Capturing the signal of muscle activity in the system of machine learning

About six months ago, the idea of ​​creating an open framework for neural interfaces came to me.

In this video, the capture of the EMG signal of the muscles occurs using an eight-channel EMG sensor on the forearm. Thus, we remove an undeciphered, muscle-enhanced pattern of activation of motor neurons through the skin.

The raw signal from the sensor via Bluetooth enters the Android / Android Things app .

To train the system, we will assign a movement class to a specific hand gesture. For example, if we need the “stop” state, as well as the rotation of two motors in two directions, we will write down five gestures. Let's collect everything in files and send it to the neural network for training . At the entrance of the network we have a nervous activity, at the exit - a recognized class of movement.

An example of the network architecture on Keras:

model = Sequential()
# 8 каналов ЭМГ записаны по 8 раз каждый
model.add(Dense(36, activation='relu', input_dim=64))
model.add(Dense(20, activation='relu'))
model.add(Dense(16, activation='relu'))
# 5 записанных жестов
model.add(Dense(5, activation='softmax'))
sgd = SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

A server is used to connect the application and the neural network . The client-server solution allows easy scripting of machine learning using TensorFlow, without changing the application code and avoiding constant reinstallations during the debugging process.

You can use the obtained classifiers using TFLite or TF Serving.

The system code is here

. Future plans:

  • Creation of open source multi-channel sensor EMG, working via USB
  • Machine Learning Experiments to Improve Control Reliability

My friend garastard talks about our Android adventures with neural interfaces in this article .

Also popular now: