Android, Rx and Kotlin or how to make the claw of Lego shrink. Part 1

Hi, Habr lovers! By a lucky chance, in August 2018 I was lucky to start working with my comrade ( kirillskiy ) on a project of amazing interest. And so, in the daytime we were ordinary programmers, and at night, we were programmers who struggle with motion recognition issues for people with functional limitations of their limbs, of course, healthy people could use this, using similar technology in a variety of ways.

In this article , Kirill tells in general about the project, but I will tell you in more detail and touch on the topic of the android in it.
I will tell you first about the project in its entirety, what we have invented and how we wanted to implement it:

1) EMG (Electromyography - recording of the electrical activity of muscles) was chosen as a way to obtain data (oh yes, there will be a lot of data). For the first time this method was applied in 1907, so we walked the beaten track.

2) Found an 8-channel EMG sensor working via bluetooth (even having its own API, which in the end turned out to be absolutely useless, because we had to independently connect as a BT device. Thanks at least they wrote the specification)

3) We decided that everything would work like this:

  • training mode. We put the sensor on the forearm, choose the type of movement that we will train. For example ... "bending the brush." and start the turnout (12 times repeat the movement). The data received at this moment will be saved and sent later to the server, where we will teach the neural network (calmly, I will tell you about it too)
  • recognition mode directly movement. Data taken in the process of movement is already compared with the model obtained as a result of training the neural network. Based on the results, we will already get “FOLDING THE BRUSH”, for example.
  • driving mode. It is necessary according to a certain type of movement, to force something to move. For example, the manipulator is assembled in the kitchen from the designer (PPC, how expensive) of a famous Danish manufacturer.

4) Item Android. I am an Android developer - and it was a sin not to use it. The Android does this:

  • finds all available BT devices
  • connects to the sensor
  • draws a graph based on data taken from the sensors (8 channels, frequency 200 Hz). 8 beautiful, multicolored curves.
  • implements a training mode (selection of the type of the learner movement, training start button, data sending button)
  • implements client-server communication. It is necessary to send data to the server to learn the neural network.
  • realizes connection and interaction with the Raspberry PI 3B, to which the motors, which drive the manipulator, are soldered.

5) Raspberry PI 3B. it was on raspberries that we put Android Things, and then we lifted the BT server on it, which receives messages from the Android device and moves the corresponding motors driving the super-claw from LEGO.

6) Server. Deployed by Docker locally on a computer. Accepts the data sent by your device, trains the neural network, returns the model.

Part number 1. Android. This time we will look at the under-hood space of the project, concerning Android before sending data to the server.

He is called NUKLEOS (https://github.com/cyber-punk-me/nukleos)
Stack:

- Kotlin
- MVP
- Dagger2
- Retrofit2
- RxKotlin, RxAndroid

for Raspberry:

-Android Things

At work, they do not let me play with architecture, and then finally there is an opportunity to play around with an old toy called MVP.

The application consists of one BottomNavigation style activation and 4 fragments:
The first one is “List of all available BT devices”

We chose an 8 channel BT sensor, which had its own API for working with BT. Unfortunately, api turned out to be absolutely useless, for he immediately suggested defining one of 6 (like) movement types, but the recognition accuracy was 80% - and this is no good. Well, we needed actual data. The value of changes in bioelectric potentials that occur in the muscles of a person when muscle fibers are excited. And for this it was necessary to work with this sensor directly. The creators have left a description of the protocol of working with him, so it took not so long to pick one. I can describe an example of working with bare BT devices in a separate article if it is interesting, but in brief it looks like this:

classBluetoothConnector(val context: Context) {
    privateval mBTLowEnergyScanner by lazy {
        (context.getSystemService(Activity.BLUETOOTH_SERVICE) as BluetoothManager)
                .adapter.bluetoothLeScanner
    }
    privatevar mBluetoothScanCallback: BluetoothScanCallback? = null// scan.funstartBluetoothScan(serviceUUID: UUID?) = Flowable.create<BluetoothDevice>({
        mBluetoothScanCallback = BluetoothScanCallback(it)
        if (serviceUUID == null) {
            mBTLowEnergyScanner.startScan(mBluetoothScanCallback)
        } else {
            mBTLowEnergyScanner.startScan(
                    arrayListOf(ScanFilter.Builder().setServiceUuid(ParcelUuid(serviceUUID)).build()),
                    ScanSettings.Builder().setScanMode(ScanSettings.SCAN_MODE_LOW_LATENCY).build(),
                    mBluetoothScanCallback)
        }
    }, BackpressureStrategy.BUFFER).apply {
        doOnCancel { mBTLowEnergyScanner.stopScan(mBluetoothScanCallback) }
    }
    // scan with timeoutfunstartBluetoothScan(interval: Long, timeUnit: TimeUnit, serviceUUID: UUID? = null) = startBluetoothScan(serviceUUID).takeUntil(Flowable.timer(interval, timeUnit))
    innerclassBluetoothScanCallback(privateval emitter: FlowableEmitter<BluetoothDevice>) : ScanCallback() {
        overridefunonScanResult(callbackType: Int, result: ScanResult?) {
            super.onScanResult(callbackType, result)
            result?.let {
                it.device.apply { emitter.onNext(this) }
            }
        }
        overridefunonScanFailed(errorCode: Int) {
            super.onScanFailed(errorCode)
            emitter.onError(RuntimeException())
        }
    }
}

Carefully wrap the standard BT service in the RX and get less pain.

Next, we run the scan, and thanks to rx on the subscription we form a list of all devices, cramming them into RecyclerView:

mFindSubscription = mFindFlowable
                        ?.subscribeOn(Schedulers.io())
                        ?.observeOn(AndroidSchedulers.mainThread())
                        ?.subscribe({
                            if (it !in mBluetoothStuffManager.foundBTDevicesList) {
                                addSensorToList(SensorStuff(it.name, it.address))
                                mBluetoothStuffManager.foundBTDevicesList.add(it)
                            }
                        }, {
                            hideFindLoader()
                            showFindError()
                            if (mBluetoothStuffManager.foundBTDevicesList.isEmpty()) {
                                showEmptyListText()
                            }
                        }, {
                            hideFindLoader()
                            showFindSuccess()
                            if (mBluetoothStuffManager.foundBTDevicesList.isEmpty()) {
                                showEmptyListText()
                            }
                        })       

Select one of the devices, select it, and go to the next screen:
“Sensor Settings”

Connect to it and start streaming the sensor data using the commands prepared by us. Fortunately, the protocol of work with this device by the creators of the sensor is described:

object CommandList {
    //Stop the streamingfunstopStreaming(): Command {
        val command_data = 0x01.toByte()
        val payload_data = 3.toByte()
        val emg_mode = 0x00.toByte()
        val imu_mode = 0x00.toByte()
        val class_mode = 0x00.toByte()
        return byteArrayOf(command_data, payload_data, emg_mode, imu_mode, class_mode)
    }
    // Start streaming (with filter)funemgFilteredOnly(): Command {
        val command_data = 0x01.toByte()
        val payload_data = 3.toByte()
        val emg_mode = 0x02.toByte()
        val imu_mode = 0x00.toByte()
        val class_mode = 0x00.toByte()
        return byteArrayOf(command_data, payload_data, emg_mode, imu_mode, class_mode)
    }
    .....

Working with the device is also carefully wrapped in rx to work without pain.

Sensors return ByteArrei naturally, and it was necessary to gash the converter, the frequency of operation of sensors 200 Hz ... if it is interesting, I can describe in detail (well, or look at the code), but in the end we work with a sufficiently large amount of data in this way:

1 - We need to draw curves of each of the sensors. Of course, drawing ABSOLUTELY all the data makes no sense, because on a mobile device it makes no sense for the eye to consider 200 changes per second on each sensor. Therefore, we will not take everything.

2 - We need to work with the entire volume of data, if it is a process of learning or recognition.

for these needs, the RX is perfect with all its filters.

Charts had to make their own. Who is interested - see PowerfullChartsView in the views folder.

And now a little video:


In the video you will see how Cyril works with the system as a whole. The video is working with the model. But the model is on the server. In the future, it will of course be on the device, which will significantly speed up the response)

Write what aspects are interesting, which ones to tell in more detail. Naturally, we are working on the project and are open to your suggestions.

All project on github here

Also popular now: