Making the camera in Qt work on Android



Qt is already a good environment for developing mobile applications, but some points there remain unfinished. So, for example, if you try to run a standard example with a camera, it will work on Windows, but not on Android. At the same time, the examples using the camera in via Qml are quite working. That means working with the camera on Android is implemented, but there is no full access to it. And if we want the freedom to have access to the video stream?

When studying the source code of the QtMultimedia module, it became clear that the reason for the limitations of working with the camera is the need to hide crutches. And these crutches had to be installed in order to provide hardware output through OpenGL. Nevertheless, it is possible to provide full access to the camera’s video stream.

Before you begin the explanation, it is worthwhile to warn that it is not necessary to do everything described below to get individual pictures. You can simply use the camera via Qml and write your component on it to capture individual frames. But how, it is written here .

In order not to write everything from scratch, take the very Qt example with the name “Camera Example” (which is in the screenshot) and make it work. To display an image, it uses an object of the QCameraViewfinder class. We will write our own instead. And for the output we will have to use OpenGL.

To write custom output classes for frames received from media objects, Qt prepares an abstract class QAbstractVideoSurface with virtual functions through which the interaction takes place. Create your own class based on it, which will be responsible for receiving the frame, and call it CameraSurface. And for the frame output, the CameraSurfaceWidget class inherited from QOpenGLWidget will be responsible. It would be possible to combine these two classes, but when inheriting from QAbstractVideoSurface and QOpenGLWidget, double inheritance from the QObject class occurs. And you can’t do that.

You can see the entire implementation code for this below, but here I will simply describe the key points. And just in case, you can learn more about how to work with the QAbstractVideoSurface class here .

We get a new frame in the function bool CameraSurface :: present (const QVideoFrame & frame). The frame parameter is the very new frame of our video stream. The data that can come from the camera can be in the form of an array (this happens in Windows or Symbian) or in the form of a texture (in Android). And if you have a texture, do not try to read it right away. When you call frame.handle (), you might think that you are only getting a texture index, but in fact, at the same time, a tricky initialization of resources based on the OpenGL context of your stream occurs. But this function is called not in your thread, which means that this OpenGL context will not work here. And let the const keyword in the function declaration not deceive you, the data inside is insidiously marked as mutable. Just copy the frame and read the data when drawing.

But that is not all there is to know. When linking to a camera, our CameraSurface has a hidden “GLContext” property, and you are expected to write your OpenGL context there. And to do this better in the stream of the CameraSurface object, that is, using the slot call through the functionality of Qt signals and slots. And then send the event talking about the entry in the “GLContext” through the property object “_q_GLThreadCallback”. And this event should have the QEvent :: User code. In theory, this is a custom type of event, but you shouldn't have known about these crutches at all, so don't give a damn. In general, in Windows everything works without action, but if this is not done on Android, the camera simply will not start sending frames.

In short, the drawing code will be something like this:

void CameraSurfaceWidget::paintGL()
{
    if (!_surface->isActive()) {//если мы не начали принимать кадры с камеры, то
        _surface->scheduleOpenGLContextUpdate();//нужно отправить данные о контексте OpenGL
        QObject* glThreadCallback = (_surface->property("_q_GLThreadCallback")).value();//куда отправляем событие, говорящее, 
//что все готово к принятию видеопотока?
        if (glThreadCallback) {
            QEvent event(QEvent::User);//Событие с пользовательским флагом
            glThreadCallback->event(&event);//теперь его отправляем
        }
        //И эта часть выше не нужна для винды. Но, главное, она там ничего не сломает.
    } else {
        QVideoFrame& frame = _surface->frame();
       //рисование кадра
   }
}

As a result, we get the ability to process the stream and the Windows-like interface on Android. By the way, the data from the frame texture can be pulled out using the Frame Buffer Object and glReadPixels (glGetTexImage is not in OpenGL ES). And this is not the only way to do this. You can still receive frames through QVideoProbe, but then everything is apparently processed on the processor, because it lags wildly. So generally it’s better to just forget about it.

More oddities Qt
And one more strange thing in the end. If the frame format is Format_RGB32, then the color channels are in the order of BG R. If the format is Format_BGR32, then RG B. Something is mixed up in Qt.

You can download the corrected example here .

Also popular now: