Motivation:
Android SurfaceTexture is used by all Android classes that need to display (a lot of) frames, it can be used to display media player, camera, etc.
You can also use it in combination with other players: OpenMAX, VLC, etc. (well, VLC and other C++ player are using MediaCodec) to decode the images directly to our SurfaceTexture. As you can see it's pretty useful.
In this tutorial we're going to see how easy it is to create a media player using this QML item and the Android's Media player.
We're going to use Android's MediaPlayer, to exercise our JNI skills ;-).
Please create a simple Quick(Controls) Applications.
Step I
Add QtAndroidExtras module to your .pro file . We're going to use QtAndroidExtras a lot.
Step II
First let's create a new class QAndroidMediaPlayer, which inherits QObject.
Then, let's create our MediaPlayer, in QAndroidMediaPlayer constructor
qandroidmediaplayer.cpp
m_mediaPlayer is declared as: QAndroidJniObject m_mediaPlayer
In QAndroidMediaPlayer's destructor we must call release method.
Because we don't know what state the m_mediaPlayer is in, before reset, we're going to call stop and reset methods.
Next task is to implement a playFile Q_INVOKABLE method.
At this moment we have a workable media player that can play any media files (supported by Android codecs). The only problem we have is that we don't see any images when playing video files :) . Let's fix that problem!
First and foremost, let's add a new property to QAndroidMediaPlayer to set a new Surface to Android's MediaPlayer object.
Step III
Implement a QQuickItem which wraps a SurfaceTexture Android object. This item will be used to display the frames that are pushed into SurfaceTexture. Basically we rewrite Google's MyGLSurfaceView.java in Qt.
We start by creating a new class QSurfaceTexture which inherits QQuickItem and overwrites updatePaintNode method.
Let's see the header file:
There's no magic here, we need surfaceTexture member to access m_surfaceTexture (used by QAndroidMediaPlayer::setVideoOut), we also declare our m_textureId.
Now, let's see the implementation file:
We need to set ItemHasContents flag in constructor, and, in destructor, to delete the texture (if it was created).
Now, let's focus on updatePaintNode method.
We'll check SurfaceTextureListener.java immediately. SurfaceTextureNode is a custom QSGGeometryNode, we'll see its implementation after we'll check SurfaceTextureListener.
The rest of the comments should be enough to understand the code.
Now let's check SurfaceTextureListener.java.
You'll need to add this file to your project in android/src/com/kdab/android folder. Also, make sure ANDROID_PACKAGE_SOURCE_DIR qmake variable is set to android sources folder.
This class is needed to call a native function to signal when a new frame was decoded. Let's check the C/C++ function implementation.
This function is called from decoder thread, therefore we're using QMetaObject::invokeMethod to post a method call to update method on QSurfaceTexture's thread.
Next, let's take a look to SurfaceTextureNode implementation. SurfaceTextureNode inherits QSGGeometryNode which is needed to render the texture content using the geometry and material. As you can see, the texture is not bound as a GL_TEXTURE_2D, but as GL_TEXTURE_EXTERNAL_OES (needed by Android MediaCodec/Player/Camera/etc.), so, we'll need to use a shader to draw it (otherwise QSGSimpleTextureNode is more suitable to display textures).
This class creates and sets our SurfaceTextureShader (the material), then on every frame it updates the texture image and gets its transform matrix.
Next let's check SurfaceTextureShader class and State structure. SurfaceTextureShader is based on QSGSimpleMaterialShader
In this structure, we only need the texture transform matrix. This matrix is updated by SurfaceTextureNode::preprocess method and used, below, by SurfaceTextureShader::updateState method.
In SurfaceTextureShader we have (almost) the same vertex & fragment shaders as in MyGLSurfaceView.java, the only difference is the qt shader vars (qt_Matrix and qt_Opacity).
Step IV
Put everything together and enjoy our MediaPlayer!
The main.cpp changes
As you can see, the path to file is hard-coded, so make sure you push a video file to /sdcard/testfile.mp4, or change that value.
Last thing we need to do is to set min API version on your AndroidManifest.xml file to 14, because the needed Surface constructor was added in API 14.
Now, the only thing we need to do, is to tap on the screen to play that video file!
You can find the full source code of this article here: https://github.com/KDAB/android
17 Comments
12 - Apr - 2016
Anton Kudryavtsev
Thanks! Is there way to show video on SurfaceView, without texture? It's useful for low-end devices and maybe help to resolve https://bugreports.qt.io/browse/QTBUG-36021
14 - Apr - 2016
BogDan Vatra
Not very easy. You can put a SurfaceView on top of the other Qt controls and pass it to a media player, but you'll not be able to draw anything on top.
18 - May - 2016
Sandro Frenzel
Thanks for that! Did this implementation use Android hardware acceleration?
18 - May - 2016
BogDan Vatra
Yep, it's using hardware acceleration if available.
19 - May - 2016
Sandro Frenzel
Could you please have a look at following issue:
https://github.com/KDAB/android/issues/1
I have posted the stacktrace of the crash when trying to start the media file.
23 - May - 2016
BogDan Vatra
I'll check it this week
23 - Jan - 2018
Luka
Could You please take a look on this issue: https://github.com/KDAB/android/issues/3
I am trying to make this work using loader.
25 - Jan - 2018
Skroopa
Hi, BogDan Vatra! Great example. I have the problem with creating 2 SurfaceTexture (QML element) with different movies to play. When I started to play one of them I see only one movie at a time and only one GL_TEXTURE_EXTERNAL_OES is showing on the screen (also second texture mirrored by vertical and horizontal). I think in "preprocess" must be some functions or manipulations with texture. Tried to unbind texture, updateTexImage and then bind texture again - no changes. Any posibilities to resolve the problem above? Thanks!
29 - Mar - 2018
Yurii Oleksyshyn
Hi Bogdan, Your solution is very interesting, great job. I have a question, can this example be extended in a way that any android view can be rendered as texture and displayed in QML tree. I am thinking using your approach, but in slightly different way. I want to try to reimplement onDraw and render on surface that is created from SurfaceTexture that is created in C++. When onDraw is called I will invokeMethod to notify Qt QML thread that it needs to redrew. What do you think about this approach?
6 - Mar - 2019
Barry
Hi Yurii, I have the same idea as you. Can you share your progress with me?
6 - Mar - 2019
BogDan Vatra
Yup, I think with some effort it can be done.
6 - Mar - 2019
Barry
Thanks very much
6 - Mar - 2019
Barry
Hi Bogdan, Your solution is very interesting, great job. I have a question, can this example be extended in a way that any android view can be rendered as texture and displayed in QML tree. I am thinking using your approach, but in slightly different way. I want to try to reimplement onDraw and render on surface that is created from SurfaceTexture that is created in C++. When onDraw is called I will invokeMethod to notify Qt QML thread that it needs to redrew. What do you think about this approach?
15 - Mar - 2019
David RAIMOND
Hi Bodgan. Thanks for that great job. That works fine with a mp4 local file & with a mp4 file located on a remote http server but I have the following error (please have a look on the trace at the bottom of this message) when I try with a local network camera RTSP flow (the screen remains black). The "setDataSource" MediaPlayer function documentation specifies that RTSP url is supported. I tried to use the "prepareAsync" call insted of "prepare" but the result is the same. I specify that I tried to add INTERNET permissions in the manifest. With an other application of the playstore, the flow is correctly displayed. My environment : Win10,Qt5.12,NDK r10e, Android 8.1.0, gcc armv7a. Any idea or suggestion (increase loglevel or... something that is missing) ? Thanks for your help. Regards, David.
16 - Mar - 2019
David RAIMOND
Hi Bogdan, Sorry, I made a small mistake in my message. The Qt version was 5.11 (and not 5.12) on my laptop. I tried your example on my desktop (difference is Qt5.12/Clang on my desktop vs 5.11/GCC on my laptop). When I build with my desktop for the same Android 8 device, the RTSP flow is correctly displayed. I tried to build with my desktop after reading the bug tracker QTBUG-50539. Great job & thank you very much. Regards. David.
28 - Apr - 2023
Kalileo
Unfortunately this does not work anymore in Qt 6. There are no more QSGSimpleMaterial and QSGSimpleMaterialShader. Also classes such as QSGMaterialShader are quite different to what they were in Qt 5.
Would it be possible to update this great example to Qt 6?
9 - May - 2023
BogDan Vatra
Sadly, in this moment I have no time to update the example.