Live streaming of stereo video to VR glasses (Oculus Go)
We will not do a lengthy introductory part, we will go straight to the point.
So, there is a stereo camera that can deliver H264 video via various protocols. There are glasses Oculus Go. How to watch a stereo live stream from a camera in VR glasses? It is desirable, with a minimum delay and locally, so that Youtube and other RTMP video services disappear.
Looking ahead, that's what happened. At the beginning - playback of a previously recorded video file from stereo, then playback of a live stream with StereoPi (MPEG-TS via UDP).
After some picking with the manifest, the glasses agreed to consider this application native. It appeared in the “Unknown Sources” in the library, started up, showed everything that was needed, but there was a problem - head movements were not taken into account, the video from the camera was simply stupidly displayed in full screen with glasses. The stereo effect was, yes, but as soon as you move your head a little, the Muscovite starts to go crazy, which caused a very, very uncomfortable feeling.
If that, here’s the .apk application: StereoPi for Oculus Go At the same time in the archive and adb lies, so you can immediately try to fill in the glasses. Just command
After that, go to the Library -> Unknown sources, the application com.virt2real.stereopi should appear there.
We launch it, and if StereoPi is on the same local network as the glasses, we immediately see the stereo image from the camera.
But this is garbage ... I want a normal native app for oculuses to watch videos. So that there was a motionless screen and so as not to storm when moving your head. I’m not ready to learn Unity for the oculus yet, so I got an idea to try using the video player applications already in the Oculus store. I usually watch 3D movies in Skybox , so I tried to use it.
In addition to the usual viewing of media files from the built-in flash drive and from network devices, an interesting “Airscreen” item was found in Skybox. It turned out that you can install the Skybox application on a computer with Windows (well, or on a Mac), feed him video files and then it becomes possible to watch these video files with glasses. Those. Windows application is a video server, and glasses - a client. I did not find the communication protocol anywhere, so I had to uncover tcpdump.
After a short dig, it turned out that Skybox uses UDP broadcast messages to search for a server in LAN. The message looks something like this:
All messages in JSON are very convenient.
To this message we need to send a response to the sender's host and port specified in the message, i.e. 6881
Here we indicate our host and port on which we have the WebSockets server running. All further communication will go through websockets.
For example, the first message via web sockets will be something like this:
We answer to it:
And after that, in the Skybox with glasses we will see our StereoPi. Next will be a bunch of requests for which you need to send answers. Playlist content, for example.
This is especially interesting since in the playlist that forms the Windows application, the coveted abbreviation RTSP was discovered. It turned out that the server application streams video files through RTSP, which is already suitable for live video streaming, which we, of course, need. More precisely, it turned out that there is “RTSP” in the playlist, but the links to the video files are regular http. Those. the server application still sends files via HTTP, but this does not suit us. At this point, I was already upset, but I thought, why not try to give a link in the playlist in a format in which VLC usually understands, i.e. rtsp: //192.168.1.51: 554 / h264 And cheers, Skybox began to play the video stream from the RTSP server in stereo. The delay is very large, 20 seconds, so picking on. We are trying to feed the UDP stream to MPEG-TS. Again, VLC usually eats this using the link udp: // @: 3001, for Skybox I tried to specify the same way. Then it remains only to direct the MPEG-TS stream to the host of glasses and the specified UDP port. GStreamer is involved for this:
In the skybox, we click on the “Live Stream MPEG-TS” playlist element and voila, we see a live MPEG-TS stream on the big screen in a virtual cinema. The delay is much less than with RTSP, 2-3 seconds, but still much more than in my simple application that receives a raw H264 stream over UDP (there is usually a delay of 100-150 ms at 720p resolution).
Then I ran into a dead end, so far I have not succeeded in reducing the delay. Perhaps you need to disable buffering in Skybox itself, I'll try to write to developers, maybe they will make the option “Disable buffering” :-)
In general, if suddenly for some reason you suddenly needed to watch a live video stream in the oculi or other VR glasses (Skybox is available on many platforms like) - you can try the method I described. I don’t know if this will work with other stereo cameras, but with StereoPi it’s checked, it plows.
Server source for skybox
Vetka on the forum with a discussion
Thank you all, everyone is free.
Oh yes, I almost forgot. If suddenly someone can help with the native app for the oculus (so that it looks like Skybox) - write in a personal message, we will discuss the details.
So, there is a stereo camera that can deliver H264 video via various protocols. There are glasses Oculus Go. How to watch a stereo live stream from a camera in VR glasses? It is desirable, with a minimum delay and locally, so that Youtube and other RTMP video services disappear.
Looking ahead, that's what happened. At the beginning - playback of a previously recorded video file from stereo, then playback of a live stream with StereoPi (MPEG-TS via UDP).
The stereo camera that I use is StereoPi, so I’ll give specific examples in relation to it. In fact, this is an ordinary raspberry, but with two cameras, so the described examples can be tried on ordinary raspberries, if you really want to. True, you will need to install the firmware from StereoPi.First of all, there was an attempt to make a regular Android application that plays the stream from the camera to full screen, and fill it into the oculus with the sideload method (via adb).
After some picking with the manifest, the glasses agreed to consider this application native. It appeared in the “Unknown Sources” in the library, started up, showed everything that was needed, but there was a problem - head movements were not taken into account, the video from the camera was simply stupidly displayed in full screen with glasses. The stereo effect was, yes, but as soon as you move your head a little, the Muscovite starts to go crazy, which caused a very, very uncomfortable feeling.
If that, here’s the .apk application: StereoPi for Oculus Go At the same time in the archive and adb lies, so you can immediately try to fill in the glasses. Just command
adb install StereoPi.apk
After that, go to the Library -> Unknown sources, the application com.virt2real.stereopi should appear there.
We launch it, and if StereoPi is on the same local network as the glasses, we immediately see the stereo image from the camera.
But this is garbage ... I want a normal native app for oculuses to watch videos. So that there was a motionless screen and so as not to storm when moving your head. I’m not ready to learn Unity for the oculus yet, so I got an idea to try using the video player applications already in the Oculus store. I usually watch 3D movies in Skybox , so I tried to use it.
In addition to the usual viewing of media files from the built-in flash drive and from network devices, an interesting “Airscreen” item was found in Skybox. It turned out that you can install the Skybox application on a computer with Windows (well, or on a Mac), feed him video files and then it becomes possible to watch these video files with glasses. Those. Windows application is a video server, and glasses - a client. I did not find the communication protocol anywhere, so I had to uncover tcpdump.
After a short dig, it turned out that Skybox uses UDP broadcast messages to search for a server in LAN. The message looks something like this:
{"command":"search","project":"direwolf","deviceId":"66a86b57-b292-3957-9fc9-4041d5e1f841","deviceType":"vr","udpPort":"6881"}
All messages in JSON are very convenient.
To this message we need to send a response to the sender's host and port specified in the message, i.e. 6881
{"udp":true,"project":"direwolf server","command":"searchResult","deviceId":"66a86b57-b292-3957-9fc9-4041d5e1f841","computerId":"53709de962eba2f9695c8a926562486c","computerName":"STEREO-PI","ip":"192.168.1.51","ips":["192.168.1.51"],"port":6888}
Here we indicate our host and port on which we have the WebSockets server running. All further communication will go through websockets.
For example, the first message via web sockets will be something like this:
{"command":"addDevice","deviceId":"66a86b57-b292-3957-9fc9-4041d5e1f841","deviceName":"Oculus Pacific","deviceType":"vr","showLoginCode":true}
We answer to it:
{"command":"addDevice","deviceId":"66a86b57-b292-3957-9fc9-4041d5e1f841","deviceName":"Oculus Pacific","deviceType":"vr","showLoginCode":true}
And after that, in the Skybox with glasses we will see our StereoPi. Next will be a bunch of requests for which you need to send answers. Playlist content, for example.
Playlist example for Skybox
[{id: 'livestream-rtsp',
name: 'Live Stream RTSP',
duration: 0,
size: 0,
url: 'rtsp: //192.168.1.51: 554 / h264',
thumbnail: 'http: //192.168 .1.51 / thumbnail / livestream.png ',
thumbnailWidth: 186,
thumbnailHeight: 120,
lastModified: 1,
defaultVRSetting: 1,
userVRSetting: 2,
width: 1280,
height: 720,
orientDegree:' 0 ',
subtitles: [],
ratioTypeFor2DScreen : 'default',
rotationFor2DScreen: 0,
exists: true,
isBadMedia: false,
addedTime: 1},
{id: 'livestream-mpegts',
name: 'Live Stream MPEG-TS',
duration: 0,
size: 0,
url: 'udp: // @: 3001',
thumbnail: 'http://192.168.1.51/thumbnail/livestream.png',
thumbnailWidth: 186,
thumbnailHeight: 120,
lastModified: 1,
defaultVRSetting: 1,
userVRSetting: 2,
width: 1280,
height: 720,
orientDegree: '0',
subtitles: [],
ratioTypeFor2DScreen: 'default',
rotationFor2DScreen: 0,
exists: true,
isBadMedia: false,
addedTime: 1}]
name: 'Live Stream RTSP',
duration: 0,
size: 0,
url: 'rtsp: //192.168.1.51: 554 / h264',
thumbnail: 'http: //192.168 .1.51 / thumbnail / livestream.png ',
thumbnailWidth: 186,
thumbnailHeight: 120,
lastModified: 1,
defaultVRSetting: 1,
userVRSetting: 2,
width: 1280,
height: 720,
orientDegree:' 0 ',
subtitles: [],
ratioTypeFor2DScreen : 'default',
rotationFor2DScreen: 0,
exists: true,
isBadMedia: false,
addedTime: 1},
{id: 'livestream-mpegts',
name: 'Live Stream MPEG-TS',
duration: 0,
size: 0,
url: 'udp: // @: 3001',
thumbnail: 'http://192.168.1.51/thumbnail/livestream.png',
thumbnailWidth: 186,
thumbnailHeight: 120,
lastModified: 1,
defaultVRSetting: 1,
userVRSetting: 2,
width: 1280,
height: 720,
orientDegree: '0',
subtitles: [],
ratioTypeFor2DScreen: 'default',
rotationFor2DScreen: 0,
exists: true,
isBadMedia: false,
addedTime: 1}]
This is especially interesting since in the playlist that forms the Windows application, the coveted abbreviation RTSP was discovered. It turned out that the server application streams video files through RTSP, which is already suitable for live video streaming, which we, of course, need. More precisely, it turned out that there is “RTSP” in the playlist, but the links to the video files are regular http. Those. the server application still sends files via HTTP, but this does not suit us. At this point, I was already upset, but I thought, why not try to give a link in the playlist in a format in which VLC usually understands, i.e. rtsp: //192.168.1.51: 554 / h264 And cheers, Skybox began to play the video stream from the RTSP server in stereo. The delay is very large, 20 seconds, so picking on. We are trying to feed the UDP stream to MPEG-TS. Again, VLC usually eats this using the link udp: // @: 3001, for Skybox I tried to specify the same way. Then it remains only to direct the MPEG-TS stream to the host of glasses and the specified UDP port. GStreamer is involved for this:
raspivid -3d sbs -w 1280 -h 720 -o - | gst-launch-1.0 -q fdsrc ! h264parse ! mpegtsmux alignment=7 name=muxer ! rndbuffersize max=1316 min=1316 ! multiudpsink clients="192.168.1.60:3001" sync=false
In the skybox, we click on the “Live Stream MPEG-TS” playlist element and voila, we see a live MPEG-TS stream on the big screen in a virtual cinema. The delay is much less than with RTSP, 2-3 seconds, but still much more than in my simple application that receives a raw H264 stream over UDP (there is usually a delay of 100-150 ms at 720p resolution).
Then I ran into a dead end, so far I have not succeeded in reducing the delay. Perhaps you need to disable buffering in Skybox itself, I'll try to write to developers, maybe they will make the option “Disable buffering” :-)
Finally
In general, if suddenly for some reason you suddenly needed to watch a live video stream in the oculi or other VR glasses (Skybox is available on many platforms like) - you can try the method I described. I don’t know if this will work with other stereo cameras, but with StereoPi it’s checked, it plows.
References
Server source for skybox
Vetka on the forum with a discussion
Thank you all, everyone is free.
Oh yes, I almost forgot. If suddenly someone can help with the native app for the oculus (so that it looks like Skybox) - write in a personal message, we will discuss the details.