Page 1 of 1

Questions about live-streaming to unity

Posted: Wed Aug 04, 2021 3:24 am
by blatimer
I'm looking for a product to use for a school project, and I have a couple questions regarding live-streaming.

-Is it possible in any way to live-stream the cameras from the StereoPi to Unity? If so, how would I go about getting that set up?

-are the cameras able to live-stream in 60fps, and in what resolution? I was unable to find the specs of the cameras online for this aspect.
- What would the latency be for streaming this on wifi? Again, I couldn't find this online but would be important to know.

-My main goal is to live-stream the cameras to unity, build some sort of "game" for the quest 2, (basically an RC car but with the remote control being in VR) and send back the controls from this game back to the StereoPi, to then be connected to some hardware. (like motors) Would this be possible without a big delay in the controls vs the video? The main concern is the low latency/high fps.

Sorry if my questions are a bit strange, I'm new to this hardware and don't know all the similarities/differences to a normal Raspberry Pi. Thanks!

Re: Questions about live-streaming to unity

Posted: Thu Sep 09, 2021 10:33 am
by Realizator
Hi blatimer,
- Our Oculus Quest app is based on Unity. But we are using the hardware-specific code for the real-time stream receiving. I'm not sure it can be easilly ported to the desktop app. You can ask the author of this Unity asset, who help us with the Unity part of our application.
- Usually we are using the 30 FPS live stream. 60 FPS is achievable for the 1280x720, and for FullHD you can get a bit smaller FPS in the stereoscopic mode. The cameras themself are able to capture the higher resolutions and FPS, but the bottleneck is the RPi hardware encoder.
- The latency for the h264 video stream is ~100-150 ms and depends on your WiFi connection

I hope this helps!

Re: Questions about live-streaming to unity

Posted: Thu Oct 14, 2021 4:47 pm
by diceky
Hey Realizator,

Would you be able to share an example Unity scene that streams video from StereoPi using the FMETP plugin?

I purchased the FMETP V2 plugin but am struggling to receive the stream from my StereoPi over UDP, so I contacted the plugin owner who recommended to ask the StereoPi team for an example scene.

Re: Questions about live-streaming to unity

Posted: Wed Oct 20, 2021 3:06 pm
by leoson
FMETP STREAM integrated the cross-platforms support for StereoPi recently.
Please follow this setup guide for Unity built applications.
https://frozenmist.com/docs/apis/fmetp- ... i-example/

Re: Questions about live-streaming to unity

Posted: Wed Oct 20, 2021 8:37 pm
by Realizator
Leoson, thank you for the link! :-)

Re: Questions about live-streaming to unity

Posted: Mon Nov 28, 2022 8:06 pm
by eswara1997
Realizator wrote:
Thu Sep 09, 2021 10:33 am
Hi blatimer,
- Our Oculus Quest app is based on Unity. But we are using the hardware-specific code for the real-time stream receiving. I'm not sure it can be easilly ported to the desktop app. You can ask the author of this Unity asset, who help us with the Unity part of our application.
- Usually we are using the 30 FPS live stream. 60 FPS is achievable for the 1280x720, and for FullHD you can get a bit smaller FPS in the stereoscopic mode. The cameras themself are able to capture the higher resolutions and FPS, but the bottleneck is the RPi hardware encoder.
- The latency for the h264 video stream is ~100-150 ms and depends on your WiFi connection

I hope this helps!
If I combine this with a Jetson Nano, can we push the res and FPS higher? How could I configure that?