Questions about live-streaming to unity

S.L.P. image questions, stereoscopic video livestream and recording, stereoscopic photo capture etc.
Post Reply
blatimer
Posts: 1
Joined: Wed Aug 04, 2021 2:58 am

Questions about live-streaming to unity

Post by blatimer »

I'm looking for a product to use for a school project, and I have a couple questions regarding live-streaming.

-Is it possible in any way to live-stream the cameras from the StereoPi to Unity? If so, how would I go about getting that set up?

-are the cameras able to live-stream in 60fps, and in what resolution? I was unable to find the specs of the cameras online for this aspect.
- What would the latency be for streaming this on wifi? Again, I couldn't find this online but would be important to know.

-My main goal is to live-stream the cameras to unity, build some sort of "game" for the quest 2, (basically an RC car but with the remote control being in VR) and send back the controls from this game back to the StereoPi, to then be connected to some hardware. (like motors) Would this be possible without a big delay in the controls vs the video? The main concern is the low latency/high fps.

Sorry if my questions are a bit strange, I'm new to this hardware and don't know all the similarities/differences to a normal Raspberry Pi. Thanks!

User avatar
Realizator
Site Admin
Posts: 900
Joined: Tue Apr 16, 2019 9:23 am
Contact:

Re: Questions about live-streaming to unity

Post by Realizator »

Hi blatimer,
- Our Oculus Quest app is based on Unity. But we are using the hardware-specific code for the real-time stream receiving. I'm not sure it can be easilly ported to the desktop app. You can ask the author of this Unity asset, who help us with the Unity part of our application.
- Usually we are using the 30 FPS live stream. 60 FPS is achievable for the 1280x720, and for FullHD you can get a bit smaller FPS in the stereoscopic mode. The cameras themself are able to capture the higher resolutions and FPS, but the bottleneck is the RPi hardware encoder.
- The latency for the h264 video stream is ~100-150 ms and depends on your WiFi connection

I hope this helps!
Eugene a.k.a. Realizator

diceky
Posts: 1
Joined: Thu Oct 14, 2021 4:34 pm

Re: Questions about live-streaming to unity

Post by diceky »

Hey Realizator,

Would you be able to share an example Unity scene that streams video from StereoPi using the FMETP plugin?

I purchased the FMETP V2 plugin but am struggling to receive the stream from my StereoPi over UDP, so I contacted the plugin owner who recommended to ask the StereoPi team for an example scene.

leoson
Posts: 1
Joined: Wed Oct 20, 2021 3:01 pm

Re: Questions about live-streaming to unity

Post by leoson »

FMETP STREAM integrated the cross-platforms support for StereoPi recently.
Please follow this setup guide for Unity built applications.
https://frozenmist.com/docs/apis/fmetp- ... i-example/

User avatar
Realizator
Site Admin
Posts: 900
Joined: Tue Apr 16, 2019 9:23 am
Contact:

Re: Questions about live-streaming to unity

Post by Realizator »

Leoson, thank you for the link! :-)
Eugene a.k.a. Realizator

eswara1997
Posts: 2
Joined: Mon Nov 28, 2022 5:33 am

Re: Questions about live-streaming to unity

Post by eswara1997 »

Realizator wrote:
Thu Sep 09, 2021 10:33 am
Hi blatimer,
- Our Oculus Quest app is based on Unity. But we are using the hardware-specific code for the real-time stream receiving. I'm not sure it can be easilly ported to the desktop app. You can ask the author of this Unity asset, who help us with the Unity part of our application.
- Usually we are using the 30 FPS live stream. 60 FPS is achievable for the 1280x720, and for FullHD you can get a bit smaller FPS in the stereoscopic mode. The cameras themself are able to capture the higher resolutions and FPS, but the bottleneck is the RPi hardware encoder.
- The latency for the h264 video stream is ~100-150 ms and depends on your WiFi connection

I hope this helps!
If I combine this with a Jetson Nano, can we push the res and FPS higher? How could I configure that?

Post Reply