The purpose of the stream above is to push the image to an oculus or other VR headset. I believed that requires the gldownload step. Any other way to do it? I’m not an expert
I'm surprised if you need the gldownload and friends to make this work on MacOS. This should work, without the GPU -> CPU -> GPU overhead:
Thaytan, Wildag's task is to obtain equirectangular projection on the Oculus Go. Ideally we have only StereoPi (CM3 onboard) and Oculus Go (or Quest). So we have two options here:
1. Do this real-time image transformation onboard (CM3) and live stream already processed video.
2. Send video "as is" to Oculus, and do this transformation onboard.
His mentioned Mac implementation is a proof-of concept. I agree with you, that Mac has enough performance to do this without GPU. But both Raspberry Pi CM3 and Oculus are embedded systems with limited performance, so GPU acceleration is the only way to obtain FPS and latency needed.
Hmmm, interesting approach. Maybe our recordings from the 180 or 200-degree cameras can be viewed "as is".
I add this experiment to my to-do list.
As of now, all those transformations to be done on-the-go in our Oculus Quest application. We did a special shader for this conversion in a real-time.