Page 1 of 1

External trigger sync

Posted: Sun Aug 23, 2020 12:16 am
by Grunky
Is there a way to force the cameras to sample using an external trigger? More generally can precise frame timing information be gotten from the camera images and is the spacing between images consistent?

I'm interested in SLAM but there needs to be precise timing information provided if I want to use odometry and IMU information.

Re: External trigger sync

Posted: Sun Aug 23, 2020 8:51 am
by stereomaton
This post viewtopic.php?f=10&t=106 discuss how to add a button to trigger a photo. You can replace the button by an other electronic signal that switches from high-impedance to gnd (e.g. through a transistor). It is not designed for low latency though.

For lower latency, you might try the -s option of raspistill which make it wait for a signal (software interrupt) to shoot (and wait for an hardware interrupt on a gpio to send this signal), but you have to throw away most mechanisms in SLP. More precisions on this would require a complete article.

For precise timing of the images, there is an option to timestamp each frame of a video, but I do not remember how to do it. It could answer to your question about spacing.

I do not know the precision required for doing a SLAM or the framerate needed (I guess it depends on the speed of the robot) so my answer might be partly irrelevant for your application.

Re: External trigger sync

Posted: Mon Aug 24, 2020 9:26 am
by Realizator
Hi Grunky,
Stereomaton, thank you for your detailed answer!
Grunky, I'd like to add one more idea here. If we are talking about extremely high-precision timing for camera synchronization and capture time, we have the only option here - it is low-level sensors synchronization and external trigger signal, which goes directly to the coupled sensors. You see, all Sony sensors (both IMX219 in V2 cams and IMX477 in HQ cams) have a special stereo synchronization mode.
Let me highlight this: this is direct, sensor-to-sensor synchronization, which is independent of CPU and other system parts. But to achieve this we should program both sensor registers, and create a direct connection between them. As the GPU code part is closed by Broadcom, the only way to do this is to ask the Raspberry Pi Foundation to add support for this mode. Luckily, HQ camera has soldering points, where we can access all pins we need.