Page 1 of 1

Live disparity map quality

Posted: Thu Feb 20, 2020 9:56 pm
by vivvyk

I'm attempting to create a disparity map with two fisheye cameras. We're running scripts from

We ran the calibration successfully and tuned parameters exactly as shown in the videos. Regardless, we're having trouble achieving the same performance/smoothness/quality of the disparity map shown in the demo video. We've tried tuning parameters extensively and we've also taken a look at some of the other posts on this topic, but we still have an extremely noisy map (fluctuating colors, noisy/rough edges, lots of speckles and patches of random color).

Changing the minDisparity, numberOfDisparities, speckleRange, or speckleWindowSize seem to be helping, but it's quite difficult to pinpoint the exact parameter values for the best image. Is there any methodology or set process for finding the optimal parameter values (even programmatically)? Is there something beyond our parameter values that could be causing imperfection (lighting, our calibration, cameras)? Is there something in the calibration or disparity map code that could be changed to improve our map quality?

For reference, here's our fork of the repository that contains all of our calibration data, results, and settings (such as scenes, etc): In the "vids" directory, you can see recorded videos of our disparity map. I've also included some screenshots here. As a side note, our saved video has slightly worse resolution but it's much clearer with the cameras.

I certainly understand it's difficult to tell what may be going wrong, but any suggestions or guidance to improve the quality of our map would be appreciated!

Thanks! :)

Re: Live disparity map quality

Posted: Fri Feb 21, 2020 8:57 am
by Realizator
Hi Vivyk,
I've looked at your video, and have several ideas.
1. If you look at your dm-tune.jpg, you can see that left part of your stereopair has some specific overlight on a light elements (lamp and window). Please clear optics on your left camera :-) This can affect on a small details detection.

2. You may notice, that your problems on a depth map appears on the very nearby objects (your hand near the camera). When the object is too close to the camera, it's positions on the left and right images are far away from each other. And if this horizontal difference is higher, than your NumOfDisparities, this object is "out of depth field of view", and StereoBM can not math it and get disparities.
By the way, if you run script 7, you can find that system will not show you red points closer than some specific distance. This 2D map can clarify for you, what is your StereoPi really see.
Also you may notice, that your dm-tune.jpg image has no any object close to the camera. A man on this image is more far away from the camera, than a hand on the photo from this post.
How to fix this:
- Before capture dm-tune.jpg, please put one object CLOSE to your camera at the distance, which you want to be the CLOSEST distance for detection. As a rule, it is 30-50 cm or more. If you put closest object at, say, 10 cm, you will have 2 problems. First one is a bad matching (as this object will be seen by cameras at the VERY different projections). The second one is a narrow FOV of your depth map (you can notice, that black fields on the left and right sides of your depth map depends on mindisp and numofdisp).
- After that tune your depth map the way, where this nearest object has a red color.
- After tuning this, tune mindisp to adjust far objects to have dark blue color.
- Please notice, that for tuning it is better to use textured objects with high contrast, but not uniform texture (like white t-shirt, walls etc).

3. You may say "What if I want to detect objects from 3 or 5 cm and up to several meters, but not from 30-50cm?"
The answer is easy. To detect the very close objects you need to use a smaller stereobase. That is distance between your cameras should be not 65 mm, but 25 mm.

For example, here is a 65 mm stereobase setup:
And here is 25 mm stereobase setup:

Fun fact. Stereobase of your eyes is about 65 mm (average size for all peoples). And you can look at the very close objects (3-5 cm). But in this case your eyes turns horizontally, and look at this close objects at the angle of about 90 degree. And our StereoPi cameras are fixed and can not turn around. Rotation of the cameras with appropriate depth-map settings adjustment on the go is advanced technology used on humanoid-like robots. This approach can be tested, for example, using this open source humanoid robot Inmoov.

Re: Live disparity map quality

Posted: Mon Feb 24, 2020 5:12 am
by vivvyk
Thanks for the quick reply! We've tried changing some of the lighting conditions and will try the other calibration suggestions and get back with results soon!

Re: Live disparity map quality

Posted: Tue Feb 25, 2020 8:05 am
by Realizator
vivvyk, by the way, according to your rectifyed_right.jpg and rectifyed_left.jpg your calibration is Ok. :-)