3D Distance from Disparity Map

OpenCV, Python and other image processing questions
Post Reply
Posts: 3
Joined: Thu Feb 20, 2020 9:07 pm

3D Distance from Disparity Map

Post by vivvyk »


After some earlier advice we've managed to create a high quality disparity map! However, we're having a lot of trouble getting the actual distance from our disparity map.

Here are two frames from our video of a tissue box:
tissueframes.png (68.68 KiB) Viewed 3295 times
Here is our disparity map:
tissue-disparity.png (23.55 KiB) Viewed 3295 times
To get the distance, we've followed the procedure in 7_2d_map.py; using

Code: Select all

cv2.reprojectImageTo3D(disp, Q)
and the Q matrix obtained in calibration. This returns a three dimensional matrix. We take a pixel from the center/box and it looks like
array([-0.25788638, 0.1133885 , 1.1743402 ], dtype=float32)
The tissue box is 200mm away. We're guessing that the third value is (Z/W) is what we need but we aren't sure of the units. We read somewhere that the disparity map returned by opencv is scaled by 16, and we've made the assumption that the returned units are cm. This gives us 1.1743402 * 16 * 10 = about 192mm, which is our best result thus far.

We aren't sure if this is the correct approach. We've struggled to find documentation or the correct way to do this. We'd appreciate any input on how to correctly calculate distance, or any other feedback on our approach thus far.

I'd be happy to provide our disparity map, Q matrix, our code, or anything else that's needed. Let me know! We'd very much appreciate some input on this!


User avatar
Site Admin
Posts: 854
Joined: Tue Apr 16, 2019 9:23 am

Re: 3D Distance from Disparity Map

Post by Realizator »

Hi vivvyk,
1. Please look at this topic. You can find a (draft) code I plan to add to the fisheye repo after polishing. Please use division by 16 I've implemented here. This will really "fix" your real distance (as it affects on a trigonometry logic). I have a lot of questions from users on this, so I hope to release some polished code during a week or so.
upd> you need to divide disparity by 16 (look line 111 in attached 22.py in the mentioned topic):

Code: Select all

return disparity_color, disparity_fixtype, disparity.astype(np.float32) / 16.0
After that your "cv2.reprojectImageTo3D(disp, Q)" will give you another results, which can be linearly scaled to get your real size.

2. Please play with the point cloud (doing "view from the top" projection on ZX pane). You will see a distance map "top view", and this will help you to understand real and relative distance to the objects on your scene for tuning your distance. Put a couple of a test object at different distances (and with the different size).

3. If you noticed it, in OpenCV 4.x calibration the chessboard size is ignored. I.e. you get your distance measured in "squares" of the chessboard. So multiplying it on a real square size will give you a real distance.
Eugene a.k.a. Realizator

Post Reply