Visual Odometry (aka. Egomotion estimation) with OpenCV
Asked Answered
C

2

9

I'm planning to implement an application with augmented reality features. For one of the features I need an egomotion estimation. Only the camera is moving, in a space with fixed objects (nothing or only small parts will be moving, so that they might be ignored).

So I searched and read a lot and stumbled upon OpenCV. Wikipedia explicitly states that it could be used for egomotion. But I cannot find any documentation about it.

  1. Do I need to implement the egomotion algorithm by myself with OpenCV's object detection methods? (I think it would be very complex, because objects will move in different speed depending on their distance to the camera. And I also need to regard rotations.)
  2. If so, where should I start? Is there a good code example for a Kanade–Lucas–Tomasi feature tracker with support for scaling and rotation?

P.S.: I also know about marker based frameworks like vuforia, but using a marker is something I would like to prevent, as it restricts the possible view points.

Update 2013-01-08: I learned that Egomotion Estimation is better known as Visual Odometry. So I updated the title.

Craps answered 7/1, 2013 at 16:33 Comment(0)
B
4

You can find a good implementation of monocular visual odometry based on optical flow here.

It's coded using emgucv (C# opencv wrapper) but you will find no issues on convert it back to pure opencv.

Bailable answered 11/1, 2013 at 7:54 Comment(1)
Thanks a lot, I already read that paper, which is base for the implementation, but it didn't contain enough information for me to implement it myself. So that link helps me very much, especially because it contains an updated link to the paper author's source code.Craps
D
2

Egomotion (or visual odometry) is usually based on optical flow, and OpenCv has some motion analysis and object tracking functions for computing optical flow (in conjunction with a feature detector like cvGoodFeaturesToTrack()).

This example might be of use.

Not a complete solution, but might at least get you going in the right direction.

Dwarfism answered 8/1, 2013 at 18:4 Comment(3)
As I feared I'll have to implement the algorithm myself. :-/Craps
@ChristianStrempfer Hi. I would like to know how you went about solving your problem. Could you post the codes?Grease
@Clive: I implemented a simple Android app, which tracks visual features. I came to conclusion that mobile phones nor yet good enough for odometry, because I couldn't analyse enough frames per second to support even slow movements and battery was drained very fast. Therefore I didn't implement an odometry algorithm.Craps

© 2022 - 2024 — McMap. All rights reserved.