How can you track motion using the iPhone's camera?
Asked Answered
P

1

16

I saw that someone has made an app that tracks your feet using the camera, so that you can kick a virtual football on your iPhone screen.

How could you do something like this? Does anyone know of any code examples or other information about using the iPhone camera for detecting objects and tracking them?

Paralyse answered 14/10, 2010 at 13:36 Comment(3)
I have had some success tracking faces and eyes using OpenCV on the iPhone. Here's a good place to start: niw.at/articles/2009/03/14/using-opencv-on-iphone/en I guess the trick is finding a cascade (description of what the camera should be looking for) that describes a foot, not really sure if that exists though.Slicker
Can OpenCV be used to track in realtime? The linked article seems to indicate that it takes up to 10 seconds to recognize a face in a 480 x 320 image.Rayon
I've used the CoreVideo framework on an iPhone 3GS to track a face in realtime, using the small sample size AVCaptureSessionPresetLow. I was able to consistently detect eyes at under 4 ms.Slicker
R
33

I just gave a talk at SecondConf where I demonstrated the use of the iPhone's camera to track a colored object using OpenGL ES 2.0 shaders. The post accompanying that talk, including my slides and sample code for all demos can be found here.

The sample application I wrote, whose code can be downloaded from here, is based on an example produced by Apple for demonstrating Core Image at WWDC 2007. That example is described in Chapter 27 of the GPU Gems 3 book.

The basic idea is that you can use custom GLSL shaders to process images from the iPhone camera in realtime, determining which pixels match a target color within a given threshold. Those pixels then have their normalized X,Y coordinates embedded in their red and green color components, while all other pixels are marked as black. The color of the whole frame is then averaged to obtain the centroid of the colored object, which you can track as it moves across the view of the camera.

While this doesn't address the case of tracking a more complex object like a foot, shaders like this should be able to be written that could pick out such a moving object.

As an update to the above, in the two years since I wrote this I've now developed an open source framework that encapsulates OpenGL ES 2.0 shader processing of images and video. One of the recent additions to that is a GPUImageMotionDetector class that processes a scene and detects any kind of motion within it. It will give you back the centroid and intensity of the overall motion it detects as part of a simple callback block. Using this framework to do this should be a lot easier than rolling your own solution.

Rayon answered 24/10, 2010 at 4:29 Comment(2)
This is cool.. is it possible to build an interface like the xbox kinetic with this? :) Im looking for a fast lightweight code to put in my iphone app like soundstep.com/blog/experiments/jsdetection .. would be cool if it would be possible with mobile web thoughDyestuff
@CarlLindberg - The Kinect uses projected structured IR light to perform 3-D mapping of an environment, so you're obviously not going to match that with a color camera in an iOS device. The crude motion detection I have so far isn't tracking hands or fingers, and for that you'll need to explore optical flow or object tracking techniques. Getting those to work on live video will be quite a challenge.Rayon

© 2022 - 2024 — McMap. All rights reserved.