I am trying to build a Kinect and iPhone based application.
I am trying to compute the acceleration of my hands over time on each of the X Y and Z axis based on the trajectory returned by the kinect. Basically I am selecting a standard time interval of 0.5 seconds or 15 frames( dt
) and 3 points, ( x0
, x1
and x2
) over time which are separeted by 0.5 seconds. First I should mention that the position of the 3 points is mentioned in meters. By using these points I am computing two speeds(v0 = (x1 - x0) / dt
and v1 = (x2 - x1) / dt
). Finally, by using these speeds, I am computing the acceleration between x1
and x2
as acc = (v1 - v0) / dt
.
I repeat these computation at each frame and I obtain an array of accelerations.
As I've said, I have also an iPhone and I want to see in which hand I have my iPhone, left hand or right hand. I want to do this by trying to match the accelerations of my hand with the accelerations of the iPhone held in the right position so that I have the same axis system.
The only problem is that there is a huge difference between my accelerations and the accelerations of the phone.
The phone acceelaration readings are somewhere between -2 and 2 for each axis, whereas mine are between -10 and 10. How should I interpret the iPhone accelerations in order to obtain similar measures to mine in meter / seconds ?