Getting Direction Vector in Android
Asked Answered
T

2

13

How can I get a direction vector representing the direction the back of the device is pointing relative to earth coordinates?

For example, if place down on a desk (screen facing up) it should read [0,0,-1] and if held vertically facing north it should read [1,0,0], etc.

I know how to calculate it from heading, pitch, and roll, as long as those are relative to earth coordinates. To be clear here I am not looking for angular velocity, but the actual current angle relative to the plane tangent to the earth. So if the device is held vertically and facing north, the angle "alpha" should read 0 or 360, the angle "beta" should read 90, and "gamma" should read 0. I can't figure out how to get these values either.

I've been reading the API all day and I still cannot find how to get either of these things.

public void onSensorChanged(SensorEvent event) {
    // ?    
}

Thanks for any insights.

Torquemada answered 17/6, 2012 at 2:49 Comment(0)
K
9

SensorManager.getRotationMatrix() does what's outlined below, written before I found this out. I'll leave the added explanation because if you want to correct for the difference between magnetic and true north you'll still need it.

The rough algorithm is to get the rotation matrix, multiply vector [0,0,-1] by it, then adjust this to your coordinate system. Why? Android docs give the coordinate systems for device and world

deviceworld

Note [0,0,-1] in Android device coords points perpendicular backward away from the screen. If you multiply rotation matrix R by this vector, you'll get [0,0,-1] in world coords when the device is on the table facing up as you desire. When it's upright facing north, you'll get [0,-1,0], which indicates that you've chosen a coordinate system where x and y are swapped with respect to the Android system, but that's simply a change of conventions.

Note R * [0,0,-1]^T is just the third column of R negated. From this I get the pseudocode:

getRotationMatrix(R);
Let v = first three elements of third column of R.
swap v[0] and v[1]

This ought to get what you want.

Additional information on what getRotationMatrix() is doing follows.


You need both accerometer data to establish the direction "down" and magnetometer data to determine the direction "north." You'll have to assume the accelerometers are sensing only gravity (the device is stationary or moving at a steady velocity). Then you need to project the magnetometer vector onto the plane perpendicular to the gravity vector (because the magnetic field is not generally tangent to the earth's surface). This gives you two axes. The third is orthogonal, so can be computed by cross product. This gives you the earth coordinate vectors in the device system. It looks like you want the inverse: device coordinates in earth coordinates. For this just construct the matrix of direction cosines and invert.

I will add that the above discussion assumes that the magnetometer vector points north. I think (from high school science!) it's actually toward magnetic south, but have no device at hand so can't try it. Of course magnetic north/south is different from true by zero to 180 degrees depending where you are on the earth. You can retrieve GPS coordinates and compute an actual offset.

If you are not familiar with the math needed to do these, I can explain further, but it will have to be later.

Knuth answered 17/6, 2012 at 3:3 Comment(2)
Thanks so much, this is really helpful. You both gave such good answers, I wish there was a way to select them both. It looks like I can get the gravity vector with just Sensor.TYPE_GRAVITY - they did that work for me. I wish it were just as easy to get a "north" vector.Torquemada
I ended up getting it using your explanation. Turns out I pass the values returned from a gravity sensor and a magnetic field sensor directly into getRotationMatrix() and it works everything out. Then just multiplied that matrix by [0,0,-1] and got the dv.Torquemada
O
9

Read this page: http://developer.android.com/reference/android/hardware/Sensor.html

In API 8 and above, there are "virtual" sensors which are generated by combining the inputs of all available sensors and appropriate filters. The "TYPE_ORIENTATION" sensor gives you the allover orientation of your device, but this interface is deprecated due to failure states at certain orientations. The new sensor is TYPE_ROTATION_VECTOR (API 9 and above) which gives your device orientation as a quaternion. This is really the best sensor to use, but the math behind it is a little heavy.

Failing that, what you do is call SensorManager.getRotationMatrix(), passing the latest gravity and magnetometer data. This will return a rotation matrix which could be used to convert a vector from device coordinates to world coordinates or vice-versa (just transpose the matrix to invert it).

The getOrientation() function can give you heading, pitch, and roll, but these have the same failure states as the TYPE_ORIENTATION sensor.

Examples:

  Device flat on a table, top facing north:
    1  0  0
    0  1  0
    0  0  1

  Tilted up 30 degrees (rotated about X axis)
    1   0      0
    0   0.86  -0.5
    0   0.5    0.86

  Device vertical (rotated about X axis), facing north:
    1  0  0
    0  0 -1
    0  1  0

  Device flat on a table, top facing west:
    0 -1  0
    1  0  0
    0  0  1

  Device rotated about its Y axis, onto its left side, top
  facing north:
    0  0 -1
    0  1  0
    1  0  0

Here is some sample code you may find useful:

public void onSensorChanged(SensorEvent event) {
    long now = event.timestamp;     // ns

    switch( event.sensor.getType() ) {
      case Sensor.TYPE_ACCELEROMETER:
        gData[0] = event.values[0];
        gData[1] = event.values[1];
        gData[2] = event.values[2];
        break;
      case Sensor.TYPE_MAGNETIC_FIELD:
        mData[0] = event.values[0];
        mData[1] = event.values[1];
        mData[2] = event.values[2];
        haveData = true;
        break;
    }

    if( haveData ) {
        double dt = (now - last_time) * .000000001;

        SensorManager.getRotationMatrix(R1, Imat, gData, mData);
        getOrientation(R1, orientation);
        pfdView.newOrientation(orientation[2], (float)dt);


        Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
        Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
        Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));

        acft.compass = orientation[0];

        last_time = now;
    }
}
Otero answered 17/6, 2012 at 4:16 Comment(2)
Thanks for your reply - I just have two questions, if I use TYPE_ROTATION_VECTOR, is the quanternion expressed in world coordinates? Is it already integrated, or do I have to integrate it to get the current vector?Torquemada
World coordinates of the device, I believe. Already converted to a unit quaternion, and sometimes the w term is omitted. To be honest, I've never used it myself. The code I display above is from a flight navigation app I wrote for 1.5, before the new sensor types were available.Otero

© 2022 - 2024 — McMap. All rights reserved.