what's wrong with my sensor monitoring technique?
Asked Answered
X

7

11

(please read UPDATE 3 at the end)I'm developing an app that continually works with the sensors of device, works with Accelerometer and Magnetic sensors to retrieve the orientation of device(the purpose is mentioned here). in other words, my app needs to know the orientation of device in Real-time(however this is never possible, so as fast as possible instead, but really as fast as possible !). as mentioned in professional Android 4 Application Development by Reto Meier:

The accelerometers can update hundreds of times a second...

I must not lose any data that sensors report and I also want to do time-consuming operations on these data(retrieve the orientation and then do calculations... ). I decided to solve my problem by using LinkedBlockingQueue:

    public void startSensors() {
            LinkedBlockingQueue<float[][]> array=new LinkedBlockingQueue();
    sensorListenerForOrientation = new SensorEventListener() {

        @Override
        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
                aValues = (event.values.clone());
            else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
                mValues = (event.values.clone());
            if (aValues != null && mValues != null) {
                try {
                    array.put(new float[][] { aValues, mValues });
                } catch (InterruptedException e) {
                }
            }
        }

        @Override
        public void onAccuracyChanged(Sensor sensor, int accuracy) {
        }
    };
    Sensor aSensor = sm.getSensorList(Sensor.TYPE_ACCELEROMETER).get(
            sm.getSensorList(Sensor.TYPE_ACCELEROMETER).size() - 1);
    Sensor mSensor = sm.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).get(
            sm.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).size() - 1);
    sm.registerListener(sensorListenerForOrientation, aSensor,
            SensorManager.SENSOR_DELAY_FASTEST);
    sm.registerListener(sensorListenerForOrientation, mSensor,
            SensorManager.SENSOR_DELAY_FASTEST);
    executor.execute(new Runnable() {
        @Override
        public void run() {
            doCalculations();
        }
    });
}

and

    public void doCalculations() {
    for (;;) {
        float[][] result = null;
        try {
            result = array.take();
        } catch (InterruptedException e) {
        }
        float[] aValues, mValues;
        aValues = result[0];
        mValues = result[1];

                    int[] degrees=getOrientation(aValues,mValues);
                    Log.e("",String.valueOf(degrees[0]));

                 //other calculations...
                     }
                                }

now I pick up my device and rotate it about 90 degrees to right and then return it to the first position fast(for example in 1.5 seconds) but as I look at the orientations that are registered in device I see for example: 0,1,2,3,4,5.......,40,39,38,37,....,0

I just want to say that I can't see a large domain of degrees in my result . based on what I have done and what I have researched I just can be sure that I am NOT losing any data, any new data reported by sensors are recorded.

any Idea, solution?!

Regards!

UPDATE 1: I did another experiment with my device and got shocking results! if I rotate my device over an axis 90 degrees fast (less than a second), I can see all degrees in my result: 0,1,2,3,....,89,90 (for example) but if I rotate it 90 degrees and then rotate it back to its first position, the result would be 0,1,2,...,36,37,36,...2,1,0(for example)...really confusing !

UPDATE 2: I updated doCalculations() method to be more clear what I have done

UPDATE 3: I think maybe we can solve the problem in another way! I have clear purposes for this code. please have a look at this. I have mentioned what is going to happen, I need to detect an specific movement gesture. so maybe the whole way that I have chosen(the technique above) is not a good way for solving this problem. maybe it's better to detect that gesture by using other sensors or using the same sensors in other way. what do you think?

Xymenes answered 13/9, 2013 at 12:46 Comment(10)
What are aValues and mValues in your listener? Shared variables? Have you tried logging each event in the listener to check if you are missing any?Sindysine
global variables: public float[] aValues,mValues;Xymenes
yes, I newly understood that I'm not missing anything, the problem is some where else...(I will Update my post now and mention new things)Xymenes
Where do you get the values from (1,2,3...37,36,...)? From logging all the events on the first line of your listener? If yes, then it's an issue with the sensor/how you access its data, if not then it could be a concurrency issue. Please indicate clearly that point.Sindysine
@Sindysine I updated my code (doCalculation() method)Xymenes
Why don't you try to log all events on the first line of your listener? If the two lists of events (in the listener and in doCalculation) are the same, the problem is with the listener/sensor, if they are different, the problem is with your code, probably a concurrency issue. At least that will narrow down the problem.Sindysine
@Sindysine I have tried what you suggested. the two lists are just the same! so the problem seems to be from sensors. but read what I have mentioned in the Update paragraph, that causes I get really confused :(Xymenes
Your last update (experiment) and your last comment together make this a really interesting question. I would suggest that you remove all the code in doCalculations and just log the values.Jacobba
@SherifelKhatib Logging results say that nothing is lost, so where the problem comes from? I don't know :((((((Xymenes
please read Update 3 !Xymenes
D
2

Your code looks reasonable. A big unknown is how good the sensors and sensor fusion are in your device. Quick angle change readings rely on integration of angular acceleration or else a physical gyroscope with magnetic data mixed in to make the result absolutely align with the earth. Magnetic data are subject to surroundings. If your device has low quality sensors or there are magnetic disturbances in your environment, it's entirely possible to see the kinds of error you are seeing. Big metal structures and magnetic equipment (like motors or even fluorescent light ballasts) can blank the field or introduce arbitrary errors. For normal uses, a device only needs an accelerometer to accurately determine which way is down so screen flips are accurate. This only needs to work when the device is not moving, where a gyro has no role. If you have a phone or tablet with sensors meant only to serve this purpose - therefore with no gyro or an inaccurate one - you are seeing a device limitation. The erratic values are other evidence that your device is low quality and/or that you are in a location where the earth's magnetic field is being distorted. Try the program on another (preferably expensive) device outside and in the open, and see what you get.

Driskill answered 22/9, 2013 at 1:1 Comment(7)
thank you for your different ideas, but have you read Update 2 ? I can't understand that, why rotating over an axis fast gives me what I want however maybe with latency but if I rotate the device and then rotate it back to first position things change completely and another shocking (!) result is seen...?Xymenes
do you think using gyroscope sensor in retrieving orientation can help?Xymenes
@Xymenes I read all the updates. The point is that if the device has a low resolution sensor system with no physical gyro, only the down vector will be reliable, and only when the device is stationary. North will be reasonably correct if you're away from big metal and magnets. It's possible that fast rotation works better with cheap hardware while slower rotation is subject to more error. This would be consistent with the device having no hardware gyro. iPhone did not have a gyro until V4. If your Android device is old or cheap, it may not have one, either.Driskill
so I got that if I can use gyro sensor beside accelerometer and magnetic, I can get better result, but that is not available in all devices.correct ?!Xymenes
it seems that the way I have chosen to solve my problem (detecting a gesture :#18812280) is not a reliable and reasonable one and I must think different. what do you think? any idea?Xymenes
@Xymenes What is the big picture? What kind of software is this? Game? There is a reason all game-writers were so excited when iPhone 4 added a hardware gyro. It made gestures much more practical. If your device has no gyro, it will be hard to get any kind of resolution. You might try writing directly to the developer.android.com/guide/topics/sensors/… gyro API to see what you get.Driskill
No, that's not a game, that's a simple app that wants to detect this movement gesture. as not all devices have a gyro so it's not reliable to use gyro sensor in calculations, on the other hand using Accelerometer and Magnetic sensors together didn't make a good result. so I'm really crazy that how to detect that gesture with the things we have in our hands !!!!Xymenes
C
7

So it looks like you are trying to find high throughput low latency solution for a standard "Producer-Consumer" problem. Basically the idea is quite straightforward: decrease data handling overhead, process data in parallel. Suggestions are the following:

1. Use "low latency" libraries

  • javolution.org - is a real-time library aiming to make Java or Java-Like/C++ applications faster and more time predictable. It includes Android support.
  • mentaqueue - is a super-fast, garbage-less, lock-free, two-thread (producer-consumer) queue based on the Disruptor ideas. Android support is undefined (it looks like it should work).
  • disruptor - yet another lightning fast library
  • trove - provides high speed regular and primitive collections for Java.

Any of these solution will let you save a lot of CPU cycles.

enter image description here

2. Process data wisely

There is an overhead every time you submit a job. Batch processing can be really helpful.

enter image description here

Process data continuously. Note, executor.execute will consume quite a lot. Several long-living consumers might help.

3. Finally, use micro optimization techniques

For example, get rid of if-else-if in favor of switch.

Track performance all the time in order to identify good and bad solutions. Experiment.

Happy coding.

Castellany answered 18/9, 2013 at 0:36 Comment(2)
the things you have suggested are very helpful for solving latency issues, but right now, based on the experiments I mentioned above and Logging results at least I can be sure that nothing is being lost, or let me say I have all data that sensors give me because just after receiving new data that is just recorded...that's why I get crazy and confused :(Xymenes
Interesting ideas but your chart only includes collections which to my knowledge are not thread safe. I don't know how much difference you'll find between java.util.concurrent and the disruptor for example. It could be worth trying with other queue implementations first (ArrayBlockingQueue for example). Also I'd suggest trying to get rid of the shared variables which do incur a penalty vs local variables.Sindysine
J
3

Just thinking: please try the following:

public void startSensors() {
    final Stack<Runnable> mStack = new Stack<Runnable>();
    sensorListenerForOrientation = new SensorEventListener() {

        @Override
        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
                aValues = (event.values.clone());
            else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
                mValues = (event.values.clone());
            if (aValues != null && mValues != null) {
               mStack.push(new Calculater(new float[][] { aValues, mValues });
            }
        }

        @Override
        public void onAccuracyChanged(Sensor sensor, int accuracy) {
        }
    };
    Sensor aSensor = sm.getSensorList(Sensor.TYPE_ACCELEROMETER).get(
            sm.getSensorList(Sensor.TYPE_ACCELEROMETER).size() - 1);
    Sensor mSensor = sm.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).get(
            sm.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).size() - 1);
    sm.registerListener(sensorListenerForOrientation, aSensor,
            SensorManager.SENSOR_DELAY_FASTEST);
    sm.registerListener(sensorListenerForOrientation, mSensor,
            SensorManager.SENSOR_DELAY_FASTEST);
    new Thread() { 
        public void run() {
            while(true)
            {
                try { 
                    Runnable r = mStack.pop();
                    r.run();
                } catch(Exception ex){}
            }
        }
    }.start();
}
private class Calculater implements Runnable {
    float[][] theValues;
    public Calculater(float[][] values) {
        theValues = values;
    }
    public void run() {
        int[] degrees= getOrientation(theValues[0], theValues[1]);
        Log.e("",String.valueOf(degrees[0]));
    }
}
Jacobba answered 18/9, 2013 at 15:0 Comment(2)
thanks, works well, but still the problem that I mentioned in Update 1 exists !!! interesting, not?!Xymenes
it should be a queue instead of stack (my fault). if this still loses some values, then the problem is in the sensor and not the codeJacobba
D
2

Your code looks reasonable. A big unknown is how good the sensors and sensor fusion are in your device. Quick angle change readings rely on integration of angular acceleration or else a physical gyroscope with magnetic data mixed in to make the result absolutely align with the earth. Magnetic data are subject to surroundings. If your device has low quality sensors or there are magnetic disturbances in your environment, it's entirely possible to see the kinds of error you are seeing. Big metal structures and magnetic equipment (like motors or even fluorescent light ballasts) can blank the field or introduce arbitrary errors. For normal uses, a device only needs an accelerometer to accurately determine which way is down so screen flips are accurate. This only needs to work when the device is not moving, where a gyro has no role. If you have a phone or tablet with sensors meant only to serve this purpose - therefore with no gyro or an inaccurate one - you are seeing a device limitation. The erratic values are other evidence that your device is low quality and/or that you are in a location where the earth's magnetic field is being distorted. Try the program on another (preferably expensive) device outside and in the open, and see what you get.

Driskill answered 22/9, 2013 at 1:1 Comment(7)
thank you for your different ideas, but have you read Update 2 ? I can't understand that, why rotating over an axis fast gives me what I want however maybe with latency but if I rotate the device and then rotate it back to first position things change completely and another shocking (!) result is seen...?Xymenes
do you think using gyroscope sensor in retrieving orientation can help?Xymenes
@Xymenes I read all the updates. The point is that if the device has a low resolution sensor system with no physical gyro, only the down vector will be reliable, and only when the device is stationary. North will be reasonably correct if you're away from big metal and magnets. It's possible that fast rotation works better with cheap hardware while slower rotation is subject to more error. This would be consistent with the device having no hardware gyro. iPhone did not have a gyro until V4. If your Android device is old or cheap, it may not have one, either.Driskill
so I got that if I can use gyro sensor beside accelerometer and magnetic, I can get better result, but that is not available in all devices.correct ?!Xymenes
it seems that the way I have chosen to solve my problem (detecting a gesture :#18812280) is not a reliable and reasonable one and I must think different. what do you think? any idea?Xymenes
@Xymenes What is the big picture? What kind of software is this? Game? There is a reason all game-writers were so excited when iPhone 4 added a hardware gyro. It made gestures much more practical. If your device has no gyro, it will be hard to get any kind of resolution. You might try writing directly to the developer.android.com/guide/topics/sensors/… gyro API to see what you get.Driskill
No, that's not a game, that's a simple app that wants to detect this movement gesture. as not all devices have a gyro so it's not reliable to use gyro sensor in calculations, on the other hand using Accelerometer and Magnetic sensors together didn't make a good result. so I'm really crazy that how to detect that gesture with the things we have in our hands !!!!Xymenes
A
2

The usual thing to do within an event block is to do almost nothing, since this is really fast. "Almost" being the important word. In your case, the event could just add the data of the event (from the event parameter) to some data structure (list, stack, circular buffer... your pick). That way you should lose less events (if any).

Which means that you can then (for instance periodically) read the stored events and decide if a gesture was made. That means that your intensive calculations are made less often. But you don't lose any events. I think this is acceptable because of your purpose, which is gesture recognition. I assume it doesn't have to be that fast (ie. you don't have to calculate it every time the sensor updates).

Note : this is one common way to handle IT in the Linux world.

Armada answered 22/9, 2013 at 2:39 Comment(2)
what you have suggested does not help solving the problem in my case as I am not losing any data. so the bad thing that may happen is latency in calculations, but the calculations is done for all data. your suggestion may help me with latency and actually is a good idea for solving latency issue, thnax! but still the main problem survives :(Xymenes
I see. Thank you for commenting, and I'm glad that my answer was somewhat usefull. Good luck for solving you problem!Armada
V
0

just a thought. I have a similar problem when I needed to collect several large sample sizes an perform calculations. My situation was probably quite different from yours as I just needed acceleration. What I did was create an array list. calculated acceleration per every record reported :

  @Override
        public void onSensorChanged(SensorEvent event) {
            float x = event.values[0];
            float y = event.values[1];
            float z = event.values[2];

            float acceleration = FloatMath.sqrt((x * x) + (y * y) + (z * z));

Then in the same onSensorChanged method, I wait until the size hits a certain limit, like 300, clone that sample to a new list,clear out original, perform calculations on new list and continue in that manner. I get results in secs. I am not sure how much down time is allowed for your application but when I run this I get what I am looking for in less that 5 secs. If you need more sample code let me know, but that is the gist. Sorry if I didn't understand your question properly but I think you were asking for a way to calculate data without losing much? Also I have this running on a separate handler when I register the listener, not to interfere with the main thread, not to effect user experience.

Vidar answered 24/5, 2016 at 10:44 Comment(0)
B
-1
  1. Change variable declaration:
List<float[][]> array = Collections.synchronizedList(new ArrayList<float[][]>());
  1. Inside the runnable:
Iterator<float[][]> values = array.iterator();
while (values.hasNext()) {
        float[][] result = values.next();
        //calculating.

        //after calculating remove the items.
        values.remove();
}
Baculiform answered 15/9, 2013 at 17:30 Comment(6)
LinkedBlockInQueue is designed to handle multiple threads but not amotic threads where you put and read and delete at same time. combined syncronizedList and iterator it works.Baculiform
thanks, I will try it tomorrow and let you know the result :)Xymenes
as you can see in my code, the runnable is executed once, now you have put Iterator<float[][]> values = array.iterator(); before while (values.hasNext()) , so that happens once, too. no problem?Xymenes
The iterator inside your for(;;) for(;;){ here the code in loop mode. }Baculiform
your code has 2 problems: 1.you never remove any element from array 2.I read Iterator documentation, it is NOT recommended for synchronization, it is for debugging purposes...Xymenes
@Baculiform that makes no sense: BlockingQueues are already thread safe (and by the way the loop you propose is not...).Sindysine
C
-1

This is what's wrong with your code. Fast as possible requires fast coding techniques. Save the sensor type instead of evaluating it twice.

@Override
    public void onSensorChanged(SensorEvent event) {
        int i = event.sensor.getType();
        if (i == Sensor.TYPE_ACCELEROMETER)
            aValues = (event.values.clone());
        else if (i == Sensor.TYPE_MAGNETIC_FIELD)
            mValues = (event.values.clone());
    }
Cracker answered 20/9, 2013 at 19:41 Comment(2)
Huh... No - && will return false if any of the two is null. The short circuit will only happen if the first is null (and will return false). As for the micro optimisation you suggest, it is probably insignificant enough to not matterSindysine
My bad on the short circuit. I'll edit it out. I see the micro optimization is what you asked for but really as fast as possible !). as mentioned in professional Android 4 Application Development by Reto Meier: It saves 2 operations 100 plus times a second.Cracker

© 2022 - 2024 — McMap. All rights reserved.