We are thinking about detecting the hand, user is holding his mobile device, right or left hand or both. As far as our knowledge we think this is not possible with 100% accuracy with the current hardware, we even dont think it would have an accuracy over 90% but if you try to achieve this with the available sensor data which most of the smartphones have today. How would you process those sensor data and how would you decide?
Our initial thoughts are,
- Checking horizontal angle via gyroscope,
- Deciding based on face recognition and the angle with eyes using camera,
If you ask why would you do such thing,
As devices get larger (such as samsung note-2, note-3), touching every side of the screen is getting harder which causes user experience/ergonomy problems. We think if we can detect this automatically with a reasonable accuracy we may adjust our layouts to serve better user experience.
Thanks everyone sharing your thoughts,