I am working on a multi-touch app using Monogame, where multiple users can work on a larger multi-touch screen with separate documents/images/videos simultaneously, and I was wondering if it's possible to make gestures "context-aware", i.e. two fingers pinching a document on one side of the wall shouldn't affect somebody panning the other side of the wall.
The way Monogame works is, all input points are translated into gestures, which can be read using:
if (TouchPanel.IsGestureAvailable)
{
var gesture = TouchPanel.ReadGesture();
// do stuff
}
Is there a way to make gestures limited to a certain point on the screen, or do I need to implement this myself? For example, by looking at the source code, it appears that the TouchPanelState
class does all the work, but unfortunately its constructors are internal
.
TouchPanel
static class, which has aGetState()
method. Maybe that could help instantiating theTouchPanelState
class you want? – Mongolian