How to simulate a touch event with Android while giving the X and Y coordinates manually?
Valentin Rocher's method works if you've extended your view, but if you're using an event listener, use this:
view.setOnTouchListener(new OnTouchListener()
{
public boolean onTouch(View v, MotionEvent event)
{
Toast toast = Toast.makeText(
getApplicationContext(),
"View touched",
Toast.LENGTH_LONG
);
toast.show();
return true;
}
});
// Obtain MotionEvent object
long downTime = SystemClock.uptimeMillis();
long eventTime = SystemClock.uptimeMillis() + 100;
float x = 0.0f;
float y = 0.0f;
// List of meta states found here: developer.android.com/reference/android/view/KeyEvent.html#getMetaState()
int metaState = 0;
MotionEvent motionEvent = MotionEvent.obtain(
downTime,
eventTime,
MotionEvent.ACTION_UP,
x,
y,
metaState
);
// Dispatch touch event to view
view.dispatchTouchEvent(motionEvent);
For more on obtaining a MotionEvent object, here is an excellent answer: Android: How to create a MotionEvent?
downTime
would be the time when the user touches down on the screen, while eventTime
in this case would be when the user lifts their finger up (ACTION_UP
). I am not sure if it will still work if both are the same. You could test it and post your results. –
Astereognosis event time
is the the time (in ms) when this specific event was generated. This must be obtained from uptimeMillis(). downTime
is The time (in ms) when the user originally pressed down to start a stream of position events. This must be obtained from uptimeMillis() –
Ernaline Here is a monkeyrunner script that sends touch and drags to an application. I have been using this to test that my application can handle rapid repetitive swipe gestures.
# This is a monkeyrunner jython script that opens a connection to an Android
# device and continually sends a stream of swipe and touch gestures.
#
# See http://developer.android.com/guide/developing/tools/monkeyrunner_concepts.html
#
# usage: monkeyrunner swipe_monkey.py
#
# Imports the monkeyrunner modules used by this program
from com.android.monkeyrunner import MonkeyRunner, MonkeyDevice
# Connects to the current device
device = MonkeyRunner.waitForConnection()
# A swipe left from (x1, y) to (x2, y) in 2 steps
y = 400
x1 = 100
x2 = 300
start = (x1, y)
end = (x2, y)
duration = 0.2
steps = 2
pause = 0.2
for i in range(1, 250):
# Every so often inject a touch to spice things up!
if i % 9 == 0:
device.touch(x2, y, 'DOWN_AND_UP')
MonkeyRunner.sleep(pause)
# Swipe right
device.drag(start, end, duration, steps)
MonkeyRunner.sleep(pause)
# Swipe left
device.drag(end, start, duration, steps)
MonkeyRunner.sleep(pause)
MonkeyDevice.DOWN_AND_UP
instead of 'DOWN_AND_UP'
. (DOWN_AND_UP
is the default, so your code still works) –
Fingerling UP
action –
Axes use adb Shell Commands to simulate the touch event
adb shell input tap x y
and also
adb shell sendevent /dev/input/event0 3 0 5
adb shell sendevent /dev/input/event0 3 1 29
You should give the new monkeyrunner a go. Maybe this can solve your problems. You put keycodes in it for testing, maybe touch events are also possible.
adb shell monkey
it is the monkeyrunner
, which is a different tool. –
Wrand tools
directory. –
Wrand android update sdk
and install the latest sdk and tools. They also contain the monkeyrunner –
Wrand If I understand clearly, you want to do this programatically. Then, you could use the onTouchEvent method of View
, and create a MotionEvent
with the coordinates you need.
When using Monkey Script I noticed that DispatchPress(KEYCODE_BACK) is doing nothing which really suck. In many cases this is due to the fact that the Activity doesn't consume the Key event. The solution to this problem is to use a mix of monkey script and adb shell input command in a sequence.
1 Using monkey script gave some great timing
control. Wait a certain amount of second for the activity and is a
blocking adb call.
2 Finally sending adb shell input keyevent 4 will end the running APK.
EG
adb shell monkey -p com.my.application -v -v -v -f /sdcard/monkey_script.txt 1
adb shell input keyevent 4
MotionEvent is generated only by touching the screen.
© 2022 - 2024 — McMap. All rights reserved.