Sure you can. Read up on calling Java from C(++), and call the respective Java functions - either construct UI elements (layouts, buttons, etc.) one by one, or load an XML layout. There's no C-specific interface for that, but the Java one is there to call.
Unless it's a game and you intend to do your own drawing via OpenGL ES. I'm not sure if you can mix and match.
In a NativeActivity
, you can still get a pointer to the Java Activity object and call its methods - it's the clazz
member of the ANativeActivity
structure that's passed to your android_main
as a parameter, via the android_app
structure. Take that pointer, take the JNIEnv*
from the same, and assign a layout.
How will this interoperate with OpenGL drawing, I'm not sure.
EDIT: about putting together your own input processing. The key callback is onInputEvent(struct android_app* app, AInputEvent* event)
within android_app
structure. Place your callback there, Android will call it whenever appropriate. Use AInputEvent_getType(event)
to retrieve event type; touch events have type AINPUT_EVENT_TYPE_MOTION.
EDIT2: here's a minimum native app that grabs the touch events:
#include <jni.h>
#include <android_native_app_glue.h>
#include <android/log.h>
static int32_t OnInput(struct android_app* app, AInputEvent* event)
{
__android_log_write(ANDROID_LOG_ERROR, "MyNativeProject", "Hello input event!");
return 0;
}
extern "C" void android_main(struct android_app* App)
{
app_dummy();
App->onInputEvent = OnInput;
for(;;)
{
struct android_poll_source* source;
int ident;
int events;
while ((ident = ALooper_pollAll(-1, NULL, &events, (void**)&source)) >= 0)
{
if(source != NULL)
source->process(App, source);
if (App->destroyRequested != 0)
return;
}
}
}
You need, naturally, to add a project around it, with a manifest, Android.mk and everything. Android.mk will need the following as the last line:
$(call import-module,android/native_app_glue)
The native_app_glue
is a static library that provides some C bridging for the APIs that are normally consumed via Java.
You can do it without a glue library as well. But then you'll need to provide your own function ANativeActivity_onCreate
, and a bunch of other callbacks. The android_main
/android_app
combo is an interface defined by the glue library.
EDIT: For touch coordinates, use AMotionEvent_getX/Y()
, passing the event object as the first parameter and index of the pointer as the second. Use AMotionEvent_getPointerCount()
to retrieve the number of pointers (touch points). That's your native processing of multitouch events.
I'm supposed to detect the [x,y] position everytime, compare it to
the location of my joystick, store the previous position, compare the
previous position and the next one to get the direction ?
In short, yes, you are. There's no builtin platform support for virtual joysticks; you deal with touches and coordinates, and you translate that into your app's UI metaphor. That's pretty much the essence of programming.
Not "everytime" though - only when it's changing. Android is an event-driven system.
Now, about your "I want it on the OS level" sentiment. It's WRONG on many levels. First, the OS does not owe you anything. The OS is what it is, take it or leave it. Second, unwillingness to extend an effort (AKA being lazy) is generally frowned upon in the software community. Third, the OS code is still code. Moving something into the OS might gain you some efficiency, but why do you think it will make a user perceptible difference? It's touch processing we're talking about - not a particularly CPU intensive task. Did you actually build an app, profile and find its performance lacking? Until you do, don't ever guess where the bottleneck would be. The word for that is "premature optimization", and it's something that everyone and their uncle's cat would warn you against.