How to implement an Assistant with Google Assist API
Asked Answered
C

2

11

I have been checking out and reading about Google Now on Tap (from http://developer.android.com/training/articles/assistant.html).

It was very interesting to find from that article that Now on Tap is based on Google's Assist API bundled with Marshmallow and it seems possible for us to develop our own assistant (the term Google used in the article to refer to app like Now on Tap) using the API.

However, the mentioned article only very briefly discusses how to use Assist API and I couldn't find any additional information about how to use it to develop a custom assistant even after spending a few days searching for it on the Internet. No documentation and no example.

I was wondering if any of you have experience with Assist API that you could share? Any help appreciated.

Thanks

Childhood answered 9/2, 2016 at 10:26 Comment(1)
Is Google Now on Tap still a thing in e.g. Android 10? I can't see that option myself on my phone and all articles and videos I see date to 2015-2016. I just see the Google Assistant coming to life and no option to revert to Google Now on Tap.Spinal
C
14

You can definitely implement a personal assistant just like the Google Now on Tap using the Assist API starting Android 6.0. The official developer (http://developer.android.com/training/articles/assistant.html) guide tells exactly how you should implement it.

Some developers may wish to implement their own assistant. As shown in Figure 2, the active assistant app can be selected by the Android user. The assistant app must provide an implementation of VoiceInteractionSessionService and VoiceInteractionSession as shown in this example and it requires the BIND_VOICE_INTERACTION permission. It can then receive the text and view hierarchy represented as an instance of the AssistStructure in onHandleAssist(). The assistant receives the screenshot through onHandleScreenshot().

Commonsware has four demos for basic Assist API usage. The TapOffNow (https://github.com/commonsguy/cw-omnibus/tree/master/Assist/TapOffNow) should be enough to get you started.

You don't have to use the onHandleScreenshot() to get the relevant textual data, the AssistStructure in onHandleAssist() will give you a root ViewNode which usually contains all you can see on the screen.

You probably need to also implement some sorts of function to quickly locate the specific ViewNode that you want to focus on using recursive search on the children from this root ViewNode.

Capreolate answered 8/3, 2016 at 4:31 Comment(5)
Thanks @lifelogger, It did see the paragraph you quoted from the developer docs, however to me as a beginner in Android, it's quite vague. For instance, how to "implementation of VoiceInteractionSessionService and VoiceInteractionSession" (the example they provided wasn't quite helpful to me), how to integrate such implementation into an assistant app, how Android system recognises my custom assistant app, etc. The examples you provided were definitely much more helpful. Thanks for thatChildhood
I've tried out the example apps and found they only work if we open the apps and hold the home button for a while. I'm wondering if there is anything we can do to make it completely replaces Google Now on Tap. For example, can we set it as a default assistant and then whenever user hold the home button on any screen, the app would pop up? It seems that's possible according to Google ("the active assistant app can be selected by the Android user") however I couldn't find a way to do it after quite a lot of searching... PS. sr I dont have 15 repu points so my upvote for you isn't visible :(Childhood
You can set the your developed assistant as default by going to Settings-> Assist & voice input and select yours.Capreolate
Note the Assist API only works starting Android 6.0.Capreolate
Hey @H.Nguyen, any more issues on this? If no, please accept my answer.Capreolate
S
2

There is a complete example here but it's too complicated to start. This is my example works on android 7.1.1

AndroidManifest.xml

<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.eaydin79.voiceinteraction">
    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:theme="@style/AppTheme" >
        <service 
            android:name="voiceInteractionService"
            android:permission="android.permission.BIND_VOICE_INTERACTION" >
            <meta-data 
                android:name="android.voice_interaction"
                android:resource="@xml/interaction_service" />
            <intent-filter>
                <action android:name="android.service.voice.VoiceInteractionService" />
            </intent-filter>
        </service>
        <service 
            android:name="voiceInteractionSessionService"
            android:permission="android.permission.BIND_VOICE_INTERACTION" >
        </service>
    </application>
</manifest>

this is interaction_service.xml file stored in res\xml folder

<?xml version="1.0" encoding="utf-8"?>
<voice-interaction-service xmlns:android="http://schemas.android.com/apk/res/android"
    android:sessionService="com.eaydin79.voiceinteraction.voiceInteractionSessionService"
    android:recognitionService="com.eaydin79.voiceinteraction.voiceInteractionService"
    android:supportsAssist="true" />

voiceInteractionService.java

package com.eaydin79.voiceinteraction;
import android.service.voice.VoiceInteractionService;
import android.service.voice.VoiceInteractionSession;

public class voiceInteractionService extends VoiceInteractionService {
    @Override
    public void onReady() {
        super.onReady();
    }
}

voiceInteractionSessionService.java

package com.eaydin79.voiceinteraction;
import android.os.Bundle;
import android.service.voice.VoiceInteractionSession;
import android.service.voice.VoiceInteractionSessionService;

public class voiceInteractionSessionService extends VoiceInteractionSessionService {    
    @Override
    public VoiceInteractionSession onNewSession(Bundle bundle) {
         return new voiceInteractionSession(this);
    }
}

voiceInteractionSession.java

package com.eaydin79.voiceinteraction;
import android.app.VoiceInteractor;
import android.content.Context;
import android.os.Bundle;
import android.service.voice.VoiceInteractionSession;
import android.media.AudioManager;

public class voiceInteractionSession extends VoiceInteractionSession {
   
    voiceInteractionSession(Context context) {
        super(context);
    }

    @Override
    public void onShow(Bundle args, int showFlags) {
        super.onShow(args, showFlags);
        //whatever you want to do when you hold the home button 
        //i am using it to show volume control slider
        AudioManager audioManager = (AudioManager) getContext().getSystemService(Context.AUDIO_SERVICE);
        if (audioManager != null) audioManager.adjustStreamVolume(AudioManager.STREAM_MUSIC, AudioManager.ADJUST_SAME, AudioManager.FLAG_SHOW_UI);
        hide();
    }

}
Surd answered 20/3, 2021 at 21:31 Comment(3)
Another example i found hereSurd
Thanks for posting this, I tried building this on Android 11, but how do I enable this as an assistant? I don't see the "Assist and Voice Input" option anywhere. Any ideas?Lanyard
@Lanyard maybe this flow with adaptations for Android 11, or just search a similar video on Youtube.Spinal

© 2022 - 2024 — McMap. All rights reserved.