I want to be able to launch some features of my Android app ("Start", "Stop"), through voice commands from Google Assistant.
- What are the next steps to be able to launch these features from Google Assistant?
- What are the natural language queries supported by this Open App Feature?
I have integrated actions.intent.OPEN_APP_FEATURE
into my app and tested it successfully through the App Actions Test Tool in Android Studio.
My actions.xml
:
<?xml version="1.0" encoding="utf-8"?>
<actions>
<action intentName="actions.intent.OPEN_APP_FEATURE">
<fulfillment urlTemplate="http://www.my-app.com/{?featureName}">
<parameter-mapping
intentParameter="feature"
urlParameter="featureName" />
</fulfillment>
<parameter name="feature">
<entity-set-reference entitySetId="FeatureEntitySet" />
</parameter>
</action>
<entity-set entitySetId="FeatureEntitySet">
<entity
name="@string/start_capture"
identifier="START" />
<entity
name="@string/stop_capture"
identifier="STOP" />
</entity-set>
</actions>
When asking Google Assistant something like "Open start from MyApp" I was expecting the same behavior as when testing through the App Actions Test Tool (open the feature of the app), but Google Assistant provides generic web results instead.
actions.xml
when specifying the inline inventory forfeature
. And still don't see anything in the doc saying which natural language query triggers this action. – Jibber