With Glass you can launch an app via the 'OK, Glass' menu and it seems to pick the nearest match unless a command is miles off, and you can obviously see the list of commands.
Is there anyway from within the app, or from the voice prompt (after the initial app trigger) to have a similar list given and return the nearest match.
Random (non-real world) example, an app that shows you a colour, "OK Glass, show the colour red"
'show the colour' could be your voice trigger and seems to be matched by glass on a 'nearest neighbor' method, however 'red' is just read in as free text and could be easily misheard as 'dread' or 'head', or even 'read' as there is no way of differentiating 'read' from 'red'.
Is there a way to pass a list of pre-approved option (red, green, blue, orange*, etc.) to this stage, or to another voice prompt within the app so the user can see the list and get more accurate results when there is a finite set of expected responses (like the main ok glass screen)?
*ok well nothing rhymes with orange, we're probably safe there