Integration of Pocketsphinx Android with Phonegap app
Asked Answered
E

0

3

I'm trying to integrate the Sphinx with my Phonegap app, and following the pocketsphinx-android-demo, but getting the RuntimeException when startup, here is the details:

E/OADemo  (15835): java.lang.RuntimeException: Decoder_setSearch returned -1
E/OADemo  (15835):  at edu.cmu.pocketsphinx.PocketSphinxJNI.Decoder_setSearch(Native    Method)
E/OADemo  (15835):  at edu.cmu.pocketsphinx.Decoder.setSearch(Unknown Source)
E/OADemo  (15835):  at edu.cmu.pocketsphinx.SpeechRecognizer.startListening(Unknown Source)
E/OADemo  (15835):  at cn.fsll.oademo.OADemo.switchSearch(OADemo.java:103)
E/OADemo  (15835):  at cn.fsll.oademo.OADemo.access$100(OADemo.java:22)
E/OADemo  (15835):  at cn.fsll.oademo.OADemo$1.onPostExecute(OADemo.java:53)
E/OADemo  (15835):  at cn.fsll.oademo.OADemo$1.onPostExecute(OADemo.java:34)
E/OADemo  (15835):  at android.os.AsyncTask.finish(AsyncTask.java:632)
E/OADemo  (15835):  at android.os.AsyncTask.access$600(AsyncTask.java:177)
E/OADemo  (15835):  at android.os.AsyncTask$InternalHandler.handleMessage(AsyncTask.java:645)
E/OADemo  (15835):  at android.os.Handler.dispatchMessage(Handler.java:102)
E/OADemo  (15835):  at android.os.Looper.loop(Looper.java:136)
E/OADemo  (15835):  at android.app.ActivityThread.main(ActivityThread.java:5050)
E/OADemo  (15835):  at java.lang.reflect.Method.invoke(Native Method)
E/OADemo  (15835):  at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:789)
E/OADemo  (15835):  at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:605)

So, how can I fix this? Any help is appreciated.

package cn.fsll.oademo;

import static android.widget.Toast.makeText;
import static edu.cmu.pocketsphinx.SpeechRecognizerSetup.defaultSetup;

import android.content.Context;
import android.os.Bundle;
import android.os.AsyncTask;
import android.widget.Toast;
import android.util.Log;

import java.io.File;
import java.io.IOException;

import edu.cmu.pocketsphinx.Assets;
import edu.cmu.pocketsphinx.Hypothesis;
import edu.cmu.pocketsphinx.RecognitionListener;
import edu.cmu.pocketsphinx.SpeechRecognizer;

import org.apache.cordova.*;

public class OADemo extends CordovaActivity implements RecognitionListener {
    private static final String KWS_SEARCH = "wakeup";
    private static final String KEYPHRASE = "oa";

    private final String LOGTAG = "OADemo";
    private SpeechRecognizer recognizer;

    @Override
    public void onCreate(Bundle savedInstanceState) {

        super.onCreate(savedInstanceState);
        super.init();

        new AsyncTask<Void, Void, Exception>() {
            @Override
            protected Exception doInBackground(Void... params) {
                try {
                    Assets assets = new Assets(OADemo.this);
                    File assetDir = assets.syncAssets();

                    setupRecognizer(assetDir);
                } catch (IOException ex) {
                    Log.e(LOGTAG, Log.getStackTraceString(ex));
                    return ex;
                }

                return null;
            }

            @Override
            protected void onPostExecute(Exception ex) {
                if(null == ex) {
                    switchSearch(KWS_SEARCH);
                } else {
                    Log.e(LOGTAG, Log.getStackTraceString(ex));
                }
            }
        }.execute();

        super.loadUrl(Config.getStartUrl());
    }

    @Override
    public void onPartialResult(Hypothesis hypothesis) {
        Log.d(LOGTAG, "onPartialResult");
        onResult(hypothesis);
    }

    @Override
    public void onResult(Hypothesis hypothesis) {
        Log.d(LOGTAG, "onResult");

        String text = hypothesis.getHypstr();

        if(text.equals(KEYPHRASE)) {
            recognizer.stop();
            // TODO: start iflytek speech recognizator
        }
    }

    @Override
    public void onBeginningOfSpeech() {
        Log.d(LOGTAG, "onBeginningOfSpeech");
    }

    @Override
    public void onEndOfSpeech() {
        Log.d(LOGTAG, "onEndOfSpeech");

        switchSearch(KWS_SEARCH);
    }

    private void toast(final String msg) {
        makeText(getApplicationContext(), msg, Toast.LENGTH_SHORT).show();
        Log.e(LOGTAG, msg);
    }

    private void switchSearch(final String searchName) {
        Log.d(LOGTAG, "switchSearch");

        try {
            recognizer.stop();
            recognizer.startListening(searchName);
        } catch(Exception ex) {
            Log.e(LOGTAG, Log.getStackTraceString(ex));
        }
    }

    private void setupRecognizer(final File assetsDir) {
        Log.d(LOGTAG, "setupRecognizer");

        File modelsDir = new File(assetsDir, "models");
        recognizer = defaultSetup()
            .setAcousticModel(new File(modelsDir, "hmm/en-us-semi"))
            .setDictionary(new File(modelsDir, "dict/cmu07a.dic"))
            .setRawLogDir(assetsDir).setKeywordThreshold(1e-20f)
            .getRecognizer();

        recognizer.addListener(this);
        recognizer.addKeyphraseSearch(KWS_SEARCH, KEYPHRASE);
    }
}
Ellga answered 21/5, 2014 at 1:45 Comment(10)
Care to provide the code you wrote? Logcat output?Homesick
I'm pretty sure that you didn't specify either grammar or language model. Take a look at the demo application.Expressionism
@AlexanderSolovets, yes that's the exact case in the mean time, but I've try to copy the code from the demo including adding the grammar and language model to the recognizer and and make sure the grammar and language file can be access from the app, but still can't make it through.Ellga
As @NikolayShmyrev mentioned, it's better to show the code.Expressionism
@AlexanderSolovets, I've just add my source, many thanks for your kindly help.Ellga
Keyphrase oa is not in the dictionary most likely, you can see it in logcat output which you didn't provide although requested. Oa is also too short for keyphrase, it's better to choose keyphrase of 3 syllables.Homesick
Thanks @NikolayShmyrev, I'll try your suggestion later. I just wonder why I get the exact exception even I provide the grammar and language model to the recongizer.Ellga
Wuala, finally these works!Ellga
@AlexanderSolovets, do you mind to recompile your comment as an answer, so that I can accept it?Ellga
@NikolayShmyrev, do you mind to recompile your comment as an answer, so that I can accept it?Ellga

© 2022 - 2024 — McMap. All rights reserved.