Real-time audio processing in Android
Asked Answered
N

5

37

I'm trying to figure out how to write an app that can decode audio morse code on the fly. I found this document which explains how to record audio from the microphone in Android. What I'd like to know is whether it's possible to access the raw input from the microphone or whether it has to be written/read to a file.

Thanks.

Nugget answered 13/2, 2010 at 9:28 Comment(1)
Did you ever do this project Paul? I'm looking into something similar as a "homer" project and was interested in processing incoming audio on-the-fly also.... maybe need to use a native library to get enough performance? drop me an email if you want at andrew at mackenzie-serres.net. Thanks!Reef
F
29

If you use MediaRecorder (the example, above) it will save compressed audio to a file.

If you use AudioRecord, you can get audio samples directly.

Yes, what you want to do should be possible.

Flotilla answered 14/2, 2010 at 6:25 Comment(2)
Can you help me with #61184852 questionLampley
To get better intuition about Audio Processing, you can visit this reference: Android-Audio-Processing-Using-WebRTCDaphnedaphnis
C
7

there is a sensing framework from MIT media labs called funf: http://code.google.com/p/funf-open-sensing-framework/
They already created classes for audio input and some analysis (FFT and the like), also saving to files or uploading is implemented as far as I've seen, and they handle most of the sensors available on the phone. You can also get inspired from the code they wrote, which I think is pretty good.

Chrissa answered 26/5, 2013 at 21:42 Comment(0)
A
6

Using AudioRecord is overkill. Just check MediaRecorder.getMaxAmplitude() every 1000 milliseconds for loud noises versus silence.

If you really need to analyze the waveform, then yes you need AudioRecord. Get the raw data and calculate something like the root mean squared of the part of the raw bytes you are concerned with to get a sense of the volume.

But, why do all that when MediaRecorder.getMaxAmplitude() is so much easier to use.

see my code from this answer: this question

Attribute answered 1/3, 2012 at 18:37 Comment(3)
1000 ms = 1 s, which seems not nearly often enough for parsing Morse code.Tiemannite
But you can't use getMaxAmplitude() (well you can, but you'll always get 0) unless you actually starting recording. So you would still have to record a file, which could grow infinitely big. Definitely not a solution.Carnivore
Can you help me with #61184852 questionLampley
M
1

I have found a way how to do it. Basically you need to run a new thread within which you continuously call myAndroidRecord.read(). After this call loop over all the entries in the buffer, and you can see raw values in real time one by one. Below is the code sample of the Main activity

package com.example.mainproject;

import androidx.appcompat.app.AppCompatActivity;
import androidx.core.content.ContextCompat;
import androidx.core.app.ActivityCompat;


import android.content.pm.PackageManager;
import android.Manifest;

import android.content.Context;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.widget.TextView;
import android.media.AudioManager;
import android.media.AudioFormat;
import android.os.Bundle;



import java.util.Arrays;

public class MainActivity extends AppCompatActivity {

    private AudioManager myAudioManager;
    private static final int REQUEST_RECORD_AUDIO_PERMISSION = 200;
    // Requesting permission to RECORD_AUDIO
    private boolean permissionToRecordAccepted = false;
    private String [] permissions = {Manifest.permission.RECORD_AUDIO};

    private static final int PERMISSION_RECORD_AUDIO = 0;
    Thread mThread;

    @Override
    public void onRequestPermissionsResult(int requestCode,  String[] permissions,  int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        switch (requestCode){
            case REQUEST_RECORD_AUDIO_PERMISSION:
                permissionToRecordAccepted  = grantResults[0] == PackageManager.PERMISSION_GRANTED;
                break;
        }
        if (!permissionToRecordAccepted ) finish();

    }

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        if(ContextCompat.checkSelfPermission(this,Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED){
            if (ActivityCompat.shouldShowRequestPermissionRationale(this,
                    Manifest.permission.RECORD_AUDIO)) {
                // Show an explanation to the user *asynchronously* -- don't block
                // this thread waiting for the user's response! After the user
                // sees the explanation, try again to request the permission.
                ActivityCompat.requestPermissions(this,
                        new String[] { Manifest.permission.RECORD_AUDIO },
                        PERMISSION_RECORD_AUDIO);
                return;
            } else {
                // No explanation needed; request the permission
                ActivityCompat.requestPermissions(this,
                        new String[]{Manifest.permission.RECORD_AUDIO},
                        1);
                ActivityCompat.requestPermissions(this,
                        new String[] { Manifest.permission.RECORD_AUDIO },
                        PERMISSION_RECORD_AUDIO);

                // MY_PERMISSIONS_REQUEST_READ_CONTACTS is an
                // app-defined int constant. The callback method gets the
                // result of the request.
            }
        }else{

            myAudioManager = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
            String x = myAudioManager.getProperty(AudioManager.PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED);

            runOnUiThread(()->{
                TextView tvAccXValue = findViewById(R.id.raw_available);
                tvAccXValue.setText(x);
            });

            mThread = new Thread(new Runnable() {
                @Override
                public void run() {
                    record();
                }
            });
            mThread.start();
        }
    }

    private void record(){
        int audioSource = MediaRecorder.AudioSource.MIC;
        int samplingRate = 11025;
        int channelConfig = AudioFormat.CHANNEL_IN_DEFAULT;
        int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
        int bufferSize = AudioRecord.getMinBufferSize(samplingRate,channelConfig,audioFormat);

        short[] buffer = new short[bufferSize/4];
        AudioRecord myRecord = new AudioRecord(audioSource,samplingRate,channelConfig,audioFormat,bufferSize);

        myRecord.startRecording();

        int noAllRead = 0;
        while(true){
            int bufferResults = myRecord.read(buffer,0,bufferSize/4);
            noAllRead += bufferResults;
            int ii = noAllRead;
            for (int i = 0;i<bufferResults;i++){
                int val = buffer[i];
                runOnUiThread(()->{
                    TextView raw_value = findViewById(R.id.sensor_value);
                    raw_value.setText(String.valueOf(val));
                    TextView no_read = findViewById(R.id.no_read_val);
                    no_read.setText(String.valueOf(ii));
                });
            }

        }
    }
}

This is just a demonstration and in reall app you will need to think a bit more about how and when to stop the running thread. This example just runs indefinitely untill you exit the app.

Code concerning the UI updates such as TextView raw_value = findViewById(R.id.sensor_value); is specific to this example and you should define your own.

Lines int ii = noAllRead; and int val = buffer[i]; are necesary because Java doesent let you put non effectively final variables in lambda methods.

Musicianship answered 22/10, 2019 at 21:25 Comment(0)
H
-2

It looks like it has to be dumped first to a file.

If you peek at the android.media.AudioRecord source, the native audio data byte buffers are not exposed to the public API.

In my experience, having built an audio synthesizer for Android, it's hard to achieve real-time performance and maintain audio fidelity. A Morse Code 'translator' is certainly doable though, and sounds like a fun little project. Good Luck!

Husky answered 13/2, 2010 at 19:29 Comment(4)
Why do you think the audio buffers aren't passed to Java? What about the read() method?Flotilla
@Error454 Could you please give an example of reading the values directly without file dumping? I am struggeling with the same problem at the moment. Thank you!Musicianship
@Musicianship You would use AudioRecord and call read(...) periodically to get the raw audio data.Roscoeroscommon
@Error454 Yes I have figured it out just before I read your response. To bad I didnt see it before, would save me quite some time. Posted an answer with sample code for the others as well. Thank you!Musicianship

© 2022 - 2024 — McMap. All rights reserved.