Android AudioRecord example [closed]
Asked Answered
A

2

62

I am designing an Android app and I need to implement an AudioRecord class to record the user's sound. After some research (that didn't provide enough information) and few failed attempts, I was wondering if anyone could help me by posting an example (code) on how to capture high quality sound using AudioRecord. I would really appreciate it. Thank you

Attainable answered 14/12, 2011 at 2:57 Comment(0)
A
102

Here I am posting you the some code example which record good quality of sound using AudioRecord API.

Note: If you use in emulator the sound quality will not much good because we are using sample rate 8k which only supports in emulator. In device use sample rate to 44.1k for better quality.

public class Audio_Record extends Activity {
    private static final int RECORDER_SAMPLERATE = 8000;
    private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
    private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
    private AudioRecord recorder = null;
    private Thread recordingThread = null;
    private boolean isRecording = false;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.main);

        setButtonHandlers();
        enableButtons(false);

        int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE,
                RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING); 
    }

    private void setButtonHandlers() {
        ((Button) findViewById(R.id.btnStart)).setOnClickListener(btnClick);
        ((Button) findViewById(R.id.btnStop)).setOnClickListener(btnClick);
    }

    private void enableButton(int id, boolean isEnable) {
        ((Button) findViewById(id)).setEnabled(isEnable);
    }

    private void enableButtons(boolean isRecording) {
        enableButton(R.id.btnStart, !isRecording);
        enableButton(R.id.btnStop, isRecording);
    }

    int BufferElements2Rec = 1024; // want to play 2048 (2K) since 2 bytes we use only 1024
    int BytesPerElement = 2; // 2 bytes in 16bit format

    private void startRecording() {

        recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
                RECORDER_SAMPLERATE, RECORDER_CHANNELS,
                RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);

        recorder.startRecording();
        isRecording = true;
        recordingThread = new Thread(new Runnable() {
            public void run() {
                writeAudioDataToFile();
            }
        }, "AudioRecorder Thread");
        recordingThread.start();
    }

        //convert short to byte
    private byte[] short2byte(short[] sData) {
        int shortArrsize = sData.length;
        byte[] bytes = new byte[shortArrsize * 2];
        for (int i = 0; i < shortArrsize; i++) {
            bytes[i * 2] = (byte) (sData[i] & 0x00FF);
            bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
            sData[i] = 0;
        }
        return bytes;

    }

    private void writeAudioDataToFile() {
        // Write the output audio in byte

        String filePath = "/sdcard/voice8K16bitmono.pcm";
        short sData[] = new short[BufferElements2Rec];

        FileOutputStream os = null;
        try {
            os = new FileOutputStream(filePath);
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        }

        while (isRecording) {
            // gets the voice output from microphone to byte format

            recorder.read(sData, 0, BufferElements2Rec);
            System.out.println("Short writing to file" + sData.toString());
            try {
                // // writes the data to file from buffer
                // // stores the voice buffer
                byte bData[] = short2byte(sData);
                os.write(bData, 0, BufferElements2Rec * BytesPerElement);
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
        try {
            os.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    private void stopRecording() {
        // stops the recording activity
        if (null != recorder) {
            isRecording = false;
            recorder.stop();
            recorder.release();
            recorder = null;
            recordingThread = null;
        }
    }

    private View.OnClickListener btnClick = new View.OnClickListener() {
        public void onClick(View v) {
            switch (v.getId()) {
            case R.id.btnStart: {
                enableButtons(true);
                startRecording();
                break;
            }
            case R.id.btnStop: {
                enableButtons(false);
                stopRecording();
                break;
            }
            }
        }
    };

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
        if (keyCode == KeyEvent.KEYCODE_BACK) {
            finish();
        }
        return super.onKeyDown(keyCode, event);
    }
}

For more detail try this AUDIORECORD BLOG.

Annamarieannamese answered 21/11, 2012 at 5:57 Comment(10)
Can I ask how often the data is written to the outputstream? And also why do you read in shorts, doesn't the api offer to read in bytes?Quartus
@JamesClark depends on size your sending to server. If your sending audio data 1024 shorts i.e 2048 bytes. then you have to send within 40ms to server.you can read in bytes too doesn't make much difference..Annamarieannamese
Adding to this: 44100 is the only sample rate guaranteed to be supported on all devices. developer.android.com/reference/android/media/AudioRecord.htmlFrancklyn
Be careful with this solution since it truncates the number of audio samples to 1024, which will result in distortion in recorded audio IF the size is NOT multiple of 1024. Should use the size returned by "AudioRecord.getMinBufferSize". Also calling "short2byte" is not necessary.Misfire
@us_david, yes.. you can use the returned size. That will reduce the distortion. But when we need to send specific amount of data to server so designed in that way.Annamarieannamese
What are BufferElements2Rec and BytesPerElement. How can I get that values please? I am new to audrio record of android and now I am struggling with it. Please.Dragonnade
I got Error code -20 when initializing native AudioRecord object even with sample rate 44100.Eyehole
I imported this into my android project to test, but how do I create the buttons? Which objects are they in the code?Copperplate
@us_david: In the code, do you only need to use AudioRecord.getMinBufferSize instead of BufferElements2Rec to avoid distortions?Frankie
@us_david: Why did you write that short2byte is not necessary?Frankie
H
7

Here is an end to end solution I implemented for streaming Android microphone audio to a server for playback: Android AudioRecord to Server over UDP Playback Issues

Homemaking answered 11/4, 2013 at 19:1 Comment(1)
Nice code! It solved my next problem. +1Gaw

© 2022 - 2024 — McMap. All rights reserved.