1 year ago

#292040

test-img

Aung Thura Moe

How to parse Audio Format ( AudioFormat.CHANNEL_OUT_MONO) to AudioRecord.getMinBufferSize function in Android?

I would like to make real-time audio streaming in my application.I have a function named listenUserVoice(String userVoice) accept encoded base64 voice data.My Problem is when I retrieve buffer size using AudioRecord.getMinBufferSize function with AudioFormat.CHANNEL_OUT_MONO.It's return -2(ERROR_BAD_VALUE).So,I try to fix error and I found that there is no switch case for AudioFormat.CHANNEL_OUT_MONO in android AudioRecord.java Class.

public void listenUserVoice(String userVoice) {
    Log.d(TAG, "voice data  " + userVoice);
    try {
        playBufSize = AudioRecord.getMinBufferSize(frequency, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
        Log.d(TAG, "-------------- playBufSize  " + playBufSize);
        audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, frequency, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, playBufSize, AudioTrack.MODE_STREAM);
        bytes = new byte[playBufSize];

        if (audioTrack.getState() == AudioRecord.STATE_INITIALIZED){
            audioTrack.play();
        }

        byte[] decoded = Base64.decode(userVoice, Base64.DEFAULT);

        int i = 0;
        InputStream inputStream = new ByteArrayInputStream(decoded);
        while (true) {
            if ((i = inputStream.read(bytes)) == -1) break;
            audioTrack.write(bytes, 0, i);
        }
        inputStream.close();
    } catch (IOException e) {
        Log.d(TAG, "Error  " + e.getMessage());
        e.printStackTrace();
    }

AudioRecord.java

java

android

java-native-interface

audio-streaming

microphone

0 Answers

Your Answer

Accepted video resources