For my project I record user audio using MediaRecorder and it almost works fine. My problem rises when I wish to display a waveform of the user recording using Wavesurfer.js, which doesn't load my recording. Playing the recording with an Audio element works fine, though.
After trying different sources, it seams that it is because the final .webm file doesn't have much metadata, not even a duration or bitrate (even though I set it in the MediaRecorder options). Here is the output from ffprobe with one of the files:
Input #0, matroska,webm, from '206_3.webm':
Metadata:
encoder : Chrome
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)
So my question is: am I doing something wrong to record the audio? Here is how I start the recording:
// Somewhere in the code...
this._handleUserMedia(await navigator.mediaDevices.getUserMedia({ audio: true }));
// ... and elsewhere
_handleUserMedia(stream) {
this._mediaRecorder = new MediaRecorder(stream, { audioBitsPerSecond : 64000 });
this._mediaRecorder.ondataavailable = event => {
this._mediaBuffer.push(event.data);
};
this._mediaRecorder.onstop = () => {
// Ajoute le buffer et une URL vers le buffer dans les résultats pour la sauvegarde et le playback
let blob = new Blob(this._mediaBuffer, { type: "audio/webm" });
this.state.results[this.state.currentWordIdx].recordingBlob = blob;
this.state.results[this.state.currentWordIdx].recordingUrl = URL.createObjectURL(blob);
// Réinitialise le buffer pour l'enregistrement suivant
this._mediaBuffer = [];
this._gotoNextWord();
};
this._gotoNextWord();
}
As you can see I create a blob which I save later on with NodeJS's fs.writeFile
. Then when I need to display the waveform, I load the file using fs.readFile
like this:
fs.readFile(`${this.getAppData()}/${filePath}`, (err, buffer) => {
if (err) { reject(err); }
const blob = new Blob([buffer], {type : 'audio/webm'});
resolve(URL.createObjectURL(blob)); // Si besoin d'un ArrayBuffer => toArrayBuffer(buffer)
});