I have web application to make audio calls between clients. As server I use FreeSwitch with SIP (using SIP.js) through secure web socket. When I make call between firefox and firefox everything works fine, but when I make call between firefox and chrome I do not hear anything on chrome client. On firefox client I can hear what was send from chrome.
Using Wireshark I found that stream reach client machine and is correct. After long trial and error I found that creating audio
tag in js code and assigning remote stream to it as srcObject
helps (adapting solution from WebRTC doesn't work with AudioContext).
const audioContext = (window.AudioContext || window.webkitAudioContext || window.mozAudioContext || window.oAudioContext || window.msAudioContext);
var aaaudio = new Audio(); <--- adding those lines helps
aaaudio.srcObject = sipSession.mediaHandler.getRemoteStreams()[0]; <--- adding those lines helps
audioContext.createMediaStreamSource(sipSession.mediaHandler.getRemoteStreams()[0])
var gainNode = ctx.createGain();
gainNode.gain.value = .5;
source.connect(gainNode);
gainNode.connect(ctx.destination);
I really don't know why this is not working without audio tag and how can I fix it to work without it. Am I missing something?
Is there any mechanism in Chrome that blocked audio output from stream connected in js without creating explicit new audio tag?
Info: Chrome was started both with disabled and enabled cors policies