HTML5 Audio tag on Safari has a delay
Asked Answered
T

13

27

I'm trying to accomplish a simple doodle-like behaviour, where a mp3/ogg sound rings on click, using the html tag. It is supposed to work under Firefox, Safari and Safari iPad is very desireable.

I've tried many approaches and have come down to this:

HTML

    <span id="play-blue-note" class="play blue" ></span>
    <span id="play-green-note" class="play green" ></span>


    <audio id="blue-note" style="display:none" controls preload="auto" autobuffer> 
        <source src="blue.mp3" />
        <source src="blue.ogg" />
        <!-- now include flash fall back -->
    </audio>

    <audio id="green-note" style="display:none" controls preload="auto" autobuffer> 
        <source src="green.mp3" />
        <source src="green.ogg" />
    </audio>

JS

function addSource(elem, path) {
    $('<source>').attr('src', path).appendTo(elem);
}

$(document).ready(function() {


    $('body').delegate('.play', 'click touchstart', function() {
        var clicked = $(this).attr('id').split('-')[1];

        $('#' + clicked + '-note').get(0).play();



    });

});  

This seems to work great under Firefox but Safari seems to have a delay whenever you click, even when you click several times and the audio file has loaded. On Safari on iPad it behaves almost unpredictably.

Also, Safari's performance seems to improve when I test locally, I'm guessing Safari is downloading the file each time. Is this possible? How can I avoid this? Thanks!

Tutorial answered 21/3, 2012 at 19:20 Comment(3)
Hey, any input on Safari's problem? :STutorial
The current answers are correct and, based upon their disposition, entirely in line with this previous answer on a similar thread. The most effective workaround, as cited by the author and in this answer in this thread, is indeed to smash all of your behaviors into a single file and call into it at different frames. It's a lot of extra work, but such is what happens when Apple makes a design decision everyone else is forced to conform with. (Let me know if you'd prefer this as an answer.)Revert
@ign did you manage to play multiple audios at once on desktop Safari? I am experiencing delay with desktop Safari unfortunately. FF and Chrome work OK.Raji
B
15

I just answered another iOS/<audio> question a few minutes ago. Seems to apply here as well:

Preloading <audio> and <video> on iOS devices is disabled to save bandwidth.

In Safari on iOS (for all devices, including iPad), where the user may be on a cellular network and be charged per data unit, preload and autoplay are disabled. No data is loaded until the user initiates it.

Source: Safari Developer Library

Bass answered 21/3, 2012 at 19:24 Comment(1)
Thanks. So, I guess there's no way to achieve almost real time feedback on iPad, right? Any insight on Safari?Tutorial
S
33

On desktop Safari, adding AudioContext fixes the issue:

const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioCtx = new AudioContext();

I found out by accident, so I have no idea why it works, but this removed the delay on my app.

Stockist answered 9/1, 2019 at 23:18 Comment(11)
I can vouch that simply adding these two lines of code improved audio performanceMuscovado
Can you provide more detail on this? How do you link the "audioCtx" to the actual audio object? I assume that just adding these lines of code without more code that connects to the actual audio is insufficient.Subscript
I spent a while trying to find a solution for this, and this worked! For anyone trying to get this working in TypeScript, I found this workaround helpful.Workingman
What is meant in this answer? To properly use fetch + AudioContext instead of regular Audio or just to add these two lines as a magic trick? They will be erased as a dead code while bundling anyway.Accompany
@AlexanderZinchuk Yes, it is a magic trick. I don't remember the details anymore, but at the time of writing, it did fix the issue for me.Stockist
Wow, this indeed fixed the delay on desktop Safari. No idea how this works though.Quadruplet
This fixed my Chrome Extension's audio bug I've been battling for weeks. Happy little accidents 🌟Croquet
To those wondering why this works, i believe that what happens on Mobile audio, is that only one program can hold the audio system open at any one time. And each time it switches back to you, it mutes the first 100 msec of sound; if you are making short sounds, it will omit the whole thing. The audio context grabs the audio and keeps your program active, so you don't just lose the audio channel immediately after a sound finishes, and next sound will get truncated. This error drove me crazy for a day. It was audio length sensitive!Bollix
@EdwardDeJong after all these years (meaning both of the years, it just feels like 5 because of COVID), I finally got an explanation. I am not even working as a programmer anymore but I was still curious, as this thread always pops up... thank you very much!Stockist
Can confirm; this improved my audio on mobile by leaps and bounds. Thanks so much!Wo
If your Safari is more up to date now (in 2022), you can skip the webkit-prefix safety-net and add just a single line: const audioContext = new AudioContext();Venusian
B
15

I just answered another iOS/<audio> question a few minutes ago. Seems to apply here as well:

Preloading <audio> and <video> on iOS devices is disabled to save bandwidth.

In Safari on iOS (for all devices, including iPad), where the user may be on a cellular network and be charged per data unit, preload and autoplay are disabled. No data is loaded until the user initiates it.

Source: Safari Developer Library

Bass answered 21/3, 2012 at 19:24 Comment(1)
Thanks. So, I guess there's no way to achieve almost real time feedback on iPad, right? Any insight on Safari?Tutorial
K
5

The problem with Safari is that it puts a request every time for the audio file being played. You can try creating an HTML5 cache manifest. Unfortunately my experience has been that you can only add to the cache one audio file at a time. A workaround might be to merge all your audio files sequentially into a single audio file, and start playing at a specific position depending on the sound needed. You can create an interval to track the current play position and pause it once it has reached a certain time stamp.

Read more about creating an HTML5 cache manifest here:

http://www.html5rocks.com/en/tutorials/appcache/beginner/

http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html

Hope it helps!

Krp answered 31/3, 2012 at 22:28 Comment(3)
Thanks for your insight. It sounds too much hassle for a simple task I'm trying to implement, though.Tutorial
I don't think you're right about the request every time, but I think you might make a good point putting all sound effects into a single file and playing at offsets in different contexts, as described under the Specifying Playback Range section at developer.mozilla.org/en-US/docs/Web/Guide/HTML/…. I wonder how well this would work with multiple <audio> elements that have the same source, or whether it would work at all. I'm have similar issue with a game I'm developing: Chrome plays the audio flawlessly, but both Firefox and Safari are noticeably laggy.Artificer
This means that with fast repeating sounds, like the player shooting, bullet impacts, etc., the repeats often aren't played. Another potential workaround might be to have several <audio> elements attached to the same source and play them each in turn in round robin fashion. Again, I don't know if this would work, but it's something I'm planning to try later on. And, by the way, I'm seeing these issues on desktop versions of the browsers, never mind mobile.Artificer
S
2

Apple decided (to save money on celluar) to not pre-load <audio> and <video> HTML elements.

From the Safari Developer Library:

In Safari on iOS (for all devices, including iPad), where the user may be on a cellular network and be charged per data unit, preload and autoplay are disabled. No data is loaded until the user initiates it. This means the JavaScript play() and load() methods are also inactive until the user initiates playback, unless the play() or load() method is triggered by user action. In other words, a user-initiated Play button works, but an onLoad="play()" event does not.

This plays the movie: <input type="button" value="Play" onClick="document.myMovie.play()">

This does nothing on iOS: <body onLoad="document.myMovie.play()">


I don't think you can bypass this restriction, but you might be able to.

Remember: Google is your best friend.


Update: After some experimenting, I found a way to play the <audio> with JavaScript:

var vid = document.createElement("iframe");
vid.setAttribute('src', "http://yoursite.com/yourvideooraudio.mp4"); // replace with actual source
vid.setAttribute('width', '1px');
vid.setAttribute('height', '1px');
vid.setAttribute('scrolling', 'no');
vid.style.border = "0px";
document.body.appendChild(vid);

Note: I only tried with <audio>.


Update 2: jsFiddle here. Seems to work.

Stridor answered 2/4, 2012 at 20:0 Comment(4)
The question isn't how to play the audio, it's why is there a delay.Bass
Your answer is just a regurgitation of what I already posted and a snippet of how to play an audio file. Am I missing something new or different that you're contributing that actually answers the question?Bass
@Bass i remember an article of safari downloading the file each time and how to tell safari to cache it. the solution was a meta tag but i cant remember the site and i cleared the history :( sorryStridor
Oh sorry, i honestly didnt read your answer i didnt mean to copyStridor
C
2

HTML5 Audio Delay on Safari iOS (<audio> Element vs AudioContext)

Yes, Safari iOS has an audio delay when using the native <audio> Element ...however this can be overcome by using AudioContext.

My code snippet is based on what I learnt from https://lowlag.alienbill.com/

Please test the functionality on your own iOS device (I tested in iOS 12) https://fiddle.jshell.net/eLya8fxb/51/show/

Snippet from JS Fiddle https://jsfiddle.net/eLya8fxb/51/

// Requires jQuery 

// Adding:
// Strip down lowLag.js so it only supports audioContext (So no IE11 support (only Edge))
// Add "loop" monkey patch needed for looping audio (my primary usage)
// Add single audio channel - to avoid overlapping audio playback

// Original source: https://lowlag.alienbill.com/lowLag.js

if (!window.console) console = {
  log: function() {}
};

var lowLag = new function() {
  this.someVariable = undefined;
  this.showNeedInit = function() {
    lowLag.msg("lowLag: you must call lowLag.init() first!");
  }
  this.load = this.showNeedInit;
  this.play = this.showNeedInit;
  this.pause = this.showNeedInit;
  this.stop = this.showNeedInit;
  this.switch = this.showNeedInit;
  this.change = this.showNeedInit;
  
  this.audioContext = undefined;
  this.audioContextPendingRequest = {};
  this.audioBuffers = {};
  this.audioBufferSources = {};
  this.currentTag = undefined;
  this.currentPlayingTag = undefined;

  this.init = function() {
    this.msg("init audioContext");
    this.load = this.loadSoundAudioContext;
    this.play = this.playSoundAudioContext;
    this.pause = this.pauseSoundAudioContext;
    this.stop = this.stopSoundAudioContext;
    this.switch = this.switchSoundAudioContext;
    this.change = this.changeSoundAudioContext;

    if (!this.audioContext) {
      this.audioContext = new(window.AudioContext || window.webkitAudioContext)();
    }
  }

  //we'll use the tag they hand us, or else the url as the tag if it's a single tag,
  //or the first url 
  this.getTagFromURL = function(url, tag) {
    if (tag != undefined) return tag;
    return lowLag.getSingleURL(url);
  }
  this.getSingleURL = function(urls) {
    if (typeof(urls) == "string") return urls;
    return urls[0];
  }
  //coerce to be an array
  this.getURLArray = function(urls) {
    if (typeof(urls) == "string") return [urls];
    return urls;
  }

  this.loadSoundAudioContext = function(urls, tag) {
    var url = lowLag.getSingleURL(urls);
    tag = lowLag.getTagFromURL(urls, tag);
    lowLag.msg('webkit/chrome audio loading ' + url + ' as tag ' + tag);
    var request = new XMLHttpRequest();
    request.open('GET', url, true);
    request.responseType = 'arraybuffer';

    // Decode asynchronously
    request.onload = function() {
      // if you want "successLoadAudioFile" to only be called one time, you could try just using Promises (the newer return value for decodeAudioData)
      // Ref: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData

      //Older callback syntax:
      //baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback);
      //Newer promise-based syntax:
      //Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);


      // ... however you might want to use a pollfil for browsers that support Promises, but does not yet support decodeAudioData returning a Promise.
      // Ref: https://github.com/mohayonao/promise-decode-audio-data
      // Ref: https://caniuse.com/#search=Promise

      // var retVal = lowLag.audioContext.decodeAudioData(request.response);

      // Note: "successLoadAudioFile" is called twice. Once for legacy syntax (success callback), and once for newer syntax (Promise)
      var retVal = lowLag.audioContext.decodeAudioData(request.response, successLoadAudioFile, errorLoadAudioFile);
      //Newer versions of audioContext return a promise, which could throw a DOMException
      if (retVal && typeof retVal.then == 'function') {
        retVal.then(successLoadAudioFile).catch(function(e) {
          errorLoadAudioFile(e);
          urls.shift(); //remove the first url from the array
          if (urls.length > 0) {
            lowLag.loadSoundAudioContext(urls, tag); //try the next url
          }
        });
      }
    };

    request.send();

    function successLoadAudioFile(buffer) {
      lowLag.audioBuffers[tag] = buffer;
      if (lowLag.audioContextPendingRequest[tag]) { //a request might have come in, try playing it now
        lowLag.playSoundAudioContext(tag);
      }
    }

    function errorLoadAudioFile(e) {
      lowLag.msg("Error loading webkit/chrome audio: " + e);
    }
  }

  this.playSoundAudioContext = function(tag) {
    var context = lowLag.audioContext;

    // if some audio is currently active and hasn't been switched, or you are explicitly asking to play audio that is already active... then see if it needs to be unpaused
    // ... if you've switch audio, or are explicitly asking to play new audio (that is not the currently active audio) then skip trying to unpause the audio
    if ((lowLag.currentPlayingTag && lowLag.currentTag && lowLag.currentPlayingTag === lowLag.currentTag) || (tag && lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag)) {
      // find currently paused audio (suspended) and unpause it (resume)
      if (context !== undefined) {
        // ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
        if (context.state === 'suspended') {
          context.resume().then(function() {
            lowLag.msg("playSoundAudioContext resume " + lowLag.currentPlayingTag);
            return;
          }).catch(function(e) {
            lowLag.msg("playSoundAudioContext resume error for " + lowLag.currentPlayingTag + ". Error: " + e);
          });
          return;
        }
      }
    }
    
    if (tag === undefined) {
      tag = lowLag.currentTag;
    }

    if (lowLag.currentPlayingTag && lowLag.currentPlayingTag === tag) {
      // ignore request to play same sound a second time - it's already playing
      lowLag.msg("playSoundAudioContext already playing " + tag);
      return;
    } else {
      lowLag.msg("playSoundAudioContext " + tag);
    }

    var buffer = lowLag.audioBuffers[tag];
    if (buffer === undefined) { //possibly not loaded; put in a request to play onload
      lowLag.audioContextPendingRequest[tag] = true;
      lowLag.msg("playSoundAudioContext pending request " + tag);
      return;
    }

    // need to create a new AudioBufferSourceNode every time... 
    // you can't call start() on an AudioBufferSourceNode more than once. They're one-time-use only.
    var source;
    source = context.createBufferSource(); // creates a sound source
    source.buffer = buffer; // tell the source which sound to play
    source.connect(context.destination); // connect the source to the context's destination (the speakers)
    source.loop = true;
    lowLag.audioBufferSources[tag] = source;

    // find current playing audio and stop it
    var sourceOld = lowLag.currentPlayingTag ? lowLag.audioBufferSources[lowLag.currentPlayingTag] : undefined;
    if (sourceOld !== undefined) {
      if (typeof(sourceOld.noteOff) == "function") {
        sourceOld.noteOff(0);
      } else {
        sourceOld.stop();
      }
      lowLag.msg("playSoundAudioContext stopped " + lowLag.currentPlayingTag);
      lowLag.audioBufferSources[lowLag.currentPlayingTag] = undefined;
      lowLag.currentPlayingTag = undefined;
    }

    // play the new source audio
    if (typeof(source.noteOn) == "function") {
      source.noteOn(0);
    } else {
      source.start();
    }
    lowLag.currentTag = tag;
    lowLag.currentPlayingTag = tag;
    
    if (context.state === 'running') {
      lowLag.msg("playSoundAudioContext started " + tag);
    } else if (context.state === 'suspended') {
      /// if the audio context is in a suspended state then unpause (resume)
      context.resume().then(function() {
        lowLag.msg("playSoundAudioContext started and then resumed " + tag);
      }).catch(function(e) {
        lowLag.msg("playSoundAudioContext started and then had a resuming error for " + tag + ". Error: " + e);
      });
    } else if (context.state === 'closed') {
      // ignore request to pause sound - it's already closed
      lowLag.msg("playSoundAudioContext failed to start, context closed for " + tag);
    } else {
      lowLag.msg("playSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
    }
  }

  this.pauseSoundAudioContext = function() {
    // not passing in a "tag" parameter because we are playing all audio in one channel
    var tag = lowLag.currentPlayingTag;
    var context = lowLag.audioContext;

    if (tag === undefined) {
      // ignore request to pause sound as nothing is currently playing
      lowLag.msg("pauseSoundAudioContext nothing to pause");
      return;
    }

    // find currently playing (running) audio and pause it (suspend)
    if (context !== undefined) {
      // ref: https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/suspend
      if (context.state === 'running') {
      	lowLag.msg("pauseSoundAudioContext " + tag);
        context.suspend().then(function() {
          lowLag.msg("pauseSoundAudioContext suspended " + tag);
        }).catch(function(e) {
          lowLag.msg("pauseSoundAudioContext suspend error for " + tag + ". Error: " + e);
        });
      } else if (context.state === 'suspended') {
        // ignore request to pause sound - it's already suspended
        lowLag.msg("pauseSoundAudioContext already suspended " + tag);
      } else if (context.state === 'closed') {
        // ignore request to pause sound - it's already closed
        lowLag.msg("pauseSoundAudioContext already closed " + tag);
      } else {
        lowLag.msg("pauseSoundAudioContext unknown AudioContext.state for " + tag + ". State: " + context.state);
      }
    }
  }

  this.stopSoundAudioContext = function() {
    // not passing in a "tag" parameter because we are playing all audio in one channel
    var tag = lowLag.currentPlayingTag;

    if (tag === undefined) {
      // ignore request to stop sound as nothing is currently playing
      lowLag.msg("stopSoundAudioContext nothing to stop");
      return;
    } else {
      lowLag.msg("stopSoundAudioContext " + tag);
    }

    // find current playing audio and stop it
    var source = lowLag.audioBufferSources[tag];
    if (source !== undefined) {
      if (typeof(source.noteOff) == "function") {
        source.noteOff(0);
      } else {
        source.stop();
      }
      lowLag.msg("stopSoundAudioContext stopped " + tag);
      lowLag.audioBufferSources[tag] = undefined;
      lowLag.currentPlayingTag = undefined;
    }
  }

  this.switchSoundAudioContext = function(autoplay) {
    lowLag.msg("switchSoundAudioContext " + (autoplay ? 'and autoplay' : 'and do not autoplay'));

    if (lowLag.currentTag && lowLag.currentTag == 'audio1') {
      lowLag.currentTag = 'audio2';
    } else {
      lowLag.currentTag = 'audio1';
    }

    if (autoplay) {
      lowLag.playSoundAudioContext();
    }
  }

  this.changeSoundAudioContext = function(tag, autoplay) {
    lowLag.msg("changeSoundAudioContext to tag " + tag + " " + (autoplay ? 'and autoplay' : 'and do not autoplay'));

		if(tag === undefined) {
    	lowLag.msg("changeSoundAudioContext tag is undefined");
    	return;
    }
    
    lowLag.currentTag = tag;

    if (autoplay) {
      lowLag.playSoundAudioContext();
    }
  }

  this.msg = function(m) {
    m = "-- lowLag " + m;
    console.log(m);
  }
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script>
<script>
  // AudioContext
  $(document).ready(function() {
    lowLag.init();
    lowLag.load(['https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3'], 'audio1');
    lowLag.load(['https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3'], 'audio2');
    // starts with audio1
    lowLag.changeSoundAudioContext('audio1', false);
  });

  // ----------------

  // Audio Element
  $(document).ready(function() {
    var $audioElement = $('#audioElement');
    var audioEl = $audioElement[0];
    var audioSources = {
      "audio1": "https://coubsecure-s.akamaihd.net/get/b86/p/coub/simple/cw_looped_audio/f0dab49f867/083bf409a75db824122cf/med_1550250381_med.mp3",
      "audio2": "https://coubsecure-s.akamaihd.net/get/b173/p/coub/simple/cw_looped_audio/0d5adfff2ee/80432a356484068bb0e15/med_1550254045_med.mp3"
    };
    playAudioElement = function() {
      audioEl.play();
    }
    pauseAudioElement = function() {
      audioEl.pause();
    }
    stopAudioElement = function() {
      audioEl.pause();
      audioEl.currentTime = 0;
    }
    switchAudioElement = function(autoplay) {
      var source = $audioElement.attr('data-source');

      if (source && source == 'audio1') {
        $audioElement.attr('src', audioSources.audio2);
        $audioElement.attr('data-source', 'audio2');
      } else {
        $audioElement.attr('src', audioSources.audio1);
        $audioElement.attr('data-source', 'audio1');
      }

      if (autoplay) {
        audioEl.play();
      }
    }
    changeAudioElement = function(tag, autoplay) {
      var source = $audioElement.attr('data-source');
      
      if(tag === undefined || audioSources[tag] === undefined) {
      	return;
      }

      $audioElement.attr('src', audioSources[tag]);
      $audioElement.attr('data-source', tag);

      if (autoplay) {
        audioEl.play();
      }
    }
    changeAudioElement('audio1', false); // starts with audio1
  });

</script>

<h1>
  AudioContext (<a href="https://developer.mozilla.org/en-US/docs/Web/API/AudioContext" target="blank">api</a>)
</h1>
<button onClick="lowLag.play();">Play</button>
<button onClick="lowLag.pause();">Pause</button>
<button onClick="lowLag.stop();">Stop</button>
<button onClick="lowLag.switch(true);">Swtich</button>
<button onClick="lowLag.change('audio1', true);">Play 1</button>
<button onClick="lowLag.change('audio2', true);">Play 2</button>

<hr>

<h1>
  Audio Element (<a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/audio" target="blank">api</a>)
</h1>
<audio id="audioElement" controls loop preload="auto" src="">
</audio>
<br>
<button onClick="playAudioElement();">Play</button>
<button onClick="pauseAudioElement();">Pause</button>
<button onClick="stopAudioElement();">Stop</button>
<button onClick="switchAudioElement(true);">Switch</button>
<button onClick="changeAudioElement('audio1', true);">Play 1</button>
<button onClick="changeAudioElement('audio2', true);">Play 2</button>

enter image description here enter image description here enter image description here enter image description here enter image description here enter image description here enter image description here

Cissie answered 17/6, 2019 at 2:57 Comment(0)
S
1

Unfortunately, the only way to make it work properly in Safari we need to use WebAudio API, or third-party libs to handle this. Check the source code here (it's not minified)
https://drums-set-js.herokuapp.com/index.html
https://drums-set-js.herokuapp.com/app.js

Sandblast answered 12/11, 2019 at 7:12 Comment(0)
B
1

Same issue. I tried to preload it via different ways. Finally I wrapped animation logic to "playing" callback. So this logic should work only if file loaded and playing started, but as a result I see that animation logic already started, and audio playing with around 2 seconds delay. It's braking my mind, how it can has delay if audio already called "playing" callback? enter image description here

Audio Context resolved my issue. The simplest example I found here https://developer.mozilla.org/en-US/docs/Web/API/Body/arrayBuffer getData - preparing your audio file; then you can play it with source.start(0);

This link missed how to get audioCtx you can copy it here let audioCtx = new (window.AudioContext || window.webkitAudioContext)();

Boni answered 8/6, 2020 at 16:23 Comment(1)
Can you post this as a question referencing the parent question and also mentioning all versions you are using?Recurrent
I
0

your audio files are loaded once then cached.. playing the sounds repeatedly, even after page refresh, did not cause further HTTP requests in Safari..

i just had a look at one of your sounds in an audio editor - there was a small amount of silence at the beginning of the file.. this will manifest as latency..

is the Web Audio API a viable option for you?

Isolationism answered 2/4, 2012 at 21:49 Comment(1)
Thanks, I don't think the delay is caused by silence in the file, though. I'll check on Web Audio API.Tutorial
B
0

I am having this same issue. What is odd is that I am preloading the file. But with WiFi it plays fine, but on phone data, there is a long delay before starting. I thought that had something to do with load speeds, but I do not start playing my scene until all images and the audio file are loaded. Any suggestions would be great. (I know this isn't an answer but I thought it better that making a dup post).

Banded answered 4/11, 2013 at 4:23 Comment(0)
B
0

I would simply create <audio autoplay /> dom element on click, this works in all major browsers - no need to handle events and trigger play manually

if you want to respond to audio status change manually - I would suggest to listen for play event instead of loadeddata - it's behavior is more consistent in different browsers

Burnell answered 31/5, 2021 at 10:38 Comment(0)
S
0

If you have a small/short audio file that doesn't require a lot of audio clarity, you can convert the audio file to base64 encoding.

This way the audio file will be text based and doesn't have latency related to downloading the audio file, since iOS downloads the audio pretty much when it's played.

On one hand, it's nice what iOS does to prevent abuse. On the other hand, it's annoying when it gets in the way of legitimate usage.

Here's a base64 encoder for audio files.

Subminiature answered 15/7, 2022 at 13:59 Comment(0)
R
0

It is possible to measure the latency between hearable audio and currentTime. Eg. this method is used in wavearea:

let mediaElement = new Audio('data:audio/wav;base64,UklGRmgAAABXQVZFZm10IBAAAAABAAEAgLsAAAB3AQACABAAZGF0YQIAAABpNUxJU1Q6AAAASU5GT0lTRlQUAAAAcHJvYmUuYXVkaW90b29sLmNvbQBJQ1JEEQAAADIwMjMtMDMtMDIgMDctNDQAAA==')
mediaElement.load()
mediaElement.volume = 0

// Measure latency of audio file between first 'playing' event and actual first sample
export async function measureLatency() {
  return new Promise(ok => {
    mediaElement.play()
    let start
    mediaElement.onplaying = () => start = performance.now()
    mediaElement.onended = () => ok(performance.now() - start)
  })
}

// ...then somewhere on the first click
let latency = await measureLatency()

It creates 1-sample wav file, plays it and measures time between onplaying and onended events.

Rossetti answered 2/3, 2023 at 20:58 Comment(0)
C
0

In my experience there is still significant delay in most answers I came across, even changing the audio format and doing base64. The only way the audio delay issue was fixed for mobile was when Web Audio API was used in replacement to new Audio() calls.

Here's a quick drop in function:

// Replace this
const audio = new Audio(url)
audio.play()

// With this
const audioPlay = async url => {
    const context = new AudioContext();
    const source = context.createBufferSource();
    const audioBuffer = await fetch(url)
      .then(res => res.arrayBuffer())
      .then(ArrayBuffer => context.decodeAudioData(ArrayBuffer));
  
    source.buffer = audioBuffer;
    source.connect(context.destination);
    source.start();
};
audioPlay(url)

The code was not created by me, credit goes to the person in SO that wrote it.. I'm very sorry I cannot find where I came across it.

Cult answered 18/11, 2023 at 11:25 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.