How to minimize the delay in a live streaming with ffmpeg
Asked Answered
S

5

59

I have a problem. I would to do a live streaming with ffmpeg from my webcam.

  1. I launch the ffserver and it works.

  2. From another terminal I launch ffmpeg to stream with this command and it works:

    sudo ffmpeg -re -f video4linux2 -i /dev/video0 -fflags nobuffer -an http://localhost:8090/feed1.ffm
    
  3. In my configuration file I have this stream:

    <Stream test.webm>
    Feed feed1.ffm
    Format webm
     NoAudio
     VideoCodec libvpx
     VideoSize 720x576
     VideoFrameRate 25
     # Video settings
        VideoCodec libvpx
        VideoSize 720x576           # Video resolution
        VideoFrameRate 25           # Video FPS
        AVOptionVideo flags +global_header  # Parameters passed to encoder 
                                        # (same as ffmpeg command-line parameters)
        AVOptionVideo cpu-used 0
        AVOptionVideo qmin 10
        AVOptionVideo qmax 42
        #AVOptionVideo quality good
        PreRoll 5
         StartSendOnKey
        VideoBitRate 400            # Video bitrate
     </Stream>
    
  4. I launch the stream with:

    ffplay http://192.168.1.2:8090/test.webm

It works but I have a delay of 4 seconds and I would like to minimize this delay because is essential for my application. Thanks

Satterlee answered 20/5, 2013 at 21:41 Comment(4)
You can try to reduce the value of probesize: ffplay -probesize 500000 http://192.168.1.2:8090/test.webm (to 500 Kb, experiment with this value, default value is 5Mb if I'm not mistaken.)Sturrock
ok thanks. Now I have to watch the stream in a mobile browser and so I cannot use ffplay. I need some suggest to optimize the configuration file.Satterlee
Reading ffmpeg.org/sample.html I can imagine that you can try with VideoBufferSize or reducing the GOP (it increase the bandwidth usage).Fascine
Did you manage to find a solution ?Colony
S
81

I found three commands that helped me reduce the delay of live streams. The first command its very basic and straight-forward, the second one combines other options which might work differently on each environment, and the last command is a hacky version that I found in the documentation, it was useful at the beginning but currently the first option is more stable and suitable to my needs.

1. Basic using -fflags nobuffer

This format flag reduces the latency introduced by buffering during initial input streams analysis. This command will reduce noticeable the delay and will not introduce audio glitches.

ffplay -fflags nobuffer -rtsp_transport tcp rtsp://<host>:<port>

2. Advanced -flags low_delay and other options.

We can combine the previous -fflags nobuffer format flag with other generic options and advanced options for a more elaborated command:

  • -flags low_delay this codec generic flag will force low delay.
  • -framedrop: to drop video frames if video is out of sync. Enabled by default if the master clock is not set to video. Use this option to enable frame dropping for all master clock sources
  • -strict experimental, finally -strict specifies how strictly to follow the standards. The experimental option allows non standardized experimental things, experimental (unfinished/work in progress/not well tested) decoders and encoders. This option is optional and remember that experimental decoders can pose a security risk, do not use this for decoding untrusted input.
ffplay -fflags nobuffer -flags low_delay -framedrop \
-strict experimental -rtsp_transport tcp rtsp://<host>:<port>

This command might introduce some audio glitches, but rarely.

Also you can try adding:

  • -avioflags direct to reduce buffering, and
  • -fflags discardcorrupt to discard corrupted packets, but I think is very aggressive approach. This might break the audio-video synchronization
ffplay -fflags nobuffer -fflags discardcorrupt -flags low_delay \ 
-framedrop -avioflags direct -rtsp_transport tcp rtsp://<host>:<port>

3. A hacky option (found on the old documentation)

This is an debugging solution based on setting -probesize and -analyzeduration to low values to help your stream start up more quickly.

  • -probesize 32 sets the probing size in bytes (i.e. the size of the data to analyze to get stream information). A higher value will enable detecting more information in case it is dispersed into the stream, but will increase latency. Must be an integer not lesser than 32. It is 5000000 by default.
  • analyzeduration 0 specifies how many microseconds are analyzed to probe the input. A higher value will enable detecting more accurate information, but will increase latency. It defaults to 5000000 microseconds (5 seconds).
  • -sync ext sets the master clock to an external source to try and stay realtime. Default is audio. The master clock is used to control audio-video synchronization. This means this options sets the audio-video synchronization to a type (i.e. type=audio/video/ext).
ffplay -probesize 32 -analyzeduration 0 -sync ext -rtsp_transport tcp rtsp://<host>:<port>

This command might introduce some audio glitches sometimes.

The -rtsp_transport can be setup as udp or tcp according to your streaming. For this example I'm using tcp.

Streeter answered 14/3, 2018 at 8:47 Comment(4)
None of this had any effect in my teting. Using SDL preview with ffmpeg (not ffplay) was the only option to reduce delay (but the lack of sound is an obvious drawback).Oosperm
Reducing analyzeduration and probesize from its default values did the trick for me. Could reduce 2-3 secs from stream start time.Lamentation
I would NOT recommend attaching the nobuffer flag, for a livestream. It in fact does seem to cause glitches during framedrops; causing the video either to stall for a couple of seconds, or to repeat certain fragments - this behaviour can be controlled with the -vsync option. Also, a VOD of such a livestream is filled with corrupted frames, and has to be re-encoded. Removing the nobuffer flag solved the problem in my case.Gylys
@PatrykCieszkowski remember that this is a solution for ffplay not for ffmpeg. And the main idea is to reduce the delay while playing a stream. If you want to store it or transcode it with low delay probably ffmpeg -re might help but again every solution has its own pros and cons.Streeter
E
22

FFMpeg's streaming guide has a specific section on how to reduce latency. I haven't tried all their suggestions yet. http://ffmpeg.org/trac/ffmpeg/wiki/StreamingGuide#Latency

They make a particular note about latency ffplay introduces:

By default, ffplay introduces a small latency of its own, Also useful is mplayer with its -nocache for testing latency (or -benchmark). Using the SDL out is also said to view frames with minimal latency: ffmpeg ... -f sdl -

Eats answered 9/7, 2013 at 2:47 Comment(3)
thanks, ffplay -probesize 32 -sync ext INPUT from the link did the trick for me!Illusionism
What latency did you achieve? I need more than 100 ms streaming latency . Is it possible ?Colony
I didn't measure the final delay but it looked close to live on a monitor in person with the sound which was going through a sound booth, so was good enough for me.Eats
R
8

Consider using the filter option -vf setpts=0. This makes all frames display as soon as possible without adding any delay for the framerate. This will allow the stream to catch up in case it falls behind, which I've found to happen if I move or resize the ffplay window. However, this could make the video look choppy if your video data is being received at an inconsistent rate.

Rwanda answered 4/3, 2022 at 14:18 Comment(2)
I ended up with the following command: ffplay -flags low_delay -vf setpts=0 udp://127.0.0.1:10000. Adding -vf setpts=0 reduced the latency by 6 frames. The other solutions are not working (in my case). Thank you!!!Ligula
Setting -vf setpts=0 is the only thing that worked for me to eliminate accrued latency when playing a usb dshow webcam. It believe it is because the webcam is tagging the frames' timestamps slower than my computer's clock, so ffplay ends up delaying playback until it has to drop frames because of a full cache.Botchy
P
7

Try set flags of AVFormatContext to AVFMT_FLAG_NOBUFFER | AVFMT_FLAG_FLUSH_PACKETS

AVFormatContext *ctx;
...
ctx->flags = AVFMT_FLAG_NOBUFFER | AVFMT_FLAG_FLUSH_PACKETS;

Then try to set decoder thread to 1. It seems like more thread will cause more latency.

AVCodecContext *ctx;
...
ctx->thread_count = 1;
Pendley answered 5/3, 2019 at 9:1 Comment(0)
F
4

for me the latency solved by passing -tune zerolatency

ffmpeg -f rawvideo -i /dev/video0 -preset slow -tune zerolatency -pix_fmt yuv420p -c:v libx264 -f rawvideo /tmp/pipe.h264

made the quality of the motion vectors a bit worst but, the latency was more importent to me so...

Filaria answered 25/12, 2022 at 14:43 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.