stream webcam using ffmpeg and live555
Asked Answered
W

3

1

I am new to live555.

I want to stream my webcam from a windows 7 (64-bit) machine behind home LAN using ffmpeg as the encoder to a live555 server running on a Debian 64-bit linux machine in a data center over the WAN. I want to send a H.264 RTP/UDP stream from ffmpeg and the "testOnDemandRTSPServer" should send out RTSP streams to clients that connect to it.

I am using the following ffmpeg command which sends UDP data to port 1234, IP address AA.BB.CC.DD

.\ffmpeg.exe -f dshow -i video="Webcam C170":audio="Microphone (3- Webcam C170)" -an 
 -vcodec libx264 -f mpegts udp://AA.BB.CC.DD:1234

On the linux server I am running the testOnDemandRTSPServer on port 5555 which expects raw UDP data from from AA:BB:CC:DD:1234. I try to open the rtsp stream in VLC using rtsp://AA.BB.CC.DD:5555/mpeg2TransportStreamFromUDPSourceTest

But I get nothing in VLC. What am I doing wrong? How can I fix it?

Waiwaif answered 16/6, 2014 at 16:3 Comment(1)
rtp can only send one stream at a time, you're sending using mpegts in this example... [?]Greenback
C
0

From what I remember, it was non-trivial to write a DeviceSource class, the problem you're describing is definitely something that's discussed quite frequently on the live555 mailing list - you need to get yourself approved to the list a.s.a.p if you want to do anything related to rtsp development.

The problem you seem to be having is related to the fact that some video formats are written with streaming in mind, and the rtsp server can easily stream certain formats because they contain "sync bytes" and other 'markers' which it can use to determine where frame boundaries end. The simplest solution you could use is to get your hands on the SDK for the camera, and use that to request data from the camera. There are many different libraries and toolkits that let you access data from the camera - one of which would be the DirectX SDK. Once you have the camera data, you would need to encode it into a streamable format, you might be able to get the raw camera frames using DirectX, then convert that to mp4 / h264 frame data using ffmpeg (libavcodec, libavformat).

Once you have your encoded frame data, you feed that into your DeviceSource class, and it will take care of streaming the data for you. I wish I had code on hand, but I was bound by NDA to not remove code from the premises, although the general algorithm is documented on the live555 website, so I am able to explain it here.

I hope you have a bit more luck with this. If you get stuck, then remember to add code to your question. Right now the only thing that's stopping your original plan from working (stream file to VLC) is the file format you chose to stream.

Creolized answered 18/6, 2014 at 12:51 Comment(0)
D
0

One thing you can try is to increase the logging verbosity level of VLC to 2: VLC expects in-band parameter sets in which case it will print a debug message that it is waiting for parameter sets on the messages window. Just having the parameter sets in the SDP of the RTSP DESCRIBE is not sufficient. IIRC you can configure x264 to output parameter sets periodically or at least with every IDR frame.

Other things you can try: You can test the stream with openRTSP before using VLC. If you use the openRTSP -d 5 -Q rtsp://xxx.xxx.xxx.xxx:5555/mpeg2TransportStreamFromUDPSourceTest options openRTSP will print quality statistics after streaming for 5 seconds. Then you will be able to verify that the testOnDemandRTSPServer is indeed relaying the stream, and that there is not a problem between the ffmpeg application and the testOnDemandRTSPServer.

Doughty answered 16/6, 2014 at 18:56 Comment(4)
I get a 'Missing sync byte' on the testOnDemandRTSPServer consoleWaiwaif
You are right. OpenRTSP is not receiving any streamed data. How do I fix this problem between ffmpeg and live555Waiwaif
I ran all programs on the same windows 7 machine. Still the same result. OpenRTSP receives 1 or 2 packets in 5 seconds.Waiwaif
I'm not sure, I've personally never used testOnDemandRTSPServer, I would try the live555 mailing list first, there's generally a fairly quick response time to queries.Doughty
C
0

Have you tried a different stream? Also, I had a similar problem due to issues with my firewall, you might want to make sure you can actually stream data through those ports.

If you are missing a Sync Byte, it's probably a stream issue - try using a different data source and see if that helps, try an .avi file or an .mp4 file, usually .mp4 files are easy to stream. If the streaming works with the .mp4 file, and not with your mpegts file, then it's a problem in your file - ffmpeg is trying to figure out where each "frame" or "frame set" of data ends so that it can try to stream discrete chunks.

It's been over 2 years since I last worked with this stuff, so let me know if you get anywhere.

Creolized answered 18/6, 2014 at 12:27 Comment(1)
Thanks for responding. My only interest is in streaming a webcam. I haven't tried different stream. Do I need to write a DeviceSource for the webcam? Is any sample code available which shows how to use it with ffmpeg?Waiwaif
C
0

From what I remember, it was non-trivial to write a DeviceSource class, the problem you're describing is definitely something that's discussed quite frequently on the live555 mailing list - you need to get yourself approved to the list a.s.a.p if you want to do anything related to rtsp development.

The problem you seem to be having is related to the fact that some video formats are written with streaming in mind, and the rtsp server can easily stream certain formats because they contain "sync bytes" and other 'markers' which it can use to determine where frame boundaries end. The simplest solution you could use is to get your hands on the SDK for the camera, and use that to request data from the camera. There are many different libraries and toolkits that let you access data from the camera - one of which would be the DirectX SDK. Once you have the camera data, you would need to encode it into a streamable format, you might be able to get the raw camera frames using DirectX, then convert that to mp4 / h264 frame data using ffmpeg (libavcodec, libavformat).

Once you have your encoded frame data, you feed that into your DeviceSource class, and it will take care of streaming the data for you. I wish I had code on hand, but I was bound by NDA to not remove code from the premises, although the general algorithm is documented on the live555 website, so I am able to explain it here.

I hope you have a bit more luck with this. If you get stuck, then remember to add code to your question. Right now the only thing that's stopping your original plan from working (stream file to VLC) is the file format you chose to stream.

Creolized answered 18/6, 2014 at 12:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.