I currently have two command-line pipelines set up to stream video from a Raspberry Pi camera (ArduCam module) to a PC over ethernet; these work great:
gst-sender.sh
./video2stdout | gst-launch-1.0 -v fdsrc fd=0 ! \
video/x-h264, width=1280, height=800, framerate=60/1 ! \
h264parse ! rtph264pay ! \
udpsink host=xxx.xxx.xx.xxx port=xxxx
gst-reciever.sh
gst-launch-1.0 -v -e udpsrc port=xxxx \
caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, \
encoding-name=(string)H264, payload=(int)96" ! \
rtph264depay ! h264parse ! mp4mux ! filesink location=video.mp4
However, I will ultimately be running multiple cameras, synchronized via an external hardware trigger, and since I can't guarantee that the streams will begin at the same time I need timestamps--either for the stream start time or for each frame.
By adding 'identity silent=false' between h264parse and rtph264pay in gst-sender.sh, I can access the stream's buffer data, and with the following command I can retrieve the frame timestamps:
./gst-sender.sh | grep -oP "(?<=dts: )(\d+:){2}\d+.\d+"
But these timestamps are relative to the start of the stream, so I can't use them to line up saved videos from multiple streams!
Start video encoding...
0:00:00.000000000
0:00:00.016666666
0:00:00.033333332
0:00:00.049999998
0:00:00.066666664
0:00:00.083333330
0:00:00.099999996
0:00:00.116666662
0:00:00.133333328
0:00:00.149999994
0:00:00.166666660
0:00:00.183333326
It looks like gstreamer has an "absolute" clock time that it uses for latency calculations [1], but I have been unable to find any way to access it from the command line.
Is there a way to access gstreamer's absolute/system clock from the command line? Or another way to get the stream start timestamp?