I am capturing video through a webcam which gives a mjpeg stream. I did the video capture in a worker thread. I start the capture like this:
const std::string videoStreamAddress = "http://192.168.1.173:80/live/0/mjpeg.jpg?x.mjpeg";
qDebug() << "start";
cap.open(videoStreamAddress);
qDebug() << "really started";
cap.set(CV_CAP_PROP_FRAME_WIDTH, 720);
cap.set(CV_CAP_PROP_FRAME_HEIGHT, 576);
the camera is feeding the stream at 20fps. But if I did the reading in 20fps like this:
if (!cap.isOpened()) return;
Mat frame;
cap >> frame; // get a new frame from camera
mutex.lock();
m_imageFrame = frame;
mutex.unlock();
Then there is a 3+ seconds lag. The reason is that the captured video is first stored in a buffer.When I first start the camera, the buffer is accumulated but I did not read the frames out. So If I read from the buffer it always gives me the old frames. The only solutions I have now is to read the buffer at 30fps so it will clean the buffer quickly and there's no more serious lag.
Is there any other possible solution so that I could clean/flush the buffer manually each time I start the camera?
video_capture.set(cv2.CAP_PROP_POS_FRAMES, 0)
before everyvideo_capture.read()
call helps me to get the latest frames from a USB camera with Python 3, OpenCV 4.2 and GStreamer. WhereasCAP_PROP_BUFFERSIZE
gives a GStreamerunhandled property
warning – Subjoindervideo_capture.set(cv2.CAP_PROP_POS_FRAMES,0)
before everyvideo_capture.read()
actually made my video stream lag even more... – Anklebone