Creating synchronized stereo videos using webcams
Asked Answered
S

1

4

I am using OpenCV to capture video streams from two USB webcams (Microsoft LifeCam Studio) in Ubuntu 14.04. I am using very simple VideoCapture code (source here) and am trying to at least view two videos that are synchronized against each other.

I used Android stopwatch apps (UltraChron Stopwatch Lite and Stopwatch Timer) on my Samsung Galaxy S3 mini to realize that my viewed images are out of sync (show different time on stopwatch).

The frames are synced maybe in 50% of the time. The frame time differences I get are from 0 to about 300ms with an average about 120ms. It seems that the amount of timeout used has very little effect on sync (same for 1000ms or 2000ms). I tried to minimize the timeout (waitKey(1) for the OpenCV loop to work at all) and read every Xth iteration of the loop - this gave worse results that waitKey(1000). I run in FullHD but lowering resolution to 640x480 had no effect.

An ideal result would be a 100% synchronized stereo video stream that has X FPS. As I said I so far use OpenCV to view video still images, but I do not mind using anything else to get the desired result (can be on Windows too).

Thanks for help in advance!

EDIT: In my search for low-cost hardware I fount that it is probably possible to do some commodity hardware hacking (link here) and inject a single clock signal into multiple camera modules simultaneously to get the desired sync. The guy who did that seems to have developed his GENLOCKed camera board (called NerdCam1) and even a synced stereo camera board that he now sells for about €200.

However, I have almost zero ability of hardware hacking. Also I am not sure if such clock injection is possible for resolutions above NTSC/PAL standard (as it seems to be an "analog" solution?). Also, I would prefer a variable baseline option where both cameras would not be soldered on a single board.

Squinteyed answered 6/8, 2014 at 13:40 Comment(4)
Since the cameras don't have genlock, I think it is not possible to o this.Planetoid
Hi, your first problem is that you have sequential code, so it is physically impossible for them to capture at the same time. The first thing to do to move forward is to put the captures into there own threads.Fremantle
This is why most machine vision cameras support strobbing / triggering for synchronization. It's a very difficult problem to solve on cheap USB hardware. Many researchers (myself included) use cameras from Point Gray (there are others too, they are just the main standard for academics). Another option is to capture at high-framerate and hope that the scene doesn't change much.Cypsela
This has been cross posted to Video Production SE: video.stackexchange.com/questions/12312/…Squinteyed
S
1

It is not possible to stereo sync two common webcams because webcams lack external trigger feature that lets one precisely sync multiple cams using a common trigger signal. Such trigger may be done both in SW or HW but the latter will give better precision. Webcams only support "free-running" mode and let you stream whatever FPS they support but you can not influence when exactly the frame integration/exposure is done.

There are USB cameras with a dedicated external trigger feature (usually scientific cameras like Point Grey) - they are more expensive (starting at about $300/piece) than webcams but can be synced. If you really are on low budget you can try to hack the PS3 Eye camera to get the ext. trigger feature.

Squinteyed answered 16/12, 2016 at 23:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.