How to de-warp 180 degree or 360 degree fisheye video with ffmpeg?
Asked Answered
T

2

6

Let's say I have video from an IP-camera that has a 180 degree or 360 degree fisheye lens and I want to dewarp the image in some way. Ideally I would be able to select some rectangular area of the input image and dewarp that into a "normal" looking output video, but it would also be acceptable to dewarp the the video into some sort of Equirectangular or Equi-Angular Cubemap projection. The input video looks like this

enter image description here

I'm aware of two filters that might be used for this

  1. lenscorrect filter - I think that this is on the right course but all of the example that I can find with this filter are only for "minor" fisheye lenses and I can't seem to get this to work correctly for videos with 360 degree fisheye lenses, it simply doesn't dewarp enough.

  2. v360 filter. I thought that this must be the correctly filter but it seems that it's intended for 360 videos and not 360 degree fisheye lenses? I didn't know that there was a difference but I can't get it to work. When I try to take my input video and map it through an equirectangular projection I get some odd output like this

enter image description here

I've tried a dozen or so different combinations of parameters but none of them seem to give me the output that I want which is a single dewarped image. Can someone help me with the filter graph parameters to use this filter?

Is there something that I'm missing? Are either of these filters the correct way forward?

EDIT -

I've been experimenting with the v360 filter and I think I've gotten closer. What I want to do is map a fisheye input to an equirectangular output, so I've tried this

ffmpeg -i input.mp4 -vf v360=fisheye:equirect:id_fov=360 output.mp4

This should mean that my input is a fisheye lens with a diagonal field of view of 360 degrees and I want my output to be an equirectangular projection but this is what I get

enter image description here

Trailer answered 6/6, 2020 at 22:22 Comment(3)
That first output you have is a cubemap, which is a more efficient way for dealing with 360 than a regular equirectangular projection.Tailrace
you must use ih_fov=180 and iv_fov=180 in place of id_fov=360. But you will get anyway 2 dark areas in the equirectangular result, because they represent the non visible half part of the world which your camera cannot see.Enroot
I’m voting to close this question because , as explained by the ffmpeg tag: Only questions about programmatic use of the FFmpeg libraries, API, or tools are on topic. Questions about interactive use of the command line tool should be asked on superuser.com or video.stackexchange.com. Please delete this.Maryrosemarys
F
2

ffmpeg -i input.mp4 -vf v360=fisheye:equirect:ih_fov=360:iv_fov=360 output.mp4

Diagonal FOV is not same as horizontal or vertical FOV. And your camera have both horizontal and vertical FOV of 360.

Founder answered 24/6, 2020 at 9:23 Comment(2)
This is what I searched for for weeks, but my photo/video/VR lingual isn't that great, plus english isn't my native language - making it hard to comprehend the manual. So thanks! However, what I did read in the manual is that setting a diagonal FOV resets H and V. JFYI.Gasparo
This is wrong, no camera in the world can have a 360x360 FOV, you need two cameras for that! 360 is the azimuth covered all around the zenith/nadir, but the camera is not looking behind, hence the FOV is just 180 (although I have a camera and a fisheye addon for phone which bring the FOV up to 235°). Hence the command is ffmpeg -i input_file -vf v360=fisheye:equirect:id_fov=180 output_file (regardless if files are images or video)Enroot
E
0

Using FFMPEG, the proper commandline is probably (see notes below) something like this:

ffmpeg -i input_file -vf v360=fisheye:equirect:id_fov=180 output_file

or:

ffmpeg -i input_file  -vf v360=input=fisheye:id_fov=180:output=equirect:pitch=-90 -y output_file

regardless if files are images or videos.

This happens because no camera in the world can have a 360x360 FOV, you need two cameras for that; 360 is the azimuth covered all around the zenith/nadir, but the camera is not looking behind, hence the FOV is just 180 (although I have a camera and a fisheye addon for phone which bring the FOV up to 235°).

Errors in your command line:

  • using id_fov in place of ih_fov and iv_fov, and using 360 value, you are actually specifying horizontal/vertical FOV = 254°, because "id_fov" is DIAGONAL fov (360/SQR(2) = 254)

  • your camera has probably 180° horizontal/vertical FOV, not 360°

FFMPEG result using id_fov=360:

ID_FOV=360

using ih_fov=254:iv_fov=254:

ih_fov=254:iv_fov=254

(identical to above one)

using ih_fov=180:iv_fov=180:

ih_fov=180:iv_fov=180:

Probably none of these results is the right one: you should try to identify the real horizontal/vertical FOV of your camera and use it for both ih_fov and iv_fov.

Once you convert to equirectangular, you can upload images here to view them:

https://spano.pyrik.dev/

To view them in VR viewer such as Google Cardboard, you'll need to add some EXIF data, as done in this scripts.


XDV360, an app coming with my "panocamera", allows converting fisheye to equirectangular (videos and images); it's limited to cameras pointing up (zenith) or down (nadir), but your case falls in this limits, so it's ok. You can find the Android version online, and the PC version here:

https://windowsdraw.altervista.org/XDV360_201672719.zip

Enroot answered 9/7, 2021 at 17:27 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.