Better Calibration to Eliminate Headaches

Calibration

26 Mar – Better Calibration to Eliminate Headaches

As Tiledmedia’s math guru (not my own words :-), I get to solve the challenging problems. Here’s an interesting one: fisheye calibration. Let me explain.

Working with Cosm, we supply client-side technology for a major upcoming sports event late this summer. (Naming the event will require me to get all sorts of permissions and I am too busy doing math, so I’ll leave that bit to the imagination of the reader.) There will be a VR service. Much of the VR will be stereoscopic 180-degree captured with fisheye lenses. Now there are many camera bodies and many fisheye lenses, and even more combinations of those. And these all require their own set of calibration parameters.

So my colleagues asked me to ensure the fisheye video looks good in our client-side SDK. Now while a sound calibration is always important, it is even more important when it comes to stereoscopic content. When the calibration is even a little off, the stereo won’t work, or it will look odd. Your eyes may have trouble adjusting. Even worse, it might give the user a splitting headache. And believe me, having looked at many badly calibrated videos, I have first-hand experience.

I used to think that calibration was an art, and I have painstakingly been working with videos on my screen and trying to find straight lines in them, measuring these, doing math on the sensor density, etc. etc. More art than science, really. Because we only worked with a handful of combinations, doing the calibration manually was not a problem. But recently, we are getting more requests and more combinations, so we needed something less time and headache consuming.

OCamLib to the Rescue

Last week, I stumbled upon an interesting tool while reading about Intel’s XCam open source stitching software: a nifty tool called OCamCalib.

Provided by Davide Scaramuzza of the ETH in Zürich, this “Omnidirectional Camera Calibration Toolbox for Matlab” does something that is noting short of amazing. Print a chessboard, move it around in front of the camera you want to calibrate, and record that in a video. Next, drop the video in OCamCalib and out come all the parameters we need for a polynomial fisheye camera model. There can be quite a few parameters, and to keep things manageable we limit the amount to 39 (yes, thirty-nine).

I did learn that this requires clean video feeds, without any “correction” plugins used in post. One such video had me quite busy for a day, trying to understand why I kept getting strangely looking results. But then I understood that some 3D processing had been applied, and when I got the raw camera output and it worked like a charm.

Useful in ClearVR Cloud and the ClearVR SDK

Armed with the new toolbox, we are now re-calibrating our fisheye settings. They were already quite good, but now we can make them even better. This is important as ClearVR maps fisheye videos in two elements in our end-to-end chain.

First, we ingest fisheye video into ClearVR Cloud and convert it to a 180-degree cubemap before doing a tiled transcode. We do this for the live matches in Sky Worlds, for instance. Second, our client-side SDK is able to take non-tiled 180-degree fisheye video and apply the same mapping.

We also confirmed something that is well known in the industry: stereoscopic video in VR should avoid very close objects. Such objects give users a hard time adjusting their eyes and they elicit the vergence-accommodation conflict. They also make the lack of true parallax in stereoscopic 3 DoF video even more apparent – and more annoying.

As always, the proof of the pudding will be in the eating, but I believe this will save sports fans some serious headaches this summer, and make their VR sports experience all the more enjoyable.

Curious? Drop me a note at [email protected]!

(And thanks Justin Fiksel for the picture! )

Date Date

March 26, 2021

Category Category

Tech

Blogs

Stay tuned!

Sign up here to get the latest streaming news and solution updates