What does it look like being on a mountain top engulfed by fire?

August 15, 2017

The article below was contributed by Paul Bourke (please check https://www.youtube.com/user/pauldbourke and http://paulbourke.net/ ) as background information for a time-lapse animation he produced from HPWREN images. The video is available at https://www.youtube.com/watch?v=uTBCVDvV36A and it shows a four-camera 360 degree view in 4k resolution of the Whittier Fire near Santa Barbara in July 2017, projected onto the inside of a sphere within which a viewer can pan, tilt and zoom. Viewing options include a PC with a mouse to control rotations (best in full screen mode), a tablet or smart phone that reacts to viewer movements, or a virtual reality head mounted display able to track head tilt and rotation.

The history of panoramic photography dates back to the very earliest days of photography in 1840's. Techniques to photographically capture panoramas have continued to evolve including the development of bespoke camera systems. With new presentation devices such as cylindrical projection systems and head mounted displays that allow the panorama to convey an immersive sense of a place, there has been considerable effort in developing cameras to record panoramic video. Single camera designs capable of recording 360 video are simple and easily create seamless panoramas but have the disadvantage of not scaling in resolution. Higher resolution requires rigs consisting of multiple individual cameras. The overlapping images from each camera are fused together to form the final panorama. The challenges for these multiple camera systems involves minimising the inevitable parallax errors [reference 1] and creating a constant colour grade, the success of these determines how seamless the panorama will appear.

Figure 1 shows a single time snapshot from four cameras mounted on Santa Ynez North Peak on the 8th July 2017. Each camera has field of view (FOV) greater than 90 degrees horizontally and thus captures everything visible (within a limited vertical FOV) from the position of the camera. While these 4 images capture the entire horizontal field of view, as they stand they present a distorted and disconnected view, a form not directly useful for current virtual reality (VR) experiences. A frequently used approach to forming suitable panoramas is to use machine vision to find feature points between the images, that is, points on the same object between adjacent images. Once these points are found the images can be warped and blended together, aligning the feature points, to form the panorama. This approach can lead to spatial distortion but is unsuited in this case because there is not sufficient overlap to find enough reliable feature points.

Figure 1

The approach used here is to model the key optical and geometric properties of the camera system. This includes the field of view of the fisheye lenses and the camera orientations. Armed with this parameterised model one can project the fisheye images onto the appropriate geometry required for the final panorama. The images would be projected onto a cylinder for a cylindrical panorama, or as in this case, projected onto a sphere to form an equirectangular projection. Figure 2 shows the equirectangular projection corresponding to the 4 images in figure 1. The equirectangular projection is the most commonly used format for VR applications, it extends 360 degrees horizontally and 180 degrees vertically. Each fisheye image is responsible for exactly one quarter of this image horizontally. Since the lenses are slightly greater than 90 degrees the overlap between each quarter is used to blend the images together thus reducing the visibility of the seam.

Figure 2

The missing part is how to measure the optical and geometric properties of each camera, noting that the author did not have access to the cameras and there is often a 5% variation in commodity optical elements. An interactive application was developed that allows the operator to manually adjust the important parameters and get immediate feedback on the consequences on the alignment of features in the scene.

Figure 3 is a screen dump of this application. Using the mouse and keyboard one can adjust the lens FOV, the position of the fisheye on the camera sensor and the orientation of the cameras. For example, each of these cameras is tilted downwards by about 9 degrees. Once the desired alignment is achieved the parameters are saved to a file and used in the subsequent processing of frames. It should be noted here that due to fundamental parallax issues it is not possible to achieve a perfect alignment across all depths, only at a single depth [reference 1]. The goal here was to achieve a good alignment for distant objects. Additionally, these cameras are not color calibrated and operating in automatic exposure mode so color differences between each quarter is to be expected.

Figure 3

The advantages of 360 panorama image and video are many. If the panorama is laid out flat as in figure 2 nothing is hidden from view and the local distortion is minimal. One only need to keep in mind that the left and right edge are connected. A navigable perspective view can be presented online [reference 2] reducing the distortion entirely while allowing the operator to look around in real time. There are a number of surround presentation devices that have been built to give a sense of immersion both without the need to wear anything and providing a more social experience. Examples might be the iDome [reference 3] or the AVIE cylinders [reference 4]. When viewed within a VR head mounted display one gets an undistorted view while being able to naturally turn around as one might in real life, the ultimate sense of "being there".


Online references

1. Fundamental parallax issues explained.

http://paulbourke.net/miscellaneous/parallaxerror/

2. Timelapse sequence from which figure 2 is a one frame.

https://youtu.be/wipj73bIkXY

3. iDome immersive display, presenting a 180 degree view and mouse navigation

http://paulbourke.net/dome/iDome/

4. AVIE - Advanced Visualisation and Interaction Environment

http://paulbourke.net/miscellaneous/AVIE.pdf