Lytro Immerge

PRE-PRODUCTION & PLANNING

Creating content for VR has introduced novel problems for cinematographers that attracts innovative solutions. Industry honed techniques related to set design and scene lighting need to be dramatically re-assessed within this new medium. Detailed planning and preproduction are critical to guaranteeing success during every stage of production, and the Lytro Immerge system is designed to ease this challenge.

Lytro provides a set of tools to aid with planning a Light Field VR production, with robust guidance and technical support from the Lytro team including shooting guides and pre-vis templates. Lytro also provides training on camera operation, and support the production with technical experts, from planning the shoot, to mastering and delivery of the final Light Field VR Experience..

To support previsualization on-set, the Immerge rig is equipped with a center mounted witness camera to assist with set planning, lighting planning, composition and blocking. For previewing takes, on-set video from the witness camera can be stitched into a 2D 360° for viewing in a VR HMD.

PRODUCTION

Lytro Immerge is the world’s first Light Field virtual reality system for capturing live action content. Intended for use in a studio environment, the system produces a full 360°x180° Light Field by capturing a series of three overlapping 120° wedges.

Each wedge captures raw Light Field data within that wedge’s field of view. The raw Light Field data from each wedge is then processed without stitching during post-production to create the Light Field volume that is the basis of the final 360° VR experience. Hundreds of individual cameras are utilized across that 360° volume, resulting in an incredibly high resolution and dense 1.5 gigapixel Light Field.

One of the outputs from processing the raw Light Field data is depth information for every pixel in the captured scene. Many of the capabilities of Light Field VR rely on having accurate depth information; accordingly, small decisions regarding aspects of the set design can affect the quality of this depth information. To produce pixel accurate depth we need consistent spatial and temporal depth information. To achieve this, the Lytro Immerge production guidelines may use for techniques such as Lidar scanning for ground truth depth of static elements, and green screen capture of dynamic subjects. Better segmentation from combining these techniques produces consistent depth data and improves depth accuracy, especially around object edges.

A fundamental challenge in VR production is translating on-set decisions into their visual impact in the final VR experience. The Lytro Immerge system offers an advanced control interface that provides access to all necessary configuration, capture, and playback controls for each of the hundreds of contributing individual cameras. Lytro Immerge also provides fully featured live view and instant playback of captured footage, including output for quick-stitch preview in an HMD of your choice.

Lytro Immerge is designed for Light Field capture and does not capture audio directly, however the Lytro Immerge VR tools provide support for integration of externally captured audio, including the ability to match the 6DoF visual experience with directional and positional audio playback via the B-format.

To achieve all the benefits of Light Field virtual reality, Lytro Immerge captures incredibly dense data sets. Included with the system is the capture server, intended to provide enough footage capacity and data throughput to capture all image data recorded during production. In addition, the capture server also provides real-time local backup of all footage, safeguarding precious digital production assets.

POST-PRODUCTION & MASTERING

To take full advantage of Light Field volume and its benefits, the Lytro Immerge VFX post-production workflow uses many industry standard processes and tools to perform corrections and creative changes. It utilizes the 16 bit EXR files generated during VR pre-processing as the new source data and allows VFX team to edit them using existing 2D and 3D pipeline tools. During this stage, VFX artists clean-up depth maps, integrate CG assets, adjust lights/colors and perform any necessary changes prior to the VR Light Field volume rendering stage.

Light Field VR does not require stitching like other forms of live action VR. Rather, it places all objects in the scene in their exact spatial position based on the viewer's point of view without duplicating, deleting or stretching data. This technique is highly reliant on accurately representing the captured 3D space. Accordingly, depth compositing is a vital step of the VFX post-production stage. Note that a typical Lytro Immerge production will require depth compositing to improve consistency and correct specific estimation errors. This touch-up process needs to take place prior to any VFX work and is offered as an optional service by Lytro.

With depth quality in mind, a number of techniques are encouraged during both production and post-production. During production, use of green screen and Lidar® greatly assists with the consistency of depth information. During post-production, rotoscoping can be used to identify the boundary pixels of a dynamic subject to improve depth segmentation in situations where green screen may not have worked well enough or was not feasible.

The Lytro Immerge VR tools support integration of CG elements, in post-production by providing a virtual camera that matches the exact configuration of the Lytro Immerge planar configuration. The virtual camera is imported into the VFX rendering pipeline, which renders CG elements as a sequence of Light Field ready images, preserving their full cinematic fidelity, with temporally consistent depth information and view dependent lighting, all without baking shaders or reducing geometric complexity. CG assets rendered through the pipeline, for example Maya-V-Ray or Houdini-Mantra, are output as RGBA and Z 16-bit EXR images. These rendered images are ready to be composited into the live action Light Field content. During this last compositing step, it’s possible to modify both the rendered CG asset images and the live-action content images using professional editing suites such as Nuke®.

When capturing the full 360° Light Field volume, Lytro Immerge utilizes hundreds of cameras and produces significant amounts of data in the process. This volume of data requires careful management and quality control. The Lytro Immerge VR tools assist VFX artists in propagating their changes across the large number of cameras to help reduce this challenge. Finally, the true measure of production’s quality can only be assessed within a VR HMD. To enable this quality review process, the system includes the Lytro Immerge VR player, which acts as an authoring aid and can be used to review content for QC and dailies.

One of the key concepts that need to be taken into account with Light Field productions, is the relationship between minimum object distance and the number of cameras that are required in the post-production stage. The closer objects are to the Light Field Volume, the more cameras are required to retain sufficient Light Field density. Once the closest object distance is more than 1.5 meters from the edge of the Light Field Volume, it is possible to reduce that density by reducing the number of cameras by roughly 50%; this step trims down post-production work. To ensure the appropriate balance of quality and data density yielded via trimming, Lytro’s experts will be there to provide guidance and consultation throughout the process.


Once post-production is wrapped, and the Light Field volume has been rendered, the Lytro Immerge VR tools are used to customize and finalize the Light Field VR experience for the full range of VR platforms. The Lytro Immerge VR player supports 6DoF playback on both the Oculus Rift® and HTC Vive® and can be additionally mastered to a variety of 3DoF experiences for other platforms, such as Samsung Gear VR®, Facebook, YouTube and future devices.

The Lytro Immerge system provides tools for tuning how the user will experience the content, allowing aspects of the Light Field volume to be modified. Customization of the visual and control elements that viewers will see and interact with is also possible. This includes changing the behavior of the Light Field volume edges, as viewers reach the furthest extent of that volume. The tools also support the ability to modify the virtual IPD, to influence the perception of scale and stereo effect in the experience.

To support creation of a branded VR experience, the tools can customize visual aspects of the Light Field VR experience, and tailor the look and feel of viewer interaction with the player package.

Once the Light Field VR experience is complete, it’s time to prepare it for final packaging and delivery. To help reduce the size of the mastered VR experience, Lytro’s Light Field compression algorithms can be put to use and tuned for specific results. To bring down file sizes further, special statistical tracking tools within the Light Field VR player can be utilized to identify and trim portions of the Light Field volume that are less likely to be accessed by seated viewers (highly dependant on the content itself).

Delivery

After mastering the Light Field VR experience, it’s time to get it to the audience. For 6DoF content on Windows® machines with Oculus Rift® and HTC Vive® HMDs , the Light Field Volume is packaged together with the Lytro Immerge VR player and the mastered configuration into a player package. Once downloaded to or placed on the desired playback machine, this player package automatically launches the Light Field VR experience.

It is worth noting that Light Field VR experiences will require a large amount of free hard drive space on the destination computer. Like most compressed digital media elements, the size of the packaged Light Field VR experience is highly dependent on the content’s amount of motion, details and overall visual complexity, as well as compression settings and mastering choices. At the highest level, there are three major tiers of experiences that can be produced using Lytro Immerge. These tiers permit the Light Field VR experience to be shared across every level of hardware, from dedicated in-venue systems, to home VR machines and even mobile devices. Each of the tiers listed below optimize the experience for a smaller package size/footprint:

  1. In-venue experience with sufficient local storage can deliver the full Light Field experience at the highest visual settings and the largest Light Field view volume.
  2. Downloadable Light Field VR experiences can have a smaller viewing volume, trimmed for ergonomic ranges of motion, compressed for faster download time and require less storage (multiple experiences can be mastered for a variety of situations/needs).
  3. Exported 3DoF version of the Light Field VR experience produces a superior quality 360° spherical video experience (Omni-Stereo*, regular stereo or mono) and can be compressed significantly more, using many third party tools; these experiences can be easily streamed to various mobile platforms.

*The third tier of delivery includes a highly unique form of spherical video called Omni-Stereo. This computational variation on spherical video leverages the data from the entire Light Field to create a completely stitch-free rendering, which is designed to be placed in a sphere without stretching or warp artifacts, with perfect stereo representation in every direction and angle. This is a far superior rendering of spherical video that can only be produced computationally from a rich set of Light Field data.

As a point of reference, below are the expected data sizes for a single 360° frame in an average 6DoF Light Field VR experience with various Light Field view volume sizes and amount of trimming to produce an optimized Light Field volume for seated viewers.

My Involvement

• Lead UX/UI Designer
• Flow Charts
• Wireframes
• Ridiculous amounts of Prototyping
• Scoping features and planning
• Information Architecture
• Interaction Design
• UI Style Direction
• UI Style Production
• User Testing

Project Overview

Achieve true presence for live action VR through six degrees of freedom with Lytro Immerge. It is a Light Field powered solution built to support the entire scope of producing high quality vr experiences, from planning to production through to post-production, onto mastering, and finally to delivery.

Details

Worked with art direction, producers, game designers and engineers to develop a flow and a process for what was to become Lytro IMMERGE.