Real-time Augmented Graphics that Blend the Real and Virtual Worlds with Marker-less Camera Tracking

Published on in Miscellaneous

Augmented graphics have come a long way in the past decade, from formerly rotund weatherman Al Roker obscuring the first two days of the seven-day forecast to reporter Jessica Yellin channeling a holographic Princess Leia while discussing results of the 2008 presidential campaign with Wolf Blitzer. Fast forward ten years and the real and virtual worlds are blending into one.

With camera tracking that can track via commonly found features in an indoor and outdoor environment (i.e., pipes in a ceiling, colors on a rug, specks of dirt on a floor, trees, etc.) and acquisition technology capable of capturing the most pristine detail, broadcasters, as well as filmmakers, can create photo-realistic virtual graphics, in real time, that interact naturally with the talent. The presenter can move and occupy physical space alongside the graphic as if the graphic were real. For example, the presenter’s hand doesn’t suddenly disappear when placed in front of the graphic, and neither is the graphic cut off by the presenter.

“With the combined advances in acquisition and marker-less tracking technology, Augmented Reality (AR) graphics can be an integral part of the program content rather than added merely to showcase a new graphics toy,” said Nic Hatch, CEO, Ncam, makers of the virtual production tool and optical tracking system, Ncam Reality. “And given the graphics can be fully immersed in the real-world environment, audiences have a hard time determining virtual reality from reality. Broadcasters can truly up their storytelling and production value by interacting with their assets and blending these two worlds seamlessly.”  

Real Depth is a feature of the Ncam system that, when used with a green screen, extracts depth data in real time to allow subjects to interact seamlessly with their virtual surroundings, creating a fully immersive environment. Objects respond to the physics of the real world. For graphics to look truly integrated into a real-world environment, the lighting of the graphics must match reality. Ncam’s newly launched Real Light product captures real-world lighting effects in virtual graphics, such as color, intensity, and direction, and renders it onto the graphics in real time. It creates the right shadows and allows objects to react ‘naturally’, adapting to every lighting change.

Ncam’s camera tracking device uses a lightweight sensor bar that provides real-time positional tracking data (XYZ translations and rotations) and lens information, including distortions. It tracks natural features in the environment, allowing the camera to move freely in any location, indoors or out, while generating a continuous stream of precise positional and rotational information that feeds graphic engines including Avid, Brainstorm, ChyronHego, Ross, Ventuz, Vizrt and more, via Ncam’s software development kit. Ncam systems don’t require any calibration, floor or roof ‘markers.’ The device can be used on a mounted, or hand-held and Initialization takes just one second.

Critical to the accurate representation of visual information is lens selection. Ncam’s go-to lenses are FUJINON’s Cabrio 14-35mm and 19-90mm, both T2.9. Ncam cites three factors that are crucial to the performance of their system and also greatly reduce set-up time:

  1. Optical accuracy. Center tracking, lack of geometric distortion and edge-to-edge performance are must-haves for optics.
  2. Consistency from lens to lens.
  3. Ease of use (Cabrio’s detachable digital servo’s 16-bit encoding ensures that all lens data output—including the position of the zoom, iris, and focus—is extremely accurate.).

“With our FUJINON Lenses, after calibration, we just plug our system into the lens’ servo and we’re ready to track. Of all the options, FUJINON’s encoding solution is the easiest for us,” explained Hatch. “The accuracy of the FUJINON servo that sends focus, iris and zoom information is unmatched. The more data you have, and the accuracy of that data, means the calibration will be more accurate and the graphics and video will be more intertwined. Even in graphic scenes with a continued zoom throughout the scene, our system keeps up with the zoom capture. Operators can zoom in and out and keep the same detail in the realistic and the virtual world. That simply can’t be done if you have an unreliable lens.” 

The compact, lightweight size of the lenses also minimizes the footprint of the camera system. The 19-90mm zoom is only 8.9” long, with a weight of 6.3 lbs. (2.85kg). The 14-35mm Cabrio is just over 9” long and weighs in at 6.4 lbs. (2.9kg).

The focal range, flexibility, and light weight of the Cabrios mean operators can shoot all day with the same lens, saving the considerable downtime traditionally spent swapping lenses. Cabrio zooms cover 31.5mm sensor size on digital cinema style cameras. While sensors on standard broadcast cameras are the same size, sensors on digital cine cameras vary. Cabrios ensure the image captured will cover large sensors for optimal, full-frame resolution. A common camera/lens combination for the Ncam system is a Sony PMW-F55 CineAlta 4K camera with a FUJINON Cabrio zoom, although Ncam can work with virtually any camera. 

Ncam’s solution is currently being deployed mainly in feature films and live broadcast. In the world of motion pictures and high-end drama production, Ncam’s solution gives the crew and cast the ability to preview what the complete visual effects shot looks like—while still on set. The camera operator or director of photography can see a preview of the computer-generated (CG) imagery or visual effects in real time, while actors can see where they’re standing within the virtual world and react accordingly. This translates into fewer mistakes, fewer fixes required in post-production, and an altogether faster post-production process.

The system was deployed recently on Solo: A Star Wars Story, most notably on the fast-paced, CG-heavy train heist sequence.

Ordinarily, with a blue screen shoot, everyone on set has to imagine what the final composited shot will look like, which makes it hard for actors to react and can result in errors that aren’t picked up until post-vis or post-production,” said Rob Bredow, Visual Effects Supervisor and Co-Producer. “With the Ncam system, we could see the on-set action and the CG background composited together in real-time, live on set, which really helped our cast, camera operators, and editors to understand what we were shooting for each of the complicated beats of that sequence. We could see the mountains racing by the train and frame up appropriately; when a corner was coming up, the camera could pan or tilt up from the actor’s face up to the corner up ahead, and we could intuitively find those shots on the set that simulated being out in the real environment.”

In live sports, Univision Deportes, the sports programming division of Spanish-language broadcaster Univision, used Ncam Reality on a crane for the network’s Deportes Awards last year. A virtual graphic of a soccer player accepting an award was indistinguishable from reality. During the award telecast, a crane operated without any markers, while accommodating all the noise and vibration present during the popular annual event. “The effect was perfectly clean,” said Alexis Salinas, Senior Director, Technical Creative Services at Univision Communications. Salinas said. “All of the camera switching, camera block preparation, AR camera tracking, and final composite went without a single glitch. It was extremely successful.”

While film and broadcast are the more obvious beneficiaries of AR technology, a large growth sector for Ncam’s camera-tracking solution may well lie outside of the entertainment sector. Major car brands are already using Ncam technology to map photo-realistic 3D models of cars on top of the same real car at live launch events, enabling the manufacturer to show 3D cutaways of the car, while maintaining the illusion that everything viewers see in the monitor is real. Mercedes Benz uses Ncam to virtually disassemble a vehicle while showing mechanics the function and location of each part.

“Motion capture and pre-visualization are really coming to the fore with some very compelling applications in diverse sectors,” concluded Hatch. “We’re seeing more AR graphics using a Steadicam during major live sporting events, while outdoor use is on the rise, and I think we’ll see more implementation in youth sports as well as e-sports in the near term. This virtually unlimited potential is what excites us – we create these innovative technologies to help our customers realize their creative visions.”

Ncam Reality has been utilized on notable productions including:


Broadcast Credits Motion Picture Credits
Super Bowl XLVIII (Fox Sports) Beauty and the Beast (Disney)
The Open Championship (Sky Sports) Solo: A Star Wars Story (Walt Disney Studios)
Fantastic Beasts: The Crimes of
Grindelwald (Warner Bros)
Monday Night Football (ESPN) Deadpool 2 (Marvel)
Daytona (Fox Sports) Aquaman (Warner Bros)
UEFA Champions League (BT Sport Avengers: Age of Ultron (Marvel)
NFC Championship Game (Fox Sports) Jupiter Ascending (Warner Bros)

Television Shows
Game of Thrones (HBO)
The Man in the High Castle (Amazon Studios)
Outlander (Sony)

Ncam Reality is available in three configurations for broadcast environments:

  • Ncam Reality PTZ is best placed for pan, tilt and zoom essential broadcast setups.
  • Ncam Reality Studio can be utilized for indoor/studio broadcast and real-time product visualization and augmented reality.
  • Ncam Reality Event has been developed for episodic, broadcast, large events, real-time product visualization and augmented reality.

You can learn more about Ncam.

ProductionHUB ProductionHUB Logo

Related Blog Posts

Comments

There are no comments on this blog post.

You must be logged in to leave a comment.