Gaming Tech Drives AR/VR Adoption In TV
Advances in video game technology from game publishers like Epic Games and Unity Technologies are driving increased adoption of augmented reality graphics and virtual sets among broadcasters, as the latest gaming engines afford real-time rendering and far more realistic AR graphics and virtual environments than previously achievable.
This level of improvement makes the use of AR and virtual sets viable on an everyday basis, according to broadcast experts who gathered last week for the TVNewsCheck webinar, AR/VR State of the Art, moderated by this reporter.
The Weather Channel started experimenting with AR and virtual sets as a storytelling tool about six years ago, using them to supplement the large flat panel displays and video walls in its Atlanta studios to describe and explain weather events. But the network quickly faced difficulties.
“We recognized almost immediately that what we were doing at the time, which was using a traditional broadcast graphics engine, was not going to be sustainable,” said Mike Chesterfield, senior director of weather presentation for The Weather Channel (TWC). “It was just too hard of a process, mainly due to the rendering that it took to get these things done. The biggest improvement, and really the key to where we are at today, was the emergence of real-time graphics engines.”
TWC uses Epic’s Unreal gaming engine, whose real-time capabilities remove a “major blocker” in the production process, said Chesterfield, and has developed “bridge” applications that link Unreal to traditional broadcast systems and allow the network to play out the photorealistic virtual environments to air [TWC uses Zero Density virtual set software]. The end result has been a much more efficient workflow.
Immersive Mixed Reality
TWC has expanded its use of AR/VR graphics with what it calls Immersive Mixed Reality, which relies on a new virtual set that blends into a conventional set in its studio and allows talent to walk seamlessly between the two environments. As shown in a TWC video clip, a meteorologist can stand in what looks like a conventional studio and have virtual video walls of weather maps or 3D graphics of daily temperatures pop up behind or in front of them, or even under their feet. TWC talent can also be instantly placed on the ground in big cities like New York, Philadelphia or Miami through “Virtual Views,” photorealistic renderings of those locations that can depict current or future conditions.
“It’s powerful enough to handle basically anything we want to throw at it,” Chesterfield said. “Our goal all along has been to create hyperrealism within these experiences. We want for the audience not to realize what’s fake or real, really, so they can place themselves into these situations.
“The real-time functionality is really what took this to the next level, because on a production pipeline we were able to work in real time and make changes on the fly without having to stand down and wait hours and hours for a rendering process to occur,” he said.
Spanish software firm Brainstorm has been in the broadcast virtual production space for almost three decades, dating back to its founding in Madrid in 1993, and its eStudio 3D graphics engine, Aston 2D/3D motion graphics and InfinitySet virtual set products are used by major broadcasters including ESPN, CNBC, the BBC and NFL RedZone. For example, CNBC runs around 7,000 or 8,000 daily AR and motion graphics daily using Brainstorm’s software, said Ruben Ruiz, EVP of sales for U.S. and Canada for Brainstorm.
The company can work with both the Unreal and Unity engines to generate photorealistic real-time renderings. But it is farther ahead in its use of Unreal, which has been quicker in its development, Ruiz said.
“We introduced Unreal Engine into our product line probably about four or five years ago,” he said. “Now we have our own engine, and we simultaneously run Unreal Engine under the hood. So, you actually have access to both while you’re running our software.”
Ruiz also shared a clip of Brainstorm’s TeleTransporter solution, which allows live video of geographically dispersed talent to be “teleported” to a virtual set where they can talk and interact as if they were sitting in a real studio. Working in conjunction with Google, Brainstorm has also created a news-specific version of this capability for remote reporters called TelePorter, which runs off a mobile phone application.
“It allows reporters to appear in-studio, so placing reporters together with interviewees, regardless of their physical location,” Ruiz said.
Tool Vs. Toy At CNN
CNN started playing with live augmented reality way back in 2008 for its presidential election coverage, when its “hologram” graphic allowed correspondent Jessica Yellin to magically appear alongside Wolf Blitzer in the network’s New York studio, even though she was covering a rally in Chicago at the time. Since then, the internal narrative about AR, virtual sets or any new production technology has been divided into “tool versus toy,” said Pallavi Reddy, senior director of new media for CNN.
“That’s something my team and I talk about a lot, how we are using technology as a tool versus just kind of a toy,” Reddy said. “Not that there won’t be playful aspects of the technology to be used, to make it more interesting, but the basis of it always has to come down to editorial use cases of it.”
In 2012, CNN developed its first green-screen studio. It soon began to be used on a daily basis for “explanatory journalism,” where Reddy’s graphics and editing team are tasked with turning around a piece in less than six hours, start to finish, with 3D elements that help explain the news of the day.
Since then, the technology has become easier to work with and vendors have become more responsive to broadcasters’ needs, Reddy said. She shared a clip of CNN’s 2020 presidential election coverage from the Iowa caucus, where correspondent Tom Foreman could be seen inside a college gymnasium explaining a bevy of AR graphics displaying real-time voting results that dynamically appeared amidst the rafters, walls and bleachers.
Virtual Production and ‘Extended Reality’
Planar is best known for its LED video walls and large LCD displays that are used by broadcasters like NBC Olympics, Golf Channel and Canal+ for live sports and news production. But the company has also been expanding into “virtual production” and “extended reality” with its Planar Studios arm.
Planar Studios markets the company’s LED video walls as a way to display photorealistic backdrops that live talent can stand in front of or on top of, thus creating the experience of an immersive world that can be captured by digital cinema cameras for film and TV production [this is how the Disney+ hit The Mandalorian was produced, using LED walls instead of a conventional set]. Planar has introduced a new series of LED walls aimed specifically at virtual production, the VX Series, and also offers optical tracking technology for virtual production through its sister company NaturalPoint.
“We’re even seeing people use LEDs as green screens,” said Kathy Skinski, general manager for broadcast, media and virtual production for Planar. “One of the advances there is that they can actually interact with the content directly, so they see the content. It’s kind of a game-changer, and a little easier to work with.”
Training Talent On AR/VR
Learning to work with AR/VR systems isn’t necessarily easy for talent, though Chesterfield said TWC meteorologists’ traditional green-screen training has helped them feel comfortable with the virtual set environment. They have eagerly taken to the network’s AR/VR technology as a way to more readily explain the weather and educate viewers about severe weather threats, such as showing a visualization of a storm surge rising 10 feet above their head during a hurricane or dodging a 3D graphic of flying debris from a tornado.
“For the first time we can now put our talent into situations that we would never otherwise put them in,” Chesterfield said. He added that such visualizations help get people to act, such as evacuating an area facing flood warnings.
To give talent a reference monitor, TWC uses projectors onto the green screen space with every color blocked out but green, so they can actually see the image when they turn around and look behind them to get a sense of where they are in the scene. There are also additional monitors placed on the periphery of the studio to give them additional reference points.
Reddy said that some talent takes more readily to working with AR than others. She described Foreman as a “master class” when it comes to a traditional news journalist presenting in a green-screen environment, speaking to live graphics that he can’t see but looking very comfortable when doing it (she noted that Foreman credits his childhood hobby of magic as being helpful to working with AR).
There is a six-person team that produces CNN’s AR and VR graphics not only for the network’s broadcast air but also its digital products. Reddy said that none of them came from technology or graphics backgrounds, but instead were all former news producers. Over time they’ve learned what the AR/VR technology can do, as well as its limitations, and that helps them communicate with other CNN staff about what is possible and what isn’t.
“That’s been our secret sauce to our success,” she said. “It’s not just creating a team that works with LED screens, or our tech operations people or our studio operations people, and … producers who are just going to tell them what to do. It’s also understanding the limits and capabilities of all of these technologies, and the story itself and how that presents out.”