TVN TECH

ESPN Raises The AR/XR Bar In Bristol

A new ESPN facility in Bristol, Conn., has introduced dramatic new virtual set capabilities for the sports network, taking its production game to the next level. Pictured: Studio X, the SportsCenter set, showing the 45’ by 16’ LED “depth wall” that supports dynamic virtual production elements.

Cable sports giant ESPN is making expanded use of augmented reality (AR), extended reality (XR) and virtual production technologies at its Bristol, Conn., headquarters with the creation of a new virtual production studio, “Catalyst Stage,” and significant enhancements to Studio X, home of its flagship SportsCenter program.

The “immersive” Catalyst Stage in Studio C, which went live in late June after two years of development, employs similar production techniques as the popular Disney+ sci-fi series The Mandalorian by blending physical and virtual elements to capture talent in realistic 3D environments. The new studio, located in a relatively compact space repurposed from the original HD SportsCenter set, has high-resolution Roe Visual LED displays making up the background wall and floor to create a virtual production “volume” measuring roughly 30 feet wide by 15 feet tall by 15 feet deep.

ESPN has been using Catalyst Stage to blend virtual and physical elements for segments like “Gear Up.”

The LEDs display photorealistic 3D graphics are rendered by Epic Games’ Unreal Engine, as well as video and still images, and are used in combination with sophisticated camera tracking technology and effects software from vendors Disguise XR, Pixotope and GhostFrame. Between the wall and the floor there are 11 million total pixels in total that refresh at a rate of 7,600 times per second. They are fed by 26 real-time servers (19 Disguise, 7 Unreal) that pump data at aggregate rates up to 128 GB per second.

GhostFrame’s “MultiSource” technology also allows up to four different video feeds to be sent simultaneously to the LED wall and captured by individual cameras, while the naked eye only sees one feed. So, a segment can be shot once while simultaneously yielding four different versions, each with a different virtual background.

The end result of all this XR technology is the ability to place talent in any location, whether it is a “digital twin” of an existing physical set in Bristol, an imaginary studio on top of a scenic hilltop, or the middle of a playing field. Live video feeds from anywhere in the ESPN network can also be easily pumped to virtual monitors within the XR environment, allowing talent to physically interact with highlights or conduct interviews with remote guests. With its “digital twin” capability Catalyst can also act as an extension of existing physical studios in Bristol when production requirements warrant it.

BRAND CONNECTIONS

Catalyst Stage currently supports both a 1080p/60 broadcast workflow with four Grass Valley LDX 150 cameras, and a 4K digital cinema workflow at 24 frames per second using an Arri Amira Live camera. All five cameras are robotically controlled by Mark Roberts Motion Control systems. The broadcast workflow is for traditional studio shows, which ESPN hopes to launch from Studio C in the near future. So far, Catalyst has been used this way primarily to create specialty segments like “Gear Up.”

The Arri digital cinema camera in Studio C, dubbed “Wall-E” after the animated film, is equipped with a Mark Roberts Motion Control robotic system and used for commercial and promo work.

The 4K cinematic workflow is aimed at creating promos and advertising content. ESPN recently employed Catalyst to create a series of spots for Inspire Brands, owner of Dunkin’, Arby’s and Buffalo Wild Wings, featuring College GameDay host Rece Davis. The XR spots featured Davis sitting at the real GameDay anchor desk, which was placed on the floor of the volume, and pitching Inspire products from a variety of different locations.

“We’ve mixing practical with virtual in here, and it seems to do the trick of really selling it [the reality] and making it feel grounded,” says Christiaan Cokas, director – creative automation & studio technology, Disney Entertainment & ESPN Technology. “This is another thing that has been easy for us to do, and flexible for our sales teams. To be able to use our talent is the best part of having it in Bristol.”

Catalyst shoots are controlled by a small team of video shading operators in an adjoining control room. Shaders are in charge of programming and building the camera moves, or live-driving the robotics, as well as being responsible for shading of the cameras and the shading of the LEDs, as well as the shading of any virtual monitors within the LEDs. Cokas says ESPN is looking to use automation as much as possible to streamline the shaders’ robust workload.

One of the keys to Catalyst’s realism is the use of machine learning (ML)-assisted liptick cameras situated around the studio that automatically track talent as they move in space through the volume. The “Talent Track” system, powered by Pixotope, creates “digital mannequins” of talent inside of Unreal Engine that are used to create graphics of shadows and reflections that would exist on a physical set and then render them on the LEDs. Put simply, the system creates fake shadows in the virtual environment so talent’s movement through it looks more realistic.

“It’s really tooling to add to the realism,” says Jason Black, manager, creative automation and studio technology, Disney Entertainment & ESPN Technology.

Between Studio C, Studio X and Studio Y, a dedicated greenscreen set, ESPN has about 15 virtual environments in Unreal today. It is steadily growing that pool of assets, as Cokas’ team works with Joe Ferretti, director – studio design & development, ESPN, to build new virtual sets to complement the hard sets Ferretti has been building for years.

“We want to be able to have the flexibility do this anytime, anywhere, on any LED,” Cokas says. “And as our library gets bigger, our flexibility grows.”

Ferretti notes that most of the virtual environments themselves can also be easily tweaked for different shows.

“The hilltop set, we built that so we could change the basketball court to a football field,” Ferretti says. “So, we have that kind of flexibility within those environments to produce content for different sports.”

XR Breathes New Life Into Studio X

In addition to Catalyst Stage, ESPN’s technology teams have been busy over the same time period in upgrading Studio X, where SportsCenter has been produced since 2014. Along with new robotic cameras, lighting and scenic backgrounds, major enhancements include a 38 million-pixel LED screen on the east wall of the studio, installed last year; a 165-inch interactive touch screen on the west wall, biggest on the Bristol campus; and a new 45’ by 16’ LED “depth wall” on the north side that supports dynamic virtual production elements.

A view of ESPN’s Catalyst Stage, taken from a mini-production area that shows the blank LED wall and floor on the actual set in the background. (Joe Faraoni/ESPN Images)

All of the improvements, including two LED monitors on the south wall that can be rotated from a horizontal to vertical orientation, were officially unveiled on Sept. 7 during the 2 p.m. edition of SportsCenter.

“We’ve made a lot of additions,” Feretti said. “Obviously, we wanted to go 360 [degrees] as much as possible in here. On the east wall, instead of using the more hard scenery, we used some SEG fabric graphic to do backlighting. So, that whole wall can be color changing as well, just to offer different looks for the different SportsCenters that are in here.”

The giant east wall display of the revamped SportsCenter set is actually six individual Ledman LED displays, each approximately 4’ by 8’, and all run by a Hudson Motion Control automation system.

“The video shading [operator] plays back a cue, and you can do countless combinations,” says Anthony Lacaprucia, ESPN senior manager – production operations. “You can have six individual 4 by 8 displays, or you can combine 1 and 2, 3 and 4, 5 and 6, and now you have three 8’ by 8’ displays. Or you combine 1,2,3, and 4,5,6, and now you have two 12’ by 8’ displays. Or you can pull them all together and it’s 24 feet wide and 8 feet tall.”

The 165-inch Ledman touchscreen on the studio’s west wall has a high-resolution 1.9 mm pixel pitch. It is surrounded by an infrared frame that allows talent to control it either with their finger or with a wand, as NFL analyst Ryan Clark did on a recent SportsCenter while breaking down a Monday Night Football game.

The touchscreen is flanked by two vertical LED “leg” tiles, which can be used as RGB lighting strips under the control of Madrix software. Or they can used to display static graphics, such as a representation of wood that seamlessly matches the physical wood border around the west wall.

Anthony Lacaprucia, ESPN’s senior manager – production operations, with one of the Sony hard cameras mounted on a Telemetrics robotic system. Behind the camera is a giant touchscreen. (Glen Dickson photo)

“We’re using these two side walls as digital scenic,” explains Erik Barone, ESPN senior technical ops manager. “That’s real scenic on the inside, and then we just mimicked it on the outside. That’s how good the color reproduction is on these walls with the hi-res, high-pixel count.”

ESPN first experimented with a “depth monitor,” using a moving camera with tracking technology to give viewers the impression of 3D depth, with its coverage of the 2010 FIFA World Cup. Since then, graphics rendering technology and camera tracking technology has improved dramatically, and the new “depth wall” in Studio X can make the physical space appear to extend well beyond the north wall.

The depth wall’s virtual set environment, initially being used to display a “skydeck” within a large stadium, was designed by Jack Morton Worldwide, which also designed the physical elements in the updated set. NEP subsidiary Halon Entertainment handled the actual build in Unreal (Halon also did Unreal build for Catalyst Stage’s Hilltop set, for which the initial scenic design was also done by Jack Morton).

“Their team designed the look in that to match with the studio,” Ferretti says. “So, it’s an extension of the physical into the virtual world with the same designers.”

The Sony jib camera in front of the blank depth wall. (Glen Dickson photo)

A single Sony hard camera on a Steadicam jib captures the depth wall images, using Mo-Sys optical tracking. Tracking data is sent to a Vizrt graphics system to create AR effects. Both Disguise XR and Unreal Engine are also used.

The rest of the cameras in Studio X are also Sony units, mounted on Telemetrics robotic systems. Key supporting technology in the control room includes Grass Valley Kayenne K-Frame production switchers and Lawo audio consoles.

To make the new virtual elements in Catalyst Stage and Studio X work smoothly with traditional production workflows across ESPN, Cokas’ team of software programmers has created a proprietary control system called GRACE (Graphic Realtime Automation and Control Environment). The web-based GRACE controls lights, images and video within virtual environments. It also connects with ESPN’s IMS (Image Management System, an image database) as well as VCS (Viz Control System, a proprietary Vizrt graphics user interface developed by Disney Entertainment & ESPN Technology) to give virtual operators access to traditional router sources.

GRACE also gives producers and directors the ability to easily order up virtual elements, such as 3D renderings of player- or shot-tracking data from various sports to incorporate in their shows. ESPN expects that practice will become commonplace in the future.

“With GRACE we’re just getting started,” says Kevin Burroughs, ESPN director – production operations, who noted the significance of using virtual elements on a day-to-day basis in the network’s flagship show.

“The technology being packaged by our technology team, that has the competence and really understands Unreal in a deep way, has allowed us to operationalize this,” Burroughs says. “It’s quite compelling.”


Comments (1)

Leave a Reply

Sal says:

October 27, 2023 at 12:17 pm

Your article says “A single Sony hard camera on a Steadicam jib captures the depth wall images, using Mo-Sys optical tracking.”

What is a “Steadicam jib”? Aren’t those two very different types of equipment?