NEW FACILITIES INSIDER

Award-Winning ESPN Facility Blazes IP Trail

The sports network’s new Digital Center 2 facility didn’t set out to be a cutting-edge technology standard bearer. The designers of the facility simply were looking for a way move about 40,000 signals in and around the facility. ESPN’s solution was a trio of technologies: IP networking, JPEG 2000 compression and MPEG transport streams.

Necessity is the mother of invention. Evidence that aphorism is just as true today as it was in the days of Plato comes from ESPN and its award-winning Digital Center 2 facility.

The 193,000-square-foot DC-2 on ESPN’s campus in Bristol, Conn., is a showpiece of just what IP-based video can do for the TV industry.

But the home to SportsCenter, most of ESPN’s NFL studio coverage, sports news updates and assorted online programming didn’t set out to be a cutting-edge technology standard bearer. The designers of the facility simply were looking for a way move about 40,000 signals in and around the facility. It would have required installing “many conventional routers” at a “substantial cost,” says Jonathan Pannaman, ESPN VP, content and production systems.

What’s more, that approach would have constrained DC-2, locking it into predetermined signal paths and compounding the problem of signal transport by requiring new baseband infrastructure to be installed to tie it in with other buildings on the campus, he says.

“Using conventional techniques that didn’t scale would have painted us more and more into a corner,” says Pannaman. “We just decided we had to get out of there some way.”

ESPN’s solution was a trio of technologies: IP networking, JPEG 2000 (J2K) compression and MPEG transport streams.

BRAND CONNECTIONS

Together, they form the core infrastructure that is capable of handling 30,000 sources and 150,000 separate content streams and that can be expanded to accommodate more as demand grows.

“The whole point was to get to an unlimited number of sources and destinations so that we could build without having to physically plan the exact design of each room, each facility within DC-2,” Pannaman says.

Consisting of five studios, 16 edit suites and six control rooms, with 1,100 miles of fiber optic cable carrying signals throughout, DC-2 earned an IBC Innovation Award in the Content Management category last month for ESPN and its technical partners, Evertz, Vizrt and Arista Networks.

Initial planning for the facility began six years ago, but it wasn’t until 18 months before its June 22, 2014, on-air date that Pannaman and his team decided the facility would be IP-based.

“Evertz and the other vendors had to develop hardware for it,” he says. “I would say 85% of all the equipment in this facility didn’t exist prior to this project. In fact, a lot of it still hasn’t made it into the catalog of products yet.”

“As vendors, we see ESPN as a very progressive customer,” says Mo Goyal, Evertz director of product marketing. “They are one of the most valued assets in the industry — one of the biggest providers of content. I think as vendors, we look at how we can continue to push technology to satisfy their needs.”

A major Evertz contribution was its EXE-VSR, a IP video routing platform, he says. It provides the switching scale ESPN needed to handle the many sources, destinations and signals envisioned.

The 40RU routing platform, which sits at the core of the Evertz 10 gig software-defined video networking solution, can switch about 14,000 uncompressed HD-SDI signals.

However, when JPEG 2000 compression is employed, signal switching capacity grows dramatically, meeting ESPN’s current and future requirements.

Vizrt, too, developed technology that didn’t previously exist to meet ESPN’s requirements, says company CTO Petter Ole Jakobsen. “We started exploring together with Nvidia a solution that could do eight simultaneous HD displays, which was unprecedented.”

What grew out of that effort was Octoviz, a real-time, live graphics generator that uses a pair of Nvidia Quadro K6000 boards and technology to sync the two together.

The thing that makes Octoviz unique is its ability to composite and render visuals for eight separate monitors. That sort of scale was instrumental in satisfying the need ESPN had to drive live graphics to more than 120 monitors on the set of SportsCenter, says Jakobsen.

Rather than 30 or more systems, ESPN could use eight, he says. (Not all SportsCenter monitors need to be active at the same time.)

The DVI output from the Octoviz systems is compressed using J2K, packetized and fed into DC-2’s IP network. Before reaching the SportsCenter monitors, it’s converted back to DVI.

Committing wholly to IP video switching in January 2013 “was a leap,” says Pannaman. That’s why ESPN also designed in “a Plan B where we would use some more conventional technology.”

“ESPN has parallel IP and SDI infrastructures,” says Goyal, “both running under the [Evertz] Magnum control system.”

From the outset, having the legacy infrastructure in place gave ESPN “a comfort level that there is an SDI core behind there that they are familiar with,” he says. It also makes it easier to migrate operations fully to IP as new IP-based production equipment becomes available.

What gave ESPN enough confidence to tackle IP in the first place was that the core technology modules needed to succeed — specifically J2K encoding, IP networking and MPEG transport streams — were “mostly there,” says Pannaman.

ESPN is confident IP will carry it far into the future. One sign of that IP buy-in is the technology update that is being contemplated for the 12-year-old Digital Center 1, says Pannaman.

“There’s a lot of glue technology bridging HD-SDI equipment like cameras and production switchers into the IP infrastructure [in DC-2],” Pannaman says.

For DC-1, ESPN is starting to request that vendors build Ethernet ports and J2K encoding and decoding directory into their equipment, he says. “If we achieve that, even on a modest scale, we’ll be reducing that glue technology dramatically in future builds.”

ESPN’s approach to the DC-2 design has also positioned it to handle 4K when that becomes a reality.

“I’d like to say that this was incredible technology foresight on my part, but it was a little more luck than judgment” says Pannaman.

“But when you look at it, you realize that transport stream moves anything you ask it to, J2K is a compression that has had 4K in its specification from the beginning. We realized we had picked the right compression and the right transport mechanism to move anything around — 4K, 8K, mixtures of 4K and HD, anything.

“This is the most flexible combination we’ve ever seen.”

Arista Networks, a developer of software-driven cloud networking solutions for data center storage, declined to be interviewed for this story.

This is the latest in a series of TVN Tech stories highlighting new television facilities. Read the other installments here. To stay up to date on all things tech, follow Phil Kurz on TVNewsCheck’s Playout tech blog here. And follow him on Twitter: @TVplayout.


Comments (0)

Leave a Reply