TVN TECH: NEW FACILITIES INSIDER

CNN Rides The IP Train Into Hudson Yards

The global news giant has built the first major broadcast facility to be completely based on the SMPTE 2110 IP networking standard. “The idea behind going IP was to get everything across CNN’s facilities connected,” says Bob Hesskamp, EVP of broadcast engineering for Turner. “The other reason we did this was we wanted to build a facility that wasn’t out of date on Day One, that was software-configurable, expandable and easier to make changes to.”

Two months in, and it’s full-steam ahead for CNN at its sparkling new headquarters at 30 Hudson Yards in New York City.

Within a 1,296-foot skyscraper situated atop an active railroad yard on the west side of Manhattan, the global news giant has built the first major broadcast facility to be completely based on the SMPTE 2110 IP networking standard.

CNN began moving personnel from its previous midtown New York home at Time Warner Center in March and began broadcasting from 30 Hudson Yards on May 6, although its engineering team says the transition will not be completed until the end of this year.

The network already has about 1,100 people in the new building, where it occupies multiple floors with roughly 110,000 square feet of technical space. Corporate parent WarnerMedia occupies 1.5 million total square feet in the 100-story-plus building, consolidating its CNN, Turner, HBO and Warner Bros. divisions.

“It’s the first time the WarnerMedia brands are all under one roof,” says Lisa Pedrogo, VP of N.Y. engineering and strategic initiatives for Turner and the project manager for the 30 Hudson Yards project.

She also led construction of CNN’s facility at Time Warner Center and worked on previous CNN builds in London, Los Angeles, Abu Dhabi and Miami.

BRAND CONNECTIONS

CNN has been steadily shifting all of its operations to IP to keep pace with the rapidly changing content business. In addition to building 30 Hudson Yards, it has also created a new IP-based master control facility at its Techwood campus in Atlanta and is currently constructing a new IP plant in London. It’s scheduled to go live later this month.

“The idea behind going IP was to get everything across CNN’s facilities connected,” says Bob Hesskamp, EVP of broadcast engineering for Turner. “The other reason we did this was we wanted to build a facility that wasn’t out of date on Day One, that was software-configurable, expandable and easier to make changes to.”

The backbone of the 30 Hudson Yards plant is a 25-gigabit-per-second Evertz 2110-compliant routing infrastructure running across 10,000 fiber-optic cables. That represents a major shift in architecture from CNN’s former home at Time Warner Center, which employed some 500,000 copper cables to distribute signals.

Other key technology at 30 Hudson Yards includes Sony cameras and production switchers; Harmonic video playout storage and encoding; Vizrt graphics; Adobe Premiere nonlinear editors; Avid iNews newsroom computer system; Studer audio consoles; Autoscript teleprompters; and Leyard video walls.

CNN’s Erin Burnett OutFront‘s new home

Diversified was the primary systems integrator of the technical infrastructure. CNN’s own engineering team handled much of the configuration of the IP routing system, working with embedded engineers from Evertz and other key vendors.

“This technology is the first of its kind,” says Pedrogo. “It’s cutting edge. We’re not the only ones learning here; the vendors are learning, too.”

One big thing that CNN engineers learned about IP is that installing and connecting equipment is only a small piece of the puzzle. Configuring those individual systems to make them work as an intelligent media network is a much bigger job, and one that took longer than expected.

“In an SDI world, the build is somewhat done when you hook everything up and turn things on,” says Hesskamp. “There’s some minor configuration that you have to do.

“In an IP world, the configuration of the routes, the router and all the systems in 2110 is a lot of work and is very complex. And configuration became the bulk of our workload after the facility was basically racked, stacked and wired. It took a lot longer than we anticipated, it was complex and it pushed back some testing.”

With such new technology at 30 Hudson Yards, CNN made a lot of decisions with the ability to “fail over” safely in mind, says Pedrogo.

For redundancy, there are two terminal gear rooms that are mirror images of each other, cooled by a robust HVAC system that keeps the gear at a steady 65 to 70 degrees. All equipment is supported by UPS (uninterruptable power supply) backup power systems, with dual power supplies used wherever possible, and CNN designed for continuity by splitting the load of its four studios and control rooms between the two mirrored gear rooms.

“We did a lot of redundant planning and a lot of careful scrutiny of what’s coming in,” Pedrogo says. “Because the reality of it is, 2110 as a standard is still in its infancy. There are still interpretations that have to happen. It’s very hard to get two vendors’ proprietary code to talk to each other without sharing each other’s code, but to work together to figure out ‘How can I get my Evertz thing talking to my Sony thing so we can do TV with it?’ We faced a lot of those things here.”

The new facility has foot-high flooring with ample space to add additional fiber paths underneath, although there are already roughly 10,000 strands of fiber-optic cable running through the space. By comparison, the old Time Warner Center uptown had 18-inch floors that were “stuffed with copper cable” underneath, says Pedrogo.

An innovative touch is the use of an overhead electric system with movable Starline power boxes that run on tracks in the ceiling and can be moved throughout the space to provide power to any rack of hardware.

CNN’s At This Hour in Studio 19Y

“We didn’t know when we built it exactly where we were going to have density [of some equipment],” says Pedrogo. “With this system we can put extra power where we need it. If we put in a new server that needs a lot more power, we just slide over a new cable.”

Connectivity to other CNN locations is greatly improved at 30 Hudson Yards. At Time Warner Center, CNN maintained a 10-gigabit-per-second OC-192 fiber loop between Atlanta through Washington to New York, and another separate bi-directional OC-192 path from New York to Atlanta. The OC-192 connection required dedicated terminal equipment at each location situated in a gear room, with accompanying maintenance and support.

Now with the help of new corporate parent AT&T, CNN has a 100-gigabit-per-second redundant fiber loop that goes over the same paths without requiring dedicated terminal gear, as AT&T handles those functions and guarantees data delivery.

All of CNN’s fiber pipes now connect to an offsite colocation center (known as a “colo”), so the amount of equipment onsite is minimal. During the move, CNN also used a 40-gig AT&T pipe between Time Warner Center and 30 Hudson Yards to migrate content between the old and new servers.

The improved connectivity facilitates REMI (remote integration) productions, where a control room and personnel in one city produce a live show from a studio in another location. Many CNN shows are still produced in a traditional manner, with the studio and control room in the same location. But CNN also does REMI production on a regular basis, with an anchor often sitting in New York and the production crew, editorial team and control room team situated in Atlanta.

The backbone of CNN’s 30 Hudson Yards plant is an Evertz IP routing infrastructure.

In addition to the high-speed fiber links, low-latency JPEG 2000 compression also helps make REMI productions easy, with a latency of around a single frame, says Matthew Holcombe, SVP, broadcast production engineering for Turner.

“The latency between sites with J2K [JPEG 2000] is imperceptible, frankly, for the production teams,” says Holcombe. “There are no complaints. And it’s consistent. So it’s easy for us to take a mic and a camera from one line, and mix it with a camera from another line. All of that stuff is fixed-delay. So, it makes mixing video and audio super-straightforward for the production team, and it’s a fairly consistent and easy setup for these REMIs between cities.”

The 30 Hudson Yards plant is configured for 1080i/60 production with an upgrade path to 1080p HDR if required in the future. And almost all of it is done through IP networking, with Evertz gear taking incoming video feeds and encapsulating them in SMPTE 2110.

For a few products that weren’t yet 2110-compliant, CNN wound up with a modification of the SMPTE 2022-6 (Transport of High Bit Rate Media) and 2022-7 (Seamless Protection Switching) standards, using Evertz “glue” to convert to 2110.

“There might be a little bit of SDI glue, for a device here or there that we couldn’t get into IP,” says Michael Koetter, SVP, media technology and development for Turner. “But there’s no SDI subrouter or anything like that hanging around.”

CNN’s general production format is MXF-wrapped MPEG-2 at 35 megabits, which CNN likes for its relatively low bit rate, fast editing and rapid file transfers. It is also compatible with Sony XDCAM, which is CNN’s primary acquisition format in the field.

“We do thousands of file transfers a day across all the locations of CNN, and the transfers move fast,” says Koetter. “It’s about speed and volume. So the lightweight format is key in that respect.”

The specialized media processors convert 2110 to an “MPEG XDCAM 35-like transport stream,” explains Koetter, which is then multicast back into the Evertz routing fabric. It’s then available for capture by Evertz DreamCatcher production servers and written as an MXF file to Harmonic MediaGrid storage. That is how CNN manages its workflow for feed recording and package editing. With CNN’s IP connectivity between its various locations, an edit bay in New York can easily access content stored in Atlanta or Washington.

CNN predominantly relies on Adobe Premiere for its editing. The network uses Avid’s iNews newsroom computer system as the backbone of its on-air newscast production, while it employs a mix of content management tools to produce its various digital offerings like CNN Go.

CNN’s New Day studio

There are four large, multipurpose studios and four control rooms. Three of the studios are around 2,500 square feet, while the fourth is around 4,000 square feet. Each control room is around 900 square feet with an enclosed audio booth for sound mixing.

There is also one “flash studio” control room to work with a half-dozen smaller “flash studios” that handle single-head shots of CNN anchors setting up live reports from the field.

There is a large open-floor plan newsroom supported by 28 edit bays, including three outfitted as “finishing rooms” with a bigger space and additional seating for producers. Almost all of the desks are configured to allow working while standing or sitting.

CNN’s overall floorplan is designed with efficient adjacencies in mind. The engineering department has a door that goes directly into the terminal gear room, while technical operations staff is located on the same floor as the studios.

Office spaces feature abundant natural light coming in, with many windows giving great views of the Hudson River and midtown Manhattan.

“The architects were instructed to design with the idea of bringing the outdoors inside because we’re here for a long day,” says Pedrogo. “Same with engineers, same with IT people. They may have a shift, but, if there’s breaking news, we’re here. And we do a lot of programming throughout the day. We’re almost a 24/7 facility here.”

CNN received great support from vendors in bringing 30 Hudson Yards on-air but more work still needs to be done, says Hesskamp, such as providing devices like keyers, delays and analog audio converters.

A particular emphasis for CNN going forward is the AMWA-NMOS (Advanced Media Workflow Association-Networked Media Open Specifications) protocol for communicating with and controlling remote equipment. NMOS, which allows two devices to automatically discover each other across an IP network, is expected to greatly streamline setting up IP broadcast infrastructures.

“There was tons of configuration on our end, and we owned that and we learned a lot by doing it ourselves and I think that was the right decision,” says Holcombe.

“At the same time, we need NMOS so we don’t have to manage so many different databases. Building a configuration and IP address system for the router, and for the intercom, and for the switcher, and for the audio board … we really need NMOS to solidify so we can manage this thing more as a system, and not as so many disparate databases across the organization.”

This is the latest in a series of TVN Tech stories highlighting new television facilities. Read the other installments here.


Comments (0)

Leave a Reply