Broadcasters are finding growing opportunities to use remote integration production that lets them produce remote broadcasts from their studio after backhauling the camera feeds via fiber or broadband connections. Sports was the first use of this “at-home” technique, but with steadily improving gear and latency rates, stations are looking at expanding it to cover political debates and other local events. Above: Nexstar’s WCIA Champaign, Ill., uses TVU Networks’ RPS remote production system transmission product to stream multiple cameras, all frame-synched, to cover remote sporting events.
One of the hot topics at last month’s NAB Show was REMI (remote integration) production, also known as “at-home” production, where a live sports broadcast is produced by backhauling the camera feeds via fiber or broadband connections to a remote studio.
There the show is produced by a director and other production staff, who perform switching between cameras, adding graphics, and generating replays — all functions that have traditionally been handled at the sports venue in a remote production truck.
At-home production has been around in some form since the 1996 Summer Olympics in Atlanta, when NBC took the then-novel step of pulling live feeds via fiber back to its New York headquarters to insert graphics and build highlight packages.
NBC used fiber-optic links from AT&T to create a “virtual IBC,” linking production personnel and equipment at the International Broadcast Center (IBC) in Atlanta and 30 Rockefeller Plaza in New York. That way, the network could avoid building a huge dedicated production facility in Atlanta, while improving existing graphics and editing systems in New York.
Since then, NBC has significantly expanded its use of at-home production techniques for its Winter and Summer Olympics coverage, producing streaming and cable-network coverage of many sports by backhauling live feeds to production centers in New York and, more recently, Stamford, Conn., where commentators and producers are stationed.
Eurosport, the Discovery-owned European cable sports giant, also relied heavily on at-home production for its coverage of the 2018 Winter Olympics from PyeonChang, South Korea, using a mix of seven studios in Korea and 10 studios back in Europe to deliver coverage in 21 languages to 48 countries.
With the huge number of Olympic sports that are being covered through various distribution channels today, employing at-home production workflows is almost a necessity for Olympics rightsholders like NBC and Eurosport, said Boromy Ung, chief product officer for ChyronHego.
“You don’t have a choice, really,” Ung said. “You’re not getting more space at IBC, and the amount of production you need to do is growing much faster than the real estate you’re being given at the IBC.”
While the Olympics may have been at-home production’s early proving ground, major sports broadcasters like ESPN and Fox, as well as newer players like Pac-12 Networks, have adopted at-home techniques on an everyday basis as way to both lower costs and make their existing staff more efficient. Big drivers have been lower costs for fiber capacity, particularly in the “last mile,” and improvements in IP networking technology.
Keith Buckley, CEO of The Switch, which provides high-bandwidth, low-latency fiber links to 180 sports venues across the U.S., has seen a major uptick in at-home productions in the past three years. While initially it was being pursued only by high-end customers, at-home production is now being used by customers across the spectrum including lower-tier sports, said Buckley.
In addition to cutting down on operational and travel costs by deploying fewer people to remote venues, at-home also results in other efficiencies, such as a single director being able to produce multiple games in one day.
“Everybody’s dabbling in it in some form or fashion, as they all have applications in their programming lineup that are perfectly suited to it,” said Buckley.
The Switch’s DTM (dynamic synchronous transfer mode) fiber links typically have one frame of latency across the U.S., though he noted that latency isn’t as important as making sure that each camera feed is perfectly synched with the others. The Switch generally employs dual 10-gigabit-per-second paths into a venue, with a standard camera path being around 150 megabits-per-second using JPEG 2000 compression.
“Out of any pro sports venue we have, we’ll allow customers to take a 10-gig [path] and divide it up as they see fit,” he said. “They could do 14 cameras down that path, and there’ll be plenty of room. But typically they’re using 4 to 5 gigs.”
Improvements in network latency have also made it easier to control IP-connected devices remotely, which means doing an at-home production doesn’t necessarily mean getting rid of mobile trucks.
ESPN, for example, has adopted a hybrid at-home model for its Major League Baseball coverage (except for games in Mexico and Puerto Rico), where it still rolls a typical complement of mobile trucks but devices like Vizrt graphics and EVS replay systems are controlled remotely by operators in Bristol, Conn.
That means a smaller core team of production staff is required on site, which cuts down on travel costs without affecting the quality of the production.
“For our regular games, we’re using this approach,” said Phil Orlins, senior coordinating producer at ESPN. “There are all different flavors of how to tackle economic improvements, and for now, this is one I feel very good about on the content side.”
NEP Australia has taken at-home production to the next level with its “Andrews Hub” project. To handle all of Fox Sports Australia’s remote production needs, the mobile production (or OB, for outside broadcast) vendor has created IP-based production centers in Sydney and Melbourne that link via Telstra’s fiber-optic network to 29 sports venues spread across Australia, as well as each other.
The 50-gigabit-per-second fiber links allow NEP to connect to its OB trucks parked at the various venues and backhaul uncompressed camera feeds to Sydney and Melbourne, where Fox Sports directors and other production personnel actually produce the games.
As part of the project, NEP created seven IP-based control rooms, four in Sydney and three in Melbourne, and four new IP-based production trucks. It also upgraded seven existing OB trucks with IP capability.
NEP and Fox began testing the Andrews Hub last fall and officially launched with a March 10 broadcast of a soccer game in Perth. At press time they had already used it to produce 46 games. They will produce 250 events through the Andrews Hub in 2018, said Marc Segar, NEP Australia director of technology, with the number likely doubling to 500 in 2019.
Cameras and mics at a game are still operated by onsite staff, and the audio mix is created onsite with Lawo audio consoles in the trucks that are controlled remotely from the Andrews Hubs. But the Sony production switchers and EVS replay units used to produce games are located back at the hubs (though there are backup units in the trucks in case of a network outage), and IP videoconferencing systems are used to allow directors to easily communicate with field crews.
The latency of the Telstra network is an imperceptible 48 milliseconds from Perth to Sydney, a 2,700-mile run, said Segar. And it measured a scant 136 milliseconds in roundtrip latency in a test conducted last week between Sydney and Los Angeles, some 8,000 miles away. An EVS operator in Los Angeles was even able to control an EVS server back in Australia.
“We had a production crew based in Sydney, and 30 camera feeds coming back from Los Angeles, and we cut a show,” says Segar. “And it doesn’t feel any different from the Perth event.”
The new workflow has halved the number of people Fox Sports needs to fly to produce a typical soccer or rugby game. That results in tremendous savings in both money and time, as top-flight directors no longer need to spend up to two days of travel to produce a one-day event. In fact, said Segar, with the hubs life as an OB director can become a bit “like an office job,” — directors who live in Melbourne and Sydney can sleep at home and then commute to the Hub the next day to produce a game anywhere in Australia.
NEP Australia signed the contract to be Fox Sports’ one-stop shop for mobile production in late 2016, and Segar said the timing was fortuitous in that the SMPTE 2110 standard was starting to be finalized at the same time that vendors were racing to support the new specifications and fiber capacity was making a big improvement.
“We literally couldn’t have done this one year before we did do it,” said Segar. “The technology was not there, and the network was not built out.”
Segar was quick to credit vendors like Lawo, EVS and Sony for doing the development work to make the Andrews Hub project a reality. Sony, for example, created ST 2110 input and outboard boards for its XVS8000 production switchers. Sony has been involved in at-home productions for several years with its sports clients, and at NAB it demonstrated how its switchers can be controlled remotely via IP networking through a fiber link between CNN in Atlanta and the Sony booth.
Sony also recently completed a REMI test with telecommunications company CenturyLink, where it did a trans-Atlantic live transmission using the SMPTE 2110 standard. For the test, Sony simulated a live broadcast and switched between cameras in New York and London. It found that it had a roundtrip latency of 69 milliseconds, which was equivalent to 2.5 to 3 frames and perfectly viable, said Deon LeCointe, Sony marketing manager for production switchers.
“There’s operational latency, yes, there’s network latency, and then there’s the law of physics,” noted LeCointe. “At the end of the day, does it create a viable environment for a technical director to run a show? When you switch from Source A to B, does it take too long for the switch to occur in the multiviewer environment?”
The New York to London test showed that the answer was no. More important, he said, the trial proved the ability to “separate” the XVS IP-based switcher between the switcher’s processing in New York and the control surface in London. With the switcher’s resource-sharing capabilities, users can also split one processor into multiple logical switchers, enabling simultaneous production operation from the same hardware.
“Having the ability to do control over IP means the hardware can sit anywhere,” said LeCointe.
While CNN has tested only Sony’s REMI workflows to date, LeCointe said that Sony continues to talk to them and other news organizations about the possibility of incorporating such capability into everyday news production. He said Sony also did a proof-of-concept test with what he called a “call-letter network.”
“If you’re a news organization, the idea is that you can set up a production data center to pool resources,” said LeCointe. “And you can have those resources shared among multiple stations or bureaus. The real estate in New York is way too expensive. You could potentially have the hardware living in Utah, buried in a mountain somewhere, and that would be much cheaper.”
Del Parks, chief technology officer of Sinclair Broadcast Group, said he’s not sure Sinclair will be employing such a data-center concept anytime soon. The real estate in many Sinclair markets isn’t that expensive, he said, and having a control room right next to the set is vital for local news production. But he is intrigued by at-home production in general, and the kind of remote-control capability that NEP Australia is using.
He noted that Tennis Channel, which Sinclair owns, already uses at-home techniques for the vast majority of its programming, with U.S.-based commentators calling live matches held in far-off locales like Dubai.
Parks sees at-home production having great potential for “hyperlocal production,” such as covering high-school football games or town-hall meetings.
“If you’re doing high school football in Altoona, Iowa, you can’t afford to rent a big truck to do that,” said Parks. “You have to size the investment, and the cost to produce compared to the revenue. What you could do is get five LiveU [bonded cellular] packs, hook them up to five cameras, and send five camera people out, an audio guy, and if want to, also send an announcer. You could stream all that back, feed all five LiveU’s into the LiveU server, run it into your switcher, and switch it back at the station. At that point, you’re not renting a $10,000, $8,000 or $5,000 truck. I get excited about the concepts, because there’s a lot of potential.”
A station that has already pursued that exact model is WCIA, Nexstar’s CBS affiliate in Champaign, Ill., for its coverage of the Illinois Marathon. For last Saturday’s race, WCIA used TVU Networks’ RPS (remote production system) transmission product to stream five cameras, all frame-synched, from its base at Memorial Stadium at the University of Illinois back to its studio, where they were fed into a Ross Video Carbonite switcher.
It is the second year that WCIA has used the TVU RPS product to cover the race, said WCIA Chief Engineer Darren Martin, in addition to TVU backpack units it uses to follow the race from the road. The station had previously rented a mobile unit that cost around $7,000, but had experienced problems with the IFB systems. It also had to install a microwave antenna at the stadium to transmit video.
By comparison, the RPS unit cost $2,000 for the week and worked flawlessly with WCIA’s Clear-Com LQ headset units, said Martin, with no video synch problems. The RPS unit doesn’t yet have a dedicated way to transmit audio back, he said. So WCIA embedded mic feeds into two of the video channels and decoded them back at the station, which Martin said worked fine.
Connectivity was also not an issue with the TVU RPS, said Martin. WCIA tapped into the university’s high-speed (250 Mbps down/200 up) broadband to transmit the signals and used a Comcast symmetrical 100 Mbps link at the studio to receive the feeds.
The Illinois Marathon is basically a one-off remote event for WCIA. But based on the success of the at-home workflow, Martin is considering whether WCIA could use a similar scheme to cover other remote events in the future, such as political debates at local universities.
“It’s so much better being able to do everything from our studio,” he said. “We’re using our graphics system and our switcher, and everyone is much more at home being able to use our own stuff.”