Sinclair is using public cloud-based playout to broadcast a three-hour block of children’s programming loaded across 51 stations. So far, so good, says CTO Del Parks. “It’s been pretty bulletproof.” And it also just may be the future for the whole industry.
Is virtual playout from a public cloud an economical and reliable option for broadcasting master control?
Sinclair Broadcast Group is aiming to find out.
Earlier this month, 51 Sinclair stations began using cloud-based playout to broadcast each morning a three-hour block of children’s programming with a full commercial load. The stations aired one of four time-zone feeds. (Some 70 other stations receive the block as part of Tribune-MGM’s This TV.)
For the pioneering project, Microsoft Azure is the supplying the cloud; Imagine Communications, the software; and LTN Global Communications, the high-speed downstream link to the stations.
So far, so good, says Del Parks, chief technology officer at Sinclair. “It’s been pretty bulletproof. It’s providing us with an opportunity to experience the system and work through some of its idiosyncrasies.”
It’s also a chance to figure out whether it makes financial sense, he adds. “We have done a preliminary cost analysis and it seems to work for us. “
Sinclair launched the kids block last July from its network operating center in Las Vegas using a conventional playout system.
The cloud-based service came online in August and ran in parallel until last month when Sinclair and its tech suppliers were confident the cloud service could go it alone.
Imagine was a likely choice for the project. Sinclair has been a long-time user of Imagine’s OSI traffic system, which is an integral part of the project. And Imagine has unmatched experience in virtual playout through its massive project with Disney.
With Imagine’s help, Disney has virtualized playout in its own private cloud for its cable networks, including the Disney Channel, Disney XD, Disney Junior and Free Form.
“The system that runs at Disney is exactly the same software that we are using with Sinclair and that runs 24/7, 365 days a year,” says Steve Reynolds, Imagine’s chief technology officer.
Reynolds gives the performance of the Sinclair cloud an A. “It’s been rock solid. It turns out that having the compute resources running up in Azure is actually a very robust environment. The guys at Microsoft have done an incredible job of engineering that environment for both stability and up time.”
Imagine is, by no means, the only tech company developing and marketing virtual playout, which is widely seen as the next big thing in network and station automation.
At the NAB Show last April, several companies offered solutions, including Harmonic, Evertz, Grass Valley, Florical Systems, PlayBox Technologies and Pebble Beach Systems.
But, as far as could be determined, no other commercial U.S. broadcasters has gone as far as Sinclair in cloud-based playout. However, two years ago Sony leveraged IP signal transport and the cloud to centralize playout for WGBH Boston and other noncommercial stations.
And working with Crispin, Tegna has launched a cloud-based system for ingesting syndicated programming, processing it and distributing to the Tegna stations.
Many broadcasters remain wary of public clouds, fearing that they will lose control of the content and expose themselves to cyberattacks.
But Reynolds is reassuring. “The public cloud is actually considerably more secure than private cloud — 100%,” he says. “The public cloud environments are architected, engineered and operated with a level of security that is orders of magnitude more stringent than any private cloud. There are thousands of people who show up to work every day at Microsoft and Google and Amazon thinking about nothing but cloud security.”
From a financial standpoint, virtualization transforms playout from capex to opex. For its kids system, Sinclair doesn’t buy a bunch of hardware; it pays fees for Imagine’s software, LTN’s service and Azure’s capacity.
“One of the big advantages of this whole movement…is that you pay for what you use,” Reynolds says. “Because of that, the utilization is dramatically higher than if you were to try to build out a facility to do this exact same thing on the ground.”
At the heart (or the brain, some would say) of the Sinclair system is Versio, Imagine’s automation software. It does all the hard work, passing information back and forth with the OSI traffic system and creating the playlist.
“It was designed from the ground up to be deployed and operated in a cloud environment,” Reynolds says.
The system is scalable, Reynolds says. To add another station or feed, he says, all Sinclair has to do is “spin up another instance” — that is, copy — of Versio to the cloud and rent a little more space from Azure.
Azure receives the animated shows that go into the block from ColorTime, a media distribution company working for the programmers, Reynolds says.
Once ingested into Azure’s “storage blob,” another piece of the Imagine software, Selenio Flex, kicks in to process each “asset” and make sure it’s in the proper file format.
“It has the ability to read hundreds of different file formats and write hundreds of different file formats. It basically takes something that’s in input format X and transform it into output format Y.”
Working with Versio and OSI, Reynolds says, each station can assemble its own playlist and traffic log — “what programs has to play at what time, what the break structure is going to look like, what commercials are going to get aired.”
Versio includes a feature called Motion, which compares the playlist from the station and makes sure that every program, commercial and promo is available in the cloud.
If not, Motion sounds an alarm at Sinclair’s Las Vegas NOC, which monitors everything that goes on in the cloud. “They normally have got a couple of days to remediate that asset problem, which really just means copy the file up to the cloud.”
For delivery of the custom streams from the cloud to each station, Sinclair turned to LTN, Sinclair’s Baltimore area neighbor, which has been handling distribution of Sinclair’s diginets.
Chris Myers, EVP and chief development officer, says LTN is well suited for the job. Its network functions much as satellite would, distributing high-bit-rate (up to 150 mbps), high-quality MPEG streams in real time with extremely low latency (250-300 milliseconds) and with no loss in quality.
But the LTN pipe has two great advantages over satellite, Myers says. It’s “a fraction” of the cost and can deliver discrete, customized streams to individual stations — critical in broadcasting where each station has a different programming lineup and commercial load.
LTN accomplishes all this with a combination of proprietary software and IP network redundancy. “We don’t rely on any one Tier One ISP. Our network is a combination of almost all of them.”
The software corrects common IP problems like packet loss and jitter in real time and sniffs out any “abnormalities” in a network and redirects traffic to avoid them, Myers says.
Other IP distribution systems can move low-quality video over the open internet, Myer says. But they do it by “putting all kinds of latency into the feeds — five seconds, 10 seconds, 30 seconds of latency — to do forward error correction.”
Thats OK for Netflix and MLB.com, but it’s not for dynamic, high through-rate broadcast streams that have to stay in real time as Sinclair’s kids block does, he says.
Gearing up for LTN is not a big deal. It requires only “an appliance,” similar in function to a satellite IRD, that acts as a receiver as each station. Each appliance can support multiple connections.
Myers sees Sinclair as a true tech pioneer that is taking some risk to develop a more economical and more flexibly playout mechanism.
“This is the tip of the iceberg,” he says. “You are going to see a lot of people follow what Del [Parks] and Sinclair are doing. I know you are because our phone is ringing off the hook with people wanting to do similar things.”
Like Myers, Reynolds believes playout from the public cloud is the future. Broadcasters just have to get use to the idea, he says.
“You don’t have to look very far for examples of other industries that have been entirely migrated towards the cloud and the bottom line is there is really no reason to build a large physical facility in order to support these kind of common compute resources,” he says.
Reynolds also points out that putting content in the cloud opens up new possibilities for broadcasters. “You don’t get content to your iPhone in any other way than by connecting to the cloud.”
If everything continues to go well with the kids block, Parks says the next step may be to move playout of its “emerging networks” like Comet and TBD to the cloud.
After that? “I am hopeful that cloud-based playback of traditional linear TV channels is practical,” he says. “There are many issues that would need to still be resolved, most importantly, integration of live news and sporting events, but I wouldn’t rule anything out.”