TVN TECH

TVN Tech | COVID-Era Content Management Taps Cloud, AI

Technology executives from WarnerMedia, Sinclair and Hearst said at a recent TVNewsCheck webinar that they’re tackling the content management challenge amplified by the pandemic by using cloud storage and leveraging artificial intelligence and machine learning to improve indexing and searching.

The proliferation of affordable acquisition gear and explosion in IP transport technology means that broadcasters and cable networks are producing more programming and distributing it on more platforms than ever before. Efficiently managing and fully utilizing all of that content is vital.

The importance of content management has become even more pronounced during the COVID-19 pandemic, as stations and networks have turned to their archives to help fill airtime in the absence of regular news and sports production and relied on content sharing to support distributed workflows.

Broadcasters are tackling the content management challenge by using cloud storage where it makes financial sense and leveraging artificial intelligence (AI) and machine learning (ML) tools to make indexing and searching through archive content via metadata more efficient. At the same time, they are taking advantage of improved compression techniques to lower the bit rates required for high-quality storage and transmission, as well as making some well-educated bets on how much a certain piece of content will get reused.

WarnerMedia is working toward a content management future where a shared taxonomy will be used across its diverse array of programming. For now, the media conglomerate is focused on taking the highest-quality original product into its archive, whether it’s news, TV, film or sports content, and then tagging it with the most accurate metadata possible, said Renard Jenkins, the company’s VP, content transmission and production technology.

Renard Jenkins

“The next thing we look at is [whether] this something that’s going to have legs for us to bring back and utilize over time,” Jenkins said. “If it is, then we separate that out into a different tier. If we do believe it may be a historical thing that we may bring out now and then, then we do put into a deep archive, but as we do, we make sure that we’re focused on the tagging portion of all of this.”

BRAND CONNECTIONS

Jenkins was one of several broadcasters and technology vendors who joined last week’s TVNewsCheck Working Lunch Webinar, “AI, the Cloud and the Future of TV Content Management,” moderated by this reporter.

Sinclair Turns To Sports Archives

During the early days of the COVID-19 pandemic, Sinclair Broadcast Group relied on the archive to keep the lights on for its regional sports networks, which it acquired from Fox in 2019 and is currently in the process of moving from its current playout center in The Woodlands, Texas (now owned by Disney) to a new facility at Encompass in Atlanta.

“The games went away, and we had to fill the air with something,” said Mike Palmer, senior director, advanced technology/media management for Sinclair. “We went to the archive for that and started airing a bunch of classic games.”

“That came at a really interesting time for us, because we were in the middle of migrating away from the previous owner to our new facility that we were designing and trying to project what our usage was,” he added. “It threw all of our statistics way out of kilter, when you’re pulling everything from the archive instead of constantly pushing things into it. So that made things interesting.”

Mike Palmer

Palmer is currently involved in an enterprise-wide project to streamline content management across the various Sinclair properties including the local stations, the RSNs (now rebranded as Bally Sports networks) and Tennis Channel, from production all the way to origination.

“We’re working on all these parts of media management at this point, and it is at the core of our business transformation,” Palmer said.

Hearst’s Archive ‘Oasis’

Hearst Television also leaned heavily on its news archives in March 2020 to keep newscasts going amid COVID-19 lockdowns, before the “Zoomasphere” became an everyday acquisition tool, said Joe Addalia, director of technology projects for Hearst Television. For the past eight years, the station group has pushed out all of the stories it uses in a day to cloud storage, which can then be searched and accessed through Bitcentral Oasis, its content sharing platform.

“This was one of the largest resources throughout the early days of the pandemic, before we got our arms around how we cover things and how to stay safe,” Addalia said.

“We were using this ability to reach into each individual station’s archive as well as share day-and-date assets,” he said. “This was all possible because of the way we do it. We actually archive the entire metadata set and story body from our news editorial system, and it stays with that asset as it gets archived. So we have pretty robust systems that allow for drag and drop of moving a story, and also a video asset will follow.”

COVID Revs Up Cloud, AI Workflows

Raoul Cospen

Raoul Cospen, Dalet’s director of product strategy for news, said COVID-19 was an “eye-opener” that accelerated the use of cloud workflows and AI for content management by roughly five years. Dalet is focused on delivering mobility and easy collaboration through its news production and asset management tools. It has developed a new product called Dalet Pyramid, which introduces the “Storytelling 360” approach as a better way to organize collaboration from the story level that unites functions previously siloed in different parts of the broadcast plant.

Storytelling 360 starts with an “umbrella story” that links to multiple versions for different platforms, and then includes all the objects associated with the story including video, pictures, edit decision lists and graphics. It then provides direct access to production tools like ingest and editing directly from the story. Finally, it organizes the overall production workflow, including who is going to shoot video in the field, who is going to create graphics and who is going to handle editing.

“So really the organization to track the progress of your story, the editorial features, the cost of your story coverage, and the progress of it,” Cospen said.

Visibility Into Feeds Is Key

The kind of IP transport provided by LTN Global has helped contribute to broadcasters’ content management challenge, as it makes it that much easier to access and distribute content feeds whether they are coming from a traditional sports venue, a cloud production workflow or a newsroom. LTN is helping its customers through tight integration with newsroom computer systems and content management tools, said Rick Young, LTN Global’s SVP and head of global product.

“The key from our perspective is providing visibility to the dizzying array of feeds that are coming into production environments across a countless number of sources and types of formats of content,” Young said. “How do you provide visibility in one place? That’s number one. Number two is how do you notify the users, the folks that need to know, whether it’s in a story-specific workflow or it’s a more general facility level — how do you let folks know that something of interest is coming into the facility?”

Rick Young

Extrapolating accurate metadata helps solve those problems, though Young and other panelists said maintaining metadata throughout the entire production and transmission chain is challenging. For example, Palmer noted that many production systems strip out geospatial information and other metadata generated by cameras.

Human Errors Mar Metadata

AI and ML tools can be used today to automatically generate metadata, particularly when indexing material stored in the cloud, and reduce errors prevalent when tagging is performed solely by human operators.

“The key to it all is, the less human interaction that we have with metadata, the more accurate it is actually going to be,” Jenkins said. “If we can take the metadata from the camera, from the original source, as we go through our transcodes and actually maintain it throughout the process, then it makes that orchestration layer a lot more powerful and a lot more valuable within the system itself.”

WarnerMedia is currently building ML models to improve discoverability of content, with a focus on its archive. Jenkins said the crucial first step is making sure the metadata is clean and accurate.

Several panelists said that commodity AI tools like speech-to-text can be used very effectively today to generate serviceable metadata, with Addalia calling speech-to-text the “low-hanging fruit with AI” and Cospen describing it as a “game-changer” when used with content recommendation engines. Jenkins cautioned that sometimes speech-to-text tools are inaccurate with different accents and identified that as a place where human intervention may still be required.

Pursuing Cost Effectiveness

While broadcasters are making more use of the cloud for archive storage, the cost feasibility of doing so relates directly to how much they think they will be accessing it. One way some of Dalet’s big station-group customers are cost-effectively taking advantage of the cloud, Cospen said, is to store proxy versions of content in the cloud while keeping high-resolution versions in on-premise storage. That avoids big egress charges for pulling content.

Hearst follows a similar model by storing proxies in the cloud, and sometimes even hosts proxies on low-cost servers within its own wide area network (WAN)

Joe Addalia

“The proxy is a really good way to avoid those cloud egress charges,” Addalia said. “Because a lot of times what we’ll find is that users just like to look at the video. Well, just to look at the video as opposed to use it, a proxy is perfect for that.”

The Pendulum Of Acceptability

Any discussion of storage costs relates directly to a discussion of bit rates and how much compression is appropriate for a given piece of content. The panelists were in agreement that maintaining a high quality “mezzanine” level for future transmission or editing purposes was important, though advanced compression techniques are lowering those numbers. Young noted that the last 14 months have “changed dramatically what people think is acceptable,” and it remains to be seen how much the pendulum will shift back.

“Everybody wants the highest quality at the end of the day, but it’s just not always practical,” Young said. “There’s definitely an acceptance of lower bit rates.”

Jenkins said that MPEG-2 or MPEG-4 content encoded at 8 Mbps could likely be encoded at 6 Mbps using H.265 (HEVC) compression and still yield a high-quality 1080i or 1080p version for archiving. Palmer says MPEG-2 content encoded at 50 Mbps could be compressed to 15 to 18 Mbps with H.264 encoding and give equivalent quality. That would yield a “huge amount” of long-term savings in file sizes and cloud costs over the long term, particularly if Sinclair can play out the mezzanine format directly without having to go through another transcode step.

“Because again, in the cloud we’re being charged for every piece of media movement and every transformation that we go through,” Palmer said. “So it’s really important for us to make sure that we have a standard format that we can use from a deep storage tier, and then bring it all the way back out to playout for us.”


Editor’s Note: An earlier version of this story incorrectly refered to Dalet’s Storytelling 360 as a product. It is a facet of Dalet Pyramid.


Comments (0)

Leave a Reply