Depth Will Add Value To Local TV News
Memo to TV newsrooms: the 90-second story just isn’t cutting it on depth anymore. Try doubling it.
While you’re at it, reshuffle your reporting assignments into A and B teams who work on three-minute pieces every other day. The side bonus is your new stories will have a much longer shelf life on time-shifted, on-demand platforms.
Those are among lessons coming from Frank Mungeam, the newly minted Knight Professor of practice in TV news innovation at Arizona State University’s Cronkite School of Journalism.
Prior to ASU, Mungeam was VP of digital audience engagement at Tegna. In his new post, Mungeam’s remit is to help create stories and storytelling formats that commercial broadcasters might be otherwise too constrained — by budget or convention — to tackle themselves.
In an exclusive interview with TVNewsCheck’s Special Projects Editor Michael Depp, Mungeam says newsrooms need an overhaul of conventions like the morning meeting and their daily assignment structures. And he says artificial intelligence is ushering in great tools that can eliminate routine tasks and free journalists up to do what they do best — telling important stories in compelling new ways.
An edited transcript.
One of the things you’ve been working on at ASU is creating more dexterous, “Swiss army knife content,” as you’ve called it. What do you mean by that, and how it can be useful in a multiplatform news environment?
There’s a set of content and formats that are the staples of local broadcast news today — packages, V.O., over the shoulder, readers — and then there are emerging platforms that have platform-specific content that works great. But it’s not cost effective to create custom content for them because while the audience is there, the revenue isn’t yet.
Another way to think of this is bridge content. Between those two worlds, what is the overlap? What are the content types and formats that can work and account for the business demands of today while also working for the audience and their demands of tomorrow? How do you future-proof by reverse engineering content that can work in both of those worlds?
Can you give an example of what that might look like?
I’ll give two. There’s a lot local broadcasters could learn from YouTube. We’ve done a number of experiments with explainer videos, and those are great for on-demand viewing on platforms like YouTube. They actually work really well in newscast formats as part of the treatment of a story.
Another simple one that is underutilized is the natural sound package — there is no reporter, intro, toss, stand ups, voice-overs. All of the sound is driven by the characters in the story.
From the metrics, TV news packages do not perform well in their digital afterlife. Natural sound packages work great on TV in the newscast format. They work really well on demand in these other video platforms time-shifted later. It’s a format that TV reporters and photographers know very well how to produce, but it’s very underused given how well it travels.
Where does your idea of the “super block” fit into this?
The super block tries to address all of the current problems and turn them into solutions. YouTube shares with its news partners that 3, nearly 4-plus minutes is the minimum length of an impactful video on demand. TV treatments, even when in depth, may be a 1:30 or 2 minutes. That’s just not deep enough. The second thing is that TV newscasts tend to be driven by story count, so there’s a lack of both depth and variety of format.
The superblock idea is this: what if you built deep dive, full segments of newscasts on topics that truly matter to your community, then intentionally inject diverse storytelling techniques to give that block both immediate interest for the viewer who’s there at 5 o’clock but interesting enough coverage that you would watch that entire block later on demand?
We did a deep dive, superblock on vaccination rates in Arizona. We only allowed our producing team to do one conventional package. We created an explainer to explain the concept of herd immunity, we did a deep dive in-studio interview to complement the reporting and we added data visualization elements and an interactive map to show in real time which Arizona counties were at highest and lowest risk of infection. All those elements gave the superblock a variety within that single topic.
You’ve described skilled reporters as a newsroom’s most precious resource, but you think their time is not often used to best effect. Describe your notion of dividing journalists into A and B teams, and how the output for those teams might better affect their story products.
The premise is we are all operating in an increasingly crowded and competitive news ecosystems. If you’re in the local news business, we need a better answer to what is the unique, differentiated reporting that we’re offering. The 90-second package formula is too small a box to get to the level of depth and value.
Anyone who knows a local reporter knows how frustrating the nature of their typical daily workflow is. If you’re doing a 90-second day-turn package, you have a morning meeting that eats up 45 minutes-plus of your day, you have drive time to the story, you have to get back to the station, you need to write your web version…. There’s a shockingly small amount of time available for the high value activity — the reporting, the writing, the storytelling. For many local reporters, it’s a frustrating, repetitive exercise.
How do you get deeper reporting without adding resources? Imagine you’re a newsroom with six day-side reporters. You send them out and they each do a 90-second story. That’s nine minutes of ROI. What if you divided them into A and B teams and each reporter turned in a three-minute story every other day? You get the same amount of content from the same amount of people, but think of all the things you could gain: better access to the best interviews and depth that could be worth time-shifting to see that story. For news directors, you get retention of your best storytellers because reporters would have a much more positive experience. I call it stories that count prioritized over story count.
Seems like a very sensible approach. What’s to keep a newsroom from adopting it?
The biggest obstacle is history and habit. The reporter assigned to that deep-dive story gets reassigned in the middle of the day to the urgency of the hour. There is a discipline and a leap of faith required to decide to let go of some of what we have always chased in the past in order to do stories that will last beyond today. Far too often we react and chase lesser breaking news stories and miss the chance to do stories that have a lifetime value for our audience.
You’ve got some ideas on revamping the daily morning news meeting. What’s a better way of doing it?
Over time, one of the most critical things we do to improve the morning meeting is increase the diversity of our hires, the voices in the newsroom and the deciders in the meeting.
The morning meeting reminds me of my tragic Little League baseball experience where I was the right fielder. What that involved was standing there doing nothing most of the time then having brief bouts of activity. Some local newsrooms are trying to move some of that process out and ahead of the morning meeting.
One of my favorite examples is [Tegna’s] WFAA in Dallas, where they use slack to get reporter pitches in advance of the meeting. Not only did that move that work out of the meeting, but reporters helped each other out and added and elaborated on ideas. By the time they got to the meeting, the pitches were better and that time wasn’t consumed just going around the horn.
Start with data. Data isn’t just numbers; it represents your audience. The best meetings don’t just present the numbers of what’s trending. They also provide the qualitative answer to why.
Despite many people’s reservations about AI in the newsroom, you’ve got an optimistic outlook on where it can be a highly useful tool in certain workflows. How can you see it helping?
Technology is not friend or foe. I see exciting opportunities for newsrooms to automate the boring stuff in order to free up our people to spend their precious, limited time on the activities that are most relevant to quality local journalism.
When I was at Tegna, we did a partnership with True Anthem that helped our local digital teams automate the process of publishing our story links to social. The rote task of pushing our links out isn’t a high value activity; it’s just time consuming.
Another example is automated transcription of video for reporters. Think of the typical reporter and how much of their time they have to transcribe. In just the last two years, the accuracy and the speed of voice to text transcription has improved dramatically. In three years, I’d be shocked if any reporter transcribed their interviews anymore.
Specific to local broadcast, there’s another application I’m keeping close tabs on: the whole world of video clipping and video archives. The low-hanging fruit before us is the time spent each day in local TV newsrooms clipping video from the newscasts for the web. That’s a task that begs for automation.
Local TV newsrooms have gold in the basement in the form of decades of video archives. Using AI to ingest, tag, add metadata automatically and then search those archives would open up a whole new source of content that could be reformatted internally through new expressions like OTT specials and also open up those archives to be licensed as a source of revenue for local stations. From demos I’ve seen, we really are reaching that tipping point when we can unlock the archives.