In The ‘Daily Battle’ Against Disinformation, Reinforcements Are Coming

TV newsrooms face a constant and ever-escalating fight against the volume of bad information coming across their transoms, but new tools are on the way to provide help.

In TV news today, identifying manipulated video or any piece of untrustworthy content before it leaks to the airwaves is “a daily battle,” said Emily Stone, VP of digital content operations for Fox Television Stations, during TVNewsCheck’s NewsTechForum panel “News Technology and Combating Disinformation” on Dec 14.

“In the last 10 years it’s just exploded, and social media has amplified the problem, so has the pace of the news cycle,” Stone added. “It’s everywhere, everyone’s a journalist these days — no degrees required.”

And because technological developments have only recently scaled this issue up, many in the industry have yet to adopt the proper terminology for such meddlesome sources. To wit, the panel’s moderator, TVNewsCheck Editor Michael Depp, began the conversation by asking Jonathan Forsythe, managing editor of Tegna’s Verify programming, to differentiate “misinformation” from “disinformation,” two terms that are often used interchangeably, incorrectly.

“Misinformation is false or inaccurate information, simply,” Forsythe said. “Disinformation is false or inaccurate information that is intended to deceive, so it’s sent intentionally — that’s propagandas, bad actors, that sort of thing. There’s a big difference.”

Providing an example, Forsythe recounted how, in early November, altered video of California Gov. Gavin Newsome, speaking with what appeared to be partial facial paralysis, sparked an internet theory that he was suffering extreme side effects from a COVID-19 booster shot. The claim was later debunked and seen as a ploy by anti-vaxxers to spread disinformation. In fact, it qualified as misinformation, Forsythe said, because the original producer of the video indicated on Twitter that it was intended as satire.

“The problem is there’s people with an agenda who take that joke and then spread it,” Forsythe said. Like other panel members at several points throughout the discussion, Forsythe summoned the “whack-a-mole” metaphor to describe the chaos in discriminating between authentic and nefarious content across the entire web.


“What we worry most about is watering down our product,” said Dustin Block, audience development lead at Graham Media Group. “Local TV plays a really important role, people turn to us in time of need, and we have people trying to trick us constantly.”

For a company like Graham, which oversees multiple stations with individual news teams tasked with producing a series of news shows throughout any given day, Block says having one group debunk a false story in the morning only to see another producer pick it up for, say, an 11 p.m. broadcast can happen.

Block also remembered an incident in which a station broadcast a segment about an expensive property for sale that had yet to be built. A developer had only drawn up renderings of the home and pitched it to local stations as an available piece of real estate. The station was later alerted to the fact that should anyone travel to the location where the home supposedly stood, all they’d find was a vacant lot.

“It just makes us look dumb,” Block said of such unfortunate outcomes. “The local actors are getting better at disinformation and spreading misinformation, and that’s the area where I think our newsrooms have to be really vigilant.”

But TV news producers are doing their best to whack these moles, putting in place new protocols and infrastructure to help snuff them out.

At Verify, Forsythe said the team is “constantly searching for misinformation, setting up open-source feeds [and other] tools that are available to us, and we’re monitoring for trends.” They also answer audience questions and field pitches from Tegna stations. Researchers and vetted expert sources confirm or deny any given claim, and the resulting Verify segment, rife with transparency, clearly outlines the fact-checking process to show how the team came to its conclusion.

Stone said Fox-owned stations rely heavily on the judgment of the trusted journalists they employ, who leverage communication technology platforms like Slack to keep real-time discourse going at the quick pace of the breaking news cycle. When the lead time is longer, full-time, centrally located fact checkers come into the fray, taking pitches from Fox stations across the country.

Stone indicated there’s increased deliberation among all of the Fox news teams’ members.

She added that falling back on the tried-and-true processes of good journalism is more important now than ever.

“Like everyone else we want to be first, but accuracy and building trust and keeping trust is far more important to us than being the first to publish,” she said. “This is now more than about who, what, where, when, why and how; it’s understanding why and how you got the information, considering the emotion that’s behind it and the impact it has if a story is written about it by a reputable news organization.”

Over at Graham, Block noted the company has worked with Fathm, a media consultancy agency, to develop its Trust Index, a company-branded verification process. Graham’s also partnered with The Trust Project, an organization whose mission is to build integrity in news, to help its news teams learn what they can do to develop trust with their audience, particularly during periods like an election cycle, when it’s so critical.

Block said both groups have been “so helpful” and enjoyable to work with. He also noted that news teams today have to work toward boosting transparency, while “listening” to the audience as well.

“Which is interesting because in this world … we’re also dealing with audiences sending us mis and disinformation,” Brock said. “It’s an interesting exercise to try and be transparent and build trust while being really careful not to get duped.”

More reinforcements are arriving.

Project Origin, which has been built on a partnership between the BBC, the CBC, Microsoft and The New York Times, has developed a “technical provenance approach,” according to its website, “in conjunction with media education and synthetic media detection techniques will help to establish a foundation for trust in media.”

Using technological solutions of their own, Bruce MacCormack, co-lead of Project Origin, said among the group’s capabilities is “tracing a piece of media, so we know who it’s coming from, that it hasn’t been tampered with, that the metadata attached to it is [secure].”

MacCormack expects that sometime in the next year or so vendors will begin to adopt the Project Origin tech into their systems. (Adobe and Photoshop have already done so.)

“We moved from black and white to color, we moved from low definition to high definition, and we’re going to make the move from insecure to secure media files,” MacCormack said. “And that’s going to take a process and adjustment.”

For more stories on NewsTECHForum 2021, click here.

Comments (1)

Leave a Reply

Good luck in 2023!! says:

December 17, 2021 at 8:41 am

News is disinformation ..When are you people going to get it???