Helping news organization monetize their content and finding ways to uncover fake news are big priorities for Facebook. That’s according to Campbell Brown, the platform’s head of news partnerships and a former TV anchor, who spoke at TVNewsCheck’s NewsTECHForum Monday.
“There is some urgency around [monetization]. We’re in a heavy news cycle that is probably not going to last forever. Having been in the news business for a very long time, [I know] these things come in waves. So in this moment when we’re in a strong news cycle, we want to get monetization products out there — especially if publishers have a subscription business where you’re trying to get direct connections with your audience.”
Brown said that she doesn’t “worry about” monetization for the big media companies like The New York Times, CNN and NBC. “Because I spent most of my career in local news, I know that local broadcasters or publishers don’t have access to the same technology. So my bias is toward what we can do for those that don’t have the resources or the technical expertise. We’re looking at creating special SWAT teams, as I like to call them. So for example, with the subscription product they can go in and help smaller news rooms implement it.”
That’s part of Facebook’s focus on training for newsrooms. “We can develop all the tools in the world, but unless you can use them [they don’t do any good]. So it’s on us to grow the work we do in this area,” she said.
Brown joined Facebook last January, about at the time when the company launched the Facebook Journalism Project (FJP). It puts the news professionals in the same room with Facebook’s product teams, in roundtable settings, so the platform could understand newsroom priorities and how it could more effectively build products and services they could use effectively.
One such product is the new “Watch” tab, where users can find video content from a variety of sources. It isn’t for Netflix-like or traditional TV content, but rather for stations it can be a way to engage consumers with video that elicits voting and other interactive functions. “The kinds of content that’s become really successful are the ones that engage the audience in a conversation,” she said, such as a mini version of The Bachelor.
“It’s very new and has yet to become a destination for news,” Brown added. “You’ll see us trying a lot of things.” For example, prerolls haven’t worked well with news content that stations publish on Facebook. But in the Watch tab, they could find it makes more sense.
Facebook is also testing out a paywall. “That was the No. 1 ask we had from media companies,” Brown said. “That was something that Facebook had never considered, prior to that. After many months of collaboration with several partners we’re testing on Android and hopefully on Apple soon. And this will be the first time there’s been a pay wall.”
Harry Jessell, TVNewsCheck’s editor and co-publisher, who interviewed Brown during the session, asked her if there were any indications that the platform increased news viewership. “Facebook, for better or worse, is a marketing platform,” Brown said. “At the end of the day what is in your newsfeed is based, almost entirely, on who you’re friends with, who you follow, the pages that you follow.” The platform can help stations know their audience, and engage them through calls to action that might illicit email addresses. “My hope is that we can help you be more targeted.”
At the same time Facebook works on the monetization piece of the puzzle for news organizations, it’s also making a “massive investment” to address false news information from appearing on Facebook. And it’s doing so in two ways, Brown said.
“Most of the false news on the platform is financially motivated. With that in mind, disrupting the financial incentives on the technology side is something that our teams have been focused on,” she said.
That problem is easier than identifying false news, such as the Russian problem.
“We’ve made a bunch of ranking changes, reducing the click bate and sensationalism on the platform.” Stories like “a woman who swallows a frog” will be penalized or down ranked. This will be going on over the next couple of months, she said.
Facebook is headed toward answering a big question: “How do you make decisions about what is a trusted source of news in a way that’s fair and that eliminates political bias and captures what the community thinks and believes but doesn’t cross the editorial lines?” Brown said. That will take a combination of machine learning along with human fact checkers and journalists.
“We don’t want to make editorial decisions. But we feel very strongly that we need to get to a place where we can identify where the widely trusted sources are and give them a boost. The only way we can give your stuff more visibility is to get the crap off the platform,” she added.
For all of our NewsTECHForum coverage, click here.