Advertisement

SKIP ADVERTISEMENT

on tech

The Limits of Facebook’s ‘Supreme Court’

What happens on Facebook has such big consequences, its quasi-independent Oversight Board can only do so much.

Video
CreditCredit...By Barney Mccann

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

What Facebook calls its “Supreme Court” ruled on Wednesday that it was the right decision for the company to kick former President Donald J. Trump off the platform after his posts about the riot at the U.S. Capitol in January.

Well, sort of. In a sign of how weird this whole decision was, the Oversight Board punted the call about Trump’s account back to Facebook. He might reappear on Facebook in a few months. Or he might not.

Let me explain the decision, its potential implications and the serious limits of Facebook’s Oversight Board.

Wait, what is happening to Trump’s account?

Facebook indefinitely suspended Trump after he used the site to condone the actions of the Capitol rioters and, as Mark Zuckerberg said, “to incite violent insurrection against a democratically elected government.”

Facebook’s Oversight Board, a quasi-independent body that the company created to review some of its high-profile decisions, essentially agreed on Wednesday that Facebook was right to suspend Trump. His posts broke Facebook’s guidelines and presented a clear and present danger of potential violence, the board said.

But the board also said that Facebook was wrong to make Trump’s suspension indefinite. When people break Facebook’s rules, the company has policies to delete the violating material, suspend the account holder for a defined period of time or permanently disable an account. The board said Facebook should re-examine the penalty against Trump and within six months choose a time-limited ban or a permanent one rather than let the squishy suspension remain.

Facebook has to make the hard calls:

A big “wow” line from the Oversight Board was its criticism of Facebook for passing the buck on what to do about Trump. “In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities,” the board wrote.

The quietly scathing part on influential Facebook users:

The meat of the board’s statement is a brutal assessment of Facebook’s errors in considering the substance of people’s messages, and not the context.

Facebook currently treats your neighbor with five followers the same as Trump and others with huge followings.

(Actually, at least when he was president, Trump had even more leeway in his posts than your neighbor. Facebook and Twitter have said that the public should generally be able to see and hear for themselves what their leaders say, even if they’re spreading misinformation.)

The Oversight Board agreed that the same rules should continue to apply to everyone on Facebook — but with some big caveats.

“Context matters when assessing issues of causality and the probability and imminence of harm,” the board wrote. “What is important is the degree of influence that a user has over other users.”

With world leaders, the Oversight Board said that Facebook should suspend accounts if they repeatedly “posted messages that pose a risk of harm under international human rights norms.”

To this I say, heck yes. The Oversight Board showed that it understands the ways that Facebook is giving repeat superspreaders of bogus information a dangerous pathway to shape our beliefs.

The limits of the Oversight Board:

It is remarkable that in its first year of operation, this board seems to grasp some of Facebook’s fundamental flaws: The company’s policies are opaque, and its judgments are too often flawed or incomprehensible. The board repeatedly, including on Wednesday, has urged Facebook to be far more transparent. This is a useful measure of accountability.

But the last year has also proved the grave limitations of this check on Facebook’s power.

Facebook makes millions of judgment calls each day on people’s posts and accounts. Most of the people who think Facebook made a mistake will never get heard by the board.

This includes those who have had their Facebook accounts disabled and are desperate for help to get them back, people who wind up in Facebook “jail” and don’t know which of the company’s zillions of opaque rules they might have broken and others who are harassed after someone posted something malicious about them. It includes journalists in the Philippines whose work is undermined by government officials regularly trashing them anonymously on the site.

The oversight board is a useful backstop to some of Facebook's hard calls, but it is a complete mismatch to the fast pace of communications among billions of people that, by design, happen with little human intervention.

I’m also bothered by the Supreme Court comparison for this oversight body that Facebook invented and pays for. Facebook is not a representative democracy with branches of government that keep a check on one another. It is a castle ruled by an all-powerful king who has invited billions of people inside to mingle — but only if they abide by opaque, ever changing rules that are often applied by a fleet of mostly lower-wage workers making rapid-fire judgment calls.

The Oversight Board is good, but the scale of Facebook and its consequences are so vast that the body can only do so much.



  • Peloton is recalling its home treadmills: A U.S. safety commission had warned about dozens of injuries and one child’s death that were linked to the machines. My colleague Daniel Victor wrote that Peloton said it made a mistake by initially fighting the agency’s request to recall the $4,295 treadmills.

  • One family’s story of pandemic learning: Jordyn Coleman, an 11-year-old in Mississippi, said he used to like school but his grades and attendance have suffered because of inadequate technology for virtual classes and pandemic disruptions in his family. My colleague Rukmini Callimachi spent time with Jordyn, who she wrote is among the children at risk of “becoming one of the lost students of the coronavirus pandemic.”

  • Jake from State Farm ENDLESSLY: It’s not your imagination if you feel like you see the same commercials over and over on streaming video sites like Hulu and Peacock. Bloomberg News says that the unruly mess of streaming video is making it hard for advertisers to know how many times their commercials are being shown and where.

The main branch of San Francisco’s public library reopened to in-person browsing for the first time in more than a year. You have to watch this video of excited patrons who are greeted by clapping and cheering library staff.

(The man who was the first in line told The San Francisco Chronicle, “The library is like my best friend.”)


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

Shira Ovide writes the On Tech newsletter, a guide to how technology is reshaping our lives and world. More about Shira Ovide

Advertisement

SKIP ADVERTISEMENT