National

Sunday 29 March 2026

Zuckerberg’s ‘fig leaf’ slips as oversight board proves toothless

The tech firm’s court loss might have been avoided if the internal body had been able to do its job

Meta’s bad week in the US courts should not have happened. The company has its own oversight board – Mark Zuckerberg called it the firm’s “supreme court” when he announced it in 2018.

Launched with great fanfare in 2020, the board of 21 high-­profile law professors, journalists and human rights advocates – including writer and activist Khaled Mansour, Denmark’s first female prime minister Helle Thorning-Schmidt, and Alan Rusbridger, the former editor-in-chief of the Guardian – is supposed to provide accountability and transparency. It has since made more than 200 rulings, including backing Zuckerberg’s ban on Donald Trump in 2021.

But the board’s profile has waned. When asked about it, one academic expert in online conspiracy theories told The Observer: “I have no idea what you’re talking about, I’m sorry.”

In theory, child safety is one of the oversight board’s key concerns. It is “enhancing and protecting teens’ online experiences”, while “considering policies that improve their safety from exploitation, abuse and other offline harms”.

In practice it is toothless by design. The Observer contacted the board as well as board members individually in light of this week’s ruling in a Los Angeles court that Meta and Google had intentionally built addictive platforms, and a ruling in New Mexico that found Meta liable for misleading users over child safety.

Only two members returned our inquiries. One, Rusbridger, said the cases were “not part of our remit I’m afraid”.

The other, Suzanne Nossel, former COO of Human Rights Watch, explained that most of the board’s work is judging decisions Meta has already made and it reviews specific pieces of content. The board does not have “the authority to review the inner workings of the algorithms or platform design at the heart of the Los Angeles case”, she said, clearly unhappy.

“From the start it has been clear to us that the impact of social media has as much, or more, to do with the ways that content is delivered algorithmically than with what content stays up” but: “We have not been able to take on issue areas without the participation of the company.”

Online safety experts are unimpressed. “If we’re serious about teenagers’ wellbeing, then focusing on individual pieces of content is not going to help,” Jessica Chalmers, who advises schools and parents on child safety, said.

“The board hasn’t got the power to change how these powerful algorithms are designed. I don’t think it’s working, but it wasn’t built to protect young people online anyway.”

Newsletters

Choose the newsletters you want to receive

View more

For information about how The Observer protects your data, read our Privacy Policy

Tanmay Durani, researcher in cyber law and AI at Rajiv Gandhi National University of Law, said the board’s impact highlights severe governance issues at the company and “looks much closer to a fig leaf when you shift from individual speech to systemic issues like youth addiction, algorithmic amplification of harmful content or design choices that expose children to predators.

“The existence of a high‑profile, independent‑seeming body helps Meta project an image of responsible self‑regulation while consequential questions about product design and profit safety trade‑offs remain outside its remit.”

And even this fig leaf may fall. Meta is only funding the oversight board until the end of next year. As Zuckerberg embraces Trump, observers suspect the board will vanish when this round of funding dries up.

Overseeing a redesign of algorithms to reduce addiction and keep children safe? Even if the board survives, “that’s not in the remit, I’m afraid”.

Photograph by Alamy

Follow

The Observer
The Observer Magazine
The ObserverNew Review
The Observer Food Monthly
Copyright © 2025 Tortoise MediaPrivacy PolicyTerms & Conditions