Meta Oversight Board Takes Its First Threads Case
Meta’s supervisory board has now expanded its scope to include the company’s new platform, Instagram Threads. Designed as an independent appeals commission that hears cases and then makes precedent-setting content moderation decisions, the commission has so far ruled on cases including Facebook’s banning of Donald Trump, misinformation on Covid-19, breast cancer photo removal, and more.
The board has now started hearing cases emanating from Threads, Meta’s Twitter/X competitor.
This is an important point of differentiation between Threads and competitors like It’s also very different from how decentralized solutions, like Mastodon and Bluesky, handle moderation tasks on their platforms. Decentralization allows community members to establish their own servers with their own set of moderation rules, as well as the ability to opt out of other servers whose content does not meet their guidelines.
Startup Bluesky is also investing in stackable moderation, meaning community members can create and manage their own moderation services, which can be combined with others to create a personalized experience for each user.
Meta’s move from tough decisions to an independent board of directors that could override the company and its CEO Mark Zuckerberg was supposed to be the solution to the problem of Meta’s centralized authority and control over content moderation. But as these startups have shown, there are other ways to do this that give the user more control over what they see, without infringing on the right of others to do the same.
Nonetheless, the Oversight Board announced Thursday that it would hear its first case from Threads.
The case concerns a user’s response to the posting of a screenshot of a new article in which Japanese Prime Minister Fumio Kishida made a statement about his party’s alleged under-reporting of collection revenues of funds. The post also included a caption criticizing him for tax evasion and contained derogatory language as well as the phrase “dropping dead.” He also used derogatory language towards a person wearing glasses. Due to the “drop dead” component and hashtags calling for death, a human Meta reviewer decided that the post violated the company’s violence and incitement rule – although it looks a lot like your post ordinary these days. After his appeal was rejected a second time, the user appealed to the Commission.
The Board says it chose this case to examine Meta’s content moderation policies and enforcement of political content practices on Threads. It’s a timely move, given not only that it’s an election year, but also that Meta has said it won’t proactively recommend political content on Instagram or Threads.
The Council’s case will be the first involving Threads, but it will not be the last. The organization is already preparing to announce another set of cases tomorrow focused on nationality-based criminal allegations. These latter cases were brought to the Council by Meta, but the Council will also receive and consider appeals from Threads users, as it did with the case regarding Prime Minister Kishida.
The decisions made by the Board will influence how Threads, as a platform, chooses to maintain the ability of users to express themselves freely on its platform, or whether Threads will moderate content more closely than on Twitter/X . This will ultimately help shape public opinion on the platforms and encourage users to choose one or the other, or perhaps a startup experimenting with new ways to moderate content in a more personalized way.
techcrunch