Bluesky saw explosive growth last year, forcing the platform to step up its moderation efforts. In its recently released moderation report for 2024, Bluesky said its user count increased by approximately 23 million, from 2.9 million users to almost 26 million. And its moderators received 17 times more user reports than in 2023: 6.48 million in 2024, compared to 358,000 the previous year.
The bulk of these reports concerned “harassment, trolling or intolerance”, spam and misleading content (including impersonation and misinformation). The presence of accounts pretending to be other people following Bluesky’s spike in popularity, and the platform taking a “more aggressive” approach to trying to crack down on it. At the time, he announced that he had quadrupled his moderation team. The new report states that Bluesky’s moderation team now numbers around 100 people and recruiting is underway. “Some moderators specialize in particular policy areas, such as dedicated child safety officers,” he notes.
Bluesky says it has received numerous reports about other categories, including “illegal and urgent issues” and unwanted sexual content. There were also 726,000 reports marked as “other.” Bluesky says it responded to 146 requests from “law enforcement, governments and legal firms” out of a total of 238 last year.
The platform plans to make some changes to how reports and appeals are handled this year, which it says will “streamline communication with users,” such as providing users with updates on actions it took on the content they reported and, later, allow users to appeal takedown decisions directly in the app. Moderators removed 66,308 accounts in 2024, while its automated systems removed 35,842 spam and bot profiles. “Looking ahead to 2025, we are investing in stronger proactive detection systems to complement user reporting, as a growing network needs multiple detection methods to quickly identify and address harmful content,” Bluesky said.