Twitter disbands Trust & Safety Council after key members resign • TechCrunch


Twitter today disbanded the Trust & Safety Council, which was an advisory group of around 100 independent researchers and human rights activists. The group, formed in 2016, gave the social network information on different content and human rights issues, such as the removal of child sexual abuse material (CSAM), suicide prevention and the online security. This could have implications for Twitter’s global content moderation, as the group was made up of experts from around the world.

According to multiple reports, board members received an email from Twitter on Monday saying the board is “not the best structure” for obtaining outside information about the company’s product and policy strategy. Although the company said it would “continue to welcome” ideas from board members, there was no guarantee they would be considered. Since the advisory group designed to provide insights has been disbanded, it’s like saying “thanks, but no thanks.”

A Wall Street Journal report notes that the email was sent an hour before the board had a scheduled meeting with Twitter staff, including new trust and safety manager Ella Irwin. and Senior Director of Public Policy, Nick Pickles.

This development comes after three key members of the Trust & Safety board have resigned Last week. The members said in a letter that Elon Musk ignored the group when he claimed to be focused on keeping users safe on the platform.

“The creation of the Council represented Twitter’s commitment to move away from a US-centric approach to user safety, closer collaboration across regions, and the importance of having people who are deeply experienced in the security team. This last commitment is no longer obvious, given Twitter’s recent statement that it will rely more on automated content moderation. Algorithmic systems can only go so far in protecting users from ever-changing abuse and hate speech until detectable patterns develop,” he said.

After taking over Twitter, Musk said he would form a new content moderation board with a “diverse set of views,” but there was no development on that front. As my colleague Taylor Hatmaker noted in her story in August, not having a robust set of content filtering systems can hurt underrepresented groups like the LGBTQ community.



Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button