As we work to improve the health of the public conversation, we’re committed to reaching beyond Twitter’s virtual walls to integrate diverse perspectives that make our service better for everyone. That’s why we regularly collaborate with trusted partners on our Trust and Safety Council to develop products and programs, and to improve the Twitter Rules.
Today, we’re sharing a recap of some of the work we’ve accomplished hand-in-hand with these trusted partners, as well as more about our ongoing commitment to incorporating the expertise of global experts, researchers, and developers to support healthy public conversation on Twitter.
Incorporating feedback to make Twitter safer
We know the best version of Twitter is the one built by the people who use it. Over the past year, we’ve engaged with the Trust and Safety Council on thirteen projects early in the development process. We distilled and put to use their feedback on ways we can offer a better and safer experience for people using Twitter. Their feedback directly informed our approach on several products.
Twitter has become an extremely important communication tool in India, and it is encouraging to see Twitter take active interest and respond to the feedback given by council members. We’re happy to be part of a team of trusted partners which gets heard and is able to make Twitter a safer platform for all, especially women.
As a founding member of the Trust and Safety Council, we’ve worked alongside Twitter to help influence positive change for over a decade. We believe that everyone should benefit from technology free from abuse and harassment and bring this perspective to all council meetings. We look forward to continued collaboration with Twitter to ensure that matters raised by our organization and the people we support are integrated into Twitter’s products and policies.
Transparency is core to Twitter’s approach. Through initiatives such as our open developer platform, our information operations archive, and our disclosures in the Twitter Transparency Center and Lumen, we continue to support third-party research of what’s happening on Twitter. We’ll continue to build on these efforts and inform the public as we improve Twitter in the open. The following are highlights from the past year.
What’s next? Increasing transparency and understanding on our approach to content moderation
As we continue to invite trusted partners and the public to share feedback on ways to make Twitter safe, it’s important to be transparent about how we develop and enforce the Twitter Rules. Our newly formed Content Governance Initiative (CGI) aims to do this by developing a governance framework that provides a consistent and principled approach to the development, enforcement, and assessment of our global rules and policies. To build our governance framework, we’re engaging external stakeholders and have created an additional advisory group on our Trust and Safety Council. We’ll continue collaborating with this group and cross-functional teams across Twitter to establish standardized guidelines on policy development, enforcement, and appeals that help drive a common understanding of Twitter’s approach to content moderation. The framework's principles and guidelines will aim to fulfill the following objectives:
We recognize that achieving these objectives will not be easy. Content moderation at scale is a highly complex and challenging process. This initiative reflects our ongoing commitment to working systematically — in partnership with external stakeholders around the world — to improve the transparency and consistency of our content moderation processes.
Here’s to another year of building together
We want to acknowledge the members of the Trust and Safety Council, research partners, civil society representatives, and you, the people using Twitter, for continuing to hold us accountable. You challenge us, offer different perspectives, and support us in our mission to safeguard the public conversation. We’re looking forward to this next chapter and can’t wait to see what we build together.