In February this year, we outlined the steps we have been taking to combat racist abuse on Twitter. We condemn racism in all its forms - our aim is to become the world’s most diverse, inclusive, and accessible tech company, and lead the industry in stopping such abhorrent views being seen on our platform. We were appalled by those who targeted players from the England football team with racist abuse following the Euro 2020 Final. Having completed an initial review of our data related to Euros 2020, we wanted to share an update publicly and transparently. Our aim is to contribute to the shared understanding of these behaviours online and provide an overview of some of the steps we put in place since our initial blogpost was published in February.
Action we took to protect the Euro 2020 conversation
In advance of the Euro 2020 Tournament, alongside our wider work with the football authorities, we put in place specific plans to quickly identify and remove racist, abusive Tweets targeting the England team and wider Euros conversation.
Following the appalling abuse targeting members of the England team on the night of the Final, our automated tools, which had been in place throughout Euro 2020, kicked in immediately to identify and remove 1622 Tweets during the Final and in the 24 hours that followed.
While our automated tools are now able to detect a majority of the abusive Tweets we remove, we also continue to take action from reports. New vectors of abuse are ever-emerging, which means our system is having to adapt on an ongoing basis. Therefore, to supplement our efforts, trusted partners are able to report any further Tweets directly to our front-line enforcement teams. In total, over 90% of the Tweets we removed for abuse over this period were detected proactively.
We continued to remove violative content as it was posted on the platform in the days that followed. By 14th July, 1,961 Tweets had been removed proactively following the Final, with a total of 126 removed from reports.
99% of the accounts suspended were not anonymous
Following the Tournament, we undertook our own analysis of the Tweets removed and accounts suspended. This is to ensure we have a comprehensive understanding of the behaviour we encountered and the users involved, and that the steps we take going forwards can be as effective as possible. While that work is continuing, we wanted to share some initial findings.
Given the international nature of the Euro 2020 Final, it was no surprise to see that the Tweets we removed came from all over the world. However, while many have quite rightly highlighted the global nature of the conversation, it is also important to acknowledge that the UK was - by far - the largest country of origin for the abusive Tweets we removed on the night of the Final and in the days that followed.
We also wanted to better understand the users we had permanently suspended over the course of the tournament. While we have always welcomed the opportunity to hear ideas from partners on what will help, including from within the football community, our data suggests that ID verification would have been unlikely to prevent the abuse from happening - as the accounts we suspended themselves were not anonymous. Of the permanently suspended accounts from the Tournament, 99% of account owners were identifiable.
We also continue our work on reducing the visibility of this kind of content, ensuring fewer people see it. Indeed, only 2% of the Tweets we removed following the Final generated more than 1000 Impressions (Impressions are the number of views a Tweet receives before being removed). This highlights the importance for us of focusing even further on finding ways to make sure these Tweets are seen by as few people as possible - or prevent them from being sent in the first place.
Racist behaviour does not reflect the vast majority of people who use Twitter to participate in vibrant conversations about football in the UK. Critically, the word “proud” was used more often on the day following the Final than on any other day this year, as people expressed their support for the England team.
Steps we have taken since our update in February
We have been taking a number of wider steps since our last update in February to combat racist abuse targeting the football community:
Indeed, for the period 19th February (when we published this blogpost) to 1st June, we removed just under 13,000 Tweets - of which 95% were identified proactively using technology.
Our continued commitment to curbing this societal issue
Our aim is always that Twitter be used as a vehicle for every person to communicate safely - be it in highlighting injustice, or giving a voice to those communities who have been historically under-represented. There is no place for racist abuse on Twitter, and we are determined to do all we can to stop these abhorrent views and behaviours from being seen on our platform
We can do better. And we fully acknowledge our responsibility to ensure the service is safe - not just for the football community, but for all users. With that in mind:
However, we also have to be honest that the progress we will be able to make alone would be magnified by greater interventions across the board. As long as racism exists offline, we will continue to see people try and bring these views online - it is a scourge technology cannot solve alone. Everyone has a role to play - including the government and the football authorities - and we will continue to call for a collective approach to combat this deep societal issue.