Today we are disclosing four networks of accounts to our archive of state-linked information operations; the only archive of its kind in the industry. The networks we are disclosing relate to independent, state-affiliated information operations that we have attributed to Armenia, Russia and a previously disclosed network from Iran.
Once our investigations were complete, the 373 associated accounts across the four networks were permanently suspended from Twitter for violations of our platform manipulation policies. As with previous disclosures, we shared early access to the data we’re releasing today with the Stanford Internet Observatory for independent investigation and analysis.
As we proactively communicated in October 2020, and based on information provided to us by the FBI, we removed approximately 130 accounts originating in Iran that were attempting to disrupt the public conversation during the first 2020 US Presidential Debate.
After the final investigation was complete, we have suspended a total of 238 accounts operating from Iran for various violations of our platform manipulation policies. As previously stated, the accounts had low engagement and did not make an impact on the public conversation. Today, we’re adding these accounts to the archive to empower independent research and analysis.
Under our platform manipulation policy, we investigated and removed 35 accounts that had ties to the Government of Armenia. These accounts were created in order to advance narratives that were targeting Azerbaijan and were geostrategically favorable to the Armenian government. In some cases, the fake accounts purported to represent government and political figures in Azerbaijan, as well as news entities claiming to operate in Azerbaijan. The accounts engaged in spammy activity to gain followers and further amplify this narrative.
Today we’re disclosing two separate networks that have Russian ties.
1. Our first investigation found and removed a network of 69 fake accounts that can be reliably tied to Russian state actors. A number of these accounts amplified narratives that were aligned with the Russian government, while another subset of the network focused on undermining faith in the NATO alliance and its stability.
2. As part of our second investigation in this region, we removed 31 accounts from two networks that show signs of being affiliated with the Internet Research Agency (IRA) and Russian government-linked actors. These accounts amplified narratives that had been previously associated with the IRA and other Russian influence efforts targeting the United States and European Union.
With every disclosure we make, we want to continue to educate people on the tactics used by state actors in order to manipulate or undermine the open democratic conversation that happens on Twitter. Since we launched our first archive in October 2018, we have disclosed data related to more than 85,000 accounts associated with platform manipulation campaigns originating from 20 countries, to our information operations archive.
We believe we have a responsibility to protect the integrity of the public conversation and to offer transparency on our findings.
Our partnerships in this area are key to the work we undertake and we will continue to keep our archive updated so the public, journalists, and the research community can access and analyze these networks.
Keep on top of regular updates from us over at @TwitterSafety.