Stories

Is it possible to build more kindness into Twitter?

By Common Thread
Tuesday, 21 September 2021

In 2019, Ph.D. candidate Lindsay Blackwell and journalist Micah Loewinger set up an ambitious social experiment. They wanted to see if they could get people to apologize to each other online. For three months, they tried their hand at mediating conflict in a Christianity subgroup of Reddit. Their results were mixed. But when Blackwell tells the story, the one person they successfully rehabilitated stands out to her. 

“We had one guy who was just a constant pain in the ass for the [moderation] team,” Blackwell said while describing how Reddit user “James” was permanently banned for his aggressive comments and tendency to pick fights. 

But Blackwell and Loewinger noticed that James had made meaningful connections in the subreddit and posted multiple times a day before he was banned. They decided to learn more and invited him to chat in a private Discord server. Here they gave him a chance to explain his side of the story while helping him reflect on his behavior. 

“We basically said if you come participate in this group, and you make a good faith effort, we will lift your restrictions and invite you back into the community with open arms,” Blackwell said.

It worked: Not only did James return to the subreddit on his best behavior, he also became an asset to the moderating team by helping explain the rules to other users.

“The best part is that this is something that I've heard over and over again in my research with offenders,” Blackwell said, referring to her own academic research at the University of Michigan and also her work as a researcher at Facebook and now at Twitter. “People who are successfully rehabilitated and who are welcomed back into the community after violating some norm or rule tend to become very good advocates for the rules,” she said.

Blackwell is now a researcher on Twitter’s safety team, where she applies some of the lessons she’s learned while working towards earning her doctorate. Blackwell and researchers like her have spent the past few years advocating for a different approach to social media moderation. This reconciliation-based approach, known as restorative justice, focuses on repairing damage and ultimately bringing offenders back into their community.

“Most Western models of criminal justice are designed around retribution as the dominant way to achieve justice,” Blackwell said. “My whole dissertation is about how we can apply theories of justice to online content moderation to improve those systems. And restorative justice is a big, big component of that.”

Restorative justice is an alternative model that values mediation and agreement over punishment. The movement in North America originated from Indigenous models of justice — specifically the First Nations people of Canada and the US and the Maori of New Zealand, Christian faith communities like the Mennonites, and the prison abolition and alternative dispute resolution movements.

Blackwell started thinking about how online communities could be designed to create an environment that encourages people to make amends instead of removing them from the platform. 

“I think people are really understanding more and more how much carceral philosophy leaks into many aspects of our society and our day-to-day lives,” she said. “A lot of the problems in the ways that these models of governance were translated into social media is because of the just-Western perspective that these companies are coming from.”

A broken system

The current method of content moderation focuses primarily on banning users who break the rules — a punishment-based system that’s in line with most Western criminal justice systems. But even though reporting and blocking tools have been available for years, incidents of hate and harassment haven’t decreased significantly on social media. In a 2020 survey, more than half of the people on Twitter who identify as being part of a marginalized community say that there is still too much trolling and harassment on the platform.

That’s what people like Christine Su are trying to change. Su, a queer woman of color who is a former activist, joined Twitter last year as the product manager for conversational safety. She immediately spotted issues for concern. 

This Tweet is unavailable
This Tweet is unavailable.

“There is no sense of psychological or emotional safety if you know that algorithms are detecting some bad stuff, but there's still harassment happening every day. Particularly for women and folks from any marginalized identity. Your safety online is like your safety in real life,” she said.  

Su knows from personal experience that people from marginalized communities, such as women and LGBTQ+ people, particularly could benefit from an overhaul of how Twitter works. She recently spent eight hours reporting abuse directed at her sister, who’s a journalist, using Twitter’s public reporting channels. 

This kind of harassment is exactly why Su wants to apply a restorative justice lens to how her team builds features that actually help people feel safe. 

“In the view of the average woman on Twitter, we're not doing enough as a platform,” she said.

Apologies and forgiveness

Around the same time Lindsay Blackwell was experimenting with restorative mediation on Reddit, another Twitter researcher noticed a curious trend: People on the platform were apologizing to each other on their own — without incentives and with no knowledge of social justice theory. 

Ruben Gomez is a researcher on the health user experience team, which works on what it calls “health empowerment” or interventions, remediation, and rehabilitation for Twitter customers who’ve been banned from the platform. His work includes speaking and listening with empathy to people who’ve violated Twitter’s rules. 

While researching how people felt about the tool that lets them hide offensive replies to their Tweets, Gomez heard from the original Tweet authors — or those who sent the initial Tweet — that they were receiving an influx of Direct Messages. The people who had offended the Tweet author were messaging them to acknowledge they had stepped out of line or to privately start a discussion about what they’d done wrong. Gomez saw how this could be shaped into a remediation tool for Twitter and brought the idea to Su and her team.

“That was one of the seeds for the idea that maybe we should formalize this ability for people to apologize,” Gomez said. “Because we're not giving the opportunity for people to do that through our existing tools.”

Su and the conversational safety product team built several concepts for an apology tool, which they based on practices common in therapy circles used in the restorative justice method. An early concept was called the “Oops, ouch, forgive” model. 

“It was just like in therapy circles and community circles — if you cross over a line, you don’t make a huge deal out of it. You just say, ‘oops, ouch,’ or tell someone something hurt you in private feedback,” Su said.

Su’s team tested the model internally, and with specific target groups, but received mixed feedback. 

“It's not clear, given the current zeitgeist of Twitter and cancel culture, that people are ready to forgive. And so, depending on the severity of the perceived infraction, it kind of trailed off there,” she said. 

Another challenge came when the team tested the concept internationally, particularly in cultures where more formal apologies are customary. 

“‘Oops’ and ‘ouch’ did not land well. There was no translation from our Japanese local teams, and our Korean local teams where they felt like it was appropriate,” she said. 

Translating Twitter tools across cultures is a challenge, particularly for a company where many of the employees are based in North America, though that’s changing. Su, who grew up in Hong Kong and Taiwan, says one of the solutions is to continue to hire more international and bilingual employees. 

“If you hire someone who's bilingual, trilingual, or grew up internationally, you're getting by default two cultural experiences in one person,” said Su. 

More urgently, most of the people experiencing abuse told the team they just wanted ways to make it stop. So Su switched her team’s focus to building filters that could help people, like women journalists, better control their experience on the platform. She sees this work as based on another tenet of the restorative justice movement, which underlines the importance of healthy boundaries. 

“The common thread of this is what we're trying to do through the lens of restorative justice or transformative justice is give communities the power. So empower people to protect themselves with the flexibility of a suite of controls that they like,” she said.

The tools for apologizing on Twitter are on hold for now, but Su wants to involve the wider Twitter community in the next iteration of the product. 

“I do think that giving a pathway to healing and forgiveness is still really important, and I would love to explore that in dialogue with people,” she said. “I'd love to do another public Tweet of like, ‘Here's different ways to ask for forgiveness or ways to apologize. What do you all think?’”

Valuing community input to the point of inviting customers to comment on what tools they’d like to see is new for Twitter, but not for the people advocating for a safer space built on empathy and reconciliation. 

“I think the goals that we're trying to achieve are very difficult, because humans are just vastly complex. And so it's okay to have an overhaul of Twitter that meets the needs of largely marginalized communities,” Gomez said. “Hopefully we're working towards that. I certainly think we are.”

This Tweet is unavailable
This Tweet is unavailable.
@95731075

Common Thread

‎@TwitterSafety‎

Real talk about the health of the public conversation on Twitter and the work ahead of us.