© 2024 MICHIGAN PUBLIC
91.7 Ann Arbor/Detroit 104.1 Grand Rapids 91.3 Port Huron 89.7 Lansing 91.1 Flint
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

New study looks at effects of banning “hateful” online communities

Frederico Cintra
/
FLICKR - HTTP://J.MP/1SPGCL0
With hateful Internet posts on the rise, how should websites respond?

Some of us turn to social media to stay connected with friends and family. Others use social media as a megaphone to spread messages of hate.

Social media outlets wrestle with striking a balance. How do they allow free speech yet not let trolls and haters wreak their havoc? Just banning the troll doesn’t make much of a difference. It’s easy to change a screen name and jump right back out there lobbing hate-filled posts.

What if the focus shifted from cracking down on the trolls to taking away their online hangouts? A recent study by researchers at Emory University, Georgia Institute of Technology and the University of Michigan explored that very question.

Listen above for the full conversation or read highlights below.

On the study's goal and findings

“There’s been this long-standing idea in Internet communities that if you stamp out a bad behavior, it’s just going to spread all over the place,” said Eric Gilbert, an associate professor at the University of Michigan School of Information and one of the researchers behind the study. “One argument for letting it sit there and do what it’s doing is that if you try to take it away, it’ll spread to more mainstream locations. And we wanted to test whether that was true.”

The study looked at the effects of Reddit banning “hate communities” on its site. “We found that, one, on the first question, individuals went away at a higher frequency than expected, they stopped using the hate speech they were using before, and in the second case, there was no discernible effect on ‘nearby’ communities.”

While members of the “hate communities” moved to other communities on the websites, called subreddits, they did not replicate the same vile language in these new communities. “We have some hunches about why it would have worked,” said Gilbert. “There are social norms that are in place in these communities that tamp down bad behavior.” Meanwhile, others left Reddit and moved to other platforms that allowed more hurtful speech.

On free speech

“We say free speech, but whose free speech are we talking about here? One of the things we have to point out is all this stuff is transnational, so it’s not even clear whose laws apply,” said Gilbert. “And even among western countries, there’s considerable variation in how free speech is interpreted.”

“The other thing that gets lost sometimes in this freedom of speech conversation is the fact there are companies involved…. Reddit does not have to uphold the First Amendment. It is not a public space. It is a privately owned space,” said Gilbert. “I’m not sure it’s the best idea to cede the regulation and enforcement of free speech to private firms, and that’s currently what we have.”

(Subscribe to the Stateside podcast on iTunes, Google Play, or with this RSS link)

Stateside is produced daily by a dedicated group of producers and production assistants. Listen daily, on-air, at 3 and 8 p.m., or subscribe to the daily podcast wherever you like to listen.
Related Content