This article was published on May 3, 2019

Online extremism is taking a mental toll on researchers studying it


Online extremism is taking a mental toll on researchers studying it Image by: Alexey Sokolov / Icons8

Social media platforms do an indispensable job of connecting us with our families, friends and like-minded people. But of late, they have also become conduits for bad actors to hijack users’ attention to manipulate news, spew conspiracy theories, and propagate toxic viewpoints with an intent to radicalize impressionable minds.

Whether be it Facebook, Instagram, Twitter, or YouTube, problems surrounding extremist content, lapses in platform moderation and general misuse run rampant. The public feeds, which incentivized us into sharing more, and connecting and engaging with as many people as possible, have also emerged as the biggest source of headaches for the companies running them.

But it’s not just users and platforms. Researchers who are investigating the ills of online extremism appear to be bearing the brunt as well, according to an extensive piece on Wired.

Online extremism and media manipulation researchers spend their days sifting through hate-speech-ridden Reddit threads, dehumanizing YouTube videos, and toxic chat rooms where death threats and active harassment campaigns are par for the course. The deluge of hate and extremist content takes a toll on their mental health and leaves some with PTSD-like symptoms, much like those experienced by content moderators at Facebook, they say.

A part of it stems from the inevitable amplification of the very issues they are looking to address, thereby calling more attention to the subject matter and posing a moral dilemma for those who are studying it. Whitney Phillips, professor of media literacy and online ethics at Syracuse University, summed it up well:

Just by calling attention [to the fact that] a narrative frame is being established means that it becomes more entrenched. And in a digital media environment it becomes more searchable. It becomes literally Google search indexed alongside whatever particular story [is written about it] … What do you do with that knowledge once you realize that any single thing that [you] do or don’t do is going to be part of how the story unfolds? That’s a huge cognitive burden.

Paris Martineau’s whole piece is well worth a read, and throws light on the need for personal responsibility and the challenges posed by toxic behavior.

TNW Conference 2019 is coming, and its Future Generations track explores how emerging technology will help us achieve the 17 sustainable development goals, outlined by the UN. Find out more by clicking here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with