Deep learning can help us eradicate suicide – but only if we let it

machine learning

Humanity’s mental health crisis has reached pandemic proportions. Bluntly put: we don’t seem capable of solving the problem on our own. Cutting edge AI research shows a clear path forward, but society as a whole will have to accept the fact that mental health is real in order for us to take the first steps.

Machine learning researchers in academia, government, and industry are all scrambling to adapt and apply modern artificial intelligence techniques – deep learning in particular – to the suicide problem. The big idea seems to be that AI can act as a sort of early detection system for mental health distress by identifying and monitoring specific markers that indicate suicidal behavior.

For humans to intervene in mental health-related medical emergencies we have to be aware of the problem. This involves understanding what physical and verbal cues to look for, knowing the right questions to ask, and being in the right place at the right time. Computers have a leg-up on us in these respects, now that we live in a world where nearly everyone always has a smartphone, tablet, or computer on them at all times.

Here’s a recent NYU study wherein scientists built a natural language processing AI — basically the same technology that runs Alexa, Assistant, and Siri — that can detect PTSD in veterans with 89 percent accuracy just by listening to audio recordings of the warriors’ speech. And here are articles from Forbes and HealthITAnalytics that both detail the Pentagon’s efforts to find a way to use AI to cross-reference the medical records of veterans who’ve committed suicide to find any links possible.

In the past 18 months at least 22 US veterans have killed themselves at Veteran‘s Administration centers in the US. And this means the Federal government is finally invested in finding technological interventions to suicide.

The bottom line is that we have the technology to deal with the problem, but suicide isn’t the disease; it’s merely a symptom. Machines can’t force us to deal with the root cause: cultural ignorance of how depression, anxiety, and trauma affect the human brain.

TNW spoke with Jason Reid, the founder of Chooselife.org, about the suicide epidemic. He told us:

People don’t need suicide awareness. They’re pretty aware of suicide. What people really want to know is what they can do. We need suicide prevention and we need to talk about suicide with each other and our children. We have to start treating mental health like physical health, and consider depression like the flu or any other illness. Because we all know bad things can happen when illness goes untreated.

Reid’s son Ryan took his own life in March of 2018. He was only 14 years old. According to Reid:

What I’m trying to do is wake parents up a little bit to what’s going on. Because we all would think “I know my kids,” and “that’ll never happen to me.” I thought the same thing. I spent time with my kids. For God’s sake I wrote a book on how to be a better parent. But it happened to me.

The personal stories of families like Reid’s are always devastating, but just how bad is the suicide problem? Here’s some perspective: approximately 17,234 people were murdered in the US in 2017; about 5 out of every 100,000 people. It’s a number that should bother anyone. Compare that to the number of suicides in the US in the same year: over 47,000.

Suicide is the tenth leading cause of death in the US and second among teens and young adults. Elderly people are considered the highest at-risk group, but under-reporting makes it difficult to discern how many people over 80 take their own lives. And about 22 US veterans kill themselves every single day.

But you probably knew that already. Maybe you didn’t have the exact figures – but most modern cultures are inundated with information and statistics related to the suicide epidemic. What we need is help.

In the meantime, Reid says one of the first things we need to do is rethink the access we give our children to technology. He believes that social media is far more dangerous than even the media hype would let on, and that one of the first things big tech needs to do is take another look at age limits. He told us:

Is 13 the right age to be on Facebook? I don’t think it is. I know social media companies think it is because they get more money that way. But here’s the thing: my son wasn’t even on social media … He was on the internet with his phone. In his suicide note he told us he had searched for ways to kill himself on the web. When you give children these devices, you don’t know it, but parents think the worst thing that’ll happen is they’ll come across porn. Porn’s not the worst thing on the internet.

The current paradigm for dealing with suicide involves placing blame on the victim, or on companies like Google and Facebook. But perhaps it’s time to start implementing AI guardian angels on our devices themselves.</p>

Apple turned the Apple Watch into a life-saving device when it added an AI-powered feature that can detect coronary failure. Imagine if the camera in your phone or the microphone in your smart speaker could detect depression or PTSD and alert someone who could help.

We have the tools to make it happen. Now we just need to convince people to take mental health crisis as seriously as we do physical illness and injury.

If you or someone you know is struggling with depression or suicidal thoughts contact the Suicide Prevention Lifeline at ;PHONE: 1-800-273-8255 or online here. For more information on Chooselife.org visit its website here.


TNW Conference 2019 is coming! Check out our glorious new location, inspiring line-up of speakers and activities, and how to be a part of this annual tech bonanza by clicking here.

 

Read next: Mark Zuckerberg has a podcast now - but you may have already heard it