There’s no shortage of people online looking to exploit and manipulate the vulnerable among us. One such group is anorexia coaches, or “anacoaches”.
They are typically middle-aged, male sexual predators who go online to find impressionable young people to exploit under the guise of providing weight-loss “coaching”.
I have been researching how anacoaches operate. I’ve found they are facilitated by flaws within social media algorithms, as well as large numbers of young people seeking weight-loss help online.
My ongoing research, coupled with other media reports, indicates an opportunity for anacoaches has risen in the past few years. My analysis showed that on Twitter alone there are about 300 unique requests for anacoaches around the world daily.
Anacoaches operate on numerous channels, including established social platforms such as Twitter, TikTok, Tumblr, and Kik. Despite this, these platforms haven’t addressed the problem.
Targeting teens
An estimated 4% of Australians, or roughly one million people, are affected by eating disorders. And almost two-thirds (63%) of these people are thought to be female.
Teenagers with eating disorders are more likely to experience poor mental health and impaired functioning in social environments — which leaves them more vulnerable to the influence of anacoaches.
Also, research has shown social media use can exacerbate the extent to which teenagers and young adults chase a “thin” ideal.
One study published by a Dutch human rights law group on the predatory behaviors of anacoaches found self-reporting victims had been sexually assaulted and even raped.
And with anacoaching comes the potential for other forms of criminal abuse, such as pedophilia, forced prostitution, and even human trafficking.
Social media provides the platform
With the rise of online platforms, there has been an emergence of communities pursuing a thin ideal. These networks tend to share content that endorses extreme thinness.
Group identity is formed through interactions and hashtag sharing, with a focus on terms used regularly in the context of eating disorders. Common hashtags include #proana (pro-anorexia), #bonespo (bone inspiration), #edtw (eating disorder trigger warning), #promia (pro bulimia), #bulimia, #thighgap, #uw (ultimate weight), #cw (current weight), #gw (goal weight) and #tw (trigger warning).
As highlighted in my previous research, communication in these communities includes exchanging weight-loss tips, diet plans, extreme exercise plans, imagery of thin bodies, and emotional “support”.
Anacoaches lurk in chat forums focused on thin ideals. Each coach will tend to be present in numerous chatrooms, luring teenagers with stories of their past “successes” from coaching.
They market themselves with dubious claims. Some will assign themselves labels such as “strict coach” or “mean coach”. The screenshots below show messages posted on the app Kik.
The coaching predominantly involves sharing pictures and videos for nude body checks (or in undergarments), weekly weigh-ins, and enforcing strict rules on what foods to eat and avoid.
While there’s currently no way to know how long coaching lasts on average, the harms are extensive. Because of the way its content algorithms work, TikTok, which has a massive young following, will start to recommend user accounts centered around eating disorders once such content is initially sought.
There are currently not enough regulations in place by platforms to prevent anacoaches from operating, despite an array of reports highlighting the issue.
What is being done?
Best efforts so far have involved Instagram, TikTok, and Pinterest filtering out selected words such as “proana” or “thinspo” and banning searches for content that promotes extreme thinness.
A TikTok spokesperson told The Conversation the platform does not allow content depicting, promoting, or glorifying eating disorders.
“When a user searches for terms related to eating disorders, we don’t return results and instead we direct them to the Butterfly Foundation and provide them with helpful and appropriate advice. We’ve also introduced permanent public service announcements (PSAs) on related hashtags to help provide support for our community,” the spokesperson said.
The spokesperson said accounts found to be engaging in sexual harassment may be banned. Platforms will ban users if they violate user guidelines, but anacoaches will often reappear under a new account name.
According to Twitter, evading account bans is against the rules. Earlier this year Twitter announced it would enable a safety mode that will allow users to turn on the proactive screening of spammy and abusive content. It remains to be seen what role this will play in curbing targeted attacks from anacoaches.
A research-based report released this month by the 5Rights Foundation has detailed how minors online are targeted with sexual and suicide-related content. It references platforms including Twitter, TikTok, Instagram, Snapchat, Facebook, Discord, Twitch, Yubo, YouTube, and Omegle.
The research showed children as young as 13 are directly targeted with harmful content online within 24 hours of creating an account online. They may receive unsolicited messages from adults offering pornography, as well as recommendations for eating disorder content, extreme diets, self-harm, suicide, and sexualized or distorted body images.
Australia’s policies involving platforms need to be overhauled to ensure platforms adhere to community guidelines and are held accountable when violations occur.
The government should prescribe set rules, informed by the eSafety office, regarding how vulnerable youth online should be helped.
A nuanced intervention approach would generate better outcomes for users with eating disorders as each user would have a different set of circumstances and a different mental health state.
Anacoaches on social media should be considered and dealt with like criminals. And platforms that fail to uphold this should face fines for failing to provide a safe user environment for the vulnerable.
In the past, the European Union has fined platforms for allowing terrorist content. Social media giants have also hired contract workers to screen content for examples of terrorism, pedophilia, and abuse. This effort should be extended to include anacoaches.
The Conversation approached Tumblr for comment but did not receive replies within the deadline allocated. Popular messaging app Kik was acquired by MediaLab in 2019. The Conversation approached MediaLab for comment but did not receive a response within the allocated time frame.
Article by Suku Sukunesan, Senior Lecturer in Information Systems, Swinburne University of Technology
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Get the TNW newsletter
Get the most important tech news in your inbox each week.