The Australian has obtained confidential internal documents from Facebook that supposedly show how it’s possible for the company to take advantage of teenagers’ insecurities to benefit advertisers.
The leaked document was written by two Australian Facebook execs and included how monitoring users’ posts, comments and interactions could help figure out when people felt “defeated”, “overwhelmed”, “stressed”, “anxious”, “stupid”, “nervous”, “silly”, “useless” and a “failure”.
Have you visited TNW's hype-free blockchain and cryptocurrency news site yet?
It's called Hard Fork.
Information about users’ mood could therefore be added to the data that Facebook sells advertisers. Currently Facebook provides ad buyers with user’s personal information including relationship status, location, age, and how often and in what manner people use the social media website. Emotional state might be a lucrative addition to this data.
The methods described in the document could be used on kids as young as 14 (Facebook’s minimum age is 13) who “need a confidence boost”. The documents also show that Facebook has been developing covert tools to better obtain useful insights into how Australian and New Zealand teenagers are feeling.
There is clearly merit to the accusations as Facebook has already issued an apology for targeting such a young audience. Though they admitted it was wrong to target children in this way and ordered an internal investigation, they did not mention whether or not exploiting people when they’re feeling vulnerable is ethical.
Facebook has stated that its research was in line with privacy and legal protections but news.com.au reports that the company may have breached Australian guidelines for advertising and marketing towards children.
Update May 1: Facebook responded to The Australian’s story saying that the company does not offer tools to target people based on their emotional state. The social media giant claims that the analysis was only done to help marketers understand how people express themselves on Facebook.
However, Facebook admits that the research did not follow established process of how the company reviews its research. Facebook says that it’s “reviewing the details to correct the oversight”.