Editor’s note: This article by Dmitri Sarle originally appeared on ArcticStartup, an independent tech blog that reports on digital startups and growth entrepreneurship from the Nordic and Baltic countries.
New York, meet the world’s tech scene
5,000 Tech leaders are coming to NYC this November to learn and do business. This is your chance to join them.
However, the Latvia-based company behind the service has recently been under a lot of pressure due to allegations of enabling cyberbullying that led to a number of teenager suicides. Last week we raised the question of why companies such as Ask.fm are the targets and if everybody should be concentrating on the root of the problem instead.
To date, Ask.fm did not formally respond to any of the allegations, but we managed to get an interview with CEO Ilja Terebin to discuss the matter:
Ilja Terebin: Well, the truth is that parents do not know where kids socialize. They think that when kids go to school, for example, all they do is solve math problems. If parents knew what kids actually talk about, they would be a lot more scared.
This is what happens with Ask.fm as they can see it right in the open. However, the truth is that it happens everywhere, both online and offline.
But have you done any research or analysis into the matter?
We have analyzed some keywords such as “kill yourself” and others and tried to see patterns to see who was asking the question. As a result about 90% of these types of questions people ask themselves.
So does that mean that it is a cry for help or some sort of an attention-seeking behavior?
Yes. FormSpring had the same problem. Kids lack attention, mainly because parents are doing other things such as watching TV, drinking beer and reading the tabloids.
When they come to sites like these, they start trolling themselves so that their peers start protecting them.
In this absurd way, they get the attention.
Still, does Ask.fm try to somehow minimize this sort of behavior?
In the press, they claim that we do not even try to deal with this. This is absolutely not the case.
We do have a “report abuse” feature and any user can use that. All of these reports are looked through. At the moment we have 50 moderators.
The same goes for pornography and other types of content. It is then either deleted or the user is blocked. It is easier to do with pictures and video. However it is much harder with text as we have 30 million questions and 30 million answers every single day.
To put that into perspective, it is like trying to control Gmail. We do have a database of negative keywords and moderators look through them and a decision is made on what action to take.
We might make the moderation control even more strict in the future as well.
So do you think that this problem can be eliminated on Ask.fm?
Our main public is teenagers and it is often a problematic and aggressive target market. The problems arise not only on Ask.fm but also on other social media. It is very hard to get rid of this altogether.
Especially when teenagers are involved.
As we wrote last week, we think that the problem is larger than individual sites such as Ask.fm and that more global initiatives should be taking place to educate parents and children about the perils of the Internet in general.
For instance, Mashable recently wrote about a Kickstarter project called HomePage which is aiming to educate kids about the social Web. As the project’s website says: “Driver’s ed for the social web”.
Now this could be a much more effective solution than trying to shut down individual sites. Especially when many of these suicides can actually be prevented thanks to social media, where teenagers choose to openly scream for help about their issues.
Top image credit: Thinkstock