Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on June 10, 2022

Virtual child sexual abuse material is almost as dangerous as the real thing

Although it depicts fictitious children, it can serve as a gateway to real child abuse


Virtual child sexual abuse material is almost as dangerous as the real thing Image by: Shutterstock

Child sexual abuse material (previously known as child pornography) can be a confronting and uncomfortable topic.

Child sexual abuse material specifically refers to the possession, viewing, sharing, and creation of images or videos containing sexual or offensive material involving children.

But less publicized is another form of child sexual abuse material: virtual child sexual abuse material (VCSAM).

What’s virtual child sexual abuse material (VCSAM)?

VCSAM is sexual content depicting fictitious children in formats such as text, drawings, deepfakes, or computer-generated graphics. It’s also known as fictional child pornography, pseudo pornography, or fantasy images.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Recent technological advancements mean fictitious children can now be virtually indistinguishable from real children in child sexual abuse material.

Some offenders create VCSAM through a morphing technique which uses technology to transform real images into exploitative ones.

A non-sexual image of a real child could be visually altered to include sexual content. For example a child holding a toy altered to depict the child holding adult genitals.

Morphing can also happen in the reverse, where an image of an adult is morphed to look like a child – for example adult breasts are altered to look prepubescent.

Another type of VCSAM includes photo-editing multiple images to create a final, more realistic airbrushed image.

But what might be most troubling about VCSAM is it may still feature images and videos of real children being sexually abused.

In fact, certain software can be used to make images and videos of real victims look like “fictional” drawings or cartoons.

In this way, this allows offenders to effectively disguise a real act of child sexual abuse, potentially preventing law enforcement from bringing victims to safety.

It may also enable repeat offenders to avoid detection.

Why do some people engage with VCSAM?

There’s limited evidence revealing why some people might engage with VCSAM.

To learn more about this offending group, we recently investigated the possible psychological basis for people who engage with such material.

We discovered several potential reasons why offenders might use VCSAM.

Some used it for relationship-building.

Despite the diverse offending group, some offenders who use child sexual abuse material have been found to have limited intimate relationships and heightened loneliness.

Online communities of other deviant but like-minded people may therefore provide offenders with a greater sense of belonging, social validation, and support. Such interactions may also, in turn, serve as positive reinforcement for their criminal behavior.

Others may use this material to achieve sexual arousal.

It could be argued the material may also normalize the sexualization of children.

In fact, professionals in child welfare and law enforcement seem to share the concern that VCSAM may “fuel the abuse” of children by framing the offenders’ criminal behavior as acceptable.

Sometimes the material is used for “grooming”.

Adult offenders may show child sexual abuse material to children, breaking down the child’s inhibitions to falsely normalise the abusive act being depicted.

This is one form of grooming – that is, predatory conduct aimed to facilitate later sexual activity with a child.

Such material can also be used to teach children how to engage in sexual activities.

For example, offenders may use VCSAM to show children material depicting young – and, most alarmingly, happy – cartoon characters engaging in sexual activities.

An urgent cause for concern

Clearly, VCSAM is incredibly harmful.

It can be used to disguise the abuse of real children, as a gateway to “contact offending” against children (meaning abusing them in real life), and as a grooming technique.

Child welfare and law enforcement officials have sounded the alarm about the increasing creation and distribution of VCSAM for over a decade.

And it seems this problem will only escalate with the development of increasingly sophisticated software and digital technologies.

So while VCSAM remains illegal and offenders are frequently prosecuted, detecting – and ultimately preventing – these often obscure acts of abuse remains a challenge. The Conversation

This article by Larissa Christensen, Senior Lecturer in Criminology & Justice | Co-leader of the Sexual Violence Research and Prevention Unit (SVRPU), University of the Sunshine Coast; Ashley Pearson, Lecturer in Law, University of the Sunshine Coast, and Dominique Moritz, Senior Lecturer in Law, University of the Sunshine Coast, is republished from The Conversation under a Creative Commons license. Read the original article.

Get the TNW newsletter

Get the most important tech news in your inbox each week.