Sextortion Against Young People Is On The Rise W/ Image-Based Abuse Spiking By 117%

Image-Based Abuse on the rise especially sextortion
CONTENT WARNING: This article discusses image based abuse.

Young people have reported an increase in sextortion on popular social media apps and the stats are harrowing.

In case you haven’t come across the term before, sextortion is a form of image-based abuse when someone is approached online by a stranger, who initiates a conversation and escalates it until the two begin to share sexual images or videos. Then, the stranger blackmails the person with these images, and threatens to publish them or send them to family and friends if their demands are not met.

According to the eSafety Commissioner, the majority of sextortion victims are aged 18 to 24 years old — though there’s an increase of children also making reports — and people are targeted most on Instagram and Snapchat.

“Kids are caught up in it really quickly,” Frank Rayner, AFP’s acting commander of the Australian Centre to Counter Child Exploitation (ACCCE), said per Guardian Australia.

“It’s not uncommon for it to be a time period of only 20 to 30 minutes between the first contact, images being sent and then demands being made.”

He said majority of the perpetrators were not operating from Australia and were unknown to their victims.

According to the eSafety Commissioner, more than 9000 instances of image-based abuse have been reported in the last year, which is a 117 per cent increase on the year before. Alarmingly, more than two thirds (68 per cent) of these instances were related to sextortion.

Last December, reports of sextortion to the ACCCE spiked by 60 per cent. The AFP said in a statement that the ACCCE receives about 300 reports per month from young people aged under 18 years old, and it was expecting this number to spike in December. It also noted that despite these horrifically high numbers, it’s likely only about 10 per cent of people actually report these abuses.

Rayner said it was imperative children know they aren’t committing a crime if this happens to them, and that they can report this to authorities who will help them, like the police and the eSafety Commissioner.

Given this happens on Instagram and Facebook too, Meta has introduced a new feature which scrubs child exploitation material off the web, in an effort to help young people combat image-based abuse.

Its algorithm creates a code from a victim’s images, sends that code to Facebook and Instagram as well other websites like PornHub (which has been known to harbour child abuse material), and then deletes content which matches.

While it can’t undo the trauma of abuse, it’s a great step in helping young people take action who otherwise might not feel comfortable going to authorities.

More Stuff From PEDESTRIAN.TV