Meta Is Partnering With A New Service That Hunts Down & Removes Leaked Nudes Off The Internet

meta partners with take it down to scrub exploitation material off social media

In news that will come as a relief to Aussie teenagers frantically trying to get their leaked nudes off the internet, Meta is partnering with a service designed to do exactly that.

By now, there’s no shortage of horror stories young Aussies face when it comes to image-based abuse, be it nudes being leaked without consent, accidentally sharing explicit pics with the wrong person, or non-consensual deepfake porn. The issue is growing by the day, and few countries have laws that can catch up.

Here’s where Take It Down comes in.

Take It Down is a new service developed by the National Centre for Missing & Exploited Children in the US which aims to help people under the age of 18 get sexually explicit content of themselves on the internet taken down. It also can be used by adults who are worried there are pictures or videos of themselves online from when they were under 18.

The platform is American and was made for those in the US but by partnering with it, Meta is opening up the service to millions of people including Aussies.

How it works is kinda strange though.

Basically, Take It Down has an algorithm that matches your images with ones it finds online, kind of like reverse image searching.

Its algorithm creates a code from your images, sends that code to websites like Meta’s Facebook and Instagram, as well as Pornhub and other sites, and then deletes content which matches yours.

And yes, the code can also work for deepfakes, which is good to know given nonconsensual deepfake porn is on the rise.

The movie is a huge leap in the right direction from Meta, but Australia’s eSafety Commissioner Julie Inman Grant reckons more can still be done.

“The service relies on user-reporting, rather than the companies proactively detecting image-based abuse or known child sexual exploitation and abuse material,” she told the ABC.

“We maintain the view that companies need to be doing much more in this area.”

It’s also perhaps a little disappointing for some that this service can only be used for child exploitation material, because plenty of adults — particularly women — are also victims of abuse material, especially in regards to deepfakes and “revenge porn”.

Let’s hope that this service will lead to a rise in services for all people who experience image-based abuse.

Recording, sharing, or threatening to share an intimate image without consent is a criminal offence across much of Australia. 

If you’d like to report image based abuse to police or get help removing an intimate image from social media, reach out to the Australian eSafety Commissioner here.

If you’d like to speak to someone about image based abuse, please call the 1800 Respect hotline on 1800 737 732 or chat online

Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.

More Stuff From PEDESTRIAN.TV