In extremely shitty news, an Aussie influencer living in the US says someone used a photo of her without her permission to make an NFT with a trademarked porn logo.

Alanah Pearce is a writer for the video game developer Sony Santa Monica Studio and a personality within the gaming media space. On Sunday she said on Twitter that she had been informed that a porn site had used a photo of her to make an NFT with their trademarked logo and profit off it. Obviously, she was never asked for permission.

In a conversation with PEDESTRIAN.TV, she explained that the photo used was a picture taken to promote her position as an award presenter for a prestigious gaming awards ceremony. Now it’s being sold on OpenSea, an NFT marketplace, under Blacked.com.

“I found out about this when someone left a YouTube comment on one of my videos warning me about it,” she explained.

“They said they were prompted to search my name on OpenSea, which is currently the largest NFT marketplace, after a number of YouTube content creators tweeted that they had discovered their YouTube channels and logos were being sold on the site.”

James Stephanie Sterling, a popular video game YouTuber and Twitch streamer was also targeted. Here’s their tweet about it from this week.

Pearce added that while she’s experienced an overwhelming amount of people publishing fake pornography of her, in most cases sites like PornHub are “very friendly” regarding the removal of deep fakes. OpenSea, meanwhile, has a “history of ignoring takedown requests”.

“A very disturbing part of the current sales pitch for NFTs is that they are ‘unregulated’. That’s encouraging a lot of theft.”

“Theft in NFTs is already rampant and undeniably increasing as more bad actors look to the technology as a “get rich quick” scheme,” Pearce added.

As proof of this, a user on Twitter has been documenting the almost daily instances where artists have noticed their content stolen by someone online and sold as an NFT. That account attributes a majority of those issues when they occur on OpenSea to the fact it’s allegedly monitored by a robot rather than a human worker.

Pearce thinks more needs to be done to stop those who produce of these kinds of NFT.

“I’m not at all surprised that part of the community overlaps with the community who spends their free time trying to sexually harass women, and don’t doubt the ‘minting’ of this kind of content will increase before it is properly regulated.”

Pearce clarified that while she has no plans to take legal action regarding this particular NFT, there’s a real issue percolating online regarding the theft and scams within the digital currency space.

“They argue that they’re in favour of artists, yet repeatedly defend those caught blatantly stealing, minting and selling art,” she explained.

“The regulations will come — because they’re clearly not self-regulating effectively — and those who have scammed people out of thousands of dollars will be made an example of.

“I think that action is more likely to come from a company who can afford the time and cost of a lawsuit than from an individual, though.”

In a statement shared with PEDESTRIAN.TV, a spokesperson for OpenSea said: “It is against our policy to sell NFTs that violate the publicity rights of others. We regularly enforce this in multiple ways, including delisting and banning accounts when we are notified that usage of a likeness is not authorized.

“Furthermore, we have a zero-tolerance policy for NCII (non-consensual intimate imagery). NFTs using NCII or similar images (including images doctored to look like someone that they are not) are prohibited, and we move quickly to ban accounts that post this material. We are actively expanding our efforts across customer support, trust and safety, and site integrity so we can move faster to protect and empower our community and creators.”

Image: Alanah Pearce