CONTENT WARNING: This article discusses disordered eating.

A joint investigation by Four Corners and Triple J Hack has highlighted how TikTok can expose users to pro-eating disorder content with real-life harms, while appearing to ‘shadow-ban’ other content it deems harmful.

While personalised feeds are nothing new, the investigation found that TikTok’s particularly powerful algorithm had pushed some young Aussies into developing eating disorders.

“What really sets TikTok apart […] is just how accurate and how up-to-the-minute this For You page recommender system seems to be,” QUT researcher Dr Bondy Kaye told Four Corners.

“It is very hard to break that cycle, and it’s by design that you never really get to the end of the content.”

The investigation spoke with two young Aussie women who had pro-eating disorder content repeatedly appear in their feeds.

“Before TikTok, calorie counting had never crossed my path,” one 19-year-old told Four Corners. She developed an eating disorder four months after downloading the app.

Another 22-year-old who had been in and out of hospital over the past five years said the app contributed to her relapsing.

“As I got sicker and I got more obsessive, all I could do was just flick through my phone and look at this footage,” she said.

“I spent hours on it and just fixated on it.”

TikTok has mechanisms in place to stop the spread of this kind of content. For example, searching for terms related to eating disorders doesn’t return any actual videos, and instead links to the Butterfly Foundation’s helpline.

“Our teams consult with NGOs and other partners to continuously update the list of keywords on which we intervene,” a TikTok spokesperson told the ABC. The app also bans “content depicting, promoting, normalising, or glorifying activities that could lead to suicide, self-harm, or eating disorders.”

TikTok
Here’s what happens when you search for ED-related keywords on TikTok. (Supplied)

However there are ways to get around these safety mechanisms by using deliberate misspellings and coded language. That’s how this kind of harmful content ended up on those two users’ For You pages.

The flipside of this content moderation, the investigation found, was that the app seemed to have no problem hiding content which constructively discussed issues like racism and disabilities.

Perth TikToker Unice Wani (@unicewani) has over 595,000 followers. However she said that her videos didn’t do well when she spoke about race and racism as a Black woman.

“You tend to get a lot of shadow bans for speaking up about stuff such as racism,” Wani told Four Corners.

“I guess they focus more on the white girls dancing and stuff like that.”

Sydney-based TikToker Paniora Nukunuku (@pnuks), who has around 190,000 followers, said “it definitely feels like TikTok has some preference on what content should be posted on the platform” when he discusses things like race or his disability.

Earlier this year, a bunch of Black creators, mainly in the US, went on strike after being fed up with white TikTokers becoming successful off of their dance moves without seeing that same success themselves.

Again, this isn’t unique to TikTok – many social media platforms reflect the biases present in society, most notably TikTok’s spiritual predecessor Vine – but this is the unavoidable backdrop to these ongoing discussions about what kind of content is allowed and what seemingly isn’t.

You can read the full investigation into “the TikTok spiral” over at the ABC. The full episode will air on the ABC tonight at 8:30pm, AEST.


If you need support, give Butterfly Foundation a call on 1800 33 4673 or chat online.

If you are in distress, please call Lifeline on 13 11 14 or chat online. 

Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.

Image: Getty Images / Future Publishing