Save articles for later
Add articles to your saved list and come back to them any time.
Seventeen-year-old Katya Jaski is in recovery for anorexia after struggling with the condition for more than three years. But every time she opens her TikTok app, she finds the battle much harder.
“You can just be scrolling and come across pro-eating disorder content. It can be disguised in different forms as well, which makes it hard to regulate,” she says.
Katya Jaski says social media often shows her content that makes her anorexia recovery harder – even though she tries to block it.
Sometimes it’s people counting calories, while other times it’s people who say they are in recovery but still in a very unwell state.
She finds that content the most challenging.
“It fuels this thought: ‘They’re in recovery and they’re still really skinny, why am I not really skinny in recovery?’ You have to really stop yourself,” Jaski says.
Even when she tries blocking content or unfollowing accounts, the videos return.
“As someone trying every day to already battle my own thoughts, it’s so unhelpful – there’s no way for me to say definitely: I don’t want any of this in any of my feeds.”
Jaski will share her thoughts on Thursday at a roundtable convened by independent MP Zoe Daniel to look at what the government and eSafety agency can do to improve protections for people with an eating disorder on social media platforms such as Instagram and TikTok.
They will be presented with a preview of a Melbourne University study that found users with eating disorders are shown far more eating disorder, dieting and appearance-related content than other users, which can hinder their recovery.
The study, which analysed about 1 million TikTok videos delivered to individuals with and without eating disorders over a month, found the former were delivered a median of 497 appearance-oriented, 460 dieting-oriented and 461 exercise-oriented videos every week.
Users with eating disorders were shown 3850 per cent more videos on the topic than those without, as well as 384 per cent more dieting content, 222 per cent more appearance-oriented videos and 78 per cent more exercise videos.
Both groups were less likely to “like” those types of videos relative to other content on the app, which the study said showed “it is not the case that users are simply ‘liking’ their way into harmful echo chambers”.
A second study, of people who created fresh TikTok accounts with untrained algorithms, found it took just three minutes on average for them to be shown an appearance-oriented video. The new users were then delivered about 19 of those videos within their first 30 minutes.
“Taken together, our findings strongly implicate the TikTok algorithm in the intensification of eating disorders among young people,” the paper’s abstract said.
Daniel said social media was compounding the difficulties faced by more than 1 million Australians who have suffered eating disorders and their families.
“No company in any industry wants to be regulated. I understand, too, there are a lot of grey areas when you start trying to manage what’s on social media,” she said.
“But increasingly the evidence and data is showing that [social media] is compounding disorders. There are so many issues affecting our young people: COVID, lockdowns, time spent in the tunnel of social media. We can’t just throw up our hands. We all need to say: What can we do about it?”
A TikTok spokesperson said it did not allow content that normalised or glorified eating disorders and had put safeguards in place.
“This includes banning ads for fasting apps and weight-loss supplements, and putting restrictions on ads that promote harmful or negative body image,” she said.
“Anyone searching for terms directly relating to eating disorders is shown information from [eating disorder charity Butterfly Foundation], including helpful and appropriate advice.”
But while platforms have their own safety measures, they are challenged by a high volume of users, fast-moving trends and hashtags intended to skirt around banned terms.
The roundtable will probe options including empowering the eSafety commissioner to remove content promoting eating disorders; a new industry code or standard; stronger industry self-regulation; or tasking a government working group to come up with recommendations.
Daniel said online safety was a space where the federal government had a clear remit to intervene.
But there was more work to be done in improving treatment pathways at both levels of government.
“There are a lot of people on the cliff who need somewhere to go to get help and avoid the revolving door of hospitalisation, forced feeding and poor mental health,” she said.
Butterfly National Helpline 1800 33 4673
Cut through the noise of federal politics with news, views and expert analysis from Jacqueline Maley. Subscribers can sign up to our weekly Inside Politics newsletter here.
Most Viewed in Politics
From our partners
Source: Read Full Article