The popular app is setting a dangerous precedent with its hyper-sexualized and thin-obsessed content, readily available for young girls to absorb
Trigger warning: this article contains information about eating disorders.

In the last week, I have spent far too much time on TikTok.
At first, it was fun: cute dog videos, awesome hair and clothing hacks, and #relatable content. But that was the happy, feel-good surface of the app. The fact is, TikTok is rife with degrading, hyper-sexual, thin-obsessed content that is far too accessible to, well, anyone.
It seems that these days, everyone from your 14-year-old neighbour to your real estate agent is on the app. In fact, as of last summer, it had over 800 million users. Sure, it’s great for a lot of things, and is definitely fun to use, but that doesn’t change the fact that the app is facing a serious problem.
Last year, the media exploded with reports of TikTok being riddled with pro-eating disorder (ED) content. While TikTok representatives told BBC that the health and safety of its users was a top priority, they encourage people to report “any inappropriate content with anything seeking to glorify or promote eating disorders violating its guidelines,” which essentially dumps the responsibility onto the users. But do the people who would be negatively influenced by this content actually report it?
Beat, a UK-based charity dedicated to eating disorder recovery and awareness, told BBC they would advocate for real people to search for and take down harmful content on the app; something that TikTok has yet to do.
Last year, The Guardian published an article about TikTok, investigating videos that promoted starvation and anorexia, claiming that even after TikTok had banned certain hashtags, they were still able to easily find pro-ED content by “putting the same words into a search for profiles” without the hashtag. In fact, it takes little more than 30 seconds to find that content.
TikTok responded by saying they planned to ban all harmful phrases, including those that could be found when searching for users, and they did follow through with this.
The fact is, TikTok is rife with degrading, hyper-sexual, thin-obsessed content that is far too accessible to, well, anyone.
Today, if you search, for instance, “pro-ana” (a pro-anorexia term that originated from the dark web) or “thinspo,” all that comes up are support resources for eating disorders. If you search “what I eat in a day,” videos still appear, but with a disclaimer at the top, notifying users that TikTok values its users’ safety, and they reference the ED resources again.
But this is where it gets tricky. Content that isn’t outwardly promoting EDs or negative body image still shows up. For instance, videos of “what I eat in a day” that are clearly not enough food (a small smoothie, a couple strawberries and a couple carrots, a burrito); others that tout consuming only 1,200 calories per day (which is not enough for anyone and is very unsafe).
However, if you search for words like “skinny,” hashtags like “skinny girl check” come up, presenting an infinite amount of videos that show small and xs clothing sizes, girls wrapping their fingers around their wrists and ankles to show how small they are, fitting into children’s clothes. In fact, the first video I saw was a 24-year-old woman who claims she is 160 cm tall and only 77 lbs. According to the National Heart, Lung, and Blood Institute, the absolute lowest healthy weight for someone that height is about 110 lbs.

So while they may have hammered down on direct ED-related content, videos that promote negative body image and unhealthy, disordered eating are still out there and readily available.
Here is why that is especially concerning: According to BBC, just over 40 per cent of users are between 16 and 24. As of 2021 Wallaroo Media estimates that 32 per cent of users in the U.S. are between 10 and 19, and that users spend an average of 52 minutes on the app per day, with kids aged four to 15 spending 80 minutes per day on the app.
That means that a vast amount of this content is being shown to children and teenagers for over an hour, every single day.
It’s also important to note that the For You page algorithm means users don’t have to actively search this stuff — if they look at something out of curiosity, or even by accident (for instance, a video that touts healthy eating but is, in fact, not), that content will continue to be suggested to them.
Users are able to indicate if they want to see less of a certain type of content, which removes it from their feed. But any content that doesn’t outwardly promote an ED will slip through the cracks, so this method doesn’t work well.
Other social media platforms are, of course, problematic as well. For instance, Facebook was recently criticized for users suffering ED’s being bombarded with dangerous dieting content. Over the years, studies have also shown Facebook usage is often correlated with negative feelings and increased self-consciousness.
In 2019, Facebook did block certain weight loss content from users under 18, though, like TikTok, content is still able to slip through the cracks.
Critics argue that Instagram isn’t doing enough to regulate content either; in fact, studies have linked Instagram use to increased risk of orthorexia, an eating disorder that involves an unhealthy obsession with healthy eating, negatively impacting your life as the obsession takes control.

With Snapchat, users are given control over who can see their content and who can contact them, through built-in parental controls. It will also prevent certain content from being shown to young users.
The issue with many of these other social apps is largely the same as with TikTok — augmenting facial filters, body editing, etc. Those, obviously, haven’t gone away at all — even Zoom has adopted an editing feature so you can touch up your face.
However, the prevalence of harmful or potentially harmful content on TikTok is more concerning when it comes to youth, as it has become the most popular app (next to Snapchat) among teenagers in the U.S. It could also be argued that TikTok is more appealing to teenagers and children, simply because it’s fun: compared to other apps like Instagram and Facebook, it’s almost like a game, where you can participate in dance challenges and test your acting or lip-syncing chops.
Body positive content creators have also reported having their videos removed; for instance, for showing cellulite or fat rolls.
Danae Mercer is a journalist and body positive Instagrammer who has had content removed from TikTok repeatedly.
On her Instagram and Twitter, she shared some of the more sickening TikTok challenges and trends that she had found, including videos where girls wrap headphones around their waists to check their size, teenage boys talking about choking girls and only being attracted to girls who weigh under a certain amount, 13-year-olds sharing their BMI’s; the list goes on. One video featured a young girl who posted a poll where users could rate whether or not she was ‘pretty’ as she danced around her room.
Mercer shared stories of numerous content creators who are larger (read: have any amount of fat on their bodies) and have had their videos taken down for “breaching community guidelines,” despite being the exact same videos that thin girls and women are posting, which stay up. As she wrote in one tweet, “The platform doesn’t censor bikinis, but it sure as heck censors certain bodies.”
In another, she wrote “what a dangerous, toxic space we are creating for our littles. We are showing them bodies but only a very particular type. We are telling them only one look is ok, normal, worthy. We are silencing all others.”
It could also be argued that TikTok is more appealing to teenagers and children, simply because it’s fun … it’s almost like a game, where you can participate in dance challenges and test your acting or lip-syncing chops.
Not to mention that critics have deemed TikTok a hunting ground for sexual predators, because the more they interact with videos of children (say, dancing sexually), the more these kinds of videos will show up on their For You page, making it easier for them to find victims. Parents have even reported incidents of learning their young children were being groomed by pedophiles on the app, according to an article in online publication, Evie.
Evie reported that parents have also found their young daughters (as in, 11-years-olds) posting videos calling themselves “sluts” and using very sexually explicit language.
In my own time navigating these darker recesses of TikTok, I saw comments from girls under 15 using that same sexually explicit and degrading language.
Now, if young girls are spending an average of 80 minutes a day on the app, and what they see is video after video of thin girls oversexualizing themselves and bragging about their dangerously low weight, young boys discussing their degrading sexual exploits, and videos presenting under eating as healthy, how do you think they’re going to view the world?
The media we consume shapes us and our realities, and if this is the content we’re seeing day in and day out, with little to no variation, that is the reality we are creating for ourselves, and more importantly, for young girls who may have difficulty differentiating the false nature of social media from real life.
So the question becomes, how do we balance the good with the dangerous content? It’s hard to say, and it’s certainly not a one-size-fits-all fix. However, until the app brings in real people to monitor the content and differentiate between the good and bad, the responsibility falls on us. We can actively report any inappropriate content and be mindful about the content we engage with and post; or, maybe ditch TikTok altogether until they find a way to make it a safe and positive environment.
For eating disorder recovery resources visit nedic.ca.
A version of this article appeared in print in The Ontarion issue 190.4 on March 25, 2021
Please visit www.theontarion.com/submit to find out how you can share your work with The Ontarion.
