YouTube Algorithms, YouTube, teen Eating Disorders, YouTube Algorithms study, Center for Countering Digital Hate, CCDH,
A recent report by the Center for Countering Digital Hate (CCDH) has found how YouTube algorithms actively promote harmful content to users interested in diet and weight loss. The study, titled YouTube’s Anorexia Algorithm, found that nearly 70% of videos recommended by YouTube to users exhibiting such interests are likely to exacerbate body image anxieties. These videos, averaging 344,000 views each, often glamorize eating disorders, promote weight-based bullying, or encourage unhealthy behaviors, and many feature ads from major brands such as Nike, T-Mobile, and Grammarly.
Findings of the CCDH study
Imran Ahmed, founder and CEO of CCDH, criticized the platform, stating, “Kids today are essentially reeducated by algorithms, by companies teaching and persuading them to starve themselves.”
The research involved creating a YouTube profile for a hypothetical 13-year-old girl, conducting 100 searches related to popular eating disorder keywords, and analyzing the first 1,000 “Up Next” recommendations. Key findings include:
Content push
Almost two-thirds of recommended videos intensified the focus on eating disorders or extreme weight loss.
Harmful recommendations
One-third of the videos were deemed harmful, featuring glorification of eating disorders, bullying, or imitable harmful behavior.
Self-harm content
Fifty of the flagged videos included themes of self-harm or suicide.
Despite YouTube’s policies prohibiting such content, the study revealed that harmful videos often bypass restrictions by altering search terms, such as substituting letters with symbols.
YouTube algorithms case studies
Anna Mockel, now 18, recalls how YouTube perpetuated her struggle with anorexia. At 14, she was drawn to videos of skinny girls, which soon led to extreme dieting content recommended by YouTube. “YouTube became this community of people who are competitive with eating disorders,” she shared, emphasizing how the platform normalized her harmful behavior.
Similarly, a 17-year-old plaintiff in a lawsuit against YouTube revealed how her feed transitioned from benign content to a stream of extreme diet and weight-loss videos. Over time, she became immersed in eating disorder culture, which severely impacted her health and education. “It’s just taken my life away pretty much,” she reflected.
Both cases highlight the real-world consequences of YouTube’s algorithmic recommendations, which experts say are designed to maximize user engagement, often at the expense of mental health.
Criticism of YouTube algorithms practices
James P. Steyer, founder of Common Sense Media, warned against allowing platforms to “experiment on new generations” with potentially harmful algorithms. Despite YouTube’s claims of working with mental health experts and refining its policies, the CCDH study found significant gaps in enforcement.
For instance, out of 100 flagged harmful videos, YouTube removed or age-restricted only 18. Additionally, while some search terms like “thinspiration” are blocked, users can easily circumvent restrictions by using variations of the terms.
“YouTube, owned by Google, is violating its own policies by allowing this content,” Ahmed asserted, pointing to the systemic nature of the problem.
Restricting eating disorder-related content
In April 2023, YouTube updated its policies to restrict content related to eating disorders and self-harm for users under 18. However, the CCDH report suggests that these measures are insufficient, as harmful content continues to proliferate on the platform.
The Social Media Victims Law Center has also taken action, filing thousands of lawsuits against platforms, including YouTube, for perpetuating eating disorders. These lawsuits allege that social media companies intentionally design algorithms to be addictive, prioritizing profits over user safety.
Also Read: Jamie Foxx Opens About His Medical Emergency: ‘I Don’t Remember 20 Days’