Explore
Settings

Settings

×

Reading Mode

Adjust the reading mode to suit your reading needs.

Font Size

Fix the font size to suit your reading preferences

Language

Select the language of your choice. NewsX reports are available in 11 global languages.
we-woman
Advertisement

YouTube Algorithms Have Alarming Impact On Teen Girls, Push Eating Disorders, Study Finds

YouTube algorithms are under scrutiny for promoting harmful content to teens, fueling eating disorders and body image anxieties. A new report reveals how the platform's recommendations intensify exposure to dangerous diet trends and self-harm content.

YouTube Algorithms Have Alarming Impact On Teen Girls, Push Eating Disorders, Study Finds

YouTube Algorithms, YouTube, teen Eating Disorders, YouTube Algorithms study, Center for Countering Digital Hate, CCDH,

A recent report by the Center for Countering Digital Hate (CCDH) has found how YouTube algorithms actively promote harmful content to users interested in diet and weight loss. The study, titled YouTube’s Anorexia Algorithm, found that nearly 70% of videos recommended by YouTube to users exhibiting such interests are likely to exacerbate body image anxieties. These videos, averaging 344,000 views each, often glamorize eating disorders, promote weight-based bullying, or encourage unhealthy behaviors, and many feature ads from major brands such as Nike, T-Mobile, and Grammarly.

Findings of the CCDH study

Imran Ahmed, founder and CEO of CCDH, criticized the platform, stating, “Kids today are essentially reeducated by algorithms, by companies teaching and persuading them to starve themselves.”

The research involved creating a YouTube profile for a hypothetical 13-year-old girl, conducting 100 searches related to popular eating disorder keywords, and analyzing the first 1,000 “Up Next” recommendations. Key findings include:

Content push

Almost two-thirds of recommended videos intensified the focus on eating disorders or extreme weight loss.

Harmful recommendations

One-third of the videos were deemed harmful, featuring glorification of eating disorders, bullying, or imitable harmful behavior.

Self-harm content

Fifty of the flagged videos included themes of self-harm or suicide.

Despite YouTube’s policies prohibiting such content, the study revealed that harmful videos often bypass restrictions by altering search terms, such as substituting letters with symbols.

YouTube algorithms case studies

Anna Mockel, now 18, recalls how YouTube perpetuated her struggle with anorexia. At 14, she was drawn to videos of skinny girls, which soon led to extreme dieting content recommended by YouTube. “YouTube became this community of people who are competitive with eating disorders,” she shared, emphasizing how the platform normalized her harmful behavior.

Similarly, a 17-year-old plaintiff in a lawsuit against YouTube revealed how her feed transitioned from benign content to a stream of extreme diet and weight-loss videos. Over time, she became immersed in eating disorder culture, which severely impacted her health and education. “It’s just taken my life away pretty much,” she reflected.

Both cases highlight the real-world consequences of YouTube’s algorithmic recommendations, which experts say are designed to maximize user engagement, often at the expense of mental health.

Criticism of YouTube algorithms practices

James P. Steyer, founder of Common Sense Media, warned against allowing platforms to “experiment on new generations” with potentially harmful algorithms. Despite YouTube’s claims of working with mental health experts and refining its policies, the CCDH study found significant gaps in enforcement.

For instance, out of 100 flagged harmful videos, YouTube removed or age-restricted only 18. Additionally, while some search terms like “thinspiration” are blocked, users can easily circumvent restrictions by using variations of the terms.

“YouTube, owned by Google, is violating its own policies by allowing this content,” Ahmed asserted, pointing to the systemic nature of the problem.

Restricting eating disorder-related content

In April 2023, YouTube updated its policies to restrict content related to eating disorders and self-harm for users under 18. However, the CCDH report suggests that these measures are insufficient, as harmful content continues to proliferate on the platform.

The Social Media Victims Law Center has also taken action, filing thousands of lawsuits against platforms, including YouTube, for perpetuating eating disorders. These lawsuits allege that social media companies intentionally design algorithms to be addictive, prioritizing profits over user safety.

Also Read: Jamie Foxx Opens About His Medical Emergency: ‘I Don’t Remember 20 Days’

Filed under


mail logo

Subscribe to receive the day's headlines from NewsX straight in your inbox