In the dystopian circus of modern social media, a grotesque spectacle unfolds: "What I Eat as a Fat Person" videos, where obese individuals devour thousands of calories of ultra-processed junk—soda, fries, cakes—while platforms like YouTube and TikTok profit from ads and algorithmic boosts.
This isn’t entertainment; it’s the monetization of self-harm, glamorizing binge eating and chronic disease. Social media companies aren’t bystanders—they’re accomplices, amplifying this content for engagement while pocketing the cash. This is a moral and legal outrage. The evidence is clear, the harm is real, and these platforms must face civil and criminal liability now.
The Toxic Genre
These videos feature obese creators—often with BMIs over 40—consuming 5,000 to 10,000 calories in one go: fried chicken, sugary soda, donuts, all eaten with a grin. Monetized via ads and platform payouts, they’re a cash cow for creators and companies alike. Science shows this isn’t harmless fun—it’s self-destruction. Ultra-processed foods, high in sugar and fat, drive obesity, diabetes, and heart disease (Monteiro et al., 2019). A 5,000-calorie binge exceeds daily needs by 150-200%, fueling visceral fat and metabolic ruin (Hall et al., 2019). Social media hands creators the tools to harm themselves and collects the profits.
Algorithms as Accomplices
Platforms design algorithms to maximize "time on site," pushing these videos into feeds based on engagement—likes, shares, comments (Anderson et al., 2021). YouTube earns from ad views; TikTok pays for viral hits. The more shocking the calorie count, the more money flows. Leaked Meta documents (2021) show Instagram prioritizes sensational content, even when harmful, for retention (Wells et al., 2021). These platforms don’t just host self-harm—they incentivize it, turning creators into pawns in a profit-driven game where health is the loser.
The Human Toll
Beyond creators, these videos normalize binge eating disorder (BED), a condition marked by uncontrollable overeating (APA, 2013). A creator eating 8,000 calories models behavior tied to shame and ruin, with ultra-processed foods triggering addiction-like responses (Gearhardt et al., 2011). Viewers—especially youth—mimic this under "body positivity," a guise that obscures the truth: a waist-to-height ratio over 0.6, common in these creators, doubles premature death risk (Browning et al., 2010). Platforms romanticize obesity, hiding its toll—inflammation, insulin resistance, cancer (Lauby-Secretan et al., 2016)—while cheering each bite toward collapse.
Legal Accountability
Civilly, platforms are negligent: they owe users a duty of care, breached by promoting harmful content for profit, causing predictable harm—obesity, BED, illness. Section 230 may not shield them when they curate and monetize, as hinted in Gonzalez v. Google (2023). A harmed viewer or exploited creator could sue: platforms knew the risks and cashed in anyway. Criminally, this is reckless endangerment—pushing content that foreseeably leads to bodily harm or death meets "conscious disregard" for life (e.g., People v. Watson, 1981). A CEO in cuffs would wake them up.
Weak Defenses
"Free speech"? Monetized self-harm isn’t protected—it’s commerce, and incitement to illegal acts like BED loses First Amendment cover (Brandenburg v. Ohio, 1969). "Personal responsibility"? Platforms exploit addiction to likes and algorithmic nudges, eroding choice (Zuboff, 2019). "Not our job"? They ban nudity and hate speech—obesity kills more (WHO, 2020). It’s not inability; it’s greed.
Ethical Collapse
Ethically, this is predation. Platforms profit as creators head toward disability and viewers spiral into disordered eating. Their algorithms whisper "eat more, post more, die faster" for ad dollars—a modern Milgram experiment with ad revenue as the shock button. This isn’t capitalism; it’s exploitation.
A Reckoning
Social media must face justice. Civil suits—class actions, billions in damages—should cripple them. Criminal charges for reckless endangerment should follow, with executives held accountable. Section 230 must be gutted for monetized harm; algorithms must be reined in. Every such video should be flagged, demonetized, banned—until the profits vanish. This isn’t a request; it’s a demand. Platforms are killing people for cash, turning self-harm into a viral product. The science is settled, the harm is real—hold them liable, or watch society pay the price.
Wrong Speak is a free-expression platform that allows varying viewpoints. All views expressed in this article are the author's own.
References (Condensed):
APA. (2013). DSM-5.
Anderson et al. (2021). New Media & Society.
Browning et al. (2010). Nutrition Research Reviews.
Gearhardt et al. (2011). Archives of General Psychiatry.
Hall et al. (2019). Cell Metabolism.
Lauby-Secretan et al. (2016). NEJM.
Monteiro et al. (2019). Public Health Nutrition.
Wells et al. (2021). WSJ.
WHO. (2020). Obesity Facts.
Zuboff, S. (2019). Surveillance Capitalism.
Well, watching porn is an act of self-harm when it leads to addiction. Good luck regulating that. You can't regulate people out of behaviors and arresting CEOs isn't going to improve people's mental or physical health. It just doesn't work.
While I appreciate your assessment, and the exploitation is real, it ignores that this is family abuse. No one becomes grossly obese without someone enabling them. You might become obese, but you can't reach the size you are referring to. And wouldn't the analysis be the same for someone drinking oneself to death? Unfortunately, if your solution came to light it would be seized upon by bad actors and use to do more harm than good.