A healthy digital ecosystem prioritizes community, truth, and mental well-being over engagement metrics and advertising revenue.
Social media was supposed to connect us. Remember that? A grand digital agora where ideas flowed freely, friendships flourished, and humanity reached new heights of understanding. Instead, we got doomscrolling, misinformation pandemics, and the occasional viral video of someone fighting a goose in a parking lot. (The goose usually wins.)
The problem with social media isn’t just what it does—it’s how it’s designed. Platforms like Facebook, Twitter, and TikTok aren’t neutral tools; they’re meticulously engineered to maximize engagement. And what drives engagement? Emotion. Specifically, outrage, fear, and envy. As Tristan Harris, a former Google design ethicist, explains in The Social Dilemma, “If you’re not paying for the product, you are the product.” Your attention is harvested, packaged, and sold to advertisers with the efficiency of a dystopian factory where the workers don’t even realize they’re on the assembly line.
Consider the role of algorithms. These aren’t passive lines of code; they’re active curators of reality. The content you see isn’t random—it’s selected because it keeps you scrolling, clicking, and, crucially, buying. A 2018 study published in Nature found that false information spreads six times faster than the truth on Twitter. Why? Because lies are often more emotionally provocative, and the algorithm doesn’t care about facts; it cares about engagement.
Social media is linked to rising rates of anxiety, depression, and loneliness, particularly among young people. In iGen, psychologist Jean Twenge details how the generation raised on smartphones reports unprecedented levels of mental health issues. It’s not just FOMO (fear of missing out); it’s the constant, algorithmically curated reminder that you’re not enough—not thin enough, rich enough, popular enough. Social comparison isn’t a bug of social media; it’s a feature.
Then there’s the issue of misinformation. The 2016 U.S. presidential election, Brexit, COVID-19 conspiracies—social media didn’t create these problems, but it amplified them at warp speed. As Zeynep Tufekci explores in Twitter and Tear Gas, social platforms can mobilize movements for justice, but can just as easily spread propaganda, radicalize extremists, and erode trust in democratic institutions.
Speaking of democracy, social media platforms wield power that rivals nation-states. They set the terms of public discourse, decide who gets heard, and can silence voices with the flick of an algorithmic switch. They’re private companies with more influence over global communication than the United Nations, yet their decision-making processes are opaque, unaccountable, and driven by profit.
One of the most insidious effects of social media is its impact on the human attention span. The constant stream of bite-sized content—tweets, reels, TikToks—fragments the mind, training it to crave instant gratification and shallow engagement rather than deep thought. Studies have shown that prolonged social media use correlates with declining ability to focus, increased impulsivity, and a reduced capacity for complex reasoning. This isn’t an accident; it’s a byproduct of platforms designed to keep users scrolling indefinitely, rewarding quick emotional reactions over sustained contemplation. When every thought must be distilled into a character limit, when every argument is reduced to a meme, public discourse suffers. Nuance is lost. Critical thinking erodes. The result is a population more susceptible to simplistic narratives, less capable of patience, and increasingly polarized.
Another overlooked consequence is the reshaping of human memory itself. Before the digital age, people remembered what was meaningful—stories, personal experiences, important conversations. Now, social media floods the mind with ephemeral, algorithmically selected snippets of information, prioritizing engagement over significance. The brain, overwhelmed, offloads more and more of its memory onto external devices, trusting platforms to store what it once internalized. But these platforms are not neutral vaults of information—they manipulate what is stored, what resurfaces, and what disappears. A moment of personal significance can be buried under the noise of trending topics, while an impulsive mistake can be immortalized and weaponized. Social media doesn’t just distort the present—it reshapes how we recall the past, altering collective and individual memory in ways we barely understand.
The problem isn’t just the technology; it’s our relationship to it. The Taoist principle of balance (zhongyong) teaches that excess leads to imbalance, whether it’s too much power, too much noise, or too much screen time. Social media, in its current form, is excess incarnate—a constant stream of stimulation that drowns out reflection, nuance, and stillness. The goal isn’t to destroy it, but to restore balance.
Consider platforms that prioritize community over clicks. Wikipedia, for example, operates without ads, relying on donations and volunteer contributions. It’s not perfect, but it’s a model of what the internet can be: collaborative, informative, and relatively free from corporate influence. Or Mastodon, a decentralized social network where communities set their own rules, challenging the idea that one algorithm should rule them all.
Even within mainstream platforms, small changes could have massive impacts. Chronological feeds instead of algorithmic ones. Limits on data collection. Transparent content moderation policies. Features designed to encourage logging off, rather than endless scrolling. These aren’t radical ideas; they’re design choices. The fact that they seem radical speaks to how deeply we’ve normalized digital addiction.
Therefore, under Folklaw:
Social media platforms shall be regulated as public utilities, with legal obligations to protect user well-being, data privacy, and democratic integrity. Algorithmic transparency will be mandatory, with users given the option to disable personalized content curation. Data collection beyond what is necessary for basic functionality will be prohibited, and all platforms must offer ad-free versions without manipulative engagement tactics.
Strict limits will be placed on political advertising, with real-time disclosure of funding sources. Content moderation decisions must be transparent, subject to public oversight, and include mechanisms for appeal.
Social media companies will be held legally accountable for amplifying harmful content, including hate speech, misinformation, and incitement to violence. Additionally, mandatory digital literacy education will be implemented in schools, empowering individuals to navigate online spaces critically and responsibly.
Discussions
There are no discussions yet.