Instagram debuted in 2010 and quickly became popular as a place to share photos with friends. The company was acquired by Facebook just a few years later, keeping it relevant as a juggernaut against new competitors vying for young people’s precious attention. But people were growing increasingly frustrated with Instagram as they missed simpler, more direct ways to stay in touch with friends.
When Instagram introduced “Reels,” a TikTok-like vertical video feed, in 2022, the frustration was so widespread that even the Kardashians expressed disappointment with one of their core tools as global influencers. Kim and Kylie both reposted a viral meme begging the platform to “just make Instagram Instagram again (stop trying to be TikTok and just want to see cute pictures of my friends).” Kylie added, “Please” in her post.
Still, Instagram CEO Adam Mosseri maintained that users love Reels because video views continue to grow. But as an Instagram user myself, I continued to notice dark patterns in user experience design and became more and more addicted to Reels. For example, before Reels, you could mute or unmute short videos in your feed by tapping anywhere on the video. With Reels, mute became a small button in the corner of the video, and tapping anywhere on the video opened the full-screen Reels feed.
Like a casino with no clocks on the walls, this full-screen view hides all menus and notifications, abandons curated feeds, and autoplays videos algorithmically to grab our attention. I found myself watching video after video with disgust until I finally pulled myself away.
In a recent essay, technologists Maria Farrell and Robin Version use concepts from ecology to critique the vulnerabilities created by the consolidation of internet service providers. They cite “rewilding,” the practice of reintroducing ecological diversity to overly cultivated land, a conservation strategy that “restores healthy ecosystems by creating wild, biodiverse spaces.” Applying the concept to the internet and web, which are dominated by a few global monopolies and duopolies, the authors advocate for a diversity of providers, protocols, and services that would bring resilience and choice to people and organizations communicating over the internet.
While Farrell and Version’s article is about internet services in general, encompassing the entire technology stack, from cables and servers to protocols, browsers and search, I was intrigued by their comments about social media monoculture and its impact on those who have grown up with it. This consolidation, they point out, has created a digital social experience characterized by passivity: “The internet has become something that is done to us, rather than something that we collaboratively reinvent every day.”
As ad-supported platforms converge into a functional singularity optimized for addiction, our culture has narrowed. Friends no longer drop in to say hello, and phone calls require appointments (yes, message me first). Instead of exchanging contact details to talk, we follow each other on Instagram and LinkedIn and become content creators on each other’s feeds. By design, in-person interaction is discouraged, and we spin in a parasocial panopticon, always watching, never reaching out. Passivity becomes the default posture as content and friends are delivered to us by algorithms.
Protecting children from the internet
In 2022, we already knew that recommendation algorithms from Meta to YouTube to TikTok were prioritizing extremist, politically polarizing, addictive, and inflammatory content. That’s great for entertainment platforms that make money by serving ads. But by last summer, Instagram’s harm had grown so much that a Wall Street Journal headline was simply “Instagram Connects Vast Pedophile Network.” In a hyper-targeted and purposefully addictive world, sex offenders turned out to be whales.
In May of this year, a New York Times investigation found that even products advertised to girls, such as a children’s jewelry line, were primarily shown to adult men who were interested in the child models, not the jewelry, and last week, another Wall Street Journal investigation reported that a young teenage dance influencer’s audience was 92% adult men, despite her mother’s efforts to protect her on Instagram.
Amid rising youth anxiety, child safety has emerged as a forum for the national conversation about the harms of social media design and information distribution. While new bills such as KOSA (Children Online Safety Act) have bipartisan support, civil society organizations like EFF, the ACLU, and some LGBTQ+ advocacy groups warn that the bill uses a parochial understanding of youth safety to push pro-censorship legislation that could be weaponized by state attorneys general to censor content along political lines. As Stanford University tech policy researcher Liana Pfefferkorn told me, both KOSA and the slate of state bills “position the internet as something kids should be protected from,” but fail to recognize the more nuanced, research-backed reality that the internet “can be an important tool for belonging and growth.” (Note: Pfefferkorn is an advisor to my startup, Germ Network.)
KOSA, the progress of the TikTok ban, and a plethora of state-level bills reflect a national mindset that seeks to protect children by censoring information and limiting access to private spaces that aren’t subject to surveillance and data analysis. As Pfefferkorn puts it, “This is the infantilization of the internet for everyone, relabeling it as a gesture toward youth only.” Efforts to legislate safety by dictating what can be found online are the digital equivalent of book bans, and progressive lawmakers should exercise caution when working with colleagues who also support literal book bans.
But it’s true that many social media products are dangerous for teens, and for all of us. But they’re not dangerous because they may give teens privacy, as Nevada’s pending bill to ban Meta end-to-end encrypted messaging suggests, or because they allow teens to discover new information online. They’re dangerous because the price of connecting with friends and learning new things is the risk of being targeted by addictive content and predatory individuals.
The alternative: building subjectivity and rewilding ourselves
Harvard psychologist Emily Weinstein and sociologist Carrie James warn parents and teachers that keeping teens safe online, as in the real world, doesn’t come from increasing surveillance and control. Instead, teens need to learn to exercise agency — a greater sense of control over their environments and the outcomes of their actions — and make healthy choices for themselves, because adults can’t always make those decisions for them.
Weinstein and James remind us that agency is at the heart of mental well-being: “Psychologists have recognized that individuals fare better when they believe they can influence what their actions do and can shape outcomes through their actions…. Conversely, feeling out of control on a daily basis can threaten our well-being.” Environments that eliminate choice, boundaries, and consent are simply not healthy.
Weinstein and James identify three types of agency that empower young people and foster resilience and well-being: individual agency, where individuals make choices about how they interact; collective agency, where they work together as a community; and vicarious agency, where adults or other authority figures step in when needed. Yet today’s social media ecosystem systematically denies this agency to teens and their families.
In the case of the teenage Instagram dance influencer profiled in the aforementioned Wall Street Journal investigation, her mother realized that to get her daughter to gain a following, she had to get a ton of grown men to follow her account and even respond to their creepy comments: “If you want to be an influencer and work with brands and get paid, you have to follow an algorithm, and that’s determined by the number of people who like and interact with your posts.”
As Meta cuts its content moderation team that protects users from increasing scams and predation, Metable’s rollout of end-to-end encryption certainly deserves scrutiny — not because privacy is inherently dangerous, but because it’s being rolled out on a platform that systematically targets and introduces dangerous people. When we push tech users into addictive feeds, introduce them to people they don’t want to talk to, and fire the teams tasked with keeping them safe, we’re creating a docile population capable of harm on a world-historical scale.
Rewilding offers a vision to aspire to in reinvigorating digital social interactions into healthier, more self-sustaining ones. For Farrell and Version, “Rewilders build resilience by restoring autonomous natural processes.” We humans are living creatures, after all. Healthy humans are able to make choices, form safe connections, and set boundaries without navigating recommendation systems that lower our defenses with every new video or introduction. Teens, families, and all people crave tools that help them discover, connect, share, and disconnect with balance and choice.
What if we built tools that allowed teenagers, and all of us, to decide for ourselves who can speak to us and what information we want to engage with on any given day? What if we developed online systems that encouraged agency and decision-making, rather than passivity and vulnerability? What if we harnessed advances in machine learning and cryptography to empower our relationships with each other, information, and time?
Safety is narrower than health, but rewilding teaches us that health comes from complexity, diversity, and choice. Individuals, companies, and policymakers must prepare for a new internet that lets us find our own way, not force it on each other. Let’s build a digital world that allows less stalking and more conversation. To rewild the internet, we must rewild ourselves.