Pinterest CEO: Governments Should Ban Social Media for Kids Under 16
by Bill Ready · TIMEChildren today are living through the largest social experiment in history. For years, kids around the world have been given unfiltered access to social media platforms. The companies building these platforms gave insufficient forethought about the consequences, the worst of which include exposing them to unknown strangers and fueling screen addictions.
This social experiment has been conducted at scale, and thus, the results are now painfully clear: rising anxiety and depression, eroding concentration, and classrooms competing for attention.
Australia has stepped in with a bold answer: banning social media for children under 16. I believe if tech companies fail to prioritize youth safety, other governments should follow Australia’s lead.
Most of my fellow tech CEOs dismissed Australia’s ban. Some have even suggested it is a “premature and performative” measure. But I see this moment differently. Now is the time to apply the same creativity and innovation that built the social media ecosystem to the vital task of protecting kids online. And if we can’t do this effectively, we lose any credibility to oppose a ban.
As both a tech CEO and parent, I know legal compliance is not the same as safety. And I understand broad restrictions come with difficult tradeoffs. But social media, as it’s configured today, is not safe for young people under 16. Instead, it’s been designed to maximize view time, keeping kids glued to a screen with little regard for their well-being. In courtrooms, we’ve seen how social media companies put profit over young people’s safety, sometimes with tragic outcomes.
Advertisement
Now, AI chatbots are being layered onto social media experiences. These AI bots can influence behavior, emotions, and identity. Yet these powerful, persuasive tools are being handed to young users who are still developing the maturity to handle them.
Businesses have faced this kind of regulatory challenge before. We set age limits on driving, smoking, and consuming alcohol, knowing rules are imperfect and will sometimes be broken. And yet we still set them because we know such policies can improve, and sometimes save, lives.
Now, we must give our children a chance to develop before making consequential choices that could significantly affect their well-being. To do this, we must understand that we have made similarly hard choices before—and that they paid off.
Imperfect protection is better than none. When we make excuses for not acting in the public’s best interest, tech CEOs sound like 20th-century tobacco executives who had to be shamed and sued into submission. This is why many, including myself, have called social media the New Big Tobacco.
Advertisement
The public is well aware of the problem, and they want solutions. Ipsos found broad support for restricting teen access to social media. Pew reported that 70% of parents worry about explicit content or excessive time online, and two-thirds say parenting is harder now than 20 years ago—often citing social media as a major reason. According to a survey by the social psychologist and author of The Anxious Generation, Jonathan Haidt, and the Harris Poll, nearly half of Gen Z respondents said they wished certain social media platforms didn’t exist.
This leads us to an important question: when we defend the status quo, are we protecting teens or protecting the existing social media business model?
Our industry has had years to mitigate these harms, but has time and again failed. The time for self-regulation has passed, and if tech companies don’t change, then the path should be obvious to lawmakers. We need a clear standard: no social media for teens under 16, backed by real enforcement, and accountability for mobile phone operating systems and the apps that run on them.
Advertisement
Australia has taken the lead with its under-16 ban. Other countries across Europe—including the UK, Spain, and France—have considered similar actions. Meanwhile, the U.S. has taken a different approach by pushing for app store age verification, which Pinterest has supported. This approach creates a uniform and privacy-safe framework that is easily controlled by parents and could be used to hold all apps more accountable for delivering age-appropriate experiences.
Critics call bans paternalistic or unworkable. They say teens will find workarounds or move to less safe platforms. Others point out that social media can offer connection and community. But today’s products too often pair those benefits with serious harm: unwanted outreach from strangers, constant comparison, body image pressure, bullying, and exposure to content even many adults struggle with. These are arguments for stronger, smarter safeguards—not for doing nothing.
Advertisement
When Pinterest removed social features for teens and made every account under 16 private—meaning no discoverability, messaging, likes, or comments from strangers—people said we’d lose the next generation of users. But Gen Z says the opposite. Today, they make up over 50% of our users. Our experience shows that prioritizing safety and well-being doesn’t push young people away; it builds trust.
The cost of inaction is a generation of young people overwhelmed by anxiety and depression. Right now, adolescence is being played out inside a global social experiment run by tech companies. But Australia has set a boundary.
It’s time to raise the bar on safety and well-being for kids. We need clearer rules, better tools for parents, and stronger accountability for platforms and social media apps.