Adam Mosseri suggests highlighting ‘real media’ rather than AI content on social media
by Sofia Elizabella Wyciślik-Wilson · BetaNewsInstagram head Adam Mosseri has ended 2025 in a reflective mood, looking at the social media trends he sees being the hallmarks of the year ahead. Perhaps unsurprisingly, he sees the onslaught of AI continuing, and has some thoughts about what this means and how to handle it.
Mosseri makes a couple of statements that are difficult to disagree with: “Deepfakes are getting better and better”, and “AI is generating photographs and videos indistinguishable from captured media”. Interestingly, though, he does not necessarily see this as a problem.
He says that the changing face of technology and social media means that “everything that made creators matter—the ability to be real, to connect, to have a voice that couldn’t be faked—is now suddenly accessible to anyone with the right tools”. Despite this, he goes on to assert that “creators matter more”.
Mosseri sees AI content as something to work with, not fix or avoid:
We haven’t truly grappled with synthetic content yet. We are now seeing an abundance of AI generated content, and there will be much more content created by AI than captured by traditional means in a few years time. We like to talk about “AI slop,” but there is a lot of amazing AI content that thankfully lacks the disturbing properties of twisted limbs and absent physics. Even the quality AI content has a look though: it tends to feel fabricated somehow. The imagery today is too slick, people’s skin is too smooth. That will change; we are going to start to see more and more realistic AI content.
Authenticity is fast becoming a scarce resource, which will in turn drive more demand for creator content, not less. The creators who succeed will be those who figure out how to maintain their authenticity whether or not they adopt new technologies. That’s harder now—not easier—because everyone can simulate authenticity. The bar is going to shift from “can you create?” to “can you make something that only you could create?” That’s the new gate.
The raw aesthetic
Just as AI makes polish cheap, phone cameras have made professional-looking imagery ubiquitous—both trends cheapen the aesthetic.
Unless you're under 25 and use Instagram, you probably think of the app as a feed of square photos. The aesthetic is polished: lots of make up, skin smoothing, high contrast photography, beautiful landscapes.
That feed is dead. People largely stopped sharing personal moments to feed years ago. Stories are alive and well as they provide a less pressurized way to share with your followers, but the primary way people share, even photos and videos, is in DMs. That content is unpolished; it’s blurry photos and shaky videos of people’s daily experiences. Think shoe shots and unflattering candids.
Looking to the future, the evolution of AI means that Mosseri sees tools emerging that can create any look – not just the polished AI looks we’ve become used to. This is already happening, and is leading to what he refers to as “defaulting to skepticism” rather than assuming anything is real.
While Mosseri still sees value in highlighting AI content as being such, he pinpoints the need to fingerprint genuine content as well:
Labeling content as authentic or AI-generated is only part of the solution though. We, as an industry, are going to need to surface much more context about not only the media on our platforms, but the accounts that are sharing it in order for people to be able to make informed decisions about what to believe. Where is the account? When was it created? What else have they posted?
So what?
In a world of infinite abundance and infinite doubt, the creators who can maintain trust and signal authenticity—by being real, transparent, and consistent—will stand out.
As for Instagram, we’re going to have to evolve in a number of ways, and fast. We need to build the best creative tools, AI-driven and traditional, for creators so that they can compete with content fully created by AI. We need to label AI-generated content clearly, and work with manufacturers to verify authenticity at capture—fingerprinting real media, not just chasing fake. We need to surface credibility signals about who’s posting so people can decide who to trust.
There is the promise of further longform posts on the topic, but this first venture can be seen here.