How missing information can misinform

by

Stephanie Baum

Scientific Editor

Meet our editorial team
Behind our editorial process

Andrew Zinin

Lead Editor

Meet our editorial team
Behind our editorial process
Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

The GIST
Add as preferred source


Credit: Pixabay/CC0 Public Domain

Readers don't need false information to get the wrong idea. In the online attention economy, UC San Diego research finds that making science more clickable or shareable can help some readers learn more—but leaves many others with an incomplete understanding. The study is published in the journal American Economic Review.

To get people to pay attention to science, you have to make it engaging. But what makes content engaging often comes at the cost of detail—shaping what people learn and what they think they've learned. The result: People can come away with the wrong idea, even when what they read isn't factually wrong.

That tension sits at the core of research from Marta Serra-Garcia, a behavioral economist at the University of California San Diego's Rady School of Management. The study examines how incentives in the online attention economy shape the way scientific information is communicated—and what readers ultimately take away from it.

A trade-off in the attention economy

You don't need bad actors for people to get the wrong idea. Incomplete information can be enough.

Crucially, the research finds that attention-grabbing summaries are not more likely to be factually inaccurate. Instead, they tend to include less information—especially key details about how studies were conducted.

"This is not a simple story that clickbait is bad," said Serra-Garcia, associate professor of economics and strategy and Phyllis and Daniel Epstein Chancellor's Endowed Faculty Fellow at UC San Diego's Rady School. "You need to get people's attention in order for them to learn something, and it's good to encourage curiosity. Yet there's a trade-off: Material designed to engage can also unintentionally contribute to the kinds of misunderstandings that can fuel misinformation."

The finding comes from a large, multi-stage experimental study in which freelance writers produced nearly 600 summaries of actual scientific research, and more than 3,700 participants were then tested on what they learned from them.

Why 'in mice' matters

In one study used in the experiment, a compound in broccoli reduced cancer cell growth—in mice. Leave out those last two words, and the finding can sound far more directly relevant to human health than it actually is.

"Why can't we say 'in mice'?" Serra-Garcia said. "It's not very hard to add. It's two words. But once you say 'in mice,' maybe fewer people will click."

Study results were consistent. Summaries written to attract attention were shorter, easier to read and more engaging—but included less detailed information, especially about sample sizes and methods.

Given the option to seek out more information, most readers did not. That mirrors real-world behavior: Studies of social media use suggest that most content is shared without users ever clicking through to read more.

Among those who relied on summaries alone in Serra-Garcia's study, knowledge dropped by about six to seven percentage points. Readers were also more likely to draw incorrect conclusions—such as assuming findings applied to humans or reflected firm medical guidance.

Discover the latest in science, tech, and space with over 100,000 subscribers who rely on Phys.org for daily insights. Sign up for our free newsletter and get updates on breakthroughs, innovations, and research that matter—daily or weekly.

Subscribe

Inside the experiments

To isolate these effects, Serra-Garcia conducted a multi-stage experimental study. In the first stage, 149 freelance writers produced nearly 600 summaries of the same set of studies—covering topics such as cancer, sleep, vaccines and climate—under different instructions: to inform readers accurately, or to attract attention by encouraging clicks or shares.

In the second stage, more than 3,700 participants read those summaries under different conditions, including whether they could click through for more information.

The results held across experiments: Attention-driven summaries increased engagement and prompted some readers to learn more—but left many others with less complete understanding.

AI and the attention economy

The same pattern emerged when a human wasn't doing the writing. In additional tests, when a large language model was prompted to attract attention, it also produced less detailed summaries—suggesting the effect is driven less by who creates the content than by the objective it's optimized for.

For Serra-Garcia, the findings point to an ongoing challenge for researchers, journalists and institutions alike.

"How do you make science engaging and important to readers," she said, "without missing the essentials that convey the full picture?"

Publication details

Marta Serra-Garcia, The Attention–Information Trade-Off, American Economic Review (2026). DOI: 10.1257/aer.20240850

Journal information: American Economic Review

Provided by University of California - San Diego