Verdicts against Meta, YouTube could be a turning point, expert says
by Cesareo Contreras, Northeastern University · Tech XploreA landmark California verdict that found the social media company Meta and video-sharing service YouTube liable for the depression and mental health challenges of a young woman could be "the beginning of a tidal wave," a social media expert said.
This isn't Meta's only legal defeat this week. A court in New Mexico has ordered the company to pay a $375 million penalty after a jury found the company misled users about the safety of its platform.
These court losses could be a turning point for holding social media companies accountable for the dangers of their products and services, explained Laura Edelson, a professor in the Khoury College of Computer Sciences whose research on Meta's safety features was used in the New Mexico trial.
In the California case, a young woman accused the platforms of intentionally addicting users and the jury agreed. As part of the verdict, the jury ordered the companies to pay a total of $6 million in compensatory and punitive damages, according to media reports. Both Meta and Google, the parent company of YouTube, said they plan to appeal the decision. Meta said it also plans to appeal that New Mexico case.
Edelson said the cases demonstrate "to the platforms that they not just can but will be found liable and will face financial penalties if their products harm users. I think that this is a very good thing. Creating these types of financial disincentives isn't by itself, but it was probably a necessary condition. Now platforms have the incentive to figure out how to be safer, and I believe in their ability to do it."
These verdicts show how similar court cases around the states could play out, she said.
Edelson said there are many ways these companies can make their platforms safer and less habit-forming, but it will require them to think strategically.
"Safer" means it would be harder for adults to contact teens whom they don't know. Safer means that a platform algorithm is less likely to recommend eating disorder content to a 14-year-old," she said.
"Safer means it is less likely to recommend scam ads to 75-year-olds in Florida. Safer means a lot of different things, and these platforms are going to have to tackle all these problems."
Edelson said the good news is that these platforms are well aware of the risks their platforms pose to users because we "know they have been studying those risks internally for years."
Given that both Meta and Google are appealing these cases, it could be a while before we see any changes in our social media feeds, explained John Wihbey, an associate professor of journalism at Northeastern and the author of "Governing Babel," a book centered on social media regulations.
But these cases should be seen as "a major wake-up call" for these companies.
"It's clear that the public, as represented by the jury system, has a new set of norms and expectations for the industry," he said.
For their part, both Meta and YouTube have done work to make their platforms safer for young adults through the creation of children- and teen-specific services, explained Wihbey.
For example, in September 2024, Meta introduced Instagram Teen Accounts, which automatically place more safety restrictions for users between the ages of 13 and 17.
But the social media companies could certainly be doing more to provide a safer place for more vulnerable populations, such as children, teens and young adults.
He pointed to social media companies bombarding users with notifications on their mobile devices, interfaces that offer infinite scrolling, and algorithms that promote misleading and harmful content.
"All of these things should come under particular scrutiny when they are served to people under the age of 18," he said.
Yet one challenge for researchers like Wihbey is that these platforms make very limited data accessible to academic researchers who could help study these questions more intensively, he said.
That sentiment is shared by Edelson, who said in her own experience that platforms have not spent much time sharing important data on the safety of their platforms with researchers like herself.
"There are very hard questions to study because platforms have just made it really hard to access data," she said.
Wihbey said it's possible in the years to come we will see a lot of people so engrossed in their phones, if more lawsuits like this are brought up.
He compared the situation to tobacco lawsuits in the 1980s, which hurt that industry significantly through financial penalties and by forcing companies to issue public statements about the health risks of smoking cigarettes.
"In 10 years, will we look around and people won't be scrolling on their phones anymore?" he asked. "Do you go on The T and not see people staring at their phones the same way? This could have a real physical impact on our sociology and way of being."
| Key concepts Online child safety regulation |
Provided by Northeastern University