Listen Live
Close
Australia Bans Social Media For Under 16s
Source: Matt Cardy / Getty

Adam Mosseri, head of Meta’s Instagram, testified Wednesday in a landmark social media trial in Los Angeles that he does not agree with the notion that people can be clinically addicted to social media platforms.

Addiction is central to the case, in which plaintiffs are attempting to hold social media companies accountable for harms to children who use their services. Meta Platforms and Google’s YouTube are the remaining defendants after TikTok and Snap reached settlements.

At the heart of the Los Angeles proceedings is a 20-year-old identified only by the initials “KGM.” Her lawsuit could influence how thousands of similar claims against social media companies are resolved. She and two other plaintiffs were chosen for bellwether trials — test cases designed to show both sides how their arguments may fare before a jury.

Mosseri, who has led Instagram since 2018, emphasized the importance of distinguishing between clinical addiction and what he described as problematic use. The plaintiff’s attorney, however, cited remarks Mosseri made in a podcast interview several years ago in which he used the word addiction in connection with social media. Mosseri clarified in court that he was likely using the term “too casually,” as many people do.

When questioned about his qualifications to assess whether social media addiction is legitimate, Mosseri acknowledged he is not a medical expert. He added that someone “very close” to him has struggled with serious clinical addiction, which is why he said he was “being careful with my words.”

He explained that he and his colleagues use the term “problematic use” to describe “someone spending more time on Instagram than they feel good about, and that definitely happens.”

“It’s not good for the company, over the long run, to make decisions that profit for us but are poor for people’s well-being,” Mosseri said.

Mosseri and the plaintiffs’ lawyer, Mark Lanier, also engaged in an extended exchange over Instagram’s cosmetic filters that altered users’ appearances in ways critics say promoted plastic surgery.

“We are trying to be as safe as possible but also censor as little as possible,” Mosseri said.

In the courtroom, parents who had lost children after struggles linked to social media appeared visibly emotional during testimony about body dysmorphia and cosmetic filters. Meta discontinued all third-party augmented reality filters in January 2025. After emotional reactions from the gallery, the judge reminded observers not to show agreement or disagreement with testimony, stating it would be “improper to indicate some position.”

During cross-examination, Mosseri and Meta attorney Phyllis Jones sought to counter the suggestion from Lanier that the company specifically targets teens for profit.

Mosseri testified that Instagram earns “less money from teens than from any other demographic on the app,” explaining that teens are less likely to click on ads and often lack disposable income to spend on advertised products. When given another opportunity to question Mosseri, Lanier pointed to research indicating that users who join social media at a young age are more likely to remain long-term, arguing that teens represent valuable long-term profit potential.

“Often people try to frame things as you either prioritize safety or you prioritize revenue,” Mosseri said. “It’s really hard to imagine any instance where prioritizing safety isn’t good for revenue.”

Meta CEO Mark Zuckerberg is expected to testify next week.

In recent years, Instagram has rolled out numerous features and tools aimed at improving safety for young users. Still, concerns remain. A report last year found that teen accounts created by researchers were recommended age-inappropriate sexual material, including “graphic sexual descriptions, the use of cartoons to describe demeaning sexual acts, and brief displays of nudity.”

The same report said teen accounts were also shown a “range of self-harm, self-injury, and body image content” that “would be reasonably likely to result in adverse impacts for young people, including teenagers experiencing poor mental health, or self-harm and suicidal ideation and behaviors.” Meta dismissed the findings as “misleading, dangerously speculative,” saying the report mischaracterized its work on teen safety.

Separately, Meta is facing another trial in New Mexico that began this week.