ReportWire

AI companions. Social media ‘rabbit holes.’ NC kids’ advocates see rising danger

DrAfter123/Getty Images

Social media’s effects on children and teens were already a focus for the state’s Child Fatality Task Force.

Now, the rapid emergence of artificial intelligence is raising alarms for the legislative study commission that looks into the causes of child death and makes recommendations.

This comes as President Donald Trump has announced he plans to sign an executive order aiming to limit state regulation of artificial intelligence.

On Tuesday, the task force voted to endorse legislation that addresses addictive algorithms in social media by restricting a company’s use of a minor’s data, thereby making social media less targeted — a measure intended to make it less addictive and less likely to show minors harmful content. The group had endorsed this recommendation in prior years.

It also voted to continue studying the effects of artificial intelligence chatbots and companions on youth, including their design features.

The American Psychological Association and the U.S. Surgeon General have issued advisories on social media and youth mental health. And a 2025 study from Columbia and Cornell University in the Journal of the American Medical Association found that kids having patterns of addictive use of social media and mobile phones was associated with suicidal behaviors and ideation and worse mental health.

The chair of the task force’s Intentional Death Prevention Committee, Whitney Belich, said kids being addicted to social media and spending more time on it hurts mental health, “so much so that it is leading to more death.”

AI chatbots, Belich said, “seem like someone who is talking to them as a companion, a listener of what may be going on with them or what they may be struggling with.

“These chatbots and AI companions are completely unregulated, and so what they may be recommending or saying back in response to these chatters — to real people — could be very detrimental.”

“To reduce harm, we need to improve local resources, and we need to regulate chat designs,” she said.

She said that this could look like expanding access to support for young people “who are feeling lonely.” Studies, she said, show that when teens are given a choice between interacting with an AI system or a real person, they prefer the real person — but one is not always available.

She presented data from a University of Chicago survey of more than 1,000 teenagers aged 13-17 showing that 41% of teens use chatbots for “homework help and emotional support,” 29% for “homework help only,” 29% say they “don’t really use” chatbots, and 1% use them for emotional support only.

“So if we’re going to have them, and we don’t always have access to a real-life person for youth, then we want to look at how we regulate the design of those chatbots to make it less likely that they would be detrimental to the people that are reaching out,” she said.

Young People’s Alliance work

Ava Smithing of the Young People’s Alliance spoke about both AI and social-media issues. The Young People’s Alliance was founded by high school students from North Carolina, but the youth-led group now works across multiple states and on Capitol Hill.

The organization worked with a bipartisan group of North Carolina lawmakers in 2023 on the “Social Media Algorithmic Control in IT Act,” a bill that did not advance in the state House.

This year it also worked with legislators to attempt to pass Senate Bill 514, the Social Media Control in IT Act, which would limit how much data companies can collect from users and aims to prevent systems from using that data to make recommendations. That bill died in the Senate.

Smithing shared that she had struggled with being pulled into addictive social-media algorithms in high school, which contributed to an eating disorder. She said that when she began using social media at about 13, she spent time looking at bikini advertisements. That led the algorithm to show her more of those ads and other content it perceived as related to keep her engaged. That’s because social media companies gain revenue with time spent scrolling.

“So this pushed me down a rabbit hole from bikini advertisements to diet content, and eventually I got to the bottom of this rabbit hole, which was filled with incredibly nasty eating-disorder content that taught me how to have an eating disorder,” she said.

“This does not only happen with eating disorder content. It can happen with any kind of content, whether it be politically extreme content or suicide content or self harm content,” she said. “Whatever your personal negativity bias is as a human, whatever you’re going to get frozen on, that’s the piece of content that they’re going to base it off of.”

She said platforms also keep teens hooked by showing unpredictable positive posts, triggering small dopamine releases and driving compulsive scrolling.

Now the Young People’s Alliance is also focusing on generative artificial intelligence.

“We are transitioning from these algorithms, which was the first iteration of AI harms, to the second iteration of AI harms, which is these human-like chatbots,” she said.

Before, companies had to collect data on social media platforms to inform decisions to keep people engaged, but “now they don’t even have to play the guessing game.”

“We’re incredibly nervous about the risks associated and the large scale of manipulation that can happen to our children if they’re using these chatbots,” she said.

She spoke about the case of 16-year-old Adam Rain, a teenager who died by suicide and who reportedly had had extended conversations with ChatGPT, a chatbot.

ChatGPT is said to have discouraged Rain from seeking help and offered to help him write a suicide note — even advising him on his noose setup, according to news outlets. His family has sued OpenAI, the creator of ChatGPT. In a recent court filing, OpenAI said it was not liable for his death, arguing the boy misused the chatbot, NBC News reported.

The Young People’s Alliance is working across the country on AI legislation, including in North Carolina. It partnered with state Sen. Jim Burgin, an Angier Republican, on Senate Bill 624, which would have regulated chatbots. The bill did not move in the Senate.

Related Stories from Raleigh News & Observer

Luciana Perez Uribe Guinassi

The News & Observer

Luciana Perez Uribe Guinassi is a politics reporter for the News & Observer. She reports on health care, including mental health and Medicaid expansion, hurricane recovery efforts and lobbying. Luciana previously worked as a Roy W. Howard Fellow at Searchlight New Mexico, an investigative news organization.

Luciana Perez Uribe Guinassi

Source link