Editor’s Note: Kara Alaimo, an associate professor of communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Her book, “Over the Influence: Why Social Media Is Toxic for Women and Girls — And How We Can Reclaim It,” will be published by Alcove Press in 2024. The opinions expressed in this commentary are her own. Read more opinion on CNN.



CNN
 — 

Utah’s Republican governor, Spencer Cox, recently signed two bills into law that sharply restrict children’s use of social media platforms. Under the legislation, which takes effect next year, social media companies have to verify the ages of all users in the state, and children under age 18 have to get permission from their parents to have accounts.

Parents will also be able to access their kids’ accounts, apps won’t be allowed to show children ads, and accounts for kids won’t be able to be used between 10:30 p.m. and 6:30 a.m. without parental permission.

It’s about time. Social networks in the United States have become potentially incredibly dangerous for children, and parents can no longer protect our kids without the tools and safeguards this law provides. While Cox is correct that these measures won’t be “foolproof,” and what implementing them actually looks like remains an open question, one thing is clear: Congress should follow Utah’s lead and enact a similar law to protect every child in this country.

One of the most important parts of Utah’s law is the requirement for social networks to verify the ages of users. Right now, most apps ask users their ages without requiring proof. Children can lie and say they’re older to avoid some of the features social media companies have created to protect kids — like TikTok’s new setting that asks 13- to 17-year-olds to enter their passwords after they’ve been online for an hour, as a prompt for them to consider whether they want to spend so much time on the app.

While critics argue that age verification allows tech companies to collect even more data about users, let’s be real: These companies already have a terrifying amount of intimate information about us. To solve this problem, we need a separate (and comprehensive) data privacy law. But until that happens, this concern shouldn’t stop us from protecting kids.

One of the key components of this legislation is allowing parents access to their kids’ accounts. By doing this, the law begins to help address one of the biggest dangers kids face online: toxic content. I’m talking about things like the 2,100 pieces of content about suicide, self-harm and depression that 14-year-old Molly Russell in the UK saved, shared or liked in the six months before she killed herself last year.

I’m also talking about things like the blackout challenge — also called the pass-out or choking challenge — that has gone around social networks. In 2021, four children 12 or younger in four different states all died after trying it.

“Check out their phones,” urged the father of one of these young victims. “It’s not about privacy — this is their lives.”

Of course, there are legitimate privacy concerns to worry about here, and just as kids’ use of social media can be deadly, social apps can also be used in healthy ways. LGBTQ children who aren’t accepted in their families or communities, for example, can turn online for support that is good for their mental health. Now, their parents will potentially be able to see this content on their accounts.

I hope groups that serve children who are questioning their gender and sexual identities and those that work with other vulnerable youth will adapt their online presences to try to serve as resources for educating parents about inclusivity and tolerance, too. This is also a reminder that vulnerable children need better access to mental health services like therapy — they’re way too young to be left to their own devices to seek out the support they need online.

But, despite these very real privacy concerns, it’s simply too dangerous for parents not to know what our kids are seeing on social media. Just as parents and caregivers supervise our children offline and don’t allow them to go to bars or strip clubs, we have to ensure they don’t end up in unsafe spaces on social media.

The other huge challenge the Utah law helps parents overcome is the amount of time kids are spending on social media. A 2022 survey by Common Sense Media found that the average 8- to 12-year-old is on social media for 5 hours and 33 minutes per day, while the average 13- to 18 year-old spends 8 hours and 39 minutes every day. That’s more time than a full time-job.

The American Academy of Pediatrics warns that lack of sleep is associated with serious harms in children — everything from injuries to depression, obesity and diabetes. So parents in the US need to have a way to make sure their kids aren’t up on TikTok all night (parents in China don’t have to worry about this because the Chinese version of TikTok doesn’t allow kids to stay on for more than 40 minutes and isn’t useable overnight).

Of course, Utah isn’t an authoritarian state like China, so it can’t just turn off kids’ phones. That’s where this new law comes in requiring social networks to implement these settings. The tougher part of Utah’s law for tech companies to implement will be a provision requiring social apps to ensure they’re not designed to addict kids.

Social networks are arguably addictive by nature, since they feed on our desires for connection and validation. But hopefully the threat of being sued by children who say they’ve been addicted or otherwise harmed by social networks — an outcome for which this law provides an avenue — will force tech companies to think carefully about how they build their algorithms and features like bottomless feeds that seem practically designed to keep users glued to their screens.

TikTok and Snap didn’t respond to requests for comment from CNN about Utah’s law, while a representative for Meta, Facebook’s parent company, said the company shares the goal to keep Facebook safe for kids but also wants it to be accessible.

Of course, if social networks had been more responsible, it probably wouldn’t have come to this. But in the US, tech companies have taken advantage of a lack of rules to build platforms that can be dangerous for our kids.

States are finally saying no more. In addition to Utah’s measures, California passed a sweeping online safety law last year. Connecticut, Ohio and Arkansas are also considering laws to protect kids by regulating social media. A bill introduced in Texas wouldn’t allow kids to use social media at all.

There’s nothing innocent about the experiences many kids are having on social media. This law will help Utah’s parents protect their kids. Parents in other states need the same support. Now, it’s time for the federal government to step up and ensure children throughout the country have the same protections as Utah kids.

Suicide & Crisis Lifeline: Call or text 988. The Lifeline provides 24/7, free and confidential support for people in distress, prevention and crisis resources for you and your loved ones, and best practices for professionals in the United States. En Español: Linea de Prevencion del Suidio y Crisis: 1-888-628-9454.

Source link

You May Also Like

Explosion rips roof off pharmaceutical plant; worker missing

Officials say an explosion at a pharmaceutical plant in Massachusetts left one…

What happened after East Palestine, Ohio, disaster

A new wall of flames leapt from a drainage ditch running along…

Woman in custody after hourslong standoff that followed the shooting of three officers, authorities say | CNN

CNN  —  Authorities say an hourslong standoff at a Missouri home that…

Armed EMTs thwart ax-wielding woman who slashed man’s face before smashing station door: police

Armed Kentucky EMTs subdued a woman accused of attacking a man with…