[ad_1]
(Associated Press) – Instagram will start alerting parents if their kids repeatedly search for terms clearly associated with suicide or self-harm.
The alerts will only go to parents who are enrolled in Instagram’s parental supervision program.
Instagram says it already blocks such content from showing up in teen accounts’ search results and directs people to helplines instead.
The announcement Thursday comes as Meta faces two trials over harms to children.
A trial underway in Los Angeles questions whether Meta’s platforms deliberately addict and harm children.
Another, in New Mexico, seeks to determine whether Meta failed to protect children from sexual exploitation on its platforms.
More about:
[ad_2]
Grant McHill
Source link