ReportWire

Tag: Kids Online Safety Act

  • Meta gives Australian kids 2-week warning to delete accounts as world-first social media age restrictions loom

    [ad_1]

    Melbourne, Australia — Technology giant Meta on Thursday began sending thousands of young Australians a two-week warning to downland their digital histories and delete their accounts from Facebook, Instagram and Threads before a world-first social media ban on accounts of children younger than 16 takes effect.

    The Australian government announced two weeks ago that the three Meta platforms plus Snapchat, TikTok, X and YouTube must take reasonable steps to exclude Australian account holders younger than 16, beginning Dec. 10.

    California-based Meta on Thursday became the first of the targeted tech companies to outline how it will comply with the law. Meta contacted thousands of young account holders via SMS and email to warn that suspected children will start to be denied access to the platforms from Dec. 4.

    “We will start notifying impacted teens today to give them the opportunity to save their contacts and memories,” Meta said in a statement.

    Meta said young users could also use the notice period to update their contact information “so we can get in touch and help them regain access once they turn 16.”

    Meta has estimated there are 350,000 Australians aged 13-to-15 on Instagram and 150,000 in that age bracket on Facebook. Australia’s population is 28 million.

    Account holders 16-years-old and older who were mistakenly given notice that they would be excluded can contact Yoti Age Verification and verify their age by providing government-issued identity documents or a “video selfie,” Meta said.

    Terry Flew, co-director of Sydney University’s Center for AI, Trust and Governance, said such facial-recognition technology had a failure rate of at least 5%.

    “In the absence of a government-mandated ID system, we’re always looking at second-best solutions around these things,” Flew told the Australian Broadcasting Corp.

    The government has warned platforms that demanding that all account holders prove they are older than 15 would be an unreasonable response to the new age restrictions. The government maintains the platforms already had sufficient data about many account holders to ascertain they were not young children.

    Social media companies will face fines of up to 50 million Australian dollars (about $33 million) if they are found to be failing to prevent people under 16 from creating accounts on their platforms.

    Meta’s vice president and global head of safety, Antigone Davis, said she would prefer that app stores including Apple App Store and Google Play collect the age information when a user signs up and verifies they are at least 16 year old for app operators such as Facebook and Instagram.

    “We believe a better approach is required: a standard, more accurate, and privacy-preserving system, such as OS/app store-level age verification,” Davis said in a statement.

    “This combined with our investments in ongoing efforts to assure age … offers a more comprehensive protection for young people online,” she added.

    Dany Elachi, founder of the parents’ group Heaps Up Alliance that lobbied for the social media age restriction, said parents should start helping their children plan on how they will spend the hours currently absorbed by social media.

    He was critical of the government’s only announcing on the complete list of platforms that will become age-restricted on Nov. 5.

    “There are aspects of the legislation that we’re not entirely supportive of, but the principle that children under the age of 16 are better off in the real world, that’s something we advocated for and are in favor of,” Elachi said. “When everybody misses out, nobody misses out. That’s the theory. Certainly we expect that it would play out that way. We hope parents are going to be very positive about this and try to help their children see all the potential possibilities that are now open to them.”

    There was significant resistance to the legislation last year, however, including from  some children’s advocacy groups.

    The CEO of the Save the Children charity Mat Tinkler said in a statement a year ago, when the ban was approved by Australian lawmakers, that while he welcomed the government’s efforts to protect children from harm online, the solution should be regulating social media companies, rather than a blanket ban.

    He said the government should “instead use the momentum of this moment to hold the social media giants to account, to demand that they embed safety into their platforms rather than adding it as an afterthought, and to work closely with experts and children and young people themselves to make online spaces safer, as opposed to off-limits.”

    The Australian Human Rights Commission, an independent government body, also expressed “serious reservations” over the law before it was approved, saying last year that there were “less restrictive alternatives available that could achieve the aim of protecting children and young people from online harms, but without having such a significant negative impact on other human rights. One example of an alternative response would be to place a legal duty of care on social media companies.”

    [ad_2]

    Source link

  • Disney to pay $10 million to settle FTC lawsuit over collecting kids’ data

    [ad_1]

    Disney will pay $10 million to settle allegations by the Federal Trade Commission that the entertainment company facilitated the “unlawful collection” of children’s personal data.

    In a complaint filed on Tuesday, the FTC said that Disney Worldwide Services and Disney Entertainment Operations — two entities that offer technical support and media content — violated the Children’s Online Privacy Protection Rule, known as COPPA, by failing to properly label some videos uploaded to YouTube as “made for kids.” The mislabeling also exposed children to “age-inappropriate YouTube features,” the FTC said in a statement.

    “Our order penalizes Disney’s abuse of parents’ trust, and, through a mandated video-review program, makes room for the future of protecting kids online — age assurance technology,” FTC Chairman Andrew N. Ferguson said in a statement. 

    Signed into law in 1998, COPPA requires commercial website operators to disclose to parents of children under 13 that they are collecting personal data and obtain the parents’ prior consent.

    The videos in question included content from Disney movies including “Coco,” “Frozen” and “Toy Story” and as well as music from “The Incredibles.”

    A spokesperson for Disney told CBS News that the settlement does not involved Disney-owned and operated digital platforms and that it is limited to some of the content on the company’s YouTube platform.

    “Disney has a long tradition of embracing the highest standards of compliance with children’s privacy laws, and we remain committed to investing in the tools needed to continue being a leader in this space,” the spokesperson said in a statement. 

    YouTube requires videos to be labeled as “made for kids” if children are the video’s primary audience or if the content reflects “an intent to target children,” according to the Alphabet-owned platform. YouTube also says on its website that failure to properly label videos could lead to “legal consequences under COPPA and other laws.” 

    YouTube began requiring video uploaders to add the “made for kids” label after it reached a similar settlement in 2019 with the FTC over COPPA violations. 

    Disney’s agreement with the FTC also calls for the company to create a program to review whether videos posted to YouTube should be designated as made for children the agency said.

    [ad_2]

    Source link

  • Senate panel forced to use U.S. marshals to subpoena CEOs of X and Discord to testify on protecting kids online

    Senate panel forced to use U.S. marshals to subpoena CEOs of X and Discord to testify on protecting kids online

    [ad_1]

    Washington — A Senate committee has issued bipartisan subpoenas to the CEOs of Discord, Snap and X demanding that the heads of the three companies testify at a December hearing on protecting children online.

    Senate Judiciary Committee Chairman Dick Durbin, D-Ill., and South Carolina Sen. Lindsey Graham, the top Republican on the panel, announced Monday that they had issued the subpoenas to Discord CEO Jason Citron, Snap CEO Evan Spiegel and Linda Yaccarino, the CEO of X, formerly known as Twitter, “after repeated refusals to appear” during weeks of negotiations.

    “Big Tech’s failure to police itself at the expense of our kids cannot go unanswered,” the two senators said in a statement.

    The committee said that “in a remarkable departure from typical practice,” Discord and X refused to accept service of the subpoenas and the panel was forced to enlist the U.S. Marshals Service to personally subpoena the CEOs.

    According to the Reuters news service, X’s head of US & Canada Government Affairs, Wifredo Fernandez, said in a statement X has been “working in good faith to participate in the Judiciary committee’s hearing … as safety is our top priority at X. Today we are communicating our updated availability to participate in a hearing on this important issue.”

    And Reuters cites Discord as saying in a statement that, “Keeping our users safe, especially young people, is central to everything we do at Discord. We have been actively engaging with the Committee on how we can best contribute to this important industry discussion.”

    The Dec. 6 hearing will focus on child sexual exploitation online. Durbin and Graham said the committee remains in discussions with both Meta and TikTok and expects their CEOs, Mark Zuckerberg and Shou Zi Chew, to testify voluntarily.

    Social media companies have faced criticism from lawmakers, regulators and the public for harms their platforms cause to children and teenagers. Most recently, Meta was sued by 41 states and Washington, D.C. for contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that addict teenagers to the platforms. 

    [ad_2]

    Source link

  • Senators reintroduce bill to help protect kids from harmful online content

    Senators reintroduce bill to help protect kids from harmful online content

    [ad_1]

    Senators reintroduce bill to help protect kids from harmful online content – CBS News


    Watch CBS News



    Senators have reintroduced the Kids Online Safety Act that gives parents and minors new controls. Senators Richard Blumenthal and Marsha Blackburn say the bill would also require social media companies to provide options for minors to protect their information and disable addictive features. Congressional correspondent Nikole Killion reports.

    Be the first to know

    Get browser notifications for breaking news, live events, and exclusive reporting.


    [ad_2]

    Source link