The UK’s data watchdog has fined TikTok £12.7mn for failing to protect children’s privacy, amid growing global concerns over the use of data at the Chinese-owned social media app.

The Information Commissioner’s Office on Tuesday said it estimated that up to 1.4mn UK children aged under 13 had used the viral-video app in 2020, despite TikTok’s own rules preventing children younger than 13 from creating accounts.

The platform had also failed to gain parental consent to use children’s data, contravening UK data protection laws, the regulator said.

The fine comes as TikTok faces an onslaught of security concerns from governments around the world.

Last month, TikTok’s chief executive was grilled by US legislators as the social media app attempted to head off a potential US ban over security fears linked to its Chinese ownership.

It has also been hit by a growing number of governments, including the UK, EU, Canada and the US, banning TikTok from government devices.

In response to mounting pressure, TikTok last month laid out new measures to protect users’ data in Europe. This will see it open two data centres in Dublin, and a third in Norway, to store videos, messages and personal information generated by 150mn European users of the platform.

The ICO investigation found that TikTok “did not respond adequately” when a concern was raised internally with senior employees about children under 13 using the platform.

“TikTok should have known better, TikTok should have done better,” said John Edwards, the UK information commissioner.

TikTok said it had taken steps to prevent children from accessing its platform. It also publishes information on how many accounts are linked to users it suspects are under 13.

The social media app said it removed more than 17mn accounts in the last three months of 2022.

“We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community,” said TikTok.

Tech companies around the world face more stringent regulations on how they protect children’s data online.

The UK government is set to introduce its Online Safety Bill, which will bring in tougher requirements for social media companies and could see tech executives who fail to protect children online face prison sentences.

The ICO’s investigation into TikTok, which covered activity from May 2018 to July 2020, concluded before it published the Children’s Code in September 2021, which demands stricter requirements on the processing of children’s data.

TikTok’s fine had been reduced from a £27mn charge proposed in September, after the regulator dropped findings relating to the processing of “special category data”, such as personal or biometric data.

“We will continue to review the decision and are considering next steps,” said TikTok.

Baroness Beeban Kidron, who first introduced an amendment for the Children’s Code in the House of Lords and founded children’s privacy charity 5Rights Foundation, said the tech sector should “accept the principle of delivering products and services with basic safety built in by design”.

“The future of tech will not be built on the back of children’s anxiety, inappropriate content and dangerous activities — but will be an accountable and regulated sector that prioritises the impact on children over profits,” she added.

Source link

You May Also Like

TikTok Introduces Text-Only Posts

TikTok users can now post text-only content, a departure for the social…

Report: 89% of organizations have been hit by an identity-based attack in the past year

Check out all the on-demand sessions from the Intelligent Security Summit here.…

Goldman and others are bullish on copper. Here are some stock ideas that analysts love

The demand surge in metals such as copper, nickel and lithium “has…

6 Tips for Running a Successful Business With Your Romantic Partner | Entrepreneur

Opinions expressed by Entrepreneur contributors are their own. Thinking about starting, or…