CNN
 — 

The world’s largest tech companies must comply with a sweeping new European law starting Friday that affects everything from social media moderation to targeted advertising and counterfeit goods in e-commerce — with possible ripple effects for the rest of the world.

The unprecedented EU measures for online platforms will apply to companies including Amazon, Apple, Google, Meta, Microsoft, Snapchat and TikTok, among many others, reflecting one of the most comprehensive and ambitious efforts by policymakers anywhere to regulate tech giants through legislation. It could lead to fines for some companies and to changes in software affecting consumers.

The rules seek to address some of the most serious concerns that critics of large tech platforms have raised in recent years, including the spread of misinformation and disinformation; possible harms to mental health, particularly for young people; rabbit holes of algorithmically recommended content and a lack of transparency; and the spread of illegal or fake products on virtual marketplaces.

Although the European Union’s Digital Services Act (DSA) passed last year, companies have had until now to prepare for its enforcement. Friday marks the arrival of a key compliance deadline — after which tech platforms with more than 45 million EU users will have to meet the obligations laid out in the law.

The EU also says the law intends “to establish a level playing field to foster innovation, growth and competitiveness both in the European Single Market and globally.” The action reinforces Europe’s position as a leader in checking the power of large US tech companies.

For all platforms, not just the largest ones, the DSA bans data-driven targeted advertising aimed at children, as well as targeted ads to all internet users based on protected characteristics such as political affiliation, sexual orientation and ethnicity. The restrictions apply to all kinds of online ads, including commercial advertising, political advertising and issue advertising. (Some platforms had already in recent years rolled out restrictions on targeted advertising based on protected characteristics.)

The law bans so-called “dark patterns,” or the use of subtle design cues that may be intended to nudge consumers toward giving up their personal data or making other decisions that a company might prefer. An example of a dark pattern commonly cited by consumer groups is when a company tries to persuade a user to opt into tracking by highlighting an acceptance button with bright colors, while simultaneously downplaying the option to opt out by minimizing that choice’s font size or placement.

The law also requires all online platforms to offer ways for users to report illegal content and products and for them to appeal content moderation decisions. And it requires companies to spell out their terms of service in an accessible manner.

For the largest platforms, the law goes further. Companies designated as Very Large Online Platforms or Very Large Online Search Engines will be required to undertake independent risk assessments focused on, for example, how bad actors might try to manipulate their platforms, or use them to interfere with elections or to violate human rights — and companies must act to mitigate those risks. And they will have to set up repositories of the ads they’ve run and allow the public to inspect them.

Just a handful of companies are considered very large platforms under the law. But the list finalized in April includes the most powerful tech companies in the world, and, for those firms, violations can be expensive. The DSA permits EU officials to issue fines worth up to 6% of a very large platform’s global annual revenue. That could mean billions in fines for a company as large as Meta, which last year reported more than $116 billion in revenue.

Companies have spent months preparing for the deadline. As recently as this month, TikTok rolled out a tool for reporting illegal content and said it would give EU users specific explanations when their content is removed. It also said it would stop showing ads to teens in Europe based on the data the company has collected on them, all to comply with the DSA rules.

“We’ve been supportive of the objectives of the DSA and the creation of a regulatory regime in Europe that minimizes harm,” said Nick Clegg, Meta’s president of global affairs and a former deputy prime minister of the UK, in a statement Tuesday. He said Meta assembled a 1,000-person team to prepare for DSA requirements. He outlined several efforts by the company including limits on what data advertisers can see on teens ages 13 to 17 who use Facebook and Instagram. He said advertisers can no longer target the teens based on their activity on those platforms. “Age and location is now the only information about teens that advertisers can use to show them ads,” he said.

In a statement, a Microsoft spokesperson told CNN the DSA deadline “is an important milestone in the fight against illegal content online. We are mindful of our heightened responsibilities in the EU as a major technology company and continue to work with the European Commission on meeting the requirements of the DSA.”

Snapchat parent Snap told CNN that it is working closely with the European Commission to ensure the company is compliant with the new law. Snap has appointed several dedicated compliance employees to monitor whether it is living up to its obligations, the company said, and has already implemented several safeguards.

And Apple said in a statement that the DSA’s goals “align with Apple’s goals to protect consumers from illegal and harmful content. We are working to implement the requirements of the DSA with user privacy and security as our continued North Star.”

Google and Pinterest told CNN they have also been working closely with the European Commission.

“We share the DSA’s goals of making the internet even more safe, transparent and accountable, while making sure that European users, creators and businesses continue to enjoy the benefits of the web,” a Google spokesperson said.

A Pinterest spokesperson said the company would “continue to engage with the European Commission on the implementation of the DSA to ensure a smooth transition into the new legal framework.” The spokesperson added: “The wellbeing, safety and privacy of our users is a priority and we will continue to build on our efforts.”

Many companies should be able to comply with the law, given their existing policies, teams and monitoring tools, according to Robert Grosvenor, a London-based managing director at the consulting firm Alvarez & Marsal. “Europe’s largest online service providers are not starting from ground zero,” Grosvenor said. But, he added: “Whether they are ready to become a highly regulated sector is another matter.”

EU officials have signaled they will be scrutinizing companies for violations. Earlier this summer, European officials performed preemptive “stress tests” of X, the company formerly known as Twitter, as well as Meta and TikTok to determine the companies’ readiness for the DSA.

For much of the year, EU Commissioner Thierry Breton has been publicly reminding X of its coming obligations as the company has backslid on some of its content moderation practices. Even as Breton concluded that X was taking its stress test seriously in June, the company had just lost a top content moderation official and had withdrawn from a voluntary EU commitment on disinformation that European officials had said would be part of any evaluation of a platform’s compliance with the DSA.

X told CNN ahead of Friday’s deadline that it was on track to comply with the new law.

Analysts anticipate that the EU will be watching even more closely after the deadline — and some hope that the rules will either encourage tech platforms to replicate their practices in the EU voluntarily around the world or else drive policymakers to adopt similar measures.

“We hope that these new laws will inspire other jurisdictions to act because these are, after all, global companies which apply many of the same practices worldwide,” said Agustin Reyna, head of legal and economic affairs at BEUC, a European consumer advocacy group. “Europe got the ball rolling, but we need other jurisdictions to win the match against tech giants.”

Already, Amazon has sought to challenge the very large platform label in court, arguing that the DSA’s requirements are geared toward ad-based online speech platforms, that Amazon is a retail platform and that none of its direct rivals in Europe have likewise been labeled, despite being larger than Amazon within individual EU countries.

The legal fights could present the first major test of the DSA’s durability in the face of Big Tech’s enormous resources. Amazon told CNN that it plans to comply with the EU General Court’s decision, either way.

“Amazon shares the goal of the European Commission to create a safe, predictable and trusted online environment, and we invest significantly in protecting our store from bad actors, illegal content, and in creating a trustworthy shopping experience,” an Amazon spokesperson said. “We have built on this strong foundation for DSA compliance.”

TikTok did not immediately respond to a request for comment on this story.

Source link

You May Also Like

How tiny corkscrew robots could save lives by breaking up blood clots

Blood clots are a serious health problem that can cause strokes, heart…

Three ways to make the most money selling your unwanted things online

Marilyn, one of our newsletter subscribers, emailed us with a great question:…

This robot can cook a burger in less than 60 seconds

Have you ever wondered what it would be like to have a…

Why Elon Musk’s OpenAI Lawsuit Leans on A.I. Research From Microsoft

When Elon Musk sued OpenAI and its chief executive, Sam Altman, for…