DSA: what new rules?

At the end of April 2022, the European institutions finalized their agreement on a new regulation that will replace the 2000 e-commerce directive.

From its entry into force, probably in early 2023, the Digital Services Law will apply to online marketplaces, social networks, app stores, price comparison sites and search engines, even those not established in the European Union but that have a number of users.

What are your contributions?

“Lightweight” Liability Principle

The DSA reaffirms that digital intermediaries are under no obligation to monitor content. They do not assume their responsibility if they are not the source of the illicit content and if they quickly remove them when they become aware of them. Nothing new in this matter of considerable importance to guarantee freedom of expression and entrepreneurship on the Internet. The DSA confirms that this disclaimer applies even if intermediaries carry out research to detect illegal content in advance.

Two main questions were not answered in the new text:

  • the definition of “ illegal content “. There is still no uniform definition of this notion across Europe;
  • Deadlines for removing illegal content. It is the national laws and jurisprudence of each Member State that will have to decide on this essential issue.

The real breakthroughs of DSA

Where DSA really innovates is by clearly reinforcing the obligations intermediaries. Genuine preventive action is expected of them. For this, the DSA distinguishes 4 increasingly extensive circles of obligations, adapted to its activity and its size. Some concern everyone, others only apply to hosts, still others only concern platforms and the latter only target very large platforms (more than 45 million active users per month in the EU, i.e. 10% of its population current).

all intermediaries must prepare an annual report on illegal content notifications received and its moderation activities.

O hosts they will also have to facilitate their notification and publish their withdrawal decisions in a public database.

O platforms it should also suspend accounts of users who frequently provide illegal content or frequently send unfounded notifications.

They will have to publish their anti-misuse policy and give priority to “trusted flags” notifications. Platforms will have to obtain more information about themselves and their products from the professionals who use them. The transparency of advertising will have to be reinforced, the internet user must, in particular, be able to identify the advertiser behind this or that advertisement.

O very large platforms once a year, it will carry out an analysis of the systemic risks induced by its use (“fake news”, false profiles, etc.) and must mitigate these risks by adapting its moderation policy.

They will undergo an independent audit once a year, giving the auditor effective access to useful data. Your recommender system, which often favors divisive content, should be clarified and users should be able to modify parameters. In case of non-compliance with these obligations, very large platforms can be sanctioned by the European authorities (fine of up to 6% of billing).

It is, therefore, certainly in the increase and control of the new obligations imposed on intermediaries that DSA innovates the most.

(Photo credit: iStock)

Article written by


Anne cousin Anne Cousin has been a partner at the Herald since 2010. She works in new technologies, internet and media law. She accompanies her clients, in advising and litigation, in …
See your contributions

This text is published under the responsibility of its author. Its content does not in any way involve the Les Echos Solutions editorial team.

Leave a Comment