Search for:

In brief

On 4 October 2022, the Council of the European Union definitively approved the Digital Services Act (DSA), maintaining unchanged the content proposed by the European Parliament. On 5 July 2022, the European Parliament also approved the articles of the Digital Market Act (DMA) still pending a final vote in the Council. The DMA and the DSA regulate the legal status of providers of intermediary services (e.g., online platforms such as marketplaces, search engines, social networks, hosting services, etc.) and thus also affect other actors (users and businesses of all sizes) interacting through their services.


  1. DSA, the Digital Services Act
  2. DMA, the Digital Markets Act
  3. Conclusions

Both regulations do not have to be implemented by the Member States and are directly applicable. On the other hand, they repeal previous regulations (for example, the e-commerce Directive 2000/31/EC is partially repealed in relation to the liability of intermediary service providers) and complement others (as in the case of the regulation of consumer law or those contributing to the definition of terms such as “unlawful content”). As regards the timeframe, there are different intervals: some of the DSA obligations will apply immediately, other rules will apply in about four months after the notification of the Commission to “Very large online platforms”, other duties will not apply until January 2024. In the case of the DMA obligations, most of them will apply six months after their entry into force (20 days after their publication in the Official Journal of the European Union).

As in the case of the General Data Protection Regulation (GDPR), the DSA and the DMA apply to companies that are not established in the European Union but offer their services in this territory under certain conditions (these companies are also required to appoint a legal representative in the European Union).

DSA, the Digital Services Act

  1. The DSA regulates the liability of information society intermediary service providers or “providers of intermediary services”, which we will refer to simply as “Platforms” in this article. It should therefore be noted that the DSA: Harmonises and imposes obligations on these Platforms (with nuances depending on their size). In particular, it imposes stricter due diligence obligations on its users for content that (i) is unlawful, (ii) constitutes disinformation or (iii) is considered a risk to society. This results in the following obligations for Platforms, among others:
    • Establish notice-and-action mechanisms and a single point of contact and publish relevant information on how to contact that point electronically, including the languages to be used in such communications, to report illegal content and/or content that does not comply with the terms and conditions of the Platform. These mechanisms should allow for anonymous reporting and meet certain minimum requirements;
    • Provide sufficient resources to respond quickly and efficiently to such complaints;
    • Clearly state the reasons provided in their terms and conditions for restricting the service (e.g., removal of content) and inform the user whose reported content has been removed about the specific application of these terms (stating the reasons and redress possibilities);
    • Provide that the user whose content has been removed can challenge the decision before the Platform itself and before the courts;
    • Depending on the size of the Platform company, it must publish an annual report detailing the measures it has taken as a result of its content moderation activities; Maintains and approximates the exceptions to the liability of intermediary service providers set out in Directive 2000/31/ EC, specifying some exceptions where it will not apply and clarifying other scenarios where it will apply. The intermediary shall be liable for illegal content if it does not promptly remove it when there are clear indications of its existence on its Platform;
  2. Retains the prohibition of imposing a general duty of supervision over the content.
  3. Establishes cases where “Marketplace” type platforms are liable under consumer law for what is sold or transacted through them if an average consumer has the perception that the platform is the seller;
  4. Places further reporting obligations, especially for online search engines;
  5. Prohibits “dark patterns” or misleading interfaces that use certain visual tactics to encourage decisions that may harm the user, as well as other bad practices related to usability;
  6. Imposes greater reporting obligations on Platforms in relation to advertising (on whose behalf is advertising carried out, what is considered a target audience for advertising, etc.);
  7. Explicitly prohibits targeted advertising based on profiling (a) with special categories of personal data (e.g., racial, political or health data) or (b) where the recipient of the service is a minor;
  8. Imposes reporting obligations on Platforms on how they organize and recommend the content they offer to users;
  9. Requires Platforms to retain information about the contractual terms and conditions that their merchant users offer to their consumers’ users (and merchant users must offer the aforementioned terms and conditions);
  10. Provides for very large platforms to be subject to annual independent audits of their compliance with the obligations of this Regulation;

The DSA also stipulates that each Member State must designate an administrative authority responsible for supervising and enforcing the obligations laid down for the Platforms and can penalize them with fines of up to 6% of the annual global turnover and require them to adopt different measures.

DMA, the Digital Markets Act

The DMA legislates for large digital players, which it calls “gatekeepers” and, given their great economic power, seeks to prevent them from engaging in unfair practices, thereby complementing competition law.

In this way, we underline that the DMA:

  1. Imposes strict requirements on an information society service provider to be considered as a “gatekeeper” and thus to be covered by the DMA;
  2. States that the Commission will designate which companies will be considered as “gatekeepers”;
  3. Indicates that, in addition to the rights under the GDPR, users of gatekeeper’s services should be able to use the platform’s services (at least the basic functions) in a less personalized way that does not involve the processing of personal data. This alternative should have the same level of quality as the personalized option;
  4. Prohibits malpractices that harm users of gatekeepers’ services, such as discontinuing products without clear reasons; making certain content inaccessible or interoperable unless purchased through gatekeepers’ channels (e.g., their app shop); setting predatory prices to restrict competition or favoring certain software solutions by installing or uninstalling them by default. There are also prohibitions on gatekeepers using aggregated information when they have a dual role (e.g., as a provider of promotional services for a technology and as a competitor offering a similar technology) that puts them in a situation similar to a conflict of interest;
  5. Establishes information requirements for gatekeepers to inform the various stakeholders in the online advertising system of their conditions (e.g., the method of calculating prices and remuneration);
  6. Provides that gatekeepers must facilitate the portability of information and data between Platforms.
  7. States that gatekeepers must inform the Commission and the general public in order to comply with this Regulation and the GDPR;
  8. Stipulates that gatekeepers must inform the Commission of their willingness to acquire other companies in the sector;
  9. Requires gatekeepers to (a) submit to the Commission an independently audited description of the profiling techniques they use and (b) publish a summary version of that audited description.

The Commission may impose penalties of up to 20% of the annual global turnover on gatekeepers for non-compliance with the DMA.


Both regulations are very far-reaching and complex. They contain the possibility of very high penalties and therefore give rise to significant legal risks. Consequently, in some cases, it will be advisable to consider an audit strategy to implement the necessary changes and establish an action plan to mitigate the risks mentioned above.

Link to related content: EU: The European Commission Digital Services Act – What does the future hold?


Cristina Duch joined the Firm’s Intellectual Property Practice Group in 2006. She has significant experience in a wide range of intellectual property matters, with particular emphasis on trademarks, designs, unfair competition and advertisement. Ms. Duch is a member of the Spanish Institute of Chartered Industrial Property Agents.


Itziar Osinaga is an Associate in Baker McKenzie Barcelona office.


David Molina is a Senior Associate in Baker McKenzie, Barcelona office.


Patricia Perez is a Team Leader in Baker McKenzie, Madrid office.


Pablo Usle is a Team Leader in Baker McKenzie, Madrid office.

Write A Comment