The virtual space of the online platforms will be turned upside down by the Digital Services Act (DSA). Once the law is fully rolled out, by the beginning of 2024, social networks will never be the same as before. The basic principle is: what is illegal offline must also be illegal online. Internet giants such as Meta will also have to allow inspection of the algorithms that determine which messages a user sees from Facebook or Instagram. Socal bots and click farms are stopped. It is impossible to be exhaustive in this review. The text is 310 pages. Lawyers, human rights and other civil society organizations should study the law for themselves.What’s in it? Lets have a closer look.
Preface and some definitions
Since 15 June 2022, the full text of the Digital Services Act, DSA, is available on the EP website as ‘Text of the provisional agreement on the digital services act’, with a download link. The primary aim of the DSA is establishing harmonisation for the regulation of digital services across Europe; as an EU Regulation and not a Directive, it is directly applicable in every member state. The DSA requires an enforcement regime that operates seamlessly to ensure equal application of the regulation throughout the Union.
The preamble is numbered by paragraph up to number (106a). Then the 74 numbered articles of the law are displayed. I limit myself to rules for social networks that are important to the users and also only deal with matters that are applicable in Belgium, because if a seat of one of the major platforms is located in a country of the EU, such as in Ireland and Luxembourg , it gets very complicated. You will find an elaborate series, where also the global impact on online platform governance enforcement is treated, on the website of the Center for Democracy and Technology: first, second and third.
The term “social networks” (Meta… and Tik Tok) or “internet platforms” (Apple App Store, Amazon… and Google) is used in the preamble as we use it in everyday language. In the legal texts, the term ‘intermediary services’ is often used as a container for both. So I will also sometimes use that term, when the law applies to both. VLOP, stands for ‘Very Large Online Platforms’, VLOSE, stands for ‘Very Large Online Search Engines’. This means in both cases, with more than 45 million users (see also article 25).
The definition of “illegal content” is as follows:
“In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that the applicable rules make illegal in view of the fact that it relates to activities that are illegal. Illustrative examples include the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of noncompliant or counterfeit products, the sale of products or the provision of services in infringement of consumer protection law, the non-authorised use of copyright protected material, the illegal offer of accommodation services or illegal sale of live animals.” (paragraph 12)
Rolling out and implementation
“This Regulation shall enter into force on the twentieth day following that of its publication in the Official Journal of the European Union. It shall apply from … [15 months after entry into force or 1 January 2024, whichever is later]. However, article 23(2) and (3), article 25(4) to (6) and Sections 3 to 5 of Chapter IV shall apply from … [the date of entry into force of this Regulation].
By derogation from the first subparagraph of this paragraph, this Regulation shall apply to providers of very large online platforms and very large online search engines designated pursuant to Article 25(4) from four months after the notification to the provider concerned referred to in Article 25(6) in case where that date is earlier than …[15 months after entry into force or 1 January 2024, whichever is later].” (article 74 copied)
So the following legal provisions will take effect immediately twenty days after publication: Article 25 defines the VLOPs and VLOSEs; Sections 3 to 5 of Chapter IV deal with supervision, detection, enforcement and monitoring of VLOPs and VLOSEs; article 23 section 2 and 3, which deals with transparency reporting obligations for providers of online platforms will take effect after thee months after publication. The other regulations can only be applied more than a year later because the EU still has to set up a whole apparatus to control and enforce them. Article 74 is real legal lingo. I had to read it several times before I understood it. To make it easier for the reader, I will state “immediately applicable” for the articles that apply immediately. For the others, it will therefore have to wait until January 1, 2024 at the latest.
Layered approach to enforcement and punishment of violations
Enforcement is split between Digital Services Coordinators, DSCs, and the European Commission itself, which is supported in this by the European Board for Digital Services Coordinators Coordinators, hereinafter referred to as the Board. It therefore consists of a meeting of all DSCs chaired by the Commission. This Board is responsible for the consistent application of the Digital Services Act.
Each EU Member State must designate a national authority as the Digital Services Coordinator, who will be responsible for overseeing and enforcing the DSA at national level (article 38). As an existing authority can be assigned the role of DSC, discussions among Member States show that audiovisual regulators abroad are preferred to fulfill this role, although specific tasks can be assigned to other competent authorities, such as regulatory authorities for electronic communications or consumer protection authorities. The DSC must coordinate cooperation between all relevant competent authorities and must be clearly independent and free from political or private influences (article 39).
DSC’s investigative powers include, but are not limited to, requiring intermediary services to provide information and the ability to conduct building inspections (or order a judicial authority) to seize information related to suspected breaches . Some of their enforcement powers include imposing fines or periodic penalty payments, requesting a court to temporarily restrict access to an online service and the power to take interim measures (article 41).
The European Commission retains investigative powers that may be exercised through an investigation on its own initiative, or at the request of a DSC that suspects a possible infringement of the Regulation by a VLOP/VLOSE affecting users within their respective Member State.
Similarly, the Commission may require VLOPs/VLOSEs to provide information relevant to their investigation of a suspected infringement; within these requests, the Commission must provide the legal basis, specify what information is required and set a deadline for the information to be provided (article 52 immediately applicable).
In addition, the Commission can issue non-compliance decisions (article 58 immediately applicable); impose fines not exceeding 6% of the total worldwide annual turnover of the service provider (Article 59 immediately applicable); and make the commitments made by VLOPs/VLOSEs legally binding to ensure compliance (article 56 immediately applicable). However, the European Commission will have to consistently consult the Board on various enforcement measures
The European Board for Digital Services Coordinators is a completely new and totally independent body within the EU. His role is to support the consistent application of the DSA. The board consists of all DSCs, all of whom have voting rights in this formation, and is chaired by the European Commission, which has no voting rights. The board’s role is mainly advisory and will, among other things, contribute to drafting codes of conduct and support joint investigations between DSCs and the European Commission.
The Board may provide the European Commission and DSC’s advisor on enforcement action, in particular for appropriate VLOPs/VLOSEs, and may provide implementation to the DSCs. While these guidelines are not binding, a DSC’s decision to issue an opinion and draft should be taken into account when the European Commission is drafting a relevant regulation.
Reporting by trusted flaggers should be given priority by the providers of online platforms. The Digital Services Coordinator may hire a nominee trusted flaggers entity provided that (a) it has particular expertise and competence in detecting, identifying and reporting illegal content; (b) it is independent of any online platform provider; (c) the entity conducts its reporting activities in a diligent, accurate and objective manner
The trusted flagger must be disclosed to the Commission. The Commission then registers the entity’s address and email in a publicly accessible database. This entity must prepare an annual report on its activities. In the event of complaints from online platform operators about too much unnecessary and incorrect reporting, the Digital Services Coordinator must investigate that entity. During the investigation, she is put on inactive. This investigation must be done as soon as possible and the trusted flagger can possibly be recalled (article 19)

Transparency and accountability are required of VLOPs and VLOSEs
This rule puts an end to the arbitrariness often experienced when removing content. VLOPs and VLOSEs must clearly communicate their terms of use to their users. That information includes information about all policies, procedures, measures and tools used to moderate content, including algorithmic decision-making, and human review, as well as the procedural rules of their internal complaint handling system. It is written in plain, intelligible, user-friendly and unambiguous language and is available to the public in an easily accessible and readable form. VLOPs and VLOSEs provide recipients of services with a concise, easily accessible summary of the terms and conditions, including available remedies and redress mechanisms, in clear and unambiguous terms.
When a provider removes content because it is illegal, it must provide a sufficiently substantiated explanation of why that content in question is considered illegal.
Intermediary service providers must provide clear and specific justification to all affected recipients of the service for any of the following restrictions imposed:
(a) any restrictions on the visibility of specific information provided by the recipient of the service, including removing content, disabling access to or downgrading content;
(b) suspension, termination or other limitation of monetary payments (monetization);
(c) suspension or termination of the service in whole or in part;
(d) suspension or termination of the recipient’s accounts.
Point (a) refers to hiding messages online without deleting them.
(articles 10, 11, 12, 13, 14, 15)
See more in the text of the law
“At the latest by … [three months after date of entry into force of this Regulation] and at least once every six months thereafter, providers of online platforms shall publish in a publicly available section of their online interface information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and, where such methodology has been established, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2)” (article 23 section 2 and 3 immediately applicable)
No mandatory permanent monitoring of users and liability of the online platforms
An important victory for digital rights is the introduction of the conditional liability regime and the ban on a general duty of supervision. This ensures that people’s online content is protected from mass arbitrary deletion, ensuring the free flow of information. The internet is crucial to the exercise of our right to freedom of expression, which is especially important for the work of human rights defenders, whistleblowers and journalists. We cannot allow arbitrary decisions without a lack of scrutiny as to what content should remain online or be removed (Articles 3, 4, 5, 6 and 7).
See more in the text of the law
Notice of Illegal Content and Liability of the Online Platforms
Online platform services should provide access to action mechanisms that any person or entity can use to indicate the presence of illegal content in their views. These mechanisms must be easily accessible and user-friendly and are mandatory electronically. It is through this notification mechanism that ‘actual knowledge’, and thus potential liability for the dissemination of illegal content, can arise. However, this notice must be sufficiently accurate and substantiated that an online platform operator can establish illegality without conducting a detailed legal investigation (Article 14).
See also the complete text of this article.
Actual knowledge of illegal content can also be obtained through voluntary investigations of a platform by a trusted flager and Digital Services Coordinator or through the receipt of court orders.
DSA restricts targeted advertising and limits the categories that can be used for it
Online platform providers that present advertisements on their online interfaces must ensure (a) that it is clear that they are advertising. (b) That it is clear which natural or legal person is behind the advertisement. If not, then the one who paid for the advertising. (c) Also meaningful information about the main parameters used to determine the recipient of the advertisement must be disclosed by the provider of the online platform. And where applicable, information should also be provided on how these parameters can be changed. That information should be directly and easily accessible from the ad.
Online platform providers are not allowed to offer advertising based on profiling within the meaning of Article 4(4) of Regulation (EU) 2016/679 using special categories of sensitive data as referred to in Article 9(1) of Regulation (EU) 2016 /679. Here’s a literal quote:
“Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.”
Providers of online platforms accessible to minors must take appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors in their services (Articles 24 and 30).
DSA stands for rapid removal of illegal content and denial of access to repeat offenders and repeat offenders
Social network service providers must, upon receipt of an order to act against illegal content, if that order comes from relevant national legal or administrative authorities, on the basis of applicable EU or national law, and in accordance with EU law , inform without delay the authority that issued the order or any other authority specified in the order of any follow-up given to the order, indicating if and when the order was applied (Articles 8 and 9) .
Online platform providers suspend, for a reasonable period of time and upon prior warning, the provision of their services to service recipients who often offer clearly illegal content (Article 20).
See for more in the text of the law.
VLOPs and VLOSEs must assess and mitigate risks
Providers of very large online platforms should carefully analyze and assess all systemic risks arising from their design, including their algorithmic systems, in order to assess the risks of the operation and use of their services in the EU. The following systemic risks are important; (a) dissemination of illegal content; (b) any actual or foreseeable negative impact on the exercise of fundamental rights, in particular the fundamental right to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including freedom and media pluralism, the prohibition of discrimination, the rights of the child and consumer protection; (c) any actual or foreseeable negative impacts on public debate and electoral processes, and on public safety.
In conducting risk assessments, providers of very large online platforms shall, in particular, consider whether and how the following factors pose any of the systemic risks of their design through (a) the design of their recommendation systems and any other relevant algorithmic system; (b) their content moderation systems; (c) the applicable terms and conditions and their enforcement; (d) advertising selection and presentation systems; e) data-related practices of the provider.
Providers of very large online platforms shall keep the supporting documents of the risk assessments for at least three years after the risk assessments have been carried out and communicate them to the Commission and the Digital Services Coordinator upon request (Article 26).
Providers of very large online platforms should take reasonable, proportionate and effective mitigation measures targeting specific systemic risks identified in accordance with Article 26, paying particular attention to the fundamental rights impact of such measures (Article 27).
See more in the text of the law.
VLOPs and VLOSEs must allow independent audits on an annual basis
VLOPs and VLOSEs must annually permit independent audits of their obligations laid down in (a) Chapter III (Articles 10 to 37) and (b) all commitments entered into on the basis of the codes of conduct referred to in Articles 35 and 36 and the 37 crisis protocols referred to.
Providers of major online platforms shall provide the organizations conducting the audits under this Article with the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by granting access to to all relevant data and buildings and by answering oral or written questions. They shall refrain from hindering, undue influencing or undermining the performance of the audit.
The organizations conducting the audits shall ensure sufficient confidentiality and professional secrecy with regard to the information obtained in the context of the audits from the providers of very large online platforms and third parties, even after the audits have ended. (Article 28)
See more in the text of the law.
Notes and provisional evaluation
For and overall assessment shared by several members of Digital Services Act Human Rights Alliance, DSAHRA, quoting Eliska Pirkova of Access Now:
“Yet, the hardest part for the DSA is likely ahead: its successful enforcement. The coming years will prove the strength of the EU content moderation rulebook and whether it can live up to its job. Access Now will continue to monitor the DSA enforcement mechanism, and will push for a human rights-centric approach to content moderation across Europe.
DSA does provide weapons to put an end to click farms and social bots fairly quickly, but it will have to act decisively. If you see what is asked of the VLOPs, that is a lot: transparency and accountability, audits, assessing and limiting risks, that is all the opposite of what they are used to do. The EU has a big stick behind the door, but the EU cannot close the shop. Users would hate that, by the way. In any case, it would be smart to stimulate the development of alternative social networks in such a way that users can switch form old to new, there are now opportunities.
An important argument for stimulating the development of alternative social networks can be found in the evaluation of EDRi. That is also positive at EDRi, but they say, DSA regulates the visible targeted advertising but not the invisible tracking systems. Literal quote:
“But despite the human rights improvements the DSA can bring to people, real alternatives to the current dominant surveillance business model are still needed. While the ad tech industry often claims to be useful to people by providing more “relevant” ads, it is most of all characterised by an omnipresent system of pervasive 24/7 online corporate surveillance.
Google trackers are embedded in almost 90% of free-of-charge Android apps and in over 87% of all websites. People’s data are being massively extracted to create detailed profiles, and given that these trackers are invisible to users, people are unable to meaningfully object to being surveilled by the ad tech industry. Losing control of the content you see directly limits your freedom of information and expression, enables discriminatory practices by advertisers and amplifies social stereotyping.
That’s why it is disappointing that the EU did not fully phase out surveillance-based online advertising.”
According to the Global Disinformation Index, at least $235 million in annual revenue is generated from advertisements displayed on extremist and disinformation websites. This is an issue that has not been resolved by DSA.
The DSAHRA didn’t get everything, but a lot. She had also proposed to exclude police forces as a ‘trusted flagger’ candidate because the Alliance rightly fears that in countries such as Poland and Hungary, police forces could abuse their appointment. This proposal failed. Since the ‘trusted flaggers’ must be notified to the Commission, and this data is made public, action can still be taken. Citizens can therefore submit a complaint to the Board in the event of abuse.
Article 7 (1a), proposed by the European Parliament, which wanted to prohibit legally mandated automated decision-making from being imposed on online platforms, has not been retained in the final law.
A proposal that would have strengthened the protection of encrypted online communications also failed to reach final text, but the use of encryption has not been restricted in Europe. In the US it’s a different story, even Signal isn’t safe for the NSA there.
The requirement for the interoperability of messaging applications, part of the Digital Markets Act, DMA, offers an opening for alternative messaging services. But more research is needed into the feasibility on a national scale of such a large project. What does not help is what a small part of the European intelligentsia is proclaiming in the media. What, for example, Patrick Van Eecke, professor of ICT law at the University of Antwerp, says that there is a protectionist odor around the DSA legislation, is below average. How so? The EU receives congratulations from Obama and the American Frances Haugen. And the American Center for Democracy&Technology wants to propagate DSA in the US. Protectionist?
A very interesting note for researchers was published by Mathias Vermeulen, PhD in EU law, about the scope of the data-sharing regime under the Digital Services Act, as well as which researchers will be able to access data under the framework,and using what process. It then evaluates the guidance and commitments contained in the European Union’s Code of Practice on Disinformation and the European Digital Media Observatory’s Code of Conduct, including how these instruments relate to one another and operate within the broader regime. You will find it at:
https://tsjournal.org/index.php/jots/article/view/84/31
LikeLike