The EU Data Governance Act is officially a go — the regulation became applicable on 24 Sept. The purpose of the DGA was not to confuse privacy professionals with another acronym but, rather, to create a mechanism to enable the safe use of public sector data, establish rules for data intermediary service providers, introduce the concept of data altruism and establish the European Data Innovation Board. I highly recommend this contributed article to untangle thorny DGA issues for privacy pros.

Aside from implementation challenges, there are at least two additional elements to contemplate when understanding the DGA. The EDIB is still in the making, but we know it will be a 40-member advisory body working to ensure harmonized practices and guidelines on the implementation of the DGA across the EU and European Economic Area. The European Commission ran a call for applications in July to select members who will be seated alongside representatives of member states and other authorities, including the European Data Protection Board, European Data Protection Supervisor and the European Union Agency for Cybersecurity.

Another point still largely left to the imagination is identifying the competent authorities that will be charged with enforcing the DGA. Each member state was set to designate — or set up, as the case may be — the appropriate authority by 24 Sept. Little information is transpiring at the moment. Spain announced the Ministry of Economic Affairs and Digital Transformation will be the competent authority for data intermediation services.

Elsewhere, the European Commission launched its Digital Services Act Transparency Database. Article 17 of the DSA, which entered into force 16 Nov. 2022, requires providers of hosting services, like cloud and webhosting services, to provide users with clear and specific information whenever they remove or restrict access to certain content. The database displays the full list of "statements of reason" provided by 16 platforms to date and analytics like the categories, grounds and restrictions applied, as well as any monetary restrictions, or whether the account was suspended or terminated.

The database launched this week and appears to reflect data from the last 20 days. A few statistics over that period:

  • The total number of statements of reason issued reaches over 8.7 million. TikTok issued more than 4.3 million and Pinterest issued approximately 1.7 million, compared to Zalando at the other end of the spectrum, which appears to have issued none. X Corp, formerly known as Twitter, issued 23,087 statements.
  • The removal of content — as opposed to disabling access to content, applying an age restriction or other measures — is by far the most applied restriction, occurring in 5.5 million cases. The termination of monetary payment has been applied in a very limited number of cases, at 23,087. Accounts were suspended or terminated in 5.5% of cases.
  • Statements of reason issued fall primarily in these categories: scope of platform service (3 million statements covering age-specific restrictions, goods/services not permitted to be offered on the platform, nudity, etc.); illegal or harmful speech (2 million); and pornography or sexualized content (1.5 million). Data protection and privacy violations — such as biometric data breaches, missing grounds of processing, right to be forgotten and data falsification — account for 311,952 cases. Animal welfare accounts for 2,994 hits.
  • Under the DSA's Article 17, hosting services providers can issue the statement of reason either on the grounds of illegal content or incompatibility with their terms and conditions. Interestingly, analytics show a ratio of 0.5% and 95.5%, respectively.

You can find additional information about the statements submission and criteria here.