Martin's DSA Newsletter #8
Dec 19, 2025
The latest newsletter looks at the period of July 2025 and November 2025. It covers the first DSA cases before the General Court, new EC and national investigations, national enforcement cases, and other developments.
First Batch of DSA cases is out
The General Court issued its Zalando and Amazon rulings, as well as rulings in fee cases initiated by Meta and TikTok. The Commission prevailed in Zalando and Amazon, but it lost in the fee cases. That does not tell the entire story, however. So let´s look at these cases individually.
And as always, appropriate disclosure: I am involved as an attorney representing NGOs intervening on the side of the European Commission in Zalando, Technius and Apple [DMA]. With that out of the way, what do these cases say?
In Zalando v Commission T‑348/23 from September, the Seventh Chamber of the General Court ruled in the extended composition of five judges that Zalando was properly designed as a VLOP. Call me biased, but I think it is the most important of all the pending cases. The General Court correctly understood the goals behind the DSA and how it relates to pre-existing law. It clearly distinguished accountability obligations from liability exemptions, and explained how the notion of neutrality relates to the latter, but not the former (§§ 35, 41). The case also clarified that the concept of average active monthly users (AMARs), as explained by recitals, covers 'any exposure' to third-party information (§ 60). For AMARs counting, it clearly states that the inability to distinguish relevant and irrelevant users (e.g. because they were not exposed to UGC content) is not a problem per se. The Commission is, so the Court, 'entitled to consider that all those persons were deemed to have been actually exposed to that information' based on the submissions from companies (§ 67). As I have argued before, this means that the Commission only needs to assess the methodology used, its credibility, and reject all irrelevant criteria, such as Zalando's volume-based discounting of user numbers, and not actually determine the right numbers. For designation purposes, that is the company's job (and cost).
In Amazon EU v Commission T-367/23 from November, the same Seventh Chamber of the General Court ruled in the extended composition of five judges that Amazon's illegality pleas regarding various provisions of the DSA are unfounded. Now, this should be good news, but I'm less sure it entirely is for two reasons. In the EU judicial system, the annulment actions against designation decisions of the Commission are based on Article 263 TFEU. The ECJ has accepted that the provisions of an act of general application that constitute the basis of those decisions or that have a direct legal connection with such decisions may legitimately form the subject matter of an objection of illegality based on Article 277 TFEU (Commission and Council v Carreras Sequeros and Others, C‑119/19 P and C‑126/19 P, § 69 ff.). This means that companies can use the annulment action to seek review of the connected provisions of the law; however, such provisions must have a 'direct legal connection'. Amazon argued that Articles 38 (opt-out) and 39 (ad archives) should be declared inapplicable for violations of equal treatment and fundamental rights.
The General Court did not resolve the admissibility question for Articles 38 and 39 DSA, but it held that Article 33 DSA, the designation provision, is 'directly legally connected', and thus can be reviewed. So far, not that surprising. However, what comes next is. The Court essentially indirectly reviewed the legality of Articles 38, 39, and even Article 40 DSA against the fundamental rights and equal treatment. If this technique holds on appeal, newly designed companies or services will have a major incentive to seek judicial review. Not because they think the designation decision or Article 33 is illegal, but because they wish to seek judicial review of VLOP-specific provisions of the DSA. To be clear, such a review is inevitable once the provisions are applied by the public authorities in fines or by national courts. But in those cases, there will be a clear context for such challenges.
On the substance, the General Court held that Articles 38, 39 and 40 do not violate Articles 7, 11(1), 16, 17, and 20 of the EU Charter, and as a result Article 33 is not illegal. On one hand, the Court clearly signals that it does not see any unconstitutionality in the underlying provisions, especially concerning alleged discrimination or the right to conduct business. I think the analysis is fairly convincing here. On the other hand, other parts are less so. For instance, it is too readily accepted that freedom of expression rights of companies are being interfered with if the state imposes an opt-out obligation for recommender systems (§ 160 ff.), or that privacy rights extend to the protection of confidential information of companies that are limited by the existence of ad archives (§ 177 ff.). Accepting this logic will mean that the EU legislature, legislating to protect the freedom of expression of its citizens, in a content-neutral way (e.g., through design mandates, such as opt-outs), must satisfy a much higher test under freedom of expression than under the freedom to conduct business. To be clear, if such a mandate were content-specific (e.g., must carry rules), applying freedom of expression is fully warranted, as recently confirmed also by Google v Russia. Finally, the lack of proper exchange of views on the subject probably led to the fact that the Court sometimes also unwittingly mischaracterises Article 40(12) by injecting too much logic from Article 40(4) when describing them together [Article 40 is not even part of the pleas].
Fee cases. Finally, the Commission has clearly lost two fee cases. Meta Platforms Ireland v Commission T-55/24 and TikTok Technology v Commission T-58/24. The Court here invalidated the fee decisions on the basis that the methodology should have been included in the delegated act (§ 40-41). In other words, the Court held that the Commission is not permitted to put methodology in the individual acts but must properly establish it in the legislative process. This shows that the Court will very closely scrutinise the rule of law considerations, which is, in my view, very important, particularly for the enforcement of Articles 34-35. For Meta and TikTok, however, the decisions might not necessarily change how much they will have to pay, at least for now. The Court decided not to address the complaint about loss-making VLOPs whose fees are paid extra by profit-making VLOPs, and the Court maintained the effects of the fee decisions until the Commission remedies the problem with the legal basis. The deadline is 12 months. Once it does, if the fee does not differ, the companies might seek judicial review and raise the above complaint again. For 2025 fees, we already see Google joining the fray of Meta and TikTok. You can look at all new fee cases in our DSA case law tracker.
The ECJ is also shaking up the ground a bit with its new preliminary reference decisions. The Russmedia C-492/23 case, in particular, is potentially highly problematic. In short, it exempts GDPR from the DSA liability exemptions and develops a new system of due diligence obligations based entirely on the GDPR. Some think they go as far as to undermine the prohibition of general monitoring, but the Court claims otherwise. I am already working on a case comment with a data protection colleague, which I hope to share with you sometime in early 2026. From other pending cases, AGCOM C‑421/24 (on liability exemptions) and Coyote C-190/24 (on country of origin) and Webgroup C‑188/24 and C‑190/24 (on country of origin) are worth following.
EC supervision
As you have surely heard, X/Twitter has been fined €120 million by the European Commission for violating Articles 25, 39 and 40(12). This is an important case for a number of reasons. First, it concerns the uncontroversial parts of the DSA. So, screaming that it constitutes censorship simply demasks the speaker as equating any regulation with censorship. Second, the response to the fine clearly shows how unhinged some tech CEOs have become. The ensuing furore underscores the importance of platform accountability regulation by the state, and why it's not just some fancy nice-to-have, but an indispensable instrument to uphold the rule of law and fundamental rights in the 21st century. We should always question if everything we do under this rubric makes sense, but empowering individuals against the private companies is simply a must if we are serious about protecting their fundamental rights. Thirdly, the game that some CEOs play is clear: intimidation of European regulators. As I said for the New Your Times:
“The intention is clearly to intimidate regulators,” said Martin Husovec, an associate professor of law at London School of Economics and Political Science who is an expert on internet regulation. “If you link yourself up with the president of the United States, then you can intimidate European regulators from doing their job.”
Now, I do not want to be misunderstood. I do not put the legal actions of companies in the same basket. On the contrary. I appreciate that companies bring cases and thus clarify the law. With their help, we are creating clarity and improving the legitimacy of the law, where this is warranted. Think of the legislative process as the first round of a sports match. While people speak through the legislatures, the parliaments can abuse their power, and so can the regulators. Court cases are thus the second round, a re-match, to make sure that the first round was played fairly. If you fail in several rounds again and again, it means that the people's case was rock solid.
The European Commission also accepted another commitment by TikTok, this time related to their advertising archive. Unfortunately, we are still not able to read even those by AliExpress from the summer because they have not been published, probably due to the lengthy redaction process. Given how important these commitments are, I really hope that the publication timeline improves in the future to avoid giving the Musks of the world the ability to criticise it as 'illegal secret deals', which they are not.
The Commission also issued new preliminary findings about DSA violations concerning Temu (several obligations related to the sale of illegal products), and TikTok and Meta (obligations relating to notification of illegal content, appeals, and data access for researchers). Unfortunately, while these developments are from July and October, I still cannot find anything on the EC's dedicated website, but only the press releases.
Finally, EC sent information requests to Apple, Booking.com, Google and Microsoft regarding financial scams, and to Snapchat, YouTube, Apple App Store and Google Play regarding protection of minors.
DSCs supervision
The Irish regulator, CNaM, has started a number of investigations and won an important court case. You may recall that CNaM initiated a broad industry review back in September 2024 into several platforms, focusing on how they handle illegal content reports. Now, CNaM commenced formal investigations against TikTok and LinkedIn (Dec 2025), regarding compliance with Articles 16(1), 16(2)(c), and 25 of the DSA; and X/Twitter (Nov 2025), regarding potential failures in its content moderation appeals process and user-friendly complaint-handling systems (Article 20 of the DSA).
The case against X/Twitter is particularly interesting. CNaM explains that the investigation relates to the ability to appeal ToS flagging, and get feedback on submitted ToS flags, as opposed to notifications of illegal content. CNaM highlights that the information from HateAid, a German NGO, has been a very useful resource in this regard. In my book (see Chapter 10 and Chapter 11, now online thanks to OUP), I have argued that since appeals of unsuccessful ToS flags fall under Article 20, they indirectly require equivalent procedural rights. Admittedly, however, the DSA is not drafted well in this respect, and the courts will likely have to clarify this point. That being said, a big part of the industry already interprets the provision in this way, which shows that X's interpretation is an outlier (even if not entirely indefensible). Thus, it is great that CNaM picked this case.
Data Access for Researchers
The big news is that, since 29 October 2025, when the delegated act on data access comes into force, researchers can now submit their requests via the EU Data Access Portal. In this regard, I should mention that GFF published a new GDPR compliance tool for those who want to use the data access requests. The tool can be found here in English and here in German. The Mozilla Foundation published another report on Article 40(12) data access, along with a model data sharing agreement prepared by AWO.
As promised, I am slowly working on a free course for researchers who wish to access platform data using the DSA. If you have any resources that you think I should be aware of, please let me know.
National enforcement of the DSA
A lot is also happening in the national courts. I cannot cover everything exhaustively today, but here are some noteworthy cases. I am slowly compiling national cases, so if you know about something new in your jurisdiction, just email me, please.
- Irish High Court ([2025] IEHC 442) rejects X's challenge to the Irish Online Safety Code, a code for video-sharing platform services under audiovisual media legislation. The core of X's argument was that the code, by going beyond the AVMS Directive by singling out specific types of content as harmful to minors or adults, violates preemption of the DSA. More specifically, it was argued to undermine the full harmonisation created by Articles 14 and 28 DSA. The ruling is very interesting, and the Irish scheme is definitely worth studying more closely for various reasons; however, on the preemption point, I think the court got it right. Article 14 DSA does not preempt national content laws. In fact, they are not even caught by it. The more interesting argument relates to Article 28 DSA, which was not developed in the judgment beyond some general statements about consistency. However, here, implicitly, the reading given by the court is that Article 28 does not require any content-specific interventions, and as a result, content laws that are content-specific by definition do not collide with it in any way. If this is the understanding, then it is, in my view, entirely right, which is why I have been saying that Article 28 guidelines or enforcement cannot develop new classes of prohibited or otherwise restricted content.
- The Dutch courts continue to show why private enforcement of the DSA is going to be important, and why the enforcement model is 'hybrid', as I called it in my book. In this case, a Dutch NGO, Bits of Freedom, sued Meta for its design of the opt-out for profiling-based recommendations (Article 38), which would switch off in many situations and never turn back on. In other words, users would have to opt in very often to keep the non-profiling-based experience up and running. It would not stick. This, according to the Dutch court (see the ruling of the Amsterdam court), undermines the effectiveness of Article 38 and might constitute a dark pattern under Article 25. As a result, for now, such a design was limited through an interim order. Based on what I can read from the judgment, I fail to see how Meta could prevail in this case. Even Article 25 DSA fits perfectly in this situation as a complementary rule.
New developments
- Council of Europe's Expert Committee approved a new Recommendation on Online Safety and Content Creator and User Empowerment (more on this in the coming months).
- EC issued its DSA report, looking at how the DSA relates to other regimes and if its VLOP designation process works.
- EDPB issued draft GDPR/DSA guidelines that now will have to be rewritten not only in light of the comments but also in light of the Russmedia case.
- ACE, an ODS body, published its first transparency report; User rights, an ODS body, published the first batch of anonymised decisions;
- X sues ACE in Ireland seeking the court to clarify that: (i) ACE's certification is only for ToS and not for illegal content; (ii) ACE can only apply standards that are expressly incorporated in X's ToS; (iii) ACE misled the public and defamed X.
- Open Terms Archive now archives all risk assessments and mitigation reports here. Also, a new batch of VLOP risk assessments and mitigation reports was recently published by companies.
- Reinhold Kesler shows that too many traders on app stores, including some of the biggest companies, identify as ... non-traders on these marketplaces. This speaks volumes about compliance with Article 30 DSA.
- Paddy Leerssen published a fantastic paper on the political economy of platform ownership.
- Daphne Keller published several great articles about data access for researchers, see here and here.
- I hope to see you in Amsterdam at IViR´s platform conference in February, where I plan to present my paper on trusted content creators, a new concept that could allow us to improve what we read online. I have been thinking about it for years, but I needed a sabbatical to finally nail my ideas down.
Have a wonderful Christmas holidays, everyone!
Psst! If you find any mistakes or know about something I have missed, just email me; and if you wish to upskill during the holidays, our DSA masterclass is now dirt cheap.