https://www.techdirt.com/2025/12/04/eus-top-court-just-made-it-literally-impossible-to-run-a-user-generated-content-platform-legally/ [ ] Techdirt. [ ] * Sign In * Register * Preferences Techdirt [ ] * TechDirt * GreenHouse * Free Speech * Error 402 * Ctrl-Alt-Speech * Deals * Jobs * Support Techdirt [podcast-ti] Daily Deal: The Ultimate Oracle, SAP And Salesforce Training Bundle A Surveillance Mandate Disguised As Child Safety: Why The GUARD Act Won't Keep Us Safe EU's Top Court Just Made It Literally Impossible To Run A User-Generated Content Platform Legally [legal-issu] Legal Issues from the seems-like-a-problem dept Thu, Dec 4th 2025 10:46am - Mike Masnick The Court of Justice of the EU--likely without realizing it--just completely shit the bed and made it effectively impossible to run any website in the entirety of the EU that hosts user-generated content. Obviously, for decades now, we've been talking about issues related to intermediary liability, and what standards are appropriate there. I am an unabashed supporter of the US's approach with Section 230, as it was initially interpreted, which said that any liability should land on the party who contributed the actual violative behavior--in nearly all cases the speaker, not the host of the content. The EU has always held itself to a lower standard of intermediary liability, first with the E-Commerce Directive and more recently with the Digital Services Act (DSA), which still generally tries to put more liability on the speaker but has some ways of shifting the liability to the platform. No matter which of those approaches you think is preferable, I don't think anyone could (or should) favor what the Court of Justice of the EU came down with earlier this week, which is basically "fuck all this shit, if there's any content at all on your site that includes personal data of someone you may be liable." As with so many legal clusterfucks, this one stems from a case with bad facts, which then leads to bad law. You can read the summary as the CJEU puts it: The applicant in the main proceedings claims that, on 1 August 2018, an unidentified third party published on that website an untrue and harmful advertisement presenting her as offering sexual services. That advertisement contained photographs of that applicant, which had been used without her consent, along with her telephone number. The advertisement was subsequently reproduced identically on other websites containing advertising content, where it was posted online with the indication of the original source. When contacted by the applicant in the main proceedings, Russmedia Digital removed the advertisement from its website less than one hour after receiving that request. The same advertisement nevertheless remains available on other websites which have reproduced it. And, yes, no one is denying that this absolutely sucks for the victim in this case. But if there's any legal recourse, it seems like it should be on whoever created and posted that fake ad. Instead, the CJEU finds that Russmedia is liable for it, even though they responded within an hour and took down the ad as soon as they found out about it. The lower courts went back and forth on this, with a Romanian tribunal (on first appeal) finding, properly, that there's no fucking way Russmedia should be held liable, seeing as it was merely hosting the ad and had nothing to do with its creation: The Tribunalul Specializat Cluj (Specialised Court, Cluj, Romania) upheld that appeal, holding that the action brought by the applicant in the main proceedings was unfounded, since the advertisement at issue in the main proceedings did not originate from Russmedia, which merely provided a hosting service for that advertisement, without being actively involved in its content. Accordingly, the exemption from liability provided for in Article 14(1)(b) of Law No 365/2002 would be applicable to it. As regards the processing of personal data, that court held that an information society services provider was not required to check the information which it transmits or actively to seek data relating to apparently unlawful activities or information. In that regard, it held that Russmedia could not be criticised for failing to take measures to prevent the online distribution of the defamatory advertisement at issue in the main proceedings, given that it had rapidly removed that advertisement at the request of the applicant in the main proceedings. With the case sent up to the CJEU, things get totally twisted, as they argue that under the GDPR, the inclusion of "sensitive personal data" in the ad suddenly makes the host a "joint controller" of the data under that law. As a controller of data, the much stricter GDPR rules on data protection now apply, and the more careful calibration of intermediary liability rules get tossed right out the window. And out the window, right with it, is the ability to have a functioning open internet. The court basically shreds basic intermediary liability principles here: In any event, the operator of an online marketplace cannot avoid its liability, as controller of personal data, on the ground that it has not itself determined the content of the advertisement at issue published on that marketplace. Indeed, to exclude such an operator from the definition of 'controller' on that ground alone would be contrary not only to the clear wording, but also the objective, of Article 4(7) of the GDPR, which is to ensure effective and complete protection of data subjects by means of a broad definition of the concept of 'controller'. Under this ruling, it appears that any website that hosts any user-generated content can be strictly liable if any of that content contains "sensitive personal data" about any person. But how the fuck are they supposed to handle that? The basic answer is to pre-scan any user-generated content for anything that might later be deemed to be sensitive personal data and make sure it doesn't get posted. How would a platform do that? -\_(tsu)_/- There is no way that this is even remotely possible for any platform, no matter how large or how small. And it's even worse than that. As intermediary liability expert Daphne Keller explains: The Court said the host has to + pre-check posts (i.e. do general monitoring) + know who the posting user is (i.e. no anonymous speech) + try to make sure the posts don't get copied by third parties (um, like web search engines??) Basically, all three of those are effectively impossible. Think about what the court is actually demanding here. Pre-checking posts means full-scale automated surveillance of every piece of content before it goes live--not just scanning for known CSAM hashes or obvious spam, but making subjective legal determinations about what constitutes "sensitive personal data" under the GDPR. Requiring user identification kills anonymity entirely, which is its own massive speech issue. And somehow preventing third parties from copying content? That's not even a technical problem--it's a "how do you stop the internet from working like the internet" problem. Some people have said that this ruling isn't so bad, because the ruling is about advertisements and because it's talking about "sensitive personal data." But it's difficult to see how either of those things limit this ruling at all. There's nothing inherently in the law or the ruling that limits its conclusions to "advertisements." The same underlying factors would apply to any third party content on any website that is subject to the GDPR. As for the "sensitive personal data" part, that makes little difference because sites will have to scan all content before anything is posted to guarantee no "sensitive personal data" is included and then accurately determine what a court might later deem to be such sensitive personal data. That means it's highly likely that any website that tries to comply under this ruling will block a ton of content on the off chance that maybe that content will be deemed sensitive. As the court noted: In accordance with Article 5(1)(a) of the GDPR, personal data are to be processed lawfully, fairly and in a transparent manner in relation to the data subject. Article 5(1)(d) of the GDPR adds that personal data processed must be accurate and, where necessary, kept up to date. Thus, every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay. Article 5(1)(f) of that regulation provides that personal data must be processed in a manner that ensures appropriate security of those data, including protection against unauthorised or unlawful processing. Good luck figuring out how to do that with third-party content. And they're pretty clear that every website must pre-scan every bit of content. They claim it's about "marketplaces" and "advertisements" but there's nothing in the GDPR that limits this ruling to those categories: Accordingly, inasmuch as the operator of an online marketplace, such as the marketplace at issue in the main proceedings, knows or ought to know that, generally, advertisements containing sensitive data in terms of Article 9(1) of the GDPR, are liable to be published by user advertisers on its online marketplace, that operator, as controller in respect of that processing, is obliged, as soon as its service is designed, to implement appropriate technical and organisational measures in order to identify such advertisements before their publication and thus to be in a position to verify whether the sensitive data that they contain are published in compliance with the principles set out in Chapter II of that regulation. Indeed, as is apparent in particular from Article 25(1) of that regulation, the obligation to implement such measures is incumbent on it not only at the time of the processing, but already at the time of the determination of the means of processing and, therefore, even before sensitive data are published on its online marketplace in breach of those principles, that obligation being specifically intended to prevent such breaches. No more anonymity allowed: As regards, in the second place, the question whether the operator of an online marketplace, as controller of the sensitive data contained in advertisements published on its website, jointly with the user advertiser, must verify the identity of that user advertiser before the publication, it should be recalled that it follows from a combined reading of Article 9(1) and Article 9(2)(a) of the GDPR that the publication of such data is prohibited, unless the data subject has given his or her explicit consent to the data in question being published on that online marketplace or one of the other exceptions laid down in Article 9(2)(b) to (j) is satisfied, which does not, however, appear to be the case here. On that basis, while the placing by a data subject of an advertisement containing his or her sensitive data on an online marketplace may constitute explicit consent, within the meaning of Article 9(2)(a) of the GDPR, such consent is lacking where that advertisement is placed by a third party, unless that party can demonstrate that the data subject has given his or her explicit consent to the publication of that advertisement on the online marketplace in question. Consequently, in order to be able to ensure, and to be able to demonstrate, that the requirements laid down in Article 9(2)(a) of the GDPR are complied with, the operator of the marketplace is required to verify, prior to the publication of such an advertisement, whether the user advertiser preparing to place the advertisement is the person whose sensitive data appear in that advertisement, which presupposes that the identity of that user advertiser is collected. Finally, as Keller noted above, the CJEU seems to think it's possible to require platforms to make sure content is never displayed on any other platform as well: Thus, where sensitive data are published online, the controller is required, under Article 32 of the GDPR, to take all technical and organisational measures to ensure a level of security apt to effectively prevent the occurrence of a loss of control over those data. To that end, the data controller must consider in particular all technical measures available in the current state of technical knowledge that are apt to block the copying and reproduction of online content. Again, the CJEU appears to be living in a fantasy land that doesn't exist. This is what happens when you over-index on the idea of "data controllers" needing to keep data "private." Whoever revealed sensitive data should have the liability placed on them. Putting it on the intermediary is misplaced and ridiculous. There is simply no way to comply with the law under this ruling. In such a world, the only options are to ignore it, shut down EU operations, or geoblock the EU entirely. I assume most platforms will simply ignore it--and hope that enforcement will be selective enough that they won't face the full force of this ruling. But that's a hell of a way to run the internet, where companies just cross their fingers and hope they don't get picked for an enforcement action that could destroy them. There's a reason why the basic simplicity of Section 230 makes sense. It says "the person who creates the content that violates the law is responsible for it." As soon as you open things up to say the companies that provide the tools for those who create the content can be liable, you're opening up a can of worms that will create a huge mess in the long run. That long run has arrived in the EU, and with it, quite the mess. Filed Under: cjeu, controller, data protection, dsa, gdpr, intermediary liability, section 230, sensitive data, user generated content Companies: russmedia 6 CommentsLeave a Comment Techdirt needs your support! Get the first Techdirt Commemorative Coin with donations of $100 If you liked this post, you may also be interested in... * Ctrl-Alt-Speech: You Can't Antitrust Anyone These Days * Bipartisan Senators Want To Honor Charlie Kirk By Making It Easier To Censor The Internet * Ctrl-Alt-Speech: Chat Bot Your Tongue? * Before Advocating To Repeal Section 230, It Helps To First Understand How It Works * Ctrl-Alt-Speech: With Great Platforms Come Great Responsibility * * * * * * Rate this comment as insightful Rate this comment as funny You have rated this comment as insightful You have rated this comment as funny Flag this comment as abusive/trolling/spam You have flagged this comment The first word has already been claimed The last word has already been claimed Lightbulb icon Laughing icon Flag icon Lightbulb icon Laughing icon Comments on "EU's Top Court Just Made It Literally Impossible To Run A User-Generated Content Platform Legally" Subscribe: RSS Leave a comment * Filter comments in by Time * Filter comments as Threaded * Filter only comments rated Insightful * Filter only comments rated funny LOL * Filter only comments that are Unread 6 Comments [93146c59f3]Anonymous Coward says: December 4, 2025 at 10:52 am Not to mension this butts up against the EU's own rules banning general monitoring requirements creating quite the ouroboros situation. Good luck figuring that out EU. Reply View in chronology Make this comment the first word Make this comment the last word [a5437690c1]Anonymous Coward says: December 4, 2025 at 11:34 am Since git commits have emails (private info), and possible names, I would guess this means that git forges would be problematic in the EU too. Especially since forging a git commit isn't hard. And open source communities often applies patches from other people (where the committer is not the author). Also The Linux's kernel's MAINTAINERS file has other contact info too. Basically maybe the EU should just disconnect itself from the internet to ensure compliance. Reply View in chronology Make this comment the first word Make this comment the last word [f707a387f9]Anonymous Coward says: December 4, 2025 at 1:02 pm This is mind-bendingly stupid. It burns. It burns so much i think a call to Red Adair is in order. The rules for processing PII are for (checks notes)... processing! Some use posting something isn't PII the site asked for and processed. And the further rules generated by this court in order to find a way to hold a company liable are somehow worse. It's like they are trying to make more Euroskeptics out of people who believed in the mission. Reply View in chronology Make this comment the first word Make this comment the last word [user-default]GHB (profile) says: December 4, 2025 at 1:18 pm The EU government doesn't know how the internet works. If you are an artist or any content creator using a website based in the EU, or a worldwide international site like youtube using servers subject to their laws, you'll experience: * Having to give out your personal info. It's bad enough we have crappy age verification mandates to view content, now we have UGC verification mandates. And don't think it will be exclusive to the EU, trash laws tend to spread like cancer across the globe. * After your content is submitted, you either wait many years, or have AI moderate your content much sooner (and are prone to errors), before your content becomes public for anyone to see. Your content is illegal by default, like being judged guilty until proven innocent. * Accept the fact you may not even appear on search engines or even social media sites, if they have link previews. Quoting a post or a video is also banned. These laws/bills might as well directly ban UGC. Reply View in chronology Make this comment the first word Make this comment the last word [user-default]Ethin Probst (profile) says: December 4, 2025 at 1:37 pm Well this is a problem. I was going to move my infra to Scaleway's AMS1 data center. Mainly because their elastic metal offerings are really good. But this ruling is a big problemo. EU, please stop making nonsensical rulings like this... Reply View in chronology Make this comment the first word Make this comment the last word [user-default]Arianity (profile) says: December 4, 2025 at 2:21 pm There's a reason why the basic simplicity of Section 230 makes sense. Simplicity is great if all you care about is protecting liability going overboard, and are willing to sacrifice the cases where the host contributes to the violative behavior. That's also one hell of a way to run the internet. As soon as you open things up to say the companies that provide the tools for those who create the content can be liable, you're opening up a can of worms that will create a huge mess in the long run. Doing the law properly often isn't simple, because life is messy. Determining who contributed to something is fundamentally a messy endeavor. The CJEU seems to have screwed up here, though. Instead, the CJEU finds that Russmedia is liable for it, even though they responded within an hour and took down the ad as soon as they found out about it. Under the simplicity of 230, it wouldn't have to take it down at all. (Never mind that an anonymous speaker probably wouldn't be found, since they're you know, anonymous) then accurately determine what a court might later deem to be such sensitive personal data. That means it's highly likely that any website that tries to comply under this ruling will block a ton of content on the off chance that maybe that content will be deemed sensitive. It's more likely to be the opposite problem. The GDPR defines personal data extremely broadly. If it's an "off chance", odds are it's actually included under GDPR. Reply View in chronology Make this comment the first word Make this comment the last word --------------------------------------------------------------------- [] says: Add Your Comment Cancel reply Your email address will not be published. Required fields are marked * Have a Techdirt Account? Sign in now. Want one? Register here Name [ ] Email [ ] [ ]Subscribe to the Techdirt Daily newsletter URL [ ] Subject [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ ] Comment * [ ] Comment Options: (*)Use markdown. ( )Use plain text. Make this the ( )First Word or ( )Last Word.(*)No thanks. (get credits or sign in to see balance) what's this? What's this? Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop >> [Post Comment][Preview] [ ] [ ] [ ] [ ] [ ] [ ] [ ] D[ ] Daily Deal: The Ultimate Oracle, SAP And Salesforce Training Bundle A Surveillance Mandate Disguised As Child Safety: Why The GUARD Act Won't Keep Us Safe Follow Techdirt Techdirt Daily Newsletter [ ] [Subscribe] Subscribe to Our Newsletter Please leave this field empty[ ] Get all our posts in your inbox with the Techdirt Daily Newsletter! [ ] [Subscribe] We don't spam. Read our privacy policy for more info. Check your inbox or spam folder to confirm your subscription. Ctrl-Alt-Speech A weekly news podcast from Mike Masnick & Ben Whitelaw Subscribe now to Ctrl-Alt-Speech >> Essential Reading The Techdirt Greenhouse Read the latest posts: * Winding Down Our Latest Greenhouse Panel: The Lessons Learned From SOPA/PIPA * From The Revolt Against SOPA To The EU's Upload Filters * Did We Miss Our Best Chance At Regulating The Internet? Read All >> --------------------------------------------------------------------- Trending Posts * EU's Top Court Just Made It Literally Impossible To Run A User-Generated Content Platform Legally * The White House Intervened On Behalf Of Accused Sex Trafficker Andrew Tate During A Federal Investigation * Forget Whether Or Not DOGE Exists: Will Anyone Be Held Accountable For 600,000 Deaths? Techdirt Deals Techdirt Insider Discord The latest chatter on the Techdirt Insider Discord channel... Loading... Become an Insider! Recent Stories Thursday 13:38 Like Apple, Google's AI News Tech Misinterprets Stories, Generates Gibberish Headlines (4) 12:01 A Surveillance Mandate Disguised As Child Safety: Why The GUARD Act Won't Keep Us Safe (2) 10:46 EU's Top Court Just Made It Literally Impossible To Run A User-Generated Content Platform Legally (6) 10:41 Daily Deal: The Ultimate Oracle, SAP And Salesforce Training Bundle (0) 09:26 Trump Administration Stops Fucking Around On Immigration, Hangs Official 'Whites Only' Sign (7) 05:29 The 'Trump Phone' Is An Unsurprising No Show Months After Promised Launch Date (6) Wednesday 20:15 RFK Jr., CDC Vaccine Guidance, A New Deputy CDC Director, And Measles In South Carolina (4) 15:16 The White House Intervened On Behalf Of Accused Sex Trafficker Andrew Tate During A Federal Investigation (12) 12:55 Forget Whether Or Not DOGE Exists: Will Anyone Be Held Accountable For 600,000 Deaths? (13) 11:25 Kristi Noem Made Final 'Fuck The Courts' Decision Refusing To Turn Deportation Flights Around (8) More arrow x Email This Story This feature is only available to registered users. You can register here or sign in to use it. Tools & Services * Twitter * Facebook * RSS * Podcast * Research & Reports Company * About Us * Advertising Policies * Privacy Contact * Help & Feedback * Media Kit * Sponsor / Advertise More * Copia Institute * Insider Shop * Support Techdirt Techdirt Brought to you by Floor64 Proudly powered by WordPress. Hosted by Pressable. [Got it] This site, like most other sites on the web, uses cookies. For more information, see our privacy policy