[HN Gopher] CJEU has made it effectively impossible to run a use...
       ___________________________________________________________________
        
       CJEU has made it effectively impossible to run a user-generated
       platform legally
        
       Author : alsetmusic
       Score  : 82 points
       Date   : 2025-12-04 19:55 UTC (3 hours ago)
        
 (HTM) web link (www.techdirt.com)
 (TXT) w3m dump (www.techdirt.com)
        
       | free_bip wrote:
       | Seems like a pretty big overreaction IMO. Advertisements deserve
       | more strict regulation than general user-generated content
       | because they tend to reach far more people. The fact that they
       | aren't has resulted in something like 10% of all ads shown being
       | outright scams or fraud[0]. And they should never have allowed
       | the ad to air in the first place - it was patently and obviously
       | illegal even without considering the GDPR.
       | 
       | If these companies aren't willing to put basic measures in place
       | to stop even the most obviously illegal ads from airing, I have a
       | lot of trouble having sympathy for them getting their just
       | desserts in court.
       | 
       | [0]: https://www.msn.com/en-us/money/personalfinance/meta-
       | showed-...
        
         | zahlman wrote:
         | > Advertisements deserve more strict regulation than general
         | user-generated content because they tend to reach far more
         | people.
         | 
         | They deserve strict regulation because _the carrier is actively
         | choosing_ who sees them, and because there are explicit fiscal
         | incentives in play. The entire point of Section 230 is that
         | carriers can claim to be just the messenger; the only way to
         | make sense of absolving them of responsibility for the content
         | is to make the argument that their conveyance of the content
         | does not constitute _expression_.
         | 
         | Once you have auctions for ads, and "algorithmic feeds", that
         | becomes a lot harder to accept.
        
           | xoa wrote:
           | > _The entire point of Section 230 is that carriers can claim
           | to be just the messenger_
           | 
           | Incorrect, and it's honestly kinda fascinating how this meme
           | shows up so often. What you're describing is "common carrier"
           | status, like an ISP (or Fedex/UPS/post office) would have.
           | The point of Section 230 was specifically to enable _not_
           | being  "just the messenger", it was part of the overall
           | Communications Decency Act intended to aid in stopping bad
           | content. Congress added Section 230 in direct reaction to two
           | court cases (against Prodigy and CompuServe) which made
           | service providers liable for their user's content when they
           | didn't act as pure common carriers but rather tried to
           | moderate it but, obviously and naturally, could not perfectly
           | get everything. The specific fear was that this left only two
           | options: either ban all user content, which would brutalize
           | the Internet even back then, or cease all moderation, turning
           | everything into a total cesspit. Liability protection was
           | precisely one of the rare genuine "think of the children!"
           | wins, by enabling a 3rd path where everyone could do their
           | best to moderate their platforms without becoming the
           | publisher. Not being a common carrier is the whole point!
        
             | zahlman wrote:
             | > Congress added Section 230 in direct reaction to two
             | court cases (against Prodigy and CompuServe) which made
             | service providers liable for their user's content when they
             | didn't act as pure common carriers but rather tried to
             | moderate it but, obviously and naturally, could not
             | perfectly get everything.
             | 
             | I know that. I spoke imprecisely; my framing is that this
             | imperfect moderation doesn't take away their immunity --
             | i.e. they are still treated _as if_ they were  "just the
             | messenger" (per the previous rules). I didn't use the
             | actual "common carrier" phrasing, for a reason.
             | 
             | It doesn't change the argument. Failing to apply a content
             | policy consistently is not, logically speaking, an act of
             | expression; choosing to show content preferentially is.
             | 
             | ... And so is _setting_ a content policy. For example, if a
             | forum explicitly for hateful people set a content policy
             | explicitly banning statements inclusive or supportive of
             | the target group, I don 't see why the admin should be held
             | harmless (even if they don't also post). Importantly,
             | though, the setting (and _attempt_ at enforcing) the policy
             | is only expressing the view _of the policy_ , not that of
             | any permitted content; in US law it would be hard to
             | imagine a content policy expressing anything illegal.
             | 
             | But my view is that if they act deliberately to show
             | something, _based on knowing and evaluating_ what it is
             | that they 're showing, to someone who hasn't requested it
             | (as a recommendation), then they really should be liable.
             | The point of not punishing platforms for failing at
             | moderation is to let them claim plausible ignorance of what
             | they're showing, because they can't observe and evaluate
             | everything.
        
         | tensor wrote:
         | Except this isn't limited to ads is it? From the post it sounds
         | like the ruling covers any user content. If someone uploads
         | personal data to Github now Github is liable. In fact, why
         | wouldn't author names on open source licenses count as PII?
        
           | free_bip wrote:
           | You could always sue GitHub to find out.
           | 
           | Personally, I'm not buying the slippery slope argument. I
           | could be wrong of course but that's the great thing about
           | opinions: you're allowed to be wrong :)
        
           | o11c wrote:
           | The author of the _article_ is claiming it extends beyond
           | ads.
           | 
           | That does not appear to be what the court actually said,
           | however.
           | 
           | And I 100% believe that all advertisements should require
           | review by a documented human before posting, so that someone
           | can be held accountable. In the absence of this it is
           | perfectly acceptable to hold the entire organization liable.
        
           | fweimer wrote:
           | The judgement is a bit more nuanced than that: https://curia.
           | europa.eu/juris/document/document_print.jsf?mo...
           | 
           | The court uses the phrase "an online marketplace, as
           | controller" in key places. This suggest to me that there can
           | be online marketplaces that are not data controllers.
           | 
           | The court cites several contributing factors to treat the
           | platform as data controller: they reserved additional rights
           | to upload content, they selected the ads to display. Github
           | only claims limited rights in uploaded content, and I'm not
           | sure if they have any editorialized ("algorithmic") feeds
           | where Github selects repository content for display. That may
           | make it less likely that they would be considered data
           | controllers. On the other hand, licensing their repository
           | database for LLM training could make them liable if personal
           | data ends up in models. I don't think that's necessarily a
           | bad thing.
        
           | bonzini wrote:
           | > why wouldn't author names on open source licenses count as
           | PII?
           | 
           | They are but you can keep PII if it is relevant to the
           | purpose of your activity. In this case the author needs you
           | to share his PII, in order to exercise his moral and
           | copyright rights.
        
       | SilverElfin wrote:
       | > Under this ruling, it appears that any website that hosts any
       | user-generated content can be strictly liable if any of that
       | content contains "sensitive personal data" about any person.
       | 
       | Could this lead to censorship as well? For example you could go
       | to a website or community you don't like, and share information
       | that could be seen as "sensitive personal data" and then file an
       | anonymous complaint so they get into legal trouble or get shut
       | down?
        
       | xg15 wrote:
       | > _I am an unabashed supporter of the US's approach with Section
       | 230, as it was initially interpreted, which said that any
       | liability should land on the party who contributed the actual
       | violative behavior--in nearly all cases the speaker, not the host
       | of the content._
       | 
       | I never really understood how that system is supposed to work.
       | 
       | So on the one hand, Section 230 absolves the hoster of liability
       | and tells an affected party to go after the author directly.
       | 
       | But on the other hand, we all rally for the importance of
       | anonymity on the internet, so it's very likely that there will be
       | no way to _find_ the author.
       | 
       | Isn't this a massive vacuum of responsibility?
        
         | Retric wrote:
         | Anonymous authors have very little reach without external
         | promotion or a long lasting reputation.
         | 
         | If someone builds up a reputation anonymously then that
         | reputation itself is something that can be destroyed when a
         | platform destroys their account etc.
        
           | probably_wrong wrote:
           | Your premise goes precisely against the base of the lawsuit
           | itself:
           | 
           | > _(...) an unidentified third party published on that
           | website an untrue and harmful advertisement presenting her as
           | offering sexual services. That advertisement contained
           | photographs of that applicant, which had been used without
           | her consent, along with her telephone number.(...) The same
           | advertisement nevertheless remains available on other
           | websites which have reproduced it._
           | 
           | Anonymous author, great reach, enough damage for the victim
           | to take a lawsuit all the way to the CJEU.
        
             | Retric wrote:
             | > great reach
             | 
             | What exactly provided great reach here? Is it the creator
             | or something else.
             | 
             | > The same advertisement nevertheless remains available on
             | other websites which have reproduced it.
             | 
             | This presumably involved the actions of 3rd parties not
             | just the original content creator.
        
               | tsimionescu wrote:
               | > What exactly provided great reach here? Is it the
               | creator or something else.
               | 
               | Irrelevant, in the initial understanding of Section 230
               | that this thread was advocating for returning to. Only
               | the author is responsible for the content in that
               | framework, not the platforms distributing and/or
               | promoting it.
        
               | Retric wrote:
               | In Section 230 platforms only maintain protection as a
               | neutral party, anything getting promoted by them puts
               | them at risk.
               | 
               | "letters to the editor" are a very old form of user
               | generated content, but the act of selecting a letter and
               | placing it in a prominent position isn't a neutral act.
               | 
               | They do maintain limited editorial discretion. _Section
               | 230(c)(2) states that service providers and users may not
               | be held liable for voluntarily acting in good faith to
               | restrict access to "obscene, lewd, lascivious, filthy,
               | excessively violent, harassing, or otherwise
               | objectionable" material._
               | 
               | Further, _Section 230 contains statutory exceptions. This
               | federal immunity generally will not apply to suits
               | brought under federal criminal law, intellectual property
               | law, any state law "consistent" with Section 230, certain
               | privacy laws applicable to electronic communications, or
               | certain federal and state laws relating to sex
               | trafficking._
        
         | unyttigfjelltol wrote:
         | The combo effectively enshittified swaths of the Internet,
         | which now is full of robo-pamphleteers acting with anonymous
         | impunity, in ways they never would if sitting face-to-face.
         | 
         | I love the Internet but it normalizes bad behavior and to the
         | extent the CJEU was tracking toward a new and more stringent
         | standard, well earned by the Internet and its trolls.
        
         | xoa wrote:
         | A few things:
         | 
         | > _But on the other hand, we all rally for the importance of
         | anonymity on the internet, so it 's very likely that there will
         | be no way to find the author._
         | 
         | So:
         | 
         | 1) We all rally for the importance of anonymity (wrt general
         | speech) _EVERYWHERE_ , before even (and _critical to_ ) the
         | founding of America. Writing like the Federalist Papers were
         | absolutely central to arguments for the US Constitution, and
         | they were anonymous. "The Internet" is not anything special or
         | novel here per se when it comes to the philosophy and politics
         | of anonymous speech. There has always been a tension with
         | anonymous speech risks vs value, and America has come down
         | quite firmly on the value side of that.
         | 
         | 2) That said, "anonymous" on the internet very rarely actually
         | is to the level of "no way to find the author _with the aid of
         | US court ordered process_ ". Like, I assume that just as my
         | real name is not "xoa" your real name is not "xg15", and to the
         | extent we have made some effort at maintaining our pseudonymity
         | it'd be somewhat difficult for any random member of the general
         | public to connect our HN speech to our meatspace bodies. But
         | the legal process isn't limited to public information. If you
         | have a colorable lawsuit against someone, you can sue their
         | placeholder and then attempt to discover their identity via
         | private data. HN has our IP addresses if nothing else, as does
         | intermediaries between the system we're posting from and HN, as
         | well as possibly a valid email address. Which can potentially
         | by themselves be enough breadcrumbs to follow back to a person
         | and have enough cause to engage in specific discovery against
         | them. And this is without any money getting involved, if there
         | are any payments of any kind that leaves an enormous number of
         | ripples. And that's assuming nobody left any other clues, that
         | you can't make any inferences about who would be doing
         | defamatory speech against you and narrow it down further that.
         | 
         | Yes, it's possible someone at random is using a massive number
         | of proxies from a camera-less logless public access point with
         | virgin monero or whatever and perfect opsec, but that really is
         | not the norm.
         | 
         | 3) Hosters not being directly liable doesn't make them immune
         | to court orders. If something is defamatory you can secure an
         | order to have it removed even without finding the person in
         | question. And in practice most hosters are probably going to
         | remove something upon notification as fast as possible, as in
         | this case, and ban the poster in question on top.
         | 
         | So no, I don't think it's a "a massive vacuum of
         | responsibility" anymore than it ever was, and the contrast is
         | that eliminating anonymous speech is a long proven massive risk
         | to basic freedoms.
        
       | smartbit wrote:
       | Another analysis, by Heise Verlag, publisher of _C't_ Europe 's
       | largest IT and tech magazine https://heise.de/-11102550
       | 
       |  _The Russmedia ruling of the ECJ: Towards a "Cleannet"?
       | 
       | A change in liability privilege for online providers will lead to
       | a "cleaner", but also more rigid, monitored internet, says Joerg
       | Heidrich._
        
       | hexo wrote:
       | Websites have to be held responsible for ADs they serve.
       | Otherwise they tend to make unfounded excuses we cant care less
       | about. Like scam ADs on youtube.
       | 
       | But user generated content? LOL, no.
        
         | RunSet wrote:
         | What a relief this does not apply to https://uk.LokiList.com
        
       | Tor3 wrote:
       | Sigh. "..that personal data processed must be accurate and, where
       | necessary, kept up to date. "
       | 
       | How do they think a hosting provider can check if personal data
       | is accurate? Maybe if privacy didn't exist and everybody could be
       | scrutinized.. but the ruling refers to the GDPR to justify this,
       | and the GDPR is about _protecting_ privacy. So, what is it?
       | 
       | And for everything else.. is the material sensitive or not? How
       | can anyone know, in advance?
       | 
       | I suggest every web site host simply forward all and every input
       | to an EU Court address, and let them handle it. They're the ones
       | suggesting that hosts should make sure that personal data on
       | someone is "accurate", they're the ones demanding that the data
       | should not be "sensitive", so they can as well be responsible for
       | vetting the data.
       | 
       | But they're all crazy anyway, as they demand that a website must
       | block anyone from copying the content.. so how, at the same time,
       | can you even have a website? A website which people can watch?
       | 
       | If the ruling was about _collecting_ data which isn 't for
       | displaying, i.e. what a net shop does (address, credit card
       | number), then this would be understandable. But provisions for
       | that already exists, instead they use this (GDPR) as a tool to
       | extend this to user-created content. It's not limited to ads, and
       | ads do need something done. Something totally different from
       | this.
        
       | nomercy400 wrote:
       | The party which decides to show the advertisment in exchange for
       | payment, should be more responsible for what they are showing
       | than a free user posting content.
       | 
       | Now things become interesting when a users pays for ranking or
       | 'verification' checkmarks. What makes that content different than
       | a paid advertisment?
        
         | jay_kyburz wrote:
         | I came to the comments to express the same sentiment, expecting
         | to be an unpopular opinion. Pleasantly surprised to find your
         | comment at the top.
         | 
         | Hosts should make sure they know who is posting content on
         | their platforms, so that in the event they are sued, they can
         | countersue the creator of the content.
        
       | jjcm wrote:
       | I've long held a belief that if countries want to mandate
       | compliance, they should be required to provide the mechanism for
       | compliance.
       | 
       | Want to deem whether content is allowed or not? Fine, provide an
       | API that allows content to be scanned and return a bool.
       | 
       | Want to age-gate content? Fine, provide an identity service.
       | 
       | While both of these will reduce privacy, they'll achieve one of
       | two objectives: Either those making these policies will realize
       | the law they wrote is impossible to achieve, or it will at least
       | provide an even playing field for startups vs incumbents if they
       | succeed.
        
         | snapplebobapple wrote:
         | But if they did that they couldnt justify all the gobernment bs
         | jobs the current system creates, they also couldnt go after
         | companies for bribe fines becauae liability would be on
         | government. It would be totally unprofitable for the governmwnt
         | leech class.
        
         | o11c wrote:
         | That sounds backwards in the general case.
         | 
         | If your business cannot stand in the face of basic realities
         | like "knowing what we are doing", then _your business has no
         | right to exist_.
        
           | bpodgursky wrote:
           | Do you enjoy telling businesses they have no right to exist?
           | Does it make you hard? Is it a sexual thing? Does it make you
           | feel big and powerful?
        
       ___________________________________________________________________
       (page generated 2025-12-04 23:01 UTC)