[HN Gopher] In memoriam
       ___________________________________________________________________
        
       In memoriam
        
       Author : ColinWright
       Score  : 180 points
       Date   : 2025-02-23 19:13 UTC (3 hours ago)
        
 (HTM) web link (onlinesafetyact.co.uk)
 (TXT) w3m dump (onlinesafetyact.co.uk)
        
       | chris_wot wrote:
       | Apparently the law is dreadfully written. I was reading the
       | lobste.rs thread and wow, it's like they took a programming
       | course in goto and it statements and applied it to the law...
        
         | ChrisKnott wrote:
         | I had the complete opposite impression from that thread. It
         | seemed like people were politically motivation to interpret the
         | law in a certain way, so they could act like they were being
         | coerced.
         | 
         | These closures are acts of protest, essentially.
         | 
         | I agree with @teymour's description of the law. It is totally
         | normal legislation.
        
       | Izmaki wrote:
       | "Furry.energy"? With a total of 49 members? My World of Warcraft
       | guild has more active players...
        
         | AnthonyMouse wrote:
         | This is exactly the point, isn't it? The smallest websites are
         | destroyed, leaving only the megacorps.
        
           | twinkjock wrote:
           | That is not the stated purpose of the law and there is
           | recourse built into it. Too often folks view these laws as
           | binaries where none exists.
        
             | AnthonyMouse wrote:
             | It's never the _stated_ purpose of the law, but we might do
             | well to be concerned with what it actually does rather than
             | what the proponents claim it would do.
             | 
             | Recourse doesn't matter for a sole proprietorship. If they
             | have to engage with a lawyer whatsoever, the site is dead
             | or blocked because they don't have the resources for that.
        
             | kelnos wrote:
             | What recourse? A small, 5o-member community doesn't have
             | the resources to ensure they're in compliance, and Ofcom's
             | statement about how smaller players are "unlikely" to be
             | affected is not particularly reassuring.
             | 
             | The "stated purpose" is irrelevant. Even if they are being
             | honest about their stated purpose (questionable), the only
             | thing that matters is how it ends up playing out in
             | reality.
        
           | Izmaki wrote:
           | I'm sure they can find a community elsewhere. Discord comes
           | to mind... "Oh but it's illegal", trust me on this: Discord
           | only cares if somebody actually reports the server and the
           | violations are severe enough.
        
       | jsheard wrote:
       | Hexus is a big one, being UK-based and UK-centric they are just
       | deleting 24 years of history rather than trying to geoblock
       | around it.
        
         | edwinjones wrote:
         | Hexus shut down years ago did it not?
        
           | stoobs wrote:
           | The reviews/news side did, but the forums kept going until
           | this.
        
       | IanCal wrote:
       | Right or wrong I think many have misread the legislation or read
       | poor coverage of it given people's reasoning.
       | 
       | Much of things boils down to doing a risk assessment and deciding
       | on mitigations.
       | 
       | Unfortunately we live in a world where if you allow users to
       | upload and share images, with zero checks, you are disturbingly
       | likely to end up hosting CSAM.
       | 
       | Ofcom have guides, risk assessment tools and more, if you think
       | any of this is relevant to you that's a good place to start.
       | 
       | https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...
        
         | docflabby wrote:
         | it's not that simple - illegal and harmful content can include
         | things like hate speech - worth a longer read...
         | https://www.theregister.com/2025/01/14/online_safety_act/
         | 
         | If I ran a small forum in the UK I would shut it down - not
         | worth risk of jail time for getting it wrong.
        
           | docflabby wrote:
           | The new rules cover any kind of illegal content that can
           | appear online, but the Act includes a list of specific
           | offences that you should consider. These are:
           | terrorism         child sexual exploitation and abuse (CSEA)
           | offences, including             grooming             image-
           | based child sexual abuse material (CSAM)             CSAM
           | URLs         hate         harassment, stalking, threats and
           | abuse         controlling or coercive behaviour
           | intimate image abuse         extreme pornography
           | sexual exploitation of adults         human trafficking
           | unlawful immigration         fraud and financial offences
           | proceeds of crime         drugs and psychoactive substances
           | firearms, knives and other weapons         encouraging or
           | assisting suicide         foreign interference         animal
           | cruelty
        
             | sepositus wrote:
             | > hate
             | 
             | Is it really just listed as one word? What's the legal
             | definition of hate?
        
               | tene80i wrote:
               | https://en.wikipedia.org/wiki/Hate_speech_laws_in_the_Uni
               | ted...
        
               | sepositus wrote:
               | Thanks.
               | 
               | > Something is a hate incident if the victim or anyone
               | else think it was motivated by hostility or prejudice
               | based on: disability, race, religion, gender identity or
               | sexual orientation.
               | 
               | This probably worries platforms that need to moderate
               | content. Sure, perhaps 80% of the cases are clear cut,
               | but it's the 20% that get missed and turn into criminal
               | liability that would be the most concerning. Not to
               | mention a post from one year ago can become criminal if
               | someone suddenly decides it was motivated by one of these
               | factors.
               | 
               | Further, prejudices in terms of language do change often.
               | As bad actors get censored based on certain language,
               | they will evolve to use other words/phrases to mean the
               | same thing. The government is far more likely to be aware
               | of these (and be able to prosecute them) than some random
               | forum owner.
        
               | sapphicsnail wrote:
               | Just want to add that I couldn't find any references to
               | gender identity in the linked Wikipedia article as well
               | as the article on hate incidents in the UK.
        
               | CamperBob2 wrote:
               | Whatever the current government says it means. What did
               | you think it meant?
        
               | bdzr wrote:
               | I don't see what the big deal is - Governments don't
               | change hands or selectively prosecute.
        
               | throwaway48476 wrote:
               | Hate is whatever I don't like.
        
             | Winblows11 wrote:
             | From that list I don't see HN being affected, although I
             | read somewhere that a report button on user generated
             | content was required to comply for smaller sites.
        
           | guax wrote:
           | The good thing about forums is their moderation. It seems
           | like mostly what the law covers is already enforced by most
           | forums anyways.
        
             | Tuna-Fish wrote:
             | A forum that merely has good moderation is not
             | automatically compliant with the act. It requires not just
             | doing things, but paperwork that shows that you are doing
             | things. The effort to do this well enough to be sure you
             | will be in compliance is far beyond what is reasonable to
             | ask of hobbyists.
        
           | nsteel wrote:
           | I might be falling for what I've read second-hand but isn't
           | one of the issues that it doesn't matter where the forum is
           | based, if you've got significant UK users it can apply to
           | your forum hosted wherever. You've got to block UK users.
        
         | aimazon wrote:
         | You're right. Plus, the overreactions have been walked back or
         | solved in some cases, e.g: LFGSS is going to continue on as a
         | community ran effort which will comply with the risk assessment
         | requirements. Most of the shutdowns are on long-dead forums
         | that have been in need of an excuse to shutter. The number of
         | active users impacted by these shutdowns probably doesn't break
         | 100.
        
         | pmlnr wrote:
         | > Much of things boils down to doing a risk assessment and
         | deciding on mitigations.
         | 
         | So... paperwork, with no real effect, use, or results. And
         | you're trying to defend it?
         | 
         | I do agree with need something, but this is most definitely not
         | the solution.
        
           | IanCal wrote:
           | Putting in mitigations relevant to your size, audience and
           | risk factors is not "no real effect".
           | 
           | If you've never considered what the risks are to your users,
           | you're doing them a disservice.
           | 
           | I've also not defended it, I've tried to correct
           | misunderstandings about what it is and point to a reliable
           | primary source with helpful information.
        
         | zimpenfish wrote:
         | > if you allow users to upload and share images
         | 
         | On my single-user Fedi server, the only person who can directly
         | upload and share images is me. But because my profile is
         | public, it's entirely possible that someone I'm following posts
         | something objectionable (either intentionally or via
         | exploitation) and it would be visible via my server (albeit
         | fetched from the remote site.) Does that come under
         | "moderation"? Ofcom haven't been clear. And if someone can post
         | pornography, your site needs age verification. Does my single-
         | user Fedi instance now need age verification because a random
         | child might look at my profile and see a remotely-hosted
         | pornographic image that someone (not on my instance) has
         | posted? Ofcom, again, have not been clear.
         | 
         | It's a crapshoot with high stakes and only one side knows the
         | rules.
        
       | guax wrote:
       | Seems like an overreaction in some of these. Perhaps the people
       | running them were close to the edge and more mental burden just
       | pushes them over it.
       | 
       | It's like local US news websites blocking European users over
       | GDPR concerns.
        
         | whatshisface wrote:
         | Feel free to put a stop to it by buying liability insurance for
         | all of these service providers, which you may have to persuade
         | the underwriter should be free. ;-)
        
         | ivanmontillam wrote:
         | > _It 's like local US news websites blocking European users
         | over GDPR concerns._
         | 
         | I don't know if you said this sarcastically, but I have a
         | friend in Switzerland who reads U.S. news websites via Web
         | Archive or Archive IS _exactly_ because of that.
         | 
         | Accessing some of these news sites returns CloudFlare's "not
         | available" in your region message or similar.
        
           | homebrewer wrote:
           | It's not just the EU; I'm in a poorer region outside the EU
           | and seeing "not available in your region" is quickly becoming
           | the norm. Site administrators try to cut down on bot traffic
           | (scraping, vulnerability scanners, denial of service, etc)
           | and block whole regions they're not interested in.
           | 
           | Hell, we do that ourselves, but only for our own
           | infrastructure that isn't expected to be used outside the
           | county. Whitelisting your own country and blocking everything
           | else cuts out >99% of scrapers and script kiddies.
        
           | guax wrote:
           | No sarcasm. I totally understand why a local news website in
           | the US would just block since its irrelevant for them any
           | traffic from outside the country and they're have little
           | resources. I don't judge them from blocking.
           | 
           | Fact is that its very unlikely they would ever face any
           | issues about having it not blocked.
        
             | ryandrake wrote:
             | So much for the "world wide" web.
        
       | Fokamul wrote:
       | I will host public proxied site for these websites, open for UK
       | people, just to troll them :D
        
         | ivanmontillam wrote:
         | Whilst I don't condone being unlawful (are you sure you want to
         | run that risk?), that's the hacker spirit one needs these days.
         | 
         | Being silly to ridicule overreaching laws is top-trolling! Love
         | it.
        
           | AnthonyMouse wrote:
           | The trouble here is that the law is _so_ crazy that third
           | parties allowing users in the relevant jurisdiction to access
           | the site could result in the site still be liable, so then
           | they would have the same reason to block your proxy service
           | if a non-trivial number of people were using it.
           | 
           | To do any good you don't want to cause grief for the victims
           | of the crazy law, you want to cause grief to its
           | perpetrators.
        
             | ivanmontillam wrote:
             | Then I guess that'd be a use case of technologies like Tor
             | or I2P, properly, securely used.
        
               | bazzargh wrote:
               | Worth mentioning that the lawyer who runs
               | onlinesafetyact.co.uk, Neil Brown, has its onion address
               | in his profile.
               | 
               | https://mastodon.neilzone.co.uk/@neil
               | 
               | http://3kj5hg5j2qxm7hgwrymerh7xerzn3bowmfflfjovm6hycbyfuh
               | e6l...
        
             | Mindwipe wrote:
             | Ofcom have said that they consider geoblocking to be
             | sufficient in writing, so at least they would probably lose
             | any legal case brought against them.
        
               | AnthonyMouse wrote:
               | Which would in turn cause the whole thing to be a farce,
               | because then the solution would be for every site to
               | geoblock the UK and then every person in the UK to use a
               | proxy.
        
               | Mindwipe wrote:
               | It is.
        
           | kelnos wrote:
           | If GP is not a UK citizen and does not live in the UK, how
           | would that be unlawful? They're not beholden to or subject to
           | UK law. The UK's belief that they can enforce this law on
           | non-UK entities is ridiculous.
        
       | amiga386 wrote:
       | Charlie Stross's blog is next.
       | 
       | Liability is unlimited and there's no provision in law for being
       | a single person or small group of volunteers. You'll be held to
       | the same standards as a behemoth with full time lawyers (the
       | _stated target_ of the law but the least likely to be affected by
       | it)
       | 
       | http://www.antipope.org/charlie/blog-static/2024/12/storm-cl...
       | 
       | The entire law is weaponised unintented consequences.
        
         | wkat4242 wrote:
         | > unintented consequences
         | 
         | Intended consequences no doubt.
        
         | tene80i wrote:
         | What standards would you want individuals or small groups to be
         | held to? In a context where it is illegal for a company to
         | allow hate speech or CSAM on their website, should individuals
         | be allowed to? Or do you just mean the punishment should be
         | less?
        
           | AnthonyMouse wrote:
           | The obvious solution is to have law enforcement enforce the
           | law rather than private parties. If someone posts something
           | bad to your site, the police try to find who posted it and
           | arrest _them_ , and the only obligation on the website is to
           | remove the content in response to a valid court order.
        
             | tene80i wrote:
             | I don't have a strong view on this law - I haven't read
             | enough into it. So I'm interested to know why you believe
             | what you've just written. If a country is trying to, for
             | example, make harder for CSAM to be distributed, why
             | shouldn't the person operating the site where it's being
             | hosted have some responsibility to make sure it can't be
             | hosted there?
        
               | mananaysiempre wrote:
               | For one thing, because that person is not obliged to
               | follow due process and will likely ban everything that
               | even might even vaguely require them to involve a lawyer.
               | See for example YouTube's copyright strikes, which are
               | _much_ harsher on the uploader than any existing
               | copyright law.
        
               | tene80i wrote:
               | Your argument is that it's better to have the illegal
               | stuff (say, CSAM) online than for a site owner to, for
               | practical reasons, ban a lot of legal stuff too? Why?
        
               | noah_buddy wrote:
               | Some sorts of goods should be prioritized over some sorts
               | of bads. There would be no terrorism if we locked every
               | human in a box and kept them there, yet you do not
               | support this position, why? I jest, but I think public
               | discourse is an unalloyed good and I would rather we not
               | compromise informal small discourse for the sake of anti-
               | terrorism, anti-CSAM, etc. These things won't be fully
               | rooted out, they'll just go to ground. Discourse will be
               | harmed though.
        
               | dcow wrote:
               | That is not the argument. The argument is that, with
               | appropriate court order, a site operator must take down
               | the illegal material (if it hasn't already been moderated
               | out). However, the site owner should _not be liable_ for
               | that content appearing on their site since it was not put
               | there by them and since there is value in uncensored
               | /unmoderated online communities. The _person who posted_
               | the content should be liable, not the site owner. In
               | neither case is the content just freely siting there
               | harming the public and unable to be removed because
               | nobody is liable for punishment.
               | 
               | I think an interesting alternate angle here would be to
               | require unmoderated community admins to keep record of
               | real identity info for participants, so if something bad
               | shows up the person who posted it is trivially
               | identifiable and can easily be reprimanded. This has
               | other problems, of course, but is interesting to
               | consider.
        
               | AnthonyMouse wrote:
               | Let's consider two ways of dealing with this problem:
               | 
               | 1) Law enforcement enforces the law. People posting CSAM
               | are investigated by the police, who have warrants and
               | resources and so on, so each time they post something is
               | another chance to get caught. When they get caught they
               | go to jail and can't harm any more children.
               | 
               | 2) Private parties try to enforce the law. The people
               | posting CSAM get banned, but the site has no ability to
               | incarcerate them, so they just make a new account and do
               | it again. Since they can keep trying and the penalty is
               | only having to create a new account, which they don't
               | really care about, it becomes a cat and mouse game except
               | that even if the cat catches the mouse, the mouse just
               | reappears under a different name with the new knowledge
               | of how to avoid getting caught next time. Since being
               | detected has minimal risk, they get to try lots of
               | strategies until they learn how to evade the cat, instead
               | of getting eaten (i.e. going to prison) the first time
               | they get caught. So they get better at evading detection,
               | which makes it harder for law enforcement to catch them
               | either. Meanwhile the site is then under increasing
               | pressure to "do something" because the problem has been
               | made worse rather than better, so they turn up the false
               | positives and cause more collateral damage to innocent
               | people. But that doesn't change the dynamic, it only
               | causes the criminals to evolve their tactics, which they
               | can try an unlimited number of times until they learn how
               | to evade detection again. Meanwhile as soon as they do,
               | the site despite their best efforts is now hosting the
               | material again. The combined costs of the heroic efforts
               | to try and the liability from inevitably failing destroys
               | smaller sites and causes market consolidation. The
               | megacorps then become a choke point for other censorship,
               | some by various governments, others by the corporations
               | themselves. That is an evil in itself, but if you like to
               | take it from the other side, that evil causes ordinary
               | people chafe. So they start to develop and use anti-
               | censorship technology. As that technology becomes more
               | widespread with greater public support, the perpetrators
               | of the crimes you're trying to prevent find it easier to
               | avoid detection.
               | 
               | You want the police to arrest the pedos. You don't want a
               | dystopian megacorp police state.
        
           | amiga386 wrote:
           | How about:
           | 
           | Individuals and small groups not held directly liable for
           | comments on their blog unless its proven they're responsible
           | for inculcating that environment.
           | 
           | "Safe harbour" - if someone threatens legal action, the host
           | can pass on liability to the poster of the comment. They can
           | (temporarily) hide/remove the comment until a court decides
           | on its legality.
        
           | ta8645 wrote:
           | How about have separate laws for CSAM and "hate speech".
           | Because CSAM is most likely just a fig-leaf for the primary
           | motivation of these laws.
        
         | 0xbadcafebee wrote:
         | big DMCA energy
        
         | aimazon wrote:
         | There has been new information since that blog post which has
         | reaffirmed the "this is much ado about nothing" takes because
         | Ofcom have said that they do not want to be a burden on smaller
         | sites.
         | 
         | https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c...
         | 
         | "We've heard concerns from some smaller services that the new
         | rules will be too burdensome for them. Some of them believe
         | they don't have the resources to dedicate to assessing risk on
         | their platforms, and to making sure they have measures in place
         | to help them comply with the rules. As a result, some smaller
         | services feel they might need to shut down completely.
         | 
         | So, we wanted to reassure those smaller services that this is
         | unlikely to be the case."
        
           | ColinWright wrote:
           | "... unlikely ..."
           | 
           | Political winds shift, and if someone is saying something the
           | new government doesn't like, the legislation is there to
           | utterly ruin someone's life.
        
           | pembrook wrote:
           | Nothing more reassuring than a vague "we're unlikely to go
           | after you [if you stay on our good side.]"
           | 
           | It's clear the UK wants big monopolistic tech platforms to
           | fully dominate their local market so they only have a few
           | throats to choke when trying to control the narrative...just
           | like "the good old days" of centralized media.
           | 
           | I wouldn't stand in the way of authoritarians if you value
           | your freedom (or the ability to have a bank account).
           | 
           | The risk just isn't worth it. You write a blog post that rubs
           | someone power-adjacent the wrong way and suddenly you're
           | getting the classic _"...nice little blog you have
           | there...would be a shame to find something that could be
           | interpreted as violating 1 of our 17 problem areas... "_
        
             | throwaway48476 wrote:
             | For my friends everything, for my enemies the law.
             | 
             | Uneven enforcement is the goal.
        
             | HPsquared wrote:
             | Sovereign is he who makes the exception.
        
             | owisd wrote:
             | Changing the code of practice is a years long statutory
             | consultation process, they're not going to be able to
             | change the rules to go after you on a whim.
        
           | amiga386 wrote:
           | Ofcom need to change the law then.
           | 
           | Unless Ofcom actively say "we will NOT enforce the Online
           | Safety Act against small blogs", the chilling effect is still
           | there. Ofcom need to own this. Either they enforce the bad
           | law, or loudly reject their masters' bidding. None of this
           | "oh i don't want to but i've had to prosecute this crippled
           | blind orphan support forum because one of them insulted islam
           | but ny hands are tied..."
        
           | rkachowski wrote:
           | > So, we wanted to reassure those smaller services that this
           | is unlikely to be the case
           | 
           | This is the flimsiest paper thin reassurance. They've built a
           | gun with which they can destroy the lives of individuals
           | hosting user generated content, but they've said they're
           | unlikely to use it.
        
           | transcriptase wrote:
           | The Canadian government did the same thing when they
           | accidentally outlawed certain shotguns by restricting bore
           | diameter without specifying it was for rifles.
           | 
           | A minister tweeted that it didn't apply to shotguns, as if
           | that's legally binding as opposed to you know, the law as
           | written.
        
             | throwaway48476 wrote:
             | The democrats wrote a bill to hire 60k new armed IRS agents
             | and promised they wouldn't be used to go after anyone with
             | an income less than 250k. Senator Mike Crapo tried to add
             | an ammendment to put that in the bill but they blocked it.
             | We have a serious problem with politicians lying about the
             | text of bills.
        
               | kelnos wrote:
               | While I certainly would prefer that the IRS first and
               | foremost go after tax evasion perpetuated by the wealthy
               | (if for no other reason than there's likely more bank for
               | the buck there), tax law is tax law. If someone making
               | less than $250k/yr is evading paying taxes, the IRS
               | should go after them just the same as if it was someone
               | making $5M/yr.
        
               | throwaway48476 wrote:
               | Usually people complain that the IRS doesn't go after
               | >250k. I've never heard anyone argue that they don't go
               | after <240k enough. This is why the democrats promised it
               | would only be used to go after >250k.
               | 
               | The problem is the dishonesty, saying the intent is one
               | thing but being unwilling to codify the stated intent.
        
           | mlfreeman wrote:
           | The use of "unlikely" just screams that Ofcom will eventually
           | pull a Vader..."We are altering the deal, pray we don't alter
           | it any further".
        
           | incompatible wrote:
           | "Unlikely," I suppose if you don't have any significant
           | assets to be seized and don't care about ending up in prison,
           | you may be willing to take the chance.
        
           | fweimer wrote:
           | You can try the digital toolkit and see for yourself if this
           | is a realistic pathway for a small site (such as a blog with
           | a comment function). Personally, I find it puzzling that
           | Ofcom thinks what they provide is helpful to small sites.
           | Furthermore, they make it pretty clear that they see no
           | reason for a purely size-based exemption ("we also know that
           | harm can exist on the smallest as well as the largest
           | services"). They do not explore ways to reach their goals
           | without ongoing collaboration from small site owners, either.
        
         | bdzr wrote:
         | > the stated target of the law but the least likely to be
         | affected by it
         | 
         | The least likely to be _negatively_ affected. This will
         | absolutely be good for them in that it just adds another item
         | to the list of things that prevents new entrants from competing
         | with them.
        
         | eminent101 wrote:
         | This is an honest question. Why does a blog need to shutdown?
         | If they moderate every comment before it is published on the
         | website, what's the problem? I ask because I've got a UK-based
         | blog too. It has got comments feature. Wouldn't enabling
         | moderation for all comments be enough?
        
           | Mindwipe wrote:
           | No, you still need to do things like write an impact
           | assessment etc and you're still on the hook for "illegal"
           | comments where you aren't a judge and have to arbitrarily
           | decide what might be when you have no legal expertise
           | whatsoever.
        
             | eminent101 wrote:
             | If I'm moderating all comments before they're published on
             | the website, what's the problem? I mean, I've got a simple
             | tech blog. I'm not going to publish random drive-by
             | comments. Only comments that relate to my blog are ever
             | going to be published. Am I making sense?
        
               | Mindwipe wrote:
               | Does anyone in your blog comments ever discuss
               | circumvention of DRM?
               | 
               | That's a criminal offence in the UK (two year prison
               | sentence in some circumstances). Do you have a good
               | feeling for what might count as incitement in those
               | circumstances?
        
         | throwaway48476 wrote:
         | Its very much intended. It's easier for the powers that be to
         | deal with a few favored oligarchs. They're building a great
         | British firewall like china.
        
       | whatshisface wrote:
       | It seems like governments around the world are shifting their
       | priorities away from their domestic economies.
        
         | logicchains wrote:
         | Shifting their priorities towards stifling the speech of anyone
         | who tries to complain about the domestic conditions.
        
       | KennyBlanken wrote:
       | So, how does all this apply to community discords, slacks, Matrix
       | rooms, IRC chats, etc?
       | 
       | Is it discord's responsibility to comply, the admin/moderators,
       | or all of the above?
        
         | twinkjock wrote:
         | Yes, at least for platforms like Discord, they bear the
         | responsibility based on my non-lawyer reading of the plain
         | English. YMMV, IANAL.
        
         | kelnos wrote:
         | The hosting platform is responsible for compliance. For Discord
         | or Slack it's easy, but for Matrix, it might be more fuzzy.
         | Certainly the homeserver that is hosting a room would be
         | responsible, but would other homeservers that have users who
         | are members of the room also be responsible?
        
       | KennyBlanken wrote:
       | You know what's really rich about the OSA?
       | 
       | One of the exemptions is for "Services provided by persons
       | providing education or childcare."
        
         | ColinWright wrote:
         | Do you have an explicit reference for that?
         | 
         | Not doubting it, but if you have a reference to hand it will
         | save me having to search.
         | 
         | If it's just something you remember but _don 't_ have a
         | reference then that's OK, I'll go hunting based on your clue.
        
           | bazzargh wrote:
           | In the text of the act, schedule 1 part 1 paragraph 10 https:
           | //www.legislation.gov.uk/ukpga/2023/50/schedule/1/para...
           | 
           | ... unlike the issue of what size of service is covered, this
           | isn't a pinky swear by Ofcom.
        
             | ColinWright wrote:
             | Super ... many thanks.
        
       | tac19 wrote:
       | Safety from dissent, for an authoritarian government. This is
       | just weaponized "empathy".
        
       | sepositus wrote:
       | Doesn't this act effectively create a new form of DDoS? A bad
       | actor can sufficiently flood a platform with enough hate content
       | that the moderation team simply cannot keep up. Even if posts
       | default to not show, the backlog could be enough to harm a
       | service.
       | 
       | And of course, it will turn into yet another game of cat and
       | mouse, as bad actors find new creative ways to bypass automatic
       | censors.
        
         | riwsky wrote:
         | Aka a "heckler's veto":
         | https://en.m.wikipedia.org/wiki/Heckler's_veto
        
         | throwaway48476 wrote:
         | This already happens but attackers go after hosts and
         | registrars.
        
       | maxed wrote:
       | Is Hacker News also affected by this act?
        
         | Mindwipe wrote:
         | Yes.
        
         | kelnos wrote:
         | Yes, but I don't really understand how the UK can expect to
         | enforce this law against non-UK entities that don't have any
         | employees or physical presence in the UK.
         | 
         | HN/YC could just tell them to go pound sand, no? (Assuming YC
         | doesn't have any operations in the UK; I have no idea.)
        
           | milesrout wrote:
           | pg lives in Britain if I'm not mistaken.
        
       | mattvr wrote:
       | Should order this list by number of affected rather than
       | alphabetical IMO. The 275K monthly user platform is almost hidden
       | relative to the 49 and 300 user examples.
        
       | v3xro wrote:
       | Just another bad UK law not worth knowing about ;)
        
       | logicallee wrote:
       | The State of Utopia has published this report on the source of
       | funding of Ofcom, the U.K. statutory regulator responsible for
       | enforcing the Online Safety Act:
       | 
       | https://medium.com/@rviragh/ofcom-and-the-online-safety-act-...
        
       | cgcrob wrote:
       | I am part of a small specialist online technical community. We
       | just moved it over to a Hetzner box in Germany and someone there
       | is paying for it instead of it being hosted in the UK.
       | 
       | What are you going to do Ofcom?
        
         | kelnos wrote:
         | If you live in the UK and can still be linked as an
         | operator/organizer of the site (or if it's not you, other UK
         | residents), can't they still come after you directly? I don't
         | know about you, but I don't think running an online community
         | would be worth huge fines to me.
        
           | cgcrob wrote:
           | There are no UK residents involved in the organisation or
           | operation of it now even though we came up with it.
        
       | throwaway48476 wrote:
       | Non UK sites should IP block UK IPs and create a block page that
       | advertises VPNs.
        
       | dang wrote:
       | Related ongoing thread: _Lobsters blocking UK users because of
       | the Online Safety Act_ -
       | https://news.ycombinator.com/item?id=43152178
        
       | darnthenuggets wrote:
       | What concept allows the UK to (attempt to) enforce this against
       | non citizens whose business or life has no ties to their country?
       | Plenty of small countries have odd censorship laws but have
       | escaped similar legal hand wringing.
        
       | JFingleton wrote:
       | The Chaos Engine forums - a site for game developers to discuss,
       | moan, and celebrate fellow and former colleagues... Now moved to
       | Discord due to this act. It really is a strange time we are
       | living through.
       | 
       | https://thechaosengine.com/index.php
        
       ___________________________________________________________________
       (page generated 2025-02-23 23:00 UTC)