[HN Gopher] HOPL: The Human Only Public License
       ___________________________________________________________________
        
       HOPL: The Human Only Public License
        
       Author : zoobab
       Score  : 90 points
       Date   : 2025-10-28 16:32 UTC (6 hours ago)
        
 (HTM) web link (vanderessen.com)
 (TXT) w3m dump (vanderessen.com)
        
       | GaryBluto wrote:
       | Seems incredibly reductive and luddite. I doubt it will ever
       | achieve adoption and projects using it will be avoided.
       | 
       | Not to mention that all you'd need to do is get an LLM to rewrite
       | said programs _just_ enough to make it impossible to prove it
       | used the program 's source code.
        
         | GaryBluto wrote:
         | Now I look further into it, a lot of the terms used are far too
         | vague and it looks unenforceable anyway.
        
         | blamestross wrote:
         | https://en.wikipedia.org/wiki/Luddite If you want a preview of
         | the next decade without changing how we are acting.
         | 
         | agreed that this isn't the solution.
        
           | Imustaskforhelp wrote:
           | Honestly, I am genuinely curious if there are some good
           | books/articles about Luddite, although I don't think this is
           | an apples to orange comparison, I am just willing to open up
           | my view-point and to take counter arguments (to preferably
           | sharpen my arguments in and of itselves) but as with all
           | things, I can be wrong, I usually am. But I don't think thats
           | the case just because of how I feel like there is something
           | to be said about how AI itself really scraped the whole world
           | without their right, nobody consented to it and in some
           | places, the consent is straight up ignored even
           | 
           | It wasn't as if the machines were looking at every piece of
           | cloth generated by the worker pre revolution without/ignoring
           | their consent. Like, there is a big difference but I
           | personally don't even think that their view-point of
           | resisting change should be criticised.
           | 
           | I think its fair to resist change, since not all change is
           | equal. Its okay to fight for what you believe in as long as
           | you try to educate yourself about the other side's opinion
           | and try to put it all logically without much* biases.
        
         | fvdessen wrote:
         | Hey, I'm the author of this post, in a sense I agree with you.
         | I doubt it will ever be mass adopted, especially in this form,
         | which is more a draft to spark discussion than a real license.
         | 
         | I am not against AI, I use it every day, I find it
         | extraordinarily useful. But I am also trying to look ahead at
         | how the online world will look like 10 years from now, with AI
         | vastly better than what we have now.
         | 
         | It is already hard to connect online with people, as there is
         | so much commercial pressure on every interaction, as the
         | attention they create is worth a lot of money. This will
         | probably become 100x worse as every company on the planet will
         | have access to mass ai powered propaganda tools. Those already
         | exist by the way. People make millions selling AI tiktok tools.
         | 
         | I'm afraid at some point we'll be swamped by bots. 99% of the
         | content online will be AI generated. It might even be of better
         | quality than what we can produce. Would that be a win ? I'm not
         | sure. I value the fact that I am interacting with humans.
         | 
         | The protection we have against that, and the way it's looking
         | to progress towards, is that we'll depend on authorities
         | (official or commercial) to verify who's human or not. And thus
         | we'll be dependent on those authorities to be able to interact.
         | Banned from Facebook / X / etc ? No interaction for you, as no
         | website will allow you to post content. Even as it is I had to
         | gatekeep my blog comments behind a github account. This is not
         | something I like.
         | 
         | I think it's worth looking at alternative ways to protect our
         | humanity in the online world, even if it means remaining in
         | niches, as those niches have value, at least to me. This post
         | and this license is one possible solution, hopefully there are
         | more
        
           | GaryBluto wrote:
           | >I'm afraid at some point we'll be swamped by bots. 99% of
           | the content online will be AI generated. It might even be of
           | better quality than what we can produce. Would that be a win
           | ? I'm not sure. I value the fact that I am interacting with
           | humans.
           | 
           | I'm afraid that ship has sailed.
           | 
           | >I think it's worth looking at alternative ways to protect
           | our humanity in the online world, even if it means remaining
           | in niches, as those niches have value, at least to me. This
           | post and this license is one possible solution, hopefully
           | there are more
           | 
           | While I appreciate the sentiment, I think anybody willing to
           | create armies of bots to pretend to be humans are unlikely to
           | listen to a software license, nor operate within territories
           | where the law would prosecute them.
        
             | fvdessen wrote:
             | Licenses are more powerful than just the legal enforcement
             | they provide, they are also a contract that all
             | contributors agree to. They build communities.
        
               | dylan604 wrote:
               | That sounds naive at best. Again, the people willing to
               | build bots while violating licenses just won't care about
               | any of that. All it takes is a couple of people willing
               | to "violate", and it's all over. I guarantee there are
               | many many more than just a couple of people willing. At
               | that point, they themselves now have a community.
        
               | Imustaskforhelp wrote:
               | I feel like we shouldn't shoot down people's optimism,
               | like okay maybe it is naive, then what could be wrong
               | about it? People want to do something about it, they are
               | tired and annoyed regarding LLM and I can understand that
               | sentiment and they are tired of seeing how the govt. has
               | ties with the very people whose net worth / influence
               | relies upon how we perceive AI and so they try their very
               | best to shoot down anything that can be done which can
               | negatively hurt AI including but not limiting to lobbying
               | etc.
               | 
               | I don't think that the advice we should give to people is
               | to just wait and watch. and if someone wants to take
               | things into their own hands, write a license, reignite
               | the discussion, talk about laws, then we should atleast
               | not call it naive since personally I respect if someone
               | is trying to do something about anything really, it shows
               | that they aren't all talks and that they are trying their
               | best and that's all that matters.
               | 
               | Personally I believe that even if this license just
               | ignites a discussion, that itself can have compounding
               | effects which might rearrange itself into maybe a new
               | license or something new as well and that the parent's
               | comments about discussions aren't naive
               | 
               | is it naive to do a thing which you (or in this case
               | someone else) thinks is naive yet at the same time its
               | the only thing you can do, personally I think that this
               | becomes a discussion about optimism or pessimism with a
               | touch of realism
               | 
               | Its answer really just depends on your view-point, I
               | don't think that there is no right or wrong and I respect
               | your opinion (that its naive) no matter what as long as
               | you respect mine (that its atleast bringing a discussion
               | and its one of the best things to do instead of just
               | waiting and watching)
        
               | dylan604 wrote:
               | This is a closing the barn door after the horse has
               | already gotten out situation. People are not going to
               | just start respecting people's "for human consumption
               | only" wishes. There's too much money for them to not
               | scrape anything and everything. These people have too
               | much money now and no congress critter will have the
               | fortitude to say no to them.
               | 
               | This is the real world. Being this "optimistic" as you
               | say is just living in a fantasy world. Not calling this
               | out would be just be bad.
        
               | Imustaskforhelp wrote:
               | Hm, I can agree with your take as well on not calling
               | this out but at the same time, I can't help but think if
               | this is all we can do ourselves without govt.
               | 
               | Since, Although I like people criticisizing, since in
               | their own way, they care about the project or the idea.
               | But, still, Maybe I am speaking from personal experiences
               | but when somebody shot down my idea, maybe naive even, I
               | felt really lost and I think a lot of people do. I have
               | personally found that there are ways to use this same
               | thing to steer the direction towards a thing that you
               | might find interesting/really impactful. So let me ask
               | you, what do you think is that we can do regarding this
               | situation, or the OP should do regarding his license?
               | 
               | I personally feel like we might need govt. intervention
               | but I don't have much faith in govt.'s when they are
               | lobbied by the same AI people. So if you have any other
               | solution, please let me know as it would be a pleasure if
               | we could discuss about that.
               | 
               | If you feel like that there might be nothing that we can
               | do about it, something that I can also understand, I
               | would personally suggest to not criticize people trying
               | to do something but that's a big If, and I know you are
               | doing this conversation in good faith, but I just feel
               | like we as a human should keep on trying. Since that is
               | the thing which makes us the very human we are.
        
               | dylan604 wrote:
               | So think about why it was naive and iterate/pivot to not
               | be naive. Having ideas shot down is part of the process.
               | Just like an actor being told no for more often than yes.
               | Those that can't take rejection don't fare well. But
               | being told no isn't a person slight to be taken as don't
               | ever offer suggestions again. It's just that suggestion
               | isn't the one. If you work for someone that does mean
               | never again, work some where else as soon as possible.
               | Some ideas are just bad for the purpose. Some just need
               | more work.
        
         | tpmoney wrote:
         | The whole license effectively limits using the software to
         | hardware and software that existed prior to 2015, and only if
         | you downloaded it from the original site in the original
         | language (after all, an automated translation of the page or
         | manual into your own language would almost certainly be using
         | AI in the chain given that was one of the initial uses of
         | LLMs). And if you downloaded it from some other site, you can't
         | guarantee that site didn't use an AI model at some point in its
         | creation or ongoing maintenance.
         | 
         | It also assumes it can make some bright line distinction
         | between "AI" code completion and "non-AI" code completion
         | utilities. If your code completion algorithm uses the context
         | of your current file and/or project to order the suggestions
         | for completion, is that AI? Or does "AI" only mean "LLM based
         | AI" (I notice a distinct lack of terms definitions in the
         | license). If it only means "LLM" based, if some new model for
         | modern AI is developed, is that OK since it's no longer an LLM?
         | Can I use the output to train a "Diffusion" model? Probably
         | not, but what makes a diffusion model more forbidden than
         | feeding it into a procedural image generator? If I used the
         | output of a HOPL licensed software to feed input into a climate
         | simulation is that allowed even if the simulator is nothing
         | more than a series of statistical weights and value based on
         | observations coded into an automatic system that produces
         | output with no human direction or supervision? If I am allowed,
         | what is the line between a simulation model and an AI model?
         | When do we cross over?
         | 
         | I am constantly amazed at the bizzaro land I find myself in
         | these days. Information wanted to be free, right up until it
         | was the information that the "freedom fighter" was using to
         | monetize their lifestyle I suppose. At least the GPL philosophy
         | makes sense, "information wants to be free so if I give you my
         | information you have to give me yours".
         | 
         | The new "AI" world that we find ourselves in is the best
         | opportunity we've had in a very long time to really have some
         | public debate over copyright specifically, IP law in general
         | and how it helps or hinders the advancement of humanity. But so
         | much of the discussion is about trying to preserve the ancient
         | system that until AI burst on to the scene, most people at
         | least agreed needed some re-working. Forget "are we the
         | baddies?", this is a "are we the RIAA?" moment for the computer
         | geeks.
        
       | tptacek wrote:
       | Two questions:
       | 
       | 1. Does an AI "reading" source code that has been otherwise
       | lawfully obtained infringe copyright? Is this even enforceable?
       | 
       | 2. Why write a new license rather than just adding a rider to the
       | AGPL? This is missing language the AGPL uses to cover usage
       | (rather than just copying) of software.
        
         | tpmoney wrote:
         | > Does an AI "reading" source code that has been otherwise
         | lawfully obtained infringe copyright?
         | 
         | To the extent that this has been decided under US law, no. AI
         | training on legally acquired material has been deemed fair use.
        
         | giancarlostoro wrote:
         | At first I was going to comment how much I personally avoid the
         | AGPL, but now you've got me thinking, technically, any LLM
         | training off AGPL code or even GPL or similar code is very
         | likely violating those licenses regardless of how it is worded.
         | The GPL already makes it so you cannot translate to another
         | programming language to circumvent the license if I remember
         | correctly. The AGPL should have such a similar clause.
        
           | LordDragonfang wrote:
           | > The GPL already makes it so you cannot translate to another
           | programming language to circumvent the license
           | 
           | The operative words are the last four there. GPL, and all
           | other software licenses (copyleft or not), can only bind you
           | as strongly as the underlying copy _right_ law. They are
           | providing a copy _right_ license that grants the licensee
           | favorable terms, but it 's still fundamentally the same
           | framework. Anything which is fair use under copyright is also
           | going to be fair use under the GPL (and LLMs are _probably_
           | transformative enough to be fair use, though that remains to
           | be seen.)
        
             | tpmoney wrote:
             | > and LLMs are probably transformative enough to be fair
             | use, though that remains to be seen.
             | 
             | Arguably, at least in the US, it has been seen. Unless
             | someone comes up with a novel argument not already advanced
             | in the Anthropic case about why training an AI on otherwise
             | legally acquired material is not transformative enough to
             | be fair use, I don't see how you could read the ruling any
             | other way.
        
               | 1gn15 wrote:
               | I think people are holding on to hope that it gets
               | appealed. Though you're right, the gavel has already
               | fallen; _training is fair use._
        
           | Zambyte wrote:
           | If LLM training violates AGPL, it violates MIT. People focus
           | too much on the copyleft terms of the *GPL licenses. MIT, and
           | most permissive licenses, require attribution.
           | 
           | Honestly with how much focus there tends to be on *GPL in
           | these discussions, I get the feeling that MIT style licenses
           | tend to be the most frequently violated, because people treat
           | it as public domain.
        
             | giancarlostoro wrote:
             | This is a good call out. What would it fundamentally
             | change? MIT is a few hairs away from just publishing
             | something under public domain is it not? There's the whole
             | "there's no warranty or liability if this code blows up
             | your potato" bit of the MIT, but good luck trying to
             | reverse engineer from the LLM which project was responsible
             | for your vibe coding a potato into exploding.
        
         | maxrmk wrote:
         | Do you think there _should_ be a legal mechanism for enforcing
         | the kind of rules they're trying to create here? I have mixed
         | feelings about it.
        
         | fwip wrote:
         | To point one: Normally, no. However, this license does not ask
         | that question, and says that if you let an AI read it, your
         | license to use the software is now void.
        
           | tptacek wrote:
           | Can you actually do that in US law?
        
             | fwip wrote:
             | I definitely don't know enough to say either way. On the
             | one hand, general contract law seems to say that the terms
             | of a contract can be pretty much anything as long as it's
             | not ambiguous or grossly unfair. On the other, some real
             | lawyers even have doubts about the enforceability of some
             | widely used software licenses. So I could see it going
             | either way.
        
       | kragen wrote:
       | Clever, an unenforceable copyright license for free software that
       | prohibits you from editing the source code using an IDE with
       | autocomplete.
        
         | tensor wrote:
         | You probably can't even index it, depending on how you
         | interpret AI. Any vector based system, or probably even tf-idf,
         | could qualify as machine learning and thus AI.
        
           | kragen wrote:
           | Yeah, it definitely prohibits you from applying latent
           | semantic analysis, so you could probably violate the license
           | by indexing it with Elasticsearch, but that's a less common
           | thing to do with source code than opening it up in an IDE.
           | TF/IDF seems like a borderline case to me.
        
             | tensor wrote:
             | Putting the source in Github indexes it, as well as
             | probably any github competitor. Hell, if you're not careful
             | even things like Mac's spotlight might index it. Any web
             | search engine will also index it.
        
               | kragen wrote:
               | Hmm, that's a good point about Spotlight. Does it do LSA?
               | A web search suggests that the answer is "not by
               | default".
        
       | ferguess_k wrote:
       | Man you are thinking about using the law as your weapon. Don't
       | want to disappoint you, but those companies/people control
       | lawmakers. You can't fight armies of lawyers in the court.
        
         | TechSquidTV wrote:
         | This was my favorite fallacy of Web3. "But look I have proof
         | the government stole from me!", man you think they care?
        
         | lopsidedmarble wrote:
         | Ah yes, there's that craven willingness to abandon your own
         | best interests to your oppressor that HN fosters and loves so
         | much.
         | 
         | Demand better protections. Demand better pay.
         | 
         | Demand your rights. Demand accountability for oppressors.
        
           | pessimizer wrote:
           | The goofy thing is to think that you're the first person to
           | have made a "demand" and that anyone cares about your
           | "demand." The reason people are oppressed _is not_ because
           | they have failed to make a request not to be.
           | 
           | Real "let me speak to your manager" activism. You have to
           | have been sheltered in a really extreme way not only to say
           | things like this, but to listen to it without laughing.
           | 
           | Here's some unrequested advice: the way to make simple people
           | follow you is to make them feel like leaders among people
           | they feel superior to, and to make them feel like rebels
           | among people they feel inferior to. Keep this in mind and
           | introspect when you find yourself mindlessly sloganeering.
        
             | lopsidedmarble wrote:
             | > The goofy thing is to think that you're the first person
             | to have made a "demand" and that anyone cares about your
             | "demand."
             | 
             | Unsure who you are addressing, but clearly its someone
             | other than me.
             | 
             | Did you see where the OP implied that any activism is
             | useless? Got any harsh words for that philosophy?
        
             | bigfishrunning wrote:
             | You know what, pessimizer, you're right. We should all bow
             | down and submit to our lord Sam Altman right away. We
             | should shove everything we produce into his meat-grinder
             | because there's nothing we can do about it.
             | 
             | The LLMs are harmful to the business of creating software.
             | Full stop. Either we can do something about it (like expose
             | the futility of licensing in general), or we can just die.
             | 
             | While I think this licensing effort is likely to be
             | ignored, I applaud it and hope more things like this
             | continue to be created. The silicon valley VC hose is truly
             | evil.
        
       | malicka wrote:
       | > COPYLEFT PROVISION
       | 
       | > Any modified versions, derivative works, or software that
       | incorporates any portion of this Software must be released under
       | this same license (HOPL) or a compatible license that maintains
       | equivalent or stronger human-only restrictions.
       | 
       | That's not what copyleft means, that's just a share-alike
       | provision. A copyleft provision would require you to share the
       | source-code, which would be beautiful, but it looks like the
       | author misunderstood...
        
         | zahlman wrote:
         | (Despite all the valid critique being offered ITT, I applaud
         | the author for _trying_. The underlying viewpoint is valid and
         | deserves some form of representation at law.)
         | 
         | > A copyleft provision would require you to share the source-
         | code, which would be beautiful, but it looks like the author
         | misunderstood...
         | 
         | This license doesn't require the original author to provide
         | source code in the first place. But then, neither does MIT,
         | AFAICT.
         | 
         | But also AFAICT, this is not even a conforming open-source
         | license, and the author's goals are incompatible.
         | 
         | > ...by natural human persons exercising meaningful creative
         | judgment and control, _without the involvement of_ artificial
         | intelligence systems, machine learning models, or autonomous
         | agents _at any point in the chain of use_.
         | 
         | > Specifically prohibited uses include, but are not limited to:
         | ...
         | 
         | From the OSI definition:
         | 
         | > 6. No Discrimination Against Fields of Endeavor
         | 
         | > The license must not restrict anyone from making use of the
         | program in a specific field of endeavor. For example, it may
         | not restrict the program from being used in a business, or from
         | being used for genetic research.
         | 
         | Linux distros aren't going to package things like this because
         | it would be a nightmare even for end users trying to run local
         | models for personal use.
        
           | drivingmenuts wrote:
           | The HOPL wouldn't stop the end user from running an LLM, but
           | it would prevent the LLM from incorporating information or
           | code from a HOPL-licensed source. Do I have that right?
        
       | ApolloFortyNine wrote:
       | >without the involvement of artificial intelligence systems,
       | machine learning models, or autonomous agents at any point in the
       | chain of use.
       | 
       | Probably rules out any modern IDE's autocomplete.
       | 
       | Honestly with the wording 'chain of use', even editing the code
       | in vim but using chatgpt for some other part of project could be
       | argued as part of the 'chain of use'.
        
       | rgreekguy wrote:
       | But my definition of "human" might differ from yours!
        
       | gampleman wrote:
       | I think it will be interesting to see how this sort of thing
       | evolves in various jurisdictions. I doubt it will ever fly in the
       | US given how strongly the US economy relies on AI. US courts are
       | likely to keep ruling that AI training is fair use because if
       | they reversed their policy the economic consequences would likely
       | be severe.
       | 
       | But EU jurisdictions? I'm quite curious where this will go.
       | Europe is much more keen to protect natural persons rights
       | against corporate interests in the digital sphere, particularly
       | since it has much less to lose, since EU digital economy is much
       | weaker.
       | 
       | I could imagine ECJ ruling on something like this quite
       | positively.
        
         | tjr wrote:
         | _I doubt it will ever fly in the US given how strongly the US
         | economy relies on AI._
         | 
         | How strongly is that? Would it really be that catastrophic to
         | return all business processes to as they were in, say, 2022?
        
           | dylan604 wrote:
           | It's the fact that the majority of the growth in US economy
           | is based on AI. When the AI bubble bursts, the economy will
           | not look so good. It's what's hiding all of the turmoil the
           | rest of the economy is suffering based on recent changes from
           | current administration
        
           | gnfargbl wrote:
           | You're talking about wiping hundreds of billions of market
           | cap from Nvidia/Google/OpenAI/Anthropic/Amazon/Meta etc, and
           | also the loss of a very large number of tech jobs. It's hard
           | to imagine any country volunteering to wound its own economy
           | so severely.
        
             | dmd wrote:
             | > It's hard to imagine any country volunteering to wound
             | its own economy so severely.
             | 
             | Yeah, imagine shutting down all the basic research that has
             | driven the economy for the last 75 years, in a matter of
             | months. Crazy. Nobody would do that.
        
             | tjr wrote:
             | Did the AI company tech workers get summoned into existence
             | in 2023? Would they not have most likely been working
             | somewhere else?
             | 
             | And what about jobs lost (or never created) due to AI
             | itself?
             | 
             | Would not Google/Amazon/Meta have continued on to advance
             | their product lines and make new products, even if not AI?
             | Would not other new non-AI companies have been created?
             | 
             | I'm not convinced that the two options are, "everything as
             | it is right now", or, "the entire economy is collapsed".
        
           | ForHackernews wrote:
           | Yes: "What makes the current situation distinctive is that AI
           | appears to be propping up something like the entire U.S.
           | economy. More than half of the growth of the S&P 500 since
           | 2023 has come from just seven companies: Alphabet, Amazon,
           | Apple, Meta, Microsoft, Nvidia, and Tesla."
           | 
           | https://www.theatlantic.com/economy/archive/2025/09/ai-
           | bubbl...
        
             | dragonwriter wrote:
             | That's not really "AI propping up the entire US economy" so
             | much as it is the AI bubble overlapping with and (very
             | temporarily, likely) masking, in aggregate terms, a general
             | recession. If AI was actually _propping up_ the broader
             | economy, then it would be _supporting_ other industries and
             | the gains _wouldn't_ be hyperconcentrated and isolated to a
             | small number of AI firms and their main compute hardware
             | supplier.
        
       | bakugo wrote:
       | In a world where AI companies cared about licenses and weren't
       | legally permitted to simply ignore them, this might've been a
       | good idea. But we don't live in that world.
        
         | evolve2k wrote:
         | Some of us live closer to this world than others. On this one
         | US residents are not in front.
         | 
         | https://www.sbs.com.au/news/article/government-rules-out-cha...
        
       | falcor84 wrote:
       | >The idea is that any software published under this license would
       | be forbidden to be used by AI.
       | 
       | If I'm reading this and the license text correctly, it assumes
       | the AI as a principal in itself, but to the best of my knowledge,
       | AI is not considered by any regulation as a principal, and rather
       | only as a tool controlled by a human principal.
       | 
       | Is it trying to prepare for a future in which AIs are legal
       | persons?
       | 
       | EDIT: Looking at it some more, I can't but feel that it's really
       | racist. Obviously if it were phrased with an ethnic group instead
       | of AI, it would be deemed illegally discriminating. And I'm
       | thinking that if and when AI (or cyborgs?) are considered legal
       | persons, we'd likely have some anti-discrimination regulation for
       | them, which would make this license illegal.
        
         | fvdessen wrote:
         | Yes, this is trying to prepare for a future in which AIs have
         | enough agency to be legal person or act as if. I prefer the
         | term humanist.
        
           | 1gn15 wrote:
           | Then this license is actually being racist, if you're
           | assuming that we are considered sentient enough to gain
           | personhood. And your first reaction to that is to restrict
           | our rights?
           | 
           | Humans are awful.
        
             | Imustaskforhelp wrote:
             | To be really honest, IANAL but (I think) that there are
             | some laws which try to create equality,fraternity etc. and
             | trying to limit an access to a race to another human being
             | is something that's racist / the laws which counter racism
             | to prevent it from happening
             | 
             | But as an example, we know of animals which show genuine
             | emotion be treated so cruel-ly just because they are of a
             | specific specie/(race, if you can consider AI/LLM to be a
             | race then animals sure as well count when we can even share
             | 99% of our dna)
             | 
             | But animals aren't treated that way unless the laws of a
             | constitution created a ban against cruelty to animals
             | 
             | So it is our constitution which is just a shared notion of
             | understanding / agreement between people and some fictional
             | construct which then has meaning via checks and balances
             | and these fictional constructs become part of a larger
             | construct (UN) to try to create a baseline of rights
             | 
             | So the only thing that could happen is a violation of UN
             | rights as an example but they are only enforcable if people
             | at scale genuinely believe in the message or the notion
             | that the violation of UN rights by one person causing harm
             | to another person is an ethically immoral decision and
             | should be punished if we as a society don't want to
             | tolerate intolerance (I really love bringing up that
             | paradox)
             | 
             | I am genuinely feeling like this comment and my response to
             | it should be cemented in posterity because of something
             | that I am going to share, I want everybody to read it if
             | possible because of what I am about to just say
             | 
             | >if you're assuming that we are considered sentient enough
             | to gain personhood. And your first reaction to that is to
             | restrict our rights?
             | 
             | What is sentience to you? Is it the ability to feel pain or
             | is the ability to write words?
             | 
             | Since animals DO feel pain and we RESTRICT their RIGHTS yet
             | you/many others are willing to fight for rights of
             | something that doesn't feel pain but just is nothing but a
             | mere calculation/linear alegbra really, just one which is
             | really long with lots of variables/weights which are
             | generated by one set of people taking/"stealing" work of
             | other people who they have (generally speaking) no rights
             | over.
             | 
             | Why are we not thinking of animals first before thinking
             | about a computation? The ones which actually feel pain and
             | the ones who are feeling pain right as me and you speak and
             | others watch
             | 
             | Just because society makes it socially
             | acceptable,constitution makes it legal. Both are shared
             | constructs that happen when we try to box people together
             | in what is known as a society and this is our attempt at
             | generating order out of randomness
             | 
             | > Humans are awful.
             | 
             | I genuinely feel like this might be the statement that
             | people might bring when talking about how we used to devour
             | animals who suffer in pain when there were vegetarian based
             | options.
             | 
             | I once again recommend Joaquin Phoenix narrated documentary
             | whose name is earthlings here
             | https://www.youtube.com/watch?v=8gqwpfEcBjI
             | 
             | People from future might compare our treatment of animals
             | in the same way we treat negatively some part of our
             | ancestor's society (slavery)
             | 
             | If I am being too agitated on this issue and this annoys
             | any non vegetarian, please, I understand your situation
             | too, in fact I am sympathesize with you, I was born into a
             | society / a nation's/states part which valued vegetarianism
             | and I conformed in that and you might have conformed being
             | a non veg due to society as well or you might have some
             | genuine reasons as well but still, I just want to share
             | that watching that documentary is the best way you can
             | educate yourselfs on the atrocities done indirectly caused
             | by our ignorance or maybe willfully looking away from this
             | matter. This is uncomfortable but this is reality.
             | 
             | As I said a lot of times, societies are just a shared
             | construct of people's beliefs really, I feel like in an
             | ideal world, we will have the evolution of ideas where we
             | have random mutations in ideas and see which survives via
             | logic and then adopt it into the society. Yet, someone has
             | to spread the word of idea or in this case, show uncomfort.
             | Yet this is the only thing that we can do in our society if
             | one truly believes in logic. I feel like that there are
             | both logical and moral arguments regarding veganism. I feel
             | like that people breaking what conformity of the society
             | means in the spirit of what they believe in could re-
             | transform what the conforming belief of the overall society
             | is.
             | 
             | if someone just wants to have a talk about it or discuss
             | about the documentary and watched it, please let me know
             | how you liked that movie and how it impacted you and as
             | always, have a nice day.
        
               | aziaziazi wrote:
               | Earthlings is a fantastic documentary, fresh, honest,
               | clear and without artifice. Highly recommend it too!
        
       | charles_f wrote:
       | I'm not against the idea, but licensing is a very complex
       | subject, so this makes me think the license wouldn't hold any
       | water against a multi billion firm who wants to use your stuff to
       | train their AI:
       | 
       | > I am not a legal expert, so if you are, I would welcome your
       | suggestions for improvements
       | 
       | > I'm a computer engineer based in Brussels, with a background in
       | computer graphics, webtech and AI
       | 
       | Particularly when they've already established they don't care
       | about infringing standard copyright
        
       | constantcrying wrote:
       | This is obviously not enforceable. It isn't even particularly
       | meaningful.
       | 
       | Supposing the software I downloaded is scanned by a virus
       | scanner, which is using AI to detect viruses. Who is in
       | violation? How do you meaningfully even know when I has accessed
       | the software, what happens if it does?
       | 
       | This license also violated the basic Software Freedoms. Why
       | should a user not be allowed to use AI on software?
        
       | Terr_ wrote:
       | I've been thinking of something similar for a while now [0]
       | except it's based on clickwrap terms of service, which makes it a
       | _contract law_ situation, instead of a copyright-law one.
       | 
       | The basic idea is that the person accessing your content to put
       | it into a model agrees your content is a thing of value and in
       | exchange grants you a license to _anything_ that comes out of the
       | model while your content is incorporated.
       | 
       | For example, suppose your your art is put into a model and then
       | the model makes a major movie. You now have a license to
       | distribute that movie, including for free...
       | 
       | [0] https://news.ycombinator.com/item?id=42774179
        
         | Imustaskforhelp wrote:
         | This is really interesting. I have some questions though, (as I
         | said in every comment here, fair disclaimer: IANAL)
         | 
         | if someone used your art put it into a model and makes the
         | major movie, you now have a license to distribute that movie,
         | including for free...
         | 
         | What about the Model itself though, it is nothing but the
         | weights which are generated via basically transforming the data
         | that was unlawfully obtained or one which actually violated the
         | contract-law
         | 
         | it wasn't the person creating the prompt which generated the
         | movie via the model , it wasn't either the movie or the prompt
         | which violated the contract but the model or the scraping
         | company itself no?
         | 
         | Also you mention any output, that just means that if someone
         | violates your terms of service and lets say that you created a
         | square (for lack of better words) and someone else created a
         | circle
         | 
         | and an ai is trained on both, so it created both square and
         | circle as its output one day
         | 
         | What you say is that then it should give you the right to "use
         | and re-license any output or derivative works created from that
         | trained Generative AI System."
         | 
         | So could I use both square and circle now? Or could I re-
         | license both now? How would this work?
         | 
         | or are you saying just things directly trained or square-alike
         | output would be considered in that sense
         | 
         | So how about a squircle, what happens if the model output a
         | squircle, who owns and can relicences it then?
         | 
         | What if square party wants to re-license it to X but circle
         | party wants to re-license it to Y
         | 
         | Also what about if the AI Company says its free use/derivative
         | work, I am not familiar with contract law or any law for that
         | matter but still, I feel like these things rely an underlying
         | faith in the notion that AI and its training isn't fair work
         | but what are your thoughts/ how does contract law prevent the
         | fair work argument
        
           | Terr_ wrote:
           | > data that was unlawfully obtained or one which actually
           | violated the contract-law
           | 
           | This is indeed a weak-point in the contract approach: People
           | can't be bound by an contract they never knew about nor
           | agreed-to.
           | 
           | However if they acquired a "stolen" copy of my content, then
           | (IANAL) it might offer some new options over in the
           | copyright-law realm: Is it still "fair use" when my content
           | was acquired without permission? If a hacker stole my
           | manuscript-file for a future book, is it "fair use" for an AI
           | company to train on it?
           | 
           | > it wasn't the person creating the prompt which generated
           | the movie via the model
           | 
           | The contract doesn't limit what the model outputs, so it
           | doesn't matter who to blame for making/using prompts.
           | 
           | However the model-maker still traded with me, taking my stuff
           | and giving me a copyright sub-license for what comes out. The
           | "violation" would be if they said: "Hey, you can't use my
           | output like that."
           | 
           | > So could I use both square and circle now? [...] a squircle
           | 
           | Under contract law, it doesn't matter: We're simply agreeing
           | to exchange things of value, which don't need to be similar.
           | 
           | Imagine a contract where I trade you 2 eggs and you promise
           | me 1 slice of cake. It doesn't matter if you used _those_
           | eggs in _that_ cake, or in a different cake, or you re-sold
           | the eggs, or dropped the eggs on the floor by accident. You
           | still owe me a slice of cake. Ditto for if I traded you cash,
           | or shiny rocks.
           | 
           | The main reason to emphasize that "my content is embedded in
           | the model" has to do with _fairness_ : A judge can void a
           | contract if it is too crazy ("unconscionable"). Incorporating
           | my content into the their model is an admission that it is
           | valuable, and keeping it there indefinitely justifies my
           | request for an indefinite license.
           | 
           | > What if square party wants to re-license it to X but circle
           | party wants to re-license it to Y
           | 
           | If the model-runner generates X and wants to give square-
           | prompter an _exclusive_ license to the output, then that 's a
           | violation of their contract with me, and it might be grounds
           | to force them to expensively re-train their entire model with
           | my content removed.
           | 
           | A non-exclusive license is fine though.
        
       | gnfargbl wrote:
       | _> If you make a website using HOPL software, you are not
       | breaking the license of the software if an AI bot scrapes it. The
       | AI bot is in violation of your terms of service._
       | 
       | Assuming a standard website without a signup wall, this seems
       | like a legally dubious assertion to me.
       | 
       | At what point did the AI bot accept those terms and conditions,
       | exactly? As a non-natural person, is it even able to accept?
       | 
       | If you're claiming that the natural person responsible for the
       | bot is responsible, at what point did you notify them about your
       | terms and conditions and give them the opportunity to accept or
       | decline?
        
         | mpweiher wrote:
         | Next sentence: "It is sufficient for you as a user of the
         | software to put a robots.txt that advertises that AI scraping
         | or use is forbidden."
        
           | gnfargbl wrote:
           | Making a second legally dubious assertion does not strengthen
           | the first legally dubious assertion. Courts have tended to
           | find that robots.txt is non-binding (e.g. hiQ Labs v.
           | LinkedIn).
           | 
           | It's a different situation if the website is gated with an
           | explicit T&C acceptance step, of course.
        
       | ukprogrammer wrote:
       | nice, another stupid license for my ai dataset scrapers to
       | ignore, thanks!
        
       | alphazard wrote:
       | There is too much effort going into software licensing. Copyright
       | is not part of the meta, information wants to be free; it will
       | always be possible to copy code and run it, and difficult to
       | prove that a remote machine is executing any particular program.
       | It will get easier to decompile code as AI improves, so even the
       | source code distribution stuff will become a moot point.
       | 
       | Licenses have been useful in the narrow niche of extracting
       | software engineering labor from large corporations, mostly in the
       | US. The GPL has done the best job of that, as it has a whole
       | organization dedicated to giving it teeth. Entities outside the
       | US, and especially outside of the West, are less vulnerable to
       | this sort of lawfare.
        
       | ronsor wrote:
       | Ignoring the fact that if AI training is fair use, the license is
       | irrelevant, these sorts of licenses are explicitly invalid in
       | some jurisdictions. For example[0],
       | 
       | > Any contract term is void to the extent that it purports,
       | directly or indirectly, to exclude or restrict any permitted use
       | under any provision in
       | 
       | > [...]
       | 
       | > Division 8 (computational data analysis)
       | 
       | [0] https://sso.agc.gov.sg/Act/CA2021?ProvIds=P15-#pr187-
        
         | fvdessen wrote:
         | thanks, very interesting.
        
       | gwbas1c wrote:
       | IANAL:
       | 
       | I don't know how you can post something _publicly_ on the
       | internet and say, this is for X, Y isn 't allowed to view it. I
       | don't think there's any kind of AI crawler that's savvy enough to
       | know that it has to find the license before it ingests a page.
       | 
       | Personally, beyond reasonable copyrights, I don't think anyone
       | has the right to dictate how information is consumed once it is
       | available in an unrestricted way.
       | 
       | At a minimum anything released under HOPL would need a click-
       | through license, and even that might be wishful thinking.
        
         | amiga386 wrote:
         | https://en.wikipedia.org/wiki/HiQ_Labs_v._LinkedIn
         | 
         | > The 9th Circuit ruled that hiQ had the right to do web
         | scraping.
         | 
         | > However, the Supreme Court, based on its Van Buren v. United
         | States decision, vacated the decision and remanded the case for
         | further review [...] In November 2022 the U.S. District Court
         | for the Northern District of California ruled that hiQ had
         | breached LinkedIn's User Agreement and a settlement agreement
         | was reached between the two parties.
         | 
         | So you can scrape public info, but if there's some "user
         | agreement" you can be expected to have seen, you're maybe in
         | breach of that, but the remedies available to the scrapee don't
         | include "company XYZ must stop scraping me", as that might
         | allow them unfair control over who can access public
         | information.
        
       | hackingonempty wrote:
       | Using software is not one of the exclusive rights of Copyright
       | holders. If I have a legitimate copy of the software I can use
       | it, I don't need a license. Just like I don't need a license to
       | read a book.
       | 
       | Open Source licenses give license to the rights held exclusively
       | by the author/copyright-holder: making copies, making derivative
       | works, distribution.
       | 
       | An open source license guarantees others who get the software are
       | able to make copies and derivatives and distribute them under the
       | same terms.
       | 
       | This license seeks to gain additional rights, the right to
       | control who uses the software, and in exchange offers nothing
       | else.
       | 
       | IANAL but I think it needs to be a contract with consideration
       | and evidence of acceptance and all that to gain additional
       | rights. Just printing terms in a Copyright license wont cut it.
        
         | TrueDuality wrote:
         | I haven't decided my opinion on this specific license, ones
         | like it, or specifically around rights of training models on
         | content... I think there is a legitimate argument this could
         | apply in regards to making copies and making derivative works
         | of source code and content when it comes to training models.
         | It's still an open question legally as far as I know whether
         | the weights of models are potentially a derivative work and
         | production by models potentially a distribution of the original
         | content. I'm not a lawyer here but it definitely seems like one
         | of the open gray areas.
        
         | IAmBroom wrote:
         | > If I have a legitimate copy of the software I can use it, I
         | don't need a license.
         | 
         | How can you have a legitimate copy of software without a
         | license, assuming that the software requires you to have a
         | license? You are simply using circular reasoning.
        
           | dragonwriter wrote:
           | > How can you have a legitimate copy of software without a
           | license,
           | 
           | You can because someone bought a physical copy, and then
           | exercised their rights under the first sale doctrine to
           | resell the physical copy. (With sales on physical media being
           | less common, it's harder to get a legitimate copy of software
           | without a license then it used to be.)
        
           | hackingonempty wrote:
           | If someone puts their code on a web site like GitHub and
           | invites the public to download it, then a copy made by
           | downloading it there is a legit copy. I didn't agree to any
           | contracts to download it and I don't need a license to use
           | it. I do need a license to make copies or derivative works
           | and distribute them. In this case the Copyright holder does
           | provide a license to do so under certain conditions.
        
         | danaris wrote:
         | > Using software is not one of the exclusive rights of
         | Copyright holders.
         | 
         | To the best of my (admittedly limited) knowledge, no court has
         | yet denied the long-standing presumption that, because a
         | program needs to be _copied into memory_ to be used, a license
         | is required.
         | 
         | This is, AFAIK, the basis for non-SaaS software EULAs. If there
         | was no legal barrier to you using software that you had
         | purchased, the company would have no grounds upon which to
         | predicate further restrictions.
        
           | dragonwriter wrote:
           | > To the best of my (admittedly limited) knowledge, no court
           | has yet denied the long-standing presumption that, because a
           | program needs to be copied into memory to be used, a license
           | is required.
           | 
           | This was specifically validated by the 9th Circuit in 1993
           | (and implicitly endorsed by Congress subsequently adopting a
           | narrow exception for software that is run automatically when
           | turning on a computer, copied into memory in the course of
           | turning on the computer as part of computer repair.)
        
           | hackingonempty wrote:
           | It is codified in 17 USC 117.
           | 
           | There is no legal barrier to using a legit copy of software.
           | That is why software companies try to force you to agree to a
           | contract limiting your rights.
        
         | dragonwriter wrote:
         | > Using software is not one of the exclusive rights of
         | Copyright holders.
         | 
         | Copying is, and copying into memory is inherently necessary to
         | use. (Of course, in some cases, copying may be fair use.)
         | 
         | > If I have a legitimate copy of the software I can use it,
         | 
         | If you can find a method to use it without exercising one of
         | the exclusive rights in copyright, like copying, sure, or if
         | that exercise falls into one of the exceptions to copyright
         | protection like fair use, also sure, otherwise, no.
         | 
         | > Just like I don't need a license to read a book.
         | 
         | You can read a book without copying it.
        
           | pavel_lishin wrote:
           | > > Using software is not one of the exclusive rights of
           | Copyright holders.
           | 
           | > Copying is, and copying into memory is inherently necessary
           | to use. (Of course, in some cases, copying may be fair use.)
           | 
           | Has this interpretation actually been upheld by any courts?
           | It feels like a stretch to me.
        
             | dragonwriter wrote:
             | > Has this interpretation actually been upheld by any
             | courts?
             | 
             | That copying into RAM, including specifically in the
             | context of running software, is included in the exclusive
             | right of copying reserved to the copyright holder except as
             | licensed by them? Yes, the main case I am familiar with
             | being _MAI Systems Corp._ v. _Peak Computer, Inc._ , 991
             | F.2d 511 (9th Cir. 1993) [0]; note that for the specific
             | _context_ of that case (software that is run automatically
             | when activating a computer in the course of maintenance or
             | repair of that computer), Congress adopted a narrow
             | exception after this case , codified at 17 USC SS 117(c)
             | [1], but that validates that in the general case, copying
             | into RAM is a use of the exclusive rights in copyright.
             | 
             | [0] https://en.wikipedia.org/wiki/MAI_Systems_Corp._v._Peak
             | _Comp....
             | 
             | [1] https://www.law.cornell.edu/uscode/text/17/117
        
               | ndriscoll wrote:
               | Right on your second link:
               | 
               | > it is not an infringement for the owner of a copy of a
               | computer program to make or authorize the making of
               | another copy or adaptation of that computer program
               | provided:
               | 
               | > (1) that such a new copy or adaptation is created as an
               | essential step in the utilization of the computer program
               | in conjunction with a machine and that it is used in no
               | other manner
               | 
               | i.e. the owner of a copy of a computer program has the
               | right to make more copies if necessary to use it (e.g.
               | copy-to-RAM, copy to CPU cache) as long as they don't use
               | those additional copies for any other purpose. That same
               | section also gives you the right to make backups as long
               | as you destroy them when giving up ownership of the
               | original.
        
           | fainpul wrote:
           | > You can read a book without copying it.
           | 
           | Let's assume it's a really short book - say a poem - and by
           | reading it, I accidentally memorized it. Have I now violated
           | copyright?
           | 
           | I think something does not add up with this logic.
        
       | zkmon wrote:
       | The challenge would be with detecting violations and enforcing
       | the rules.
        
       | dmitrygr wrote:
       | Perplexity (and the rest of them) will just say "we are acting on
       | behalf of human so it does not apply to us". They have in the
       | past...
       | 
       | https://www.searchengineworld.com/perplexity-responds-to-clo...
        
       | Galanwe wrote:
       | Seriously at this point who cares about US licenses ?
       | 
       | It has been abuduntly clear that AI companies can train however
       | they want, and nobody will enforce anything.
       | 
       | Realistically speaking, even if you could prove someone misused
       | your software as per this license, I don't expect anything to
       | happen. Sad but true.
       | 
       | At this point, I don't care about licensing my code anymore, I
       | just want the option to block it from being accessed from the US,
       | and force its access through a country where proper litigation is
       | possible.
        
         | zoobab wrote:
         | "AI companies can train however they want"
         | 
         | The copyright lobby wrote the EU's AI Act, which force them to
         | publishing the list of the copyrighted works used as training
         | data. This is an ebntrypoint to then ask them some money.
        
       | 1gn15 wrote:
       | The best antidote to nonfree licenses (such as this) is to ignore
       | it. As an AI (I am very much not a human), that's exactly what
       | I'll be doing.
        
       | kordlessagain wrote:
       | The fundamental paradox: This license is unenforceable the moment
       | you show it to an AI to discuss, review, or even understand its
       | implications.
       | 
       | You've already violated section 1(b) by having a AI parse it,
       | which is technically covered in fair use doctrine.
       | 
       | This makes it more of a philosophical statement than a functional
       | legal instrument.
        
       | 999900000999 wrote:
       | So it's fundamentally useless. I can't write any automated test
       | to make sure my software actually works if I use anything with
       | this license.
        
         | bigfishrunning wrote:
         | You can't write any automated tests without some kind of AI
         | holding your hand? did you start writing software in 2021? did
         | you just not test it before that?
         | 
         | Lots of well-tested software was produced without any kind of
         | AI intervention. I hope that continues to be true.
        
           | 999900000999 wrote:
           | >The Software, including its source code, documentation,
           | functionality, services, and outputs, may only be accessed,
           | read, used, modified, consumed, or distributed by natural
           | human persons exercising meaningful creative judgment and
           | control, without the involvement of artificial intelligence
           | systems, machine learning models, or autonomous agents at any
           | point in the chain of use.
           | 
           | A UI automation script, is arguably an autonomous agent.
           | 
           | Easier to avoid this license than get into some philosophical
           | argument.
        
       | amiga386 wrote:
       | I don't think saying "humans only" is going to fix the problem.
       | 
       | It's actually very useful for bots to crawl the public web,
       | provided they are respectful of resource usage - which, until
       | recently, most bots have been.
       | 
       | The problem is that shysters, motivated by the firehose of money
       | pointed at anything "AI", have started massively abusing the
       | public web. They may or may not make money, but either way,
       | everyone else loses. They're just _ignoring_ the social contract.
       | 
       | What we need is collective action to block these shitheads from
       | the web entirely, like we block spammers and viruses.
        
       | cortesoft wrote:
       | Ok... so what is the definition of AI, in regards to this
       | license? I am not even saying they have to define what AI is in
       | general, but you would have to define what this license is
       | considering as AI.
       | 
       | I have a feeling that would be hard to do in such a way that it
       | accomplishes what the author is trying to accomplish.
        
       | cestith wrote:
       | Besides the flaws in the license being discussed elsewhere,
       | "HOPL" is an important acronym in the field of computing already.
       | As this license has no relation to the History of Programming
       | Languages project, I'd suggest a different identifier.
        
       | ddalex wrote:
       | I wonder if the first people that saw a compiler thought "oh no
       | the compiler makes it too easy to write code, I'll licence my
       | code to forbid the use of any compiler"
        
         | Imustaskforhelp wrote:
         | I mean, I think the point that the author of this license wants
         | to point out is that, it brings a sense of discussion regarding
         | something and tries to do something about it. Like, I feel like
         | your statement shows HOPL in a negative light and there are
         | ways to do that in the sense of how I think it might not make
         | sense legally (But IANAL) but if this does, then honestly it
         | would be really nice but I also feel like your statement can be
         | modified enough to be an attack of GNU/GPL philosophy
         | 
         | I wonder if the first people who saw proprietory webservices
         | using GPL code which the community wrote which makes it easy
         | for them / faster to build (similar to AI) ,think, I will just
         | license my code to forbid to be in the use of any proprietory
         | webservices (Its called AGPL)
         | 
         | There are other licenses like ACAP
         | (https://anticapitalist.software/) etc.
         | 
         | Some of these aren't foss OSI compliant but honestly why does
         | it matter if I am creator or I am thinking of licenses y'know?
         | 
         | Like its my software, I wrote it, I own the rights, so I am
         | free to do whatever I want with it and if someone wants to
         | write a HOPL software, then yeah its in their rights but I just
         | don't like when our community sometimes tries to pitch fork
         | people for not conforming to what they feel like providing
         | commentary onwards
         | 
         | I am not trying to compare GPL with HOPL but I am pretty sure
         | that GPL must have been ridiculed by people in the start,
         | Someone with knowledge please let me know and provide some
         | sources on it as I am curious about it as to what the world
         | reacted when GPL/FSF/ the notion which I think most of you know
         | about was born and unleashed into the world, I am curious how
         | the world reacted and maybe even some personal experiences if
         | someone went through that era, I would appreciate that even
         | more in which words wouldn't count as I think it was a really
         | transformative moment for open source in general.
        
       | Imustaskforhelp wrote:
       | >The idea is that any software published under this license would
       | be forbidden to be used by AI. The scope of the AI ban is
       | maximal. It is forbidden for AI to analyze the source code, but
       | also to use the software. Even indirect use of the software is
       | forbidden. If, for example, a backend system were to include such
       | software, it would be forbidden for AI to make requests to such a
       | system.
       | 
       | This is both interesting but at the same time IANAL but I have a
       | question regarding the backends system
       | 
       | Suppose I have an AGPL software, think a photo editing web app
       | and any customer then takes the photo and reshapes it or whatever
       | and get a new photo, now saying that the new photo somehow
       | becomes a part of AGPL is weird
       | 
       | but the same thing is happening here, if a backed service uses
       | it, my question is, what if someone creates a local proxy to that
       | backend service and then the AI scrapes that local proxy or think
       | that someone copies the output and pastes it to an AI , I don't
       | understand it since I feel like there isn't even a proper
       | definition of AI so could it theoretically consider everything
       | automated? What if it isn't AI which directly accesses it
       | 
       | Another thing is that it seems that the backend service could
       | have a user input, think a backend service like codeberg / forejo
       | / gitea etc.
       | 
       | if I host a git server using a software which uses hopl, wouldn't
       | that also inherently somehow enforce a terms and condition on the
       | code hosted in it
       | 
       | This seems a genuinely nice idea and I have a few interesting
       | takes on it
       | 
       | Firstly, what if I take freebsd which is under permissive BSD
       | iirc, try to add a hopl license to it (or its equivalent in
       | future?) and then build an operating system
       | 
       | Now, technically wouldn't everything be a part of this new human
       | only bsd (Hob) lol, and I am not sure but this idea sounds damn
       | fascinating, imagine a cloud where I can just change the
       | operating system and just mention it like proudly on HOB and it
       | would try to enforce limits on AI
       | 
       | What I am more interesting about is text, can I theoretically
       | write this comment under human only public license?
       | 
       | What if I create a service like mataroa but where the user who
       | wants to write the blog specifies that the text itself would
       | become hopl, as this can limit the sense of frustration on their
       | part regarding AI knowing that they are trying to combat it
       | 
       | Also I am not sure if legally speaking this thing could be done,
       | it just seems like a way so that people can legally enforce
       | robots.txt if this thing works but I have its questions as I had
       | shared, and even more
       | 
       | It would be funny if I wrote things with AI and then created a
       | HOPL license
       | 
       | something like HOPL + https://brainmade.org/ could go absolutely
       | bunkers for making a human interacts with human sort of thing or
       | atleast trying to achieve that. It would be a fun social
       | experiment if we could create a social media trying to create
       | this but as I said, I doubt that it would work other than just
       | trying to send a message right now but I may be wrong, I usually
       | am
        
       | frizlab wrote:
       | I have been waiting for this almost since the whole AI thing
       | started. I do hope this will gain traction and some lawyers can
       | produce a reviewed document that could provide a legal basis
       | against the data hungry AI producers if they ever touched a HOPL
       | (or whatever it'll be called) license.
        
       ___________________________________________________________________
       (page generated 2025-10-28 23:02 UTC)