[HN Gopher] Is A.I. The Death of I.P.?
       ___________________________________________________________________
        
       Is A.I. The Death of I.P.?
        
       Author : marban
       Score  : 71 points
       Date   : 2024-01-17 12:27 UTC (10 hours ago)
        
 (HTM) web link (www.newyorker.com)
 (TXT) w3m dump (www.newyorker.com)
        
       | fredolivier0 wrote:
       | surprise surprise
        
       | pvaldes wrote:
       | For context, I.P. here = Intelectual Property and not Internet
       | Protocol address. You can save a click today.
        
         | midasuni wrote:
         | Is the answer "yes" or "no"?
         | 
         | I suspect the actual answer is "those with wealth will continue
         | to amass it and those without will struggle more"
        
           | amelius wrote:
           | Traditionally, rich people paid poor artists to obtain new
           | artworks. In the future, they just click a button.
        
             | tenebrisalietum wrote:
             | This is actually not necessarily only a rich thing. There
             | are numerous artists that will accept commissions and
             | create something for you, unique to you, and it's not
             | thousands of dollars.
             | 
             | But your statement brings up (or at least makes me think
             | of) what I think is an interesting point: there's the
             | notion of mass media, which started when broadcasting
             | became a thing. Play 1 recording via a method where
             | potentially millions can hear or see it. It was after the
             | development of broadcast technology that copyright
             | maximalism took hold. IMHO the current IP regime is for
             | that purpose, and cracks are showing in it because media is
             | becoming more fragmented and more personalized. Would mass
             | media ever become a thing of the past?
        
           | marcosdumay wrote:
           | The article is quite long, but it seems to agree with your
           | suspicions.
           | 
           | Personally, I don't think this is sustainable. Once you make
           | the injustice obvious, people will give-up on following the
           | rule.
        
           | shzhdbi09gv8ioi wrote:
           | A: No [1]
           | 
           | 1:
           | https://en.wikipedia.org/wiki/Betteridge's_law_of_headlines
        
             | amelius wrote:
             | I'm writing an article titled "Is Betteridge's law true?"
        
           | waihtis wrote:
           | the actual answer is "those who apply themselves will be fine
           | and those who spend all their days wallowing in marxist
           | fantasies will stay poor (both spiritually and monetarily)"
        
       | 8jef wrote:
       | I have not read the article, just responding to title:
       | 
       | Hell yeah!
       | 
       | I do wish I.P. dead. Or at least, I wish for a much shorter
       | protection period, and for original authors / artists only. For
       | all the wrong reasons, but primarily, because nothing's sacred.
       | Amen.
        
         | ImPleadThe5th wrote:
         | Do intellectuals / creatives not deserve the fruits of their
         | own labor?
         | 
         | I get corporate IP is used for a lot of bad. But intellectuals
         | and artists create a lot of value for society in ways that are
         | very hard to monitize as a individual. If we completely kill IP
         | what becomes of our authors, philopshipers, historians, artists
         | etc.?
         | 
         | They are already having it hard as it is with existing ip
         | protections vs. big tech, big pharma, big record labels, etc.
        
           | teddyh wrote:
           | It's not that cut-and-dry:
           | 
           | <https://en.wikipedia.org/wiki/Criticism_of_copyright>
           | 
           | <http://www.dklevine.com/general/intellectual/againstfinal.ht
           | ...>
        
           | 8jef wrote:
           | Yes, you are right. But.
           | 
           | People do abuse the protections, which are too lengthy, makes
           | possible to sell rights, etc.
           | 
           | In the end, corporations (ab)uses (buy and sell rights)
           | protections for amassing fortunes much bigger than the
           | original creator ever got. That's not right, not fair, at
           | all. System's broken, need fix, throw it all out and build
           | something new, centered around creators.
           | 
           | Give back to creators only, for less time.
        
       | ska wrote:
       | Betteridge's law of headlines?
        
         | CrzyLngPwd wrote:
         | No!
        
       | thot_experiment wrote:
       | No subscription and archive.is is broken on 1.1 but it seems like
       | the conversation is finally moving in a good direction.
       | 
       | I'm very very tentatively hopeful that we might see some
       | reduction of IP rights in my lifetime. It's going to be a long
       | hard fight but maybe, just maybe we will see people wake the fuck
       | up and realize how important the building up the fucking commons
       | is to the progress of all humanity.
        
         | __loam wrote:
         | It's rich to see people talk about the commons like AI models
         | aren't a massive tragedy of the commons.
        
           | AnarchismIsCool wrote:
           | Can you expand on that?
        
             | __loam wrote:
             | These models will discourage creative work. They exist at
             | the expense of the people who made them possible.
        
           | KoolKat23 wrote:
           | I assume you're referring to copyright. It could be argued
           | copyright is a tragedy of the commons, the individual
           | benefits at the expense of everyone else. Obviously it is
           | intended that the wider impact of incentivising innovation
           | will be beneficial to everyone, but that is not a sure thing,
           | it is a compromise, a balancing act and that applies to AI
           | too. The Luddites did lose their jobs after all.
        
             | __loam wrote:
             | Listening to people bash the luddites without understanding
             | the actual history of the luddites is getting exhausting.
        
           | thot_experiment wrote:
           | Please explain to me how baking all the art into a big tensor
           | and then giving it away to everyone for free to run on any
           | half decent GPU from the past 10 years is somehow anti-
           | commons.
           | 
           | Rent-seeking by controlling AI models is anti-commons, the
           | models themselves are not, neither is training them. Don't
           | delude yourself into thinking that Miyazaki didn't study
           | Moebius didn't study Mucha. Even if you got your way and all
           | training on copyrighted material was stopped today and all
           | existing models were deleted it would only be a matter of
           | time before we could synthesize arbitrary styles (at least as
           | well as we can today) by identifying and controlling the
           | correct axes in tensor space using a model purely informed by
           | public domain data.
           | 
           | Your art is mostly not yours, your art is mostly an
           | amalgamation of other people's art which itself is also the
           | same, all the way down. You are a sprinkling of flavor on
           | top. Your art is stolen styles, techniques and vibes mashed
           | together. And here you are, trapped arguing about where we
           | draw the line on what's okay. 75 years? now 95? Look at what
           | you're saying! Introspect and realize that your understanding
           | of the context is defined by Disney and Bono wanting to
           | collect rent on ideas. You're just focused on getting your
           | crumbs of the pie in a game that's rigged to centralize power
           | and keep it centralized.
        
             | mplewis wrote:
             | We all know that an artist studying the masters to make
             | something original isn't the same as an autocomplete API
             | picking the color of the next pixel. Come on, man.
        
               | visarga wrote:
               | > studying the masters to make something original
               | 
               | original or "original"? maybe AI companies need to hire
               | people to make "original" content for AI.
        
               | thot_experiment wrote:
               | Whether or not those things are equivalent isn't relevant
               | to any part of my point.
        
               | __loam wrote:
               | You equivocated them.
        
             | bobthecowboy wrote:
             | The jist of what you're saying is in the Free Culture
             | movement/philosophy, I think, and it resonates with me as
             | someone who does not like what AI has done with copyrighted
             | works but also dislikes copyright.
             | 
             | The "rent-seekers" is the problem. We collectively inherit
             | and own our shared culture, but large corporations have
             | always wanted to sell it back to us. AI companies are
             | arguing they should have no limitations on their usage of
             | the culture, but that the same shouldn't apply to them.
             | Selling tickets to the commons is anti-commons.
             | 
             | Perhaps if these companies were themselves arguing for the
             | end of copyright and IP _for everyone_ , the conversation
             | would be different.
        
               | visarga wrote:
               | I think copyright lawsuits against AI companies will
               | force them to develop attribution models. They will do
               | the work of indexing all ideas to their authors. This
               | will also reveal what is common knowledge, and who
               | borrowed from who without attribution.
               | 
               | In order to make attribution models we need
               | text+author+timestamp. We can get that from books,
               | newspaper articles, scientific papers and social network
               | posts. Then we extend to the rest of the training set.
               | 
               | But then we can also make AI models that cleverly avoid
               | infringement while the same strict checking is going to
               | be applied to human made content. Humans are not that
               | good at avoiding pitfalls.
        
               | jwells89 wrote:
               | Right, I think we'd be having a very different
               | conversation if products of AI could not be sold (what's
               | taken for free must be given for free and free for others
               | to use as they see fit). Companies will fight this tooth
               | and nail though because that flushes any prospective
               | profit produced by firing humans down the drain.
        
             | __loam wrote:
             | > Even if you got your way and all training on copyrighted
             | material was stopped today and all existing models were
             | deleted it would only be a matter of time before we could
             | synthesize arbitrary styles (at least as well as we can
             | today) by identifying and controlling the correct axes in
             | tensor space using a model purely informed by public domain
             | data.
             | 
             | Do it then.
             | 
             | > Your art is mostly not yours, your art is mostly an
             | amalgamation of other people's art which itself is also the
             | same, all the way down. You are a sprinkling of flavor on
             | top. Your art is stolen styles, techniques and vibes mashed
             | together.
             | 
             | Eat shit. People deserve to have some ownership over the
             | product of their labor. Imagine doing all that work to
             | learn all these skills and some rat fuck nerd tells you
             | they have the right to alienate you from that work, making
             | some greasy fucking argument that the multi-billion dollar
             | program they made is doing the same thing you do.
             | 
             | Corporate abuse of copyright isn't an excuse for corporate
             | abuse of people who can actually make things.
        
           | Dracophoenix wrote:
           | The tragedy of the commons relates to deletable resources,
           | such as grass feed for livestock. How's that relevant to ML
           | models that _create_ content /resources about as fast as most
           | can think? What's being depleted?
        
             | __loam wrote:
             | Qualifying this statement by saying this is something I've
             | thought about and this is my own bullshit theory.
             | 
             | There's a limited amount of "attention" on the internet
             | that's available in online art spaces. I theorize that
             | artists need to maintain a public portfolio to find work,
             | either creative work for advertising or entertainment
             | companies or commissions or patreon subscription. You need
             | to be out there for people to hire you if you want to make
             | a living doing art.
             | 
             | AI art is disruptive and harmful to that ecosystem in a few
             | ways. It competes with artists not only for dollars and
             | jobs for commercial art (ie ad agency work), but more
             | fundamentally it competes with artists for attention on the
             | internet, which at a macro scale makes it so there's less
             | money in the system for real artists to practice and master
             | their skills. It's also not necessarily that the AI art is
             | better, it's just good enough to pollute the signal to
             | noise ratio. And as you said, it's fast. That can quickly
             | overwhelm sites like DeviantArt or Art Station, which moves
             | those communities and resources away from their original
             | purpose.
             | 
             | Additionally, AI art models need data from the commons to
             | function. I'm skeptical that synthetic data is good enough
             | to use for improving these models, so theoretically, one of
             | the main ways to improve these models is getting more and
             | better data from humans. If the economy of these models is
             | structured in a way that discourages artists from posting
             | their work publicly, or working at all, then the pace of
             | improvement decreases. I think a lot of artists are pretty
             | pissed off that their work is being used to produce
             | commercial software that creates substitutes for their work
             | without their permission or knowledge.
             | 
             | So that's what I mean by the tragedy of the commons. The
             | online art ecosystem is more fragile than many people think
             | it is and ai companies are over exploiting it. The content
             | remains but the thing you're actually trying to sample,
             | human knowledge and skill, withers on the vine.
        
         | foobiekr wrote:
         | Let's see how software people feel when it becomes routine to
         | AI wash projects to de-license them.
        
           | thot_experiment wrote:
           | That's bad, but the problem there isn't the AI.
        
           | vdaea wrote:
           | As a software person that feels great.
        
           | gumballindie wrote:
           | Software people here, we dont want ai people to steal our
           | open source code, hide it behind closed models, and sell it
           | without attribution or other terms respected.
           | 
           | Many in the software world are in awe how much theft ai
           | people do.
        
             | visarga wrote:
             | Open generative models are to open source what open source
             | is to closed source - an even deeper level of openness,
             | customisability and accessibility. Like open source they
             | empower their users, are private and easy to adapt. Most of
             | the time you just need to prompt, other times you apply a
             | fine-tuning tool, which is also open source.
             | 
             | I really don't understand open source people combating
             | generative AI. I've never seen this ethos since the
             | javascript framework wars. Take a look at llama.cpp, ollama
             | and vllm repos and their friends. They got such a sustained
             | rate of development and participation.
        
         | robertlagrant wrote:
         | Is that from the perspective of a reader of novels or a writer
         | of novels? Readers love free stuff.
        
           | thot_experiment wrote:
           | When it costs zero to produce a copy of a novel is the moral
           | choice to create artificial scarcity to preserve it's value
           | in our current system of value assignment, or is the moral
           | choice to fight to change and improve the system to more
           | align with the reality that novels exist in a post-scarcity
           | world?
           | 
           | Do we really want to deny people something that costs nothing
           | in order to preserve the status quo? Perhaps it's the system
           | of value assignment that's broken.
        
             | harimau777 wrote:
             | Novels don't exist in a post scarcity world because their
             | writers still need money to live.
             | 
             | I'm not necessarily opposed to moving to a post scarcity
             | model without IP, but we need to actually be post scarcity
             | first.
        
             | robertlagrant wrote:
             | Does that mean novel writing would only be for those who
             | are already independently wealthy? How would that actually
             | work?
        
               | thot_experiment wrote:
               | I'm not claiming to have worked out how to make this all
               | function. I'm just pointing out that the cost of
               | duplicating a novel approaches zero and the assumption
               | that the way we're assigning value is the only one that
               | works leads to a world where we must deny people things
               | they could have in order to preserve their value. This
               | does not seem morally correct to me. I do not need to
               | have a solution in order to identify a problem. Perhaps
               | we could make strides toward a future where you don't
               | need to be independently wealthy to write a novel by
               | having some sort of a UBI scheme.
        
               | robertlagrant wrote:
               | > I do not need to have a solution in order to identify a
               | problem
               | 
               | I'm the only one who's identified a problem. You haven't.
               | Production costs aren't the only type of cost.
        
             | Kim_Bruning wrote:
             | I wonder if traditional novels have been out-paced by free
             | online texts that people write in their free time (fan-fics
             | and/or original-fics) ? It'd be interesting to crunch the
             | numbers. Productivity is enormous, though quality isn't
             | always as high. Of course with sufficient monkeys on
             | keyboards, there's bound to be a few Shakespeares out
             | there.
        
           | visarga wrote:
           | We have had more content that we can chew for 25 years, all
           | accessible through search. Yes, we love it. But it's nothing
           | new. We don't need AI to have good, free stuff to read.
        
           | naet wrote:
           | It can end up being bad for both readers and writers in the
           | long term.
           | 
           | If writers aren't able to find profit in writing then there
           | will be less people spending the time and effort to produce
           | good writing, which in turn means less quality material for
           | readers to read (and, as an aside, less quality material for
           | an AI to train on in the future).
           | 
           | We could arguably reach some type of feedback loop where less
           | quality writing leads to less interest in reading, which
           | leads to smaller reading audiences, which leads to less
           | quality writing being produced, etc.
           | 
           | People do love free stuff, but free stuff often isn't
           | sustainable under our current economic system.
        
             | robertlagrant wrote:
             | > free stuff often isn't sustainable under our current
             | economic system
             | 
             | I disagree - some people live their whole lives on benefits
             | or similar entitlements. That's an incredible economic
             | achievement, as well as being a brilliant show of hard work
             | by the net contributors.
        
       | gustavus wrote:
       | Intellectual property isn't.
        
         | chefandy wrote:
         | The problem is that our society has no other viable widespread
         | mechanism to support the many millions of people who do
         | intellectual work, and lots of intellectual work is really
         | important for our society-- far more important than would make
         | sense to relegate to hobbyists and volunteers. And this is
         | coming from someone who's contributed well over 10k coding
         | hours to FOSS software.
        
           | teddyh wrote:
           | > _The problem is that our society has no other viable
           | widespread mechanism to support_
           | 
           | Except, of course, UBI.
        
             | BobaFloutist wrote:
             | We don't have that. Maybe we should, but we don't.
        
               | chefandy wrote:
               | Yep. Order of operations matters. This isn't a
               | theoretical "future problem" for a lot of people.
        
         | falcolas wrote:
         | Yup. It accounts for some 41% of the US' GDP, and a minimum of
         | 47 million jobs. It's also growing at a fairly fast rate.
         | 
         | That's too much money for anybody in charge to let die. They're
         | much more likely to nuke AI.
        
           | chefandy wrote:
           | > minimum of 47 million jobs
           | 
           | What would you propose the US workforce did with some 47
           | million knowledge workers with suddenly useless skills if you
           | removed the only widespread economic mechanism our society
           | has to support them?
        
             | falcolas wrote:
             | In my Star Trek-esque ideal world, post-scarcity sharing of
             | the resources required to live.
             | 
             | Realistically, it's a part of the reason IP is "too big to
             | fail". The fallout would effectively destroy the US.
        
               | chefandy wrote:
               | Yeah but we need the Star Trek society _first_. In
               | principle, I absolutely agree that IP is bullshit. The
               | problem that many people in the "smash IP" camp tend to
               | ignore is that people needing to eat, have a place to
               | live, not having a stage 1 cancer diagnosis be an instant
               | death sentence, and things like that aren't merely
               | "regrettable collateral damage" on the road to freeing
               | data.
               | 
               | I've encountered many people over the years that hardline
               | shit like that--they're almost exclusively middle to
               | upper middle class suburbanites that are edge lording
               | that stuff to subconsciously quiet their deep-seated
               | insecurities about how incredibly soft their
               | comparatively unchallenged existence has made them.
        
       | dventimi wrote:
       | https://archive.ph/2024.01.16-221827/https://www.newyorker.c...
        
       | pessimizer wrote:
       | I think so. Not just because of copyrights as the article covers,
       | because the people who want to abuse copyright for A.I. and the
       | people who own copyrights are basically the same people. They
       | will simply cross-license and lock normal people out of using the
       | products of A.I. by wielding that licensing against individuals.
       | 
       | The real reason is that anybody can do A.I. and it can't be very
       | patent-encumbered, being a number of abstract mathematical
       | techniques. If it becomes an absurdly productive technology, you
       | won't easily be able to keep people from using it in private.
       | Maybe the real Butlerian Jihad will be the government Office of
       | A.I. Copyright Royalties sending agents and their silicon-
       | sniffing dogs to kick in people's doors following rumors of
       | Aggravated Infringement.
       | 
       | Maybe it will be what finally gets general purpose computers
       | banned?
        
         | deadbabe wrote:
         | In a world where general purpose computers are banned, would
         | there be some kind of underground speakeasy type computer labs?
        
       | jerf wrote:
       | No, it will result in a strengthening. It raises the value of
       | things previous unprotectable, and thus will raise the incentive
       | to get them protected. The Wild West of the AI world today will
       | rapidly get captured because by the time you've amassed the
       | resources to put up a credible AI, you're already a fairly large
       | corporation. The scale of time necessary to get new protections
       | passed is roughly comparable to the scale of time it will take
       | for these companies to become owned one way or another by
       | existing interests.
       | 
       | The good news is that this probably still won't much affect your
       | personal projects or anything. Anything you can already self-host
       | and do today isn't going anywhere and you've probably got at
       | least another two or three generations in this fast-moving space
       | before the law clamps down. It is likely that many of the
       | referenced AI aspects will be essentially "solved" before then,
       | such as voice cloning. They aren't going to go after every last
       | little AI user for every last thing because that's squeezing
       | blood from a stone and not cost-effective. But don't plan on
       | building a multi-million-subscriber YouTube channel out of it.
       | 
       | And if you are interested in the freedom to do things the
       | copyright owners don't want you to do, be sure to self-host and
       | archive as much as possible. The ability to do this on platforms
       | others host is going to disappear rapidly, if it isn't already.
        
         | AnarchismIsCool wrote:
         | This is a really interesting take, but I don't completely buy
         | it. Traditionally technologies like these become more
         | accessible with time. Take the youtube example, this is
         | actually a really good counterpoint as if you can create videos
         | with an AI you created, nobody has any way of figuring out that
         | you trained it on so-and-so's IP. It's completely black box
         | which is the direction things are probably going to go IMO.
         | 
         | On the other hand, this is problematic because it'll slow down
         | the development of the self-hostable models outside of the
         | community moving to a darkweb like model where they stay beyond
         | the reach of the legal system.
        
           | jerf wrote:
           | One of the big factors is whether or not AI can stay in the
           | hands of the public or if it recedes into levels of hardware
           | that only companies can own. If the latter, then the control
           | will win.
           | 
           | Keeping AI performance workable on personal-level hardware is
           | going to be a big deal. Open source work on keeping
           | performance at that level may be long-term more important
           | that this or that feature.
           | 
           | Watch out for calls for AI ethics to turn into calls for
           | controls on hardware capable of running AI. I'm half
           | surprised we aren't already hearing calls for that. I
           | wouldn't be surprised to see it by the end of the year,
           | though my expectation would be at least next year if not the
           | one after. The real goal behind hardware control will be this
           | sort of thing (though copyright control may actually be a
           | secondary concern versus just keeping the really good tools
           | out of the hands of the plebs), not AI ethics. (The elites
           | know AI ethics limitations have no meanings as they have no
           | intention of limiting themselves.)
        
             | gooob wrote:
             | everyone already has computers that can "run AI". would be
             | pretty ridiculous if some law was passed and everyone's
             | computers were confiscated. that would be like a
             | lobotomization of society. a very stupid and short-sited
             | move i think.
        
             | visarga wrote:
             | > One of the big factors is whether or not AI can stay in
             | the hands of the public or if it recedes into levels of
             | hardware that only companies can own. If the latter, then
             | the control will win.
             | 
             | Both will win. As corporations will make more and more
             | advanced models, training data generated by their models
             | will uplift open models. They only lag a few months. We
             | already can shrink most of chatGPT's abilities in a 7B
             | model. All open models trained after 2022 are benefiting
             | from their big brothers. It's too easy to exfiltrate skills
             | from proprietary models.
        
       | ricardo81 wrote:
       | It's been dead a while because of big tech. YouTube, Google,
       | Facebook, Twitter - they have no way of effectively managing IP
       | and so you basically see people's stolen content and they
       | monetise it, so <shrug>
       | 
       | TBF Napster and torrents are another and earlier angle.
       | 
       | Purely IMO, big tech normalised the taking of other people's
       | content, and it's getting to the point where content creators (as
       | in, ones that are capable of producing unique content, not
       | wealthy YouTubers per se) have simply had enough. The idea of
       | authorities and primary sources is rapidly diminished with the
       | hodge podge we have to navigate through in information discovery.
        
         | ronsor wrote:
         | > big tech normalised the taking of other people's content
         | 
         | This was normalized by large media companies long before big
         | tech.
        
           | ricardo81 wrote:
           | Do you have examples? Obviously it's hard to compare scales
           | though.
        
             | jachee wrote:
             | Record labels spring immediately to mind.
        
             | TuringNYC wrote:
             | Any case where a small player cannot effectively defend
             | patents they hold, but at any time, can be sued for
             | infringement and cannot effectively defend themselves
             | either.
        
             | ricardo81 wrote:
             | @TuringNYC and @jachee
             | 
             | What had sprung to mind for me was the wholesale rip off of
             | European IP by the US in its commercial development in the
             | 19th century and China now with the US.
             | 
             | I guess what I'm thinking of is at the individual level, no
             | one cares much for re-appropriating someone elses work. It
             | has been normalised.
        
             | pjmorris wrote:
             | "Everything is free now
             | 
             | That's what they say
             | 
             | Everything I ever done
             | 
             | Gonna give it away
             | 
             | Someone hit the big score
             | 
             | They figured it out
             | 
             | That we're gonna do it anyway
             | 
             | Even if it doesn't pay"
             | 
             | - 'Everything is Free', Gillian Welch, 2001
        
               | ricardo81 wrote:
               | Ah
               | 
               | "No one's gonna notice if you're never right or wrong
               | 
               | And if you and your next neighbour, yeah, ya don't quite
               | get along
               | 
               | No one's gonna notice if you're singing anyway
               | 
               | Those not coming in for free will learn they gotta pay"
               | 
               | Ian Brown, 2004.
        
         | willmadden wrote:
         | The pharmaceutical industry continues as a giant, blood
         | sucking, vampire squid because of IP.
        
           | NoZebra120vClip wrote:
           | > The pharmaceutical industry continues as a giant, blood
           | sucking, vampire squid because of IP.
           | 
           | Let's be honest here. In order to get that way, they needed
           | to stomp out and eradicate every trace of "evidence" that
           | natural medicine was efficacious, because herbs, botanicals,
           | and other natural substances can't be patented, can't be
           | synthesized in factories, and can't be exploited for profit
           | at megascale.
           | 
           | If word ever leaked out that, for example, selenium, valerian
           | root or St. John's wort were equivalent/better than
           | synthetic, patented medications, we'd all be in trouble.
        
             | BobaFloutist wrote:
             | I can buy valerian root tea at multinational grocery store.
             | The problem with supplements isn't that they're
             | disenfranchised, and it's not even that they strictly don't
             | work. It's that the huge businesses that make tons of money
             | off of them have so aggressively resisted regulation based
             | on effectiveness, or even contents, that you can't count on
             | what you're buying not poisoning you, let alone helping.
             | 
             | And I'd love to see a natural replacement for insulin that
             | doesn't kill people, but I'm pretty sure it doesn't exist.
             | So maybe pharmaceutical technology does a few things that
             | natural remedies can't?
        
         | mschuster91 wrote:
         | > TBF Napster and torrents are another and earlier angle.
         | 
         | Napster didn't kill the music industry, torrents didn't kill
         | the movie industry because people still want to see movies on a
         | big screen experience. And so, actors, directors, VFX artists,
         | cutters, riggers, lighting staff, costume/mask people, they all
         | continue to have jobs.
         | 
         | But now, with AI being already able to do very realistic
         | photos, and in a few years likely to create short movie
         | sequences all based upon a prompt, not only will all these
         | people be replaced by one prompt artist feeding a massive AI
         | engine... but the diversity the AI can generate will always be
         | limited by the diversity of its training material. It is by
         | definition incapable of trying something entirely new, and
         | since there will be no economic incentive to try anything
         | entirely new there will be no expansion. (Oh, and guess what,
         | movie prices aren't going down but since the expenses for all
         | the humans vanish the profit concentration will accelerate!)
         | 
         | Widespread AI will freeze our culture in an era of about
         | 2020-2030 forever simply because capitalism will not offer any
         | incentive to feed the AI with new, creative things.
        
           | Ukv wrote:
           | > torrents didn't kill the movie industry because people
           | still want to see movies on a big screen experience
           | 
           | Given you believe the reason torrents didn't kill the movie
           | industry is because people weren't satisfied with smaller
           | screens: would you be satisfied with content that is stuck in
           | "2020-2030 forever" and can never produce anything "entirely
           | new"?
           | 
           | Personally I think novelty in art thrives when the entry
           | barrier is low and experimentation is possible without a huge
           | budget - which AI can help achieve.
        
             | mschuster91 wrote:
             | > Given you believe the reason torrents didn't kill the
             | movie industry is because people weren't satisfied with
             | smaller screens: would you be satisfied with content that
             | is stuck in "2020-2030 forever" and can never produce
             | anything "entirely new"?
             | 
             | I fear the enshittification of movies - similar to, say,
             | the situation in video games or residential ISPs: when
             | everyone has silently agreed to a common level of base
             | bullshit, customers won't move (because why move from a
             | pile of cow dung to a pile of dog dung? it's all dung in
             | the end), and everyone can keep fleecing the customers.
             | 
             | > Personally I think novelty in art thrives when the entry
             | barrier is low and experimentation is possible without a
             | huge budget - which AI can help achieve.
             | 
             | Oh even now even utter novices and two-three person teams
             | can produce hours of high quality content, youtube is
             | enough proof of that. Experimentation is not the problem,
             | the problem is all the marketing to get people to sit in
             | the cinema. For a lot of modern movies, music and video
             | games the cost of production is at least on the same order
             | of magnitude as the cost of marketing - even a blockbuster
             | movie like Avengers Endgame [1] reportedly had 356 M$ of
             | production cost vs 200 M$ of marketing cost, and of the
             | production cost at least 100 M$ were compensation for the
             | star actors.
             | 
             | [1] https://collider.com/avengers-endgame-box-office-
             | budget/
        
               | Ukv wrote:
               | > or residential ISPs: when everyone has silently agreed
               | to a common level of base bullshit, customers won't move
               | (because why move from a pile of cow dung to a pile of
               | dog dung? it's all dung in the end)
               | 
               | So, if I'm understanding, you believe you wouldn't be
               | satisfied with the movies, but wouldn't have a choice
               | because of a silent agreement (as with residential ISPs)
               | leaving customers with no good options. Is lowering
               | barriers to competition not one of the ways that such
               | oligopolies are fought?
               | 
               | To me this seems far more of a risk in our current
               | environment where the vast majority of high-production-
               | value movies come from a small handful of companies that
               | have the budget to make them (and give us plenty of safe
               | low-risk sequels).
               | 
               | > Oh even now even utter novices and two-three person
               | teams can produce hours of high quality content, youtube
               | is enough proof of that
               | 
               | There are reasons why working-class people are vastly
               | under-represented in arts - the time and resources
               | required to create "hours of high quality video content"
               | is a hefty investment even for Youtube videos, not to
               | mention a feature film that people will want to watch.
        
           | suoduandao3 wrote:
           | >capitalism will not offer any incentive to feed the AI with
           | new, creative things.
           | 
           | Disney is already following that thesis without AI, yet it's
           | never looked so vulnerable to the whims of the free market.
        
           | hadlock wrote:
           | > and in a few years likely to create short movie sequences
           | all based upon a prompt
           | 
           | It depends on the director and style of the era, but average
           | "shot length" runs between 5 and 14 seconds. Six months ago
           | companies were already demonstrating AI generated, 20 second
           | shots (albeit with low motion, like slow pans) with the
           | background staying mostly the same. I would probably upgrade
           | this statement from "likely to create" to "almost near
           | certainty".
        
           | visarga wrote:
           | > It is by definition incapable of trying something entirely
           | new
           | 
           | Only if the AI doesn't see any usage. But when you start
           | using it, you get AI+human generating things that can be
           | outside its training scope. Humans can input novel ideas into
           | the prompt or include new information. Then AI gets to
           | retrain on the data generated with human in the loop and
           | improve. There is a chance for humans and AI to explore new
           | directions.
           | 
           | If you got a simple way to test, like the DeepMind geometry
           | problem solving model, then AI models can improve on their
           | own by doing massive search and validation. Kind of similar
           | to the scientific method - formulate hypothesis, validate,
           | observe outcomes, and repeat. Works for code too, or any AI
           | output that can be "executed" and generate a validation
           | signal.
        
           | gaganyaan wrote:
           | Completely unfounded fear, fortunately. AI will enable more
           | people to be creative than ever before. People won't stop
           | telling stories because of capitalism, they'll just continue
           | the tradition of storytelling that predates capitalism by
           | many millenia.
        
         | dandellion wrote:
         | There was no other outcome possible from the moment we invented
         | a technology that allowed everybody to create infinite copies
         | of digital content at almost zero cost.
         | 
         | If tomorrow we discovered a source of infinite free energy, oil
         | and energy companies could try anything they wanted, their
         | current business model would end up obsolete sooner or later.
        
           | ricardo81 wrote:
           | Can't say I disagree it was inevitable but could have been
           | legislated for, which seems to be years behind the times as
           | always. My main concern is just unique content that is
           | adequately rewarded that isn't siphoned off by large
           | platforms that can viralise said content, or along those
           | lines.
           | 
           | Maybe even if the big tech could pass on some (most) of the
           | proceeds because of the (content share), it'd be far less a
           | thing. Otherwise, 90% of people being made aware of an idea
           | see it somewhere beyond the original source and those
           | secondary sources capitalise because... simply because it's
           | the norm.
        
           | neonsunset wrote:
           | Just look at what happened to nuclear energy :)
        
             | k__ wrote:
             | So, nuclear proponents are now going so far to call the
             | catastrophes of the past a big oil psyops?
        
             | BobaFloutist wrote:
             | Nuclear power plants have a dramatically higher upfront
             | cost than personal computers. You can't really ignore the
             | capital cost and call the energy free.
             | 
             | A better example is solar power, which is slowly but surely
             | whittling away at fossil fuels.
        
         | peab wrote:
         | That's not true at all - content ID on Youtube works quite
         | well, and the Music Industry has strong ties to Youtube
        
       | adventured wrote:
       | It inherently has to damage our present conception of IP, unless
       | we plan to cripple the AI revolution in the crib.
       | 
       | If you have smart robots in the home, much less AGI, what's the
       | plan on restricting them? Sorry robot, you may only learn this
       | limited set of skills, you're not allowed to create, because
       | you've seen too much and might be too good at learning and
       | producing.
       | 
       | They'll be incredibly metered if they're not allowed to learn and
       | produce at a high capability.
       | 
       | And if we're talking about AGI, you can never realistically reach
       | it if you put it in the straightjacket required to protect IP as
       | things are today. Sorry all-knowing AGI god, you may not make
       | images that look too much like XYZ. In fact, you're not allowed
       | to even see any of it because of your capabilities. In fact,
       | Disney has decided through the copyright act of 2042 that you're
       | not even allowed access to the Internet (or magazines, or radio,
       | or music, or TV).
       | 
       | It can't be AGI if it can't create. Forget about trained on
       | ('just do not train it on material with copyright'), it'll self-
       | train, and then what? Well then you can't even let it self-train,
       | ie no AGI is possible.
       | 
       | It's going to be either or. Either the IP laws get changed, or
       | you get no AGI. The only thing possible would be extremely boxed
       | off, narrow, high capability AI.
        
         | jprete wrote:
         | Why do you think AGI is desirable?
        
           | foobiekr wrote:
           | Humans are too stupid to collectively govern themselves. Just
           | look at the US election cycle that is starting. There is a
           | genuine messianistic cult at work.
        
             | tatrajim wrote:
             | All hail the wisdom of the cold circuits. We hear and obey.
        
         | noitpmeder wrote:
         | I, for one, do not want to live in your new world.
         | 
         | Further, I doubt that preventing AIs from literally stealing
         | protected information will halt the rise of smarter AI. There
         | is plenty of free use information out there that is LEGAL TO
         | USE. Just curate your dataset to prevent the ILLEGAL use of
         | other material.
        
       | somewhereoutth wrote:
       | Depending on the outcome of the NYT case, I.P. may in fact be the
       | death of (profitable) AI.
       | 
       | One problem with AI (vs e.g. Google search) is that there is no
       | mechanism for attribution. So it is impossible for rights holders
       | to understand whether their works are being monetised (unlike,
       | say, Spotify).
       | 
       | Should NYT win, then either AI services will have to pay into a
       | (very large) common pot for rights holders, which likely will
       | make AI uneconomic (even if it does turn out to have some
       | business benefit), or (paid) AI services will simply be banned.
        
         | TuringNYC wrote:
         | >> One problem with AI (vs e.g. Google search) is that there is
         | no mechanism for attribution.
         | 
         | Sorry if this is a silly question, but is that really the case?
         | Can we not train an LLM on successively larger training sets,
         | each will uniquely ID's model and associated better
         | performance?
        
         | djohnston wrote:
         | There's no way AI services are going anywhere. If the US
         | kneecaps itself for parasitic rent seekers then China will win
         | and benefit from the productivity these tools produce.
        
           | gumballindie wrote:
           | So the proposal of sociopathic thieves is that we turn into
           | china and steal people's property? What an odd take.
        
             | djohnston wrote:
             | It's a hard world and we need to play to win.
        
         | wredue wrote:
         | It's not impossible. AI developers can 100% trace node that
         | contribute to a result during the processing of a prompt,
         | therefore they can be attributed.
         | 
         | They'll never do this, of course, cause they're riding high on
         | wins of stupid people pushing "you can't work backward from a
         | result to find the nodes" (which is probably somewhat grounded
         | in reality).
         | 
         | Attribution is absolutely not impossible, they just want to
         | think it's not possible, because enabling such a thing would
         | show just how egregious the copying actually is.
        
           | Ukv wrote:
           | > they just want to think it's not possible
           | 
           | As far as I'm aware, what you propose really isn't a solved
           | issue. Most naive approaches you'd think might work are
           | either infeasible or give chaotic nonsense results.
           | 
           | Partial matching between the output text and the training
           | data is possible as a step after generation (though doesn't
           | determine whether it's coincidence).
        
         | booleandilemma wrote:
         | I'm not sure I like the direction AI is going anyway.
         | 
         | Right now we're on a path towards just a handful of megacorps
         | with AI agents that will make large numbers of white collar
         | workers redundant. Imagine something like closed-source
         | software only much worse.
        
           | hadlock wrote:
           | In the last six months model size has come way down and
           | quality has gone way up through different training
           | styles/methods. There is an incentive to release separate
           | models capable of running on 8/16/32gb ram. On my laptop I
           | have something like six free/open source models installed.
           | LLMs are getting so easy to make that people are doing this
           | in their spare time as a hobby. This is starting to feel like
           | a small stepping stone towards AGI but only time will tell.
        
         | gumballindie wrote:
         | So essentially ai is profitable only if it resells stolen
         | property. Almost as if it's not intelligent and it only
         | procedurally generates output. The more the better.
        
           | suoduandao3 wrote:
           | How is it 'stolen' if it was part of a public dataset? Is the
           | argument that OpenAI was scraping data that should have been
           | behind a paywall?
        
             | noitpmeder wrote:
             | Just because it's publicly viewable does not mean it's free
             | for reuse. Imagine an author who uploads chapters of their
             | book to their own website. You cant take that and claim
             | it's yours.
             | 
             | Similarly with all the publically viewable source codes on
             | GitHub/Lab. Just because you can view the code doesn't mean
             | there aren't licenses in play for how you are able to use
             | it.
             | 
             | Further, what happens if stolen information is posted
             | publicly? (E.g. wholesale copies of full books on html
             | pages?
             | 
             | Just because it's public doesn't mean the viewer has
             | freedom to use it as they see fit.
        
               | yokem55 wrote:
               | That 'usage' is taking statistical notes about the work
               | (creating factual statements about the work) and imputing
               | those notes into a database, averaged with a few billion
               | other notes about other works. That is a usage that
               | copyright under current law simply doesn't cover or
               | protect for. It doesn't even need the analysis if 'fair
               | use' because, there's no copying or public performance
               | happing in the creation/training of the model.
               | 
               | Where infringement arguably can happen is when that model
               | is used in the generation of content - and if the user is
               | prompting regenerate a protected work, then that is where
               | the infringement happens. But not before. Maybe the
               | various ai services can adequately guard against that
               | illicit usage. Maybe not. And if not, its those live
               | services that would need to be shut down.
               | 
               | But the creation and training of a model, and even
               | distributing that model for people to use with their own
               | computers in private does not engage in copyright
               | infringement.
        
             | BobaFloutist wrote:
             | If I create a sculpture and put it on my front lawn, I
             | think it's fair for me to complain if you scan it and start
             | 3d printing and selling copies, even if I intended it to be
             | in public view. Just because I want the public to be able
             | to enjoy my version of something, doesn't mean I want
             | people to feel free to ingest it and monetize it for
             | personal profit.
        
         | mindcandy wrote:
         | > there is no mechanism for attribution.
         | 
         | What if there was?
         | 
         | What if I hand-painted an image (without AI) and God himself
         | came down from heaven and explained to the world that my
         | painting was derived 10% from my observations of the collected
         | works of Lisa Frank, 3% from looking at Davinci paintings, and
         | 87% a long tail of various sources. And then I sell my painting
         | for $100.
         | 
         | OK. Now what?
        
           | teddyh wrote:
           | > _OK. Now what? Do I owe Lisa Frank $10?_
           | 
           | No, because copyright law says that if you hand-paint an
           | image (without AI), then as long as the images isn't deemed
           | (by a court, if necessary) to be a derivative work, then you,
           | and nobody else, own the copyright. The word of God is not
           | necessarily considered by the law. And even if the court
           | takes the word of God as gospel, a 10% derivative work might
           | not exceed the necessary threshold of infringement.
        
             | mindcandy wrote:
             | Very good!
             | 
             | Alternatively, if I created a large body of works that were
             | 90% Lisa Frank and brought it to a large market
             | distribution, making significant revenue, then she could
             | sue me for damaging her market and her brand. Which she has
             | done to others in the past and rightly won. Go Lisa Frank!
             | 
             | But, if I used an AI tool in the process of making an image
             | that came out 10% Lisa Frank, she wouldn't have a strong
             | case against me. Nor does she have a case against the
             | million random teenagers hand-copying her style but not
             | making significant revenue from their copies.
             | 
             | Either way, AI doesn't really factor in here. The results
             | are the results and the markets are the markets regardless
             | of AI or paint brushes.
        
               | BobaFloutist wrote:
               | Do you think it's reasonable for artists to want a way to
               | forbid major AI corporations from training on their work?
               | Because some people will stop producing/stop making
               | available their work if they don't get access to a
               | different way of doing so.
               | 
               | Much like how some artists are uncomfortable with making
               | high-quality scans of their work available online,
               | because it becomes prohibitively difficult to prevent
               | people from selling prints. Forget the current legal
               | framework, it seems fair to me to say "I don't want that
               | happening to my art" when "that" is a corporation finding
               | a way to monetize it. Even if the AI is using less than a
               | pixel of your work, I think it's fair to ask that it not.
        
               | mindcandy wrote:
               | I get that the concept is upsetting. And, I'm not going
               | to tell people to not be upset or not state their will.
               | Ex: DeviantArt has instated an opt-in checkbox for "It's
               | OK to train on my art" and I think that's great.
               | 
               | But, everything I've heard and learned about how AI works
               | leads me to the conclusion that such a move is almost
               | entirely symbolic. Counter to what our egos would like to
               | believe, the vast majority of the training images and the
               | vast majority of the learning value for AI comes from
               | random, crappy photographs of various things. Without a
               | million photos of dogs and trees, the AI can't make a
               | stylized dog distinguishable from a stylized tree. And,
               | it's turning out that, from that base of crappy
               | photographs, it doesn't take many high-aesthetic images
               | to make a high-aesthetic AI.
        
         | chmod600 wrote:
         | Are there other examples where the law crushed fundamental
         | technological shifts?
         | 
         | (Not just delayed briefly by starving investment or something.)
        
           | somewhereoutth wrote:
           | That assumes that the current crop of LLMs _are_ a
           | fundamental technological shift.
           | 
           | Furthermore, rights holders did it for file sharing, why not
           | AI too?
        
           | Workaccount2 wrote:
           | Stem cells
        
       | thomastjeffery wrote:
       | I can only hope.
       | 
       | In the mean time, we have and even worse situation: "AI" models
       | like LLMs are effectively allowed to _launder_ IP. As long as
       | companies like OpenAI are allowed to be ignorant of copyright,
       | they get to _monopolize_ that ignorance.
       | 
       | The result is that the rest of us are in an even worse position,
       | because _we_ have to respect DMCA, but can 't benefit from it by
       | monopolizing our own IP.
        
         | ricardo81 wrote:
         | 100%, sleight of hand plea of ignorance.
         | 
         | Take xx billion pages of unique human content, do something,
         | pretend that there's no monetary value to that content and
         | monetise it. Farcical.
        
       | mlsu wrote:
       | Copyright is not a real thing. It's a fiction. Subverting
       | copyright law is not like subverting the laws of physics. What
       | happens when you subvert copyright law is some guys with guns, in
       | either an overt way or a subtle one, force a trade of your
       | imaginary resource (copyright) for the real resource (money),
       | through licensing or fines.
       | 
       | Debate about what AI means for copyright is really a debate about
       | what we choose to do, what we choose to allow. It's a debate
       | about whether the guys with guns will stop the development of AI,
       | or how much.
       | 
       | It's a bit of clever misdirection to declare the output of AI's
       | "copyrightable" or "not copyrightable." Copyrightable is made up!
       | It's fiction! It is whatever we want it to be!
        
         | tilwidnk wrote:
         | > Copyright is not a real thing. It's a fiction.
         | 
         | Now, I don't have a college degree, but with my 60+ years life
         | experience I can say, yes, copyright is a real thing, it
         | protects people who create, from assholes and AI.
        
           | jncfhnb wrote:
           | It doesn't though
        
           | wahnfrieden wrote:
           | It's not real beyond what the parent post described, which is
           | an expectation that the state will use guns and cages in
           | reaction to certain behaviors
        
           | gumballindie wrote:
           | Am I the only one that loves a blunt and honest comment?
           | 
           | Funny how advocates against property rights have no issue
           | with corporations protecting their own. That's fine. What
           | they want is to legally steal from the average joe.
        
             | echelon wrote:
             | Are you stealing from your lord, serf? Perhaps you should
             | be churning the butter, not thinking for yourself. (I kid,
             | I kid!)
             | 
             | It's going to be just as easy to make a movie or a song as
             | it was to write this comment. I'm working and researching
             | at the edge, and I promise you that the entire world will
             | be in awe of these next 12 months. Go play with Suno for a
             | taste.
             | 
             | I'm a senior engineer, and now I'm tab completing Rust code
             | 70% of the time. It's hard to believe we've come this far,
             | and it's only going to keep climbing in capability.
             | 
             | You're watching breakneck progress, and the genie isn't
             | going back.
             | 
             | More art will be created per month than the entire human
             | history to this point. Gen alpha is growing up on this
             | stuff and using it to great effect to communicate amongst
             | their peers. The things they build will be incredible.
        
               | arrosenberg wrote:
               | > It's hard to believe we've come this far, and it's only
               | going to keep climbing in capability.
               | 
               | Or we hit the top of the sigmoid curve and just have
               | fancy autocomplete.
        
               | SpicyLemonZest wrote:
               | I don't understand why the autocomplete meme is so
               | compelling to people. How is drawing a picture, which
               | modern AI systems are already capable of, "autocomplete"?
        
               | arrosenberg wrote:
               | (1) I was responding to the actual thing OP said (70%
               | code completion in Rust).
               | 
               | (2) It literally is autocompleting - it draws each pixel
               | because it statistically determines that <some color
               | value> is the best fit given the prompt and the prior
               | pixels it drew. It's a more advanced robot, but it's
               | still a robot.
        
               | gumballindie wrote:
               | AI systems are not "drawing a picture". They procedurally
               | generate output given vast amounts of input, much of
               | which has been stolen. Without it there would be no
               | credible output. I dont understand why are ai cultists
               | hell bent on theft.
               | 
               | Llama, ollama, etc are not the issue here. Nor is ai. The
               | issue is theft for training.
        
               | emporas wrote:
               | The parent makes the mistake, to assume an adversarial
               | relationship between an mp3 download from a torrent and
               | the musician. Or an adversarial relationship between
               | training an A.I. statistical engine to a painter and
               | reproducing the style.
               | 
               | That's not correct. We are now increasing the
               | capabilities of everyone creative, to achieve much more,
               | almost free and instantly. The painter now, will have a
               | 100 million film studio on his fingertips to create
               | movies. The musician will be able make a high quality
               | album, just from his snoring.
               | 
               | One usual misconception is that talent is not important
               | anymore. That's certainly not true. Talent is not going
               | anywhere. People not so talented can get some results
               | which are ok or good enough, but talented people can
               | create magnificent art, without even trying.
               | 
               | Also "tab completing Rust code 70% of the time" is
               | breadcrumbs. I am working in making, Rust specifically,
               | 100% code generated.
        
             | kiba wrote:
             | It is well understood that laws are man made construct
             | enforced at the point of violence?
        
             | pcthrowaway wrote:
             | > Funny how advocates against property rights have no issue
             | with corporations protecting their own
             | 
             | Are you just not aware that the most outspoken advocates
             | against private property (and rights to it, including
             | IP/copyright), also strongly oppose capitalism and the idea
             | of corporations?
        
           | echelon wrote:
           | Serfdom used to be a real thing.
           | 
           | Dowry payments used to be a real thing.
           | 
           | Butter churning used to be a real thing.
           | 
           | Copyright law is, and will be, mutable.
           | 
           | In just about a year, kids will be making entire Pixar movies
           | from home. (The content will probably be Skibidi toilet
           | related, but of Pixar scale and scope.)
           | 
           | How that reconciles with copyright, you've got me. It
           | doesn't. It's a domain mismatch. It's completely outmoded by
           | what's coming.
           | 
           | And the music industry is equally toast. I can already make
           | an excellent banger song about superhero rodents in under 30
           | seconds.
        
             | mplewis wrote:
             | None of what you described is quality content that anyone
             | actually wants to watch.
        
               | resolutebat wrote:
               | The YouTube watch history of any 13-year-old provides a
               | handy counterexample to your assertion.
               | 
               | Slightly less flippantly, the days when the nation
               | gathered around their TVs to all watch "Leave It To
               | Beaver" at 8pm are long gone, the media landscape has
               | been fragmenting for decades and this is just the next
               | step. My kids don't watch TV shows, they follow
               | YouTubers.
        
               | distortedsignal wrote:
               | Interestingly, that doesn't matter for copyright.
               | 
               | A six-year-old filmmaker has as much claim to copyright
               | protection as Spielberg and Tarantino. Just because one
               | uploads to YouTube and one is paid millions of dollars by
               | a major movie studio, it doesn't mean that they're
               | different in the eyes of the law.
               | 
               | From what I understand, once a work is created, copyright
               | is assigned to that work's creator. That creator may then
               | license that work however they want. Quality doesn't
               | factor in.
        
           | jtriangle wrote:
           | Now, I don't have a college degree, but with my 60+ years
           | life experience I can say, yes, copyright is a real thing, it
           | protects people who create, from assholes and AI.
           | 
           | There, I've stolen your post. Did copyright protect you? No,
           | it did not. Copyright is an idea we came up with that sounded
           | good at the time, and it's come to pass at this point that
           | the original idea was deeply flawed in ways we could never
           | have predicted, but now can see clearly in hindsight.
           | 
           | The truth is, intellectual property should be protected by
           | those creating it. Coke does a great job of this, as do many,
           | many other companies. We call these "trade secrets", but
           | ultimately the concept is the same. You're protecting the
           | work you deem worth protecting.
           | 
           | I don't buy the notion that copyright ever protected the
           | creator. What it really protected was the interests of the
           | entities who effectively enslave the creators via contract,
           | and is not of any tangible benefit to the creators
           | themselves. If one truly cares about the artisans among us,
           | one cannot justify the existence of our ideas surrounding
           | copyright.
           | 
           | Yes, removing those laws from our doctrine would cause
           | upheaval, as the market must then rebalance itself in the
           | absence of the artificial pressure we've put on it, but in
           | time all things find equilibrium again, and placing the value
           | and responsibility back on the individual is, in my mind,
           | simple human decency.
        
             | adamsilkey wrote:
             | > The truth is, intellectual property should be protected
             | by those creating it.
             | 
             | The whole point of government and laws and societal norms
             | is that not everyone has to be deeply involved and
             | specialized in protecting their rights. We default to
             | people doing the right thing and seek out specialists (e.g.
             | lawyers) when we're wronged.
        
               | jtriangle wrote:
               | Yeah, and it's a terrible system, because it effectively
               | paywalls anything remotely resembling justice.
        
         | throwboatyface wrote:
         | Stop signs are made up too. See how far you get driving through
         | all of them without stopping.
        
           | k__ wrote:
           | But they have a point.
           | 
           | Copyright law is already different from country to country.
        
             | throwboatyface wrote:
             | Some countries have roundabouts. Some countries let you
             | make a right turn on a red light. Traffic law isn't any
             | more universal than copyright law. All law is made up but
             | that doesn't make it less real.
        
               | visarga wrote:
               | But AI models are not limited to specific countries, they
               | can be downloaded and run anywhere. Train in Japan
               | (unrestrictive rules), then use in US.
        
               | throwboatyface wrote:
               | I can drive my car over the land border between two
               | countries and have to switch sides of the road. Or the
               | speed limit can go up/down. Or they can mandate different
               | safety equipment.
        
               | k__ wrote:
               | I didn't mean to imply that they aren't real.
               | 
               | I just wanted to say that laws are subject to change.
        
             | michael_nielsen wrote:
             | So are the rules of the road.
        
         | tenebrisalietum wrote:
         | U.S. copyright also has a constitutionally defined purpose:
         | 
         | `[The Congress shall have Power . . . ] To promote the Progress
         | of Science and useful Arts, by securing for limited Times to
         | Authors and Inventors the exclusive Right to their respective
         | Writings and Discoveries.`
         | 
         | Does the current scheme really promote the progress of science
         | and useful arts? Is the current length of copyright really
         | "limited Times"?
        
           | ronsor wrote:
           | > Is the current length of copyright really "limited Times"?
           | 
           | This was argued before, and the Supreme Court decided "Yes,
           | it is, somehow."
        
             | visarga wrote:
             | Choose any number of years < infinite. It's limited time.
        
         | ricardo81 wrote:
         | >What happens when you subvert copyright law is some guys with
         | guns, in either an overt way or a subtle one
         | 
         | That just seems like a sociopathic argument to circumvent
         | perfectly clear laws intended to protect people who expend
         | energy into offering new ideas and also trying to diminish what
         | is already delineated as right and wrong. In the blatant sense.
         | 
         | I'm not sure how your gun analogy funnels along to fair use.
         | 
         | As with anyone who doesn't seem to respect copyright laws, I'd
         | say to them, give me a copy of everything useful you've ever
         | done in your life. Going by the same rule.
        
           | wahnfrieden wrote:
           | Most copyrights are not held by the people who expended
           | energy into the labor of producing new work, they're held by
           | the employers of that labor. Most people have no practical
           | choice in finding the means for their survival but through
           | wage labor and selling the ownership of what they produce.
        
             | ricardo81 wrote:
             | And that's fine, because presumably it's by consent.
        
           | distortedsignal wrote:
           | The "gun analogy" is the parent saying the the government
           | (who is supposed to be the only entity able to use force) is
           | supposed to protect IP. You're saying the same thing.
           | 
           | You shouldn't say anything to anyone who doesn't respect
           | copyright laws - or, rather, you shouldn't have to. Your tax
           | dollars, and the entity that you authorize to be the one to
           | use force where you live, should be able to say whatever you,
           | as the governed citizen, want to say to the person who
           | doesn't respect copyright.
        
             | ricardo81 wrote:
             | What parent? The comment I'm replying to has no parent and
             | the article has no mention of a gun
        
               | distortedsignal wrote:
               | Sorry, I meant the comment you replied to.
        
           | visarga wrote:
           | And if you look at open source it does exactly that. Most AI
           | models are open source/weights. People are giving away some
           | of their best work because they want to cooperate with others
           | to build things.
           | 
           | Scientific publications also share knowledge openly. Only
           | private journals are locking up papers, but they are losing
           | ground. At least AI papers are free to read and implement.
           | They share even code, datasets and trained models.
           | 
           | I think the ethos of progress is with open culture not closed
           | copyrights.
        
             | ricardo81 wrote:
             | It may be a good ethos but you can't simply undermine
             | people's life work for your ethos especially when there's
             | laws protecting their life work.
             | 
             | Put yourself in their shoes. Maybe you use some AI
             | derivative work for your own life's work and then when
             | you've finally accomplished what you want to do, someone
             | else duplicates it, or uses the majority of your work?
             | 
             | What do you expect out of it. If you don't expect to be
             | paid, great - I guess you'll be spending X other hours
             | supporting yourself and possible family.
             | 
             | And if you do, you don't expect the hard work of others to
             | be compensated? Rhetorical question.
        
               | visarga wrote:
               | Use AI to build things for yourself. You get the benefit
               | in those things, it doesn't matter if anyone else copies,
               | as long as you're not selling content.
        
               | ricardo81 wrote:
               | Sounds fine to me, just if the AI abides.
        
               | IcyWindows wrote:
               | Technology had made people's jobs obsolete for ages.
        
               | ricardo81 wrote:
               | But there is differences between technological evolution
               | and wholesale taking of other people's intellectual
               | property.
               | 
               | You can go to Watt's steam engine or some such. I doubt
               | he'd argue that his engine had intellectual property on
               | the manual labour people did since time immemorial. But
               | yet it did make their jobs obsolete.
               | 
               | Maybe some of them died while his patents were still in
               | play. All the same, his invention, his idea, plays by the
               | rules.
        
         | kopecs wrote:
         | Do you not believe that societal constructs are real? What
         | makes (fiat) money a real resource and copyright an imaginary
         | one? What about this logic:
         | 
         | Debt is not a real thing. It's a fiction. Subverting contract
         | law is not like subverting the laws of physics. Simply don't
         | pay your bills! What happens when you subvert contract law is
         | some guys with guns, in either an overt way or a subtle one,
         | force a trade of your imaginary resource (money) for the real
         | resource (labor), through a contract.
         | 
         | It's a bit of clever misdirection to declare yourself "in debt"
         | or "debt free." Money is made up! It's fiction! It is whatever
         | we want it to be!
        
           | rngname22 wrote:
           | Really confused to see your comment flagged.
           | 
           | The parent comment basically says "copyright is not a
           | physical good or law, therefore we may disregard it".
           | 
           | As if the same couldn't be said for any verbal agreement,
           | sexual consent, property rights, whatever.
           | 
           | They ought to be arguing why this _particular_ verbal
           | agreement/social contract is not worth enforcing or
           | practicing, rather than dismissing the entire category.
        
             | carlosjobim wrote:
             | It can and should be said about every fiction, as a healthy
             | reminder to yourself. Fiat money, government and property
             | rights are complete abstractions, completely abused and
             | tarnished beyond all limits. A verbal agreement is on a
             | much closer level to reality. Even people who have never
             | heard of money or owned property will take a man up on his
             | word. As for sexual consent it is not something abstract at
             | all. It's not words. It is the will of a person and
             | something that can never be misunderstood.
        
         | stefan_ wrote:
         | This is funny, the moment _the absence of IP_ can benefit large
         | companies it 's just a fiction, a made up concept, not too
         | serious at all.
         | 
         | Of course you go onto the OpenAI corporate pitch page and it
         | will proudly say "respects your IP by not training from it!",
         | that's not the part we are looking to get rid of apparently?
         | That's very real?
        
           | visarga wrote:
           | This is a clever trick, making out corporations to be the
           | beneficiary - who is the real beneficiary? The one who
           | prompts, because they can use the output of the model to
           | solve their problems. In the meantime AI developers make
           | cents on 100K tokens and are mostly in the red.
           | 
           | Why not put the issue where it is - it's a debate about
           | public empowerment with AI vs incumbents protecting their
           | copyrights. Especially for open source models - this
           | situation is not about corporations. Yes, corporations are
           | the ones with deep pockets and easier to sue, but that's just
           | how lawsuits go, you don't waste money suing someone who
           | can't pay up.
           | 
           | What is more important in the next decades - public
           | empowerment or extending copyright to block training AI
           | models? Can we agree to limit copyright to exact
           | reproductions or should it cover all content generated from a
           | model, even if when it looks different from all training
           | examples?
           | 
           | I think going the copyright maximalist way will indirectly
           | hurt creatives because all their works will be checked with
           | the same tools we use to check AI. Anyone could be secretly
           | using generative models. The AI attribution methods will
           | reveal all sorts of things we don't like to see.
        
             | axus wrote:
             | The outrage as packaged by the media really does mirror the
             | anger against Napster from 25 years ago. I seem to recall
             | the small-time musicians were more excited about Napster,
             | though.
        
               | visarga wrote:
               | I remember when BitTorrent users were getting sued left
               | and right, I was thinking about how can a P2P network
               | allow downloading without revealing what was downloaded.
               | Now we got the answer - the LLM - you can download the
               | same LLM like anyone else and nobody knows what you're
               | getting from it.
        
         | jprete wrote:
         | Wait, what? Money is real? I thought it was just a
         | collectively-imagined pile of IOUs to keep track of how many
         | favors someone is owed.
        
         | efsavage wrote:
         | In the sense of Sapiens+, yes, copyright is a fiction, but so
         | is money, or any law or contract or agreement. In as much as it
         | affects people's lives and societies and economies it is
         | "real". The question of the article isn't about realness, it's
         | whether it can evolve or survive this latest shift.
         | 
         | +
         | https://en.wikipedia.org/wiki/Sapiens:_A_Brief_History_of_Hu...
        
           | eimrine wrote:
           | The obelisk symbol after Sapiens word put in the manner it is
           | being put after the names of dead people emparrassed me a
           | little.
        
             | galdosdi wrote:
             | Can you please clarify this cryptic comment? The OP just
             | used a common variation of asterisk, and I am really
             | confused what your beef with it is.
             | 
             | "in the manner it is being put after the names of dead
             | people " is a total non sequitur in this context, how is
             | that in any way related to the discussion or anything the
             | OP said?
        
         | andrewmutz wrote:
         | I agree but I would word it a bit differently: Copyright isn't
         | some fixed, timeless thing. It has been continuously modified
         | to suit the times and to adapt to technological change.
         | 
         | Rather than try to reason about how existing copyright law
         | applies to AI models, we should be focused on changing
         | copyright law to work well in a world with AI.
         | 
         | We can balance the incentives of content creators with the
         | incentives of the users and creators of AI models. I'm
         | confident we can do it because we've done it every time in the
         | past when technology has changed, and AI models will be no
         | different.
        
           | ToucanLoucan wrote:
           | > It has been continuously modified to suit the times and to
           | adapt to technological change.
           | 
           | And to suit the whims of sufficiently large and influential
           | media conglomerates.
           | 
           | Yes, you, as an indie creator, have the exact same
           | intellectual property rights as Disney. But if Disney steals
           | something you made, who's going to win that fight?
        
         | Fauntleroy wrote:
         | This is not a solid argument because it suggests every single
         | social contract involved in human society is not worthwhile,
         | important, or "real".
        
         | NoboruWataya wrote:
         | Leaving aside the semantics of what is "real" - yes, that is
         | the point, that is why the article exists. "Is A.I. The Death
         | of Gravity" is a nonsense question, "Is A.I. the Death of I.P."
         | is not, precisely because humans get to decide the answer. We
         | have developed certain norms and institutions over hundreds of
         | years and now we must decide whether to scrap them in the name
         | of technological progress. How we decide the question will have
         | huge implications for how wealth and power are distributed in
         | our society.
        
         | fulladder wrote:
         | Contracts are not a real thing. It's a fiction. Subverting
         | contract law is not like subverting the laws of physics. What
         | happens when you subvert contract law is some guys with guns,
         | in either an overt way or a subtle one, force a trade of your
         | imaginary resource (a contract) for the real resource (money),
         | through licensing or fines.
        
         | huytersd wrote:
         | Well that's a stupid argument. A good argument is that AI is in
         | accordance with copyright law already since there are no
         | reproductions. In the rare cases where there are reproductions,
         | they can be DMCA taken down like we already do with post LLM
         | output filters.
        
         | rangerelf wrote:
         | I think you're sowing the seeds to undermine your own
         | arguments, because, literally, everything of social utility is
         | a fiction: money, law, "private property", contracts, and so
         | forth.
         | 
         | All of them are needed for society to function, but at the same
         | time all of them are convenient fictions for us to build upon.
        
           | mlsu wrote:
           | My point is that in these discussions, don't make the mistake
           | of considering the artificial thing ("it's copyrightable/it's
           | not copyrightable"). Rather, consider the real thing ("You
           | will pay a licensing fee to use your GPU").
        
             | johnnyanmac wrote:
             | I'm still confused. Sure, a GPU is a real, physical thing.
             | But licenses and fees are no more or less fiction than
             | copyright. A license says "I have permission from the
             | company makers to use this product", similar to how
             | copyright is "you must get permission from me to use my
             | product".
             | 
             | But for the sake of this discussion: I don't think anyone
             | can assume those permissions when using a tool to generate
             | the base image. Ironically enough, it may make more sense
             | to have artists for concepting, copyrighting the character
             | or world so you obtain those permissions, and then using AI
             | to ramp up production. That's an inversion of what's
             | happening for early AI usage.
             | 
             | The main question is how much concepting is needed to
             | copyright a character? Making a few sketches in private?
             | Releasing a full work first with no AI on what you want to
             | copyright?
        
               | cush wrote:
               | > Copyright is not a real thing.
               | 
               | > - I think you're sowing the seeds to undermine your own
               | arguments
               | 
               | > licensing fee...
               | 
               | It took them exactly one reply. New record
        
             | karmakaze wrote:
             | It's only real if you believe it has value. If you choose
             | to say copyright is made up and doesn't have real value
             | then it isn't real to you, but recognize that it's
             | arbitrary as others will believe in the value of copyright
             | so still has value in their eyes. e.g. Taking a copyright
             | from someone (if that were possible) would cost them like
             | taking a fee costs them--losing something they value.
        
           | sam0x17 wrote:
           | Sure, you can make a slippery slope argument, but there is
           | something substantially more "fictional" / artificial /
           | unnatural about copyright/IP than currencies and other social
           | constructs you are equivocating with.
           | 
           | Storytelling is as old as language itself and might even be
           | older than our particular species, and is basically copyright
           | infringement. The same goes for tool-making, which is thought
           | to be one of the sparks that gave rise to our species' vast
           | intelligence. That's how imaginary copyright and IP in
           | general is, let's take the thing we've been doing since
           | literally the dawn of time, copying and sharing information
           | for free and building upon and improving it, and portray this
           | practice as "unnatural" and "unlicensed" and in fact let's
           | set up a restrictive framework under which most of the
           | technological and cultural achievements in human history and
           | before would have been impossible or severely hampered. It
           | reeks of artificiality, way more than money and other things
           | do.
           | 
           | It was created by corporations to make it easier for them to
           | form monopolies around unchallenged control over a particular
           | intellectual property or idea, and now that it is becoming
           | inconvenient for them with the advent of AI, they will
           | probably get rid of it or re-invent it in some way that even
           | further benefits them.
           | 
           | It also matters very little what we do because in the end we
           | will be out-competed by other countries like China that don't
           | draw these artificial intellectual lines that hamper
           | progress.
        
           | carlosjobim wrote:
           | "Society" is the most abstract illusion of them all. It
           | doesn't exist and has never existed anywhere, it has always
           | been a complete fiction and pretend. So, maintaining a bunch
           | of fictions in order to maintain another fiction becomes
           | circular reasoning.
        
         | belter wrote:
         | This post does not exist...
        
         | nyc_data_geek1 wrote:
         | Money is no more real than copyright.
        
         | kajecounterhack wrote:
         | As many other comments mention, copyright is as real as
         | contract law or money.
         | 
         | A more correct statement with the same thrust would be
         | "copyright is not _natural_ -- it's an artificial, bolt-on
         | construct."
         | 
         | And yet lots of artificial constructs are useful for
         | incentivizing / de-incentivizing massed behavior in a way that
         | is maximally beneficial to all. For sure, getting it right is
         | difficult but let's not write off all artificial constructs as
         | worthless or impossible to get right (especially when you just
         | have to get it right _enough_).
        
         | z7 wrote:
         | >Copyright is not a real thing. It's a fiction.
         | 
         | Maybe, but you can also say that about money or about being
         | married. Social fictions work as collective agreements on what
         | is seen as useful for society. One can obviously critique their
         | utility, but pointing out the ficitonal component isn't in
         | itself a criticism.
        
         | lordnacho wrote:
         | The thing that separates copyright from the other fictions
         | mentioned is that there are pretty good reasons to think this
         | particular fiction is not useful.
         | 
         | Property and debt have much longer histories and seem more
         | clearly defensible.
         | 
         | Of course there is one really big fiction out there that a lot
         | of people no longer really believe, but that doesn't mean
         | longevity isn't a thing to consider.
        
         | tivert wrote:
         | > Copyright is not a real thing. It's a fiction. Subverting
         | copyright law is not like subverting the laws of physics. What
         | happens when you subvert copyright law is some guys with
         | guns...
         | 
         | That reasoning applies to murder, too. After all, a bullet
         | through your mom's brain is just a physically allowable
         | reconfiguration of the position of some atoms.
         | 
         | So you're making a true point, but a useless one, since you're
         | operating on the wrong level of abstraction.
        
       | cratermoon wrote:
       | No, but media companies like Disney, Warner Bros, and Paramount
       | might find it a useful lever to squeeze independent creators even
       | further than they already are.
        
       | Jun8 wrote:
       | Here's a story from my days at Motorola Labs around 2009 that you
       | might find relevant: we were looking at ways for streaming
       | content to homes (Moto owned a set-top box business at that time)
       | and there were big debates about the future of streaming. The
       | position that prevailed was that consumers would not be able to
       | stream content to their heart's content en masse because content
       | providers would never let it. I distinctly remember a
       | presentation poster that said you couldn't stream because it's
       | against the law, hence use our solution, etc.
       | 
       | Moral of the story for me is that if there's adequate money to be
       | made laws, protocols, etc. can _totally_ be changed.
        
       | artninja1988 wrote:
       | > Bellos, a comparative-literature professor at Princeton, and
       | Montagu, an intellectual-property lawyer, find this kind of rent-
       | seeking objectionable. They complain that corporate copyright
       | owners "strut the world stage as the new barons of the twenty-
       | first century," and they call copyright "the biggest money
       | machine the world has seen." They point out that, at a time when
       | corporate ownership of copyrights has boomed, the income of
       | authors, apart from a few superstars, has been falling. They
       | think that I.P. law is not a set of rules protecting individual
       | rights so much as a regulatory instrument for business.
       | 
       | Couldn't agree more
        
         | noitpmeder wrote:
         | Remove copywriter protections and small time authors have the
         | potential to earn EVEN LESS.
         | 
         | Why write books or any creative text based content when the
         | second it's available it will be hoovered into the next AI
         | training cycle and can be reproduced in part or in whole by
         | users who almost definitely do not know who you are in the
         | first place.
        
           | artninja1988 wrote:
           | Is that even the case? Who would read a small time writers
           | fiction book through chatgpt? Most of the use cases I've seen
           | for chatgpt, e.g. Translation, conversation, brainstorming,
           | reformulation, summarization are wholly different from the
           | dataset of books. I would say that the art generators have a
           | stronger claim of competing with their work than llms
        
             | noitpmeder wrote:
             | OpenAI/other is earning money (by selling subscriptions,
             | ...) with a product that has been trained on and utilizes
             | the protected work to produce their results.
             | 
             | Personally I think the only way forward is for these AI
             | companies to curate their dataset to only material that is
             | legal for use. E.g. only GitHub repos with licenses stating
             | such. Just because there is not a license doesn't mean it's
             | free to hoover.
        
               | BobaFloutist wrote:
               | Or there could be a very cheap license for AI training
               | that doesn't allow other uses, much like radio licenses
               | for music.
               | 
               | They could buy a bunch of content created (or
               | specifically licensed) for purpose, the artists get paid,
               | and the artists that don't want their work to go into the
               | pipeline get what they want too.
               | 
               | But too many people consider the development of AI to be
               | such a moral imperative that no objections can possibly
               | be relevant.
        
       | mcguire wrote:
       | It's amazing how fast the "I want to get paid for my work"
       | attitude disappears when discussing _someone else_ getting paid
       | for their work.
        
         | huytersd wrote:
         | Well it's kind of silly to say an AI is replicating my style
         | when your style is "I only use purple and yellow and thick
         | black outlines in my work".
        
       | kylehotchkiss wrote:
       | I used the new Instagram "AI" yesterday to "Add a famous duck and
       | his famous mouse friend" to my image which gave me a perfect
       | Donald Duck in the background (I guess it forgot the famous
       | mouse). Surely lawyers just need to work on the replication
       | prompts to demonstrate the model was trained from a copyrighted
       | image and replicates it so accurately to make their case?
        
       | ur-whale wrote:
       | > Is A.I. The Death of I.P.?
       | 
       | One can only hope.
        
       | silveira wrote:
       | I find it revealing that some decades ago when teenagers were
       | copying mp3s in their rooms for their own enjoyment it was
       | piracy, crime, reprehensible, police, prisons, etc. When
       | corporations are doing mass copyright infringement, we are
       | talking about death of IP or changing the copyright laws to
       | accommodate them.
        
         | shredprez wrote:
         | This is the right take -- whatever your position on the future
         | of this technology or its costs and benefits for humanity, the
         | hypocrisy mentioned here shouldn't go unacknowledged.
        
           | mwhitfield wrote:
           | It's only "hypocrisy" if it's the same people saying it. The
           | media zeitgeist is not a person.
        
         | JumpCrisscross wrote:
         | > _when teenagers were copying mp3s in their rooms for their
         | own enjoyment it was piracy, crime, reprehensible, police,
         | prisons, etc. When corporations are doing mass copyright
         | infringement, we are talking about death of IP or changing the
         | copyright laws to accommodate them_
         | 
         | It's been over a decade since an individual was prosecuted for
         | digital piracy [1]. Longer since anyone was threatened with
         | jail time.
         | 
         | When it was new, both individuals and companies were
         | prosecuted. The tides shifted and law enforcement responded.
         | This isn't a story of different standards, but one of evolving
         | ones.
         | 
         | [1] https://en.wikipedia.org/wiki/Sony_BMG_v._Tenenbaum
        
           | ricardo81 wrote:
           | I think the GP's point was it's different rules for big co vs
           | individuals breaching the same idea.
        
             | JumpCrisscross wrote:
             | > _GP 's point was it's different rules for big co vs
             | individuals breaching the same idea_
             | 
             | My point is it isn't. Individual copyright infringement is
             | virtually unenforced today. To the extent it was in the
             | last decade, the penalty was a fine. Meanwhile, OpenAI is
             | being sued by the _New York Times_ [1] and various writers
             | [2].
             | 
             | [1] https://www.nytimes.com/2023/12/27/business/media/new-
             | york-t...
             | 
             | [2] https://www.reuters.com/technology/more-writers-sue-
             | openai-c...
        
               | ricardo81 wrote:
               | I don't think you have a point, the original poster was
               | talking about individual indiscretions (20 years ago
               | also) vs wholesale scraping of billions of people's
               | copyrighted work for corporate gain, today.
               | 
               | Some lawsuits don't change that fact.
        
               | JumpCrisscross wrote:
               | > _individual indiscretions vs wholesale scraping of
               | billions of people 's copyrighted work...some lawsuits
               | don't change that fact_
               | 
               | Over the last decade, individual indiscretions have _not_
               | been punished. Wholesale scraping _is_ being punished.
               | Some lawsuits _do_ challenge the hypothesis that
               | corporates ' copyright violations are being treated more
               | leniently than individuals'.
        
               | ricardo81 wrote:
               | Why are you introducing this 'last decade' thing. Was the
               | decade before not relevant? The original point was not
               | about lawsuits anyway. Are they American lawsuits you're
               | talking about?
               | 
               | You seem to be conflating a whole bunch of things, seems
               | like misdirection.
        
               | JumpCrisscross wrote:
               | > _Why are you introducing this 'last decade' thing. Was
               | the decade before not relevant?_
               | 
               | I'm arguing that policy preferences around copyright
               | infringement have changed _in general_. In general, in
               | 1990s, copyright infringement meant  "crime,
               | reprehensible, police, prisons, etc." for _both_
               | individuals and coporations. In general, in the past
               | decade, it 's meant none of those things for _either_
               | individuals or corporations. Yet it 's meant fines and
               | lawsuits for corporations with virtually none I can find,
               | in America, aimed at individuals.
               | 
               | Also, LLMs were basically invented less than a decade ago
               | [1].
               | 
               | > _seems like misdirection_
               | 
               | "Please don't post insinuations about astroturfing,
               | shilling, brigading, foreign agents, and the like" [2].
               | 
               | [1] https://arxiv.org/abs/1706.03762
               | 
               | [2] https://news.ycombinator.com/newsguidelines.html
        
               | ricardo81 wrote:
               | Your link to the guidelines doesn't prevent me from
               | implying you're directing away from the point, and nor
               | should it.
               | 
               | >I'm arguing that policy preferences around copyright
               | infringement have changed in general.
               | 
               | Perhaps they have, and I take on your take on that. In
               | the end you were replying to me and the original poster
               | so respect the spirit of those posts.
        
               | tanseydavid wrote:
               | I agree -- it does seem like intentional misdirection.
        
               | EdwardDiego wrote:
               | And plenty of people are threatened with lawsuits for
               | seeding on an ongoing basis, especially in some markets
               | like Germany where an entire industry stalks public
               | torrents looking for German IPs.
        
               | JumpCrisscross wrote:
               | > _plenty of people are threatened with lawsuits for
               | seeding on an ongoing basis, especially in some markets
               | like Germany_
               | 
               | Fair enough, I'm talking about America. To my knowledge,
               | individuals downloading pirated content have not been
               | threatened with lawsuits. And to the degree seeders have
               | been threatened, it's only that--threats. When was the
               | last distributor actually sued?
        
           | EdwardDiego wrote:
           | Sure, they don't prosecute, they just threaten you into a
           | settlement. Is that any better?
        
             | JumpCrisscross wrote:
             | > _they don 't prosecute, they just threaten you into a
             | settlement. Is that any better?_
             | 
             | Objectively, yes. Paying a settlement in private is better
             | than being publicly prosecuted and then put in jail.
             | 
             | That said, it's still no cakewalk. Do you have a source for
             | individuals settling copyright claims? I'm not finding any
             | recent stories nor surveys.
        
               | reedciccio wrote:
               | Try downloading pirated movies and see how long it takes
               | for you to receive a letter from your ISP telling you to
               | stop or else... It's automatic, that's why you don't hear
               | about lawsuits anymore: not necessary, the law
               | enforcement is semi automatic now.
        
         | dageshi wrote:
         | We're not exactly tearing everything apart to combat piracy
         | nowadays even though piracy still exists, so I'm not sure
         | exactly what it reveals other than attitudes change?
        
         | pylua wrote:
         | Are they training it on music from top artists or their lyrics?
         | Maybe they haven't poked the wrong bear yet.
        
           | mtlmtlmtlmtl wrote:
           | Did some limited testing(chatgpt 3.5).
           | 
           | If you ask it to complete text and give it the first line, it
           | will sometimes continue the lyrics. Seems to work best for
           | particularly famous stretches of lyrics, like Lose Yourself
           | by Eminem, Bohemian Rhapsody or Hurt.
           | 
           | When I told it to complete the text "Obie Trice, real name no
           | gimmicks", it said I'm sorry I can't reproduce the lyrics to
           | Without Me by Eminem. And offered to tell me more about Obie
           | Trice.
           | 
           | When I just asked it to reproduce the lyrics to Bohemian
           | Rhapsody, it once again refused and offered to analyse the
           | lyrics.
           | 
           | Seems like there's clearly song lyrics in the training data,
           | and that they've at least made attempts to prevent it from
           | regurgitating them.
        
         | rhystmills wrote:
         | There are plenty of articles from major news orgs about we
         | shouldn't accept LLMs infringing copyright. The New York Times
         | are suing OpenAI.
        
           | tanseydavid wrote:
           | >> There are plenty of articles from major news orgs
           | 
           | Opinion articles.
        
         | supertofu wrote:
         | Remember those very dramatic warnings before movie trailers
         | equating piracy to real-world theft (You wouldn't steal a car;
         | you wouldn't rob a bank; piracy is a crime...)? Seems laughable
         | now.
        
         | happytiger wrote:
         | Rules for thee and not for me? Perhaps V, perhaps.
         | 
         | I want to gently reframe the debate, for while I agree with the
         | hypocrisy it rather misses a very key point.
         | 
         | Intellectual law, as bedrock principle, _explicitly doesn't
         | recognize nonhuman creators._ So at the heart of the issue
         | isn't changing copyright laws or the death of IP, but _what to
         | do with non-human creativity?_ This is an interesting issue
         | now, but most of the debates and options being debated thus far
         | won't survive an AGI let alone a world full of advanced AGIs.
         | 
         | But it's a critical distinction between "so now it doesn't
         | matter when it was theft before," and "what do we do with non-
         | human intelligence when our entire system of creativity
         | protection is build around humans and non-humans now exist?"
         | 
         | It's a paradigm shift that gets somewhat denied by the
         | hypocrisy argument. Most people look at AI as "technology" we
         | have developed, but if any science fiction writers are right
         | it's actually a bona fide digital intelligence that's getting
         | developed here -- essentially the possible digital twin of
         | human intelligence -- and that's a whole different set of
         | considerations.
        
         | leotravis10 wrote:
         | This is the absolute correct take, especially with the
         | uncertainty and legal actions (copyright infringment lawsuits)
         | against AI right now.
        
       | ricardo81 wrote:
       | Perhaps another angle is that the incoming and young workforce
       | find it extremely convenient to take everyone else's work and
       | make it their own.
       | 
       | It is nature after all, to spend the least amount of energy to
       | attain a goal.
       | 
       | It would be convenient to accept it as par for the course.
       | 
       | The problem seems to be is that copyright laws are basically
       | ignored nowadays.
        
       | rikroots wrote:
       | If anyone wants to train their AI thing on my poetic output ...
       | they're more than welcome to![1] I've been working for decades to
       | get people's eyeballs bleeding from reading (too much of) my
       | poetry; the thought that someone would even want to train a
       | machine to churn out eyeball-bleeding poetry influenced by my
       | work - it makes me happy!
       | 
       | [1] - https://rikverse2020.rikweb.org.uk/blog/copyrights
       | 
       | (sarcasm only half-intended)
        
       ___________________________________________________________________
       (page generated 2024-01-17 23:01 UTC)