[HN Gopher] Who knew the first AI battles would be fought by art...
___________________________________________________________________
Who knew the first AI battles would be fought by artists?
Author : dredmorbius
Score : 312 points
Date : 2022-12-15 11:49 UTC (11 hours ago)
(HTM) web link (vmst.io)
(TXT) w3m dump (vmst.io)
| cardanome wrote:
| I don't see the point. There is a copyright (and in that regard
| most of these images are fine) and then there is trademark which
| they might violate.
|
| Regardless, the human generating and publishing these images is
| obviously responsible to ensure they are not violating any IP
| property. So they might get sued by Disney. I don't get why the
| AI companies would be effected in any way. Disney is not suing
| Blender if I render an image of Mickey Mouse with it.
|
| Though I am sure that artists might find an likely ally in Disney
| against the "AI"'s when they tell them about their idea of making
| art-styles copyright-able Being able to monopolize art styles
| would be indeed a dream come true for those huge corporations.
| xg15 wrote:
| If thouse mouse images are generated, that implies that Disney
| content is _already_ part of the training data and models.
|
| So in effect, they are pitting Disney's understanding of
| copyright (maximally strict) against that of the AI companies
| (maximally loose).
|
| Even if it's technically the responsibility of the user not to
| publish generated images that contain copyrighted content, I
| can't imagine that Disney is very happy with a situation where
| everyone can download Stable Diffusion and generate their own
| arbitrary artwork of Disney characters in a few minutes.
|
| So that strategy might actually work. I wish them good luck and
| will restock my popcorn reserves just in case :)
|
| The problem I see though is that both sides are billion dollar
| companies - and there is probably a lot of interest in AI tech
| within Disney themselves. So it might just as well happen that
| both sides find some kind of agreement that's beneficial for
| both of them and leaves the artists holding the bag.
| wnkrshm wrote:
| You can search the LAION5B CLIP-space and you find a lot of
| mickey in it, lots of fan art between photos of actual merch.
| If you search with a high aesthetic score, you'll find lots
| of actual Disney illustrations etc. in the neighbourhood. [0]
|
| [0] https://rom1504.github.io/clip-retrieval/
| xg15 wrote:
| Yes, and probably the copyrighted art of lots of other
| artists as well. That's the entire point.
| astrange wrote:
| > If thouse mouse images are generated, that implies that
| Disney content is already part of the training data and
| models.
|
| It doesn't mean that. You could "find" Mickey in the latent
| space of any model using textual inversion and an hour of GPU
| time. He's just a few shapes.
|
| (Main example: the most popular artist StableDiffusion 1
| users like to imitate is not in the StableDiffusion training
| images. His name just happens to work in prompts by
| coincidence.)
| Taywee wrote:
| If you can find a copyrighted work in that model that
| wasn't put there with permission, then why would that model
| and its output not violate the copyright?
| mcv wrote:
| The idea behind that is probably that any artist learns
| from seeing other artists' copyrighted art, even if
| they're not allowed to reproduce it. This is easily seen
| from the fact that art goes through fashions; artists
| copy styles and ideas from each other and expand on that.
|
| Of course that probably means that those copyrighted
| images exist in some encoded form in the data or neural
| network of the AI, and also in our brain. Is that legal?
| With humans it's unavoidable, but that doesn't have to
| mean that it's also legal for AI. But even if those
| copyrighted images exist in some form in our brains, we
| know not to reproduce them and pass them off as original.
| The AI does that. Maybe it needs a feedback mechanism to
| ensure its generated images don't look too much like
| copyrighted images from its data set. Maybe art-AI
| necessarily also has to become a bit of a legal-AI.
| astrange wrote:
| https://en.wikipedia.org/wiki/The_Library_of_Babel
|
| A latent space that contains every image contains every
| copyrighted image. But the concept of sRGB is not
| copyrighted by Disney just yet.
| Taywee wrote:
| Sure, but this isn't philosophy. An AI model that
| contains every image is a copyright derivative of all
| those images and so is the output generated from it. It's
| not an abstract concept or a human brain. It's a pile of
| real binary data generated from real input.
| astrange wrote:
| StableDiffusion is 4GB which is approximately two bytes
| per training image. That's not very derivative, it's
| actual generalization.
|
| "Mickey" does work as a prompt, but if they took that
| word out of the text encoder he'd still be there in the
| latent space, and it's not hard to find a way to
| construct him out of a few circles and a pair of red
| shorts.
| mcv wrote:
| How do you get that coincidence? To be able to accurately
| respond to the cue of an artist's name, it has to know the
| artist, doesn't it?
|
| In any case, in the example images here, the AI clearly
| knew who Mickey is and used that to generate Mickey Mouse
| images. Mickey has got to be in the training data.
| esrauch wrote:
| For other artist cases the corpus can include many images
| that includes a description with phrases like "inspired
| by Banksy". Then the model can learn to generate images
| in the style of Banksy without having any copyrighted
| images by Banksy in the training set.
|
| The Mickey Mouse case though is obviously bs, the
| training data definitely does just have tons of
| infringing examples of Mickey Mouse, it didn't somehow
| reinvent the exact image of him from first principles.
| taeric wrote:
| This is a bit silly, though? Search Google images for Mickey
| Mouse, is the results page a possible liability for Google?
| Why not?
|
| Go to a baker and commission a Mickey Mouse cake. Is that a
| violation if the bakery didn't advertise it? (To note, a
| bakery can't advertise it due to trademark, not copyright.
| Right?)
|
| For that matter, any privately commissioned art? Is that
| really what artists want to lock away?
| crote wrote:
| > Is the results page a possible liability for Google?
|
| Absolutely. Google previously had a direct link to the
| full-size image, but it has removed this due to potential
| legal issues. See [0].
|
| > Is that a violation if the bakery didn't advertise it?
|
| According to Disney, it is. See [1].
|
| > Any privately commissioned art?
|
| Not _any_ art, no. Only that which uses IP /material they
| do not have a license to.
|
| [0]: https://www.ghacks.net/2018/02/12/say-goodbye-to-the-
| view-im...
|
| [1]: https://en.wikipedia.org/wiki/Cake_copyright#Copyright
| _of_ar...
| taeric wrote:
| I started to go down the rabbit hole of commissioned fan
| art. To say that that is a quagmire is an understatement.
| :(
| Macha wrote:
| I mean, isn't most of that "It's trademark infringement,
| but it is both financially tedious and a PR disaster to
| go after any but the most prominent cases"
|
| Which is why e.g. Bethesda is not going to slap you for
| your Mr House or Pip-Boy fanart, but will slap the
| projects that recreate Fallout 3 in engine X.
| wongarsu wrote:
| The right to citation is already part of the 1886 Berne
| Convention, a precedent that enables services like Google
| images.
|
| The matters of the baker and the privately comissioned art
| are more complicated. The artist and baker hold copyrigh
| for their creation, but their products are also derived
| from copyrighted work, so Disney also has rights here [1].
| This is just usually not enforced by copyright holders
| because who in their right mind would punish free
| marketing.
|
| 1: https://en.wikipedia.org/wiki/Derivative_work
| sigmoid10 wrote:
| >is the results page a possible liability for Google?
|
| That's actually a tricky question and lengthy court battles
| were held over this in both the US and Europe. In the end,
| all courts decided that the image result page is
| questionable when it comes to copyright, but generally
| covered by fair use. The question is how far fair use goes
| when people are using the data in derivative work. Google
| specifically added licensing info about images to further
| cover their back, but this whole fair use stuff gets really
| murky when you have automatic scrapers using google images
| to train AIs who in turn create art for sale eventually.
| There's a lot of actors in that process that profit
| indirectly from the provided images. This will probably
| once again fall back to the courts sooner or later.
| red_trumpet wrote:
| Europe has no concept of Fair Use. How did the courts
| argue there?
| sigmoid10 wrote:
| Fair use is just a limitation of copyright in case of
| public interest. Europe has very similar exclusions, even
| though they are spelled out more concretely. But they
| don't make this particular issue any less opaque.
| FinnKuhn wrote:
| Not a lawyer, but from how I understand it the German
| courts argued that if you don't use any technology to
| prevent web crawlers from accessing the pictures on your
| website you need to accept that they are used for preview
| images (what the Google picture search technically is) as
| this is a usual use case.
|
| -> here is the actual judgement though:
| https://juris.bundesgerichtshof.de/cgi-
| bin/rechtsprechung/do...
| logifail wrote:
| > Search Google images for Mickey Mouse, is the results
| page a possible liability for Google?
|
| In 2018[0], didn't Getty force Google to change how Google
| Images presented results, following a lawsuit in 2016[1]?
|
| [0] https://arstechnica.com/gadgets/2018/02/internet-rages-
| after... [1] https://arstechnica.com/tech-
| policy/2016/04/google-eu-antitr...
| jameshart wrote:
| There's nothing wrong with the model knowing what Mickey
| Mouse looks like.
|
| There are noninfringing usecases for generating images
| containing Mickey Mouse - not least, Disney themselves
| produce thousands of images containing the mouse's likeness
| every year; but also parody usecases exist.
|
| But even if you are just using SD to generate images, if we
| want to make sure to avoid treading on Disney's toes, the AI
| would need to know what Mickey Mouse looks like in order to
| _avoid_ infringing trademark, too. You can feed it negative
| weights already if you want to get 'cartoon mouse' but not
| have it look like Mickey.
|
| The AI draws what you tell it to draw. You get to choose
| whether or not to publish the result (the AI doesn't
| automatically share its results with the world). You have the
| ultimate liability and credit for any images so produced.
| xg15 wrote:
| Not a lawyer (and certainly no disney lawyer), but my
| understanding was that copyright is specifically concerned
| with _how_ an image is created, less so _that_ it is
| created. Which is why you can copyright certain recordings
| that only consist of silence. It just prevents you from
| using _this_ record to base your own record of silence on,
| it doesn 't generally block you from recording silence.
|
| In the same way, making the model deliberately unable to
| generate Micky Mouse images would be much more far-reaching
| than just removing Micky imagery from the trainset.
| jameshart wrote:
| Most Mickey Mouse image usage problems will be trademark
| infringement not copyright.
|
| Copyright infringement does generally require you to have
| been _aware_ of the work you were copying. So for sure
| there 's an issue with using AI to generate art where you
| could use the tool to generate you an image, which you
| think looks original, because you are unaware of a
| similar original work, so _you_ could not be guilty of
| copyright infringement - but if the AI model was trained
| on a dataset that includes an original copyrighted work
| that is similar, obviously it seems like someone has
| infringed something there.
|
| But that's not what we're talking about in the case of
| mickey mouse imagery, is it? You're not asking for images
| of 'utterly original uncopyrighted untrademarked cartoon
| mouse with big ears' and then unknowingly publishing a
| mouse picture that the evil AI copied from Disney without
| your knowledge.
| xg15 wrote:
| > _But that 's not what we're talking about in the case
| of mickey mouse imagery, is it? You're not asking for
| images of 'utterly original uncopyrighted untrademarked
| cartoon mouse with big ears' and then unknowingly
| publishing a mouse picture that the evil AI copied from
| Disney without your knowledge._
|
| I think this is exactly the problem that many artists
| have with imagine generators. Yes, we could all easily
| identify if a generated artwork contained popular Disney
| characters - but that's because it's Disney, owners of
| some of the most well-known IP in the world. The same
| isn't true for small artists: There is a real risk that a
| model reproduces parts of a lesser known copyrighted work
| and the user doesn't realise it.
|
| I think this is what artists are protesting: Their works
| have been used as training data and will now be parts of
| countless generated images, all with no permission and no
| compensation.
| TchoBeer wrote:
| This already happens all the time in the current status
| quo with no need for AI.
| jameshart wrote:
| Right.
|
| So Disney don't need to worry about AI art tools - so
| 'attacking' them with such tools does nothing.
| palata wrote:
| Well Disney would probably sue Blender if there was a "generate
| Mickey Mouse model" button in it. It's not a totally fair
| comparison.
| poulpy123 wrote:
| But you can already make mickey mouse models, and people do
| it all the time https://www.youtube.com/watch?v=CqVXoGCTfuk&a
| b_channel=Ashle...
| subw00f wrote:
| I'm sure it's easy to write an addon for that.
| gl-prod wrote:
| Then the author of that addon would be liable. Not blender.
| SiempreViernes wrote:
| Try it and see if it is blender or you as the addon creator
| that gets sued.
| idlehand wrote:
| These AI models are closer to Google in that regard, yes, you
| can instruct them to generate a Mickey Mouse image, but you
| can instruct them to generate any kind of image, just like
| you can search for anything on Google, including Mickey
| Mouse. When using these models you are essentially performing
| a search in the model weights.
| thih9 wrote:
| Google Image results have a note that says: "Images may be
| subject to copyright".
| Tepix wrote:
| It boils down to this: Do you need permission if you train your
| AI model with copyrighted things or not?
| residualmind wrote:
| I would argue if people are allowed to see your art for free,
| so should AI models.
| bakugo wrote:
| AI models are not people.
| dotancohen wrote:
| Bad argument. Being allowed to see art and being allowed to
| copy art are two different things. Being allowed to _copy_
| is a reserved _right_, that's the root of the word
| copyright.
| CuriouslyC wrote:
| Bad argument. Copying art is not the crime, distributing
| the copied art is the crime. The Disney Gestapo can't
| send storm troopers to your house if your kid draws a
| perfect rendition of Mickey, but they can if your kid
| draws a bunch of perfect renditions and sells them
| online.
| concordDance wrote:
| Except they aren't copying it, but instead drawing
| inspiration from it. Which all humans have done forever.
| AlexandrB wrote:
| This falls apart for 2 reasons. First, I don't think
| there's any technical definition of "inspiration" that
| applies to a deeply nested model of numerical weights.
| It's a machine. A hammer does not draw inspiration from
| nails that have been hammered in before. Second an AI is
| not a human under the law and there's no reason to think
| that an activity that would be considered
| "transformative" (e.g. learning then painting something
| similar) when done by a human would still be considered
| such if performed by an AI.
| mejutoco wrote:
| Following your logic: if AI is like humans why don't we
| tax its work?
| CuriouslyC wrote:
| If an AI ever gets paid for the work it does, I'm sure we
| will.
| peoplefromibiza wrote:
| people are allowed to take a walk in the park, so why cars
| or tanks or bulldozers are not?
| residualmind wrote:
| A bulldozer destroys the park and other people's ability
| to enjoy it -- active, destructive. Passively training a
| model on an artwork does not change the art in the
| slightest -- passive, non-destructive
|
| Mind you, this is not talking about the usage rights of
| images generated from such a model, that's a completely
| different story and a legal one.
| 6P58r3MXJSLi wrote:
| > A bulldozer destroys the park and other people's
| ability to enjoy it
|
| hear hear...
|
| > Passively training a model on an artwork does not
| change the art in the slightest
|
| copyright holders, I mean individual authors, people who
| actually produced the content being used, disagree.
|
| They say AI is like a bulldozer destroying the park to
| them.
|
| Which technically is true, it's a machine that someone
| (some interested party maybe?) is trying to disguise as a
| human, doing human stuff.
|
| But it's not.
|
| > passive, non-destructive
|
| Passive, non-destructive, in this context means
|
| - passive: people send the images to you, you don't go
| looking for them
|
| - non-destructive: people authorized you, otherwise it's
| destructive of their rights.
| gt565k wrote:
| Ehhh that's like saying an artist who studies other art
| pieces and then creates something using combined techniques
| and styles from those set pieces is what ???? Now liable ???
| Taywee wrote:
| An AI is not a person. Automated transformation does not
| remove the original copyright, otherwise decompilers would
| as well. That the process is similar to a real person is
| not actually important, because it's still an automated
| transformation by a computer program.
|
| We might be able to argue that the computer program taking
| art as input and automatically generating art as output is
| the exact same as an artist some time after general
| intelligence is reached, until then, it's still a machine
| transformation and should be treated as such.
|
| AI shouldn't be a legal avenue for copyright laundering.
| jefftk wrote:
| _> Automated transformation does not remove the original
| copyright_
|
| Automated transformation is not guaranteed to remove the
| original copyright, and for simple transformations it
| won't, but it's an open question (no legal precedent,
| different lawyers interpreting the law differently)
| whether what these models are doing is so transformative
| that their output (when used normally, not trying to
| reproduce a specific input image) passes the fair use
| criteria.
| idlehand wrote:
| Now we are in Ship of Theseus territory. If I downsample
| an image and convert it into a tiny delta in the model
| weights, from which the original image can never be
| recovered, is that infringement?
| CyanBird wrote:
| Except the machine is not automatically generating an
| input
|
| > automatically generating art as output
|
| The user is navigating the latent space to obtain said
| output, I don't know if that's transformative or not, but
| it is an important distinction
|
| If the program were wholy automated as in it had a random
| number/words generator added to it and no navigation of
| the latent space by users happened, then yeah I would
| agree, but that's not the case at least so far as ml
| algos like midjourney or stable diffusion are concerned
| Taywee wrote:
| That's still automated in the same way that a compiler is
| automated. A compiler doesn't remove the copyright,
| neither does a decompiler. This isn't different enough to
| have different copyright rules. There are more layers to
| the transformation, but it's still a program with input
| and output. I'm not sure what you mean by "navigation of
| latent space". It's generating a model from copyrighted
| input and then using that model and more input to
| generate output. It's a machine transformation in more
| steps.
| Retric wrote:
| The output is probably irrelevant here, the model itself
| is a derivative work from a copyright standpoint.
|
| Going painting > raw photo (derivative work), raw photo >
| jpg (derivative work), jpg > model (derivative work),
| model > image (derivative work). At best you can make a
| fair use argument at that last step, but that falls apart
| if the resulting images harm the market for the original
| work.
| strken wrote:
| The question for me is whether "jpg > model" is
| derivative or transformative. It's not clear it would be
| derivative.
| Retric wrote:
| You seem to be confused, transformative works are still
| derivative works. Being sufficiently transformative can
| allow for a fair use exception but you may need a court
| case to prove something is sufficiently transformative to
| qualify.
| strken wrote:
| Sorry, yes.
| PeterisP wrote:
| It's not clear at all whether the model is a derivative
| work from a copyright standpoint. Maybe they are, may be
| they are not - it's definitely not settled, the law isn't
| very explicit and as far as I know, there is no
| reasonable precedent yet - and arguably _that_ would be
| one of the key issues decided (and set as precedent) in
| these first court battles. I also wouldn 't be surprised
| if it eventually doesn't matter what current law says as
| the major tech companies may lobby passing a law to
| explicitly define the rules of the game; I mean if Disney
| could lobby multiple copyright laws to protect their
| interests, then the ML-heavy tech companies, being much
| larger and more wealthy than Disney, can do it as well.
|
| But currently, first, there is a reasonable argument that
| the model weights may be not copyrightable at all - it
| doesn't really fit the criteria of what copyright law
| protects, no creativity was used in making them, etc, in
| which case it can't be a derivative work and is
| effectively outside the scope of copyright law. Second,
| there is a reasonable argument that the model is a
| collection of facts about copyrighted works, equivalent
| to early (pre-computer) statistical ngram language models
| of copyrighted books used in e.g. lexicography - for
| which we have solid old legal precedent that creating
| such models are not derivative works (again, as a
| collection of facts isn't copyrightable) and thus can be
| done against the wishes of the authors.
|
| Fair use criteria comes into play as conditions when it
| is permissible to violate the exclusive rights of the
| authors. However, if the model is not legally considered
| a derivative work according to copyright law criteria,
| then fair use conditions don't matter because in that
| case copyright law does not assert that making them is
| somehow restricted.
|
| Note that in this case the resulting image might still be
| considered derivative work of an original image, even if
| the "tool-in-the-middle" is not derivative work.
| Retric wrote:
| You seem to be confused as to nomenclature,
| transformative works are still derivative works. Being
| sufficiently transformative can allow for a fair use
| exception, the distinction is important because you can't
| tell if something is sufficiently transformative without
| a court case.
|
| Also, a jpg seemingly fits your definition as "no
| creativity was used in making them, etc" but clearly they
| embody the original works creativity. Similarly, a model
| can't be trained on random data it needs to extract
| information from it's training data to be useful.
|
| The specific choice of algorithm used to extract
| information doesn't change if something is derivative.
| jamesdwilson wrote:
| finally, a good use for a blockchain, decentralized
| defeating of copyright
| Double_a_92 wrote:
| That's still the question that it boils down to, even if
| the answer is a "No".
| TaupeRanger wrote:
| Not at all, for many reasons.
|
| 1) the artist is not literally copying the copyrighted
| pixel data into their "system" for training
|
| 2) An individual artist is not a multi billion dollar
| company with a computer system that spits out art rapidly
| using copyrighted pixel data. A categorical difference.
| brushfoot wrote:
| Those reasons don't make sense to me.
|
| On 1, human artists _are_ copying copyrighted pixel data
| into their system for training. That system is the brain.
| It 's organic RAM.
|
| On 2, money shouldn't make a difference. Jim Carrey
| should still be allowed to paint even though he's rich.
|
| If Jim uses Photoshop instead of brushes, he can spit out
| the style ideas he's copied and transformed in his brain
| more rapidly - but he should still be allowed to do it.
| AlexandrB wrote:
| I think the parent's point about (2) wasn't about money,
| but _category_. A human is a human and has rights, an AI
| model is a tool and does not have rights. The two would
| not be treated equally under the law in any other
| circumstances, so why would you equate them when
| discussing copyright?
| astrange wrote:
| > On 1, human artists are copying copyrighted pixel data
| into their system for training. That system is the brain.
| It's organic RAM.
|
| They probably aren't doing that. Studying the production
| methods and WIPs is more useful for a human. (ML models
| basically guess how to make images until they produce one
| that "looks like" something you show it.)
| Mezzie wrote:
| They do sometimes, or at least they used to. I have some
| (very limited) visual art training, and one of the things
| I/we did in class was manually mash up already existing
| works. In my case I smushed the Persistence of Memory and
| the Arnolfini portrait. It was pretty clear copycat; the
| work was divided into squares and I poorly replicated the
| Arnolfini Portrait from square to square.
| Taywee wrote:
| A human can grow and learn based on their own experiences
| separate from their art image input. They'll sometimes
| get creative and develop their own unique style. Through
| all analogies, the AI is still a program with input and
| output. Point 1 doesn't fit for the same reason it
| doesn't work for any compiler. Until AI can innovate
| itself and hold its own copyright, it's still a machine
| transformation.
| endorphinbomber wrote:
| Have to disagree with point 1, often this is what artists
| are doing. More strictly in the music part (literally
| playing others songs), less strictly in the drawing part.
| But copying, incorporating and developing are some of the
| core foundations of art.
| astrange wrote:
| Diffusion models don't copy the pixels you show them. You
| cannot generally tell which training images inspired
| which output images.
|
| (That's as opposed to a large language model, which does
| memorize text.)
|
| Also, you can train it to imitate an artist's style just
| by showing it textual descriptions of the style. It
| doesn't have to see any images.
| mejutoco wrote:
| > Also, you can train it to imitate an artist's style
| just by showing it textual descriptions of the style. It
| doesn't have to see any images.
|
| And the weights. The weights it has learned come
| originally from the images.
| alt227 wrote:
| Depends if the artist creates something new which looks
| exactly like one of the things he has studied.
| Tepix wrote:
| That's like saying creating a thing that looks at one
| artists artwork and then copies her unique style ad
| infinitum may need permission first.
| pigsty wrote:
| Copying an artist's style is very much not considered
| copyright infringement and is how artists learn.
|
| Copying a work itself can be copyright infringement if
| it's very close to the original to the point people may
| think they're the same work.
| Gigachad wrote:
| You don't need permission. Style is not an owned thing.
| bakugo wrote:
| No, it's not the same thing at all, in fact it's entirely
| unrelated.
|
| Say it with me: Computer algorithms are NOT people. They
| should NOT have the same rights as people.
| ben_w wrote:
| If you do need permission, is Page Rank a copyright
| infringing AI, or just a sparkling matrix multiplication
| derived entirely from everyone else's work?
| peoplefromibiza wrote:
| Page Rank doesn't reproduce any content claiming it's new.
|
| You can however disallow Google from indexing your content
| using robots.txt a met tag in the HTML or an HTTP header.
|
| Or you can ask Google to remove it from their indexes.
|
| Your content will disappear from then on.
|
| You can't un-train what's already been trained.
|
| You can't disallow scraping for training.
|
| The damage is already done and it's irreversible.
|
| It's like trying to unbomb Hiroshima.
| CyanBird wrote:
| That's actually interesting, adding Metadata to the
| images as a check for allowing or disallowing ai usage
|
| That might be a good way to go about it
| ben_w wrote:
| If you can make the metadata survive cropping, format
| shifts, and screenshots.
|
| Can probably do all that well-enough ( _probably_ doesn
| 't need to be perfect) by leaning on FAANG, with or
| without legislation.
|
| But: opt-in by default, or opt-out by default?
| Lalabadie wrote:
| The output of Pagerank for a given page is not another new
| page, that's curiously close in style and execution but
| laundered of IP concerns.
|
| A tool that catalogues attributed links can't really be
| evaluated the same way as pastiche machine.
|
| You'd be much closer using the example of Google's first
| page answer snippets, that are pulled out of a site's
| content with minimal attribution.
| jefftk wrote:
| Which is also what the GitHub co-pilot suit is about:
| https://githubcopilotlitigation.com
|
| If you have views on whether they'll win, the prediction
| market is currently at 49%:
| https://manifold.markets/JeffKaufman/will-the-github-
| copilot...
| cardanome wrote:
| As a human, I can use whatever I want for reference for my
| drawings. Including copyrighted material.
|
| Now, as for training "AI" models, who knows. You can argue it
| is the same thing a human is doing or you could argue it a
| new, different quality and should be under different rules.
| Regardless, the current copyright laws were written before
| "AI" models were in widespread use so whatever is allowed or
| not is more of a historic accident.
|
| So the discussion needs to be about the intention of
| copyright laws and what SHOULD be.
| vgatherps wrote:
| This would be a fairly novel law as it would legislate not
| just the release of an AI but the training as well? That
| would imply legislating what linear algebra is legal and
| illegal to do, no?
|
| And practically speaking, putting aside whether a
| government should even be able to legislate such things,
| enforcing such a law would be near impossible without wild
| privacy violations.
| CyanBird wrote:
| > That would imply legislating what linear algebra is
| legal and illegal to do, no?
|
| No, it would just legislate what images are and which
| ones are not on the training data to be parsed, artists
| want a copyright which makes their images unusable for
| machine learning derivative works.
|
| The trick here is that eventually the algorithms will get
| good enough that it won't be necessary for said images to
| even be on the training data in the first place, but we
| can imagine that artists would be OK with that
| astrange wrote:
| > The trick here is that eventually the algorithms will
| get good enough that it won't be necessary for said
| images to even be on the training data in the first
| place, but we can imagine that artists would be OK with
| that
|
| They shouldn't be OK with that and they probably aren't.
| That's a much worse problem for them!
|
| The reason they're complaining about copyright is most
| likely coping because this is what they're actually
| concerned about.
| stale2002 wrote:
| > but we can imagine that artists would be OK with that
|
| No they won't. If AI art was just as good as it is today,
| but didn't use copyrighted images in the training set,
| people would absolutely still be finding some other thing
| to complain about.
|
| Artists just don't want the tech to exist entirely.
| manimino wrote:
| I am not allowed to print $100 bills with my general-
| purpose printer. Many printing and copy machines come
| with built-in safeguards to prevent users from even
| trying.
|
| It's quite possible to apply the same kind of protections
| to generative models. (I hope this does not happen, but
| it is fully possible.)
| bootsmann wrote:
| Entirely different scales apply here. You can hardcode a
| printer the 7 different bills each country puts out no
| problem, but you cannot hardcode the billions of
| "original" art pieces that the model is supposed to check
| against during training, its just infeasible.
| CuriouslyC wrote:
| Not exactly true. Given an image, you can find the
| closest point in the latent space that image corresponds
| to. It is totally feasible to do this with every image in
| the training set, and if that point in the latent space
| is too close to the training image, just add it to a set
| of "disallowed" latent points. This wouldn't fly for
| local generation, as the process would take a long time
| and generate a multi gigabyte (maybe even terabyte)
| "disallowed" database, but for online image generators
| it's not insane.
| peoplefromibiza wrote:
| > As a human
|
| you have rights.
|
| AIs don't.
|
| Because they don't have will.
|
| It's like arresting a gun for killing people.
|
| So, as a human, the individual(s) training the AI or using
| the AI to reproduce copyrighted material, are responsible
| for the copyright infringement, unless explicitly
| authorized by the author(s).
| dredmorbius wrote:
| Among the goals seems to be a bit of well-poisoning. Artists
| have done this previously by creating art saying, say, "This
| site sells STOLEN artwork, do NOT by from them", and
| encouraging followers to reply with "I want this on a t-shirt",
| which had previously been used by rip-off sites to pirate
| artwork. See:
|
| <https://waxy.org/2019/12/how-artists-on-twitter-tricked-
| spam...>
|
| If art streams are tree-spiked with copyrighted or trademarked
| works, then AI generators might be a bit more gun-shy about
| training with abandon on such threads.
|
| It's a form of monkeywrenching.
|
| <https://en.wikipedia.org/wiki/Tree_spiking>
|
| <https://en.wikipedia.org/wiki/Sabotage#As_environmental_acti..
| .>
| gwd wrote:
| Not sure about Stable Diffusion / Metawhatsit, but OpenAI's
| training set is already curated to make sure it avoids
| violence and pornography; and in any case, the whole thing
| relies on humans to come up with descriptions. Not clear how
| this sort of thing would "spike the well" in that sense.
| wokwokwok wrote:
| Are you being deliberately obtuse?
|
| It's blatantly obvious that regardless of if it will _work_ or
| not, they're trying to get companies with enough money to file
| law suits to make a move and do so.
|
| > I don't see the point.
|
| ...or you don't agree with the intent?
|
| I'm fine with that, if so, but you'd to be deliberately trying
| very hard not to understand what they're trying to do.
|
| Quite obviously they're hoping, similar to software that lets
| you download videos from YouTube, that tools that enable things
| are bad, not neutral.
|
| Agree / disagree? Who cares. I can't believe anyone who
| "doesn't get it" is being earnest in their response.
|
| Will it make any difference? Well, it may or may not, but
| there's a fair precedent of it happening, and bluntly, no one
| is immune to law suits.
| dredmorbius wrote:
| _When disagreeing, please reply to the argument instead of
| calling names. "That is idiotic; 1 + 1 is 2, not 3" can be
| shortened to "1 + 1 is 2, not 3."_
|
| <https://news.ycombinator.com/newsguidelines.html>
| yellow_lead wrote:
| > I don't get why the AI companies would be effected in any
| way.
|
| It doesn't necessarily matter if they're affected. My thought
| when seeing this is that they want some _legal precedent_ to be
| set which determines that this is not fair use.
| seydor wrote:
| The problem is copyright laws , not the models (which are
| inevitable and impossible to stop anyway). The sketch of a mouse
| should not be protected more than the artistic style of any guy.
| IP laws are ancient concept and it s a mystery why people still
| cling to them so tightly
| rangersanger wrote:
| They aren't. the first battles were fought by victims of
| deepfakes.
| jelliclesfarm wrote:
| Art is now low value. It has no value addition. Technology in the
| palm of our hands and higher quality of life is also the reason.
|
| Let's not forget the very impressive population explosion in the
| past century. Every 'job' is a skill that has been out streamed
| so the needs of the population are satisfied by skills of the
| population so resources are distributed evenly.
|
| Art is no longer a need and there are way too many artists simply
| proportional to the population.
|
| Further, a lot of 'art' taught is technique. It's not creativity.
| Can creativity be taught? I don't think so.
|
| Culture played a part in preserving artists and honoring their
| skills. But as 'culture' becomes global, mainstream is adopted
| more as it's more accessible. And mainstream is subject to the
| vagaries of market as well as vulnerable to market manipulation.
|
| Contrary to population notions, our world is very homogenous.
| Somehow the promotion of diversity has ended up with the tyranny
| of conformity. How did this happen? This is the biggest puzzle of
| this past few decades.
| Der_Einzige wrote:
| This is the kind of post that I come on HN to read.
| mattdesl wrote:
| Challenging to navigate. These demonstrations are technically
| copyright infringement if done for financial gain (selling a
| T-shirt with Mickey Mouse icon). The same would be true if you
| were to draw by hand Mickey Mouse with a gun and sold it on a
| T-shirt. The only exception would be if it is a clear derivative,
| or satire, or parody, or personal use of course.
|
| The challenging part is that these artists are protesting the use
| of 'style' in AI synthesized media. That is, an artist's style is
| being targeted (or, even, multiple artist's styles are combined
| in a prompt to create a new AI-original work). This is not
| protected by copyright--if you draw a new scene in another
| artist's style, it would be perhaps unethical, but legally
| derivative work.
|
| If the artists who are challenging these AI systems do get there
| way, and they are able to legally copy-protect their "style"
| (like a certain way of brush strokes), this would inevitably
| backfire against them. To give an example: any artist whose work
| now too closely resembles the "style" of Studio Ghibli might be
| liable to copyright infringement, where before the work would be
| clearly derivative, or just influenced by another work, as is the
| case with most art over time.
| concordDance wrote:
| Trademark infringement, not copyright.
| mattdesl wrote:
| Technically, sure, but the artists (who are not trademarking
| their work) are putting this in the context of copyright
| infringement.
| MetaWhirledPeas wrote:
| Challenging legally, and challenging philosophically. I would
| think an artist doing it _for the art_ would embrace the
| fleeting nature of all things, including art. Some artists
| demonstrate this by creating temporary art, or even throwing
| their own art away after making it. The desire to make money
| from art is certainly reasonable, but accepting a world where
| all art styles are immediately mimicked, where art is
| trivialized and commoditized, and where there 's no recognition
| to be had let alone money... that's going to be a tough
| philosophical pill to swallow.
| meebob wrote:
| I've been finding that the strangest part of discussions around
| art AI among technical people is the complete lack of
| identification or empathy: it seems to me that most computer
| programmers should be just as afraid as artists, in the face of
| technology like this!!! I am a failed artist (read, I studied
| painting in school and tried to make a go at being a commercial
| artist in animation and couldn't make the cut), and so I decided
| to do something easier and became a computer programmer, working
| for FAANG and other large companies and making absurd (to me!!)
| amounts of cash. In my humble estimation, making art is _vastly_
| more difficult than the huge majority of computer programming
| that is done. Art AI is terrifying if you want to make art for a
| living- and, if AI is able to do these astonishingly difficult
| things, why shouldn 't it, with some finagling, also be able to
| do the dumb, simple things most programmers do for their jobs?
|
| The lack of empathy is incredibly depressing...
| rightbyte wrote:
| "The cut" for artists is just way closer to 100% than for
| programmers. It is that simple.
| mrbombastic wrote:
| I feel like I am missing something or holding it wrong, I would
| personally love if we had a tool that i could describe problems
| at a high level and out comes a high quality fully functional
| app. Most software is shit and if we are honest with ourselves
| there is a huge amount of inessential complexity in this field
| built up over the years. I would gladly never spend weeks
| building something someone else already built in a slightly
| different way because it doesn't meet requirements, I would
| gladly not end up in rabbit holes wrestling with some
| dependency compatibility issue when I am just trying to create
| value for the business. If the tools get better the software
| gets better and the compexity we can manage gets larger. That
| said while these tools are incredibly impressive, having messed
| with this for a few days to try to even do basic stuff, what am
| I missing here? It is a nice starting point and can be a
| productivity boost but the code produced is often wrong and it
| feels a long way away from automating my day to day work.
| kypro wrote:
| > I would personally love if we had a tool that i could
| describe problems at a high level and out comes a high
| quality fully functional app.
|
| I'm sure your employer would love that more than you. That's
| the issue here.
|
| > That said while these tools are incredibly impressive,
| having messed with this for a few days to try to even do
| basic stuff, what am I missing here? It is a nice starting
| point and can be a productivity boost but the code produced
| is often wrong and it feels a long way away from automating
| my day to day work.
|
| This is the first irritation of such a tool and it's already
| very competent. I'm not even sure I'm better at writing code
| than GPT, the only thing I can do that it can't is compile
| and test the code I produce. If you asked me to create a
| React app from a two sentence prompt and didn't allow me to
| search the internet, compile or test it I'm sure I'd probably
| make more mistakes than GPT to be honest.
| alt227 wrote:
| If I had the tool that did that, I would be the employer!
| mrbombastic wrote:
| Exactly code has always been a means to an end not the
| end itself. Further our industry has been more than happy
| to automate inefficiency away from other fields, feels
| pretty hypocritical to want it to stop for ours.
| pixl97 wrote:
| I mean, then so would everyone else, and we just fall
| back to a capital problem to advertise your creations.
| OctopusLupid wrote:
| If everyone is able to make their own app, then there is
| no need to advertise their apps, because everyone will
| just be using their own.
|
| The real battle there would be protocols; how everyone's
| custom apps communicate. Here, we can fall back to
| existing protocols such as email, ActivityPub, Matrix,
| etc.
| mrbombastic wrote:
| Have you actually tried to get an app working using gpt? A
| lot of shared stuff is heavily curated. It is no doubt an
| extremely impressive tool but I think we always
| underestimate the last 10% in AI products. We had
| impressive self driving demos over a decade ago, we are all
| still driving and L5 still seems a ways away.
| r3trohack3r wrote:
| I'm empathetic, but my empathy doesn't overcome my excitement.
|
| This is a moment where individual humans substantially increase
| their ability to affect change in the world. I'm watching as
| these tools quickly become commoditized. I'm seeing low income
| first generation Americans who speak broken English using
| ChatGPT to translate their messages to "upper middle class
| business professional" and land contracts that were off limits
| before. I'm seeing individuals rapidly iterate and explore
| visual spaces on the scale of 100s to 1000s of designs using
| stable diffusion, a process that was financially infeasible
| even for well funded corps due to the cost of human labor this
| time last year. These aren't fanciful dreams of how this tech
| is going to change society - Ive observed these outcomes in
| real life.
|
| I'm empathetic that the entire world is moving out from under
| all of our feet. But the direction it's moving is unbelievably
| exciting. AI isn't going to replace humans, humans using AI are
| going to replace humans who don't.
|
| Be the human that helps other humans wield AI.
| kmeisthax wrote:
| Computer programmers have a general aversion to copyright, for
| a few reasons:
|
| 1. Proprietary software is harmful and immoral in ways that
| proprietary books or movies are _not_.
|
| 2. The creative industry has historically used copyright as a
| tool to tell computer programmers to stop having fun.
|
| So the lack of empathy is actually pretty predictable. Artists
| - or at least, the people who claim to represent their economic
| interests - have consistently used copyright as a cudgel to
| smack programmers about. If you've been marinading in Free
| Software culture and Cory Doctorow-grade ressentiment for half
| a century, you're going to be more interested in taking revenge
| against the people who have been telling you "No, shut up,
| that's communism" than mere first-order self-preservation[1].
|
| This isn't just "programmers don't have fucks to give", though.
| In fact, your actual statements about computer programmers are
| wrong, because there's already an active lawsuit against OpenAI
| and Microsoft over GitHub Copilot and it's use of FOSS code.
|
| You see, AI actually breaks the copyright and ethical norms of
| programmers, too. Most public code happens to be licensed under
| terms that permit reuse (we hate copyright), but only if
| derivatives and modifications are also shared in the same
| manner (because we _really hate copyright_ ). Artists are
| worried about being paid, but programmers are worried about
| keeping the commons open. The former is easy: OpenAI can offer
| a rev share for people whose images were in the training set.
| The latter is far harder, because OpenAI's business model is
| _charging people for access to the AI_. We don 't want to be
| paid, we want OpenAI to not be paid.
|
| Also, the assumption that "art is more difficult than computer
| programming" is also hilariously devoid of empathy. For every
| junior programmer crudly duck-taping code together you have a
| person drawing MS Paint fanart on their DeviantART page. The
| two fields test different skills and you cannot just say one is
| harder than the other. Furthermore, the consequences are
| different here. If art is bad, it's bad[0] and people
| potentially lose money; but if code is bad it gets hacked or
| kills people.
|
| [0] I am intentionally not going to mention the concerns
| Stability AI has with people generating CSAM with AI art
| generators. That's an entirely different can of worms.
|
| [1] Revenge can itself be thought of as a second-order self-
| preservation strategy (i.e. you hurt me, so I'd better hurt you
| so that you can't hurt me twice).
| tshaddox wrote:
| My (admittedly totally non-rigorous) intuition is that the
| advances in AI might "grow the pot" of the software
| engineering, IT, and related industries at roughly the same
| rate that they can "replace professionals" in those industries.
| If that's the case, then there wouldn't be some existential
| threat to the industry. Of course, that doesn't mean that
| certain individuals and entire companies aren't at risk, and I
| don't want to minimize the potential hardship, but it doesn't
| seem like a unique or new problem.
|
| As a crude analogy, there are a lot of great free or low-cost
| tools to create websites that didn't exist 15 years ago and can
| easily replace what would be a much more expensive web
| developer contract 15 years ago. And yet, in those last 15
| years, the "size of the web pot" has increased enough that I
| don't think many professional web developers are worried about
| site builder tools threatening the entire industry. There seem
| to be a lot more web developers now then there were 15 years
| ago, and they seem to be paid as well or better than they were
| 15 years. And again, that doesn't mean that certain individuals
| or firms didn't on occasion experience financial hardship due
| to pressure from cheaper alternatives, and I don't want to
| minimize that. It just seems like the industry is still
| thriving.
|
| To be clear, I really have no idea if this will turn out to be
| true. I also have no idea if this same thing might happen in
| other fields like art, music, writing, etc.
| mtrower wrote:
| Consider the compiler.
|
| There's an awful lot of analogy there, if you think about it.
| odessacubbage wrote:
| it's been very frustrating to see how much ignorance and
| incuriosity is held by what i assume to be otherwise very
| worldly, intelligent and technical people in regards to what
| working artists actually _do_.
| toldyouso2022 wrote:
| The arithmetic a computer can do instantly is much more
| difficult to me that writing this sentence. Point being: we
| can't compare human and computer skills. As if I'm worried, I'm
| not because, if there is no government intervention to ruin
| things, even if I lose my job as a programmer society becomes
| richer and I can always move to do another thing while having
| access to cheaper goods
|
| People should stop giving work all this meaning and also they
| should study economics so they chill.
|
| Learn and chill.
| chii wrote:
| The empathy you imply might also require that the artists (or
| programmer's) jobs be preserved, for the sake of giving them
| purpose and a way to make a living.
|
| I dont think that is absolutely something a society must
| guarantee. People are made obsolete all the time.
|
| What needs to be done is to produce new needs that currently
| cannot be serviced by the new AI's. I'm sure it will come - as
| it has for the past hundred years when technology supplanted an
| existing corpus of workers. A society can make this transition
| smoother - such as a nice social safety-net, and low-cost/free
| education for retraining into a different field.
|
| In fact, these things are all sorely needed today, without
| having the AIs' disruptions.
| [deleted]
| mattr47 wrote:
| Art is not harder than coding. What is hard is for an artist to
| make a living because the market for artwork is very, very low.
| ben_w wrote:
| I'm mostly seeing software developers looking at the textual
| equivalent, GPT-3, and giving a spectrum of responses from
| "This is fantastic! Take my money so I can use it to help me
| with my work!" to "Meh, buggy code, worse than dealing with a
| junior dev."
|
| I think the two biggest differences between art AI and code AI
| are that (a) code that's only 95% right is just wrong, whereas
| art can be very wrong before a client even notices [0]; and (b)
| we've been expecting this for ages already, to the extent that
| many of us are cynical and jaded about what the newest AI can
| do.
|
| [0] for example, I was recently in the Cambridge University
| Press Bookshop, and they sell gift maps of the city. The
| background of the poster advertising these is pixelated and has
| JPEG artefacts.
|
| It's highly regarded, and the shop has existed since 1581, and
| yet they have what I think is an amateur-hour advert on their
| walls.
| edanm wrote:
| > code that's only 95% right is just wrong,
|
| I know what you mean, but thinking about it critically, this
| is just wrong. _All_ software has bugs in it. Small bugs, big
| bugs, critical bugs, security bugs, everything. No code is
| immune. The largest software used by millions every day has
| bugs. Library code that has existed and been in use for 30
| years has bugs.
|
| I don't think you were actually thinking of this in your
| comparison, but I think it's actually a great analogy - code,
| like art, can be 95% complete, and that's usually enough.
| (For art, looks good and is what I wanted is enough, for
| code, does what I want right now, nevermind edge cases is
| enough.)
| CuriouslyC wrote:
| Two issues. First, when a human gets something 5% wrong,
| it's more likely to be a corner case or similar "right most
| of the time" scenario, whereas when AI gets something 5%
| wrong, it's likely to look almost right but never produce
| correct output. Second, when a human writes something wrong
| they have familiarity with the code and can more easily
| identify the problem and fix it, whereas fixing AI code
| (either via human or AI) is more likely to be fraught.
| edanm wrote:
| You (and everyone else) seem to be making the classic
| "mistake" of looking at an early version and not
| appreciating that _things improve_. Ten years ago, AI-
| generated art was at 50%. 2 years ago, 80%. Now it 's at
| 95% and winning competitions.
|
| I have no idea if the AI that's getting code 80% right
| today will get it 95% right in two years, but given
| current progress, I wouldn't bet against it. I don't
| think there's any _fundamental_ reason it can 't produce
| better code than I can, at least not at the "write a
| function that does X" level.
|
| Whole systems are a _way_ harder problem that I wouldn 't
| even think of making guesses about.
| idontpost wrote:
| yamtaddle wrote:
| _To be fair_ to those assumptions, there 've been a lot
| of cases of machine-learning (among other tech) looking
| very promising, and advancing so quickly that a huge
| revolution seems imminent--then stalling out at a local
| maximum for a really long time.
| ben_w wrote:
| It might improve like Go AI and shock everyone by beating
| the world expert at everything, or it might improve like
| Tesla FSD which is annoyingly harder than "make creative
| artwork".
|
| There's no fundamental reason it can't be the world
| expert at everything, but that's not a reason to assume
| we know how to get there from here.
| namelessoracle wrote:
| What scares me is a death of progress situation. Maybe it
| cant be an expert, but it can be good enough, and now the
| supply pipeline of people who could be experts basically
| gets shut off, because to become an expert you needed to
| do the work and gain the experiences that are now
| completely owned by AI.
| tintor wrote:
| But it could also make it easier to train experts, by
| acting as a coach and teacher.
| nonrandomstring wrote:
| Exactly this.
|
| The problem of a vengeful god who demands the slaughter
| of infidels lies not in his existence or nonexistence,
| but peoples' belief in such a god.
|
| Similarly, it does not matter whether AI works or it
| doesn't. It's irrelevant how good it actually is. What
| matters is whether people "believe" in it.
|
| AI is not a technology, it's an ideology.
|
| Given time it will fulfil it's own prophecy as "we who
| believe" steer the world toward that.
|
| That's what's changing now. It's in the air.
|
| The ruling classes (those who own capital and industry)
| are looking at this. The workers are looking too. Both of
| them see a new world approaching, and actually everyone
| is worried. What is under attack is not the jobs of the
| current generation, but the value of human skill itself,
| for all generations to come. And, yes, it's the tail of a
| trajectory we have been on for a long time.
|
| It isn't the only way computers can be. There is IA
| instead of AI. But intelligence amplification goes
| against the principles of capital at this stage. Our
| trajectory has been to make people dumber in service of
| profit.
| CadmiumYellow wrote:
| > What is under attack is not the jobs of the current
| generation, but the value of human skill itself, for all
| generations to come. And, yes, it's the tail of a
| trajectory we have been on for a long time.
|
| Wow, yes. This is exactly what I've been thinking but you
| summed it up more eloquently.
| snickerbockers wrote:
| Maybe for certain domains it's okay to fail 5% of the time
| but a lot of code really does need to be _perfect_. You
| wouldn 't be able to work with a filesystem that loses 5%
| of your files.
| mecsred wrote:
| Or a filesystem that loses all of your files 5% of the
| time.
| scarmig wrote:
| No need to rag on btrfs.
| GoblinSlayer wrote:
| And GPT can't fix a bug, it can only generate new text that
| will have a different collection of bugs. The catch is that
| programming isn't text generation. But AI should be able to
| make good actually intelligent fuzzers, that should be
| realistic and useful.
| alar44 wrote:
| Yes it can, I've been using it for exactly that. "This
| code is supposed to do X but does Y or haz Z error fix
| the code."
|
| Sure you can't stick an entire project in there, but if
| you know the problem is in class Baz, just toss in the
| relevant code and it does a pretty damn good job.
| UnpossibleJim wrote:
| sure but now you only need testers and one coder to fix
| bugs, where you used to need testers and 20 coders. AI
| code generators are force multipliers, maybe not strict
| replacements. And the level of creativity to fix a bug
| relative to programming something wholly original is days
| apart.
| Ajedi32 wrote:
| > GPT can't fix a bug
|
| It can't? I could've sworn I've seen (cherry-picked)
| examples of it doing exactly that, when prompted. It even
| explains what the bug is and why the fix works.
| ipaddr wrote:
| Which examples the ones where they were right or wrong.
| It goes back to trusting the source not to introduce new
| ever evolving bugs.
| soerxpso wrote:
| Those are cherry picked, and most importantly, all of the
| examples where it can fix a bug are examples where it's
| working with a stack trace, or with an extremely small
| section of code (<200 lines). At what point will it be
| able to fix a bug in a 20,000 line codebase, with only
| "When the user does X, Y unintended consequence happens"
| to go off of?
|
| It's obvious how an expert at regurgitating StackOverflow
| would be able to correct an NPE or an off-by-one error
| when given the exact line of code that error is on. Going
| any deeper, and actually being able to _find_ a bug,
| requires understanding of the codebase as a whole and the
| ability to map the code to what the code actually _does_
| in real life. GPT has shown none of this.
|
| "But it will get better over time" arguments fail for
| this because the thing that's needed is a fundamentally
| new ability, not just "the same but better."
| Understanding a codebase is a different thing from
| regurgitating StackOverflow. It's the same thing as
| saying in 1980, "We have bipedal robots that can hobble,
| so if we just improve on that enough we'll eventually
| have bipedal robots that beat humans at football."
| tintor wrote:
| It can, in some cases. Have you tried it?
| mlboss wrote:
| It is only a matter of time. It can understand error
| stacktrace and suggest a fix. Somebody has to plug it to
| IDE then it will start converting requirements to code.
| mr_toad wrote:
| When AI can debug its own code I'll start looking for
| another career.
| CapmCrackaWaka wrote:
| This depends entirely on _how_ the code is wrong. I asked
| chatGPT to write me code in python that would calculate
| SHAP values when given a sklearn model the other day. It
| returned code that ran, and even _looked_ like it did the
| right thing at a cursory glance. But I've written SHAP a
| package before, and there were several manipulations it got
| wrong. I mean completely wrong. You would never have known
| the code was wrong unless you knew how to write the code in
| the first place.
|
| To me, code that is 95% correct will either fail
| catastrophically or give very wrong results. Imagine if the
| code you wrote was off 5% for every number it was supposed
| to generate. Code that is 99.99% correct will introduce
| subtle bugs.
|
| * No shade to chatGPT, writing a function that calculates
| shap values is tough lol, I just wanted to see what it
| could do. I do think that, given time, it'll be able to
| write a days worth of high quality code in a few seconds.
| KIFulgore wrote:
| I experienced ChatGPT confidently giving incorrect
| answers about the Schwarzchild radius of the black hole
| at the center of our galaxy, Saggitarius A-star. Both
| when asked about "the Scharzchild radius of a black hole
| with 4 million solar masses" (a calculation) and "the
| Scharzchild radius of Saggitarius A-star" (a simple
| lookup).
|
| Both answers were orders of magnitude wrong, and vastly
| different from each other.
|
| JS code suggested for a simple database connection had
| glaring SQL injection vulnerabilities.
|
| I think it's an ok tool for discovering new libraries and
| getting oriented quickly to languages and coding domains
| you're unfamiliar with. But it's more like a forum post
| from a novice who read a tutorial and otherwise has
| little experience.
| mcguire wrote:
| My understanding is that ChatGPT (and similar things) are
| purely _language models_ ; they do not have any kind of
| "understanding" of anything like reality. Basically, they
| have a complex statistical model of how words are
| related.
|
| I'm a bit surprised that it got a lookup wrong, but for
| any other domain, describing it as a "novice" is
| understating the situation a lot.
| nmfisher wrote:
| Over the weekend I tried to tease out a sed command that
| would fix an uber simple compiler error from ChatGPT [0].
| I gave up after 4 or 5 tries - while it got the root
| cause correct ("." instead of "->" because the property
| was a pointer), it just couldn't figure out the right sed
| command. That's such a simple task, its failure doesn't
| inspire confidence in getting more complicated things
| correct.
|
| This is the main reason I haven't actually incorporated
| any AI tools into my daily programming yet - I'm mindful
| that I might end up spending more time tracking down
| issues in the auto-generated code than I saved using it
| in the first place.
|
| [0] You can see the results here https://twitter.com/Nick
| FisherAU/status/1601838829882986496
| malandrew wrote:
| Who is going to debug this code when it is wrong?
|
| Whether 95% or 99.9% correct, when there is a serious
| bug, you're still going to need people that can fix the
| gap between almost correct and actually correct.
| cool_dude85 wrote:
| Sure, but how much of the total work time in software
| development is writing relatively straightforward,
| boilerplate type code that could reasonably be copied
| from the top answer from stackoverflow with variable
| names changed? Now maybe instead of 5 FTE equivalents
| doing that work, you just need the 1 guy to debug the
| AI's shot at it. Now 4 people are out of work, or
| applying to be the 1 guy at some other company.
| woah wrote:
| Or the company just delivers features when they are
| estimated to be done, instead of it taking 5 times longer
| than expected
| mcguire wrote:
| Does anyone remember the old maxim, "Don't write code as
| cleverly as you can because it's harder to debug than it
| is to write and you won't be clever enough"?
| Workaccount2 wrote:
| The thing about ChatGPT is that it warning shot. And all
| these people I see talking about it, laughing about how
| the shooter missed them.
|
| Clearly ChatGPT is going to improve, and AI development
| is moving at a breakneck pace and accelerating. Dinging
| it for totally fumbling 5% or 10% of written code is
| completely missing the forest for the trees.
| jhbadger wrote:
| Sure, it will improve, but I think a lot of people think
| "Hey, it almost looks human quality now! Just a bit more
| tweaking and it will be human quality or better!". But a
| more likely case is that the relatively simple
| statistical modeling tools (which are very different from
| how our brains work, not that we fully understand how our
| brains work) that chatGPT uses have a limit to how well
| they work and they will hit a plateau (and are probably
| near it now). I'm not one of those people who believe
| strong AI is impossible, but I have a feeling that strong
| AI will take more than that just manipulating a text
| corpus.
| ben_w wrote:
| I'd be surprised if it did only take text (or even
| language in general), but if it does only need that, then
| given how few parameters even big GPT-3 models have
| compared to humans, it will strongly imply that PETA was
| right all along.
| woeirua wrote:
| Yeah, but people were also saying this about self-driving
| cars, and guess what that long tail is super long, and
| its also far fatter than we expected. 10 years ago people
| were saying AI was coming for taxi drivers, and as far as
| I can tell we're still 10 years away.
|
| I'm nonplussed by ChatGPT because the hype around it is
| largely the same as was for Github Copilot and Copilot
| fizzled badly. (Full disclosure: I pay for Copilot
| because it is somewhat useful).
| pleb_nz wrote:
| I wonder if some of this is the 80 20 rule. We're seeing
| the easy 80 percent of the solutions which has taken 20%
| of the time. We still have the hard 80% (or most of) to
| go for some of these new techs
| rightbyte wrote:
| Replacing 80% of a truck driver's skill would suck but
| replacing 80% of our skill would be an OK programmer.
| lostmsu wrote:
| Considering the deep conv nets that melted the last AI
| winter happened in 2012, you are basically giving it 40
| years till 100%.
| kerkeslager wrote:
| Tesla makes self-driving cars that drive better than
| humans. The reason you have to touch the steering wheel
| periodically is political/social, not technical. An
| acquaintance of mine read books while he commutes 90
| minutes from Chattanooga to work in Atlanta once or twice
| a week. He's sitting in the driver's seat but he's
| certainly not driving.
|
| The political/social factors which apply to the life-and-
| death decisions made driving a car, don't apply to
| whether one of the websites I work on works perfectly.
|
| I'm 35, and I've paid to write code for about 15 years.
| To be honest, ChatGPT probably writes better code than I
| did at my first paid internship. It's got a ways to go to
| catch up with even a junior developer in my opinion.
|
| The expectation in the US is that my career will last
| until I'm 65ish. That's 30 years from now. Tesla has only
| been around 19 years and now makes self-driving cars.
|
| So yeah, I'm not immediately worried that I'm going to
| lose my job to ChatGPT in the next year, but I am quite
| confident that my role will either cease existing or
| drastically change because of AI before the end of my
| career.
| tarranoth wrote:
| The thing is though, it's trained on human text. And most
| humans are per difinition, very fallible. Unless someone
| made it so that it can never get trained on subtly wrong
| code, how will it ever improve? Imho AI can be great for
| suggestions as for which method to use (visual studio has
| this, and I think there is an extension for visual studio
| code for a couple of languages). I think fine grained
| things like this are very useful, but I think code
| snippets are just too coarse to actually be helpful.
| tintor wrote:
| Improve itself through experimentation with reinforcement
| learning. This is how humans improve too. AlphaZero does
| it.
| lostmsu wrote:
| The amount of work in that area of research is
| substantial. You will see world shattering results in a
| few years.
|
| Current SOTA: https://openai.com/blog/vpt/
| throwaway82388 wrote:
| Anyone who has doubts has to look at the price. It's free
| for now, and will be cheap enough when openai starts
| monetizing. Price wins over quality. It's demonstrated
| time and time again.
| ben_w wrote:
| Depends on the details. Skip all the boring health and
| safety steps, you can make very cheap skyscrapers. They
| might fall down in a strong wind, but they'll be cheap.
| pixl97 wrote:
| After watching lots of videos from 3rd world countries
| where skyscrapers are built and then tore down a few
| years later, I think I know exactly how this is going to
| go.
| idontpost wrote:
| This is magical thinking, no different than a cult.
|
| The fundamental design of transformer architecture isn't
| capable of what you think it is.
|
| There are still radical, fundamental breakthroughs
| needed. It's not a matter of incremental improvement over
| time.
| mejutoco wrote:
| I agree with you. Even software that had no bugs today (if
| that is possible) could start having bugs tomorrow, as the
| environment changes (new law, new hardware, etc.)
| tablespoon wrote:
| >> code that's only 95% right is just wrong,
|
| > I know what you mean, but thinking about it critically,
| this is just wrong. All software has bugs in it. Small
| bugs, big bugs, critical bugs, security bugs, everything.
| No code is immune. The largest software used by millions
| every day has bugs. Library code that has existed and been
| in use for 30 years has bugs.
|
| All software has bugs, but it's usually _far_ better that
| "95% right." Code that's only 95% right probably wouldn't
| pass half-ass testing or a couple of days of actual use.
| toomanydoubts wrote:
| The other day I copied a question from leetcode and asked GPT
| to solve it. The solution had the correct structure to be
| interpreted by leetcode(Solution class, with the correct
| method name and signature, and with the same implementation
| of a linked list that leetcode would use). It made me feel
| like GPT was not implementing the solution for anything. Just
| copying and pasting some code it has read on the internet.
| edanm wrote:
| EDIT: I posted this comment twice by accident! This comment
| has more details but the other more answers, so please check
| the other one!
|
| > code that's only 95% right is just wrong,
|
| I know what you mean, but thinking about it critically, this
| is just wrong. _All_ software has bugs in it. Small bugs, big
| bugs, critical bugs, security bugs, everything. No code is
| immune. The largest software used by millions every day has
| bugs. Library code that has existed and been in use for 30
| years has bugs.
|
| I don't think you were actually thinking of this in your
| comparison, but I think it's actually a great analogy - code,
| like art, can be 95% complete, and that's usually enough.
| (For art, looks good and is what I wanted is enough, for
| code, does what I want right now, nevermind edge cases is
| enough.)
|
| The reason ChatGPT isn't threatening programmers is for other
| reasons. Firstly, it's code isn't 95% good, it's like 80%
| good.
|
| Secondly, we do a lot more than write one-off pieces of code.
| We write much, much larger systems, and the connections
| between different pieces of code, even on a function-to-
| function level, are very complex.
| yourapostasy wrote:
| _> The reason ChatGPT isn 't threatening programmers is for
| other reasons. Firstly, it's code isn't 95% good, it's like
| 80% good._
|
| The role that is possibly highly streamlined with a near-
| future ChatGPT/CoPilot are requirements-gathering business
| analysts, but developers at Staff level on up sits closer
| to requiring AGI to even become 30% good. We'll likely see
| a bifurcation/barbell: Moravec's Paradox on one end, AGI on
| the other.
|
| An LLM that can transcribe a verbal discussion directly
| with a domain expert for a particular business process with
| high fidelity, give a precis of domain jargon to a
| developer in a sidebar, extracts out further jargon created
| by the conversation, summarize the discussion into
| documentation, and extract how the how's and why's like a
| judicious editor might at 80% fidelity, then put out semi-
| working code at even 50% fidelity, that works 24x7x365 and
| automatically incorporates everything from GitHub it
| created for you before and that your team polished into
| working code and final documentation?
|
| I have clients who would pay for an initial deployment of
| that for an appliance/container head end of that which
| transits the processing through the vendor SaaS' GPU farm
| but holds the model data at rest within their network /
| cloud account boundary. Being able to condense weeks or
| even months of work by a team into several hours that
| requires say a team to tighten and polish it up by a
| handful of developers would be interesting to explore as a
| new way to work.
| Kye wrote:
| >> _" I think the two biggest differences between art AI and
| code AI are that (a) code that's only 95% right is just
| wrong, whereas art can be very wrong before a client even
| notices [0];"_
|
| Art can also be extremely wrong in a way everyone notices and
| still be highly successful. For example: Rob Liefeld.
| jhbadger wrote:
| And in the same way as Liefeld has a problem drawing hands!
| Maybe he was actually ahead of us all and had an AI art
| tool before the rest of us.
| itronitron wrote:
| >> whereas art can be very wrong before a client even notices
|
| No actually, that's not how that works. You're demonstrating
| the lack of empathy that the parent comment brings up as
| alarming.
|
| Regarding programming, code that's only 95% right can just be
| run through code assist to fix everything.
| meebob wrote:
| I do appreciate that the way in which a piece of code "works"
| and the way in which an piece of art "works" is in some ways
| totally different- but, I also think that in many cases,
| notably automated systems that create reports or dashboards,
| they aren't so far apart. In the end, the result just has to
| seem right. Even in computer programming, amateur hour level
| correctness isn't so uncommon, I would say.
|
| I would personally be astonished if any of the distributed
| systems I've worked on in my career were even close to 95%
| correct, haha.
| azornathogron wrote:
| A misleading dashboard is a really really bad. This is
| absolutely not something where I would be happy to give it
| to an AI to do just because "no one will notice". The fact
| that no one will notice errors until it's too late is why
| dashboards need _extra_ effort by their author to actually
| test the thing.
|
| If you want to give programming work to an AI, give it the
| things where incorrect behaviour is going to be really
| obvious, so that it can be fixed. Don't give it the stuff
| where everyone will just naively trust the computer without
| thinking about it.
| lycopodiopsida wrote:
| Understanding what you are plotting and displaying in the
| dashboard is the complicated part, not writing the
| dashboard. Programmers are not very afraid of AI because it
| is still just a glorified fronted to stackoverflow, and SO
| has not destroyed the demand for programmers so far. Also,
| understanding the subtle logical bugs and errors introduced
| by such boilerplate AI-tools requires no less expertise
| than knowing how write the code upfront. Debugging is not a
| very popular activity among programmers for a reason.
|
| It may be that one day AI will also make their creators
| obsolete. But at that point so many professions will be
| replaced by it already, that we will live in a massively
| changed society where talking about the "job" has no
| meaning anymore.
| jimmaswell wrote:
| > code that's only 95% right is just wrong
|
| It's still worth it on the whole but I have already gotten
| caught up on subtly wrong Copilot code a few times.
| asdf123wtf wrote:
| A lot depends what the business costs are of that wrong %5.
|
| If the actual business costs are less than the price of a
| team of developers... welp, it was fun while it lasted.
| BurningFrog wrote:
| This has been going on for 250 years, and humanity still hasn't
| quite grasped it.
|
| The steady progress of the Industrial Revolution that has made
| the average person unimaginably richer and healthier several
| times over, looks in the moment just like this:
|
| "Oh no, entire industries of people are being made obsolete,
| and will have to beg on the streets now".
|
| And yet, as jobs and industries are automated away, we keep
| getting richer and healthier.
| koshnaranek wrote:
| "It hasn't happened in the past, therefore it won't happen in
| the future" is simply just a fallacy.
| thedorkknight wrote:
| Collectively, sure. How did that go for the people who's
| livelihoods got replaced though? I've had family members be
| forced to change careers from white-collar work after being
| laid off and unable to find engineering jobs due to people
| decades younger taking them all nearby. I saw firsthand the
| unbelievable amount of stress and depression they went
| through, and it took them years to accept that their previous
| life and career were gone.
|
| "It'll massively suck for you, but don't worry, it'll be
| better for everyone else" is little comfort for most of us
| yamtaddle wrote:
| Especially when promises and plans to use some of those
| windfalls of progress to help those harmed by it, seem
| never to see much follow-through.
|
| Progress is cool if you're on the side of the wheel that's
| going up. It's the worst fucking thing in the world if
| you're on the side that's going down and are about to get
| smashed into the mud.
| spitBalln wrote:
| BurningFrog wrote:
| Well, most of us are in the benefiting group so I'd
| definitely take that gamble.
|
| But you're off course right that the benefits are unevenly
| distributed, and for some it truly does suck.
| [deleted]
| beardedetim wrote:
| > making art is vastly more difficult than the huge majority of
| computer programming that is done.
|
| I'd reframe this to: making a living from your art is far more
| difficult than making money from programming.
|
| > also be able to do the dumb, simple things most programmers
| do for their jobs?
|
| I'm all for Ai automating all the boring shit for me. Just like
| frameworks have. Just like libraries have. Just like DevOps
| have. Take all the plumbing and make it automated! I'm all for
| it!
|
| But. At some point. Someone needs to take business speak and
| turn it into input for this machine. And wouldn't ya know it,
| I'm already getting paid for that!
| 2OEH8eoCRo0 wrote:
| I think that programmers here have a lot riding on the naive
| belief that all new tools are neutral, and there is no pile of
| bodies under these advances.
| unshavedyak wrote:
| I think i have a fair bit of empathy in this area and, well
| like you said, i think my job (software) is likely to be
| displaced too. Furthermore, i think companies have data sets
| regardless of if we allow public use or not. Ie if we ban
| public use, then _only_ massive companies (Google /etc) will
| have enough data to train these. Which.. seems worse to me.
|
| At the end of the day though, i think i'm an oddball in this
| camp. I just don't think there's that much difference between
| ML and Human Learning (HL). I believe we are nearly infinitely
| more complex but as time goes on i think the gulf between ML
| and HL complexity will shrink.
|
| I recently saw some of MKBHD's critiques of ML and my takeaway
| was that he believes ML cannot possibly be creative. That it's
| just inputs and outputs.. and, well, isn't that what i am?
| Would the art i create (i am also trying to get into art) not
| be entirely influenced by my experiences in life, the memories
| i retain from it, etc? Humans also unknowingly reproduce work
| all the time. "Inspiration" sits in the back of their minds and
| then we regurgitate it out thinking it as original.. but often
| it's not, it's derivative.
|
| Given that all creative work is learned, though, the line
| between derivative and originality seems to just be about how
| close it is to pre-existing work. We mash together ideas, and
| try to distance it from other works. It doesn't matter what we
| take as inspiration, or so we claim, as long as the output
| doesn't overlap too much with pre-existing work.
|
| ML is coming for many jobs and we need to spend a lot of time
| and effort thinking about how to adapt. Fighting it seems an
| uphill battle. One we will lose, eventually. The question is
| what will we do when that day comes? How will society function?
| Will we be able to pay rent?
|
| What bothers me personally is just that companies get so much
| free-reign in these scenarios. To me it isn't about ML vs HL.
| Rather it's that companies get to use all our works for their
| profit.
| wnkrshm wrote:
| > We mash together ideas, and try to distance it from other
| works. It doesn't matter what we take as inspiration, or so
| we claim, as long as the output doesn't overlap too much with
| pre-existing work.
|
| I feel a big part what makes it okay or not okay here is
| intention and capability. Early in an artistic journey things
| can be highly derivative but that's due to the student's
| capabilities. A beginner may not intend to be derivative but
| can't do better.
|
| I see pages of applications of ML out there being derivative
| on purpose (Edit: seemingly trying to 'outperform' given
| freelance artists with glee, in their own styles).
| unshavedyak wrote:
| But the ML itself doesn't have intention. The author of the
| ML does, and that i would think is no different than an
| artist that purposefully makes copied/derived work.
|
| TBH given how derivative humans tend to be, with such a
| deeper "Human Learning" model and years and years of
| experiences.. i'm kinda shocked ML is even capable of even
| appearing non-derivative. Throw a child in a room, starve
| it of any interaction and somehow (lol) only feed it select
| images and then ask it to draw something.. i'd expect it to
| perform similarly. A contrived example, but i'm
| illustrating the depth of our experiences when compared to
| ML.
|
| I half expect that the "next generation" of ML is fed by a
| larger dataset by many orders of magnitude more similarly
| matching our own. A video feed of years worth of data,
| simulating the complex inputs that Human Learning gets to
| benefit from. If/when that day comes i can't imagine we
| will seem that much more unique than ML.
|
| I should be clear though; i am in no way defending how
| companies are using these products. I just don't agree that
| we're so unique in how we think, how we create, and if
| we're truly unique in any way shape or fashion. (Code,
| Input) => Output is all i think we are, i guess.
| MrPatan wrote:
| Because we know it's not going to happen any time soon, and
| when it does happen it won't matter only to devs, that's the
| singularity.
|
| You'll find out because you're now an enlightened immortal
| being, or you won't find out at all because the thermonuclear
| blast (or the engineered plague, or the terminators...) killed
| you and everybody else.
|
| Does that mean there won't be some enterprising fellas who will
| hook up a chat prompt to some website thing? And that you can
| demo something like "Add a banner. More to the right. Blue
| button under it" and that works? Sure. And when it's time to
| fiddle with the details of how the bloody button doesn't do the
| right thing when clicked, it's back to hiring a professional
| that knows how to talk to the machine so it does what you want.
| Not a developer! No, of course not, no, no, we don't do
| development here, no. We do _prompts_.
| calebcannon wrote:
| This has happened to developers multiple times. Frankly it's
| happened so many times that it's become mundane. These programs
| are tools, and after a while you realize having a new tool in
| the bag doesn't displace people. What it does is make the old
| job easy and new job has a higher bar for excellence. Everyone
| who has been writing software longer than a few years can name
| several things that used to take them a long time and a lot of
| specialization, and now take any amateur 5 minutes. It might
| seem scary, but it's really not. It just means that talented
| artists will be able to use these tools to create even more
| even cooler art, because they don't need to waste their time on
| the common and mechanical portions.
| cwmoore wrote:
| AI art will become most valuable posthumously.
| klooney wrote:
| https://githubcopilotlitigation.com/
|
| Some programmers are upset and already filing suit.
| CyanBird wrote:
| Lack of empathy is because we are discussing about systems, not
| feelings
|
| At the dawn of mechanization, these same arguments were being
| used by the luddites, I'd recommend you to read them, it was
| quite an interesting situation, same as now
|
| The reality is that advances such as these can't be stopped,
| even if you forbid ml legislation in the US there are hundreds
| of other countries which won't care same as it happens with
| piracy
| odessacubbage wrote:
| the luddites may be one of the most singularly wrongly
| vilified groups in human history.
| jacoblambda wrote:
| Remember, luddites largely weren't against technology.
|
| What they were however was against was companies using that
| technology to slash their wages in exchange for being forced
| to do significantly more dangerous jobs.
|
| In less than a decade, textile work went from a safe job with
| respectable pay for artisans and craftsmen into one of the
| most dangerous jobs of the industrialised era with often less
| than a third of the pay and the workers primarily being
| children.
|
| That's what the luddites were afraid of. And the government
| response was military/police intervention, breaking of any
| and all strikes, and harsh punishments such as execution for
| damaging company property.
| astrange wrote:
| I recommend reading the part in Capital where Marx makes fun
| of them for being opposed to productivity.
| hippie_queen wrote:
| I don't disagree, except I don't get what you mean with
| "because we are discussing systems, not feelings."
|
| I think artists feeling like shit in this situation is
| totally understandable. I'm just a dilettante painter and
| amateur hentai sketcher, but some of the real artists I know
| are practically in the middle of an existential crisis.
| Feeling empathy for them is not the same as thinking that we
| should make futile efforts to halt the progress of this
| technology.
| Kalium wrote:
| I agree, but we should pay attention when we are asked for
| empathy. In this very thread we have an excellent
| demonstration of how easy it is for an appeal to feel
| empathy for people's position to change into an appeal to
| protect the same people's financial position.
|
| I'll go so far as to say that in many cases, displaying
| empathy for the artists without also advocating for futile
| efforts to halt the progress of this technology will be
| regarded as a lack of empathy.
| meroes wrote:
| If the advances create catastrophic consequences there will
| be a stop by definition. Death of art(ists) and coders may
| not be a catastrophe, but it could be coincident with one.
| From OP, "Art AI is terrifying if you want to make art for a
| living". Empathize a little with that to see coding AI making
| coding not a way of life. Empathize even more and see few
| people having productive ways of life due to general purpose
| AI. The call to empathize is not about "feelings"
| necessarily, it is a cognitive exercise to imagine future
| consequences that aren't obvious yet.
| locopati wrote:
| sacado2 wrote:
| In art, you can afford a few mistakes. Like, on many photo-
| realistic pictures generated by midjourney, if you look closely
| you'll see a thing or two that are odd in the eyes of
| characters. In an AI-generated novel, you can accept a typo
| here and there, or not even notice it if it's really subtle.
|
| In a program, you can't really afford that. A small mistake can
| have dramatic consequences. Now, maybe in the next few years
| you'll only need one human supervisor fixing AI bugs where you
| used to need 10 high-end developers, but you probably won't be
| able to make reliable programs just by typing a prompt, the way
| you can currently generate a cover for an e-book just by asking
| midjourney.
|
| As for the political consequences of all of this, this is yet
| another issue.
| mullingitover wrote:
| I'm not sure that humans are going to beat AI in terms of
| defect rate in software, especially given that with AI you
| produce code at a fast enough rate that corner cutting (like
| skipping TDD) often done by human developers is off the
| table.
|
| I don't think this is going to put developers out of work,
| however. Instead, lots of small businesses that couldn't
| afford to be small software companies suddenly will be able
| to. They'll build 'free puppies,' new applications that are
| easy to start building, but that require ongoing development
| and maintenance. As the cambrian explosion of new software
| happens we'll only end up with more work on our hands.
| exceptione wrote:
| Will you be happy to curate bot output?
|
| Could the bot not curate its own output? It has been shown
| that back feeding into the model result in improvement. I
| got the idea that better results come from increments. The
| AI overlords (model owners) will make sure they learn from
| all that curating you might do too, making your job even
| less skilled. Read: you are more replaceable.
|
| Please prove me wrong! I hope I am just anxious. History
| has proven that increases in productivity tend to go to
| capital owners, unless workers have bargaining power. Mine
| workers were paid relatively well here, back in the day.
| Complete villages and cities thrived around this business.
| When those workers were no longer needed the government had
| to implement social programs to prevent a societal collapse
| there.
|
| Look around, Musk wants you to work 10 hours per day
| already. Don't expect an early retirement or a more relaxed
| job..
| mullingitover wrote:
| I don't think it's a matter of blindly curating bot
| output.
|
| I think it's more a matter of enlarging the scope of what
| one person can manage. I think moving from the pure
| manual labor era, limited by how much weight a human body
| could move from point A to point B, to the steam engine
| era. Railroads totally wrecked the industry of people
| moving things on their backs or in mule trains, and that
| wasn't a bad thing.
|
| > Don't expect an early retirement or a more relaxed
| job..
|
| That's kinda my point, I don't think this is going to
| make less work, it'll turbocharge productivity. When has
| an industry ever found a way to increase productivity and
| just said cool, now we'll keep the status quo with our
| output and work less?
| exceptione wrote:
| Thanks for sharing your thoughts.
|
| You describe stuff that is harmful or boring. In an other
| comment I touched upon this, as there seem to be a clear
| distinction between people that love programming and
| those that just want to get results. The former does not
| enjoy being manager of something larger per se if the
| lose what they love.
|
| I can see a (short term?) increase in demand of software,
| but it is not infinite. So when productivity increases
| and demand does not with at least the same pace, you will
| see jobless people and you will face competition.
|
| What no one has touched yet is that the nature of
| programming might change too. We try to optimize for the
| dev experience now, but it is not unreasonable to expect
| that we have to bend towards being AI-friendly. Maybe
| human friendly becomes less of a concern (enough
| desperate people out there), AI-friendly and performance
| might be more important metrics to the owner.
| exceptione wrote:
| I have to add that software is also something that can be
| copied without effort. If we can have 2000 drawing apps
| instead of 20, changes that none of those 2000 will fit
| the bill will get close to zero.
|
| Industries have traditionally solved this with planned
| obsolence. Maybe JavaScript might be our saviour here for
| a while. :)
|
| There is also a natural plateau of choice we can handle.
| Of those 2000, only a few will be winners and with reach.
| It might soon be that the AI model becomes more valuable
| than any of those apps. Case in point: try to make a
| profitable app on Android these days.
| mullingitover wrote:
| > You describe stuff that is harmful or boring. In an
| other comment I touched upon this, as there seem to be a
| clear distinction between people that love programming
| and those that just want to get results.
|
| There's nothing stopping anyone from coding for fun, but
| we get paid for delivering value, and the amount of value
| that you can create is hugely increased with these new
| tools. I think for a lot of people their job satisfaction
| comes from having autonomy and seeing their work make an
| impact, and these tools will actually provide them with
| even more autonomy and satisfaction from increased impact
| as they're able to take on bigger challenges than they
| were able to in the past.
| exceptione wrote:
| "having autonomy and seeing their work make an impact"
|
| I think we are talking about a different job. I mentioned
| it somewhere else, but strapping together piles of bot
| generated code and having to debug that will feel more
| like a burden for most I fear.
|
| If a programmer wanted to operate on a level where "value
| delivering" and "impact" are the most critical criteria
| for job satisfaction, one would be better of in a product
| management or even project management role. A good
| programmer will care a lot about his product, but she
| still might derive the most joy out of having it build
| mostly by herself.
|
| I think that most passionate programmers want to build
| something by themselves. If api mashups are already not
| fun enough for them, I doubt that herding a bunch of code
| generators will bring that spark of joy.
| spitBalln wrote:
| strken wrote:
| My empathy for artists is fighting with my concern for everyone
| else's future, and losing.
|
| It would be very easy to make training ML models on publicly
| available data illegal. I think that would be a very bad thing
| because it would legally enshrine a difference between human
| learning and machine learning in a broader sense, and I think
| machine learning has huge potential to improve everyone's
| lives.
|
| Artists are in a similar position to grooms and farriers
| demanding the combustion engine be banned from the roads for
| spooking horses. They have a good point, but could easily screw
| everyone else over and halt technological progress for decades.
| I want to help them, but want to unblock ML progress more.
| allturtles wrote:
| > My empathy for artists is fighting with my concern for
| everyone else's future, and losing.
|
| My empathy for artists is aligned with my concern for
| everyone else's future.
|
| > I want to help them, but want to unblock ML progress more.
|
| But progress towards what end? The ML future looks very bleak
| to me, the world of "The Machine Stops," with humans perhaps
| reduced to organic effectors for the few remaining tasks that
| the machine cannot perform economically on its own: carrying
| packages upstairs, fixing pipes, etc.
|
| We used to imagine that machines would take up the burden our
| physical labor, freeing our minds for more creative and
| interesting pursuits: art, science, the study of history, the
| study of human society, etc. Now it seems the opposite will
| happen.
| dangond wrote:
| Work like this helps us work towards new approaches for the
| more difficult issues involved with replacing physical
| labor. The diffusion techniques that have gained popularity
| recently will surely enable new ways for machines to learn
| things that simply weren't possible before. Art is getting
| a lot of attention first because many people (including the
| developers working on making this possible) want to be able
| to create their own artwork and don't have the talent to
| put their mental images down on paper (or tablet). You
| worry that this prevents us from following more creative
| and interesting pursuits, but I feel that this enables us
| to follow those pursuits without the massive time
| investment needed to practice a skill. The future you
| describe is very bleak indeed, but I highly doubt those
| things won't be automated as well.
| BudaDude wrote:
| I don't get this argument. Artists will not be replaced by
| AI. AI will become a tool like Photoshop for artists. AI
| will not replace creativity.
| yamtaddle wrote:
| I see two realistic possibilities:
|
| 1) It'll no longer be possible to work as an artist
| without being _incredibly_ productive. Output, output,
| output. The value of each individual thing will be so low
| that you have to be both excellent at what you do (which
| will largely be curating and tweaking AI-generated art)
| and extremely prolific. There will be a very few
| exceptions to this, but even fewer than today.
|
| 2) Art becomes another thing lots of people in the office
| are expected to do simply as a part of their non-artist
| job, like a whole bunch of other things that used to be
| specialized roles but become a little part of everyone's
| job thanks to computers. It'll be like being semi-OK at
| using Excel.
|
| I expect a mix of both to happen. It's not gonna be a
| good thing for artists, in general.
| yunwal wrote:
| 3) The scope and scale of "art" that gets made gets
| bigger and we still have plenty of pro artists,
| designers. AKA art eats the world
| yamtaddle wrote:
| Maybe. But art was already so cheap, and talent so
| abundant, that it was notoriously difficult to make
| serious money doing it, so I doubt it'll have that effect
| in general.
|
| It might in a few areas, though. I think film making is
| poised to get _really weird_ , for instance, possibly in
| some interesting and not-terrible ways, compared with
| what we're used to. That's mostly because automation
| might replace entire teams that had to spend thousands of
| hours before anyone could see the finished work or pay
| for it, not just a few hours of one or two artists' time
| on a more-incremental basis. And even that's not quite a
| revolution--we _used to_ have very-small-crew films,
| including tons that were big hits, and films with credits
| lists like the average Summer blockbuster these days were
| unheard of, so that 's more a return to how things were
| _before_ computer graphics entered the picture (even 70s
| and 80s films, after the advent of the spectacle- and FX-
| heavy Summer blockbuster, had crews so small that it 's
| almost hard to believe, when you're used to seeing the
| list of hundreds of people who work on, say, a Marvel
| film)
| wnkrshm wrote:
| It does just that though? Don't tell me nobody is
| surpised sometimes while prompting a diffusion model,
| that can only happen if a significant portion of creation
| happens, in a non-intuitive way for the user - what you
| could describe as 'coming up with something'.
| lovehashbrowns wrote:
| > We used to imagine that machines would take up the burden
| our physical labor, freeing our minds for more creative and
| interesting pursuits: art, science, the study of history,
| the study of human society, etc.
|
| You're like half a step away from the realization that
| almost everything you do today is done better if not by AI
| then someone that can do it better than you but you still
| do it because you enjoy it.
|
| Now just flip those two, almost everything you do in the
| future will be done better by AI if not another human.
|
| But that doesn't remove the fact that you enjoy it.
|
| For example, today I want to spend my day taking
| photographs and trying to do stupid graphic design in After
| Effects. I can promise you that there are thousands of
| humans and even AI that can do a far better job than me at
| both these things. Yet I have over a terabyte of
| photographs and failed After Effects experiments. Do I stop
| enjoying it because I can't make money from these hobbies?
| Do I stop enjoying it because there's some digital artist
| at corporation X that can take everything I have and do it
| better, faster, and get paid while doing it?
|
| No. So why would this change things if instead of a human
| at corporation X, it's an AI?
| kevingadd wrote:
| > It would be very easy to make training ML models on
| publicly available data illegal
|
| This isn't the only option though? You could restrict it to
| data where permission has been acquired, and many people
| would probably grant permission for free or for a small fee.
| Lots of stuff already exists in the public domain.
|
| What ML people seem to want is the ability to just scoop up a
| billion images off the net with a spider and then feed it
| into their network, utilizing the unpaid labor of thousands-
| to-millions for free and turning it into profit. That is
| transparently unfair, I think. If you're going to enrich
| yourself, you should also enrich the people who made your
| success possible.
| yamtaddle wrote:
| Everyone else's future?
|
| I see this as another step toward having a smaller and
| smaller space in which to find our own meaning or "point" to
| life, which is the only option left after the march of
| secularization. Recording and mass media / reproduction
| already curtailed that really badly on the "art" side of
| things. Work is staring at glowing rectangles and tapping
| clacky plastic boards--almost nobody finds it satisfying or
| fulfilling or engaging, which is why so many take pills to be
| able to tolerate it. Work, art... if this tech fulfills its
| promise and makes major cuts to the role for people in those
| areas, what's left?
|
| The space in which to find human meaning seems to shrink by
| the day, the circle in which we can provide personal value
| and joy to others without it becoming a question of cold
| economics shrinks by the day, et c.
|
| I don't think that's great for everyone's future. Though
| admittedly we've already done so much harm to that, that this
| may hardly matter in the scheme of things.
|
| I'm not sure the direction we're going looks like success,
| even if it happens to also mean medicine gets really good or
| whatever.
|
| Then again I'm a bit of a technological-determinist and
| almost nobody agrees with this take anyway, so it's not like
| there's anything to be done about it. If we don't do [bad but
| economically-advantageous-on-a-state-level thing], someone
| else will, then we'll _also_ have to, because fucking Moloch.
| It 'll turn out how it turns out, and no meaningful part in
| determining that direction is whether it'll put us somewhere
| _good_ , except "good" as blind-ass Moloch judges it.
| lovehashbrowns wrote:
| What role exactly is it going to take? The role we
| currently have, where the vast majority of people do work
| not because they particularly enjoy it but because they're
| forced to in order to survive?
|
| That's really what we're protecting here?
|
| I'd rather live in the future where automation does
| practically everything not for the benefit of some
| billionaire born into wealth but because the automation is
| supposed to. Similar to the economy in Factorio.
|
| Then people can derive meaning from themselves rather that
| whatever this dystopian nightmare we're currently living
| in.
|
| It's absurdly depressing that some people want to stifle
| this progress only because it's going to remove this god
| awful and completely made up idea that work is freedom or
| work is what life is about.
| stcroixx wrote:
| Every other living thing on the planet spends most of
| it's time just fighting to survive. I think that's
| evidence it's not a 'made up idea' and likely may be what
| life is actually about.
| deathgripsss wrote:
| This is the dictionary definition of appeal to nature
| fallacy.
| lovehashbrowns wrote:
| What're you doing on the internet? No other living thing
| on this planet spends time on the internet. Or maybe we
| shouldn't be copying things from nature just because.
|
| Also kinda curious how you deal with people that have
| disabilities and can't exactly fight to survive. Me, I'm
| practically blind without glasses/contacts, so I'll not
| be taking life lessons from the local mountain lion,
| thanks.
| exceptione wrote:
| I am happy to write code for a hobby. Who is going to pay
| for that? The oligarchs of our time pay their tax to
| their own 'charities'. Companies with insane profits buy
| their own shares.
|
| AI powered surveillance and the ongoing destruction of
| public institutions will make it hard to stand up for the
| collective interest.
|
| We are not in hell, but the road to it has not been
| closed.
| lovehashbrowns wrote:
| The ideal situation is that nobody pays for it. Picture a
| scenario where the vast majority of resource gathering,
| manufacturing, and production are all automated.
| Programmers are out of a job, factory workers are out of
| a job, miners are out of a job, etc.
|
| Basically the current argument of artists being out of a
| job but taken to its extreme.
|
| Why would these robots get paid? They wouldn't. They'd
| just mine, manufacture, and produce on request.
|
| Imagine a world where chatgpt version 3000 is connected
| to that swarm of robots and you can type "produce a 7
| inch phone with an OLED screen, removable battery, 5
| physical buttons, a physical shutter, and removable
| storage" and X days later arrives that phone, delivered
| by automation, of course.
|
| Same would work with food, where automation plants the
| seeds, waters the crops, removes pests, harvests the
| food, and delivers it to your home.
|
| All of these are simply artists going out of a job,
| except it's not artists it's practically every job humans
| are forced to do today.
|
| There'd be very little need to work for almost every
| human on earth. Then I could happily spend all day taking
| shitty photographs that AI can easily replicate today far
| better than I could photograph in real life but I don't
| have to feel like a waste of life because I enjoy doing
| it for fun and not because I'm forced to in order to
| survive.
| exceptione wrote:
| Look, I like the paradise you created. You only forgot
| about who we are.
|
| > There'd be very little need to work for almost every
| human on earth.
|
| When mankind made a pact with the devil, the burden we
| got was that we had to earn our bread though sweat and
| hard labor. This story has survived millennia, there is
| something to it.
|
| Why is the bottom layer in society not automated by
| robots? No need to if they are cheaper than robots. If
| you don't care about humans, you can get quite some labor
| for a little bit of sugar. If you can work one job to pay
| your rent, you can possibly do two or three even. If you
| don't have those social hobbies like universal healthcare
| and public education, people will be competitive for a
| very long time with robots. If people are less valuable,
| they will be treated as such.
|
| Hell is nearer than paradise.
| lovehashbrowns wrote:
| Humans have existed for close to 200,000 years. Who we
| 'are' is nothing close to what we have today. What humans
| actually are is an invasive species capable of
| subjugating nature to fit its needs. I want to just push
| that further and subjugate nature with automation that
| can feed us and manufacture worthless plastic and metal
| media consumption devices for us.
|
| Your diatribe about not caring about humans is ironic. I
| don't know where you got all that from, but it certainly
| wasn't my previous comment.
|
| I also don't know what pact you're on about. The idea of
| working for survival is used to exploit people for their
| labor. I guess people with disabilities that aren't able
| to work just aren't human? Should we let them starve to
| death since they can't work a 9-5 and work for their
| food?
| exceptione wrote:
| > Who we 'are' is nothing close to what we have today.
|
| I am wondering why you define being in terms of having.
| Is that a slip, or is that related to this:
|
| > I want to just push that further and subjugate nature
| with automation that can feed us and manufacture
| worthless plastic and metal media consumption devices for
| us.
|
| Because I can hear sadness in these words. I think we can
| feel thankful for having the opportunity to observe
| beauty and the universe and feel belonging to where we
| are and with who we are. Those free smartphones are not
| going to substitute that.
|
| I do not mean we have to work because it is our fate or
| something like that.
|
| > Your diatribe about not caring about humans is ironic.
|
| A pity you feel that way. Maybe you interpreted "If you
| don't care about humans" as literally you, whereas I
| meant is as "If one doesn't care".
|
| What I meant was is the assumption you seem to make that
| when a few have plenty of production means without
| needing the other 'human resources' anymore, those few
| will not spontaneously share their wealth with the world,
| so the others can have free smart phones and a life of
| consumption. Instead, those others will have to double
| down and start to compete with increasingly cheaper
| robots.
|
| ----
|
| The pact in that old story I was talking about deals with
| the idea that we as humans know how to be evil. In the
| story, the consequence is that those first people had to
| leave paradise and from then on have to work for their
| survival.
|
| I just mentioned it because the fact that we exploit not
| only nature, but other humans too if we are evil enough.
| People that end up controlling the largest amounts of
| wealth are usually the most ruthless. That's why we need
| rules.
|
| ----
|
| > I guess people with disabilities that aren't able to
| work just aren't human? Should we let them starve to
| death since they can't work a 9-5 and work for their
| food?
|
| On the contrary, I think I have been misunderstood.:)
| lovehashbrowns wrote:
| I hear more sadness in your words that are stuck on the
| idea of having to compete. The idea is to escape that and
| make exploiting people not an option. If you feel evil
| and competition for survival is what defines humans,
| that's truly sad.
|
| I like my ideal world a lot better.
| exceptione wrote:
| > The idea is to escape that and make exploiting people
| not an option.
|
| I am in, but just wanted to let you know many had this
| idea before. People thought in the past we would barely
| work these days anymore. What they got wrong is that
| productivity gains didn't reach the common man. It was
| partly lost through mass consumption, fueled by
| advertising, and wealth concentration. Instead, people at
| the bottom of the pyramid have to work harder.
|
| > I like my ideal world a lot better.
|
| Me too, without being consumption oriented though.
| Nonetheless, people that take a blind eye to the
| weaknesses of humankind often runs into unpleasant
| surprises. It requires work, lots of work.
| antonvs wrote:
| > The space in which to find human meaning seems to shrink
| by the day
|
| I don't understand this. It reminds me of the Go player who
| announced he was giving up the game after AlphaGo's
| success. To me that's exactly the same as saying you're
| going to give up running, hiking, or walking because horses
| or cars are faster. That has nothing to do with human
| meaning, and thinking it does is making a really obvious
| category error.
| yamtaddle wrote:
| A lot of human meaning comes from providing value to
| others.
|
| The more computers and machines and institutions take
| that over, the fewer opportunities there are to do that,
| and the more doing that kind of thing feels forced, or
| even like an _indulgence_ of the person providing the
| "service" and an _imposition_ on those served.
|
| Vonnegut wrote quite a bit about this phenomenon in the
| arts--how recording, broadcast, and mechanical
| reproduction vastly diminished the social and even
| economic value of small-time artistic talent. Uncle Bob's
| storytelling can't compete with Walt Disney Corporation.
| Grandma's piano playing stopped mattering much when we
| began turning on the radio instead of having sing-alongs
| around the upright. Nobody wants your cousin's quite good
| (but not _excellent_ ) sketches of them, or of any other
| subject--you're doing _him_ a favor if you sit for him,
| and when you pretend to give a shit about the results.
| Aunt Gertrude 's quilt-making is still kinda cool and you
| don't mind receiving a quilt from her, but you always
| feel kinda bad that she spent dozens of hours making
| something when you could have had a functional equivalent
| for perhaps $20. It's a nice gesture, and you may
| appreciate it, but she needed to give it more than you
| needed to receive it.
|
| Meanwhile, social shifts shrink the set of people for
| whom any of this might even apply, for most of us. I
| dunno, maybe online spaces partially replace that, but
| most of that, especially the creative spaces, seem full
| of fake-feeling positivity and obligatory engagement, not
| the same thing at all as meeting another person you
| know's _actual_ needs or desires.
|
| That's the kind of thing I mean.
|
| The areas where this isn't true are mostly ones that
| machines and markets are having trouble automating, so
| they're still expensive relative to the effort to do it
| yourself. Cooking's a notable one. The last part of our
| pre-industrial social animal to go extinct may well be
| meal-focused major holidays.
| 6gvONxR4sf7o wrote:
| > I think that would be a very bad thing because it would
| legally enshrine a difference between human learning and
| machine learning in a broader sense, and I think machine
| learning has huge potential to improve everyone's lives.
|
| How about we legally enshrine a difference between human
| learning and corporate product learning? If you want to use
| things others made for free, you should give back for free.
| Otherwise if you're profiting off of it, you have to come to
| some agreement with the people whose work you're profiting
| off of.
| Negitivefrags wrote:
| Well Stable Diffusion did give back.
|
| This doesn't seem to satisfy the artists.
| 6gvONxR4sf7o wrote:
| I'm thinking about the people who use SD commercially.
| There's a transitive aspect to this that upsets people.
| If it's unacceptable for a company to profit off your
| work without compensating you or asking for your
| permission, then it doesn't become suddenly acceptable if
| some third party hands your work to the company.
|
| Ideally we'd see something opt-in to decide exactly how
| much you have to give back, and how much you have to
| constrain your own downstream users. And in fact we do
| see that. We have copyleft licenses for tons of code and
| media released to the public (e.g. GPL, CC-BY-SA NC,
| etc). It lets you define how someone can use your stuff
| without talking to you, and lays out the parameters for
| exactly how/whether you have to give back.
| kevingadd wrote:
| "Giving back" is cute but it doesn't make up for taking
| without permission in the first place. Taking someone's
| stuff for your own use and saying "here's some
| compensation I decided was appropriate" is called Eminent
| Domain when the government does it and it's not popular.
|
| Many people would probably happily allow use of their
| work for this _if asked first_ , or would grant it for a
| small fee. Lots of stuff is in the public domain. But you
| have to actually go through the trouble of getting
| permission/verifying PD status, and that's apparently Too
| Hard
| spitBalln wrote:
| [deleted]
| conviencefee999 wrote:
| Its not that terrifying the way the these models work they
| aren't really creating new works, just taking other ones and
| basically copying them. Honestly, new laws for copyright have
| to be made I wonder when it will happen. And how the judical
| systems in the world will deal with it. Or if big tech has
| enough in the pockets to pretend it isn't an issue.
| lxe wrote:
| > In my humble estimation, making art is vastly more difficult
| than the huge majority of computer programming that is done.
|
| You're comparing apples to oranges. Digging a trench by hand is
| also vastly more difficult than art or programming.
|
| There's just as much AI hype around code generation, and some
| programmers are also complaining
| (https://www.theverge.com/2022/11/8/23446821/microsoft-
| openai...).
|
| Overall though the sentiment is that AI tools are useful and
| are a sign of progress. The fact that they are stirring so much
| contention and controversy is just a sign of how revolutionary
| they are.
| segmondy wrote:
| Tools happen, folks get automated away and need to retool to
| make themselves useful. It will happen in computing, as a
| matter of fact, it has happened in computing.
|
| What do you think cloud computing did? A lot of sysadmins,
| networking, backups, ops went the way of dinosaurs. A lot of
| programmers have also fallen on the side by being replaced with
| tech and need to catch up.
|
| Wallowing in pity is not going to make help, we saw a glimpse
| of this with github-copilot. Some people built the hardware,
| the software behind these AIs, some others are constructing the
| models, applying it to distinct domains. There's work to be
| done for those who wish to find their place in the new world.
| spinach wrote:
| But people aren't being automated away - their work is input,
| and for the AI generated art to remain fresh and relevant
| instead of rehashing old stuff it would need artists to
| continue creating new art. It's not a tool that exists
| independently of people's creative work (although this is
| true of most AI, though it seems particularly terrible with
| art).
| davidguetta wrote:
| Or its a big non event.. tech change, culture change, people
| hange #shrug
|
| What about the horse-powered carrioles devastated by cars !!
| eiiot wrote:
| > making art is vastly more difficult than the huge majority of
| computer programming that is done
|
| Creating art is not that much harder than programming, creating
| good art is much harder than programming. That's the reason
| that a large majority of art isn't very good, and why a large
| majority of Artists don't make a living by creating art.
|
| Just like the camera didn't kill the artist, neither will AI.
| For as long as art is about the ideas behind the piece as
| opposed to the technical skills required to make it (which I
| would argue has been true since the rise of impressionism) then
| AI doesn't change much. The good ideas are still required, AI
| only makes creating art (especially bad art) more accessible.
| CamperBob2 wrote:
| _The lack of empathy is incredibly depressing..._
|
| You're projecting your own fears on everyone else. I'm a
| programmer, too, among other things. I write code in order to
| get other things done. (Don't you?) It's fucking _awesome_ if
| this thing can do that part of my job. It means I can spend my
| time doing something even more interesting.
|
| What we call "programming" isn't defined as "writing code," as
| you seem to think. It's defined as "getting a machine to do
| what we (or our bosses/customers) want." That part will never
| change. But if you expect the tools and methodologies to remain
| the same, it's time to start thinking about a third career,
| because this one was never a good fit for you.
|
| This argument has come up many times in history, and your
| perspective has never come out on top. Not once. What do you
| expect to be different this time?
| xwdv wrote:
| I have zero empathy for "artists". Art produced for commercial
| purposes is no art at all, a more apt title for such a job is
| "asset creator", and these people are by no means banned from
| using AI generation tools to make their work easier. Already
| artists will generate some logo off a prompt that takes a few
| minutes and charge full price for it. Why cry about it?
|
| I would argue because most AI imagery right now is made for fun
| and not monetary gains, so it is actually a purer form of art.
| nonameiguess wrote:
| The entire history of computer programming is using code
| generation tools to increase the level of abstraction most
| programmers work at. Having yet another one of those doesn't
| seem to present any realistic chance of replacing all of the
| development, testing, maintenance, and refinement of that
| entire stack. If your job is literally just being handed over a
| paragraph or so written requirement for a single function or
| short script, giving back that function/script, and you're
| done, then sure, worry.
|
| But at least every job I've had so far also entailed
| understanding the entire system, the surrounding ecosystem,
| upstream and downstream dependencies and interactions, the
| overall goal being worked toward, and playing some role in
| coming up with the requirements in the first place.
|
| ChatGPT can't even currently update its fixed-in-time knowledge
| state, which is entirely based on public information. That
| means it can't even write a conforming component of a software
| system that relies on any internal APIs! It won't know your
| codebase if it wasn't in its training set. You can include the
| API in the prompt, but then that is still a job for a human
| with some understanding of how software works, isn't it?
| golemotron wrote:
| > The lack of empathy is incredibly depressing...
|
| The thing is, empathy doesn't really do anything. Pandora's Box
| is open and there's no effective way of shutting it that is
| more than a hopeful dream. Stopping technology is like every
| doomed effort that has existed to stop capitalism.
| nitwit005 wrote:
| It's a bit tiresome having people demand you demonstrate
| empathy in every single post. Do you truly want everyone typing
| up a paragraph of how sad they are in every comment? It won't
| actually help anything.
| A4ET8a8uTh0 wrote:
| I don't think there is no empathy here, but there are clear
| divisions on whether this tech will help advance humankind or
| further destabilize the society as a whole.
|
| To be perfectly honest, I absolutely love that particular
| attempt by artists, because it will likely force 'some'
| restrictions on how AI is used and maybe even limit that amount
| 'blackboxiness' it entails ( disclosure of model, data set
| used, parameters -- I might be dreaming though ).
|
| I disagree with your statement in general. HN has empathy and
| not just because it could affect their future world. It is a
| relatively big shift in tech and we should weigh it carefully.
| CuriouslyC wrote:
| If you were transported back to the 19th century, would you
| have empathy for the loom operators smashing mechanical looms?
|
| Art currently requires two skills - technical rendering
| ability, and creative vision/composition. AI tools have
| basically destroyed the former, but the latter is still
| necessary. Professional artists will have to adjust their
| skillset, much like they had to adjust their skillset when
| photography killed portrait painting as a profession.
| yunwal wrote:
| > AI tools have basically destroyed the former
|
| Do you people think art is relegated to digital images only?
| No video? No paintings, sculptures, mixed media, performance
| art, lighting, woodwork, etc etc. How is it possible that
| everyone seems to ignore that we still have massive leaps
| required in AI and robotics to match the technical ability of
| 99% of artists.
| LarryMullins wrote:
| > _The lack of empathy_
|
| Probably has something to do with years of artists trash
| talking engineers.
| meroes wrote:
| Forgive me but I would be lucky to have artists saying
| anything, positive or negative, about my way of life. Being
| knowledgeable in something critically studied is very
| rewarding. You are forsaking opportunity if I dare say so.
| LarryMullins wrote:
| I don't understand your response, maybe I should clarify my
| comment. What I'm saying is there has historically been a
| fair amount of animosity and mean hearted banter between
| engineer types and artistic types. Particularly, artists
| sharing and promoting negative stereotypes about engineers.
| Claims that engineers are antisocial, can't design
| interfaces for 'real people', etc. Now that the fruit of
| engineering labor has threatened artists, it doesn't
| surprise me that engineers have little sympathy for the
| artists.
| dr-detroit wrote:
| elektrontamer wrote:
| There's nothing to be depressed about. It's not a lack of
| empathy it's recognition of the inevitable. Developers realize
| that there is no going back. AI art is here to stay. You can't
| ban or regulate it. It would be extremely hard to police. All
| there is left to do is adapt to the market like you did, even
| if it's extremely difficult. It's not like AI made it
| significantly harder anyway. The supply for artists far
| surpassed the demand for them before the advent of AI art.
|
| Edit: Typo
| FamosoRandom wrote:
| being afraid is the best way to run away from what's coming. If
| a computer can easily do some work, simply use that work to
| your advantage, and do something more complicated. If a
| computer can generate art, use what's generated to your
| advantage and do something better.
|
| As long as the world is not entirely made of AI, there will
| always be some expertise to add, so instead of being afraid,
| you should just evolve with your time
| telesilla wrote:
| Exactly! Didn't the rise of abstract art coincide with the
| ubiquity of photography? Realism in painting was no longer
| needed by the populace to the previous extent.
|
| Artists will survive through innovation.
| ImHereToVote wrote:
| What if everything more complicated can have a neural tool
| that is equivalent in some respects?
| amelius wrote:
| You are assuming that AI will always be "open" and accessible
| to anyone.
| FamosoRandom wrote:
| I do, because I don't see why it wouldn't be. If it's
| revolutionary and a lot of people need it, or if it change
| completely the way people work/live/are entertained, it
| will certainly evolve to be as much accessible as possible.
| A succesful product is a product used by many.
|
| If it's not, why worry about it ?
| pixl97 wrote:
| I mean, nuclear bombs are not free and available to
| anyone, yet quite a few people on this planet are
| involved in worrying about them.
| koshnaranek wrote:
| If the demand is high, so will the price. And for AI,
| data is everything. And data is the domain of the biggest
| companies.
| amelius wrote:
| > I do, because I don't see why it wouldn't be.
|
| Because of how capitalism works and people always try to
| corner markets, extract value from other people, etc.
| etc.?
|
| > If it's not, why worry about it ?
|
| Because we can choose different professions that are less
| susceptible to automation? Or we can study DL to
| implement our own AI.
| ajmurmann wrote:
| > we can choose different professions that are less
| susceptible to automation
|
| What are those? It seems it's low-margin, physical work
| that's seeing the least AI progress. Like berry picking.
| Maybe also work that will be kept AI-free longer by
| regulators like being a judge?
| amelius wrote:
| Perhaps surgery. Or cooking.
| TchoBeer wrote:
| idk about cooking, but surgery is already seeing AI-aid.
| boredemployee wrote:
| >> making art is vastly more difficult than the huge majority
| of computer programming that is done.
|
| I completely agree with it. Take a contemporary pianist for
| example, the amount of dedication to both theory and practice,
| posture, mastering the instrument and what not, networking
| skills, technology skills, video recording, music recording,
| social media management, etc.
| drinfinity wrote:
| You think music theory is more demanding than CS? I've
| dedicated decades and probably 75% of my youth to mastering
| this instrument called a computing device. It has numerous
| layers, each completely different and each significant enough
| to build a standalone career out of (OS, networking, etc). I
| feel insulted if you think playing and mastering a piano is
| the same thing.
|
| Extreme specialists are found everywhere. Mastering
| skateboarding at world level will eat your life too, but it's
| not "harder" than programming. At least, for any
| commonsensical interpretation of "harder".
|
| All the rest, we do too. Except I don't record videos and I'm
| sure it is not childishly easy, but it will not eat my life.
| odo1242 wrote:
| have done (doing?) both, music theory is several times
| harder at least
| quonn wrote:
| Again, it depends on the level. Maybe you took trivial CS
| courses. Many parts of CS are indistinguishable from
| mathematics, is that so easy as well? What about the
| various open problems that have remained unsolved for
| decades now in theoretical CS? You think these are
| simpler than music? Really?
| boredemployee wrote:
| >> You think music theory is more demanding than CS?
|
| Of course it is.
| quonn wrote:
| Can't you see that your statement is just as silly or
| even more?
|
| Have you actually looked into CS deeply? Obviously not.
| (I'm not saying this cannot also be true for music, which
| I don't know.)
| boredemployee wrote:
| try to study both and then come back :)
| quonn wrote:
| I couldn't, but I could also not study many other things
| and not because of what you call difficulty. Quite simply
| different people are good at some things and less good at
| others.
|
| Maybe you are better at CS than music and therefore
| perceive it as easy and the other one as hard.
| boredemployee wrote:
| ok man
| CatWChainsaw wrote:
| Speaking as one of the outsiders that the other commenters
| warned you made SV/programmers look bad... yeah, you do
| look bad.
| CadmiumYellow wrote:
| This comment is so arrogant I have to laugh. This kind of
| attitude is exactly why people outside of our industry
| don't think highly of Silicon Valley.
| boredemployee wrote:
| I think today he/she learned an important lesson for
| his/her career: there are things more difficult than the
| epitome, the apogee, the quintessence of professions,
| called computer science.
| monsterbasher wrote:
| I'm literally speechless. What an arrogant and egotistical
| comment. This is why us tech workers have a such a bad rep
| as culturally ignorant/bubbled community. Do a bit of
| research into jazz theory and counterpoint theory before
| you make this kind of blatant over generalization.
| dbfx wrote:
| Thanks for the copypasta
| meroes wrote:
| This is why I come to HN! Thank you!
| dangond wrote:
| This exact comment could be made by a jazz soloist with a
| few words changed and be just as valid. I think you're
| underestimating how deep other fields, including artistic
| fields, are. Anything as competitive as an artistic field
| will always result in amounts of mastery needed at the top
| level that are barely noticeable to outside observers.
| CuriouslyC wrote:
| It isn't harder to be an artist or pianist, it's just that
| the cutoff of employability for these professions is much
| higher. It's like saying playing baseball is harder than
| programming because only a few thousand people are good
| enough to play baseball for a living.
| Der_Einzige wrote:
| A lot of coders have radical open source beliefs.
|
| Basically, the argument is that you should not have ever
| charged for your art, since its viewing and utility is
| increased when more people see it.
|
| The lack of empathy comes from our love of open source. That's
| why. These engineers have been pirating books, movies, games
| for a long time. Artists crying for copyright has the same
| sound as the MPAA sueing grandma 20 years ago.
| meroes wrote:
| This could easily be flipped on it's head. Artists wanting
| more control over their creations ensures bad actors can't
| use/misuse as easily. Freely creating tools for any bad
| actors to use/misuse appears incredibly naive in this light.
|
| Now was Aaron Schwartz (what I view as on ultimate example of
| this open source idea you cite) naive, no. Maybe he knew in
| his heart the greater good would outweigh anything.
|
| But I don't think we should judge too harshly merely falling
| on one side of this issue or not. Perhaps it's down to a
| debate about what creation/truth/knowledge actually are.
| Maybe some creators (of which aritsts and computer scientists
| are) view creations as something they bring into the world,
| not reveal about the world.
| AlexandrB wrote:
| Setting aside questions of whether there is copyright
| infringement going on, I think this is an unprecedented case in
| the history of automation replacing human labor.
|
| Jobs have been automated since the industrial revolution, but
| this usually takes the form of someone inventing a widget that
| makes human labor unnecessary. From a worker's perspective, the
| automation is coming from "the outside". What's novel with AI
| models is that the workers' own work is used to create the
| thing that replaces them. It's one thing to be automated away,
| it's another to have your own work used against you like this,
| and I'm sure it feels extra-shitty as a result.
| gottebp wrote:
| We need a better way to reward the contributing artists
| making the diffusion models possible. Might we be able to
| come up with a royalty model, where the artist that made the
| original source content used in training the diffusion model,
| gets a fractional royalty based on how heavily it is used
| when generating the prompted art piece? We want to
| incentivize artists to feed their works, and original styles,
| into future AI models.
| Archelaos wrote:
| > From a worker's perspective, the automation is coming from
| "the outside".
|
| Not, if the worker is an engineer or similar. Some engineers
| built tools that improved building tools.
|
| And this started even earlier than the industrial revolution.
| Think for example of Johannes Gutenberg. His real important
| invention was not the printing press (this already existed)
| and not even moveable types, but a process by which a printer
| could mold his own set of identical moveable types.
|
| I see a certain analogy between what Gutenberg's invention
| meant for scribes then and what Stable Diffusion means for
| artists today.
|
| Another thought: In engineering we do not have extremly long
| lasting copyright, but a lot shorter protection periods via
| patents. I have never understood why software has to be
| protected for such long copyright periods and not for much
| shorter patent-like periods. Perhaps we should look for
| something similar for AI and artists: An artist as copyright
| as usual for close reproductions, but after 20 years after
| publication it may be used without her or his consent for
| training AI models.
| wwweston wrote:
| Absolutely this -- and in many (maybe most cases), there was
| no consent for the use of the work in training the model, and
| quite possibly no notice or compensation at all.
|
| That's a huge ethical issue whether or not it's explicitly
| addressed in copyright/ip law.
| api wrote:
| I really think there's likely to be gigantic class action
| lawsuits in the near future, and I support them. People did
| not consent for their data and work to be used in this way.
| In many cases people have already demonstrated using custom
| tailored prompts that these models have been trained on
| copyrighted works that are not public domain.
| archontes wrote:
| Consent isn't required if they're making their work
| available for public viewing.
| granshaw wrote:
| For VIEWING. This is like blatantly taking your gpl
| licensed code and using it for commercial purposes
| archontes wrote:
| A thing that can be viewed can be learned from.
|
| I can't copy your GPL code. I might be able to write my
| own code that does the same thing.
|
| I'm going to defend this statement in advance. A lot of
| software developers white knight more than they strictly
| have to; they claim that learning from GPL code
| unavoidably results in infringing reproduction of that
| code.
|
| Courts, however, apply a test [1], in an attempt to
| determine the degree to which the idea is separable from
| the expression of that idea. Copyright protects
| particular expression, not idea, and in the case that the
| idea cannot be separated from the expression, the
| expression _cannot be copyrighted_. So either I 'm able
| to produce a non-infringing expression of the idea, or
| the expression cannot be copyrighted, and the GPL license
| is redundant.
|
| [1] https://en.wikipedia.org/wiki/Abstraction-Filtration-
| Compari...
| OctopusLupid wrote:
| It's already explicitly legal to train AI using
| copyrighted data in many countries. You can ignore opt-
| outs too, especially if you're training AI for non-
| commercial purposes. Search up TDM exceptions.
| archontes wrote:
| It is not a huge ethical issue. The artists have _always_
| been at risk of someone learning their style if they make
| their work available for public viewing.
|
| We've just made "learning style" easier, so a thing that
| was always a risk is now happening.
| ilammy wrote:
| This is like saying that continuously surveilling people
| when they are outside of their private property and live-
| reporting it to the internet is not a huge ethical issue.
| For you are always at risk of being seen when in public
| and the rest is merely exercising freedom of speech.
|
| Something being currently legal and possible doesn't mean
| being morally right.
|
| Technology enables things and sometimes the change is
| qualitatively different.
| wwweston wrote:
| Let's shift your risk of immediate assault and death up
| by a few orders of magnitude. I'm sure that you'll see
| that as "just" something that was always a risk, pretty
| much status quo, right right?
|
| Oh, life & death is different? Don't be so sure; there's
| good reasons to believe that livelihood (not to mention
| social credit) and life are closely related -- and also,
| the fundamental point doesn't depend on the specific
| example: you can't point to an orders-of-magnitude change
| and then claim we're dealing with a situation that's
| qualitatively like it's "always" been.
|
| "Easier" doesn't begin to honestly represent what's
| happened here: we've crossed a threshold where we have
| technology for production by automated imitation at
| scale. And where that tech works primarily because of
| imitation, the work of those imitated has been a crucial
| part of that. Where that work has a reasonable claim of
| ownership, those who own it deserve to be recognized &
| compensated.
| archontes wrote:
| The 'reasonable claim of ownership' extends to
| restricting transmission, not use after transmission.
|
| Artists are poets, and they're railing against Trurl's
| electronic bard.
|
| [https://electricliterature.com/wp-
| content/uploads/2017/11/Tr...]
| wwweston wrote:
| > The 'reasonable claim of ownership' extends to
| restricting transmission, not use after transmission.
|
| It's not even clear you're correct by the apparent (if
| limited) support of your own argument. "Transmission" of
| _some_ sort is certainly occurring when the work is given
| as input. It 's probably even tenable to argue that a
| copy is created in the representation of the model.
|
| You _probably_ mean to argue something to the effect that
| dissemination by the model is the key threshold by which
| we 'd recognize something like the current copyright law
| might fail to apply, the transformative nature of output
| being a key distinction. But some people have already
| shown that some outputs are much less transformative than
| others -- and even that's not the overall point, which is
| that this is a qualitative change much like those that
| gave birth to industrial-revolution copyright itself, and
| calls for a similar kind of renegotiation to protect the
| underlying ethics.
|
| People should have a say in how the fruits of their labor
| are bargained for and used. Including into how machines
| and models that drive them are used. That's part of
| intentionally creating a society that's built for humans,
| including artists and poets.
| archontes wrote:
| I wasn't speaking about dissemination by the model at
| all. It's possible for an AI to create an infringing
| work.
|
| It's not possible for _training_ an AI using data that
| was obtained legally to be copyright infringement. This
| is what I was talking about regarding transmission.
| Copyright provides a legal means for a rights holder to
| limit the creation of a copy of their image in order to
| be transmitted to me. If a rights holder has placed their
| image on the internet for me to view, then copyright does
| not provide them a means to restrict how I choose to
| consume that image.
|
| The AI may or may not create outputs that can be
| considered derivative works, or contain characters
| protected by copyright.
|
| You seem to be making an argument that we should be
| changing this somehow. I suppose I'll say "maybe". But it
| is apparent to me that many people don't know how
| intellectual property works.
| myrryr wrote:
| That is a hard fight to have, since it is the same for
| people. An artist will have watched some Disney movie, and
| that could influence their art in some small way. Does
| Disney have a right to take a small amount from every bit
| of art which they produce from then on? Obviously not.
|
| The real answer is AI are not people, and it is ok to have
| different rules for them, and that is where the fight would
| need to be.
| MSFT_Edging wrote:
| I don't know why we keep framing artists like they're textile
| workers or machinists.
|
| The whole point of art is human expression. The idea that
| artists can be "automated away" is just sad and disgusting
| and the amount of people who want art but don't want to pay
| the artist is astounding.
|
| Why are we so eager to rid ourselves of what makes us human
| to save a buck? This isn't innovation, its self destruction.
| eikenberry wrote:
| The idea that artists can be automated away is really just
| kind of dumb, not because people like AI created art and
| can get it cheap, but because it has no real impact on the
| "whole point" of the art... for the creation of the art.
| Pure art, as human expression, has no dependency on money.
| Anecdotally I very much enjoy painting and music (and
| coding) as art forms but have never sold a painting nor a
| song in my life. Just because someone won't pay you for
| something doesn't mean it has no value.
|
| As far as money goes... long run artists will still make
| money fine as people will value the people generated
| (artisanal) works. Just as people like hand-made stuff
| today, even though you can get machine-made stuff way
| cheaper. You may not have the generic jobs of cranking out
| stuff for advertisements (and such) but you'll still have
| artists.
| krapp wrote:
| The conversation isn't about you or your hobby, it's
| about _professional_ artists and illustrators, who are
| already being automated away by AI.
| astrange wrote:
| Professional artists have no chance of being automated
| away. They need all the productivity tools they can get.
|
| The ones at risk (and complaining the most) are semipro
| online artists who sell one image at a time, like fanart
| commissions.
| hunter2_ wrote:
| > The whole point of art is human expression.
|
| For someone seeking sound/imagery/etc. resulting from human
| expression (i.e., art), it makes sense that it can't be
| automated away.
|
| For someone seeking sound/imagery/etc. without caring
| whether it's the result of human expression (e.g., AI
| artifacts that aren't art), it can be automated away.
| lolinder wrote:
| Most art consumed today isn't about human expression, and
| it hasn't been for a very long time. Most art is produced
| for commercial reasons with the intent of making as much
| profit as possible.
|
| Art-as-human-expression isn't going anywhere because it's
| intrinsically motivated. It's what people do because they
| love doing it. Just like people still do woodworking even
| though it's cheaper to buy a chair from Walmart, people
| will still paint and draw.
|
| What _is_ going to go away is design work for low-end
| advertising agencies or for publishers of cheap novels or
| any of the other dozens of jobs that were never bastions of
| human creativity to begin with.
| PhasmaFelis wrote:
| I think fine artists and others who make and sell
| individual art pieces for a living will probably be fine,
| yeah. (Or at least won't be struggling much worse than
| they are already.)
|
| There are a _lot_ of working commercial artists in
| between the fine art world and the "cheap novels and
| low-end advertising agencies" you dismiss, and there's no
| reason to think AI art won't eat a lot of their
| employment.
| lolinder wrote:
| Just like AI can't replace programmers completely because
| most people are terrible at defining their own software
| requirements, AI won't replace middle-tier commercial
| artists because most people have no design sense.
|
| Commercial art needs to be eye catching and on brand if
| it's going to be worth anything, and a random intern
| isn't going to be able to generate anything with an AI
| that matches the vision of stakeholders. Artists will
| still be needed in that middle zone to create things that
| are on brand, that match stakeholder expectations, and
| that stand out from every other AI generated piece. These
| artists will likely start using AI tools, but they're
| unlikely to be replaced completely any time soon.
|
| That's why I only mentioned the bottom tier of commercial
| art as being in danger. The only jobs that can be
| replaced by AI with the technology that we're seeing
| right now are in the cases where it really doesn't matter
| exactly what the art looks like, there just has to be
| _something_.
| archontes wrote:
| Of course it will. Their employment isn't sacred. They
| have a skill, we're teaching that skill to computers, and
| their skill will be worth less.
|
| I don't pay someone to run calculations for me, either,
| also a difficult and sometimes creative process. I use a
| computer. And when the computer can't, _then_ I either
| employ my creativity, or hire a creative.
| nescioquid wrote:
| It's an important distinction you make and hard to talk
| about without a vocabulary. The terms I've seen music
| historians use for this concept were:
|
| - generic expression: commercial/pop/entertainment;
| audience makes demands on the art
|
| - autonomous expression: artist's vision is paramount;
| art makes demands on the audience
|
| Obviously these are idealized antipodes. The question
| about whether it is the art making the demands on the
| audience or the audience making demands on the art is
| especially insightful in my opinion. Given this rubric,
| I'd say AI-generated art must necessarily belong to
| "generic expression" simply because it's output has to
| meet fitness criteria.
| soerxpso wrote:
| You're defining the word "art" in one sentence and then
| using a completely different definition in the next
| sentence. Where are these people who want art, as you've
| defined it, but don't want to pay? Most of the people
| you're referring to want visual representations of their
| fursonas, or D&D characters, or want marketing material for
| their product. They're not trying to get human expression.
|
| In the sense that art is a 2D visual representation of
| something, or a marketing tool that evokes a biological
| response in the viewer, art is easy to automate away. This
| is no different than when the camera replaced portraitists.
| We've just invented a camera that shows us things that
| don't exist.
|
| In the sense that art is human expression, nobody has even
| tried to automate that yet and I've seen no evidence that
| expressionary artists are threatened.
| BoiledCabbage wrote:
| Because when people discuss "art" they are really
| discussing two things.
|
| Static 2D images that usually serve a commercial purpose.
| Ex logos, clip art, game sprites, web page design and the
| like.
|
| And the second is pure art whose purpose is more for the
| enjoyment of the creator or the viewer.
|
| Business wants to fully automate the first case and must
| people view it has nothing to do with the essence of
| humanity. It's simply dollars for products - but it's also
| one of the very few ways that artists can actually have
| paying careers for their skills.
|
| The second will still exist, although almost nobody in the
| world can pay bills off of it. And I wouldn't be shocked it
| ML models start encroaching there as well.
|
| So a lot of what's being referred to is more like textile
| workers. And anyone who can type a few sentences can now
| make "art" significantly lowering barriers to entry. Maybe
| a designer comes and touches it up.
|
| The short sighted part, is people thinking that this will
| somehow stay specific to Art and that their cherished field
| is immune.
|
| Programming will soon follow. Any PM "soon enough" will be
| able to write text to generate a fully working app. And
| maybe a coder comes in to touch it up.
| andrepew wrote:
| I wouldn't say saying it came from the inside is unique to AI
| art. You very much need a welder's understanding of welding
| in order to be able to automate it for example.
|
| I'd just say the scale is different. Old school automation
| just required one expert to guide the development of an
| automation. AI art requires the expertise of thousands.
| fckgnad wrote:
| drinfinity wrote:
| Making art is not "vastly more difficult" or at least it is
| (IMO) highly debatable. Some parts of it require decades of
| experience to do with any kind of excellence, yes. That's also
| the case with powerlifting, figure skating and raising children
| and indeed programming. It's just that your boss made a money
| printer that takes in bullshit and outputs bullshit which gives
| you your cosy job.
|
| But that is not "programming". That is glueing together
| bullshit until it works and the results of that "work" are
| "blessing" us everyday. The gift that keeps on giving. You
| FAANG people are indeed astronomically, immorally, overpaid and
| actively harm the world.
|
| But, luckily, the world has more layers than that. Programming
| for Facebook is not the same as programming for a small
| chemical startup or programming in any resource-restricted
| environment where you can't just spin up 1000 AWS instances at
| your leisure and you actually have to know what you're doing
| with the metal.
| lordfrito wrote:
| I want to apologize in advance if my response here seems
| callous considering your personal experience as an artist. I'm
| trying to talk about AI and labor in general here, and don't
| mean to minimize your personal experience.
|
| That said, I don't think AIs ability to generate art is a major
| milestone in the progress of things, I think it's more of the
| same, automating low value-add processes.
|
| I agree that AI is/will-be an incredibly disruptive technology.
| And that automation in general is putting more and more people
| out of jobs, and extrapolated forward you end up in a world
| where most humans don't have any practical work to do other
| than breed and consume resources at ever increasing rates.
|
| As much as I'm impressed by AI art (it's gorgeous), at the end
| of the day it's mainly just copying/pasting/smoothing out
| objects it's seen before (training set). We don't think of it
| as clipart, but that's essentially what it is underneath it
| all, just a new form of clipart. Amazing in it's ability to
| reposition, adjust, smooth images, have some sense of artistic
| placement, etc. It's lightyears beyond where clipart started
| (small vector and bitmap libraries). But at the end of the day
| it's just automating the creation of images using clipart. Re-
| arranging images you've seen before so is not going to make
| anyone big $$$. End of the day the quality of the output is
| entirely subjective, just about anything reasonable will do.
|
| This reminds me a lot of GPT-3... looks like it has substance
| but not really. GPT-3 is great at making low value clickbait
| articles of cut-and-paste information on your favorite band or
| celebrity. GPT-3 will never be able to do the job of a real
| journalist, pulling pieces together to identify and expose
| deeper truths, to say, uncover the Theranos fraud. It's just
| Eliza [1] on steroids.
|
| The AI parlor tricks started with Eliza, and have gotten quite
| elaborate as of late. But they're still just parlor tricks.
|
| Comparing it to the challenges of programming, well yes I agree
| AI will automate portions of it, but with major caveats.
|
| A lot of what people call "programming" today is really just
| plumbing. I'm a career embedded real-time firmware engineer,
| and it continues to astonish me that there's an entire
| generation of young "programmers" who don't understand basic
| computing principles, stacks, interrupts, I/O operations.. at
| the end of the day their knowledge base seems comprised of
| knowing which tool to use where in orchestration, and how to
| plumb it together. And if they don't know the answer they
| simply google and stack overflow will tell them. Low code, no
| code, etc. (python is perfect for quickly plumbing two systems
| together). This skill set is very limited and wouldn't even get
| you a junior dev position when I started out. I'm not suprised
| it's easy to automate, as it will generally have the same
| quality code (and make the same mistakes) as a human dev that
| simply copies/pastes Stack Overflow solutions.
|
| This is in stark contrast to the types of problems that most
| programmers used to solve in the old days (and a smaller number
| still do). Stuff that needed an engineering degree and complex
| problem solving skills. But when I started out 30 years ago,
| "programmers" and "software engineers" were essentially the
| same thing. They aren't now, there is a world of difference
| between your average programmer and a true software engineer
| today.
|
| Not saying plumbers aren't valuable.. they absolutely are as
| more and more of the modern world is built on plumbing things
| together. Highly skilled software engineers are needed less and
| less, and that's a net-good thing for humanity. No one needs to
| write operating systems anymore, lets add value building on top
| of them. Those are the people making the big $$$, their
| skillset is quite valuable. We're in the middle of a bi-
| furcation of software engineering careers. More and more
| positions will only require limited skills, and fewer and fewer
| (as a percentage) will continue to be highly skilled.
|
| So is AI going to come in and help automate the plumbing? Heck
| yes, and rightly so... They've automated call centers,
| warehouse logistics, click-bait article writing, carry-out
| order taking, the list goes on and on. I'd love to have an AI
| plumber I could trust to do most of the low-level work right
| (and in CI/CD world you can just push out a fix if you missed
| something).
|
| I don't believe for a second that today's latest and greatest
| "cutting edge" AI will ever be able to solve the hard problems
| that keep highly skilled people employed. New breakthroughs are
| needed, but I'm extremely skeptical. Like fusion promises,
| general purpose AI always seems just a decade or two away.
| Skilled labor is safe, for now.. maybe for a while yet.
|
| The real problem as I see it, is that AI automation is on
| course to eliminate most low skilled jobs in the next century,
| which puts it on a collision course with the fact that most
| humans aren't capable of performing highly skilled work (half
| are below average by definition). Single parent workig the GM
| line in the 50's was enough afford an average family a decent
| life. Not so much where technology is going. At the end of the
| day the average human will have little to contribute to
| civilization, but still expects to eat and breed.
|
| Universal basic income has been touted as a solution to the
| coming crisis, but all that does is kick the can down the road.
| It leads to a world of too much idle time (and the devil will
| find work for idle hands) and ever growing resource
| consumption. A perfect storm.... at the end of the day what's
| the point of existing when all you do is consume everything
| around you and don't add any value? Maybe that's someone's idea
| of utopia, but not mine.
|
| This has been coming for a long time, AI art is just a small
| step on the current journey, not a big breakthrough but a new
| application in automation.
|
| /rant
|
| [1] https://en.wikipedia.org/wiki/ELIZA
| unity1001 wrote:
| > entire generation of young "programmers" who don't
| understand basic computing principles, stacks, interrupts,
| I/O operations
|
| Why would software engineers who work on web apps,
| kubernetes, and the internet in general need to understand
| interrupts. Not only they will never ever deal with any of
| that, but also they are supposed not to. All of those have
| been automated away so that what we call the Internet can be
| possible.
|
| All of those stuff turned into specializations as the tech
| world progressed and the ecosystem grew. A software engineer
| specialized in hardware would need to know interrupts while
| he wouldnt need to know how to do devops. For the software
| engineer who works on Internet apps, its the opposite.
| lordfrito wrote:
| I'm not dissing cloud engineering. I've learned enough to
| really repesct the architects behind these large scale
| systems.
|
| My point was about skill level, not specialization.
| Specialization is great.. we can build bigger and bigger
| things not having to engineer/understand what's beneath
| everything. We stand on the shoulders of giants as they
| say.
|
| And I agree, there is no one job specialization that's more
| valuable than the other. It's contextual. If you have a
| legal problem, a specialized lawyer is more valuable than a
| specialized doctor. So yeah I agree that if you have a
| cloud problem, you want a cloud engineer and not a firmware
| engineer. Although I should add that things like
| interrupts/events/synchronization and I/O operations are
| fairly universal computing concepts even in the cloud
| world. If you're a cloud programmer and you don't know how
| long an operation takes / its big-O complexity, how much
| storage it uses / it's persistence etc. you're probably
| going to have some explaining to do when your company gets
| next months AWS bill.
|
| And yes plumbing is useful! Someone has to hook stuff up
| that needs hooking up! But which task requires more skill;
| the person that designs a good water flow valve, or the
| person hooking one up? I'd argue the person designing the
| valve needs to be more skilled (they certainly need more
| schooling). The average plumber can't design a good flow
| valve, while the average non-plumber can fix a leaky sink.
|
| AI is eating unskilled / low-skill work. In the 80's
| production line workers were afraid of robots. Well, here
| we are. No more pools of typists, automated call centers
| handling huge volumes of people, dark factories.
|
| It's a terrible time to be an artist if AI can clipart
| compose images of the same quality much faster than you can
| draw by hand.
|
| Back to original comment: I'm merely suggesting that some
| programming jobs require a lot more skill than others. If
| software plumbing is easy, then it can and will be
| automated. If those were the only skill I posessed, I'd be
| worried about my job.
|
| Like fusion, I just don't see general purpose AI being a
| thing in my lifetime. For highly skilled programmers, it's
| going to be a lot longer before they're replaced.
|
| Welcome to our digital future. It's very stressful for the
| average skilled human.
| rhn_mk1 wrote:
| Not being afraid of AI is not necessarily due to the lack of
| empathy. It could be due to acceptance: perhaps AI will make
| programmers obsolete. That is fine, programming is really
| boring most of the time, when it's just cobbling things
| together. Even if it will be to the short term disadvantage of
| some people (including the speaker), AI taking over tedious
| programming tasks will make humanity richer.
|
| It's up to us to distribute those gains back.
| alxlu wrote:
| I think the issue is that our laws and economy are not
| structured in a way that makes it likely for those gains to
| be distributed back to anyone other than the ultra wealthy.
| Not that I expect AI to take over most programming jobs
| anytime soon (or ever), but if it does, it would almost
| certainly happen long before society manages to agree on a
| system to distribute those gains back in a way that benefits
| the average person.
| rhn_mk1 wrote:
| I believe that was the concern of the Luddite movement.
| While they failed, we can learn from them this time.
| imgabe wrote:
| I've played around a bit with Stable Diffusion and as far as I
| can tell, it's just a new tool, like a much better paintbrush.
|
| It still needs a human to tell it what to paint, and the best
| outputs generally require hours of refinement and then possibly
| touch-up in photoshop. It's not generating art on its own.
|
| Artists still have a job in deciding what to make and using
| their taste to make it look good, that hasn't changed. Maybe
| the fine-motor skills and hand-eye coordination are not as
| necessary as they were, but that's it.
| kecupochren wrote:
| > require hours of refinement
|
| Not disagreeing with your comment but this is not the case
| with Midjourney. Very little is needed to produce stunning
| images. But afaik they modify/enhance the prompts behind the
| screen
| mtrower wrote:
| There's a big difference though between "a stunning image"
| and "the stunning image you wanted".
| kecupochren wrote:
| That's very true, I stand corrected. I see people tuning
| their prompts for hours on public MJ channels
| yamtaddle wrote:
| A key difference is someone with some prompt-writing
| skills and a tiny amount of aesthetic taste can now
| compete with trained artists who actually know how to
| create such images from scratch. Sally in Sales and Tom
| in Accounting can also do art as part of their job,
| whenever it calls for art. And copy-writing, et c. Or
| will be able to in the near future. Fewer dedicated
| artists, fewer dedicated writers, and so on. One artist
| can do the work of ten, and almost anyone in the office
| can pinch-hit to do a little art or writing here and
| there (by which I mean, tell a computer to make some art,
| then select which of that art is best).
| nikanj wrote:
| Coders have been using "AI" for ages. You used to write
| assembly by hand, then got a compiler that you could just
| instruct to generate the code for you. I don't worry about my
| job, even though a single prompt to REPL can now replace
| thousands of hand-crafted machine instructions
| eddiewithzato wrote:
| yea no, the difficulty in programming as a career is
| interaction with other humans. I would like AI to reach the
| stage where it can comprehend solutions that stakeholders don't
| know themselves.
|
| Because in my time the stakeholders in companies have never
| actually been decisive when scoping features.
|
| Co-pilot is indeed the endgame for AI assisted programming. So
| I would say for art, someone mindful could train an AI on their
| own dataset and use that to accelerate their workflow. Imagine
| it drawing outlines instead of the full picture.
| mysterydip wrote:
| > the difficulty in programming as a career is interaction
| with other humans
|
| It would be great if there was an AI that could be a liaison
| between developers and stakeholders, translating the
| languages of each side for mutual understanding.
| imknewhere wrote:
| What I find interesting is how people literally cannot see any
| alternative besides, "This is just the way capitalism works",
| which implicitly acknowledges "capitalism is the only way it
| can work".
| mallvinegar wrote:
| Reminds me of this quote from Mark Fisher:
|
| " _Observing humans under capitalism and concluding it 's
| only in our nature to be greedy is like observing humans
| under water and concluding it's only in our nature to
| drown._"
| exceptione wrote:
| Spot on. Our thinking on these matters is more adherence to
| faith than reason. We are stuck in a collective meme.
|
| A belief system that centers around human well being sounds
| more reasonable than *unbounded* capitalism. We know it, we
| don't know what to do with it.
| mypastself wrote:
| Tangentially, this is something I think about from time to
| time: in tech, you can be mediocre and live a very comfortable
| life. In art (and many other areas), you often have to be
| extraordinary just to make ends meet.
|
| So I don't think art is "harder". It's just harder for the
| average practitioner/professional to find "success" (however
| you like to define it).
| ajmurmann wrote:
| I wonder if this is due to existing forms of automation in
| art. Artists have been competing with reproductions of art in
| the form of recordings and prints for a long time now. That
| creates a really high floor. How many people who play an
| instrument have people around them genuinely want to listen
| to them play rather than a recording? How much lower would
| the bar be if recordings didn't exist?
|
| Of course software gets copied all the time, but we have jobs
| because so much bespoke software is needed. Looking at some
| of what AI can do now, I wouldn't need surprised if our floor
| gets raised a lot in the next few years as well.
| anticristi wrote:
| I think about this too and I wonder why?
|
| Are artists really "doomed"? Or are they just worse at
| redistribution?
| MomoXenosaga wrote:
| Artists will exist as long as they can entertain the elite
| with their clown antics.
|
| Be entertaining. Be outrageous. Be endearing. An AI can't
| cut off their ear.
| eulers_secret wrote:
| IMO, the 'why' is due to how mature the industry is - it'll
| absolutely be the future for every profession, given enough
| time. It's the natural distribution of wealth in our
| society: Few have too much, most have not enough.
|
| We're all "doomed" if this is the case.
| VoodooJuJu wrote:
| Oh don't worry, they'll learn empathy real fast when Co-pilot
| becomes just Pilot and they have to take a passenger seat.
| CuriouslyC wrote:
| Except engineers won't become passengers, they will become
| air traffic controllers.
| dumbaccount123 wrote:
| aaroninsf wrote:
| These are society-wide problems, not a failure of empathy on
| the part of "technical people."
|
| The lack you find depressing is natural defensiveness in the
| face of hostility rooted in the fear, and in most cases, broad
| ignorance of both the legal and technical context and operation
| of these systems.
|
| We might look at this and say, "there should have been a roll
| out with education and appropriate framing, they should have
| managed this better."
|
| This may be true but of course, there is no "they"; so here we
| are.
|
| I understand the fear, but my own empathy is blocked by
| hostility in specific interactions.
| ReptileMan wrote:
| I think that programmers are safe for now - because of the law
| of leaky abstractions. And there is hardly bigger and leakier
| abstraction than AI generated code.
| jhbadger wrote:
| I think it's more a lack of historical perspective on the part
| of artists. I remember when Photoshop and other digital art
| tools became available and many artists were of the opinion
| "Feh! Digital art isn't really art. Real artists work with
| pens, brushes, and paper!". Fast forward a couple of decades
| and you won't find many artists still saying that. Instead
| they've embraced the tools. I expect the future won't be AI art
| vs human art but rather a hybrid as art tools incorporate the
| technique and artists won't think it is any less art than using
| other digital tools.
| odessacubbage wrote:
| the issue at hand has nothing to do with gatekeeping, elitism
| or any kind of psued debate about what constitutes real art.
|
| people are mad because job & portfolio sites are being
| flooded with aishit which is making them unusable for both
| artists and clients .
|
| people are mad because their copyright is being scraped and
| resold for profit by third parties without their consent.
|
| whether ai is _the future_ is an utterly meaningless
| distraction until these concerns are addressed. as an aside,
| ai evangelists telling working professionals that they
| 'simply don't get' their field of expertise has been an
| incredibly poor tact for generating goodwill towards this
| technology or the operations attempting to extract massive
| profit from it's implementation.
| broast wrote:
| I'm both a digital artist and programmer. I never thought it
| would happen before, but I accept that this technology can
| easily replace some aspects of my professional value. But i
| don't let it take away from my experience and capacity to be
| creative, so I still think I have an advantage when leveraging
| these tools- and I've started to use them every day.
|
| Rendering was only ever a small part of the visual arts process
| anyway. And you can still manually add pixel perfect details to
| these images by hand that you wouldn't know how to create an AI
| prompt for. And further, you can mash together AI outputs in
| beautifully unique and highly controlled ways to produce
| original compositions that still take work to reproduce.
|
| To me, these AI's are just a tool for increased speed, like
| copy and paste.
| threatofrain wrote:
| One of the things that I find problematic is that we enjoy so
| many conveniences or efficiencies where taking a step back
| feels unimaginable. We used to have human computers. Going back
| on this to rescue an old profession would seem unimaginable.
| Paying individual taxes is very easy for many nations. Going to
| what the US has just to rescue many accounting jobs seems
| absurd.
|
| Now imagine a future where AI can assist in law. Or should we
| not have that because lawyers pay so much for education and
| they work so bitterly? Should we do away with farm equipment as
| well? Should we destroy musical synths so that we can have more
| musicians?
|
| It's one thing to say we should have a government program to
| ease transitions in industry. It's something else to say that
| we should hold back technological progress because jobs will be
| destroyed.
|
| How do we develop a coherent moral framework to address this
| matter?
| majani wrote:
| It's quite typical of devs in my experience. I remember during
| the MegaUpload/Pirate Bay arrests, devs were quite up in arms
| about big media going after pirates, but when it came to devs
| going after app pirates with everything they've got, they were
| real quiet
| themagician wrote:
| I never thought leopards would eat MY face!
|
| Creative professionals might take the first hit in professional
| services, but AI is going to come for engineers at a much
| faster and more furious pace. I would even go so far as to say
| that some (probably a small amount) of the people who have
| recently gotten laid off at big tech companies may never see a
| paycheck as high as they previously had.
|
| The vast majority of software engineering hours that are
| actually paid are for maintenance, and this is where AI is
| likely to come in like a tornado. Once AI hits upgrade and
| migration tools it's going to eliminate entire teams
| permanently.
| willsmith72 wrote:
| > The vast majority of software engineering hours that are
| actually paid are for maintenance
|
| Do you have a source for that? Doesn't match my experience
| unless your definition of maintenance is really broad
| themagician wrote:
| Just experience, but my definition is pretty broad. Once
| you get out of the valley most of what pays well (banking,
| finance, telecom, analytics, industrial, etc.) is
| maintenance code IMO. Basically anything that doesn't come
| out real R&D budget, even if it is a "new feature", is
| maintenance to me at this point.
| itronitron wrote:
| The caveat there is 'paid hours'. The current working model
| for the industry is that all software engineers
| _leetcodeuberhack_ on open source repos at night and by day
| have paying jobs maintaining companies ' systems that use
| open source.
| SkyPuncher wrote:
| > The vast majority of software engineering hours that are
| actually paid are for maintenance, and this is where AI is
| likely to come in like a tornado.
|
| I have the exact, almost completely opposite opinion.
| Greenfield is where AI going to shine.
|
| Maintenance is riddled with "gotcha's", business context, and
| legacy issues that were all handled and negotiated over
| outside of the development workflow.
|
| By contrast, AI can pretty easily generate a new file based
| on some form of input.
| grandmczeb wrote:
| > The vast majority of software engineering hours that are
| actually paid are for maintenance, and this is where AI is
| likely to come in like a tornado. Once AI hits upgrade and
| migration tools it's going to eliminate entire teams
| permanently.
|
| There's been huge improvements in automating maintenance, and
| yet I've never once heard someone blame a layoff on e.g.
| clang-rename (which has probably made me 100x more productive
| at refactoring compared to doing it manually.)
|
| I'd even say your conclusion is exactly backwards. The
| implicit assumption is that there's a fixed amount of
| engineering work to do, so any automation means fewer
| engineers. In reality there is no such constraint. Firms hire
| when the marginal benefit of an engineer is larger than the
| cost. Automation increases productivity, causing firms to
| hire _more_ , not less.
| scj wrote:
| I believe the current generation of AI would be better suited
| to augmenting human understanding of code (through static
| analysis tools and the like), rather than generating it.
|
| On an infinite timeline humans will no longer be needed in
| the generation of code (we hopefully will still study and
| appreciate it for leisure), but I doubt we're there yet.
| bryanrasmussen wrote:
| Much of the history of programming has been programmers making
| other jobs obsolete, and indeed there is a saying that a good
| programmer makes themselves obsolete.
| dumbaccount123 wrote:
| If tech that makes programmers obsolete comes then we are
| living in a new era. Pretty much every single job will
| obselete by then
| dlkf wrote:
| > why shouldn't it, with some finagling, also be able to do the
| dumb, simple things most programmers do for their jobs?
|
| Because those things, while dumb and simple, are not continuous
| in the way that visual art is. Subtle perturbations to a piece
| of visual art stay subtle. There is room for error. By
| contrast, subtle changes to source code can have drastic
| implications for the output of a program. In some domains this
| might be tolerable, but in any domain where you're dealing
| significant sums of money it won't be.
| cardanome wrote:
| I mean sure things will get harder for some artists but what is
| to be done about it? What will feeling sorry for them
| accomplish?
|
| The job market will always keep on changing, you have to adept
| to it to a certain degree.
|
| Now we can talk about supporting art as a public good and I am
| all for that but I don't see how artists are owed a corporate
| job. Many of my current programming skill will be obsolete one
| day, that's part of the game.
| gus_massa wrote:
| I think the correct way to get empathy is to use an equivalent
| that technical people understand, like Copilot:
|
| * Can a Copilot-like generator be trained with the GPL code of
| RMS? What is the license of the output?
|
| * Can a Copilot-like generator be trained with the leaked
| source code of MS Windows? What is the license of the output?
| Terretta wrote:
| Your example is like saying we should have empathy for people
| who can whittle when a 3D printer can now extrude the same
| design in bulk. Or like empathy for London cabbies having to
| learn roads when "anyone" can A-to-B now with a phone.
|
| Code should not need to be done by humans at all. There's no
| reason coding as it exists today should exist as a job in the
| future.
|
| Any time I or a colleague are "debugging" something, I'm just
| sad we are so "dark ages" that the IDE isn't saying "THERE,
| humans, the bug is THERE!" in flashing red. The IDE has the
| potential to have perfect information, so where is the bug is
| solvable.
|
| The job of coding today should continue to rise up the stack
| tomorrow to where modules and libraries and frameworks are
| just things machines generate in response to a dialog about
| _"the job to be done"_.
|
| The primary problem space of software is in the business
| domain, today requiring people who speak barely abstracted
| machine language to implement -- still such painfully early
| days.
|
| We're cavemen chipping at rocks to make fire still amazed at
| the trick. No empathy, just, self-awareness sufficient to
| provoke us into researching fusion.
| Kalium wrote:
| We can and should have empathy for all those people.
|
| The question is perhaps not if we should have empathy for
| them. The question is what we should do with it once we
| have it. I have empathy for the cabbies with the Knowledge
| of London, but I don't think making any policy based on or
| around that empathy is wise.
|
| This is tricky in practice. A surprising number of people
| regard prioritizing the internal emotional experience of
| empathy in policy as experiencing empathy.
| PeterisP wrote:
| I don't think that's a road to empathy, because if we're
| talking about the matter of empathy i.e. "emotional should's"
| instead of nuances of current legal policy, then I'd expect a
| nontrivial part of technical people to say that a morally
| reasonable answer to both these scenarios could (or should)
| be "Yes, and whatever you want - not treated as derivative
| work bound by the license of the training data", which
| probably is the opposite of what artists would want.
|
| While technically both artists and developers make their
| living by producing copyrighted works, our relationship to
| copyright is very different; while artists rely on copyright
| and overwhelmingly support its enforcement as-is, many
| developers (including myself) would argue for a significant
| reduction of its length or scale.
|
| For tech workers (tech company owners could have a different
| perspective) copyright is just an accidental fact of life,
| and since most of paid development work is done as work-for-
| hire for custom stuff needed by one company, that model would
| work just as well even if copyright didn't exist or didn't
| extend to software. While in many cases copyright benefits
| our profession, in many other cases it harms our profession,
| and while things like GPL rely on copyright, they are also in
| large part a reaction to copyright that wouldn't be needed if
| copyright for code didn't exist or was significantly
| restricted.
| gus_massa wrote:
| It depends a lot of the type of software you are making. If
| it's custom software for a single client, then probably
| copyright is not important. (Anyway, I think a lot of
| custom software is send without the source code or with
| obfuscated code, so they have to hire the developer again.)
|
| Part of my job is something like that. I make custom
| programs for my department in the university. I don't care
| how long is the copyright. Anyway, I like to milk the work
| for a few years. There are some programs I made 5 or 10
| years ago that we are still using and saving time of my
| coworkers and I like to use that leverage to get more
| freedom with my time. (How many 20% projects can I have?)
| Anyway, most of them need some updating because the
| requirements change of the environment changes, so it's not
| zero work on them.
|
| There are very few projects that have a long term value.
| Games sell a lot of copies in a short time. MS Office gets
| an update every other year (Hello Clippy! Bye Clippy!) ,
| and the online version is eating them. I think it's very
| hard to think programs that will have a lot of value in 50
| years, but I'm still running some code in Classic VB6.
| imgabe wrote:
| If a human learns to program by reading GPL code, what is the
| license of future code they write?
| 6gvONxR4sf7o wrote:
| Why's this matter? Corporations aren't people.
| zorked wrote:
| A language model is not a human. You at least have the
| possiblity that the human learned something. The language
| model is a parrot with a large memory.
|
| That said Microsoft didn't allow their kernel developers to
| look at Linux code for a reason.
| ben_w wrote:
| What definition of learning are you using that makes
| humans _not_ parrots and a deep learning system _not_
| learning?
|
| I know current AI is very different from an organic brain
| at many levels, but I don't know if any of those
| differences really matters.
| NateEag wrote:
| And since you don't know if they matter, you should not
| presume that they don't.
| zorked wrote:
| Go to a judge in a copyright case and argue that humans
| are parrots. Then tell me how it went.
| AlexandrB wrote:
| Humans have rights, machines don't. Copyright is a system
| for protecting human intellectual property rights. You
| can't copyright things created by a monkey[1] for example.
| Thus it's not a contradiction to say that an action
| performed by a human is "transformative" while the same
| action performed by a machine is not.
|
| But that is giving AI too much credit. As advanced as
| modern AI models are, they are not AGIs comparable to human
| cognition. I don't get the impulse to elevate/equate the
| output of trained AI models to that of human beings.
|
| [1] https://thecopyrightdetective.com/animal-copyrights/
| imgabe wrote:
| The AI did not create anything. It responded to a prompt
| given by a human to generate an output. Just like
| photoshop responds to someone moving the mouse and
| clicking or a paintbrush responds to being dragged across
| a canvas.
|
| So any transformativity of the action should be
| attributed to the human and the same copyright laws would
| apply.
| AlexandrB wrote:
| But under this model, the comparisons to human learning
| don't apply either. What matters is whether the output is
| transformative - so it's fair to compare the outputs of
| AI systems to one of the many inputs and say "these are
| too similar, therefore infringement occurred". It doesn't
| matter what kind of mixing happened between inputs and
| outputs, just like it doesn't matter how many Photoshop
| filters I apply to an image if the result resembles what
| I started with "too much".
| amanaplanacanal wrote:
| I believe that _you_ can copyright the image, it 's the
| monkey that can't copyright it.
| gus_massa wrote:
| It's more complicated, even if humans are involved. From
| https://wiki.winehq.org/Developer_FAQ#Copyright_Issues
|
| > _Who can 't contribute to Wine?_
|
| > _Some people cannot contribute to Wine because of
| potential copyright violation. This would be anyone who has
| seen Microsoft Windows source code (stolen, under an NDA,
| disassembled, or otherwise). There are some exceptions for
| the source code of add-on components (ATL, MFC, msvcrt);
| see the next question._
|
| I've seen a few MIT/BSD projects that ask people not to
| contribute if they have seen the equivalent GPL project.
| It's a problem because Copilot has seen "all" GPL projects.
| [deleted]
| Mountain_Skies wrote:
| While it was far from all of them, lots of the people who are
| decrying AI art were recently gleefully cheering the
| destruction of blue-collar jobs held by people with what they
| view as unacceptable value systems. "Learn to code" was a
| middle finger both to the people losing their jobs and to those
| who already code and don't want to see the value of their
| skills diluted. There's been plenty of "lack of empathy" going
| around lately, mostly because of ideological fault lines.
| Perhaps this will be a wake-up call that monsters rarely obey
| their masters for very long before turning on them.
| thedorkknight wrote:
| >lots of the people who are decrying AI art were recently
| gleefully cheering the destruction of blue-collar job
|
| I hear these sorts of statements a lot, and always wonder how
| people come to the conclusion that "people who said A were
| the ones who were saying B". Barring survey data, how would
| you know that it isn't just the case that it seems that way?
|
| The idea that people who would tell someone else to learn to
| code are now luddites seems super counter-intuitive to me.
| Wouldn't people opposing automation now likely be the same
| ones opposing it in the past? Why would you assume they're
| the same group without data showing it?
|
| I know a bunch of artists personally and none of them seem to
| oppose blue-collar work
| knighthack wrote:
| If _" making art is vastly more difficult than the huge
| majority of computer programming that is done"_ - then I'm
| sorry, you must not be doing very difficult computer
| programming.
| odo1242 wrote:
| the vast majority of computer programming is "not very
| difficult" computer programming
| quonn wrote:
| Right and why is that? Because there is often no budget to
| solve the interesting parts and because of a lack of skills
| and because of terrible management - all of these mutually
| reinforcing.
|
| Same if true by the way for writing. So? Doesn't mean
| writing well is easy.
| NateEag wrote:
| what's the most difficult art project you've produced?
|
| Comparing these is very "apples and oranges", but I think
| you'd better have a strong background in both if you're gonna
| try.
| CadmiumYellow wrote:
| I have a strong background in both and I think creating
| good art is worlds more difficult than writing good code.
| It's both technically difficult and intellectually
| challenging to create something that people actually want
| to look at. Learning technical skills like draughtsmanship
| is harder than learning programming because you can't just
| log onto a free website and start getting instant &
| accurate feedback on your work. I do agree that it's very
| apples and oranges though - creating art requires a level
| of intuition and emotion that's mostly absent from
| technical pursuits like programming, and this very
| distinction is both the reason technical people can be so
| dismissive of the arts AND the reason why I think making
| art is ultimately more difficult.
| quonn wrote:
| This is a very strange thing to say since great art is
| often not technically difficult at all. Much of modern
| and contemporary art is like that, nevertheless the art
| is superb.
|
| > Learning technical skills like draughtsmanship is
| harder than learning programming because you can't just
| log onto a free website and start getting instant &
| accurate feedback on your work.
|
| Really? I sometimes wonder what people think programming
| really is. Not what you describe, obviously.
| CadmiumYellow wrote:
| I actually think a lot of modern and contemporary art is
| more technically difficult than it appears (though
| certainly not as technically difficult as making a marble
| sculpture or something). But fair point.
|
| Not sure I fully understand your second point: are you
| implying that I don't really know what programming is?
| quonn wrote:
| I'm not judging since I don't know you. I see programming
| as the profession, grounded in CS and with coding being
| usually not the problem (instead designing the solution
| is the problem).
| 6gvONxR4sf7o wrote:
| It's especially absurd that they have no empathy for this
| exploiting artists' work, and then get upset when it spits out
| GPL code.
|
| The people who generated the training data should have a say in
| how their work is used. Opt-in, not opt-out.
| exceptione wrote:
| You have my sympathy.
|
| I think you need to see there are 2 types of people:
|
| - those who want to generate results ("get the job done,
| quickly"), and
|
| - those who enjoy programming because of it.
|
| The first one are the ones who can't see what is getting lost.
| They see programming as an obstacle. Strangely, some of them
| believe that on the one hand that many more people can produce
| lots more of software because of AI, and simultaneously expect
| to keep being in demand.
|
| They might think your job is producing pictures, which is just
| a burden.
|
| I am from the second group. I never choose this profession
| because of the money, or dreaming about big business I could
| create. I dread pasting generated code all over the place. The
| only one being happy would be the owner of that software. And
| the AI model overlord of course.
|
| I hope that technical and artistic skill will gain appreciation
| again and that you will have a happy live in doing what you
| like the most.
| astrange wrote:
| If you think code generating AI will take your job, you
| should also never hire junior engineers because one of them
| might take your job.
|
| Nevertheless, having more engineers around actually causes
| you to be more valuable, not less. "Taking your job" isn't a
| thing; the Fed chairman is the only thing in our economy that
| can do that.
| exceptione wrote:
| > If you think code generating AI will take your job,
|
| It might take away the joy of programming, feeling of
| ownership and accomplishment.
|
| People today complain about having to program a bunch of
| api calls might be in for a rude awakening, tending and
| debugging the piles of chatbot output that got mashed
| together. Or do we expect that in the future we will
| suddenly value quality over speed or #features?
|
| I love coaching juniors. These are humans, I can help them
| with their struggles and teach them. I try to understand
| them, we share experiences in life. We laugh. We find
| meaning by being with each other on this lonely, beautiful
| planet in the universe.
|
| ---
|
| Please do not take offense: observe the language in which
| we are already conflating human beings with bots. If we do
| it already now, we will collectively do it in the future.
|
| We are not prepared.
| shadowgovt wrote:
| Software engineers are in the business of self-replacement. The
| idea they could be replaced by an AI doesn't engender fear; it
| marks a success.
| XorNot wrote:
| No one is in programming to "do programming". They're in it to
| get things done. I didn't learn C++ in high school to learn
| C++, I learned it to make games (then C++ changed and became
| new and scary to me and so I no longer say I know C++, possibly
| I never did).
|
| If an AI will take care of most of the finicky details for me
| and let me focus on defining what I want and how I want it to
| work, then that is nothing but an improvement for everyone.
| meebob wrote:
| I would point out that many (most?) people are in programming
| to make money, rather than get things done per se.
|
| If an AI were to make it impossible to make a living doing
| programming, would that be an improvement for most readers of
| this site?
| ZetaZero wrote:
| It _should_ be an improvement for people to get a career in
| something they enjoy, instead of what pays the most money.
| ajmurmann wrote:
| Yes, and there will be much fewer of those jobs and they
| might not pay.
|
| Ultimately though this isn't a technical problem but an
| economic one about how we as a society decide to share
| our resources. AI growth the pie, but removes leverage
| from some to claim their slice. Automation is why we'll
| inevitably need UBI at some point
| meebob wrote:
| What we're talking about here is the immanent arrival of
| it being impossible for a very large number of people to
| get a career in something they enjoy (making images by
| hand).
|
| It's fair to suppose (albeit based on a _very_ small
| sample size, i.e., the last couple hundred, abnormal
| years of history) that all sorts of new jobs will arise
| as a result of these changes- but it seems to me
| unreasonable to suppose that these new jobs of the future
| will necessarily be more interesting or enjoyable than
| the ones they destroyed. I think it 's easy to imagine a
| case in which the jobs are all much less pleasant (even
| supposing we all are wealthier, which also isn't
| necessarily going to be true)- imagine a future where the
| remaining jobs are either managerial/ownership based in
| nature or manual labor. To me at least, it's a bleak
| prospect.
| Kalium wrote:
| At the risk of demonstrating a total lack of empathy and
| failure to identify, we long ago passed the arrival of it
| being impossible for a very large number of people to get
| a career in something they enjoy (making images by hand).
| Art has been a famously difficult career path for quite a
| long time now. This does not really seem like a dramatic
| shift in the character of the market.
|
| Now, I have empathy. I paused a moment before writing
| this comment to identify with artists, art students, and
| those who have been unable to reach their dreams for
| financial reasons. I emphatically empathize with them. I
| understand their emotional experiences and the pain of
| having their dreams crushed by cold and unfeeling
| machines and the engineers who ignore who they crush.
|
| Yet I must confess I am uncertain how this is supposed to
| change things for me. I have no doubt that there used to
| be a lot of people who deeply enjoyed making carriages,
| too.
| anothernewdude wrote:
| I don't care. After decades of having no TV, film, books or
| video games aimed at me, they might finally be generated
| instead of the bullshit written by committees.
| yunwal wrote:
| Oh yeah I'm sure the AI that was trained on decades of tv,
| movies, and books that didn't appeal to you will do a great
| job of creating things that appeal to you.
| Mezzie wrote:
| I find it weird that they're considered separate talents.
| Programming is a creative task for me, and one reason I never
| took it up as a full time job is that I learned I hate trying
| to do creative work on demand. (I've been paid for both fiction
| writing and dev work and they produce very similar feelings in
| me.)
|
| Programming is definitely easier to make a _living_ from. I 'm
| a very mediocre artist _and_ developer and I 'm never making
| enough off of art to live on, but I could get a programming job
| at a boring company and it would pay a living wage. In that
| sense, it's definitely 'easier'.
| sciclaw wrote:
| The thing with programming is that it either works or does not
| work, but there is a huge window of what can be called art.
|
| With no training, I, or even a 1 year old, could make something
| and call it art. I wouldn't claim it's very good but I think
| most people would accept it as art. The same cannot be said for
| programming.
| helsinkiandrew wrote:
| > making art is vastly more difficult than the huge majority of
| computer programming that is done
|
| Art and programming are hard for different reasons.
|
| The difference in the AI context is that a computer program has
| to do just about exactly whats asked of it to be useful,
| whereas a piece of art can go many ways and still be a piece of
| art. If you know what you want its quite hard to get DALL-E to
| produce that exactly (or it has been for me), but it still
| generates something that is very good looking.
| pkdpic wrote:
| Sidenote, you don't sound like a failed artist to me man. You
| sound like someone who survived the art machine and worked hard
| to make a smart career transition capable of supporting
| whatever kind of art you want to make. PS I did the same thing,
| painting MFA --> software development. Wish I was making FAANG
| money tho...
| furyofantares wrote:
| > it seems to me that most computer programmers should be just
| as afraid as artists, in the face of technology like this!!!
|
| I'm just as excited for myself as I am for artists. The current
| crop of these tools look like they could be powerful enablers
| for productivity and new creativity in their respective spaces.
|
| I happen to also welcome being fully replaced, which is another
| conversation and isn't really where I see these current tools
| going, though it's hard to extrapolate.
| orbital-decay wrote:
| Artists have all my sympathy. I'm also a hobbyist painter. But
| I have very little sympathy _for those perpetuating this
| tiresome moral panic_ (a small amount of actual artists,
| whatever the word "artist" means), because I think that:
|
| a) the panic is entirely misguided and based on two wrong
| assumptions. The first is that textual input and treating the
| model as a function (command in -> result out) are sufficient
| for anything. No, this is a fundamentally deficient way to give
| artistic directions, which is further handicapped by primitive
| models and weak compute. Text alone is a toy; the field will
| just become more and more complex and technically involved,
| just like 3D CGI did, because if you don't use every trick
| available, you're missing out. The second wrong assumption is
| that it's going to _replace_ anyone, instead of making many
| people re-learn a new tool and produce what was previously
| unfeasible due to the amount of mechanistic work involved. This
| second assumption stems from the fundamental misunderstanding
| of the value artists provide, which is conceptualization, even
| in a seemingly routine job.
|
| b) the panic is entirely blown out of proportion by the social
| media. Most people have neither time nor desire to actually
| dive into this tech and find out what works and what doesn't.
| They just believe that a magical machine steals their works to
| replace them, because that's what everyone reposts on Twitter
| endlessly.
| dtn wrote:
| > But I have very little sympathy for those perpetuating this
| tiresome moral panic (a small amount of actual artists,
| whatever the word "artist" means)
|
| > A small amount of actual artists
|
| It's extremely funny that you say this, because taking a look
| at the _Trending on Artstation_ page tells a different story.
|
| https://www.artstation.com/?sort_by=trending
| thordenmark wrote:
| You are demonstrating that lack of empathy. Artist's works
| are being stolen and used to train AI, that then produces
| work that will affect that artist's career. The advancement
| of this tech in the past 6 months, if it maintains this
| trajectory, demonstrates this.
| Permit wrote:
| > Artist's works are being stolen
|
| It has been fascinating to watch "copyright infringement is
| not theft" morph into "actually yes it's stealing" over the
| last few years.
|
| It used to be incredibly rare to find copyright maximalists
| on HackerNews, but with GitHub Co-pilot and StableDiffusion
| it seems to have created a new generation of them.
| wahnfrieden wrote:
| Copyright should not exist, but artists do need support
| somehow and doing away with copyright without other
| radical changes to economy/society leaves them high and
| dry. Copyright not existing should pair with other forms
| of support such as UBI or worker councilization, instead
| of ridding it while clutching capitalist pearls and
| ultimately only accelerating capitalism at their expense
| kevingadd wrote:
| "copyright infringement is not theft" is not an
| especially common view among artists or musicians, since
| copyright infringement threatens their livelihood. I
| don't think there's anything inconsistent about this.
| Yes, techies tend to hold the opposite view.
|
| Personally, I think "copyright infringement is not theft"
| but I also think that using artists' work without their
| permission for profit is never OK, and that's what's
| happening here.
| blamestross wrote:
| Individual humans copying corporate products vs
| corporations copying the work of individual humans they
| didn't pay.
|
| The confusion is that "copyright infringement is not
| theft" really was about being against corporate abuse of
| individuals. It's still the same situation here.
| [deleted]
| pfisch wrote:
| So I employ quite a few artists, and I don't see the
| problem. This whole thing basically seems more like a
| filter on photoshop then something that will take a persons
| job.
|
| If artists I employ want to incorporate this stuff into
| their workflow, that sounds great. They can get more done.
| There won't be less artists on payroll, just more and
| better art will be produced. I don't even think it is at
| the point of incorporating it into a workflow yet though,
| so this really seems like a nothing burger to me.
|
| At least github copilot is useful. This stuff is really not
| useful in a professional context, and the idea that it is
| going to take artists jobs really doesn't make any sense to
| me. I mean, if there aren't any artists then who exactly do
| I have that is using these AI tools to make new designs? If
| you think the answer to that is just some intern, then you
| really don't know what you're talking about.
| kevingadd wrote:
| With respect, you need to pay more attention to how and
| why these networks are used. People write complex prompts
| containing things like "trending on artstation" or
| "<skilled artist's name>" then use unmodified AI output
| in places like blog articles, profile headers, etc where
| you normally would have put art made by an artist.
|
| Yes, artists _can_ also utilize AI as a photoshop filter,
| and some artists have started using it to fill in
| backgrounds in drawings, etc. Inpainting can also be used
| to do unimportant textures for 3d models. But that doesn
| 't mean that AI art is no threat to artists' livelihoods,
| especially for scenarios like "I need a dozen
| illustrations to go with these articles" where quality
| isn't so important to the commissioner that they are
| willing to spend an extra few hundred bucks instead of
| spending 15 minutes in midjourney or stable diffusion.
|
| As long as these networks continue being trained on
| artists' work without permission or compensation, they
| will continue to improve in output quality and muscle the
| actual artists out of work.
| pfisch wrote:
| If you are looking for a bunch of low quality art there
| are tons of free sources for that already. If this is
| what you mean when you say "putting artists out of work"
| you are really talking about less than 1% of where artist
| money is spent.
| _0ffh wrote:
| So who's that mythical artist that hasn't seen and learned
| from the works of other artists? After all, these works
| will have left an imprint in their neural connections, so
| by the same argument their works are just as derivative, or
| "stolen".
| blincoln wrote:
| As someone who's shifted careers twice because disruptive
| technologies made some other options impractical, I can
| definitely appreciate that some artists are very upset
| about the idea of maybe having to change their plans for
| the future (or maybe not, depending on the kind of art they
| make), but all art is built on art that came before.
|
| How is training AI on imagery from the internet without
| permission different than decades of film and game artists
| borrowing H. R. Giger's style for alien technology?[1]
|
| How is it different from decades of professional and
| amateur artists using the characteristic big-eyed
| manga/anime look without getting permission from Osamu
| Tezuka?
|
| Copyright law doesn't cover general "style". Try to imagine
| the minefield that would exist if it were changed to work
| that way.
|
| [1] No, I don't mean Alien, or other works that actually
| involved Giger himself.
| idiotsecant wrote:
| Is 'looking at something' equivalent to stealing it? The
| use by all these diffusion networks is pretty much the
| definition of transformative. If a person was doing this it
| wouldn't even be interesting enough to talk about it. When
| a machine does it somehow that is morally distinct?
| berniedurfee wrote:
| Existing art trains the neural nets in human artists as
| well. All art is derivative. No art is wholly unique.
|
| Will human artists be able to compete with artificial
| artists commercially? If not, is that bad or is it
| progress, like Photoshop or Autotune?
| lolinder wrote:
| Making money through art is already not a feasible career, as
| you yourself learned. If you want a job that _millions_ of
| people do for fun in their free time, you can expect that job
| to be extremely hard to get and to pay very little.
|
| The solution isn't to halt technological progress to try to
| defend the few jobs that are actually available in that sector,
| the solution is to fight forward to a future where _no one_ has
| to do dull and boring things just to put food on the table.
| Fight for future where people can pursue what they want
| regardless of whether it 's profitable.
|
| Most of that fight is social and political, but progress in ML
| is an important precursor. We can't free _everyone_ from the
| dull and repetitive until we have automated _all_ of it.
| stemlord wrote:
| >The solution isn't to halt technological progress
|
| Technological progress is not a linear deterministic
| progression. We _decide_ _how_ to progress every step of the
| way. The problem is that we are making dogshit decisions for
| some reason
|
| Maybe we lack the creativity to envision alternative futures.
| How does a society become so uncreative I wonder
| [deleted]
| MSFT_Edging wrote:
| You'll find its nearly impossible to imagine a world
| without capitalism.
|
| Capitalism is particularly good at weaponizing our own
| ideas against us. See large corporations co-opting anti-
| capitalist movements for sales and PR.
|
| Pepsi-co was probably mad that they couldn't co-op "defund
| the police", "fuck 12", and "ACAB" like they could with
| "black lives matter".
|
| Anything near and dear to us will be manipulated into a
| scientific formula to make a profit, and anything that
| cannot is rejected by any kind of mainstream media.
|
| See: Capitalist Realism and Manufactured Consent(for how
| advertising effects freedom of speech in any media
| platform).
| CatWChainsaw wrote:
| Perhaps it would be better to say you can't imagine "the
| future" without capitalism, as history prior to maybe the
| 1600s offers a less technologically advanced
| illustration.
| astrange wrote:
| It's pretty easy to imagine a world without capitalism.
| It's the one where the government declares you a
| counterrevolutionary hedonist for wanting to do art and
| forces you to work for the state owned lithium mine.
|
| Mixed social-democratic economies are nice and better
| than plutocracies, but they have capitalism; they just
| have other economic forms alongside it.
|
| (Needing to profit isn't exclusive to capitalism either.
| Socialist societies also need productivity and profit,
| because they need to reinvest.)
| godelski wrote:
| But do you know what reducing the progress of generative
| modeling will do? Because there seems to be this confusion
| that generative modeling is about art/music/text.
| visarga wrote:
| > We decide how to progress every step of the way.
|
| I think the wheels are turning. It's just a resultant
| movement from thousands of small movements, but nobody is
| controlling it. If you take a look not even wars dent the
| steady progress of science and technology.
| kevingadd wrote:
| If it's so important, we could at least pay the people who
| create the training set. Otherwise, we're relying on unpaid
| labor for this important progress and if the unpaid labor
| disappears, we're screwed. How does it seem sensible to
| construct a business this way?
| gitfan86 wrote:
| Most of us in technology have had to learn new skills. I used
| to rack up and wire servers in a lab as part of my dev work. I
| don't do that anymore and instead had to learn aws and
| terraform. Personally I don't expect any empathy due to my lab
| racking skills no longer being as relevant to many jobs.
| medellin wrote:
| The lack of empathy in general on online forums is incredible.
| I don't think NH is any worse than other places but it would be
| nice if we could be a little better as it would lead to some
| more interesting and nuanced topics.
|
| As a developer/manager i am not yet scared of AI because i have
| had to already correct multiple people this week who tried to
| use chatGPT to figure something out.
|
| It's actually pretty good but when it's wrong it seems to be
| really wrong and when you don't have the background to figure
| that out a ton of time is wasted. It's just a better
| Stackoverflow at the end of the day imo.
| eatsyourtacos wrote:
| >it seems to me that most computer programmers should be just
| as afraid as artists
|
| That is absurd. Sure some basic AI tools have been helpful like
| co-pilot and it's sometimes really impressive how it can help
| me autofill some code instead of typing it out... but come on,
| there is no way we are anywhere close to AI replacing 99.99% of
| developers.
|
| >making art is vastly more difficult than the huge majority of
| computer programming that is done
|
| I don't know.. art is "easy" in the sense that we all know what
| art looks like. You want a picture of a man holding a cup with
| a baby raven in it? I can picture that in my head to some
| degree right away, and then it's just "doing the process" to
| draw it in some way using shapes we know.
|
| How in the heck can you correlate that to 99% of business
| applications? Most of the time no one even knows exactly what
| they want out of a project.. so first there is the massive
| amount of constant changes just from using stuff. Then there is
| the actual way the code is created itself. Let's even say you
| could tell it "Make me an angular website with two pages and a
| live chat functionality" and it worked. Well, ok great it got
| you a starting template.. but first, maybe the code is so weird
| or unintuitive that it's almost impossible to really keep
| building upon- not helpful. Now let's say it is "descent
| enough", well fine.. then it's almost like an advanced co-pilot
| at this point. It helps with boilerplate boring template.
|
| But comparing this all to art is still just ridiculous. Again,
| everyone can look at a picture and say "this is what I wanted"
| or "this is not what I wanted at all". Development is so crazy
| intricate that it's nothing like art.. I could look at two
| websites (similar to art) and say "these look the same", but
| under the hood it could be a million times different in
| functionality, how it works, how well it's structured to evolve
| over time.. etc etc. But if I look at two pictures that look
| exactly the same, I don't _care_ how it got there or how it was
| created- it 's done and exactly the same. Not true of
| development for 99% of cases.
| quonn wrote:
| This comment is downvoted, but it makes an important point.
| AI systems that produce an outcome that can be easily
| verified by non-experts are far more practical. If my mom can
| get an illustration out of the AI that she wants, she is
| done. Not so for software, where she cannot really verify
| it's that going to reliably do what was specified.
|
| This is especially true for complex pieces.
|
| If an AI could produce a world-class totally amazing
| illustration or even a book I will afterwards easily see or
| read it.
|
| On the other hand real-world software systems consist of
| hundreds of thousands or lines in distributed services. How
| would a layman really judge if they work?
|
| Nevertheless I also expect AI to have a big impact since less
| engineers can do much more.
| Kalium wrote:
| What's going to happen if technologists collectively come to
| the table and engaging in sincere discussion rooted in
| kindness, compassion, and empathy?
|
| I fully expect there will be zero reciprocation. There will,
| instead, be a strong expectation that that empathy turns into
| centering of _fear_ and a resulting series of economic choices.
| AI systems are now threatening the ability of some artists to
| get paid and those artists would like that to stop.
|
| I think we're seeing it right now. You shift effortlessly from
| talking about empathy to talking about the money. You consider
| the one the way to get the other, so you deplore the horrifying
| lack of empathy.
|
| Let me put it another way. Would you be happy if you saw an
| outpouring of empathy, sympathy, and identification with
| artists coupled with exactly the same decisions about machine
| learning systems?
| Mezzie wrote:
| I do find it funny that artists are complaining about things
| like AI generated art clogging up art sites/reducing
| commissions/etc. because my particular artistic outlet of
| choice is _writing_ and visual art has completely overtaken
| text based content online, particularly for anything fandom
| or nerd adjacent. The visual artists are also responsible for
| the monetization of fandom to begin with which I 'm still
| pretty salty about. We moved from discussions and fanfic to
| 500+ 'commission me to draw your OTP!' and 'Look at this
| skimpy character art!' daily posts.
|
| Shoe's on the other foot now and they don't like it.
| lilactown wrote:
| Yes, an outpouring of sympathy, empathy, etc combined with
| the same unilateral decision making that technologists make
| would be terrible. I would call continuing to do that
| unempathetic.
|
| Technologists acting like technocrats and expecting everyone
| to give them sympathy, empathy and identification is
| laughably rude and insulting.
| syntheweave wrote:
| I have crossed over the other direction from coding to drawing
| and suspect that neither side understands their craft well
| enough to assess what'll happen.
|
| Most of coding is routine patterns that are only perceived as
| complex because of the presence of other coders and the need to
| "talk" with them, which creates a need for reference
| materials(common protocols, documentation, etc.)
|
| Likewise, most of painting is routine patterns complicated by a
| mix of human intent(what's actually communicated) and the need
| for reference materials to make the image representational.
|
| Advancements in Western painting between the Renaissance and
| the invention of photography track with developments in optics;
| the Hockney-Falco thesis is the "strong" version of this,
| asserting that specific elements in historical paintings had to
| have come through the use of optical projections, not through
| the artist's eyes. A weaker form of this would say that the
| optics were tools for study and development of the artist's
| eye, but not always the go-to tool, especially not early on
| when their quality was not good.
|
| Coding has been around for a much shorter time, but mostly
| operates on the assumptions of bureaucracy: that which is
| information is information that can be modelled, sorted,
| searched. And the need for more code exists relative to having
| more categories of modelled data.
|
| Art already faced its first crisis of purpose with the
| combination of photography and mass reproduction. Photos
| produced a high level of realism, and as it became cheaper to
| copy and print them, the artist moved from a necessary role
| towards a specialist one - an "illustrator" or "fine artist".
|
| What an AI can do - given appropriate training, prompt
| interfaces and supplementary ability to test and validate its
| output - is produce a routine result in a fraction of the time.
| And this means that it can sidestep the bureaucratic mode
| entirely in many circumstances and be instructed "more of this,
| less of that" - which produces features like spam filters and
| engagement-based algorithms, but also means that entire
| protocols are reduced to output data if the AI is a
| sufficiently good compiler; if you can tell the AI what you
| want the layout to look like and it produces the necessary CSS,
| then CSS is more of a commodity. You can just draw a thing,
| possibly add some tagging structure, and use that as the
| compiler's input. Visual coding.
|
| But that makes the role a specialized one; nobody needs a "code
| monkey" for such a task, they need a graphic designer...which
| is an arts job.
|
| That is, the counterpoint to "structured, symbolic prompts
| generating visual data" is "visual prompts generating
| structured, symbolic data". ML can be structured in either
| direction, it just takes thoughtful engineering. And if the
| result is a slightly glitchy web site, it's an acceptable
| tradeoff.
|
| Either way, we've got a pile of old careers on their way out
| and new careers replacing them.
| incrudible wrote:
| > In my humble estimation, making art is vastly more difficult
| than the huge majority of computer programming that is done.
|
| The value of work is not measured by its difficulty. There's a
| small amount of people who make a living doing contract work
| that may be replaced by an AI, but these people were in a
| precarious position in the first place. The well-to-do artists
| are not threatened by AI art. The value of their work is
| derived from _them_ having put their name on it.
|
| If you assume that most programming work could be done by an AI
| "soon", then we really have to question what sort of dumb
| programming work people are doing today and whether that
| wouldn't disappeared anyway, once funding runs dry. Mindlessly
| assembling snippets from Stackoverflow may well be threatened
| by AI very soon, so if that's your job, consider the
| alternatives.
| runald wrote:
| Sorry, I have no reason to be afraid of AI taking my job, not
| now, not ever. You seem to have a condescending idea of what
| programming is, given how you describe it as simple and dumb,
| but I can assure you, programming would be one of the last jobs
| to be deprecated by AI. If you think ChatGPT is enough to put
| programmers on the street, I would question what kind of
| programming you do.
|
| I would turn this around to you: if a braindead AI can do these
| astonishingly difficult art, maybe art was never difficult to
| begin with, and that artists are merely finagling dumb, simple
| things to their work. Sounds annoying and condescending right?
| If you disagree what I said about art, maybe you ought to be
| more aware of your own lack of empathy.
| [deleted]
| akiselev wrote:
| It's not about empathy but about the fundamental nature of the
| job.
|
| Developers will be fine because software engineering is an arms
| race - a rather unique position to be in as a professional. I
| saw this play out during the 2000s offshoring scare when many
| of us thought we'd get outsourced to India. Instead of getting
| outsourced, the industry exploded in size globally and
| everything that made engineers more productive also made them a
| bigger threat to competitors, forcing everyone to hire or die.
|
| Businesses only need so much copy or graphic design, but the
| second a competitors gains a competitive advantage via software
| they have to respond in kind - even if it's a marginal
| advantage - because software costs so little to scale out. As
| the tech debt and the revenue that depends on it grows, the
| baseline number of staff required for maintenance and upkeep
| grows because our job is to manage the complexity.
|
| I think software is going to continue eating the world at an
| accelerated pace because AI opens up the uncanny valley:
| software that is too difficult to implement using human
| developers writing heuristics but not so difficult it requires
| artificial general intelligence. Unlike with artists,
| improvements in AI don't threaten us, they instead open up
| entire classes of problems for us to tackle
| oldstrangers wrote:
| Technically I'd imagine AI threatens developers
| (https://singularityhub.com/2022/12/13/deepminds-alphacode-
| co...) a lot more than artists because there's a tangible (or
| 'objectively correct') problem being solved by the AI.
| Whereas art is an entirely subjective endeavor, and
| ultimately the success of what is being made is left up to
| how someone is feeling. I also imagine humans will begin to
| look at AI generated art very cynically. Maybe we all
| collectively agree we hate AI art, and it becomes as cliche
| as terrible stock photography. Or, we just choose not to
| appreciate anything that doesn't come with a 'Made By Humans'
| authentication... Pretty simple solution for the artists.
|
| Obviously a lot of money will be lost for artists in a
| variety of commercial fields, but the ultimate "success of
| art" will be unapproachable by AI given its subjective
| nature.
|
| Developers though will be struggling to compete from both a
| speed and technical point of view, and those hurdles can't be
| simply overcome with a shift in how someone feels. And you're
| right about the arms race, it just won't be happening with
| humans. It'll be computing power, AIs and the people capable
| of programming those AIs.
| akiselev wrote:
| If there's a "tangible problem" people solve it with a SaaS
| subscription. That's not new.
|
| We developers are hired because our coworkers _can't
| express what they really want._ No one pays six figures to
| solve glorified advent of code prompts. The prompts are
| much more complex, ever changing as more information comes
| in, and in someone's head to be coaxed out by another human
| and iterated on together. They are no more going to be
| prompt engineers than they were backend engineeers.
|
| I say this as someone who used TabNine for over a year
| before CoPilot came out and now use ChatGPT for
| architectural explorations and code scaffolding/testing.
| I'm bullish on AI but I just don't see the threat.
| oldstrangers wrote:
| I'm just arguing that its a lot easier for AI to replace
| something that has objectively or technically correct
| solutions vs something as subjective as art (where we can
| just decide we don't like it on a whim).
| akiselev wrote:
| I'm arguing that there is no objectively or technically
| correct solutions to the work engineers are hired to do.
| You don't "solve" a startup CEO or corp VP who changes
| their mind about the direction of the business every
| week. Ditto for consumers and whatever the latest fad
| they're chasing is. They are agents of chaos and we are
| the ones stuck trying to wrangle technology to do their
| bidding. As long as they are _human_ , we'll need the
| general intelligence of humans (or equivalent) to figure
| out what to code or prompt or install.
| oldstrangers wrote:
| In the sense that someone asks "I need a program that
| takes x and does y" and the AI is able to solve that
| problem satisfactorily, it's an objectively correct
| solution. There will be nuance to that problem, and how
| its solved, but the end results are always objectively
| correct answers of "it either works, or it doesn't."
| akiselev wrote:
| Case in point, I guess :-)
| Lichtso wrote:
| I think in both domains there are parts which are purely
| technical (wrong or right) and others which are well ...
| an art.
|
| In art these parts are often overlooked, but they are
| significant none the less. E.g. getting the proportions
| right is an objective metric and really off putting if it
| is wrong.
|
| And in programming the "art" parts are often overlooked
| and precisely the reason why I feel that most software of
| today is horrible. It is just made to barely "work" and
| get the technical parts right up to spec and that's it.
| Beyond that nobody cares about resource efficiency,
| performance, security, maintainability or yet alone
| elegance.
| cyborgx7 wrote:
| To be honest, I have been forced to choose a side during all
| those debates about copyright and advertising/adblocking. And
| it was artists who forced me to make that choice. It's hard not
| to see this as just another way in which artist are trying to
| limit how people use their own computing devices in a way that
| provides the most value to them.
|
| All these talking points about lack of empathy for poor
| suffering artists have already been made a million times in
| those other debates. They just don't pack much of a punch
| anymore.
| netheril96 wrote:
| > if AI is able to do these astonishingly difficult things, why
| shouldn't it, with some finagling, also be able to do the dumb,
| simple things most programmers do for their jobs?
|
| Art is more difficult than programming for people with talents
| in programming but not in arts. Art is easier than programming
| for people with talents in arts but not in programming.
| Granted, those two sentences are tautology, but nonetheless a
| reminder that the difficulty of art and programming does not
| form a total order.
| turpialito wrote:
| Luddites hopping on the bandwagon for reasons unclear to
| themselves.
|
| EDIT: Would Andy Warhol be sued by Campbell or Brillo?
| crote wrote:
| No, but his estate _was_ sued by Lynn Goldsmith over his use of
| a photo of Prince - and lost.
|
| Warhol himself said that art "is anything you can get away
| with." He was clearly very much aware of the dubious legality
| of some of his work.
| dredmorbius wrote:
| Context, too long to fit into the HN title: "In order to protest
| AI image generators stealing artists work to train AI models, the
| artists are deliberately generating AI art based on the IP of
| corporations that are most sensitive to protecting it."
| yreg wrote:
| Interesting approach, but is drawing fan art illegal?
|
| I would think that generating those images is okay by Disney,
| the same as if I painted them. The moment Disney would object
| is when I start selling them on merch, at which point it is
| irrelevant how they were created.
|
| Am I mistaken?
| onetrickwolf wrote:
| Fan art is pretty much illegal or infringement actually it's
| just not really enforced by most companies. There are some
| caveats for fair use but generally most fan art could be
| successfully taken down if a company was motivated enough in
| my opinion. Nintendo is pretty notorious for this but it has
| rarely gone to court as most people are too scared to fight
| takedown requests.
| Taywee wrote:
| Copyright isn't level legal vs illegal, it's infringing vs
| non-infringing. Fan art very often could be argued to be
| infringing, but no company has any reason to pursue it in the
| vast majority of cases, so they just don't.
|
| It's very confusing, especially when you have to consider
| trademark as related but separate.
| jefftk wrote:
| I don't get your distinction: copyright infringement is
| illegal, so "infringing" implies "illegal"
| Taywee wrote:
| It's civil vs criminal law. Illegal usually implies
| breaking a law and committing a crime. Copyright
| infringement is a civil matter, not criminal.
| dredmorbius wrote:
| False.
|
| <https://news.ycombinator.com/item?id=33999561>
| dredmorbius wrote:
| Infringement carries both civil (noncriminal) and
| criminal proscriptions and liabilities under much law,
| e.g., under US law, 17 USC Chapter 5:
|
| <https://www.law.cornell.edu/uscode/text/17/chapter-5>
| Taywee wrote:
| From that link, criminal copyright infringement depends
| on specific circumstances that don't directly apply here:
| https://www.law.cornell.edu/uscode/text/17/506
| dredmorbius wrote:
| It's unclear whether "here" refers to the artists
| spoofing Disney, or other actors pirating / duplicating
| artists' work for commercial use.
|
| In the former case, I'd agree.
|
| In the second, there's a clear violation of 17 USC
| 506(a)(1)(A).
| astrange wrote:
| Artists have a complicated ethical system where 1.
| reposting/tracing a solo artist's images without "citing the
| artist" is "stealing" (copyright violation) 2. imitating
| their style is also "stealing" but 3. drawing fanart of any
| series without asking is fine and 4. any amount of copyright
| violation is not only fine but encouraged as long as it's
| from a corporation.
|
| The punishment for breaking any of these rules is a lot of
| people yell at you on Twitter. Unfortunately, they've been at
| it so long that they now think these are actual laws of the
| universe, although of course they have pretty much nothing to
| do with the actual copyright law.
|
| That actual law doesn't care if you're selling it or not
| either, at least not as a bright line test.
|
| (Japanese fanartists have a lot more rules, like they won't
| produce fan merch of a series if there is official merch
| that's the same kind of object, or they'll only sell fan
| comics once on a specific weekend, and the really legally
| iffy ones have text in the back telling you to burn after
| reading or at least not resell it. Some more popular series
| like Touhou have explicit copyright grants for making fanart
| as long as you follow a few rules. Western fanartists don't
| read or respect any of these rules.)
| dotnet00 wrote:
| Japan doesn't have fair use, so the only thing ensuring
| that copyright owners don't go after fanartists is that
| fanart is generally either beneficial to them or is not
| worth going after. However that would change if the artist
| were attempting to directly interfere with their revenue,
| which is why they won't do things like producing imitations
| of merch.
|
| Copying an artist's style isn't in and of itself looked
| down upon, any artist will tell you that doing so is an
| important part of figuring out what aspects of it one likes
| for their own style. The problem with AI copying it is that
| the way the vast majority of users are using it isn't in
| artistic expression. The majority of them are simply
| spamming images out in an attempt to gain a popularity
| "high" from social media, without regard for any of the
| features of typical creative pursuits (an enjoyment of the
| process, an appreciation for other's effort, a desire to
| express something through their creativity, having some
| unique intentional and unintentional identifying features).
|
| Honestly maybe the West messed up having such broad fair
| use protections since it seems people really have no
| respect for any creative effort, judging by all the AI art
| spam and all the shortsighted people acting smug about it
| despite the questions around it being pretty important to
| have a serious conversation about, especially for pro-AI
| folk.
|
| The AI art issue has several difficult problems that we are
| seemingly too immature to deal with, it makes it clear how
| screwed we'd be as a society if anything approaching true
| AGI happened to be stumbled upon anytime soon.
| BeFlatXIII wrote:
| > the West messed up having such broad fair use
| protections since it seems people really have no respect
| for any creative effort
|
| That is based on the fallacy that derivative creativity
| is somehow lesser than so-called "original" creativity.
| dotnet00 wrote:
| I'm not saying that because I think all derivative
| creativity is lesser than 'original' creativity. Rather,
| we've gotten so used to such broad protections on all
| creativity that a good chunk of us genuinely think that
| their dozens of minor variations on a popular prompt
| entirely spat out by a tool and published to a site every
| hour are at the same level of creativity as something
| even just partially drawn by a person (eg characters
| drawn into an AI generated background or AI generated
| character designs then further fixed up).
|
| The vast majority of AI art I've seen on sites like Pixiv
| has been 'generic' to the level of the 'artist' being
| completely indistinguishable from any other AI-using
| 'artist'. There has been very little of the sort where
| the AI seemed to truly just be a tool and there was
| enough uniqueness to the result that it was easy to guess
| who the creator was. The former is definitely less
| creative than the latter.
| gwd wrote:
| But the premise is just bad law. Disney does, in fact, hold a
| copyright on the Mickey Mouse character (at least until the end
| of 2023) [1]. It doesn't matter where the art comes from.
| Anyone making copies of something with Mickey Mouse in it --
| whether drawn by a Disney artist, or drawn by someone else, or
| "drawn" by an AI -- is violating their copyright (at least for
| another year).
|
| On the other hand, nobody owns a copyright on a specific style.
| If I go study how to make art in the style of my favorite
| artist, that artist has no standing to sue me for making art in
| their style. So why would they have standing to sue for art
| generated by an AI which is capable of making art in their
| style?
|
| [1] https://fishstewip.com/mickey-mouse-copyright-expires-at-
| the...
| dredmorbius wrote:
| https://news.ycombinator.com/item?id=33999491
| hectorlorenzo wrote:
| I'm still organising my thoughts on the subject so please feel
| free to push back.
|
| This ongoing discussion feels classist. I've never seen such
| strong emotions about AI (and automation) taking blue-collar
| jobs, some shrugs at most. It's considered an unavoidable given,
| even though it has been happening for decades. The only
| difference now is that AI is threatening middle-upper class jobs,
| which nobody saw coming.
|
| I do not see the difference between both. Can somebody that does
| explain to me why now is "critical" and not so much before?
| nbzso wrote:
| As an artist, I already realized that the war is lost, without a
| fight. There is no way to stop the removal of human labor. At
| first, A.I. tools will need supervision and optimization, but
| soon they will do this by themselves. I moved all of my art
| related work into a real medium. If someone in the future finds
| value of owning an actual art, I will provide.
|
| If people are happy with metaverse A.I. generated images,
| projected in their minds, so be it. It is over. The rest is just
| an echo of human civilization. Transhumanistic clones are coming
| to town:)
| wnkrshm wrote:
| I'm thinking the same way, plein air painting is a nice
| activity. You get something nothing can take away from you, any
| kind of mark you make with your own body is yours. At least at
| the moment, using prompt- or inpainting-based tools feels like
| talking through Microsoft Sam (voice synth).
| Taywee wrote:
| The war is not lost. The goal isn't to try to force people to
| never be able to use AI to generate art, but to force them to
| only use input that they gave permission to use.
|
| AI replacing artists functionally is just the surface fear. The
| real problem is using AI as an automated method of copyright
| laundering. There's only so much hand waving one can do to
| excuse dumping tons of art that you didn't make into a program
| and transform it into similar art and pretend like you own it.
| People like to pretend that it's like a person learning and
| replicating a style, but it's not. It's a computer program and
| it's automated. That the process is similar is immaterial.
| [deleted]
| marmetio wrote:
| My memory of this is really fuzzy, so I'm probably getting the
| details wrong.
|
| I watched a documentary in roughly the early oughts about AI. The
| presenter might have been Alan Alda.
|
| In one segment, he visited some military researchers who were
| trying to get a vehicle to drive itself. It would move only a few
| inches or feet at a time as it had to stop to recalculate.
|
| In another segment, he visited some university researchers who
| set up a large plotter printer to make AI-generated art. It was
| decent. He saw it could depict things like a person and a pot, so
| he asked if it would ever do something silly to us like put a
| person in a pot. The professor said not to be silly.
|
| To jokingly answer the title question: everyone who saw that one
| specific documentary 20 years ago knew that AI art was way ahead
| of AI machines.
|
| Art is useful when someone subjectively finds it enjoyable or
| meaningful. While it might not achieve all of what humans can,
| the barrier to entry is relatively lower.
| fullshark wrote:
| If it was Alan Alda it was probably a Scientific American
| Frontiers episode
|
| https://en.m.wikipedia.org/wiki/Scientific_American_Frontier...
|
| Edit: confused SAF with Nova!
| avereveard wrote:
| "images trascend copyright"
| https://cdn.vmst.io/media_attachments/files/109/512/541/929/...
|
| you can still copyright characters separatedly. he's feigning
| ignorance of how copyright work to make a sensationalistic point,
| which pretty much invalidate and poison what is otherwise an
| interesting argument at the boundary between derivative work and
| generative art.
| namelessoracle wrote:
| The slice Im curious about is what happens, when you let loose
| your AI art generator and start copy/trademarking everything it
| creates to basically make sure all kinds of art that could have
| been created is potentially infringing for you?
|
| The art equivalent of patent trolling or domain squatting
| basically. Is that possible legally?
| alexfromapex wrote:
| It seems like the least-regulated professions will be the front
| lines, due of course to the friction created by getting AI
| operating in regulated environments.
| cobertos wrote:
| I hope that AI companies don't end up implementing another system
| like Youtube's DMCA system. Right holders and trolls alike can
| scrub these "black boxes" of whatever content they want, adding
| more garbage and uncertainty to their output.
|
| Then again, there should be some sort of solution so this can
| coexist with artists, and not replace them
| sdiupIGPWEfh wrote:
| So they're protesting alleged _copyright_ violations in the form
| of AI copying artistic styles (presuming an artistic style alone
| rises to the level of copyright protection) by committing
| _trademark_ violations? Yeah, I don 't get it.
|
| I can appreciate that there are all kinds of potential
| "intellectual property" issues with the current glut of AI
| models, but the level of misunderstanding in some affected
| communities is concerning.
| sidlls wrote:
| Outside of lawyers, what communities do you think should have
| an "understanding" of intellectual property law, and to what
| degree? Or, maybe the fact that it takes a lawyer to truly
| understand it indicates that the complexity of applicable laws
| and regulations isn't beneficial to the communities they're
| ostensibly meant to protect?
| avereveard wrote:
| I fully expect self called artist to know the law that
| protect their own means of living
| Double_a_92 wrote:
| People that complain very vocally about some issue, should at
| least bother to research what they are talking about...
| Karawebnetwork wrote:
| When I took my graphic design class in college, there was a
| big chunk about copyright and trademark. We had to be very
| cautious about images we were using and the difference
| between the two was drilled into our heads.
| sdiupIGPWEfh wrote:
| Communities that generate and/or profit off of "intellectual
| property" ought to have a rudimentary understanding of the
| laws involved. Doubly so if they're protesting what they see
| as violations of those laws. It honestly does not take a
| lawyer to understand the distinctions at play here.
| Tao3300 wrote:
| ITT everyone similarly conflating multiple types of IP law and
| calling it all copyright. Palm to the face.
| mensetmanusman wrote:
| AI enables fast-fashion-like competition. There will still be
| winners and losers.
|
| Use these tools to 10x your own output and create new markets
| that arise due to the 10x modifier.
| cwkoss wrote:
| If a human wrote the prompt, how is AI different from a
| paintbrush or any other tool of the trade?
|
| Every tool makes some of the 'decisions' about how the artwork
| results by adding constraints and unexpected results. If anything
| I'd argue that AI art allows for more direct human expression:
| going from mental image to a sharable manifestation has the
| potential to be less lossy with art than with paint.
|
| This feels like a bunch of misplaced ludditism. We need to
| implement a UBI because 99.9% of human labor is going to be
| valued below the cost of survival in the next 50-100 years.
| Always fun to see people thumbing their nose at Disney though.
| rperez333 wrote:
| I think it is different because you don't need any pictures to
| create a paintbrush or a pencil. You can still have the AI code
| as tool, but without the dataset (images), it won't go
| anywhere.
| 4bpp wrote:
| Surely, if the next Stable Diffusion had to be trained from a
| dataset that has been purged of images that were not under a
| permissive license, this would at most be a minor setback on AI's
| road to obsoleting painting that is more craft than art. Do
| artists not realise this (perhaps because they have some kind of
| conceit along the lines of "it only can produce good-looking
| images because it is rearranging pieces of some Real Artists'
| works it was trained on"), are they hoping to inspire overshoot
| legislation (perhaps something following the music industry model
| in several countries: AI-generated images assumed pirated until
| proven otherwise, with protection money to be paid to an artists'
| guild?), or is this just a desperate rearguard action?
| wruza wrote:
| There's only one way to figure it out - train on a properly
| licensed content and show them that.
|
| Your line of reasoning sounds like "ah, we already won so your
| protest doesn't matter anyway", but did you already win
| actually? Do you really _not_ need all their development to
| draw on the same level? Just show that.
| 4bpp wrote:
| I'm not in AI and my GPU barely runs games from 10 years ago,
| so I'll pass. To be more precise, though, I think that it
| _seems_ that their protest won't matter, but the one way in
| which I see that it may (the second out of three options)
| leads to an outcome that I would just consider bad in the
| short term (for society, and for artists that are not
| established enough to benefit from any emerging
| redistribution system; we observe cases in Germany every so
| often where pseudonymous musicians are essentially forced to
| charge for their own performances and redirect proceeds to
| rent-seekers and musicians that are not them, because they
| can't prove ownership of their own work to GEMA's
| satisfaction).
| chrisco255 wrote:
| But human beings themselves are influenced by licensed
| content. And remix it just the same as AI.
| gedy wrote:
| But they're _Artists_ and makes the same approach all
| better
|
| /s
| wruza wrote:
| https://news.ycombinator.com/item?id=33998736
| gpderetta wrote:
| Also if a theoretical purged-dataset SD were released, it would
| still be easy and cheap for users to extend it to imitate any
| art style the want. As they wouldn't be redistributing the
| model and presumably they would use art they have already
| licensed the copyright issue would be further muddled.
|
| I think attempting to prevent this is a losing battle.
| Gigachad wrote:
| I'm not too sure how it works but someone commented that you
| can take the model and "resume training" it on the extra
| dataset you want to add.
|
| Given most of the heavy lifting is already done, this seems
| like a pretty easy thing for anyone to do.
| mejutoco wrote:
| It is called fine-tuning or transfer learning, and you
| usually train the last layer.
|
| Here is an example for keras (a popular ML framework).
| https://keras.io/guides/transfer_learning/
| gpderetta wrote:
| https://dreambooth.github.io/
|
| edit: the examples are all about objects, but my
| understanding is that it is capable of style transfers as
| well.
| nwoli wrote:
| I'm sure artists realise that. They also realise the power of
| these things and I see this more as a fight against survival.
| They're up against the wall and they know it, and they're
| incredibly well connected and have invested their lives up to
| now into this so they won't just lie down without a fight
| (trying anything).
| Tepix wrote:
| Imagine you are an artist and you have developed your unique
| style.
|
| Would you mind if AI starts creating art like yours?
|
| What if your clients tell you they bought the AI generated art
| instead of yours?
| FeepingCreature wrote:
| Would you mind if there was another person who copied your
| style? What if your clients...?
|
| Yeah, sure you'd mind. However, we have decided as a society
| that "style" is not protected.
| wruza wrote:
| "We" decide on today's issues, not on all future
| possibilities. The reason for that decision in the past was
| to allow many creators to create without being too held
| back by "private property" signs everywhere. The current
| situation allows AI to create but demotivates creators. Now
| it's time to think what will we do when AI wouldn't pick a
| new style and there are not enough creators anymore who can
| or want to do that, whether it is a near future problem or
| maybe not a problem at all, and what should we decide
| again.
|
| Simply hiding in an obsolete technicality is sure a wrong
| way to handle it.
| concordDance wrote:
| By the time we're tired of the existing styles I suspect
| we'll have AGI and the entire question will be moot.
| oneoff786 wrote:
| Style is entirely subjective and impossible to define.
| Van Gogh had a style. Are we going to say that we would
| want a society where only Van Gogh is allowed to make
| Impressionist paintings? Who decides if your painting is
| similar enough to Van Gogh that it's illegal? What if
| your style is simplistic. Are you going to need to
| compare your art to all published art to make sure a
| court couldn't find it "too similar"? What if we make a
| painting with AI that is a mix of Picasso and Van Gogh?
| Style?
|
| It's a stupid concept. It would never work. Even the
| visualizations we see that are explicitly attempting to
| copy another artist's style are often still clearly not
| exactly the same.
| wruza wrote:
| I don't think style will be a subject here at all. Maybe
| we'll settle on that AI user must take an exicit
| permission before training on someone's content and
| humans must not.
| Nadya wrote:
| I still don't see how this isn't the "Realistic
| Portrait/Scenic Painters vs Photography" argument rehashed.
|
| Imagine you are a painter and you have developed your
| expertise in photorealistic painting over your entire
| lifetime.
|
| Would you mind if someone snaps a photograph of the same
| subject you just painted?
|
| What if your commissioners tell you they decided to buy a
| photograph instead of your painting because it looked more
| realistic?
|
| Every argument I've seen against AI art is an appeal to
| (human) ego or an appeal to humanity. I don't find either
| argument compelling. Take this video [0] for example and half
| of the counterarguments are an appeal to ego - and one
| argument tries to paint the "capped profit" as a shady
| dealing of circumventing laws without realizing (1) it's been
| done before, OpenAI just tried slapping a label on it and (2)
| nonprofits owning for-profit subdivisions is commonplace.
| Mozilla is both a nonprofit organization (the Foundation) and
| a for-profit company (the Corporation).
|
| E:
|
| I'm going to start a series of photographs that are
| intentionally bad and poorly taken. Poor framing, poor
| lighting, poor composition. Boring to look at, poor white
| balance, and undersaturated photos like the kind taken on
| overcast days. With no discernable subjects or points of
| interest. I will call the photos art - things captured solely
| with the press of a button by pointing my camera in a
| direction seemingly at random. I'm afraid many won't
| understand the point I am making but if I am making a point
| it does make the photographs art - does it not? I'm pretty
| sure that is how modern art works. I will call the collection
| "Hypocrisy".
|
| E2:
|
| The first photo of the collection to set the mood - a picture
| of the curtain in my office:
| https://kimiwo.aishitei.ru/i/mUjQ5jTdeqrY3Vn0.jpg
|
| Chosen because it is grey and boring. The light is not
| captured by the fabric in any sort of interesting manner -
| the fabric itself is quite boring. There is no pattern or
| design - just a bland color. There is nothing to frame - a
| section of the curtain was taken at random. The photo isn't
| even aligned with the curtain - being tilted some 40 odd
| degrees. Nor is the curtain ever properly in focus. A perfect
| start for a collection of boring, bland photos.
|
| [0]
| https://www.youtube.com/watch?v=tjSxFAGP9Ss&feature=youtu.be
| mtrower wrote:
| Your art is fascinating; how can I donate to the cause?
| Nadya wrote:
| A second photo has been added to the collection - for
| anyone who thought I might be joking about doing this.
|
| Photos will periodically be added to the collection - not
| that I expect anyone whatsoever to ever be interested in
| following a collection of photos that is meant to be boring
| and uninspired. However - feel free to use this collection
| of photos as a counterargument to the argument that "art
| requires some effort". I promise that I will put far less
| thought and effort into the photos of this collection than
| I have in any writing of prompts for AI generated art that
| I've done.
|
| Art is little more than a statement and sometimes a small
| statement can carry a large message.
|
| https://imgur.com/a/Oez2w64
|
| Tomorrow I will work on setting up a domain and gallery for
| the images - to facilitate easier discussion and sharing.
| Is the real artistic statement the story behind the
| collection and not the collection itself? How can the two
| be separated? Can one exist without the other?
| Brushfire wrote:
| Imagine you are a startup business owner and you have
| developed a unique product or service.
|
| And then someone comes along and competes with you?
|
| --
|
| No one is bothered by competition in markets.
|
| Why do we have more or less empathy of this type for some
| professions?
| MomoXenosaga wrote:
| The appeal of art is the artist. Unless computers gain
| sentience they cannot replace the humanity and ego of
| artists.
|
| Ever wondered why artists have to show up at gallery
| parties to sell their stuff?
| mtrower wrote:
| No, the appeal of the artist is the artist. The art does
| offer a means to connect with the artist. It does not
| follow that the art may not offer its own appeal besides.
| BeFlatXIII wrote:
| > The appeal of art is the artist.
|
| To some. To others, the artistic object is all that all
| that matters.
| MomoXenosaga wrote:
| That must be why every piece of painting is signed.
| Artists are selling a brand- Rembrandt already understood
| that 400 years ago.
| Taywee wrote:
| If they competed with me by throwing my product through a
| decompiler, fed it into an AI model, and selling the
| generated output, I'd be pretty upset about it.
|
| Which is pretty close to the actual issue here, that
| artists did not give their permission to use their own work
| to generate their competition.
| mtrower wrote:
| Wouldn't that say more about the client than the
| competitor?
| [deleted]
| onetrickwolf wrote:
| To quote another comment but "Instead of replacing crappy
| jobs and freeing up peoples time to enjoy their life, we're
| actually automating enjoyable pursuits."
|
| I think this isn't just a simple discussion on competition
| and copyright, I think it's a much larger question on
| humanity. It just seems like potentially a bleak future if
| enjoyable and creative pursuits are buried and even
| surpassed by automation.
| BeFlatXIII wrote:
| Some people enjoy looking at images more than creating
| them.
| onetrickwolf wrote:
| Yeah maybe, but I think we also already have a problem
| with overconsumption of media though. I am not sure this
| is helping.
|
| It seems inevitable and I don't think we can stop it, but
| I just am kind of worried about the collective mental
| health of humanity. What does a world look like where
| people have no jobs and even creative outlets are
| dominated by AI? Are people really just happy only
| consuming? What even is the point of humanity existing at
| that point?
| mtrower wrote:
| If the pursuit is enjoyable, it should continue to be
| enjoyable as a hobby, no?
|
| Meanwhile, where is my levy of custom artists willing to
| do free commission work for me? It's enjoyable, right?
|
| I see a lot of discussion about money and copyright, and
| little to no discussion about the individual whose life
| is enriched by access to these tools and technologies.
|
| As for your bleak future... will that even come to pass?
| I don't know. Maybe it depends on your notion of
| "surpass", and what that looks like.
| onetrickwolf wrote:
| > If the pursuit is enjoyable, it should continue to be
| enjoyable as a hobby, no?
|
| I think for most people the enjoyable and fulfilling part
| of life is feeling useful or having some expression and
| connection through their work. There's definitely some
| people who can create in a vacuum with no witness and be
| fulfilled, but I think there's a deep need for human
| appreciation for most people.
|
| > As for your bleak future... will that even come to
| pass? I don't know. Maybe it depends on your notion of
| "surpass", and what that looks like.
|
| I don't know either, maybe it will be fine. Maybe this
| will pass like the transition from traditional to
| digital. But something about this feels different...like
| it's actually stealing the creative process rather than
| just a paradigm shift.
| bigbacaloa wrote:
| In most markets everyone is bothered by competition and
| tries to eliminate it.
| [deleted]
| astrange wrote:
| Sure, artists don't like having competition, but that doesn't
| mean their competitors should listen to them.
| PurpleRamen wrote:
| Would they mind if another artist would create the same art-
| style independent of them? Or something 99% alike? 95%? How
| many art-styles are even possible without overlapping too
| much?
| CyanBird wrote:
| The big issue is precisely this, yeah, living* artists are
| upset that an ai can take their own names as input and output
| their artistic styles, that's the big thorn with these ml
| systems
|
| There is a secondary issue on that there is other people
| being able to craft high quality images with strong
| compositions without spending the "effort/training" that
| artists had to use over years to produce them, so they are
| bitter about that too, but that's generally a minor cross-
| section of the publicvoutcry tho they are quite vitriolic
|
| Photobashing, tracing, etc there have always been a layer of
| purists whom look down on anyone that doesn't "put the effort
| in" yet get great results in a timely manner, these purists
| will always exist, just like how it was when digital painting
| was starting, people were looked down by oil painters for not
| putting the effort in, even when oil painters themselves used
| tricks like projectors to the empty blank canvas to get
| perspective perfect images, but that's just human nature to a
| degree, trying to put down other people while yourself doing
| tricks to speed up processes
| sdiupIGPWEfh wrote:
| > Would you mind if AI starts creating art like yours?
|
| The law isn't there to protect my feelings, so whether I mind
| or not is irrelevant. Artists have had to deal with shifting
| art markets for as long as art has been a profession.
|
| > What if your clients tell you they bought the AI generated
| art instead of yours?
|
| I'd be sad and out of a source of income. Much the same way I
| would be if my clients hired another similar but cheaper
| artist. The law doesn't guarantee me a livelihood.
| 4bpp wrote:
| The idea that the AI will compete with you by copying your
| unique style seems like exactly the sort of short-sighted
| conceit that I alluded to in my post above. As an artist,
| would you be much happier if, rather than the AI copying your
| style, the AI generated infinitudes of pictures in a style
| that the overwhelming majority of humans prefers to yours, so
| that you couldn't hope to ever create anything that people
| outside of a handful of hipsters and personal friends will
| value?
| deelly wrote:
| > The idea that the AI will compete with you by copying
| your unique style seems like exactly the sort of short-
| sighted
|
| Could you please elaborate, why its "short-sighted"?
|
| > As an artist, would you be much happier if, rather than
| the AI copying your style, the AI generated infinitudes of
| pictures in a style that the overwhelming majority of
| humans prefers to yours, so that you couldn't hope to ever
| create anything that people outside of a handful of
| hipsters and personal friends will value?
|
| You mean that any artist should be just happy that his work
| is used by other people / rich corporation / AI without
| consent? Cool, cool.
| 4bpp wrote:
| > Could you please elaborate, why its "short-sighted"?
|
| Because it's barely been a year since we've gone from
| people confidently asserting that AI won't be able to
| produce visual art on the level of human professionals at
| all to the current situation. Predictions on ways in
| which AI performance will not catch up to or overtake
| human performance have a bad track record at the moment,
| and it has not been long enough to even suspect that the
| current increase in performance might be plateauing.
| Cutting-edge image generation AI appears to often imitate
| human artists in obvious ways _now_ , but it seems quite
| plausible that the gap between this and being
| "original"/as non-obvious in your imitation of other
| humans as those high-performing human artists that are
| considered to be original is merely quantitative and will
| be closed soon enough.
|
| > You mean that any artist should be just happy that his
| work is used by other people / rich corporation / AI
| without consent? Cool, cool.
|
| I don't know how you get that out of what I said. Rather,
| I'm claiming that artists will have enough to be unhappy
| about being obsoleted, and the current direction of their
| ire at being "copied" by AI may be a misdirection of
| effort, much as if makers of horse-drawn carriages had
| tried to forestall the demise of their profession by
| complaining that the design of the Ford Model T was
| ripped off of theirs (instead of, I don't know, lobbying
| to ban combustion engines altogether, or sponsoring Amish
| proselytism).
| chrisco255 wrote:
| Many skilled and talented programmers work on open source
| software for the explicit purpose of allowing it to be copied
| and extended in any fashion.
| mejutoco wrote:
| > in any fashion.
|
| Several open source licenses do not agree with this (they
| enforce restrictions on how it is to be shared).
| mtrower wrote:
| This is true, and many bitter wars are fought over ISS
| licensing. I'm not sure it derails his point - there's an
| awful lot of BSD, MIT etc licensed code out there.
| orbifold wrote:
| I think this drastically overestimates what current AI
| algorithms are actually capable of, there is little to no hint
| of genuine creativity in them. They are currently severely
| limited by the amount of high quality training data not the
| model size. They are really mostly copying whatever they were
| trained on, but on a scale that it appears indistinguishable
| from intelligent creation. As humans we don't have to agree
| that our collective creative output can be harvested and used
| to train our replacements. The benefits of allowing this will
| be had by a very small group of corporations and individuals,
| while everyone else will lose out if this continues as is. This
| will and can turn into an existential threat to humanity, so it
| is different from workers destroying mechanical looms during
| the industrial revolution. Our existence is at stake here.
| XorNot wrote:
| > They are really mostly copying whatever they were trained
| on
|
| People keep saying this without defining what _exactly_ they
| mean. This is a technical topic, and it requires technical
| explanations. What do _you_ think "mostly copying" means
| when you say it?
|
| Because there isn't a shred of original pixel data reproduced
| from training data through to output data by any of the
| diffusion models. In fact there isn't enough data in the
| model weights to reproduce any images at all, without adding
| a random noise field.
|
| > The benefits of allowing this will be had by a very small
| group of corporations and individuals
|
| You are also grossly mistaken here. The benefits of heavily
| restricting this, will be had by a very small group of
| corporations and individuals. See, everyone currently comes
| around to "you should be able to copyright a style" as the
| solution to the "problem".
|
| Okay - let's game this out. US Copyright lasts for the life
| of author plus 70 years. No copyright work today will enter
| public domain until I am dead, my children are dead, and
| probably my grandchildren as well. But copyright can be
| traded and sold. And unlike individuals, who do die,
| corporations as legal entities do not. And corporations can
| own copyright.
|
| What is the probability that any particular artistic "style"
| - however you might define that (whole other topic really) -
| is truly unique? I mean, people don't generally invent a
| style on their own - they build it up from studying other
| sources, and come up with a mix. Whatever originality is in
| there is more a function of mutation of their ability to
| imitate styles then anything else - art students, for
| example, regularly will do studies of famous artists and
| intentionally try to copy their style as best they can. A
| huge amount of content tagged "Van Gough" in Stable Diffusion
| is actually Van Gough look-alikes, or content literally
| labelled "X in the style of Van Gough". It had nothing to do
| with them original man at all.
|
| I mean, zero - by example - it's zero. There are no truly
| original art styles. Which means in a world with
| copyrightable art styles, _all_ art styles eventually end up
| as a part of corporate owned styles. Or the opposite is also
| possible - maybe they _all_ end up as public domain. But in
| both cases the answer is the same: if "style" becomes a
| copyrightable term, and AIs can reproduce it in some way
| which you can prove, then literal "prior art" of any
| particular style will invariably be an existing part of an AI
| dataset. Any new artist with a unique style will invariably
| be found to simply be 95% a blend of other known styles from
| an AI which has existed for centuries and been producing
| output constantly.
|
| In the public domain world, we wind up approximately where we
| are now: every few decades old styles get new words keyed
| into them as people want to keep up with the times of some
| new rising artist who's captured a unique blend in the
| zeitgeist. In the corporate world though, the more likely
| one, Disney turns up with it's lawyers and says "we're taking
| 70% or we're taking it all".
| alan-crowe wrote:
| Trying to be _exact_ about "mostly copying", I want to
| contrast Large Language Models (LLM) with Alpha Go learning
| to play super human Go through self play.
|
| When Alpha Go adds one of its own self-vs-self games to its
| training database, it is adding a genuine game. The rules
| are followed. One side wins. The winning side did something
| right.
|
| Perhaps the standard of play is low. One side makes some
| bad moves, the other side makes a fatal blunder, the first
| side pounces and wins. I was surprised that they got
| training through self play to work; in the earlier stages
| the player who wins is only playing a little better than
| the player who loses and it is hard to work out what to
| learn. But the truth of Go is present in the games and not
| diluted beyond recovery.
|
| But a LLM is playing a post-modern game of intertextuality.
| It doesn't know that there is a world beyond language to
| which language sometimes refers. Is what a LLM writes true
| or false? It is unaware of either possibility. If its own
| output is added to the training data, that creates a
| fascinating dynamic. But where does it go? Without Alpha
| Go's crutch of the "truth" of which player won the game
| according to the hard coded rules, I think the dynamics
| have no anchorage in reality and would drift, first into
| surrealism and then psychosis.
|
| One sees that AlphaGo is copying the moves that it was
| trained on and a LLM is also copying the moves that is was
| trained on and that these two things are not the same.
| orbifold wrote:
| Ok, let me try to be technical. These models fundamentally
| can be understood as containing a parametrised model of an
| intractable probability distribution ("human created
| images", "human created text"), which can be conditioned on
| a user provided input ("show me three cats doing a tango",
| "give me a summary of the main achievements of Richard
| Feynman") and sampled from. The way they achieve their
| impressive performance is by being exposed to as much of
| human created content as possible, once that has happened
| they have limited to no ways of self-improvement.
|
| I disagree that there is no originality in art styles,
| human creativity amounts to more than just copying other
| people. There is no way a current gen AI model would be
| able to create truly original mathematics or physics, it is
| just able to reproduce facsimile and convincing bullshit
| that looks like it. Before long the models will probably
| able to do formal reasoning in a system like Lean 4, but
| that is a long way of from truly inventive mathematics or
| physics.
|
| Art is more subtle, but what these models produce is mostly
| "kitsch". It is telling that their idea of "aesthetics"
| involves anime fan art and other commercial work. Anyways,
| I don't like the commercial aspects of copyright all that
| much, but what I like is humans over machines. I believe in
| freely reusing and building on the work of others, but not
| on machines doing the same. Our interests are simply not
| aligned at this point.
| idlehand wrote:
| This has been a line of argument from every Luddite since the
| start of the industrial revolution. But it is not true.
| Almost all the productivity gains of the last 250 years have
| been dispersed into the population. A few early movers have
| managed to capture some fraction of the value created by new
| technology, the vast majority has gone to improve people's
| quality of life, which is why we live longer and richer lives
| than any generation before us. Some will lose their jobs and
| that is fine because human demand for goods and services is
| infinite, there will always be jobs to do.
|
| I really doubt that AI will somehow be our successors.
| Machines and AI need microprocessors so complex that it took
| us 70 years of exponential growth and multiple trillion-
| dollar tech companies to train even these frankly quite
| unimpressive models. These AI are entirely dependent on our
| globalized value chains with capital costs so high that there
| are multiple points of failure.
|
| A human needs just food, clean water, a warm environment and
| some books to carry civilization forward.
| orbifold wrote:
| There is a significant contingent of influential people
| that disagree. "Why the future doesn't need us"
| (https://www.wired.com/2000/04/joy-2/), Ray Kurzweil etc.
| This is qualitatively different than what the Luddites
| faced, it concerns all of us and touches the essence of
| what makes us human. This isn't the kind of technology that
| has the potential to make our lives better in the long run,
| it will almost surely be used for more harm than good. Not
| only are these models trained on the collectively created
| output of humanity, the key application areas are to
| subjugate, control and manipulate us. I agree with you that
| this will not happen immediately, because of the very real
| complexities of physical manufacturing, but if this part of
| the process isn't stopped in its tracks, the resulting
| progress is unlikely to be curtailed. I at least
| fundamentally think that the use of all of our data and
| output to train these models is unethical, especially if
| the output is not freely shared and made available.
| yeknoda wrote:
| It seems we are running out of ways to reinvent ourselves
| as machines and automation replace us. At some point,
| perhaps approaching, the stated goal of improving quality
| of life and reduce human suffering ring false. What is
| human being if we have nothing to do? Where are the vast
| majority of people supposed to find meaning?
| yeknoda wrote:
| I've been lucky enough to build and make things and work
| in jobs where I can see the product of my work - real,
| tangible, creative, and extremely satisfying. I can only
| do this work as long people want and need the work to be
| done.
| ChadNauseam wrote:
| I don't see why machines automatically producing art
| takes away the meaning of making art. There's already a
| million people much better at art than you or I will ever
| be producing it for free online. Now computers can do it
| too. Is that supposed to take away my desire to make art?
| snordgren wrote:
| Where do you find meaning in life today? What do you do
| on weekends and vacations?
|
| Another place to look is the financially independent.
| What are they doing with their time?
| rperez333 wrote:
| Exactly this, and it was clear based on the backlash got SD
| 2.0 after they removing artist labels and getting 'less
| creative'. Most people are not interested on the creative
| aspect, just looking for a easy way to copy art from people
| they admire.
| netheril96 wrote:
| > They are really mostly copying whatever they were trained
| on, but on a scale that it appears indistinguishable from
| intelligent creation.
|
| Which is what most humans do, and what most humans need.
| Tao3300 wrote:
| None of the above. They don't like it being trained on and
| occasionally regurgitating their work.
| 1auralynn wrote:
| To me, it's an art vs. craft issue and there are many shades of
| gray to the discussion, because the root is really based in the
| question that every first-year art student is tasked with
| answering for themselves "What is art?"
|
| If art for you is primarily centered on fidelity of
| implementation (i.e. "craft") then you will be very threatened by
| AI, particularly if you've made it your livelihood. However, if
| your art is more about communication/concepts, then you might
| even feel empowered by having such a toolset and not having to
| slog through a bunch of rote implementation when developing your
| ideas/projects. Not to mention that a single person will be able
| to achieve much much more.
|
| I feel like it's possibly a good thing for art/humanity overall
| to stop conflating craft with art, because new ideas will rise
| above all of the AI-generated images. i.e. splashiness alone will
| no longer be rewarded.
|
| In an ideal future when we all live in the Star Trek universe,
| none of it will matter and whoever loves crafting stuff can do it
| all day long. Until then of course, it's tragic and lots of
| people will be out of jobs.
| 1auralynn wrote:
| Not to mention it also may spur innovations in different
| mediums: More time-based art, installations, video games, etc.
| mwigdahl wrote:
| We already have a great example of a group that has fought
| technological development of a synthetic alternative to their
| product -- the diamond industry.
|
| For years DeBeers and other diamond moguls have run extensive
| propaganda campaigns to try to convince people that lab-grown
| diamonds are physically, emotionally, and morally inferior. They
| had a lot of success at first. Based on lobbying, the US FTC
| banned referring to lab-grown diamonds as "real", "genuine", or
| even "stone". It required the word "diamond" be prefixed with
| "lab-grown" or "synthetic" in any marketing materials.
|
| Technology kept improving, economies of scale applied, and
| consumer demand eventually changed the balance. The FTC reversed
| its rulings and in 2022 demand for lab-grown stones (at small
| fractions of equivalent natural prices) is at an all-time high.
|
| Artists (and writers, and programmers) can fight against this all
| they like, and may win battles in the short term. In the end the
| economic benefits accruing to humankind as a result of these
| technologies is inexorably going to normalize them.
| WanderPanda wrote:
| I think what these generative models reveal is that the vast
| majority of art is just interpolation.
| crote wrote:
| Was there ever any doubt about that? There are _literally_
| entire graduate studies on it.
|
| However, art isn't _solely_ interpolation. The critical part is
| that art styles shift around due to innovations or new
| viewpoints, often caused by societal development. AI might be
| able to make a new Mondriaan when trained on pre-existing
| Mondriaans but it won 't suddenly generate a Mondriaan out of a
| Van Gogh training set - and yet that's still roughly what
| happened historically.
| beezlebroxxxxxx wrote:
| Lots of people in these comments trying to reduce art in a
| way that is pretty hilarious. You hit the nail on the head.
| Art is only interpolation if you....remove the human that
| created it, in which case you would not call the image art.
| AI "art" is computational output, to imply otherwise is to
| mistakenly imply a family resemblance to human (and uniquely
| human I would argue) creation.
| dymk wrote:
| The human brain is just a model with weights and a lifelong
| training step. Seems like a distinction without a
| difference - even more so as ML models advance further.
| beezlebroxxxxxx wrote:
| > Seems like a distinction without a difference
|
| This is giving ML models, more credit than they are due.
| They are unable to be imagine, they might convincingly
| seem to produce novel outputs, but their outputs are
| ultimately proscribed by their inputs and datasets and
| programming. They're machines. Humans can learn like
| machines, but humans are also able to imagine as agents.
| "AI" "art" is just neither of its namesakes. That doesn't
| mean it isn't impressive, but implying they are the same
| is granting ML more powers and abilities than it is
| capable of.
| dymk wrote:
| Humans imagine by mostly by interpolating things they've
| seen before. Add in some randomness and you get novel
| output (creativity).
| beezlebroxxxxxx wrote:
| You're oversimplifying imagination. It _could_ be related
| to something they 've seen before, or it could _not_ be.
| It could be entirely invented and novel in a way that has
| no antecedent to senses. Nor is it mere randomness added
| in. Imagining is something an agent _does_ and is capable
| of. The fly in the ointment is still that ML models
| simply do not have agency in a fundamental way; they are
| programmed and they 're are limited by that programming,
| that's what makes them and computers so effective as
| tools: they do _exactly_ as they are programmed, which
| can 't be said for humans. _We_ , as humans, might find
| the output imaginative or novel or even surprising, but
| the ML model hasn't done anything more than follow
| through on its programming. The ML programmer simply
| didn't expect (or can't explain the programming) the
| output and is anthropomorphizing their own creation as a
| means of explanation.
| mtrower wrote:
| But you know. Everything you said can easily be imagined
| to apply to humans as well. You can't see your own
| programming, and so can't fully understand it, and so you
| imagine it to be something more than what it is.
| beezlebroxxxxxx wrote:
| The problem you run into with that is that saying "humans
| are programmed" in the _identical sense_ as "computers
| are programmed" is nonsensical. We have powers that
| computers simply do not, like agency, imagination, we are
| capable of understanding, etc. So, the concept of
| programming a computer and "programming a human" would
| mean different things, which they do in our language. You
| run into either fundamentally redefining what programming
| means, placing sentient, agential, humans on the same
| plane as non sentient, non agential, machines; or you run
| into a situation where it makes no sense to say "Humans
| are programmed identically to computers."
|
| But if you say "humans are programmed" in a metaphorical
| sense, then yeah sure that's an interesting thought
| experiment. But it's still a thought experiment.
| xikrib wrote:
| The human experience is an embodied one, it is not just
| information processing
| dymk wrote:
| Do we know for a fact that a sufficiently stateful and
| complex ML model won't experience subjective
| consciousness?
| kmeisthax wrote:
| Vaguely related: Mickey Mouse will actually be hitting the public
| domain in 2024. That's a year and a few weeks away.
| fallingfrog wrote:
| To be honest: I'm not generally a luddite, but in this case- I
| think we should nip this in the bud. I can see where this is
| going. You can argue back and forth about whether this will make
| the economy grow, but that's not the point. The profits from
| increased productivity do not accrue to the workforce but to the
| owners of the capital, in the absence of concerted, organized
| resistance, so I would not expect the quality of life for the
| majority of people to improve because of this.
|
| The question is: do you _like_ human beings? Because there is
| really no job that can 't be replaced, if the technology goes far
| enough. And then the majority of the population, or _all_ of the
| population, becomes dead weight. I 'm a musician; how long before
| an AI can write better songs than I can in a few seconds?
|
| This is fundamentally different than past instances of technology
| replacing human labor, because in the past, there was always
| something else that humans could do that the machines still could
| not. Now- that may not be the case.
|
| There is only one choice: I think we should outlaw all machine
| learning software, worldwide.
| throwawayoaky wrote:
| See you guys in five years!
| Tycho wrote:
| What about an AI that can write unit tests for any codebase...
| seems like the overall benefit of that would be huge.
| spikeagally wrote:
| Does anybody else find the whole AI art generation thing both
| amazing and incredibly depressing at the same time? I've played
| around with it and it's lots of fun. But I can also see a deluge
| of mediocre "content" taking over the internet in the near
| future. "Real art" will become a niche underground discipline.
| Most popular music will be AI generated and will have fake
| performers also generated to go along with it. And most people
| will be fine with that.
|
| I don't think "real art" will disappear. People will always want
| to create (although monetising that will now be exceedingly more
| difficult).
|
| It feels like we are ripping the humanity out of life on a
| greater and greater scale with tech. Instead of replacing crappy
| jobs and freeing up peoples time to enjoy their life, we're
| actually automating enjoyable pursuits.
|
| NB: when I'm referring to art I mean of all types as that's where
| we are heading.
| rco8786 wrote:
| > But I can also see a deluge of mediocre "content"
|
| Have you been to the internet?
|
| In all seriousness, the cream will rise to the top. The
| mediocre "content" will get generated and we will get better at
| filtering it out which will decrease the value in generating
| mediocre content, etc etc. The tools being produced just
| further level the playing field for humanity and allow more
| people to get "in the arena" more easily.
|
| Humans are still the final judge of the value being produced,
| and the world/internet will respond accordingly.
|
| For a thought exercise, take your argument and apply it to the
| internet as a whole, from the perspective of a book or
| newspaper publisher in the 1990s.
| crote wrote:
| Have _you_ been to the internet?
|
| High-quality content rarely rises to the top. The internet as
| of 2022 optimizes for mediocrity: the most popular content is
| the one which is best psychological manipulation using things
| like shock value and sexuality. Just take a look at Twitter,
| Facebook, or Reddit: it is _extremely_ rare to see genuine
| masterpieces on there. Everything is just posted to farm as
| many shares and likes as possible.
|
| If anything, this will result in the cream getting drowned in
| shit. Not to mention that artists do not get the space to
| develop from mediocre to excellent - as the mediocre market
| will have been replaced with practically free AI.
| rco8786 wrote:
| This is a truly cynical take - but by your own account the
| problem already exists and is widespread before AI even
| came along.
| moron4hire wrote:
| Not to distract too much from your point, because I agree that
| the obviously imminent explosion of AI generated work will
| probably lead to a generation of stylistic stagnation, but...
|
| We already live in a time of artistic stagnation. With how much
| audio engineers manipulate pop music in Pro Tools, "fake"
| singers have been a practical reality for 20 years. Look at
| Marvel movies. Go to any craft fair on a warm day, or any
| artists' co-op, in a major city and try, try to find one booth
| that is not exactly like 5 other booths on display.
|
| People have been arguing about what is "real art" for
| centuries. Rap music wasn't real because it didn't follow
| traditional, European modes and patterns. Photography wasn't
| real because it didn't take the skill of a painter. Digital
| photography wasn't real because it didn't take laboring in a
| dark room. 3D rendering wasn't real. Digital painting wasn't
| real. Fractal imagery wasn't real. Hell, anything sold to the
| mass market instead of one-off to a collector still isn't "real
| art" to a lot of people.
|
| Marcel Duchamp would like to have a word.
|
| If anything, I think AI tools are one of the only chances we
| have of seeing anything interesting break out. I mean, 99% of
| the time it's just going to be used to make some flat-ui,
| corporate-memphis, milquetoast creative for a cheap-ass startup
| in a second rate co-working space funded by a podunk city's
| delusions they could ever compete with Silicon Valley.
|
| But if even just one person uses the tool to stick out their
| neck and try to question norms, how can that not be art?
| netheril96 wrote:
| If these generated arts just replace human created arts, then
| it can be construed as depressing.
|
| But what if AI generates arts where humans do not scale?
|
| For example, what if the AAA game you are expecting gets done
| in half of the time, or has ten times the size of explorable
| area, because it is cheap and fast to generate many of the arts
| needed by AI?
|
| Or if some people excellent at story telling but mediocre at
| drawing can now produce world class manga due to the assistance
| of AI?
| adamhp wrote:
| I've been complaining about this with AI generated content in
| general as well, especially Twitter and blog posts. I worry
| that we're in a sort of downward spiral, creating a feedback
| loop of bad content. Eventually models will get trained on this
| badly generated content, and it will reduce the overall
| vocabulary of the Internet. Take this to the extreme, and we'll
| keep going until everything is just regurgitated nonsense.
| Essentially sucking the soul out of humanity (not that tweets
| and blog posts are high art or anything). I know that sounds a
| little drastic but I really think there's a lurking evil that
| we don't have our eye on here, in terms of humanity and AI.
| We've already seen glimpses of it even with basic ad targeting
| and various social media "algorithms".
| CadmiumYellow wrote:
| I've been thinking the same thing. I wonder if this might
| give rise to some kind of analog renaissance as people get
| sick of all the digitally regurgitated garbage. There has to
| be a point of diminishing returns for this kind of content,
| right? Maybe there will be some kind of Made By Humans
| verification that will make certain content much more
| valuable again simply by differentiating it from all the AI-
| generated simulacra.
| carlmr wrote:
| >we'll keep going until everything is just regurgitated
| nonsense.
|
| I feel like this about the mostly-human-created fashion. In
| my not so long lifetime I've seen everything from the 90s
| making a comeback. Ultimately I guess in terms of clothing
| that is practical with the materials that are available,
| we've already cycled through every style there is, such that
| the cycle time is now <30years.
| jeremyjh wrote:
| Considering the level of discourse in almost any Twitter
| thread on any popular topic, it's hard to be sure it hasn't
| already happened.
| Workaccount2 wrote:
| To me it's terrifying and gives me a bit of panic playing with
| it. This is still early stuff, like dial-up or 100Mhz
| processors. We all know the trajectory tech takes nowadays, and
| the writing on the wall here is an event horizon where it's
| impossible to see the full scope of how this tech will change
| the world.
|
| We're like people getting the very first electric light bulbs
| in their home, trying to speculate how electricity will change
| the world. The pace of change however will be orders of
| magnitude faster than that.
| lemoncookiechip wrote:
| > But I can also see a deluge of mediocre "content" taking over
| the internet in the near future.
|
| This has always been the case. Most entertainment regardless of
| form (music, art, tv, games...) is mediocre or below mediocre,
| with the occasional good or even rarer exceptional that we all
| buzz about.
|
| AI image gen is only allowing a wider range of people to
| express their creativity. Just like every other tools that came
| before it lowered the bar of entry for new people to get in on
| the medium (computer graphics for example allowed those who had
| no talent for pen and paper to flourish).
|
| Yes, there will be a lot of bad content, but that's nothing out
| of the ordinary.
|
| https://en.wikipedia.org/wiki/Sturgeon%27s_law
| [deleted]
| nonbirithm wrote:
| > Instead of replacing crappy jobs and freeing up peoples time
| to enjoy their life, we're actually automating enjoyable
| pursuits.
|
| But in my case, I don't happen to find drawing or painting
| enjoyable. I simply don't, for nature- or nurture-based
| reasons. I also don't believe that everyone can become a
| trained manual artist, because not everyone is _interested_ in
| doing so, even if they still (rightly or wrongly) cling to the
| idea of having instant creative output and gratification.
|
| I think this lack of interest is what makes me and many other
| people a prime target for addiction to AI-generated art. Due to
| my interest in programming I can tweak the experience using my
| skills without worrying about the baggage people of three years
| ago _had_ to deal with if they wanted a similar result.
|
| So without any sort of generation, how does one solve the
| problem of not wanting to draw, but still wanting one's own
| high-quality visual product to enjoy? I guess it would be
| learning to be interested in something one is not. And that
| probably requires virtuosity and integrity, a willingness to
| move past mistakes, and a positive mindset. The sorts of things
| that have little to do with the specific mechanics of writing
| code in an IDE to provoke a dopamine response. Also, the
| ability to stop focusing so hard on the end result, a detriment
| to creativity that so many (manual) art classes have pointed
| out for decades.
|
| I sometimes feel I lack some of those kinds of qualities, and
| yet I can somehow still generate interesting results with
| Stable Diffusion. It feels like a contradiction, or an
| invalidation of a set of ideas many people have held as sacred
| for so long, a path to the advancement of one's own inner
| being.
|
| I will relish the day when an AI is capable of convincing me
| that drawing with my own two hands is more interesting than
| using its own ability to generate a finished piece in seconds.
|
| So I agree that, on a bigger scale beyond the improvement of
| automated art, this line of thinking will do more harm to
| humanity than good. An AI can take the fall for people who
| can't or don't want to fight the difficult battles needed to
| grow into better people, and that in turn validates that kind
| of mindset. It gives even the people who detest the artistic
| process a way to have the end result, and a _decent_ one at
| that.
|
| I think this is part of the reason why the anti-AI-art movement
| has pushed back so loudly. AI art teaches us the wrong lessons
| of what it means to be human. People could become convinced to
| not want to go outside and walk amongst the trees and
| experience the world if an AI can hallucinate a convincing
| replacement from the comfort of their own rooms.
| AlexandrB wrote:
| > Instead of replacing crappy jobs and freeing up peoples time
| to enjoy their life, we're actually automating enjoyable
| pursuits.
|
| This feels like the natural outcome of Moravec's paradox[1]. I
| can imagine a grim future where most intellectually stimulating
| activities are done by machines and most of the work that's
| left for humans is building, cleaning, and maintaining the
| physical infrastructure that keeps these machines running.
| Basically all the physical grunt work that has proven hard to
| find a general technological solution for.
|
| [1] https://en.wikipedia.org/wiki/Moravec%27s_paradox
| woeirua wrote:
| We've seen this before when CGI first came out, then with the
| proliferation of Photoshop and other cheap editors. Now fake
| garbage is everywhere on the internet. Did that make human life
| substantially different? Nope. Everyone just ignores most of it
| and only believes stuff that comes from "reputable sources."
| That will be the end game here too. A flight to quality.
| nonbirithm wrote:
| But also, the explosion in interest means there had been a
| latent interest in instantly generating pictures to begin with.
|
| I think this situation says a lot about the nature of human
| desire, not just the fact that a few people were ingenious to
| come up with the idea of diffusion models. A lot of ingenious
| inventions are relatively boring when exposed to the broader
| populace, and don't hit on such an appealing latent desire.
|
| What will this say about the limitless yet-to-be-invented ideas
| that humanity is just raring to give itself, if only someone
| would hit on the correct chain of breakthroughs? Would even a
| single person today be interested in building a backyard
| nuclear warhead in an afternoon, and would attempt to if the
| barrier of difficulty in doing so was solved?
| onetrickwolf wrote:
| Yeah I agree. I was generally pretty pro AI art and agree with
| a lot of the pro AI sentiments here on a logical basis still,
| but as the tech develops I drift more and more towards thinking
| this may be a bleak path for humanity.
|
| > Instead of replacing crappy jobs and freeing up peoples time
| to enjoy their life, we're actually automating enjoyable
| pursuits.
|
| Yeah really hit the nail on the head here. I thought a lot of
| backlash against AI was due to workers not really reaping the
| benefits of automation and that's a solvable problem. But I've
| seen a lot of artists who are retired or don't need to work
| dive into despair over this still. It's taking their passion
| away, not just their job.
|
| I don't really know how we could stop it though without doing
| some sweeping Dune-level "Thou shalt not make a machine in the
| likeness of the human mind" type laws.
| sidlls wrote:
| Near future? The internet is cesspool of mediocre and terrible
| content already. AI is going to have an impact on art and
| everything else in general. Artists may (and likely will be
| forced) to adapt to/adopt its use.
| karmakurtisaani wrote:
| If you think about how much content we're already getting from
| mediocre artists and writers, how many tv shows are complete
| garbage, how much governments and corporations are promoting
| and trolling in online discussions, how many search results are
| already ruined by lazy copied content, it's difficult to see
| things getting orders of magnitude worse.
|
| Good stuff will still be good stuff, and it will keep being
| rare. The biggest change will be that producing mediocre
| content will be cheaper and more accessible, but we're already
| drowning in it, so .. meh?
|
| > Instead of replacing crappy jobs and freeing up peoples time
| to enjoy their life, we're actually automating enjoyable
| pursuits.
|
| That's an interesting observation.
| dredmorbius wrote:
| Fair assessment, and I agree with much of your premise,
| though regards "it's difficult to see things getting orders
| of magnitude worse": Please _don 't challenge them_.
| Lichtso wrote:
| I totally agree that there is a lot of low effort and
| consequentially low quality stuff out there in the world
| already. However, it still costs to make that. With this form
| of automation getting better it will simply become a lot
| cheaper to produce and is thus going to happen a lot more.
| So, I expect the ratio to become worse, maybe even "orders of
| magnitude" worse.
| astrange wrote:
| If you find everything incredibly depressing, that may simply
| mean you have depression, not that it's actually objectively
| bad.
| MomoXenosaga wrote:
| Musea already have basements filled with thousands of art
| pieces nobody has seen in decades. There's already too much
| content.
| kmlx wrote:
| > But I can also see a deluge of mediocre "content" taking over
| the internet
|
| i've noticed this mediocrity decades ago when artists started
| using computers to create art. for me that's when it went
| downhill.
| beezlebroxxxxxx wrote:
| I will say, the kind of art intended for corporate needs
| (much of which in the last decade in particular has been a
| deluge of bland vector art with weird blob people) is not the
| same as the art that many artists make in their own time, or
| would regard as good.
|
| The through line for a lot of mediocre stuff is the intention
| of the artist/creator to appeal to as broad a
| demographic/audience as possible so as to dissolve away
| anything that makes the art interesting, challenging, and
| good.
| PurpleRamen wrote:
| Majority of everything is always mediocre at best. There is no
| absolute value in those things, they always get pitched against
| each other. Something mediocre today, could have been a
| masterpiece some decades ago. A masterpiece from decades ago
| could be hot garbage today. Those things are a constantly
| moving target and will always shift. People will just adapt
| their taste and figure out some new random rules to say why
| something was yesterday a masterpiece and became today mediocre
| and so on.
| fleddr wrote:
| I sympathize with artists on this matter, but they're really bad
| at protesting.
|
| AI Mickey Mouse is a possible copyright as well as trademark
| violation which would likely be enforced in the exact same way if
| you were to hand draw it. This type of violation is not AI
| specific.
|
| The main threat that AI poses is not that it outputs copyrighted
| characters, instead brand new works that are either totally new
| (idea is never drawn before but the style is derived) or
| different enough from a known character to be considered a
| derived work.
|
| Another way to put it: artists' current job is not to draw
| mickey. It is to draw new works, which is the part AI is
| threatening to replace. Sure, Disney may chase the AI companies
| to remove Mickey from the training set, and then we lost AI
| Mickey. That doesn't solve any problem because there are no
| artist jobs that draw Mickey.
|
| Even in the case of extreme success where it becomes illegal to
| train a copyrighted image without explicit consent, the AI
| problem doesn't go away. They'll just use public domain images.
| Or sneak in consent without you knowing it. As was the case with
| your "free and unlimited" Google Photos.
|
| Finally, if there's any player interested in AI art, it has to be
| Disney. Imagine the insane productivity gains they can make. It's
| not reasonable to expect that they would fight AI art very hard.
| Maybe a little, for the optics.
| itronitron wrote:
| I think you are giving the AI too much credit in being able to
| pull out the trademarked bits. Artists can introduce trademark
| iconography into their work as a poision pill. Sort of like GPL
| but with more powerful allies.
| fleddr wrote:
| I don't really believe that. An example: "Robot owl in the
| style of van Gogh".
|
| This will closely mimic van Gogh's style but nobody cares
| because style cannot be copyrighted in itself. So it draws a
| robot owl, which for the sake of this example, is a new
| character.
|
| Zero copyright violations.
|
| My point remains that AI users aren't going to aim for output
| that directly looks like an existing character. These artists
| are now intentionally doing that for the sake of the protest
| but this is not how AI is used. It's used to create new works
| or far-derived works.
| rini17 wrote:
| LOL. I am on midjourney discord and this really is how it's
| being used half of the time, users asking for existing
| characters.
| 29athrowaway wrote:
| The previous generation of illustrators got disrupted by digital
| illustration.
|
| The number of people required to publish a magazine, or to create
| an ad, went down significantly with digital tools.
| ur-whale wrote:
| Art has historically always been about copy and improve.
|
| This whole copyright / intellectual property idea is something
| that unfortunately cropped up in the 20th century, and the fact
| that it was codified into law is certainly not something 20th
| century humanity should be proud of or regard as progress.
| Juliate wrote:
| Nope.
|
| Intellectual property concepts in their current form started to
| appear as soon as prints, so about the 15th century.
|
| https://en.wikipedia.org/wiki/History_of_copyright#Early_dev...
| Valmar wrote:
| > Intellectual property concepts in their current form
| started to appear as soon as prints, so about the 15th
| century.
|
| Copyright is not the same as intellectual property.
|
| Copyright is not an intellectual property concept.
|
| They're very different things, though often conflated.
| Juliate wrote:
| Perhaps not on your side of the planet, but in Europe,
| copyright is a part of intellectual property legislation.
| OctopusLupid wrote:
| I find it weird how I don't see any mention of the TDM exceptions
| ("Text and Data Mining") that already explicitly allows AI
| companies to train on copyrighted data, in some cases even
| allowing them to ignore opt-outs (such as in research
| institutions). This is already implemented in the UK, EU, Japan
| and Singapore.
|
| It seems to me that the online discourse is very US-centric,
| thinking that the AI regulatory battles are in the future, when
| in some other countries it's already over.
| standardly wrote:
| It is pretty ironic.
|
| "AI will outdo us at repetitve, mindless tasks, but it will NEVER
| be able to compete with humans at, like, ART, and stuff"
| anonyfox wrote:
| Abolish copyrights. At all. Unrestricted exchange boosts learning
| curves of societies and benefits everyone in the long run, except
| a few won't become too rich in the process. There are several
| downsides attached to that, but I am willing to accept that.
| nwoli wrote:
| People should be rewarded for finding a unique artistic
| innovation that lots of people enjoy I'd think
| someNameIG wrote:
| Yes, people, not AI.
|
| I wonder if that could be a solution to this. Anything AI
| generated is public domain, no one can own the IP to it. It
| would allow it to be used for research and education,
| hobbyists, but hinder how large corporations could use it.
|
| Maybe even have it like GNU license, anything using AI
| generated stuff must also be public domain.
| JetAlone wrote:
| quonn wrote:
| It's amusing how so many on this thread assume to be able know
| what will happen to this or that profession.
|
| We don't know. We just don't.
|
| It's too difficult to predict what, say, software developers will
| do in a few years and how demand or salary or competition will
| be.
|
| Look at this final video of the 2012 Deep Learning course by
| Hinton that I still remember from a long time ago:
| https://m.youtube.com/watch?v=FOqMeBM3EIE
|
| What I do know however is this:
|
| - Short term nothing special will happen.
|
| - In the actually interesting projects that I worked on I always
| ran out of time. So much more could be imagined that could have
| been done but there was no time or budget to do it. Looking
| forward to AI making a dent in this a bit.
| rafaelero wrote:
| Chill out, people. Humans are still great generalists. We are
| pretty capable of leveraging these tools to amplify our
| productivity. It's only the specialists between us that are going
| to suffer a lot with these new developments. All these AI
| innovation is truly showing us how pathetic our ability to deeply
| understand and specialize at something is. We are always going to
| lose to computers, be it in chess, go or art. Therefore, we
| should cultivate our generalist skills and stop fighting AI
| progress.
| mdrzn wrote:
| This should have (2019) in the title
| dredmorbius wrote:
| Um, why? (As submitter.)
|
| The linked item was posted within the past 24 hours. The
| referenced images also appear to be current so far as I can
| tell.
|
| (I'd looked for a more substantial post or article without luck
| when submitting this.)
| mdrzn wrote:
| My bad, the mastodon thread is actually fresh, I got it mixed
| up with the article linked in the 3rd reply to it, which is a
| 2019 story:
|
| https://waxy.org/2019/12/how-artists-on-twitter-tricked-
| spam...
| dredmorbius wrote:
| Thanks.
| silent_cal wrote:
| The way these image generating neural nets are trained is
| illegal. They copy and use other artists' work without asking
| them or paying them. There's a lot of legal exposure here - why
| hasn't anyone taken advantage of that yet?
| OctopusLupid wrote:
| What makes you say it's illegal?
|
| In the EU, UK, Japan and Singapore, it is explicitly legal to
| train AI on copyrighted work. I saw another comment say that AI
| companies train in those countries.
| ChadNauseam wrote:
| In the US we have fair use, and it's not clear at all to me
| that this wouldn't count. If I took every image on artstation
| and averaged all of them (creating a muddy mess), I think I
| would be legally able to distribute the result without
| compensating or crediting the original artists.
| benreesman wrote:
| I'm going to be a broken record here: both of the words
| "artificial" and "intelligent are hellaciously difficult to
| define, put them together and you've got a real epistemological
| quantum on your hands.
|
| What we're actually always talking about is "applied
| computational statistics", otherwise known as ML.
|
| And if an artist wants to sample from the distribution of
| beautiful images and painting and photographs as a source of
| inspiration, why not? We do it in other fields.
|
| But using a computer to sample from that same distribution and
| adding nothing will be rightly rewarded by nothing.
| BurningFrog wrote:
| A fascinating angle I heard recently is that when the new tech of
| photography swept the world, it made tons of painters unemployed.
|
| And that was the main reason for "modern art". A camera can do a
| portrait or landscape instantly and more precise than a painter,
| but it can't compete on abstract or imagined pictures.
|
| Will something analogous happen when AIs takes over other
| industries? I have no clue, but it will, as always, be
| interesting to see what happens.
| dredmorbius wrote:
| Any references you can recall on the emergence of modern art as
| a response to photography?
| BurningFrog wrote:
| Not really. Don't remember where I read it. It was a few
| months ago.
|
| I like the explanation a lot, and I think the timelines line
| up pretty well.
|
| But sure, it could be one of those stories that _sound_ true,
| but isn 't.
| dredmorbius wrote:
| Thanks.
| Karawebnetwork wrote:
| If someone builds an AI self-driving car and feeds it images of
| Honda cars. Should the company be required, under threat of legal
| action, to remove the Honda from the model? What if this makes
| the model less accurate and causes more accidents?
|
| In other words, I am wondering if the current issue here is the
| model being trained or the model being able to generate images.
|
| Coming back to my example, if the car displayed the closest
| vehicle on the HUD. Would Honda ask the car company to replace
| the likeness of their car with a generic car icon or would they
| ask for the model to be scrubbed?
| anothernewdude wrote:
| So either they are hypocrites, or art can be made by AI. Probably
| both.
| sircastor wrote:
| I feel for artists who feel like they're losing their livelihood.
| Art has always been a tough profession, and this doesn't help
| because late-stage capitalism all but guarantees that a lot of
| potential customers will just skip the human-made article in
| favor of the "good-enough" mechanical production.
|
| That said, automation is coming for all of us. The problem is not
| "we Need to stop these AIs/robots from replacing humans." It's
| "We need to figure out the rules for taking care of the humans
| when their work is automated"
| agomez314 wrote:
| Probably because they disdain the use of AI being used to copy
| their IP and distribute it at "machine" scale? Not an artist
| myself but can imagine I'd be pissed off that a bot is
| replicating my art with random changes.
|
| HOWEVER, if a person were to ask for permission to use my
| pictures to feed into an AI to generate a number of images, and
| that person _selected_ a few and decided to sell them, I wouldn't
| have a problem with that. Something to do with the permission
| provided to the artist and an editing/filtering criteria being
| used by a human makes me feel ok with such use.
| alxlaz wrote:
| What you're describing is basically copyright, which is exactly
| what artists are demanding: the legal protection to which they
| are entitled to.
|
| Edit: Silicon Valley exceptionalism seems to preclude some
| thought leaders in the field to remember the full definition of
| copyright: it's an artist's exclusive right to copy,
| distribute, _adapt, display, and perform a creative work_.
|
| A number of additional provisions, like fair use, are meant to
| balance artists' rights against _public_ interest. Private
| _commercial_ interest is not meant to be covered by fair use.
|
| No one is disputing that everyone, including companies in the
| private sector, is entitled to using artists' images for AI
| research. But things like e.g. using AI-generated images for
| promotional purposes are not research, and not covered by fair
| use. You want to use the images for that, great -- ask for
| permission, and pay royalties. Don't mooch.
| phpisthebest wrote:
| Copyright (in the US) also includes fair use provisions of
| which education and research is a fair use of copyrighted
| work for which no permission from the artist is needed
| beezlebroxxxxxx wrote:
| > fair use provisions of which education and research is a
| fair use
|
| I don't think people are debating fair use for education
| and research. It's the obvious corporate and for profit use
| which many see coming that is the issue. Typically,
| licensing structures were a solution for artists, but "AI"
| images seem to enable for-profit use by skirting around who
| created the image by implying the "AI" did, a willful
| ignorance of the way that the image was
| generated/outputted.
| phpisthebest wrote:
| >>I don't think people are debating fair use for
| education and research. It's the obvious corporate and
| for profit use
|
| Sounds like you are, because in copyright law there is
| not carve out for only non-profit education / research.
| Research and Education can be both profit and non-profit,
| copyright law does not distinguish between the 2, but it
| sounds like you claim is research can only ever be non-
| profit but given the entire computing sector in large
| part owes itself to commercial research (i.e Bell Labs) I
| find that a bit odd
| beezlebroxxxxxx wrote:
| Doesn't fair use make a distinction in the _use_ though?
| Fair use in terms of commentary on something for instance
| is not the same as a company presenting marketing images,
| for example, as _theirs_ in the selling of a product. If
| someone has legally protected their artwork, you can 't
| just apply a photoshop layer to it and claim it is
| _yours_ as fair use though, right? The issue seems to
| become almost more about provenance.
| phpisthebest wrote:
| >If someone has legally protected their artwork, you
| can't just apply a photoshop layer to it and claim it is
| yours as fair use though, right?
|
| That depends on what the layer was, and there is current
| cases heading to supreme court that have something
| similar to that so we may see
|
| however commentary is just one type of fair use and would
| not be a factor here, nor is anyone claiming the AI is
| reselling the original work. The claim is that copyright
| law prevents unauthorized use of a work in the training
| of AI, AI training could (and likely would) be treated as
| research, and the result of the research is a derivative
| work wholly separate from the original and created under
| fair use
| Double_a_92 wrote:
| The copyright to what exactly though? Imagine you're an
| artist that draws abstract paintings of trees. If an AI uses
| those, the results it produces will be generic abstract trees
| _in your style_. And since I doubt that you can copyright
| trees, you would have to copyright your specific style. But
| is that possible?
| astrange wrote:
| It is not possible. And EU law (which is where these models
| were trained) has explicit allowances for machine learning
| anyway.
| yreg wrote:
| Copyright doesn't protect your art against being copied (heh)
| by other artists.
|
| Artists have always been inspired by each other and copied
| each other's styles and ideas.
| patientplatypus wrote:
| amelius wrote:
| Not just artists, also product designers like Jony Ive.
| esotericsean wrote:
| This will be a losing battle for artists. Anyone can train any
| data they want. It's the equivalent of a human learning to draw
| someone else's art style or take photos the same as a famous
| photographer. There is no stopping it now and it's only going to
| get better and easier. Video is getting close to being just as
| accessible as image or text generation. Regardless of how you
| feel about all this, there's no stopping it. It's the future.
| vgatherps wrote:
| As a thought experiment, let's say that the next version of
| stable diffusion is able to integrate large text datasets into
| the training set and can generate an accurate Mickey Mouse
| without ever having to be trained on an image of Mickey Mouse
| since it's integrated enough information from the text.
|
| What then? Certainly an individual artist can't go and sell
| images of Mickey Mouse since it's still copyright infringement,
| but what claim would Disney have against the AI company?
|
| I wrote in another comment that if you make the training of such
| models illegal regardless of distribution, it's essentially
| making certain mathematics illegal. That poses some very
| interesting questions around rights, whether others will do it
| anyways, and the practicality of enforcing such a rule in the
| first place.
| crote wrote:
| Would an SVG image count as text? How about an SVG
| automatically transformed into human-readable language?
| [deleted]
| adenozine wrote:
| Nobody because these aren't the first. Aviation, gaming, finance,
| logistics, etc... there's huge industries that are already
| inundated with AI tools.
| smrtinsert wrote:
| This is like monks fighting the printing press. Sorry guys, this
| is only going one direction.
| rozgo wrote:
| In our game studio, engineers are creating lots of developer art
| on their own. But the real productivity booster is coming from
| artists using language models to generate entire art pipeline
| scripts. Several Python scripts to automate Blender3D and offline
| asset post-processing. Many artists are also changing shaders by
| asking language models to modify existing code.
| jmyeet wrote:
| One bone to pick: this says "artists" are fighting this and
| mentions Disney, Nintendo and Marvel. "Corporations" would be
| more accurate than "artists".
|
| Training a model with artists' work seems completely fine to me.
| If something is out in the world and you can see it, you can't
| really control how that affects a person or a model or whatever.
|
| The actual issue is reproduction of trademarked and copyrighted
| material. There are already restrictions on how you can use
| Mickey Mouse's likeness in any derivative work. That's not an AI
| issue. It's an IP issue. The derivative works are no different
| than if I, a person, produced the same derivative work.
|
| It would be funny to me that we had to turn our attention to
| training AIs in IP laws.
| yeknoda wrote:
| If regulation is found to be necessary, here are some options
|
| - government could treat open ai like an electricity utility,
| with regulated profits
|
| - open ai could be forced to come up with compensation schemes
| for the human source images. The more the weights get used, the
| higher the payout
|
| - the users of the system could be licensed to ensure proper use
| and that royalties are paid to the source creators. We issue
| driving licenses, gun licenses, factory permits etc. Licenses are
| for potentially dangerous activities and powers. This could be
| one of those.
|
| - special taxation class for industries like this that are more
| parasitic and less egalitarian than small businesses or
| manufacturing
|
| - outright ban on using copyrighted work in ai training
|
| - outright ban on what can be considered an existential
| technology. This has been the case for some of the most important
| technologies in the last 100 years including nuclear weapons.
| charlescearl wrote:
| The title is an erasure of the minoritized workers who've been
| exploited in labeling and curation and moderation who've been
| raising concerns; it's an erasure of the many who've been raising
| concerns about the misogyny and predation involved in the
| construction of the data sets (e.g. https://www.image-net.org/)
| which make these models possible
| https://arxiv.org/abs/2006.16923.
|
| Marx makes the case in Grundisse
| https://thenewobjectivity.com/pdf/marx.pdf that the automation of
| work could improve the lives of workers -- to "free everyone's
| time for their own development". Ruth Gilmore Wilson observes
| that capital's answer is to build complexes of mass incarceration
| & policing to deal with the workers rendered jobless by
| automation https://inquest.org/ruth-wilson-gilmore-the-problem-
| with-inn... -- that is, those who have too much "free" time. In
| such a world, Marx speculates that "Wealth is not command over
| surplus labour time' (real wealth), 'but rather, disposable time
| outside that needed in direct production", but Wilson reminds us
| that capital's apparent answer to date has been fascism.
| Imnimo wrote:
| If you create a trademark-violating image using an AI model does
| that demonstrate anything more than that that particular image is
| violating? Like it's also violating if I hand draw those images,
| the fact that they're AI-generated doesn't enter into it.
| residualmind wrote:
| You don't need AI to create these, you just have to be a d*ck.
| Juliate wrote:
| Lawyers.
|
| Lawyers are going to have a lot of fun$$ with the
| copyright/trademark violation flood that is coming (and not only
| for high profiles).
| bigbacaloa wrote:
| One traditional way of learning to make art was to go to the
| museum and copy the works of masters ... What's the difference in
| principle if one trains AI on them?
| dredmorbius wrote:
| Work-factor, most obviously.
|
| Targeting and distribution as well. AI has the edge on
| individual creators here.
| rvz wrote:
| People who are not techies and have a clue about Stable Diffusion
| and DALL-E being trained on copyrighted images without their
| permission or attribution / credit knew this? This was absolutely
| unsurprising [0] [1].
|
| Stability AI knew they would be sued to the ground if they
| trained their AI generating music equivalent called 'Dance
| Diffusion' model on thousands of musicians without their
| permission and used public domain music instead.
|
| So of course they think it is fine to do it to artists
| copyrighted images without their permission or attribution, as
| many AI grifters continue to drive everything digital to zero.
| That also includes Copilot being trained on AGPL code.
|
| [0] https://news.ycombinator.com/item?id=33902341
|
| [1] https://news.ycombinator.com/item?id=33005559
| phpisthebest wrote:
| Anything that weakens copyright is something that should be
| supported. Copyright has expanded well beyond its original
| goals to in fact be a harm to those goals
|
| Copyright (in the US) was NOT in fact created to protect
| creators, it was to encourage creation and advance science.
| Today copyright is being used to curb and monopolize creation
| and prevent advancement (case in point this very story)
| crote wrote:
| On the other hand, copyleft licenses _are_ being used to
| protect creators. Without copyright protection, what is
| stopping companies from blatantly violating even more open-
| source licenses?
| phpisthebest wrote:
| note I never said anything about elimination of copyright
| completely, I said weaken it.
|
| The original copyright term was for 14 years, not for Life.
___________________________________________________________________
(page generated 2022-12-15 23:01 UTC)