[HN Gopher] Adobe is buying videos for $3 per minute to build AI...
___________________________________________________________________
Adobe is buying videos for $3 per minute to build AI model
Author : marban
Score : 115 points
Date : 2024-04-11 11:05 UTC (1 days ago)
(HTM) web link (www.bloomberg.com)
(TXT) w3m dump (www.bloomberg.com)
| SilverBirch wrote:
| I'm kind of surprised by this. I understand there's basically two
| views: Content creators say "Hey, this is my stuff, you don't
| have permission to use it, you need to pay me" and there's
| Silicon Valley: "I'm allowed to look at your images, feeding your
| images into my machine is the same thing". Legally, the first
| view seems probably correct, but from a "history of silicon
| valley" view, breaking the rules in order to gain a competitive
| advantage in the market has _always_ been the better strategy.
|
| It'd be like if Uber launched by applying for NY taxi medallions.
| So this seems like a crazy risk:reward ratio here. Adobe is going
| to end up with a massive bill, a weak model and the _hope_ that
| the guys who steal everything will face consequences. We 've
| _never_ seen Silicon Valley face consequences like that in the
| past so i don 't see why you would bet it's going to happen this
| time.
| artninja1988 wrote:
| The whole differentiator between firefly, the adobe image
| generator, is that it's trained on their licensed stock image
| library and commercially safe. Besides that, even openai is
| licensing some proprietary high quality content from
| proprietary sources. Even if the open Internet is deemed fair
| use, there's still a lot of content locked away to license
| dehrmann wrote:
| Adobe is an established player with real customers who don't
| want to deal with copyright issues. Telling them "this is safe
| to use _because we paid for it_ " solves that issue.
|
| With Uber, no one's suing individual riders or drivers over an
| illegal taxi ride.
| HeatrayEnjoyer wrote:
| This is such a tectonic shift for what the concept of media
| even means that any assumptions are on uncertain ground.
|
| The courts could find that it isn't possible to sign away
| these rights. There are many other rights that the law does
| not permit to be signed away for much more tame reasons than
| "enables further creations of 'your' work (or in the case of
| actors, literally 'you') not only without your actual
| involvement, but even long after your death". I would expect
| that it at least comes with a time limit. "Artists can sign a
| license to use these rights but only until their death + 10
| years, after that the licensee can't use anything trained on
| their work. Artists also have an unrevokable right to cancel
| the license at any time with 90 day notice." might be one
| possible outcome.
| alemanek wrote:
| That seems like a crazy stretch to me. Not a lawyer but the
| rights courts typically don't let you sign away are things
| like your freedom. Signing away your right to profit or
| control a video you created is super common already.
|
| Content licensing is really well established and already
| allows for licensing for specific use and different royalty
| structures based on usage. An example of this is streaming
| vs theater vs syndication.
|
| So, now there is a new venue to take into account and have
| the lawyers add a few pages to their contracts moving
| forward.
| HeatrayEnjoyer wrote:
| >courts typically don't let you sign away are things like
| your freedom.
|
| This is exactly my point. An AI that can replace you,
| personally, is closer to signing away your identity or
| freedom than the rights to display a specific already
| completed work.
| spaced-out wrote:
| The court telling an artist they're not allowed to sell
| their videos to the highest bidder seems like a greater
| infringement on their freedom. What if no one wants to
| pay for this person's videos except for an AI company?
| You're basically telling them they're not allowed to
| profit off of their work.
| bongodongobob wrote:
| You can't just make up new definitions of freedom, that's
| not what freedom is.
| tylerchilds wrote:
| i was at adobe summit a couple weeks back, the push was ai
| heavy-- the strategy is 100% around no copyright issues.
|
| observationally they're in an interesting position, balancing
| their artistic clients and their executive clients. a clean
| model is the easiest way to hedge their portfolio and
| reputation.
| Ekaros wrote:
| Artist don't want their work taken for free. And executive
| clients can know that court systems can be extremely
| fickle... It can go one way or an other depending
| jurisdiction and even one big enough going wrong can be
| expensive.
| whywhywhywhy wrote:
| The Artists it was trained on already signed rights away
| to Adobe when they put their work for sale on Adobe Stock
| so they'll get what they're given really.
| Ekaros wrote:
| That is fair. But I was talking in general, mostly about
| material that was not sold on Adobe Stock...
| SJC_Hacker wrote:
| They would have to do the leg work to ensure that the seller
| is the legitimate rights holder.
| surfingdino wrote:
| I am struggling with the Silicon Valley's latest business model
| that seems to be based on stealing all content in order to
| train AI to replace the very creators who created that content.
| If we then replace white-collar workers with AI and blue-collar
| workers with robots... and most of the population are jobless
| who's going to be able to pay for the content, the services,
| and the goods produced by AI and robots? Is it why the VC are
| in favour of universal basic income? But if we all go on UBI
| then what's the point to selling to us if that money could go
| to the VCs... but then... what do they do with the money if it
| ceases to circulate and incentivise people to work and trade?
| AnthonyMouse wrote:
| > If we then replace white-collar workers with AI and blue-
| collar workers with robots...
|
| ...we would be living in post-scarcity and everything would
| be free. But that doesn't happen in the absence of AGI, what
| actually happens is that technology replaces some jobs and
| then people do the remaining jobs, which are now in higher
| demand because the things done by technology become cheaper
| and the money that had gone to pay for labor there now gets
| spent on something else, increasing demand for the other
| thing.
|
| Technology has been replacing jobs for hundreds of years and
| we still have low unemployment.
| numpad0 wrote:
| > "Hey, this is my stuff, you don't have permission to use it,
| you need to pay me"
|
| I see a lot of the first and second parts, but nowhere near as
| often the third part: The rights holders aren't seeking
| financial growth, just wants control in perpetuity. I suspect
| that's the part proving difficult to solve.
| AnthonyMouse wrote:
| > Legally, the first view seems probably correct
|
| It's not obvious why that would be.
|
| Artists aren't going to like this technology because it
| competes with them, but it competes with them regardless of
| whether it was their work or someone else's in the training
| data.
|
| This leads to a visceral response where they want to call this
| "stealing" and hope that the creators of the technology can be
| sued into non-existence so their competition can be eliminated.
| But as Adobe is demonstrating, that isn't going to happen
| anyway. So the question isn't whether the technology will
| exist, it's if it will be locked up behind the walls of major
| corporations. The latter doesn't do artists any good but harms
| the public -- including artists who want to leverage the
| technology in their art. So why should the law protect Adobe's
| moat from the public?
| SilverBirch wrote:
| The reason I think the first view seems more correct is that
| it's like downloading a song from spotify. Yes you would
| think streaming a song from spotify is technologically
| identitical as downloading it, but legally there is a
| distinction. If you found a way of ripping a copy of a song
| from spotify there would be a record company ready to sue you
| and a law they could use to do it.
|
| It's theoretically true that the models could be trained with
| someone else's training data, but there's a flaw in that
| argument. If you _can_ train with other data without these
| legal issues, why _don 't_ they? And the answer is actually
| because they do assign some value to that training data, and
| there's not an infinite supply and it's actually quite
| difficult to get large sets of quality data.
|
| I think it's a pretty open question how this will resolve, it
| could be like stremaing music where companies like spotify
| are little more than puppets for the major record labels. It
| could end up like youtube where the model started with "We're
| going to steal stuff" and ended up "We're going to strongly
| enforce copyright now we're the encumbent" or some other
| third way. But I don't expect the "We're going to take
| everything with no regard to the existing legal framework"
| will sustain long term.
| AnthonyMouse wrote:
| > The reason I think the first view seems more correct is
| that it's like downloading a song from spotify. Yes you
| would think streaming a song from spotify is
| technologically identitical as downloading it, but legally
| there is a distinction. If you found a way of ripping a
| copy of a song from spotify there would be a record company
| ready to sue you and a law they could use to do it.
|
| But the law would probably be DMCA 1201 for circumventing
| the DRM rather than normal copyright for making the copy,
| which is much more ambiguous. Also, record companies like
| to sue people, that doesn't mean they're right and provides
| no indication of what the law _should_ be. You could just
| as easily pick some other example, like whether you can rip
| a music CD or vinyl you bought to put it on your iPod,
| which the record companies might not _like_ to be allowed,
| but that doesn 't mean that it isn't.
|
| > It's theoretically true that the models could be trained
| with someone else's training data, but there's a flaw in
| that argument. If you _can_ train with other data without
| these legal issues, why _don 't_ they?
|
| To which the answer is that they do. A lot of models are
| trained on arbitrary content from the internet.
|
| As to why Adobe in particular is doing this, think about
| it. It's because that interpretation benefits them rather
| than the artists, by creating a moat where companies who
| already have licenses to bulk stock images etc. are the
| only ones who can create a model, rather than having lots
| of competitors because anyone can create one and many of
| them are free and can be run locally.
|
| > I don't expect the "We're going to take everything with
| no regard to the existing legal framework" will sustain
| long term.
|
| Publishing companies don't like public libraries. So
| anybody can go there and borrow a copy of any book for
| free? That doesn't mean that libraries are bad or are or
| should be illegal.
|
| Also, laws are created through the political process, which
| is not always great. The outcome "individual artists
| somehow benefit from this" isn't even in the room there.
| The two most plausible outcomes are that "tech companies"
| win and anybody can train a model on anything they can get
| their hands on, and that "content conglomerates" win and
| then this technology gets locked up as a service from only
| megacorps and the artists still don't get anything
| meaningful, but now the world has another abusive cartel
| imposing arbitrary censorship and using control over this
| to cement control over adjacent markets etc. Of these, the
| second is clearly worse.
| troq13 wrote:
| Weird spot. $3 per minute seems like a lot more than most AI
| companies are willing to pay, and a lot less than most creators
| who are not making slop would be willing to take.
| devoutsalsa wrote:
| If I'd posted 10 minute videos weekly for 5 years and someone
| offered me $7500-ish to use them as training data, I'd think
| that was a pretty good deal. YMMV.
| endisneigh wrote:
| Depending on the content that's a terrible deal. Low ball is
| it takes an hour to produce a single minute of video. You're
| being paid 3 bucks an hour. If it's animation it could take a
| day to animate a single minute, in which case you're being
| screwed even harder.
|
| Suckers who don't properly value their time is why BigCo
| generally wins.
| altdataseller wrote:
| Did they say they're buying only high quality videos? What
| if it's just a video of me talking through my coding
| sessions? It would be rather low effort and sounds like
| "free money" to me
| doublerabbit wrote:
| Would it though? If they train your video, produced
| future videos based off that video, charged for the next.
|
| That then takes away your next video. So you got paid a
| low ball amount only to be ripped off.
| replygirl wrote:
| would it though? nothing would be stopping them. the idea
| something would relies on an assumption that we'll soon
| be able to generate long, coherent, and useful
| instructional videos with fully resolved text on demand,
| with such high quality and low cost that no one will be
| able to compete. but we already have people out there who
| can do this instruction/review live and off the cuff, and
| who would certainly be able to make use of this stuff in
| their own work
| doublerabbit wrote:
| Yes it would.
|
| Because it restricts you from creating such style videos.
| Your being paid a pittance for a bot to learn your style.
|
| Why wouldn't you want more money?
| AnthonyMouse wrote:
| > Why wouldn't you want more money?
|
| Because that style isn't that unique, so the difference
| is between getting paid nothing and now there is a bot
| that can do the same thing, or getting paid something and
| now there is a bot that can do the same thing.
|
| It hasn't even been established that they're required to
| pay you at all.
| doublerabbit wrote:
| My style is unique. If I'm teaching folks something I'm
| skilled in I wouldn't want to be learned from $3
| especially for a $$$ company that abuses trust.
|
| Style is how teachers make learning happen. If teachers
| follow the general generic book mundaneness, you learn
| far less than of you apply your own style.
|
| Besides, if they were using the content without my
| permission I should be allowed to seek costs for such.
| AnthonyMouse wrote:
| > My style is unique.
|
| "You're unique, just like everybody else."
|
| It doesn't matter if the thing is using your exact style
| or one which is enough of a substitute for it that the
| difference isn't going to make up the difference between
| your $50 fee and the $0.01 in electricity it takes to
| have the AI do it.
|
| > Besides, if they were using the content without my
| permission I should be allowed to seek costs for such.
|
| On what basis? How is it different than someone teaching
| their students your style, so the students can make their
| own original works in the same style? It's directly
| analogous to classroom use, which is an explicit example
| of fair use from the copyright statute.
| doublerabbit wrote:
| This is moot. For sake of sanity, and that I said what I
| wanted to say and I'll agree to disagree.
|
| This just shows that anyone is willing to cloned for less
| than their actual worth which is calculated on their own
| basis.
|
| If you'd rather be ripped off, having a class taught for
| $5 from some AI bot from your own teaching style earning
| a single $3 than yourself teaching and earning $5 from
| each class, be my guest.
|
| Edit: I'm now post capped, so can't comment/reply on HN
| for another four hours anyway. Old news.
| replygirl wrote:
| keeping in mind $0.01 for something like an hour lesson,
| or a full class, is entirely theoretical
| AnthonyMouse wrote:
| The AI is the thing being taught, not the thing teaching
| a class. Once you have a model, $0.01 is the correct
| order of magnitude for the cost of generating an image
| from a prompt. If anything it's an overestimate.
| replygirl wrote:
| it's barley short of what a 1920x1080 image costs from
| openai, but we're in a thread about instructional video,
| which is neither economical nor available yet
| AnthonyMouse wrote:
| Video would cost more than still images for the obvious
| reasons, but still likely much less than the cost of
| however many frames per second times that number of
| images, because nearly all of the frames will be minor
| variations on the previous frame. Meanwhile it's going to
| be a couple years before that technology exists because
| you'd have to develop something that can sync video with
| audio etc., by which point the hardware would be more
| power efficient.
|
| So now we're speculating on the cost of something that
| doesn't exist yet, but it's highly likely that hardware
| is going to get more power efficient over time, so the
| question then isn't whether "AI can do this for a lower
| price than humans" will happen, it's just a question of
| how long before it does.
| replygirl wrote:
| it restricts one from making more videos? how? i
| understand your comment as reiterating the assumption i
| was responding to then reframing that assumption with a
| generalization. of course fair pay is good, and my point
| is we don't have a solid foundation to assume ai will
| impact that more any other creative technology we've seen
| on computers in the last few decades
| devoutsalsa wrote:
| You can sell it more than once.
| replygirl wrote:
| i reckon nearly zero of these creators are making videos
| for a single viewer. and as a film student, or even as a
| professor, you wouldn't spend $300 per movie just to
| reference in study. demanding even more than that can only
| be hedging against the idea that one will be out of a job
| sebzim4500 wrote:
| Yeah but if you have already made the content then you're
| getting $7500 for free.
| endisneigh wrote:
| Per the article you're paid to make new content.
|
| Even if that were not true, releasing the rights for so
| little is dumb.
| 0cf8612b2e1e wrote:
| As opposed to the Pile where the authors got nothing for
| the privilege of their content being used to train the
| LLMs.
| lightedman wrote:
| "Low ball is it takes an hour to produce a single minute of
| video."
|
| As some of us like to say, "Fuck it, we'll do it live." As
| in, that's the only take you get and it goes right to
| production zero editing and processing. Everything gets set
| once, fired once, and it gets uploaded immediately.
| asah wrote:
| Price is determined by the value to the purchaser, not the
| cost of production. Particularly for digital products with
| no per-use COGS.
| rakoo wrote:
| If I learn through the press that Adobe is willing to pay
| amount A and this information is very peblic, I can at the
| very least assume they are making 10x more so I'm willing
| to sell it 20x more and see where this goes
| baobabKoodaa wrote:
| These AI video systems will need huge amounts of random
| videos showing things like leafs rustling in the wind, car
| driving, etc. so you don't even need to put in 10 hours of
| effort to create a 10 minute video, you can just grab your
| camera and go for a walk and publish that as it is, and it
| will be valuable training material.
|
| $3 per minute would be a pretty high price to pay for "man
| walks outside" video.
| doublerabbit wrote:
| Why wouldn't you want more?
| ysavir wrote:
| If I was in that situation, the last thing I'd want is to
| provide an AI video generator the means of creating videos
| like mine and that would compete with me.
| paxys wrote:
| If it's a non exclusive deal then of course they'll take it.
| troq13 wrote:
| "you already used these socks, why wouldn't you sell them to
| a stranger for $3"
| voxic11 wrote:
| Its a non-exclusive deal so you can film something for your
| youtube channel and make money there while also getting some
| extra bucks from adobe.
| Legend2440 wrote:
| Videos that are good for training data are very different from
| videos that are good content.
|
| They do not want theatrical productions, they want raw footage
| of the real world.
| htrp wrote:
| have they fixed the image generation in their firefly models yet?
| retskrad wrote:
| So are we just going to let OpenAI, Google, Facebook, Microsoft
| and others to just steal the whole internet, including videos on
| YouTube, and train their models for free without compensating
| creators? Or will they get away with it because there were no
| laws that prohibited their behaviour?
| jsheard wrote:
| Adobe is pretty much banking on the "scrape everything"
| approach getting too bogged down in legal problems to be
| commercially viable in the long term. They're going out of
| their way to only train on licensed material so they (and Getty
| Images) will be the last ones standing if the worst case
| scenario happens for the companies who thought they could just
| take everything, and in the meantime they get the business of
| risk-adverse customers who don't want to chance it until it's
| settled one way or the other.
|
| It's a decent lobbying tactic too - companies like OpenAI claim
| their technology can only exist with unrestricted scraping, and
| requiring them to license training data would kill the whole
| industry in the crib, but Adobe and Getty can point to their
| own products as existence proofs that it can be done but OpenAI
| just doesn't want to.
| Dalewyn wrote:
| I applaud their approach, since it's obviously in line with
| traditional concepts such as _fucking paying someone for
| their time and work_.
| pavlov wrote:
| Still, it's possible to end up in a situation where people
| are getting paid and in theory it's an improvement over a
| previous lawless IP situation, but in practice it still
| leaves creators in the cold.
|
| That's basically what happened with Spotify. Artists are
| getting paid something for streaming, but it's fractions of
| pennies on the dollars they used to make from music sales.
| Dalewyn wrote:
| It's better to be good than not be perfect in the face of
| being bad.
| cma wrote:
| Didn't they quickly change terms before people knew what
| was going on with the Stock AI stugg, and only do any press
| for it after it was available but too late to opt out?
| paxys wrote:
| And by being the first mover they get to dictate prices as
| well. Content creators will sell for whatever because the
| alternative is $0. Once more companies get involved and start
| bidding the value of these catalogs will undoubtedly go up.
| andy_ppp wrote:
| It's amazing how quickly laws get changed when there's
| potentially billions of dollars that can be extracted.
| Hoasi wrote:
| Sound strategy by Adobe.
| doctorpangloss wrote:
| Adobe's stuff isn't only trained on licensed material. They
| are as fucked as everyone else.
|
| There doesn't exist a LLM trained only on expressly licensed
| data that works well enough for the image diffusion
| conditioning task. So unless they made some kind of huge
| scientific discovery, which they didn't, they are using a
| text model like T5 or CLIP to achieve the "text" part of
| "text to image", which of course is trained on not expressly
| licensed data.
| chriskanan wrote:
| I'm guessing they will then lobby to ban open source AI
| models, especially their usage in commercial applications.
| One would need a huge bank account to create models in the
| future. I think this would have a chilling effect on the AI
| ecosystem. In essence it's a form of regulatory capture.
| jsheard wrote:
| You already need a huge bank account to create models, the
| (quality) open source ones are all hand-me-downs from
| companies burning through huge stacks of investor cash and
| giving away the spoils. It should go without saying that
| that won't last.
| artninja1988 wrote:
| I mean it's in the crown sourcing range for diffusion
| models and going to get cheaper. It currently costs 50k
| to retrain stable diffusion 2
| (https://www.databricks.com/blog/stable-diffusion-2). So
| yeah, mandatory licensing of data on the open web would
| absolutely kill university projects and small startups.
| The road to hell is paved with good intentions, as they
| say
| jsheard wrote:
| That's 50k if you already know the exact architecture
| you're going to use, and the exact training data you're
| going to use, and nail it all on the first try. Multiply
| that by however many failed attempts it takes while
| experimenting with new ideas, especially when the
| corporate players stop being so eager to publish the
| finer details of their research, and that 50k could very
| quickly turn into millions.
| artninja1988 wrote:
| Yeah, it's not going to be SOTA but existing models are
| already good enough for a reasonable portion of use
| cases. Even if they lag behind it's important to have
| free models.
| kozikow wrote:
| Nono, let stealing the internet first, generate synthetic
| training data, and then outlaw it to ensure no one else catches
| up.
| mewpmewp2 wrote:
| Is there going to be a new term called "Data Laundering"?
|
| You create the synthetic data, move it through 10s of
| entities and then buy it back cheaply from an entity.
| datadrivenangel wrote:
| It's the fireside monopoly approach.
|
| Break the law so you can undercut the real producers, and
| then buy the rights when they're broke and everything works
| out for you!
| winstonprivacy wrote:
| Data Laundering is an old term... I was using in more than
| a decade ago in talks about how unethically gathered
| private data was being sold to Israeli companies, who then
| licensed it back to US corporations. It was (probably still
| is) a way to side step privacy laws.
| whywhywhywhy wrote:
| This is why open models trained with disregard for IP are
| important.
|
| It's going to happen anyway, the tech can either be free to
| everyone so we all benefit or behind a paywall. Those are your
| two options there isn't a 3rd if we're talking reality not
| fantasy.
| kristopolous wrote:
| It's called enclosing the commons and the answer is a
| resounding "yes!"
| deadbabe wrote:
| Most of a creator's content is consumed for free as is by their
| audience. Their compensation arrives in the form of payments
| from a few large subscribers or from an ad distribution
| platform or other such sponsorship deals.
|
| I'm not sure why you and others think that using someone's work
| as inspiration for creating some new work requires payment. Are
| we required to pay someone when sharing a meme, or using a
| sound for a reel or a TikTok? A lot of these creators are just
| creating shit they saw someone else create.
| ChrisMarshallNY wrote:
| _> Are we required to pay someone when sharing a meme_
|
| I suspect that, if the image in the meme is copyrighted, the
| answer is "yes."
|
| I haven't come across too many memes with Mickey Mouse, but I
| do see the occasional Marvel one (I have forwarded these,
| myself).
| deadbabe wrote:
| Under the OP's logic, you and everyone who shared those
| memes owes money. Everyone.
| ChrisMarshallNY wrote:
| Yup. Welcome to the RIAA's worldview.
|
| It can get nasty.
| qeternity wrote:
| There is a little something called "fair use".
| ChrisMarshallNY wrote:
| Yeah...that gets fuzzy.
|
| Satire of the work is allowed, but I don't think using
| the work, itself, in satire of something else, is
| allowed.
|
| I think that the example a lot of folks use is the
| "Pissing Calvin" decal.
|
| Watterson never allowed any commercialization of his
| characters. Every "Pissing Calvin" decal is actually a
| copyright violation, but I think that they never enforced
| it, so it may be considered "OK" to use, as the copyright
| has been allowed to wither.
|
| But IANAL. Things get sticky, here.
| deadbabe wrote:
| The original work isn't used. It's satirized in the form
| of training data that a model will understand.
| rightbyte wrote:
| Family Guy made a scene for scene recreation of Star
| Wars.
| AmericanChopper wrote:
| I don't think there's any legal issue with using data for
| training. I think the problem is how do you subsequently
| prevent your model from creating copyright violations. If you
| watch a Spider-Man movie and take some inspiration from it,
| you know that you go and make a movie with similar themes or
| whatever, but you can't just go out and create your own
| Spider-Man movie. An AI model doesn't know this, and I don't
| know how you'd teach it this concept. Especially when
| properly educated humans frequently have disputes about what
| is/isn't allowed.
| deadbabe wrote:
| The copyright isn't violated unless shared. So it's the end
| user's responsibility. Not the AI model.
| AmericanChopper wrote:
| It was shared when OpenAI (or whoever else) created it
| and then shared it with you (or likely sold it to you if
| you're a subscriber).
|
| But even if you think that's fine it makes using AI
| models for your own commercial applications more risky.
| SJC_Hacker wrote:
| Most likely these models are going be hidden behind the
| network. Aside from the issues of size, which will
| probably run into the terabytes, I can't see companies
| willing to risk including them in downloadable code them
| given how expensive they are to generate, for fear of
| being copied.
|
| A single network transfer alone is enough to qualify
| "sharing".
|
| Heck some of the content creators are even arguing that
| the models are essentially a form of lossy compression,
| because no one quite knows exactly how they work.
|
| Also the users probably won't be the ones with deep
| pockets, unless it was a studio. And the lawyers are
| generally going to after the ones with deep pockets.
| SJC_Hacker wrote:
| Spiderman was released 1962. Copyright runs out in 2057.
|
| What you could do is what some of of the original comic
| book creators did when they ripped off each others work
| (Doom Patrol/Xmen, Quicksilver/Flash, etc.) - create an
| alternative version that was close the original but was
| different enough that it wasn't a straight up copy. So it
| wouldn't be Spiderman, but Venom or something, and the suit
| wouldn't be red/blue but like maybe purple/green or
| something.
| bobcostas55 wrote:
| You know what, I _would_ download a car.
| djeastm wrote:
| I think the strategy is to succeed at all costs now then pay
| out a fine or settlement as a cost of doing business later when
| it's a fraction of their profits. Seems to be how things go.
| aqme28 wrote:
| > train their models for free without compensating creators
|
| Isn't $3/minute exactly the compensation for creators you're
| looking for?
| chinathrow wrote:
| Why would that be a fair price?
| aqme28 wrote:
| They're offering that price to creators. Beyond that,
| you're getting into the philosophical argument of what
| defines a fair price.
| SJC_Hacker wrote:
| Because its the highest bid?
| slyall wrote:
| Google owns youtube. I'd be surprised if their user agreement
| doesn't allow them to train on the uploads. Same with Apple and
| Facebook.
|
| I'm not sure a dozen companies (before consultation and
| licensing) having control of 90-something percent of their
| world's content is a good idea either
| glimshe wrote:
| The article is paywalled, but isn't _paying for videos_ the
| very opposite of "stealing" and "without compensating"?
| CameronFromCA wrote:
| We can only hope so.
| rchaud wrote:
| Open borders for me, paywalls and perpetual surveillance for
| thee.
| rchaud wrote:
| Open borders for me, DMCA enforcement for thee.
| 2Gkashmiri wrote:
| I demand adobe buy ALL copyrighted material at $3/minute.
|
| Disney, WB,etc etc
|
| That'll show them
| LightBug1 wrote:
| Times like this, I'm glad a lot of my shit is still on hard
| drives!
| spaceman_2020 wrote:
| Personally, Photoshop's AI tools are the weakest among all that
| I've tried. Even free tools like Remove.bg and Canva are running
| circles around Adobe's background removal
|
| Lenoardo's AI tools are even better and the native background
| removal is just 10x better than Adobe
| ackbar03 wrote:
| yea those guys completely missed the bus. I worked on some AI
| image editing tool a few years back with a partner and we were
| constantly worried Photoshop was going to suddenly make us
| obsolete, which is now kind of funny looking at where they're
| at
| HarHarVeryFunny wrote:
| $3/min = $180/hr. Pretty good pay!
|
| Do they care what the content is?!
| sentfromrevolut wrote:
| They're not buying slop. they're buying high quality content
| like animations. It can take hours to create one minute so it's
| really the other direction , division from 3/min down to
| something like 0.50c/hour of work, not multiplication from
| 3/min to 180/hr .......
| HarHarVeryFunny wrote:
| Biggest source of video on the net is maybe porn. Is that the
| path to AGI?
| ackbar03 wrote:
| That is beyond dystopian
| CyberDildonics wrote:
| It's also not true so the whole idea is nonsense. Youtube
| alone in 2022 was taking _in_ 30,000 hours of video per hour.
| Every day youtube receives more video than anyone could watch
| in their entire lifetime.
| ackbar03 wrote:
| Says cyberdildonics
| marcodiego wrote:
| Would they be interested in buying a video of paint drying[1]?
|
| [1] https://en.wikipedia.org/wiki/Paint_Drying?useskin=vector
| callamdelaney wrote:
| Tbh AI is a derivative thing, a human can read something and be
| inspired in exactly the same way, I don't see how copyright is
| enforceable.
| lionkor wrote:
| The difference is that it's not a human, it's a machine
| designed to do that. There was a choice, that's the difference.
| everforward wrote:
| I don't think this is a great line of argument. It hinges on
| choice, which rapidly degrades into an argument about free
| will and the nature of whether anyone truly makes a choice or
| if our apparent "choices" are just a rationalization of
| deterministic response to external stimuli.
|
| I don't think there's a subjectively true answer to that
| question, so we just end up bickering over free will instead
| of AI.
|
| If you just want to declare humans to be special, I would
| just go straight to that. Humans are already legally special
| in all kinds of ways.
| graypegg wrote:
| I think making humans legally special is a good argument
| though.
|
| The whole point of copyright, is to allow artists to be
| able to generate a little income with exclusive rights to
| the works they produce. It's not just there because it's
| some inherent property of artwork: us humans decided that
| courts should punish people that threaten the livelihood of
| artists (in a very narrow scope of ways) via the judicial
| system.
|
| Of course the "spirit" of the law is meaningless, but I at
| least don't think understanding copyright law as a purely
| legal tool, divorced from the human element, is
| participating in the same conversation artists are having.
|
| Should people that just make art, eat? Early human history
| the answer was no. Then we advanced to a point where people
| can specialize in skills, and art made life worth living,
| so the answer was yes. Now where are we?
| visarga wrote:
| Now that generative AI is here, there will be artists who
| will try to differentiate themselves from AI, and artists
| that will embrace it to explore new possibilities. It's
| not a matter of protecting artists, but of choosing what
| artists to protect, the first or second group? Should we
| ban photography as well, how about synthesizers, should
| home owners band new construction in their neighborhood
| to protect their investments?
|
| In my opinion any attempt at restricting AI training is
| also restricting people who would use it to create new
| things. And it will backfire spectacularly, by scanning
| for derivative works in AI we automatically need to do
| the same for every work claimed to be created by a human,
| because we don't know who's using AI secretly.
|
| And it will create a chilling effect, incidental
| similarity would be too high a risk to take. Even old
| works will be revealed, their secret influences laid bare
| for everyone to see. Excessive scrutiny could deter
| artists from experimenting with new forms of expression,
| possibly leading to a homogenized art world where
| innovation is curtailed.
| graypegg wrote:
| I can get that. There's never a path forward that doesn't
| leave something on the table. I of course want artistic
| freedom + technology to advance alongside each other.
|
| But we're also talking about MAKING generative AI models.
| Not USING generative AI models. Making a model is so far
| outside of the scope of a single artist it's almost
| comical. We're talking about a task that, given no
| bounds, would include the subtask "ingest all human
| creative output". That's hard to do so therefore it's
| restricted to the realm of well funded companies.
|
| But even as well off and talented as these companies are,
| they did very little of the actual work. The total scale
| of effort that goes into these big lists of coefficients,
| is beyond human comprehension at this point. We've
| literally NEVER had to contend with tasks of this
| magnitude before. Many thousands of centuries of human
| life hours distilled. Copyright as it's legally inscribed
| precisely, is ill-prepared to achieve it's end goals in
| this situation.
|
| But considering:
|
| 1) that's kind of cool we can do this
|
| 2) it makes interesting things that could better people's
| lives
|
| 3) we still think that humans should be able to make art
| (ai or not) and not die of starvation ideally
|
| ... we need to figure out what to do. Maybe it's not
| copyright! That was created to solve a specific problem.
| Now we have new problems. I just think people latch too
| hard onto the legalese as if it's some magic property of
| the universe, and NOT a set of systems humans made, for
| very human goals. I don't want a system where we just say
| "well look at copyright, it says this is fine, so we will
| just use everything for free". It's missing the point.
|
| I agree with you that it's asking the hard question of
| "which artists do we protect". That, I'm not feeling like
| I have a good answer for right now. I definitely heavily
| lean towards the talented artists who've honed their
| craft. That's just more interesting. But I wish more
| people we're thinking about it, in those terms.
| visarga wrote:
| > We've literally NEVER had to contend with tasks of this
| magnitude before. Many thousands of centuries of human
| life hours distilled.
|
| And one poem or a page of text is but a drop in this
| ocean, an infinitesimal contribution. The model is much
| smaller than the training set. For example Stable
| Diffusion packs 5B images in 5GB, that's a compression
| factor of 100,000:1, about 1 byte per training image, not
| even a pixel. Taking so little from everyone, I am amazed
| how strong a reaction it creates. The final model doesn't
| steal almost anything, it doesn't rely on any training
| example too much, just a compressed abstraction of the
| training data. You have to overfit it and also prime it
| to make it regurgitate some training examples, and most
| attempts fail, resulting in novel outputs.
| gentleman11 wrote:
| Why do you think artists will be the ones to use ai to
| create art? Rather than big tech and clusters of
| corporate ais working together?
| jsemrau wrote:
| One could argue that all creativity is derivative as it is
| based on the art that came before it.
| phone8675309 wrote:
| At least Adobe is asking for consent instead of scraping a
| bunch of questionably copyrighted things without the consent of
| their creators.
|
| If you did that to Disney then you'd be in jail, but since it's
| the little guys getting fucked I guess it doesn't matter.
| gentleman11 wrote:
| A machine owned by a corporation is not a person. They don't
| get the same rights as people
| tgbugs wrote:
| Amusingly this policy might end up meaning that the AI model they
| produce by training on this data will never be able to produce
| video that will ever be worth more than 3$ per minute. They are
| probably unintentionally filtering for content created by people
| willing to sell for that price or below, and that bias will be
| present in any downstream model. You get what you pay for I
| guess?
| summerlight wrote:
| I wonder if this is an exclusive deal for Adobe? If not, it's
| just an additional $3 per minute. Though I agree that they need
| to be flexible on the pricing.
| cheriot wrote:
| Free money for people that already have video lying around.
| They're not asking for exclusivity.
| graypegg wrote:
| How much will AI generated video cost per minute? Not from Adobe,
| just generally, where it's obtainable.
|
| There's a good scam if you can automate video generation and pass
| it to Adobe. Them setting a price of AI training video, may have
| just set the floor price for buying AI generated video.
|
| As people with more money than me say: arbitrage!
| whycome wrote:
| What if users started poisoning the data set by automating and
| submitting AI-created videos? lol What's the price for
| production? Set up a pipeline right through to Adobe. AI-
| laundering?
| JieJie wrote:
| Alternate non-paywalled link from MSN^0:
|
| "The software company is offering its network of photographers
| and artists $120 to submit videos of people engaged in everyday
| actions such as walking or expressing emotions including joy and
| anger, according to documents seen by Bloomberg. The goal is to
| source assets for artificial intelligence training, the company
| wrote."
|
| 0: https://www.msn.com/en-us/money/other/adobe-is-buying-
| videos...
___________________________________________________________________
(page generated 2024-04-12 23:02 UTC)