[HN Gopher] Valve is not willing to publish games with AI genera...
___________________________________________________________________
Valve is not willing to publish games with AI generated content
anymore?
Author : Wouter33
Score : 596 points
Date : 2023-06-29 16:27 UTC (6 hours ago)
(HTM) web link (old.reddit.com)
(TXT) w3m dump (old.reddit.com)
| lost_tourist wrote:
| Sounds like a market opportunity.
| lyu07282 wrote:
| This is obviously nonsense because games and artists have been
| using AI and procedural generated content for decades, everything
| from textures, models, maps, animation, music and sound.
| Generative models are now even integrated in the NVIDIA drivers
| for upscaling and every photo you take with a recent samsung
| phone uses generative AI.
|
| Just because now generative AI has made a significant leap
| doesn't mean its anything new. And copyright is irrelevant
| because models are clearly derivative works the same way artists
| remix existing works of art, if that were to change, copyright
| law would destroy the majority of all creative endeavors.
| SrslyJosh wrote:
| > games and artists have been using AI and procedural generated
| content for decades
|
| These are not the same things. Procedural generation is not the
| same as feeding different prompts to a model until it vomits up
| something that looks sort-of like what you want.
| shadowgovt wrote:
| Does that mean _High on Life_ is now banned? If I recall
| correctly, they used AI on purpose to give ads and billboards a
| nonsense alien feel.
| GaggiX wrote:
| atomic heart, high on life, hawken reborn, observation duty and
| probably many others, but no they are not banned.
| imdsm wrote:
| Hypocrisy is alive and well
| mcpackieh wrote:
| Or maybe the developers of those games have given their
| assurance to Valve that they own all the relevant
| copyright, and therefore the same standard is being
| applied?
|
| > As the legal ownership of such AI-generated art is
| unclear, we cannot ship your game while it contains these
| AI-generated assets, _unless you can affirmatively confirm
| that you own the rights to all of the IP used in the data
| set that trained the AI to create the assets in your game._
| whywhywhywhy wrote:
| If they claimed that then they lied because we all know
| those games studios didn't train their own Image gen
| models without relying on the base SD models
| mcpackieh wrote:
| Maybe they did, maybe they didn't. But if the developer
| is willing to sign a document saying they own the rights,
| that's probably good enough for Valve to feel their ass
| has been covered.
| oblio wrote:
| Did you read the article?
| whywhywhywhy wrote:
| This will change within 6 months I promise you, EA/Ubisoft/etc
| will ALL be shipping AI generated textures in games before the
| end of the year.
| TaupeRanger wrote:
| That's not a change from this though, as long as those AI
| generated textures are from systems that were trained on images
| that they have permission to use (or are copyright free).
| whywhywhywhy wrote:
| No such system exists so they won't.
|
| Even Adobes system has questionable training data mixed in
| TillE wrote:
| Every major game publisher has an enormous archive of in-
| house artwork, and probably a decent amount of hardware
| they could use for training. They'd be stuck with open
| source software like Stable Diffusion - or have to cut a
| deal with Midjourney or whoever - but there's no barrier to
| creating such a system.
| nickthegreek wrote:
| Stable Diffusion nor Midjourney can be used to create
| artwork under these Valve guidelines.
| teaearlgraycold wrote:
| And only a matter of time until a major game lets you talk to
| NPCs by generating dialog with an LLM.
| oneeyedpigeon wrote:
| It'll be interesting to see if a game can make good use of
| that, but it sounds like it would lead to some very
| frustrating interactions.
| wlesieutre wrote:
| The first versions of this won't be using it to create
| characters with plot significance and great depth, they'll
| be generating infinitely large piles of background
| character chatter. No matter how big your budget is, you'll
| never be able to to make a world full of random civilians
| who don't sound repetitive if every line has to be recorded
| in advance.
|
| Any prerecorded line having a slight bit of personality to
| it ("I used to be an adventurer like you, but then I took
| an arrow to the knee") becomes very noticeable after the
| first time you've heard it, so the safe thing to do is have
| lots of the most inane chatter possible ("Nice weather
| today!") to make it stand out less.
|
| If the game can write new lines and synthesize voices on
| the fly, you could have more interesting lines without the
| repetition.
| olddustytrail wrote:
| Have you tried it? As in, prompt the LLM that it's the
| character you want and then converse?
| evandale wrote:
| Of course it will change and Valve will even release their own
| game that's generated with AI.
|
| Governments and companies everywhere trying to lock out small
| time people today before they get too much traction with AI
| generated content. They know indie devs will never be able to
| prove their model is only trained on their art. Only massive
| companies with billions of dollars can do that right now.
|
| Every big company is trying to create rules to ban AI but keep
| a small enough loophole that they can use it when the time is
| right.
| MagicMoonlight wrote:
| That seems pretty sensible. There have been lawsuits from artists
| before. Do you want to risk a game selling 10m copies and then it
| turns out that all the art was just copied and pasted by the "AI"
| and Valve is now on the hook.
|
| Also from a store perspective, any game where shortcuts like this
| are used tend to be shit games. They don't want spam games to be
| pumped. There's already enough indie trash platformers that
| nobody wants.
| vlunkr wrote:
| > Also from a store perspective, any game where shortcuts like
| this are used tend to be shit games.
|
| Yeah, they indicate that they have already submitted multiple
| games with AI generated assets, and submitted this one "with a
| few assets that were fairly obviously AI generated." Maybe I'm
| being unfair and they are making really good games, but these
| are not good indicators to me.
| LanceH wrote:
| These tools have only now become available. I can imagine a
| game where these tools are used to develop a large world and
| backstory previously not possible. The main images and text
| may be hand crafted, but you might walk down a street where
| the other building are all unique, have names, and
| descriptions. It could really flesh out some of the
| procedurally generated games out there. Or it could be
| terrible. Or it could be good for being so terrible. It
| shouldn't be rejected entirely just yet.
| eropple wrote:
| It's _not_ being "rejected entirely". That is mendacious
| editorializing. Generative AI products are being rejected
| _unless you affirm that you have rights to the entirety of
| the training data set_.
|
| What's wrong with that?
| Animats wrote:
| That you probably don't _need_ rights to the training set
| in the US, unless Congress changes copyright law. This is
| being litigated.[1]
|
| Humans can look at a collection of copyrighted images and
| draw a new picture. The legal basis for holding AIs to a
| higher standard is weak.
|
| Current litigation: [1]
|
| [1]
| https://www.theverge.com/2023/1/16/23557098/generative-
| ai-ar...
| mcpackieh wrote:
| > That you _probably_ don 't need rights [...]
|
| And if Valve doesn't want to take this risk? To reiterate
| eropple, what's wrong with that?
| eropple wrote:
| _> That you probably don 't need rights to the training
| set in the US_
|
| Even if that is true--and it's not sure to be--Valve is
| within their rights to demand additional coverage.
|
| _> Humans can look at a collection of copyrighted images
| and draw a new picture. The legal basis for holding AIs
| to a higher standard is weak._
|
| Horseshit and worse words. Computers aren't people. They
| create derivative works from pushing inputs through
| mathematical models. The inputs are unerasable and the
| claims to the otherwise by the AI hustler class exist
| only to be able to profit off human effort without paying
| for it.
| Animats wrote:
| > Valve is within their rights to demand additional
| coverage.
|
| As a near-monopoly gatekeeper, Valve is vulnerable to
| antitrust charges.
|
| FTC is apparently about to go after Amazon.[1] US
| antitrust policy did far too little for too many years.
| EU competition policy is more aggressive.[2] The EU
| competition authorities already fined Valve for selling
| geo-blocked content that only worked in some EU
| countries. That's a violation of the basic Single Market
| rules of the EU.
|
| [1] https://arstechnica.com/tech-policy/2023/06/ftc-
| prepares-the...
|
| [2] https://competition-
| policy.ec.europa.eu/sectors/ict/cases_en
| lastangryman wrote:
| > Also from a store perspective, any game where shortcuts like
| this are used tend to be shit games. They don't want spam games
| to be pumped. There's already enough indie trash platformers
| that nobody wants.
|
| I find this hard to agree with. A game engine is a "shortcut"
| too, I can imagine people saying at some point anything
| developed with Unity would "tend to be shit games".
|
| Associating quality with visual fidelity anyway is wrong, look
| at Terraria, I'm pretty sure anyone semi competent with AI
| generation could produce better assets, but it wouldn't help
| them produce a better game.
|
| People will use gen AI art in good games, and people will use
| gen AI art in terrible games.
| Vermeulen wrote:
| There are already successful Steam games known to have used AI
| art. And this is only in the cases of the developers publicly
| admitting that.
|
| High On Life used Midjourney for it's ingame posters -
| https://store.steampowered.com/agecheck/app/1583230/ -
| https://www.thegamer.com/high-on-life-ai-generated-art/
|
| And Firmament
| https://store.steampowered.com/app/754890/Firmament/ -
| https://www.pcgamer.com/firmament-ai-generated-content/
|
| So is Valve now going to remove these games off the store? This
| seems like a very terrible way to handle this - they need to
| make clear rules and make a public statement, not just start
| banning apps that they sense use AI art.
| dragonwriter wrote:
| No, the developers just need to "affirmatively confirm" that
| the own the copyright on all the works in Midjourney's
| training set, and they are good.
| sebzim4500 wrote:
| I'm sure some indie studios will sign whatever, but as soon
| as a large studio uses an public model Steam will have to
| roll over on this one.
| eropple wrote:
| Can you explain to me how "you must affirmatively state that
| you own or have licensed rights to the training data (and if
| you're lying, the legal responsibility is yours and not
| Valve's)" is not a clear rule?
|
| And yeah, they should kick those games off for using
| copyrighted materials that they do not own.
| Vermeulen wrote:
| This is a rule developers are just finding out now from a
| game getting rejected. Pretty major deal if multi-million
| dollar budget games like High On Life should now be banned
| (even worse if they don't ban it now, making the rules
| unclear). It should have been a public statement, with a
| clear change to their developer terms.
| mock-possum wrote:
| AI doesn't copy and paste art though, it generates new art
| based off of patterns it's seen in its training material. If
| it's training material heavily features red squares, and you
| prompt it to generate a new piece, chances are you'll get
| something with a red square - not because it copied that red
| square from any particular piece, but because it was a common
| element. There's a difference between reproducing common
| elements in pursuit of adhering to a style, and copying and
| pasting.
| mehlmao wrote:
| https://arxiv.org/pdf/2301.13188.pdf
|
| Stable Diffusion spits out slightly blurrier versions of the
| pictures in its training set.
| oceanofsolaris wrote:
| But that's not the default behaviour of these models at
| all. Instead you have to work pretty hard to extract these
| original images.
|
| When using the models normally, they do generate new
| content, even if the style and subjects are of course
| interpolated from training data.
| Zetobal wrote:
| Without seeing the art there is nothing to judge. I bet it's a
| visual story and they are using characters from established ips.
| Sivart13 wrote:
| 100%, my guess this is someone putting Disney or Nintendo
| characters in compromising situations and Valve would have
| rejected it AI or not
| brucethemoose2 wrote:
| Well what if someone ships a model exclusively trained on legal
| content?
| eropple wrote:
| Then you tell Valve that and they say "OK"?
| brucethemoose2 wrote:
| Would they though?
|
| I think the legality of the dataset would be hard to prove.
| And I can see some game devs straight up lying, shipping SD
| 1.5/Llama, and dragging Valve into court when one of them
| explodes in popularity.
| eropple wrote:
| _> I think the legality of the dataset would be hard to
| prove._
|
| Could be. I hope it is. It's a reason not to use it!
| jasonlotito wrote:
| Of note: they aren't banning AI generated graphics. Rather:
|
| > we are declining to distribute your game since it's unclear if
| the underlying AI tech used to create the assets has sufficient
| rights to the training data.
|
| It's not AI generate graphics. Instead, it's AI-generated
| graphics where the rights to the training data cannot be
| established. I think that's an important distinction.
| smoldesu wrote:
| > At this time, we are declining to distribute your game since
| it's unclear if the underlying AI tech used to create the assets
| has sufficient rights to the training data.
|
| This seems like a completely fair response from Valve. On top of
| that, they gave them notice and an opportunity to remove the
| offending content (with that content explicitly called out) and
| offered to refund if that was not a viable option.
|
| If this was an iOS/Android app, they would have just been told to
| pound sand and swallow the dev fee. Good on Valve for not lapsing
| communication here.
| A4ET8a8uTh0 wrote:
| Organic only content -- here we come.
|
| Not that it bothers me, but I feel oddly validated that appears
| to be the path taken. It makes sense, even from just purely 'we
| can't review it all' perspective.
| smoldesu wrote:
| Well, it's not like they're The App Store and controlling
| everything you can install. You can still put AI-assisted
| software on any machine that can install Steam, they just
| don't want to deal with the legal implications of hosting the
| dubiously-generated content themselves.
| colechristensen wrote:
| I think it's fine.
|
| You have to have rights to do AI things with the content of
| your datasets. No more "download the whole internet" or
| "create image generation models from the scraped contents of
| a stock image provider".
|
| I think it's going to turn into a new class of copyright
| permissions.
|
| Along the lines of
|
| > thou shalt not make a machine in the likeness of a human
| mind
|
| More like
|
| > License is hereby given for the consumption of these
| contents by human minds
| slimsag wrote:
| It sounds like they even looked into the specific AI the
| gamedev said they used:
|
| > we reviewed [Game Name Here] and took our time to better
| understand the AI tech used to create it.
|
| And offered a refund on the $100 app submission credit:
|
| > App credits are usually non-refundable, but we'd like to make
| an exception here and offer you a refund. Please confirm and
| we'll proceed.
|
| Seems incredibly reasonable.
| stuckinhell wrote:
| This just shows me the future is people using AI tools to make
| their own games custom for them.
| lobo_tuerto wrote:
| Actual title is:
|
| "Valve is not willing to publish games with AI generated content
| anymore"
| GreedClarifies wrote:
| I guess we need clarity on whether using copyrighted material is
| covered under fair use.
|
| GenAI clearly meets the "transformative" standard.
|
| OTOH it seems likely that it will have difficulty with the
| "Amount and substantiality" as it considers the whole art work,
| OTOH this is not necessarily a hard barrier given the
| "transformative" nature.
|
| My guess is that the "Effect upon work's value" standard vs. the
| "transformative" standard will be the area where there is most
| action. Clearly, in aggregate, GenAI will have great impact upon
| works value. However, this is not the usual standard (it is about
| individual works), and I would argue that this would be creating
| new law by the courts.
|
| Hopefully we will get a case to the supreme court to resolve
| this, quickly. I think that this is a boon for humanity and I
| would like to see the cuffs taken off as quickly as possible.
| janosdebugs wrote:
| > GenAI clearly meets the "transformative" standard.
|
| IANAL, but the problem is that fair use is an affirmative
| defense and is decided for each case separately. One GenAI may
| be transformative, while others may not, depending on how much
| of the original training data they throw back at you.
| pierat wrote:
| By that definition, any roguelike should be banned. And, well,
| we're not seeing that.
|
| I'll watch, but I disbelieve the reddit poster. Probably a CEO
| bot drumming up obvious bait comments over current computer
| events.
| theknocker wrote:
| [dead]
| kevinh wrote:
| I'd wait for more information before making any assumptions about
| what Valve is doing here. So often these stories here are lacking
| context due to only one side trying to paint the situation in a
| very one-sided light.
| Wouter33 wrote:
| Valve is quite clear about their reasoning. Since AI models use
| all kind of sources for their training, they don't want those
| assets on their platform because they are afraid of copyright
| claims.
| peoplearepeople wrote:
| That seems very sensible to me, I hope other platforms follow
| this example
| kevinh wrote:
| For all we know, the game in question had images clearly
| aping some licensed characters. We don't know how stringent
| the policies are without clarification or examples of art
| found infringing. How did Valve know that the art was AI-
| generated? Did the developer tell them or include it in their
| marketing materials? It's basically just reading tea leaves
| without that information.
| yomlica8 wrote:
| It actually sounds like if you claim to have ownership of the
| training data you can still use AI generated assets. For most
| people this is a distinction without a difference however.
| NelsonMinar wrote:
| Agreed. The link here here is to a Reddit post from an
| relatively unknown person claiming to be quoting a private
| email from Valve. Kotaku has a bit more reporting, including a
| second report from a developer on Reddit. Also some comments on
| skepticism. https://kotaku.com/valve-ai-art-generator-steam-
| crypto-ban-m...
|
| I'm pretty sure if this is Valve policy they'll have no trouble
| saying so publicly. I miss the old days of journalism where
| someone made an effort to get the story correct including
| responses from the named parties.
| bun_at_work wrote:
| To be fair there are still examples of quality journalism,
| it's just that the internet doesn't care as much for that
| content, as it doesn't generate the outrage present in this
| thread. Unfortunately the incentives are aligned with ad
| revenue instead of accuracy.
| xk_id wrote:
| Kotaku have proven themselves to be a garbage tabloid in
| their reporting of NFTs. I would never bother to read
| anything they had to post.
| minimaxir wrote:
| This just creates a moral hazard to not disclose the use of any
| AI-generated assets, which is something other creative industries
| have already learnt the hard way.
|
| Recent text-to-image models have improved enough such that it's
| possible to get realistic, not-Midjourney-dreaminess in the
| generations with a modicium of effort, so banning obviously-AI-
| generated images is shortsighted and unsustainable.
| OscarTheGrinch wrote:
| Also just creates an incentive to just lie about what AI model
| / training data you used , how could anyone possibly prove your
| deception?
|
| What would an satisfy an audit trail that no tainted AI data
| have made it into a digital image? It would involve a chain of
| attribution per fraction of a pixel through all it's past
| iterations.
| minimaxir wrote:
| > It would involve a chain of attribution per fraction of a
| pixel through all it's past iterations.
|
| Which wouldn't be sufficient since, as stated many times
| before, the diffusion process most text-to-image AIs use is
| not collaging.
| meheleventyone wrote:
| Valve isn't worried about you lying they just want your
| attestation so if someone tries to sue them they can redirect
| them to you and if they think it's worthwhile sue you
| themselves for lying and breaching the contract. In the same
| way they want you to attest to owning rights to all the IP in
| the products you put on their platform. It's just IP
| ownership around AI content is murky right now so gets
| treated as a special case.
| pavon wrote:
| That strategy didn't work well for the submitter. Steam is
| rather stingy in allowing resubmissions, so the better strategy
| is to make your best effort to comply with the terms on the
| first try.
| BlueTemplar wrote:
| Shortsighted, unsustainable... but still the best thing for the
| company to do meanwhile in a very uncertain situation ?
| bob1029 wrote:
| > This just creates a moral hazard to not disclose the use of
| any AI-generated assets
|
| The whole space is somewhat amusing to me. what is the bigger
| moral hazard: Openly disclosing everything about your content
| pipeline and getting your team's efforts shitcanned, or keeping
| everything private unless a court order shows up?
| namaria wrote:
| No one is being protected from consequences of risky
| behavior. Moral hazard doesn't apply.
| ethbr0 wrote:
| So, the OpenAI model?
| candiddevmike wrote:
| And GitHub Copilot
| Kiro wrote:
| That's OpenAI.
| mr_coleman wrote:
| I prefer to think of it as the Uber/AirBnB model. Just do
| illegal things so much that you clog the enforcement
| mechanisms. Then it becomes such an unreasonable burden
| that they change the laws in your favor.
| minimaxir wrote:
| It has been widely speculated that the primary reason
| OpenAI never disclosed the full training dataset for GPT-3
| or GPT-4 was to avoid potential legal backlash.
| hot_gril wrote:
| > This just creates a moral hazard to not disclose the use of
| any AI-generated assets, which is something other creative
| industries have already learnt the hard way.
|
| This has been the legal street smarts for a while and doesn't
| seem like a big development to me. As usual, you don't admit or
| allude to anything. It's like when I'd write code and say no
| I've never even visited StackOverflow.
| moogly wrote:
| How about Firmament?
| justinclift wrote:
| Wonder if using (Japanese) anime based AI assets would be
| workable instead, as the licensing situation there sounds a bit
| clearer?
|
| aka "copyright doesn't apply":
|
| https://cacm.acm.org/news/273479-japan-goes-all-in-copyright...
| dleslie wrote:
| The models which can prove the progeny, and valid licensing, of
| their source assets will become increasingly valuable with time.
|
| This gives social networks an edge, which often have EULAs that
| allow the business to use uploaded content _at least_ internally,
| if not commercially.
|
| _And_, in the short term, there's an opportunity for someone to
| pay armies of artists to create _decent renditions_ of existing
| styles and known works. It's not a copyright violation if a human
| being mimics another human being in creating a new, original
| work.
| snowman647 wrote:
| That's the wrong path, soon Internet for 90%++ will be mixed with
| AI. Btw - What if I write my game using Chat-GPT?
| dragonwriter wrote:
| As long as you just use it for generating the code and not the
| assets, Valve doesn't seem to care.
| [deleted]
| janosdebugs wrote:
| I recently tried to use it to learn the Blender API, without
| much success. It halucinates left right and center, so I had to
| go back to reading the docs, trial and error, as well as
| reverse engineering. I'd be honestly surprised if you could use
| it to create an entire game.
| justrealist wrote:
| This seems utterly impossible to enforce. You really going to
| guarantee that your design firm didn't use AI to generate the
| assets?
| Nursie wrote:
| So you're in breach of contract and if it turns out later that
| you don't own the copyright for what you're distributing, valve
| get to plant that firmly on you.
| jasonlotito wrote:
| It doesn't need to be 100%. It just needs to be a reasonable
| attempt. We all too often let perfect be the enemy of good.
| Impossible to enforce? That's not true. They did so right here.
|
| Yes, people will work around it, and some will slip through the
| cracks. That doesn't mean it's a useless policy with no impact.
| Eji1700 wrote:
| It's a CYA thing.
|
| Steam says "we don't allow AI content".
|
| Someone shoves AI content on the platform anyways.
|
| If it can be proved they violated the TOS, they then have the
| ability to nuke their game and stop someone from suing them. If
| they can't prove it, well they can't prove it and the game
| stays.
|
| To do otherwise opens up the door to steam having to "vet" all
| the AI content. So yes AI content will slip through (in massive
| droves) but it will be indy scene.
|
| The biggest impact here is going to be AAA devs who can't just
| neglect to mention they used AI at some point. This is actually
| the first thing that could "kill" steam or give Epic a
| competitive advantage. There's 0 doubt that companies like
| EA/Activision/whatever want to jump all over AI to make yearly
| releases like FIFA even cheaper, and if Epic is willing to say
| "come on over" we might see epic exclusivity for that reason
| alone, rather than the current "here's a pile of money to make
| up for all the sales you'll miss out on when no one remembers
| your game released"
| freedomben wrote:
| If Steam (or probably more likely Steam's lawyers) is/are
| worried about liability, shouldn't Epic be also? The lawsuits
| are firing up, I wouldn't want to be the whipping boy chosen
| by the copyright holders to punish.
| Eji1700 wrote:
| I would assume so, but I'm faaaar from an expert.
|
| My amateur opinion is there's too much money to be made for
| this to be stopped forever (we SHOULD rework copyright but
| we'll probably just make some dumb rule when disney decides
| how they want to handle it), so epic might just say "the
| courts will side with us eventually and the benefits are
| waaay too high to ignore"
| Verdex wrote:
| I don't think it's really about enforcement. It's more about
| liability.
|
| Valve has an official position that they don't allow AI content
| (apparently). When the lawsuits show up they can say that they
| don't serve any AI content as official policy. When someone
| points out the AI content that they do serve, then they pull
| out their expert witness that testifies that their AI detection
| method is as good as possible and they couldn't haven been
| expected to do any better. Meanwhile, they're more than happy
| to remove anything explicitly flagged that falls through the
| cracks.
|
| Finally, I suspect that anyone who can prove that they're able
| and willing to indemnify Valve against lawsuits for AI content
| that their game contains will be allowed to have AI content.
| mcpackieh wrote:
| > _Finally, I suspect that anyone who can prove that they 're
| able and willing to indemnify Valve against lawsuits for AI
| content that their game contains will be allowed to have AI
| content._
|
| Yes, they're quoted as saying that AI generated assets are
| permitted if the developer can "affirmatively confirm" they
| own all the IP in the training set. seems reasonable to me.
| Verdex wrote:
| Yeah, I saw that quoted part, although, I suspect that if I
| show up with a bunch of AI assets that I can prove are 100%
| mine, then the reviewer is likely to error on the side of
| Valve not being sued.
|
| Meanwhile, AAA blockbuster studio will almost definitely be
| given a pass with a wink and a handshake after saying, "Hey
| if anyone figures it out, we'll take the blame." For using
| assets that throw up multiple red flags.
| kemayo wrote:
| > After reviewing, we have identified intellectual property in
| [Game Name Here] which appears to belongs to one or more third
| parties. In particular, [Game Name Here] contains art assets
| generated by artificial intelligence that appears to be relying
| on copyrighted material owned by third parties. As the legal
| ownership of such AI-generated art is unclear, we cannot ship
| your game while it contains these AI-generated assets, unless you
| can affirmatively confirm that you own the rights to all of the
| IP used in the data set that trained the AI to create the assets
| in your game.
|
| Valve's worried that AI-generated art is in a murky copyright
| state, and don't want to open themselves up to being sued.
| XCSme wrote:
| Well, that was the entire point of banning AI generated
| content, right? Or, are you implying that the article states
| different reasons for the ban?
| nickelcitymario wrote:
| I may be wrong, but I believe they were just intending to
| summarize.
| freedomben wrote:
| > _that was the entire point of banning AI generated content,
| right?_
|
| But AI generated content is _NOT_ banned. You just have to
| prove you have the copyright (or permission) for the training
| data.
| mrweasel wrote:
| Which honestly sounds rather reasonable and should be the
| expectation of all current AI products.
|
| The only reason it wouldn't be easy enough to provide is if
| you just scraped any available data set with a complete
| disregard for intellectual property.
| Matticus_Rex wrote:
| Humans take in inputs and transform what they learn into
| distinct outputs. Training data is essentially just a
| machine doing the same thing. Knowingly scraping pirated
| material would be one thing, but essentially having the
| machine view publicly-available material is not clearly
| at odds with existing IP law.
| mrweasel wrote:
| And if your output isn't distinct enough from the inputs,
| you too aren't allowed to claim copyright or sell your
| work without proper licensing.
|
| With the AI we can at least be 100% certain of which
| input you trained it on and under which licens, making
| the whole a lot easier to deal with, as compared to
| humans. The liability is the same, but it's much easier
| to avoid legal implications, so why not play ball and
| ensure that you have the correct licenses?
| Matticus_Rex wrote:
| Have _what_ correct licenses?
|
| It's not clear that you need _any_ license to train on
| data in the vast majority of cases. Having a license to
| train on it won 't guarantee that you can grant your
| users a license to any particular output, especially
| given the addition of user input. And most of the utility
| is in creating outputs that are indeed distinct.
|
| So the answer to "why not play ball?" is: 1. It's not
| clear that it's legally required 2. It would be
| incredibly expensive and/or slow progress dramatically,
| or limit you to pre-existing licensed content (e.g.
| Adobe) which drastically reduces some types of
| capabilities 3. Given #1, for any company that doesn't
| have an Adobe-style library, "playing ball" is
| essentially betting the company that it will become
| legally required, because on top of developing an AI
| model you're going to have to become an expert content
| licensing and documentation studio
| kemayo wrote:
| I'm quoting the relevant bit from Valve's email in the reddit
| post, to indicate exactly what they're actually forbidding.
|
| Notably: _not_ all AI-generated content, but rather AI-
| generated content from models that were trained on material
| that 's not owned by the person submitting the game.
| XCSme wrote:
| Can that even be proven/tested?
| BlueTemplar wrote:
| Depending on the court, burdens of proof can vary a
| lot...
| dragonwriter wrote:
| In court, it would be the entity claiming infringement
| that would have the burden of proof that their exclusive
| rights under copyright were violated by the assets in
| question, not the distributor of the asset that had the
| burden of showing ownership of every item in the training
| set of the model used in some part of the workflow.
| kemayo wrote:
| What I'd assume Valve is worried about is that it only
| takes one major decision against Stable Diffusion in
| court to suddenly leave us in a state where "this game
| used Stable Diffusion" _is_ the proof that 's needed.
|
| Given the whole "Stable Diffusion reproduces the Getty
| Images watermark" lawsuit[1] that's still ongoing, it's
| not an idle concern.
|
| [1]: https://www.theverge.com/2023/1/17/23558516/ai-art-
| copyright...
| dragonwriter wrote:
| > What I'd assume Valve is worried about is that it only
| takes one major decision against Stable Diffusion in
| court to suddenly leave us in a state where "this game
| used Stable Diffusion" is the proof that's needed.
|
| Hard to see how any plausible outcome that would have
| that result for users of SD (if model training isn't fair
| use, that's definitely a blanket-liability issue for
| Stability.AI -- and Midjourney, and OpenAI, and lots of
| people training their own models, either from scratch or
| fine-tuning, using others' copyright-protected works.
|
| But "using a tool that violates copyright in the
| workflow" is not itself infringement; whether and in what
| situations prompting SD to produce output makes the
| output a violation of copyright (and whose) would be a
| completely different decision, and while Ibcan certainly
| see cases (such as deliberately seeking to reproduce a
| particulaflr copyright-protected element, like a
| character, from the source data) where it might be
| (irrespective of the copyright status of the model
| itself), I haven't seen anyone propose a rule that could
| be applied (much less an argument that would justify it
| as likely) based on copyright law that gets you to "used
| SD, in violation".
|
| Lots of blanket ethical arguments about using it, but
| that's a different domain than law.
| nemomarx wrote:
| They can ask you to say that you own all the relevant
| rights, and then if it turns out not to be true later
| they can say they don't know.
|
| Which seems like as much as you can hope for in a policy?
| XCSme wrote:
| But how does this work for YouTube for example? People
| still upload copyrighted stuff, even if the ToS says you
| shouldn't.
| gbear605 wrote:
| The key is that it protects YouTube (and Valve in this
| case) since they can say "we weren't allowing our users
| to upload copyrighted content but they snuck it in"
| mrweasel wrote:
| It just has to be enough that Valves legal team can claim
| that they approved the game in good faith and that they
| can be held responsible for a game developer lying and
| knowingly violating the terms of service.
| ravenstine wrote:
| Why would Valve be on the hook for copyright? If a particular
| game developer happens to get sued (which happens regardless of
| AI), all Valve has to do is remove the game from Steam.
|
| Assuming this is even real, it may have more to do with
| preventing another 1989 video game crash resulting from the
| market being overwhelmed with crappy games.
|
| Then again, most AAA games today are broken pieces of suck, so
| IDK.
| henryfjordan wrote:
| If Valve takes a % of sales from a game that is full of
| copyright infringement, they can be sued for their cut from
| the sales plus maybe more and they need to pay the lawyers.
|
| They also care about their reputation amongst content-
| producers (game makers). Youtube faced this exact dynamic
| back in the day and have found it better to side with the
| large creators who care very much about protecting IP rights
| and so they exercise a heavy hand against copyrighted
| material.
| usrusr wrote:
| "all Valve has to do is remove the game from Steam."
|
| And take the reputation hit that would go along with that.
| Valve's business is 30% technology and 70% the reputation of
| being much less untrustworthy than the alternatives. If they
| lose that they can close shop.
| shultays wrote:
| Valve's business is its user base, as long as it has that I
| think they will survive.
| raincole wrote:
| It's just a random reddit post. The OP on reddit didn't even
| post his game.
|
| Three possiblilities:
|
| 1. It's just fictional. Probably written by a troll or
| generated by ChatGPT.
|
| 2. Steam refused to publish the game due to some obvious
| copyright issues (like they told Midjourney to generate
| superman or one-piece characters)
|
| 3. Steam is banning any AI generated assets.
|
| My bet is 1 > 2 > 3.
| thrashh wrote:
| I bet it's 2 because OP said they used "admittedly obviously
| AI generated from the hands" and a lot of AI generation makes
| really funky hands that you have to fix because it looks so
| bad.
|
| So it sounds like OP slapped some half-assed generated images
| into a game and tried to submit it. Valve now can't really
| trust someone that does that to have done any due diligence.
| naet wrote:
| The OP's only other reddit post says that they want to
| start a corporation to avoid associating their name with
| the game, which is a pretty big red flag to me.
|
| "I've been developing a game for a while now, and am near
| ready to release it on Steam. I'd prefer it not to be
| associated with my name (as in I'd prefer people googling
| my name and future employers being unable to find out i
| developed this game )."
| bobthepanda wrote:
| There's a high likelihood that it's probably pornographic
| or otherwise controversial.
|
| Use copyrighted material in porn that you charge money
| for and you'll get slapped with a lawsuit faster than you
| can load the home screen.
| jstarfish wrote:
| Given the tech involved and allegations I agree with the
| others that this is a smut game, but shielding yourself
| with an LLC is a smart move for anybody doing _anything_
| controversial, commercially.
|
| Ask Alex Jones...
| sangnoir wrote:
| Depending on the genre, that may be a reasonable action.
| If I were to uncharitably assume OP submitted an adult
| game with rip-off characters, that would explain both OP
| & Valve's behaviors. If this transpired at all.
| thrillgore wrote:
| I don't visit reddit anymore so I need to see more
| substantiated claims before I give this any thought.
|
| Besides my already established biases towards AI: It's
| threatening to creative endeavors, not because it exists, but
| because it will impact the earning potential of creatives.
| lolinder wrote:
| It's a really strong indictment of the state of journalism
| that "I read it on reddit" has become sufficient to turn into
| a news story.
|
| The internet is flooded with content right now to the effect
| of "Valve might be doing this thing", but not one of those
| sources has actually reached out to Valve for comment.
| Instead they all cite a random commenter on Reddit (or they
| cite each other).
| coffeebeqn wrote:
| What journalism? This is a link to reddit
| cmiles74 wrote:
| Here's one from Ars Technica.
|
| https://arstechnica.com/gaming/2023/06/steam-mods-
| reportedly...
| yk wrote:
| > Neither Valve nor potterharry97 were immediately
| available to respond to a request for comment.
|
| I wanted to point out that journalists at least check
| their sources...
| lolinder wrote:
| "Immediately available" could mean "we sent out the email
| right before we pressed submit". That's better than
| nothing, but it's still not journalism.
| lyu07282 wrote:
| "Journalists" are like vultures now, they just run
| towards anything to publish as much as possible with
| absolutely zero regard for any journalistic ethic.
|
| Might as well just replace journalists by AI at this
| point, to a large group of people (me included) they've
| all made themselves more hated and untrustworthy than a
| company, economist, politician or civil servant.
| SantalBlush wrote:
| By and large, it's true. But Reuters still does some
| decent investigative journalism, imo.
| [deleted]
| lolinder wrote:
| https://news.google.com/search?q=valve
|
| There are dozens of articles that are just summaries of
| this reddit thread with no further effort put into them,
| and that's pretty much the norm these days for a lot of
| content.
| coffeebeqn wrote:
| Ah very true, thanks
| netsharc wrote:
| I see three relevant articles (the rest aren't relevant,
| e.g. talking about Steam summer sale), from sites that
| look more like content farms than legitimate news
| websites.. but yeah, there are too many content farms
| nowadays.
| palata wrote:
| Soon they will be auto-generated with ChatGPT from reddit
| threads...
| varelse wrote:
| [dead]
| DirkH wrote:
| A lot were already automated prior to ChatGPT
| palata wrote:
| Sure, but I like to think that I could pretty quickly
| recognize them. Much harder with GPT.
| AnIdiotOnTheNet wrote:
| 4. This is intentionally floated with nebulous veracity in
| order to gauge public reaction before making it official.
|
| If so, the reaction I've seen is quite positive. Very
| unlikely though.
| ilyt wrote:
| >Steam refused to publish the game due to some obvious
| copyright issues (like they told Midjourney to generate
| superman or one-piece characters)
|
| from post:
|
| > contains art assets generated by artificial intelligence
| that appears to be relying on copyrighted material owned by
| third parties.
|
| So I'm guessing 2
| Macha wrote:
| I'd be less sure about that, Valve has taken stances for
| reasons of public demand and/or their personal ethical
| standards (depending on what you believe) before, for example
| the ban on Blockchain games.
| judge2020 wrote:
| But not for Tyrone vs. Cops[0]
|
| "When a game comes up as problematic, it gets flagged a
| bunch by steam users, and there's a meeting that takes
| place where they decide if these things stay on steam"[1].
|
| 0:
| https://store.steampowered.com/app/1853200/TYRONE_vs_COPS/
|
| 1: https://youtu.be/hDjxBrgtJXc?t=974
| Macuyiko wrote:
| Well... Valve is a very interesting study in that regard.
| They have voted for violence, topics bordering to meme
| hate speech, pornography, but have voted against crypto
| shit and now AI.
|
| Not making a verdict either way but I find it
| interesting. I'd like to know more about the internal
| discussion(s) that took place to establish their
| frameworks. Especially given the company is private.
| bmicraft wrote:
| I think it makes perfect sense that they care more about
| their users not falling for a crypto scam on their
| platform than the actual content of the games
| cmeacham98 wrote:
| For what it's worth, I think I would vote the same way,
| perhaps with the difference being against the hate speech
| depending on how bad it was.
|
| Games with significant Crypto and AI art components bring
| significant risks to Valve in both legal and social
| contexts (99.9% of modern crypto-related projects are an
| intentional scam, and AI art is a legal minefield right
| now).
|
| On the other hand, violence and pornography are much more
| accepted by society (in the context of fictional
| enterntainment).
| adnzzzzZ wrote:
| This developer's next game was banned though: https://twi
| tter.com/Team_SNEED/status/1651022411368628224. They're
| fairly inconsistent about it.
| cma wrote:
| Valve's ban on blockchain games was more related to their
| cut of revenue and desire note to bolster systems that
| bypass it, probably along with KYC concerns converting
| steam wallet money into crypto.
| [deleted]
| intrasight wrote:
| I usually read HN comments before articles. Is a good filter.
| Seeing that article was just a Reddit post, I stopped reading
| comments at this one.
| shultays wrote:
| But it is just a random HN post!
| bunga-bunga wrote:
| Sounds like a cop-out. They can't possibly verify that any
| content they're hosting isn't already copyrighted, let alone in
| a "murky copyright state"
| [deleted]
| wahnfrieden wrote:
| Well, they verify. Think again or say something that isn't
| just "I know something I made up about them that I didn't
| look up".
| mcpackieh wrote:
| What are they copping out of? What are you suggesting their
| real motivation is?
| evandale wrote:
| I'll suggest a real motivation: they want to make their own
| game generated by AI and have first mover advantage while
| they work out all the scary AI copyright issues they have
| to deal with that they already deal with because the same
| problem exists with human generated art.
|
| Why do I say that? They want the developer to prove they
| only used material they created to do the training and
| Valve has the resources to follow that rule unlike the rest
| of us.
| mcpackieh wrote:
| > _They want the developer to prove they only used
| material_
|
| No, they only want the developer to "affirmatively
| confirm" it. It doesn't say anything about Valve
| demanding some sort of proof.
| pinkcan wrote:
| yea, and the game is half-life 3
| mardifoufs wrote:
| I don't think valve wanting to make a game is a realistic
| argument for anything in 2023. I'd believe any other
| reason than that lol.
| [deleted]
| urda wrote:
| > Sounds like a cop-out.
|
| Adhering to legal and copyright standards isn't a "cop-out"
| vkou wrote:
| And neither is choosing to act in a situation where the
| legality isn't clear.
|
| I understand that OpenAI et al would like to assure all
| their investors and customers that there's nothing legally
| problematic with using an AI to launder away copyright
| infrigement, but we're going to need a few lawsuits to have
| the matter settled.
| weatherlight wrote:
| someone failed their ethics class in college....
| [deleted]
| doctorwho42 wrote:
| And valve leadership has made a decent decision of 'we
| don't want to be the ones being the defendants on what
| could be a costly and time consuming lawsuit for
| something they didn't make'
| hyperhopper wrote:
| Okay, show me the standard that says that output from AI
| trained on copyrighted materials cannot be sold in this
| manner.
|
| They aren't following standards, they are being ultra super
| conservatively cautious.
|
| If you go that far you can rationalize a lot of things
| Matticus_Rex wrote:
| Being cautious when not being cautious could mean lots of
| big lawsuits against you doesn't seem that ultra-super
| conservative. I hope this ends up going the other way,
| but I understand Valve's calculus here.
| adamc wrote:
| INAL. But... show me the case law establishing there is
| near-zero risk to them if they let it go through.
|
| People make business decisions all the time to avoid
| murky areas that may hold peril. Unless there is a big
| benefit to them, why take the risk?
| gabeio wrote:
| > ultra super conservatively cautious.
|
| This has nothing to do with politics.
|
| This has everything to do with CYA, the issue is AI
| trained with copyrighted material is a huge gray area and
| they don't want to be in the gray area. That's rational
| and has zero to do with "conservative".
|
| This is likely not set in stone and after the copyright
| laws and courts catch up and decide what to do, Valve
| will likely go back and update their policies
| accordingly.
| saurik wrote:
| >> ultra super conservatively cautious.
|
| > This has nothing to do with politics.
|
| > This has everything to do with CYA, the issue is AI
| trained with copyrighted material is a huge gray area and
| they don't want to be in the gray area. That's rational
| and has zero to do with "conservative".
|
| The word "conservative" isn't a political word in all (or
| even I would have thought in most) contexts: it's normal
| meaning is similar to "chosen so as to be careful". For
| example, a "conservative estimate" isn't "an estimate
| that leans to the right of the political spectrum": it is
| an estimate which has been padded out in the direction
| you are more likely wrong.
|
| When someone says they are being "ultra super
| conservatively cautious" they are merely being super
| extra _extra_ doubly-cautious, as we are stacking similar
| adjectives (as one might could do with something else
| such as "carefully"). So, wanting to avoid being in a
| gray area is dead center to being "conservative" in one's
| curation or legal strategy.
| Rexxar wrote:
| > This has nothing to do with politics.
|
| Please tell what is, in your opinion, a conservative
| garbage collector without looking on google.
| imchillyb wrote:
| > https://www.artnews.com/art-news/news/ai-generator-art-
| text-...
|
| US Copyright Office has stated unequivocally that AI
| works cannot be copyrighted, or otherwise protected
| legally.
|
| The US patent office is studying the effects of AI on the
| patent system and asking citizens and businesses for
| comment.
|
| If that's not enough for you, I don't know what would be.
| oneeyedpigeon wrote:
| That's surprising. Do you know if their definition of
| 'AI' includes things like generative fill in Photoshop?
| dragonwriter wrote:
| > US Copyright Office has stated unequivocally that AI
| works cannot be copyrighted, or otherwise protected
| legally.
|
| The "or otherwise legally protected" piece is outright
| falss (and would be out of their scope of competence if
| true), the other part is true but potentially misleading
| (a work cannot be protected to the extent that AI, and
| not the human user, "determines the expressive elements
| of the work", but a work made with some use of AI where
| the human user does that can be protected to the extent
| of the human contribution.)
|
| The duty to disclose elements that are created by
| generative AI in the same guidance is going to prove
| unworkable, too, as generative AI is increasingly
| embedded into toolchains with other features and not
| sharply distinguished, and with nontrivial workflows.
| [deleted]
| kube-system wrote:
| That is a shallow regurgitation of their opinion that has
| been repeated out of context in headlines, but it misses
| their point. The Copyright Office's opinion can be better
| summed up as:
|
| 1. Copyright protects work that humans create
|
| 2. Humans sometimes use tools to create their works, that
| is okay
|
| 3. Y'all make up your mind whether your AI is some
| sentient being or whether it's just a tool. We're just
| lawyers.
|
| If the wind blows and your typewriter falls off a shelf
| and writes a novel, it isn't subject to copyright either.
| That doesn't mean that _all_ works written using a
| typewriter aren 't subject to copyright. It means a human
| must be part of the creative process.
| hedora wrote:
| But what if the wind blows, and my laptop falls off a
| shelf and writes the source code for windows 95, but
| reindented, with some implementation details and variable
| names changed?
|
| It's pretty clear that the "neural networks are just a
| tool" ruling is going to have to be revisited eventually
| (and probably soon).
| kube-system wrote:
| > But what if the wind blows, and my laptop falls off a
| shelf and writes the source code for windows 95, but
| reindented, with some implementation details and variable
| names changed?
|
| Simple. If it wasn't created by a human, it's not
| eligible for copyright. The law is quite clear about
| this.
|
| Microsoft gets the copyright to Windows 95 because they
| wrote it with humans. You wouldn't get it because you
| didn't write it. Your laptop wouldn't get it because it
| isn't a human.
|
| > It's pretty clear that the "neural networks are just a
| tool" ruling
|
| I think you misinterpreted the above. There is no
| ""neural networks are just a tool" ruling".
|
| The copyright office never said neural networks were or
| were not a tool.
|
| They said if a human makes a creative work, and they
| happen to use use a tool, then it is eligible for
| copyright. As it always has been.
|
| All they said is what every lawyer _already knows_ ,
| which is that a work has to have an element of human
| creativity in order to be eligible for copyright.
| jamilton wrote:
| That's meaningfully different. "Can't be copyrighted"
| doesn't mean "can't be sold", or "someone else owns the
| copyright". It just means someone can copy and resell the
| generated portions without payment/licensing.
| chefandy wrote:
| Sure-- the method of making the image, such as being AI
| generated, is entirely irrelevant in terms of IP
| enforcement. You could cut a cross-section from a log
| that had coincidentally formed the Nike symbol with it's
| rings, and if you slapped a picture of it on your line of
| sportsware, you better believe you're going to get owned.
|
| But if they see an increased risk of IP violations from
| AI generated assets-- and given the Getty red carpet
| debacle that's entirely reasonable-- banning it will
| probably save them a whole lot of money on manual game
| reviews.
| tgsovlerkhgsel wrote:
| The Nike example is trademark rights, not copyright.
|
| If you give a worker 5 examples of cars, and tell him
| "draw me a new car in this style", and he does so (from
| memory without clearly copying any individual example),
| it's unlikely to be a copyright or other IP violation.
| blibble wrote:
| regardless of legality: the odds are games with AI
| generated materials are going to be much lower quality
|
| (shovelware)
| ekianjo wrote:
| And which standard is that for ai generated art ?
| georgeecollins wrote:
| A lot of AI generated images retain the watermark of the
| copyright image it was trained on. If you sell something
| with that image with no agreement from the rights holder
| it is not fair use.
|
| It is completely reasonable for Valve to forbid this
| until it is sorted out. Keep mind they are a company of
| IP creators, creating a marketplace for IP creators. The
| whole reason Steam was created was to establish a DRM
| that fought the piracy of Half Life. I am on the side of
| Valve in this.
| tgsovlerkhgsel wrote:
| I believe the AI _generates_ a watermark because so many
| examples contained it.
|
| Imagine taking a really dumb gig worker, showing him
| 10000 images, some of them with watermarks, and then
| telling him "draw a red car, kinda like the kind of
| images you saw". There's a decent chance you'll get a red
| car that looks nothing like any cars with the data set
| (original work), and yet he'll paint a memorable
| watermark on top because so many examples contained it,
| you said "kinda like the kind of images you saw", and he
| doesn't understand that the watermark isn't meant to be
| part of the picture. I believe that's whats happening.
| jameshart wrote:
| They don't 'retain' a watermark. They 'reproduce' the
| watermark.
|
| It's entirely possible for a diffusion model to produce
| an original work and yet still hallucinate a
| 'shutterstock' watermark onto it, in much the same way as
| GPT can hallucinate valid-looking citations for legal
| cases that never happened.
| sdenton4 wrote:
| To correct the common misconception: Sometimes AI image
| generators insert a watermark because they have seen a
| lot of watermarks on certain kinds of images during
| training. This does not mean that the image itself is a
| copy of any particular image in the training data.
|
| Producing (distorted) copies of images in the training
| data takes some real effort, and typically only occurs
| for images which are heavily repeated in the training
| data... Most of the complaints along these lines can be
| compared to complaints that cars cause massive bodily
| harm if you steer them into lightposts: The problem is
| easily preventable by not driving into a lightpost.
| Tuna-Fish wrote:
| There are multiple jurisdictions where there have been
| rumblings that an AI-generated work is possibly a derived
| work from every single work that the AI was trained with.
| This hasn't been properly tested in court, but I would
| give very high odds that the standard will be upheld at
| least somewhere where Steam sells things.
|
| If this is true, then ordinary copyright law means that
| AI-generated media cannot be used unless you have a
| release from every bit of training data you used. At
| least some of the currently existing AI:s were trained
| with datasets for which such releases are impossible, so
| they should not be used.
|
| Also, for the love of god, do not use any of the AI
| coding assistants, or if you do, at least never publicly
| admit you do.
| 99_00 wrote:
| It's incorrect to say that Valve can't verify any content
| they are hosting is copyrighted.
|
| They are obviously able to identify some copyright material.
| [deleted]
| ethbr0 wrote:
| > _Sounds like a cop-out._
|
| Sounds like due diligence.
| methehack wrote:
| And a sound business decision until the copyright law works
| itself out.
| justapassenger wrote:
| Risk management. AI generated content has high likelihood of
| copyright infringement.
| wiz21c wrote:
| funny, a while ago MSFT said Copilot was not stealing code,
| it was merely reading it...
| alpaca128 wrote:
| I'd say that too if I was Microsoft, but that doesn't
| make it true.
| usrusr wrote:
| The best case scenario for Microsoft would be supplying
| the world with programming tools far ahead of all others
| (no idea, haven't tried any of that stuff), while maybe
| not getting sued to bits. The best case scenario for
| Valve would be not getting sued to bits while getting
| even more spammed by low-effort money grab attempts that
| hope to luck into virality than they already are.
|
| At first approximation, yeah, the risk of getting sued to
| bits might be roughly the same. But the upside is not.
| kibwen wrote:
| Microsoft wants to leverage LLMs to expand their
| influence in the software development market. For them,
| Copilot is both revenue source and a moat, so it behooves
| them to claim that these models don't constitute
| copyright infringement. But there's no business benefit
| to Valve in allowing AI-generated art assets on Steam,
| and a small (though nonzero) amount of risk.
| doctorwho42 wrote:
| And Microsoft isn't the government. So I see no bearing
| on the actual issue at hand, which is valve protecting
| it's own ass from lawsuits that are in the realm of murk
| at best.
| bee_rider wrote:
| It isn't a settled legal issue yet. It could be that
| Valve and Microsoft are responding to different
| incentives, because they have different business models.
| But it could also just be that their lawyers have
| different legal opinions.
| [deleted]
| kemayo wrote:
| They have a manual review step when you submit a game.
| Although you're right that they can't catch everything, they
| can certainly catch obvious things.
|
| I'm sure they mostly just don't want to wind up in court with
| a lawyer being able to say that they let [blatant example
| here] get published on their store. So long as they can
| credibly claim that there was no way for them to _tell_
| something was in an objectionable category, I 'd imagine
| they're fine with it.
|
| Their rules, if you're curious:
| https://partner.steamgames.com/steamdirect
| ryathal wrote:
| I doubt their manual review actually does much of anything.
| There are already tons of "games" that don't actually
| function that are just pre-built engine assets shoved
| together.
| CommitSyn wrote:
| I wonder how automated their system is. They obviously
| wouldn't boot the game up and start walking around because
| they can just extract the media files and check. But I'm
| curious if there is a system that identifies copyrighted
| images/video stills and searches for copyrighted words.
| gnopgnip wrote:
| Wouldn't the DMCA safe harbor apply, it's user submitted
| content?
| paxys wrote:
| This is a bullshit argument. There is zero liability for Valve
| here. They are a publisher, and are fully protected by the
| DMCA. Have they received a single takedown request for AI
| generated art? Why are they judging it to be possible
| infringement then?
| pcai wrote:
| Consider for just a moment that they have almost certainly
| thought about this much more deeply and thoughtfully than you
| have
| paxys wrote:
| "No big company can ever do a bad thing because I'm sure
| they've thought deeply and thoughtfully about it."
| pproe wrote:
| I agree with your sentiment but this is a poor way to
| dismiss an argument.
| scrps wrote:
| https://www.law.cornell.edu/wex/contributory_infringement
|
| _The Copyright Act does not expressly impose liability for
| contributory infringement. According to the U.S. Supreme
| Court, the "absence of such express language in the copyright
| statute does not preclude the imposition of liability for
| copyright infringements on certain parties who have not
| themselves engaged in the infringing activity.
|
| One who knowingly induces, causes or materially contributes
| to copyright infringement, by another but who has not
| committed or participated in the infringing acts themselves,
| may be held liable as a contributory infringer if they had
| knowledge, or reason to know, of the infringement. See, e.g.,
| Metro-Goldwyn-Mayer Studios Inc. v. Grokster, Ltd., 545 U.S.
| 913 (2005); Sony Corp. v. Universal City Studios, Inc., 464
| U.S. 417 (1984)._
|
| IANAL. Considering Valve not only gives games a retail
| platform, has to approve games before sale, and takes a cut
| of that sale and assuming the reddit post isn't a lie then I
| am gonna guess Valve's probably well staffed legal dept
| decided not to take a seemingly iffy legal gamble on a game
| that probably wasn't going to rake in a ton of sales anyway.
| bmoxb wrote:
| That may be true but that won't stop people from trying to
| sue undoubtedly - clearly they've decided the effort and
| legal fees required to deal with that aren't worth it.
| [deleted]
| emveeoh wrote:
| [dead]
| lallysingh wrote:
| Man, getty images is sitting on a goldmine opportunity to get
| into AI here. They have enough images to train something quite
| useful!
| krapp wrote:
| AI is already trained on Getty's image set... it's why people
| have to exclude watermarks from their prompts.
| baobabKoodaa wrote:
| Sure, but that doesn't relate in any way to the legal
| problem that this post is about.
| wincy wrote:
| Interesting. So products that use AI generation as pet of an API,
| say using a diffusion model to generate different stylings for
| the walls and textures for a level creator, would fall under
| this?
|
| Guess it's time to ask for forgiveness rather than ask for
| permission and not let Valve know where my art assets are coming
| from in my web-based API.
|
| If I were making a game I'd just lie and lie at this point.
| axus wrote:
| AI code generation is OK though! Because they can't detect it?
|
| It's cool to see the development of new ethical standards in
| response to new technology. If I could get an option for
| ethically-sourced AI, which only uses public-domain art / text
| / code for training, that'd be nice.
| smoldesu wrote:
| For a very long time music producers would pirate their
| samples, plugins and presets. The idea was that nobody could
| tell how illegal these tools were in the finished product, so
| there was no reason _not_ to steal. It was genuinely the gold
| standard for a while, and even established artists like Diplo,
| Porter Robinson and Kanye West were caught pirating content en-
| masse.
|
| Nowadays there isn't the same attitude so much. Many people
| still pirate sounds, but skeptic listeners will sometimes ask
| musicians to show off their project files to embarass them over
| how many pirated Cymatics drums they use and their version of
| Sylenth licensed to "RuTorrent".
|
| It wouldn't surprise me if the same thing happened today. AI-
| assisted development will take off for a while, and then people
| will ask self conscious questions like "nice art, who's your
| art director?"
| sebzim4500 wrote:
| > Many people still pirate sounds, but skeptic listeners will
| sometimes ask musicians to show off their project files
|
| And the musicians comply? Weird.
| smoldesu wrote:
| I mean, not always. It's hard to be super secretive in a
| live situation, but I'd wager many musicians have
| successfully hidden their pirated plugins.
|
| A lot of people have been caught anyways. Steve Aoki
| accidentally left a visibly pirated Sylenth VST in a promo
| vid, Porter Robinson and Skrillex both got caught with
| pirated plugins during track breakdowns, Kanye West posted
| a video with 30 tabs of The Pirate Bay open to download
| Logic Pro... the list goes on. It was extremely common in
| the early days of digital music production (and still is
| today, to an extent), but the backlash has pushed most
| legit production houses to legit licensed software.
| mavu wrote:
| [flagged]
| stainablesteel wrote:
| i really hope the US copies Japan's ruling on this kind of thing.
| emveeoh wrote:
| [dead]
| bilalq wrote:
| Where do they draw the line? What about DLSS? Doesn't any game
| using that have "AI generated graphics"? I guess their email
| wording focused specifically on assets. Does that mean if you
| don't pre-bake assets in as artifacts, you're fine?
| pavon wrote:
| With DLSS Steam isn't hosting and distributing AI generated
| content, Nvidia is distributing it with their drivers. So any
| liability regarding the source of the training datasets falls
| on Nvidia, and there is no reason for Steam to get involved.
| bilalq wrote:
| That seems reasonable at first glance, but how is that
| different from the game downloading generated content or
| models used for inference from its own servers after Steam
| distribution? If you argue that the code being used to access
| it is distributed, then what about the code to integrate with
| DLSS APIs on Steam distributed games?
| qmarchi wrote:
| DLSS is trained on the games themselves no?
| entropicdrifter wrote:
| It used to be trained on specific games, but I think most of
| the time it isn't nowadays outside of a few major titles
| giobox wrote:
| DLSS requires no training on your own game to use - you just
| use the pre-trained system NVidia provides, IIRC. So it has
| been trained on game data, but not necessarily yours.
| pawelduda wrote:
| There isn't one trained model per game I think, to update
| DLSS manually you just replace a .dll file.
| giobox wrote:
| There is _very_ clearly a line most people will understand
| between what DLSS does, and generating completely new art that
| mimic existing intellectual property.
|
| No one is ever going to accuse DLSS of creating new art works
| containing some other legal entity's existing IP for example,
| its literally just a (very clever) upscaling of the original
| art. If it did, it would presumably render the game being
| upscaled almost unplayable as it would be changing the output
| to a state unrecognizable from the input frame.
| bilalq wrote:
| There may be a line between there somewhere, but it's not at
| all clear where it is. What about generating foliage or
| varied ground textures? What about generating buildings? Or
| NPCs? Also, the "just upscaling" relies on training data from
| outside your own game. Why is that okay when the rest of this
| isn't?
|
| I totally get why Valve is taking the stance that they are. I
| imagine its hard even for them to know where to draw the line
| (evidenced by how long the turnaround time was on the support
| response).
| giobox wrote:
| > What about generating foliage or varied ground textures?
| What about generating buildings? Or NPCs?
|
| This isn't how NVidia DLSS or AMD FSR works at all.
| DLSS/FSR can't create new buildings or foliage, or NPCs, so
| hard to foresee problems of the kind Valve are concerned
| with. Same for varied ground textures- the entire point of
| the technology is to sharpen and upscale the original
| image, or in case of DLSS3 inject new matching frames.
|
| The only "risk" to DLSS is in Nvidia's own training data,
| but there is no "risk" of another company's existing IP
| leaking into final frame - again if there was, gamers
| wouldn't want to use it, as its destroying the original
| frame! If the resulting frame isn't a near perfect match
| for the original, DLSS has failed. Thankfully it does
| nearly perfectly match the original in use and alongside
| AMD's FSR 2.0 stuff has been one of the best advancements
| in gaming technology of recent years - effectively
| significant FPS boost "for free" on same hardware.
|
| > https://www.nvidia.com/en-us/geforce/technologies/dlss/
|
| > https://www.amd.com/en/technologies/fidelityfx-super-
| resolut...
|
| While the line may be gray for other AI technologies in
| gaming, such as using it create new original textures or
| models, DLSS/FSR is _just a really good upscaler_ - no
| "new" content being created and therefore no risk of IP
| infringement. To be really blunt; FSR and DLSS are in
| almost every new game for the last couple of years on both
| PC and console across literally hundreds of games now - if
| there was IP infringement issues, we would know by now - we
| are already onto second/third generations of these
| upscalers.
| bilalq wrote:
| Apologies if I was unclear. I didn't mean to suggest that
| DLSS generates textures. I was raising the question about
| whether or not generated textures would be in violation
| of the policy. And if they're fine, what about generated
| buildings or NPCs? It's fine that DLSS is considered
| compliant. My point is that it's not at all clear where
| we draw the line.
| [deleted]
| [deleted]
| guy98238710 wrote:
| Engineers have created it and lawyers have ruined it. It's
| interesting how whole professions can be inherently constructive
| or inherently destructive.
| a_cardboard_box wrote:
| Or are engineers trying to destroy the livelihood of millions
| of artists, and lawyers are protecting it?
| GreedClarifies wrote:
| Smash the looms! Smash the looms!
| pessimizer wrote:
| Looms weren't trained unwillingly by the people they
| replaced. You're thinking about outsourcing.
| tzekid wrote:
| But they were. Weavers improved on their processes for a
| long time before the engineers swooped in and put that
| accumulated knowledge into an automated form.
| mrweasel wrote:
| Engineers, well companies, created an entire industry without
| any regards for how that might affect artists... again.
|
| Musicians are still being screwed over because engineers wanted
| change how music is distributed. The goals are noble enough,
| just as with the music, but large corporations inserted
| themselves in the middle to capitalize on the work of the
| artists. I can't fault artists teaming up with lawyers again in
| an attempt to be paid for their work. It's didn't really work
| out for the music industry, but hey, what can they do?
|
| As engineer we clearly aren't on the side of the artists, we
| help companies in the middle, not the artists. When developers
| created ChatGPT, or Stable Diffusion, did anyone of the
| developers insist on building in licens tracking, to ensure
| that only work in the public domain or under appropriate
| licenses was used, or at least tracked?
|
| We're once again trying to build a new industry, but we don't
| care how that might affect others. It's dumb, it's not like
| there wasn't enough publicly available material, it's just that
| it's cheaper to ignore licensing.
| ronsor wrote:
| Musicians were always screwed over by the music industry, not
| engineers.
| rngname22 wrote:
| This is just legal cover until such time as its possible to
| enforce no child exploitation imagery, no copyright stuff, etc.
|
| It doesn't matter if they are able to enforce it, Valve can use
| this policy as cover if they ever get sued.
|
| Don't overthink the motivation. They will not even have a
| bulletproof way to detect AI imagery as it evolves every single
| day as an arm's race and detection is a full-time job. Even a
| FAANG or a state actor would need to dedicate team(s) to
| detection technology and still have false negatives.
|
| The same sorts of things already happen for example on YouTube
| and Twitch, where types of content are against TOS or copyright
| but enforcement is sporadic and selective, smaller operations
| often fly under the radar of enforcement, bigger creators who are
| netting the org sufficient revenue will likely be able to get
| away with more, etc, the automated tools for detection are
| flawed.
| birdyrooster wrote:
| Going public about your awareness to a problem necessitates an
| enforcement response to be considerable.
|
| Imagine you are a trademark holder and someone is using your IP
| but you don't enforce your trademark by litigating. Your claim
| is weakened.
|
| It shows the public and the court how significant this problem
| is for your party.
|
| Edit: copyright -> trademark
| epakai wrote:
| You seem to be confusing copyright and trademark. Copyright
| isn't diminished by non-enforcement. A trademark risks being
| invalidated or genericized when not enforced.
|
| Intellectual property is an encompassing term that seems to
| lead to this sort of confusion.
| birdyrooster wrote:
| Thank you!!
| abejfehr wrote:
| > This is just legal cover until such time as its possible to
| enforce no child exploitation imagery, no copyright stuff, etc.
|
| If the art used in a game violates copyright or contains
| imagery of exploited children, ban it of course, but what does
| that have to do with whether it was generated via AI or created
| in another manner?
|
| If anything AI generated art should be _less_ susceptible to
| copyrighted stuff because everything is original (even if it's
| not in original style)
| hot_gril wrote:
| Because AI IP law is murky, and that's all Valve cares about.
| MetaWhirledPeas wrote:
| > They will not even have a bulletproof way to detect AI
| imagery as it evolves every single day as an arm's race and
| detection is a full-time job. Even a FAANG or a state actor
| would need to dedicate team(s) to detection technology and
| still have false negatives.
|
| Are people actually trying to detect AI-generated content? That
| would not only be pointless and futile; the threat of false
| positives would be enormously detrimental to anyone creating
| legitimate work.
|
| It is such a ridiculously bad idea I'm dumbfounded that anyone
| _smart_ would be trying to do it.
| kbelder wrote:
| Yes, multiple teams are working on it, private and public.
|
| >It is such a ridiculously bad idea I'm dumbfounded that
| anyone smart would be trying to do it.
|
| Agree with you there.
| cwkoss wrote:
| Yep. There is a cohort of #noaiart amateur-but-wants-to-be-
| professional artists on twitter who believe their mediocre
| talents would be paying their expenses if it only wasnt for
| that pesky ai imggen. (Ignoring that a vanishingly small
| proportion of imggen art is replacing commissioned art - most
| is art that would simply never have been made). Like a horde
| of locusts, they will randomly pile onto AI artwork with hate
| comments for a brief period of time before moving onto the
| next one.
|
| People are 'offering their services' where you can DM them a
| link to an image and they'll eyeball it and tell you if its
| made by AI. Laughable hubris, if it wasn't for the inevitable
| ramifications of false positives.
| [deleted]
| slikrick wrote:
| no there isnt
| cwkoss wrote:
| I have first hand experience with them.
| coeneedell wrote:
| Yes people are working on it. The thing you're missing is
| that many of the contexts where the money actually is being
| spent is not really relevant to the public discussion around
| AI generated content. It's more about making sure that nobody
| gets a bank loan using an AI generated voice and face, or
| that people don't get scammed by a deep fake of their
| relatives, or that your government office isn't being slammed
| with subtle propaganda, for instance. The trick to your
| concern is to change your expectations of accuracy. Flagging
| something as fraudulent with an ML is not treated by these
| systems as if it's actually being fraudulent.
| oneeyedpigeon wrote:
| I know companies that are trying to do this, yes. The
| detection tools are terrible, but places really do not want
| writers submitting AI-written content, for example.
| MetaWhirledPeas wrote:
| > but places really do not want writers submitting AI-
| written content, for example
|
| And I want the power of flight, but it isn't going to
| happen!
| rngname22 wrote:
| You need to think about different contexts.
|
| Think about security or trust and safety or anti-scam or
| anti-fraud.
|
| AI generated image, video, and audio can be used to
| circumvent a lot of systems used in these domains. Many of
| these domains are for protecting users from being scammed,
| being impersonated, being tracked, etc.
|
| Think about criminal court. Evidence may become impermissible
| if it can't be proven whether an image or video or audio
| document is a forgery or captured reality.
|
| It's a bit flippant and absurd to insult the intelligence of
| people working on AI detection. I'd be a bit dumbfounded by
| someone dismissing an effort w/o spending time to think about
| why that effort may exist.
| MetaWhirledPeas wrote:
| > Think about security or trust and safety or anti-scam or
| anti-fraud.
|
| It doesn't matter what context I think about it in. It
| isn't going to work! And it will make things worse for
| everyone involved.
|
| Hypothetically let's say we get to a point where everyone
| believes the detection is 100% accurate. Well that's all
| that means: everyone _believes_ it. Meanwhile AI has just
| gotten better, and we 're all more fooled than we were
| before. All we are really accomplishing is enhancing the
| training necessary for AI to _elude_ detection.
|
| And there will be an inherent bias toward false positives,
| because high detection rate will be the selling point. The
| truth is secondary, and there's no way to verify the
| results.
| rngname22 wrote:
| It does work. If you absolutely need to know if an image
| is AI generated, you can just have a central authority in
| the system watch the person draw the picture on a piece
| of paper. Or drive to your house and hand you the paper
| and pencil and watch you draw it in person.
|
| There are workflows or system designs that absolutely can
| and will solve for human-verified creation, they just
| might be incredibly costly or unscalable compared to
| existing solutions. It's all just tradeoffs. Might make
| existing business models no longer work. Might open new
| ones.
| MetaWhirledPeas wrote:
| > If you absolutely need to know if an image is AI
| generated, you can just have a central authority in the
| system watch the person draw the picture on a piece of
| paper. Or drive to your house and hand you the paper and
| pencil and watch you draw it in person. There are
| workflows or system designs that absolutely can and will
| solve for human-verified creation, they just might be
| incredibly costly or unscalable compared to existing
| solutions. It's all just tradeoffs. Might make existing
| business models no longer work. Might open new ones.
|
| This is pretty much my point. Like you said, incredibly
| costly and unscalable. A non-solution! We're better off
| not pretending we can compute what is and isn't real.
| rngname22 wrote:
| I don't agree, I think it's just a matter of the right
| mix of distributed trust and creating incentives for
| honesty and penalties for dishonesty. As well as those
| costlier mechanisms for verification to be available for
| a subset of cases.
| permo-w wrote:
| if there's money to be made, then there'll be people who'll
| try and make it. doesn't matter how aware of the philosophy
| or ethics you are. humanist intelligence rarely comes into
| the equation when money is on the table
| Madmallard wrote:
| Strong disapprove. Artificially attempting to slow progress just
| creates a massive power disparity for those who do not care.
| seanw444 wrote:
| If this was politically or ethically motivated, I'd be inclined
| to agree. But it seems this is just a safeguard against
| lawsuits, which I can understand at least.
| freedomben wrote:
| > _Artificially attempting to slow progress_
|
| Why do you think Valve is just trying to slow progress? Don't
| they win when people on their store win?
|
| It seems more likely to me that this is CYA against the major
| lawsuits that are happening right now from copyright holders.
| Madmallard wrote:
| Ah yeah that is probably right. Chinese indie industry boom?
| zzzzzzzza wrote:
| misleading title, hn should ban reddit links
| Der_Einzige wrote:
| Good luck enforcing this. You can generate textures all day with
| no evidence that they were AI generated.
| justahuman74 wrote:
| Depends if the game publisher is willing to run the risk of
| their game getting yanked from sale
| pessimizer wrote:
| You can also ban games from your platform for even the vaguest
| suspicion that they contain AI generated assets, or probably
| even for complaining about the policy.
| [deleted]
| raincole wrote:
| Random reddit anecdote.
|
| And from 23 days ago.
|
| AND misleading clickbait title.
| scohesc wrote:
| How would you be able to know if something is AI generated if
| it's not outright stated in the product description?
|
| "Yes, I intentionally designed the static image of this man to
| have 5 and a half fingers on one hand with a distorted logo on
| their t-shirt, please allow this game, Valve."
|
| How can you prove that something is AI generated? Would creating
| graphics in Adobe's photoshop AI filler tool count as AI-
| generated content to Valve, or is Adobe's AI data-set using
| copyright-free graphics?
|
| I wonder if this is Valve trying to also somewhat cater/attract
| artists on the platform, as I'm sure artists are against using AI
| under the guise it'd "steal their jobs/hamper creativity".
| mcintyre1994 wrote:
| I think the idea is that if someone gets sued for the AI art in
| their game, Steam plans to point to their terms of service and
| say the legalese equivalent of "they promised us that it didn't
| have AI art, if they didn't lie to us we wouldn't have hosted
| their game", and not also get sued.
| bugglebeetle wrote:
| Seems entirely reasonable. Just because Stability (who seems to
| be crashing and burning) decided to try and do a Napster for
| image generation doesn't mean everyone else should run into the
| lawsuit abyss alongside them.
| seydor wrote:
| good news for their competitors
| bogwog wrote:
| Not really, or at least, not yet. If Epic allows AI generated
| content, it will just attract devs that use AI generated
| content. I think those are more likely to be shovelware garbage
| today than anything else.
|
| Until there's a killer/must-have game built with AI content, I
| don't think this is going to have much of a noticeable impact.
| paulmd wrote:
| https://i.imgur.com/JWjLFlO.png
|
| now that's how you know when a comments section is gonna be
| amazing
| ccheney wrote:
| Seems shortsighted and overly limiting to me. Perhaps in this
| specific case it makes sense?
|
| What's the difference?
|
| A) Human creates artwork in the style of [insert artist here]
|
| B) Computer creates artwork in the style of [insert artist here]
|
| Both "trained" against existing copyrighted works except one is
| human. Is this to "save jobs"?
| [deleted]
| alexdeloy wrote:
| After seeing that Unity Muse[0] AI presentation yesterday and the
| following backlash regarding the source material[1], this seems
| to be a huge legal minefield to be solved first.
|
| [0]: https://www.youtube.com/watch?v=dR4IuN2tF78
|
| [1]: https://nitter.net/unitygames/status/1673650585860489217
| slongfield wrote:
| This doesn't seem to have anything really related to the AI-
| generation of the graphics--it's 100% about copyright. The
| statement from Valve even says that explicitly: if this user had
| owned the copyright to the training data, they would have been
| fine using the AI generated graphics and text.
| jasonjmcghee wrote:
| Why editorialize?
|
| "Valve is not willing to publish games with AI generated content
| anymore"
|
| Your title changes the meaning- they didn't ban games afaict.
|
| It's also a misleading post, as it's specifically GenAI where
| authors can't prove or don't have rights to content.
|
| If you use ProcGen etc or have full rights to the data used, I
| can't imagine there would be any issues.
| dwringer wrote:
| > it's specifically GenAI where authors can't prove or don't
| have rights to content.
|
| Even more specifically, the author admitted to the images being
| "obviously AI generated" and Valve alleges that the images
| themselves in the game's initial submission contained
| copyrighted third-party content.
| i_like_apis wrote:
| Yeah this seems like the title should be edited. @dang (not
| sure how to get his attention)
| dang wrote:
| @dang is a no-op. The only way to get reliable message
| delivery is to email hn@ycombinator.com. Fortunately someone
| did that. I'll take a look at the title situation now.
| [deleted]
| dang wrote:
| Thanks. The submitted title ("Valve bans games using AI
| generated graphics or text from Steam") broke the HN
| guidelines: " _Please use the original title, unless it is
| misleading or linkbait; don 't editorialize._"
|
| Even the original title seems questionable until properly
| substantiated, so I've reverted to it plus tacked on a question
| mark.
| Wouter33 wrote:
| No bad intentions with the title, was just using "banned"
| since Valve used it in their response to the Reddit poster.
| Change it to whatever you think is better! :)
| dang wrote:
| I haven't followed any of the details so you could well be
| right (in which case, sorry!)
| schnebbau wrote:
| [flagged]
| gcampos wrote:
| > we cannot ship your game while it contains these AI-generated
| assets, unless you can affirmatively confirm that you own the
| rights to all of the IP used in the data set that trained the
| AI to create the assets in your game.
|
| Yep, you are absolutely right.
| hospitalJail wrote:
| Beginning of the end for Valve. This is a warning shot. Might
| want to stop buying games on steam sales.
| fyrn_ wrote:
| Title is very misleading for something that the only the only
| evidence of is a anecdote from reddit. Was expecting a statement
| from Valve based on the title
| floomk wrote:
| So using Adobe Firefly is fine, since they only trained on data
| they had the rights to?
| acomjean wrote:
| Yes,
|
| You can assert you own or have the rights to those images,
| based on your license with Adobe.
| GaggiX wrote:
| How do you prove that the images where generated using Adobe
| Firefly instead of SD or MJ?
| delecti wrote:
| They don't seem to be asking for proof.
|
| > we are declining to distribute your game since it's
| unclear if the underlying AI tech used to create the assets
| has sufficient rights to the training data
|
| So it seems that asserting "assets X and Y were generated
| by tool Z that has rights to its training data" would be
| sufficient. Presumably AI tools will also start to
| formalize that declaration alongside their terms of
| service.
| GaggiX wrote:
| Are there similar things where you have to declare
| something that is in no way provable in game development?
| It feels kinda silly.
| dragonwriter wrote:
| Yes, its fairly normal for distribution platforms to
| require you to declare that you have the rights for
| _everything_ in your game (which is in no way provable),
| extending it to everything in the training set(s) for the
| model(s) used for generative AI used is ludicrous, but
| just amounts to the same thing plus the (almost certainly
| false) assumption that every work produced by AI
| generation is legally a derivative work of every work in
| the training set(s) of the AI model(s) used.
| GaggiX wrote:
| >which is in no way provable
|
| I guess in that sense it's not provable by the platform,
| but it is provable by the actual copyright owner if
| you're infringing one.
|
| What I mean before is that no human can know whether you
| used Firefly or SD or MJ or some other custom model.
| preommr wrote:
| Sort of - the person uploading to adobe stock can upload
| copyrighted material. Adobe will handle copyright claims and be
| liable up to 10k worth of damages.
| bastardoperator wrote:
| Only 10K?
| add-sub-mul-div wrote:
| We're on the cusp of a profound content shovelware crisis. It
| will happen regardless, but any oasis of real content will become
| important.
|
| Automation will be a force multiplier for laziness and predation
| more so than for creativity.
| com2kid wrote:
| We are also on the cusp of individual developers being able to
| produce works that used to take entire teams.
|
| I'm working on simulating a small town using Generative AI
| agents, schedules, social interactions, realistic reactions to
| outside events, dialogue between characters, the whole shebang.
|
| A year ago that wasn't an "after work side project".
|
| I just did a full launch on
| https://www.generativestorytelling.ai/ - a side project that
| was only possible because of AI help. Between art assets and
| also coding in brand new areas that I hadn't used before, AIs
| are an obscene boost to what individuals can do.
|
| The price and complexity of software development projects has
| been increasing for years now, AI is a huge reset on the amount
| of effort needed to make stuff.
| xk_id wrote:
| > AIs are an obscene boost to what individuals can do.
|
| Depends what the individual is trying to do. Making memes?
| Blog spam? Sure. But for non-trivial content I haven't seen
| an example that was compelling.
| asveikau wrote:
| I think game art for programmers or people with a game
| design idea but no visual arts chops is a very fitting use.
| As would be generating text for such a use, like dialog for
| an NPC.
| com2kid wrote:
| > But for non-trivial content I haven't seen an example
| that was compelling.
|
| ChatGPT helped me write my websocket code, I'd never used
| websockets before and it saved me hours (of not more) of
| time learning a new API.
|
| I had a concurrency bug in some code, I threw it at GPT4
| and asked it what the problem was, a few seconds later it
| split out a solution.
|
| I had some complex state management code: "Hi this code has
| an off by 1 error in it, can you find it?"
|
| "My page demonstrates this visual problem, here is the CSS
| file for the page, what is wrong?"
|
| The color picker component used on my above linked site was
| 80% written by ChatGPT4.
|
| AI is a huge productivity booster.
|
| Heck I use it to steel man opposing views of my blog posts
| to try and make sure I have sound arguments.
| [deleted]
| schroeding wrote:
| In this context: Generative ML models allow e.g. a single
| motivated writer with almost no budget to make a Visual
| Novel which they then could publish on Steam (before the
| policy change) for the world to see.
|
| Write the script yourself, generate and curate 2D art
| assets, optionally generate and curate your OST / BGM,
| optionally generate and curate voice lines, put everything
| into Ren'Py. Done.
|
| It's still very much not easy if you do not want to make
| shovelware, but it's _possible_ now for a sole developer
| (or very small team) with no great artistic and musical
| talent.
| minimaxir wrote:
| Massive amounts of shovelware on Steam is most definitely not a
| new phenomenon.
| imdsm wrote:
| Amazon is more like eBay than eBay now
|
| Steam is like eBay but for low quality games
|
| It happens to them all. There are big games on steam, but 95%
| of the stuff on there is low value, low cost content
| SV_BubbleTime wrote:
| >We're on the cusp of a profound content shovelware crisis.
|
| Everywhere. Games, porn, text, articles, music, etc. For the
| generation that grew up with the internet already existing,
| this is their epoch moment, lives pre and post generative AIs.
|
| I was thinking the other day that original artworks are going
| to be far more valued with a glut of AI generated.
|
| Thinking slightly ahead, you can find your absolute favorite
| artist, and in seconds use their style you love so much to make
| the family portrait you would never be able to commission them
| to do. But going forward even more...
|
| It's just not the same as a print right? Well, thanks to AI
| being able to learn and determine where brush strokes would
| land, we take advancements from 3D printers and your desktop
| painting rig picks up a brush and paints it just as the artist
| would have.
|
| Then going forward even more.. the artist himself needs some
| cash and knocks out 100 of these customs while they sleep,
| signs them, and now they are originals, sort of.
|
| So... verifiable originals are going to be the hot thing. A
| painting with a video of the artist painting it... but not an
| AI generated video of course!
|
| Maybe the artist will have to print it on location while you
| watch.
| draugadrotten wrote:
| > Maybe the artist will have to print it on location while
| you watch.
|
| For the music industry, this happened already. Artists makes
| a lot more money from live performances than from album sales
| or streaming fees.
| pengaru wrote:
| You must have never seen the original Bladerunner if you
| found this line of thinking as original...
|
| "you think I'd be working in a place like this if I could
| afford a real snake?!?"
| xsmasher wrote:
| To me that implies that real animals are scarce; the price
| is not related to any kind of "artistic realness."
|
| If they said that hand-crafted robotic snakes were more
| expensive than run-of-the-mill live-bred snakes, that would
| support your point about artistic realness.
| crazygringo wrote:
| > _We 're on the cusp of a profound content shovelware crisis._
|
| No, we've been inundated with low-quality content for decades
| now. This is nothing new.
|
| Fortunately, ratings and reviews and popularity have always
| been an extremely effective antidote.
|
| Even if 99% of stuff on a platform is total crap, nobody cares.
| It's a non-issue. Whether you're talking about music, books, TV
| shows, or whatever. The 1% rises to the top and you don't
| honestly need to pay attention to the rest.
|
| If you choose to pay $20 for something that has 3 reviews that
| are all 1-star, then that's more your problem than the system's
| problem.
| add-sub-mul-div wrote:
| > No, we've been inundated with low-quality content for
| decades now. This is nothing new.
|
| The problem with this argument is that scale matters.
|
| When a small number of people were trading mp3 files on FTP
| servers it was not seen as a problem. When Napster came out,
| it was seen as a problem. It was correctly seen as a
| qualitative shift in the effect it would have on society.
| oehpr wrote:
| I have to agree reviews work very well. But it distresses me
| that, knowing the absolute tsunami of garbage that awaits us
| without it, we are SO laissez faire when it comes to
| protecting that system. We allow companies to game reviews
| with kick backs, we lets spammers in posting fake reviews.
| This is currently our one wall of defense and cracks in it
| should terrify us.
| asdff wrote:
| >No, we've been inundated with low-quality content for
| decades now. This is nothing new.
|
| Trends matter too. If it gets to be an order of magnitude
| cheaper to shill your products on social media and forums,
| set up seo crap articles to phish users from search results,
| or churn out good old fashioned email scams, then expect an
| order of magnitude worse signal to noise ratio on the
| internet as a result. There could be a point reached where
| the internet is functionally broken, with the signal to noise
| ratio too low to make it useful for anything, save for
| navigating directly to known good hosts that themselves will
| become increasingly more lucrative targets for
| enshittification.
| TOMDM wrote:
| This has been an issue on Steam for ages, people call them
| "asset flip" games, because someone can buy a few $10 asset
| packs and piece together a game out of them.
|
| AI generated content is not meaningfully better or worse than
| these low effort games, though taking the time to generate
| passable content with AI is probably a lot more effort than
| just using $50 worth of assets that are already packaged up for
| unity.
| newobj wrote:
| Asset store is still a more effective tool to create shovelware
| than AI. It's been this way for years. Recommendation systems
| (digital or otherwise) are already coping with it.
| jncfhnb wrote:
| Outlast Trials appears to be a fairly large title that utilized
| AI art. I sincerely doubt it'll get ax'ed
| TheCaptain4815 wrote:
| I wonder if they'll do the same for Ai generated text? Why
| shouldn't they honestly. I could easily finetune an LLM on the
| writings of a certain author or maybe the content of Mass Effect
| 1-3 and have the outputs be similar.
| FloatArtifact wrote:
| Basically valve is saying no AI generated content. The premise is
| that all AI generated content violates copyright which isn't
| necessarily true. To me, it sounds like their side stepping a
| potential issue rather than an actual issue with a copyright
| holder complaining of infringement with a particular IP.
| rileymat2 wrote:
| They are saying no AI content trained on copyrighted work you
| don't own. Thats a very different framing.
| FloatArtifact wrote:
| Yes I can see that now but that seems too broad.
| Hypothetically if a model's trained on a thousand images only
| which 10 images are copyright. Does that mean of all images
| generated violate copyright law...?
| LinuxBender wrote:
| I do not believe this will be limited to Valve. I expect more
| companies to start _covering their backside_ by implementing
| similar rules to avoid copyright lawsuits. I can 't say I would
| blame them either. I am not a lawyer but I think one of the risks
| is that LLM's do not show their work so proving where something
| came from is likely to end up in court after a lot of expensive
| discovery is performed.
| charles_f wrote:
| What sounds very weird to me, is that I doubt Valve is verifying
| the copyright for all the graphics and text you submit. Why would
| they reject something because "it looks AI generated"? The
| potential for legal hazard is probably less on these re-mashed
| works than on purely copy-pasted content.
| deskamess wrote:
| Interesting... what would an AI trained only on the
| Commons/public domain be like? Would it be a clean source for new
| images? And would new images need to inherit a public/Commons
| license (GPL style)?
| Kuinox wrote:
| > And would new images need to inherit a public/Commons
|
| Well first we need to know if using images for AI training can
| be considered fair use.
| Kiro wrote:
| I thought Adobe Firefly did that.
| xsmasher wrote:
| And some non-public-domain images of their own.
|
| > The current Firefly generative AI model is trained on a
| dataset of Adobe Stock, along with openly licensed work and
| public domain content where copyright has expired.
| Havoc wrote:
| >I improved those pieces by hand, so there were no longer any
| obvious signs of AI
|
| Steam's objection is other copyright even indirectly in the AI
| training dataset and to _remove_ it, not to conceal the issue
| better.
|
| Tricky copyright questions aside, inability to follow basic
| instructions is definitely a disadvantage when going through
| approval processes
| cinntaile wrote:
| This opens up one hell of an opportunity for Epic or a startup.
| add-sub-mul-div wrote:
| As I understand it, Epic already has a smaller and more curated
| catalog by intention. They're already trying to keep out the
| tens of thousands of low quality and hobbyist titles.
| al_be_back wrote:
| copyright issues aside, a platform has to consider 'spam', AI
| generated content could quickly and easily overwhelm a platform.
| samstave wrote:
| We need an "AI generated web game tower defense" FULL FN STOP.
| euix wrote:
| Midjourney has been really helpful to me as a one man dev. I can
| mockup art much faster then what I can do in photoshop. I still
| intend to at some point do a complete pass using a professional
| artist (or learn to draw myself) - because generative art is not
| consistent thematically from asset to asset. But if I just want
| to see what my tile assets look like if they were all done in
| 30's art deco style, I can do it in 20 minutes.
|
| As placeholders or to create little bits and doodles (like a
| mouse cursor in the style of an armored fist), there are lots of
| little graphical icons in a game that would other have to be
| created by a graphical artist. Generative art is really useful in
| my experience.
|
| It's reduced the work to the point where I can toy with it in my
| off time and spend most of my effort in the actual programming
| and development.
|
| The other idea I have toyed with, coming from professional ML
| experience - was to build my own generative model and use it to
| create my own art assets. Here I wonder how the copyright rules
| would work - would the assets I train on be subjected to
| copyright? This is a much bigger conversation at that point and I
| wont be the only one affected.
| kitsunesoba wrote:
| Yeah I don't see too much issue in using generative art for
| more trivial things, like some banner art to sit atop a
| blogpost or something. Placeholders also seem like a really
| good application, particularly for cases where the
| randomization might expose issues that real users would face. I
| wouldn't have spent money on these things anyway.
|
| Direct incorporation of generative art into a commercial
| product is much more murky.
| c-hendricks wrote:
| And no one is trying to take that away from you. You are using
| it as a tool, and intend to pay someone to create the final
| version, or do it yourself.
|
| The issue people have is when you just use a dataset trained on
| someone else's work and pass it off as your own, and in the
| case of Steam games, most likely profit from it.
| valine wrote:
| What if I look at other people's art and learn from it? Seem
| unfair to pass that work off as my own. All artists should be
| banned from looking at copyrighted images, we can't risk them
| incorporating copyrighted elements into their own work. /s
| Jolter wrote:
| That's not a good analogy and you should know it.
| valine wrote:
| It's a terrific analogy. The alternative is to believe
| that a 5GB model somehow contains a database of 160
| million images.
| TillE wrote:
| It's a fine _analogy_ , but the map is not the territory.
| Machine learning is not human learning, even if it works
| in a vaguely comparable way.
|
| It's still a computer program that uses an enormous
| amount of copyrighted work as its input.
| greysphere wrote:
| "With four parameters I can fit an elephant, and with
| five I can make him wiggle his trunk"
|
| It seems like you could calculate how much data is within
| X% error of a 5GB model, and what X% should be for
| 'visual data'.
|
| I bet it's pretty big.
| pessimizer wrote:
| Not only does this keep them safe from copyright fallout, I think
| its real goal is to hold back a flood of shit games until the
| tech matures.
| tuckerpo wrote:
| I imagine this is a stopgap measure until there's more concrete
| legislature in place for AI generated IP.
| newobj wrote:
| Fake story, guaranteed
___________________________________________________________________
(page generated 2023-06-29 23:02 UTC)