(C) Daily Kos This story was originally published by Daily Kos and is unaltered. . . . . . . . . . . AI's Consent Problem [1] ['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.'] Date: 2024-05-22 OpenAI’s newest chat bot has a voice that sounded suspiciously like the voice of actor Scarlett Johansson. This was clearly intentional — Sam Altman, the leader of OpenAI, tweeted “her”, a reference to the movie in which Johansson plays an AI that several people fall in love with. It turns out, however, that OpenAI was denied use of Johansson’s voice by Johansson herself. They now claim, after changing the voice, that the voice was done by an imitator, but Johansson isn’t happy. Nor should she be — AI companies clearly have an issue with consent. As Johansson makes clear, she was approached twice by Altman about using her voice in their new product. The first time, she turned them down cleanly. The second time, the product came out just two days after they reached out, leaving no time for a serious answer. Even if the voice was done by an impersonator — something that seems unlikely given the history and the sudden plea for permission two days before the product launch — they still went ahead and did everything they could to use her voice without her permission and make it seem as if their product was following in her footsteps. It is, a best, a scummy move and at worst a direct theft of her likeness. This is, of course, not the first time that OpenAI has been caught with its hand in the cookie jar. They have openly stated that they cannot create their products unless they are allowed to ignore copyright laws. They also appear to have trained their latest video tool on YouTube videos, but don’t want to admit their appropriation. Consent is clearly not something that OpenAI values. However, it is not just OpenAI that appears to have this problem. The CEO of Google, Sundar Pichai, recently sat down for an interview with the Verge to discuss their new AI products. It is not an especially hard-hitting interview (Patel, the interviewer, is not known for pressing tech execs especially hard) but it does have a couple of interesting points, but I want to focus on his allergy to consent for now. Pichai dodges the question when pressed on how his AI system is taking from artists in the same way that YouTube had its material taken by OpenAI (something Pichai alludes to in the interview): That’s really the thing I’m asking about: how do you bring value back to them? How do you bring incentives back to the small creator or the independent business that’s saying, “Look, this feels a taking.” Look. [Sighs] The whole reason we’ve been successful on platforms like YouTube is we have worked hard to answer this question. You’ll continue to see us dig deep about how to do this well. And I think the players who end up doing better here will have more winning strategies over time. I genuinely believe that. Across everything we do, we have to sort that out. Anytime you’re running a platform, it’s the basis on which you can build a sustainable long-term platform. Through this AI moment, over time, there’ll be players who will do better by the content creators that support their platforms, and whoever does it better will emerge as the winner. I believe that to be a tenet of these things over time. Note that Pichai doesn’t say that his AI isn’t being trained on artists’ work without their permission, nor does he claim that he will stop doing that, nor does he promise to fairly compensate artists for taking their work. He merely says that he thinks whoever makes artists like them best will win out. He doesn’t have to run fast enough to escape the bear that is artist anger at AI theft — he merely has to run faster than his competitors. This attitude is likely because AI is not making a profit, not now, and it may never. These products are expensive, and they do not work well. The Sora demo, for example, required extensive human interaction to even begin to approach usability. And of course, hallucinations are a constant and likely unsolvable problem for imitative AI. Not to mention the out and out plagiarism. Google sees the value that artists work provides — Pichai’s defense of YouTube’s terms of service make that clear. As does the fact that these companies are making deals with newsrooms, Slack, and Reddit to use their content in the AI training sets. The hypocrisy is only interesting in as far as it proves that they know they are taking value from people without paying for that value. They are bad, again, at consent. And that should tell us a lot about the AI boom-let. It is not going to be about creating value for anyone other than the companies involved. If it was, they would not be taking from artists, they would be partnering with them, as they partner with deep pocket companies that can afford to push back on the taking. It also tells us that either the “market” for AI is completely broken or that there is likely to be far less money in it than the hype wants us to believe. If there was so much value to be created, surely one company would make itself looks great and solve some legal headaches by doing something to share the wealth with the people it takes from in order to train its models. That no one is even suggesting this, that the big players suggest the opposite, that copyright should be abolished in the case of AI training data, is quite telling. Watch what people do, not what they say is always good advice. And in this instance, what the AI companies are doing suggests very strongly that they don’t know how to get to profitability with imitative AI, at least not on the levels that the hype requires, and that they aren’t interested in making artists lives easier. Just in taking from artists until there is nothing left to take. Want more oddities like this? You can follow my RSS Feed or free newsletter [END] --- [1] Url: https://www.dailykos.com/stories/2024/5/22/2241864/-AI-s-Consent-Problem?pm_campaign=front_page&pm_source=more_community&pm_medium=web Published and (C) by Daily Kos Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified. via Magical.Fish Gopher News Feeds: gopher://magical.fish/1/feeds/news/dailykos/