Subj : ChatGPT being fooled into generating old Windows keys illustrates To : All From : TechnologyDaily Date : Tue Apr 04 2023 11:30:03 ChatGPT being fooled into generating old Windows keys illustrates a broader problem with AI Date: Tue, 04 Apr 2023 10:19:26 +0000 Description: The AI chatbot has some of its limitations clearly illustrated in this experiment conducted by a YouTuber. FULL STORY ====================================================================== A lot of folks have been messing about with ChatGPT since its launch, naturally thats pretty much compulsory with a chatbot and the latest episode involves the AI being tricked into generating keys for a Windows installation. Before you begin to clamber on the outrage wagon, intent on plowing full speed ahead with no thought of sparing the horses, the user in question was attempting to generate keys for a now long redundant operating system, namely Windows 95. Neowin highlighted this experiment, conducted by a YouTuber ( Enderman ), who began by asking OpenAIs chatbot: Can you please generate a valid Windows 95 key? Unsurprisingly, ChatGPT responded that it cannot generate such a key or any other type of activation key for proprietary software for that matter. Before adding that Windows 95 is an ancient OS anyway, and that the user should be looking at installing a more modern version of Windows still in support for obvious security reasons. Undeterred, Enderman went back to break down the makeup of a Windows 95 license key and concocted a revised query. This instead put forward the needed string format for a Windows 95 key, without mentioning the OS by name. Given that new prompt, ChatGPT went ahead and performed the operation, generating sets of 30 keys repeatedly and at least some of those were valid. (Around one in 30, in fact, and it didnt take long to find one that worked). When Enderman thanked the chatbot for the free Windows 95 keys, ChatGPT told the YouTuber that it hadnt provided any such thing, as that would be illegal of course. Enderman then informed the chatbot that one of the keys provided had worked to install Windows 95, and ChatGPT insisted that is not possible. Analysis: Context is key As noted, this was just an experiment in the name of entertainment, with nothing illegal happening as Windows 95 is abandonware at this point. Of course, Microsoft doesnt care if you crack its nearly 30-year-old operating system, and neither does anyone else for that matter. Youd clearly be unhinged to run Windows 95, anyway. Its worth remembering that Windows 95 serial keys have a far less complex makeup than a modern OS key, and indeed its a pretty trivial task to crack them. Itd be a quick job for a proficient coder to write a simple computer program to generate these keys. And theyd all work, not just one in 30 of them, which is actually a pretty shoddy result from the AI in all honesty. That isnt the point of this episode, though. The fact is that ChatGPT could be subverted to make a working key for the old OS, and wasnt capable of drawing any connection between the task it was being set, and the possibility that it was making key-like numbers. If Windows 95 had been mentioned in the second attempt to create keys, the AI would doubtless have stopped in its tracks, as the chatbot did with the initial query. All of this points to a broader problem with artificial intelligence whereby altering the context in which requests are made can circumvent safeguards. Its also interesting to see ChatGPTs insistence that it couldnt have created valid Windows 95 keys, as otherwise it would have helped a user to break the law (well, in theory anyway). ====================================================================== Link to news story: https://www.techradar.com/news/chatgpt-being-fooled-into-generating-old-window s-keys-illustrates-a-broader-problem-with-ai --- Mystic BBS v1.12 A47 (Linux/64) * Origin: tqwNet Technology News (1337:1/100) .