[HN Gopher] GPT-3 can create both sides of an Interactive Fictio...
___________________________________________________________________
GPT-3 can create both sides of an Interactive Fiction transcript
Author : raldi
Score : 77 points
Date : 2022-10-24 16:19 UTC (1 days ago)
(HTM) web link (old.reddit.com)
(TXT) w3m dump (old.reddit.com)
| joshuahedlund wrote:
| Is there a Stable-Diffusion-esque open-source GPT yet? Given the
| incredible pace of advances in the image space this year, and my
| (perhaps naive) assumption that text generation is less complex
| and less resource-intensive than image generation, I'm hoping
| we'll get something similar and surprised that we haven't yet.
| visarga wrote:
| Google recently released the Flan-T5 models that are very
| efficient, they rival GPT-3 and can run on a normal desktop.
| You can try them out.
|
| https://huggingface.co/google/flan-t5-xxl?text=Translate+to+...
| davnn wrote:
| There are multiple open-source GPTs, but GPT-3 is absolutely
| massive - larger than the image models actually! So,
| unfortunately, text generation is probably even more complex
| and resource intensive than image generation (especially to
| train). Additionally, in image generation, we appreciate the
| creativity of solutions, but in text generation the creative
| solutions seem like utter nonsense.
| joshuahedlund wrote:
| I guess my intuition is based on the file size of text being
| so much smaller than images, but I guess that doesn't really
| map to the complexity of generating it. Fascinating!
| davnn wrote:
| I think large language models are still in their infancy.
| The models are extremely sparse, but we don't have the
| tooling yet to deal with these kinds of structures
| efficiently. Your intuition might be right in a future,
| maybe.
| drusepth wrote:
| If you think about the space both models are covering from
| a rate-of-failure perspective, it kind of makes sense that
| images end up being a bit easier than text: both text- and
| image-models can output results that look plausible at
| first glance, but when you analyze both outputs further
| there are a _lot_ more gotchas in parsing meaning within
| language than there are in pixel placement within an image.
| spijdar wrote:
| There is, called "GPT-NeoX", or its TPU-based predecessor GPT-
| Neo. However, even running inference on these models is much,
| much harder than Stable Diffusion -- the GPT-NeoX-20B weights
| for GPT-NeoX requires a minimum of two GPUs with 24 GB of VRAM
| _each_ to simply run inference, never-mind training or fine-
| tuning.
|
| I believe there are some tricks for cutting down the VRAM
| requirements a bit by dropping precision at different points,
| but the gist is that these big text models are actually quite a
| bit _more_ resource intensive than the image models.
| tinco wrote:
| Interesting that they're saying they have a target of
| replicating GPT-3 with 175B parameters, but since the RETRO
| paper it should (in theory?) be possible to replicate GPT-3
| with a tenth of the parameters. Are they not planning to use
| a retrieval system, is it difficult to adopt their system for
| retrieval, or is their readme out of date?
| jncraton wrote:
| There are smaller models that you can freely play around with
| that work in roughly the same way. If you're working on a
| fairly regular computer some reasonable options are GPT2 or
| GPT-Neo. These can both perform inference on your local CPU if
| you have 8GB or more of RAM.
|
| They are much less powerful that GPT-3, but they can still be
| fun for simple text generation or NLP tasks. You can play
| around with one of the smaller GPT-Neo models that should fit
| in RAM if run locally here:
|
| https://huggingface.co/EleutherAI/gpt-neo-1.3B
|
| That page includes instructions to run this locally in Python.
|
| As others mentioned, there are larger models available, but
| they tend to be expensive to setup and use as an individual.
| gamegoblin wrote:
| Emad Mostaque, CEO of Stability AI (behind Stable Diffusion)
| has said they are training language models that they plan on
| releasing. IIRC, though, even using all the most up-to-date
| techniques like 1-byte parameter quantization and Chinchilla
| scaling learnings, it will still probably be on the order of
| magnitude of 64GB or something, so not quite usable for most
| people locally yet, unless you happen to have a very beefy
| multi-GPU machine at home.
| ctoth wrote:
| GLM-130b is available and uses 8-bit quantization, was
| trained on 400 billion tokens, and runs on 4x3090, which, of
| course, is why all the cheap Vast.AI instances are gone :)
| goodside wrote:
| I recorded a demo of this same premise here:
| https://twitter.com/goodside/status/1562613028927205377
|
| Text completions of exotic forms of session/action logs are a
| seriously under-explored area. Here's what happens if, instead of
| a text game, you do text completion on an IPython REPL:
| https://twitter.com/goodside/status/1581805503897735168
| jcims wrote:
| Isn't that the premise of AI Dungeon?
|
| https://aidungeon.io/
| raldi wrote:
| As far as I can tell, that can only generate one side of the
| transcript.
| jtvjan wrote:
| Type a lone "> You" in "Story" mode and it will generate an
| action for you.
|
| NovelAI[1] just gives you a big text box to type in, so you
| don't have to do any input hacks.
|
| [1]: https://novelai.net/
| neaden wrote:
| It's interesting that the prompt here pretty much immediately
| became a Dr. Who game. It looks like there has been a Dr. Who
| Text adventure game before but it was made before Jack Harkness
| and Torchwood. I wonder if there is some fan made text adventure
| that was part of the training data. Probably would be best for
| Open AI that it doesn't just spit out copywrited characters
| unprompted too.
| visarga wrote:
| Copyright infringement detection could be formulated as a ML
| task. A model can copy the idea but not the expression, if it's
| copyrighted. Fortunately a model can sample again and again
| until it looks ok.
| planetsprite wrote:
| 99 times out of 100 whenever I see a new tech demo that
| absolutely blows my mind and makes me optimistic about the future
| of technology it's always some big model deep learning AI thing.
| When are we all going to admit that ML/AI is the final and
| ultimate paradigm shift of our time
| kleiba wrote:
| It's all smoke and mirrors.
| dilap wrote:
| I think it when it starts having a broad impact on the way we
| live, like the internet and smartphones did.
| raldi wrote:
| Last night the AI and I teamed up to write bedtime stories on
| demand and on the fly for whatever themes my daughter mused.
| I think about a billion families would enjoy an app that did
| that.
| dilap wrote:
| I do think there's a good chance this stuff _will_ have
| that level of impact, I just don 't think it has yet.
|
| (Though I'm not _convinced_ it will. If you wanted to be a
| skeptic, you could argue that we 're _already_ in a huge
| content glut; there 's basically infinite content available
| for almost free. So does radically lower the cost matter
| that much?
|
| (Maybe in low-level ways, like it'll increase the abilities
| of small indie studio to produce high-level content. But to
| your average consumer, maybe it's not noticable.)
| airstrike wrote:
| > So does radically lower the cost matter that much?
|
| It's not about lowering the cost, it's about generating
| content that fits your specific preferences in real time.
| Forget on demand streaming of off-the-shelf content,
| people will want on demand _content generation_.
|
| And then, as is tradition, the next step after that is
| giving you content you did not even know you wanted. That
| can be perverse marketing, feeding you sensationalized
| stuff that keeps you hooked 24/71 but the better
| alternative is something that is tailored to your general
| tastes, genres, writing styles, etc. so that you receive
| the content that is best received by you at that
| particular moment (matching your mood, your goals, your
| style)
|
| ----------
|
| 1. This image comes to mind...
| https://www.wallpaperflare.com/dystopian-cyberpunk-sad-
| virtu...
| thih9 wrote:
| And yet people aren't doing this so I guess something is
| still missing. Awareness? UX? Something else? All of the
| above? The company that figures it out might earn a lot.
| MintsJohn wrote:
| See what is happening with StableDiffusion, a model was
| released opensource, performance in the same league as
| closed source, usable on consumer hardware and (non AI)
| techies start to modify it. The biggest
| steps/modifications are by specialists no doubt, yet
| still opensource, but others are happily glueing parts
| together to make something else. The key really seems to
| be access, an hosted API is rather hostile to innovation,
| using and especially experimenting is expensive,
| modification can only happen within whatever the API
| allows. For the tech to get bigger and more noticed
| faster more people need to be able to tinker with it.
| nonasktell wrote:
| Price. I have a dozen ideas that could be done using
| GPT-3. Most of them aren't financially viable, I'm not
| burning half my income to build an MVP.
| actually_a_dog wrote:
| Maybe so, but unless there are a billion families out there
| who are already privileged enough to have internet access
| and are willing to pay for such an app, I don't see it
| happening.
| davnn wrote:
| I recently wrote a VS Code plugin to use GPT-3 for text
| completions and editing of text using instructions [1]. The main
| intention was scientific writing, but I didn't find it
| particularly useful until now, unfortunately.
|
| [1] https://github.com/davnn/assistant
| aaron695 wrote:
| airstrike wrote:
| OK, this is absolutely mindbogglingly fun. Here's my run at it,
| with italics representing what the model spit out:
|
| > OPEN THE BACKDOOR WITH THE OLD KEY
|
| You open the manor's backdoor with the old key. Your torch lights
| the way into the damp service halls, the warm glow of the nuclear
| war in the horizon lighting the window panes. You can almost feel
| the centuries old dust through your boots as you make your way up
| the stairs. You climb up to the main floor and pause at the
| entrance to the parlor room. It is quiet, except for the
| persistent low rumble of the bombs in the distance.
|
| _You enter the parlor room and are immediately hit with the rank
| smell of death and decay. The room is draped in cobwebs and
| littered with the bones of small animals. The only light comes
| from the faint glow of the bombs outside. You see a small, dust
| covered table in the center of the room, and on it is a dusty old
| book. You approach the table and dust off the book. It is titled,
| "The Necronomicon"._
|
| > _OPEN THE BOOK_
|
| _You open the book and begin to read. The first few pages are
| filled with gibberish, but as you flip through the pages, the
| text becomes clearer and more comprehensible. The book speaks of
| dark things, things that should not be spoken of, things that
| should not be known. It speaks of powerful beings that sleep
| beneath the earth, waiting to be awaken. It speaks of a time when
| the world will be consumed by darkness, and all will bow before
| the old ones. You close the book and back away from the table,
| your heart pounding in your chest. You have to get out of here.
| You have to warn somebody._
|
| > _LEAVE THE PARLOR ROOM_
|
| _You leave the parlor room and make your way back down the
| stairs. You can hear something moving around in the darkness, but
| you don 't dare look back. You make it to the backdoor and slam
| it shut behind you. You lean against the door, your heart still
| racing, and listen to the thing moving around inside the manor.
| You hear it coming down the stairs, but it doesn't reach the
| door. You wait there for a few minutes, but nothing happens.
| Finally, you muster up the courage to peek through the keyhole.
|
| You see a large, black, slimy creature slithering around the
| parlor room. It is the size of a man, but it has no eyes, only a
| large mouth filled with sharp teeth. You watch in horror as the
| thing moves around the room, destroying everything it comes
| across. Then, without warning, it turns towards the door and
| starts towards you.
|
| > RUN AWAY
|
| You turn and sprint away from the manor as fast as you can. You
| can hear the thing crashing through the door behind you, but you
| don't dare look back. You run through the woods, your heart
| pounding in your chest, until_ you find an old cottage by a
| creek. You force the old door in and see a 12 gauge shotgun lying
| on the table. You confidently load the gun and sprint out of the
| cottage.
|
| > _AIM THE SHOTGUN
|
| You raise the shotgun and take aim at the creature. You fire off
| two rounds and the creature falls to the ground, writhing in
| pain. You approach the creature and finish it off with a bullet
| to the head. You stand there for a moment, panting, your heart
| still racing. You can't believe you just killed something. You
| can't believe this is real._
|
| > INSPECT THE CREATURE'S BLOOD
|
| _You kneel down next to the creature and inspect its blood. It
| is black and viscous, and it smells like rot. You have no idea
| what this thing is, but you know it isn 't natural. You have to
| find out what is going on. You have to find out what these things
| are._ Suddenly you notice the viscous blood pooling around you,
| engulfing your right foot. You try to move, but by now your whole
| right leg is covered with it. It is too late. You can feel the
| power coursing through you, and you know that you are not the
| same person you were before. You are something else now.
| Something dark. Something evil.
|
| > RETURN TO THE CITY
|
| _You make your way back to the city, your mind consumed with
| dark thoughts. You know that you are not the same person you were
| before, and you are not sure what to do with your new found
| power. all you know is that you want more. You want to learn more
| about these things, and you want to find more of them. You want
| to become the most powerful being in the world, and you will stop
| at nothing to achieve your goal._
| stuckinhell wrote:
| wow thats really good!
| raldi wrote:
| And if you don't like the way the story is going, you can just
| erase that part and everything that followed and type a couple
| words nudging it in the right direction.
|
| For some reason it's really drawn to rusty keys.
|
| > LOOK IN HOLE
|
| _You look in the hole and see a rusty key_ - NO, TRY AGAIN,
| ERASE ERASE ERASE
|
| _You look in the hole and see an old rusty key_ - NO,
| BACKSPACE BACKSPACE
|
| _You look in the hole and see a small box_ - YES!!!!
|
| _> OPEN BOX_
|
| _You open the box, revealing a rusty key_
| airstrike wrote:
| I hear you--it took this somewhat unexpected turn into horror
| so I just ran with it, but it's not where I was going
| originally... I guess the bombs dropping in the horizon made
| it eerie? It was supposed to be post-apocalyptic but it
| decided to dial things up to 11 and bam!! "The Necronomicon"
| selimnairb wrote:
| Who cares? Does GPT-3 know suffering and joy? No. Will it produce
| art? No. This is about as interesting as creating a self-
| fellating android.
| [deleted]
| sodality2 wrote:
| > Does GPT-3 know suffering and joy?
|
| Do you? Prove it. Your textual output here will do nothing to
| prove you know suffering and joy _any more_ than GPT-3 would be
| able to.
|
| > Will it produce art?
|
| Is a poem art? I think most consider it to be.
|
| For what it's worth, I don't like where AI generated content is
| going either, but to disregard it as worthless (instead of
| simply lacking conscience) is not helpful.
| visarga wrote:
| GPT-3 has learned many identities, it's not lacking a
| perspective but quite the opposite, it has too many
| perspectives and you never know who you're talking to. Unless
| you prompt it with an identity. Recently it was used to run
| "in silico sampling" [1], a virtual poll where the
| respondents were GPT-3 prompted with personality backstories.
|
| So it should not be seen as an agent, it's not like us in
| that regard. It is the synthesis of all human culture and can
| assume any hat.
|
| [1] https://arxiv.org/abs/2209.06899
| avarun wrote:
| According to who? You?
|
| This creation is art to me, and much more interesting than your
| comparison.
| smcameron wrote:
| I don't know, Rockit was pretty popular back in the day.
| https://www.youtube.com/watch?v=GHhD4PD75zY
___________________________________________________________________
(page generated 2022-10-25 23:00 UTC)