Subj : Re: AI and BBSing To : claw From : unc0nnected Date : Tue Oct 03 2023 20:04:32 cl> Tr> Depending on the LLM, yes... Many/most LLMS run on GPU hardware, and cl> Tr> good ones need a card with 16gb or more ram on that GPU in order to r cl> Tr> Even a "low" memory, CPU based LLM will generally want at least a cou cl> Tr> cores and a few gigs of ram, so if you're running multiple conversati cl> Interesting. That makes sense so you need something like a 6950 with cl> loads of ram? cl> That explains why no one is doing this. Dedicate a peice of hardward cl> like that to a BBS. nope. Alternatively you could just get credits at openai.com and go directly through chatGPT's API. If you limited it to GPT3 the costs would be fairly minimal, around $0.002 for 750 words or so or 20 cents for 75,000 words and you could just cap it at $10/month for safety. Unc0nnected @ Bottomless Abyss BBS ®bbs.bottomlessabyss.net:2023¯ --- Mystic BBS v1.12 A45 2020/02/18 (Linux/64) * Origin: The Bottomless Abyss BBS * bbs.bottomlessabyss.net (21:1/172) .