Subj : Re: AI and BBSing To : claw From : Tracker1 Date : Sun Oct 01 2023 17:59:13 Tr>> I hadn't really dug into it though. You'd really need to either Tr>> make requests to an external service, or write your own Tr>> terminal/console application to use as the chat bot itself, outside Tr>> synchronet/js. Tr>> It's definitely doable though. That said, it could be easy to Tr>> overwhelm a system's resources, or incure some significant costs. cl> What kind of rescource cost are we talking about 8 - 16 cores and 16G ram cl> or much more? Depending on the LLM, yes... Many/most LLMS run on GPU hardware, and the good ones need a card with 16gb or more ram on that GPU in order to run. Even a "low" memory, CPU based LLM will generally want at least a couple cores and a few gigs of ram, so if you're running multiple conversations it's entirely possible to hit those kinds of limits. -- Michael J. Ryan +o roughneckbbs.com tracker1@roughneckbbs.com --- SBBSecho 3.15-Linux * Origin: Roughneck BBS - roughneckbbs.com (21:3/149) .