Post AZAfDAoIvSVMVNxGT2 by FrayJay@mastodon.social
(DIR) More posts by FrayJay@mastodon.social
(DIR) Post #AZAHw1Yfm2WPwRUsMK by killyourfm@layer8.space
2023-08-27T18:54:06Z
0 likes, 0 repeats
Looking for an AI chatbot that runs locally, doesn't send your info to corporate overlords, doesn't need a GPU, and doesn't need internet? GPT4ALL is what you're looking for: https://github.com/nomic-ai/gpt4all#AI #ChatGPT #OpenSource
(DIR) Post #AZAIHcgmuvcjBMalPM by shipp@mastodon.coffee
2023-08-27T18:58:05Z
0 likes, 0 repeats
@killyourfm OpenAI definitely are the bad guys at this point, and somehow Meta is saving the day. "Open"AI ended up being 100% the opposite of their initial mission and goal and is the most closed AI lol. They need a rename.
(DIR) Post #AZAIgmlga4SxgIshcm by killyourfm@layer8.space
2023-08-27T19:02:40Z
0 likes, 0 repeats
@shipp It also sounds like they're about to get sued into the ground on multiple fronts. I guess this is where profit-driven goals and greed without any oversight gets you, huh?
(DIR) Post #AZAJESihcqjVLhXnrU by razze@osna.social
2023-08-27T19:08:43Z
0 likes, 0 repeats
@killyourfm did you compare with localAI? That's integrated with nextcloud.
(DIR) Post #AZAJcjwZEx4HiWbcv2 by killyourfm@layer8.space
2023-08-27T19:13:09Z
0 likes, 0 repeats
@razze No I haven't. I'm just barely starting to drink from the open-source AI firehose.
(DIR) Post #AZAJvAPRa0it0mtoTh by radasbona@social.vivaldi.net
2023-08-27T19:16:25Z
0 likes, 0 repeats
@killyourfm could this be the beginning for an "open source Siri" on android?
(DIR) Post #AZAKKeTevOu3VP1UPI by paninid@mastodon.world
2023-08-27T19:21:04Z
0 likes, 0 repeats
@killyourfm FWIW, it will eat up processor bandwidth when running. Multi-tasking not an option.
(DIR) Post #AZAKLTsDppr2uLuKH2 by killyourfm@layer8.space
2023-08-27T19:21:05Z
0 likes, 0 repeats
@radasbona I'm not a developer or an AI expert, so I'd hesitate to give any kind of concrete answer. But I do have hope! Mozilla AI is working on some special things that use this as its foundation. (Also, knowing that Firefox Translations runs locally and is only about 17MB inspires even more hope)
(DIR) Post #AZAKP8XV659ttoxTzU by killyourfm@layer8.space
2023-08-27T19:21:53Z
0 likes, 0 repeats
@paninid That's interesting to know, thanks! I haven't taken a look at its performance footprint yet.
(DIR) Post #AZAMTUY1Pc51DwPGeO by razze@osna.social
2023-08-27T19:45:04Z
0 likes, 0 repeats
@killyourfm was just curios if you could report
(DIR) Post #AZAOrSy78PyvBVAdzU by plwt@mstdn.social
2023-08-27T20:11:49Z
0 likes, 0 repeats
@killyourfm Strongly recommend this for a weekend project - I have tried it and it is good fun.
(DIR) Post #AZAaWlmJCleWtFXw8W by Blort@social.tchncs.de
2023-08-27T22:22:32Z
0 likes, 0 repeats
@killyourfm Note: only some of the models available to #GPT4All are offline. For example if you want to use the ChatGPT4 model, you still need an API key and it states that it will send all of your chats to #openAI. That said it does have a number of models (eg Wizard) that are local only.Still, a nice find!Now to see if I can fool these models into creating horrific lovecraftian nightmare texts as easily as with #ChatGPT4 (eg appendectomy instructions, roadkill recipes etc)
(DIR) Post #AZAectTHukNSIvpDyC by bill88t@c.im
2023-08-27T23:08:26Z
0 likes, 0 repeats
@killyourfm You may also wanna take a look at koboldcpp which is a jack of all trades.It pretty much supports every model type context size and setting you could imagine.Doesn't play nice with cuda 12 atm, but runs well with just cpu.Also plays very nicely inside of docker (even with gpu acceleration).
(DIR) Post #AZAfDAoIvSVMVNxGT2 by FrayJay@mastodon.social
2023-08-27T23:15:01Z
0 likes, 0 repeats
@killyourfm @razze ouf that's a deep rabbit hole. Recommend looking at orca and wizzardlm models ;)
(DIR) Post #AZB9VZ1qcNpfSFyWHI by ReverseModule@mstdn.games
2023-08-28T04:54:28Z
0 likes, 0 repeats
@killyourfm Oh wow! It's also in tha AUR/Chaotic AUR. Time for some fun! Thanks so much for this! :)
(DIR) Post #AZC43BeBeSsWDVW8VE by hayesstw@c.im
2023-08-28T15:28:02Z
0 likes, 0 repeats
@killyourfm Will it run on Windows XP?
(DIR) Post #AZE8lKLFTVoM3askdc by killyourfm@layer8.space
2023-08-29T15:30:15Z
0 likes, 0 repeats
@hayesstw I honestly don't know as I couldn't find hardware requirements at a glance. Might be a good question to ask the developers.
(DIR) Post #AZE8nk4mlXlmRyKino by killyourfm@layer8.space
2023-08-29T15:30:44Z
0 likes, 0 repeats
@ReverseModule My pleasure! (I miss sharing cool open source apps...)
(DIR) Post #AZE9063sqmBy6znokC by killyourfm@layer8.space
2023-08-29T15:32:54Z
0 likes, 0 repeats
@tamitha A cursory search tells me they are donated (for certain models?) but I couldn't find specific information on this.
(DIR) Post #AcxMVJmJFLCT1MuuUS by Idcrafter@fosstodon.org
2023-12-19T04:42:34Z
0 likes, 0 repeats
@killyourfm llama.cpp or llamafile would be small and quite easy ways to do so