Post AcKmhWNLFTBB7XitP6 by brodriguesco@fosstodon.org
(DIR) More posts by brodriguesco@fosstodon.org
(DIR) Post #AcKmhWNLFTBB7XitP6 by brodriguesco@fosstodon.org
2023-11-30T13:59:39Z
0 likes, 0 repeats
What’s an LLM I can run on my machine to assist me with coding on #Emacs?
(DIR) Post #AcKmhXKXhMXC59QAgC by galdor@emacs.ch
2023-11-30T14:02:29Z
0 likes, 0 repeats
@brodriguesco It's mostly dependent on your GPU. Models able to run on consumer GPU (a couple gigabyte of video memory) or without a GPU altogether (e.g. using llama.cpp) are very far from the state of the art.