Post AXcd5IyRshGVymfxfk by simon@fedi.simonwillison.net
 (DIR) More posts by simon@fedi.simonwillison.net
 (DIR) Post #AXcaEqh9Zp4fJ6RfrU by simon@fedi.simonwillison.net
       2023-07-12T14:45:02Z
       
       0 likes, 0 repeats
       
       Huge new release of my LLM CLI tool (and Python library) for accessing Large Language Models: it now supports additional models via plugins, so you can "llm install llm-gpt4all" to get models that run on your own machine!https://simonwillison.net/2023/Jul/12/llm/
       
 (DIR) Post #AXcaRH8P6KOOmb7JEu by simon@fedi.simonwillison.net
       2023-07-12T14:46:29Z
       
       0 likes, 0 repeats
       
       I want to make local models (and remote API-driven models) as easy to try out as possible, so I put together this detailed tutorial about how to build an LLM plugin that adds support for a new model: https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html
       
 (DIR) Post #AXcah9amuVly1Zxs5w by simon@fedi.simonwillison.net
       2023-07-12T14:48:32Z
       
       0 likes, 0 repeats
       
       I released three new LLM plugins this morning:- llm-gpt4all adds 17 models from the amazing https://gpt4all.io/ project - https://github.com/simonw/llm-gpt4all- llm-mpt30b adds the MPT-30B model (a 19GB download) - https://github.com/simonw/llm-mpt30b- llm-palm adds support for Google's PaLM 2 model, via their API - https://github.com/simonw/llm-palm
       
 (DIR) Post #AXcbBQuX5LqaQMddxo by nick@amok.recoil.org
       2023-07-12T14:55:53Z
       
       0 likes, 0 repeats
       
       @simon Are you mostly doing your LLM work on the M2?
       
 (DIR) Post #AXcd5IyRshGVymfxfk by simon@fedi.simonwillison.net
       2023-07-12T15:17:16Z
       
       0 likes, 0 repeats
       
       @nick yup, entirely on the M2 - I really should get myself a CUDA environment somewhere though
       
 (DIR) Post #AXcdq3ppz7ucdZwWES by daaain@fosstodon.org
       2023-07-12T15:25:38Z
       
       0 likes, 0 repeats
       
       @simon this is absolutely excellent work Simon, especially like that it's possible to install with Homebrew instead of the usual manual Conda setup! But in general, having ergonomic CLI and Python interfaces to do a lot of common tasks (download models, save chats and soon plugins) is amazing!
       
 (DIR) Post #AXce2X6uxtsUuOsZfs by simon@fedi.simonwillison.net
       2023-07-12T15:26:13Z
       
       0 likes, 0 repeats
       
       @daaain thanks! I got frustrated with how hard it was to try out local models
       
 (DIR) Post #AXcgx8qo2Ee65BSpEW by windhamdavid@universeodon.com
       2023-07-12T15:59:59Z
       
       0 likes, 0 repeats
       
       @simon Dude, you are awesome... but make sure you take a break sometime.
       
 (DIR) Post #AXckaMTZbtE5UrzRi4 by daaain@fosstodon.org
       2023-07-12T16:41:20Z
       
       0 likes, 0 repeats
       
       @simon I tried a few myself and after install errors, unexplainable crashes and bad performance I just gave up, so there's definitely space for something that's easy to use and nicely fits into the wider Unix tool (piping, yes!) and Python library ecosystem.
       
 (DIR) Post #AXdP21BYMGKtjwItSi by cydonian@social.vivaldi.net
       2023-07-13T00:14:30Z
       
       0 likes, 0 repeats
       
       @simon All awesomeness and can’t wait to try them out! Some random ideas to build on top of this:1) Support for Claude (if it’s not there already)2) Some way of adding a file to the prompt via pipe (Claude and Code Analyser take file input, so it could be interesting to enable that at a CLI itself)3) Generate the output (data analysis pieces in particular) as QMD files (that’s Quarto)
       
 (DIR) Post #AXoCcE3unYH9R7E1J2 by simon@fedi.simonwillison.net
       2023-07-18T05:16:53Z
       
       0 likes, 1 repeats
       
       New LLM plugin today: llm-replicate, which provides support for running prompts against any model hosted on https://replicate.com - including falcon-40b-instruct https://github.com/simonw/llm-replicate
       
 (DIR) Post #AXpfSXAuuFknkJpBFQ by simon@fedi.simonwillison.net
       2023-07-18T22:15:08Z
       
       0 likes, 0 repeats
       
       LLM 0.6 is out: https://llm.datasette.io/en/stable/changelog.html#v0-6