Post AZG2AE9TJcCSjgDgJc by zellyn@hachyderm.io
(DIR) More posts by zellyn@hachyderm.io
(DIR) Post #AZF4xAnSaEgVGoQm0m by zellyn@hachyderm.io
2023-08-30T01:15:44Z
0 likes, 0 repeats
@simon is it normal for `llm` to forget about (some of) my downloaded models and/or plugins when python gets updated via `homebrew`?
(DIR) Post #AZF4xBcrV7njqETp8C by simon@fedi.simonwillison.net
2023-08-30T02:20:12Z
0 likes, 0 repeats
@zellyn yeah that probably means the virtual environment for LLM was recreated and you lost your installed pluginsThe models themselves should still be there - running "llm install llm-mlc" or whatever plugin you uses should cause them to show up in "llm models" again
(DIR) Post #AZG2AE9TJcCSjgDgJc by zellyn@hachyderm.io
2023-08-30T13:24:03Z
0 likes, 0 repeats
@simon Ah, thanks. That was indeed the case.btw, is it normal for the output to have other random things?```llm 'what color is grass?'The color of grass is typically green.ggml_metal_free: deallocatingllama_new_context_with_model: max tensor size = 87.89 MB```
(DIR) Post #AZGD95ZEqU2LJlvobw by simon@fedi.simonwillison.net
2023-08-30T15:27:08Z
0 likes, 0 repeats
@zellyn that's a really annoying feature of llama-cpp that I've not been able to completely work around yet - see also this related issue https://github.com/mlc-ai/mlc-llm/issues/740
(DIR) Post #AZGDLl8U7XfDiPSY2C by simon@fedi.simonwillison.net
2023-08-30T15:28:11Z
0 likes, 0 repeats
My code that attempts to fix that is here but there's still some output that leaks to stderr somehow https://github.com/simonw/llm-llama-cpp/blob/b0c2f25165adde7204c7dd9eb80535447fd333f6/llm_llama_cpp.py#L255