Post Aco5TXqTPgxAwDTeRk by vidar@m.galaxybound.com
 (DIR) More posts by vidar@m.galaxybound.com
 (DIR) Post #Acn1zgOGaJ7RjROPjc by simon@fedi.simonwillison.net
       2023-12-14T05:03:55Z
       
       0 likes, 1 repeats
       
       New release of my llm-anyscale-endpoints plugin which can now talk to Mixtral-8x7B-InstructIt's a really impressive model. It gave me an excellent response to my favorite coding test prompt: "Write a Python function that accepts a URL to a CSV file, downloads it and loads it into a SQLite database, creating a table with the correct columns"
       
 (DIR) Post #Acn70A1qtBCdQOxKMq by simon@fedi.simonwillison.net
       2023-12-14T06:00:00Z
       
       0 likes, 1 repeats
       
       I figured out how to run Mixtral on my own laptop!I used the latest llama-cpp-python (released 2 hours ago) and my llm-llama-cpp plugin, grabbed the 38.4GB GGUF file from https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF/blob/main/mixtral-8x7b-instruct-v0.1.Q6_K.gguf and ran the following incantation:llm -m gguf -o path mixtral-8x7b-instruct-v0.1.Q6_K.gguf '[INST] Write a Python function that accepts a URL to a CSV file, downloads it and loads it into a SQLite database, creating a table with the correct columns[/INST]'
       
 (DIR) Post #Acn7WMS6EcBdDoaTOS by simon@fedi.simonwillison.net
       2023-12-14T06:05:54Z
       
       0 likes, 1 repeats
       
       If you then run this:llm llama-cpp add-model mixtral-8x7b-instruct-v0.1.Q6_K.gguf --llama2-chat -a miIt will use the [INST] prompt format automaticall, and you can execute the model using the new "mi" alias like so:llm -m mi "Crack a joke about a walrus who likes selling balloons"
       
 (DIR) Post #Acn7s6hzRxr6lSOCJM by jannem@fosstodon.org
       2023-12-14T06:09:26Z
       
       0 likes, 0 repeats
       
       @simon Don't keep us in suspense - what joke did you get?
       
 (DIR) Post #Acn82ljblqu8ttEK6S by simon@fedi.simonwillison.net
       2023-12-14T06:11:03Z
       
       0 likes, 0 repeats
       
       @jannem Why did the walrus go out of business selling balloons?Because he was terrible at "sealing the deal"! (since walruses have whiskers, not seals!)
       
 (DIR) Post #Acn9w7wH18J7eB7B0S by jannem@fosstodon.org
       2023-12-14T06:32:56Z
       
       0 likes, 0 repeats
       
       @simon Well... It *is* better than my sisters favourite joke.
       
 (DIR) Post #AcntXh4bqZW1GaIjUO by knur@tilde.zone
       2023-12-14T15:03:30Z
       
       0 likes, 0 repeats
       
       @simon one day you will have to explain your obsession with walruses.
       
 (DIR) Post #AcnumOtNpGT3B1cvI0 by simon@fedi.simonwillison.net
       2023-12-14T15:17:35Z
       
       0 likes, 0 repeats
       
       @knur not much to explain, they're clearly excellent!I particularly love this one: https://www.niche-museums.com/78
       
 (DIR) Post #AcnuzfypbYpzwFVOXg by simon@fedi.simonwillison.net
       2023-12-14T15:20:11Z
       
       0 likes, 0 repeats
       
       @knur plus they have absolutely the best skulls  https://www.niche-museums.com/100
       
 (DIR) Post #Aco5TXqTPgxAwDTeRk by vidar@m.galaxybound.com
       2023-12-14T17:14:27Z
       
       0 likes, 0 repeats
       
       @simon @knur Love the Horniman. Such an odd mixture of stuff and great location.
       
 (DIR) Post #Aco66CA9xiMUz77CmO by happyborg@fosstodon.org
       2023-12-14T17:24:31Z
       
       0 likes, 0 repeats
       
       @simon Does this work without the API key?
       
 (DIR) Post #Aco84hwJ2wgsrp3Om8 by happyborg@fosstodon.org
       2023-12-14T17:44:33Z
       
       0 likes, 0 repeats
       
       @simon also... not sure if it interests you but the model is being downloaded in order to upload it to the current Safe Network testnet so the community there can try downloading it and trying it out with your instructions. Ref: https://safenetforum.org/t/closernet-13-12-23-testnet/38945/131?u=happybeingMeantime, thanks for sharing your explorations and how-to's. I've been playing a bit with Llama thanks to you 🙏
       
 (DIR) Post #AcoDfuhl311IwYoFIO by simon@fedi.simonwillison.net
       2023-12-14T18:46:48Z
       
       0 likes, 0 repeats
       
       @happyborg Yes, running on a laptop doesn't need an API key at all.
       
 (DIR) Post #AcoJmh5UfpmwSkWngm by knur@tilde.zone
       2023-12-14T19:56:54Z
       
       0 likes, 0 repeats
       
       @simon wow... I didn't think the round part of their snout had bones beneath. I really thought they were just really puffy, squishy cheeks.
       
 (DIR) Post #AcoTGH4iqpEw3sLaEq by gregors@mastodon.world
       2023-12-14T21:44:16Z
       
       0 likes, 0 repeats
       
       @simon hey Simon, love your work! I guess you need a 64gb Mac? Would it run on a 32gb?
       
 (DIR) Post #Acoajeznh2AMKL0KI4 by simon@fedi.simonwillison.net
       2023-12-14T23:07:58Z
       
       0 likes, 0 repeats
       
       @gregors Might be a bit tight! The Mixtral models are pretty big, but there are quantized ones on https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF/tree/main that might just work
       
 (DIR) Post #ActWHqjY8yuw5O0Egy by gregors@mastodon.world
       2023-12-17T08:11:46Z
       
       0 likes, 0 repeats
       
       @simon works with q3 model. Of course, I had to ask a tribute to Simon question :)It's really cool to see how RAM usage ramps up when the weights get loaded.