[HN Gopher] Accessing Llama 2 from the command-line with the LLM...
___________________________________________________________________
Accessing Llama 2 from the command-line with the LLM-replicate
plugin
Author : simonw
Score : 40 points
Date : 2023-07-18 19:33 UTC (3 hours ago)
(HTM) web link (simonwillison.net)
(TXT) w3m dump (simonwillison.net)
| simonw wrote:
| More about my LLM tool (and Python library) here:
| https://llm.datasette.io/
|
| Here's the full implementation of that llm-replicate plugin:
| https://github.com/simonw/llm-replicate/blob/0.2/llm_replica...
|
| If you want to write a plugin for some other LLM I have a
| detailed tutorial here:
| https://llm.datasette.io/en/stable/plugins/tutorial-model-pl... -
| plus a bunch of examples linked from here:
| https://github.com/simonw/llm-plugins
| Anticlockwise wrote:
| Can you or anyone else comment on how replicate's per-second
| pricing ends up comparing to OpenAI's per token pricing when
| using Llama2?
| simonw wrote:
| My hunch is that OpenAI is a lot cheaper. I've spent $0.26 on
| 115 seconds of compute with Llama 2 on Replicate so far,
| which is only a dozen test prompts.
| peatmoss wrote:
| I feel like Simon's been on a tear with these LLM postings.
| Simon, I really enjoying you swashbuckling through this, and then
| documenting your travels.
___________________________________________________________________
(page generated 2023-07-18 23:00 UTC)