Post AmqLN2s3eppZQSLYxM by thenets@fosstodon.org
 (DIR) More posts by thenets@fosstodon.org
 (DIR) Post #AmqILx8zAIr1Vn3HrE by stfn@fosstodon.org
       2024-10-09T21:34:15Z
       
       0 likes, 0 repeats
       
       OK, that was funny.I tested out Ollama's newest model, llama3.2 to run an AI chat locally on my PC.First I asked it a few questions about Python and databases and was pleasantly surprised at the quality of responses.Then I asked it if it can access the Internet. Ollama said yes. So I asked it to summarize one of my blog posts. The answer was very generic. Turns out Ollama did not access my blog, just guessed the answer based on the URL. Actually, Ollama cannot access the Internet at all.
       
 (DIR) Post #AmqIYh1WPaHvHPdTu4 by stfn@fosstodon.org
       2024-10-09T21:36:33Z
       
       0 likes, 0 repeats
       
       All in all, I am impressed at what it can do, using less than 5GB of VRAM, and providing long answers (several paragraphs) in a few seconds, all that on my rather old RTX2060.
       
 (DIR) Post #AmqIjArOKoP4iDtZM8 by stfn@fosstodon.org
       2024-10-09T21:38:27Z
       
       0 likes, 0 repeats
       
       Yes, I know about all the issues with LLMs. I would not trust it with any factual information. But as a tool to give ideas, or reiterate, or summarize, or just point me in a direction, it can actually be useful. And everything happens locally.
       
 (DIR) Post #AmqJWaLQNBPLwMVIv2 by xylophilist@mastodon.online
       2024-10-09T21:47:21Z
       
       0 likes, 1 repeats
       
       @stfn Exactly so. Ignore the hype and run the open technology local and private. Another form of interaction I'm exploring atm, related to summarizing: you can go far with the idea of directory-of-markdown as knowledge database. Use Obsidian or VS Codium as a frontend, a clipper plugin in $BROWSER and ollama becomes a query engine. RSS feeds, fields of research, former attempts at running a blog, you name it. All potential input.#LLM #geek
       
 (DIR) Post #AmqLN2s3eppZQSLYxM by thenets@fosstodon.org
       2024-10-09T22:08:03Z
       
       0 likes, 0 repeats
       
       @stfn the QWEN 2.5 is the most impressive I saw so far. The 7b runs very fast on mine RTX 3070 and produces really great results:https://ollama.com/library/qwen2.5
       
 (DIR) Post #Amr2TMw0GpWJQ0SCoK by stfn@fosstodon.org
       2024-10-10T06:11:01Z
       
       0 likes, 0 repeats
       
       @xylophilist interesting, I might try it out, thanks!
       
 (DIR) Post #Amr2bqd3kcuHpZdVJI by stfn@fosstodon.org
       2024-10-10T06:12:32Z
       
       0 likes, 0 repeats
       
       @thenets thanks, I'll compare them when I have a moment