[HN Gopher] Ollama has a native front end chatbot now
       ___________________________________________________________________
        
       Ollama has a native front end chatbot now
        
       Author : BUFU
       Score  : 69 points
       Date   : 2025-07-30 21:19 UTC (1 hours ago)
        
 (HTM) web link (ollama.com)
 (TXT) w3m dump (ollama.com)
        
       | rihegher wrote:
       | "Ollama's new app is now available for macOS and Windows" linux
       | sounds out for now
        
         | amelius wrote:
         | Shouldn't the LLM be able to code the linux version?
        
           | permalac wrote:
           | Vibe coding?
        
         | pkaye wrote:
         | Click the download button and you will see Linux as an option.
        
           | mchiang wrote:
           | Linux does not have the interface right now, and there are a
           | lot of options for users. Open WebUI is an awesome project.
           | 
           | https://github.com/open-webui/open-webui
        
       | 8thcross wrote:
       | a little too late i think.
        
         | thimabi wrote:
         | It came very late indeed! By now, I'm already used to LM Studio
         | as a UI for local LLMs... it even seems to have more features
         | than ollama.
         | 
         | But I liked to know that ollama developed a GUI as well -- more
         | options is always better, and maybe it will improve in the
         | future.
        
       | IceWreck wrote:
       | Why not Linux? The UI looks to be some kind chrome based thingy -
       | probably electron - should be easy to port to Linux.
       | 
       | Also is there a link to the source?
        
         | johncolanduoni wrote:
         | For all of Electron's promise in being cross-platform, "I'll
         | just press this button and ship this Electron app on Linux and
         | everything will be fine" is not the current state of things. A
         | lot of it is papercuts like glibc version aggravation, but GPU
         | support is persistently problematic.
        
           | zettabomb wrote:
           | The Element app on Linux is currently broken (if you want to
           | use encryption, so basically for everyone) due to an issue
           | with Electron. Luckily it still works in a regular browser.
           | I'm really baffled by how that can happen.
        
         | ceroxylon wrote:
         | I am guessing that the Linux version was first (or the
         | announcement was worded strangely), as it is available on their
         | download page:
         | 
         | https://ollama.com/download
        
         | nicce wrote:
         | Electron... wonder how this can be marketed as native then.
        
       | swyx wrote:
       | finally, what took so long lmao
       | 
       | if im being honest i care more about multiple local ai apps on my
       | desktop all hooking into the same ollama instance rather than all
       | downloading their own models as part of the app so i have like
       | multiple 10s of gbs of repeated weights all over the place
       | because apps dont talk to each other
       | 
       | what does it take for THAT to finally happen
        
         | noman-land wrote:
         | I, too, dream of this.
        
         | mchiang wrote:
         | this is something we are working on. I don't have a specific
         | timeline since it's done when its done, but it is being worked
         | on.
        
       | syspec wrote:
       | I've been using Open WebUI and have been blown away, it's a
       | better ChatGPT interface than ChatGPT!
       | 
       | https://github.com/open-webui/open-webui
       | 
       | Curious how this compares to that, which has a ton of features
       | and runs great
        
       ___________________________________________________________________
       (page generated 2025-07-30 23:00 UTC)