[HN Gopher] Jan - Ollama alternative with local UI
       ___________________________________________________________________
        
       Jan - Ollama alternative with local UI
        
       Author : maxloh
       Score  : 163 points
       Date   : 2025-08-09 09:54 UTC (13 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | roscas wrote:
       | Tried to run Jan but it does not start llama server. It also
       | tries to allocate 30gb that is the size of the model but my vram
       | is only 10gb and machine is 32gb, so it does not make sense.
       | Ollama works perfect with 30b models. Another thing that is not
       | good is that it make constant connections to github and other
       | sites.
        
         | SilverRubicon wrote:
         | Did you see the feature list? It does not deny that makes
         | connections to other sites.
         | 
         | - Cloud Integration: Connect to OpenAI, Anthropic, Mistral,
         | Groq, and others
         | 
         | - Privacy First: Everything runs locally when you want it to
        
         | hoppp wrote:
         | It probably loads the entire model into ram at once while
         | ollama solves this and does not, it has a better loading
         | strategy
        
           | blooalien wrote:
           | Yeah, if I remember correctly, Ollama loads models in
           | "layers" and is capable of putting some layers in GPU RAM and
           | the rest in regular system RAM.
        
       | biinjo wrote:
       | Im confused. Isn't the whole premise of Ollama that its locallt
       | ran? What's the difference or USP when comparing the two.
        
         | moron4hire wrote:
         | That's not the actual tagline being used in the repo. The repo
         | calls itself an alternative to ChatGPT. Whoever submitted the
         | link changed it.
        
         | hoppp wrote:
         | I think its an alternative because ollama has no UI and its
         | hard to use for non-developers who will never touch the CLI
        
           | simonw wrote:
           | Ollama added a chat UI to their desktop apps a week ago:
           | https://ollama.com/blog/new-app
        
             | apitman wrote:
             | Their new app is closed source right?
        
               | simonw wrote:
               | Huh, yeah it looks like the GUI component is closed
               | source. Their GitHub version only has the CLI.
        
               | diggan wrote:
               | I think at this point it's fair to say that most of the
               | stuff Ollama does, is closed source. AFAIK, only the CLI
               | is open source, everything else isn't.
        
               | conradev wrote:
               | Yeah, and they're also on a forked llama.cpp
        
             | accrual wrote:
             | I have been using the Ollama GUI on Windows since release
             | and appreciated its simplicity. It recently received an
             | update that puts a large "Turbo" button in the message box
             | that links to a sign-in page.
             | 
             | I'm trying Jan now and am really liking it - it feels
             | friendlier than the Ollama GUI.
        
               | dcreater wrote:
               | And ollamas founder was on here posting that they are
               | still focused on local inference... I don't see ollama as
               | anything more than a funnel for their subscription now
        
       | bogdart wrote:
       | I tried Jan last year, but the UI was quite buggy. But maybe they
       | fixed it.
        
         | diggan wrote:
         | Please do try it out again, if things used to be broken but
         | they no longer are, it's a good signal that they're gaining
         | stability :) And if it's still broken, even better signal that
         | they're not addressing bugs which would be worse.
        
           | esafak wrote:
           | So you're saying bugs are good?!
        
             | diggan wrote:
             | No, but maybe that their shared opinion will be a lot more
             | insightful if they provide a comparison between then and
             | now, instead of leaving it at "it was like that before, now
             | I don't know".
        
       | mathfailure wrote:
       | Is this an alternative to OpenWebUI?
        
         | apitman wrote:
         | Not exactly. OWUI is a server with a web app frontend. Jan is a
         | desktop app you install. But it does have the ability to run a
         | server for other apps like OWUI to talk to.
        
           | ekianjo wrote:
           | Openweb-ui does not include a server.
        
             | apitman wrote:
             | I was referring to Jan.
        
             | cristoperb wrote:
             | It starts a webserver to serve its UI, which is what your
             | comment parent meant. It doesn't provide its own openai-
             | style API, which I guess is what you meant.
        
         | PeterStuer wrote:
         | More an alternative to LM Studio I think from the description.
        
           | apitman wrote:
           | Jan also supports connecting to remote APIs (like
           | OpenRouter), which I don't think LM Studio does
        
       | semessier wrote:
       | still looking for vLLM to support Mac ARM Metal GPUs
        
         | baggiponte wrote:
         | Yeah. The docs tell you that you should build it yourself,
         | but...
        
       | apitman wrote:
       | I really like Jan, especially the organization's principles:
       | https://jan.ai/
       | 
       | Main deal breaker for me when I tried it was I couldn't talk to
       | multiple models at once, even if they were remote models on
       | OpenRouter. If I ask a question in one chat, then switch to
       | another chat and ask a question, it will block until the first
       | one is done.
       | 
       | Also Tauri apps feel pretty clunky on Linux for me.
        
         | c-hendricks wrote:
         | Yeah, webkit2gtk is a bit of a drag
        
         | diggan wrote:
         | > Also Tauri apps feel pretty clunky on Linux for me.
         | 
         | All of them, or this one specifically? I've developed a bunch
         | of tiny apps for my own usage (on Linux) with Tauri (maybe
         | largest is just 5-6K LoC) and always felt snappy to me, mostly
         | doing all the data processing with Rust then the UI part with
         | ClojureScript+Reagent.
        
         | _the_inflator wrote:
         | Yep. I really see them as an architecture blueprint with a
         | reference implementation and not so much as a one size fits all
         | app.
         | 
         | I stumbled upon Jan.ai a couple of months ago when I was
         | considering a similar app approach. I was curious because
         | Jan.ai went way beyond what I considered to be limitations.
         | 
         | I haven't tried Jan.ai yet, I see it as an implementation not a
         | solution.
        
         | signbcc wrote:
         | > especially the organization's principles
         | 
         | I met the team late last year. They're based out of Singapore
         | and Vietnam. They ghosted me after promising to have two
         | follow-up meetings, and were unresponsive to any emails, like
         | they just dropped dead.
         | 
         | Principles and manifestos are a dime a dozen. It matters if you
         | live by them or just have them as PR pieces. These folks are
         | the latter.
        
           | dcreater wrote:
           | With a name like Menlo research, I assumed they were based in
           | Menlo park. They probably intended that
        
       | venkyvb wrote:
       | How does this compare to LM studio ?
        
         | rmonvfer wrote:
         | I use both and Jan is basically the OSS version of LM Studio
         | with some added features (e.g, you can use remote providers)
         | 
         | I first used Jan some time ago and didn't really like it but it
         | has improved a lot so I encourage everyone to try it, it's a
         | great project
        
         | angelmm wrote:
         | For me, the main difference is that LM Studio main app is not
         | OSS. But they are similar in terms of features, although I
         | didn't use LM Studio that much.
        
       | reader9274 wrote:
       | Tried to run the gpt-oss:20b in ollama (runs perfectly) and tried
       | to connect ollama to jan but it didn't work.
        
         | thehamkercat wrote:
         | Exactly: https://github.com/menloresearch/jan/issues/5474
         | 
         | Can't make it work with ollama endpoint
         | 
         | this seems to be the problem but they're not focusing on it:
         | https://github.com/menloresearch/jan/issues/5474#issuecommen...
        
         | accrual wrote:
         | I got Jan working with Ollama today. Jan reported it couldn't
         | connect to my Ollama instance on the same host despite it
         | working fine for other apps.
         | 
         | I captured loopback and noticed Ollama returning an HTTP 403
         | forbidden message to Jan.
         | 
         | The solution was set environment variables:
         | OLLAMA_HOST=0.0.0.0         OLLAMA_ORIGINS=*
         | 
         | Here's the rest of the steps:
         | 
         | - Jan > Settings > Model Providers
         | 
         | - Add new provider called "Ollama"
         | 
         | - Set API key to "ollama" and point to
         | http://localhost:11434/v1
         | 
         | - Ensure variables above are set
         | 
         | - Click "Refresh" and the models should load
         | 
         | Note: Even though an API key is not required for local Ollama,
         | Jan apparently doesn't consider it a valid endpoint unless a
         | key is provided. I set mine to "ollama" and then it allowed me
         | to start a chat.
        
       | klausa wrote:
       | So this is how women named Siri felt in 2011.
        
         | lagniappe wrote:
         | Hello Jan ;)
        
       | jwildeboer wrote:
       | My name is Jan and I am not an AI thingy. Just FTR. :)
        
         | underlines wrote:
         | Jan here too, and I work with LLMs full time and I'm a speaker
         | about these topics. Annoying how many times people ask me if
         | Jan.ai is me lol
        
           | dsp_person wrote:
           | We need a steve.ai
        
             | ithkuil wrote:
             | I want a Robert Duck AI
        
       ___________________________________________________________________
       (page generated 2025-08-09 23:01 UTC)