[HN Gopher] Show HN: Onit - open-source ChatGPT Desktop with loc...
___________________________________________________________________
Show HN: Onit - open-source ChatGPT Desktop with local mode,
Claude, Gemini
Hey Hackernews- it's Tim Lenardo and I'm launching v1 of Onit
today! Onit is ChatGPT Desktop, but with local mode and support
for other model providers (Anthropic, GoogleAI, etc). It's also
like Cursor Chat, but everywhere on your computer - not just in
your IDE! Onit is open-source! You can download a pre-built
version from our website: www.getonit.ai Or build directly from
the source code: https://github.com/synth-inc/onit We built this
because we believe: Universal Access: AI assistants should be
accessible from anywhere on my computer, not just in the browser or
in specific apps Provider Freedom: Consumers should have the choice
between providers (anthropic, openAI, etc.) not be locked into a
single one (ChatGPT desktop only has OpenAI models) Local first: AI
is more useful with access to your data. But that doesn't count for
much if you have to upload personal files to an untrusted server.
Onit will always provide options for local processing. No personal
data leaves your computer without approval Customizability: Onit is
your assistant. You should be able to configure it to your liking
Extensibility: Onit should allow the community to build and share
extensions, making it more useful for everyone. The features for
V1 include: Local mode - chat with any model running locally on
Ollama! No internet connection required Multi-provider support -
Top models for OpenAI, Anthropic, xAI, and GoogleAI File upload -
add images or files for context (bonus: Drag & drop works too!)
History - revisit prior chats through the history view or with a
simple up/down arrow shortcut Customizable Shortcut - you pick your
hotkey to launch the chat window. (Command+zero by default)
Anticipated questions: What data are you collecting? Onit V1 does
not have a server. Local requests are handled locally, and remote
requests are sent to model providers directly from the client. We
collect crash reports through Firebase and a single "chat sent"
event through PostHog analytics. We don't store your prompts or
responses. How to does Onit support local mode? For use local
mode, run Ollama. You can get Ollama here: https://ollama.com/ Onit
gets a list of your local models through Ollama's API. Which
models do you support? For remote models, Onit V1 supports
Anthropic, OpenAI, xAI and GoogleAI. Default models include (o1,
o1-mini, GPT-4o, Claude3.5 Sonnet, Claude3.5 Haiku, Gemini 2.0,
Grok 2, Grok 2 Vision). For local mode, Onit supports any models
you can run locally on Ollama! What license is Onit under? We're
releasing V1 available on a Creative Commons Non-Commercial
license. We believe the transparency of open-source is critical. We
also want to make sure individuals can customize Onit to their
needs (please submit PRs!). However, we don't want people to sell
the code as their own. Where is the monetization? We're not
monetizing V1. In the future we may add paid premium features.
Local chat will- of course- always remain free. If you disagree
with a monetized feature, you can always build from source! Why
not Linux or Windows? Gotta start somewhere! If the reception is
positive, we'll work hard to add further support. Who are we? We
are Synth, Inc, a small team of developers in San Francisco
building at the frontier of AI progress. Other projects include
Checkbin (www.checkbin.dev) and Alias (deprecated - www.alias.inc).
We'd love to hear from you! Feel free to reach out at
contact@getonit dot ai. Future roadmap includes: Autocontext -
automatically pull context from computer, rather than having to
repeatedly upload. Local-RAG - let users index and create context
from their files without uploading anything. Local-typeahead - i.e.
Cursor Tab but for everywhere Additional support - add
Linux/Windows, Mistral/Deepseek etc etc. (maybe) Bundle Ollama to
avoid double-download And lot's more!
Author : telenardo
Score : 13 points
Date : 2025-01-24 22:15 UTC (45 minutes ago)
(HTM) web link (github.com)
(TXT) w3m dump (github.com)
| jjmaxwell4 wrote:
| The idea of a universal AI assistant across the desktop is cool.
| Like the emphasis on local processing and provider choice.
|
| I have tried out V1 and while it's a bit barebones, the planned
| features like 'Autocontext' and 'Local-RAG' sound promising.
| Devil's in the implementation details though.
| emacsen wrote:
| I was so so excited to read this, then I saw the headline is
| deceptive. It's not Open Source; it uses a Creative Common "Non-
| Commercial" license.
|
| CC licenses are not meant for software. They explicitly say so on
| their FAQ: https://creativecommons.org/faq/#can-i-apply-a-
| creative-comm...
|
| And non-commercial licenses are not Open Source, period. This has
| been well established since the 1990s, both by the FSF and the
| OSI.
|
| It's such a promising piece of software, but deceptive
| advertising is a bad way to start off a relationship of any sort.
| elashri wrote:
| I would like to add that this is probably not deceptive
| advertising. At least not intentional deceptive as many people
| including me didn't know that CC licenses are not meant for
| software and is not considered open source. I don't know if it
| is common misunderstanding or not but I think there is strong
| case to say that some people intuitively would think so.
| airstrike wrote:
| [delayed]
___________________________________________________________________
(page generated 2025-01-24 23:00 UTC)