[HN Gopher] Convo-Lang: LLM Programming Language and Runtime
       ___________________________________________________________________
        
       Convo-Lang: LLM Programming Language and Runtime
        
       Author : handfuloflight
       Score  : 67 points
       Date   : 2025-08-14 05:40 UTC (17 hours ago)
        
 (HTM) web link (learn.convo-lang.ai)
 (TXT) w3m dump (learn.convo-lang.ai)
        
       | benswerd wrote:
       | How do you think about remote configurability?
       | 
       | Stuff like a lot of this needing to be A/B tested, models hot
       | swapped, and versioned in a way thats accessible to non technical
       | people?
       | 
       | How do you think about this in relation to tools like BAML?
        
       | yewenjie wrote:
       | What is a motivating use case that this solves?
        
         | otabdeveloper4 wrote:
         | Riding the LLM hype train to its exhaustion.
        
           | N_Lens wrote:
           | ChooChoo!
        
       | bn-l wrote:
       | It's a noisy / busy syntax. Just my own opinion.
        
       | machiaweliczny wrote:
       | Why not library?
        
       | croes wrote:
       | Next step, an LLM that writes convo-lang programs to programs
       | with an LLM
        
       | gnubee wrote:
       | This looks a lot like another effective way of interacting with
       | LLMs: english-lang. Some of english-lang 's features are that it
       | can be used to convey meaning, and it's largely accepted (network
       | effect!). I'm excited to see what convo brings to the table /s
        
         | ttoinou wrote:
         | You're absolutely right!
        
       | mrs6969 wrote:
       | Nice try. We will eventually get there, but I think this can and
       | need to get better.
        
       | Disposal8433 wrote:
       | The new COBOL. The next step is obviously to add syntax when you
       | need to specify the type of the variables: put the type first,
       | then the name and its value, and finish with a semicolon because
       | it's fun, like "int n = 0;"
        
         | taneq wrote:
         | COBOL ? Hurrah! If there's anything that would improve vibe
         | coding, it's a "come from" statement. :P
        
           | Y_Y wrote:
           | MARKETING DIVISION
        
             | warkdarrior wrote:
             | "Divide by zero error encountered."
        
       | devops000 wrote:
       | Why not as a library in Ruby or Python?
        
       | khalic wrote:
       | Cool concept that brings a little structure to prompts. I
       | wouldn't use the semantic part that much, English is fine for
       | this, but there is a real need for machine instructions. There is
       | no need for an LLM guess if "main" is a function or a file for
       | exemple.
        
       | dmundhra wrote:
       | How is it different than DSPy?
        
         | xwowsersx wrote:
         | I haven't used DSPy this much, but as I understand it: this
         | lang is more like an orchestration DSL for writing and running
         | LLM conversations and tools, whereas DSPy is a framework that
         | compiles and optimizes LLM programs into better-performing
         | prompts...like DSPy has automatic improvement of pipelines
         | using its compilers/optimizers. With DSPy you deal with modules
         | and signatures.
        
       | trehans wrote:
       | I'm not sure what this is about, would anyone mind ELI5?
        
         | xwowsersx wrote:
         | Not sure I'm sold on this particular implementation, but here's
         | my best steelman: working with LLMs through plain text prompts
         | can be brittle...tiny wording changes can alter outputs,
         | context handling is improvised, and tool integration often
         | means writing one-off glue code. This is meant to be DSL to add
         | structure: break workflows into discrete steps, define vars,
         | manage state, explicitly control when and how the model acts,
         | and so on.
         | 
         | It basically gives you a formal syntax for orchestrating multi-
         | turn LLM interactions, integrating tool calls + managing
         | context in a predictable, maintainable way...essentially trying
         | being some structure to "prompt engineering" and make it a bit
         | more like a proper, composable programming discipline/model.
         | 
         | Something like that.
        
       | brainless wrote:
       | I have thought of this issue quite a few times. I use Claude
       | Code, Gemini CLI, etc. for all my new projects. Each of the
       | typical CLAUDE.md/GEMINI.md file exists. I do not use MCPs. I ask
       | agents to use `gh` command, all my work happens around
       | Git/GitHub.
       | 
       | But text is just that, while scripts are easier to rely on. I can
       | prompt and document all mechanisms to, say, check code format.
       | But once I add something, say a pre-commit hook, it becomes
       | reliable.
       | 
       | I am looking for a human readable (maybe renderable) way to
       | codify patterns.
        
       | zuzuen_1 wrote:
       | Perhaps when LLMs introduce a lot more primitives for modifying
       | behvavior such a programming language would be necessary.
       | 
       | As such for anyone working with LLMs, they know most of the work
       | happens before and after the LLM call, like doing REST calls,
       | saving to database, etc. Conventional programming languages work
       | well for that purpose.
       | 
       | Personally, I like JSON when the data is not too huge. Its easy
       | to read (since it is hierarchical like most declarative formats)
       | and parse.
        
         | zuzuen_1 wrote:
         | One pain point such a PL could address is encoding tribal
         | knowledge about optimal prompting strategies for various LLMs,
         | which changes with each new model release.
        
       | meindnoch wrote:
       | @on user       > onAskAboutConvoLang() -> (           if(??? (+
       | boolean /m last:3 task:Inspecting message)               Did the
       | user ask about Convo-Lang in their last message           ???)
       | then (                      @ragForMsg public/learn-convo
       | ??? (+ respond /m task:Generating response about Convo-Lang)
       | Answer the users question using the following information about
       | Convo-Lang               ???           )       )              >
       | user
       | 
       | Who in their right mind would come up with such a "syntax"? An
       | LLM?
        
         | lnenad wrote:
         | I have to agree, it looks wild, even the simpler examples don't
         | feel ergonomic.
        
         | ljm wrote:
         | ... I think I'll just stick with pydantic AI for now
        
       | swoorup wrote:
       | Money Incinerator Lang would be fitting name as well.
        
       | brabel wrote:
       | I like it. Much nicer than having to use some python SDK in my
       | opinion. Is this a standalone language or it requires Python or
       | other languages to run it?
        
       | aurumque wrote:
       | This is a really great experiment that gets a lot of things
       | right!
        
       | pryelluw wrote:
       | Like terraform for prompts.
       | 
       | Put that on the landing page.
        
       ___________________________________________________________________
       (page generated 2025-08-14 23:02 UTC)