[HN Gopher] Exploring biphasic programming: a new approach in la...
       ___________________________________________________________________
        
       Exploring biphasic programming: a new approach in language design
        
       Author : chriscbr
       Score  : 44 points
       Date   : 2024-07-02 17:20 UTC (5 hours ago)
        
 (HTM) web link (rybicki.io)
 (TXT) w3m dump (rybicki.io)
        
       | warpspin wrote:
       | He missed one of the earliest examples of "languages and
       | frameworks that enable identical syntax to express computations
       | executed in two distinct phases" - immediate words in Forth:
       | https://www.forth.com/starting-forth/11-forth-compiler-defin...
        
         | Munksgaard wrote:
         | Maybe I'm missing something, but isn't Lisp the original
         | version of this?
        
           | warpspin wrote:
           | Well, Lisp macros are mentioned in a footnote at least, and
           | yes, maybe even the oldest version of this idea.
        
             | kragen wrote:
             | lisp macros are probably newer than forth, but i don't know
             | that they're newer than the words [ and ] in forth. but
             | lisp did permit arbitrary compile-time computation from the
             | beginning, i believe
        
           | TeMPOraL wrote:
           | Pretty much. And not just via macros.
           | 
           | Lisp languages tend to blend together "runtime" with "load
           | time", and in case of compiled languages, also "compile
           | time". You can write code executing during any one, or any
           | combination of, these phases. You can reuse code between
           | those phases. You can interleave them at will - e.g. by
           | loading more code at runtime, or invoking a compiler, etc.
        
           | kragen wrote:
           | probably so, yes, but forth would be #2
        
         | Someone wrote:
         | Immediate words don't cover the _"while maintaining consistent
         | behavior (i.e., semantics) across phases"_ of the definition
         | given, do they?
         | 
         | I think normal forth words are way closer to that. They (1)
         | normally just do whatever their definition implies, but inside
         | a colon definition, they (1) compile code that does whatever
         | their definition implies.
         | 
         | They do miss C++ _constexpr_
         | (https://en.cppreference.com/w/cpp/language/constexpr). I
         | haven't read Zig docs, but that seems highly similar to Zig's
         | _comptime_ to me.
         | 
         | (1) technically, it's not "they" doing that themselves but
         | whatever code processes them.
        
           | kragen wrote:
           | yeah, the forth feature that actually does this is not
           | immediate words in general, but the two immediate words [ and
           | ], which allow you to do arbitrary ad hoc computations at
           | compile time instead of at run time
           | 
           | but also you can just put the code you would have put between
           | the [ ] delimiter words into an immediate word, and call it
           | where you would have put the [ ] block. the effect is not
           | _exactly_ the same but it has semantics slightly more
           | consistent with the usual semantics because your immediate
           | word is in fact compiled just like non-immediate words are
        
       | cobbal wrote:
       | Racket has a sophisticated system for dealing with phases in its
       | macro system: https://docs.racket-lang.org/guide/phases.html I
       | don't know if other schemes use a similar system.
        
       | graypegg wrote:
       | I wonder if something like Ruby could fit into this category too,
       | even though there isn't a clean line between the two phases. (I'm
       | stretching the concept a bit heh)
       | 
       | The block inside of a class or module definition is executed
       | first, and then the application can work on the resulting
       | structure generated after that pass. Sorbet (a Ruby static typing
       | library) uses this first-pass to generate its type metadata,
       | without running application code. (I think by stubbing the class
       | and module classes themselves?)
        
       | Svoka wrote:
       | To be honest `comptime` seems excessive. Like, if something can
       | be calculated at compile time, it should be. Why the extra
       | keywords for that? Rust is mostly doing it already.
        
         | staunton wrote:
         | > if something can be calculated at compile time, it should be
         | 
         | Often you don't actually want some things done at compile time
         | although they could be done. It can lead to, e.g., excessive
         | executable sizes, excessive compile times. If you've ever
         | considered using `-ftemplate-depth` in C++, you've probably
         | encountered such a case.
         | 
         | Maybe it sounds like I'm splitting hairs and you would say "of
         | course in _such crazy cases_ it 's not true", but if you look
         | at C++ projects and what _can_ be done at compile time with
         | modern C++, you would find it 's not rare at all.
        
           | tomjakubowski wrote:
           | Yes. My attitude earlier in my life as a programmer was that
           | moving as much as possible to "compile time" was a universal
           | good. The truth is of course, like everything, there are
           | trade offs.
           | 
           | I once worked on an Elixir application whose configuration
           | was accessed almost exclusively at compile time. This was
           | done with the well-intended notion that saving runtime cycles
           | was a good thing. It meant that changing any config (e.g.:
           | "name of s3 bucket") meant recompiling the entire
           | application. It also meant we had to wait for a full
           | application rebuild in CI to deploy fixes for simple
           | configuration errors. Not so super.
        
         | trealira wrote:
         | One reason you might want an explicit keyword is so that it
         | fails to compile if it can't be calculated at compile time,
         | which is what was intended, rather than fall back to
         | calculating it at runtime. It also seems useful as an ad-hoc
         | method of annotating pure functions; those functions are
         | guaranteed not to modify global variables at runtime or do I/O.
        
           | Svoka wrote:
           | I guess so... This is reason for `consteval` and such.
        
       | cb321 wrote:
       | Nim & D also have the compile-time function evaluation he
       | mentions for Zig. Nim also has a full macro system wherein macros
       | are written in Nim - just taking & producing ASTs. I've known
       | people to refer to this/Julia macro systems as "homoiconic". Nim
       | also has a javascript backend to enable similar same-syntax on
       | client&server like his React & clojure examples.
        
       | taliesinb wrote:
       | The end-game is just dissolving any distinction between compile-
       | time and run-time. Other examples of dichotomies that could be
       | partially dissolved by similar kinds of universal acid:
       | 
       | * dynamic typing vs static typing, a continuum that JIT-ing and
       | compiling attack from either end -- in some sense dynamically
       | typed programs are ALSO statically typed -- with all function
       | types are being dependent function types and all value types
       | being sum types. After all, a term of a dependent sum, a
       | dependent pair, _is_ just a boxed value.
       | 
       | * monomorphisation vs polymorphism-via-
       | vtables/interfaces/protocols, which trade roughly speaking
       | instruction cache density for data cache density
       | 
       | * RC vs GC vs heap allocation via compiler-assisted proof of
       | memory ownership relationships of how this is supposed to happen
       | 
       | * privileging the stack and instruction pointer rather than
       | making this kind of transient program state a first-class data
       | structure like any other, to enable implementing your own co-
       | routines and whatever else. an analogous situation: Zig deciding
       | that memory allocation should NOT be so privileged as to be an
       | "invisible facility" one assumes is global.
       | 
       | * privileging _pointers_ themselves as a global type constructor
       | rather than as typeclasses. we could have pointer-using functions
       | that transparently monomorphize in more efficient ways when you
       | happen to know how many items you need and how they can be
       | accessed, owned, allocated, and de-allocated. global heap
       | pointers waste _so_ much space.
       | 
       | Instead, one would have code for which it makes more or less
       | sense to spend time optimizing in ways that privilege memory
       | usage, execution efficiency, instruction density, clarity of
       | denotational semantics, etc, etc, etc.
       | 
       | Currently, we have these weird siloed ways of doing _certain_
       | kinds of privileging in _certain_ languages with rather arbitrary
       | boundaries for how far you can go. I hope one day we have
       | languages that just dissolve all of this decision making and
       | engineering into universal facilities in which the language can
       | be anything you need it to be -- it 's just a neutral substrate
       | for expressing computation and how you want to produce machine
       | artifacts that can be run in various ways.
       | 
       | Presumably a future language like this, if it ever exists, would
       | descend from one of today's proof assistants.
        
       | kragen wrote:
       | this 'biphasic programming' thing is item #9 in pg's list of
       | 'what made lisp different' from 02001:
       | https://paulgraham.com/diff.html
       | 
       | it's interesting to read this biphasic programming article in the
       | context of pg's tendentious reading of programming language
       | history
       | 
       | > _Over time, the default language, embodied in a succession of
       | popular languages, has gradually evolved toward Lisp. 1-5 are now
       | widespread. 6 is starting to appear in the mainstream. Python has
       | a form of 7, though there doesn 't seem to be any syntax for it.
       | 8, which (with 9) is what makes Lisp macros possible, is so far
       | still unique to Lisp, perhaps because (a) it requires those
       | parens, or something just as bad, and (b) if you add that final
       | increment of power, you can no longer claim to have invented a
       | new language, but only to have designed a new dialect of Lisp ;
       | -)_
       | 
       | it of course isn't _absolutely_ unique to lisp; forth also has it
       | 
       | i think the academic concept of 'staged programming' is a
       | generalization of this, and partial evaluation is a very general
       | way to blur the lines between compile time and run time
        
       | funcDropShadow wrote:
       | This is also a special case of what MetaOCaml calls multi-stage
       | programming. It does not only support two phases but arbitrary
       | many. Some similar prototype also exists for some older Scala
       | version. And Lisp and Forth obviously also support n-phases of
       | computation.
        
       | jalk wrote:
       | "Biphasic programming" is also present in frameworks like Apache
       | Spark, Tensorflow, build tools like Gradle and code-first
       | workflow engines. Execution of the first phase generates a DAG of
       | code to be executed later. IMO the hardest thing for newcomers is
       | when phase 1 and phase 2 code is interleaved with no immediate
       | clear boundaries, (phase 1 code resembles an internal DSL). The
       | docs need to teach this early on to avoid confusion. A prime
       | offender of this is SBT, with its (perhaps no longer true) 3
       | stage rocket, which is not really described in the docs (see
       | https://www.lihaoyi.com/post/SowhatswrongwithSBT.html#too-ma...)
        
       | AlexErrant wrote:
       | Another example of biphasic programming is parser generators with
       | DSLs for generating parsers, e.g. Tree Sitter or Lezer.
        
       | ithkuil wrote:
       | Also interesting is the singeli language
       | https://github.com/mlochbaum/Singeli/tree/master
        
       | bitwize wrote:
       | > And compared to Lisps like Scheme and Racket which support
       | hygenic macros, well, Zig doesn't require everything to be a
       | list.
       | 
       | Found the smug anti-Lisp weenie. Not everything is a list in
       | Lisp. In fact Lisp and Forth are among the most powerful
       | "biphasic" languages, as in both expressions in the full language
       | can be evaluated at compile time. Pre-Scheme, a GC-less,
       | statically typed, "systems" subset of Scheme, lets you use the
       | full Scheme language for any expression that can be provably
       | evaluated at compile time (for example, using DEFINE to introduce
       | a variable at top level).
        
       | StiffFreeze9 wrote:
       | Other "biphasic"-like aspects of programming languages and code:
       | 
       | - Documentation generated from inline code comments (Knuth's
       | literate programming)
       | 
       | - Test code
       | 
       | We could expand to
       | 
       | - security (beyond perl taint)
       | 
       | - O(n) runtime and memory analysis
       | 
       | - parallelism or clustering
       | 
       | - latency budgets
       | 
       | And for those academically inclined, formal language semantics
       | like https://en.wikipedia.org/wiki/Denotational_semantics versus
       | operational and others..
        
       ___________________________________________________________________
       (page generated 2024-07-02 23:00 UTC)