[HN Gopher] What was the last breakthrough in computer programmi...
       ___________________________________________________________________
        
       What was the last breakthrough in computer programming? (2019)
        
       Author : ColinWright
       Score  : 95 points
       Date   : 2021-08-17 14:22 UTC (8 hours ago)
        
 (HTM) web link (www.quora.com)
 (TXT) w3m dump (www.quora.com)
        
       | jimbokun wrote:
       | I think the biggest programming productivity boosts since 1984
       | haven't been about programming languages, but about tools.
       | 
       | Specifically, distributed version control and dependency
       | management tools.
       | 
       | Being able to collaborate with developers anywhere in the world,
       | and being able to pull in any library with a single declarative
       | line in a configuration file, increases productivity more than
       | any improvement to a programming language ever could.
        
         | ComodoHacker wrote:
         | >Specifically, distributed version control and dependency
         | management tools.
         | 
         | Indeed, without these tools dependency hell wouldn't be
         | possible! /s
        
           | kenjackson wrote:
           | Oh, we still had it!
        
         | chasd00 wrote:
         | I agree, could you imagine doing something like Java Spring
         | without maven or the equivalent? It wouldn't be possible.
        
       | karmakaze wrote:
       | There isn't so much in languages features themselves, but in
       | their implementations. GC can be largely pauseless for many
       | practical purposes and a GC language can be within 2x-3x the
       | performance of C-like languages.
       | 
       | Also the amount/cost of memory has improved so that we can use
       | immutable datastructures and functional style in many contexts,
       | which definitely feels like a 'level-up'.
       | 
       | Concurrency has been getting easier too, with many languages
       | supporting coroutines/async and/or threads. Reference
       | capabilities are exciting as in Pony, Rust, or Clean.
       | 
       | In general there's a great convergence where ergonomics are
       | improving (editors, compilers, build/package systems) and each
       | language evolves to adopt features of other languages.
       | 
       | I just dabbled into the Sorbet type checker after not writing any
       | C++ since the 90s, and it was surprisingly browseable/readable
       | and could copy/paste the concepts based on recent Ruby, Java/Go
       | knowledge.
        
         | ragnese wrote:
         | It's nice to read a positive comment like yours occasionally,
         | because the vast majority of the time, I'm just disappointed in
         | how bad our programming tools (including languages) are.
         | 
         | It's become a meme in my office that I'm the guy constantly
         | bitching about how stupid our languages are. This week I was
         | back on my soap box about the fact that almost zero mainstream
         | (statically typed) programming languages can even let you write
         | down that you want a non-empty string. In some languages you
         | can implement your own class/type that wraps around the built-
         | in string type, but most of the time you are now introducing
         | memory and CPU overhead, can't use your type with other APIs
         | that expect strings, etc. So nobody does that. But ask yourself
         | this: how often have you ever written a function that requested
         | a string as input and actually wanted an empty string? How many
         | times did you not even think about what would happen if someone
         | DID pass an empty string?
         | 
         | Same goes for positive and non-negative numbers. How many times
         | did you write "int" when you actually only wanted a positive
         | number? If I have to type "if x <= 0 throw FooException" as the
         | first line of one more function, I'm going to scream. (A lot of
         | languages do have unsigned ints, to be fair. But some still
         | don't, or their unsigned ints are almost useless.)
         | 
         | People make all kinds of Stockholm Syndrome-y excuses for it
         | ("validate inputs at the edges is all you need"), but the truth
         | is that (most of) our languages are so deficient for expressing
         | really basic ideas.
         | 
         | Thank goodness there are some languages that do try to make it
         | possible to write "newtypes" and try to make concurrency safer.
         | Now if we could just get everyone to adopt those in less than a
         | decade, maybe we'd be able to get to the next generation after
         | that, and then maybe we'll have good programming languages
         | before I die.
         | 
         | Sorry that turned into a rant... :/
        
           | kwertyoowiyop wrote:
           | It'd also be handy to put other limits on my numeric
           | variables, for instance to automatically throw an exception
           | if an angle delta goes out of the range -180 to 180, or
           | whatever.
        
             | Jtsummers wrote:
             | Yes, it is handy. This is why I like Ada (or the idea of
             | it, it's rarely been used in my work because it's hard to
             | sell others on) for safety critical systems. With SPARK/Ada
             | you can even work towards proofs that your code won't
             | assign outside of that range so that you don't end up with
             | runtime exceptions.
        
           | [deleted]
        
           | sharikous wrote:
           | Have you tried Ada? Its type system is the closest I could
           | think of to your use case?
        
           | mtreis86 wrote:
           | Common Lisp can do that                 CL-USER> (defun non-
           | empty-string-p (string)                  (and (stringp
           | string)                       (plusp (length string))))
           | NON-EMPTY-STRING-P            CL-USER> (deftype non-empty-
           | string ()                  `(satisfies non-empty-string-p))
           | NON-EMPTY-STRING            CL-USER> (typep "" 'non-empty-
           | string)       NIL            CL-USER> (typep " " 'non-empty-
           | string)       T
        
           | amelius wrote:
           | > the vast majority of the time, I'm just disappointed in how
           | bad our programming tools (including languages) are
           | 
           | If you are disappointed in programming tools, you didn't see
           | hardware design tools yet.
        
           | ComodoHacker wrote:
           | >how often have you ever written a function that requested a
           | string as input and actually wanted an empty string?
           | 
           | To be fair, I do it quite often. Most of the strings I deal
           | with in my code are coming from user input, and most of them
           | are optional. They are usually just passed to/from a
           | database. If the string has some internal meaning (like ULSs
           | or file paths), it usually gets wrapped in an object anyway.
           | 
           | If you're processing some formal language or network
           | protocol, that's another story.
        
             | ragnese wrote:
             | Let me ask you this, though. If your strings that come from
             | user inputs are optional, doesn't that mean they could also
             | just not be present (as in null)? Why do you need or want
             | two different ways to express "nothing"? Are all of the
             | text fields just funneled right into the database without
             | checking/validating any of them? I've written a number of
             | RESTy/CRUDy APIs and I can't count the number of "check
             | that username isn't empty" checks I've written over the
             | years.
        
               | still_grokking wrote:
               | > I've written a number of RESTy/CRUDy APIs and I can't
               | count the number of "check that username isn't empty"
               | checks I've written over the years.
               | 
               | But you do it only once, thereafter you have hopefully
               | some "Username" type of object which is guarantied to
               | contain a valid username.
        
               | feoren wrote:
               | The argument for disallowing nulls is much stronger than
               | the argument for demanding a compiler-enforced non-empty
               | string. I definitely support the ability to declare
               | variables, including strings, as non-nullable. An empty
               | string is simply analogous to the number 0. It doesn't
               | really overlap in meaning with null. It's true it would
               | be useful to occasionally disallow the number 0, but only
               | very occasionally. The obvious example is division, but
               | having a representation of +/- infinity alleviates some
               | cases.
               | 
               | > I've written a number of RESTy/CRUDy APIs and I can't
               | count the number of "check that username isn't empty"
               | checks I've written over the years.
               | 
               | Paraphrasing: "I've written the same kind of method over
               | and over throughout my career and have been unable to (or
               | made no attempt to) abstract it away." I love strong type
               | systems, but it doesn't sound like the type system is
               | your problem here. The problem is that you're constantly
               | re-implementing the same business logic.
        
               | onemoresoop wrote:
               | I hear your dismay but you could easily build your own
               | library of string validations which can extend however
               | you want and can reuse it as much as you need.
        
           | still_grokking wrote:
           | You're looking for refinement-types?
           | 
           | Scala to the rescue:
           | 
           | https://github.com/fthomas/refined
           | 
           | And a new one:
           | 
           | https://github.com/Iltotore/iron/
           | 
           | The latter will be zero overhead once a compiler ticket is
           | closed.
           | 
           | Both solutions will yield compile time checks where possible,
           | and fall back to runtime checks when the data isn't
           | statically known.
        
           | lamontcg wrote:
           | > How many times did you write "int" when you actually only
           | wanted a positive number?
           | 
           | subtracting one from zero and getting max_uint can be its own
           | brand of fucking horribly awful.
           | 
           | having the language itself throw in that circumstance can
           | also be exactly what you don't want.
        
             | Jtsummers wrote:
             | In some languages you can get a choice. Broken record time,
             | but Ada:                 type Byte is unsigned 2**3;
             | 
             | This will permit any value in the range [0,7] and when you
             | exceed it (in either direction) it will wrap around (the
             | desired action if you choose this type). In contrast:
             | type Byte is range 0..7;
             | 
             | Will give you a runtime error when you exceed the bounds by
             | trying to increase beyond 7 or decrease below 0. Having
             | this choice is nice, you get to decide the semantics for
             | your system.
        
               | still_grokking wrote:
               | Are that static or runtime checks (or both)?
        
               | Jtsummers wrote:
               | Definitely runtime, potential for compile time. The
               | compile time checks will at least prevent obvious cases
               | (assigning a value out of the range like using either
               | Byte type above: _T := -1_ will get a compile time
               | error). Using the second Byte type, paired with SPARK
               | /Ada, this bit of code should set off the proof system
               | and prevent compilation:                 procedure Foo is
               | type Byte is range 0..7;         T : Byte := 0;
               | begin         T := Byte - 1;       end Foo;
               | 
               | (Not that that's useful code, but a basic example.) That
               | shouldn't make it past the SPARK/Ada system to
               | compilation. Now change it to this:
               | function Foo(T : Byte) return Byte is       begin
               | return T + 1;       end Foo;
               | 
               | and SPARK/Ada should warn (been a bit, but should also
               | fail to compile) that this could cause overflow.
        
               | lamontcg wrote:
               | Yeah I took Ada for CS 210/211 back in 1991, are we there
               | yet?
        
               | Jtsummers wrote:
               | Ada's good, but it will never win. The open source
               | compilers are solid, but no one wants to learn it. It's
               | (depending on who you ask): Too old, too verbose, BDSM
               | programming, not Rust, not Haskell, not C.
               | 
               | It has a lot of positive features going for it, but it
               | _is_ verbose. That verbosity is a major distraction for
               | people who can 't handle it, they want their very short
               | keywords or, better, no keywords just symbols. Curly
               | braces are somehow better than begin/end even though
               | begin/end really aren't hard to type. Ada shines,
               | particularly, in the long tail of system maintenance, not
               | in the writing (arguable: the type system certainly helps
               | a lot in the writing, the syntax doesn't). So I press for
               | it where it belongs, and don't where it doesn't. But when
               | someone laments the state of type systems, I point it
               | out.
        
               | still_grokking wrote:
               | Give it a new syntax, and try again...
               | 
               | It's not so difficult nowadays to write a "syntax-
               | transpiler". The hard part, the language, would remain
               | the same.
               | 
               | People seem having done so successfully with OCaml, see
               | ReScript.
        
           | raspasov wrote:
           | Checkout Clojure spec for a very expressive way of defining
           | data requirements. It allows you to use arbitrary functions
           | to describe data requirements. That way you are not limited
           | by static, compile-time only descriptions of data flowing
           | through your program.
        
             | ragnese wrote:
             | I used Clojure on a project while spec was still alpha/beta
             | or something, so I never used it. It does sound
             | interesting, but I'm skeptical. Even the way you described
             | it- I'm still just writing a function to validate my data,
             | aren't I? Is that truly any different than just calling
             | `validateFoo()` at the top of my functions in any other
             | language?
        
               | raspasov wrote:
               | There's more power than that in spec. For example, you
               | can globally define :address/zip to be a string that's
               | more than, say, N characters long. Now anytime you
               | encounter an :address/zip, regardless of whether it is
               | inside, say :billing-address or :shipping-address
               | dictionary/map, it can be checked for those function
               | predicates.
        
               | eckesicle wrote:
               | 'Maybe Not' is a great talk on this subject matter.
               | https://m.youtube .com/watch?v=YR5WdGrpoug
        
           | taeric wrote:
           | I'm not so sure that I sympathize with your example. Why not
           | a type for even numbers? Odd, prime, not-prime, etc?
           | 
           | You really are asking for a type that is "valid data."
           | Commendable, but not a static property of data. As a fun
           | example, what is a valid email address? Once established as
           | valid, how long will it stay that way? If invalid, how long
           | until it can become valid?
           | 
           | Do I think better typing can be a boon? Absolutely! Can it
           | also be a burden? Absolutely!
        
             | initplus wrote:
             | Validating an email is different from validating a
             | primitive.
             | 
             | You can validate a primitive like an int based on it's own
             | state alone.
        
               | taeric wrote:
               | Different validation, but yes?
               | 
               | And then there is validating it is a valid int, but is it
               | valid to use it somewhere? 7642468 is a valid int, but is
               | it a valid address?
               | 
               | That is, the primitives are unlikely to be your concern
               | in programming. Handy tools, but not your problem domain.
        
             | WalterBright wrote:
             | > Why not a type for even numbers? Odd, prime, not-prime,
             | etc?
             | 
             | Why not indeed? See my other comment
             | https://news.ycombinator.com/item?id=28214776 about how to
             | create such types.
        
               | taeric wrote:
               | I meant that line as a bit of a tease to other tricks.
               | 
               | And it isn't like this isn't done often. Take the lowly
               | format string in c languages. With the requirement that
               | the format string has to be static, it is common to fail
               | builds if you give it a bad format string or supply the
               | wrong number of arguments.
        
             | simonh wrote:
             | That's easy, a valid email address is one that is well
             | formed, conforming to the specification for email
             | addresses.
             | 
             | I know what you're trying to get at, but that's just a
             | category error. It's a misuse of the term valid in this
             | context. For example my mail archive contains many emails
             | from valid addresses for which there happens to be no
             | currently active mail box end point. They're still valid
             | data though. The fact that people sometimes use th s term
             | valid to mean something completely different is just an
             | unfortunate linguistic accident, but its a mistake to think
             | it's meaningful.
        
               | still_grokking wrote:
               | > conforming to the specification for email addresses
               | 
               | You mean something that roughly matches this obvious
               | regex:                 \A(?:[a-z0-9!#$%&'*+/=?^_'{|}~-]+(
               | ?:\.[a-z0-9!#$%&'*+/=?^_'{|}~-]+)*        |
               | "(?:[\x01-\x08\x0b\x0c\x0e-\x1f\x21\x23-\x5b\x5d-\x7f]
               | |  \\[\x01-\x09\x0b\x0c\x0e-\x7f])*")       @ (?:(?:[a-z0
               | -9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-
               | 9])?         |
               | \[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}
               | (?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9
               | ]:
               | (?:[\x01-\x08\x0b\x0c\x0e-\x1f\x21-\x5a\x53-\x7f]
               | |  \\[\x01-\x09\x0b\x0c\x0e-\x7f])+)            \])\z
               | 
               | [ Source: https://www.regular-expressions.info/email.html
               | ]
               | 
               | That somehow doesn't look "easy" on my eyes. I'm not sure
               | the parent's comment was serous.
        
             | DennisP wrote:
             | Dependent typing can do that sort of thing. In fact, here's
             | a stackoverflow answer that uses even numbers as an example
             | of a dependent type:
             | https://stackoverflow.com/questions/9338709/what-is-
             | dependen...
        
               | taeric wrote:
               | Right. And I look forward to advances in them. I can
               | count how many times this would have actually helped my
               | program, though...
        
               | exdsq wrote:
               | Why do you think that? It might require a different
               | mental model of types in order to see the benefits but I
               | can't believe anyone is working on a project that doesn't
               | have any sort of properties or 'business' logic that
               | dependent types wouldn't help encode. Have you read
               | anything about type-driven development?
               | https://blog.ploeh.dk/2015/08/10/type-driven-development/
        
           | hota_mazi wrote:
           | Mmmh... these are not very convincing examples.
           | 
           | Statically typed languages give you u8/i8 types of numbers.
           | 
           | Maybe having a non empty string, or non empty list, type is
           | useful now and then, but in practice, just have your code
           | work just as well on both empty and non empty values, and
           | you're good to go.
           | 
           | I'm pretty happy with the languages we have today overall
           | (Kotlin and Rust at the top, but C#, Swift, and Java get an
           | honorable mention).
        
             | ragnese wrote:
             | Your comment kind of galvanizes my view that most of us
             | suffer from Stockholm Syndrome with respect to our
             | programming languages.
             | 
             | As another commenter said, _some_ statically typed
             | languages give you unsigned numbers. Maybe most of them do.
             | But definitely not some of the most popular ones. And out
             | of the ones that do, they are often really unhelpful.
             | 
             | C's unsigned numbers and implicit conversions are full of
             | foot-guns.
             | 
             | Java basically doesn't have unsigned ints. It does kind of
             | have this weird unsigned arithmetic API over signed ints,
             | but it's really awkward and still bug-prone to use.
             | 
             | Kotlin's unsigned numbers API is very poor. Very. Kotlin
             | does not give a crap if I write: `Int.MIN_VALUE.toUInt()`,
             | so it's perfectly happy to just pretend a negative number
             | _is_ an unsigned number. Not to mention that unsigned
             | number types are implemented as inline /value classes which
             | don't even actually work correctly in the current version
             | of the language (just go look at the bug tracker- I
             | literally can't use value classes in my project because
             | I've encountered multiple DIFFERENT runtime crash bugs
             | since 1.5 was released). It, like Java and C, etc, is
             | perfectly happy to wrap around on arithmetic overflow,
             | which means that if you didn't guess your types correctly,
             | you're going to end up with invalid data in your database
             | or whatever.
             | 
             | Rust and Swift have good unsigned number APIs.
             | 
             | Notice also that I didn't say anything about non-empty
             | collections. Yes, I absolutely want non-empty collection
             | types, but that is no where NEAR as important and useful as
             | non-empty strings, even though they might seem conceptually
             | similar. I'm willing to assert, like the bold internet-man
             | I am, that you and all of the rest of us almost NEVER
             | actually want an empty string for anything. I never want to
             | store a user with an empty name, I never want to write a
             | file with an empty file name, I never want to try to
             | connect to a URL made from an empty host name, etc, etc,
             | etc. It is very often perfectly acceptable to have empty
             | collections, though. It's also very frequent that you DO
             | want a non-empty collection, which is why we should have
             | both.
             | 
             | We don't even need empty strings (or collections, really)
             | if we have null.
             | 
             | You say you like Kotlin and Rust and I work with both of
             | them extensively. I can point out a great many shortcomings
             | of Kotlin in particular. I used to be enamored with it, but
             | the more I use it, the more shortcomings, edge cases, bugs,
             | and limitations I run into. Rust is pretty great, but even
             | that has some real issues- especially around how leaky the
             | trait abstraction is. But at least Rust's excuse is that
             | it's a low-level-ish systems language. It's these "app
             | languages" that irritate me the most.
        
               | isotropy wrote:
               | > We don't even need empty strings (or collections,
               | really) if we have null.
               | 
               | This feels exactly backwards to me: I almost always want
               | my sequence-like types to have a well-defined zero-length
               | element, and I almost never want to allow a NULL value
               | for a variable. NULL is so much worse than [] or ''.
               | Think about concat(). When the trivial members of a type
               | support most of the same behaviors as the nontrivial
               | ones, that makes error checking so much easier.
        
             | AnimalMuppet wrote:
             | _Some_ statically typed languages give you u8 /i8. Java
             | doesn't, despite it being one of your "honorable mention"
             | languages.
        
           | WalterBright wrote:
           | > almost zero mainstream (statically typed) programming
           | languages can even let you write down that you want a non-
           | empty string.
           | 
           | Today D will fulfill your dreams!                   struct
           | MyString {             private string s = "error";
           | alias s this;                  this(string s) {
           | assert(s.length);                 this.s = s;             }
           | }                   void main() {             MyString ms =
           | "hello";             test(ms);             MyString mserror =
           | ""; // runtime assert fail         }              void
           | test(string s) { }
        
             | Leherenn wrote:
             | Would it be possible to have a static assert? In that case
             | I would expect it to fail at compile time, not run time.
        
               | WalterBright wrote:
               | You can use static asserts if you turn the constructor
               | into a template and pass the initializer into a compile
               | time parameter.
        
         | scythe wrote:
         | >GC can be largely pauseless for many practical purposes and a
         | GC language can be within 2x-3x the performance of C-like
         | languages.
         | 
         | To that end, it seems like only recently we've seen automatic
         | reference counting [Obj-C, Rust, Swift] and/or compile-time
         | garbage collection [Mercury] in a non-toy implementation.
         | "Breakthrough" is a difficult word because it refers to
         | discovery _and_ impact, with the latter coming long after the
         | former, and it 's not clear if ARC is really a game-changer for
         | any serious applications, but it seems interesting at least.
        
           | steveklabnik wrote:
           | (Rust does not do automatic reference counting)
        
             | dcow wrote:
             | Sure it does: `Arc::new(thing)`
        
               | steveklabnik wrote:
               | That is atomic reference counting, not automatic
               | reference counting. With automatic reference counting,
               | you do not need to wrap the variables, and you do not
               | need to increment or decrement the counter. Rust requires
               | that you actively make your values reference counted by
               | wrapping them explicitly, and makes you bump the count
               | explicitly. It uses RAII to decrement the count
               | automatically though.
        
           | alexisread wrote:
           | For me, Composita https://concurrency.ch/Content/publications
           | /Blaeser_Componen...
           | 
           | and ASAP https://www.semanticscholar.org/paper/ASAP%3A-As-
           | Static-As-P...
           | 
           | Seem like a good way forward wrt memory management and
           | concurrency, using ASAP inside a component, and delegating
           | concurrency and component cleanup to Composita.
           | 
           | Other languages like Maude
           | http://maude.cs.illinois.edu/w/index.php/The_Maude_System
           | 
           | Are pushing in new directions wrt proof checking.
        
           | wizeman wrote:
           | As far as I understand, the Mercury compiler doesn't do
           | compile-time garbage collection (see the LIMITATIONS file).
        
         | amelius wrote:
         | Another problem is that no language hits the sweetspot of a
         | truly general purpose language. For example Rust doesn't allow
         | freestyle functional programming (Haskell relies on a garbage
         | collector for a reason), whereas at the other end of the
         | spectrum Haskell doesn't allow precise control of CPU usage.
        
           | pharmakom wrote:
           | Is such a language even possible? Seems like you have
           | identified two desirable traits that conflict with each
           | other.
        
       | DesiLurker wrote:
       | I'd say being able to program GPUs as general purpose compute
       | devices & general SIMD vectorization in high level language is
       | pretty significant. it has opened up many applications like
       | machine learning previously unavailable.
       | 
       | IMO some C++20 features like coroutines rank pretty high in
       | introducing new ways of programming.
        
         | ColinWright wrote:
         | I have a question about that, and despite asking in several
         | places, and reading lots of documents, I've ever been able to
         | find an answer that makes sense. Maybe you can help me.
         | 
         | I have an algorithm that I used to run on a SIMD machine. It's
         | a single, simple algorithm that I want to run on lots of
         | different inputs. How can I run this on a GPU? From what I read
         | about GPUs it should be possible, but nothing I've read has
         | made any sense about how to do it.
         | 
         | Can you point me in a suitable direction?
         | 
         | My contact details are in my profile if you'd like to email to
         | ask for more information.
         | 
         | Thanks.
        
           | emllnd wrote:
           | Hi Colin. A few ways to go about it, requires getting some
           | initial tedium done to get started. I would recommend the
           | course https://ppc.cs.aalto.fi/ as a resource, goes through
           | specifics of implementing an example data-crunching program
           | using both a vectorized/SIMD CPU approach (ch1-2) and a GPU
           | approach (ch4, using Nvidia Cuda specifically). Another
           | approach would be to upload data to GPU in
           | dx/vulkan/metal/opengl buffers or textures and run shaders on
           | them, plenty of resouces out there but I understand it's
           | tricky to find a suitable one. Happy to discuss more
        
       | pharmakom wrote:
       | IMO probably monads. Next might be Unison lang or dependent
       | types.
        
         | hollerith wrote:
         | >>What was the _last_ breakthrough . . . ?
         | 
         | >IMO probably monads.
         | 
         | So, you believe there have probably been no breakthroughs since
         | 1990?
        
           | pharmakom wrote:
           | Well it depends how you define breakthrough doesn't it. There
           | have absolutely been advances.
        
       | RivieraKid wrote:
       | Maybe calling these "breakthroughs" is a stretch, but anyway...
       | 
       | Swift. I think that this is the best general purpose language
       | ever created. By "best" I mean that it has the highest
       | productivity (which is determined by readability, static safety,
       | expressiveness, etc.) after normalizing for factors outside of
       | the language spec, e.g. tooling, libraries, compilation and
       | runtime speed.
       | 
       | React and Svelte. The first breakthrough was React, the second
       | generation is Svelte and similar frameworks.
       | 
       | Async / await. This is a major improvement to the readibility and
       | mental model simplicity of the most common type of concurrency
       | code.
        
       | lordnacho wrote:
       | My problem with this is it sort of implicitly assumes that
       | languages are the unit of progress. It's unlikely that we will
       | get progress solely by coming up with better languages, it's more
       | like different people invent different tools to suit different
       | problems. In the explosion, we are more likely to find what we
       | need.
       | 
       | I don't think I've ever used just one language for a project.
       | 
       | The progress at least personally is that there are now so many
       | resources to use a bunch of different languages to knit together
       | a solution that a lot more can get done. You can write your low
       | latency code in a variety of languages, and ship the data via a
       | web server to a browser or a mobile app. For every part you have
       | several choices of tech to use, and they're realistic choices
       | with lots of help online. Almost everything is available for
       | free, so you can try all sorts of libs without committing to one.
       | There's no longer a need to sit down with a textbook and learn a
       | bunch of stuff before starting, you can just jump in with a brief
       | tutorial and explore the language or lib as you discover them.
       | 
       | The plethora of choices also teaches you a lot about languages in
       | general, because you see so many that you can start to make
       | generalizations.
        
       | gcanyon wrote:
       | I think Kay's complaint about engineering rigor ignores the
       | explosive growth of programming. Sure, bridge-builders have
       | rigor; there's also probably about the same number of them today
       | as there were 50 years ago.
       | 
       | The number of programmers has grown by at least two, maybe three
       | orders of magnitude over the last half century. And more
       | importantly, almost anyone can do it. A kid whose closest
       | approach to structural engineering is building a balsa bridge for
       | a weight competition can also write and release an app for
       | Android or iOS that will be seen by millions. Even if it's not a
       | success, it's still just as real a program as MS Office.
       | 
       | That level of access guarantees amateur-level code, and the rigor
       | Kay is suggesting would kill the software industry as we know it.
        
         | joe_the_user wrote:
         | _...the rigor Kay is suggesting would kill the software
         | industry as we know it._
         | 
         | You say that like it's a bad thing.
         | 
         | Anyone's plans to make software sane would kill the software
         | industry as people love and hate it. A substantial portion of
         | this industry involves supporting horrific abortions of a
         | system that seem to live far too long. That includes the
         | functional programming people and anyone who believes some
         | method of theirs will produce an explosion of productivity.
         | Hopefully, the effect will be people quickly rewriting old
         | systems to be sane and creating many new system.
         | 
         | Unfortunately, such starry eyed idealism is unlikely to be
         | realized and the rolling a 1000 pounds of jello up a hill jobs
         | are safe. But this kind of idealism is still needed to motivate
         | the systems builders so it's not a complete loss.
        
         | DennisP wrote:
         | Note that Kay actually complained about lack of "real
         | engineering _vigor_ " which maybe was a typo or maybe not.
        
         | Joker_vD wrote:
         | Well, rigor and quality in building industry are location
         | dependent. I've read a blog of a builder who describes how
         | architects regularly produce dangerous (too thin or just simply
         | missing load-bearing beams) or straight up impossible (gable of
         | negative size, yep) designs. The solution is that the builders
         | just build whatever makes sense and sometimes the contractors
         | simply don't notice.
        
           | Fantosism wrote:
           | I'm ignorant in this domain, but wouldn't it be up to the
           | structural engineer to make sure the plans are sound? I
           | always thought that architects dream it up and engineers are
           | responsible for the physics.
        
         | Jtsummers wrote:
         | > That level of access guarantees amateur-level code, and the
         | rigor Kay is suggesting would kill the software industry as we
         | know it.
         | 
         | I don't believe this follows. The level of rigor he's lamenting
         | could be constrained to certain categories of software, based
         | on their impact or information content. Amateur or informally
         | specified systems can still satisfy everything else. There is
         | no reason for most systems that don't touch PII or critical
         | systems or other similar categories of software to have serious
         | engineering behind them if people don't want them to.
        
           | gcanyon wrote:
           | Sure, I meant if that rigor was applied to the whole software
           | industry, not selectively, exactly as you say. And that level
           | of rigor _is_ applied sometimes. The most rigorous example I
           | know of is the Space Shuttle guidance system (I think that
           | was it; I read about it twenty years ago). Two independent
           | teams write two entirely separate programs, and then in
           | practice (again, from memory) two versions of program A run,
           | and if they disagree program B is the tie breaker.
           | 
           | Also their QA process was _completely_ adversarial. Finding a
           | bug was a major success for QA, and a major failure for the
           | dev team. They found something crazy like 1 bug per million
           | lines of code.
        
         | DubiousPusher wrote:
         | Yeah and I'll tell you, as someone wbo gets to look at a lot of
         | those CAD models that are suposedly introducing rigor, they're
         | often in exactly the same kind of condition as internal
         | codebases.
        
       | analog31 wrote:
       | I would consider microcontrollers and programmable logic. They
       | outnumber "real" computers and completely reshaped electronics.
        
       | rurban wrote:
       | SAT solvers applied.
       | 
       | You can now solve unsolvable problems, like practical formal
       | verification in big C/C++ codebases. There's no need anymore to
       | write test cases, as formal verification tries all possible input
       | values, not just a few selected. It also checks all possible
       | error cases, not just the ones you thought about.
        
         | AnimalMuppet wrote:
         | Could you expand on what you mean by this? Because it sounds
         | like a level of hype bordering on nonsense.
         | 
         | 1. How is a SAT solver going to do formal verification? How do
         | you turn a formal verification into a SAT problem?
         | 
         | 2. You can only formally verify what your formal verifier can
         | handle, which is usually somewhat less than "everything". Can
         | your SAT-driven formal verifier verify that the worst case
         | response time of the system is less than X?
         | 
         | 3. Formal verification tends to be slow. If I can write and run
         | unit tests orders of magnitude faster than the formal verifier,
         | then the formal verifier is going to not be used much.
        
       | Fire-Dragon-DoL wrote:
       | Isn't Rust's compile time memory and concurrency safety a decent
       | breakthrough?
        
         | DetroitThrow wrote:
         | I'm not sure that was a breakthrough made by Rust, but I think
         | people agree that Rust was the first language that provided
         | those things that people actually have wanted to use outside
         | academia and narrow industry groups.
        
       | agent327 wrote:
       | Since I left university (eighties) I have only been positively
       | impressed by two languages. One was the Wolfram language. I
       | haven't used it; I'm just going on the demo here, but the idea of
       | having not just a powerful language, but also a massive database
       | with useful information to draw from, seems to elevate it above
       | the usual sad collection of new ways to spell variable
       | declarations and loops.
       | 
       | The other is Inform. I haven't used that either, and of course it
       | is highly domain-specific, but within that domain it seems a
       | pretty damn cool way to write text adventures.
       | 
       | All of the graphical programming systems seem to fail the moment
       | they scale up above what roughly fits on one screen. Labview is a
       | disaster once it is more than just a few boxes and lines.
       | 
       | And everything else is, as far as I can tell, just the same bits
       | and pieces we already had, arranged slightly differently. We were
       | promised hyper-advanced fifth-generation programming languages,
       | but the only thing that seems to come close is Wolfram, and
       | that's hardly mainstream.
       | 
       | I'm occasionally wondering if the whole field might not improve
       | mightily if we stopped focusing so much on languages, and instead
       | focused on providing powerful, easy to use, elegant, well-
       | documented APIs for common (and less common) problems. Using
       | something like OpenSSL, or OpenGL, or even just POSIX sockets, is
       | just an exercise in (largely unnecessary) pain.
        
         | feoren wrote:
         | It sounds like you and Alan Kay are both expecting novel
         | problems to be solved by new programming languages. That is an
         | extremely inefficient way to do it: you need to come up with
         | new compilers, documentation, standard libraries, communities,
         | etc. Instead, programming languages have become general enough
         | that most new problems are being solved within existing
         | languages, instead of by inventing new ones.
         | 
         | I have no idea what a "hyper-advanced fifth-generation
         | programming language" is even supposed to look like.
         | 
         | > I'm occasionally wondering if the whole field might not
         | improve mightily if we stopped focusing so much on languages,
         | and instead focused on providing powerful, easy to use,
         | elegant, well-documented APIs for common (and less common)
         | problems.
         | 
         | But a programming language is nothing but a well-documented
         | API! How would your suggested solution even differ from a
         | programming language?
        
       | beckman466 wrote:
       | holochains
        
       | echelon wrote:
       | Kay is looking at a microscopic part of a vast and evolving
       | ecosystem.
       | 
       | Take a look at deep learning, Kay. (In a sense, building models
       | is a new type of programming, solving problems that are difficult
       | to describe.)
       | 
       | Or maybe something a little closer to home, like Rust. It vastly
       | improves the state of the world at the boundaries of the software
       | layer.
        
       | sfblah wrote:
       | This is at least part of the reason that reasonably strong
       | engineers can learn a new programming language in under a day.
       | The paradigms just aren't that different.
       | 
       | I realize there's a cottage industry of folks creating new
       | languages all the time. But when you read the docs they all
       | bucket into a few categories, and the differences are syntax,
       | tooling and how system services are accessed.
       | 
       | All that being said, these three categories do matter.
       | Programming using tools like autocompletion, syntax highlighting
       | and the like does speed productivity.
       | 
       | But, like human language, at some point the "right" set of
       | primitives are discovered. From that point on changes become more
       | about culture and fads than concepts.
        
         | gcanyon wrote:
         | Try learning J in a day. It...won't go well -- but in a good
         | way. :-) As you say, for many languages the paradigm is some
         | variation of C, and the concepts are largely interchangeable.
         | Then you approach something like J, where (trivial example)
         | 
         | +/%#
         | 
         | returns the average of a list by composing three functions: +
         | sums; % divides; # counts; with a modifier / that distributes +
         | throughout the list; and you begin to realize you're not in
         | Kansas anymore.
         | 
         | https://en.wikipedia.org/wiki/J_(programming_language)
         | 
         | I don't think the "right" set of primitives is as obvious as
         | you say. Obviously branching and loops are foundational to
         | _many_ languages, but even those are optional given the right
         | design, and they can be implemented in very different ways.
        
           | raxxorrax wrote:
           | AF - Fibonacci in 05AB1E that are less than the current
           | number on the stack - Good tip for when you get THAT code
           | interview and can pick the language. Just need to know the
           | code for A if you are not from Denmark.
        
           | dahfizz wrote:
           | That's just a syntax over map() and reduce(). There are no
           | new / different primitives in J, just a new syntax that
           | emphasizes the use of arrays.
        
             | gcanyon wrote:
             | It's a bit ironic that we're essentially having a blub
             | language debate on ycombinator's web site. I'll defer to
             | Paul Graham's response: http://www.paulgraham.com/avg.html
        
           | IncRnd wrote:
           | J is essentially APL. I think the point is that J is
           | functionally equivalent with many other languages, only
           | differing in syntax but not in paradigm.
        
             | karmakaze wrote:
             | I would say that you have to think in a different style so
             | would classify as a different paradigm and not just syntax.
        
               | blacktriangle wrote:
               | Going back to the original comment, the statement was
               | that any experienced engineer could learn a language in a
               | day. At risk of gatekeeping, I'd argue that by definition
               | any experienced engineer would have some experience with
               | an array based language like APL or MATLAB so that no, J
               | does not qualify as a different paradigm.
        
               | gcanyon wrote:
               | The implication of the original comment was that any
               | C-style language engineer could learn any C-style
               | language in a day. I'd even argue that point, but it
               | definitely doesn't get you J in a day, except that an
               | supports most C-style syntax.
               | 
               | So you can program in J like a C-programmer in a day. You
               | definitely can't program like a J-programmer. And to say
               | a competent engineer would already have array-language
               | experience sidesteps the point.
        
               | IncRnd wrote:
               | The original comment said nothing about C-style
               | languages. That must be something you read into their
               | comment.
               | 
               | Learning J is like learning Perl or regular expressions.
               | Nobody really wants to engage in such an activity, but
               | people do what they need to do. Depending on their level
               | of experience, a person who understands imperative and
               | declarative paradigms along with the language's execution
               | model can absolutely learn J within a day, because it
               | only differs in syntax from the existing languages.
        
               | gcanyon wrote:
               | "This is at least part of the reason that reasonably
               | strong engineers can learn a new programming language in
               | under a day. The paradigms just aren't that different."
               | 
               | This literally says "the paradigms aren't that
               | different." So if you accept that C and J are different
               | paradigms, then because the paradigms aren't that
               | different, a C programmer could pick up J in a day.
               | 
               | What it doesn't say is that truly different paradigms
               | take more than a day to learn competent programmers
               | already understand all different paradigms.
               | 
               | Perl is C-like. As I said elsewhere, J supports C-style
               | syntax, so sure you can program J in C-style in a day.
               | But that's not J. This is J:
               | 
               | for. }.z do. b=. -. (i.n) e."1 ,. z +"1 _ ((-i.){:$z) */
               | _1 0 1 z=. ((+/"1 b)#z),.n|I.,b end.
               | 
               | To a J programmer, that's not just clear, it's obvious.
               | I'd like to meet the C-programmer who can learn to write
               | and read that in a day.
               | 
               | And also as I responded elsewhere, this is a blub
               | language discussion: http://www.paulgraham.com/avg.html
        
               | IncRnd wrote:
               | You are the only person who is talking about C. It is a
               | leap to go from "experienced programmer" to "that means
               | C-only programmer". You are using the term "programming
               | paradigm" incorrectly. C is not a programming paradigm.
               | The paradigm of C is that it is an imperative procedural
               | language. In the strictest sense it is a functional
               | language, since functions are first class citizens, but
               | it is not really a functional language as used in
               | practice, such as when discussing function application.
        
               | gcanyon wrote:
               | I'm using "C-style language" as a metaphor for
               | "imperative procedural language". I'm not the first to do
               | so.
        
               | IncRnd wrote:
               | Clearly, C is not a functional, array language. Nobody
               | said it is. What is your point?
        
               | goatlover wrote:
               | I guess it depends on how much past experience they had
               | with an array-based, stack-based, logic-based, pure
               | functional or lisp family language, and how long ago that
               | was. I can believe an experienced engineer can learn the
               | basics of any language in a day. But being proficient and
               | idiomatic is another matter.
        
             | gcanyon wrote:
             | J is functionally equivalent with APL, and a few other
             | related languages like K. In the end you can call anything
             | "syntax," but I'd invite you to give it a try. If C is
             | British English and Python is online-English, then J is
             | Russian, or maybe even Mandarin.
        
           | karmakaze wrote:
           | But even this is an example of finding 'a' (not 'the') right
           | set of primitives as APL is from 1966.
        
           | derefr wrote:
           | Definition by function/combinator composition isn't weird to
           | a modern programmer (Haskell does a lot weirder); and vector
           | processing isn't really weird either -- even Java programmers
           | are familiar with chaining transformations and reductions on
           | Streams these days.
           | 
           | Instead, not knowing J, the only+ thing that's weird about
           | that J expression _to me_ , is that both +/ and # are
           | receiving the same implicit vector argument, without any
           | Forth-like "dup" operator needing to precede them.
           | 
           | Are these operators defined to take an implicit single
           | receiver vector? (Maybe whichever one's on top of a "vector
           | result stack" separate from a "scalar result stack"? Maybe
           | the first one passed to the implicit lambda expression this
           | code would be wrapped in?) Or is syntax sugar here hiding the
           | argument, the way Smalltalk hides the receiver in successive
           | expressions chained using ; ?
           | 
           | What would a compact-as-possible J expression look like, to
           | sum up the contents of one vector, and then divide the result
           | by the cardinality of a _different_ vector?
           | 
           | ----------
           | 
           | + Well, there is one other, more technical thing that's
           | "weird" about the expression above: it's the fact that,
           | unless +/ is a identifier separate from +, then the lexeme-
           | sequence + / has to either derive its AST by lookahead, or by
           | + being a stack literal and / an HOF. But then % is,
           | seemingly, a binary infix operator. You don't usually find
           | both of those syntax features in the same grammar, both
           | operating on arbitrary identifier-class lexemes, as it would
           | usually cause an LALR parser for the language to have shift-
           | reduce ambiguity. Unless, I suppose, the lexeme / isn't lexed
           | as "an identifier" terminal, but rather its own terminal
           | class-- _and_ one that's illegal anywhere _but_ postfix of an
           | identifier.
        
             | gcanyon wrote:
             | Frankly, I suck at J. I've solved a few dozen project Euler
             | problems with it, but that's about it. If you want to see
             | how deep the weeds get, check out this essay on solving the
             | n-queens problem:
             | https://code.jsoftware.com/wiki/Essays/N_Queens_Problem
             | 
             | J programmers just think differently.
        
             | gcanyon wrote:
             | + is a "verb" / is an "adverb" -- it modifies the
             | functionality of whatever verb it is applied to.
             | 
             | Almost all verbs can be binary or unary, sometimes with
             | surprising (to me, a newbie) consequences. I have no idea
             | how it gets handled underneath the hood.
        
         | raxxorrax wrote:
         | But you don't know common compiler/interpreter pitfalls,
         | tooling, frameworks... so in the end you still need a long time
         | to get familiar with it for productive work.
         | 
         | Sure, you know the vocabulary, but it will still take time to
         | learn the rest.
        
       | api wrote:
       | I wouldn't call it a breakthrough as the ideas are not new, but I
       | consider Rust to be a major practical advancement that brings
       | ideas into the mainstream that were formerly just for "academic"
       | languages.
       | 
       | It's the first IMO viable alternative to C and C++ for systems
       | programming _ever_ , and its safety features represent a
       | successful transfer of provable-safety ideas of academia into a
       | practical language that people actually want to use for real
       | stuff.
       | 
       | As for true breakthroughs I'd say the last big one was modern
       | deep learning.
        
       | mark_l_watson wrote:
       | Nice read. Personally, I am not as unhappy with the state of
       | tooling. With LSP, many programming languages are getting better
       | VSCode, Emacs, Vi, etc. support. XCode is actually nice to use,
       | on a fast M1 Mac. I think wide adoption of deep learning is the
       | biggest breakthrough in recent years.
       | 
       | EDIT: I have used Lisp heavily for 40 years, and I have done a
       | few Smalltalk projects.
        
       | N1H1L wrote:
       | LLVM could very well be thought of as major breakthrough. The
       | intermediate representation format has enabled so many compiler
       | and languages now, that it's pretty insane in my opinion.
        
       | platz wrote:
       | So if Pizza Hut wants a checkout cart they need to rent a
       | supercomputer to plan the implementation?
        
         | mlosgoodat wrote:
         | He was speaking as if it was still 1980s. Now we have cloud
         | based ML doing the optimization finding.
         | 
         | ML will (eventually) be able to generate biz logic code, and
         | most assuredly infrastructure config (a cloud API has a limited
         | set of possible configs of value to our sorting patterns), UI
         | code (we gravitate towards a limited set of UX it seems), to
         | solve much of our daily programming work.
         | 
         | ML can't invent future ideas, it can't evolve itself without us
         | making new hardware for it. But it will implode the blue collar
         | dev job market eventually.
        
           | platz wrote:
           | how many years is eventually
        
             | ausbah wrote:
             | 5-10 years, every year
        
       | abetusk wrote:
       | It always feels like Alan Kay is wistfully talking about how
       | there could have been an alternate future where he and his ilk
       | would do programming and computer engineering the "right way" and
       | somehow the world has lost it's path. He waxes eloquent about how
       | he enabled the genius's at Xerox PARC to do deep work and how
       | they changed the world.
       | 
       | I find his talks inspiring but I also find it tone deaf that he
       | doesn't understand the situation he was in nor does he see how
       | the world has changed. Xerox PARC was, in my opinion, a by
       | product of top down "waterfall" like business practices that he
       | happened to be at the apex of. For most of the rest of us, we
       | have to get by with minimal funding and try to push ideas to an
       | over saturated market.
       | 
       | What really irks me is how he still has this view that somehow if
       | all the worlds intellectuals just got together or just got
       | funding, somehow they would come up with the next genius ideas
       | and could deliver it to the rest of the world, like god's
       | messengers.
       | 
       | Here's what he missed:
       | 
       | * Free and Open source. It's not software that's eating the
       | world, it's free and open source software that's eating the
       | world. Most of the worlds infrastructure runs on FOSS and we'd be
       | living in a developer hellscape of choosing between which Apple
       | and Microsoft crippling licensing fees we'd need to pay just to
       | have the privilege of compiling our programs.
       | 
       | * Git. Git has allowed project management and software sharing
       | with ease like nothing before. Even though GitHub is a
       | proprietary enterprise, it's created massive value to the
       | community through it's ease of sharing.
       | 
       | * Javascript. Write once run anywhere? Javascript is the only
       | language that comes even close. Data representation that's
       | portable? JSON is the only one that comes close. Javascript has
       | it's warts but it brings the best of functional languages in a
       | procedural languages skin. Javascript is the only reason I don't
       | dismiss functional languages off hand, because Javascript
       | actually makes those concepts useful.
       | 
       | * Docker (and perhaps some WebAssembly/Javascript environment). I
       | think we're closing in the idea that Linux boxes are the
       | analogues of "cells", where each has their own local environment
       | and can be composed to form larger structures. Linux images may
       | seem bloated now but when space and compute passes a threshold of
       | cheap/fast, it won't matter.
       | 
       | * GPUs. Moores law is alive and well with GPUs. "Out of the box"
       | gets you a 10x-100x speedup on algorithms ported from CPUs. I've
       | heard of 1000x-10000x in some cases. It's not just raw
       | performance, it's also designing data structures that work on
       | GPUs, are more data centric and are designed for small,
       | independent and highly parallel workloads.
       | 
       | And I'm sure there are many more. I would add cryptocurrency but
       | maybe that falls outside of the "programming" scope.
       | 
       | For all his preaching, Alan Kay still doesn't "get it". If
       | there's a billion dollar monopoly that has a small, independent
       | research group that has no ceiling to funding and has Alan Kay as
       | it's head, great, I'm sure they'll come up with some wonderful
       | ideas. We don't live in that world anymore and the playing field
       | has been leveled because of the points above and many more (maybe
       | even more simply just because cheaper compute is available).
       | 
       | It never really occurs to Alan Kay that maybe his type of
       | research is not the only way. Better yet, maybe Alan Kay's style
       | of research is gone _because_ we 've discovered better ways.
       | 
       | If Alan Kay really wants to see the worlds next ideas birthed
       | into existence, why isn't he at the forefront of free and open
       | source, championing the software that enables people to his type
       | of fundamental research? If you really want a trillion dollar
       | ROI, invest a few million in FOSS.
        
         | gcanyon wrote:
         | Write once, run anywhere: I used to demo for LiveCode
         | https://livecode.com/ at MacWorld and WWDC where I would code a
         | basic app and build separate single-file standalones for Mac,
         | Windows, and Linux, all while holding my breath.
        
         | bumbada wrote:
         | >If Alan Kay really wants to see the worlds next ideas birthed
         | into existence, why isn't he at the forefront of free and open
         | source?
         | 
         | He was, he created Smalltalk, released the source code and made
         | it as open as it can be. You can see the source of everything.
         | 
         | The only problem was very few people were interested on that.
         | Most children were interested in just one thing: games(the best
         | quality they could get). And adults wanted to run professional
         | programs in their inexpensive machines.
         | 
         | He just could not understand why people choose other languages,
         | like C, that were anathema for him, but let people extract all
         | the juice of their cheap computers in order to create and play
         | games and also serious programs that previously only run on
         | mainframes.
         | 
         | As an academic with early access to "hundreds of thousand
         | dollars per machine"(and not accounting for inflation), he was
         | too isolated from the rest of the world to understand what
         | happened later.
        
         | DetroitThrow wrote:
         | I really enjoyed this critique of AK, but you could really
         | stuff docker and javascript into "portability" to include
         | technologies that provide the same benefits to other focuses
         | outside of application programming. Intermediate
         | representations like LLVM and WASM, and even environments like
         | Wine and Linux subsystem for Windows have felt like major
         | paradigm shifts.
        
       | gcanyon wrote:
       | I think to understand the "next" level of programming it's
       | important to broaden our definition. "Programming" should be more
       | like "defining a system to turn a specific input from a defined
       | input set into a corresponding action or output."
       | 
       | That's too broad, because it includes the formula =A1*2 in Excel.
       | But at some greater level of complexity, an Excel spreadsheet
       | transforms from "a couple formulas" to "a structured tool
       | designed to keep track of our orders and invoices" -- in other
       | words, a program.
       | 
       | On that basis, the recent advances include spreadsheets, database
       | tools, and JavaScript along with other scripting languages.
        
       | mikewarot wrote:
       | Metamine represents the latest breakthrough in programming, it
       | offers a mix of tradition declarative programming and reactive
       | programming. The "magical equals" for lack of a better term, lets
       | you do "reactive evaluation", the opposite of lazy evaluation.
       | 
       | If any of the terms that a term depend on change, the result is
       | updated, and all it's dependencies, etc. You can use the system
       | clock as a term, thus have a chain of things that update once a
       | second, etc.
       | 
       | Being able to use both reactive and normal programming together
       | without breaking your brain is a whole new level of power.
       | 
       | It's brilliant stuff, and it seems to have been yoinked from the
       | internet. 8(
       | 
       | Here's a previous thread about it:
       | https://news.ycombinator.com/item?id=27555940
        
         | pharmakom wrote:
         | Can't this be expressed as a library in Haskell or even F#
         | though?
        
           | AnimalMuppet wrote:
           | That style sounds like it's mutating variables when other
           | variables change (that is, mutate). That doesn't sound like a
           | good fit for Haskell...
        
             | kian wrote:
             | Haskell is great at mutating variables. Reactive
             | programming was even pioneered in it. It's just that the
             | Monads that allow you to do this are somewhat like sticky
             | tar - everything they touch becomes a part of them.
             | 
             | The traditional structure of Haskell programs is to build a
             | pure functional 'core', and layer around that the 'sticky'
             | parts of the code that need to interact with the outside
             | world.
        
               | AnimalMuppet wrote:
               | OK, but the "reactive" programming sounds like how you
               | build the core. So I still question whether that style
               | fits well with Haskell.
        
               | pharmakom wrote:
               | Haskell has do notation and monads (F# has computation
               | expressions, which are similar). These allow you to
               | implement things as libraries that in most other
               | languages would require changes to the compiler. Lisp
               | macros can do it too.
               | 
               | You could push the reactivity monad right down to the
               | core of your application to get the benefits of this
               | reactive language.
        
         | machiaweliczny wrote:
         | So it's like MobX in UI world?
        
       | wmanley wrote:
       | I feel like Kay has taken a rather too narrow view of what counts
       | as programming. IMO here are the breakthroughs in the last 20
       | (ish) years:
       | 
       | 1. Stack Overflow - search for your problem, copy and paste the
       | answer.
       | 
       | 2. git - Revision control that is low enough overhead that you
       | need to have a reason not to use it.
       | 
       | 3. Open-source software as a commodity - Unless you've got very
       | specific requirements there's probably a system that can do most
       | of your heavy lifting. (Almost) no-one writes their own JSON
       | parsers or databases or web frameworks. Using open-source
       | software is low overhead and low risk compared to engaging with
       | some vendor for the same.
       | 
       | 4. Package managers - By making it easy to include other peoples
       | code we lower the bar to doing so.
       | 
       | The common thread here is code-reuse. None of the above are
       | programming languages, but all have driven productivity in
       | building systems and solving problems.
        
         | kenjackson wrote:
         | YouTube. My son learns a lot of his programming from searching
         | YouTube and watching videos. Doesn't seem like it would be high
         | density, but I'm amazed that there are really good
         | videos/tutorials on pretty obscure topics. And since all
         | YouTube videos have dates, it's actually easier to find current
         | tutorials.
        
           | josephcsible wrote:
           | Are those tutorials good because they're videos, or despite
           | being videos? I'm inclined to believe the latter. What sort
           | of content is there in programming lessons that wouldn't be
           | better presented as text?
        
         | orthoxerox wrote:
         | Code reuse was the holy grail of the 90's. People expected
         | classes to be the unit of code reuse, then enterprise services,
         | then after a dozen years of disillusionment we finally got the
         | recipe right.
        
           | still_grokking wrote:
           | We finally got the recipe right?
           | 
           | How? With micro-services? Or Kubernets clusters?
           | 
           | Or copy-paste from SO?
           | 
           | Or is it more about Linux packages? Maybe NPM?
           | 
           | I'm really not sure we've got this right by now.
           | 
           | Actually everything is over and over rewritten. May it be
           | because of the language used, may it be because of
           | frameworks, or architectures.
           | 
           | Polyglot runtimes that could enable true code reuse at least
           | across language boundaries (like GraalVM) are just emerging.
           | 
           | For the higher build blocks though there's still nothing to
           | enable efficient reuse. (People try of course. So mentioning
           | micro-services was no joke actually).
        
         | AnimalMuppet wrote:
         | I would add: Better, more complete libraries shipping standard
         | with languages.
        
         | citrin_ru wrote:
         | > Stack Overflow
         | 
         | Its blessing and a curse. I think software would be better if
         | coders read mans and other documentation more often (and
         | standards like RFC where it is applicable).
         | 
         | > git - Revision control that is low enough overhead that you
         | need to have a reason not to use it.
         | 
         | RCS - 1982 CVS - 1990
         | 
         | They are limited compare to git, but they perform the main
         | function - track changes in text files allowing to see previous
         | versions, a diff for each change, commit messages. CVS compare
         | to tarballs for each release (or worse to a mess of files .bak,
         | .bak2 e. t. c.) is a breakthrough. Subversion, Mercurial, git
         | is IMHO just evolution of earlier VCS.
         | 
         | > Package managers - By making it easy to include other peoples
         | code we lower the bar to doing so.
         | 
         | CPAN - 1993 FreeBSD pkg_add - 1993
         | 
         | > Open-source software as a commodity
         | 
         | Here I fully agree. Opensource started to get some traction
         | 20ish years ago (probably thanks to more widely available
         | Internet and support from corporations like IBM), but its use
         | is still growing.
         | 
         | When I look back it seems to me that 1990s were very fruitful
         | and the next 20 years progress in software was somewhat slower,
         | but progress in hardware enabled previously impossible stuff
         | without revolutionary changes in software.
        
           | wmanley wrote:
           | The point was that these were either dismissed or weren't
           | considered by Kay when describing breakthroughs in computer
           | programming. They aren't breakthroughs in computer
           | programming languages, but IMO are breakthroughs in computer
           | programming.
           | 
           | > RCS - 1982 CVS - 1990
           | 
           | I don't accept CVS as a breakthrough in the same way as git
           | has been. Back in 2000 - 10 years after CVS - using source
           | control wasn't a given. We had articles like "The Joel
           | Test"[1] encouraging teams to use source control. CVS was a
           | pain to set up and limited once you did. Thanks to git (and
           | DVCS in general) using source control is the default for 1
           | person throwaway projects up to projects with thousands of
           | contributors and million lines of code.
           | 
           | [1]: https://www.joelonsoftware.com/2000/08/09/the-joel-
           | test-12-s...
        
       | svachalek wrote:
       | Early programming language advancements were about abstracting
       | away the basic repetitive stuff (function call stack
       | manipulation) and the hardware details (register selection,
       | memory addresses). They did it in a way that was minimally
       | "leaky"; debugging a C program you may be aware of the call stack
       | and registers, but most of the time you'll be just fine working
       | at the abstraction level of function parameters and local
       | variables.
       | 
       | Since then we've added tons more boilerplate and hardware to the
       | standard application deployment, it runs over multiple servers
       | and clients using various network protocols, interacts with
       | databases and file systems, etc. But modern solutions to these
       | are mostly code generation and other leaky layers, it's likely
       | you can't debug a typical problem without reading or stepping
       | through generated code or libraries.
       | 
       | What I'd like to see in a new programming language is some
       | abstraction of an application that has persistent data and is
       | distributed, with the details being more or less compiler flags.
       | And comes with debugging tools that allow the programmer to stay
       | at that level of abstraction. But most new language announcements
       | come down to some new form of syntactic sugar or data typing.
        
       ___________________________________________________________________
       (page generated 2021-08-17 23:01 UTC)