[HN Gopher] Mostly dead influential programming languages (2020)
___________________________________________________________________
Mostly dead influential programming languages (2020)
Author : azhenley
Score : 234 points
Date : 2025-07-12 20:50 UTC (4 days ago)
(HTM) web link (www.hillelwayne.com)
(TXT) w3m dump (www.hillelwayne.com)
| Rochus wrote:
| How can COBOL be a "dead" or "mostly dead" language if it still
| handles over 70% of global business transactions (with ~800
| billion lines of code and still growing). See e.g.
| https://markets.businessinsider.com/news/stocks/cobol-market....
| dlachausse wrote:
| BASIC is the scripting language used by Microsoft Office.
| Saying that it powers millions of businesses is probably not an
| exaggeration.
|
| Pascal, particularly the Delphi/Object Pascal flavor, is also
| still in widespread use today.
| Rochus wrote:
| Also Smalltalk is still in wide use; ML is also used; there
| are even many PL/I applications in use today and IBM
| continues to give support.
| fuzztester wrote:
| I don't know, I heard somewhere that even the C language is
| in wide use, still ... ;)
| ranger_danger wrote:
| Maybe their definition uses recent popularity or how many new
| projects are started with it. Under that definition, I think
| it's pretty safe to call it "dead".
| Rochus wrote:
| If you redefine language, anything is possible.
| pessimizer wrote:
| Yes. "Dead" normally means "to be devoid of life," but it's
| often extended to metaphorically cover things like computer
| languages.
|
| edit: for ancient Greek to become a dead language, will we
| be required to burn all of the books that were written in
| it, or can we just settle for not writing any new ones?
| eviks wrote:
| For ancient Greek all you need is no one speaking it
| (using it in real life)
|
| Same with a programming language - is no one is wiring
| code in it, it's dead
| duskwuff wrote:
| No one's starting new projects in COBOL.
| Rochus wrote:
| One of the most significant new COBOL projects in 2025 was
| the integration of a new COBOL front-end into the GNU
| Compiler Collection. There are indeed quite many new projects
| being started in COBOL, though they primarily focus on
| modernization and integration with contemporary technologies
| rather than traditional greenfield development. Also not
| forget some cloud providers now offer "COBOL as a service"
| (see e.g.
| https://docs.aws.amazon.com/m2/latest/userguide/what-
| is-m2.h...).
| iLoveOncall wrote:
| > There are indeed quite many new projects being started in
| COBOL
|
| No.
|
| You have to put this relative to projects started in other
| languages, at which points new projects started in COBOL is
| even less than a rounding error, it probably wouldn't
| result in anything other than 0 with a float.
| Rochus wrote:
| The claim was " _No one 's starting new projects in
| COBOL._"
| iLoveOncall wrote:
| And everyone of good faith understood what the claim
| actually was.
| Rochus wrote:
| And everyone with relevant fintech project experience
| knows that new projects on the existing core banking
| systems are started all the time and that COBOL continues
| to be a relevant language (whether we like it or not).
| duskwuff wrote:
| By "new COBOL projects" I mean green-field development of
| entirely new projects written in that language - not the
| continued development of existing COBOL codebases, or
| development of tools which interact with COBOL code.
|
| As an aside, the article you linked to is pretty obvious AI
| slop, even aside from the image ("blockchin infarsucture"
| and all). Some of the details, like claims that MIT is
| offering COBOL programming classes or that banks are using
| COBOL to automatically process blockchain loan agreements,
| appear to be entirely fabricated.
| duskwuff wrote:
| (Rochus edited their post after I replied to it; the
| article I was referring to was
| https://zmainframes.com/zlog/cobol-modernization-
| in-2025-bre....)
| alwinw wrote:
| Interesting read, and would have been good to see the author's
| definition of 'mostly dead'. Some are still used widely in niche
| areas like COBOL for banking. If a language itself isn't
| receiving any updates nor are new packages being developed by
| users, is it mostly dead?
| Rochus wrote:
| In any case, the author claims that each of these languages is
| "dead". There is a "Cause of Death" section for each language,
| which doesn't allow for another conclusion. By listing
| languages like ALGOL, APL, CLU, or Simula, the author implies
| that he means by "dead" "no longer in practical use, or just as
| an academic/historic curiosity". The article contradicts itself
| by listing languages like COBOL, BASIC, PL/I, Smalltalk,
| Pascal, or ML, for which there is still significant practical
| use, even with investments for new features and continuation of
| the language and its applications. The article actually
| disqualifies by listing COBOL or Pascal as "mostly dead",
| because there is still a large market and significant
| investment in these languages (companies such as Microfocus and
| Embarcadero make good money from them). It is misleading and
| unscientific to equate "no longer mainstream" with "no longer
| in use." This makes the article seem arbitrary, poorly
| researched, and the author not credible.
| addaon wrote:
| Seeing Smalltalk on these lists and not Self always seems...
| lacking. Besides its direct influence on Smalltalk, and its
| impact on JIT research, its prototype-based object system lead to
| Javascript's object model as well.
| joshmarinacci wrote:
| Self was influenced by Smalltalk, not the other way around.
| Smalltalk was developed in the 1970s. Self in the 1980s.
| addaon wrote:
| Thanks for the correction.
| vincent-manis wrote:
| There is one very _BIG_ thing that Cobol pioneered: the
| requirement that not only the programs, but also the data, must
| be portable across machines. At a time when machines used
| different character codes, let alone different numeric formats,
| Cobol was designed to vastly reduce (though it did not completely
| eliminate) portability woes.
|
| We take this for granted now, but at the time it was
| revolutionary. In part, we've done things like mandating Unicode
| and IEEE 754, but nowadays most of our languages also encourage
| portability. We think very little of moving an application from
| Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess),
| but back when Cobol was being created, you normally threw your
| programs away ("reprogramming") when you went to a new machine.
|
| I haven't used Cobol in anger in 50 years (40 years since I even
| taught it), but for that emphasis on portability, I am very
| grateful.
| froh wrote:
| the other big cobol feature is high precision (i.e. many
| digest) fixed point arithmetic. not loosing pennies on large
| sums, and additionally with well defined arithmetics, portably
| so as you point out, is a killer feature in finance.
|
| you need special custom numerical types to come even close in,
| say, java or C++ or any other language.
| fuzztester wrote:
| >the other big cobol feature is high precision (i.e. many
| digest) fixed point arithmetic. not loosing pennies on large
| sums, and additionally with well defined arithmetics,
| portably so as you point out, is a killer feature in finance.
|
| I guess you mean:
|
| >digest -> digits
|
| >loosing -> losing
|
| Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo
| Pascal had that as an option, or maybe I am thinking of
| something else, sorry, it's many years ago.
| imoverclocked wrote:
| Binary Coded Decimal is something else.
|
| 1100 in "regular" binary is 12 in decimal.
|
| 0001 0010 in BCD is 12 in decimal.
|
| ie: bcd is an encoding.
|
| High precision numbers are more akin to the decimal data
| type in SQL or maybe bignum in some popular languages. It
| is different from (say) float in that you are not losing
| information in the least significant digits.
|
| You could represent high precision numbers in BCD or
| regular binary... or little endian binary... or trinary, I
| suppose.
| froh wrote:
| yes and... bcd arithmetics are directly supported on CPUs
| for financial applications, like Z:
|
| IBM z Systems Processor Optimization Primer
|
| https://share.google/0b98AwOZxPvDO6k15
| lucas_membrane wrote:
| There are some regulations in bond pricing or international
| banking or stuff like that that require over 25 decimal
| places. IIRC, the best COBOL or whatever could do on the
| IBM 360's was 15 digits. The smaller, cheaper, and older
| 1401 business machines didn't have any limit. Of course,
| for nerdy financial applications, compound interest and
| discounting of future money would require exponentiation,
| which was damn-near tragic on all those old machines. So
| was trying to add or subtract 2 numbers that were using the
| maximum number of digits, but with a different number of
| decimal places, or trying to multiply or divide numbers
| that each used the maximum number decimal places, with the
| decimal point in various positions, and it was suicide-
| adjacent to try to evaluate any expression that included
| multiple max precision numbers in which both multiplication
| and division each happened at least twice.
| carlos_rpn wrote:
| > There are some regulations in bond pricing or
| international banking or stuff like that that require
| over 25 decimal places.
|
| Sounds interesting. Is there anywhere you know I can read
| about it, or is there something specific I can search
| for? All results I'm getting are unrelated.
| lucas_membrane wrote:
| Sorry, not that I know of. They never let me near that
| stuff, and the various really important standards bodies
| are monopoly providers of information, so they inevitably
| charge more for copies of their standards than non-
| subservients like me can afford. I just tried to quickly
| scan the 25,000+ ISO standards and did not find anything
| even possibly related under $100. The Securities Industry
| Association was the maven of bonds when I was trying to
| figure them out, but, knock me over with a feather, they
| were dissolved almost 20 years ago. You might start with
| the Basel Committee on Banking Supervision, the Financial
| Markets Standards Board, the Bank for International
| Settlements, the Fixed Income Clearing Corporation, or
| https://www.bis.org/publ/mktc13.pdf.
| froh wrote:
| thx for the typo fixes
|
| indeed, it's exactly BCD arithmetics which are part of the
| standard, with fixed decimal size and comma position
|
| and yes, Turbo Pascal had some limited support for them.
|
| you needed them in the 1990s for data exchange with banks
| in Germany. "Datentrageraustauschformat". data storage
| exchange format. one of my first coding gigs was automatic
| collection of membership fees. the checksum for the data
| file was the sum of all bank account numbers and the sum of
| all bank ID numbers (and the sum of all transferred
| amounts)... trivial in Cobol. not so much in Turbo C++ :-)
|
| I wasn't aware of the BCD in turbo Pascal... those were the
| days :-D
| fuzztester wrote:
| welcome.
|
| those were the days, indeed :)
| altcognito wrote:
| Is Python indentation at some level traced back to Cobol?
| somat wrote:
| I would guess not. indentation in python serves a very
| different purpose to the mandatory indentation as found in
| early cobol/fortran.
|
| I am not really an expert but here is my best shot at
| explaining it based on a 5 minute web search.
|
| Cobol/fortran were designed to run off punch cards, specific
| columns of the punch card were reserved to specific tasks,
| things like the sequence number, if the line is a comment,
| continuation lines.
|
| https://en.wikipedia.org/wiki/COBOL#Code_format
|
| https://web.ics.purdue.edu/~cs154/lectures/lecture024.htm
|
| In python the indentation is a sort of enforced code style
| guide(sort of like go's refusal to let you load unused
| modules), by making the indentation you would do normally
| part of the block syntax, all python programs have to have
| high quality indentation. whether that author wants to or
| not.
| HelloNurse wrote:
| Interestingly, punch cards and early terminals in the
| 80-132 columns range reached the limits of readable line
| length and early programming languages were obviously
| pushed to the limits of human comprehension, making the
| _shape_ of text in old and new programming languages
| strikingly consistent (e.g. up to 4 or 5 levels of 4
| characters of indentation is normal).
| ewgoforth wrote:
| I don't know much about COBOL, but I did code quite a bit
| in Fortran. In Fortran, the first five columns were for an
| optional line number, so these would mostly be blank. The
| sixth column was a flag that indicated that the line was
| continued from the one before, this allowed for multiline
| statements, by default a carriage return was a statement
| terminator. Like you said, all this came from punch cards.
|
| Columns seven through 72 were for your code
| KWxIUElW8Xt0tD9 wrote:
| I've heard it was from Haskell?
| gnufx wrote:
| Block structure as indentation was introduced in Landin's
| ISWIM. I think the first actual implementation was in
| Turner's SASL (part of the ancestry of Haskell). Note that
| Haskell doesn't have Python's ":" and it also has an
| alternative braces and semicolons block syntax.
| colanderman wrote:
| Instead now, you throw everything away when moving to a new
| language ecosystem. Would love to see parts of languages become
| aligned in the same manner that CPUs did, so some constructs
| become portable and compatible between languages.
| brabel wrote:
| Great point. But some newer languages do keep compatibility,
| with Java (Scala, Groovy, Kotlin, Clojure) and .Net (C#, F#,
| Visual Basic, Powershell) "platforms" being an example, but
| also with system languages that normally have some simple (no
| binding required) ABI compatibility with C , like D, Zig and
| Nim I think.
|
| The newest attempt seems to be revolving around WASM, which
| should make language interoperability across many languages
| possible if they finally get the Components Model (I think
| that's what they are calling it) ready.
| SideburnsOfDoom wrote:
| Today, with several of those languages, we really don't
| care from a code point of view if the deployment target is
| Linux or Windows, we know it will work the same. That's an
| achievement.
|
| And many of them can target WASM now too.
| michaelmrose wrote:
| What about graal and truffle?
| pdw wrote:
| Another fascinating aspect of COBOL is that it's the one
| programming language that actively rejected ALGOL influence.
| There are no functions or procedures, no concept of local
| variables. In general, no feature for abstraction at all, other
| than labelled statements. Block-structured control flow
| (conditionals and loops) was only added in the late 80s.
| skissane wrote:
| Contemporary COBOL (the most recent ISO standard is from
| 2023) has all those things and more.
|
| Rather than rejecting such features, COBOL was just slower to
| adopt them-because conservatism and inertia and its use in
| legacy systems. But there are 20+ year old COBOL compilers
| that support full OO (classes, methods, inheritance, etc)
| stefan_ wrote:
| Until someone decided we shall have big and little endianness,
| and that this newfangled internet shall use the correct big
| endian ordering.
| ameliaquining wrote:
| Previously: https://news.ycombinator.com/item?id=22690229
|
| (There are a few other threads with a smaller number of
| comments.)
| mud_dauber wrote:
| Kinda surprised to not see Forth listed.
| drweevil wrote:
| Or Lisp. Lisp is definitely not dead, but was definitely very
| influential.
| tempaway43563 wrote:
| The article does touch on that:
|
| "COBOL was one of the four "mother" languages, along with
| ALGOL, FORTRAN, and LISP."
| bitwize wrote:
| Imho Lisp is deader than COBOL. Especially now that we've
| learned you can do the really hard and interesting bits of AI
| with high-performance number crunching in C++ and CUDA.
| kstrauser wrote:
| I wrote Lisp this morning to make Emacs do a thing. In
| other venues, people use Lisp to script AutoCAD.
|
| Lisp isn't as widely used as, say, Python, but it's still
| something _a lot_ of people touch every single day.
| lucasoshiro wrote:
| And Clojure
| duskwuff wrote:
| Forth was neat, but it was a bit of an evolutionary dead end.
| I'm not aware of any significant concepts from Forth which were
| adopted by other, later programming languages.
| ks2048 wrote:
| PostScript
| microtherion wrote:
| The only thing PostScript and Forth have in common is RPN.
| Other than that, they are very different in philosophy -
| Forth is very bit banging, close to the metal, while
| PostScript is much more symbol oriented and high level.
| ks2048 wrote:
| That's true, PostScript is much higher-level and feels
| like a stack-based LISP. But, saying they just have RPN
| in common makes it seem like a small choice about the
| syntax - instead of a whole stack-oriented approach,
| which affects everything.
| microtherion wrote:
| Well, yes, the stack oriented approach does matter. But
| even there, there are big differences with Forth having a
| user accessible return stack, which is implicit in
| PostScript, while PostScript has an explicit dictionary
| stack, which exists only in a very primitive form in
| Forth.
| tengwar2 wrote:
| RPL (Reverse Polish Lisp, a high level language for HP
| calculators) possibly drew on it a bit, though the main
| antecedents are RPN and Lisp, and possibly Poplog (a Poplog
| guru was at HP at the time, but I don't know if he
| contributed).
| usgroup wrote:
| I was almost sure that Prolog would be on the list, but
| apparently not.
| coredog64 wrote:
| Because it's dead or because it's influential?
| usgroup wrote:
| It was very influential and had all kinds of interesting
| stories (e.g. https://en.wikipedia.org/wiki/Fifth_Generation_
| Computer_Syst...).
|
| I'm not sure what qualifies as dead. Prolog is still around
| although as a small and specific community, perhaps
| comparable in size to the APL community at least within an
| order of magnitude.
| DrNosferatu wrote:
| The (literal) first and foremost ASCII descendant of APL was
| MATLAB.
|
| I feel that the article should have made this a lot more clear -
| as so many people code along the APL -> Matlab / R (via S) ->
| NumPy family tree.
| ansgri wrote:
| R/S is also heavily influenced by Lisp. Haven't written it in
| 10 years, but AFAIR it even has proper macros where argument
| expressions are passed without evaluation.
| bradrn wrote:
| Technically those aren't macros; they're what the Lisp world
| calls 'fexprs'.
| ck45 wrote:
| Modula-3 should be on that list as well. Unfortunately pretty
| dead (compiler support is rather abysmal), though pretty
| influential. Wikipedia lists a couple of languages that it
| influenced, I think it should also include Go (though Go is
| allegedly influenced by Modula-2, according to its wikipedia
| article)
| asplake wrote:
| What other languages have been influenced by Go?
| brabel wrote:
| I think they meant Go should be in the list of languages
| influenced by ... but one language influenced by Go I believe
| is Odin.
| jasperry wrote:
| Okay, I'll bite. ML did not mostly die, it morphed into two main
| dialects, SML and OCaml. OCaml is still going strong, and it's
| debatable whether SML is mostly dead.
|
| My main beef, however, is that the last sentence in the section
| seems to suggest that the birth of Haskell killed SML on the vine
| because suddenly everybody only wanted pure, lazy FP. That's just
| wrong. The reality is that these two branches of Functional
| Programming (strict/impure and lazy/pure) have continued to
| evolve together to the present day.
| dboreham wrote:
| Isn't F# ML-influenced?
| bgr-co wrote:
| F# used to be OCaml.NET AFAIK
| kqr wrote:
| Not quite. They explicitly drew a lot of inspiration from
| OCaml, but they never intended it to be an OCaml compiler
| for .NET.
|
| F# was - from the start - a functional language designed
| specifically for the .NET Framework Common Language
| Runtime. Whenever OCaml and CLR diverged in how they did
| things, they went the CLR route.
|
| (See e.g. https://entropicthoughts.com/dotnet-on-non-
| windows-platforms... for more information, or the Don Syme
| history of F#.)
| tialaramex wrote:
| I have a colleague who is very good with C# and F#. We
| use a lot of C# for work, and for him F# is just a hobby
| (he has a math background). Because Rust is basically an
| ML dressed up to look like a semicolon language, and I
| know I grokked Rust almost immediately with an academic
| background in SML/NJ and decades of experience writing C,
| my guess is that this colleague would pick up Rust very
| easily, but I can't confirm that.
|
| Every Xmas when people are picking up new languages for
| Advent of Code (which like me he does most years) I
| wonder if he'll pick Rust and go "Oh, that's nice" - I
| was going to write a relatively contemporary spoiler here
| but instead let's say - it's like watching a Outer Wilds
| player staring at the starting night sky for their fifth
| or hundredth time wondering if they're about to say "Oh!
| Why is that different each time?". No. Maybe next time?
| FrustratedMonky wrote:
| So he didn't immediately take to Rust? What was his
| feedback? I like F# and have also wanted to dive into
| Rust.
| tialaramex wrote:
| He's never tried it.
|
| Sorry it may not have been clear, I was comparing the
| experience of knowing he might love Rust (or not) but not
| knowing if he'll decide to learn it - against the
| experience of watching unspoiled people playing a
| discovery game such as Outer Wilds where you know _what_
| they don 't know yet and so you're excited to watch them
| discover it. I dunno that's maybe not an experience most
| people have.
|
| If you either enjoy learning new languages or have a
| purpose for which Rust might be applicable I encourage
| you to try it. As you're an F# user it won't be as
| revelatory as it is for someone with say only C as a
| background, but on the other hand if you have no bare
| metal experience it might also be a revelation how _fast_
| you can go without giving up many of the nice things you
| 're used to.
|
| If you're a polyglot you probably won't encounter much
| that's new to you because Rust never set out to be
| anything unprecedented, I've heard it described as an
| "industrialization of known best practice" and it's ten
| years since Rust 1.0 drew a line in the sand.
| porcoda wrote:
| Yeah, I saw that and was tempted to say the same thing. Ocaml
| is alive and well, and SML still is in active use. Ocaml has a
| relatively broad application space, whereas SML is more or less
| limited to the world of theorem provers and related tools
| (e.g., PolyML is used to build things like HOL4, and CakeML is
| a pretty active project in the verified compilers space that
| targets SML and is built atop HOL4). SML is likely regarded as
| irrelevant to industrial programmers (especially those in the
| communities that frequent HN), but it's still alive. F# is
| still alive and kicking too, and that's more or less an Ocaml
| variant.
| doublerabbit wrote:
| Pascal is alive too with Lazarus and FPC.
| msgodel wrote:
| I'd argue Rust is modern ML in many ways, it just uses a c-like
| syntax. It's really the non-pure Haskell alternative.
| empath75 wrote:
| Rust is really missing higher kinded types, though. You can
| do a lot of neat haskell tricks with it, but you still can't
| write functions that are generic over "monads" for example.
| You have to write a function for lists, a function for
| options and a function for results, you can't really treat
| them as the _same thing_, even though they all have similar
| methods.
| jasperry wrote:
| Rust semantics owes a lot to ML, but the borrow checker makes
| programming in Rust very different from other ML-derived
| languages, which are almost all garbage-collected.
| Jimmc414 wrote:
| COBOL - "mostly dead" but still somehow the backbone of the
| global financial system
| waldopat wrote:
| One day Perl will be on this list
| Qem wrote:
| If we assume peak Perl was in the 00s, say 2005, an
| impressionable teenager of ~15 learning by then probably will
| keep using it for the rest of their life, even in absence of
| uptake by new people. Assuming a lifespan of 85, I estimate
| this day won't arrive before the 2070s.
| macintux wrote:
| I started using it in the mid-90s, and used it extensively at
| work as long as I could, but by 2012 I gave up the fight. I
| still break it out once in a great while for a quick text
| transformation, but it's so rusty in my memory that I rely on
| an LLM to remind me of the syntax.
| nkozyra wrote:
| I think peak Perl was before then, but that's about when Perl
| fell off the map and started getting replaced by Python or
| PHP to replace CGI since it had some syntactic overlap.
|
| This is when I started professionally and we were asked to
| replace "slow, old Perl scripts" As a new entrant, I didn't
| ask many questions, but I also didn't see any of the
| replacements as improvements in any way. I think the # of
| devs left to take over messy Perl projects was shrinking.
|
| As you might imagine, this job involved a lot of text
| processing. People still point to that as the arrow in Perl's
| quiver, but it seems especially quaint today since any
| language I'd reach for would blow it out of the water in
| terms of flexibility and ease of use.
| tayo42 wrote:
| That's how I remeber the time line.
|
| But I thought maybe the end of the 00s was when ror started
| showing up.
|
| Mid 2000s I think i was learning php and lamp stack. Perl
| was already kind of old
| hpcjoe wrote:
| As will Python and many others.
| kqr wrote:
| I'm not convinced. Which languages have Perl influenced,
| really? All the things that other languages seem to draw from
| Perl could just as well have come from e.g. awk, Smalltalk, and
| other languages that came before Perl.
|
| Most of the uniquely Perly things (topicalisation,
| autovivification, regexes in the language syntax, context
| sensitivity) haven't been picked up by other languages.
|
| The only really Perly thing that was picked up elsewhere is
| CPAN, but that's not part of Perl-the-programming-language but
| Perl-the-community.
|
| (Oh I guess PHP picked up sigils from Perl but misunderstood
| them and that was the end of that.)
| lmz wrote:
| Regexes in syntax / regex literals made it into JS and Ruby
| (at least).
| Qem wrote:
| > Which languages have Perl influenced, really?
|
| Raku is a direct descendant.
| johannes1234321 wrote:
| Did Raku see any uptake from people outside the shrinking
| Perl world?
| p_l wrote:
| Perl's big influence is Perl-Compatible Regular Expressions,
| to the point many younger people just assume "regular
| expression" means "pcre".
|
| In terms of direct language impact, Ruby code rarely shows
| that these days, but it was essentially designed in a way to
| easily migrate from Perl ways
| lmz wrote:
| Besides the regex literals, Ruby does have sigils of a sort
| (@ and $) but with different meanings. Also the quotelike
| operators (q, qx, qw becomes %q, %x, %w).
| cturner wrote:
| I came here to write - I think awk would fit in the list.
|
| Awk is sold on pattern matching, and there are earlier
| technologies that do pattern-matching - ML, SNOBOL.
|
| But awk's historic significance is something else: it was the
| embryonic scripting language. You could use it in an imperative
| manner, and in 1977 that showed a new path to interacting with
| a unix system. It allowed you to combine arithmetic, string
| manipulation, and limited forms of structured data processing
| in a single process without using a compiler.
|
| Two language schools grew from imperative awk. (1) The
| scripting language that expose convenient access to filesystem
| and OS syscalls like perl/pike/python/ruby; (2) The tool
| control languages like tcl/lua/io.
|
| It may also have influenced shell programming. Note that awk
| was released before the Bourne shell.
| kqr wrote:
| I would like to agree - I'm always surprised when I realise
| how old awk it. It feels like an incredibly modern language.
| It's also obvious that it inspired the likes of dtrace and
| bpftrace.
|
| That said, I don't know how many other languages explicitly
| have cited awk as an inspiration, which was the criterion for
| this list.
| assimpleaspossi wrote:
| You think awk would fit the list but then go on to show how
| useful it was and still is today.
|
| I often read answers to questions all over the internet where
| awk is part of the solution. Mainly serious programmers using
| BSD and Linux.
| cturner wrote:
| My comment did not talk about where awk is useful today.
|
| Unix gurus will recommend awk as a pattern matching and
| substitution tool.
|
| But my comment was about awk the vanguard imperative
| scripting language. I don't know of anyone who recommends
| use of awk's imperative style over python in 2025.
|
| As an exercise, I tried writing a simple roguelike in awk
| in an imperative style. Within twenty minutes, it felt
| obvious where perl came from.
| waldopat wrote:
| Classic HN. A comment meant in half jest is dissected
| technically and literally and down voted. LOL Anyways, I'm glad
| the Perl crowd is alive and well!
| somat wrote:
| "Significance: In terms of syntax and semantics we don't see much
| of COBOL in modern computing."
|
| Would I be wrong in saying that SQL has what feels to me to be a
| very cobaly syntax. By which I mean, I know it is not directly
| related to cobal, But someone definitely looked at cobal's clunky
| attempt at natural language and said "that, I want that for my
| query language"
| geophile wrote:
| I agree completely. They were from the same era (COBOL is a few
| years older), and they do have that dweeby, earnest, natural
| language influence.
| _glass wrote:
| COBOL is very much alive as ABAP, the SAP scripting language.
| mikewarot wrote:
| >An accurate analysis of the fall of Pascal would be longer than
| the rest of this essay.
|
| I put the blame solely on the management of Borland. They had the
| world leading language, and went off onto C++ and search of
| "Enterprise" instead of just riding the wave.
|
| When Anders gave the world C#, I knew it was game over for
| Pascal, and also Windows native code. We'd all have to get used
| to waiting for compiles again.
| Falkon1313 wrote:
| Agreed. And it was partly how they ignored or snubbed two+
| generations of rising developers, hobbyists, etc.
|
| No kid or hobbyist or person just learning was spending $1400+
| on a compiler. Especially as the number of open-source
| languages and tools were increasing rapidly by the day, and
| Community Editions of professional tools were being released.
|
| Sure they were going for the Enterprise market money, but
| people there buy based on what they're familiar with and can
| easily hire lots of people who are familiar to work with it.
|
| Last I looked they do have a community edition of Delphi now,
| but that was slamming the barn door long after the horses had
| all ran far away and the barn had mostly collapsed.
| macawfish wrote:
| Dang I wanted it to keep going
| geophile wrote:
| Wow, that was a trip down memory lane! I have used six of those
| languages: BASIC, APL, COBOL, Pascal, Algol-W (a derivative of
| Algol 60), PL/1. Mostly in school. My first dollars earned in
| software (brief consulting gig in grad school) had me debug a
| PL/1 program for a bank.
|
| For some reason I remember an odd feature of PL/1: Areas and
| offsets. If I am remembering correctly, you could allocate
| structures in an area and reference them by offset within that
| area. That stuck in my mind for some reason, but I never found a
| reason to use it. It struck me as a neat way to persist pointer-
| based data structures. And I don't remember seeing the idea in
| other languages.
|
| Maybe the reason it stayed with me is that I worked on Object
| Design's ObjectStore. We had a much more elegant and powerful way
| of persisting pointer-based structures, but an area/offset idea
| could have given users some of the capabilities we provided right
| in the language.
| justsomehnguy wrote:
| >> An area is a region in which space for based variables
| can be allocated. Areas can be cleared of their allocations
| in a single operation, thus allowing for wholesale freeing.
| Moreover, areas can be moved from one place to another by
| means of assignment to area variables, or through input-output
| operations. >> Based variables are useful in
| creating linked data struc tures, and also have
| applications in record inputoutput. A based variable
| does not have any storage of its own; instead, the
| declaration acts as a template and describes a generation
| of storage.
|
| http://www.iron-spring.com/abrahams.pdf p. 19, 74
|
| *shrug_emoji*
| kqr wrote:
| Ada supports storage pools as well. In the right circumstances,
| it's a safer way to deal with dynamic allocation.
|
| I believe it also starts to creep into things like C#.
| elcapitan wrote:
| Area-based allocation seems to be quite popular recently in
| game development (under the name "arena allocator"). Saw some
| talks referencing the concept in the "Better Software"
| (gamedev) conference on the weekend [1]
|
| [1] https://bettersoftwareconference.com/
| geophile wrote:
| Serious question: Is Ada dead? I actually had to google Ada, and
| then "Ada language" to find out. It's not dead, and it has a
| niche.
|
| When I was in grad school in the late 70s, there was a major
| competition to design a DoD-mandated language, to be used in all
| DoD projects. Safety and efficiency were major concerns, and the
| sponsors wanted to avoid the proliferation of languages that
| existed at the time.
|
| Four (I think) languages were defined by different teams, DoD
| evaluated them, and a winner was chosen. It was a big thing in
| the PL community for a while. And then it wasn't. My impression
| was that it lost to C. Ada provided much better safety (memory
| overruns were probably impossible or close to it). It would be
| interesting to read a history of why Ada never took off the way
| that C did.
| throwaway81523 wrote:
| Ada isn't dead and it's superior to Rust in many ways, but it
| is less trendy. adacore.com is the main compiler developer
| (they do GNAT). adahome.com is an older site with a lot of
| links.
| pjc50 wrote:
| How does Ada solve dynamic allocation?
| throwaway81523 wrote:
| It kind of doesn't at the moment. That's an area where Rust
| is ahead. They are working on a borrow-checker-like thing
| for Ada. But the archetypal Ada program allocates at
| initialization and not after that. That way it can't die
| from malloc failing, once it is past initialization.
| kqr wrote:
| Assuming "solve" is meant loosely: much like in C++, with
| RAII-style resource management. The Ada standard library
| has what are called _controlled types_ which come with
| three methods: Initialize, Adjust, and Finalize. The
| Finalize method, for example is automatically called when a
| value of a controlled type goes out of scope. It can do
| things like deallocate dynamically allocated memory.
|
| That said, Ada also has features that make C-style dynamic
| allocation less common. Ada does not have pointers but
| access types, and these are scoped like anything else. That
| means references cannot leak, and it is safer to allocate
| things statically, or in memory pools.
| kqr wrote:
| I don't think the Ada story is particularly interesting:
|
| (1) It was very expensive to licence at a time where C was
| virtually free.
|
| (2) It was a complicated language at a time where C was
| (superficially) simple. This made it harder to port to other
| platforms, harder to learn initially, etc.
|
| (3) All major operating systems for the PC and Mac happened to
| be written in C.
|
| Ada had virtually nothing going for it except being an
| amazingly well-designed language. But excellence is not
| sufficient for adoption, as we have seen repeatedly throughout
| history.
|
| Today? Virtually nothing stops you from using Ada. For lower
| level code, it's hands-down my favourite. Picking up Ada taught
| me a lot about programming, despite my experience with many
| other languages. There's something about its design that just
| clarifies concepts.
| elcapitan wrote:
| > Virtually nothing stops you from using Ada
|
| What would you recommend for getting started with it? Looks
| like there's GNAT and then also GNAT Pro and then the whole
| SPARK subset, which one would be best for learning and toying
| around?
| kqr wrote:
| GNAT. Upgrade to gprbuild when you start to find gnatmake
| limiting.
|
| SPARK is best considered a separate language. It gives up
| some of the things that make Ada great in exchange for
| other guarantees that I'm sure are useful in extreme cases,
| but not for playing around.
| thesuperbigfrog wrote:
| If you are playing with Ada, be sure to try out Alire, a
| 'Rust cargo'-style tool for Ada libraries:
|
| https://alire.ada.dev/
|
| One thing to be aware of is that GNAT is part of GCC:
|
| https://gcc.gnu.org/wiki/GNAT
|
| AdaCore is the primary developer of GNAT, SPARK, and
| related tools:
|
| https://blog.adacore.com/a-new-era-for-ada-spark-open-
| source...
|
| There are other Ada compilers as well:
|
| https://forum.ada-lang.io/t/updated-list-of-ada-
| compilers/10...
| elcapitan wrote:
| Oh man that style guide is triggering hard :D
|
| > Use three spaces as the basic unit of indentation for
| nesting.
|
| https://en.wikibooks.org/wiki/Ada_Style_Guide/Source_Code
| _Pr...
| thesuperbigfrog wrote:
| gnatformat is a handy tool for formatting working Ada
| code:
|
| https://github.com/AdaCore/gnatformat
|
| I had forgotten about the three spaces for indentation.
| Yuck.
| Someone wrote:
| > All major operating systems for the PC and Mac happened to
| be written in C.
|
| They were written in "not Ada"; the original OS for the Mac
| was written in assembly and Pascal.
| Aloha wrote:
| (3) wasn't relevant (or even really true) in 1977-1983 when
| Ada was being standardized.
|
| MS-DOS was mostly x86 assembly, Classic MacOS was a mix of
| 68k assembly and Pascal, CP/M was written in PL/M, UCSD
| P-System was Pascal, and this leaves out all of the OS
| Options for the Apple II - none of which were written in C.
| I'm hard pressed to identify a PC OS from that time period
| that was written in C, other than something Unix derived (and
| even sometimes the unix derived things were not C, Domain/OS
| for example was in Pascal).
|
| If we leave the PC space, it gets even less true - TOPS10/20
| NotC, RSX-11 NotC, VMS also NotC - and I can keep going from
| there - the only OS from the time period that I can point at
| from that time period that _was_ C is UNIX.
|
| I'd actually argue that C/C++ were not enshrined as the
| defacto systems programming languages until the early-90's -
| by that time Ada had lost for reasons (1) and (2) that you
| noted.
| Pamar wrote:
| https://en.m.wikipedia.org/wiki/Ada_(programming_language) -
| it was _mandated_ for some projects both by NATO and US DoD.
| porcoda wrote:
| No. Just google for NVIDIA and Adacore to see how Ada is quite
| alive in NVIDIA land. Ada is quite a nice language that more or
| less anticipated a lot of the current trends in languages that
| the safe languages like Rust and friends are following. Spark
| is quite a cool piece of work too. I think the perception of
| old-ness is the biggest obstacle for Ada.
| ab_testing wrote:
| Wheee does perk fit in this scheme of dying languages. I see
| fewer and fewer new packages written in Perl and lots of
| unmaintained packages in Cpan. It seems obvious that the language
| is dying a slow death .
| pjc50 wrote:
| The 5 -> 6 transition took out a lot of its momentum, and it
| was gradually eclipsed by two other dynamic weakly typed
| languages which were less alarming to read: Python and
| Javascript.
| AnimalMuppet wrote:
| "Less alarming to read". Sir, that is _beautifully_ said.
| reddit_clone wrote:
| Sad thing is Perl6/Raku is a nice kitchen-sync language with
| some pretty sophisticated constructs!
|
| Unfortunately it has no mind-share.
| Taniwha wrote:
| Algol-68 gets a bum rap here - it brought us 'struct', 'union'
| (and tagged unions) a universal type system, operator
| declarations, a standard library, and a bunch more - Wirth worked
| on the original design committee, Pascal reads like someone
| implementing the easy parts of Algol-68.
|
| The things it got wrong were mostly in it having a rigorous
| mathematical definition (syntax and semantics) that was almost
| unreadable by humans ... and the use of 2 sets of character sets
| (this was in the days of cards) rather than using reserved words
| p_l wrote:
| structs were first introduced in Algol 58 variant called
| JOVIAL, AFAIK
| jamesfinlayson wrote:
| > Of the four mother languages, ALGOL is the most "dead";
| Everybody still knows about LISP, COBOL still powers tons of
| legacy systems, and most scientific packages still have some
| FORTRAN.
|
| I've heard of enough Cobol and Fortran jobs existing, and Lisp
| continues to exist in some form or other, but Algol really does
| seem dead. I remember someone telling me about an Algol codebase
| that was decommissioned in 2005 and that seemed like a very late
| death for an Algol codebase.
| kqr wrote:
| Algol is in the funny position of being both very dead, yet not
| dead at all given its legacy. I suspect that's sort of
| inevitable: if all of a language's spirit is cannibalised by
| newer languages, it will be hard to argue for using the old
| language and it dies.
|
| (In contrast, Lisp retains some unique ideas that have not been
| adopted by other languages, so it survives by a slim margin.)
| brabel wrote:
| There are several well maintained Common Lisp compilers, some
| of which are paid for and can sustain whole businesses,
| Clojure is fairly big as niche languages go, and Guile Scheme
| is used across GNU projects. There's also some usage of
| Racket and Scheme in education, and I believe Janet is having
| success in gaming. So I wouldn't say Lisp survives by just a
| slim margin.
| kqr wrote:
| Correct. That was bad wording on my part. I meant it in the
| sense of "it survives at a scale proportional to the
| fraction of it that is unique".
|
| It used to be that things like GC, REPL, flexible syntax,
| the cond form etc. made Lisp unique, but these days the
| list is down mainly to homoiconicity (and an amazing hacker
| culture).
| vindarel wrote:
| the image-based REPL, including the interactive debugger,
| is still unique (matched maybe by Smalltalk), CLOS and
| the condition system too... pair them with stability,
| fast implementations, instantaneous compile-time errors
| and warnings a keystroke away, a Haskell-like type system
| on top of the language (Coalton), macros... and the whole
| is still pretty unique.
| skissane wrote:
| > but Algol really does seem dead
|
| Unisys still actively maintains their MCP mainframe operating
| system, which is written in an Algol superset (ESPOL/NEWP), and
| comes with a maintained Algol compiler -
| https://public.support.unisys.com/aseries/docs/ClearPath-MCP...
| - and they continue to add new features to Algol (even if
| minor)
|
| So, no, Algol isn't dead. Getting close but still not quite
| there. There are better candidates for dead languages than
| Algol, e.g. HAL/S (programming language used for the Space
| Shuttle flight software)
| p_l wrote:
| There are some legacy programs in DoD that are developed in a
| variant of Algol 58 called JOVIAL that are still supported and
| edited despite over 40 years now of mandate to update to
| something newer (Ada). Also used in some other places, like UK
| Air Traffic Control at least until 2016 (that was the last date
| I've seen for replacement plan)
| askvictor wrote:
| In the Smalltalk section, it says that Python isn't 'true' OO
| like Smalltalk... who considers this to be the case? In Python,
| everything (including functions, and classes), is an object
| a_bonobo wrote:
| I think this refers to encapsulation in Python, 'private'
| methods aren't really private, any user can poke around in
| there.
|
| Old justification:
| https://mail.python.org/pipermail/tutor/2003-October/025932....
|
| >Nothing is really private in python. No class or class
| instance can keep you away from all what's inside (this makes
| introspection possible and powerful). Python trusts you. It
| says "hey, if you want to go poking around in dark places, I'm
| gonna trust that you've got a good reason and you're not making
| trouble."
|
| >After all, we're all consenting adults here.
| johannes1234321 wrote:
| I don't think that is the relevant point. Smalltalk doesn't
| have visibility either.
|
| The difference is that in smalltalk everything is a message.
| Operators are messages to an object. Even things we commonly
| assume to be control structures in other languages like if or
| while (or rather ifTrue, whileTrue) are messages. Python is a
| lot less "pure" but so are all commonly used OO languages.
| brodo wrote:
| I think this comes from the fact, that Alan Kay does not think
| it is OO. There is no legal definition, but Python does other
| have Smalltalk-like 'method_missing' or 'responds_to' methods.
| If you think OOP means messages and late-binding, that feature
| is important.
| igouy wrote:
| _That 's the opposite of what it says_ -- We
| sometimes think that Smalltalk is "true" OOP and things like
| Java and Python aren't "real" OOP, but that's not true.
| zzo38computer wrote:
| I still sometimes use BASIC and Pascal (both in DOS), although
| they are not the programming languages I mostly use.
| PeterStuer wrote:
| In the finance world COBOL is still very much not dead. It's not
| as dominant as it once was, but core mainframe code is very often
| still COBOL.
|
| Replacing that is a _very_ hard problem as thousands and
| thoudands of (abused and overloaded) integrations are layered in
| serveral strata around it relying each night on _exact_ behaviour
| in the core, warts and all.
|
| As for Smalltalk, I know at least one company around hete still
| running some legacy, but still maintained afaik, code on it (
| https://mediagenix.tv ).
| balderm wrote:
| I used to work in a bank IT department 13 years ago and all
| their systems still ran on Java 1.4.2 and COBOL procedures, at
| the time IBM was basically forcing them to migrate to Java 1.6
| by the end of the year because the new LTS version didn't
| support that Java version anymore.
| lucas_membrane wrote:
| My recollection, contrary to TFA, is that development of PL/I or
| PL1 (both names were used) started in 1964 or a bit earlier with
| the intention of coming up with a replacement for Fortran. I
| think some referred to it as Fortran V or Fortran VI. IBM
| introduced PL1 pretty quickly after the it rolled out 360
| mainframes, maybe because it might help sell more megabuck
| hardware, but there were two different compilers that compiled
| two different versions of the language (a fork that remained a
| wretched stumbling block for at least 20 or 25 years) into the
| one for the big mainframes (model 50 or bigger) and one for the
| economical mid-line mainframes, and none for the model 20. In
| that period, Gerald Weinberg published some good stuff about PL1,
| including a strong warning against concurrent processing. There
| was also a PL1 on Multics that tried to be a little more suited
| to academia than the IBM versions. In the middle 1980's there was
| a PL1 subset that ran on PC's. It couldn't do anything that Turbo
| Pascal couldn't do, but it did it much slower.
| zozbot234 wrote:
| The point of PL/I was to unify the userbases of both FORTRAN
| and COBOL, thus tending to "replace" both languages. There was
| some influence from PL/I to the CPL language, which in turn led
| to BCPL, B and C.
| nxobject wrote:
| I've always wondered what it would be like to "live" in ALGOL 68
| for a while, because of its ambition and idiosyncrasy. With a
| modern perspective, what would I be surprised to actually have,
| and what would I be surprised to miss?
|
| (Apart from the wild terminology. File modes are called "moods",
| and coincidentally, ALGOL 68's three standard "channels" (i.e.
| files) are "stand in", "stand out", and "stand back" -- I kid you
| not, close enough to 'stdin', 'stdout'.)
| eddieh wrote:
| Basically, writing a CS paper in LaTeX using the "alg*"
| packages (algorithm2e, algorithmicx, algorithms, algpseudocode,
| algxpar, algpseudocodex, etc.) is living in the ALGO 60 / ALGO
| 68 world. For the most part, nobody uses anything from the
| standards that would require superfluous explanations in a
| paper.
| sandworm101 wrote:
| Seeing more than one of the languages i have used on this list
| makes me feel very old.
|
| +1 for basic, first used in gradeschool.
|
| +1 for pascal (highschool)
|
| +1 for lisp (compsci 101)
| KWxIUElW8Xt0tD9 wrote:
| Ah, APL. I remember a network synthesis EE course in the 70s
| where we had an assignment to implement matrix exponentiation. I
| had just finished a matrix algebra class and learned APL in the
| process, so my implementation was 2 lines of APL. Those where the
| days...
| xelxebar wrote:
| You should come back. APL has considerably modernized in the
| intervening 50 years, and with data parallelism gaining ever
| more market share, the timing is perhaps ripe...
| bombcar wrote:
| How is it to read and write? It always looked to me like
| something that made Perl look verbose and clear.
| bear8642 wrote:
| > How is it to read and write?
|
| Fairly straight-forward _once_ you 've learnt the character
| set.
|
| See here for details: https://aplwiki.com/wiki/Typing_glyphs
| FrustratedMonky wrote:
| Surprised F# wasn't on list.
|
| Since F# was like the test bed, for many features that got moved
| to C# once proven.
| claytonaalves wrote:
| Pascal isn't dead. At least here in Brazil it's widespread. I've
| heard in Russia and China too.
|
| Freepascal [1] is up and running, targeting a lot of platforms:
| x86-16, x86-32, AMD64, RISC-V (32/64), ARM, AArch64, AVR, JVM,
| Javascript...
|
| ... and operating systems: Windows, Linux, MacOS, iOS, web and
| others.
|
| I don't know an easier way to build multiplatform desktop
| applications other then Freepascal/Lazarus. I mean, real desktop
| apps, not that Electron bullsh*.
|
| [1] https://www.freepascal.org/
| breve wrote:
| C# with Avalonia is pretty good for cross platform development:
|
| https://avaloniaui.net/
| weinzierl wrote:
| When I saw the Smalltalk paragraph I could not help thinking Rust
| will be to Haskell what Java has been to Smalltalk.
| troad wrote:
| I'm struggling to follow your logic a little.
|
| Java has virtually nothing in common with Smalltalk, other than
| in the most superficial way (things called objects exist, that
| work nothing alike). The closest thing to Smalltalk in serious
| use today is Ruby, which hews very close to Smalltalk's
| philosophy of message passing, though it does abandon the idea
| of a holistic programmable environment.
|
| If one considers the actor model to be a natural evolution of
| Smalltalk's original idea of message-oriented, dynamic objects,
| then the BEAM might be Smalltalk's natural successor (Erlang,
| Elixir, Gleam, etc). Genservers are effectively isolated green
| threads that communicate solely by sending one another
| messages, which strikes me as very close to the spirit of
| Smalltalk's objects.
| fuzztester wrote:
| What about Eiffel?
|
| What are people's thoughts about it?
|
| I know a little about its history, and had tried it out a bit,
| some years ago, via the EiffelStudio trial edition, IIRC. I had
| also read a good chunk of Bertrand Meyer's book, Object Oriented
| Software Construction. Thought the book was good.
___________________________________________________________________
(page generated 2025-07-16 23:01 UTC)