[HN Gopher] What did Ada Lovelace's program actually do? (2018)
       ___________________________________________________________________
        
       What did Ada Lovelace's program actually do? (2018)
        
       Author : mitchbob
       Score  : 171 points
       Date   : 2024-12-16 16:58 UTC (6 hours ago)
        
 (HTM) web link (twobithistory.org)
 (TXT) w3m dump (twobithistory.org)
        
       | uberman wrote:
       | What I think is the coolest part is her actual work in the
       | "notes" she attached to the translation.
       | 
       | See:
       | https://upload.wikimedia.org/wikipedia/commons/c/cf/Diagram_...
       | 
       | and: https://en.wikipedia.org/wiki/Note_G
       | 
       | The article also references this python translation of her work:
       | 
       | https://enigmaticcode.wordpress.com/tag/bernoulli-numbers/
        
         | 0points wrote:
         | Half the article is about Note A and Note G.
        
           | uberman wrote:
           | Sure, but it is spoken about in the abstract. I enjoyed the
           | article, but why not at least include "some" of the actual
           | notes she wrote or at least a screenshot?
        
             | stnmtn wrote:
             | There was a link to directly to note G in the article, in
             | fact, it's the exact same URL that you linked to.
        
               | uberman wrote:
               | Yes, and I said that explicitly in my post.
               | 
               | The difference is in my post it is one of the featured
               | things. In the article that claims to show what the
               | program actually did it is buried in the text.
               | 
               | I'm not really sure what you are defending. The article
               | is a good read but does a terrible job of illuminating
               | what her program actually did.
        
       | StableAlkyne wrote:
       | > The Difference Engine was not a computer, because all it did
       | was add and subtract.
       | 
       | The definition of computer is pretty grey for the pre-digital
       | era, and it wasn't turing complete, but is it actually
       | controversial whether it was a computer?
        
         | exitb wrote:
         | Difference Engine basically implemented one algorithm in
         | hardware, while Analytical Engine was supposed to run a
         | program. I believe that could make the latter one a computer.
        
           | jandrese wrote:
           | The first stored program computer is a remarkable
           | achievement, even if they didn't actually build it.
        
             | Manuel_D wrote:
             | The analytical engine wasn't a stored program computer. It
             | most closely follows the Harvard architecture, with
             | instructions read from punch card memory. The analytical
             | engine's claim to fame is that it was the first Turing
             | complete computer to be designed.
        
               | area51org wrote:
               | > with instructions read from punch card memory
               | 
               | If that isn't a stored program, I don't know what is.
        
               | Manuel_D wrote:
               | A stored program computer refers to the computer
               | architecture where program instructions and data are
               | stored in the same memory. This is also referred to as
               | the Von Neumann architecture.
               | 
               | In contrast, a lot of early computers were built with
               | separate instruction memory like punch cards. This is
               | called the Harvard Architecture. If the instructions were
               | immutable, which they usually were, then things like
               | modifying the program at runtime were not possible.
               | 
               | Concrete examples of this difference is the Harvard Mk 1
               | and the Manchester Mk 1, the former being a Harvard
               | architecture computer and the latter is a stored program
               | computer or a von Neumann architecture.
        
           | Wowfunhappy wrote:
           | > Difference Engine basically implemented one algorithm in
           | hardware
           | 
           | So, did Pong run on a computer?
        
         | Avicebron wrote:
         | Probably not, it's stated in the TFA, the controversy is
         | because Lovelace was a woman and some people think propping her
         | up is basically a DEI retcon in history, the rest of us don't
         | care. But I don't think it's anything whatsoever to do with
         | actual computers
        
           | nilamo wrote:
           | That's so funny...
           | 
           | Mathematicians for 150 years: Ada Lovelace is kind of on top
           | of it.
           | 
           | Random from 2024: probably just a diversity footnote.
        
             | mrguyorama wrote:
             | Seriously. As the article states, while everyone else was
             | like "Wow cool we will make a machine that makes
             | calculating things easier"
             | 
             | Meanwhile Ada over here going "Oh shit this can do
             | literally anything that can be done by steps of math.
             | Someday machines following that formula will _make music_ "
             | 
             | Ada is not the first programmer. Ada is the first _computer
             | scientist_. She understood the ramifications of what we
             | would eventually call  "turing complete" systems, and
             | understood the value of "general purpose" in a general
             | purpose computer, and seemingly understood that more than
             | just numbers could be represented and calculated in a
             | computer.
        
               | grey-area wrote:
               | Yes this is the most interesting thing about her writing
               | - she foresaw a lot of later work.
        
             | metalman wrote:
             | Funny indeed.Ada Lovelace has been persistantly recognised
             | for a very long time, but has never been held up as a
             | sufferget type mayrter, as by all accounts, she enjoyed
             | herself out on the bleeding edge and is still making people
             | uncomfortable 150 years after not fitting into any
             | stereotypes then. Its clear from the footnotes that,
             | whatever crowd around Babage and Lovelace, grasped the
             | possibilities. Also interesting is that durring the apollo
             | moon mission, the memory modules for the guidance computers
             | were crafted by some of the last lace makers, working by
             | hand, to survive the introduction of the jaquard looms and
             | there punch cards.
        
           | Manuel_D wrote:
           | https://en.wikipedia.org/wiki/Ada_Lovelace#Controversy_over_.
           | ..
           | 
           | > All but one of the programs cited in her notes had been
           | prepared by Babbage from three to seven years earlier. The
           | exception was prepared by Babbage for her, although she did
           | detect a "bug" in it. Not only is there no evidence that Ada
           | ever prepared a program for the Analytical Engine, but her
           | correspondence with Babbage shows that she did not have the
           | knowledge to do so.
           | 
           | > Bruce Collier wrote that Lovelace "made a considerable
           | contribution to publicizing the Analytical Engine, but there
           | is no evidence that she advanced the design or theory of it
           | in any way"
           | 
           | The common claims are that Ada Lovelace was the first person
           | to write a computer program, or that she was actually the
           | primary driver in developing the analytical engine. Both such
           | claims fall into the area "DEI retcon" as you choose to
           | phrase it.
           | 
           | Although on a more pedantic note, Babbage wasn't the first
           | person to program a computer either. Computers that aren't
           | Turing complete are still computers. The Jacquard loom is one
           | such example, and unlike the analytical engine it was
           | actually built and put to practical use.
        
           | kevin_thibedeau wrote:
           | An entire programming language was named after her in 1980
           | (by a man) when when such things didn't exist.
        
         | SilasX wrote:
         | I'm not sure I have a direct answer, but I agree something
         | shouldn't be called a computer if it just does a one-shot,
         | fixed-length calculation before requiring further human
         | intervention. To be a "computer", and be associated with that
         | general conceptspace, it should be Turing-complete and thus
         | capable of running arbitrarily long (up to the limits of memory
         | and hardware rot).
         | 
         | Earlier comment expressing annoyance at a mislabeling:
         | 
         | https://news.ycombinator.com/item?id=40077408
        
           | SilasX wrote:
           | Separate comment to address a subtlety that comes up a lot:
           | 
           | Often you'll hear about fully homomorphic encryption (FHE)
           | being Turing-complete. But you can't actually have a Turing
           | complete system with variable-run-time loops that's
           | homomorphically encrypted, because that leaks information
           | about the inputs.
           | 
           | When they say FHE is Turing-complete, what they mean is that
           | you can take an arbitrary program requiring Turing
           | completeness, then time-bound it, unroll it into a fixed-
           | length circuit, and run _that_ homomorphically. Since you can
           | keep upping the time bound, you can compute any function. So
           | the _system_ that translates your programs into those
           | circuits, with no limit on the bound you set, could then be
           | called Turing-complete -- but you couldn 't say that about
           | any of those circuits individually.
           | 
           | Earlier related comment:
           | https://news.ycombinator.com/item?id=40078494
        
         | jcranmer wrote:
         | That the Difference Engine and Analytical Engine belong on the
         | timeline of computing history isn't particularly controversial,
         | but the Difference Engine itself I've never seen anyone try to
         | claim was a computer (it's a mechanical calculator)--the
         | Wikipedia page doesn't even try to link it directly to the
         | history of computers, you have to go to the Analytical Engine
         | to see the Difference Engine's place in the "history of
         | computing" timeline.
        
         | UniverseHacker wrote:
         | I don't think there is anything controversial here- the
         | Difference Engine was a calculator that could only do a
         | predefined set of hardwired computations, the Analytical Engine
         | a true turing complete computer.
        
         | retrac wrote:
         | Is an early 20th century mechanical desk calculator a computer?
         | There is no consensus on definition but for me, a computer
         | follows a program. Maybe even only one fixed program. But a
         | program. If there is no stepping through a program it is not a
         | computer.
         | 
         | Does the iterative method used by the difference engine
         | constitute a program?
        
       | jandrese wrote:
       | > In her "diagram of development," Lovelace gives the fourth
       | operation as v5 / v4. But the correct ordering here is v4 / v5.
       | This may well have been a typesetting error and not an error in
       | the program that Lovelace devised. All the same, this must be the
       | oldest bug in computing. I marveled that, for ten minutes or so,
       | unknowingly, I had wrestled with this first ever bug.
       | 
       | The real mark of a non-trivial program is that it doesn't work on
       | the first try.
       | 
       | It's incredible how Babbage, frustrated that the mass production
       | precision machining technology necessary to make his simple
       | engine work didn't exist yet, decides that the best way forward
       | is to design a new system an order of magnitude more complex and
       | then go to Italy to find more advanced manufacturing somehow.
        
         | ChrisMarshallNY wrote:
         | I had an employee like that.
         | 
         | He'd want to do something, and hit a roadblock, so he'd design
         | his own tool (He wrote his own font, once, because he didn't
         | like the way the built-in ones worked at teeny point sizes).
         | 
         | Best damn engineer I ever knew, but I had to keep an eye out
         | for rabbitholing.
        
           | jandrese wrote:
           | Babbage would have likely had more success if he stayed in
           | England and opened his own precision machine shop.
        
           | m463 wrote:
           | > He wrote his own font, once, because he didn't like the
           | way...
           | 
           | Wonder how many folks here have done the same thing, building
           | and discarding in the throes of creation like tibetan monks:
           | 
           | https://en.wikipedia.org/wiki/Sand_mandala
        
       | dkdbejwi383 wrote:
       | > In fact, aside from the profusion of variables with unhelpful
       | names, the C translation of Lovelace's program doesn't look that
       | alien at all.
       | 
       | Clearly the author never met my coworkers.
        
         | seanhunter wrote:
         | ...or worked with any mathematicians/physicists/engineers who
         | program. As soon as I saw that, I thought "typical quant".
         | 
         | Like my dad (A chemical engineer) learned to program in
         | FORTRAN, which used to insist variable names were 1 letter and
         | up to 2 digits. He later learned Basic, but his code was still
         | spiritually FORTRAN so the one-letter-two digits thing stuck. I
         | thought that was just him but then much later I went to work on
         | Wall St and had to work with quants who were copying code out
         | of "Numerical Recipes" and it was exactly the same just now in
         | C.
        
           | glouwbug wrote:
           | That naming convention makes perfect sense to the
           | mathematician, so why not? It's why we use `for(int i = i; i
           | < n; i++)` in for loops; its the mathematical sigma sum of
           | values with the same naming convention
        
             | seanhunter wrote:
             | Oh yeah. And if you're like my dad you call them "do loops"
             | not "for loops"
        
             | lukan wrote:
             | The question to me always was, does it makes sense in the
             | way of, it is intuitivly understandable, or does it only
             | make sense, if it was drilled into you long enough?
             | 
             | (I suspect the latter)
        
             | kevin_thibedeau wrote:
             | A loop counter doesn't carry much semantic weight so it
             | gets a short name. Doing that for important things that
             | deserve a descriptive name is the problem. Maybe passable
             | with literate programming, but even Knuth's code is pretty
             | inscrutable due to transclusions everywhere.
        
           | stevenalowe wrote:
           | I helped port a physicist's assembly code long ago; variables
           | were named alphabetically in the order encountered in the
           | code, e.g. A, B, ...A1, ..., AA1, etc. up to ZZ23.
           | 
           | Still amazed that the nearly-incomprehensible code (and the
           | port) worked
        
           | vincent-manis wrote:
           | Not sure which Fortran this refers to. I never used Fortran
           | I, but as I understand it, names were up to 6 characters
           | long, first character alphabetic; names with initial letter
           | A-H and O-Z were REAL, I-M INTEGER (Fortran II added
           | declarations to override the defaults). Dartmouth Basic
           | restricted names to a single letter and an optional digit.
           | 
           | Incidentally, the various Autocode languages of the 1950s in
           | Britain had 1-character variable names.
        
         | kirkules wrote:
         | I'm reminded of a high school programming class where a project
         | partner named variables with the most crude and lewd words he
         | could imagine. Not that I was prudish, but he unsurprisingly
         | never remembered what "butts" was for and somehow never figured
         | out why he kept getting confused by his own code.
        
       | neuronet wrote:
       | A really cool article. From the Intro:
       | 
       | > She thought carefully about how operations could be organized
       | into groups that could be repeated, thereby inventing the loop.
       | She realized how important it was to track the state of variables
       | as they changed, introducing a notation to illustrate those
       | changes. As a programmer myself, I'm startled to see how much of
       | what Lovelace was doing resembles the experience of writing
       | software today.
       | 
       | > So let's take a closer look at Lovelace's program. She designed
       | it to calculate the Bernoulli numbers. To understand what those
       | are, we have to go back a couple millennia to the genesis of one
       | of mathematics' oldest problems.
       | 
       | It does a nice job getting into just enough detail to make you
       | appreciate what she did. If she were alive today, you could
       | imagine her down the hall grinding away on some problem in Rust
       | (I have a feeling she'd have a strong preference for statically
       | typed languages).
        
         | int_19h wrote:
         | However much credit Ada deserves for her programming
         | techniques, to me the thing that always stood out is her
         | ability to see the big picture wrt computation:
         | 
         | > Again, it [Analytical Engine] might act upon other things
         | besides number, were objects found whose mutual fundamental
         | relations could be expressed by those of the abstract science
         | of operations, and which should be also susceptible of
         | adaptations to the action of the operating notation and
         | mechanism of the engine. Supposing, for instance, that the
         | fundamental relations of pitched sounds in the science of
         | harmony and of musical composition were susceptible of such
         | expression and adaptations, the engine might compose elaborate
         | and scientific pieces of music of any degree of complexity or
         | extent.
         | 
         | Imagine coming up with this idea in 1842, a whole century
         | before the first actual programmable computers would be built,
         | based solely on a description of a prototype of a mechanical
         | computer. This is hacking extraordinaire.
        
           | JohnMakin wrote:
           | Programmable looms (which used a type of punchcard) such as
           | the Jacquard Loom had existed for a little while - if I
           | recall she specifically referenced this as inspiration for
           | some of her ideas. Not trying to diminish how impressive her
           | work was, but I do believe some form of primitive mechanical
           | computation had already been done for a little while.
        
             | int_19h wrote:
             | Jacquard loom was indeed well-known, and one of the sources
             | of inspiration for Babbage, but it is still fundamentally
             | about designing a system around a specific task - the cards
             | directly encode operations on hooks.
             | 
             | What Ada is saying here is that, once you have a machine
             | that let you do generic operations on numbers, you can use
             | it to do all kinds of non-math stuff so long as you can
             | come up with ways to encode other things as numbers (=
             | finite sets of symbols). This was not at all obvious to
             | other people who worked on the Engine, including Babbage
             | himself.
        
           | kevin_thibedeau wrote:
           | Tide prediction machines came about 30 years later as an
           | application of the "science of harmony".
        
           | coldpie wrote:
           | I agree, this is the thing that stood out to me. There's this
           | kind of amazing leap you have to do to understand how
           | computers do what they do. How does a thing that adds and
           | subtracts numbers _paint pictures_? Once you grasp that you
           | can transform those things _into_ numbers and then operate on
           | them, the whole world of computation opens up. It 's amazing
           | Ada was thinking about this 100 years before computers really
           | existed.
        
         | TomatoCo wrote:
         | > She realized how important it was to track the state of
         | variables as they changed, introducing a notation to illustrate
         | those changes.
         | 
         | The thing that really stuck out to me was how similar it was to
         | static single assignment.
         | https://en.wikipedia.org/wiki/Static_single-assignment_form#...
         | 
         | I think this is a state-of-the-art technique today and she had
         | it, what, 180 years ago?
        
       | dang wrote:
       | Discussed at the time (of the article):
       | 
       |  _What Did Ada Lovelace's Program Actually Do?_ -
       | https://news.ycombinator.com/item?id=17797003 - Aug 2018 (52
       | comments)
        
         | olddustytrail wrote:
         | > Discussed at the time (of the article)
         | 
         | Thank you for that careful clarification. The discussion in
         | "Bell's Life in London and Sporting Chronicle" was far less
         | enlightening.
        
         | eesmith wrote:
         | Also relevant is "Untangling the Tale of Ada Lovelace" from
         | December, 2015 at
         | https://writings.stephenwolfram.com/2015/12/untangling-the-t...
         | with 35 comments from the time at
         | https://news.ycombinator.com/item?id=10709730 .
        
       | glouwbug wrote:
       | Has anyone built a virtual machine out of Babbage's instruction
       | set and then tried Ada's program?
        
         | cchianel wrote:
         | John Walker built a virtual machine for the Babbage's
         | instruction set, and it has a web emulator:
         | https://fourmilab.ch/babbage/emulator.html.
         | 
         | I don't think Ada program is available as an example though, so
         | you'll need to input it manually.
         | 
         | Fun fact: my compiler course project was creating a C compiler
         | targeting the emulator https://github.com/Christopher-
         | Chianelli/ccpa (warning, said code is terrible).
        
         | alanjay wrote:
         | Not quite, but this emulates her program.
         | 
         | https://github.com/MarquisdeGeek/Ada-Origins
        
       | LeroyRaz wrote:
       | Good article. This is the clearest explaination I've read of how
       | and why Ada was meaningfully innovative, and worthy of her
       | recognition.
        
       ___________________________________________________________________
       (page generated 2024-12-16 23:00 UTC)