[HN Gopher] Turing Oversold?
       ___________________________________________________________________
        
       Turing Oversold?
        
       Author : anielsen
       Score  : 333 points
       Date   : 2021-09-14 10:48 UTC (1 days ago)
        
 (HTM) web link (people.idsia.ch)
 (TXT) w3m dump (people.idsia.ch)
        
       | pantulis wrote:
       | There are a lot of true facts thrown in the article, but it does
       | not explore the reason _why_ this is.
       | 
       | I feel the era of great thinkers who single handledly performed
       | disruptive breakthroughs in their field, the Galileos and
       | Newtons, was over with the Einstein-era (and even Einstein also
       | stood in the shoulders of giants).
       | 
       | No one works in isolation any more, and that is not a bad thing.
       | You can subject any relevant figure to a similar analysis and
       | come with the same results, it's absurd to try and come up with
       | someone with such an overwhelming figure like Albert Einstein
       | these days.
       | 
       | But if you need to choose a Founding Father of Computing Science
       | for the general public, I'd say Alan Turing is the best
       | candidate. Scholars will give due credit to Church, Zuse, von
       | Neumann and all the others.
        
         | colechristensen wrote:
         | Nobody exists in a vacuum, but I think Kuhn was right:
         | scientific progress is made up of long periods of incremental
         | work split between short bursts of paradigm shifts. Those
         | shifts are more likely to rest on a few very influential people
         | who take the current state and look at it in a considerably
         | different way. We haven't had that in physics in quite a long
         | time and might not again.
        
         | OskarS wrote:
         | > But if you need to choose a Founding Father of Computing
         | Science for the general public, I'd say Alan Turing is the best
         | candidate. Scholars will give due credit to Church, Zuse, von
         | Neumann and all the others.
         | 
         | I agree with this. It's certainly the case that I wish more
         | people knew of Alonso Church and Kurt Godel, but you have to
         | realize in a "PR" sense that it's simply not going to be
         | feasible to teach the general public about their contributions.
         | 
         | And Turing's contributions were genuinely ground-breaking,
         | there's a reason that computer science is lousy with concepts
         | named after or by him (Turing machines, Turing-completeness,
         | even the word "computing" was arguably coined in "On Computable
         | Numbers"). He also thought deeply and hard about the
         | philosophical implications to computing in a way that others
         | didn't (the "Turing test" being the obvious example).
         | 
         | In addition: when a mathematically inclined person describes
         | any kind of mathematical concept to laymen, the first question
         | is always "Yeah, but what is that actually useful for?", asked
         | with a certain amount of disdain. With Turing, the answer is
         | powerful: "How about defeating the Nazis and laying the
         | foundation for modern society?". That case is harder to make
         | for Church or Godel: they obviously didn't work for the GCSE,
         | and "lambda calculus" as a concept is a much more abstract
         | thing than Turing machines, which laymen can readily understand
         | (i.e. it's "just" a computer).
         | 
         | Add to that the fact that Turing's story is not just about
         | computing, or code-breaking, it's also the story of the
         | suffering that society inflicted on gay men. The fact that he
         | was shamed into suicide is just all the more reason to
         | celebrate him now.
         | 
         | I agree with the basic point of the article, but I have no
         | issue with giving Alan Turing this title. He earned it.
        
           | legostormtroopr wrote:
           | > The fact that he was shamed into suicide is just all the
           | more reason to celebrate him now.
           | 
           | Please don't diminish his legacy by repeating this lie.
           | Turings suicide is contentious and circumstantial at best.
           | His documented behaviour had none of the very common signs of
           | suicide - there was no note, he had plans for later in the
           | week, and none of his close friends noted any change in
           | behaviour.
        
             | UncleMeat wrote:
             | Suicide or not, his treatment by society was equally
             | heinous and repulsive. Even he had lived a complete and
             | happy life, his story would have been a bright example of
             | the terror and evil of homophobia.
        
           | thanatos519 wrote:
           | Very much agreed about thinking deeply and having earned the
           | title!
           | 
           | He applied computational thinking all over the place, showing
           | great foresight in
           | https://en.wikipedia.org/wiki/Turing_pattern
        
         | kreetx wrote:
         | This is just a theory, but I think this (assigning some major
         | leap of science to few specific persons) is how society
         | remembers things. I.e it is difficult, or even impossible, to
         | go into the intricate histories of how things actually
         | developed in middle or high school (and perhaps even in
         | college), thus the people teaching us simplify it to make it
         | easier to study and remember.
         | 
         | Once you start digging you realize that nothing is as simple.
         | For example for physics, "Physics for Poets" by Robert H. March
         | is an eye opener.
        
           | netcan wrote:
           | A typical medieval depiction of a great siege might be one
           | king and two or three famous knights with a ladder assaulting
           | a 5 foot castle manned by another king and a knight.
           | Distilling stories to a handful of characters seems to make
           | it easier for us to digest. I suppose it's easier for us to
           | imagine ourselves as one of these people.
        
         | chubot wrote:
         | Tangential, but one thing I learned from dense computer history
         | book _The Dream Machine_ is that the term  "von Neumann
         | architecture" is improperly assigning credit:
         | 
         | https://en.wikipedia.org/wiki/Von_Neumann_architecture
         | 
         | von Neumann simply _described_ the work of Eckert and Mauchly
         | on the ENIAC in a written report. And his name was on the
         | report which made people think that he came up with the idea,
         | which was false. It also resulted in a patent dispute -- it 's
         | interesting to imagine what would have happened if the concept
         | had been patented. The book goes into detail on this.
         | 
         | The wikipedia article also talks about Turing machines as a
         | precedent that store data and code in the same place. But
         | ironically I'd say that probably gives him too much credit! I
         | think a large share should go to the people who designed a
         | working machine, because it's easy to say come up with the idea
         | of an airplane; much harder to make it work :) And to me it
         | seems unlikely that the Turing Machine, which was an idea
         | created to prove mathematical facts, was a major inspiration
         | for the design of the ENIAC.
         | 
         | Finally, even though the author of this web page has his own
         | credit dispute, I appreciate this elaboration on the credit
         | assigned to Turing.
        
         | pvg wrote:
         | _even Einstein also stood in the shoulders of giants_
         | 
         | People have had that perched-on-giants feeling for some time:
         | 
         |  _This concept has been traced to the 12th century, attributed
         | to Bernard of Chartres. Its most familiar expression in English
         | is by Isaac Newton in 1675: "If I have seen further it is by
         | standing on the shoulders of Giants."_
         | 
         | https://en.wikipedia.org/wiki/Standing_on_the_shoulders_of_g...
        
           | OrderlyTiamat wrote:
           | I've heard this quote explained as an insult directed to one
           | of Newtons enemies (either Leibiz or Hooke), referencing
           | their short height. I'm not convinced it's true, but it is an
           | amusing possibility.
        
         | elzbardico wrote:
         | The problem is that the general public thinks the CS ==
         | Computers.
         | 
         | So, Founding Fathers of computing science becomes mixed -
         | starting from those low brow thinkers we call journalists -
         | with the idea of Founding Father of computing. And this is not
         | only unfair, but technically wrong.
        
         | erhk wrote:
         | Disruption is a canary word to me now.
        
         | hoseja wrote:
         | Because those others mostly aren't anglos helping the war
         | effort.
        
         | donkeybeer wrote:
         | Everything builds on past work. Educated people, at least in
         | europe, were quite well connected and aware of each others
         | works in those times too.
        
         | yann2 wrote:
         | No one worked in isolation in the past either.
         | 
         | Move Newton, Faraday, Maxwell and Einstein 10kms away from
         | where they were born, surround them by a different set of
         | chimps and the story doesnt end the same way.
         | 
         | A good book from Niall Ferguson - the Sqaure and the Tower -
         | makes the case tradionally Historians have studied individuals
         | instead of groups because its easier to collect data on one
         | chimp versus the entire troupe.
        
           | cxr wrote:
           | "I am, somehow, less interested in the weight and
           | convolutions of Einstein's brain than in the near certainty
           | that people of equal talent have lived and died in cotton
           | fields and sweatshops."
           | 
           | <https://graph.global/?id=2851>
        
           | mushishi wrote:
           | Yup, the influences on e.g. Newton happening to delve into
           | reading up on Archimedes, Descartes, Fermat, and then
           | synthesizing their inventions in his mind with lot of time on
           | his hand, or for that matter Leibniz getting math tutoring
           | from Christiaan Huygens seem to be crucial in relation to the
           | invention of fluxions/infinitesimals. (Approximately from
           | memory of reading Infinite Powers by Steven Strogatz).
           | 
           | Doesn't diminish their achievement in my mind.
        
           | vmilner wrote:
           | Having lived both 10kms north and 10kms south of Newton's
           | birthplace (in more flat Lincolnshire farmland) I'm not sure
           | he's the best example for that argument!
        
           | denton-scratch wrote:
           | The idea that history is wrong to focus on "chaps" derives
           | from marxism; and Fergusson is very much anti-marxist. The
           | marxian view would be that historical change is the result of
           | economic forces; that if (e.g.) Turing hadn't done it someone
           | else would have, because economics was driving history in
           | that direction.
           | 
           | I'm sympathetic to the marxian view of Great Men; I think
           | it's no coincidence that the related work of Godel and Turing
           | was published within a couple of decades of one-another, or
           | that the ideas of Copernicus, Kepler and Galileo emerged
           | around the same time as one-another.
           | 
           | I'm certainly impressed by the greatness of Great Men; but
           | I'm hard-pressed to find one whose discoveries were so
           | remarkable, _in the context of their times_ , that noone else
           | could have been expected to make similar discoveries around
           | the same time.
        
             | cxr wrote:
             | Alternative angle: among their insights and discoveries,
             | the successes will be shaped by survivorship bias. When
             | deciding what part of one's work to focus on, a person will
             | pursue the things that are close enough to other
             | contemporary work at the time, because it provides a short
             | path to buy-in.
        
       | heurisko wrote:
       | I watched the "Imitation Game" and read the biography of Turing
       | by Andrew Hodges.
       | 
       | What I found fascinating about the biography was recognising what
       | Turing was describing in theoretical terms, as actual hardware
       | and software concepts today. I know some parts are somewhat
       | obvious, but it was still nice to trace the concepts back to
       | excerpts from his work.
       | 
       | The Imitation Game is a nice fiction, but doesn't portray Turing
       | as having very much agency, in comparison to the real Turing. I
       | think the fictional and real-life characters were completely
       | different.
        
         | jgrahamc wrote:
         | The Imitation Game is awful and should never be screened.
        
           | mlajtos wrote:
           | It is a good _movie_. It introduced Turing to the masses in a
           | compassionate way. Also it showed normal people that
           | cryptography, despite being rather boring and mathy, is
           | extremely important in everyday life.
           | 
           | I also enjoyed Breaking the Code -
           | https://www.imdb.com/title/tt0115749/
        
             | arethuza wrote:
             | I would agree that is an enjoyable movie but also terrible
             | history - but then most movies that deal with historical
             | events are wildly inaccurate - usually for fairly
             | understandable reasons.
             | 
             | I was lucky enough to see the stage version of _Breaking
             | the Code_ - Derek Jacobi was outstanding:
             | 
             | https://www.youtube.com/watch?v=vyDe8IWAxaY&t=168s
        
       | [deleted]
        
         | [deleted]
        
       | mhh__ wrote:
       | Relative to the 0 times I have ever heard his name mentioned vs.
       | the countless times I have read about him on manic midnight
       | wikipedia binges, I think Zuse seems very unsung.
        
         | jgrahamc wrote:
         | Agreed. Zuse did a lot of cool stuff.
        
         | amacbride wrote:
         | Not completely surprising (from Wikipedia):
         | 
         | "While Zuse never became a member of the Nazi Party, he is not
         | known to have expressed any doubts or qualms about working for
         | the Nazi war effort."
        
         | hutzlibu wrote:
         | "I think Zuse seems very unsung"
         | 
         | Of what I know, this might be, because he worked for the Nazis
         | and his prototypes were mostly destroyed in the war and
         | somewhat forgotten and seemed to have not been influential to
         | the general branch of computing.
        
           | TheOtherHobbes wrote:
           | Von Braun also worked for the Nazis.
           | 
           | Interesting to speculate what would have happened if Zuse had
           | been paper-clipped to the US and given a huge budget.
        
             | schlupa wrote:
             | One funny anecdote about Zuse during the war was that he
             | managed to save his Z4 because it was named as V4 in the
             | paperwork. The wehrmacht officers thought it was one of
             | these retaliation weapon V1, V2, V3 so V4 was very
             | important and got high priority to be hidden away
             | somewhere.
        
           | mhh__ wrote:
           | Oh sure. I know _why_ , but I find computer engineering a bit
           | more interesting than theory so I'd rather read about him
           | than quite a few others.
        
           | adrian_b wrote:
           | There is an indirect influence of Zuse in the daily life of
           | most programmers.
           | 
           | The "for" keyword used in the "for" loops comes from Algol
           | 60, which inherited it from its predecessor from 1958, IAL.
           | 
           | The "for" loops were introduced in IAL by Heinz Rutishauser,
           | who had used "fuer" loops (i.e. "for" in German) for the
           | first time in 1951, when he had published a paper describing
           | a computer programming language, 3 years before Fortran
           | introduced the "do" loops in 1954.
           | 
           | In the Rutishauser paper from 1951, he quotes Zuse with his
           | paper about a programming language (published in 1948) as the
           | inspiration for his own improved programming language, which
           | included the "for" loops.
        
       | jmull wrote:
       | Well, this article certainly shoots itself right in the foot.
       | 
       | If the problem is that some important early contributions to CS
       | are being overlooked, the solution is to promote those
       | contributions.
       | 
       | By framing this as Turing vs. others, the focus is squarely on
       | Turning and his contributions. It puts itself in the position of
       | having to beat down and minimize Turing's contributions before
       | raising up other contributions. Pretty much setting itself up to
       | fail to convince very many.
       | 
       | Instead, e.g., present the narrative of those early
       | contributions, showing how they provided the foundation Turing
       | worked from.
       | 
       | (edit: I should add: I understand perfectly that the point of
       | this article probably isn't to actually convince anyone of
       | anything, but is just a hot take meant to get people worked up
       | for the page views. So mission accomplished, from that
       | perspective, I guess.)
        
       | bobthechef wrote:
       | I wonder how much of this is due to anglocentrism. Wouldn't be
       | the first time.
        
       | blazespin wrote:
       | The article is an immature and childish distraction from a
       | potentially interesting discussion (where do good ideas come
       | from).
       | 
       | There is no such thing as _The_ father of anything. That 's just
       | one of the many incarnations of the insipid fantasy society has
       | that "Daddy" will solve all your problems.
       | 
       | Turing is absolutely _a_ father of computer science, and his
       | contributions are interesting.
       | 
       | This can be said for any scientist, historically speaking. Even
       | Einstein stood on the shoulders of giants.
       | 
       | Giving this article oxygen is a colossal waste of time. Just
       | because someone somewhere said some stupid thing isn't a reason
       | to write a 10,000 word essay on it with endless citations.
       | 
       | Kinda like what we are doing now..
        
       | NtochkaNzvanova wrote:
       | Of course Turing didn't invent computer science. Everyone knows
       | that Jurgen Schmidhuber invented computer science.
        
       | liorben-david wrote:
       | Name me a single mathematician - or honestly a single scientist
       | or thinker ever that didn't stand on the shuolders of giant
        
       | OhNoMyqueen wrote:
       | He's not oversold. His major feat (breaking enigma using a
       | computer) is miscategorized into Computer Science, where it
       | should be classified into Computer Engineering and Software
       | Engineering, alongside other stars like Charles Babbage and Ada
       | Lovelace. His second major feat (the Turing Machine model) is one
       | stone among the many foundational stones of Computer Science.
        
         | iainmerrick wrote:
         | _He 's not oversold. His major feat (breaking enigma using a
         | computer)_
         | 
         | You're overselling him right there. That wasn't _his_ feat, he
         | was just on the team (and after Enigma had been initially
         | broken).
        
         | adrian_b wrote:
         | Yes, "X oversold" is a poor choice of words for cases like
         | Turing or Einstein.
         | 
         | The contributions of people like Turing and Einstein were as
         | important as they are claimed to be, they are not oversold.
         | 
         | On the other hand, I am also very annoyed that the majority of
         | people have some idea about the importance of Turing or
         | Einstein, but they are completely unaware that there were many
         | other contemporaneous scientists whose contributions were
         | equally important and neither Turing nor Einstein nor any of
         | their many peers could have created their own original work
         | without using the work of the others.
         | 
         | These myths about the singular geniuses are reflected in the
         | stupid laws about the so-called "intellectual property".
         | 
         | There exists no new "intellectual property" that does not
         | incorporate 99% of old "intellectual property", but this is not
         | handled correctly in patent claims or in fair use conditions.
         | 
         | If the current laws about patents and copyrights would have
         | been applied during the initial evolution of the electronics
         | and computer industries, they would have never progressed to
         | the level of today.
        
       | TimTheTinker wrote:
       | This article gives far too little credit to John von Neumann (not
       | to mention Turing).
        
       | KingOfCoders wrote:
       | "especially in the Anglosphere".
       | 
       | Just like the Wright brothers. And on and on and on. It pays off
       | if you control culture.
        
       | ocschwar wrote:
       | No.
       | 
       | What Turing did in 1937 might not have been an advance over what
       | Godel and Church did. But if you want to make the case for
       | building a general purpose computer out of mechanical relays or
       | vacuum tubes in 1946, you need a semantic advance. Turing
       | machines did that.
       | 
       | Nobody would read Godel and say "let's build ENIAC." Tommy
       | Flowers did read Turing and say that. THAT is the difference.
       | 
       | It's like Einstein versus Lorenz. Same mathematics.Different
       | semantic interpretation.
        
       | pyentropy wrote:
       | The average person has no clue what theoretical computer science
       | is.
       | 
       | But science fields do need marketing. All children have heroes
       | they look up to. Putting focus on Turing's achievements is merely
       | creating a pop star figure in the mainstream, which I think is a
       | good thing: a smart dude works on a problem that saves World War
       | 2 and now powers your phone and your TikTok app. Once you are
       | actually interested in the field you can work out the nuances and
       | the falsehoods in that claim.
       | 
       | Evaluating earlier work in some field throughout history always
       | leads to a complex graph of achievements, but you cannot put that
       | graph in the name of an annual prize. Do we change "Turing Award"
       | to "Frege-Cantor-Godel-Church-Turing"?
        
         | OneEyedRobot wrote:
         | >All children have heroes they look up to.
         | 
         | After reading that I sat here for a minute and racked my brain
         | as to who my childhood 'hero' might be. I can't remember a
         | single person.
         | 
         | It's amusing to me how much of intellectual work deals in a
         | currency of status. Getting/giving credit for things appears to
         | be the Prime Directive, at least among the observers. We've now
         | graduated to not only stressing who is responsible but what
         | demographic groups they are a part of.
         | 
         | Now, it could be that the real deal groundbreaking folks don't
         | give a damn. Tip o' the hat to those people.
        
         | Banana699 wrote:
         | >a smart dude works on a problem that saves World War 2 and now
         | powers your phone and your TikTok app.
         | 
         | The vast majority of people working anywhere near mathematics,
         | physical sciences or electrical engineering (the 3 founding
         | pillars of CS) in the 1920s and 1930s probably worked on
         | problems related to WW2 during WW2. You can equally state that
         | motivating claim for a _lot_ of other people.
         | 
         | I think Turing gets the Media Treatment^TM because there's a
         | lot of Tragic Hero Energy in his story:
         | 
         | <A gay man in an era that summarily rejected him [and we tell
         | this story in an era that is extremely oversensitive and hyper-
         | reactive to this particular sort of injustice]; a smart, shy
         | pupil whose closest childhood friend (and suspected lover) died
         | early of a now-extinct illness; a mathematician who dreamed of
         | how numbers and lookup tables could hold a conversation, saw
         | them used and counter-used to destroy cities and murder
         | millions, then was finally rewarded with prison and humiliation
         | by the people he fought for.>
         | 
         | Turing himself off course deserves all praise and glory and the
         | righteous anger for how he was treated in his last years, but I
         | think our era's affinity for him is just the old mechanism of
         | people digging the past for battles that reflect the moral
         | values they're currently (fighting for|winning|losing), see
         | also the misguided claim that Ada Lovelace is the first
         | "computer programmer", usually followed by a looong screed
         | about Women In Tech.
         | 
         | We just like a good story to exaggerate and make it reflect our
         | current moral memes, and the idea of a Man|Woman Ahead Of Their
         | Times is a catch by this standard.
        
           | OneEyedRobot wrote:
           | >a smart dude works on a problem that saves World War 2 and
           | now powers your phone and your TikTok app.
           | 
           | So much for the Polish Cipher Bureau. Not so many tragic hero
           | opportunities there.
        
           | UncleMeat wrote:
           | Oversensitive and hyper-reactive?
           | 
           | Jailing or sterilizing gay people for having sex is evil. End
           | of story. It has only been 20 years since this was the law in
           | some US states. I see no reason why vigorous rejection of
           | this sort of policy as monstrous can possibly be seen as
           | "oversensitive and hyper-reactive".
        
             | Banana699 wrote:
             | You missed the entire point.
             | 
             | The point isn't that this oversensitivity is misplaced, the
             | point is that it's moral outrage porn that the tellers of
             | the story use in a smart way to get a reaction from you.
             | 
             | This isn't necessarily a bad thing if it's just one or two
             | story among others, after all the purpose of art is to get
             | strong reactions out of its audience. But when every such
             | story has to lean hard into the moral aspect to the
             | exclusion of all else it becomes a trite children story
             | told for nothing else but feel-good points.
             | 
             | Consider the amount of articles on trump during his
             | presidency. How much of it was high-quality investigative
             | journalism telling you things you don't know, and how much
             | was "Trump tweeted something shockingly stupid, here are a
             | list of experts you don't need telling you this is
             | shockingly stupid, this means End Of Democracy (EOD)" ? The
             | latter articles are technically true, but it's trite and
             | accomplishes nothing but pulling on your memetic levers to
             | get you to like/share/feel-bad|good-all-day.
        
       | ur-whale wrote:
       | Jurgen Schmidhuber forging on in his never-ending quest to slay
       | historical misattributions and generally right all wrongs.
       | 
       | We can count ourselves lucky this time: at least, the topic isn't
       | himself.
       | 
       | On a positive note, he can't be accused of being inconsistent.
        
       | Animats wrote:
       | This has changed over time. In early computing, the people
       | mentioned were Eckert and Mauchley, who designed the ENIAC, and
       | von Neumann, who figured out how a practical CPU should be
       | organized. Also Shannon, who figured out that relays basically
       | did Boolean algebra and could be analyzed formally.
       | 
       | Bletchley Park work was still classified in those days, so not
       | much was known about that. Much to the annoyance of Tony Flowers,
       | who built the Colossus code-breaking machine but couldn't mention
       | it in post-war job-hunting. (Incidentally, Colossus was not a
       | general-purpose computer. It was a key-tester, like a Bitcoin
       | miner ASIC.)
       | 
       | As I've pointed out before, the big problem was memory. IBM had
       | electronic addition and multiplication, with tubes, in test
       | before WWII. But the memory situation was really bad. Two tubes
       | per bit. Or electromagnetic relays. Or electromechanical counter
       | wheels, IBM's mainstay in the tabulator era. To store N bits, you
       | had to build at least N somethings. Babbage's Analytical Engine
       | called for a memory of a ring of rows of counter wheels, a
       | mechanism the size of a locomotive. Access time would have been
       | seconds.
       | 
       | Much of early computing was about kludges to get some kind of
       | memory. The "Manchester Baby" had a CRT-based memory, the
       | Williams tube. That was the first computer to actually run a
       | program stored in memory. Frederic C. Williams, Tom Kilburn, and
       | Geoff Tootill, 1948.
       | 
       | After that, everybody got in on the act. Mercury tanks and long
       | metal rods were built. Really slow, and the whole memory has to
       | go past for each read, so twice the memory means twice the access
       | time. Then there were magnetic drum machines. Magnetic tape.
       | Finally, magnetic cores. At last, random access, but a million
       | dollars a megabyte as late as 1970. Memory was a choice between
       | really expensive and really slow well into the 1980s.
       | 
       | Turing was involved with one of the early machines, Pilot
       | ACE/ACE. But he quit before it was finished.
        
       | netcan wrote:
       | Just the references here are probably an amazing resource for
       | early computer science, and I'm not going to argue against such a
       | force.
       | 
       | Seems to be a lot of uneasiness, of late, about the way credit is
       | allocated in science. IMO, it's mistaken to point this at the
       | top: nobel laureates, heroic icons like Einstein or Turing. These
       | figures are _supposed_ to be idolized and idealized. Yes, this is
       | "untrue," technically. But, it serves many purposes. A nobel
       | prize win elevates _science_ by singling out scientists for hero
       | status. Achilles elevated Greece by giving Greeks something to
       | collectively aspire to or adulate.
       | 
       | If you're already deeply interested in computer science, of
       | course the detailed narrative recognizing dozens of brilliant
       | early computer scientists is richer. Of course!
       | 
       | Where poor credit allocation matters isn't historical hero
       | narratives, it's at the working scientist level. The grants &
       | positions level. Here, it's important to be accurate, fair, etc.
       | Being inaccurate, unfair or corrupt at this level creates actual
       | deficits.
        
         | mcguire wrote:
         | " _Seems to be a lot of uneasiness, of late, about the way
         | credit is allocated in science._ "
         | 
         | This is always been the case---medieval and renaissance
         | thinkers would publish anagrams of their key findings because
         | they didn't want to give someone else the advantage of knowing
         | the finding but also wanted to prove that they thought of the
         | idea. IIRC, Isaac Newton did not publish any of his findings
         | until someone else threatened to publish their independent
         | results. And he's known as the creator of calculus because the
         | British Royal Society got into an academic slap-fight with the
         | French.
        
         | paganel wrote:
         | Not scientists but some of the best writers of the 20th century
         | never got a Nobel, I'm thinking especially about Proust and
         | Kafka (and I would say Celine was more worthy of the Nobel than
         | Camus and especially Sartre), I'm sure the same thing happens
         | in science in regards to this Swedish prize.
        
           | netcan wrote:
           | True, but writing has many forms of hero culture. Tolkien
           | doesn't need a nobel, neither does Kafka. They became heroes
           | regardless.
        
         | simorley wrote:
         | > Seems to be a lot of uneasiness, of late, about the way
         | credit is allocated in science.
         | 
         | Of late? You should read up on Newton/Leibniz hysterics over
         | who invented calculus. The arguments over who invented the
         | first light bulb, car, etc. Whether greek knowledge ( the
         | foundation of western civilization ) originated in the near
         | east or even in india. Heck, people still argue about who
         | "discovered" america first. There is national, religious,
         | ethnic, racial, gender, sexuality pride tied to "priority".
         | It's not just in science/math, it's applies to everything.
         | 
         | > These figures are supposed to be idolized and idealized
         | 
         | Why? They weren't particularly good people. Neither were
         | saints.
         | 
         | > Achilles elevated Greece by giving Greeks something to
         | collectively aspire to or adulate.
         | 
         | Are you talking about science/knowledge or politics? But you
         | are right on the point. It's what this is all about at the end
         | of the day. Politics.
         | 
         | Without politics, the discovery/knowledge would be what is
         | important. Because of politics, the people become the focal
         | point.
        
         | breuleux wrote:
         | Why do we need idols, though?
         | 
         | If there was no narrative, no idols, no celebrities, would
         | people be less motivated to do science? Why do we need to lie
         | to ourselves so?
         | 
         | > If you're already deeply interested in computer science, of
         | course the detailed narrative recognizing dozens of brilliant
         | early computer scientists is richer. Of course!
         | 
         | Personally I'm mostly uninterested in who did what, but maybe
         | that's just me. It seems obvious to me that nearly every
         | scientific discovery could have been done equally well by
         | millions of people, it's just a matter of who had the resources
         | to be educated, who decided to research the problem, who
         | managed to snipe the answer first, and who had the right
         | connections to get it acknowledged. They're still great
         | achievements, for sure, but they're not the markers of
         | exceptional genius we want to think they are, not for Turing or
         | Einstein, but not for anyone at all, really.
        
           | reedf1 wrote:
           | Idolatry seems like an emergent property of human collective
           | consciousness. You can try to ignore it (it's been tried),
           | downplay it (also been tried), and ban it (and again).
        
           | netcan wrote:
           | >>They're not the markers of exceptional genius we want to
           | think they are, not for Turing or Einstein, but not for
           | anyone at all, really.
           | 
           | The point isn't to prove that they're special. The point is
           | that something special happened and these people are
           | designated symbols for that... and they're kind of selected
           | for being good at this. We're not doing this for them,
           | they're dead. The celebrity of Einstein is a deification of
           | his relativity theories. We need idols for our symbolic
           | world, to work without them in the real one.
        
             | breuleux wrote:
             | But what purpose do these idols or symbols serve, exactly?
             | I'm speaking as someone who doesn't care who came up with
             | relativity and doesn't care whether there is a founding
             | person of computer science or not let alone who that would
             | be, and would like to know what others see. Is it an
             | inspiration thing? A motivation thing?
        
               | PartiallyTyped wrote:
               | Not necessarily a motivational thing, but events such as
               | these become widespread and allows for easier
               | dissemination of information.
               | 
               | It's easy to see then that such events allow for the
               | eventual "recruitment" of other scientists, and in
               | showing society that "science is working" and "solves
               | important problems".
               | 
               | Both of which serve to enrich the scientific world with
               | new researchers and funding to keep the engine running.
        
               | bentonite wrote:
               | I'd say its a bit of both inspiration and motivation.
               | That said, I think the main motivators for these kinds of
               | idols/heroes are to craft ethical or normative stories
               | for how people should (or shouldn't) behave as well as to
               | assist with teaching people theories and concepts.
               | 
               | Learning about why correlation doesn't equal causation
               | (and spurrious correlations) is more impactful if you
               | also learn about Wakefield's sins at the same time. He's
               | a villian.
               | 
               | Archimedes and the bathtub is a great story - and I
               | learned it in elementary school and still remember it and
               | the lessons it teaches. We like to associate people with
               | events and they help for learning and retaining
               | information.
        
           | slibhb wrote:
           | > Why do we need idols, though?
           | 
           | Because we're flesh and blood, i.e. utterly irrational.
           | 
           | > If there was no narrative, no idols, no celebrities, would
           | people be less motivated to do science? Why do we need to lie
           | to ourselves so?
           | 
           | Yes, definitely, a huge amount of what motivates scientists
           | is desire for fame, being considered a genius, Nobel prizes,
           | scientific immortality, and so on. It is entirely unrealistic
           | to imagine that we can stop being like this, it's almost a
           | religious belief, akin to thinking that, one day, people can
           | live without sin.
           | 
           | > Personally I'm mostly uninterested in who did what, but
           | maybe that's just me. It seems obvious to me that nearly
           | every scientific discovery could have been done equally well
           | by millions of people, it's just a matter of who had the
           | resources to be educated, who decided to research the
           | problem, who managed to snipe the answer first, and who had
           | the right connections to get it acknowledged. They're still
           | great achievements, for sure, but they're not the markers of
           | exceptional genius we want to think they are, not for Turing
           | or Einstein, but not for anyone at all, really.
           | 
           | This may be an accurate description of your personality, in
           | which you're one in a million, or it may be that you're
           | ignorant about the things that actually drive you. The vast
           | majority of people are driven by some kind of desire for
           | fame, recognition, status, upvotes, and so on.
           | 
           | Suggesting that Turing and Einstein were not "exceptional
           | geniuses" is bizarre. Even in proper context, they were
           | exceptional geniuses, just among other, lesser-known,
           | exceptional geniuses. If we take your view seriously, we
           | remove all human agency and uniqueness, we remove the idea of
           | an "achievement" and we can only give credit to luck, the
           | historical process, and various contingent circumstances.
           | Even if your view is accurate, people simply cannot live that
           | way. Creating narratives is part of what makes us human and
           | narratives need protagonists (idols, heroes, whatever).
        
             | breuleux wrote:
             | > This may be an accurate description of your personality,
             | in which you're one in a million, or it may be that you're
             | ignorant about the things that actually drive you. The vast
             | majority of people are driven by some kind of desire for
             | fame, recognition, status, upvotes, and so on.
             | 
             | Or it might be that people who are driven by fame and
             | recognition are more likely to become famous than those who
             | aren't, which skews our idea of what motivates people.
             | Given how emphatic society is about fame and money as
             | markers of success, I feel people tend to be mistaken in
             | the other direction: many people think they are, or should
             | be driven by fame or money even when it simply contradicts
             | their personality.
             | 
             | Even if it was indeed the case that most people are
             | motivated by fame, I think those who aren't are more like 1
             | in 3 or 1 in 4 than 1 in a million. It might be 1 in a
             | million in actually famous people, but not in the
             | population at large.
             | 
             | > Even in proper context, they were exceptional geniuses,
             | just among other, lesser-known, exceptional geniuses.
             | 
             | If I am correct that millions of people had the capability,
             | that would place "exceptional genius" at 1 in 1000, or 1 in
             | 10000. I think that's a reasonable ballpark.
             | 
             | > If we take your view seriously, we remove all human
             | agency and uniqueness, we remove the idea of an
             | "achievement" and we can only give credit to luck, the
             | historical process, and various contingent circumstances.
             | 
             | Whether we acknowledge exceptional geniuses or not, it
             | remains the case that 99.99% of people are not exceptional
             | geniuses. Are you saying these people don't have agency, or
             | that they aren't unique? I think we all have agency, we're
             | all unique, and we all have achievements. Some achievements
             | are more impactful than others, some achievements are more
             | impressive than others, but these are not necessarily the
             | same, and neither is necessarily remembered, because what
             | matters most is not the person or the achievement, but how
             | the story fits in the narrative. In any case, you don't
             | need to care about that narrative to care about or
             | acknowledge agency, uniqueness or achievement.
        
         | hyperpallium2 wrote:
         | An aspect of lionization is a wish and motivation to emulate
         | and become heroic oneself...
         | 
         | But one soon realizes you probably won't get heroic credit even
         | if you do contribute something heroic, neutralizing that
         | encouragement.
         | 
         | Therefore, you'd better do it for the love of the work itself
         | or for how it helps others.
         | 
         |  _There 's no limit to what you can accomplish if you don't
         | mind who gets the credit._
        
         | simonh wrote:
         | Right, we shouldn't underestimate the importance of narratives.
         | We need narratives about the theoretical foundations of
         | computer science, and Turing is the perfect figure to weave
         | many of those narratives around. It's good for young people and
         | the general public, and good for the field.
         | 
         | The Turing machine is a key conceptual model for understanding
         | the basics of computation. The Turing Test is a great model for
         | thinking about what being intelligent means. Hardly a week goes
         | by without the term Turing Complete appearing somewhere in a HN
         | comment. The fact that he also played an important role in the
         | design and construction of actual practical computing machines,
         | and did so to fight nazis seals the deal.
         | 
         | Of course there's more to it, there's plenty of credit to go
         | around, but Turing is the perfect entry point for people to
         | appreciate and learn more about all the work that went into the
         | founding of computer science. It elevates the profile of the
         | whole field.
        
           | netcan wrote:
           | Don't forget the tragic way society later betrayed him.
        
           | roenxi wrote:
           | We also shouldn't underestimate the importance of truth.
           | Dealing with the world as-it-is has better results than
           | interacting with a story we'd like to be true but isn't.
           | 
           | People waste their lives in service of causes and ideas that
           | just are not grounded in reality. Not just in the
           | philosophical sense that we cannot know truth, but in the
           | practical sense of "the outcome you want will not flow from
           | the actions you are taking today". Narratives are inferior to
           | truth when it comes to making decisions.
        
             | simonh wrote:
             | I'm not advocating telling lies. Sometimes we simplify, and
             | doing so can be perfectly appropriate. Unfortunately that
             | does open the stage for nitpicking and pedantry.
        
               | nzmsv wrote:
               | Disagree. A finely crafted but ultimately false story can
               | be actively harmful. A young person may think that they
               | are not of the same caliber as "the greats" and cannot
               | make their mark on a field, which would discourage them
               | from trying. All the while in reality "the greats" were
               | never as great as the historians later depicted them.
               | "Come on in, collaborate, and make a difference" would be
               | a much more positive message and wouldn't be any harder
               | to explain than what amounts to the creation of
               | personality cults.
        
               | UncleMeat wrote:
               | This is where the humanities has the tech world beat.
               | While we quibble over correct narratives and seek one
               | option, the humanities has been completely soaked in the
               | idea that there are nearly unlimited narratives that
               | describe any given human endeavor and they weave together
               | into a rich and ever-changing tapestry.
               | 
               | This is why a historian can read, understand (both the
               | pros and the cons), and respect books that represent an
               | economic history, a social history, an information
               | history, a microhistory, and even a great-man history of
               | a given subject without trouble.
               | 
               | More reason for engineers to take humanities courses!
        
               | nzmsv wrote:
               | So can I continue to prefer my narrative? It seems to
               | gather some upvotes and some downvotes, so at least it is
               | interesting and elicits a reaction :)
               | 
               | Also, the more I learn about my heroes the more I realize
               | that they never saw themselves as ubermensch. If
               | anything, self doubt seems to be the common thread. I
               | think this angle does not get enough attention.
               | 
               | However, I agree with you on a broader point. This is
               | just one perspective. Here is another one: Turing the
               | historical figure is necessarily oversold because many
               | more people than Turing the real person contributed to
               | his aggrandizement. Like all cultural icons, Turing the
               | idea outlived and outshined Turing the man.
        
               | Dylan16807 wrote:
               | > So can I continue to prefer my narrative? It seems to
               | gather some upvotes and some downvotes, so at least it is
               | interesting and elicits a reaction :)
               | 
               | I think most of that reaction isn't coming from your
               | narrative on history, it's from accusing other
               | commenter's narratives of being _false_.
        
               | simonh wrote:
               | Nobody here is advocating telling false stories. Saying
               | that Turing laid the foundations for computer science is
               | not false. It's a perfectly valid opinion to hold. We
               | might say it's a simplification, or even an exaggeration,
               | arguably saying he's one of them might be better, but
               | it's not a false statement.
        
               | sandwall wrote:
               | Story telling is how our society has transferred
               | information since we started to communicate--
               | understanding the map is not the territory, nor should it
               | be. A beautiful narrative can convey important kernels
               | more efficiently than endless minutiae-- Awareness of
               | this is important and elaborations are helpful for those
               | interested in the details.
               | 
               | I'm reminded of, "The Glass Bead Game," which discusses
               | an academic society that forgets the names of
               | contributors since they're all just part of the flow of
               | humanity
        
               | TeMPOraL wrote:
               | Arguably our storytelling was an efficient hack in an age
               | before writing. A story is a very high-overhead, low SNR
               | way of communicating kernels of truth, but it's robust
               | over time, so it allowed transfer and accumulation of
               | knowledge across societies and generations.
               | 
               | But then we've invented and perfected writing, developed
               | symbolic languages and notations (e.g. math, musical),
               | long-duration storage media for text, and eventually
               | networked digital computers. In terms of communicating
               | and preserving knowledge, stories are pretty much the
               | _worst possible option you can choose_.
               | 
               | We're comfortable with narratives because we didn't have
               | anything else for hundreds of thousands of years. Stories
               | are pretty much hardwired into our brains. But that
               | doesn't make them the right choice, now that we've
               | figured out much better alternatives.
               | 
               | More than that, I'm personally suspicious of stories
               | being used in communication. There's no good reason to
               | use them, and there's plenty of bad ones - it so happens
               | that what makes a good story robust over time is the same
               | thing you need to manipulate people into believing lies
               | and shut off critical thinking.
        
               | NineStarPoint wrote:
               | The main benefit of stories is that they are easier for
               | people to remember than dry details. In terms of
               | communicating knowledge, they are the form that are most
               | likely to stick with us as opposed to going in one ear
               | and out the other. Especially when it comes to areas
               | where someone doesn't have expertise. This is as you
               | noted incredibly prone to manipulation, but it doesn't
               | change that it you want a random person picked off the
               | street to actually synthesize the knowledge you're trying
               | to tell them, a story is by far the way most likely to
               | work. And I'd say that's important, since knowledge
               | written down somewhere that no one remembers or cares
               | about does nothing to change the way people act.
               | 
               | As far as preserving information goes, no argument there.
               | Stories aren't a good way to preserve the truth of
               | matters for future generations. To look and determine if
               | the stories told have truth in them requires more
               | detailed writing.
        
               | mrtranscendence wrote:
               | Stories place ideas into context, not only making them
               | easier to remember (as mentioned by another comment) but
               | also easier to understand. Analytic philosophers are used
               | to dry, precise language, but even they often rely on
               | scenarios and narratives -- this can help reveal what the
               | reader thinks intuitively and bring that into sharper
               | contrast. By remaining story-free you're giving pedagogy
               | the short shrift.
               | 
               | What has empirically brought more folks into careers in
               | science, dry textbooks foisted by teachers or Star Trek?
               | I'd argue Star Trek and science fiction more generally.
               | You can chalk that up to human failings if you like, but
               | inspiration is a need that can't be avoided if you wish
               | to convince.
        
               | Retric wrote:
               | The issue with stories is they focus on unimportant bits
               | often for propaganda reasons. Pick some arbitrary first
               | and every country can find someone to play up as a home
               | town hero. The US just happens to be rather quite around
               | who "invented" electricity but longer lasting
               | incandescent lightbulbs and kites in lightning storms
               | that's the ticket. The British tend to streamline the
               | Benchley park narrative by dropping the preceding polish
               | contribution etc etc.
               | 
               | In that context narratives end up glorifying endless
               | minutiae.
        
               | nextaccountic wrote:
               | Setting the record straight on this matter isn't
               | nitpicking and pedantry, it's just giving credit where it
               | is due. Since the intent of the "simplification" isn't to
               | deceive, this shouldn't be a problem.
        
       | wirthjason wrote:
       | A fascinating book on Turing and Church is "The Annotated
       | Turing". It's a walk through of Turings paper with analysis and
       | commentary.
       | 
       | It was written by Charles Petzold, who also wrote the immensity
       | popular book "CODE".
       | 
       | https://www.amazon.com/Annotated-Turing-Through-Historic-Com...
        
         | arduinomancer wrote:
         | +1
         | 
         | It's pretty easy to understand from an average programmer's
         | perspective
         | 
         | I found the proofs at the end are a bit hard to follow but it's
         | not really critical to understand them if you just want to know
         | what a Turing Machine is and the history/context behind it
         | 
         | I thought it was really interesting how Turing defines what are
         | essentially "macros" for the machine
         | 
         | For example copy or erase
        
       | 0x456 wrote:
       | There is also the related "Matthew effect"
       | 
       | https://en.wikipedia.org/wiki/Matthew_effect#Sociology_of_sc...
        
       | kazinator wrote:
       | I feel that this article wastes effort on attacking a bit of a
       | strawman.
       | 
       | In the fields of computing and mathematics, nobody (hopefully)
       | believes bunk like that Turing started computer science or
       | invented the first computer. My reactions to all that were,
       | "what? who thinks that, and why don't they check the Wikipedia?"
       | 
       | If such beliefs are circulating among laypeople, it is good to
       | debunk them. But doing so in this article (especially while
       | failing to acknowledge that they are strictly lay myths that no
       | mathematician or computer scientist believes) detracts its main
       | thesis, which is about the excessive attribution to particular
       | individuals, while others are ignored/forgotten.
       | 
       | Turing is not excessively attributed with anything in our field.
       | He's frequently referenced, mainly because he articulated a
       | concrete model of computation which can be simulated and using
       | which proofs can be made about computability.
       | 
       | I mean, academia doesn't habitually make up lies about who did
       | what, especially if they are not coming from that person or
       | persons. I.e. if you don't misrepresent your work yourself, the
       | field is generally not going to step in and do that for you.
       | Embellishing the exploits and contributions of some eminent
       | persona may be what some careless journalists or bloggers
       | sometimes do, but that's neither here nor there.
        
       | yodelshady wrote:
       | It's certainly true that Turing didn't invent computer science.
       | As a point of order:
       | 
       | > A popular British movie [The Imitation Game] even went so far
       | as to say he invented the computer.
       | 
       | The Imitation Game that was financed by a US production company,
       | with an American screenwriter and Norwegian director? _That_
       | British movie?
        
       | ajkjk wrote:
       | I feel like it's really weird to call what Godel was doing
       | computer science.
        
         | amelius wrote:
         | I feel that computer science is really the wrong word.
         | 
         | It's like calling astronomy "telescopy", to paraphrase
         | Dijkstra.
        
           | jessaustin wrote:
           | "Computation" would have been a more felicitous term for the
           | field, but that ship has probably sailed.
        
         | stepbeek wrote:
         | Encoding logical statement into numbers is foundational, but I
         | do see your point. I don't know of any evidence that says that
         | Godel was interested in automating computation through his
         | encoding.
         | 
         | That being said, I view Godel's addition to be so mind-blowing
         | that I can't help but privately think of him as the founder.
        
         | jameshart wrote:
         | Right - in Godel's mind, he was trying to model how mathematics
         | works - the same applies to the problem Hilbert and co were
         | framing in the Entscheidungsproblem. They were wrestling with
         | the foundational philosophy of how powerful mathematics could
         | ever claim to be.
         | 
         | Turing I think resonates as a computer scientist more than a
         | mathematician for the same reason that Ada Lovelace does: both
         | of them shared, and made explicit in how they approached
         | problems, the insight of the generalizability of computability
         | beyond mathematics. Where Babbage saw a calculating machine
         | spitting out numerical tables, Lovelace saw music and pictures.
         | And not in the implicit reductive way most mathematicians
         | assume that anything important can be modeled as mathematics:
         | because if it can't be modeled mathematically, it can't be
         | important. Turing and Lovelace both seemed to get that this is
         | a mathematics that can act on _anything_ , and that that's what
         | makes the mathematics interesting.
         | 
         | The extension beyond 'and this lets you derive any provable
         | true statement within a formal system in finite time' to 'and
         | this lets you carry out any deterministic transformation on an
         | arbitrary piece of text' to something like 'if machine
         | translation is possible this machine can do it' or 'a
         | sufficiently sophisticated machine like this could hold a
         | convincingly human conversation' is a through-line you can only
         | make with Turing's insights, not just Church or Godel's.
         | 
         | And sure, you need Shannon to give you a framework for
         | information representation that extends it to _any form of
         | data_ and dozens of other contributions of _course_ , and
         | without the mathematical foundations of Church and Godel et al
         | there's no foundation to build it on at all. But Turing's
         | bridge out of the mathematical philosophy world into the realm
         | of stuff engineers like Zuse were building seems like as good a
         | moment as any to draw a line and say 'this is computer
         | science'.
        
         | progre wrote:
         | I makes sense to me. The "computer science" to "practical
         | computer work" relationship has about the same distance as
         | fundamental physics has to industrial chemistry.
        
           | ajkjk wrote:
           | yeah, but what he was doing was much more in the realm of
           | logic and set theory, which is to say mathematics, not
           | concerned with anything to do with computation.
        
       | qwerty456127 wrote:
       | All politicians and state officers should be required to pass the
       | Turing test.
        
       | Labo333 wrote:
       | I'm not going to comment on the actual content that is mostly [1]
       | scientifically correct, but Schmidhuber (the author) has a record
       | of wanting to be the center of attention [2] (even though LeCun
       | is not better on that matter). Also, a third of the sources are
       | written by him...
       | 
       | Just look at his previous blog post [3], in which he explains
       | that the most cited neural networks all cite works by him. These
       | papers cite dozens of papers, so a lot of other groups that are
       | active in AI can claim the same thing...
       | 
       | [1]: For example, Turing published an independent proof of the
       | Entscheidungsproblem, in the [TUR] article, just a month after
       | Church, that the article forgets to highlight.
       | 
       | [2]: https://en.wikipedia.org/wiki/J%C3%BCrgen_Schmidhuber#Views
       | 
       | [3]: https://people.idsia.ch/~juergen/most-cited-neural-nets.html
        
         | IshKebab wrote:
         | > of wanting to be the center of attention
         | 
         | It seemed more like he felt he was unfairly being uncredited.
         | Which is probably why he wrote this - he now cares deeply about
         | giving credit to the right people.
        
           | bodge5000 wrote:
           | Surely the more noble cause for that would be giving more
           | credit to others, rather than attempting to take away credit
           | from a well known figure. This article is somewhat about the
           | other important figures who's knowledge Turing's was built
           | off, but its central point is that Turing gets too much
           | credit.
           | 
           | I understand why he'd care about that if he'd been uncredited
           | and watched peers be overcredited, but I'd hardly call it a
           | noble work, even if it is understandable.
        
             | 6gvONxR4sf7o wrote:
             | The article is full of credit given to a huge number of
             | people.
        
               | bodge5000 wrote:
               | The article is called Turing oversold, and the article is
               | all about who should be getting credit instead of Turing.
               | This isn't "Hey, are you aware of all these people who
               | helped develop computer science", its "Turing is
               | overcredited, heres a list of other people to support my
               | argument"
        
         | YeGoblynQueenne wrote:
         | >> I'm not going to comment on the actual content that is
         | mostly [1] scientifically correct, but Schmidhuber (the author)
         | has a record of wanting to be the center of attention [2] (even
         | though LeCun is not better on that matter).
         | 
         | You_again wants his work _and that of others_ properly
         | recognised. For example, his article, titled _Critique of Paper
         | by "Deep Learning Conspiracy" (Nature 521 p 436)_ [1] that is
         | referenced by your link to wikipedia, cites a couple dozen
         | pioneers of deep learning, quite apart from Schmidhuber
         | hismelf. Quoting from it:
         | 
         | >> _2. LBH discuss the importance and problems of gradient
         | descent-based learning through backpropagation (BP), and cite
         | their own papers on BP, plus a few others, but fail to mention
         | BP 's inventors. BP's continuous form was derived in the early
         | 1960s (Bryson, 1961; Kelley, 1960; Bryson and Ho, 1969).
         | Dreyfus (1962) published the elegant derivation of BP based on
         | the chain rule only. BP's modern efficient version for discrete
         | sparse networks (including FORTRAN code) was published by
         | Linnainmaa (1970). Dreyfus (1973) used BP to change weights of
         | controllers in proportion to such gradients. By 1980, automatic
         | differentiation could derive BP for any differentiable graph
         | (Speelpenning, 1980). Werbos (1982) published the first
         | application of BP to NNs, extending thoughts in his 1974 thesis
         | (cited by LBH), which did not have Linnainmaa's (1970) modern,
         | efficient form of BP. BP for NNs on computers 10,000 times
         | faster per Dollar than those of the 1960s can yield useful
         | internal representations, as shown by Rumelhart et al. (1986),
         | who also did not cite BP's inventors._
         | 
         | That is not "wanting to be the center of attention". It is very
         | much asking for proper attribution of research results. Failing
         | to do so is a scientific scandal, especially when the work
         | cited has contributed towards a Turing award.
         | 
         | __________
         | 
         | [1] https://people.idsia.ch/~juergen/deep-learning-
         | conspiracy.ht...
        
         | albertzeyer wrote:
         | He just wants to get the facts right, esp the correct
         | attribution to the original scientific contributions (who did
         | it first).
         | 
         | Originality is easily defined as who did sth first.
         | 
         | This might not be the same as influence of some work. It might
         | be that someone else does a lot of groundbreaking work which
         | actually makes sth work (e.g. Goodfellow et al for GAN). You
         | can say the GAN paper had more influence than Schmidhubers
         | Adversarial Curiosity Principle.
         | 
         | Also, of course some newer authors might not know of all the
         | old work. So it might be that people get the same ideas. So
         | when Goodfellow got the idea for GAN, he might not have known
         | about Schmidhubers Adversarial Curiosity.
         | 
         | The problem is, sometimes people did know about the other
         | original work but intentionally do not cite them. You can not
         | really know. People of course will tell you they did not know.
         | But this can be fixed by just adding the citation. It looks bad
         | of course when there are signs that they should have known, so
         | it was really intentionally.
         | 
         | There is also a lot of arguing when sth is the same idea, or
         | when sth is a different novel idea. This can be ambiguous. But
         | for most cases which are discussed by Schmidhuber, when you
         | look at the core idea, this is actually not so much the case.
         | Also, this is also not so much a problem. There is less
         | argumentation about whether sth is at least related. So this
         | still should be cited then.
         | 
         | The question is then, which work should one cite. I would say
         | all the relevant references. Which is definitely the original
         | work, but then also other influential work. Many people just do
         | the latter. And this is one of the criticism by Schmidhuber,
         | that people do not give enough credit (or no credit) to the
         | original work.
        
       | exporectomy wrote:
       | He seems a bit self-contradictory saying both "value those who
       | ensure that science is self-correcting." and "when it comes to
       | credit assignment in science, the important question is: Who did
       | it first?"
       | 
       | Probably worth stepping back and asking why we have heroes in the
       | first place. It obviously doesn't serve the hero himself because
       | he's dead. It's for the benefit of other people. Perhaps if it
       | seems unfair, people won't be so inspired by them, but that's
       | more about perception and modern culture than reality.
       | 
       | We already have a million and one scientists trying to be the
       | first to discover a novel phenomenon but in a useless and non-
       | self-correcting way and not even bothering to follow up on their
       | own "discovery", perhaps hoping someone else will do the hard
       | work while they enjoy the credit of being first. I'd also say
       | that perhaps it doesn't count if you do it first but your results
       | remain hidden and unused. I'll bet some native African discovered
       | Angel Falls before Angel did, but that knowledge didn't get out
       | to the wider world so what was the point of it?
        
       | okl wrote:
       | Related books with more depth by Edgar Daylight:
       | 
       | - Turing Tales: https://dijkstrascry.com/turingtales
       | 
       | - The Dawn of Software Engineering: https://dijkstrascry.com/dawn
        
       | bee_rider wrote:
       | I guess to some extent I can understand some defensiveness of
       | Leibniz, given that he probably deserves a ton of the credit we
       | normally give Newton (and there was actually some bad blood
       | between the two).
       | 
       | But Godel is incredibly famous, isn't he? I don't see why there's
       | a need to give him more credit.
       | 
       | Also it is weird that this article doesn't have a couple
       | paragraphs on Von Neumann, he's underappreciated and also very
       | cool.
        
       | tegeek wrote:
       | If you take away all the media hype around Alan Turning and just
       | see his contributions, he will still stand out.
       | 
       | On Computable Numbers [1] is perhaps one of the top 3 papers in
       | the history of Mathematics. One of the most remarkable thing
       | about Turning Machine is its simplicity.
       | 
       | Then again, in 1950, Can Machine Think[2] is perhaps the top 3
       | papers in the history of Philosophy. And then again, one of the
       | most remarkable thing about Turing Test and the Imitation Game is
       | its simplicity.
       | 
       | The impact of these two papers in the academia, industry and in
       | our lives is huge.
       | 
       | Alan Turning is easily one of the top 3 Mathematicians and
       | Philosophers of all time.
       | 
       | [1]. https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf
       | 
       | [2]. https://academic.oup.com/mind/article/LIX/236/433/986238
        
         | wheaties wrote:
         | Mathematicians? No. Computer Science? Yes.
         | 
         | If you want to group to math, you're going to have to compete
         | with the likes of Euclid, Reiman, Bayes, Newton, Gauss, Cantor,
         | Erdos, Fermat, Pascal, Leibniz, Bernoulli, Euler, Lagrange,
         | Laplace, Fourier, Cauchy, Jacobi, Hamilton, Galois,
         | Weierstrass, Cayley, Dedekind, Klein, Hilbert, Brouwer,
         | Godel...
        
       | bjornsing wrote:
       | Having an educational background in physics I find the Turing
       | Machine a much more intuitive model of computation than say
       | lambda calculus. To me this is Turing's main contribution:
       | linking the abstract world of computation to the physical world,
       | and proving that a very simple physical machine can perform any
       | computation (Turing completeness). That's no small contribution.
        
         | lou1306 wrote:
         | This dualism in CS still carries on to this day. Essentially,
         | the lambda calculus is the forefather of the functional
         | approach to computation, while the Turing machine epitomizes
         | the imperative paradigm.
        
         | kerblang wrote:
         | So okay yeah it's Turing Completeness that matters the most to
         | me as computer science, on a purely pragmatic basis: When
         | people are pitching the latest whiz-bang doodad, the answer is
         | always, "This isn't doing anything we couldn't already do, so
         | how does it make things easier and do it efficiently enough for
         | our purposes?" That's the proper question when it comes to
         | monads, coroutines, actors, non-blocking blah blah, etc. etc.
         | 
         | That's really important in an industry saturated with hype,
         | elitism and general nonsense. Anything I can do in Rust, you
         | can do in Assembly, so I've got some explaining to do (I can
         | probably succeed in this example, others maybe not).
         | 
         | If Turing actually failed to deliver the goods on
         | "completeness", I'd really like to resolve that.
        
         | skissane wrote:
         | > I find the Turing Machine a much more intuitive model of
         | computation than say lambda calculus
         | 
         | I think register machines are more intuitive than Turing
         | machines - they are much closer to how real world computers
         | work.
        
           | tsimionescu wrote:
           | But that's the opposite direction from Turing's intention.
           | The point of the Turing machine model is for the machine to
           | be both mechanically plausible (no magic required) but also
           | equivalent to what a human does when performing computation.
           | 
           | The Turing machine model is a mechanically plausible
           | abstraction of a human performing a computation by following
           | some steps in their mind and with a notebook. The tape stands
           | in for the notebook. The read head stands in for the human's
           | eyes. The write head stands in for their pencil hand. The
           | internal state of the machine stands in for their mental
           | state. At every step, you either read a symbol from the
           | notebook tape and change your mental state in relation to
           | this symbol, or write a symbol based on the current mental
           | state. The procedure itself can be written on the tape, and
           | you can occasionally refer back to it.
           | 
           | The original paper spends a good few pages working out this
           | metaphor and showing that the machine model perfectly
           | abstracts the mathematician's work of computation.
        
           | Dylan16807 wrote:
           | In the days where you can buy a ram chip, a register machine
           | is a really easy abstraction.
           | 
           | If you're trying to imagine something you can mechanically
           | assemble out of discrete compoonents, it's not so great. You
           | need an unlimited number of components hooked up in
           | complicated ways.
           | 
           | A turing machine is a fixed-size and relatively simple box,
           | plus a long tape that feeds through.
        
           | bjornsing wrote:
           | Yes, on the abstract computation side of the link register
           | machines are much more intuitive.
           | 
           | But on the physical side of the link they are much less
           | intuitive IMHO: it's much less clear that "this is just a
           | machine that I could build in my garage".
        
             | tsimionescu wrote:
             | The Turing machine is definitely not some machine that you
             | could build in your garage. None of the mechanisms are
             | specified or even specifiable. The important part, to
             | Turing ateast, is that it perfectly matches what a human
             | does while computing a number, and that there are no
             | magical steps like 'thinking'. Read symbol, change internal
             | state, write other symbol down, rinse and repeat.
             | 
             | All of the details of how a symbol is read, recognized, how
             | it alters the internal state, how the next symbol is
             | chosen, or even how many symbols you actually need are not
             | mentioned and even considered irrelevant. Turing wasn't
             | building one of these, he was proving that this model
             | captures all known computation, and that even so it is
             | undecidable whether this machine would ever stop for any
             | arbitrary computation.
        
               | bjornsing wrote:
               | No "the Turing Machine" isn't a machine you can build in
               | your garage. It's an abstraction.
               | 
               | But any individual Turing machine is. Building a simple
               | one is not very hard, and you can imagine supplying it
               | with more and more tape as it needs it.
               | 
               | It's thus the only model of computation that I can fully
               | imagine "working". And that to me is the beauty of
               | Turing's contribution.
        
             | skissane wrote:
             | The "physical side" was probably more important when Turing
             | first came up with the idea, and people struggled to
             | conceive of computers because none had yet been built.
             | Nowadays it is arguably less necessary because they are an
             | essential part of everyday life, and most people learn some
             | programming before learning theoretical computer science.
        
         | anta40 wrote:
         | Back to undergraduate days > a decade ago, I think I learnt
         | both lambda calculus and Turing machine at the same class:
         | Formal Language and Automata Theory.
         | 
         | Of course Turing machine is more easy to understand because...
         | it's a theoritical machine, after all. On the other side,
         | lambda calculus was weirder, I didn't get it until learning
         | Haskell :D
        
         | nightcracker wrote:
         | > and proving that a very simple physical machine can perform
         | any computation (Turing completeness)
         | 
         | Not proving, conjecturing. It's not proven until this day:
         | https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis
        
           | skissane wrote:
           | It can never be "proven" because the notion of "any
           | computation" being referred to is informal.
           | 
           | Also, it can't perform _any_ computation, if we say
           | hypercomputation is a form of computation. Hypercomputation
           | is (as far as we know) physically impossible, but so strictly
           | speaking are Turing machines - a true Turing machine has
           | unlimited storage, and an unlimited amount of time in which
           | to complete its computations - any Turing machine you could
           | physically construct would only be a finite approximation of
           | a real one.
        
           | bjornsing wrote:
           | Ah, ok, I should have said "any computation that can be
           | performed by any Turing Machine".
        
         | tsimionescu wrote:
         | > and proving that a very simple physical machine can perform
         | any computation (Turing completeness)
         | 
         | This is a misunderstanding of the Turing machine model. The
         | Turing machine is not designed to be a realistically
         | implementable physical machine, and indeed there are no details
         | in Turing's paper on how such a physical machine could be
         | achieved.
         | 
         | Instead, the Turing machine model is designed to be a
         | mechanistic model of _what a mathematician does_ (in rote day-
         | to-day tasks at least). The tape is a representation of the
         | notebook where the mathematician writes down notes. The read
         | head is a representation of the mathematician reading from the
         | notebook.
         | 
         | It's fascinating to read the paper because of this, since it
         | spends quite a few paragraphs showing that this simplification
         | doesn't lose anything of the mathematician's work. It spends
         | some time noting that even though paper is two-dimensional, it
         | can be losslessly compressed on unidimensional tape. It spends
         | time noting that writing/reading one symbol at a time is a good
         | enough approximation for human writing/reading. It spends time
         | noting that the next step doesn't need to depend on more than
         | the current symbol + internal state, as human attention is also
         | focused on a limited number of symbols.
         | 
         | This is actually why Turing's paper is so convincing on the
         | argument of universal computation - not because the machine is
         | realizable, but because it's hard to invent anything that a
         | mathematician does while calculating that is not captured by
         | the Turing machine model.
         | 
         | I very much recommend reading the first part of the paper [0]
         | to see this argument (the second part, where it is proven that
         | this abstract machine can in fact compute all known numbers, is
         | more technical and flew over my own head).
         | 
         | [0] PDF
         | https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf
        
           | codeflo wrote:
           | I'm with you, I also found Turing's argument that his machine
           | model captures _all_ of computation very convincing and
           | pointed that out in another thread.
           | 
           | However, for this argument to work, we need to accept both
           | that all computation is captured by Turing machines, and also
           | that what Turing machines do is in fact computable. In
           | essence, Turing machine <=> realizable machine. Maybe some
           | people are more impressed by one, others more by the other
           | direction of that double implication.
        
           | bobthechef wrote:
           | > the Turing machine model is designed to be a mechanistic
           | model of what a mathematician does (in rote day-to-day tasks
           | at least)
           | 
           | Or more accurately, what _human computers_ did in those days
           | (i.e. the rooms full of people algorithmically working out
           | numerical calculations for e.g. physicists or whatever
           | without understanding what they were doing beyond the
           | mechanical steps they were taking). In other words a
           | formalization of so-called effective methods.
        
           | bjornsing wrote:
           | > This is a misunderstanding of the Turing machine model. The
           | Turing machine is not designed to be a realistically
           | implementable physical machine, and indeed there are no
           | details in Turing's paper on how such a physical machine
           | could be achieved.
           | 
           | I've read the paper. I think we just take different things
           | from it, possibly because you have a background in
           | mathematics?
           | 
           | To me, the main takeaway (if I imagine reading it in 1936) is
           | that a universal Turing machine is not all that complicated,
           | and arouses the "I could build this thing"-intuition.
           | 
           | That of course doesn't mean that Turing intended it to be
           | realizable, that's not my point. But he appeals to an
           | engineer's intuition. It's that intuitive link that's
           | valuable and unique IMHO.
           | 
           | BTW, I think your takeaway is probably clearer in Godel's
           | work.
        
             | eynsham wrote:
             | The Turing machine has a tape of unbounded size so can't be
             | built simpliciter.
             | 
             | Moreover although it turns out that that model of
             | computation is very robust and sufficient for all purposes
             | in physics (unless black holes or something allow
             | hypercomputation) Turing does not really definitively show
             | that and in a way that can't be definitively shown. All we
             | have is a lack of counterexamples (admittedly a very
             | convincing one.)
             | 
             | I don't see why this intuition is that helpful generally
             | either; Turing machines don't really help at an
             | implementation level with modern engineering problems as
             | far as I can tell. Most of the time you know that what you
             | want to do is possible in finite time &c.--presumably the
             | difficulty is doing what you want to do, and going via the
             | Turing formalism would be a bit odd.
        
               | bjornsing wrote:
               | > The Turing machine has a tape of unbounded size so
               | can't be built simpliciter.
               | 
               | On the contrary, I think this is one of the advantages of
               | Turing's model: I can imagine standing there in my garage
               | looking on as my universal Turing machine is running low
               | on tape on the left side, and then simply attaching a new
               | roll of fresh empty tape at the end, holding it as it is
               | fed into the machine. :)
               | 
               | It's simply the least leaky abstraction of computation.
        
               | codeflo wrote:
               | Exactly this. Unbounded doesn't mean infinite, and people
               | are sometimes confused by the distinction.
        
               | hardlianotion wrote:
               | Including me. What is the difference?
        
               | hansvm wrote:
               | The usual difference is just predicate ordering -- (1)
               | for every program there exists a tape big enough vs (2)
               | there exists a tape big enough for every program. In the
               | first case, each individual (valid) program can get by
               | with a tape of _some_ fixed length, but there's no bound
               | on how big that requisite length might be. In the second
               | case, since the tape requirements can be arbitrarily high
               | you would need a legitimately infinite tape to service
               | all possible programs.
               | 
               | IMO the given example muddies the waters a bit by
               | conflating the conceptual tape a given machine is running
               | on (which might be infinite for non-halting programs)
               | with the physical tape you've used so far (for which it
               | suffices for such tape to be finite at any fixed moment
               | in time, though the amount needed might grow unboundedly
               | as time increases).
        
               | codeflo wrote:
               | Note that tape usage typically depends on the input, so I
               | would distinguish programs and computations (program +
               | input).
        
               | Dylan16807 wrote:
               | In a theoretical sense, an unbounded number is always
               | finite.
               | 
               | In a practical sense, turing machines don't voraciously
               | consume tape. Adding extra feet of tape gives you an
               | exponential increase in what you can compute. So if you
               | set up a program to be reasonably judicious with its tape
               | use, you can just say that if it reaches an end you pause
               | it for a day, head to the shop, and buy another reel. Big
               | computations take a lot of time anyway.
        
               | hardlianotion wrote:
               | Any given number is always bounded. I am not sure it
               | makes sense to talk about an unbounded number
        
               | [deleted]
        
               | codeflo wrote:
               | Some parts of mathematics deal with infinite sequences,
               | that is, actually infinite lists of numbers. It's usually
               | assumed, and important for analysis, that these numbers
               | are considered to be "all there" right from the
               | beginning. You can do operations like: Compute the limit.
               | Add up all of its elements. Determine whether two
               | sequences are identical for all entries after the
               | trillionth.
               | 
               | I think this is often part of the misunderstanding when
               | you stumble into a post by someone who's confused about
               | 0.999... = 1. People sometimes write things like:
               | "0.999... only moves closer and closer to 1, it never
               | reaches 1." I think that highlights a deeper point than
               | people usually give these comments credit for. The thing
               | is, 0.999... doesn't "move" anywhere, it's considered a
               | completed object right from the beginning.
               | 
               | Anyway, the point is that Turing machines are not like
               | this at all. They only look at a fixed-size part of the
               | tape during each step, from this follows that they have
               | only used a finite amount of tape at each point of their
               | execution.
               | 
               | So for any given (halting) computation, you don't
               | actually need an infinite tape, you just need "enough",
               | without changing the result. This is important because it
               | makes Turing machines a model for practical computers.
               | For example, the device you're reading this on has
               | gigabytes of tape, and that's big enough for many, many,
               | many kinds of computation.
        
       | titzer wrote:
       | FTA:
       | 
       | > They did not exploit this--Turing used his (quite inefficient)
       | model only to rephrase the results of Godel and Church on the
       | limits of computability.
       | 
       | That's an infuriating sentence, as the author clearly has no clue
       | how insanely inefficient Church numerals are. For those who don't
       | know, Church numerals are literally a _unary_ encoding of
       | integers as repeatedly applied functions.
       | 
       | I read a lot of this article and at no point was I nodding along,
       | but at this point I just had to pause and vent. This person has
       | something _against_ Turing to the point that it 's just weird.
        
       | motohagiography wrote:
       | Nice to acknowledge the work of lesser known lights, but the
       | article kind of reduces to another variation of, "your co-
       | ordinates for belief and identity are arbitrary and unstable, the
       | protagonists of the story you tell yourself are villains, it's
       | probably because you are a shameful example of a person and
       | should expect punishment, but you can redeem yourself by
       | renouncing belief and aligning to the narrative of progress in
       | the present."
       | 
       | It passes the popular bar for new knowledge because it
       | illustrates a conflict and resolves it, but it's in a cognitive
       | style that seems to conflate debasing knowledge with discovering
       | it. It is how people are educated these days, where it is
       | sufficient to destabilize and neutralize target ideas instead of
       | augmenting or improving them, so it's no surprise, but the
       | article seemed like a good example to articulate what is
       | irritating about this trend in educated thinking.
        
       | jesuslop wrote:
       | An _ad fontes_ breadcrumb elucidation I think that passes through
       | Godel 1934, in SS9  "General recursive functions". "...this leads
       | to the question what one would mean by 'every recursive
       | function'. One _may attempt_ to define this notion as follows...
       | ", and remites to footnote 34: "This was suggested by Herbrand in
       | private cummunication". Finally the note points to the article
       | post scriptum where Godel appraises (later) Turing work as
       | settling what would become known Church's thesis, that Turing
       | machines exhaust all 'mechanical procedures'. "In consequence of
       | later advances, in particular of the fact that, due to A. M.
       | Turing's work, a precise and unquestionably adequate definition
       | of the general concept of formal system [by defining 'mechanical
       | procedure'] can now be given..."
        
       | rg111 wrote:
       | Also see this: https://news.ycombinator.com/item?id=27536974
       | 
       | It's almost like Schmidhuber hates Turing and wants the world to
       | worship Godel.
       | 
       | It's _personal_ for him.
       | 
       | He also wrote a tweet and maybe another post where he ranked up
       | all countries of Europe and says that Europeans collectively got
       | the most medals in Olympics and hence they are superior.
       | 
       | He suffers from this weird and serious case of European
       | Supremacy.
       | 
       | I know about his actual works (not what he claims to have done).
       | 
       | And based on that I respect him.
       | 
       | He also has got a cult of people following him and who will gang
       | up on any dissenters, i.e. people not thinking of him as a
       | walking god.
       | 
       | Schmidhuber has actual contributions, I know.
       | 
       | Now he moved to a country with extremely bad record of human
       | rights, and has policies and implementation that put the Taliban
       | to shame.
       | 
       | Now he is promoting the institution left and right any
       | opportunity he gets.
       | 
       | He is very hard to take seriously.
        
       | amw-zero wrote:
       | This did not convince me that Turing's accomplishments are
       | exaggerated. Anyone who knows computing history knows that there
       | are many people involved. But Turing's contributions are not
       | overstated, they are just accurate. That doesn't mean that he was
       | the only computer scientist in the world.
        
       | ogogmad wrote:
       | I need to find a reference for this:
       | 
       | Goedel apparently believed that it was impossible to explicitly
       | enumerate all the partial computable functions over the integers,
       | until Turing proved him wrong. He reasoned as follows:
       | 
       | - You can't enumerate the total functions because Cantor's
       | diagonal argument is constructive. Given a claimed enumeration of
       | N^N, you can perform Cantor's diagonal construction to produce a
       | computable element of N^N not in the enumeration. So N^N is
       | computably uncountable.
       | 
       | - Classical set theory says that a superset of an uncountable set
       | is also uncountable.
       | 
       | The problem is that while a superset of an uncountable set is
       | uncountable, a superset of a _computably uncountable_ set may
       | instead be computably countable. The partial functions over the
       | integers show that this is indeed the case.
       | 
       | Cantor's construction is blocked because a partial function can
       | output a degenerate value called non-termination. A computable
       | function must map the non-termination value to itself. Since it
       | isn't mapped to something different, Cantor's construction can't
       | produce something outside an enumeration.
        
         | bsedlm wrote:
         | > The problem is that while a superset of an uncountable set is
         | uncountable, a superset of a computably uncountable set may
         | instead be computably countable. The partial functions over the
         | integers show that this is indeed the case.
         | 
         | > The computable countability of the partial maps from N to N.
         | 
         | Can anybody give an example?
         | 
         | does this have any to do with rationals? or is it more related
         | to limits and calculus?
        
           | ogogmad wrote:
           | > does this have any to do with rationals? or is it more
           | related to limits and calculus?
           | 
           | No.
           | 
           | I'm going to use Haskell, which I'm going to assume you know.
           | I'm using it because it seems closer to the math. The type
           | for naturals is:
           | 
           | data Nat = Zero | Succ Nat
           | 
           | and then all Haskell functions of type `Nat -> Nat` represent
           | partial functions. They're not total functions because they
           | might enter an infinite loop for some inputs. You can clearly
           | enumerate all Haskell functions which are syntactically
           | correct and which have type Nat->Nat, so the partial
           | functions of that type are computably enumerable.
           | 
           | But consider the total functions of type Nat->Nat (i.e. those
           | that never enter an infinite loop). Assume you can have a
           | function `en :: Nat -> (Nat -> Nat)` which can output every
           | total function of type Nat->Nat.
           | 
           | Then the function `counterexample n = Succ (en n n)` is a
           | function that cannot be outputted by `en`, and therefore `en`
           | fails to enumerate all total functions.
           | 
           | I've got other things to do, unfortunately, so I can't say
           | more than this.
           | 
           | [edit] Fixed the counterexample.
        
         | codeflo wrote:
         | I think the more widespread term for what you call "computably
         | countable" is "computably enumerable", perhaps in part to make
         | the distinction more clear. For example, the set of all natural
         | numbers is (trivially) c.e., but has many non-c.e. subsets. But
         | to understand this _is_ understanding the Entscheidungsproblem,
         | so it shouldn 't be surprising that this was less clear before
         | that was resolved.
        
           | ogogmad wrote:
           | Computably countable is also a correct term. In line with
           | this approach to computability theory:
           | http://math.andrej.com/asset/data/synthetic-slides.pdf
           | 
           | It shows that the unsolvability of the halting problem
           | follows from:
           | 
           | - The computable uncountability of N^N, which can be proved
           | using an identical argument to the one Cantor used to prove
           | the set-theoretic uncountability of N^N.
           | 
           | - The computable countability of the partial maps from N to
           | N.
           | 
           | If the halting problem were solvable, then the second bullet
           | point would contradict the first. So it's essentially a weird
           | twist on set theory that uses constructive logic.
        
             | codeflo wrote:
             | I don't doubt that the term is in use, and I understood
             | what you meant. But it's not listed among seven (!)
             | synonyms for computably enumerable on Wikipedia, and more
             | the point, the slides you linked to also don't contain that
             | term.
             | 
             | However, that's not the point I wanted to make. I wouldn't
             | like calling it computably countable even if everyone else
             | did, simply because it gives the wrong intuition about
             | subsets.
        
               | ogogmad wrote:
               | > the slides you linked to also don't contain that term
               | 
               | The term "countable" is used in the slides, where it
               | means computably enumerable. The adjective "computably"
               | is used when there's a need to distinguish set-theoretic
               | notions from similarly behaved computability-theoretic
               | notions. Otherwise the meaning of the term "countable"
               | can be context-dependent.
               | 
               | > it gives the wrong intuition about subsets
               | 
               | In constructive logic, a countable set can contain an
               | uncountable subset. The misleading intuition (in the
               | context of non-classical logics) is based on classical
               | logic where countability is a statement about the size of
               | a set. Whether you think constructive logic is a good way
               | of explaining computability theory is another question,
               | but it's certainly a viable way of doing it.
               | 
               | It's like how the term "line" can mean something
               | different in hyperbolic geometry from what it means in
               | Euclidean geometry. You could argue that it might mislead
               | people about the nature of parallel lines, but that's why
               | hyperbolic geometry is not Euclidean geometry. Another
               | example is using "multiplication" to refer to an
               | operation on matrices, which might make people think that
               | AB=BA when that usually isn't true. Mathematics is all
               | about re-using terms and pointing out that there are
               | differences.
               | 
               | [edit]
               | 
               | Admittedly, the slides do use the term "enumerable" as
               | well, so that's another option. When there's a
               | possibility for confusion with set theory, you can say
               | "computably enumerable" as you suggested.
               | 
               | [edit] Made lots of edits. Hopefully, that's it.
        
       | codeflo wrote:
       | It seems to be in style now to try to tear down the public
       | perception of past great minds, I recently read a similar article
       | about Hawking. And while this article may have some points, I
       | don't think the overall framing is fair.
       | 
       | I think everyone with an interest in theoretical CS should work
       | through Turing's 1936 paper at one point in their life. For me,
       | the important part of that paper is how convincingly it argues
       | that the proposed machine model is not just yet another model of
       | computation, but a complete model of computation: that everything
       | you could reasonably call computation is encompassed by these
       | machines.
       | 
       | So there's a finality to Turing's definition of computation:
       | these problems are unsolvable not just with this model, but with
       | _any_ reasonable machine model. It 's very hard to make the same
       | case for lambda calculus, which is (in my opinion) a large part
       | of what made Turing's paper so groundbreaking.
        
         | [deleted]
        
       | bo1024 wrote:
       | We've been over this. Godel's mu-recursive functions were a poor
       | model of computation because it's completely unclear how to
       | physically implement the arbitrary-function minimization
       | operator. So people didn't see how to build a machine that
       | calculates this way. Similarly, there's no clear way how to
       | mechanize lambda calculus.
       | 
       | Turing Machines, on the other hand, were instantly obviously
       | mechanizable. It was clear that one could build a physical
       | machine to run any Turing program without human input. By proving
       | that they could simulate the other systems, Turing showed that
       | those other systems could be mechanizable as well.
       | 
       | I don't understand why Schmidhuber continues to ignore this
       | crucial point.
        
         | KingOfCoders wrote:
         | "For example, it was claimed that Turing founded computer
         | science.[...] Turing's 1936 paper provided the "theoretical
         | backbone" for all computers to come."
         | 
         | So your argument is, because it is unclear how to "physically
         | implement the arbitrary-function minimization operator", Turing
         | is the better "theoretical backbone" and has founded computer
         | science?
        
           | jimhefferon wrote:
           | The question in the air in 36-ish was something like, "OK,
           | clearly we can mechanically compute things like the sum of
           | two numbers or the prime factorization of a number. But are
           | there other things that can be computed with a discrete and
           | deterministic mechanism?" (At the time they called these
           | "effective" functions.)
           | 
           | Church had piles of stuff that he and his students produced
           | that were computable with the lambda calculus. Basically, all
           | of the natural number functions that a person thinks are
           | intuitively mechanically computable, those folks had showed
           | how to lambda compute. With this evidence, he proposed to
           | Godel (they were working together at Princeton at the time),
           | who was considered the world's expert, taking "lambda-
           | calculable" as a mathematically precise version of
           | "mechanically computable." But Godel was notoriously careful,
           | and he did not accept the thought as perfectly clear.
           | 
           | That is, they had a subset of the things that could be
           | mechanically computed. But was it the entire set? Or was
           | there something that some discrete and deterministic
           | mechanism could be made to do that would lead to more than
           | Church's set?
           | 
           | Imagine you are Dedekind and you are looking at the primitive
           | recursive functions and (1) any such function is intuitively
           | mechanically computable, and (2) you are able to work out how
           | to define a pile of things like prime factorization of an
           | integer using this system. You might well conjucture that
           | this is it. But we know (and Godel and Church and Turing
           | knew) that this is not it, that you need to add unbounded
           | search of some kind (this is what minimization does) to get
           | more things that are intuitively mechanically computable.
           | 
           | I agree that the minimization operator is not as easy to
           | picture with gears and levers as some of the other
           | operations. But the issue in 36 was that a person could worry
           | that there was even more. Just as minimization is not as easy
           | to picture and the need for it didn't hit Dedekind with great
           | force, could there be something else out there that we have
           | all missed?
           | 
           | That worry disappeared when Godel read Turing's masterful
           | analysis. It convinced him that this _is_ what a machine can
           | do. He wrote,  "That this really is the correct definition of
           | mechanical computability was established beyond any doubt by
           | Turing.'' Church felt the same way, writing that Turing
           | machines have "the advantage of making the identification
           | with effectiveness ... evident immediately.''
        
             | earleybird wrote:
             | In a nutshell, Church asserted effective computability by
             | saying "look what you can do with the lambda calculus".
             | Turing took the philosophical approach saying "this is what
             | it means to compute". To Godel, Church's argument was
             | incomplete. Turing provided the direct argument. Godel was
             | convinced.
        
             | generalizations wrote:
             | I'm still a little confused. It seems like Turing came up
             | with something that works, and clearly fulfills precisely
             | what Godel and Church and Turing were all looking for; but
             | it also seems like it's a mathematically inelegant
             | solution. Is it possible that in the future we'll find a
             | way to show that Godel's mu-recursive functions or Church's
             | lambda calculus also precisely describe 'what a machine can
             | do'? If so, it seems that from a mathematical standpoint,
             | either of those would be a better foundation to start from.
             | (I'll totally agree that from a teacher's perspective, the
             | Turing machine is the better explanation.)
        
               | acchow wrote:
               | > Is it possible that in the future we'll find a way to
               | show that Godel's mu-recursive functions or Church's
               | lambda calculus also precisely describe 'what a machine
               | can do'?
               | 
               | See also https://en.wikipedia.org/wiki/Church%E2%80%93Tur
               | ing_thesis
        
               | dwohnitmok wrote:
               | > we'll find a way to show that Godel's mu-recursive
               | functions or Church's lambda calculus also precisely
               | describe 'what a machine can do'?
               | 
               | This is already proven to be the case. Mu-recursion
               | (which IIRC is _not_ Godel 's general recursive
               | functions, despite what Wikipedia says; Kleene was the
               | originator of the mu operator. Godel's general recursive
               | functions are defined in a separate, but yet again
               | equivalent way that directly extends primitive
               | recursion), Turing machines, and the lambda calculus are
               | all proven to be exactly equivalent to each other. The
               | fact that these three independent approaches to
               | computability are all equivalent is why we have strong
               | informal justification that the Church-Turing Thesis (a
               | non-mathematical statement) holds.
               | 
               | Separately, there's a sentiment that I've seen come up
               | several times on HN that somehow the lambda calculus, mu-
               | recursion, or general recursion is more "mathematical"
               | and Turing machines are less "mathematical."
               | 
               | I want to push back on that. The mathematical field of
               | computability is based almost entirely off of Turing
               | machines because there are many classes of mathematical
               | and logical problems that are easy to state with Turing
               | machines and extremely awkward/almost impossible to state
               | with the lambda calculus and mu-recursion (this is
               | consistent with my previous statement that the three are
               | all equivalent in power because computability theory
               | often deals with non-computable functions, in particular
               | trying to specify exactly how non-computable something
               | is). The notion of oracles, which then leads to a rich
               | theory of things like Turing jumps and the arithmetical
               | hierarchy, is trivial to state with Turing machines and
               | very unwieldy to state in these other formalisms.
               | 
               | Likewise the lambda calculus and mu-recursion (but not
               | general recursion) provide a very poor foundation to do
               | complexity theory in CS. Unlike Turing machines, where it
               | is fairly easy to discern what is a constant time
               | operator, the story is much more complicated for the
               | lambda calculus, where to the best of my knowledge,
               | analyzing complexity in the formalism of the lambda
               | calculus, instead of translating it to Turing machines,
               | is still an open problem.
               | 
               | There is indeed a mathematical elegance to Turing
               | machines that makes it so that _most_ of the mathematics
               | of computability is studied with Turing machines rather
               | than the lambda calculus.
               | 
               | The lambda calculus on the other hand is invaluable when
               | studying programming language theory, but we should not
               | mistake PLT to be representative of the wider field of
               | mathematics or theoretical CS.
               | 
               | EDIT: I should perhaps make clear that if I put on my
               | mathematical hat, mu-recursive functions seem like the
               | most familiar formalism, because they align with a common
               | way families of things are defined in mathematics
               | (specify individual members and then generate the rest
               | through some relationship). However, I would contend that
               | for the majority of mathematicians outside of
               | computability theory, the lambda calculus and Turing
               | machines seem equally "strange."
        
               | cybernautique wrote:
               | I'm convinced that David Harland's Rekursiv[1:] machine
               | _is_ the manner by which lambda et al. might be
               | implemented at the machine level. Unfortunately, Rekursiv
               | seems to have died an ignominious death, with the last
               | Rekursiv chip having fallen off the side of a steamboat
               | (apocrypha; I remember having read this, but I'm unable
               | to find the original citation.)
               | 
               | The Rekursiv advantage is its ability to do recursion on
               | the bare-metal. It's described as an object-oriented
               | architecture. Its memory was a persistent store of
               | objects, with each object having its size, type, and
               | position known in memory.
               | 
               | [1] https://en.wikipedia.org/wiki/Rekursiv
        
               | thom wrote:
               | Thanks for this link, really interesting stuff. For a
               | while there was a fashion in OS research for orthogonal
               | persistence but this is much deeper and more elegant.
        
               | cybernautique wrote:
               | Absolutely! As mentioned, I'm very much a novice of
               | computer architecture. It kinda blew my mind that
               | Rekursiv was unique in its abilities. Also fascinating
               | are LISP Machines, which seem to have received quite a
               | bit of love from HN already.
               | 
               | If I may ask, from your perspective, what's more elegant
               | about Rekursiv's design? Is it the philosophy or
               | implementation? I'd also love any links to orthogonal
               | persistence!
        
               | Karrot_Kream wrote:
               | I guess I'm confused as to why this is more than simply
               | an interesting formalism. Complexity theory and models of
               | computation are largely based around the Turing Machine
               | model. Lambda calculus is an effective lens to design
               | programming languages and prove equivalence between
               | programs. We know by way of the Church-Turing Thesis that
               | these two models are equivalent. The Turing Machine model
               | is both better studied from a computation perspective and
               | has much more practical realization; what's the point in
               | creating something like this? It feels a bit like silly
               | lambda calculus fetishism, but again maybe the value here
               | is the actual computation formalism itself.
        
               | cybernautique wrote:
               | For me, personally, I really like the lambda calculus as
               | a tool to organize and better my computational thinking.
               | 
               | I came into programming/computer science from a
               | mathematics degree; I read some old treatises on
               | recursion theory[1] and fell in love. I couldn't ever
               | quite wrap my head around the Turing Machine formalism,
               | but kept at it for a while. Finding Barendregt's paper
               | [2] was a huge shock! I grasped it much quicker. So, yes,
               | lambda calculus and the Turing Machine formalism are
               | equivalent in explanatory power, but there are also
               | reasons someone might prefer one to the other. So, yes,
               | for me, the value _is_ the formalism.
               | 
               | As to why I think the Rekursiv would provide a good
               | platform for implementing lambda calculus on the bare-
               | metal, that's entirely due to Rekursiv's memory model
               | advantage and the fact that it has a user-writable ISA.
               | Why would someone choose to implement the lambda calculus
               | on bare-metal? You call it "fetishism," I call it fun!
               | 
               | More generally, I just really like the idea of having a
               | machine with a user-writable ISA.
               | 
               | [1] Theory of Recursive Functions and Effective
               | Computability: https://openlibrary.org/books/OL2738948M/T
               | heory_of_recursive...
               | 
               | [2] Introduction to Lambda Calculus: https://www.cse.chal
               | mers.se/research/group/logic/TypesSS05/E...
        
               | Karrot_Kream wrote:
               | FWIW Nothing wrong with having fun with computing, and
               | implementing lambda calculus on bare-metal can be as fun
               | as any other computational exploration, so good on ya!
               | 
               | Thanks for clearing up that it's the formalism you find
               | interesting. Also, to offer a counterpoint, I'm also from
               | a math background, but I was more of an analysis person
               | (as much as one can be in mathematics where it's all
               | related) than an algebra person, and when I did some FP
               | research, it often felt like where all the algebraists go
               | to play computer science. I feel like analysts are
               | underrepresented in PLT (and overrepresented in
               | complexity theory!) but this is already going off-topic,
               | so cheers.
        
               | cybernautique wrote:
               | Nah mate, s'all good! It's great to hear your feedback; I
               | am very much of the algebraist spirit myself (I barely
               | passed my Rudin-based real analysis course). Our
               | experiences definitely align. FP feels much more like my
               | favorite parts of math.
               | 
               | Out of curiosity, can you identify any areas in PLT that
               | could be made more analyst-friendly?
               | 
               | Intuitively, it feels that PLT is almost necessarily of
               | the algebraist; to me, one of the big divides is the
               | discreteness of algebra vs the continuity of analysis.
               | Would it help if there was a PLT that exhibited a greater
               | degree of continuousness? If so, what do you think that
               | might look like?
        
               | WorldMaker wrote:
               | That's the central argument in the Church-Turing Theory
               | isn't it? Church felt very strongly that the difference
               | between his and his students' "elegance" and Turing's
               | "practical" was a difference only in abstraction and that
               | the two models were equivalent and translatable (you can
               | write in one abstraction and convert it to the other).
               | 
               | That theory continues to bear fruit as the history of
               | programming languages is almost entirely about bringing
               | new and "better" abstractions to problems and then
               | translating them to "dumber, more practical" machines. We
               | have programming languages today modeled directly off the
               | elegance of (though now sometimes still a few steps
               | removed from being direct implementations of) the lambda
               | calculus and mu-recursive functions, and the amazing
               | thing is that they work great even given how "dumb" and
               | "inelegant" our machines can be in practice.
        
               | jimhefferon wrote:
               | > Is it possible
               | 
               | Yes, the set of functions computable with mu recursion,
               | or with lambda calculus, is the same as the set of
               | functions computable with a Turing machine. (Turing in
               | his original paper showed that the set is the same for
               | his system and Church's, and the proofs for mu recursion
               | and many other systems are well-known.)
               | 
               | > I'll totally agree that from a teacher's perspective,
               | the Turing machine is the better explanation.
               | 
               | When I learned this, the instructor did not use Turing
               | machines. They used mu recursion. That has the advantage
               | that you don't first define a machine and then derive the
               | set of functions, instead you go straight to the
               | functions. But I agree that Sipser (which I understand to
               | be the most popular text today) defines a computable
               | function as one that is computed by Turing machine, and
               | to my mind that has the advantage of honestly suggesting
               | to students what they are about.
        
               | generalizations wrote:
               | Thanks, I think I understand now. I thought there was a
               | distinction between the mathematical idea of what
               | computation is, and the engineering we've invented to
               | implement that computation, and so I didn't really get
               | the significance/literalness of what you were saying
               | about mechanization.
               | 
               | It does seem weird to me though that we're letting our
               | engineering limitations determine how we think about
               | these things mathematically.
        
             | cfcf14 wrote:
             | Wonderful explanation, thank you.
        
             | lordnacho wrote:
             | You seem like someone who might have a good book about this
             | bit of history? Or perhaps a blog?
        
               | jimhefferon wrote:
               | About the Theory of Computation in general. A draft:
               | https://hefferon.net/computation/index.html :-)
        
           | 3pt14159 wrote:
           | Not OP, but I agree with them.
           | 
           | The word computer means multiple things. In one sense it the
           | abstraction of universal computation. Imagine a world where
           | actual physical computers didn't progress to universal
           | computation, but were stuck being purpose built to the
           | present day. The field of computer science would be utterly
           | different because they couldn't actually _compute_ anything
           | with their science. They could just discuss computability in
           | an abstract sense. It 'd be like physics without the particle
           | colliders or telescopes or lasers.
           | 
           | I think of the founders of computer science more like the
           | founding fathers of America, rather than a single guy named
           | Turing, but some are more memorable than others.
        
             | KingOfCoders wrote:
             | The article writes about your point of general computation
             | and purpose built computers:
             | 
             | "Likewise, Konrad Zuse never got a Turing award despite
             | having created the world's first working programmable
             | general computer 1935-41. [...] It was pointed out that
             | none of the computers built during the 1940s were
             | influenced in any way by Turing's 1936 theoretical paper,
             | [...]"
        
           | bo1024 wrote:
           | I don't think it's important to quibble over who's overrated
           | or underrated among these giants of math and CS who already
           | get tons of recognition (I'm glad Schmidhuber brings many
           | other historical names into the narrative).
           | 
           | However, yes, I do think that 'mechanization' or physical
           | implementation is a crucial piece of Turing's contribution
           | that is wrongly ignored in this article. And I think without
           | mechanization, there is no CS as we understand it.
        
             | KingOfCoders wrote:
             | I can only repeat my comment from down:
             | 
             | "Likewise, Konrad Zuse never got a Turing award despite
             | having created the world's first working programmable
             | general computer 1935-41. [...] It was pointed out that
             | none of the computers built during the 1940s were
             | influenced in any way by Turing's 1936 theoretical paper,
             | [...]"
        
               | mcguire wrote:
               | I don't believe anyone has received a Turing award for
               | creating a working computer.
        
               | obastani wrote:
               | As far as I know, Konrad Zuse didn't prove that this
               | strategy was a universal model of computation. In
               | contrast, Turing proved that his universal machine could
               | emulate any other machine, given the right program.
               | 
               | In my view, Turing's contribution is providing a
               | plausible definition of computation along with a deep and
               | comprehensive theoretical characterization of the
               | properties of this model of computation. This is why
               | Turing machines form the basis of theoretical computer
               | science, and not other models such as lambda calculus. I
               | think saying that Turing machines were adopted since they
               | were merely more convenient is highly misleading.
               | 
               | I think this pattern repeats a lot: There are many cases
               | where you can point to multiple people who invented
               | similar ideas around the same time, but it is typically
               | the person who provided the most deep and comprehensive
               | treatment of the subject that ultimately gets most of the
               | credit. This depth is not conveyed in pop science
               | attributions such as "Turing invented computation", but
               | this doesn't mean Turing doesn't deserve the credit.
        
           | w-j-w wrote:
           | I think that proving computers could exist in physical
           | reality satisfies this claim very well
        
         | dnautics wrote:
         | > I don't understand why Schmidhuber continues to ignore this
         | crucial point.
         | 
         | From TFA:
         | 
         | > There is a seemingly minor difference whose significance
         | emerged only later. Many of Godel's instruction sequences were
         | series of multiplications of number-coded storage contents by
         | integers. Godel did not care that the computational complexity
         | of such multiplications tends to increase with storage size.
         | Similarly, Church also ignored the context-dependent spatio-
         | temporal complexity of the basic instructions in his
         | algorithms. Turing and Post, however, adopted a traditional,
         | reductionist, minimalist, binary view of computing. Their
         | machine models permitted only very simple elementary binary
         | instructions with constant complexity, like the early binary
         | machine model of Leibniz (1679)[L79][LA14][HO66] and Zuse's
         | 1936 patent application...
         | 
         | I presume by "seemingly minor" Schmidhuber implies "it turns
         | out to be very important".
        
           | lostmsu wrote:
           | It is important for engineering, but as far as I understand
           | it is not that important for math. E.g. Godel solved the
           | completeness and consistency problems, and Church first
           | solved the decidability.
        
             | jrochkind1 wrote:
             | The OP itself documents:
             | 
             | > Nevertheless, according to Wang,[WA74-96] it was Turing's
             | work (1936) that convinced Godel of the universality of
             | both his own approach (1931-34) and Church's (1935).
             | 
             | Unless "according to Wang" is meant as "I don't know if I
             | believe it", then apparently it's documented that Godel
             | himself thought Turing's contribution was major and shed
             | important light on the mathematical implications of godel's
             | own work.
             | 
             | There's never any one person that invents anything, it's
             | always built on work that came before.
             | 
             | Reading the OP, I got increasingly bored and tired... ok,
             | what's your point? Yes, clearly Godel and Church especially
             | did foundational work without which Turing's work would not
             | be possible -- and I don't think anyone denies it, anyone
             | with any kind of computer science education is surely
             | familiar with Godel and Church. It's not like they are
             | languishing in obscurity, they are both very well known and
             | respected! Godel especially is widely considered a real
             | giant in his fields. I am as confident that neither Godel
             | nor Church is going to be forgotten for many generations.
             | 
             | But Turing made important contributions that took important
             | steps necessary for the development of CS as a field. It's
             | not a mystery or undeserved why Turing's name ends up
             | remembered.
             | 
             | The OP's point is just that any singular "inventor" is
             | always building on work of may who have come before? OK,
             | sure. Boring. So we should never promote anyone as making
             | important foundational contributions? Well, people aren't
             | gonna stop. Boring.
        
           | mananaysiempre wrote:
           | Not to dispute your point, but note the lambda-calculus-as-a-
           | reasonable-machine papers from the last couple of years: it
           | turns out (despite the seeming general understanding to the
           | contrary in the past) that polynomial interpreters for some
           | meanings of "lambda calculus" (including IIRC a weird very
           | general one, call-by-need on open terms) are perfectly
           | possible, meaning that many fundamental questions of
           | computational complexity _can_ be straightforwardly expressed
           | in terms of the lambda-calculus machine as well. Linear-time
           | interpretation (thus e.g. classes L and NL) seemed out of
           | reach last I checked, though.
           | 
           | To echo my other comment here, it's not that Church himself
           | knew that. Even five years ago people did not know that. It's
           | not a question of priority. But I find it fascinating and
           | reassuring nevertheless, as actually programming _e.g._ a
           | self-interpreter for a Turing machine--or for general
           | recursive functions, or for combinatory calculus--is an
           | absolute slog.
        
             | anatoly wrote:
             | Can you give some links to these recent papers? Sounds
             | interesting. Thanks!
        
         | Grimm1 wrote:
         | As someone said to me recently, Schmidhuber may have a few
         | points about lack of credit regarding his work but him having
         | invented everything to do with modern NNs is a wild claim and
         | regardless he has cemented himself as a boor and continues to
         | double down. This is just another example of that doubling
         | down.
        
         | cs702 wrote:
         | Jurgen Schmidhuber has a track record of complaining about
         | unfairness in the conventionally accepted view of the
         | scientific record, especially in the English-speaking world.
         | For instance, he has claimed for years that his own early
         | research on AI has been unfairly overlooked or ignored by a
         | "conspiracy" (his word, not mine).[a] At one point, the NY
         | Times called him the "Rodney Dangerfield of AI research"
         | because he doesn't get the respect he deserves.[b]
         | 
         | Ultimately, as always, credit will be assigned by future
         | generations of scientists and historians _long after everyone
         | who is alive today has disappeared from the face of the earth_.
         | Many disagreements that seem important now will be forgotten.
         | Many findings that seem important now will turn out not be.
         | Many names that are very prominent now will fade into
         | obscurity. As always.
         | 
         | [a] https://people.idsia.ch//~juergen/deep-learning-
         | conspiracy.h...
         | 
         | [b] https://www.nytimes.com/2016/11/27/technology/artificial-
         | int...
        
           | coldtea wrote:
           | > _Jurgen Schmidhuber has a track record of complaining about
           | unfairness in the conventionally accepted view of the
           | scientific record, especially in the English-speaking world._
           | 
           | So, he has
           | 
           | > _Ultimately, as always, credit will be assigned by future
           | generations of scientists and historians long after everyone
           | who is alive today has disappeared from the face of the
           | earth._
           | 
           | Well, credit for the field of CS is already applied now, and
           | also yesterday and 20 years ago. It's not like this future
           | credit would be the first such, or that it would be more
           | accurat (or that it would not feed of the current popular
           | views).
        
           | etaioinshrdlu wrote:
           | See also:
           | https://www.urbandictionary.com/define.php?term=Schmidhubris
        
         | AnimalMuppet wrote:
         | > Similarly, there's no clear way how to mechanize lambda
         | calculus.
         | 
         | Is Lisp such a mechanization?
        
         | sytelus wrote:
         | Computers we use are nothing like Turing machine and if you
         | want to credit one person for computer design than it would be
         | van Neumann. Media needs hero worship. It doesn't matter who
         | will be hero but they must have one. This applies to every
         | field. Every scientist, enterprenuer, artist is overblown.
         | Older they are, more overblown they are. The worse case is
         | movie actors who literally everyone _knows_ are not  "hero" but
         | just the human puppets who move and behaves as how directors
         | asks and writers dictact, but in the eye of people they are
         | literally called "hero" and "star". The people moving these
         | puppets are only mentioned in credits who no one reads.
        
           | wedowhatwedo wrote:
           | Best I can tell, von Neumann stole the design that bears his
           | name from John Mauchly and J. Presper Eckert. von Neumann
           | doesn't seem to deserve as much much credit as he is given.
           | I'm not saying he does not deserve credit for his other work.
        
             | toyg wrote:
             | Both you and parent are correct: Von Neumann knew how to
             | play the game to end up with "the credits people do read".
        
         | SquishyPanda23 wrote:
         | > Turing Machines, on the other hand, were instantly obviously
         | mechanizable. It was clear that one could build a physical
         | machine to run any Turing program without human input.
         | 
         | Harold Abelson points out in one of his lectures [0] that
         | computer science isn't about computers any more than biology is
         | about microscopes.
         | 
         | From that perspective, it is clear that Turing found an
         | existing discipline of computer science and made major
         | contributions to it. But it doesn't really seem right to say
         | that he invented or founded computer science.
         | 
         | [0]
         | https://www.youtube.com/watch?v=-J_xL4IGhJA&list=PLE18841CAB...
        
           | Ar-Curunir wrote:
           | Computer Science is about what computers can do. To decide
           | the latter, you have to first decide what a computer is.
           | Turing Machines were the first abstractions that intuitively
           | captured what it means to compute something
        
             | SquishyPanda23 wrote:
             | Sure, if you take "computer" to mean "something that
             | computes". In that case it would include humans. There was
             | a great deal of research into things that can be
             | effectively computed that goes back even before the focus
             | of this article. And of course "computer" used to refer to
             | humans who computed before the invention of mechanical
             | computers.
             | 
             | But it's certainly not the study of what mechanical
             | computers can do. Among other things, mechanical computers
             | all have bounded resources unlike Turing machines.
        
             | dnautics wrote:
             | that's not the case, according to the article. If anything
             | the article implies the opposite. Turing machines were a
             | re-abstraction of Godel's computational model that provided
             | a path to mechanical realization.
             | 
             | Also if you ever work with the turing machine (NFA hooked
             | up to an infinite recording tape) it is not at all
             | "intuitive" that this construction comprehensively captures
             | the world of computation.
        
               | Ar-Curunir wrote:
               | Godel's computational model does not intuitively capture
               | what it means to compute. Godel himself was unconvinced
               | that his model was universal, and it took Turing's paper
               | to convince him that his model, lambda calculus, and
               | Turing machines were equivalent and universal.
        
         | yesOfCourse9 wrote:
         | He does not?
         | 
         | He is crafting a tapestry not being black and white.
         | 
         | They all worked together to refine an overall concept and get
         | somewhere useable.
         | 
         | IMO it's the simpler mind that needs a stark black and white
         | assignment. Talk about missing the forest for a tree.
         | 
         | Constantly iterating which daddy figure should be on top of the
         | mental hierarchy, when clearly it's network effects.
         | 
         | Humanity is still emerging from its mental models of society of
         | old.
        
         | wombatmobile wrote:
         | > I don't understand why Schmidhuber continues to ignore this
         | crucial point.
         | 
         | It is difficult to get a man to understand something when his
         | salary depends upon his not understanding it.
         | 
         | But wait a minute, you might say, facts are facts.
         | 
         | And if everyone had the time and resources to discover and
         | digest every fact, your understanding would be the end of it.
         | 
         | But everyone doesn't have time and resources. To compensate, we
         | rely on others to curate facts for us. When we encounter an
         | internally consistent subset of facts that suits our ideals and
         | our interests, we adopt that point of view.
         | 
         | There are infinitely many subsets of curated facts that can be
         | presented as internally consistent. That's why there are so
         | many different points of view.
         | 
         | What bearing does this have on Turing's role in computer
         | science, and his latter day fame in a society which came to be
         | defined by silicone logic?
         | 
         | The First Computers Were Human (and Mostly Women)
         | 
         | https://mjosefweber.medium.com/the-first-computers-were-huma...
         | 
         | Turing, in addition to stating an ontology of computing, dared
         | to invite the question, what is the difference between a
         | computer and a human?
        
           | jterrys wrote:
           | >https://mjosefweber.medium.com/the-first-computers-were-
           | huma...
           | 
           | I cringe when I read ill-researched essays like these because
           | it informs a relationship between women and computing that
           | back then genuinely did not exist.
           | 
           | For the vast, vast majority of women in the field of
           | computing at this time, they were nothing more than glorified
           | calculators. Yes, there were a few women scientists and
           | mathematicians (by few, I mean literally handfuls). Yes, it
           | was a male dominated field.
           | 
           | But the overwhelming majority of women working in this
           | industry at this time did secretarial style busywork. It
           | wasn't glorious. It was underappreciated. It sucked.
           | 
           | These types of essays are a genuine attempt to rewrite a
           | history that did not exist. It is literary gaslighting the
           | likes of which the article we are discussing right now is
           | attempting to rectify.
        
         | mananaysiempre wrote:
         | And Schonfinkel's combinatory logic ten years earlier (we _are_
         | talking priority here, right?) was even more awkward.
         | 
         | There's also the point (mentioned _e.g._ by Wadler in his
         | "Propositions as types" talk) that Godel didn't actually
         | realize how pervasive universal computation was and actively
         | pushed back against Turing proclaining the equivalence. This is
         | not to accuse him--he wasn't particularly obstinate or unfair
         | about it and, furthermore, came to understand the idea fairly
         | quickly. But it's not at all uncommon in science for the one
         | who actually invented a thing to fail to realize what it was
         | that they just invented, and somebody else comes years or in
         | some unfortunate cases decades later and announces the founding
         | of a new area of knowledge on that already-discovered site.
         | Whom the later naming will associate with the discovery is a
         | toss-up, but it's generally fair and in good taste, I think, to
         | mention both.
        
         | dralley wrote:
         | Likewise, people dispute that Ada Lovelace was the first
         | programmer, because Babbage and Menabrea had previously created
         | a few simple example programs.
         | 
         | But that downplays her accomplishments too much. She didn't
         | write the "first program" but she was the first to understand
         | what computers would be capable of doing (for example, that by
         | assigning numbers to letters and symbols, computers could do
         | more than simply perform numerical computations), and she was
         | the first to invent foundational control flow structures such
         | as loops and conditionals. Her program was much more rigorously
         | defined and sophisticated than any previous examples.
         | 
         | >The longest program that Menabrea presented was 11 operations
         | long and contained no loops or branches; Lovelace's program
         | contains 25 operations and a nested loop (and thus branching).
         | 
         | https://twobithistory.org/2018/08/18/ada-lovelace-note-g.htm...
         | 
         | https://writings.stephenwolfram.com/2015/12/untangling-the-t...
         | 
         | https://projectlovelace.net/static_prod/img/Diagram_for_the_...
        
           | haihaibye wrote:
           | People that say Ada was the first programmer must think
           | Babbage came up with the first general purpose computer then
           | never wrote any instructions for it.
           | 
           | Maybe the first programmer who wasn't also a hardware
           | engineer.
        
           | sebastialonso wrote:
           | Jeremy Campbell's The Grammatical Man would like a word with
           | you on Lovelace actual documented contributions.
        
           | coldtea wrote:
           | > _She didn 't write the "first program" but she was the
           | first to understand what computers would be capable of doing_
           | 
           | Or merely the first to express it? I'm pretty sure Babbage
           | himself, as the inventor, understood well what computers
           | would be capable of doing.
        
             | chmod600 wrote:
             | I'm not sure that inventors always understand the
             | consequences of their inventions. Often, they are either
             | focused on first-order capabilities and neglect the larger
             | significance; or focused on visions of the future but
             | unable to turn them into useful products in the short term.
        
           | HideousKojima wrote:
           | Since the notes attributed to Lovelace were written as part
           | of some scribe work she was doing for Babbage, what
           | indication is there that the notes are her own original work,
           | and not something that Babbage asked her to write? Don't get
           | me wrong, she was clearly a very intelligent woman.
        
           | VBprogrammer wrote:
           | Defining the first person to do anything is almost futile. No
           | one exists in a vacuum and most first were standing on the
           | shoulders of technological accomplishments far outside of
           | their own field.
           | 
           | That said, I'm sure in the case of Ada Lovelace there is at
           | least some element of my misogyny involved.
        
             | VBprogrammer wrote:
             | Awkward typo.
        
           | jhgb wrote:
           | > because Babbage and Menabrea had previously created a few
           | simple example programs.
           | 
           | That almost sounds to me like saying that the Wright brothers
           | "made a few simple flights".
           | 
           | > first to invent foundational control flow structures such
           | as loops
           | 
           | I wonder how sigma notation fits into this. Clearly the
           | notion of expressing arbitrarily repeated operations using a
           | fixed amount of information (which is what a loop is,
           | essentially) was known at least to Euler.
           | 
           | Also, the fact that the machine enabled these things in the
           | first place (unlike even some of the later machines such as
           | Z3) suggests that its designer was either aware of this
           | necessity to begin with, or at the very least in possession
           | of preternatural prescience. In that case the use of these
           | features in some programs but not in others would be not a
           | matter of inventing them in the former programs but instead a
           | matter of choosing to exploit existing hardware features, or
           | declining to do so, depending on what program you're looking
           | at.
        
             | bhaak wrote:
             | > I wonder how sigma notation fits into this. Clearly the
             | notion of expressing arbitrarily repeated operations using
             | a fixed amount of information (which is what a loop is,
             | essentially) was known at least to Euler.
             | 
             | You can even go further back. Algorithms with loops were
             | known already to Babylonian mathematicians. So you don't
             | need to resort to preternatural prescience.
             | 
             | The Z3 was not intended as a general computing device but
             | as a practical help for engineers. Because of that you
             | can't say it was missing something it didn't need to do its
             | job. Whereas when Zuse designed Plankalkul loops and
             | conditional branches where naturally included in the
             | design.
        
             | teruakohatu wrote:
             | > That almost sounds to me like saying that the Wright
             | brothers "made a few simple flights".
             | 
             | Richard Pearse gets written off in the same way to elevate
             | the Wright brothers flying accomplishments.
             | 
             | Pearse was just perusing powered flight as hobby in rural
             | New Zealand, didn't bother informing the press and didn't
             | bother even telling the government until WWII, 40 years
             | later, about his flights and engineering designs.
             | 
             | https://en.wikipedia.org/wiki/Richard_Pearse
        
           | dilap wrote:
           | I really enjoyed Stephen Wolfram's mini-bio of her.
           | 
           | https://writings.stephenwolfram.com/2015/12/untangling-
           | the-t...
           | 
           | I very much recoginized from that that she had the attitude
           | and experience of a "programmer," so I would say she was the
           | first programmer, in the modern sense.
        
             | extr wrote:
             | Wow, thanks for the link. Really interesting story,
             | fascinating to think about what could have been if she
             | hadn't died so young.
        
               | e12e wrote:
               | "It is desirable to guard against the possibility of
               | exaggerated ideas that might arise as to the powers of
               | the Analytical Engine. In considering any new subject,
               | there is frequently a tendency, first, to overrate what
               | we find to be already interesting or remarkable; and,
               | secondly, by a sort of natural reaction, to undervalue
               | the true state of the case, when we do discover that our
               | notions have surpassed those that were really tenable.
               | 
               | The Analytical Engine has no pretensions whatever to
               | originate anything. It can do whatever we know how to
               | order it to perform. It can follow analysis; but it has
               | no power of anticipating any analytical relations or
               | truths. Its province is to assist us in making available
               | what we are already acquainted with. This it is
               | calculated to effect primarily and chiefly of course,
               | through its executive faculties; but it is likely to
               | exert an indirect and reciprocal influence on science
               | itself in another manner. For, in so distributing and
               | combining the truths and the formulae of analysis, that
               | they may become most easily and rapidly amenable to the
               | mechanical combinations of the engine, the relations and
               | the nature of many subjects in that science are
               | necessarily thrown into new lights, and more profoundly
               | investigated. This is a decidedly indirect, and a
               | somewhat speculative, consequence of such an invention.
               | It is however pretty evident, on general principles, that
               | in devising for mathematical truths a new form in which
               | to record and throw themselves out for actual use, views
               | are likely to be induced, which should again react on the
               | more theoretical phase of the subject. There are in all
               | extensions of human power, or additions to human
               | knowledge, various collateral influences, besides the
               | main and primary object attained." -- Ada Lovelace, 1842
               | http://www.fourmilab.ch/babbage/sketch.html
        
             | chagen wrote:
             | Ada Lovelace's father was Lord Byron?! TIL
        
               | teruakohatu wrote:
               | Sadly for her she never knew him. The parents split up,
               | he left England and she stayed with her mother. Byron
               | died when she was just eight years old.
        
               | rg111 wrote:
               | I was surprised to learn that as well.
               | 
               | I learned about it in Walter Isaacson's _Innovators_.
        
               | toyg wrote:
               | Look for Sydney Padua's comics for a lot of weird and
               | strange facts about Lovelace and Babbage. (To be fair,
               | Babbage was much weirder)
        
       | tomxor wrote:
       | I started reading "the computer and the brain", derived from the
       | unperformed silliman lectures by John von Neumann.
       | 
       | The foreword of this small book is almost as large as the main
       | content and does an excellent job of contextualising it in terms
       | of more recognisable modern technology; but also in terms of the
       | contributions, the cyclical relationship between contributors to
       | the field (Chruch, Turing, von Neumann) and how they inspired and
       | fed off each other. Admittedly the writer has bias towards von
       | Neumanns contributions, yet it is still clear that although he
       | was clearly an incredibly smart, inventive and forward thinking
       | individual ahead of his time - the parts that happened to landed
       | on his lap were a product of collaboration between many brilliant
       | people.
        
       | jopsen wrote:
       | It's not necessarily about being the first to discover a result.
       | 
       | It's about being able to communicate it clearly. Being concise
       | and presenting relatable models is important.
       | 
       | Results have limited value on their own. If they are not clear
       | and concise you can't reach a wide audience.
        
       | [deleted]
        
       | RcouF1uZ4gsC wrote:
       | > In 1935, Alonzo Church derived a corollary / extension of
       | Godel's result by showing that Hilbert & Ackermann's famous
       | Entscheidungsproblem (decision problem) does not have a general
       | solution.[CHU] To do this, he used his alternative universal
       | coding language called Untyped Lambda Calculus, which forms the
       | basis of the highly influential programming language LISP.
       | 
       | >In 1936, Alan Turing introduced yet another universal model
       | which has become perhaps the most well-known of them all (at
       | least in computer science): the Turing Machine.[TUR]
       | 
       | Seems like even then, people embraced the imperative
       | model(Turing) over the functional model (Church).
        
       | timeoperator wrote:
       | Agreed. Steve Jobs invented the first computer.
        
       | mark_l_watson wrote:
       | The author complained loudly that he didn't get sufficient credit
       | for some of his early work in deep learning. Then he received
       | credit, but keeps touting his early contributions. I know of no
       | one else in the field who spends so much energy on the past,
       | rather than present and future work. To be honest, his trying to
       | bring Allen Turing down a notch me annoys me.
        
       | teknopaul wrote:
       | I smell a rat. There are damn good reasons Turin has been
       | posthumously awarded prizes and accolades. A fact I doubt the
       | author of this article is ignorant of.
       | 
       | It's makes a lot of sense that kids of the next generation grow
       | up with Alan Turin and Ada Lovelace as the heros of the computer
       | revolution.
       | 
       | Naturally many other people (perhaps you too one day) have made
       | contributions to the world's collective computer knowledge.
        
       | jgrahamc wrote:
       | There's some confusion towards the end about Engima and Colossus:
       | 
       |  _However, his greatest impact came probably through his
       | contribution to cracking the Enigma code, used by the German
       | military during the Second World War. He worked with Gordon
       | Welchman at Bletchley Park in the UK. The famous code-breaking
       | Colossus machine, however, was designed by Tommy Flowers (not by
       | Turing). The British cryptographers built on earlier foundational
       | work by Polish mathematicians Marian Rejewski, Jerzy Rozycki and
       | Henryk Zygalski who were the first to break the Enigma code (none
       | of them were even mentioned in the movie). Some say this was
       | decisive for defeating the Third Reich._
       | 
       | Yes, Turing worked on Enigma and the Bombe to automate breaking
       | of the code. However, Colossus wasn't for Engima (it was for
       | Tunny) and Turing didn't work on it. This paragraph seems
       | confused about which is which.
       | 
       | Also, the fact that the Polish who did the original work weren't
       | mentioned in the movie is just one of many things horribly wrong
       | with that movie. It's so bad it shouldn't be considered canon.
        
         | ekianjo wrote:
         | > It's so bad it shouldn't be considered canon.
         | 
         | Since when are movies supposed to be accurate historical
         | references? They are made to be entertaining, so facts get
         | kicked out of the door from Day 1.
        
           | mannykannot wrote:
           | It is not like you cannot tell a good story here without
           | embellishing and distorting it.
           | 
           | As it happens, Verity Stobb panned the movie (justifiably,
           | IMHO), in her splendidly British style, for much more than
           | just getting the facts wrong.
           | 
           | https://www.theregister.com/2015/01/26/verity_stob_turing_mo.
           | ..
        
         | vmilner wrote:
         | It seems a bit surprising not to mention specifically the
         | harder _naval_ Enigma and the U-boat threat in the context of
         | Turing.
        
         | cma wrote:
         | > Yes, Turing worked on Enigma and the Bombe
         | 
         | Since it wasn't linked, Bombe was based on the Polish Bomba
         | machine:
         | 
         | https://en.wikipedia.org/wiki/Bomba_(cryptography)
        
         | rincewind wrote:
         | Canon? Isn't Turing being canonised the problem in the first
         | place?
        
           | RicoElectrico wrote:
           | In a fictional universe a medium (book, movie etc.) being
           | "canon" is the original author's seal of approval claiming
           | "this happened".
           | 
           | I think this is a joking extrapolation to the real life, and
           | the "author" here is, in best case real life, or as an
           | approximation, historians' position based on evidence.
        
             | jgrahamc wrote:
             | Correct. That's what I meant.
        
         | jagrsw wrote:
         | > Polish mathematicians Marian Rejewski, Jerzy Rozycki       >
         | and Henryk Zygalski who were the first to break the Enigma
         | > code
         | 
         | I was in ~2010 or so in the Computer History Museum in Mountain
         | View, CA. There was some exhibition related to Enigma there or
         | maybe Enigma device on display (I don't remember what was it
         | exactly now).
         | 
         | The person who toured us around was telling us a brief story of
         | how the Enigma was broken, starting with Bletchley Park. Me or
         | maybe my friend asked who was the first to break Enigma, and he
         | immediately answered that it was Turing, then noticed our
         | puzzled face expressions, and added something along 'ah..
         | yeah.. and Polish did some minor work too' :). Just an
         | anecdote.
        
       | tiagoleifert wrote:
       | This article is super misleading.
       | 
       | He first says that Church was Turing's advisor without citing
       | that Church became Turing's advisor only after both of them
       | independently solved the Entscheidungsproblem.
       | 
       | Also, it is only after Turing became aware of Church's solution
       | that he wrote an appendix to his work where he cited Church and
       | showed that both solutions were equivalent. And even through the
       | solutions were equivalent, the techniques employed were very
       | different. Church even stated that Turing machines were more
       | intuitive and praised Turing's work.
       | 
       | About Turing citing Godel, the problems they solved were related
       | but not the same. To put it informally, Godel showed that in
       | every useful axiomatic system, there would theorems that could
       | neither be proved true or false. To do this, he codified math and
       | derived such theorem. On the other hand, Turing showed that there
       | was no general way of deciding if a theorem was true, false on
       | unprovable. To do this, Turing defined a model of computation
       | that we now call Turing machines.
       | 
       | Both Turing and Church defined the first two (and equivalent)
       | models of computation. After this, many new models emerged and
       | none of them were shown to be more powerful (in relation to what
       | it computes) than Turing machines or Lambda Calculus. That is why
       | we now believe in what we call the Church-Turing thesis:
       | Everything that is computable is computable by a Turing machine.
       | 
       | I see many people in the comments praising the article because of
       | the number of citations. This is not a good metric to judge one's
       | work. I do not have the time to make a reference to everything
       | that I said above, but if one is curious they can find all of
       | them in Charles Petzold's book The Annotated Turing. If one is
       | interest in this topic, I also recommend the Stanford
       | Encyclopedia of Philosophy and the book Godel's Proof.
       | 
       | About the rant the article makes about over attributing a person
       | contributions, I can only say that this is a problem that has no
       | solutions. One example I like to give is Wiles' proof of Fermat's
       | Last Theorem. We say that Wiles proved Fermat's Last Theorem,
       | which is true, but we may also say that Wiles "only" did the last
       | step in proving Fermat's Last Theorem, before him many other
       | people tried, advanced and contributed to our understanding of
       | the problem so Wiles could finally prove it. For me personally, I
       | know that every advance in science is made in the shoulder of
       | giants and do not bother with attribution, since this is a topic
       | we could discuss all day without going anywhere. About the
       | particular example of Turing, I am Computer Scientist working
       | with Theory of Computation and Computational Complexity who is
       | familiar with the history and pioneer work done by Turing and do
       | not believe he is oversold, not in the academic community nor in
       | popular culture.
        
       | jmartrican wrote:
       | Honest question here, how much of the CS theories play a role in
       | the history of the practical computers we use today? For example,
       | if we play back the history of the the modern computers that we
       | use from a perspective of the CPU (with its instruction set),
       | RAM, disk, high level languages, and operating systems... do they
       | have a foundation in CS theories? Or do these constructs come
       | from a more practical engineering basis, void of theory?
        
       | fmajid wrote:
       | His cryptographic achievements during WW2 were also vastly
       | oversold. Most of the theoretical breakthroughs needed to crack
       | Enigma were made by Polish mathematicians, but it was more
       | palatable for the British government to put a Brit forward, just
       | as they did for penicillin.
        
         | rich_sasha wrote:
         | I think this is true of Enigma as of start of WWII. But Enigma
         | was modified many times over the duration of the war, needing
         | not just "number crunching" but genuinely new methods.
         | 
         | So Enigma was genuinely re-broken, a few times, at Bletchley
         | Park, indeed by an all-British team, but yes, easy to forget
         | the little people who did all the initial work.
        
           | bobthechef wrote:
           | "Little people" seems like a strange choice of words. In this
           | case, these were foundational contributions.
        
           | ellimilial wrote:
           | The team being 'all-British' for obvious security reasons.
           | Which I imagine might have felt like and insult to an injury
           | to the 'little people', who, despite cracking the code, were
           | not permitted to continue working on it. Making them, you
           | know, 'little people'.
        
         | r_c_a_d wrote:
         | The brits didn't put anyone forward at all, they kept it secret
         | for as long as they could and long after Turing's death.
         | 
         | There is now a memorial at Bletchley Park to the Polish
         | mathematicians who worked on Enigma.
        
           | mhh__ wrote:
           | Indeed, we basically shoved Colossus in a warehouse
           | (scientifically at least - I assume they were used for more
           | cold war stuff)
        
         | billyruffian wrote:
         | The work at Bletchly wasn't declassified until the 70s. The
         | house and the site was a near ruin when I visited in the 90s.
         | Hardly a government promoting anyone or anything. There is
         | some, quite rightly, national disgust at the way Turing was
         | treated which probably plays into his myth.
        
       | simorley wrote:
       | If hollywood made a movie about it, then it most likely is
       | oversold.
       | 
       | > especially in the Anglosphere.
       | 
       | What does he expect. We've always done this. Whether it is with
       | thomas edison, the wright brothers, etc. Each nation/ethnic group
       | oversells their achievements and minimizes others. We also did
       | that in terms of gender/race. Women and minorities have been
       | undersold.
       | 
       | The germans do the same thing. Gutenberg didn't invent the
       | movable type. It existed in china, korea, etc for centuries
       | before gutenburg.
       | 
       | It's human nature. All nations/groups do this. Imagine how native
       | americans feel when we tell them columbus discovered the
       | americas.
       | 
       | Everything the author wrote appears valid though. Turing didn't
       | invent the computer. He didn't create computer science. He didn't
       | win ww2. What he did is solve Hilbert's Decision Problem ( which
       | church did before turing ) in a very interesting manner. With or
       | without Turing, we'd still have computers, we'd still have
       | computer science and we'd still have won ww2.
       | 
       | But sadly, the masses need heroes and heroes are
       | created/manufactured by their advocates. Turing for a variety of
       | reasons has more advocates than church, godel, etc. And
       | unfortunately for the writer, it's an anglo world and our message
       | will be the loudest and most visible.
       | 
       | The author states the truth but he is a lone voice in the
       | wilderness. A few thousand will probably read his factual article
       | while millions will watch the misleading "The Imitation Game" and
       | read the misleading articles from the media. It must be
       | frustating for the author, but truth is at a disadvantage to
       | fiction here.
        
       | amelius wrote:
       | Steve Jobs is also oversold.
       | 
       | Same for Elon Musk, etc.
        
       | helsinkiandrew wrote:
       | You can say something similar about a lot of inventions or
       | discoveries that happened in a time when many others were working
       | in the same area. Light bulbs, powered flight, calculus, the EMF
       | laws etc.
       | 
       | History seems to like a single origin story.
        
         | flohofwoe wrote:
         | Not just history and not just inventions (ok, this quote is
         | pretty much history by now too):
         | 
         | "There's a tendency among the press to attribute the creation
         | of a game to a single person," says Warren Spector, creator of
         | Thief and Deus Ex.
         | 
         | I guess whoever wrote that line didn't even get the irony.
         | 
         | https://www.ign.com/articles/2001/11/12/deus-ex-2
        
       | kryptiskt wrote:
       | Eh, he worked on important problems and made significant
       | contributions. It's a rare case that scientists are actually
       | oversold, and that's mostly a case of the public mistaking
       | excellent popularizers of science for top scientists. It might
       | seem that some gets too much attention, but that is because
       | scientists in general aren't known at all and undeservedly
       | obscure. Pulling down an icon wouldn't help them get any more
       | public recognition, it would just leave the field without any
       | publicly known names, like most areas of science (like, what are
       | the big heroes of solar physics?).
        
       | umutisik wrote:
       | Being the first one to claim a formal proof to a theorem is not
       | the most important thing. What's really important is to prove
       | things that improve our understanding, which both Goedel and
       | Turing have done, in the fields of logic and computing
       | respectively.
        
       | rtpg wrote:
       | After watching the Imitation Game, I did some googling/trying to
       | find out how the Bombe worked. I expected it to not be very
       | exact, but I also kinda felt like the entire narrative around
       | that history in the industry was just super off!
       | 
       | - The core mechanisms of the machine for running the Enigma
       | "quickly" was from the Polish - The machine wasn't even a
       | generalized computer!
       | 
       | I just felt really misled! Perhaps the biggest thing is Turing
       | probably ended up doing good amounts of contributions to the
       | academic/theoretical side and the practical side, but it feels
       | like we are missing opportunities to describe the confluence of
       | so many people's ideas in this period of history to end up at the
       | current "machines that read instructions and then act upon them
       | in a generalized fashion, very quickly".
       | 
       | This article seems to be that, and it's super intersting
        
         | jgrahamc wrote:
         | The Imitation Game was inaccurate and horrible every way you
         | look at it.
         | 
         | https://en.wikipedia.org/wiki/The_Imitation_Game#Historical_...
        
           | formerly_proven wrote:
           | What I find so strange about The Imitation Game that all of
           | this is pretty well-known; anyone who has skimmed the
           | Wikipedia article of Turing and the overview article on
           | breaking Enigma knows that the movie is pretty much complete
           | horseshit. Most of the alterations in the movie removed
           | things that would have made the movie more interesting
           | instead of the utterly bland story they made up.
        
             | goto11 wrote:
             | Given the movie was a major box office hit _and_ critically
             | acclaimed, I suspect the producers knew what they were
             | doing.
             | 
             | Just don't expect historical accuracy from a Hollywood
             | movie. Cleopatra didn't look like Elizabeth Taylor either.
        
               | HideousKojima wrote:
               | Cleopatra most likely looked like a horribly inbred
               | Greek, seeing as her family tree has a literal circle in
               | it.
        
             | arethuza wrote:
             | I'm struggling to think of any movies that are really
             | historically accurate - the point is to tell a good story
             | to get people to watch it to make a profit.
             | 
             | Edit: I'm Scottish so Braveheart is the obvious example -
             | entertaining movie but wildly inaccurate and even manages
             | to get who/what the term "braveheart" refers to wrong.
        
               | HideousKojima wrote:
               | Gettysburg and its prequel (Gods and Generals). The
               | dialogue and character motivations may or may not be
               | accurate, but the battles it depicts are pretty accurate.
        
               | endgame wrote:
               | Master and Commander (the Russell Crowe film)? While it's
               | a fictional story, I've heard it said that it captures
               | the period extremely well.
        
           | rootbear wrote:
           | When I visited Bletchley Park a few years ago, I got into a
           | conversation with one of the docents about the film and it
           | was clear that they had a very low opinion of it there.
           | Turing deserved a better film.
        
         | iainmerrick wrote:
         | I knew some of the real history beforehand and the movie really
         | annoyed me, so I'm glad to hear you were able to uncover the
         | facts yourself and had a similar reaction!
         | 
         | Among _so much_ that's just plain wrong, I really dislike the
         | insidious idea that Turing's horrible punishment at the hands
         | of the state was wrong _because he was a unique genius and war
         | hero._ No, it was wrong because he was a human being and being
         | gay should not be a crime!
         | 
         | That line of thought makes it harder to argue that no, Turing
         | may have been a genius but wasn't unique, he was just a
         | significant player in a rich field. That doesn't make him any
         | less interesting.
        
           | rtpg wrote:
           | > I really dislike the insidious idea that Turing's horrible
           | punishment at the hands of the state was wrong because he was
           | a unique genius and war hero. No, it was wrong because he was
           | a human being and being gay should not be a crime!
           | 
           | 100% agree, an unfortunate mentality all too present in
           | society, where we tend to build narratives of feeling bad for
           | people because are exceptional, and not because they are
           | people
           | 
           | See the classic kids story of "oh the store tried to kick the
           | hobo out but actually he was a millionaire!!!" How about
           | treating all people like human beings even if they aren't
           | like... valuable to you
        
           | johncearls wrote:
           | I think the attraction to the Turing story is that it is a
           | classical tragedy. If what happened to him happened to any
           | gay man, it would be wrong. But since it happened to one of
           | the greatest geniuses of the 20th century, who may have had
           | other breakthroughs that could have pushed mankind forward,
           | it is a tragedy. A tragedy for all mankind. Mankind suffered
           | a huge loss due to its own moral failures.
        
       | mannykannot wrote:
       | Alan Turing's life and work presents an interesting case which
       | makes him rather different than most of the heroes of the 'great
       | men' meme.
       | 
       | His halting theorem and work on computability is too abstract and
       | abstruse for most people to recognize its significance (at least
       | until and unless they put their minds to understanding it), but
       | his work in defeating the Nazis, and his important subsequent
       | suicide, apparently brought on by what we now regard (rightly) as
       | harmful bigotry, are the essence of drama. His is a modern
       | Campbellian hero's tale begging to be told.
       | 
       | https://en.m.wikipedia.org/wiki/The_Hero_with_a_Thousand_Fac...
        
       | bodge5000 wrote:
       | The problem with technically correct allocation of of credit is
       | that to be truly technically correct, it gets very messy very
       | quickly, as all knowledge is built on other knowledge. The credit
       | for founding computer science would be "[absolutely massive list
       | of people] and finally of course, the one we call Ung, who
       | discovered the wheel".
       | 
       | That might seem pedantic and it is, but you need to define
       | exactly where the line is drawn and more so, give a good reason
       | why. In fact its not even that simple, WE need to decide and
       | agree on where the line is drawn and all of us agree why.
       | Otherwise one mans pedantic is anothers important creditation.
       | 
       | Obviously that's not going to happen anytime soon, so for now,
       | figureheads like Einstein and Turing do the job. And they do
       | certainly deserve credit to some degree. That or we stop giving
       | credit completely, which a). seems like a good way to destroy
       | knowledge and b). isn't going to happen anytime soon.
       | 
       | Edit: As another commenter pointed out, if Einstein or the like
       | were born somewhere else and lived around a different group of
       | people, theres a chance he wouldn't become a figurehead, or he
       | would make less or more or different discoveries. Therefore,
       | theres a third option for creditation, in which everyone who has
       | ever lived up until those discoveries has equal credit. If I were
       | 60 or so years older, I'd be as much to credit for the turing
       | machine as Turing himself. So would you. Of course, this is
       | pretty much as good as no credit to anyone at all, but fixing it
       | again requires a joint agreement on where the line is drawn
        
         | mbostleman wrote:
         | Agreed. The mainstream needs a single or at least smaller list
         | of heroes that represent a larger effort. Turing is especially
         | suited for this given the oppression he suffered for his sexual
         | orientation combined with the impact of lives saved in WWII as
         | a result of his work.
        
         | 6gvONxR4sf7o wrote:
         | Those aren't the only options. We can give credit without
         | creating mythic heros. Giving technically correct precise
         | allocation of credit is messy, you're right. But so is defining
         | what 'tall' means, so the precision is beside the point. You
         | _don't_ need to define exactly where the line is drawn.
         | 
         | It reminds me of voting systems, but maybe that's just because
         | of the election yesterday. If you want to give singular
         | nontransferrable credit, the things you say are important
         | because giving someone credit takes it away from someone else.
         | Division and fighting become the right answers. But if you
         | spread the credit around, saying Leibniz and Newton both get
         | calculus credit (and probably not just those two!), then
         | discussions of which one should get the title of The One And
         | Only Calculus Hero just seems absurd.
        
           | bodge5000 wrote:
           | Defining tall doesn't matter so much, a) because we have a
           | precise measure of height and b) because we all know and
           | fully understand that the definition of tall is completely
           | subjective. You might say the same thing about credit, but
           | then you also need to accept that for a lot of people will
           | gravitate towards these mythic heroes as appropriate credit.
           | 
           | The problem with my wheel example is that it demonstrates the
           | absurdity of trying to assign credit to anyone involved, but
           | it doesn't quite demonstrate how difficult it is to even draw
           | a rough line. Does the inventor of the abacus get credit in
           | the creation of computer science? The discovery of
           | electricity? And what about all the (comparatively) minor
           | inventions and discoveries off the back of those?
           | 
           | As far as I can see it, it either needs to be completely
           | subjective, or there needs to be a line. Maybe it doesn't
           | need to be incredibly specific, although at that point some
           | subjectivity creeps back in
        
       ___________________________________________________________________
       (page generated 2021-09-15 23:01 UTC)