[HN Gopher] A brief history of computers
___________________________________________________________________
A brief history of computers
Author : zdw
Score : 159 points
Date : 2023-07-22 13:50 UTC (9 hours ago)
(HTM) web link (www.lesswrong.com)
(TXT) w3m dump (www.lesswrong.com)
| kalverra wrote:
| Not specifically computers, but if you want a very deep dive into
| the creation of the internet (including some bits about the
| earliest computers) The Dream Machine is a great, and extensive
| look at the history of the internet through the lens of J.C.R.
| Licklider's life. It was rather mind blowing to me in various
| ways, one of the big ones being that it seems a lot of early
| computer pioneers weren't only mathematicians and physicists, but
| also psychologists.
| nappy wrote:
| Agreed. It's an excellent book. But perhaps a little long if
| you are purely interested in computer history and want an
| introduction in a shorter volume. I recommend these two:
| https://en.wikipedia.org/wiki/The_Idea_Factory
| https://www.amazon.com/Dealers-Lightning-Xerox-PARC-Computer...
| olooney wrote:
| Maybe mention Pascal[1] and Leibniz[2] as important predecessors
| to Babbage?
|
| [1]: https://en.wikipedia.org/wiki/Pascal%27s_calculator
|
| [2]: https://en.wikipedia.org/wiki/Stepped_reckoner
| chubot wrote:
| One interesting question (which is very hard to answer) is how
| many of the ideas were passed down vs. re-invented, and how much
| theory influenced practice:
|
| - I think Mauchly and Eckert (of ENIAC) in the 1940's were
| unaware of Babbage (1810's)
|
| - There was the (in)famous von Neumann paper describing the ENIAC
| and patent lawsuit -
| https://www.historyofinformation.com/detail.php?id=639 - this
| page says that "most likely" Von Neumann and Mauchly/Eckert
| developed it together
|
| https://en.wikipedia.org/wiki/Honeywell,_Inc._v._Sperry_Rand....
|
| Hm these sources are a bit vague -- my memory is that "The Dream
| Machine" was more critical of von Neumann. Basically it ended up
| that he put his name on work that wasn't really his, or he gave
| that impression.
|
| i.e. the name "von Neumann architecture" doesn't give the proper
| credit
|
| - Did they need Turing or Church to build a real computer?
| Probably not, I guess Babbage also proves that. Computation is
| just "out there"; it's part of nature; it's been discovered many
| times.
|
| - That said, I would guess that Boolean logic is the most
| important theory/math in building a computer, though Babbage
| didn't have that either !!!
| dboreham wrote:
| Most written history is "bunk" in that it's a long-after
| interpretation of events in the context of what we know today.
| That's why Turing features heavily -- because he's a colorful
| character about which many books have been written and movies
| made that people watch today. But Tommy Flowers is never
| mentioned.
|
| The computer wasn't really invented at all. It evolved from
| earlier things in a step-wise manner. There were computing
| machines for decades prior. E.g. before WW1 there were
| sophisticated gunnery computers that could fire shells taking
| account the vector velocity of a ship, wind, distance measured
| optically, movement of the attacking ship. Boolean logic was
| used in telephone switching systems. Boolean circuits already
| existed both in electromechanical form (relays) and electronics
| (vacuum tubes|valves). So when Turing decided he needed a
| machine to do so and so calculations on some kind of data,
| Flowers didn't need to invent Boolean logic nor design Boolean
| circuits -- those already existed off the shelf. Teleprinters
| existed. Paper tape existed.
| chubot wrote:
| Yeah definitely agree, Taleb has warned us about such
| teleological explanations.
|
| As far as I remember, Woz's biography is good evidence that
| you don't need the idea of "boolean logic" to design
| circuits. That did come after the fact -- the way it's
| taught, not the way it was invented
|
| I think he just said he figured it all out himself
| essentially, and often did a lot better than the pros. Some
| of his claims were suspect to me, but I do think his claimed
| ignorance of prior work is genuine :)
|
| Shannon did come decades earlier though, so the designs of
| somebody who was influenced by Shannon probably influenced
| Woz. It's hard to tease apart, but I agree with "evolution"
| and "tinkering" as the main explanations.
|
| The entertaining explanations are the ones that tend to stick
| in our minds, but they're not necessarily true
|
| ---
|
| The other example I think of is when I look at Debian -- a
| lot of it is not at all what the Unix inventors had in mind.
| Debian/Linux basically looked at an artifact and then came up
| with their own divergent ideas around them
|
| Likewise Woz probably looked at a lot of computers, but he
| didn't have much of an idea what the creators were thinking
| -- just the artifacts themselves
| AsmiKittu wrote:
| [flagged]
| dmvdoug wrote:
| > I'm confused though. Babbage came up with the blueprints for
| his Difference Engine in the 1820s. I know he never succeeded at
| actually building it, but he never really had a fair chance if
| you ask me.
|
| Babbage had a chance to build his difference engine. The problem
| was that the engineering was a lot harder than he thought it was
| going to be, and he was a mathematician/economist, not so much a
| working engineer. The idea that if only he had had a better
| chance the difference engine would've been successful is just
| simply a misreading of what happened. It didn't help that after
| the British government poured tens of thousands of pounds into
| the project Babbage suddenly decided to start pushing analytical
| engine before he had even finished the difference engine. That
| made it look like these were just wild, cockeyed schemes, when
| Babbage was supposed to be engaged in a practical, mechanical
| calculating project (to help reduce the labor expended on
| computing, for example, navigational tables).
| lproven wrote:
| A Swedish family, called Scheutz, finished the Difference
| Engine within a few years after Babbage gave up, and sold them
| very profitably for many decades.
|
| https://en.wikipedia.org/wiki/Per_Georg_Scheutz
|
| It wasn't too hard for the time... Babbage just kept getting
| distracted.
| dboreham wrote:
| > The problem was that the engineering was a lot harder than he
| thought it was going to be
|
| This is true of many (most?) innovations. E.g. the steam
| engine: people knew about steam power, and built primitive
| steam engines. Watt succeeded eventually in manufacturing one
| that had the right mix of reliability, power, cost,
| maintainability to be widely useful. E.g. the jet engine :
| Whittle conceived of turbine aircraft power during WW1, but
| didn't succeed in manufacturing a viable engine and putting it
| in a plane until the end of WW2.
| ghaff wrote:
| A lot of similar things come into play if you ask questions
| like "Could the Romans have invented $X?"
|
| Some cultural factors like slavery meant they were less
| interested in e.g. labor-saving inventions. And there
| probably were health and life sciences concepts you could
| introduce--but might have limited ability to prove. But, for
| the most part, there are technology trees that you can't
| really shortcut and, even with the right high-level
| knowledge, it's hard to accelerate things too mych.
| jazzyjackson wrote:
| I read a good description of the affair in James Essinger's
| "Jacquard's Web" - came down to the machinist cutting the gears
| not-to-spec, for the geartrains to function smoothly they would
| have needed precision not achieved for decades. The Royal
| Society was willing to fund the project but it became a
| bottomless money pit as Babbage kept sending the parts back to
| the kitchen so to speak.
| d_silin wrote:
| reminds me of fusion energy promises, or at least "the
| bottomless money pit". The science is solid, but the
| engineering challenges delayed even proof-of-concept
| experiment literally until next century.
| nemo wrote:
| >It's not what Godel was hoping for. Godel was hoping to add some
| support to that shaky Jenga piece at the bottom so that they
| could continue building their beautiful tower taller and taller,
| consisting of structures ever more elegant.
|
| Godel was a Lutheran Platonist who was personally morally opposed
| to Logicism and his contemporary mathematical program. He was an
| odd man, really, but he was in no respect a booster of, or a
| person working to promote Hilbert's program. He was tearing it
| down very deliberately.
| Sharlin wrote:
| Boole didn't introduce propositional logic; what he did was come
| up with an _algebra_ that encodes propositional logic.
|
| Abstract algebra was a new snd developing thing back then, the
| idea that you can generalize from numbers and addition and
| multiplication to other structures that have something like
| numbers and addition and multiplication.
|
| Boole found that if you take the two-element set {0, 1} and
| choose saturating addition as the addition-like operation and
| normal multiplication as the multiplication-like operation, you
| get an algebra (specifically a ring) that is isomorphic to
| propositional logic with its AND and OR operations.
|
| So the idea that the number 1 can represent true and the number 0
| false was Boole's insight and the foundation of modern digital
| circuits.
| javajosh wrote:
| And actually Boole is only dealing with two distinguishable
| states. Any two distinguishable physical states can be used to
| define binary 1 and 0. But in general, any N distinguishable
| states can be used to form an algebra. N=2 is just the
| simplest, most elegant, and what most real computers are based
| on. I've heard rumors that an N=3 computer exists, aka
| 'ternary' (EDITed) but I have my doubts. On the math side you
| can turn the integers into a discreet N-part thing with
| modulus.
| helf wrote:
| [dead]
| grahamlee wrote:
| No, Boole was dealing with probabilities. The first half of
| his investigation on the laws of thought is all ones and
| zeroes, but the second half admits any value in between.
| ccppurcell wrote:
| The word is ternary, and Ternary Computer is a Wikipedia page
| you could read if you're interested.
| dreamcompiler wrote:
| Boole figured out Boolean Algebra, but nobody paid attention
| until Claude Shannon realized we could use Boolean Algebra to
| design digital circuits. In his _masters thesis._
|
| Very few masters theses have changed the world, but Claude
| Shannon's was one of them.
| raspyberr wrote:
| Seems pretty reasonable for someone doing their own high level
| research. Notably missing any references to telegrams and
| telephone systems.
| DiscourseFan wrote:
| Yeah, there was a lot of work on logic in both India and the
| West in the medieval period, which was very influential in
| later thought, its just that what we conceive of as the
| "modern" form of logic only took shape in the 19th century. But
| reading medieval tracts on logic is fascinating.
| jazzyjackson wrote:
| Wish I had a source but I've read Mr and Mrs Boole were both
| involved in the study of Indian mathematics, and that "Bool"
| as a datatype might be more true to its namesake if it
| included "maybe" or "null"
| jll29 wrote:
| Also between the middle ages and the 19th century: e.g. The
| Logic and Grammar of Port Royal were very influential - c.f.
| https://plato.stanford.edu/entries/port-royal-logic/
| DiscourseFan wrote:
| I meant "medieval" here to refer to any time between the
| beginning of neoplatonism and the industrial revolution:
| quite a large space, but it seems like those are the only
| two time periods people are generally aware of in the
| history of thought, since we teach kids that everything
| between then was the "dark ages."
| retrac wrote:
| In terms of causes and forces, the influence of
| telecommunication on computing is really hard to overstate. In
| the Internet era, I think some might be unaware that
| telecommunication definitively came first, and in my view,
| effectively led directly to computers.
|
| From the electrical engineering perspective, the earliest relay
| and vacuum tube computers were built out of elements developed
| for radio and telephone exchanges. Bell Labs developed the
| transistor with their phone system primarily in mind. Same with
| high speed digital data circuits. (Digital audio was demo'd in
| the late 1940s, deployed in the phone network in the late 50s).
|
| It's not just the physical circuits; much of the theory, too.
| Claude Shannon was trying to optimize subnets of switches in
| the phone network, when he proved that binary switching logic
| is equivalent to Boolean algebra and so such systems could be
| described, manipulated, and optimized symbolically.
|
| Similarly, both frequency and time division multiplexing date
| to the late 1800s with telegraphy. One of the first uses of
| vacuum tubes as switching elements was for multiplexing
| telegraph lines c. 1940 or so. (The terminology from that era
| is quite charming - modern Wi-Fi might be described as
| supersonic harmonic multiplexed radiotelegraphy.)
| kitd wrote:
| _From the electrical engineering perspective, the earliest
| relay and vacuum tube computers were built out of elements
| developed for radio and telephone exchanges._
|
| Indeed. A "main frame" was originally the housing used for
| the relay switches in the original telephone exchanges.
| kens wrote:
| I've done a ton of research on this, and "main frame" in
| the computer sense originated with the IBM 701, which was
| built from various frames such as the power frame, the
| storage frame, and the main frame (which performed
| computation). There is a direct (but complicated) path from
| this to the modern meaning of "mainframe". The "main
| distribution frame" in a telephone exchange was unrelated.
| [deleted]
| pjmlp wrote:
| For example during the 1800's sport results were already
| being telegraphed and printed into dotted paper.
| citelao wrote:
| What would be a good, more academic overview of computing
| history? Do you have any specific book recommendations? I'd
| love to read a more "citation-based" version.
| nappy wrote:
| Not sure about academic history, but in a single volume, this
| does a good job on early 20th century computer history:
| https://en.wikipedia.org/wiki/The_Idea_Factory
| cfmcdonald wrote:
| IMO the best overall "soup to nuts" survey of the history of
| computers is Campbell-Kelly, et. al., "Computer: A History of
| the Information Machine."
|
| Specifically on the relation of telecommunications to
| computers I will toot my own horn and recommend my book,
| McDonald, "How the Telegraph, Telephone, and Radio Created
| the Computer."
| mjbrusso wrote:
| The First Computers
|
| History and Architectures
|
| Edited by Raul Rojas and Ulf Hashagen
|
| https://mitpress.mit.edu/9780262681377/the-first-computers/
| 123pie123 wrote:
| no mention of Konrad Zuse or the Z1?
|
| https://en.wikipedia.org/wiki/Z1_(computer)
| jchw wrote:
| > But the biggest thing is probably that it made assemblers and
| compilers possible. Well, I'm not sure if that's strictly true.
| Maybe you could still have them without a shared memory
| architecture. But the shared memory architecture made them a
| whole lot easier.
|
| I think the actual important part is _being able to_ address and
| manipulate code like it 's data somehow, rather than the specific
| architecture. Having two separate address spaces for code and
| data doesn't necessarily prevent that, though it's surely simpler
| with only one.
| nappy wrote:
| I don't recommend reading this. There are many gaps and a lot of
| important history missing, including:
|
| 1. Computation before ~1800. Abacus, Napier's Bones, Slides
| rules, Pascal's Calculator, motivations from celestial navigation
| and astronomy.
|
| 2. Modern analog computers ~1900-1950. The author seems to refer
| to them as "math machines" and leaves it at that, without
| exploring much deeper than that they were used for besides
| calculating firing solutions for artillery. I think the author
| lacks a solid grasp of how mathematical tables were used from
| 1614 onwards, and that analog computers were used to create much
| more accurate and complex tables which could be used for more
| accurate firing solutions. And for other purposes as well, beyond
| code-breaking.
|
| >"It's hard for me to wrap my head around the fact that early,
| pre-general purpose computers (~1890-1950s) weren't computers in
| the way that we think about computers today. I prefer to think
| about them as "math machines"."
|
| >"But subsequent machines were able to do math. From what I'm
| seeing, it sounds like a lot of it was military use. A ton of
| code-breaking efforts during World War II. Also a bunch of
| projectile calculations for artillery fire."
|
| 3. Poor description of the advent of electronic computers.
|
| >"Then in the 1940s, there was a breakthrough.[10] The vacuum
| tube took computers from being mechanical to being electric. In
| doing so, they made computers cheaper, quieter, more reliable,
| more energy efficient, about 10-100x smaller, and about 100x
| faster. They enabled computations to be done in seconds rather
| than hours or days. It was big."
|
| It was certainly a breakthrough, but the idea that computers
| immediately became quieter, cheaper, and more reliable is false.
| They were _much_ larger, initially, compared to analog computers
| of the era. By almost any measure, they were also much less
| efficient with energy, though this may depend on what sort of
| calculations you are doing - I 'm less sure of this.
|
| 4. Incomplete and incorrect descriptions of programming languages
| and the history of digital logic. No mention of information
| theory and Claude Shannon, digital circuits.
|
| This is a poor analogy that misleads a reader who is unfamiliar
| with programming languages, it obscures the abstraction:
|
| >"Think of it like this. It's translating between two languages.
| Assembly is one language and looks like this: LOAD R1, #10.
| Machine code is another language and looks like this:
| 10010010110101010011110101000010101000100101. Just like how
| English and Spanish are two different languages."
|
| 5. Lack of understanding of digital hardware.
|
| The author never describes why or how vacuum tubes and then
| transistors allowed computers to use logic that is both _digital_
| and _electronic_.
|
| The author jumbles a lot of ideas into one and does not seem to
| understand the relationship and distinction between the evolution
| of transistor technology (point-contact -> BJT -> FET -> MOSFET)
| and the creation of integrated circuits.
|
| >"Before 1966, transistors were a thing, but they weren't the
| transistors that we imagine today. Today we think of transistors
| as tiny little things on computer chips that are so small you
| can't even see them. But before 1966, transistors were much
| larger. Macroscopic. Millimeters long. I don't really understand
| the scientific or engineering breakthroughs that allowed this to
| happen, but something called photolithography allowed them to
| actually manufacture the transistors directly on the computer
| chips."
|
| 6. Lack of historical context. No mention of the motivations for
| creating the vacuum tube or transistor: amplification and
| switching for use in telegraph and phone networks. No mention of
| the role the US government played beyond the 1860 Census, no
| mention of continued investments motivated by the Cold War,
| Apollo Program, ICBMs, etc. They briefly cover artillery firing
| solutions and mention code-breaking.
|
| 7. Over reliance on LLMs to research and write this.
|
| Hard to take a history which includes this seriously:
|
| >"And from what ChatGPT tells me, it's likely that this would
| have been an investment with a positive ROI. It'd make the
| construction of mathematical tables significantly faster and more
| reliable, and there was a big demand for such tables. It makes
| sense to me that it'd be a worthwhile investment. After all, they
| were already employing similar numbers of people to construct the
| tables by hand."
|
| >"Anyway, all of this goodness lead to things really picking up
| pace. I'm allowed to quote Claude, right?"
| kaycebasques wrote:
| The sections on the super early years were great. A lot of ideas
| and perspectives in there that I have not heard before. The later
| sections personally didn't do as much for me, but that's probably
| only because I am more familiar with those topics.
|
| Thank you to the author for creating this! This style of super
| personal historical overview is very enjoyable. I like how the
| author says outright to take everything with a grain of salt and
| I like how they call out things about the narrative that don't
| make sense to them.
| sdfgionionio wrote:
| >And from what ChatGPT tells me, it's likely that this would have
| been an investment with a positive ROI.
|
| Wonderful.
|
| It's interesting to me that, in just a few months, I've already
| developed muscle memory for checking whether or not things I read
| online are machine-generated. The first thing I do on any website
| is search for "GPT", "Bing", and "AI" and stop reading if I find
| them.
|
| Reading someone's writing is an exercise in trust. If they claim
| something, I have to be able to trust that they have done enough
| of their homework to back it up. If they cite a source, I have to
| be able to trust it says what they claim. Otherwise what's the
| point? If I can't rely on the author, then reading their writing
| requires checking everything they've said. Their writing is
| useless to me since I'll need to do my own research anyway.
|
| If you write something and ask me to read it, you are asking me
| to trust that you have done the legwork. If you really just typed
| it into ChatGPT, that's more than just stupid. It's a betrayal.
| gerikson wrote:
| At least this author states openly that they used ChatGPT. In a
| couple of months such honesty will be rare.
| MostlyStable wrote:
| It's weird to me that you are conflating asking GPT a question
| related to your article and writing your article. Would you
| have a similar reaction if he had said "according to google"?
| There does not seem to be any evidence at all that the author
| didn't write this entire article, and the fact that they
| explicitly reference that they consulted GPT on some related
| point seems further evidence that they _didn't_ have GPT write
| it (I think if they _had_ had GPT write it, they would have
| avoided mentioning GPT at all)
| howenterprisey wrote:
| Not them, but no, I wouldn't have a similar reaction if it
| were "according to google", because in the context of a blog
| posted to HN I'd expect it to mean a cursory bit of research,
| which is way better.
| RyanAdamas wrote:
| A decent brief history. Author should look into Vannevar Bush.
| https://en.wikipedia.org/wiki/Vannevar_Bush
| spenrose wrote:
| Agreed. My history emphasized the Bush/memex ->
| Engelbart/Online System -> Alto -> Mac and Engelbart ->
| Berners-Lee/WWW lineages:
| http://whatarecomputersfor.net/machines-for-millions/
| RcouF1uZ4gsC wrote:
| For a more humorous overview of this, take a look at Verity
| Stob's account
|
| https://www.theregister.com/2012/12/22/verity_stob_8086_and_...
| nashashmi wrote:
| Not sure what it is about computer history, but it garners a lot
| of interest among computer science students. Electives on
| computer history have usually been packed classes.
___________________________________________________________________
(page generated 2023-07-22 23:00 UTC)