[HN Gopher] NSA releases 1982 Grace Hopper lecture
___________________________________________________________________
NSA releases 1982 Grace Hopper lecture
Author : gaws
Score : 875 points
Date : 2024-08-26 12:37 UTC (1 days ago)
(HTM) web link (www.nsa.gov)
(TXT) w3m dump (www.nsa.gov)
| petercooper wrote:
| Just watched this and it was fantastic. The first half is much
| like public lectures of hers I've watched before, but the second
| half goes into more depth in a variety of areas that were pretty
| cutting edge for 1982 like cybersecurity, loose
| coupling/modularity in software, VLSI/SoC, and programming
| language standardization.
| philistine wrote:
| I loved her few extremely specific references. She mentions the
| cheapest _computer_ one can buy, the Intel 8021, a chip sold
| for 13 cents a piece if you buy a hundred. That 's a great
| visualization of how cheap her system of computers can be.
| 12_throw_away wrote:
| Ok, this got me curious, how much have things changed for
| low-end embedded microcontrollers?
|
| Some numbers from a few minutes of searching:
|
| - Then: Intel 8021: 1 kB ROM, 64 B RAM, 11MHz, about $.40-.50
| in 2024 dollars
|
| - Now: ATtiny25: 1kB ROM, 128 B of RAM, 20 MHz, maybe
| $.70-$.80 each for a huge order
|
| Not sure if this is the right comparison, and I'm sure there
| are lots of other differences that the topline numbers don't
| capture and that I don't know about (e.g, power consumption,
| instruction set, package size, etc. etc.)
| msl wrote:
| You might want to check out The Amazing $1 Microcontroller
| [1] which explores multiple microcontrollers that could be
| had for less than $1 (when buying a hundred of them) in
| 2020. I haven't checked how much the prices have dropped
| since (if at all) but it could still be a good starting
| point when looking for parts at the 8021's price range.
|
| [1] https://jaycarlson.net/microcontrollers/
| danhor wrote:
| An ATtiny really isn't good value for money, except when
| looking for something simple to use.
|
| The CH32V003 is $.10-$.20, quite usable in my experience
| and features a 16 kB ROM, 2 kB RAM and a 32-bit 48 MHz
| RISC-V.
|
| The PMS150 is available for <$.05 with ~1.5 kB ROM, 60 B
| RAM and an 8-bit 8 MHz CPU.
|
| If you're excluding chinese manufacturers, the STM32G030 is
| sub-$.80 in quantity for 32 kBs of ROM, 8 kBs of RAM and a
| 32-bit 64 MHz ARM CPU.
| fragmede wrote:
| which is to say, an MMU to run a full blown operating
| system, and not a small C program with some interrupt
| handlers.
| kragen wrote:
| none of these have mmus, even the stm32g030 is a
| cortex-m0+
| fragmede wrote:
| dang. thanks for the correction. Looks like the cheapest
| chip with an MMU is the Allwinner F1C100S which is $2.20
| in quantity.
| kragen wrote:
| happy to explore with you!
|
| mmus are a performance hack; they make your memory-
| protected code run faster than if you use a jit compiler
| that inserts memory bounds checks. but suppose running
| code that way costs you a factor of 10x in performance.
| so maybe your 30-dhrystone-mips processor
| (https://www.lcsc.com/product-detail/Microcontrollers-
| MCU-MPU..., say, or https://www.lcsc.com/product-
| detail/Microcontrollers-MCU-MPU... for 2x) performs
| roughly like a 3 dhrystone mips processor. if we believe
| https://netlib.org/performance/html/dhrystone.data.col0.h
| tml that's roughly the performance of a sun 3/160, an
| amiga 2000, or a 40 megahertz amd clone 80386 pc (though
| much slower than an intel 386)
|
| that's much faster than many multiuser machines i've used
|
| the promise of java (and oberon) was that such large
| runtime overhead would be unnecessary with better static
| checking, and j2me and oberon seem to have largely borne
| that out
|
| the bigger issue is i think that cheap microcontrollers
| don't have much ram or off-chip bandwidth. if you want a
| megabyte on-chip, you end up with things like https://www
| .digikey.com/en/products/detail/stmicroelectronic... (a
| 480-megahertz cortex-m7 with 128 kibibytes of flash and a
| mebibyte of ram for usd11.27),
| https://www.digikey.com/en/products/detail/infineon-
| technolo... (a dual-core 150-megahertz cortex-m0+ with 2
| mebibytes of flash and a mebibyte of ram for usd12.92),
| https://www.digikey.com/en/products/detail/nxp-usa-
| inc/MIMXR... (a 600-megahertz cortex-m7 using external
| program memory and a mebibyte of ram for usd14.48), or
| https://www.digikey.com/en/products/detail/renesas-
| electroni... (a 240-megahertz renesas rx72n with 4
| mebibytes of flash and a mebibyte of ram for usd20.85)
|
| i'm pretty sure none of these have mmus either but i
| forget the rx architecture
|
| a whole esp32 module like https://www.lcsc.com/product-
| detail/Development-Boards-Kits_... is cheaper and has
| more ram though. that one has 8 megs of psram and costs
| usd4.93
| kragen wrote:
| all the following prices are for "under 100 dollars"
| quantities
|
| https://www.lcsc.com/product-detail/Microcontrollers-MCU-
| MPU... 1.5C/, 32 kibibytes in-application-programmable
| flash, 4 kibibytes sram, 48 megahertz, nearly 1 32-bit arm
| instruction per clock
|
| https://jlcpcb.com/partdetail/NyquestTech-NY8A051H/C5143390
| 1.58C/, 1 kibiword otp prom, 48 bytes of ram, 20 megahertz,
| nearly 1 8-bit pic16-like instruction per clock, english
| datasheet https://www.nyquest.com.tw/upload/2024_02_293/NY8
| A051H_v1.6....
|
| https://www.lcsc.com/product-detail/Microcontroller-Units-
| MC... 10.5C/, 1 kibiword otp prom, 60 bytes ram, two
| hardware threads ('fppa') context-switching every cycle so
| you can get better real-time response, 16 megahertz, nearly
| 1 8-bit instruction per clock. english datasheet https://ww
| w.padauk.com.tw/upload/doc/PMC251%20datasheet%20V0...
|
| https://www.lcsc.com/product-detail/Microcontrollers-MCU-
| MPU... 8.7C/, 20 kibibytes flash, 3 kibibytes ram, 24
| megahertz, nearly 1 32-bit arm instruction per clock.
| english datasheet https://download.py32.org/Datasheet/en/PY
| 32F002A%C2%A0datash...
|
| https://www.lcsc.com/product-detail/Microcontrollers-MCU-
| MPU... 12.45C/, 16 kibibytes of flash, 2 kibibytes sram, 24
| megahertz, nearly 1 32-bit risc-v (rv32ec) instruction per
| clock
|
| these are generally much lower power than the 8021, but
| really the place to look for power consumption is ambiq;
| these are all conventional cmos rather than the
| subthreshold logic ambiq uses
|
| they also incorporate a lot more peripherals
| kragen wrote:
| i thought i'd check out the 8021 to see how big the
| difference is. it's bigger than i imagined
|
| https://en.m.wikipedia.org/wiki/Intel_MCS-48 says it's a
| cut-down 8048. the 8048 itself has a max clock speed 11
| megahertz, 15 clocks per machine cycle, with about 70% of
| instructions taking one machine cycle, 30% taking two,
| for about half a mip. not vax mips, tho, an 8-bit mip.
| there's a manual for the chip family at
| https://manualsdump.com/en/download/manuals/intel-
| mcs-48/253...
|
| the biggest omission is that they cut it down from 28 to
| 21 i/o pins and eliminated interrupts, which would be a
| big loss in a modern microcontroller, but it was nmos
| rather than cmos, so you're looking at unholy power
| consumption anyway; 40 milliamps (typ., p. 192/478
| (6-49)) at 5 volts is 200 milliwatts. but they also cut
| the clock speed, the minimal machine cycle time on the
| 8021 is listed as 10 ms (with a 3 megahertz crystal)
| rather than the 8048's 2.5ms or the 8049's 1.36ms. so you
| get 0.07 8-bit mips. it uses dynamic logic and dynamic
| ram to save space so you can't clock it at less than 20%
| of that, so you can't do low-power sleep, ever
|
| they also omitted the subtract instruction, which the
| 8048 doesn't have either. i guess you can use cpl, inc,
| add. (i see that what the manual suggests is cpl, add,
| cpl.) there's enough space for multiply and divide
| subroutines but they ain't gonna be fast
|
| also, you can forget about programming an 8021's program
| memory. it's mask-programmable only; you have to do your
| test programming on an 8748 before you place your order
| with intel for a batch of 8021s with a custom silicon
| mask encoding your 1024 bytes of already tested and
| debugged firmware. so those 42-cent[?] prices were
| necessarily in rather large batches. the 8051 and
| 8048/8049 had an 'ea' pin you can pull high to get it to
| execute code from external memory instead, but i don't
| think the 8021 did; the manual says, "no external rom
| expansion capability is provided."
|
| 0.07 8-bit mips is about 0.005 dhrystone mips, although i
| think the 8021 is too small to run dhrystone. the cypress
| chip i linked above is about 60 dhrystone mips at 48
| megahertz, so about 12000 times faster, for about a 25x
| lower price. it also has 4096 bytes of ram instead of 64
| (16x), 32 kibibytes of nonvolatile program memory instead
| of 1 (32x), plus 8 kibibytes of rom, and you can program
| the flash in-application ( _i.e._ , under the control of
| the program it's running). it has a hardware multiplier,
| which is about a 10x additional speedup for dsp type
| stuff. in deep sleep it uses 2.5mamps. at full speed it's
| a bit of a power hog by current standards, slurping a
| hefty 13 milliamps (at 1.8 volts if you like, so 23
| milliwatts, almost 1/8 the 8021, but if you're in deep
| sleep most of the time you can go another 5000x lower).
| and it's 1.6mmx2mm. and you can program it in c. it has
| only 9 gpio pins, though!
|
| despite nominally being a psoc the cypress chip has no
| analog peripherals, not even a comparator. it does have
| internal oscillators (with no external components), pwm
| generation, i2c, spi, uart, and quadrature input, and its
| gpio pins have seven drive strength modes
|
| so depending on whether cpu speed or memory space is the
| bigger bottleneck for your application, price/performance
| has improved between 400x and 300000x since the 8021. for
| things that were constrained by battery life or
| reprogrammability, the difference isn't quantitative,
| it's just that the 8021 couldn't do the job at all
|
| in the metric of interest to hopper, though, which was
| computers per buck rather than mips per buck, it's only
| about 25x better than then
|
| ______
|
| [?] https://data.bls.gov/cgi-
| bin/cpicalc.pl?cost1=.13&year1=1982... says 42C/
| mikewarot wrote:
| I was disappointed that she didn't really talk about multilevel
| security, which had solved the computer security problem by the
| time of this talk. However, her focus on breaking things into
| individual systems instead of multiprogrammed ones could be
| _seen_ as an effective approach at the time.
|
| It wasn't until persistent internet connections became the
| norm, that this would have been shown to be an illusion. An
| illusion we continue to suffer for to this day.
| AdmiralAsshat wrote:
| Does this lecture include her famous nanosecond/microsecond
| dioramas? The existing videos on it seem to be fairly low
| quality. [0]
|
| [0] https://www.youtube.com/watch?v=gYqF6-h9Cvg
| taxborn wrote:
| Yes! About 40 minutes into the first posted lecture [0]
|
| [0] https://www.youtube.com/watch?v=si9iqF5uTFk&t=2400s
| samstave wrote:
| >" _I 'm beginning to push the velocity of light_"
|
| Wow.
|
| 11.8" is a nanosecond.
|
| So much wisdon and understanding:
|
| "Get a molocule set, red balls can be computers, blue balls
| can be databases..."
|
| "Get out of the domain of the paper, you cant draw in paper
| any more, they've got to be in three diminsions."
| kranke155 wrote:
| Was she right about three dimensional databases ? I'm not
| in IT
| samstave wrote:
| Yup - thats what an AI vector DB is all about - matrices.
|
| --
|
| EDIT: @-probably_wrong
|
| You have to think that she also has to put things into
| the minds of others who dont have here prescient
| forethought of systems.
|
| Listen to all she says about "systems of computers" where
| "you need a computer to run all these other computers"
| and that you have ~160KB over head to run a 32KB
| program... andn how she breaks down all the constituents
| in a cluster, down to security auth...
|
| And she tells you to buy a Scientific Molocule Model set
| to be able to design computer systems in 3d.
|
| She is fn the foundation of cloud.
| Bluestein wrote:
| Increible.-
|
| I wholeheartedly wish certain folks - us all,
| undoubtedly, but particularly certain people, such as
| Hopper - were inmortal.-
| probably_wrong wrote:
| I agree that she's right regarding how distributed
| systems will eventually work. My disagreement is not with
| the "what" nor the "why", but rather with the "how". If a
| "better Molecule Kit" were the solution then I think we
| would have built one today in VR.
|
| IMO the fundamental problem is that visualizing complex
| systems fails because the result is either too cumbersome
| to be useful or too simplified. UML tried to solve that
| issue (in 2D) by allowing you to go into more/less detail
| as you need it, and yet its adoption in modern software
| development is uneven at best. And there's a reason why
| we use flowcharts mostly for beginner's problems.
|
| The actual solution, I believe, was getting _away_ from
| visualizations by making robust software (well...) with
| clear interfaces to abstract the complexity away.
| Reaching this conclusion took a _lot_ of work by plenty
| of brilliant minds, so I 'm not faulting her for not
| being _that_ accurate in that particular prediction.
| samstave wrote:
| NVIDIA named one of their chip platforms after her.
|
| I mean, at the time that she was saying this, you couldnt
| fill a Trump Rally with as many people on the planet at
| the time knew the future of compute the way she did.
| probably_wrong wrote:
| I'm going to say "no". In her example she mentions that
| the problem she's trying to solve is that flowcharts [1]
| need to be 3D to model multiple systems and components
| operating in parallel, but that's just trying to push a
| single-system tool beyond it's usefulness. Trying to
| model multiple systems like that would lead to an
| explosion in the number of transitions very quickly.
|
| The closest we have nowadays to her "3D flowcharts" idea
| would be UML in general [2] and Orthogonal State Machines
| [3] in particular, but I think that what her problem
| really _needed_ was better encapsulation and interfaces
| between systems.
|
| [1] https://en.wikipedia.org/wiki/Flowchart
|
| [2]
| https://en.wikipedia.org/wiki/Unified_Modeling_Language
|
| [3] https://en.wikipedia.org/wiki/UML_state_machine#Ortho
| gonal_r...
| johncessna wrote:
| She also talks about the first bug during this one as well as
| some general lore, history, lessons learned and brings up some
| good future problems - some of which got solved, some didn't.
| Def worth watching the whole thing.
| ssklash wrote:
| Is this the same video that was found via FOIA, but was on an old
| tape format of some kind that the NSA couldn't/wouldn't read?
| KenoFischer wrote:
| Yes. Linked press release says they borrowed equipment from
| NARA to play it.
| toomuchtodo wrote:
| Relevant Animats comment at the time.
|
| https://news.ycombinator.com/item?id=40958494
|
| Related:
|
| _The NSA Is Defeated by a 1950s Tape Recorder. Can You Help
| Them?_ -https://news.ycombinator.com/item?id=40957026 - July
| 2024 (24 comments)
|
| _Admiral Grace Hopper 's landmark lecture is found, but the
| NSA won't release it_
| -https://news.ycombinator.com/item?id=40926428 - July 2024 (9
| comments)
| iNate2000 wrote:
| https://www.nsa.gov/Press-Room/Press-Releases-
| Statements/Pre...
|
| > able to retrieve the footage contained on two 1' APEX tapes
|
| I'm no expert, but I think they meant 1-inch AMPEX tape.
|
| Also, perhaps they should record it in doubly.
| #ThisIsSpinalTap #Stonehenge #InchsToFeet
| qingcharles wrote:
| The page now says AMPEX. Did they read your comment?
|
| p.s. I'm imagining ECHELON flagging your comment and
| someone from the NSA quickly modifying the document on the
| sly...
| philipwhiuk wrote:
| Or someone in the NSA reads Hacker News.
|
| Hi, anonymous NSA employee!
| fragmede wrote:
| haha now we're all on a list, aren't we? shit.
| barathr wrote:
| Amazing how prescient her talk is on so many levels -- things
| that in 1982 there were likely few folks really thinking about
| deeply and holistically.
| TomK32 wrote:
| Don't forget she was born in 1906, having this thinking at 76
| is something only few of us will be able to do. She had been
| born in an age before so many things that were important to the
| 20th century but are already outdated and being replaced in our
| 21st century. Amazing!
| refibrillator wrote:
| I love her sense of humor! One story she tells is about the
| world's first computer bug [1], I had never heard it nor the
| history of the word.
|
| She also mentions they were using computers to enhance satellite
| photos, it took 3 days to process but they could determine the
| height of waves in the middle of the pacific and the temperature
| 20 feet below the surface.
|
| [1] The Bug in the Computer Bug Story
|
| https://daily.jstor.org/the-bug-in-the-computer-bug-story/
| mighmi wrote:
| Without detracting from her humor, Thomas Edison and other
| before him e.g. wrote about bugs in the 19th century:
| https://en.wikipedia.org/wiki/Bug_(engineering)#History
| vidarh wrote:
| Hopper's involvement is not about being the one to coin the
| term, but about being the first to find an actual, physical
| _computer_ bug. It 's clear from the story the implication
| was not that it was the first use of the _term_ as in that
| case the joke would make no sense.
| brandall10 wrote:
| Right, otherwise the word "actual" wouldn't be in the
| notebook, which implies computer scientists were actively
| using the term prior to the event.
| 2OEH8eoCRo0 wrote:
| She goes on to say:
|
| "I think it's rather nice that the Navy is keeping a few of the
| early artifacts like the first bug and me and a few other
| things."
|
| :)
| igleria wrote:
| > early artifacts like the first bug and me and a few other
| things
|
| lovely, reminds me of an argentinian tv presenter that we
| make jokes about regarding her age (97 currently and going
| strong)
| cryptonector wrote:
| Go on.
| igleria wrote:
| sorry for no subtitles, but the context is her saying she
| got married by one of the Argentinian early patriots (?)
| https://youtu.be/AS3aue2B7Ak
| cryptonector wrote:
| Gracias!
| adastra22 wrote:
| Reminds me of the French supercentarian who credits her
| good health with having quit smoking (at the age of 103).
| jkaptur wrote:
| I think that article buries the most interesting part! It's
| true that "bug" _in that sense_ dates back to the 19th century,
| but before that, it didn 't necessarily mean "insect" - it
| could mean something like "malevolent spirit", as in Hamlet's
| "bugs and goblins".
|
| I wrote a little more about this: https://jkaptur.com/bugs/
| rufus_foreman wrote:
| >> I love her sense of humor!
|
| She appeared on the David Letterman show a few years later in
| 1986, https://hackcur.io/grace-hopper-on-letterman/.
| saintradon wrote:
| Who knows how many terabytes of incredible lectures like this our
| government is sitting on... Makes me sad to think about, frankly.
| toomuchtodo wrote:
| Don't be sad, get busy digging and liberating.
|
| https://www.muckrock.com
| robotnikman wrote:
| Yep, the archives are there, its just a matter of going
| through the processes and finding them. The archives are HUGE
| though, so don't expect things to happen quickly.
| cbm-vic-20 wrote:
| This may be a good job for AI/LLM.
| mark-r wrote:
| I can only think about that scene from the end of "Raiders of
| the Lost Ark". Life imitates art.
| mrinfinitiesx wrote:
| 'I've gotten the most amount of blank stares I've ever gotten'
| when in regards to how people value their information.
|
| I mention two things outside of social media, which is what most
| people think is the internet, about what I can do with a computer
| and people stare at me like I'm speaking alien languages. I come
| to hacker news and realize I'm not even 1% as smart as most of
| you.
|
| A good video to watch. She's really funny. Really smart.
| joshstrange wrote:
| I can't wait to watch this later. I watched just a few minutes
| starting at this timestamp [0] (it was linked in another comment)
| and it was gold, I love her sense of humor.
|
| [0] https://www.youtube.com/watch?v=si9iqF5uTFk&t=2400s
| mrandish wrote:
| Wow! This being released is wonderful and unexpected. I first
| heard about these tapes being found six weeks ago yet the NSA
| being unable to release them due to not having a suitable working
| 1-inch VTR machine (via this article:
| https://www.muckrock.com/news/archives/2024/jul/10/grace-hop...)
|
| That article was re-posted here on HN and elsewhere but didn't
| seem to get much attention and I feared the worst, since 1-inch
| magnetic video tape degrades with time. Very frustrating since
| such vintage VTRs do exist in working order in the hands of
| museums, video preservationists and collectors. Now six weeks
| later we get the best possible news! Hopefully, that article and
| the re-postings helped spread the word and someone in control of
| access to the tape got connected to someone with the gear.
|
| And what an amazing piece of history to have preserved. I'm only
| ten minutes into the first tape but she's obviously a treasure -
| clear thinking, great communication and a sharp wit. Even
| captured here later in life you can clearly see why she was so
| successful and highly regarded by her peers (including some the
| most notable people in early computing history).
| molticrystal wrote:
| From the press release on the page[0] explains they got a
| machine from the National Archives. Though it probably would of
| been more fun if they directly cooperated with citizens of the
| public to decode the tapes.
|
| >While NSA did not possess the equipment required to access the
| footage from the media format in which it was preserved, NSA
| deemed the footage to be of significant public interest and
| requested assistance from the National Archives and Records
| Administration (NARA) to retrieve the footage. NARA's Special
| Media Department was able to retrieve the footage contained on
| two 1' APEX tapes and transferred the footage to NSA to be
| reviewed for public release.
|
| [0] https://www.nsa.gov/Press-Room/Press-Releases-
| Statements/Pre...
| dtx1 wrote:
| The NSA being the good guys for once feels strange.
| Especially caring for public interest.
| aftbit wrote:
| That used to be the norm! My personal favorite story along
| those lines was how they proposed changes to DES S-boxes
| without any detailed explanation. The open community was
| skeptical but it later turned out that the changes they
| proposed protected against differential cryptanalysis[1],
| which was at the time not known outside the intelligence
| community. That said, they did cut the key size
| dramatically which ended up weakening DES to the point that
| it could be trivially brute forced by the early 2000s,
| which led to 3DES and AES.
|
| 1: https://www.schneier.com/blog/archives/2004/10/the_legac
| y_of...
| adastra22 wrote:
| Yeah they unfortunately abused the good will they got
| from that. Once differential cryptanalysis was known and
| it was clear the NSA had strengthened the DES S-boxes,
| people started trusting them. And they started making
| lots of suggestions to various standards. Only now they
| were inserting back doors. It wasn't until Snowden that
| the pendulum of public paranoia swung back the other way.
| tptacek wrote:
| You're using the plural for "backdoors" there; what's the
| other one you're aware of?
| dekhn wrote:
| the morris hexabox shunt
| adastra22 wrote:
| https://www.atlasobscura.com/articles/a-brief-history-of-
| the...
| tptacek wrote:
| Unless you count Clipper as a "backdoor", this article
| asks the same question I am. The whole point of Clipper,
| of course, was that keys were escrowed.
| adastra22 wrote:
| Clipper was deliberately backdoored (the key exchange had
| a trap door), with that backdoor only publicly found
| after its release. This was more the a just key escrow.
| Why would that not count?
| tptacek wrote:
| The entire point of Clipper was to field cryptography
| that NSA could break. That wasn't a later revelation. It
| was the understanding at the time. It's why there _were_
| "the crypto wars".
| kragen wrote:
| they did strengthen the s-boxes against differential
| cryptanalysis, yes, but since 02004 we have evidence that
| they _also_ sabotaged it as part of a deliberate policy
| they 'd put in place in 01968:
| https://blog.cr.yp.to/20220805-nsa.html
| tptacek wrote:
| The sleight of hand here is to equate publicly reducing
| the key size, which was known (presumably at the time as
| well) to be a weakening of the system, with a supposed
| weakness injected cryptically into the S-boxes --- which
| we now know is the opposite of what happened.
|
| Further, the truncated version of DES that got
| standardized _far_ outlasted its expected lifetime ---
| the National Bureau of Standards expected DES to have a
| useful lifetime of about 5 years. And even at the time it
| was understood that you could expand the keysize by
| tripling up the DES core.
|
| I think there's a really big difference between publicly
| weakening a standard, in effect telling the world "we
| want a standard that is adequate for commercial purposes
| but inadequate for military purposes, so as to retain our
| national edge", and doing what they did with Dual-EC,
| where it was impossible (apparently) for people to reason
| about what NSA was up to.
| philodeon wrote:
| > and doing what they did with Dual-EC, where it was
| impossible (apparently) for people to reason about what
| NSA was up to.
|
| Schneier was clearly able to reason about what NSA was up
| to, and told everyone in 2007 not to use Dual-EC, 6 years
| before the Snowden revelations.
|
| I believe you have admitted that you thought that "Dual-
| EC has a backdoor" was a wild conspiracy theory until the
| Snowden revelations? Which makes the "impossible
| (apparently)" part a classic case of projection.
| tptacek wrote:
| The (apparently) was a dunk on me.
|
| (I thought nobody should use Dual EC! But that was my
| reason for thinking it wasn't an NSA backdoor, because it
| was too dumb to be one. I underestimated the industry's
| capacity for "dumb". Also: I was dumb! I am dumb a lot.)
| philodeon wrote:
| And now you believe it's impossible for any of the NIST
| PQC submissions to have been backdoored or weakened. I
| feel safer already. :D
| tptacek wrote:
| NIST didn't design any of the PQC submissions. It did
| design Dual EC.
| philodeon wrote:
| NIST didn't design Dual-EC, NSA did. But NIST did the
| really hard work, which involved slapping their
| organization's name on it, and not asking any
| inconvenient questions.
|
| Thankfully we found a better way that ensures
| cryptographic security, which is to get former NSA
| interns to write the PQC standards, instead of proper NSA
| employees.
| tptacek wrote:
| As a shorthand for this site, I'm not distinguishing
| between the two organizations. Which former NSA interns
| are you talking about? You can get their names from the
| pq-crystals.org site. Which one should we not be
| trusting?
| tptacek wrote:
| Is it maybe Tancrede Lepoint? He always seemed shady to
| me. Or Peter Schwabe?
| philodeon wrote:
| A wonderful question that exposes me to legal action if I
| answer.
|
| A better question: why do you think so many of your
| cryptographic feline friendz were so excited about
| isogenies for the past decade? Where do you think they
| all obtained that identical enthusiasm from? Why do you
| think SIKE made it so far in the contest and only got
| eliminated through luck?
| tptacek wrote:
| Your theory here is that NSA coordinated an action
| whereby the PQC standard selected could be broken by
| anybody in the world with a Python script, based on
| research disclosed to the public in the 1990s.
|
| I'm guessing this isn't a conversation that's going to
| take us into Richelot isogenies.
| philodeon wrote:
| You obviously know that the Python script wasn't
| submitted to NIST along with the draft standard.
|
| Is Dual-EC-DRBG fine because we never saw the FVEY Python
| exploit that breaks it?
|
| I think my theory here is that NSA coordinated an action
| whereby they figured no one was reading obscure algebraic
| geometry papers from 1997. In our low-attention-span
| world, it's not the worst plan.
|
| (Hell, folks didn't realize TAOSSA contained 0day for a
| long time. Simply putting something in front of the
| public doesn't mean they'll read or comprehend it.)
| tptacek wrote:
| It is literally the worst plan, because it leaves every
| PQC-protected system in the world exposed to _everybody
| in the world_. It 's a theory that depends on NSA just
| wanting to watch the world burn.
|
| Dual EC isn't broken by an exploit script. It's broken
| _with a secret key_.
| philodeon wrote:
| > It is literally the worst plan, because it leaves every
| PQC-protected system in the world exposed to _everybody
| in the world_.
|
| No, it leaves every SIKE-protected system in the world
| exposed to _everybody who reads obscure algebraic
| geometry papers from 1997._ We got really lucky that the
| two dorks who do read those papers decided to share their
| insights.
|
| For all you know, there's a paper sitting at the
| Institute For Advanced Study that would let you write a
| marvelous pq-crystals-shattering Python script, but
| they'll never tell you the combination to the safe.
|
| (Again: TAOSSA contained 0day exploits, and few noticed
| for a decade.)
| tptacek wrote:
| You seem to believe the only thing preventing people from
| exploiting Dual EC is not having read the right
| cryptography papers. No; the reason why that's not the
| case is plainly evident from Dual EC's structure (if that
| were true, the NSA would presumably have no need of Dual
| EC!). Our premises are too far apart to usefully discuss
| this.
| dadrian wrote:
| Of the SCW hosts, I'm actually the NSA plant. You got me.
| tptacek wrote:
| What people on these threads aren't prepared to grok is
| that cryptography engineers (even the older ones) are
| gothy af, and the isogeny graph diagrams all looked like
| black magic stuff out of the Lesser Key of Solomon.
| Sorry, there isn't more to it than that.
| lvh wrote:
| I don't know if I count as a "feline friend", but: SIDH
| kept the DH shape. Being able to upgrade the protocols we
| had relatively closely is appealing. "Structure is useful
| but seems precarious" wasn't exactly secret knowledge.
| aftbit wrote:
| I never understood the Dual-EC backdoor. What was the
| point? Who would be dumb enough to use that as their
| CSPRNG when so many simpler, faster, and less sus options
| were available?
|
| I supposed they did (allegedly) pay RSA Security to make
| this the default choice in BSAFE but that seems like an
| awful lot of work to hack one product.
| tptacek wrote:
| That was my take too, but in fairness to everyone else
| who was right about this, once you stepped back and
| looked at the design for what it was, rather than as a
| weird concoction that happened to spit out random
| numbers, it was extremely obvious what the purpose of the
| design was. Another thing happening with me and Dual EC:
| I just know a lot more about cryptography today than I
| did 13 years ago. (I'm not a cryptographer; I'm a
| vulnerability person that happens to specialize a bit in
| cryptography vulnerabilities. It's a great rhetorical
| hedge.)
|
| Another thing I was very certain (and certainly wrong)
| about was that no competent team was using BSAFE in 2010.
| The more I've learned about cryptography the less
| confidence I've held onto in industry cryptography
| practices outside of Google, Apple, and Microsoft. I
| would have assumed the major networking vendors were
| playing at roughly the same level. Yikes, no.
| reaperducer wrote:
| _The NSA being the good guys for once feels strange.
| Especially caring for public interest._
|
| Only if everything you know about the NSA comes from the
| evil, cackling, mustache-twirling caricatures of it
| promulgated by angry people on the internet.
|
| Once you look beyond the politics, propaganda, and axe-
| grinding that is endemic to the online world you find out
| all sorts of fascinating things about the U.S. government.
| snapcaster wrote:
| You think the dominant propaganda in the US is _against_
| the US war state and intelligence community?!
| nullityrofl wrote:
| It depends on the circles you run in.
|
| If you consume news primarily from, say, Hacker News,
| then sure.
| jknoepfler wrote:
| I don't think the Federal government has had much control
| over public perception of itself for quite some time now.
| We're not living in an age of manufactured consent in
| which the dominating central tendency is more or less
| obvious.
| DiscourseFan wrote:
| The concept of manufactured consent always felt a bit
| suspect, but Kamala Harris' presidential candidacy has
| been covered by say, the NYTimes and The Guardian with
| little to no criticism, and they seem to be intentionally
| masking the fact that she has no real policies or any
| sort of platform. What else, if anything, points you
| towards the image of a state in whose operations it wants
| to appear as ambiguous as possible? The real threat, the
| known threat to state security is Trump, because he and
| his followers are crazy.
|
| If the NSA, and other intelligence agencies, had any
| influence on the election, why wouldn't they do _exactly_
| what it would appear they are doing now and get a
| milquetoast liberal elected to office who will easily
| capitulate to their demands?
| defrost wrote:
| What, exactly, did they do to further their evil plans?
|
| Did they inject Biden with a dementia drug to force a
| withdrawal and engineer the timing such that the current
| Vice President was pretty much the only viable option for
| the US Democrats to rally behind?
|
| Seems like a tightrope feat of Rube Goldberg Heath
| Robinson needle threading.
| DiscourseFan wrote:
| >Did they inject Biden with a dementia drug to force a
| withdrawal and engineer the timing such that the current
| Vice President was pretty much the only viable option for
| the US Democrats to rally behind?
|
| It's not a one-way relation to power. Intelligence
| agencies are nothing if not opportunistic, they can
| _influence_ elections but if one of the candidates is
| clearly incompetent there isn 't much they can do
| _unless_ he drops out. You 're forgetting that Jill Stein
| would've never been endorsed by Biden; what appears to be
| chaotic and contingent actually has a strong set of
| boundary conditions of possibility that all the
| contingency is contained within, and intelligence
| agencies, including even the state department for foreign
| affairs, try to control _that_. Not individual actions,
| but the ability to perform them, the rationality of it.
| The fact that you can 't even imagine a candidate
| _besides Donald Trump_ who poses a serious threat to the
| state intelligence apparatus shows you that they 've
| already won, or at least nearly so.
| defrost wrote:
| > The fact that you can't even imagine a candidate
| besides Donald Trump who poses a serious threat to the
| state intelligence
|
| ?
|
| How'd you get this incorrect insight into what I think
| ... and what makes you think that Trump is a serious
| threat to the US state intelligence apparatus?
| DiscourseFan wrote:
| He encouraged a group of his supporters to overthrow the
| government to allow him to stay in elected office, and
| his political advisors have developed a plan for him to
| wipe out the executive branch in its current form if he
| gets re-elected? There won't be a security state under
| Trump, at least as it exists now.
|
| I think you claimed that Harris was somehow not an ideal
| candidate for the current hegemonic forces in the US, or
| at least those forces of power wouldn't do what they
| could to make sure she gets elected. One of Chomsky's
| points was precisely this, they goad you with progressive
| political candidates who don't actually threaten power.
| The two main forces of power in the US are capitalist
| industry and the state, but the truth is that _both_ have
| an interest in maintaining power relations such as they
| are, and so what we are witnessing in most elections is
| just a sort of balancing act between direct and indirect
| means of control. With Trump you have someone who is so
| insanely narcissistic that he is completely unreliable
| and there is essentially no way of using him to maintain
| state control as such.
| defrost wrote:
| Overthrowing an elected government doesn't threaten the
| longevity of security agencies or "the security state"
| and having read Project 2025 I see no threat to "the
| security state" .. if anything he'd be bringing more work
| their way.
|
| Trump is a threat to democracy, not to TLA's.
|
| > I think you claimed that Harris was somehow not an
| ideal candidate for the current hegemonic forces in the
| US,
|
| I made no such claim. Perhaps you might like to scroll
| back and identify where I did, I suspect you've confused
| me for another.
| DiscourseFan wrote:
| >Trump is a threat to democracy, not to TLA's.
|
| As if America is a democracy
| knowaveragejoe wrote:
| > intentionally masking the fact that she has no real
| policies or any sort of platform
|
| What you're suggesting doesn't exist - and is being
| skirted around by the news - is in fact widely available.
| Google's right there.
|
| > If the NSA, and other intelligence agencies, had any
| influence on the election, why wouldn't they do exactly
| what it would appear they are doing now and get a
| milquetoast liberal elected to office who will easily
| capitulate to their demands?
|
| This strikes me as working backwards from a conclusion.
| If in your view the intelligence community would operate
| in that way, how would you ever know one way or the
| other?
|
| One thing we can certainly agree on is that Trump is the
| real threat. It is pretty damning of our age that "not
| having a platform"(to your satisfaction) is supposed to
| be met as a serious criticism, but her opponent's openly
| unhinged behavior is just "how it is".
| shiroiushi wrote:
| >and they seem to be intentionally masking the fact that
| she has no real policies or any sort of platform
|
| She doesn't need one: the fact that she's not Trump, and
| she's not old enough to be senile or on death's door, is
| all she needs for most voters. It's not like the
| Democratic Party had a bunch of other viable candidates
| in a position to mount a presidential campaign this close
| to the election.
|
| If you want to criticize the US for having a crappy FPTP
| election system that basically guarantees only two viable
| parties on the national stage, that's fair, but that's
| not the fault of journalism outlets, it's baked into the
| Constitution and other legislation.
|
| <The real threat, the known threat to state security is
| Trump, because he and his followers are crazy. If the
| NSA, and other intelligence agencies, had any influence
| on the election...
|
| Also, those news outlets may very well have their own
| agenda they're pushing, without any help from the
| intelligence agencies or anyone else: back in 2015, the
| media did help to make Hillary look bad. Perhaps they're
| blaming themselves partially for Trump getting elected,
| so this time around they want to make sure they don't
| turn off voters to the non-crazy candidate just because
| she isn't perfect. (And granted, Kamala doesn't have
| nearly as much baggage as Hillary did, which helps a
| lot.)
| emilamlom wrote:
| Of course the NSA (and arguably any topic) is more
| nuanced than internet discourse likes to admit. That
| said, they've done plenty to warrant people's paranoia of
| them and not a lot to dissuade it.
| psunavy03 wrote:
| It's entertaining how many people online think government
| intelligence agencies actually care about them at all,
| considering the limited amount of time in the day and all
| the info that said agencies need to know about adversary
| countries and other important topics.
|
| For 99 44/100 percent of the online outrage bait, I'm
| like "you're not that interesting, and they almost
| certainly don't care about you anyway."
| TiredOfLife wrote:
| https://en.wikipedia.org/wiki/Ghidra
| HybridCurve wrote:
| With the type of work the NSA does, I can't imagine many of
| the didn't know who Grace Hopper was. I expect they did it
| out of respect for her, rather than for the benefit of the
| general public.
| Animats wrote:
| I wrote, about a month ago:
|
| _" The National Archives and Records Administration has
| standard procedures and approved vendors for this.[1] One of
| their approved vendors, Colorlab, has 1" type C equipment.
| Colorlab is conveniently located just outside the Capitol
| Beltway, about 20 miles west of NSA HQ at Fort Meade.
| Colorlab does preservation and conversion work for the
| Library of Congress, Warner Bros., Universal, NBC, The New
| York Public Library, Paramount, HBO, etc. NARA has a standard
| form for government agencies requesting this service.[3] It
| looks like it's not even charged against the sending agency -
| Archives picks up the bill."_[1]
|
| Maybe somebody got the message.
|
| [1] https://news.ycombinator.com/item?id=40957026
| hnpolicestate wrote:
| So it's interesting to know why people say what they do * _when*_
| (1982) they do. At 5:23 Rear Adm. Hopper says "they are dumping
| polychlorinated biphenyls (PCBs) around the country side".
|
| Toxic waste was a highly relevant cultural phenomena at the time.
| I believe she was referencing the "Valley of the Drums" toxic
| waste site which was proposed as a superfund site in 12/82. Love
| Canal made the subject popular 5 years earlier.
|
| For some reason I'm extremely interested in toxic waste. Anyways
| bit of reference.
| MisterTea wrote:
| > For some reason I'm extremely interested in toxic waste.
| Anyways bit of reference.
|
| Toxic waste is a real life monster that make for the best
| horror stories. Fictional monsters as in creatures aint got
| shit on real life willful poisoning of entire communities
| causing the suffering and deaths of millions. And its all done
| intentionally because someone wants more money - greed. The
| real monsters take on a dual form - the head of the beast being
| the people responsible and the body being the invisible poisons
| carelessly tossed onto the earth. Makes lovecraft and others
| look like mickey mouse.
| hnpolicestate wrote:
| I agree. Well said.
|
| For anyone interested, this is far and away the best book
| I've read on the subject of toxic waste. It became rare over
| the past 5 years. Used to be available on Open Library but I
| think they received a DMCA. Even the NYC public library only
| has one copy located at the main branch. Library wouldn't let
| me loan it out.
|
| https://books.google.com/books/about/The_Road_to_Love_Canal..
| ..
| MisterTea wrote:
| I should also have mentioned that as a child this issue of
| Nat Geo truly scared me scared me more than any silly
| horror movie could: https://archive.org/details/edg-
| ng-1981/edg%20NG%201985-03%2...
| Bluestein wrote:
| > It became rare over the past 5 years.
|
| This needs Torrent-seeded to death, in the public
| interest.-
| russellbeattie wrote:
| Here in Silicon Valley there's often surprise at my reluctance
| to eat fruit grown from back yard trees. It's sort of assumed
| that fruit right off the tree is somehow organic and healthy.
|
| "Have you checked to see if this area is sitting on EPA
| Superfund designated land, or down stream?" The response is
| usually a blank look.
|
| So I ask them why is this area called Silicon Valley? Then I
| ask if they realize how incredibly toxic the solvents used in
| chip manufacturing are? And then I ask how much 1950s and 60s
| companies cared about environmental concerns? Most people
| connect the dots pretty quickly. "Holy shit." Is the usual
| response.
|
| It really wouldn't surprise me if Fairchild, Intel and the rest
| just took barrels of used chemicals out back and dumped them
| into holes in the ground back when.
|
| Google got hit by this a few years ago when they built an
| office building on top of toxic waste and now have to have 24/7
| basement ventilation to make sure workers there don't get sick.
|
| There are whole neighborhoods built on that same polluted land.
| I'll get my orange from Safeway, thanks.
| kragen wrote:
| my old boss worked at a fairchild site. he said it was now a
| superfund site because, yes, that's what they did
| EvanAnderson wrote:
| > Here in Silicon Valley there's often surprise at my
| reluctance ...
|
| It's definitely not just the Valley. I live in rural western
| Ohio. It was really eye-opening to see how much contamination
| there is even here, in a relatively sparsely-populated area.
| Once I knew the extent my feelings about local real estate
| changed dramatically. Everybody should research toxic sites
| in their area.
|
| No doubt the manufacturing center in Dayton, OH, helped drive
| local contamination. I'm in a suburb 30 miles away in another
| county, however. We've got fun Superfund sites like the old
| county incinerator (PCB), two contaminated aquifers
| (tetrachloroethene and trichloroethylene) under the largest
| town in the County (from three sources, too!), and lead from
| a battery "recycler".
|
| I simply can't understand the mentality earlier generations
| had re: environmental contamination. I hear it in my father
| (71) re: anthropogenic climate change ("I can't believe the
| activities of humans could change such a large system...")
| and I imagine similar sentiments were in the minds of people
| dumping PCB or lead into the ground. It's chilling to me.
| ikiris wrote:
| The difference between the valley and Ohio is Ohio never
| even tried to change / clean it up.
| coldpie wrote:
| I do woodworking in some shop space I rent in northern
| Minneapolis. The building complex used to be a General
| Mills research laboratory from about the 20s through the
| 50s. Reportedly, Cheerios were invented there. At the time
| it was a relatively rural area, so how do you dispose of
| your research chemicals? Dump 'em in a pit out back!
|
| Now, it's a fairly densely populated urban neighborhood. It
| was declared a superfund site in the 80s, and they're still
| monitoring the site and working on nearby properties for
| remediation, decades later. https://www.health.state.mn.us/
| communities/environment/hazar...
|
| > I simply can't understand the mentality earlier
| generations had re: environmental contamination. I hear it
| in my father (71) re: anthropogenic climate change ("I
| can't believe the activities of humans could change such a
| large system...") and I imagine similar sentiments were in
| the minds of people dumping PCB or lead into the ground.
| It's chilling to me.
|
| It is bonkers, but you still see it all the time. "Oh, my
| choice to drive a 10 MPG SUV to work every day doesn't
| matter. I'm only one person."
| eddyfromtheblok wrote:
| yes, this applies anywhere the land has changed hands. who
| knows what a farmer or rancher used the land for before they
| sold it to a housing developer? or did a previous homeowner
| use pesticides or herbicides that have since been banned?
| dekhn wrote:
| In the biography of Gordon Moore, he mentioned that when he
| was inventing Intel's process chemistry they just routinely
| poured all their solvents down the same drain. The strong
| acids ate away the concrete (which wasn't noticed until long
| after), so nearly everything they poured down, went into the
| ground and hit the water table, then spread out. Moore's
| excuse was that they didn't really teach chemistry safety
| when he was in school.
|
| Some additional reading here:
| https://semspub.epa.gov/work/09/100018492.pdf
| photochemsyn wrote:
| Unpolluted quality soil is a valuable commodity. Anyone
| growing food in an area with a history of electronics and
| semiconductor fabrication would be wise to haul in a few
| truckloads of soil from an organic farmer and grow all their
| plants in raised beds.
|
| It's possible to build semiconductor devices without
| polluting the soil and water table, but it means every
| factory needs to build at least a small chemical waste
| processing plant onsite, or (better) design new closed-loop
| manufacturing processes that minimize or eliminate waste.
|
| https://www.sourcengine.com/blog/growing-sustainability-
| effo...
| drjasonharrison wrote:
| Popular Science in 1963 recommended disposing of used motor
| oil in a hole in your backyard: https://books.google.co.in/bo
| oks?id=myADAAAAMBAJ&lpg=PP1&pg=...
| knowaveragejoe wrote:
| This is, unfortunately, still ongoing even in Silicon Valley.
| Apple had a skunkworks office building caught venting their
| byproducts to atmosphere with completely inadequate
| filtrating.
| coldpie wrote:
| My recollection is the only source for this was a
| questionable rant on Elon's mass misinformation website.
| Was this ever actually investigated by professionals?
| mindcrime wrote:
| This story[1] would have been in the news around that time as
| well, FWIW. I'm guessing there were probably many other related
| cases around the country also.
|
| _The landfill was created in 1982 by the State of North
| Carolina as a place to dump contaminated soil as result of an
| illegal PCB dumping incident._
|
| The "illegal PCB dumping incident" refers to dumping of PCB
| contaminated oil from the Ward Transformer Factory along the
| sides of highways in several (14) NC counties, back in 1978.
|
| [1]: https://en.wikipedia.org/wiki/Warren_County_PCB_Landfill
| mulmen wrote:
| PCB contaminated oil or oil contaminated PCB? They both seem
| awful.
| huijzer wrote:
| "We're now at what will be the largest industry of the united
| states." (at 4:50)
|
| That aged pretty well.
| scrlk wrote:
| Great quote at 45:26 regarding vertical vs. horizontal scaling:
| https://youtu.be/si9iqF5uTFk?t=2726
|
| > "Now back in the early days of this country, when they moved
| heavy objects around, they didn't have any Caterpillar tractors,
| they didn't have any big cranes. They used oxen. And when they
| got a great big log on the ground, and one ox couldn't budge the
| darn thing, they did not try to grow a bigger ox. They used two
| oxen! And I think they're trying to tell us something. When we
| need greater computer power, the answer is not "get a bigger
| computer", it's "get another computer". Which of course, is what
| common sense would have told us to begin with."
| oasisbob wrote:
| Additionally, with early pioneer logging, another solution to
| avoiding having logs which are too large to handle was to not
| drop them in the first place.
|
| In the Pacific Northwest, US, early loggers would leave the
| huge ones - to the point where pioneers could complain about a
| lack of available timber in an old-growth forest.
|
| When the initial University of Washington was built, land-
| clearing costs were a huge portion of the overall capital
| spend. The largest trees on the site weren't used for anything
| productive; rather, they were climbed, chained together, and
| domino felled at the same time. By attaching the trees
| together, they only needed to fell one tree which brought the
| whole mess down into a pile and they burned it.
|
| I think there's a lesson here about choosing which logs you
| want to move.
| pests wrote:
| Having not read the article yet, this was one confusing
| comment until I realized by "logging" you meant actual trees
| and not log files.
| rootsudo wrote:
| That's an interesting tidbit I did not know about UW. Sad the
| wood wasn't used.
| brutal_chaos_ wrote:
| Perhaps an analogy can be made with respect to information
| priority and which trees to fell.
| interroboink wrote:
| > they did not try to grown a bigger ox
|
| Actually, humans have been doing exactly that through breeding
| over the millennia. They were just limited in their means.
|
| This analogy has some "you wouldn't download a car!" vibes --
| sure I would, if it were practical (: And vertical scaling of
| computers _is_ practical (up to some limits).
| nequo wrote:
| This is irrelevant to the example cited by Hopper. If you
| have a large log, you don't have time to breed a larger ox.
| You need to solve the problem with the oxen you have.
| interroboink wrote:
| I'm all in favor of making the best of what's available.
| But at the same time, if such thinking is taken as dogma,
| innovation suffers.
|
| You spoke of one log, and the time scales involved. But
| suppose you have an entire _forest_ of logs. Then it may
| indeed be worth breeding bigger oxen (or rather, inventing
| tractors).
|
| I don't mean to accuse Hopper of shortsightedness, but when
| quotes by famous people, like the above, are thrown around
| without context, they encourage that dogmatic thinking.
|
| So, I was more replying to that quote as it appeared here,
| rather than as it appeared in her talk.
| gffrd wrote:
| > if such thinking is taken as dogma, innovation suffers.
|
| I don't think there's anything about the original post,
| with quote about oxen, that reads as dogmatic, or invites
| such perspective.
|
| Also, I think we can all agree most innovation happens as
| an extension of "making the best of what's available"
| rather than independent of it, on a fully separate track.
|
| Using two oxen can lead to realizing a bigger ox would be
| beneficial.
| interroboink wrote:
| I don't mean to wear out this thread, and I totally
| respect your different viewpoint, but when I see:
| ... they're trying to tell us something. When we need
| greater computer power, the answer is not "get a
| bigger computer", it's "get another computer".
|
| that _does_ read as dogmatic advice to me, taken in
| isolation. It boils down to "the answer is X." Not
| "consider these factors" or "weigh these different
| options," but just "this is the answer, full stop."
|
| That's dogma, no?
|
| (that aside, I do slightly regret the snarkiness of my
| initial comment :)
| kortilla wrote:
| It is extremely rare that your compute workload has
| scaling properties that need just a little bit faster
| computer. The vast majority of the time if you are bound
| by hardware at all, the answer is to scale horizontally.
|
| The only exception is really where you have a bounded
| task that will never grow in compute time.
| interroboink wrote:
| Perhaps I misunderstand you, but what about those decades
| where CPUs were made faster and faster, from a few MHz up
| to several GHz, before hitting physical manufacturing and
| power/heat limits?
|
| Was that all just a bunch of wasted effort, and what they
| _should_ have been doing was build more and more 50MHz
| chips?
|
| Of course not. There are lots of advantages to scaling up
| rather than out.
|
| Even today, there are clear advantages to using an
| "xlarge" instance on AWS rather than a whole bunch of
| "nano" ones working together.
|
| But all this seems so straightforward that I suspect I
| really don't understand your point...
| kortilla wrote:
| >Perhaps I misunderstand you, but what about those
| decades where CPUs were made faster and faster, from a
| few MHz up to several GHz, before hitting physical
| manufacturing and power/heat limits?
|
| If you waited for chips to catch up to your workload, you
| got smoked by any competitors who parallelized. Waiting
| even a year to double speed when you could just use two
| computers was still an eternity.
|
| > Was that all just a bunch of wasted effort, and what
| they should have been doing was build more and more 50MHz
| chips?
|
| No, that's a stupid question and you know it. You set it
| up as a strawman to attack.
|
| Hardware improvements are amazing and have let us do tons
| for much cheaper.
|
| However, the ~4ghz CPUs we have now are not meaningfully
| faster in single thread performance compared to what you
| could buy literally a decade ago. If you're sitting
| around waiting for 32ghz that should only be "3 years
| away", you're dead in the water. All modern improvements
| are power savings and density of parallel cores, which
| require you to face what Grace presented all those years
| ago.
|
| Faster CPUs aren't coming.
|
| xlarge on AWS is a ton of parallel cores. Not faster.
| interroboink wrote:
| I just want to make one last attempt to get my point
| across, since I think you are discussing in good faith,
| even if I don't like your aggressive timbre.
|
| There is risk in reinforcing a narrow-minded approach
| that "all we need is more oxen." It limits one's
| imagination. That's the essence of what I've been
| advocating against in this thread, though perhaps my
| attempts and examples have merely chummed your waters.
| Ironically, I'd say Grace Hopper rather agrees, elsewhere
| in the linked talk[1].
|
| > Faster CPUs aren't coming.
|
| Not with that attitude, ya dingus (:
|
| [1] "https://www.youtube.com/watch?v=si9iqF5uTFk&t=1420s
| I think the saddest phrase I ever hear in a computer
| installation is that horrible one "but we've always
| done it that way." That's a forbidden phrase in my
| office.
| hughesjj wrote:
| Por que no los dos?
|
| I liked grace hopper's comments as a rebuttal against
| "only vertical! No horizontal!" but I'd agree that
| reading that rebuttal dogmatically would be just as bad
| of a decision.
|
| Bigger is better in terms of height _and_ girth when it
| comes to capabilities. At any given time, figure out the
| most cost efficient number of oxen of varying breeds for
| your workload and redundancy needs and have at it. In
| another year if you 're still travelling the Oregon trail
| you can reconsider doing the math again and trading in
| the last batch's oxen for some new ones, repeat as
| infinitum or as long as you're in business.
| kortilla wrote:
| You clearly aren't working in the constraints of
| computing in reality. The clock speed ceiling has been in
| place for nearly 20 years now. You haven't posted
| anything suggesting alternatives are possible.
|
| Your point has been made and I'm telling you very
| explicitly that it's bad. The years of waiting for faster
| processors have been gone for basically a generation of
| humans. When you hit the limit of a core, you don't wait
| a year for a faster core, you parallelize. The entire GPU
| boom is exemplary of this.
| vel0city wrote:
| I agree. And it is interesting too that the ceiling for
| the faster computer still goes back to her visualization
| of a nanosecond. Keep cutting that wire smaller and
| smaller, and there's almost nothing left to cut. But if
| we want it to go faster, we'd need to keep halving the
| wires.
|
| Despite the very plain language her talk has a lot of
| depth to it and I do think how interesting how on the
| money she was with her thoughts all the way back then.
| interroboink wrote:
| I think the misunderstanding here (and I apologize where
| I've contributed to it) is that you think I'm talking
| specifically and only about CPU clock rates.
|
| The scale-up/scale-out tradeoff applies to many things,
| both in computing and elsewhere. I was trying to make a
| larger point.
|
| I guess it's appropriate, in this discussion about
| logging, that we got into some mixup between the forest
| and the trees (:
| kortilla wrote:
| >But suppose you have an entire forest of logs. Then it
| may indeed be worth breeding bigger oxen
|
| That's idiotic unless you have other constraints. The
| parallelism allows you to also break apart the oxen to do
| multiple smaller logs at the same time when their
| combined force isn't needed.
| pbhjpbhj wrote:
| In there context of an analogy for parallelism, a tractor
| is just a bigger oxen. The whole point seems to be
| instead of making a bigger X to do function Y one has the
| option to use multiple X at the same time.
| bunderbunder wrote:
| The thing is, Hopper said this in 1982. This was a time
| when, to keep stretching the analogy, it wasn't hard to
| find a second ox, but _yokes_ were still flaky bleeding
| edge technology that mostly didn 't work very well in
| practice.
|
| One potentially more likely solution back in the day was to
| just accept the job was going to take a while. This would
| be analogous to using a block and tackle. The ox can do the
| job but they're going to pull for twice as long to get it
| done. Imagine pulleys cost $10, but a second ox costs $1000
| and a yoke costs $5000, and getting the job done in less
| time is not worth $5,990 to you.
| gpm wrote:
| To an extent, but there's also a reason why beasts of burden
| didn't get to arbitrarily large sizes. Scaling has limits
| (particularly in this case both thermal limits and material
| strength limits).
| bunderbunder wrote:
| That "they didn't have any big cranes" forces the analogy in a
| way that breaks it. The solution wherever cranes are used is
| absolutely to get a bigger crane. And also, oxen were
| absolutely bred to be bigger. That's kind of the defining thing
| that distinguishes draft oxen from other kinds of cattle. But
| that process was limited by some factors that are peculiar to
| domesticated animals. And, of course, if you need to solve the
| problem right now, you make do with the current state of the
| art in farm animal technology.
|
| Admiral Hopper's lecture wasn't delivered too long after 1976,
| which saw the release of both the CRAY-1 (single CPU) and the
| ILLIAC IV (parallel). ILLIAC IV, being more expensive, harder
| to use, and slower than the CRAY-1, was a promising hint at
| future possibility, but not particularly successful. Cray's
| quip on this subject was (paraphrasing) that he'd rather plow a
| field with one strong ox than $bignum chickens. Admiral Hopper
| was presumably responding to that.
|
| What they both seem to miss is that the best tool for the job
| depends on _both_ the job and the available tools. And they
| both seem to be completely missing that, if you know what you
| 're doing, scale up and scale out are complementary: _first_
| you scale up the individual nodes as much as is practical, and
| _then_ you start to scale out once scale up loses steam.
| hbosch wrote:
| Yeah, but trying to create a perfect analogy is like trying
| to improve the poundcake, just isn't worth it.
| dylan604 wrote:
| If you served me two pound cakes at the same time, I would
| say things were improved. However, it wouldn't be very
| efficient as I would still only eat them serially instead
| of in parallel
| interroboink wrote:
| Now I'm considering whether I'd rather have two pound
| cakes, or one big one...
|
| One normal-size, versus 10 miniature ones?
|
| Needs research (:
| dylan604 wrote:
| Depending on how you look at it, 10 is just a second one.
| Shadowmist wrote:
| Is one hand for each cake hyper-threading?
| kortilla wrote:
| >And also, oxen were absolutely bred to be bigger. That's
| kind of the defining thing that distinguishes draft oxen from
| other kinds of cattle.
|
| In your attempt to take down the analogy you just reinforced
| it. They quickly hit the limits of large oxen and had to
| scale up far faster than any selective breeding could help.
|
| The exact same thing happened in computing even during the
| absolute hay day of Moore's law. Workloads would very quickly
| hit the ceiling of a single server and the way to unblock
| yourself was not to wait for next gen chips but to
| parallelize.
| crmd wrote:
| The Sun E10k/15k was the last big ox in my tech career. I
| miss the big ox days.
| bunderbunder wrote:
| It's not that multiprocessing systems didn't exist at the
| time Hopper delivered this lecture; it's that they remained
| fairly niche products for computing researchers and fairly
| deep-pocketed organizations. At the time, multiprocessing
| was still very difficult to pull off. It wasn't necessarily
| analogous to just yoking two oxen to the same cart. It was
| maybe more like a world where the time it takes to breed an
| ox that's twice as strong is comparable to the time it
| takes to develop a working yoke for state-of-the-art oxen,
| and also nobody's quite sure how to drive a two-oxen team
| because it's still such a new idea. So the parallel option
| wasn't as sure of a bet from a business perspective as it
| is now.
| dumbo-octopus wrote:
| Interestingly, there are cases where a "support crane" is
| made to lift crane components up to a higher altitude where a
| different "primary crane" can be to do the remainder of the
| heavy lifting. At that point the listing can theoretically be
| efficiently parallelized, with two items being able to be
| hoisted at any given moment.
|
| This technique famously remodeled the iconic Tiffany building
| in NyC. https://www.mgmclaren.com/projects/crane-lift-at-
| tiffanys/
| pbhjpbhj wrote:
| How you describe this sounds more like pipelining.
| Avamander wrote:
| Both parts of the lecture are in general really good.
|
| Though I have to say the part about the cost of not
| implementing standards, the cost of not doing something, felt
| scarily relevant right now.
| agapon wrote:
| Well, if growing or breeding bigger oxen were as feasible as
| building bigger (mower powerful) computers was/is, perhaps
| people would take a different path? In other words, perhaps the
| analogy is flawed?
| dpcx wrote:
| At some point, a bigger computer either doesn't exist or is
| too cost prohibitive to get. But getting lots of "small"
| computers is somewhat easier.
| bunderbunder wrote:
| Sure, but, as I was (rather unpopularly) pointing out in
| another comment, that point was pretty hard to reach in
| 1982. Specifically the point where you've met both
| criteria: bigger computer is too cost prohibitive to get,
| and lots of smaller computers is easier. At the time of
| this lecture, parallel computers had a nasty tendency to
| achieve poorer real-world performance on practical
| applications than their sequential contemporaries, despite
| greater theoretical performance.
|
| It's still kind of hard even now. To date in my career I've
| had more successes with improving existing systems'
| throughput by removing parallelism than I have by adding
| it. Amdahl's Law plus the memory hierarchy is one heck of a
| one-two punch.
| PaulHoule wrote:
| In 1982 you still had "supercomputers" like
|
| https://en.wikipedia.org/wiki/Cray_X-MP
|
| because you could still make bipolar electronics that
| beat out mass-produced consumer electronics. By the mid
| 1990s even IBM abandoned bipolar mainframes and had to
| introduce parallelism so a cluster of (still slower) CMOS
| mainframes could replace a bipolar mainframe. This great
| book was written by someone who worked on this project
|
| https://campi.cab.cnea.gov.ar/tocs/17291.pdf
|
| and of course for large scale scientific computing it was
| clear that "clusters of rather ordinary nodes" like the
|
| https://www.cscamm.umd.edu/facilities/computing/sp2/index
| .ht...
|
| we had at Cornell were going to win (ours was way bigger)
| because they were scalable. (e.g. the way Cray himself
| saw it, a conventional supercomputer had to live within a
| small enough space that the cycle time was not unduly
| limited by the speed of light so that kind of
| supercomputer had to become physically smaller, not
| larger, to get faster)
|
| Now for very specialized tasks like codebreaking, ASICs
| are a good answer and you'd probably stuff a large number
| of them into expansion cards into rather ordinary
| computers and clusters today possibly also have some
| ASICs for glue and communications such as
|
| https://blogs.nvidia.com/blog/whats-a-dpu-data-
| processing-un...
|
| ----
|
| The problem I see with people who attempt parallelism for
| the first time is that the task size has to be smaller
| than the overhead to transfer tasks between cores or
| nodes. That is, if you are processing most CSV files you
| can't round-robin assign rows to threads but 10,000 row
| chunks are probably fine. You usually get good results
| over a large range of chunk size but _chunking is
| essential_ if you want most parallel jobs to really get a
| speedup. I find it frustrating as hell to see so many
| blog posts pushing the idea that some programming scheme
| like Actors is going to solve your problems and meeting
| people that treat chunking as a mere optimization you 'll
| apply after the fact. My inclination is you can get the
| project done faster (human time) if you build in chunking
| right away but I've learned you just have to let people
| learn that lesson for themselves.
| 0xcde4c3db wrote:
| > The problem I see with people who attempt parallelism
| for the first time is that the task size has to be
| smaller than the overhead to transfer tasks between cores
| or nodes.
|
| My big sticking point is that for some key classes of
| tasks, it's not clear that this is even possible. I've
| seen no credible reason to think that throwing more
| processors at the problem will ever build that one tool-
| generated template-heavy C++ file (IYKYK) in under a
| minute, or accurately simulate an old game console with a
| useful "fast forward" button, or fit an FPGA design
| before I decide to take a long coffee-and-HN break.
|
| To be fair, some things that _do_ parallelize well (e.g.
| large-scale finite element analysis, web servers) are
| extremely important. It 's not as though these techniques
| and architectures and research projects are simply a
| waste of time. It's just that, like so many others before
| it, parallelism has been hyped for the past decade as
| "the" new computing paradigm that we've got to shove
| absolutely everything into, and I don't believe it.
| bunderbunder wrote:
| It isn't for a great many tasks. Basically, whenever
| you're computing f(g(x)), you can't execute f and g
| concurrently.
|
| What you can do is run g and h currently in something
| that looks like f(g(), h()). And you can vectorize.
|
| A lot of early multiprocessor computers only gave you
| that last option. They had a special mode where you'd
| send exactly the same instructions to all of the CPUs,
| and the CPUs would be mapped to different memory. So in
| many respects it was more like a primitive version of SSE
| instructions than it is to what modern multiprocessor
| computers do.
| bunderbunder wrote:
| To your last point, it's been interesting to watch people
| struggle to effectively use technologies like Hadoop and
| Spark now that we've all moved to the cloud.
|
| Originally, the whole point of the Hadoop architecture
| was that the data were pre-chunked and already sitting on
| the local storage of your compute nodes, so that the
| overhead to transfer at least that first map task was
| effectively zero, and your big data transfer cost was
| collecting all the (hopefully much smaller than your
| input data) results of that into one place in the reduce
| step.
|
| Now we're in the cloud and the original data's all
| sitting in object storage. So shoving all your raw data
| through a tiny small slow network interface is an
| essential first step of any job, and it's not nearly so
| easy to get speedups that were as impressive as what
| people were doing 15 years ago.
|
| That said I wouldn't want to go back. HDFS clusters were
| such a PITA to work with and I'm not the one paying the
| monthly AWS bill.
| dylan604 wrote:
| I was getting into 3D around the time the Pentium was out,
| and I took a lot of time looking at the price of a single
| Pentium computer or multiple used 486s. The logic being a
| mini render farm would still be faster than a single
| Pentium. Never pulled the trigger on either option
| Animats wrote:
| How to organize them is a hard problem. For general-purpose
| use, we have only three architectures today - shared memory
| multiprocessors, GPUs, and clusters. When Hopper gave that
| talk, people were proposing all sorts of non-shared memory
| multiprocessor setups. The ILLIAC IV and the BBN Butterfly
| predate that talk, while the NCube, the Transputer, and the
| Connection Machine followed it by a year or two. This was a
| hot topic at the time.
|
| All of those were duds. Other than the PS3 Cell, also a
| dud, none of those architectures were built in quantity.
| They're really hard to program. You have to organize your
| program around the data transfer between neighbor units. It
| really works only for programs that have a spatial
| structure, such as finite element analysis, weather
| prediction, or fluid dynamics calculations for nuclear
| weapons design. Those were a big part of government
| computing when Hopper was active. They aren't a big part of
| computing today.
|
| It's interesting that GPUs became generally useful beyond
| graphics. But that's another story.
| mikaraento wrote:
| (Some) TPUs look more like those non-shared memory
| systems. The TPU has compute tiles with local memory and
| the program needs to deal with data transfer. However,
| the heavy lifting is left to the compiler, rather than
| the programmer.
|
| Some TPUs are also structured around fixed dataflow
| (systolic arrays for matrix multiplication).
| sillywalk wrote:
| > non-shared memory multiprocessor setups
|
| It's not really comparable to the other examples you cite
| - the ncube/transputer/connection machine, in that it was
| programmed conventionally, not requiring a special
| parallel language, but Tandem's NonStop was this,
| starting in ~1976 or 1977. Loosely coupled, shared-
| nothing processors, communicating with messages over a
| pair of high-speed inter-processor busses. It was
| certainly a niche product, but not a dud. It's still
| around having been ported from a proprietary stack-
| machine to MIPS to Itanium to X86.
|
| EDIT: I suppose it can be compared to a Single System
| Image cluster.
| toast0 wrote:
| Sure, but it's important to notice that the biggest off the
| shelf computers keep getting bigger.
|
| Dual socket Epyc is pretty big these days.
|
| If you can fit your job on one box (+ spares, as needed),
| you can save a whole lot of complexity vs spreading it over
| several.
|
| It's always worth considering what you can fit on one box
| with 192-256 cores, 12TB of ram, and whatever storage you
| can attach to 256 lanes of PCIe 5.0 (minus however many
| lanes you need for network I/O).
|
| You can probably go bigger with exotic computers, but if
| you have bottlenecks with the biggest off the shelf
| computer you can get, you might be better of scaling
| horizontally, but assuming you aren't growing 4x a year,
| you should have plenty of notice that you're coming to the
| end of easy vertical scaling. And sometimes you get lucky
| and AMD or Intel makes a nicely timed release to get you
| some more room.
| amy-petrik-214 wrote:
| exactly, that's what it is as we hit the end of moore's
| law (which we won't, but we'll hit the end as far as
| feature size shrinkage)... one of the optimizations they
| will do is rote trivial process optimization. So if the
| chip failure rate on the assembly line is 40% they drop
| it to 10%. Costs will drop accordingly, because there are
| x-fold more transistors per dollar, thus ensuring moores
| law.
| Animats wrote:
| This is a reaction to Grosch's Law, "Computing power increases
| as the square of the price".[1] In the early 1980s, people
| still believed that. Seymour Cray did. John McCarthy did when I
| was at Stanford around then. It didn't last into the era of
| microprocessors.
|
| Amusingly, in the horse-powered era, once railroads started
| working, but trucks didn't work yet, there was a "last mile"
| problem - getting stuff from the railroad station or dock to
| the final destination. The 19th century solution was to develop
| a bigger breed of horse - the Shire Horse.[1]
|
| [1] https://en.wikipedia.org/wiki/Shire_horse
| SECProto wrote:
| The article you linked doesn't support your anecdote about
| the root of the Shire Horse. It describes their history
| dating back centuries before railways. Their biggest use
| seems to have been hauling material to and from ports, not
| trains.
| TomatoCo wrote:
| Well, would the anecdote work if you bumped it back a few
| centuries and swapped trains for ships?
| Animats wrote:
| Sure it does. Shires go back a ways, but they were not bred
| in quantity until the 1850s or so. "In the late nineteenth
| and early twentieth centuries, there were large numbers of
| Shires, and many were exported to the United States."
| Before and after that period, those big guys were an exotic
| breed. That's the railroad but pre-truck period.
|
| (I've owned a Percheron, and have known some Shires.)
| SECProto wrote:
| When you cite a source for something, the source should
| justify the thing you're claiming. The relevant part of
| your post to the thread was that "The 19th century
| solution was to develop a bigger breed of horse." It is
| entirely contrary to what the Wikipedia article says -
| "The breed was established in the mid-eighteenth century,
| although its origins are much older".
|
| (relatives have owned Friesian's and Clydesdale's and
| Norwegian Fjording Horses, but it's neither here nor
| there)
| jessekv wrote:
| Seems like you are reading the engineer definition of
| "develop", and OP is using the general english
| definition.
| K0balt wrote:
| I think op is using "develop" as in :
|
| To advance; to further; to prefect; to make to increase; to
| promote the growth of. "We must develop our own resources
| to the utmost. Jowett (Thucyd)."
|
| Rather than develop as in the contemporary software
| engineering sense synonymous with create.
|
| In this sense, the breed was further developed to serve
| railway terminals from the original breed created to
| service maritime ports.
| SECProto wrote:
| That other definition doesn't really seem to fit either,
| but I acknowledge that if they had used a different word
| ("adopted a bigger breed" or "popularized a bigger breed"
| or something) then it would fit with the anecdote.
| jessekv wrote:
| It's not just software engineering, a mechanical engineer
| might develop a new kind of coffee machine.
|
| In engineering we are accustomed to getting involved
| early in the creation process, and our usage of "develop"
| reflects this bias.
|
| Outside of that bubble, "develop" is very explicit that
| the thing already exists. For example, developing a
| musical theme, a muscle, or a country.
| inopinatus wrote:
| There's an inflection point, however. Hence Seymour Cray's
| famous quip, "If you were plowing a field, which would you
| rather use? Two strong oxen or 1024 chickens?"
| jimmySixDOF wrote:
| My favorite quote/story was "Never Never Never take the First
| No!" (16:20) because some people are just obstructionists and
| others just want to see how serious you are. As she says,
| appreciation for how to do this comes with age, but to me a
| good definition of management in general is learning if, when,
| how, and how much to pushback against pushback.
| hello_computer wrote:
| These days, "get another computer" and "get a bigger computer"
| are basically the same thing; differences primarily residing in
| packaging and interconnects, but boy howdy can those
| interconnects make a difference.
| begueradj wrote:
| Grace Hopper: she devised the first "true and modern" compiler
| ever (A1 was its name, if I recall).
| begueradj wrote:
| Just checked: her compiler was called A-0. If I remember well,
| she did it when she was teaching mathematics to help her
| students.
|
| (As a part of my English language exam when I was a student, I
| had to write a text about a subject of my choice. I wrote about
| the history of programming languages: that's where I discovered
| Grace Hopper and mentioned her work in my essay).
| 2OEH8eoCRo0 wrote:
| "I have already received the highest award that I will ever
| receive no matter how long I live, no matter how many more jobs
| that I have- and that has been the privilege and responsibility
| of serving very proudly in the United States Navy."
|
| What a peach. She inspired this jarhead.
| cm2187 wrote:
| It's almost a stand up comedy special, but a smart version.
| kranke155 wrote:
| Her humor was what blew me away the most. She is clearly an
| amazing public speaker.
|
| And I mean this - amazing - she is at a level that few CEOs and
| public figures ever got. She's got tremendous charm,
| intelligence and wit
| biofox wrote:
| The proof she presents in Part 2 (t=15:00) on software changes
| propagating through a system is perhaps the best theoretical
| justification I have ever seen for Object Oriented design
| principles and encapsulation:
|
| https://youtu.be/AW7ZHpKuqZg?si=Dzt6JeoX7MDT8D9M&t=899
| mrkeen wrote:
| I think anyone can walk away from that explanation with their
| own methodology vindicated.
| MajimasEyepatch wrote:
| It's a strong case for loose coupling, but that can be achieved
| with object oriented programming, functional programming, or
| any number of other paradigms.
| rkagerer wrote:
| A timeless piece of advice comes toward the end, where she
| describes all the smart, young professionals out there who are
| looking for positive leadership. It means respect those above and
| keep them informed, and look after your crew.
|
| The zeitgeist of the time was shifting emphasis onto management
| (MBA type stuff) but the army had a saying; you can't manage a
| soldier into war, you lead them.
|
| You manage _things_ , you _lead_ people.
| russellbeattie wrote:
| > _" I think we forget that the four and five year olds are
| learning arithmetic. The little professor. The six year olds are
| getting Speak and Spell. You better look out, there's going to be
| a generation coming, that will know how to spell."_
|
| > _The seven-year-olds, of course, are learning BASIC, running
| the computers. I know one man that bought a computer and took it
| home, his son is teaching him BASIC. His son is seven. Of course
| I know another guy that took a computer home, now he has to apply
| to his three children for computer time. "_
|
| > _They 're tremendously bright and they are out there, the
| brightest youngsters we have ever had."_
|
| As a GenXer who was a 10yo computer nerd programming BASIC in
| 1982, I'm proud to know that she was talking about myself and my
| peers!
|
| But then she goes on to predict that the brightest kids will come
| from rural areas because they have "good schools". No idea where
| she got that idea from - I moved from a city to a rural area
| around that time and the education wasn't any better and my
| access to computers was totally gone. In my experience, most
| rural areas, especially in flyover states, neither had the money
| for a computer lab, nor the teachers that knew how to use them.
| Kon-Peki wrote:
| > But then she goes on to predict that the brightest kids will
| come from rural areas because they have "good schools". No idea
| where she got that idea from
|
| She is speaking from the perspective of someone leading a team
| of people that volunteered for the Navy and were then selected
| to do technical work for her team.
|
| I think that given the years in which she lived, she would have
| certainly be seeing the side effect effects of the growing
| divide in opportunity between the urban and rural areas. A
| smart, hardworking kid in the city or suburbs is going to have
| a fulfilling job or go off to college at a higher rate than the
| equivalently talented and hardworking kid from farm country,
| where joining the military is probably the best chance at
| advancement that they're going to get.
| WillAdams wrote:
| Something like that --- my going to Stanford was torpedoed by
| my rural high school not being able to find a teacher for
| calculus my senior year, so, having aced the ASVAB, EDPT, and
| DLPT, I enlisted.
|
| That rural school system was a marked change from the one
| near Columbus AFB I had previously attended --- most students
| were from the base and the school received a generous amount
| of DoD funding to offset that, so all of the teachers had
| Masters degrees, and a number of them were accredited as
| faculty at a nearby college --- classes were strongly divided
| between social such as homeroom, P.E., social studies, &c.
| (attended at one's grade level) and academic (attended at
| one's grade level with a cap on 4 years ahead if in grade 8
| or lower --- said cap was removed at 8th grade and students
| could begin taking college courses --- many graduated high
| school and were simultaneously awarded a 4 year college
| degree).
| WillAdams wrote:
| Typo in that
|
| >attended at one's grade level with a cap on 4 years ahead
| if in grade 8 or lower
|
| should be:
|
| >attended at one's _ability_ level with a cap on 4 years
| ahead if in grade 8 or lower
| acdha wrote:
| Seconding this: even in the 90s, the central California high
| school I went to had most of the top students looking at the
| military if they didn't love FFA - the costs were already
| high enough thanks to Reagan's cuts that they couldn't afford
| tuition at the state schools without significant loans,
| weren't from families where taking on that kind of debt was
| something you did, and the military benefits looked great in
| an era where you were unlikely to deploy at all or if you did
| it'd be something like peacekeeping in Bosnia. One of the
| guys I was in a few classes with ended up deploying into
| Afghanistan early on. I didn't see his name in the news after
| that so I hope he's okay.
| Kon-Peki wrote:
| I grew up in a very blue collar rust belt type of area, and
| if you were a smart kid that showed an interest in the
| military, the school guidance staff would try to keep you
| away from the standard military recruiters and point you
| towards ROTC while studying engineering at State U. A
| person taking that route would never have been assigned to
| Hopper's team.
|
| And again, the wealthy area where I live now, if you are a
| bright kid that shows an interest in the military they are
| going to try to get you to do ROTC at the top engineering
| school possible, or setting up interviews with the
| Congressman or Senator to try to get a nomination into one
| of the academies.
|
| So by _percentages_ , the best enlisted men and women are
| likely to be from more rural areas.
| karmicthreat wrote:
| What computer is Harper talking about here?
| https://www.youtube.com/watch?v=AW7ZHpKuqZg&t=375s
| gigel82 wrote:
| 160Kb overhead was outrageous. I'm glad she didn't get to
| experience a monstrosity like Windows 11 booting up with 12Gb of
| overhead...
| mulmen wrote:
| 42 years later the storage cost of this information has not yet
| surpassed its value.
| thenegation wrote:
| I did not watch the videos yet (saved for later), but did a fun
| experiment:
|
| Jump to a random timestamp. Listen for 3-4 seconds. Repeat it 10
| times.
|
| Quite impressive.
| markunivac95 wrote:
| Partial stenographic record. Sept. 19, 1985.
|
| https://www.osti.gov/servlets/purl/6566336
| huppeldepup wrote:
| At 32 min she talks about a book called "Everything You Ever
| Wanted to Know About Microcomputers, but didn't know who to ask"
| by Slater. I found a reference on archive but nothing more, also
| nothing on the author.
|
| Leland W. Slater, "Everything You Ever Wanted to Know About
| Microcomputers," Computop i cs . 3/82, pp. 38ff.
| pdw wrote:
| Not a book, but a paper. I found two citations, but I can't
| find either version.
|
| Slater, L.W. (1982). Everything you ever wanted to know about
| microcomputers (but didn't know WHO to ask). Navy Regional Data
| Automation Center Publ., Norfolk, Virginia, 19 pp.
|
| Leland W. Slater, "Everything You Ever Wanted to Know About
| Microcomputers," Computopics, 3/82, pp. 38ff.
|
| (Computopics appears to have been the magazine of the
| Washington DC chapter of the ACM.)
| erk__ wrote:
| There seem to exist a copy at the University of Virginia,
| School of Nursing: https://nursing.virginia.edu/nursing-
| history/collections-cnh... So it may be possible to either
| ask them or go there and get a scan of it.
| erk__ wrote:
| I have sent them a mail about it and will update somewhere
| if I hear back.
|
| Update: I have heard back and should have a scan of it
| soon.
| sword_smith wrote:
| the user "erk__" from this forum managed to get a copy:
| https://user.fm/files/v2-5bf26ed8f02e4cac68357e4ed895b3fc/Sl...
| mewse-hn wrote:
| I'm so glad the NSA released this despite trying to stonewall the
| FOIA request by saying they couldn't access it.
| hank808 wrote:
| "...and you can write programs that will run on anybody's
| computer." https://youtu.be/AW7ZHpKuqZg?t=1694
| hank808 wrote:
| Just fantastic!!! She was amazing! I'm so glad that they managed
| to get this digitized and released.
___________________________________________________________________
(page generated 2024-08-27 23:02 UTC)