[HN Gopher] When did computer science theory get so hard?
       ___________________________________________________________________
        
       When did computer science theory get so hard?
        
       Author : furcyd
       Score  : 186 points
       Date   : 2021-11-15 05:08 UTC (1 days ago)
        
 (HTM) web link (blog.computationalcomplexity.org)
 (TXT) w3m dump (blog.computationalcomplexity.org)
        
       | jeffreyrogers wrote:
       | On the math page that he linked he has some ideas:
       | 
       | > 1) When you get older math got harder.
       | 
       | > 2) When math got more abstract it got harder. Blame
       | Grothendieck.
       | 
       | > 3) When math stopped being tied to the real work it got harder.
       | Blame Hardy.
       | 
       | > 4) Math has always been hard. We NOW understand some of the
       | older math better so it seems easy to us, but it wasn't at the
       | time.
       | 
       | > 5) With the web and more people working in math, new results
       | come out faster so its harder to keep up.
       | 
       | > 6) All fields of math have a period of time when they are easy,
       | at the beginning, and then as the low-hanging fruit gets picked
       | it gets harder and harder. So if a NEW branch was started it
       | might initially be easy. Counterthought- even a new branch might
       | be hard now since it can draw on so much prior math. Also, the
       | low hanging fruit may be picked rather quickly.
       | 
       | Personally I think it is mostly 2 and 3. And a lot of it has to
       | do with how math is communicated. The very abstract way that math
       | textbooks use to communicate is not how most mathematicians think
       | when solving problems. A lot of things I struggled to learn from
       | books became much more understandable once someone gave me the
       | little insight I needed to make sense of it in conversation.
        
         | ericbarrett wrote:
         | I completely agree regarding books. I've been refreshing my
         | university undergrad-level math, and older textbooks have been
         | much more accessible than most written after (roughly) the
         | 1990s, even though the subjects haven't changed at all--this
         | isn't avant-garde stuff. Seems no field is exempt from the
         | modern "churn for churn's sake" phenomenon.
        
           | blocked_again wrote:
           | > older textbooks have been much more accessible than most
           | written after (roughly) the 1990s
           | 
           | Explain?
        
             | ericbarrett wrote:
             | I don't have an explanation; it's entirely
             | empirical/anecdotal. And there are certainly exceptions in
             | both directions, e.g. I have a 1978 textbook on linear
             | algebra that never mentions the relationship between matrix
             | determinants and area.
             | 
             | Overall what I've found is that older textbooks (edit: not
             | just math) are more likely to (1) explain concepts in
             | smaller steps; (2) not bury them in symbols and
             | abstractions; (3) introduce and explain physical intuitions
             | for subjects when they are relevant.
        
               | GeorgeTirebiter wrote:
               | I think it's just familiarity. Those are the books where
               | you learned the stuff. So their way (style) is 'what you
               | have been programmed to like'.
               | 
               | to learn anything, so you can teach it to somebody else
               | (the REAL def of Learning I'd say), you have to Do The
               | Exercises.
               | 
               | ---- I recently went over one of my older books on
               | Matrices and it was a curious mix of trying to be sort-of
               | precise, with trying to bring the 'wonder' and 'magic' of
               | the concepts to life early -- so as to maintain reader
               | interest. Ultimately, it was the _way_ of thinking
               | (mathematically) that was mostly taught; the  'matrix
               | intro' was just fodder to support the pedagogy!
        
               | spc476 wrote:
               | One book I found that was incredible was _Mathematics for
               | the Million_ (3rd edition, from 1951) that's part
               | mathematics book, part history book. It goes over the
               | history of mathematics from ancient Egypt going forward,
               | showing how only _how_ math works (geometry, trig,
               | algebra, the calculus) but how it was discovered and
               | evolved over time.
        
           | hungryforcodes wrote:
           | In Grade 12, I had a revelation when a good friend explained
           | to me in 5 minutes that trigonometry was largely just ratios.
           | I guess either it hadn't been explained to me before properly
           | or I didn't understand it at a DEEP level, but when he laid
           | it all out for me on the board, it was suddenly all so
           | simple.
           | 
           | I had three reactions:
           | 
           | 1) I was elated to finally understand sin and cosin.
           | 
           | 2) I was thankful for him and also amazed at how simple it
           | was in 5 minutes to explain something that seemed to have
           | eluded me for 5 years of highschool. It made me wonder alot
           | what other difficult subjects could be easily taught by just
           | a good explanation.
           | 
           | 3) I wondered why math class had failed so specifically to
           | give me that knowledge, or at least test me to see if I
           | understood it and if I didn't to ensure that I did understand
           | it. The answer I think was math class just encouraged me to
           | use a calculator, thus short circuiting any real
           | understanding.
           | 
           | Anyways he went on to be a physics professor at Cambridge.
           | 
           | I always appreciated that moment.
        
             | 2-718-281-828 wrote:
             | > Anyways he went on to be a physics professor at
             | Cambridge.
             | 
             | such a shame. sounds like a talented fellow who really
             | could have made something of his life. maybe even a chair
             | at Oxford :/
        
               | 908B64B197 wrote:
               | Had he added a "New" in front of the country of his
               | institution he might also have tripled his impact factor
               | and salary!
        
             | dataflow wrote:
             | How were sine and cosine taught to you as anything other
             | than ratios? I just assumed they were always taught as
             | "opposite over hypotenuse" and "adjacent over hypotenuse"
             | etc.
        
               | mcbuilder wrote:
               | I was originally taught the standard SOHCAHTOA in high
               | school, and it didn't really click for me at all until I
               | got to my undergrad introductory functions and graphs
               | course. Near the end we got to Euler's forumla and our
               | professor starting graphing e^{ix} in an extended complex
               | plane. So apparently people learn things differently, as
               | I prefer to work with identities and conceptual
               | understanding rather than disconnected concepts.
        
               | hungryforcodes wrote:
               | They were clear to me intellectually, but the way he
               | explained it made me SEE the ratios. Instead of just
               | being abstract functions, there was a clear relationship
               | between the angles in the triangle. After that I was able
               | to reason about them, where as before they were just
               | things I plugged into my calculator.
        
               | mixmastamyk wrote:
               | Sometimes we are not ready for specific bits of
               | information. i.e. brains have a "limited bandwidth."
               | 
               | It's possible back in the old days the math teacher did
               | mention it, but we couldn't understand it at the time
               | because we had little sense of the subject.
               | 
               | A common take on this phenomenon is discussed here
               | occasionally:
               | 
               | http://habitatchronicles.com/2004/04/you-cant-tell-
               | people-an...
               | 
               | For a demonstration, go back and watch a good movie that
               | requires focus, say three times in a row. Every viewing
               | should uncover details you didn't notice the first time,
               | because we can't take them all in at once.
               | 
               | It's also why repetition is a good learning technique.
               | Unfortunately few people have the patience to start over
               | from the absolute beginning.
        
               | r00fus wrote:
               | This is very true. You are not the person you were in HS
               | - everyone is a "Ship of Theseus" [1] therefore you can
               | only guess at what HS version of you was like based on
               | fuzzy remembrances.
               | 
               | At your current age, you are more capable in some ways,
               | less in others, and thus your learning capacity is
               | different.
               | 
               | [1] https://en.wikipedia.org/wiki/Ship_of_Theseus
        
               | tzs wrote:
               | In high school any other way would probably be borderline
               | insane.
               | 
               | If you somehow managed to skip trigonometry in high
               | school but still got into college calculus, then there
               | are a few of other ways they might be defined.
               | 
               | One way is to put off dealing with trig functions until
               | you get to differential equations. Then sin is the unique
               | solution of y'' + y = 0, y(0) = 0, y'(0) = 1, and cos is
               | the unique solution of y'' + y = 0, y(0) = 1, y'(0) = 0.
               | 
               | Another way is to define them axiomatically. These are
               | the axioms for sin and cos:
               | 
               | 1. They are both defined everywhere on the real line,
               | 
               | 2. cos(0) = sin(pi/2) = 1, cos(pi) = -1,
               | 
               | 3. cos(y-x) = cos(y) cos(x) + sin(y) sin(x)
               | 
               | 4. for 0 < x < pi/2, 0 < cos(x) < sin(x)/x < 1/cos(x).
               | 
               | From that you can deduce all the usual trig identities
               | and the limits you need to do calculus with them.
               | 
               | You still need to prove that functions satisfying those 4
               | axioms actually exist. One approach is to put that off
               | until later when you've got more tools under you belt.
               | For example once you do Taylor series, you can work out
               | what they Taylor series would be for sin and cos if they
               | exist. The functions defined by those Taylor series _do_
               | exist, so then you just need to show that those functions
               | satisfy the axioms.
        
               | yumraj wrote:
               | That is how I learned too, however I believe the point of
               | parent was that you can memorize it as "opposite over
               | hypotenuse" and "base/adjacent over hypotenuse" OR you
               | can have a realization that they are constant ratios in a
               | right angled triangle.
               | 
               | It's subtle, but I think it's a very valid point since I
               | too never thought it in terms of ratio, merely in terms
               | of here's the angle, this is what you need to find, you
               | apply sin/cos/tan and get the answer.
        
             | L0in wrote:
             | > 2) I was thankful for him and also amazed at how simple
             | it was in 5 minutes to explain something that seemed to
             | have eluded me for 5 years of highschool. It made me wonder
             | alot what other difficult subjects could be easily taught
             | by just a good explanation.
             | 
             | I didn't understand what functions are/ how they work, my
             | brain _absolutely_ refused to understand, until i started
             | programming. Then i got it in 5 minutes.
        
               | justinlloyd wrote:
               | I started programming at a young age, and got introduced
               | to enumeration, product of and functions in math,
               | absolute value, and so forth, but they were explained in
               | any clear and succinct way that made sense. And then I
               | had that "Aha!" moment of "oh, those are just these
               | computing concepts." But when I later talked to the math
               | teacher, "no, no, those are wrong, you don't understand"
               | he would say. We were most definitely talking about the
               | same concepts, just with different language, and I, at
               | that young age, didn't not understand his language, and
               | he, not having any computing of software experience,
               | didn't understand my language.
        
               | mmmpop wrote:
               | Yep, I think CS 101 should be a coreq with engineering
               | calculus for this exact reason.
        
               | L0in wrote:
               | I was so pissed... It took me years of trying to
               | understand why i couldn't understand something so vital.
               | 
               | I don't know if it has to do with how my brain works. I
               | wouldn't say i have too technical mind.
        
               | jacquesm wrote:
               | Conversely, I had been programming in a BASIC dialect
               | that used DEF FN to add user defined functions to the
               | language and that made the concept of functions in math
               | trivial for me. Being exposed to computers at a very
               | young age was a huge advantage in school in the 70's.
        
           | runnerup wrote:
           | As a related side note, I'm looking for a post on HN from
           | earlier this year which linked to a textbook from the early
           | 1900's or late 1800's about visualizing data. If anyone can
           | help me find it, I'd greatly appreciate it.
        
             | wrycoder wrote:
             | google visualizing data book
        
             | jacquesm wrote:
             | This one?
             | 
             | https://hyperallergic.com/306559/w-e-b-du-boiss-modernist-
             | da...
        
           | bigthymer wrote:
           | > older textbooks have been much more accessible than most
           | written after (roughly) the 1990s
           | 
           | Why? How are they different?
        
         | bumby wrote:
         | > _A lot of things I struggled to learn from books became much
         | more understandable once someone gave me the little insight I
         | needed to make sense of it in conversation._
         | 
         | Don't just leave us hanging :-) Care to elaborate on what
         | helped gain that insight?
        
         | mcguire wrote:
         | I don't think 6 has applied to math for hundreds of years, but
         | I do see it as having applied to CS relatively recently.
        
         | Jensson wrote:
         | How much math have you studied? I think they are talking about
         | grad level math courses where things starts to get really
         | abstract. Calculus, linear algebra and discrete math are all
         | super concrete, pretty sure they are talking about things
         | described as "generalized abstract nonsense":
         | https://en.wikipedia.org/wiki/Abstract_nonsense
         | 
         | Try reading the example on that page. That is what makes math
         | hard. Not the way calculus is described.
        
           | hxksleo wrote:
           | It's a mistake to just think of linear algebra as being super
           | concrete. Linear algebra can be very theoretical. Obviously,
           | there are a lot of categorical applications to linear algebra
           | as well.
        
           | m0lecules wrote:
           | Or try reading the ethereum yellow paper:
           | https://ethereum.github.io/yellowpaper/paper.pdf
           | 
           | When it gets into the symbolics of the ethereum state machine
           | (section 4+), things get hairy fast. Unfortunately papers
           | like this only make sense once you understand the paper. It's
           | a succinct representation of knowledge for someone skilled in
           | the field, but not a sensible way to teach others about the
           | EVM.
        
             | interroboink wrote:
             | > It's a succinct representation of knowledge for someone
             | skilled in the field, but not a sensible way to teach
             | others
             | 
             | Sounds like manpages (:
        
           | jeffreyrogers wrote:
           | I had a math minor in college and took some upper level
           | analysis courses, so I know what abstract means in the
           | context of higher mathematics. I wasn't talking about
           | calculus.
        
             | Jensson wrote:
             | But how would you study abstract math without those
             | abstractions? The math is just abstractions and nothing
             | else, the abstractions are what you study, there is no
             | concrete thing there to describe. So your original comment
             | tells me you haven't studied a lot of very abstract math
             | since you still want those explanations, but they stop
             | existing after a point.
             | 
             | > took some upper level analysis courses
             | 
             | Real Analysis is still not very abstract, but yeah that is
             | a step. The next step is to remove everything concrete
             | about it and just study abstractions. And then you start
             | studying abstractions of those abstractions, hence
             | "abstract nonsense". Maybe sometimes in the future someone
             | will invent abstractions of those.
             | 
             | Edit: Meant the stuff you learn in Rudins principles of
             | mathematical analysis. Not sure exactly what the course
             | would be called in an English class, but I called it real
             | analysis here.
        
               | jeffreyrogers wrote:
               | > But how would you study abstract math without those
               | abstractions?
               | 
               | I'm not saying you would, I'm just saying the
               | abstractions are directly related to why its harder.
               | 
               | > since you still want those explanations, but they stop
               | existing after a point.
               | 
               | You're misunderstanding me.
               | 
               | > The next step is to remove everything concrete about it
               | and just study abstractions. And then you start studying
               | abstractions of those abstractions, hence "abstract
               | nonsense". Maybe sometimes in the future someone will
               | invent abstractions of those.
               | 
               | Yes, I understand all this. I have the textbooks and I've
               | read parts of them. The reason I didn't study math more
               | seriously (I considered it) is because I realized I
               | didn't have the brain for it and am not that interested
               | in studying really abstract stuff.
        
       | doofin wrote:
       | As math is becoming harder,we also have better tool to tackle
       | such complexity.For example,it's probably easier to use a theorem
       | prover which supports homotopy type theory to study algebraic
       | topology and synthetic homotopy theory.However,the problem is
       | that we have to understand homotopy type theory first.
        
       | drewg123 wrote:
       | As far as I'm concerned, it was always hard. I took 2 theory
       | classes (undergrad and grad) in the early 90s. I dreaded the
       | proofs, and I hated the classes, and could barely stay awake for
       | them. I'd much rather stay up all night coding a project for
       | another class than do one proof for a theory class. Ugh.
        
       | Jensson wrote:
       | Computer science is an extremely young field. The parts of the
       | field that number theorists could dabble in quickly got fleshed
       | out and got hard as they could apply the millennia of accumulated
       | number theorist knowledge, the parts more related to how to work
       | with computers (such as algorithms or program structure) is still
       | young today.
       | 
       | Edit: Note that the frontier of the field will always be
       | extremely hard no matter what time you live in. The reason is
       | that things will be poorly understood and written, there will be
       | no good textbooks or explanations, you have to trudge through all
       | that often without anyone to help you understand things or
       | correct you. It took centuries for calculus to become as easy as
       | it is today.
        
         | bsedlm wrote:
         | > It took centuries for calculus to become as easy as it is
         | today.
         | 
         | it's easy to use, but it's (IMO) even harder to understand
         | correctly than before. I'm referring to a mathematical-level
         | understanding; i.e. you could re-invent it because of how well
         | you undesrtand all of it.
        
           | P_I_Staker wrote:
           | How so?
        
             | danuker wrote:
             | I would guess the sheer size of it.
             | 
             | For instance, I can't take a derivative without the rules
             | in front of me, though the concept of derivatives is
             | simple.
        
               | sudosysgen wrote:
               | I mean, you probably forgot how to, but taking a
               | derivative without using any derivation rules has always
               | been an exam question for me whenever I take a calculus
               | class.
        
               | 1980phipsi wrote:
               | Derivatives are usually quite a bit easier than
               | integrals.
        
             | bsedlm wrote:
             | As far as I've understood so far, back when calculus was
             | invented people would think about it using infinitesimals.
             | But nowadays it is taught without mentioning them; in
             | short, caluclus is now taught emphasizing the ability to
             | use the techniques and that's it.
             | 
             | I think that during the 19th century when it (along with
             | logic) was being formalized they decided to throw away the
             | old original ways to think about it. This ends up making it
             | really hard to understand in exchange for making it
             | "formally sound" and "user friendly".
        
               | sudosysgen wrote:
               | I don't know about you but proving various derivative
               | rules using the limit (ie, infinitesimals) has been a
               | recurring exam question for me, and so were
               | differentials.
        
               | east2west wrote:
               | The modern approach to calculus using limits do not use
               | infinitesimals. According to Richard Courant,
               | "Mathematicians of first rank operated ... sometimes even
               | with mystical associations as in references to
               | 'infinitesimals' or 'infinitely small quantities.'" I
               | believe we now use the "epislon delta" definition of
               | limits.
               | 
               | "Introduction to Calculus and Analysis" Volume 1
        
               | sudosysgen wrote:
               | Limits are conceptually equivalent to infinitesimals.
               | Another way to view it is that a limit is a way to
               | evaluate an equation of infinitesimals.
               | 
               | Epsilon Delta is not our definition of limits, it's the
               | definition of the derivative that uses limits as the
               | "backend". But even using infinitesimals you can use the
               | epsilon-delta definition of the derivative.
        
               | [deleted]
        
               | east2west wrote:
               | Perhaps you are referring to modern nonstandard analysis,
               | which I am sure has equivalent definition of limits. I
               | will note that in standard calculus Epsilon-Delta is not
               | only used in defining derivatives, but also used in
               | defining function limits, which of course means used in
               | defining integrals.
               | 
               | Just out of curiosity, in nonstandard analysis is there
               | an equivalent of Lebesgue integral?
        
               | jasomill wrote:
               | While certainly not the usual approach, calculus in terms
               | of infinitesimals _has_ been placed of formal
               | foundations[1]. To be fair to Courant, however, the
               | approach to infinitesimals in nonstandard analysis is
               | pretty far removed from the views of Leibniz, _et al.,_
               | and  "mystical associations" is a fair, if not
               | necessarily impartial, criticism of the latter.
               | 
               | IIRC, even _Newton_ was at least a bit skeptical of the
               | validity of his _method of fluxions_ , even going so far
               | as to formulate his _Principia_ in terms of (nearly
               | impenetrable IMO) arguments from classical geometry
               | instead, and the non-rigorous application of related
               | ideas led even the best mathematicians to questionable
               | conclusions at times ( _e.g.,_ Euler 's argument that 1 +
               | 2 + 4 + [?] = -1, which may or may not have influenced
               | computer science in terms of two's complement
               | arithmetic).
               | 
               | As for abstraction as a source of seemingly artificial
               | difficulty in modern mathematics, from the introduction
               | to Courant's _Differential and Integral Calculus_ , the
               | first edition of _Introduction to Calculus and Analysis_
               | [2],
               | 
               |  _The presentation of analysis as a closed system of
               | truths without reference to their origin and purpose has,
               | it is true, an aesthetic charm and satisfies a deep
               | philosophical need. But the attitude of those who
               | consider analysis solely as an abstractly logical,
               | introverted science is not only highly unsuitable for
               | beginners but endangers the future of the subject; for to
               | pursue mathematical analysis while at the same time
               | turning one 's back on its applications and on intuition
               | is to condemn it to hopeless atrophy._
               | 
               | For similar arguments that over-reliance on formalism
               | begins far earlier in the modern mathematical curriculum,
               | see Feynman's _New Textbooks for the "New" Mathematics_
               | [3], in which he discusses his frustrations as a working
               | (theoretical!) physicist reviewing grade school
               | textbooks.
               | 
               | As a partial counterpoint, I personally often have an
               | easier time understanding the more abstract treatments.
               | For example, I struggled with the traditional
               | presentation of multivariable calculus in terms of
               | "physically meaningful" differential operators until
               | working through the first couple books of Spivak's
               | _Comprehensive Introduction to Differential Geometry_
               | after-hours, at which point everything sort of clicked.
               | Had my intro (college) calculus course _not_
               | coincidentally been taught by a differential geometer
               | with a habit of presenting some of the basic ideas as
               | asides, I may have never made it any further in the field
               | (in terms of learning; I work in software, not maths).
               | 
               | Nevertheless, I'd never propose Bourbaki as a good model
               | for elementary education!
               | 
               | Revised ( _Introduction to..._ ) editions of Courant's
               | calc textbooks are particularly noteworthy in this
               | respect; to me, at least, they strike a very good balance
               | between classical and modern formalisms, and between
               | applications to pure and applied mathematics.
               | 
               | As for the application of "general abstract nonsense" to
               | computer science and engineering, I have no doubt Courant
               | would be pleased, with the applications if not in their
               | presentation, as the interplay between pure mathematics
               | and its applications was always an important subject to
               | him (see also the introduction to his [and, nominally,
               | Hilbert's] _Methods of Mathematical Physics_ [4], the
               | address he gave on variational methods in PDEs [5], often
               | cited as a foundational work in finite element analysis,
               | and, well, pretty much the entirety of _What is
               | Mathematics?_ [6]).
               | 
               | [1] https://en.wikipedia.org/wiki/Nonstandard_analysis
               | 
               | [2] https://onlinelibrary.wiley.com/doi/pdf/10.1002/97811
               | 1803324...
               | 
               | [3]
               | http://calteches.library.caltech.edu/2362/1/feynman.pdf
               | 
               | [4] https://onlinelibrary.wiley.com/doi/pdf/10.1002/97835
               | 2761721...
               | 
               | [5] https://www.ams.org/journals/bull/1943-49-01/S0002-99
               | 04-1943...
               | 
               | [6] https://archive.org/details/WhatIsMathematics
        
               | tubby12345 wrote:
               | Infinitesimals can't be used to prove anything - it's
               | really easy to write equations like
               | 
               | x + y + z = 0
               | 
               | dx/dy * dy/dz * dz/dx = -1
               | 
               | that make no sense if you don't know epsilon-delta
               | calculus.
               | 
               | You can use infinitesimal analysis but that's a
               | completely different thing that you probably won't enjoy
               | if you don't think epsilon-delta is straightforward.
               | 
               | https://en.m.wikipedia.org/wiki/Smooth_infinitesimal_anal
               | ysi...
        
               | gmadsen wrote:
               | limits are still taught... which do the same thing as
               | infinitesimals
        
               | NyxWulf wrote:
               | I think infinitesimals are more like the partitions in a
               | Riemann Sum. The limit is where you are heading, whereas
               | infinitesimals are how you get there.
        
               | sudosysgen wrote:
               | In the finite Riemann sum written as a limit the
               | partition is just the limit of (x1 - x2)/m. I'd say it's
               | the reverse, you use the limit to resolve an equation of
               | two infinitesimals.
        
           | fantod wrote:
           | If you're referring to the underlying rigorous analysis / set
           | theory / foundations of math, it's arguably still easier to
           | understand now that those things have been developed. Before
           | these theories existed, your only way of truly
           | "understanding" calculus to a similar level of detail would
           | have been to actually develop the foundations of math, which
           | would be impossibly hard for most people.
        
             | [deleted]
        
         | amusedcyclist wrote:
         | The author is actually talking about the opposite situation,
         | where CS theory departments now have a lot more number
         | theorists (and other mathematicians) dabbling in CS theory,
         | which has lead to use of far more complex tools than before.
         | (Fourier analysis of Boolean functions is one example I'm
         | familiar with)
        
           | Jensson wrote:
           | It might use more connections to number theory today, but the
           | connection was always there and to understand it you had to
           | understand number theory to a deep level from the start. The
           | other parts of computer science had to slowly build depth as
           | it was new ground.
        
         | sircastor wrote:
         | Once, during one of my classes I was reading an assigned paper
         | and realized "hey, I know this guy..." it was an exciting
         | little moment. I shared it with a friend who noted that unlike
         | most fields, CS is young enough that many of the pioneers in it
         | are still around.
        
           | ddlutz wrote:
           | This has happened a few times, I've ended up reading research
           | papers of people I work(ed) with professionally.
        
         | eru wrote:
         | The frontier isn't always hard. When someone discovers a new
         | powerful technique, there's often a wave of people just
         | applying it to everything, and seeing what sticks.
         | 
         | See https://slatestarcodex.com/2016/11/17/the-alzheimer-photo/
        
           | wittycardio wrote:
           | The frontier was breached when someone found that powerful
           | technique. Then it was no longer the frontier
        
             | Frost1x wrote:
             | I like to view of it more as a leap in knowledge. Someone
             | jumped not just a foot into the frontier but half a mile
             | and setup camp.
             | 
             | Now, there's half a mile of land for people to safely and
             | easily scout through knowing they can always setup camp at
             | the new established camp and work back towards previous
             | understanding or work from previous understanding to the
             | new camp. There's often a lot of low hanging fruit to be
             | had here once a leap, not small incremental step into the
             | frontier, has been made.
        
             | dekhn wrote:
             | no, in science, technique isn't frontier, discovery is.
             | technique is just what you build to expand the frontier.
        
               | wittycardio wrote:
               | Not in cs theory or math, it is almost entirely a study
               | of technique. If you look at papers quite often the
               | abstract will emphasize the novelty of their technique
               | rather than just the result
        
               | dekhn wrote:
               | neither cs theory nor math is science. CS theory is a
               | form of applied math, and math is not science, it's a
               | distinct process (that is used by science, as a
               | technique). My comment doesn't apply there.
        
               | wittycardio wrote:
               | This is an article about CS theory...
        
               | dekhn wrote:
               | this subthread is about science, I'm responding to
               | https://slatestarcodex.com/2016/11/17/the-alzheimer-
               | photo/
        
         | nemo44x wrote:
         | Right - novel thought is always difficult. It's looks easy and
         | obvious looking back but that's because, as you've pointed out,
         | effective teaching exists for it that draws obvious
         | connections.
        
           | Frost1x wrote:
           | It's interesting how in education and academics this is well
           | known and understood, however the second you transition out
           | of education/academia to the real world in industry, it's as
           | if all this is lost/unknown or the burden is simply ignored
           | to create a pressure on everyone doing the work. Businesses
           | and business leaders often think novel work should be simple
           | because that's what you trained for. You studied computer
           | science, maybe even have your Ph.D. and therefor should be
           | able to anything and everything possible using computers
           | quickly and efficiently.
           | 
           | There's a large trench between what people expect and what
           | reality is here.
        
       | grumple wrote:
       | The background color on that site makes it very painful to read.
       | Author, if you're on HN, please make this something more normal.
        
         | jodrellblank wrote:
         | " _Please don 't complain about website formatting, back-button
         | breakage, and similar annoyances. They're too common to be
         | interesting. Exception: when the author is present. Then
         | friendly feedback might be helpful._" -
         | https://news.ycombinator.com/newsguidelines.html
        
       | actually_a_dog wrote:
       | Mention of Fourier transforms over fields other than the complex
       | numbers brought this up to mind:
       | https://en.wikipedia.org/wiki/Sch%C3%B6nhage%E2%80%93Strasse...
       | 
       | TL;DR: It's a way to multiply n-digit integers in time O(n log n
       | * log log n) time, developed in the early 70s, using DFTs in the
       | ring Z mod 2^k + 1, with a slightly clever choice of k. Since the
       | runtime was so tantalizingly close to O(n log n), it fueled
       | speculation that O(n log n) was the optimal runtime. This bound
       | was finally proven in 2019: https://hal.archives-
       | ouvertes.fr/hal-02070778v2
        
       | throwaway2016a wrote:
       | Anecdotally:
       | 
       | I went to a college in Boston in early 2000s for my CS degree and
       | it was undergoing extra ABET accreditation to put it in line with
       | the engineering programs. The prior year has mostly algebra based
       | course work but due to the change: all our math (including
       | physics) was switch to calculus based just like the other
       | engineering programs. And the theory courses got more in depth.
       | 
       | So if I had to guess based on my personal experience I'd say
       | added rigor is possibly due to the field becoming less like a
       | free-for-all and more like engineering.
       | 
       | In other words, knowing why something works vs just experimenting
       | until you stumble upon something that works.
       | 
       | I've always appreciated a B.Sc. or M.Sc. in Computer Science as
       | the academically rigorous cousins of Associate degrees, B.A., and
       | bootcamps. You can get a job with both types of education but one
       | understands the theory behind things far better.
       | 
       | Edit: Not sure if it is like this globally so just as an FYI...
       | B.Sc. and M.Sc. are Bachelor and Master of "science" degrees
       | where as B.A. is in the Liberal Arts. In the US you have CS
       | degrees in both. The B.A. is more well rounded but the B.Sc. is
       | more rigorous in theory and math.
        
         | sumtechguy wrote:
         | What degree you get kind of depends on which college you went
         | to and when. The difference in my school between the eng track
         | and the arts track was basically 2 classes between the two
         | programs. Both programs were run by the same professors. If I
         | had got the degree 10 years earlier I would not have been able
         | to get it and only could have gone math with a study in CS. As
         | is, I think my CS program was 6 hours away from an additional
         | math degree. Which makes sense as the math programs typically
         | were supersets of the CS degree for a long time. I do not know
         | about the time after but from what I heard they simplified the
         | program a bit and went more algerbra/java like you said.
        
         | lanstin wrote:
         | The original article is not about these things. It is about the
         | difficulty of the rigor increasing. From some sort of easy
         | encoding of computation into integer math to doing Fourier
         | analysis over booleans. From maths you can grok in a few hours
         | to the horrendously complex stuff in number theory that takes
         | days and you aren't really sure you get it.
         | 
         | In neither case is it a matter of theory vs practice. It is
         | that the mechanics used by theorists is a lot more difficult
         | than a few decades ago.
         | 
         | Actual software development becoming like engineering, which
         | seems counter to my experience, is a different issue.
         | 
         | In my experience software veers away from engineering because
         | as soon as something is standardized enough to not be novel,
         | it's completely captured by a library or service or something
         | and is a solved problem. The new software is always doing
         | something new. And there are no lessons learned, except,
         | ironically, at the deep theory level.
        
       | fortran77 wrote:
       | It was really easy when I got my graduate degree in 1984. Don't
       | know if I'd be able to do it today. (And yes, I'm serious).
       | 
       | My undergraduate degree was in math, and I switch to CS for
       | graduate work because math got too hard.
        
       | dekhn wrote:
       | I took only one CS class in college- taught by David Huffman. It
       | was a wonderful class, most of it beyond me (I was a bio major
       | who liked hacking) and I ultimately failed. So, IMHO it was
       | definitely hard math (sphere packing, prefix-free codes) back in
       | the 90s and I'm sure before that.
       | 
       | Amusingly, that failure so drove me to learn computer science
       | that I'm now an expert, but I often get stuff like order
       | calculations wrong because they don't correspond well with actual
       | computer performance.
        
       | [deleted]
        
       | Otternonsenz wrote:
       | As someone whose childlike wonder piques at the technology I grew
       | up with, I've always tried to delve into any parts of knowledge I
       | could get my head around; I read textbooks/papers/articles until
       | I come across something I don't get, then drill down into that
       | until I get it.
       | 
       | However, because my education/training has not been part of an
       | institution, I do feel like I am on the outside of understanding
       | when it comes to wrangling the lightning inside the rocks we work
       | on day in/day out. I've been both DevOps (helpdesk jockey to
       | admin) and dabbled in frontend work, but mostly fell into those
       | positions and grew into them.
       | 
       | Couldn't tell you how to work with Kubernetes or Docker, but have
       | been following discourse on here enough to understand how they
       | are useful for big stuff. I used PuTTY back in HS to work on a
       | backend internship, but have no clue why it works. I have made at
       | least 2 programs still running internally at a big real estate
       | tech company, but cannot deploy my own apps on my own computer.
       | 
       | Is this disconnect with deeper Computer Science theory just my
       | issue as a visual and kinetic learner? Is the understanding of
       | modern technology only suitably captured in a classroom setting
       | that I do not have access or time for?
        
         | 908B64B197 wrote:
         | Docker will make sense if you had a good OS class. The summary
         | is that it makes it easier to run and deploy software and gives
         | extra control over resource access and isolations between
         | programs.
         | 
         | Kubernetes will immediately make sense if you take a serious
         | Distributed Systems class.
        
         | winternett wrote:
         | Docker is basically a way to segment/emulate control,
         | configuration, and access to cloud host infrastructure on an
         | individual basis because in shared hosting environments, you
         | can't change settings or they can possibly compromise security
         | and impact everyone else hosted on the platform negatively. It
         | adds a bunch of software on top of other software and requires
         | massive hardware and power consumption to drive it overall, and
         | still is not particularly more reliable nor secure than the old
         | method of hosting your own site, and it drives consumerism and
         | environmental pollution probably just as bad if not worse than
         | before here and in many other parts of the country, but it
         | generally feels less guilty because you don't have to see the
         | factories and plants -- they're hidden overseas and in rural
         | areas that you don't go to.
         | 
         | No one wants to admit that things were probably better when we
         | ran our own infrastructure and bought hosting at fixed pricing,
         | because they're busy making money off of the new and overly-
         | complex "utility priced" cloud hosting solution model to
         | hosting web sites and apps, and because the learning curve for
         | their competition is now more steep, reducing the threat to
         | their profit pipelines... There, I said it.
         | 
         | These days people have a huge problem with getting to the point
         | in speech and in writing. If you put the premise up front, it
         | allows people to get it an move on, or to argue with you online
         | more easily without reading the elaboration below it.
         | 
         | There is usually a reason why software and hardware is made, to
         | solve a specific problem, or to solve a group of problems, yet
         | modern-day engineers don't see the importance of putting the
         | detail about the problem(s) that their tools solve foreword
         | FIRST AND FOREMOST before gushing about how to use their tools.
         | A lot of the time they don't put the premise first because it's
         | an ugly solution, or in reality simply useless, or based on
         | being costly, and usually just too damn overly-complicated to
         | really work consistently and accurately.
         | 
         | We also have marketers, managers, and sales people working to
         | promote ideas without any real understanding about the reason
         | why those things were created. Many people simply don't care as
         | long as their share values increase or if they make a paycheck,
         | so there's even more of a barrier to meaningful answers to why
         | in the he|| things exist, to determining real meaningful fixes
         | to modern-day problems, and to why certain solutions are
         | different than other things used to solve already over-
         | complicated IT problems.
         | 
         | Life is better when it's more simple, simplifying things frees
         | up time to innovate beyond just ideas that are geared towards
         | making profit. The world would be a far better place if we
         | began a shift towards bringing technology back down to earth,
         | and if everyone would stop chasing the annual "tech leader
         | supreme monopolistic empire douchebag" awards.
         | 
         | :|
        
       | analog31 wrote:
       | I don't know if this is an analogy, but the first thing that came
       | to mind was: When did jazz improvisation get so hard? The answer
       | is the Bebop era. And it could just come down to, players got
       | sick of playing the easy stuff, so they started making up hard
       | stuff.
        
       | cweill wrote:
       | Sightly OT: the best Math for Computer Science book/course I ever
       | read is the one from MIT https://ocw.mit.edu/courses/electrical-
       | engineering-and-compu...
       | 
       | Prof. Tom Leighton is an incredible teacher who teaches the
       | intuition before the theory.
        
       | evancoop wrote:
       | I might argue that computer science theory is not fundamentally
       | harder, so much as the available tooling is far more
       | sophisticated. Not so many years ago, the function for mean and
       | median needed to be written rather than accessed within native
       | libraries.
       | 
       | Now, complex neural network architectures are available to all.
       | Alas, simply gaining that access does not in any imply deeper
       | understanding of what is occurring under the hood. Perhaps the
       | higher technical demands is simply a byproduct of the more
       | complex "off-the-shelf" solutions and the need to do more than
       | simply pip install, point, and click?
        
         | mjburgess wrote:
         | I'd say its more the opposite. The lack of much "academic &
         | public understanding" of neural networks comes down to the
         | severe lack of statistical modelling and empirical theory-
         | building knowledge in the compu-sci community.
         | 
         | NNs are not a hard thing to understand, it's just regression --
         | there's parameteric and non-parametric. And NNs are, just like
         | any generic approximator algorithm, a largely non-parametric
         | method. The lack of understanding of even what a parametric
         | method is, in CS, is very telling: NB. it has nothing to do
         | with the final model having parameters. It is whether the
         | method assumes a parametrised distribution of its input data.
         | Non-parameteric methods are well-understood, they aren't magic,
         | and they aren't very hard to characterise.
         | 
         | Rather, it is exactly in those areas where compu-sci people are
         | most qualified that the mathematics gets the hardest. Much of
         | the low-hanging fruit has been picked (Turing, et al.) and
         | today comes the hard part of the thorniest problems.
        
           | CrazyStat wrote:
           | > The lack of understanding of even what a parametric method
           | is, in CS, is very telling: NB. it has nothing to do with the
           | final model having parameters. It is whether the method
           | assumes a parametrised distribution of its input data.
           | 
           | Having done my PhD work on (Bayesian) nonparametric methods
           | I'm struggling to parse this.
           | 
           | What is the input data? Just the explanatory (independent)
           | variables? Both explanatory and response (dependent)
           | variables? Are we talking about a joint or conditional
           | parametric distribution?
           | 
           | Many parametric methods (e.g. OLS regression) make no
           | assumption about the distribution of the explanatory
           | variables. Many "nonparametric" methods do make parametric
           | assumptions about the conditional distribution of the
           | response variable(s) conditioned on the explanatory
           | variable(s) (e.g. GP regression).
           | 
           | I don't see how this works as a classification for whether a
           | method is "nonparametric" or not.
        
             | mjburgess wrote:
             | So i'd put parametric vs. non-para on a scale. It can be
             | done in terms of the final model parameters, even -- but
             | this may seem initially weird.
             | 
             | If the parameters of the predictive model are a weakly
             | compressive function of the dataset, then your method is
             | non-parametric. If your parameters are extremely
             | compressive it's parametric. Subject to both being low-loss
             | models of the data.
             | 
             | Why? Well a non-parametric method is basically one which
             | "uses the data points as its statistical model"; and a
             | parametric method _fits_ the data to a prior low-
             | parameterised model.
             | 
             | Eg., linear regression essentially fits the data to a
             | normal distribution = parametric on the mean/std, ie.,
             | pdf(Y|X) = N(ax +b, stdev). You fit (a,b) and thus
             | essentially are just "finding a mean".
             | 
             | Eg., knn just remembers the dataset itself.
             | 
             | So there's a scale from "linear regression to knn", ie.,
             | Weights = (Mean, Stdev)... to Weights = (X, Y).
             | 
             | The terms parametric and non-parametric are fairly
             | overloaded, so this way of characterising the distinction
             | may either improve or worsen that.
             | 
             | Either way, my point is that NN model with a very high
             | parameter count is essentially best analysed as KNN on a
             | weakly compressed feature space. In that sense it is an
             | incredibly obvious and simple algorithm.
             | 
             | Incidentally, NN can just be linear regression or KNN if
             | you set it up appropriately. So NN is an alg. which runs
             | "from knn to linear regression" depending on, eg.,
             | activation-fns, how hard you regularise it, etc.
        
       | kevmo314 wrote:
       | > I tend to think this is an outlier- for the most part, CS
       | theory back in the 1970's did not hard math.
       | 
       | CS theory always had hard math. Software engineering didn't (and
       | still mostly doesn't) but also in the 1970's software didn't
       | exist so you could literally do anything and it would become the
       | standard if it was pushed hard enough.
       | 
       | It's not like there's an intrinsic reason TCP packets are the way
       | they are now, it's largely an arbitrary choice that we
       | standardized on.
        
         | NikolaNovak wrote:
         | I found math depends on location as well. I moved around in the
         | 90's during my University years; what was billed as exactly the
         | same 3rd year networking course:
         | 
         | * University of Toronto, ON, Canada: HEAVY math; queuing
         | theory; no computer lab, no mention on any specific stack or
         | technology
         | 
         | * University of Athabasca, AB, Canada: No math; discussion on
         | OSI Layers, TCIP/IP, SMTP, Ethernet; gateways routers switches,
         | virtualization; lots of fun computer labs. Not technology
         | specific - it wasn't doing Cisco certs or anything. But good
         | working understanding of what's happening in networking world.
         | 
         | I always felt one university taught working software engineers;
         | other university taught the dozen people worldwide who need to
         | create next IPv8 :D
        
           | naasking wrote:
           | That's weird, because I was at U of T engineering in the late
           | 90s, took networking and they covered some math for various
           | calculations on ring and wireless networks, had labs, and
           | covered everything the Athabasca course covered as well. Was
           | that for computer science or engineering?
        
             | NikolaNovak wrote:
             | Computer Science; by all accounts, Computer Engineering was
             | less ivory tower / theory oriented.
        
         | titzer wrote:
         | > It's not like there's an intrinsic reason TCP packets are the
         | way they are now, it's largely an arbitrary choice that we
         | standardized on.
         | 
         | IP and TCP were _very_ carefully designed and were not much
         | like the networking standards of the time. It wasn 't an
         | arbitrary choice any more than designing a rocket engine is an
         | "arbitrary choice".
        
           | bbulkow wrote:
           | I fought in the networking wars. Implemented a lot of
           | protocols and a lot of protocol gateways. Calling the reason
           | for TCP winning 'arbitrary' or even softening that to 'not
           | based entirely on technical merit' does not do justice to the
           | entire land war in Asia that happened between about '85 and
           | '95. If anyone cares about that period, they can do some
           | research, otherwise, I would propose not uttering unfounded
           | opinions.
        
           | [deleted]
        
           | Fellshard wrote:
           | I believe what's being stated is that TCP isn't some
           | mathematical inevitability, falling out of some discovered
           | law, but rather a tool that was adopted because it had
           | utility, and was certainly based on more formal findings or
           | hypotheses.
        
           | kazinator wrote:
           | > _IP and TCP were very carefully designed_
           | 
           | How did it happen that the byte order of the multi-byte
           | numeric fields in TCP/IP headers is big endian, which is
           | opposite of what most machines use now? Back then, did it
           | look as if big endian would become prevalent?
           | 
           | Today, almost every host attached to an IP network has to
           | byte swap.
           | 
           | The IPv6 people definitely ought to have designed a little
           | more carefully.
        
             | tuatoru wrote:
             | > Back then, did it look as if big endian would become
             | prevalent?
             | 
             | Yes. Big endian was "natural" and x86 was weird and
             | awkward, and a niche low end design that woud never be
             | connected to networks.
        
             | GuB-42 wrote:
             | Big endian is better for routing applications. In the same
             | way that little endian is better for addition.
             | 
             | For example let's say I send a message to 1.2.3.4, the
             | first router will decide what will be the next node based
             | on the first number, the next one on the second, the next
             | on the third, etc... Because it is big endian, the first
             | router only has to read the first number to do its job, if
             | it was little endian, it would have to read the entire
             | address fist.
             | 
             | Of course, it is a detail, I mean, the words "little/big
             | endian" is a parallel to the the pointless fights in
             | Gulliver's travels. But even if it could have gone either
             | way, for IP, big endian is more natural.
        
             | lanstin wrote:
             | When IP and TCP were deployed, most machines were big-
             | endian.
        
           | dahart wrote:
           | Parent wasn't taking away from the intention of design; they
           | were talking about what designs got canonized by accident.
           | This happens everywhere: very good and very intentional
           | designs become the de facto standard for arbitrary reasons,
           | often just being first. Electrons being negative, Leibniz
           | notation, musical notation and the four piece band, binary
           | digital circuits, hamburgers, the car, and even the structure
           | of TCP packets, are all things that got overwhelmingly
           | popular despite the presence of good and intentional
           | alternatives, they are largely arbitrary choices for
           | standardization.
        
             | titzer wrote:
             | Even still, I don't agree that IP became a standard for
             | arbitrary reasons. For the most part, IP did not "compete"
             | with other protocols; it encapsulates and bridges other
             | protocols, from ethernet to IPX to what-have-you. TCP is
             | properly understood as a layer above IP, the connection-
             | oriented stream protocol to match with the UDP, the packet-
             | oriented protocol. Both are encapsulated in IP packets. IP
             | was specifically designed to bridge heterogenous networks
             | and thus its name _inter-_ network protocol.
        
               | dahart wrote:
               | I hear your point, and you're right about TCP. I'm only
               | clarifying that even though you disagree, I (and parent I
               | believe) don't disagree with you. There's a lot room for
               | vagueness and potential for miscommunication hidden
               | behind the word "arbitrary" - it can be a shorthand for
               | too many or too complex or too ad hoc to account for.
               | 
               | The point was basically that history is 'arbitrary', at a
               | larger scale than the intention of the TCP design or the
               | reasons it got chosen for standardization. Maybe the
               | primary reason that TCP stuck is because the military
               | committed to it first, and that signaled to the rest of
               | the world that it was good enough.
               | 
               | It doesn't necessarily require competition for something
               | to be arbitrarily canonized or standardized, things just
               | happen the way they happen. There can be good reasons all
               | along the way, but it still could have happened a
               | different way, and just didn't. TCP does choose some
               | specific tradeoffs, it's not the only way the internet
               | could have come about. Being first without competition is
               | one way it happens. When this happens it adds enormous
               | friction to later alternatives that have or make
               | different tradeoffs.
        
               | titzer wrote:
               | > There's a lot room for vagueness and potential for
               | miscommunication hidden behind the word "arbitrary" - it
               | can be a shorthand for too many or too complex or too ad
               | hoc to account for.
               | 
               | Yeah, I get that, and of course everything is a spectrum,
               | including the _level_ of arbitrariness. But I am just
               | going by the definition of arbitrary in a dictionary:
               | 
               | > arbitrary, adj. based on random choice or personal
               | whim, rather than any reason or system.
               | 
               | Whereas my analogy with rocket engines was meant to
               | illustrate that IP was designed to solve a very specific
               | problem given a set of constraints. Most (let's say,
               | liquid-fueled) rocket engines end up with very similar
               | designs and engineering decisions are made to optimize
               | performance rather than whim. It's a very empirically-
               | driven engineering exercise.
               | 
               | And for the OP's mentioning of the TCP packet _format_ ,
               | I still think we shouldn't regard that as arbitrary. The
               | location of the source and destination fields, for
               | example, was carefully chosen so that routers of the time
               | could minimize the logic necessary to decode them and
               | start to make routing decisions before even most of the
               | header had arrived. It was literally optimized down to
               | the bit position. Now, the choice of byte order (big
               | endian), we might be considered somewhat arbitrary (maybe
               | even wrong) now, but even that was motivated by the
               | machines of the time, as most were big endian while
               | nowadays almost all remaining architectures are little
               | endian (causing no end of confusion to poor networking
               | students).
               | 
               | I don't disagree that history overall, and computer
               | history in particular is full of arbitrary choice points,
               | but I think TCP/IP isn't a good example of that.
        
               | dahart wrote:
               | FWIW, I didn't read the top comment as saying anything
               | about the TCP format being arbitrary. I could be wrong,
               | but wearing my glasses with benefit-of-the-doubt
               | strongest-plausible-interpretation lenses, here's what I
               | see above. It's two separate clauses in the last
               | sentence:
               | 
               | > It's not like there's an intrinsic reason TCP packets
               | are the way they are now
               | 
               | This to me invokes the idea of mathematical intrinsic
               | properties. For example the idea of the number zero
               | didn't always exist, but to this day we don't have
               | multiple representations or any alternatives. Something
               | about the number zero feels intrinsic. Addition,
               | multiplication, derivatives and integrals seem like
               | operations that might have different notations, but are
               | unchanging intrinsic concepts. TCP formats don't seem to
               | share this intrinsicness. It is in that sense that I
               | agree with the top comment; there is nothing intrinsic
               | about the TCP format, it is the result of conscious
               | engineering choices, and today there are reasonable
               | alternatives. In this sense I see your argument as
               | supporting this idea that there's nothing intrinsic about
               | TCP.
               | 
               | > it's largely an arbitrary choice that we standardized
               | on.
               | 
               | The word "arbitrary" was applied to the choice of
               | standard, not the format. I'm choosing to see a second,
               | different idea in that second clause. I see how and why
               | it implies the format itself is being called arbitrary to
               | you and others, but my defense of the comment is based on
               | interpreting it differently than what you saw.
               | 
               | BTW, the dictionary definition starts with "based on
               | random choice". If we stop there, it still adequately
               | matches what I was talking about, and it also
               | unfortunately doesn't add much color either. Then the
               | keyword becomes "random" rather than "arbitrary", but
               | "random" is also commonly used to mean the reason is
               | unknown or complex.
               | 
               | Total tangent, but being a fan of Monte Carlo methods, I
               | frequently use "arbitrary" to disambiguate from truly
               | "random". When people are asked to pick random numbers,
               | the results are arbitrary but not random in the
               | mathematical sense. ;)
        
               | hluska wrote:
               | Honestly friend, I think you're both correct. You're
               | correct in that TCP and IP are the product of smart,
               | driven design decisions. The people you're arguing with
               | are correct in that the internet won a standards war so
               | other protocols disappeared. The standards war wasn't
               | always fought over technical merits and
               | philosophy/politics were both part of it.
        
             | bbulkow wrote:
             | Definition of arbitrary: 'based on random choice or
             | personal whim, rather than any reason or system.' (Google)
             | 
             | Being first is not arbitrary. It is hard to achieve, and
             | valuable. It is a reason.
             | 
             | There were networking systems before TCP and alongside TCP,
             | TCP was clearly superior for a large number of reasons.
             | 
             | Given that everyone on hacker news's livelihood stems from
             | the awesome of TCP (not an over statement), I propose a bit
             | more respect.
        
             | JohnHaugeland wrote:
             | > Parent wasn't taking away from the intention of design
             | 
             | Yes they were, very literally, and incorrectly to boot
        
           | [deleted]
        
         | rabite wrote:
         | >It's not like there's an intrinsic reason TCP packets are the
         | way they are now, it's largely an arbitrary choice that we
         | standardized on.
         | 
         | This is incredibly off-base. There was a huge effort to force
         | an OSI-designed networking protocol stack that was designed by
         | a committee of government bureaucrats and big blue sales
         | engineers. It was a total disaster, but the vast amount of
         | resources that were dedicated to forcing the OSI memes was in
         | the billions of dollars. Large amounts of actual physical
         | hardware were produced conforming to protocol standards nobody
         | doing anything real actually wanted to use because a lot of
         | government agencies required all network equipment to be
         | compatible with the OSI protocol stack.
         | 
         | TCP, UDP, and the modern routing protocols (BGP, OSPF, etc) and
         | their associated address resolution systems were not an
         | arbitrary choice -- they represented orders of magnitude level
         | improvements over the government-backed OSI standards and
         | getting them to succeed was the product of intense collective
         | effort of some of the most brilliant minds of the past century.
         | 
         | https://en.wikipedia.org/wiki/Protocol_Wars
        
           | walshemj wrote:
           | As a former OSI person (UK third line X.400/500 support ) I
           | think that TCP/IP won as it was simpler and good enough.
           | 
           | TCP/IP was in no way an technical improvement over OSI
        
         | Ar-Curunir wrote:
         | No. NP-Completeness was first defined in the 70s. That's
         | literally a foundation of much subsequent work. The depth of
         | techniques used back then was mostly just elementary
         | combinatorics, probability, and number theory. In comparison,
         | much work nowadays require complex algebra, advanced
         | combinatorics, and deeper mathematics like measure theory.
         | 
         | (This is not to downplay the work of the pioneers who
         | discovered these foundational concepts; just to point out that
         | they used less sophisticated mathematical tools)
        
           | AnimalMuppet wrote:
           | What in computer science requires measure theory?
        
             | Ar-Curunir wrote:
             | Eg: https://simons.berkeley.edu/talks/michael-
             | mislove-2016-08-30
        
               | AnimalMuppet wrote:
               | No, I'm not going to watch an hour-long video for the
               | answer. The intro seems to be saying that "domains"
               | (undefined) are related to CS, and also related to
               | measure theory. That's... not much to go on...
        
               | bidirectional wrote:
               | There's an abstract right there:
               | 
               | > Measure theory has played an important role for domains
               | in modeling probabilistic computation.
        
               | AnimalMuppet wrote:
               | Yup. Read it. Without a definition or context for
               | "domains", it doesn't tell me much.
        
         | SavantIdiot wrote:
         | Feynman used a differential equation to prove the correct size
         | of the buffers on the first connection machine [1].
         | 
         | [1] https://en.wikipedia.org/wiki/Connection_Machine#Designs
        
         | jessriedel wrote:
         | The author explains pretty clearly what he means by "hard math"
         | and cites many specific developments that introduced
         | sophisticated techniques into academic CS where they weren't
         | used before.
        
         | JohnHaugeland wrote:
         | > It's not like there's an intrinsic reason TCP packets are the
         | way they are now
         | 
         | What are you talking about? It was a three year argument based
         | extremely heavily on prior knowledge
         | 
         | Why does everyone think that just because they don't know what
         | craft went into something, there wasn't any?
        
         | throwvirtever wrote:
         | > [I]n the 1970's software didn't exist
         | 
         | There actually was software in the 1970s, and earlier.
        
         | aduitsis wrote:
         | But let us please note that the choice of a (low pass filter
         | gain constant) in the original round trip estimator of the TCP
         | congestion avoidance algorithm by Van Jacobson is not
         | arbitrary.
        
         | bo1024 wrote:
         | The author may have a different definition of "hard math" than
         | you...
        
           | fragmede wrote:
           | I think it's a reference to the kind of math involved, like
           | hard science vs soft science. Not that I'm sure what soft
           | math would be. Probabilities?
        
             | [deleted]
        
             | Ar-Curunir wrote:
             | No, the difference is whether you require deep theorems
             | from algebra, measure theory, combinatorics, etc.
        
             | hammock wrote:
             | Statistics. Accounting. Etc
        
             | thanatos519 wrote:
             | Economics.
        
             | leoc wrote:
             | 'Hard' is meant as an antonym of 'easy' rather than 'soft'
             | here.
        
             | Nasrudith wrote:
             | Soft math vs hard math has me thinking aproximations,
             | estimations, and heuristics as opposed to exact solutions.
             | Neither type of solution is inherently easy or difficult of
             | course.
        
       | bob331 wrote:
       | Who cares. I'm earning fat bucks and I know none of this
        
       | kaymanb wrote:
       | I think it's interesting that we seem to equate CS Theory being
       | hard with involving sophisticated math. Are there any examples of
       | concepts in Theory that are considered hard, but not because of
       | their dependence on math?
       | 
       | The article seems to focus on complexity theory, but I couldn't
       | think of any from that sub-field. My first thought is the Paxos
       | algorithm for distributed consensus [1], which definitely has a
       | reputation of being difficult to understand, and fits under the
       | Theory umbrella.
       | 
       | [1] https://en.m.wikipedia.org/wiki/Paxos_(computer_science)
        
         | lanstin wrote:
         | Yeah I think a model of computation set in Minkowski space time
         | is novel compared to classic theory of computation, which is
         | sort of set in Plato land.
        
         | titanomachy wrote:
         | I wonder if Paxos is actually considered hard to understand by
         | academic computer scientists, or if it just happens to be
         | harder than most of the algorithms that practicing software
         | engineers regularly try to understand.
         | 
         | Disclaimer: I'm a software engineer who doesn't understand
         | Paxos.
        
       | srvmshr wrote:
       | Computational theory was always hard. What is different now is
       | that we see a lot of confluence to practical applications, and
       | people who haven't exclusively trained for it find it hard to
       | parse. It was previously in the realm of math - now there is some
       | drift towards modern software engineering.
        
       | nostrebored wrote:
       | Talking about theory as a single subdomain of CS doesn't really
       | make sense anymore. It's a wide and varied field. I have friends
       | who work mainly in automata and complexity and have worked with
       | people focused exclusively on a single class of algorithms.
       | 
       | They're remarkably different in difficulty.
       | 
       | My friend in complexity is clearly a mathematician. My friend who
       | is an expert in random algorithms is doing a lot more throwing
       | darts at a dartboard with enough theory chops to make his papers
       | interesting.
       | 
       | I studied CS theory and modeling in university and my degree was
       | effectively a math degree. Without the math and analysis piece I
       | don't think the theory ever would've been intuitive.
        
       | mindcrime wrote:
       | I'm not sure what the answer to this question is in general, but
       | from my perspective (was a CS major in the mid 1990's) there were
       | definitely parts of this that were hard (by my standards) by
       | then. All of the computational complexity stuff (P vs NP, etc.)
       | feels akin to black magic to me. Especially so when you get into
       | that whole "complexity zoo" thing where there are 10 bazillion
       | different complexity classes. To this day I have no idea what
       | most of them mean and have to visit the corresponding Wikipedia
       | page and read up on any ones that I encounter.
        
       | hprotagonist wrote:
       | Considering the founders intellectual merits, I'm going to go out
       | on a limb and say that CS theory has been hard since day 0.
        
         | tech_tuna wrote:
         | Since day -1
        
         | arethuza wrote:
         | Out of interest, who do you regard as the founders of CS theory
         | - I'd probably go for Church and Turing, but I can see
         | arguments for others as well.
        
           | criddell wrote:
           | Off topic, but Charles Petzold wrote a really great book that
           | walks you through Turing's historic paper on computability
           | and the Turing Machine. The book is called _The Annotated
           | Turing_.
           | 
           | I've just started reading it and Petzold claims all you need
           | is a decent grasp of high school mathematics.
        
             | bengale wrote:
             | I've had that book on my shelf for so long, I really do
             | need to shift it higher in my reading list.
        
               | mindcrime wrote:
               | _I 've had that book on my shelf for so long, I really do
               | need to shift it higher in my reading list_
               | 
               | Glad to see I'm not the only one...
        
           | voldacar wrote:
           | Leibniz technically
        
           | wongarsu wrote:
           | What does "founder" mean to you? Leibniz did a lot of
           | important foundational work 300 years ago, but at the same
           | time CS as a separate field didn't emerge until the 1940s
           | with pioneers like Turing.
        
           | varjag wrote:
           | Shannon, von Neumann
        
           | tgv wrote:
           | Might want to add Post, Chomsky and Kleene, too.
        
           | loraxclient wrote:
           | Tangential but may be of interest: a cool book (compilation?)
           | that somewhat addresses this topic is 'Ideas That Created the
           | Future' edited by Harry R. Lewis [0].
           | 
           | [0] https://mitpress.mit.edu/books/ideas-created-future
        
             | dc-programmer wrote:
             | Working through it now - reading the Laws of Thought by
             | George Boole was a spiritual experience for me. The
             | connection between natural language, logic, and math
             | finally made sense to me. Before I had a hard time
             | understanding the saliency of things like Analytic
             | Philosophy (what does language have to do with truth?) or
             | why the work of someone like Chomsky would be so important
             | to computer scientists. Boole wasn't the only person at
             | time to work on a project to reduce logic to solvable
             | equations (clearly that time in history was ripe for this
             | work, and also Leibniz had laid some of the spiritual
             | groundwork, himself inspired by Ramon Llull and Confucian
             | texts) but the way he wrote about it really resonated with
             | me.
             | 
             | The other surprising thing to me was that it didn't really
             | resemble modern algebra at all. As a former math student I
             | think it's easy to create a mental model of math history
             | where when new fields are created they resemble the modern
             | incarnation. But when you actually look at the primary
             | founding texts its clear the ideas and notation are pretty
             | unrefined at least compared to today's standards.
        
           | DeathArrow wrote:
           | Gottfried Leibnitz, David Hilbert, Kurt Godel.
        
             | mindcrime wrote:
             | Should Russell and Whitehead count as well, in terms of
             | this path which seems focused on the formal logic aspect?
        
         | tialaramex wrote:
         | The nice thing about an open frontier is that it's really easy
         | to be an innovator. If you put more than a passing thought into
         | almost anything in the vast Terra Incognita you're faced with,
         | you are now the world expert.
         | 
         | Today Grace Hopper's "compiler" wouldn't merit the name, it's
         | just a linker loader, what's the big deal? Well the big deal
         | was nobody had actually programmed the computer to help them
         | program the computer before. A huge sub-discipline of computer
         | science simply didn't exist.
         | 
         | Consider another discipline, metrology. Pioneers in metrology
         | weren't doing advanced physics like you would see in a
         | metrology lab today, they just had the insight that it sure is
         | easier if everybody agrees how much corn this is, so they used
         | prototypes, which is pretty much the simplest strategy but it's
         | already a huge improvement on nothing at all. It took until
         | this century for the last prototype (the Kilogram) to be
         | replaced. On the route there those prototypes got a lot more
         | sophisticated, and we learned a lot more about what it actually
         | means to have a certain amount of a thing, so metrology got
         | much harder.
        
       | kazinator wrote:
       | When did security get so hard? It used to be that you could
       | assume that the supervisor mode of your CPU was safe from the
       | user mode. Ah, the good old days.
        
       | wespiser_2018 wrote:
       | I'd say it was "hard" all along. The question of computability is
       | challenging in that it requires a good deal of formalism and
       | theory to get anywhere near problems we deal with every day.
       | 
       | Take NP-Complete for instance. We know a problem is NP-Complete
       | if we can do a Karp Reduction from another NP-Complete problem
       | and also prove the problem is in NP. Sure, that's fine. But how
       | was the first NP-Complete bootstrapped? Well, using automata and
       | generalized turning machine languages! You can use NP-Complete as
       | a concept at work, and never touch the original proof using a
       | non-determistic turning machine language.
       | 
       | That's at least one course worth of material to teach in order to
       | get students to understand automata. To me: that's a complex
       | approach to a simple question: what can we compute? We have to
       | invent a series of automata with increasing complexity and
       | corresponding theories/proofs. I don't think it's bad, it's just
       | the nature of the problem!
        
         | Ar-Curunir wrote:
         | The point is that one catch up to state-of-the-art-in-the-90s
         | in CS theory with basically one upper-division course. That is
         | not a lot from from a mathematical perspective.
         | 
         | Till a few years ago, undergrads could contribute to CS theory
         | research, whereas in math only senior grad students can do
         | that.
         | 
         | What the article is saying is that CS theory is slowly moving
         | towards the latter model, as more work is done and the low-
         | hanging fruits are picked off.
        
       | black_13 wrote:
       | When the middle class disappeared. When the professions in the
       | middle tier went by by everyone flocked to that as a career now
       | the great winnowing occurrs.
        
       ___________________________________________________________________
       (page generated 2021-11-16 23:01 UTC)