[HN Gopher] Ask HN: Was programming more interesting when memory...
       ___________________________________________________________________
        
       Ask HN: Was programming more interesting when memory usage was a
       concern?
        
       I got into doing programming professionally around 2014, which was
       way after memory and stuff was really much of a concern for 99% of
       all applications seemingly.  I always do look enviously at back
       when you had to figure out sort of bootstrappy ways to solve
       problems due to inherent limitations of the hardware instead of
       just sitting there gluing frameworks together like a loser.  Am I
       wrong or was programming just way cooler back then?
        
       Author : morph123
       Score  : 19 points
       Date   : 2023-04-02 19:08 UTC (3 hours ago)
        
       | PaulHoule wrote:
       | Like not having enough or managing the memory you do have?
       | 
       | I'd say that garbage collection was key to large scale software
       | reuse as if you didn't have garbage collection you'd have to have
       | a lot of cooperation between libraries and applications (how does
       | the application know the library doesn't need a piece of memory
       | or vice versa?)
        
         | morph123 wrote:
         | I guess a bit of both. Managing memory itself is not that
         | interesting as it is just mostly a nice shotgun to shoot
         | yourself in the foot with. I just like the idea of working in a
         | more limited environment where I am forced to come up with
         | perhaps more novel solutions than what my job is now which is
         | just stiching together random libraries and creating
         | unnecessarily complicated SQL queries.
        
       | revelio wrote:
       | Memory stopped being a big limiter on the desktop way before
       | 2014. You'd want to go back to the 90s for it to be a major
       | change in how things were done. So, back when people had 8mb of
       | RAM or maybe 16mb if the machine was high spec and in theory
       | Windows could run in 4mb. Swap existed but disks were so slow
       | that if you actually hit swap then your machine would drag to a
       | halt.
       | 
       | Anyway. No, it really wasn't cool.
       | 
       | The thing to realize is that software wasn't really that much
       | better optimized for RAM than today. Maybe a bit, but mostly it
       | just created a lot of pain:
       | 
       | - Every single API was specified to be able to fail due to OOM,
       | and a lot of programmers tried to check for this and recover.
       | However, there were no good ways to simulate or test it, and unit
       | testing was in its infancy as a practice (e.g. no widely adopted
       | frameworks for testing), and so in practice this just yielded a
       | ton of codepaths and boilerplate that never really got tested and
       | probably didn't work.
       | 
       | - Because RAM was so tight, you pretty much had to use the
       | operating system's APIs for everything, even if they sucked or
       | were full of bugs. It wasn't just RAM of course, it was also disk
       | space, CD space, nearly non-existent network bandwidth.
       | Duplication of what Windows had wasn't feasible. It meant
       | everything was pretty consistent, which had its upsides, but it
       | also meant that everything was consistently quite ugly and the
       | horrible Windows/macOS APIs were all you got.
       | 
       | - Memory limits resulted in awkward APIs. No type safety, no
       | enums (only bit flags), no reflection, lots of annoying (and by
       | the late 90s obsolete) memory locking protocols that were
       | holdovers from win16, and "rerendering" literally meant re-
       | drawing small areas of the screen because pixels weren't cached.
       | In particular a lot of APIs required you to call them twice, once
       | to figure out how much memory something would require, and then a
       | second time to fill out that memory once it was allocated.
       | 
       | - Error messages? Logging? Hah no. You get 32 bit error codes
       | because there isn't enough memory or disk space for all the
       | strings proper logging and errors would require. So you got
       | really good at decoding HRESULTs.
       | 
       | - Garbage collection existed as a tech but because swap was so
       | slow and RAM so tight it was very easy for it to trigger swap
       | storms, so in common practice automatic memory management (when
       | it existed) was all based on refcounting. So you couldn't quite
       | just forget about it because you could still leak memory pretty
       | easily even using a higher level language like VB.
       | 
       | The transition towards relative memory abundance was fairly
       | painful. For example, Java because popular with devs because it
       | had all those nice things that used a lot of memory but it led to
       | slow/swappy apps that were painful for users. You still see the
       | same complaints w.r.t. Electron apps, though today machines can
       | take it and so it's more about people feeling it's wasteful
       | rather than it literally killing the responsiveness of the
       | machine like it used to be.
       | 
       | Of course in the above I say "was" and "used to be" but if you
       | ever have to do any Win32 or low level POSIX programming you'll
       | be right back in that world.
        
       | DamonHD wrote:
       | It still is cool because of memory and CPU (power) and code
       | constraints for embedded computing.
        
         | morph123 wrote:
         | I do want to get into that but where I am at embedded computing
         | jobs are kind of rare. I do like programming in C quite a bit.
        
           | gostsamo wrote:
           | You can try some home projects to decide if you like it.
        
           | DamonHD wrote:
           | It's also possible to (for example) build 'normal' Web pages
           | that load very fast, eg some of the main readable content is
           | within the first few TCP packets, and the total page weight
           | is <1% of typical. You won't get a medal, but it's fun, and
           | sometimes the perf experts notice unprompted!
        
           | livueta wrote:
           | It's nearly the exact opposite of resource-constrained
           | computing, but HPC / exabyte-scale storage type stuff has
           | some of the same vibe of needing to know your hardware and
           | wanting to squeeze the most out of it that you can.
           | Consistently saturating a 100GbE link is a very different
           | problem from dynamic data structure allocation on a
           | microcontroller or whatever, but you have to care about some
           | of the same sorts of things.
        
       | GianFabien wrote:
       | Programming embedded systems using Atmel and similar low-cost
       | SoC's is very much like working on computers in the 1980s and
       | 1990s. But with PC hosted dev tools far easier and cheaper.
       | 
       | Since you mention an interest in this space, consider buying an
       | Arduino kit and building some cool personal projects. Once you
       | have some experience, look around your part of the world and
       | identify any _needs_ that you could solve with an embedded
       | solution.
        
       | bell-cot wrote:
       | YES, it was for-sure cooler.
       | 
       | Memory mattered. Ditto CPU cycles, disk, and network bandwidth.
       | (Not that your code could count on there _being_ a network, in a
       | lot of cases.) Security wasn 't a Garden of Eden...but now that
       | can feel more like being a night-shift ICU nurse during a COVID
       | surge that never ends. And the software stack under your code was
       | orders of magnitude smaller, and slower-changing. These days -
       | that can feel like you aren't a programmer, but instead a Wall
       | Street attorney who's trying to pilot some giga-corporate merger
       | through the shifting regulatory and political obstacles in 37
       | different countries.
        
       | helph67 wrote:
       | Yes, knowing your hardware always helped. Subroutines used
       | frequently and placed early in the code helped reduce memory use
       | and improve speed. Back in the early 1980's "80 Microcomputing"
       | magazine ran an annual competition for BASIC one line programs.
       | Some entries proved just how clever some coders could be.
       | https://en.wikipedia.org/wiki/80_Micro
        
       | [deleted]
        
       | Dwedit wrote:
       | You had to deal with dynamic loading of code or data when needed
       | rather than preloading everything into RAM all at once.
        
       | zxcvbnm wrote:
       | Yes. Also, back then in the offline-ish times people shipped
       | working software. Nowadays stuff is huge, slow, craps out all the
       | time, and the risk of stuff breaking with a next update is
       | comparable to the security risks of turning off updates
       | completely.
        
       | markus_zhang wrote:
       | Definitely interesting. I'd love to do everything my way, however
       | shitty it is. But it's MINE! Plus if you ever scroll back to 1991
       | and take a look of the first version of any Linux command line
       | program I bet it's not pretty.
        
       | czzr wrote:
       | No
        
       | [deleted]
        
       | nathants wrote:
       | these things still matter for the interesting/challenging
       | sections of problem space.
        
       | xboxnolifes wrote:
       | Constraints are what make puzzles interesting. Low
       | memory/compute/throughput/etc availability makes software more
       | like a good puzzle.
       | 
       | The more constraints you remove, the more something transitions
       | from being a puzzle to being a canvas. Each end appeals to
       | different kinds of people.
        
       | mitchellpkt wrote:
       | There are still some performance-sensitive niches that you might
       | enjoy. Large scale numeric simulation workloads can have a
       | variety of bottlenecks, and the difference between the easiest
       | solution and the fastest solution can be orders of magnitude.
       | Smart contract development also requires tight memory management,
       | otherwise the fees can become prohibitively high.
        
       | alkonaut wrote:
       | Its still a huge concern. Cache today is what memory was in the
       | 80s and 90s. Memory today is what disk was. And your L1 cache
       | today might be 64kb! That's basically what you can work with if
       | you want to use your CPU at full speed.
       | 
       | For anything that isn't IO bound you are very often bound by
       | memory access time. CPUs are incredibly fast and feeding them
       | data to work on is very difficult, more so today than before!
       | 
       | The big difference is that a category of programming jobs has
       | appeared where this is almost never a concern because it's about
       | shoving text around between servers.
        
       | dave4420 wrote:
       | It was more interesting in the sense of "May you live in
       | interesting times."
       | 
       | I don't miss low level programming in a work context at all.
        
         | roundandround wrote:
         | I don't miss restrictions when writing code, I miss them when
         | reviewing code.
        
       | mistrial9 wrote:
       | a huge amount of effort went into managing "soft loading" assets
       | in RAM pre-1995 or so, even later. To swap allocated blocks in
       | and out well was very much a thing. Games got away with different
       | tricks since forever, then secondly desktop work apps with non-
       | trivial parts included things like this. Does it make you sharper
       | or the work more interesting to have to jump through hoops on a
       | regular basis? YMMV
        
       | eesmith wrote:
       | For the people who found that interesting, yes, it was more
       | interesting back then. Most projects were tight on memory, so
       | those skills were needed, and easy to find comrades in the
       | struggle.
       | 
       | However, people go into programming for many reasons. For those
       | who were not interested in squeezing out every word, it was not
       | interesting. Probably some people realized their tasks couldn't
       | be done in the limited hardware of the time, so didn't even try
       | until memory became significantly cheaper.
       | 
       | The past can seem like a golden era of mythic heroes.
       | 
       | Some from the generation you envy are themselves envious of the
       | previous generation, where programmers were expected to write
       | their own operating system - the vendor didn't provide one - and
       | pull out a soldering iron and oscilloscope for debugging.
       | 
       | When many were writing yet another COBOL program for accounts
       | processing.
        
       ___________________________________________________________________
       (page generated 2023-04-02 23:01 UTC)