[HN Gopher] The decline of computers as a general-purpose techno...
___________________________________________________________________
The decline of computers as a general-purpose technology (2021)
Author : yeesian
Score : 227 points
Date : 2023-10-21 21:24 UTC (1 days ago)
(HTM) web link (cacm.acm.org)
(TXT) w3m dump (cacm.acm.org)
| waterheater wrote:
| This fragmentation was predicted well over a decade ago, but
| we're finally seeing mass market realization of these
| predictions.
|
| The most relevant idea here is Koomey's Law [1], which postulates
| that the "number of computations per joule of energy dissipated
| doubled about every 1.57 years." This law is essentially a
| combination of Moore's Law and Dennard scaling. As transistors
| get smaller, Moore's Law predicts increases in the number of
| transistors on a chip, while Dennard scaling predicts decreasing
| power usage per transistor. For many years, Moore's Law really
| just tracked with Dennard scaling; CPU on-die areas weren't
| getting substantially larger (to my knowledge), but transistors
| were getting smaller and more efficient, so more could fit in a
| given area.
|
| However, Dennard scaling also says that the power density of a
| transistor remains constant. As transistors became smaller, more
| of the die became occupied by transistors, causing more heat
| dissipation per unit area. After exhausting cooling options
| (think back to the TDP of those CPUs back in the late-90s/early
| 2000s), Dennard scaling died in the mid-to-late 2000s because of
| thermal issues.
|
| However, just because heat can't be dissipated doesn't mean CPU
| manufacturers won't try to cram more and more transistors in a
| given area. As a result, a modern CPU will not use all its
| transistors at once because it would overheat, creating "dark
| silicon". Once in that paradigm, the next logical design approach
| was prioritizing application-specific transistor clusters, the
| extent of which we're now seeing.
|
| While I generally agree with the authors, the key in these
| circumstances is to find ways to combine the fragments and make a
| better whole. Perhaps we'll have a "brain board" attached to the
| motherboard, where sockets on the brain board allow user-specific
| processor cores to be installed. Not everyone needs an i9 or even
| an i7, and maybe not everyone needs advanced ML performance.
|
| [1] https://en.wikipedia.org/wiki/Koomey%27s_law
| Qwertious wrote:
| The dark silicon isn't necessarily application-specific, it's
| also used by duplicating a particular circuit and then rotating
| which duplicate does the computing, so that the other areas
| have time to cool down a bit before they're used again.
| ZoomerCretin wrote:
| > However, Dennard scaling also says that the power density of
| a transistor remains constant. As transistors became smaller,
| more of the die became occupied by transistors, causing more
| heat dissipation per unit area. After exhausting cooling
| options (think back to the TDP of those CPUs back in the
| late-90s/early 2000s), Dennard scaling died in the mid-to-late
| 2000s because of thermal issues.
|
| Dennard scaling says that power usage per transistor per clock
| cycle is constant; not power density. As transistors became
| smaller, an x% reduction in transistor area meant an x%
| reduction in power per transistor per clock cycle, which
| allowed clock cycles to increase dramatically to consume the
| same fixed heat dissipation budget.
|
| The breakdown was not from there being too many transistors per
| square millimeter, but from the breakdown of the direct
| relationship of transistor size to power consumption per clock
| cycle due to current leakage. An x% reduction in transistor
| area no longer causes an x% reduction in power consumption per
| clock cycle, meaning operating clock frequency is now fixed at
| your ability to dissipate heat, not your ability to shrink
| transistors.
| waterheater wrote:
| Good point. Leakage current is the main issue these days, but
| that problem is only indirectly linked to power density.
|
| I've explained it your way in the past, but I saw the
| Wikipedia article for Dennard scaling [1] says that power
| density for a transistor remains constant with time, which is
| true given that the underlying transistor technology has
| constant resistive losses. My understanding is that
| transistors using GaN have higher power density than
| "traditional" transistors, which mean Dennard scaling
| certainly breaks down.
|
| [1] https://en.wikipedia.org/wiki/Dennard_scaling
| ethbr1 wrote:
| I understand this is ACM, but I would say the decline has
| strictly been a failure of _software_.
|
| Specifically, corporate IT's failure to deliver general purpose
| development tools usable by anyone in a company to make computers
| do work for them (aka programming).
|
| There's still a ton of general _value_ that could be delivered by
| general purpose processors /computers.
|
| But instead, most non-programmers live in a Kafkaesque reality
| where the only programs they have access to don't do what they
| want, and so their work becomes "working around the bad program."
|
| I helped a lady on Friday at a midsized bank whose job it was to
| take a set of dates in an Excel file, search each one
| individually into an internal search tool, then Ctrl+f the result
| report and verify that every code listed in the original document
| exists in the report.
|
| For 100+ entries.
|
| Every month.
|
| In 2023.
|
| What the fuck are we doing as a society, that this lady, with a
| machine that could compute the above in a fraction of a second,
| has to spend a decent chunk of her work time doing this?
|
| Example specifically chosen because I've seen innumerable similar
| "internal development will never work on this, because it's too
| small of an impact" issues at every company.
|
| Computers should work for us. _All_ of us. Not the other way
| around.
| glandium wrote:
| Kind of a counterpoint: I've heard countless stories (albeit,
| in Japan) of people doing things like "manually" doing
| additions of numbers they already have in a spreadsheet instead
| of just asking Excel to do the addition in what, two clicks,
| because they "can't trust the computer to do it right" which
| sounds like a BS rationalization to justify that they wouldn't
| have much of a job if the computer was doing it.
| cratermoon wrote:
| Perhaps some folks here are too young to remember the Pentium
| FDIV bug[1], but it's not complete BS to distrust the
| calculations.
|
| 1 https://en.wikipedia.org/wiki/Pentium_FDIV_bug
| bee_rider wrote:
| I'm sure FDIV-like bugs are absolutely eclipsed by typos
| and misclicks from doing things manually.
| TeMPOraL wrote:
| Yeah, but typos and misclicks don't scale. What was that
| saying? "To err is human, but to really foul things up
| you need a computer."
| RajT88 wrote:
| Back in 2009, I visited Tokyo for work. What I learned from
| the folks in my local office is the deployment software my
| company made tooling for was not very popular.
|
| The reasons were mainly that most of that software was made
| by Western (English speaking, specifically) countries and
| they did not trust it. They were not exactly wrong - software
| tends to work best on English systems, unless it was
| developed by a local company. I heard of some really
| embarrassing Korean translation issues that even as a person
| just with Google as my Korean language skill, I could
| validate the translation issue. Like Korean 101 kinda
| mistakes.
|
| So Japanese companies were using vastly inferior deployment
| software, which would basically RDP into an interactive
| session and replay inputs, because that software worked fine
| in Japanese.
|
| ...Or also very common was sneakernet deployment with DVD's
| and thumbdrives, making the lowest workers do the running
| around.
|
| *I do not know the current state of software deployment in
| the Japanese market.
| MichaelZuo wrote:
| If I was a busy decision maker at a major Japanese firm
| with a dozen direct reports and a hundred balls in the air
| simultaneously, and if I could even spot translation issues
| in a few minutes on the polished demo presented to upper
| management, then there's no way I would ever assume
| anything is correct with the software when I'm not looking.
|
| Let alone my subordinate's subordinate's subordinate who
| would actually be using the software day in day out with
| their job on the line.
|
| So it's a very sensible heuristic. If it's supposed to be a
| serious enterprise software product, offered for sale in
| Japan, then spending a few million dollars to hire expert
| technical translators to double check everything should not
| be an issue.
| SoftTalker wrote:
| I mean not exactly the same thing but just last night Apple
| Maps was directing me to turn down a street that had clearly
| been closed for some time. Computers "make mistakes" all the
| time in most peoples' experience and after you get burned a
| few times (especially if you get badly burned, like getting
| fired over the "mistake") you learn to not blindy trust
| computers.
| JohnFen wrote:
| Being a developer for a number of decades has firmly taught
| me to never blindly trust computers. Even if the software
| is flawless (which is literally never the case), there's
| still the GIGO problem which your Apple Maps example
| demonstrates.
| genewitch wrote:
| I once got fired because i upgraded a company's wifi
| network from a single apple wifi bridge (the residential
| one!) to a managed linksys or netgear deployment, and there
| was a bug in the firmware. I don't recall the bug, but it
| made the chief lawyer's apple laptop drop connectivity to
| their printer or something a few times the first day. I
| opened a support ticket, and got confirmation of the bug -
| at around 8PM, forwarded the email to my boss, and was
| fired the next afternoon.
|
| what a crap company. They folded within 4 months IIRC, and
| listed me as the "financial contact" for the fiber optic
| internet links - got a bunch of nasty phone calls later in
| the year about needing to pay 5 digits in billing arrears!
| gavinhoward wrote:
| IIRC, Excel uses floating point for calculations, so they're
| not wrong.
| ben_w wrote:
| It also uses them for phone numbers, if you're not careful.
|
| I have, in real life, received an SMS from... let's see...
| 4.4786e+11 welcoming me to Germany and giving me a link to
| the German COVID quarantine and testing rules (I was
| already in Germany at the time, I just turned off Airplane
| mode for the first time in 6 months).
| matheusmoreira wrote:
| Anyone who trusts computers to do non-integer math correctly
| does not know anything about floating point numbers.
| fomine3 wrote:
| I trust, but they compute in a different form of numbers
| MrLeap wrote:
| A large amount of inefficiency is actually a chain of
| accountability. Trust, and failing that assigning blame, and
| trusting those assignments, is the root of so. much.
| dTal wrote:
| While this is true and arguably of more importance, the thesis
| of the article was something quite specific: "the economic
| cycle that has led to the usage of a common computing platform,
| underpinned by rapidly improving universal processors, is
| giving way to a fragmentary cycle, where economics push users
| toward divergent computing platforms driven by special purpose
| processors". This is likely to exacerbate the issue you
| describe, but it is not caused by it.
|
| I don't think your observation was any _less_ true 20 years ago
| either. There hasn 't been a "decline", as such - more of a
| failure to realize potential. Office Space (1999) was full of
| people doing mindless busywork that could easily be automated.
| ethbr1 wrote:
| From a hardware perspective, the ACM cycle discussed is
| tightly tied to continued general purpose CPU performance
| gains.
|
| There are many pure-performance classes of software, where
| more performance = more value. Those classes have been
| diverging since the 80s (media playback), 90s (graphics), 00s
| (mobile), and ~10s (gpgpu).
|
| But there are other classes that are functionality limited.
| E.g. electronic medical record or enterprise resource
| planning.
|
| If software functionality were more plastic or expanded in
| those, the same general purpose performance would then be
| more valuable, and investment would also be incentivized.
|
| Accepting the inevitability of divergent, harder to user-
| program platforms, when we still have a lot of value on the
| table feels premature.
|
| And like it bodes badly for further losses of user computing-
| sovereignty as hyper-optimized hardware makes cloud platforms
| more attractive and kills off user-owned processing.
| karmakaze wrote:
| This sounds like a perfect use-case for automation using
| recorded macros. The failure (or perhaps solutions exist) is
| lack of facilities in our desktop OSes.
| glandium wrote:
| AFAIK, macOS is the only OS to come with this sort of things
| out of the box (and that is supposed to work consistently
| across all applications; good luck with that on Linux).
|
| https://en.m.wikipedia.org/wiki/Automator_(macOS)
|
| https://en.m.wikipedia.org/wiki/AppleScript
| karmakaze wrote:
| That is what I had in mind as I wrote it. Sad that many
| macOS apps now are built with Electron, or we simply use
| web browsers to access public or internal apps.
| depressedpanda wrote:
| Using web browsers to access apps is actually a boon in
| this regard: the whole UI is exposed via the DOM, and you
| can use JavaScript to interact with it for macros -- and
| it's cross platform as it works on any OS that can run a
| web browser!
|
| In contrast Automator/AppleScript only works on macOS
| afaict.
| skydhash wrote:
| But then you got the ugliest DOM. I left FaceBook because
| I was trying to curate my friends list (2k+) down to only
| the person I know in the real world. But the Unfriend
| process was a dark pattern and automating it was more
| work than it was worth. Instagram was worse.
| dr_kiszonka wrote:
| There is a free Power Automate Desktop for Windows.
| salawat wrote:
| Macros are generally locked down in enterprise network
| environments due to the risk of acting as a malware vector.
| bee_rider wrote:
| It would be nice if there was a general expectation by now that
| people would have a sort of "intro to scripting" type class in
| high school. Not everyone has to be a programmer, in the same
| way not everyone needs to be an author, but everybody ought to
| be able to write a memo, email, or shopping list.
| asdff wrote:
| Take it a step further and teach actual data analysis and
| statistics that are not coin flip/compound interest plug and
| chug into the calculator problems. No matter what sort of job
| you do in life, being able to pull a csv from all your
| different credit or debit accounts and draw some inferences
| from that would be so useful.
| eru wrote:
| Just because something fancy (or even basic) is on the
| curriculum doesn't mean that anyone is learning anything.
|
| No matter how well meaning, high school barely teaches most
| people anything in practice.
| bee_rider wrote:
| I'm not sure what "actual data analysis" is exactly, but it
| sounds like something that would be better handled in a
| stats or science class.
|
| Though, if programming were taught as a basic problem
| solving tool like writing and arithmetic, just stuck in the
| toolbox, a science class would probably benefit from having
| it there. For example, if it was expected that a kid could
| do programming as well as they can write a lab report, then
| it would probably be easy enough to add some sort of "call
| Numpy to do a linear regression" type section at the end of
| every lab report.
|
| This sort of programming is an everyday force multiplier,
| and it ought to be taught as such.
| rottencupcakes wrote:
| What I learnt from the emergence of chatGPT was that way more
| people were allergic to writing even a small note than I
| thought.
| eru wrote:
| Look at all the other classes people are already sitting
| through for high school and not learning anything either.
| What makes you think this one would be any different?
|
| Already around the world high schools all have something
| highfalutin like 'teaching critical thinking' in their
| curriculum.
| TeMPOraL wrote:
| > _What makes you think this one would be any different?_
|
| Because unlike almost all other classes, that one would be
| _immediately beneficial in daily life_ , in a direct and
| apparent way.
| smokel wrote:
| No it won't, unless we build infrastructure for them to
| work with.
|
| Even worse, given modern security standards, I wouldn't
| be surprised if programming and scripting were to be
| prohibited in the next ten years or so.
| bee_rider wrote:
| It seems like a negative feedback loop or something,
| nobody uses these sorts of features, so nobody implements
| them, and then the skills are less useful...
| eru wrote:
| Microsoft Excel (or cloud-based spreadsheets) will always
| be allowed. Even if they remove scripting abilities, a
| clever person can remove a lot of office busy work with
| just basic Excel functionalities like pivot tables.
| eru wrote:
| Basic scripting abilities would be beneficial years later
| for the subset of the kids who become office drones.
|
| I would agree with your assessment about 'immediately
| beneficial in daily life', if you were talking about a
| course offered to adults who are already working a white
| collar job.
| bee_rider wrote:
| Literacy rates in the US are pretty high I think, compared
| to where we were before k-12 schooling became standard.
|
| And most people can do basic arithmetic, right?
|
| The goal, I think, to start should be to get to the point
| where an office of 10 or so people has at least one or two
| who can string together basic scripts.
| eru wrote:
| > Literacy rates in the US are pretty high I think,
| compared to where we were before k-12 schooling became
| standard.
|
| 'Post hoc ergo propter hoc.'
| https://en.wikipedia.org/wiki/Post_hoc_ergo_propter_hoc
|
| By and large, kids from well-off parents were always more
| literate, and they still are. And the US is a lot richer
| now than it used to be.
| BeFlatXIII wrote:
| Add it to reach the average-smart kids, those without
| internal knowledge of where to look or internal motivation
| to learn (but still will take the advanced classes for a
| grade).
| oytis wrote:
| Teaching people a scripting language when the language has
| been already chosen for them, platform has been prepared,
| data preformatted etc. might be easy - a non-trivial share of
| my classmates couldn't comprehend a for loop though.
|
| To have real transferrable programming skill - I am not
| talking about one that will get you a software engineering
| job, just basic problem solving - you need to know an awful
| lot about how computers work. Otherwise you'll quickly get
| overwhelmed by errors you can't comprehend at all stages ,
| not being able to get data in right format, not knowing what
| tools are suitable for what problems etc.
| bee_rider wrote:
| It might be the case that a better language for this sort
| of thing would have to be designed. Maybe somebody needs to
| take another crack at the sort of Visual Basic/Excel type
| environment.
|
| I think someone on this site chimes in occasionally with a
| sort of "spreadsheet, but each cell runs a line of Python"
| type program they've been working on (although I can't
| remember the name unfortunately). That could be a good
| basis...
| genewitch wrote:
| vb (6?) may have been copy and paste back in the day but
| with the advent and continuing "work" on .net VB is just
| as impenetrable as all the others now.
| eru wrote:
| Spreadsheets might already be that better environment.
| You can solve a load of spreadsheets with some smarts and
| just spreadsheets (no Visual Basic or so necessary).
|
| But that doesn't mean _teaching_ people how to solve
| problems is any easier.
| opportune wrote:
| But surely she had the tools to do that in excel itself, no?
| This is something I see a lot in software, where people devalue
| and trivialize the skills and abilities needed to make software
| simply because they know something is possible to do in
| software and because, as a software worker, they most likely
| had a natural predilection towards the kind of thought
| processes that led them toward pursuing software work.
|
| I think it's fallacious to think everybody has the natural
| disposition and curiosity to want to write software. Writing
| has been around forever, and most people don't want to do it
| beyond purely functional communication, and even those who do
| want to go beyond that are mostly terrible at it. That lady
| could have automated her job if she really wanted to and was
| motivated enough to do it; she simply preferred to be a ditch
| digger than to design a ditch digging machine.
|
| It doesn't help that "designing ditch digging machines" is in
| such high demand that most companies are really only able to
| hire mediocre people to fill those roles. Leonardo da Vinci is
| not working on sox compliance at a fucking bank.
|
| There's a good chance that the payoff for that bank automating
| "take a set of dates in an Excel file, search each one
| individually into an internal search tool, then Ctrl+f the
| result report and verify that every code listed in the original
| document exists in the report" is not even there, because the
| first attempt will have insufficient monitoring or
| verifications to catch silent failures and introduce an
| expensive issue caught months after the fact that leads to a
| huge remediation scramble and requires more development than
| initially expected to address, not to mention the constant need
| for maintenance in updating the software and making sure it
| continues to run (which also involves a rotating cast of
| programmers joining, needing to learn the software, and
| leaving). Leonardo might do it right the first time, but he's
| not doing it for cheap and not going to make it his life's
| work. Joe Schmoe can do it cheaper but it's going to take a
| while, have an ongoing maintenace burden, and probably some
| mistakes made along the way. That lady can do it for even
| cheaper than Joe Schmoe can without many headaches.
| conception wrote:
| You see this in manufacturing as well. You'd think robots
| would be everywhere but often, like way more than you'd
| imagine, it's cheaper to hire a human to do it.
| tuatoru wrote:
| The way fertility rates are going, not for much longer.
| opportune wrote:
| Yeah, it's easy to forget that humans themselves are pretty
| amazing technology. You can program them with relatively
| vague instructions using natural language, they come with
| built in general-intelligence so you can delegate things
| for them to figure it out and expect them to adapt their
| instructions according to their situation, and they come
| with amazingly precise and dextrous arms. They generally
| cost under $150k/unit/year for most manual applications and
| you pretty much know what you're getting up front vs
| hardware/software purchased off the shelf, not to mention
| the capital cost of acquiring one is pretty low because
| they already exist and just need to be hired.
| asdff wrote:
| I think part of that is how these sorts of businesses
| choose to organize their money. A manufacturer might budget
| themselves to be quite lean, so lean that for any sort of
| improvement like automation with robots, they have to hire
| an expensive consultant to tell them they have to hire an
| expensive robot company to build them expensive robots with
| expensive service contracts. So of course in that model the
| human is cheaper. However, if the manufacturer instead took
| all that money they would have spent on the expensive
| consultant, and opened up their own "internal" automating
| wing that would just look for things within the company to
| replace with a robot they create and service internally,
| maybe the math would pencil out another way.
| JohnFen wrote:
| There's another aspect to it as well.
|
| It's really common to see jobs that people are doing that
| appear to be easy to replace with robots. But if you
| actually try to develop automation to do it, you very
| commonly hit a bunch of gotchas that make the automation
| infeasible. Sometimes this is a systemic thing (you can't
| effectively automate this without having to change too
| much of the rest of the process at the same time, which
| makes it economically a nonstarter), but more often than
| you'd think it's because there's an aspect of the job
| which doesn't seem particularly hard or important because
| humans do it without even thinking about it -- but to
| computers and robots, it's incredibly difficult.
| smokel wrote:
| This has to do with the fact that robots are still pretty
| bad at basic tasks such as picking items from a container,
| or handling fabric. Just have a look at the state of the
| art in garment manufacturing robots. For most tasks, human
| muscle and sense of touch are way ahead of any robot.
| eichin wrote:
| Thus, over time, most people working on commercial robots
| favor increasing the minimum wage :-)
| makeitdouble wrote:
| There's many situations where manually dealing with 100+
| records is a decent tradeoff. I feel people in our field
| overestimate the actual complexity of many dumb looking tasks.
|
| From your bank employee example, it looks like your solution
| would be to open the bank's internal tool to her excel
| instance, have it somewhat find the right records and inject
| the right value, while checking the listed codes. That looks
| like a series of APIs if they care about data exposure, or huge
| CSV exports that they need to then track ?
|
| And as it seems to be important enough to warrant verification,
| it means your program is not some random batch, it needs
| significant validation.
|
| Then picture the internal tool changing its output format.
| Reports getting categorized differently because of
| organizational change. New fields being added to follow new
| legislations. The tool could break at any of these changes, and
| the lady's left with a broken system except a lot of money has
| been invested, it's supposed to have solved her problem, she
| might not have the same access to data she had before the
| integration, and overall the situation could be worse than
| before.
|
| That feels like a bleak scenario, but it happens all the time
| as well.
| jiggawatts wrote:
| It's basically a JOIN, the most fundamental data manipulation
| operator. If you can't automate this, then you don't "get"
| computers at all.
|
| I once had two guys arguing next to me about how it was
| impossible to produce a simple report about VM storage
| utilisation _in under a month_. I produced the report,
| printed it out, and shoved it in their hands just to shut
| them up while they were still busy arguing. A "month" of work
| required five joins.
| johnmaguire wrote:
| > It's basically a JOIN
|
| The comment you're replying just explained why this is not
| "basically a JOIN." There are two unrelated systems (not
| SQL databases) which don't expose any APIs to the analyst.
| tuatoru wrote:
| It's still basically a join. If it's on a computer, it
| can be converted to text files, and then it can be
| JOINed, one way or another.
| ethbr1 wrote:
| My sibling from another mother. That's exactly what we
| did.
| csomar wrote:
| In these cases, the work is usually about connecting the
| systems together than the data modification itself.
| TeMPOraL wrote:
| No, that's just made up work that's not required to solve
| a problem. Systems are already connected enough by virtue
| of being accessible by the same person on the same
| computer at the same time. Turning this into software
| project and building Proper Integration is only creating
| a tight coupling between the systems, which will only
| keep creating bugs and requires ongoing maintenance.
| ethbr1 wrote:
| We were able to fix it up with some automation in a couple
| days.
|
| Critically, still running visually on her machine, and with
| internal resources for lifecycle support if the process
| changes.
| johnmaguire wrote:
| Finding internal resources to fix the tool when things
| change is very prescient.
|
| Growing up, my mom performed a job where she would
| essentially copy car crash data from one system into
| another, row by row. As a teenager learning to program, it
| was obvious to me this could be easily automated. She
| (perhaps rightfully) didn't want to run anything I cooked
| up on her company's computer!
|
| A year or so later she was laid off a long with many others
| in the company performing similar roles.
|
| And this is the problem, isn't it? Once the job is
| automated, the analyst is no longer necessary. You really
| need to "learn to fish."
| troupe wrote:
| Do you have any guess how long it will take to break even?
| Presumably you charged some amount to spend a few days
| automating it and hopefully it frees her up to do something
| more valuable for the company.
|
| Sometimes the reason things aren't automated is because
| there isn't anything more valuable for that person to do so
| automation doesn't have a positive return on investment.
| ethbr1 wrote:
| It was an internal citizen developer initiative that I
| was supporting externally. Don't know her salary, so
| couldn't say.
|
| > _because there isn 't anything more valuable for that
| person to do_
|
| I've seen an interesting shift post-COVID where
| (remaining) ops workers are often more coordinators of /
| responsible for their area of responsibility.
|
| Which is a distinction because in that sort of job, not
| having Task X leads directing into finding something else
| you can be doing to improve your area.
| concordDance wrote:
| Can't speak for them, but I've done office automation
| work with a breakeven measured in days.
| criddell wrote:
| I've automated things where the breakeven will occur
| decades from now. It was fun though, the chance of errors
| is decreased, and I learned a bunch of new stuff.
|
| Sometimes it's about the journey and not the destination.
| ethbr1 wrote:
| Calculating breakeven ROI is part of the push from a lot
| of companies to start citizen developer initiatives.
|
| The point being that the "I" is also flexible -- an
| existing employee fooling around to learn automation
| whenever they have downtime is _much_ cheaper than hiring
| a consultant.
|
| Sure, the automation is worse and delivered more slowly.
| But you can get automations from a lot of people in
| parallel, plus you're upskilling your workforce.
| makeitdouble wrote:
| > Critically, still running visually on her machine, and
| with internal resources for lifecycle support if the
| process changes.
|
| That's such an elegant and perfect solution. She keeps
| track of what's happening, it all runs through her GUI so
| worse* case scenario she does it herself again, and she can
| get help if/when she needs it.
|
| * actual worse case would be the script going berserk, but
| as she's not the one who wrote it, she should be shielded
| from the blame if that ever happens
| bloak wrote:
| I think you're right. I've sometimes been in the situation of
| having to maintain a script that depends on APIs or data
| formats that change from time to time without warning. A fair
| amount of skill and effort is required. (There is skill
| involved in writing the script so that it is maintainable and
| not too fragile but still detects when something has gone
| wrong.) If the script is doing a task that has to be
| completed on a particular day every month, rather than
| whenever the script hacker is next available, then you'd have
| to choose between paying the hacker a reasonable retainer or
| having someone around who you know can still do the task
| manually, which means you might have to get them to do the
| task manually from time to time to stay in practice.
|
| A dilemma I keep facing in practice is converting a text that
| someone else has produced in their own special way, like the
| ghastly cluttered HTML from a word processor. For example, do
| I attempt to use a Perl script to automatically convert some
| of the mark-up, or do I throw away all the mark-up and put
| back the italics by hand, with the risk of making a mistake?
| (If anyone knows of a good _configurable or modifiable_ tool
| for converting cluttered HTML into sane HTML /XML I'd be
| interested to hear about it. My Perl scripts for doing it are
| not great.)
| danielbln wrote:
| Sounds like a good usecase for an LLM.
| ben_w wrote:
| Perhaps, but LLMs while may be slightly better (or
| slightly worse) than average humans, if you have 20
| instances of the same LLM will have correlated errors
| whereas 20 humans will be... well, not _entirely_ [0]
| uncorrelated, but _less_ correlated, in their errors.
|
| [0] We'll still sometimes get an entire office of
| telephone help-desk people who are all convinced that
| $1/megabyte is the same as $0.01/megabyte, or whatever
| that story was a few decades back (I can't Google it any
| more to find it, there was a recording of the phone
| call).
| genewitch wrote:
| it was 0.01C/ and the rep / software was billing it as
| $0.01 - saying 0.01 of a penny is $0.01.
|
| I may be off by a factor of 10.
| ethbr1 wrote:
| Over the years, I've internalized a few rules to soften
| what you point out.
|
| * Understand both the cost of making a mistake and of the
| process not running. Be aggressive on things that are 'nice
| to have' or have downstream manual re-validation. Be
| conservative on things that are 'must have' and immediately
| effect an outcome.
|
| * First, do not harm -- program defensively and break
| execution on _any_ violated expectations
|
| * Understandable error messages at the user level are gold.
| The user has no idea what "schema mismatch" means. They can
| understanding "Column 3 had unexpected value. Was expecting
| 'X'. Saw: 'Y'"
|
| --
|
| On ugly HTML, that's a bad one, because of how ugly it can
| get.
|
| I've only done toy work with it, but I hear good things
| about Beautiful Soup (Python). https://beautiful-
| soup-4.readthedocs.io/en/latest/
| userbinator wrote:
| _What the fuck are we doing as a society, that this lady, with
| a machine that could compute the above in a fraction of a
| second, has to spend a decent chunk of her work time doing
| this?_
|
| To put it bluntly: do you want her to be unemployed instead?
| tuatoru wrote:
| This is the "lump of labor" fallacy.
|
| If she turns up to work reliably, and stays on-task, there
| are no end of other things she could be doing that increase
| general welfare more than this.
| johnmaguire wrote:
| Maybe, but I'm not sure everyone has the inclination to
| become a programmer.
|
| And the whole thread was about how computers should work
| for everyone, including this lady whose job could be
| automated.
|
| Which begs the question: do most people actually want
| general purpose computing? Or does the average human prefer
| "apps"?
| tuatoru wrote:
| Who said anything about her becoming a programmer?
|
| An interior decorator, or a gardener, or a nurse aide, or
| a yoga instructor, or a nail technician: they all add
| more to human welfare than this task.
|
| If she wants to become an architect, or a water system
| engineer, more power to her!
| rrr_oh_man wrote:
| Not disagreeing with your point in general, but:
|
| Not everyone can become an architect or water systems
| engineer at 50, after having worked a "general assistant"
| type office job for many years.
|
| I think that (and its consequences) might be the biggest
| short term societal risk of automation in an aging
| society.
|
| How would you solve this problem?
| tuatoru wrote:
| If there are resources available, some entrepreneur will
| figure out a way to make use of them.
| asdff wrote:
| Why can't you become these things at 50? Considering you
| can become them at 23, having worked perhaps in fast food
| for a few years prior?
| auggierose wrote:
| Good question! Wait until you are 50, maybe you can
| answer it then ;-)
| JohnFen wrote:
| I (and most of my social circle) am over 50, and I can
| answer that question: there's no solid reason you can't
| train for and be successful in a different career later
| in life. I've seen it happen too often to think
| otherwise.
|
| Whether or not you _want_ to is an entirely different
| question, of course.
| auggierose wrote:
| Not there yet, but slowly getting there. Of course you
| can train for a different career after you are 50, but
| you will also have a good idea what kind of career is not
| a good fit for you (anymore). So just because certain
| careers are now looking for people, doesn't mean that
| these are a good fit for you.
| JohnFen wrote:
| Yes indeed!
|
| Our youth is when the majority of what we do is try
| things out to see what fits and what doesn't. In our
| older years, most of that experimentation is behind us
| and we have a pretty solid idea of what fits us and what
| doesn't.
|
| The trick is that the amount of experimentation should
| never be reduced to zero.
| rrr_oh_man wrote:
| I appreciate that question!
|
| Something something about age and responsibilities and
| tiredness and not having trained for that, but... it's
| worth it to think about it, anyway -- at least as risk-
| mitigation for yourself, going forward.
|
| Maybe we'll all end up as plumbers?
| TeMPOraL wrote:
| Whatever you list here, she'll be competing for an
| _entry-level position_ with much younger people, who have
| more time, more energy, lower expenses and no
| obligations.
|
| The older one is, the more time one spent in one line of
| work before being forced to find something else to do,
| the harder it hits. So sure, maybe you _can_ switch to
| landscaping in your 50s, but that also means you and your
| family suddenly being kicked down one or two economic
| classes.
| ethbr1 wrote:
| In this scenario, she had plenty of other high-value work
| to perform if she never had to do this again.
|
| Which is the other side of the coin of the ubiquity of
| this problem -- modern businesses are so optimized on the
| people side that roles are collapsed down to 1-2 people,
| and therefore those people inevitably catch "oh, and also
| do this" could-be-automated work.
| asdff wrote:
| If the average human were allowed to achieve its most
| base preferences, we'd all be a half ton in weight,
| floating around in chairs, stupefied at some screen, just
| as depicted in _Wall-e_.
| JohnFen wrote:
| I actually don't think this is true at all. While there
| will always be a percentage of people who would prefer
| that, my observations are that most people don't. They
| tend to have interests that drive them to put time and
| effort into doing things, instead.
| NikolaNovak wrote:
| I was going to say that - I'm increasingly subscribing to
| "bullshit jobs" theory :
|
| * productivity for goods is high and availability is high (we
| can discuss inequality and distribution)
|
| * we just don't need everybody to work 40hrs a week to obtain
| collectively good standard of living
|
| * but as a society we are uncomfortable with universal basic
| income
|
| * so instead we basically do UBI through inefficient,
| unnecessary jobs
|
| That's my current perspective at least. As I continue to grow
| older and slide into cynicism though, I'm likely to switch to
| "everybody is idiots and everything is screwed up" theory
| more and more :-D
| opportune wrote:
| Who's saying that lady's job is bullshit just because it
| can be automated? There is real value in paying some up
| front cost in detection/monitoring/validation of data to
| prevent mistakes. Just because software engineers would
| consider her job "beneath" them because they could do it
| faster or better does not make it not worth doing.
|
| There are mega-efficient factory farms in Nebraska that are
| immensely productive per-human involved, but that doesn't
| mean what hobby farmers in Connecticut or subsistence
| farmers in Africa are doing is bullshit.
|
| I think "bullshit jobs" are less of a thing than people
| believe. It's just that people devalue assembly-line or
| cog-in-machine work, even when they're the ones doing it -
| especially in the US where we grow up wanting to be a
| unique success story like an entertainment celebrity or
| rich entrepreneur and so that kind of work is antithetical
| to our idea of a successful person. Fact of the matter is,
| machines needs cogs, and we don't have an infinite capacity
| to replace human cogs with automated ones.
| concordDance wrote:
| Does she find that task enjoyable and fulfilling? If not,
| it's bullshit.
|
| And if bet money on it NOT being enjoyable and
| fulfilling. Humans almost universally hate being cogs
| doing the same repetitive trivial action over and over.
| oldsecondhand wrote:
| Do janitors find their job enjoyable and fulfilling?
| Probably not. Their job is still important for the
| functioning of society, so I wouldn't call it bullshit.
| JohnFen wrote:
| I was a janitor for a long time in my younger years, and
| I actually did find it enjoyable and fulfilling. Just
| sayin'.
| hn_throwaway_99 wrote:
| > so instead we basically do UBI through inefficient,
| unnecessary jobs.
|
| Your first 3 bullet points make sense, but this last one is
| where I think the normal theory behind "bullshit jobs"
| really falls apart. Every individual business has a large,
| strong economic incentive to _not_ hire these bullshit jobs
| if they don 't need to, so why should we think they would
| be so generous to engage in this "bullshit jobs charity"?
|
| I think what is really happening is that as society
| advances, lots of rules, regulations and processes just
| build up and up over time, and this complexity eventually
| becomes self-sustaining, even if it greatly diverges from
| the original intent of the original rules.
|
| Case in point (which is a common case I know) is the
| complete, total insanity of the US healthcare system,
| specifically how healthcare is paid for. Literally every
| single person I know that has ever gotten into some aspect
| of healthcare payments (potentially with an idea to improve
| the madness) eventually comes to the conclusion "The whole
| system is completely fucked, it should all just be torn
| down and greatly simplified." The problem, though, is now
| there are a _ton_ of entrenched interests who depend on
| that complexity for their livelihood (I heard it referred
| to as "an abusive relationship" - Company A exists to
| "simplify some aspects of healthcare payments", which means
| they depend on the underlying complexity in the first place
| to exist), so there are no real incentives from the people
| that control the levers to simplify. So a lot of those
| bullshit jobs come about to manage that complexity, but
| it's not like some company thought "let's hire people to do
| busywork so they'll have employment."
| ludston wrote:
| Whilst business owners have a financial incentive not to
| employ people, there are competing incentives. For
| example, non-owners are incentivised to have more people
| reporting to them because it increases their status and
| status is usually correlated with salary. In fact, when
| you are rich enough that money doesn't mean anything any
| more, you may as well have a bunch of people employed to
| do nothing but make you feel important and powerful.
| spease wrote:
| > In fact, when you are rich enough that money doesn't
| mean anything any more, you may as well have a bunch of
| people employed to do nothing but make you feel important
| and powerful.
|
| I'd just like to proactively coin the term "Ego Engineer"
| for the post-AGI world.
| ludston wrote:
| Traditionally they'd be called "elevator operator" or
| "doorman" or some equally mundane/superfluous job.
| tempodox wrote:
| GP's last point isn't to be taken literally. It's just
| the short and snappy summary of what took you many words
| to describe.
| zweifuss wrote:
| Yes, that's how it is. For every reform that would
| benefit society as a whole, there is now a tiny minority
| of certain losers with a deeply entrenched lobby against
| the new and for the old. Be it fossil fuels, health care,
| banking, peace in the Middle East, nuclear technology,
| the use of genetic engineering in plant breeding,
| electric vehicles, and so on.
|
| I don't think UBI would change that, but UBI might have a
| chance to change the perception of one's job as a
| bullshit job (they say that's 40% of the workforce).
| mvncleaninst wrote:
| > I think what is really happening is that as society
| advances, lots of rules, regulations and processes just
| build up and up over time, and this complexity eventually
| becomes self-sustaining, even if it greatly diverges from
| the original intent of the original rules.
|
| This is one theory, I think a slightly different
| explanation could be that most corporations are too large
| for the people making decisions regarding things like
| layoffs to be able to have a clear picture of what each
| employee is doing
|
| Also like a sibling comment said, there are also
| conflicting incentives like middle management engaging in
| empire building. Because of this, there isn't any
| vertical enforcement or clarity
|
| Really interesting how much better complexity scales in
| software than it does in business
| asdff wrote:
| She will not be unemployed in this situation, nor will her
| peers in similar situations. Available labor is always
| gobbled up and not left idle for long. Case in point: we've
| obviated jobs like the horseshoe maker and the stable boy and
| yet unemployment rate today at 3.8% or so is half of what it
| was in the 1900s when we had all these horse and stable jobs.
| yard2010 wrote:
| Jobs do not disappear, they just change
| ben_w wrote:
| They can be fully automated out of existence, e.g the
| jobs of Herb Strewer and link-boy.
|
| And, indeed, that old job called "computer".
| ffgjgf1 wrote:
| You have to also take into account income not just the
| unemployment rate. Growth has been close to non-existent if
| not negative over the last 40-50 years or so if earn below
| the median.
| JohnFen wrote:
| In the long term, yes. But shouldn't we have some
| compassion for those who get screwed in the short term? The
| horseshoe makers who were thrown out of work at the time
| may not have been able to find decent new employment. We
| have copious modern-day examples of jobs that have been
| obsoleted and those who were immediately affected still
| haven't been able to recover.
|
| These sorts of labor shifts can't avoid harming those who
| get the short end of the stick.
|
| Saying this is not to say that such changes shouldn't
| happen. It's inevitable and necessary in order for society
| to adapt to changing conditions. But I very often see a
| disregard for the reality that people really do get hurt
| badly in these shifts.
| nonrandomstring wrote:
| > do you want her to be unemployed instead?
|
| Yes.
|
| Compassion and reason both repel me, as a humanist and
| computer scientist, from the thought of another human being
| pointlessly wasting her life for the mere "necessity" of
| money to live because broken software has become an excuse
| for this.
|
| It is an inhuman spectacle, and the notion of "employment"
| here is so weak as to have no force in the argument. An
| unemployed person at least has the opportunity to redefine
| their life, whereas one who is tricked daily into believing
| their bullshit job is "valuable" is being robbed, no matter
| the remuneration.
|
| Further, as a technical person, it disgusts me to see the
| tools we created so comprehensively misused by idiots who are
| unable to intelligently deploy them - or rather deploy them
| as instruments of abuse rather than for human development and
| progress.
| spease wrote:
| > To put it bluntly: do you want her to be unemployed
| instead?
|
| Fine. Then write her a script and continue to pay her a
| salary. Hell, let her keep coming into work if she wants to.
| Functionally, what's the difference to the business?
|
| Presumably she's still going to want to maintain social
| relationships in her life, which means she's got an intrinsic
| incentive to do things to benefit someone else even absent
| financial incentive.
|
| But now we've effectively got an atomic form of UBI, which
| scares people. Yet 1 person's worth of work is being done for
| the cost of 1 salary.
|
| In fact, if she starts doing something else productive to
| society with her time, you've got 2 people's worth of work
| being done for the cost of 1 salary.
|
| And if someone automates that and she switches to something
| else, it increases to a 3:1 ratio.
|
| Where is her employment status a problem, except to the
| sensibilities of people who believe that everyone needs to be
| "employed"?
|
| If no one ever automated what she was doing for fear of her
| losing her livelihood, then the maximum productivity we'd
| ever expect to see is 1:1.
|
| Seems like _not_ having UBI or a stronger safety net is
| creating a perverse incentive for economic inefficiency.
| matheusmoreira wrote:
| > continue to pay her a salary
|
| Why would any company do that?
| solatic wrote:
| Yes. A human mind is a terrible thing to waste. Plenty of
| people don't look for more fulfilling work because what they
| already have is "good enough" and it brings home a paycheck
| that supports the people who depend on them. Oftentimes,
| people need a push, and doing so can be an act of compassion.
| Laying people off is not some death sentence; it's a
| disruptive event that well-adjusted people will recover from.
| That's ultimately their private responsibility, not the
| company's, certainly not the company's in a day and age where
| few people work their entire career for the same employer
| anymore.
|
| Furthermore, a society where there is massive unemployment
| but society's economic engines are humming along, mostly
| automated, for the benefit of a select private few, is a
| society where such engines are ripe for nationalization,
| whether that be through public ownership or public seizure of
| the majority of the profits via taxation, either one of which
| could support basic income. So again, not necessarily a bad
| thing in the end.
|
| And if you think people blowing their basic income checks on
| panem et circenses represents some kind of civic death, I
| would point to the current alternative, people working
| bullshit jobs of questionable value, and ask if that _really_
| represents the civic virtue you 're trying to uphold.
| JohnFen wrote:
| > Oftentimes, people need a push, and doing so can be an
| act of compassion.
|
| Only if those people have asked for the push. If they
| haven't, then it's not at all an act of compassion.
| ThrowAway1922A wrote:
| > it's a disruptive event that well-adjusted people will
| recover from.
|
| Some of us are not well adjusted. Some of us are barely
| hanging on, barely able to manage the demands of our lives
| and our society.
|
| Being laid off at this point for me would be worse than
| disruptive it would be an outright disaster.
| omscs99 wrote:
| Hypothetically, if she figured out how to automate it on her
| own, would she get any kind of reward for it? If not, why
| should she automate it?
|
| Another example, if there's some repetitive task that you
| automate at a software dev job, would you get rewarded for
| figuring out how to automate it? The answer is obviously
| dependent on culture, at my current gig you just get brushed
| off
|
| Seems to me like you're assuming that the economy incentivizes
| efficiency. Imo it doesn't, it's just a bunch of stupid rich
| people sloshing money around (for the most part). None of it
| makes any sense
| arvinsim wrote:
| > Hypothetically, if she figured out how to automate it on
| her own, would she get any kind of reward for it? If not, why
| should she automate.
|
| The reward is more time. Now where that time is spent is
| another story.
| HHC-Hunter wrote:
| Sounds like you're implying that that newly spared time
| would be hers to spend and not her employers?
| eru wrote:
| If she keeps her mouth shut about it, definitely.
| TeMPOraL wrote:
| Of course. She was presumably hired for regular office
| work, not R&D.
| dingi wrote:
| why not? she needs to be rewarded for that automation
| somehow.
| atoav wrote:
| Realistically that reward is going to be:
|
| 1. more work
|
| 2. nice words and then more work
|
| 3. nice words, more work and then she is asked to do the
| same for another person's task who is then fired
|
| Sorry to be cynical here, but it is very rare for
| managment to reward the few people that do a better than
| good job with more free time or less work.
| dingi wrote:
| She doesn't have to tell anyone
| atoav wrote:
| Exactly my point. But if you are a manager, you might
| wanna consider if that is the incentive structure you
| wanna have at your organization: People who do good work
| get "punished" with more work, people who keep it low
| don't.
|
| I certainly have left jobs because of that.
| ethbr1 wrote:
| I think about incentive structures from an opportunity
| cost standpoint too.
|
| Everyone has finite amounts of time they can spend on
| work.
|
| Ceteris paribus, if you spend 90% of your time working
| and 10% politicking, at most companies you will be out
| promoted by someone who spends 60% of their time working
| and 40% politicking.
|
| The parallel IC track that tech popularized solves this
| to some degree... but most non-tech companies don't have
| that track.
| maayank wrote:
| What mechanisms successfully address this?
| yodelshady wrote:
| I believe dingi is perfectly aware of the realistic
| outcomes, and is instead describing the normative ones,
| i.e. what you need if you want an enterprise that
| actually self-improves, where the best employees _aren
| 't_ actively looking for the door whilst concealing
| things from management.
|
| It is, however, _wholly_ a management problem to find
| those actually-rewarding rewards.
|
| Crazy idea: what if employees _retained_ IP of any
| spontaneous, self-directed R &D? You could then license
| their tech and make the role redundant, something any
| ruthless capitalist would consider a win. The employee
| can go job-hunting with a small passive income and
| glittering CV, which means they're much more likely to
| _actually tell you_ so you can get that outsourcing win.
|
| In reality, it seems far too many businesses have moats
| as a result of excellent decisions made by founders in
| the past, and as a result can't be outcompeted by
| companies that _do_ manage talent better.
| Defletter wrote:
| More time, yes, probably doing something else equally
| tedious, or looking for a job.
| atoav wrote:
| I had a job like this as a student. I automated it and didn't
| tell anyone and used the newly won time to do other stuff.
| russfink wrote:
| I was a temporary worker at a field dispatch office. We had
| to telephone all of our technicians and enter their time into
| a time sheet. This was approximately 1990, and it turns out
| all the technicians had email. Since the time recording
| format was very simple, I worked out a shell script to Let
| them enter their time in an email message, mail it to me,
| then my forward rules would pipe it through a shell script, a
| couple of ANSI codes later, and the time was entered
| automatically into the system. I would check it for accuracy,
| but it saved me having to Tab/enter about 25 times per
| employee just to get to the part where I entered their time.
| Literally it was five minutes per person including voice
| comms time reduced to 30 seconds.
|
| A senior dispatcher got wind of it. She went to the boss'
| boss' boss, and complained that "this boy is going to
| computerize us out of a job."
|
| It wasn't long before I was summoned to appear. Three things
| happened at that meeting. The Uber boss told me to keep it on
| the down low. He also gave me a copy of his shell script
| programming book. And finally, he told me to get the heck out
| of there, go back to school and stop putzing around at a job
| like this before you end up becoming a middle manager like
| himself in a career with low reward.
| nonrandomstring wrote:
| > if she figured out how to automate it on her own, would she
| get any kind of reward for it?
|
| More likely she would be hauled up in front of a gang of
| rabid, frightened, myopic and punitive ICT managers where
| she'd be reprimanded for breaking a litany of "policies".
|
| In these kinds of places you are not supposed to think,
| there, computers are not intelligence amplifiers or "bicycles
| for the mind"; they are more akin to the presses and looms in
| dark satanic mills.
| _dain_ wrote:
| _> Hypothetically, if she figured out how to automate it on
| her own, would she get any kind of reward for it? If not, why
| should she automate it?_
|
| the reward is the intrinsic satisfaction of a job well done,
| and something to put on the CV for the next job.
|
| automating dumb bureaucratic shit is how I learned to code in
| the "real world".
| gryn wrote:
| > something to put on the CV for the next job.
|
| spoiler alert: you can put anything in your CV without
| actually doing it. I've had the experience of dealing with
| people who I have high doubts did this. From the looks of
| it have been reward for it for more than a decade.
|
| > the reward is the intrinsic satisfaction of a job well
| done,
|
| just because it applies to you do not mean it a universal
| shared experience.
|
| for some it's not even satisfaction that they get from that
| it a frustration and a feeling of being cu*ed where they
| put in the effort and someone else get the reward.
| II2II wrote:
| > Hypothetically, if she figured out how to automate it on
| her own, would she get any kind of reward for it? If not, why
| should she automate it?
|
| One would hope she would be able to move on to more
| meaningful work, rather than doing drudgery while waiting for
| someone to eliminate her when someone figures out it can be
| automated.
|
| That said, a lot of people don't know how to automate
| processes and it would likely face some push back since the
| process would be non-standard and tied to an individual. That
| can have long term costs should the knowledge be lost or a
| software upgrade breaks something.
| rhn_mk1 wrote:
| What the fuck are we doing as a society that we have such a
| system of perverse incentives in place?
| JohnFen wrote:
| Our society is geared -- at all levels -- towards
| minimizing expense regardless of the impact on quality or
| even on society in general. We're all racing to the bottom.
| AnthonyMouse wrote:
| Giving people raises for automating things _is_
| minimizing expense.
|
| Suppose Alice is making $40,000/year and comes up with a
| way to automate a third of her job. So you start paying
| her $50,000/year and give her some other work to do. Then
| Bob and Carol each find a way to automate a third of
| their own jobs, so now they all make $50,000 and have
| made Don redundant.
|
| The company is now paying $150,000 in total salary
| instead of $160,000 and only has to pay insurance and
| provide office space for three employees instead of four.
| Meanwhile the workers who found ways to improve
| efficiency are making more money and have the incentive
| to do it again and get another raise.
|
| Companies may not actually do this, but those companies
| are mismanaged and putting themselves at a competitive
| disadvantage.
| imtringued wrote:
| It's called capitalism. Some confuse that with the more
| neutral concept of a market economy.
|
| In capitalism, individual wealth accumulation is the
| primary goal, not employing and paying people living wages.
|
| Since capitalism is a disequilibrium state, full employment
| is not expected. Busy work is part of the system.
| orangepurple wrote:
| It is not entirely correct to describe the social
| contract within a corporation as capitalist if you are
| salaried and compensation is indirectly tied to
| performance.
| Hgelo wrote:
| I hate this take on optimization.
|
| She would of course get a raise or would succeed otherwise in
| our economy.
|
| The only reason why I'm as successful as I am is that people
| understand that I'm so good in optimizing shit that they give
| me raises.
|
| And alone the time I have to myself to skill up instead of
| waisting it on repetitive things is ridiculous if you think
| about return of investment and reinvestment.
| deergomoo wrote:
| > She would of course get a raise or would succeed
| otherwise in our economy.
|
| Are you kidding? I'm happy your optimisations have been
| recognised and rewarded, because that's how it _should_ be,
| but this lady would almost certainly just get more work to
| fill that newly freed time, for no additional compensation.
| userinanother wrote:
| Many companies don't even give top performers raises large
| enough to cover inflation. Who are you kidding here
| Hgelo wrote:
| I get bonuses because my manager knows very well he has
| much bigger issues when I leave.
|
| You have to have a certain amount of flexibility of
| course and be willing to quit.
|
| I'm still very sure that good performance is overall much
| more beneficial than staying and acting at the bottom.
|
| You are hurting yourself while you do mondaine tasks
| while the other person gets compound interest and new
| skills.
| userinanother wrote:
| Your situation is unusual and you should be grateful. The
| person described by op is a quasi government employee
| likely working for a TBTF institution that does not do
| this
| JohnFen wrote:
| > if she figured out how to automate it on her own, would she
| get any kind of reward for it? If not, why should she
| automate it?
|
| Because it makes her job easier or improves her performance?
|
| That's why I automate things at work, anyway. Being rewarded
| by my company for doing it doesn't really enter into the
| equation for me.
| LtWorf wrote:
| What happens to her if her performance is improved?
| JohnFen wrote:
| That entirely depends on the company she works for, and
| her own disposition.
| leephillips wrote:
| Quite likely she will be punished for it, directly or
| indirectly. This is why it's bad to be an employee. If
| you earn a living by knitting scarves in your house and
| work out a way to make them faster or better, or both,
| you'll make more money or have more free time. If you
| knit scarves for a salary you'll probably suffer for
| doing it faster or better.
| ethbr1 wrote:
| > _Quite likely she will be punished for it, directly or
| indirectly._
|
| Not at most modern companies in the real world.
|
| Jobs have already been so hyper-specialized that you have
| minimal staff "managing" large portions of the company,
| amortized over a large number of locations / amount of
| business.
|
| Consequently, if a job task is taken off their plate,
| there are innumerable additional tasks to backfill the
| free time.
|
| And critically, tasks that are probably more
| intellectually fulfilling than the lowest-hanging-fruit
| rote tasks that are automated.
| pacificmaelstrm wrote:
| "assuming the economy incentivizes efficiency"
|
| In the long run it does. Today's stupid rich people are
| tomorrow's "my grandfather was x and now I'm middle class"
|
| Happens more and faster than you think.
|
| Economic disparity metrics suffer from selection bias.
|
| But beyond that I think that attitude come from
| misunderstanding and confusing an ideal of "fairness" with
| efficiency.
|
| Efficiency and fairness are far and away not the same thing.
|
| Autocracies, for example, can be very efficient.
|
| The same applies to economic competition. Scams and cons are
| efficient. Crime is efficient. Corruption... All are very
| efficient at redistributing wealth. So government is needed
| to enforce fairness.
|
| But when it comes to the example problem here of adding
| excess value to a low-value job, the efficiency of the market
| is usually acting at the level of the firm, rather than
| within it.
|
| People are naturally lazy, and for most people, the imagined
| ungratefulness of a company paired with an inflated view of
| their own value causes them to not even try and certainly not
| persist at innovating.
| rtz121 wrote:
| > Another example, if there's some repetitive task that you
| automate at a software dev job, would you get rewarded for
| figuring out how to automate it? The answer is obviously
| dependent on culture, at my current gig you just get brushed
| off
|
| The only reward I need for automating a tedious task is my
| own sanity.
| throwaway14356 wrote:
| I use to point at office buildings and tell people non of the
| jobs there are real. They are all software that didn't get
| written. One logistics friend argued his well paid job was very
| important but with very little help he automated it in 2 weeks,
| it was just the idea never occurred to him. i told him to not
| tell anyone but he did and 300 people got fired
| mnky9800n wrote:
| And don't forget that automating this process could also
| include a host of error checking she is simply unable to do
| because she is a human and not a computer.
| SanderNL wrote:
| Someone told me their job was mainly checking Word documents
| for discrepancies. Basically diffing them, but manually, by
| reading.
|
| I showed them how you could diff them automatically. This was
| not appreciated.
|
| "Cool, IT dude, now I lost my job." And they were probably
| right if anybody cared enough to look at what they actually do.
| dingi wrote:
| That lady could have automated that with just a little glue
| code. Let the program do the work and use that time to do
| nothing. Nobody's complaining.
| delusional wrote:
| >I helped a lady on Friday at a midsized bank whose job it was
| to take a set of dates in an Excel file, search each one
| individually into an internal search tool, then Ctrl+f the
| result report and verify that every code listed in the original
| document exists in the report.
|
| I happen to work in a midsized bank, and we have TONS of these
| sorts of manual processes. My favorite unintuitive fact is that
| 90% of them turn out to be some outdated compliance process
| that nobody remembered a person even did, and that is no longer
| necessary.
|
| That also usually the reason why something is "too small of an
| impact". Having 2 engineers come in to figure out that
| something is obviously just busywork, and then try to run some
| political process to convince leadership, is very expensive.
| ethbr1 wrote:
| When I stumble across things like this, I always remember
| this scene from Babylon 5 (and Peter Jurasik's delivery!)
|
| (Context: they've found an attache and friend they thought
| was dead, alive but forgotten in a prison cell)
| https://m.youtube.com/watch?v=kCj-Rnd5SsA
| leephillips wrote:
| That was pretty amazing. I have to watch this show some
| day.
| ethbr1 wrote:
| 150% would recommend. I've watched it through 3 times
| over the years. Can't say that for any other show!
|
| If you do, commit to watching through the end of the
| second season, before you make a call.
|
| It takes that long for the show to find its feet and the
| overarching plot to get really going.
|
| And expect a lot of cheesy 90s stuff. Although _way_ less
| than Star Trek!
|
| But on the whole, IMHO, it's incredibly modern and
| prescient for when it was released. Far fewer
| disconnected "monster/problem of the day" than e.g.
| X-Files, and even those usually still have character
| interactions that do important world building.
| dmvdoug wrote:
| There should be a Tell HN that is just an 800-comment-long
| thread full of these stories because I love them.
| npsimons wrote:
| > There should be a Tell HN that is just an 800-comment-long
| thread full of these stories because I love them.
|
| As gratifying as they are, they get repetitive. Once you've
| heard one, you've heard them all.
|
| I'd much rather hear about the place that fixed a policy
| problem that was blocking such progress in the first place.
| Software is easy, getting people (and orgs) to change is
| hard, tedious, boring, but absolutely necessary, and more
| enduring than a lot of software fixes.
| dmvdoug wrote:
| I believe that about repetitiveness. It's just that I'm not
| in software or computers, so I haven't been subject to them
| all.
|
| Although I was a lawyer, and I can tell you that human
| beings come up with an infinite variety of ways to do weird
| things, which gives me hope that there may be a weird
| software story out there for you, that you've never heard.
| :)
| JohnFen wrote:
| I've been a dev for a very long time, and I still
| regularly hear weird software stories from colleagues
| that are not just "the same old thing" again. They're a
| pretty small percentage -- maybe 2-5%? -- but despite the
| low flow rate, the well never seems to actually empty.
| smokel wrote:
| I agree that software is easy, and managing (human)
| processes is hard. Just the other day I advocated for a
| Free Management Foundation [1], in the same vein as the
| Free Software Foundation. Anyone care to spend their life
| on that?
|
| [1] https://news.ycombinator.com/item?id=37968979
| dmvdoug wrote:
| For some reason that reminded me of the professor I had
| for administrative law in law school. He was a member of
| something called the Administrative Office of the United
| States Courts or something like that. Their entire job
| was to be a government run think tank that produced
| reports about government administration.
|
| He really, _really_ loved administrative law.
| npsimons wrote:
| > but I would say the decline has strictly been a failure of
| software.
|
| I would argue it's a failure of _policy_ , and more
| specifically _certain people enforcing a policy top-down on an
| industry they have little, if any, competence in_.
|
| The tools are still out there, and those of us who know what we
| are doing balk at those managers (not programmers, BTW),
| telling us "you don't need that."
|
| We need to wrest control back from the managers, at all levels,
| and tell them to fuck off while we get real work done.
| crabbone wrote:
| > set of dates in an Excel file
|
| PTSD intensifies
| baetylus wrote:
| I've seen this before. My experience is that the leadership who
| approves improvements is too removed to care. This is one
| reason B2B SaaS generally targets decision makers.
| vore wrote:
| This article has absolutely nothing to do with what you're
| saying. This article is about how certain types of calculations
| are better suited and are more efficient on specialized
| processors. I don't know what finding dates in Excel has to do
| with this.
| pacificmaelstrm wrote:
| Perhaps this is either:
|
| A: Where AI comes in to do this and then the rest of her job is
| well.
|
| B. Why programming should be taught as a basic subject in
| primary school.
| RachelF wrote:
| The failure is primarily due to stagnation in single core speed
| improvements since 2003, compared to previous decades.
|
| Yes we have more cores now, and single cores have got faster, but
| nothing like the scale of the 1990's when raw clock speeds
| increased 150x.
|
| Moore's law is about the number of transistors, not the speed of
| the device. More cores are good, but software is hard to
| parallelize, despite decent efforts from the CS community.
|
| There have also been efforts to put FPGAs into general purpose
| CPUs. These have also not taken off.
|
| For most users, a single core CPU running at 30GHz would appear
| faster than 32 cores running at 2GHz.
| williamtrask wrote:
| Machine learning is a massive counter example to this trend.
| Specialised deep learning hardware is just supporting an even
| more general GPT.
| karmakaze wrote:
| I largely don't blame the specialized processors, they exist to
| fill a need. The transition to mobile/tablet has been very much a
| producer-consumer relationship. In the old days, normal users
| would one day "View source" and dabble in HTML and making their
| own web pages and journey into making their own programs. There's
| no parallel to this on mobile today. Perhaps the next generation
| that grew up on Roblox and using yet to successful end-user
| visual development tools will.
|
| Is there a web-app like MS Access or Visual Basic today, maybe
| Airtable, others?
| opportune wrote:
| It's way too premature to say we are turning into fragmentation
| cycle regarding bespoke hardware. GPUs are still universal
| computers, they just use a different model with different
| performance characteristics (and applications, which stem from
| these performance characteristics and also inertia/maturity of
| tooling) than CPUs. ASICs and specialized processors have been a
| thing for a long time and their application towards some "new"
| things like crypto, hft, and a handful of cloud optimizations is
| hardly a new trend when seen in the larger context of their being
| used all the time for now-boring, once also-new technologies like
| networking hardware and embedded applications.
|
| I'd argue that the last three decades have actually had massive
| shifts towards more generalized and ubiquitous computing that may
| continue even further. First, with the introduction of PCs the
| computing landscape shifted away from bespoke hardware and
| software to more standardized plug-and-play hardware and
| software. With the internet, everybody got a middleware called a
| "web browser" that standardized UI development on a single
| computing model. With mobile computing, we got a more restricted
| model of userspace that standardized _what_ applications could do
| and even _how_ (app review, software signing) you could do it.
| Cloud did the same things to various extents to server software.
| Except for at the height of Windows ' PC dominance, it's never
| been easier to write software usable by such a large portion of
| the market, and the absolute number of devices and users that
| applies to is probably an order of magnitude more than existed
| during peak-Windows.
|
| Everybody in tech knows single-core cpu performance gains are
| slowing down and that the remaining incremental improvements will
| eventually end too, so what comes next is on peoples' minds. IMO
| this article is jumping the gun though - it's still way too big
| an undertaking to use specialized computers (and I don't count
| GPUs as these) for all but the largest scale or most performance
| sensitive use cases, as it's always been.
| 8note wrote:
| The last two decades at least have moved towards a purpose
| built consumer device, with a general purpose server.
|
| Gpus, CPUs, and Asics, are run by vendors rather than end
| customers. The end customer runs a full computer, but has no
| rights to it because it's called a smart phone rather than a
| computer
| dang wrote:
| Discussed at the time (of the article):
|
| _The decline of computers as a general-purpose technology_ -
| https://news.ycombinator.com/item?id=26238376 - Feb 2021 (218
| comments)
| seydor wrote:
| ditch your phone , go back to your desktop, problem solve. Mobile
| has been a regression in pretty much everything except corporate
| profits.
|
| The focus of the article is wrong. PCs for example are not
| getting more specialized, they are getting new capabilities with
| the additional specialized gpus. As others have said the problem
| does not lie in processors, but on the software and UI, which has
| been steadily dumbing down since "smart"phones have been
| introduced. Phones have perfectly capable general computing
| processors but corporate decisions limit their usage.
| pixelpoet wrote:
| Amen to this, I absolutely hate what mobile phones and their
| attendant giant megacorps did to the internet and computing in
| general.
| JohnFen wrote:
| I agree -- this is exactly why when my current smartphone dies,
| I'm going to be switching to the dumbest phone I can find and
| start carrying a real pocket computer along with it.
| genewitch wrote:
| and a camera; and a wallet with all the cards you use? maybe
| the landscape of featurephones is different now, but i only
| knew of 1 or two (kyocera) that allowed tethering, so now you
| also have to figure out how to get internet onto your pocket
| computer - another sim, another bill. Perhaps you need a GPS
| device too (my car has one built in, but my wife's doesn't,
| for example).
|
| I was going to just post a link to a picture of my cellphone
| graveyard drawer; i've been saying this same sentiment for
| the 12 or so years i've owned a smartphone. I even bought two
| Nikon 1 cameras to obviate the need for a decent cellphone
| camera. I have a servicable netbook (eeePC), too.
|
| The most expensive cellphone i ever bought was $300. In
| theory, i could carry a Nintendo Switch, Nikon 1, feature
| phone, GPS unit, and a netbook around with me... or this
| absolute garbage OnePlus that i hate using enough that i only
| use it when necessary, like to SMS/MMS or something as silly
| as making or receiving a telephone call.
| JohnFen wrote:
| I realize I'm unusual for the HN demographic, but...
|
| I don't really use the camera on my smartphone, so see no
| need to replace it. I carry a wallet with the few cards I
| need daily anyway, so no change there (and even if I didn't
| need to carry cards, I still need somewhere to keep cash) I
| don't use my phone to pay for things or access services, so
| I'm not concerned about doing those things when mobile. I
| can do what I need to do online from my desktop machine at
| home.
|
| I do need GPS, but it will be in my pocket computer, so
| that's fine. I don't need to have online access for this to
| be useful because I really only use GPS for specific
| planned activities anyway, so I can preload any maps I
| might need.
|
| > i only knew of 1 or two (kyocera) that allowed tethering,
| so now you also have to figure out how to get internet onto
| your pocket computer - another sim, another bill.
|
| I'm not particularly worried about having the pocket
| computer be always connected to the internet. I can easily
| get by without having a constant connection. The pocket
| computer will have Wifi, which will cover situations where
| I want to have the machine connect to the internet.
| fsflover wrote:
| > and start carrying a real pocket computer along with it
|
| So why wouldn't you buy Librem 5 or Pinephone, which are
| pocket computers running GNU/Linux which have a phone
| functionality?
| sergeykish wrote:
| PinePhone has a lot software releases, postmarketOS has plenty
| of supported hardware.
|
| "Dumbing down" is a necessity for extended demographics, that's
| trend from long ago.
|
| Article explains reality, predicts proliferation of specialized
| hardware. For example was sound ever viable on CPU? GPUs were
| specialized, become more universal with compute, extended with
| video encoders, RTX, DLSS.
| blueblimp wrote:
| I question the article's framing of CPUs as "universal" and GPUs
| as "specialized". In theory, they can both do any computation, so
| they differ only in their performance characteristics, and the
| deep learning revolution has shown that there is wide range of
| practical workloads that is non-viable on CPUs. The reason OpenAI
| runs GPT-4 on GPUs isn't that it's faster than running it on CPUs
| --they do it because they _can't_ practically run GPT-4 on CPUs.
|
| So what's going on is not a shift away from the universality of
| CPUs, but a realization that CPUs weren't as universal as we
| thought. It would be nice though if a single processor could
| achieve the best of both worlds.
| tjoff wrote:
| > _[...] differ only in their performance characteristics
| [...]_
|
| ... but that is exactly why CPUs are considered "universal"
| and GPUs as "specialized".
|
| The whole concept of specialized hardware is to do fewer things
| more efficient, and in tons of applications that means the
| problem suddenly becomes feasible. That has always been the
| case. Not sure what the deep learning revolution has shown in
| regards to this.
| Jensson wrote:
| You can't run a GPU without a CPU, but you can run a CPU
| without a GPU.
|
| Could we change GPU's so they become more general purpose?
| Yeah, an we already have by gluing the GPU to a CPU ie
| integrated graphics, lots of CPUs has that. But when you do
| that we call it a CPU and not a GPU, so as soon as you make a
| GPU general purpose we start calling it a CPU.
| JohnFen wrote:
| > In theory, they can both do any computation, so they differ
| only in their performance characteristics
|
| But surely, the exact same thing can be said when comparing any
| two different machines that engage in computations and can do
| conditional branching.
|
| GPUs can be used for any computational task that a general CPU
| can be used for, but a GPU is optimized so it will do certain
| sorts of tasks much better (and as a consequence will be worse
| at other kinds of tasks). CPUs are meant to be adequate (if not
| spectacular) at any sort of task.
|
| It seems to me characterizing GPUs as "specialized" and CPUs as
| "not specialized" is entirely correct.
| piyh wrote:
| I would also question the framing because the systems the
| specialized hardware is running are the most general software
| systems we've ever created
| graffix wrote:
| Indeed. Though I did like the rest of the article, three of the
| authors' pillars to define specialization are dubious:
|
| > 1. substantial numbers of calculations can be parallelized
|
| > 2. the computations to be done are stable and arrive at
| regular intervals ('regularity')
|
| > 3. relatively few memory accesses are needed for a given
| amount of computation ('locality')
|
| Where (1) fails, any modern multicore + SIMD + ILP
| desktop/console/mobile CPU will run at a tiny fraction of its
| peak throughput. While sufficiently small serial tasks still
| complete in "good enough" time, the same could be said of
| running serial programs on GPU (in fact this is sometimes
| required in GPU programming). People routinely (and happily)
| use PL implementations which are ~100x slower than C. The
| acceptibility of ludicrous under-utilization factors depends on
| the tininess of your workload and amount of time to kill.
| Parallelism is used broadly for performance; it's about as un-
| specialized as you can get!
|
| (2) and (3) are really extensions of (1), but both remain major
| issues for serial implementations too. There mostly aren't
| serial or parallel applications, rather it's a factor in
| algorithm selection and optimization. Almost anything can be
| made parallel. Naturally you specialize HW to extract high
| performance, which requires parallelism, for specialized HW as
| for everywhere else.
|
| The authors somewhat gesture towards the faults of their
| definition of "specialized" later on. Truly specialized HW
| trades much (or all) programmability in favor of performance, a
| metric which excludes GPUs from the last ~15 years:
|
| > [The] specialization with GPUs [still] benefited a broad
| range of applications... We also expect significant usage from
| those who were not the original designer of the specialized
| processor, but who re-design their algorithm to take advantage
| of new hardware, as deep learning users did with GPUs.
| kaycebasques wrote:
| > the economic cycle that has led to the usage of a common
| computing platform, underpinned by rapidly improving universal
| processors, is giving way to a fragmentary cycle, where economics
| push users toward divergent computing platforms driven by special
| purpose processors
|
| Where does RISC-V and all its extensions land on the GPT <->
| fragmentation spectrum?
| sinuhe69 wrote:
| The tittle is misleading. It should read: The Decline of CPU As a
| General Computing Device.
|
| But then, I guess not many people would mind it.
|
| A computer? I consider all forms of computing units today as
| computer: micro-controller, single board computer, CPU +/- GPU
| +/-Neutral Engines etc.
|
| What about the math-coprocessor in the 90s? Did they cause a
| decline of computer as a general computing device? Of course not.
| Madmallard wrote:
| Is this because people in efforts to secure their long-standing
| jobs have made software more complex and less effective?
| benrutter wrote:
| I don't know if there's good justification for this yet. The
| claim that computers are shifting from general (CPU) chips to
| more specialized (GPU) chips.
|
| I'm not sure classifying GPU as "specialized" is right- it's
| essentially a chip that can do lots of small computations, rather
| than big ones like the CPU does. To me, the trend looks more like
| a shift from centralized to distributed models of computing.
| Which I think is also backed up by data processing tools like
| spark which distribute over multiple computers.
|
| I saw a "SQL process unit" announced recently[1] which I guess
| _really is_ a move towards specialized compute. I haven 't heard
| of much uptake in it yet though, so I guess time will tell.
|
| [1] https://www.neuroblade.com/product/
| deafpolygon wrote:
| The decline is due to vested interests. More specifically: to
| make money.
| bullen wrote:
| The problem was the smart phone.
|
| With that closed platforms became the standard.
|
| Non productive hardware is now ubiquitous, most of it can't be
| fixed.
|
| The open risc-v tablet is the only exit from this mess.
|
| Forget ARM and X86 empires, they will only exploit you as a
| consume only slave.
|
| It needs working GPU and swap-able battery though!
| piperswe wrote:
| What difference does the ISA make in this respect?
| babypuncher wrote:
| I think the theory is that an open ISA like RISC-V would
| promote a similar culture of openness around the software and
| hardware implemented with it.
|
| I'm skeptical that is how it would play out in practice
| though.
| bullen wrote:
| Very little, broad stroke RISC ISA was nailed between 6502
| and ARM-1 (1975-85), conclusion is 32-bit is enough for
| eternity. Same with IPv4, just add a byte for the internal
| mask (/24) and be done with it, long (64-bit) is way
| overkill... and IPv6 choose 128-bit. I say 4GB RAM per
| process is enough.
|
| Competition (also eternal growth) only works when you have
| infinite free energy. Now we are at the end of the curve,
| current Risc-V (64-bit) is not perfect by any means (lacks
| vector/SIMD), BUT it removes the real morons (lawyers and
| "owners") from the picture somewhat.
|
| The practical evidence is: Vision Five 2 software is going
| forward fast, the GPU is stalled for some reason, probably
| related to Imagination (PowerVR dumped by Apple and bought by
| the Chinese governement) being British, but who knows.
|
| Pinetab-V uses the same chip (JH7110) so it's also grid
| locked by the GPU drivers. TH1520 also uses Imagination GPU
| but a bit beefier, that one will have more trouble since the
| CPU is a new architecture and the JH7110 uses the same as the
| SiFive Unmatched.
|
| I tried to help by building my 3D MMO engine for them. I sent
| the profiling data, they wanted to see the OpenGL call stack
| and I lost hope because it works on ALL machines I have tried
| (PC Nvidia/Radeon, Raspberry 4 and Jetson Nano).
|
| I mean they have the game executable right there a 10MB
| download away?
|
| Popcorn time at the end of the world, at least I tried.
| sergeykish wrote:
| But x86 is PC, there are ARM notebooks, smartphone is just a
| computer with touchscreen and telephony.
| tim333 wrote:
| It's an odd use of language. I'm typing this from a macbook that
| seems pretty general purpose but the article seems to arguing
| it's not because it has both a regular CPU and some GPU/parallel
| stuff and the latter apparently doesn't count. But it mostly
| seems quite general purpose to me.
___________________________________________________________________
(page generated 2023-10-23 09:01 UTC)