[HN Gopher] Hyperbolic Growth
       ___________________________________________________________________
        
       Hyperbolic Growth
        
       Author : kristiandupont
       Score  : 53 points
       Date   : 2022-09-17 11:55 UTC (11 hours ago)
        
 (HTM) web link (kristiandupont.medium.com)
 (TXT) w3m dump (kristiandupont.medium.com)
        
       | Negitivefrags wrote:
       | There is a famous quote from Douglas Adams:
       | 
       |  _Anything that is in the world when you're born is normal and
       | ordinary and is just a natural part of the way the world works.
       | Anything that's invented between when you're fifteen and thirty-
       | five is new and exciting and revolutionary and you can probably
       | get a career in it. Anything invented after you're thirty-five is
       | against the natural order of things. Apply this list to movies,
       | rock music, word processors and mobile phones to work out how old
       | you are._
       | 
       | These feelings are just the cohort of which we are a part getting
       | older.
        
         | kristiandupont wrote:
         | I guess that is comforting in a way :-)
        
         | luis_cho wrote:
         | There is a famous quote from Kenneth Boulding "Anyone who
         | believes in indefinite growth in anything physical, on a
         | physically finite planet, is either mad or an economist."
        
       | carapace wrote:
       | Mark Miller said something like, "Moore's law from the CPU's POV:
       | humans are getting exponentially more expensive."
       | 
       | The fundamental shape of the future is clear: either you have
       | some sort of legal claim on or equity in the primary producers
       | and/or "rent extractors" or you don't.
       | 
       | As a programmer, the next iterations of e.g. GitHub Copilot
       | _will_ eventually eat my career.
       | 
       | Large corporations are effectively already AIs, and they have
       | much more ability and capacity to leverage technology advances
       | than the little people.
       | 
       | I don't really know what to do about any of it, even assuming I
       | could...
       | 
       | I've thought about trying to start a kind of member-owned
       | "general automation" corp that builds houses and neighborhoods
       | and ecologically savvy farms, but the simple fact is, I'm not a
       | people person.
       | 
       | I've also thought about buying one of those cheap desert plots, a
       | couple of acres for a few thousand dollars, and going out there
       | with a "kernel" or "seed" of, y'know, solar panels and chickens
       | and a little earth-mover, etc. I could pretend it was like I was
       | colonizing an alien planet and make a kind of performance art
       | piece out of it, I dunno.
        
       | BlargMcLarg wrote:
       | >Creativity is a big part of what I feel defines me, and if that
       | loses all value then it will surely affect me.
       | 
       | Isn't that the whole problem with these tools? That they _can 't_
       | capture creativity and context, and therefore only work as
       | augments to competent users rather than replacing users entirely?
       | 
       | >What will be a viable career in 20 years? Will the concept of a
       | "career" even make sense by then? I have no idea, and I am
       | uncomfortable with that.
       | 
       | Which is why society should answer this ASAP. It's evident that
       | putting the means of production in the hands of a few becomes a
       | problem for the many, once the many are no longer necessary. This
       | has been going on since pre-industrial. The premise has always
       | been "we will make more jobs than we remove". What if that
       | premise is no longer true? What if that premise wasn't true to
       | begin with?
        
       | seibelj wrote:
       | Advanced auto-complete and sort-of right concept art is not going
       | to disrupt the world nor the software industry. We have had
       | algorithms trained on machine learning do IMO way more socially
       | affecting things like decide what news and videos we watch, who
       | gets approved for loans, what politicians say - but this gets way
       | less press or thought.
       | 
       | You will never replace software engineers just as you will never
       | replace writers and artists no matter how "good" the machine
       | generated stuff gets. There is no soul there and it only appears
       | interesting at a superficial, "that's sort of neat" level that
       | doesn't hold up. No one is buying master works of AI, they spend
       | their weekend reading real books and blogs from real people.
       | 
       | It's 2022 - where is my self driving car? Where is the tens of
       | millions of job losses from AI predicted that necessitated UBI
       | (see the entire presidential thesis of Andrew Yang)?
       | 
       | It does get a lot of hype and raises a lot of VC money though!
        
         | nmca wrote:
         | https://bounded-regret.ghost.io/ai-forecasting-one-year-in/
        
         | kristiandupont wrote:
         | May I ask if you have tried using Copilot? Because this sounds
         | exactly like the kind of response I would have had half a year
         | ago. It makes many mistakes and can by no means write code by
         | itself. But that doesn't change the fact that it's orders of
         | magnitude better at it than I thought possible.
        
           | copenja wrote:
           | I'm not the parent post and I have not tried autopilot, but I
           | do have a question.
           | 
           | Does autopilot make you orders of magnitude more productive?
        
             | kristiandupont wrote:
             | No, it does not.
        
           | AlotOfReading wrote:
           | One issue with copilot is that debugging broken code is far
           | harder than writing it in the first place. We've seen similar
           | transitions before though. When good optimizing compilers
           | started appearing you had plenty of programmers like Mel [1]
           | that said 'there's no art in it, a compiler couldn't do the
           | kind of optimizations I do'. And yeah, it's true, but
           | compilers got good enough that 99% of people stopped caring
           | and hardcore optimization became a niche skill few people
           | have. What does the industry look like when all that most
           | programmers know is debugging generated code and what kinds
           | of tools need to exist to support that?
           | 
           | [1] http://catb.org/jargon/html/story-of-mel.html
        
       | vsareto wrote:
       | >What will be a viable career in 20 years? Will the concept of a
       | "career" even make sense by then?
       | 
       | I think AIs have made some amazing incremental process, but I
       | feel like the fears of being made obsolete are still premature.
       | If you needed to pick though, learning the thing that's replacing
       | you is a good start, so pick AI research or development. For that
       | to be non-viable as a job or career, AIs would need to have the
       | ability to build and improve themselves, and at that point, it's
       | game over, man(tm).
       | 
       | None of these AI things are going to replace politicians,
       | governments, or economies, and so you still have a vast amount of
       | soft skill careers available as well.
        
         | api wrote:
         | Every AI I have tested requires the user to become clever and
         | proficient at getting it to generate good results.
         | 
         | Programming with AI is going to just elevate the level of the
         | programmer cognitively just like compilers and high level
         | languages did. It will be an even higher level of abstraction
         | at which developers can operate.
         | 
         | These things are powerful tools but just tools. Same with the
         | "art generators." So far I see nothing capable of autonomous
         | reasoning at a level that would actually replace the tool
         | wielding operator. Someone still has to hold the hammer.
        
         | ben_w wrote:
         | While I would agree that the fears are in many cases premature,
         | 20 years is a long time in tech while being short enough to
         | matter for careers.
         | 
         | By 2042 we're likely to have single-atom transistors in
         | consumer devices, while people mid-career now are likely to be
         | retiring _or_ getting radical life extension to confuse all
         | this even further, and newborns today will be just about to
         | graduate from universities.
         | 
         | Will we have AI as general-purpose and flexible as a human,
         | even if only limited to an IQ of 85? Dunno, but even limited to
         | that IQ an AGI would render about 15% of the population
         | permanently unemployable, which would be enough to cause all
         | sorts of social and political problems by itself.
         | 
         | But I think the limit on AGI right now isn't compute resources,
         | it's that we don't know the right algorithms to make the AI we
         | have as data-efficient as humans (or even, AFAICT, dogs, cats,
         | or mice).
        
           | bumby wrote:
           | > _Will we have AI as general-purpose and flexible as a
           | human, even if only limited to an IQ of 85?_
           | 
           | While I agree to a certain extent on this premise, there will
           | still be some areas that AI won't replace because humans also
           | have an emotional value component.
           | 
           | Consider if you'd rather have a massage from a robut or
           | person. Or, if you are infirm and would prefer a caretaker
           | who is human or machine. It's not just a matter of processing
           | data when humans are emotional creatures.
           | 
           | If anything, AGI may free up humans to focus on human
           | relational jobs which we're much more evolved to be good at
           | and value.
        
             | ben_w wrote:
             | > Consider if you'd rather have a massage from a robut or
             | person. Or, if you are infirm and would prefer a caretaker
             | who is human or machine.
             | 
             | Sure, I'm also implicitly assuming robotics will improve.
             | At some point -- don't know when -- we can't tell the
             | difference any more, and at that point, the humans
             | indistinguishable from machines are likely to be made
             | redundant by the machines.
        
               | bumby wrote:
               | I'm saying there's something innately different
               | emotionally about interacting with a human.
               | 
               | So if you know it's a machine, it will inherently change
               | the subjective interaction. If you're saying that it wont
               | matter because we will get to a point where humans and
               | machines are indistinguishable, that's implying robots
               | will be emotional and no longer just processing data in
               | the way we currently think of AGI. Essentially it means
               | they will be conscious with subjective experience of
               | their own or just philosophical zombies. Both are
               | essentially fictitious thought experiments at this point.
        
               | throwaway743 wrote:
               | That's what it seems like they're saying. At some point
               | it's likely they will be indistinguishable from us on
               | many levels, including emotions. There's a demand for it
               | and a solution/peoduct will eventually be created to
               | address that demand, once the means of doing so are
               | technically and financially feasible/scalable.
               | 
               | Currently, yes, there's an innate difference, but it's
               | only fictitious that there won't be a difference until
               | it's not. It's plausible speculation. Writing it off as
               | fictitious as of now feels like an "ostrich in the sand"
               | means of molding around a static perception of reality.
               | 
               | Things will change, we're not special, labor as we know
               | it will be replaced, machines will at some point acquire
               | the same abilities as us, and we'll have to strive to
               | figure out as a whole if we need to change the means
               | required to carry on and/or individually adapt/overcome
               | such sea changes in labor/income.
               | 
               | It will probably cause many issues within our societies,
               | but many will also see opportunities. It's going to be a
               | lot of grey, likely more so than now.
        
               | bumby wrote:
               | I agree that there's hubris in thinking we're somehow
               | special. But pretending we will get there "someday" isn't
               | much different than saying one day we'll time travel.
               | Even if plausible, it's so far from where we're currently
               | at it belongs more to philosophical thought experiments
               | than serious technology inquiry
        
               | ben_w wrote:
               | I'd agree that they would either be conscious or
               | p-zombies, but I think humans are sufficiently easy to
               | fool that the latter may well happen. I offer panpsychism
               | and celebrity fandoms/parasocial relationships as
               | examples of people's emotional sense of connection being
               | misleading.
               | 
               | I don't think anyone has a sufficiently concrete
               | definition of consciousness to even tell either how far
               | we are from that now, nor how to get there from here.
               | 
               | https://kitsunesoftware.wordpress.com/2022/06/18/lamda-
               | turin...
        
             | gnaritas99 wrote:
             | Neither of those examples are a good illustration of your
             | point as many and probably most will prefer the machine in
             | both scenarios. I don't get a massage for an emotional
             | experience with the masseuse; I'd prefer the likely much
             | cheaper and higher quality and consistent machine massage.
             | Ditto with a permanent caretaker as it'll allow you to feel
             | far more independant; most of us don't want to feel like a
             | burden on other people. However, you do have a valid point,
             | just not good examples imho.
             | 
             | AGI will make the vast majority unemployable, it'll be a
             | disaster and those jobs won't simply be replaced by new
             | ones: people who think this haven't thought through it very
             | deeply at all.
        
               | bumby wrote:
               | Maybe. My original example that came to mind was a
               | caretaker for someone terminally ill, I.e., hospice.
               | Having volunteered in that area, people want human
               | contact, even if it's not performing any pragmatic
               | function. I tend to think automation would just make
               | somebody feel more isolated. People yearn for connection,
               | not automation.
               | 
               | I believe the research shows there is a strong
               | psychological component to massage that goes beyond the
               | mechanical manipulation. I guess it would depend on
               | whether somebody uses it for stress relieve or merely
               | recovery.
        
         | chrisco255 wrote:
         | Really I would think that politician would be one of the
         | easiest roles to automate. It's a very simple algorithm that
         | votes yes based on whoever gives it the most money.
        
           | vsareto wrote:
           | Getting them to hand over their jobs is the hard part :)
        
         | BlargMcLarg wrote:
         | Those jobs aren't being replaced by AI because the people in
         | charge don't _want_ them to be replaced. It 's not the AI they
         | should be worried about, but society shifting the paradigm and
         | rendering half of all bureaucracy across the globe obsolete.
         | 
         | Endless bikeshedding is deemed very important amongst its
         | practitioners, and is pretty lucrative to boot. Even when we
         | never measured the benefits.
        
       | random314 wrote:
       | Why the word hyperbolic. It doesn't seem to make sense.
       | Exponential automation seems better.
        
         | kristiandupont wrote:
         | Hyperbolic growth has a singularity in finite time. I.e. there
         | is a "crucial" point where things truly go haywire. Not a very
         | important distinction in a philosophical discussion like this,
         | but that was the reasoning.
        
       | mmargerum wrote:
       | I reluctantly moved to an architecture position because developer
       | salaries have stagnated and remote work normalization will put
       | even more pressure on western developer wages.
       | 
       | Coding isn't nearly as much fun anymore. Agile, CI/CD, Pull
       | Requests, code reviews, constant software updates, locked down
       | laptops put so many barriers in the way of my creative flow.
       | 
       | I still code on my own equipment building apps for myself and I
       | still love it but im done doing it professionally
        
       | dasil003 wrote:
       | Being made obsolete by technology is nothing new (see the
       | Luddites). But regardless of how replaceable any individual is,
       | we're still in a human society. Maximizing profits by laying off
       | humans can work for a while, and systems can rebalance, but if
       | prevailing dynamics collectively lead to a large percentage of
       | the population being "unemployable", then the political situation
       | will become very explosive. People need to feel some purpose in
       | life. Once things destabilize to a certain point, the global
       | productivity that allows for the enormous compute power behind
       | modern tech and AI will come crashing down pretty fast. Sure,
       | there are authoritarian scenarios to worry about, but I have a
       | hard time convincing myself that tech enables authoritarian
       | control beyond what was possible throughout history.
        
         | sawyna wrote:
         | One would think people would have more freedom and will explore
         | life far more than today and get into philosophical stuff. But
         | the reality is messy. Imagine a single individual who leads
         | quite a busy life and yearns to have some free time to be more
         | "free" and do things that matter. You know what happens when
         | that person gets whole lot of free time? They are not sure
         | about what to do and it takes a lot to start doing something
         | new. If this happens on a large scale, I'm really not sure that
         | it'll make the world a better place to live. Maybe I'm
         | pessimistic, but people would have ample lot of time to do more
         | useless political stuff like manipulate, influence others for
         | their own gain.
         | 
         | There's this anime called Log Horizon which kind of plays this
         | scenario out. Everyone gets stuck in this metaverse sort of
         | game where people generally come for fun. Once stuck, people
         | start killing each other and doing random things because they
         | don't know what to do. There's no purpose in their "life".
        
         | fullshark wrote:
         | Why? Tech enables a small group of people to have extreme
         | power/leverage. In the case of military/control you don't even
         | need the labor class to serve in the military in large numbers
         | anymore for global supremacy.
        
           | dasil003 wrote:
           | For all the leverage the US has militarily, what control did
           | it actually exert in Afghanistan? What about Russia in
           | Ukraine? War and occupation is one thing, but I'm speaking
           | more of the consent of the governed. Do you think the
           | billionaire class can keep hoovering up all the economic
           | gains as jobs are automated based solely on their self-
           | interested narrative of capital ownership? What happens when
           | unemployment hits 20%, 30%, 50%? Are they going to send
           | drones against the populace?
           | 
           | Even in China, the CCP's policies only work as long as they
           | deliver a certain measure of prosperity to the people.
           | Propaganda and police brutality can only take you so far
           | before the people have had enough.
        
           | peyton wrote:
           | Yeah, I'm fairly certain things will be fine. Everyone's
           | consuming curated propaganda on their phones every single
           | day. The little people won't be doing much.
        
         | warent wrote:
         | It feels like part of this may be some needed mental shift from
         | a scarcity mindset to a prosperity mindset.
         | 
         | My mind always goes to the very optimistic, Utopian world in
         | Star Trek, where humans have identified that we can afford for
         | people to stop working for the purpose of survival (the bottom
         | of Maslow's hierarchy of needs) and instead start working just
         | for the personal challenge and growth.
         | 
         | Not saying we're there yet, but with so many jobs rapidly
         | becoming obsolete by machines, it seems to me to be the
         | direction we're going. Rather than meaninglessness,
         | joblessness, poverty, homelessness, I hope society can shift
         | into this paradigm that automation could mean the exact
         | opposite: empowerment to more freedom.
         | 
         | This is probably a very scary perspective for many capitalists!
         | The majority of people actually do want to work, they just
         | don't want to work for the sake of work itself. A huge benefit
         | of capital is that it means we don't really have to trust each
         | other at a high level, because fundamentally our incentives all
         | align around survival which money is a tool to survive.
         | 
         | One day maybe we can make a world where there is more trust in
         | each other; that just about everyone wants to do the best that
         | we can with the exception of a small number of malignant
         | personality disorders that is the minority.
         | 
         | This new trust will enable some amazing doors to open up.
        
           | monkeydust wrote:
           | You should read Trekonimics if your into this way of
           | thinking.
        
             | warent wrote:
             | interesting! thank you, I'll check it out
        
       ___________________________________________________________________
       (page generated 2022-09-17 23:01 UTC)