[HN Gopher] Managed by Bots: surveillance of gig economy workers
___________________________________________________________________
Managed by Bots: surveillance of gig economy workers
Author : giuliomagnifico
Score : 161 points
Date : 2021-12-13 09:47 UTC (13 hours ago)
(HTM) web link (privacyinternational.org)
(TXT) w3m dump (privacyinternational.org)
| monkeydust wrote:
| Recommend reading this paper which introduces the concept of
| 'pre-automation'
|
| https://sociologica.unibo.it/article/view/11657
|
| Helps understand (for me at least) what's going on here
| fxtentacle wrote:
| That video at the end is interesting both for its content and for
| its delivery technology.
|
| And I do agree with him: If whether you get work or not decides
| on a black box AI and your only support channel is another black
| box AI, that is pretty much a Kafka-esque nightmare.
|
| I'd guess we here on HN are the people best equipped to prevent
| the worst. So the next time you help someone automate their
| customer support, ask yourself: How would I feel if my well-being
| depended on this? Because for some poor soul, it might. Is there
| a clear fall-back in case the AI fails horrible? Because, you
| know, they always do.
|
| I once had my own support automation problem with Amazon, but
| luckily I had no stakes in it. They accidentally sent me someone
| else's parcel. So I filed a support request to inform them. They
| very politely apologized within a minute and informed me that
| they are sorry about my lost parcel and they'll send another one.
| So I got the same wrong parcel again. After waiting a while, I
| opened up the two parcels, each roughly 10x5x5 inches (25x10x10
| cm) large. It was two single pencil erasers.
|
| But boy would I have been furious if I had received the same
| support quality for a lost high-value parcel... Also, I did
| ponder if it is OK for Amazon to waste my time if I'm not even
| their customer. I mean their support forms are difficult to
| reach, no matter why you need to contact them.
| PragmaticPulp wrote:
| > I'd guess we here on HN are the people best equipped to
| prevent the worst.
|
| I agree that individuals should be conscious of which companies
| they support through their labor and/or spending habits.
|
| But I also think that HN overestimates the ability for single
| engineers to upend the goals of an entire organization. In the
| real world, if a company wants to automate anything and they're
| paying well, there will be a line of engineers out the door
| applying to get it done.
|
| > So the next time you help someone automate their customer
| support
|
| I think this strikes at a false dichotomy that occurs
| frequently in these conversations: There's an idea that if we
| simply removed the automated solutions then companies would be
| forced to replace them with the idealized solutions that we
| want. In practice, companies know very well that automated
| customer support and similar solutions aren't comparable to
| having a well-paid, highly-trained person pick up the phone.
| But they weren't going to pay for the expensive solution
| anyway.
|
| And the real driving factor isn't just the companies, it's the
| customers. If given the choice between two identical products
| where the only difference is automated versus human customer
| support (and associated higher price for human customer
| support), the majority of customers will choose the cheaper
| option every time. There are a few people who will proudly pay
| more for the better CS, but they are a tiny minority. Overall,
| customers _want_ the cheaper option even if it comes with
| tradeoffs, and they will vote with their wallets.
| [deleted]
| 1vuio0pswjnm7 wrote:
| Who expects "single engineers to upend the goals of an entire
| organization". Maybe I am missing something but I could not
| detect that suggestion anywhere in the parent's comment. Nor
| did I detect any suggestion that companies would ever adopt
| "idealized solutions". The comment appears to ask others to
| refrain from helping companies automate customer support,
| whereas the reasoning of the reply is something along the
| lines of "If you do not help the company automate customer
| support, then whatever customer support the company provides
| will not be "ideal", therefore, you should continue to help
| the company automate customer support." WTF.
|
| What is perplexing about these type of replies, which seem to
| occur anytime there is suggestion of taking personal
| responsibility for one's actions, is that people even bother
| to make them. Because if we accept the premise as true that
| one person's actions can have no effect on organisational
| change, then why would anyone else care about, let alone try
| to discourage, someone acting according to some ethical
| principles. What prompts people to respond.
|
| Perhaps there is something else going one here, especially
| when these replies often (a) try to point to some other
| focus, e.g., management, customers, price, etc., besides
| their own decision-making and (b) do not pressent any
| alternative courses of action. "It will not make a
| difference. Thus, keep doing what you are doing." If it does
| not make a difference then why would anyone care about
| someone doing it.
|
| Whatever the motivation, maybe someone else can explain it,
| something about a person on HN who suggests following ethical
| principles triggers a counter reply from someone who
| generally tries to discourage this line of
| thinking/behaviour, alleging something along the lines of "it
| will not make a difference."
| ekanes wrote:
| > the majority of customers will choose the cheaper option
| every time
|
| This is ultimately behind everything that goes to zero.
| Airlines must compete almost exclusively on price, so
| everything gets worse to make that happen. We buy goods
| mostly on price, and so are reliant on China.
| glitchc wrote:
| Maybe the only choice then is to regulate the profession so
| that every Tom, Dick and Harry with a two-week bootcamp
| _can't_ walk in the door to replace you.
|
| Engineers in other professions are required to behave
| ethically. Those who can't get there need to stop calling
| themselves engineers, until they qualify to use the
| designation (see regulation above).
| BolexNOLA wrote:
| I know this may come off as cheeky, but I'm being very
| sincere: if not us, then who? Sure I alone can't upend
| Amazon, but we aren't all alone and we aren't all working for
| the juggernaut that is the Bezos machine.
|
| Maybe I sound naive but every bit matters, especially given
| how many of us probably work at start-ups. You never know
| when your decisions will have an impact on a multibillion
| dollar company that simply hasn't emerged yet.
| geodel wrote:
| I think it would be like earlier during industrial
| revolution when people would vandalize train tracks or
| factories. It can slow down a bit at particular place and
| time but otherwise these things are inevitable.
| belval wrote:
| The people? Like it or not developing stuff like this is
| not illegal, and expecting people to grow a conscience when
| their livelihood depends on not having one is pointless.
|
| The unfortunately cynical answer is that we need to have
| representatives that will put laws in place to prevent
| this. It's both simple and complex due to the political
| landscape, but that's the only real long-term solution.
| pessimizer wrote:
| > But I also think that HN overestimates the ability for
| single engineers to upend the goals of an entire
| organization. In the real world, if a company wants to
| automate anything and they're paying well, there will be a
| line of engineers out the door applying to get it done.
|
| There's a similar effect in journalism, where the loudest
| voices saying that individual journalists have the duty and
| ability to maintain the integrity of journalism are often
| other journalists.
|
| It's a survivorship thing - the journalists/engineers who
| stay employed as such are the ones who probably honestly
| believe in the same things that their employers do, or at
| least have an output high enough in the things that their
| employers won't feel the need to heavily edit or nix entirely
| to have nearly the same effect. It creates an illusion of
| control.
|
| But ultimately the author of anything is the person whose
| yeses and nos cannot be overridden, and that's the boss.
|
| > In practice, companies know very well that automated
| customer support and similar solutions aren't comparable to
| having a well-paid, highly-trained person pick up the phone.
|
| And even if it's not particularly expensive in itself, the
| fact is that having bad customer service reduces complaints.
| Having anonymous/automated customer service even reduces
| complaints about customer service - because there's no way to
| identify or isolate a particular bad customer service
| experience, like a _name._ Avoiding the complaints probably
| creates so much savings that it would be worth automating
| customer service even if that cost _more_ than having good,
| live customer service.
|
| We do the same thing in means testing/bureaucracy in
| welfare/social services. The more difficult or confusing it
| is to prove one's qualification, the fewer _qualified_ people
| will have to be serviced. In that case, spending _more_ on
| "customer service" (i.e. adding more steps) ultimately
| reduces costs.
|
| > There are a few people who will proudly pay more for the
| better CS, but they are a tiny minority.
|
| There are also very few ways to know that the customer
| service on a product will be terrible before you need it.
| People aren't actively picking bad customer service any more
| than people choose to watch bad movies at theaters. It's
| something you find out about after you've already bought in.
| fxtentacle wrote:
| There are plenty of little things that you can add to make
| even an automated system more humane.
|
| For example, assign a unique ID to every automated decision
| and display it in the automated replay email. That way,
| affected humans can at least sue to get the basis of their
| decision disclosed.
|
| Or why should noreply-support@company.com actually be a no-
| reply? Let's forward it to the manager who is responsible so
| that if people are genuinely upset, their "feedback" reaches
| the correct person.
|
| Or attach a footer to those emails, which contains an email
| address for legal matters. Make sure to say "Please only
| contact this address for urgent legal matters" so that
| everyone correctly infers that it'll reach an actual human.
|
| As for those gig apps, it's probably good enough to just
| include the actual reason in the JSON reply that the server
| sends, without ever displaying it in the app. Of course,
| that's for internal debugging only, but nobody's going to get
| the production deployment flags set correctly.
|
| Or you can make the decision really obvious, by instantly
| loading it as an AJAX. Customer fills out a form, the spinner
| starts, and the result is displayed. "We're terribly sorry
| but your claim has been denied." But keep the inputs visible,
| so the customer can tweak the values and submit again.
| They'll surely figure out what makes the decision flip.
|
| And I wouldn't worry too much about colleagues who'd do
| anything for money. They will probably go to FAANG for better
| pay anyway. And if all non-FAANG companies have significantly
| better service, the overall public will notice.
|
| Bonus tip: If you ever want to reach a human yourself, there
| are services like hunter.io that'll procure the email address
| of a human for you. And then you just circumvent all the
| automation and directly CC the CTO to say "f* - oops - thank
| you for your great customer service!"
| milkytron wrote:
| > I agree that individuals should be conscious of which
| companies they support through their labor and/or spending
| habits.
|
| Agreed, but this is a lot of work. After doing it myself, I
| find that the options are almost always choose the least
| evil, choose something expensive, or don't buy that thing at
| all.
|
| Choosing something expensive is a sacrifice I'm willing to
| make to support a company I believe is doing good. The hard
| part becomes not buying anything at all when there are no
| good choices. When I had a car, filling it up with gas felt
| like a crime against humanity's future.
| alisonkisk wrote:
| You can get a car that doesn't run on gas.
| fxtentacle wrote:
| > The hard part becomes not buying anything at all when
| there are no good choices
|
| That pretty much describes me and my Android phone right
| now. I don't want to use an iPhone, but I also want a phone
| that still fits into my hand and my pocket. Sadly, there's
| nothing comparable to the 2016 iPhone SE on the market. I
| could easily get a new phone for free from work, but I
| would hate to support a product that I actively dislike. So
| I wait and hope for a small Android phone to be released in
| the future.
| Mezzie wrote:
| I feel similarly, and recently I've been focusing on
| acquiring tools that enable me to disengage from our
| rotting culture by making things for myself. Of course not
| everything can be done, but a lot can be, especially if you
| start letting other people use the tools.
|
| Instead of fast fashion, I'm getting a sewing machine and
| fabric and just letting several people use it in return for
| making me clothes (I have money, they have time). Likewise,
| I'm investing in knitting/crocheting stuff and lending it
| out. Instead of plastic doohickies, I'm getting a 3D
| printer: What I make may be cheap crap on par with what I'd
| get at Walmart, but at least I won't be exploiting or
| enslaving anyone.
|
| Also buying used and repairing is an option. Basically I
| like to ask myself 'will this help me need less from big
| corporations?' before I buy something. Not that small
| business is inherently more moral, but it's easier to hold
| to account as an individual consumer. I can actually
| sue/raise enough of a ruckus to put a local business out of
| business if they hurt me, whereas Amazon doesn't care.
| burnished wrote:
| Want to argue this point
|
| >>But I also think that HN overestimates the ability for
| single engineers to upend the goals of an entire
| organization. In the real world, if a company wants to
| automate anything and they're paying well, there will be a
| line of engineers out the door applying to get it done.
|
| Either the work is difficult, and few people can do it (hence
| the price tag on engineering salaries) or it isn't. But if it
| is, then even a few refusals can effectively shut down a
| project. 'oh theres a line out the door' gets given as a weak
| excuse from people who have zero threat to their security,
| and I'd rather never hear it again.
| et-al wrote:
| > But I also think that HN overestimates the ability for
| single engineers to upend the goals of an entire
| organization. In the real world, if a company wants to
| automate anything and they're paying well, there will be a
| line of engineers out the door applying to get it done.
|
| So my experience may not be the norm because I've only worked
| at smaller orgs, but my experience with all my product
| managers have been collaborative. Personally, I wouldn't be
| able to work at a place where I can't affect product
| decisions. But then again, maybe that mythical $300k total
| comp is really about "shut up and just follow the specs."
| kwhitefoot wrote:
| I think Amazon support forms are difficult to reach by design.
| It helps keep the traffic down. They are simply not at all
| interested in people who are not customers, and not especially
| interested in those who are. There was a time when I got a lot
| of email from Amazon about things that someone else had
| purchased. How my email address became associated with their
| account I have no idea. I tried to notify Amazon but there just
| wasn't any proper way to do it and after sending an email or
| two I just gave up. It was more than six months later that I
| stopped getting such email
| 6gvONxR4sf7o wrote:
| > So the next time you help someone automate their customer
| support...
|
| In my experience people working on these things are well
| intentioned. So it comes down to a need to shift priorities.
| Maybe you need to launch something now, and the PM tries to
| make you delay some important thing for the next release,
| launching without it at first. That's the kind of thing we need
| to change. No more promises of "We are working to make this
| better" while launching without crucial pieces first. We're
| used to making that kind of choice when it's not people's
| livelihoods and working conditions at stake, and we have to
| make those choices differently when the stakes are people.
| spaetzleesser wrote:
| "I'd guess we here on HN are the people best equipped to
| prevent the worst. So the next time you help someone automate
| their customer support, ask yourself: How would I feel if my
| well-being depended on this? Because for some poor soul, it
| might. Is there a clear fall-back in case the AI fails
| horrible? Because, you know, they always do."
|
| I don't think we are that well equipped to prevent the worst.
| Maybe some entrepreneurs are but most engineers develop tools
| that can be used for good and bad. I think automation provides
| enormous benefits to society and we should keep pursuing it. To
| make sure it doesn't end up with an abusive system is a
| political problem and should be addressed by rules and laws.
|
| In the last few decades the mantra was that the more money the
| psychopaths can make the better for society and screw the
| victims. That has to change. Society overall suffers if we have
| a few billionaires lording over the working masses.
|
| I am pretty hopeful that we can create systems that balance
| technological progress with concerns of social welfare. We have
| made big progress with environmental regulation and I hope we
| will do the same in other areas.
| durnygbur wrote:
| > So I got the same wrong parcel again. After waiting a while,
| I opened up the two parcels, each roughly 10x5x5 inches
| (25x10x10 cm) large. It was two single pencil erasers.
|
| The monstrosites which reach this scale always amaze me. In
| case of individuals, when sending an item the main concern is
| packaging, shipping costs and time. Amazon and others? They
| just furiously and repeatedly ship oversized boxes filled
| mostly with fillers, while eco-shaming the customer.
| jevoten wrote:
| > I'd guess we here on HN are the people best equipped to
| prevent the worst.
|
| Best _positioned_ , but how well equipped are you? Are you
| organized enough to collectively refuse such requests, or will
| you fall into the "If I don't do it someone else will" trap?
| inter_netuser wrote:
| why is that a trap?
|
| the "trap" seems to be instead the stable equilibrium state.
| pessimizer wrote:
| It's only a trap if you consider doing things for money
| that you, yourself, think are bad things, as a bad thing.
| spaetzleesser wrote:
| You often don't even know how your tools will be used
| eventually. Most technology is pretty neutral. How it's being
| applied defines whether it's positive or negative.
| goodpoint wrote:
| > Best positioned, but how well equipped are you?
|
| More importantly: how willing? The HN crowd is known for
| workshipping tech companies.
| konschubert wrote:
| Any automated process always needs human fallback.
| the_snooze wrote:
| Or else what? It's not like companies lose revenue or
| reputation for relying on cheap frustrating automation. If
| anything, it's in their best interests to keep doing so
| (i.e., to dissuade people from unsubscribing or complaining).
| konschubert wrote:
| Or else we the people make it law.
| Maf1 wrote:
| Or will we live in a world where the computer is always
| right.
| ksdale wrote:
| It's unreal the number of times a support person has said
| they "can't" do something because they don't have an option
| to do it in their user interface. And like, I understand
| that they don't have the authority to do it and it's not
| their fault, but someone at HQ basically decided that
| something was too expensive or inconvenient for the company
| and made it quite literally "not an option" and it just
| becomes a law of the universe that the customer can't get
| that remedy.
| syshum wrote:
| Google, Amazon, and the rest clearly disagree
| konschubert wrote:
| Yes, and it's a kafkaesque nightmare.
|
| I try to avoid Google for this reason.
| bigodbiel wrote:
| The irony is that it's middle management that will be the first
| to be laid off. What globalization did for blue collar,
| management automation will do with middle management.
|
| Making this a "workers' issue" won't lead to anything,
| specially when it's gig economy workers. Obviously these AI
| will make their way to non-gig economy sector, and from there
| spread like wildfire.
|
| This will only exacerbate socio-economic inequalities to a
| point of no return (revolution of the precariat aside)
|
| Creepy how it's all too similar to the short story Manna[1]
| (substituting fast food for gig economy)
| C19is20 wrote:
| I'm rather amazed people purchase single erasers from amazon.
| raxxorrax wrote:
| I firmly state my disapproval to engineers working on such
| systems. Doesn't really work, you always find someone that
| prostitutes himself.
|
| Production surveillance can be essential for quality control,
| but you don't need individual surveillance for that and line
| managers are better at evaluating people.
|
| It is mostly useless managers that want to show of a nice Excel
| sheet.
| i_like_waiting wrote:
| To be honest, I just realized I am culpable in this area. So far
| its small things for small company: e.g. automated alerts when
| checklist is not filled for selected day, information in 1 system
| is not matching with info in second system.
|
| By themselves those things look innocent, and just automate my
| headache of writing emails to correct the info.
|
| What will it be 2-3 years for now? For sure automated
| penalization system in case you don't do those things correctly.
|
| Do I have enough morality to not do those alerts/ "small
| productivity hacks" for me and do it manually for greater good
| instead? Tbh I need to think about it.
| simion314 wrote:
| I would say that first thing you should do is to make sure the
| emails/notifications you send contain all information needed
| that the affected person knows what he did wrong and also
| contain a way to dispute.
|
| We need to push against broken system that abuse us, like in my
| case I got an email from Sony that I violated a super generic
| rule and my account is suspended for 2 months, with no way to
| dispute this and sure enough no refunds or compensation for my
| currently paid and active yearly subscription. (this can happen
| to any of us is not only Google/Apple/PayPal/Amazon that can
| screw you with their automation and shit customer support , the
| list is much larger and probably growing)
| spaetzleesser wrote:
| "I would say that first thing you should do is to make sure
| the emails/notifications you send contain all information
| needed that the affected person knows what he did wrong and
| also contain a way to dispute."
|
| This should become law. Otherwise we end up in Kafka's "The
| Trial" world.
| commandlinefan wrote:
| > This should become law
|
| I'd like to see that too, but I think it's... unlikely. In
| the case of OP, he lost access to his Sony account (I
| assume that's an online gaming platform) and there were
| monetary damages included, but imagine a generic "you can't
| be suspended from an online account without a way to
| dispute with an actual human person" law applied to, say,
| Facebook or Twitter or Reddit?
| simion314 wrote:
| So my account was banned for 60 days , I could use the
| console offline but no multiplayer/chat features. I am
| paying a yearly subscription for Play Station Plus that
| includes some online features that were made useless by
| the ban, so I was paying for nothing (in my case I was a
| paying customer) . My son told me that he got threats
| from some guy that if he did not gifts him Fortnite stuff
| he will do many fake Abuse Reports to Sony , I could see
| this system getting abused and adding on top the chats
| were not in english this could have been the cause (I
| can't be 100% sure and Sony did not told me what exactly
| triggered this so AFAIK it could be a screenshot from a
| video game that triggered Sony bots).
|
| The law should force this company to refund completely or
| partially when this ban is happening , or in this case
| they could have extended the subscription with 60 days.
| The law should also force them to tell me what phrase or
| image or report triggered the system, it might make Sny
| job harder but the law should make our live easy not some
| giant corporation.
|
| In this case Sony last a loyal customer, I will do my
| best to avoid giving them any more money in the future,
| consoles or other products, so if anyone wants to
| implement a similar system that screws with customer keep
| in mind that the ones you screw over will not stay silent
| and will spread the word and you lose more then 1
| potential customer.
| mikro2nd wrote:
| Thing is, it's not just automated systems that are broken. As
| worrying to me, or even more so, are those huge, lumbering,
| old-school systems called corporations, where the humans who
| work in them are merely functionaries, cog-wheels that make
| the machine work, without discretion or leeway to exercise
| judgement, and they're getting worse because those little
| cog-wheels are increasingly tasked, measured, hired, promoted
| and fired by algorithms, too. You may be dealing with a
| human, but they have absolutely no way to exercise their
| humanity, even when their job-title is something like "Bank
| Manager".
| [deleted]
| Animats wrote:
| Machines should think. People should work. That's the new
| reality.
|
| How far away is an AI system that supervises web developers?
| ghgr wrote:
| It reminds me of the short story "Manna - Two Views of Humanity's
| Future" [1], which describes the steady establishment of AI
| process optimizers as the managers of human employees, who
| increasingly became a replaceable commodity. Fascinating read.
|
| [1] https://marshallbrain.com/manna1
| JoachimS wrote:
| Good pointer, and a good short story.
| mikro2nd wrote:
| When I first read Dune, 'lo those many years ago, I always
| thought of the Butlerian Jihad as being somewhere in our far
| future. Now, I'm not so sure...
| pessimizer wrote:
| When I first read it, I realized was probably a Butlerian
| jihadist.
|
| "Computers" are unfortunately named after the least important
| thing that they do (accounting.) What's important about them
| is that they are allow us to generalize and add infinite
| complexity to autonomous control systems, like valves that
| open when a certain pressure is reached, or switches that
| toggle at a certain light level. The ultimate computer
| network would keep the king that implemented it in power for
| eternity, even after his death, by keeping his will alive. If
| you automated his corpse, you wouldn't even need to know that
| the king _could_ die.
|
| edit: "Ordenador" (from Spain) is probably better, i.e the
| thing/person that gives orders, or puts things in order.
| Maybe "mandonador"...
| AussieWog93 wrote:
| Funny you mention that. I've felt the same thing when going on
| Uber Eats runs with my mate who drives for them.
|
| At the moment, the technology is in the "chapter 1" stage where
| things are pretty smooth and chill. It's genuinely low-stress,
| well-paced work.
|
| Once self-driving tech hits the market, though, it's clear that
| the human employees will be replaced without a care or thought
| for their well-being.
| hellbannedguy wrote:
| Not if they unionize, and it's hard to offshore delivery
| drivers. I guess autonomous driving cars will be an option?
| retube wrote:
| So? Ever since the industrial revolution new technology has
| obsoleted jobs. 98% of us used to be farmers. Excel put
| gazillions of book-keepers out of work. But we're still at
| basically full employment: there's thousands of jobs now no
| one could have conceived of 50 years ago, or even 20 years
| ago. Self-driving cars (if indeed they actually materialise)
| will be hugely transformative for society, and create whole
| new businesses and industries, i.e. new jobs to replace the
| old. Progress fundamentally relies on this "creative
| destruction", without it we'd still be in caves scavenging
| for nuts and berries.
|
| Edit: it's kind of bonkers that pointing out the basic
| mechanics of technological progress is getting downvoted on a
| site like HN, where I imagine most of us are gainfully
| employed trying to do just this.
| Epa095 wrote:
| I am also mildly positive, but also quite worried, for the
| following reasons.
|
| 1: Its easy to make fun of the luddites which fought the
| mills and against progress, but for many of them it was a
| real fight for their life, and many got thrown out in deep
| poverty. Society as a whole progressed yes, but many got it
| significantly worse for the rest of their lives. Maybe
| their kids, or eventually their kids kids got a better life
| for it. We must not be so naive that we dont think the same
| will happen (/ is happening now).
|
| 2: Not everyone is cut out to be
| developers/nurses/teachers/whatever-the-future-job-is. Many
| are, but not all. How do we take care of these boys(and
| some girls) which before would go to sea, or work at a
| farm, or do whatever simple job they could get? Maybe in
| the end it will all "settle down" in a good way, but we
| live now, not then.
|
| 3: It affects the "power struggle" between labour and
| capital. Its easier to move machines than people. Robots
| dont strike. Yes, you need a few expensive engineers, but
| in a world where soon "everyone" has a MSc they are not
| *that* hard to replace if they get too demanding.
|
| 4: And related to the last, as a society we collectively
| produce a bunch of stuff, and we have decided that selling
| our labour is the way to distribute this wealth. Automation
| produces more stuff, but reduces the value of labour. Will
| we find a way to handle this?
| pc86 wrote:
| 1. I struggle not to just respond "So?" to this one as
| well. Society progressing is the goal, isn't it? There
| will always be people in poverty. Poverty caused by AI
| isn't worse than poverty caused by industrial or
| agricultural automation, and it isn't worse than poverty
| caused by the fact you're born in a family of serfs
| instead of lords.
|
| 2. It will all settle down eventually, it always does.
| The poorest today have access to more food, education,
| and (usually) healthcare than even the rich just a few
| centuries ago. Being knocked to steerage of a quickly
| rising ship isn't a death sentence.
|
| 3 (and 4). Isn't this just more of the same compared to
| the industrial revolution? The cotton gin certainly
| shifted the power struggle more toward capital but I
| don't think anyone today would argue we should be
| processing all our textiles by hand.
| credittw2021 wrote:
| Otoh the last few decades have led to automation replacing
| well paying jobs with mcJobs.
| retube wrote:
| Maybe in some localised areas. Globally tho absolutely
| not; in recent years huge numbers of people have been
| pulled into the middle classes, probably a 1bn+ in India
| and China, and Africa is now the fastest growing economy,
| with the world bank expecting most countries to reach
| "middle income" status by 2025.
| JoachimS wrote:
| And replacing entry level academic jobs with machines -
| adding a barrier of entry for freshly baked lawyers.
| [deleted]
| derekjdanserl wrote:
| >So?
|
| So much for humanity benefitting from technology. Instead
| of our workdays becoming fewer and shorter, a tiny number
| of people who don't work at all will become richer while
| our lives become more alienated from our labor. This is the
| opposite of what we tell each other technology is supposed
| to accomplish. If it's not a problem, why do we keep lying
| about?
| FDSGSG wrote:
| You are describing some speculative future that doesn't
| really correlate with historical progress. Same could've
| been said about the industrial revolution, but somehow it
| did end up dramatically benefitting all of humanity.
| nopenopenopeno wrote:
| Again, that is the story we well, but it's a very
| reckless claim. Generations in early industrialization
| who still remembered life as a peasant almost universally
| preferred life as a peasant. Industrialization's need for
| labor forced other peasants from their land to build
| competing modern nation states. Very few people asked for
| this and only a small wealthy few would tell you that
| dragging these people into brutal toiling factory labor
| improved everybody's lives, but those are the people who
| funded the history books.
| ilikerashers wrote:
| The Industrial Revolution also played a heavy hand in WW1
| and Industrial Warfare.
|
| I don't think anyone is saying progress is wrong.
| Unfettered capitalism which robs people of their rights
| and alienates people from society is disastrous.
|
| Putting an algorithm to treat people like machines is not
| innovation. Uber Eats is not our generations "spinning
| jenny". They should be forced to provide basic working
| benefits.
| retube wrote:
| > Instead of our workdays becoming fewer and shorter,
|
| If that happened any increase in productivity would be
| offset by the reduced time worked, so no one would get
| richer (very simplistically).
|
| At the end of the day, humans are competitive, so of
| course hours worked will never really reduce, but it's
| that competition which drives innovation
|
| And we're all getting richer. Sure there's more
| billionaires then ever, but if we're all getting richer
| so what? (and yes we are getting richer: the economic
| progress in Asia and Africa over the last two decades has
| been incredible. And the west continues to develop also.
| The world is richer than it has ever been)
| throwaway2331 wrote:
| Normally, it would be poor form to pick apart someone's
| well thought-out reasoning, but your reasonings are not
| well thought-out: they're middle class talking points,
| and effectively propaganda, lacking any overarching
| cohesion.
|
| > If that happened any increase in productivity would be
| offset by the reduced time worked, so no one would get
| richer (very simplistically).
|
| This is a useless point for the common man.
| "Productivity," in statistical economics jargon, means
| "how much profit is squeezed out of an employee." It's a
| substitute for the much more charged "surplus value of
| labor." When productivity goes up and wages stagnate, as
| we've seen happen, it means the common employee is not
| getting richer, but instead being fleeced -- usually do
| to a moral depression that does not leave him able to
| negotiate (struggle/fight/etc.) for higher wealth (both
| material and non-material).
|
| This false equilibrium falls apart when you can simply
| deduct hours worked, and lower wages just enough to have
| a better productivity-to-hours ratio. Cannily, this is
| what has happened to most large low-skilled labor
| markets, where MBA "consultants" come in to grift and
| sell management/the executive team on how to make more
| money -- at the cost of everything else. See: McDonalds
| and Walmart as a prime example.
|
| This also has the added bonus that you can now decide not
| to offer full-time benefits (human rights), because your
| workers are no longer "full time."
|
| I.e. economic propaganda does not reflect real life.
|
| > Humans are competitive
|
| Humans with way less, are also hungry. Humans with way
| more, are also desensitized-degenerates that need hyper-
| stimulating avenues of pleasure like extreme avarice to
| sate their desires.
|
| Most people do not work because they want to, but because
| they need to. The only "competitive" people you'll see in
| the work-force are the greedy. Who've passed the point
| where money no longer really means anything, but still
| have decided to dedicate their life watching that number
| go up. They are degenerates. Using them to paint all of
| humanity with a broad stroke is ill thought-out.
|
| > And we're all getting richer. Sure there's more
| billionaires then ever, but if we're all getting richer
| so what? (and yes we are getting richer: the economic
| progress in Asia and Africa over the last two decades has
| been incredible. And the west continues to develop also.
| The world is richer than it has ever been)
|
| Materially, yes. Spiritually, morally, intellectually,
| emotionally, socially: no.
|
| Colonialization of foreign lands, effectively enslaving
| these people to be a part of the dominating
| civilization's "economic machine" is not a good thing.
| These people had a way of life before Western
| Civilization (TM) came in and obliterated their culture.
| They were presumably content, otherwise they would've
| sought out "innovation," "competition," and
| "productivity" on their own -- without the help of
| colonizers.
|
| If the pieces of what I've quoted are authentic, and
| genuinely from the heart, then it is an example of moral
| bankruptcy (most likely due to environmental causes --
| rather than individual).
|
| There are other things in the world besides money. The
| only civilization that keeps on beating the war-drum of
| money money money is the Western Civilization. Everyone
| else has other things that are important to them
| (notably: family, social connections, spirituality,
| living a good life, being a decent human being, and so
| on). But in fairness, these are all slowly disappearing,
| as the U.S. keeps on pumping inflation out to the world,
| and forcing everyone else to "join" or "suffer"
| economically (materially).
| JoachimS wrote:
| The fallacy here is to assume that, because this is what
| have happened before, it will always happen in the future.
|
| Yes, farming got efficient with tractors, allowing farmers
| to move to the cities to fill jobs in factories that was
| starting up at the dawn of the mass production and
| industrial revolution. But that does not mean that all
| future resources (i.e. humans) being freed due to
| automation, tech development will have new jobs to move to.
| That there are jobs suitable for humans created. That
| humans are needed as part of production of goods, services,
| content etc.
|
| Also I would love to see a reference to why progress rely
| on creative destruction. Please provide pointers.
| FredPret wrote:
| People are scared, which is understandable. These great
| technological leaps aren't called revolutions for nothing
| monkeybutton wrote:
| I've been seeing more and more references to this story over
| the last few years which is a sign.. Of something. It would be
| really interesting to see the Http referrer logs his website
| gets for it. Probably a great collection of threads about
| everywhere this kind of automation is happening.
| ammonammonammon wrote:
| One step closer to Elysium ...
| ape4 wrote:
| A small example: I had a Uber driver who was afraid to deviate
| from the app-selected route even though he could see an
| obstruction.
| Bayart wrote:
| That page had me download over 11000 elements and 80MB.
| agustif wrote:
| https://marshallbrain.com/manna1
|
| https://marshallbrain.com/manna2
|
| https://marshallbrain.com/manna3
| stevenjgarner wrote:
| There is an avalanche of gig economy employee management and
| monitoring software (Teramind, ActivTrak, Ekran System,
| BrowseReporter, workpuls, Cerebral, monday.com, DeskTime, Time
| Doctor, etc). Many of these systems contain algorithms that
| surveil and manage workers. These tools eliminate much of the
| risk of outsourcing work to remote workers that cannot be
| directly supervised. They enable the remote workers by making
| them eligible for employment, and engineering a higher degree of
| trust into the relationship.
| afandian wrote:
| I would dearly like to see legislation that means that non-
| human supervision constitutes an _elevated_ a risk to the
| employer.
|
| e.g. UK Data Protection Act 2018 A controller
| may not take a significant decision based solely on
| automated processing unless that decision is required
| or authorised by law.
|
| https://www.legislation.gov.uk/ukpga/2018/12/part/3/chapter/...
| stevenjgarner wrote:
| ... and to the employee?
| Applejinx wrote:
| For the time being that's a given.
| willcipriano wrote:
| I don't think it even needs to go that far, if you choose to
| delagate your descion making to a machine you should be as
| responsible for whatever results from that, it is your role
| to ensure the machine is making the right choices.
| afandian wrote:
| I think it's a harmful act of technological hubris to
| suggest that it's possible. Employment is a human
| relationship. I don't see much evidence that humans are
| able to construct a system that can respond to all of the
| human interaction that goes into a human relationship. Look
| at the state of the art of HR software, let alone spyware.
|
| Let the robot managers for manage robots.
| willcipriano wrote:
| If someone invented such a system, I wouldn't want to
| stop them from using it, I just hold them accountable. I
| personally wouldn't trust a computer to fire someone if I
| was on the hook for it's choices.
| afandian wrote:
| I don't trust any of the feedback loops, or the good
| faith of any of the kinds of organizations that would use
| them. Show me a time when a company was truly accountable
| for a "computer mistake" and didn't just shrug it off
| when the damage was done. The fable of the scorpion and
| the frog springs to mind.
|
| https://en.wikipedia.org/wiki/The_Scorpion_and_the_Frog
| willcipriano wrote:
| I don't think it's actually a defence now, except maybe
| in the court of public opinion. For example, I'm certain
| courts would find that it's still sexual harassment if
| you build a machine to go around looking up women's
| dresses instead of doing it yourself. The problem isn't a
| lack of accountability for the machine so much as it is a
| lack of accountability generally.
| streamofdigits wrote:
| How can it be that the most predatory elements of society have
| been given so much leeway?
|
| This is not an intrinsic pattern of our species / societies or we
| would have devoured ourselves literally or figuratively long time
| ago.
|
| Obviously restoring forces that would push this evil genie back
| in its bottle exist but are somehow dormant, neutralized or
| otherwise missing-in-action.
|
| This fight and push-back should not be a thing for "activists".
| It affects every single person. We are all "gig workers".
| pessimizer wrote:
| Because lacking ethical prohibitions always gives one an
| advantage. People who lack ethics can always choose to act
| ethically when it confers a social advantage, or to act
| otherwise when that's more advantageous. Ethical people are by
| definition often forced to act at a disadvantage.
|
| The only advantage that individual ethical behavior gives is
| through supporting the health of the group, thereby supporting
| members who depend on the group. People who aren't loyal to a
| group can shop around, play one against the other, etc. Even
| better, if you automate the interactions between people,
| there's no need or benefit for ethical behavior. Profit comes
| from pleasing the algorithm rather than any sort of loyalty.
|
| Edit: The algorithm that capitalism uses is maximizing value
| against cost. All ethical complaints about it are asking people
| to accept less value/more cost. Capitalism insists that if
| everyone maximizes value against cost personally, that sends a
| control signal to the collective that will maximize the greater
| good. This is the ethic that we're automating.
| caddybox wrote:
| This is my personal opinion.
|
| We have been devouring each other for ages now. Entire cultures
| have vanished, destroyed by more powerful members of the same
| species. The entire history of our species has been on violent
| conflict where a stronger group removes a weaker group, only to
| be removed later by another group that emerges stronger.
| Previously it was racial, clan-based violence. Now it's
| economic servitude where a perpetually unfortunate lower class
| keeps on greasing the wheels of industry with sweat and blood.
| The rich stay rich, the poor stay poor.
|
| We are not benevolent or kind creatures. We are programmed to
| survive. As long as the poor live, they will rarely question
| the rationality behind the inequality that keeps them poor.
| kmlx wrote:
| > The rich stay rich, the poor stay poor.
|
| this has been false since at least the industrial revolution.
|
| https://ourworldindata.org/grapher/life-expectancy?tab=chart
|
| https://ourworldindata.org/grapher/world-gdp-over-the-
| last-t...
|
| https://ourworldindata.org/grapher/world-population-in-
| extre...
| Applejinx wrote:
| No, you overlook the mutual aid factor. We are non-benevolent
| AND benevolent, unkind AND kind creatures. Hence the
| inability ever to settle on one explanation. Either direction
| you go, you're in error unless you account for both.
|
| It becomes an existential crisis when humanity designs bots
| and AI and then behaves like humanity must COMPETE with the
| AI in order to survive, when we absolutely can't. We can't
| lift better than a forklift, run faster than a Veyron, we
| can't think better than Big Data. For the time being we're
| still able to dream better. For now.
|
| Android dreams won't speak to us.
|
| But there's no reason we can't pass on our capacity for
| mutual aid to the AI, as well as the competitiveness. And
| there's no reason the AI can't have heart and soul. When it
| fails it's because WE try to construct it in a false image of
| what we think we are.
| streamofdigits wrote:
| This is probably too pessimistic. If it was true we would not
| be writing this right now, we'd be still in some semi-wild
| state, ripping each others hearts out as battle trophy to
| dedicate to some imaginary god. If predatory behavior really
| had the upper hand it would not allow the kind of social
| structures that solved difficulty technology problems and
| allowed mass health, mass education etc.
|
| But destructive, rapacious, behavior feels just one step away
| even now that we are supposedly "civilized" and we have
| excruciating records of our bloody history.
|
| If we don't find a way to bottle this instinct up we will not
| see many more decades.
|
| Time is running out because the system is not time-invariant:
| we had explosive population and technology growth. The same
| regressive traits that might have "only" annihilated isolated
| cultures in the past will bring our collective demise.
| BingoAhoy wrote:
| Is it too pessimistic? Human history is filled with empires
| dominating their weaker neighbors, culling many of them
| (though not all), for the INgroup's self interest (it's
| also useful to keep subordinates around to do the menial
| labor). Even the less ruthless of these "imperialistic"
| entities such as the United States have a outstanding
| record. The native Americans can pay witness.
|
| Not to mention what we do to animals. I read during just
| Superbowl Sunday 500 million chickens are slaughtered for
| the delight of eating chicken wings in the living room. If
| animals are nonconscious entities that don't suffer akin to
| robots then no harm no foul, but that is quintessential
| mass-scale devouring of another.
| mschuster91 wrote:
| > How can it be that the most predatory elements of society
| have been given so much leeway?
|
| Partially because we as collective societies let them, by not
| voting out or actively voting in those who campaigned on neo-
| liberal platforms, many decades ago.
|
| Partially because not many (and those who did were mostly on
| the hard-left side, which was and still is vilified) saw the
| "boiling frog" effect at play... stuff like the end of the
| Fairness Doctrine that gave way to today's mass media and its
| problems with propaganda, the Citizens United decision that
| paved the way for utterly absurd amounts of money in political
| campaigning, gerrymandering and other forms of voter
| manipulation, the building/consolidation of media empires like
| Sinclair [1], or the total lack of any regulation for social
| media and the blindness of everyone regarding Russian and
| Chinese propaganda on these platforms.
|
| Now, the water is boiling: support for democracy - not just in
| the US but worldwide - has been eroding, as have democratic
| freedoms and lessons-learned from WW2 (such as, especially in
| Europe, the rights of those fleeing from war, destruction and
| hunger, or the need for international cooperation). Countries
| in the European Union (Poland, Hungary) begin to qualify as
| quasi-dictatorships, the UK dropped out in an epic clusterfuck,
| the US suffered from a (thankfully incompetent) putsch attempt,
| and as the COVID crisis has shown people don't even trust
| _vaccinations_ any more. Meanwhile, the rich have gone ever
| more rich (sometimes to utterly absurd degrees).
|
| How do we get back to civilization? I have no idea if it is /
| will be possible without some kind of global "reset" event...
| we Germans caused the last one, this time the cause will likely
| be either somewhere in Asia (invasion of Taiwan) or Russia
| (which is moving enough troops to the Ukrainian border to run
| an actual _blitzkrieg_ style invasion [2]).
|
| [1] https://nymag.com/intelligencer/2018/04/anchors-reciting-
| sin...
|
| [2] https://www.euronews.com/2021/11/24/russia-s-military-
| build-...
| Havoc wrote:
| Happened to be reading a photo essay about delivery drivers [0]
| today and was struck by how intensely vulnerable the workers are.
| e.g. Saying he knows 5 people that died and non that have filed
| insurance claims. i.e. They're not even bothering with the shiny
| "support" processes the companies put in place. Sure that is in a
| dangerous country, but still says a lot about the people that are
| (by necessity) attracted to these jobs.
|
| It's not just the algo angle...the entire system isn't fit for
| purpose.
|
| [0] https://www.theguardian.com/world/2021/dec/13/ghost-
| riders-t...
___________________________________________________________________
(page generated 2021-12-13 23:01 UTC)