[HN Gopher] The exploited labor behind artificial intelligence (...
___________________________________________________________________
The exploited labor behind artificial intelligence (2022)
Author : zhte415
Score : 145 points
Date : 2023-01-20 16:35 UTC (6 hours ago)
(HTM) web link (www.noemamag.com)
(TXT) w3m dump (www.noemamag.com)
| prvc wrote:
| >The public's understanding of artificial intelligence (AI) is
| largely shaped by pop culture
|
| First clause of first sentence falsely posits that the "public"
| has a single mind, and claims the authors know the contents of
| this mind and what causes it to acquire these contents. It also
| adopts a false pose of authority and and understanding superior
| to that of the (nonexistent) mind of the public. It was nice of
| the authors to disqualify themselves from offering worthy ideas
| on the topic of AI so early in the article.
| bostonwalker wrote:
| Do you really believe that we cannot make generalizations about
| the layperson's understanding of AI? Unless you have a related
| degree in math, statistics, engineering, or comp. sci, i.e. a
| tiny minority of people, it's probably true that you get your
| understanding from pop culture.
| prvc wrote:
| >Do you really believe that we cannot make generalizations
| about the layperson's understanding of AI?
|
| I never said anything like that.
|
| Another point: They also continually conflate "AI", deep
| learning, and mechanical-turk style crowdsourcing.
| Xelynega wrote:
| Why are you discrediting the authors points based off of a
| fluff sentence in the introductory paragraph? Typically the
| arguments themselves are evaluated based on their own merit.
|
| The "popular" in pop culture and the "public" in publics
| understanding are referring to the same thing, so I don't
| understand your confusion. The only thing that's really being
| implied by the sentence is that scientific understanding
| doesn't dominate the public understanding of artificial
| intelligence as much as pop culture does.
| ucha wrote:
| As a third-worlder I welcome these jobs. They lower unemployment,
| raise average salaries, haggling power, increase the country's
| foreign currency reserves... Our minimum wage is similar to
| Kenya's and $1.46/hour after tax is way above the around 80c
| minimum.
|
| It could make sense from an environmental perspective to produce
| and consume locally but for a job like this, the only reason to
| use US nationals would be to create local jobs.
| [deleted]
| labrador wrote:
| I'd like to see a The Onion story about a third world person
| worried that the computer games they play are written by young
| exploited programmers in first world countries
|
| Edit: I asked ChatGPT to write it for me:
|
| Write me a story in the style of The Onion about a third world
| kid who worries the he is exploiting the young programmers in
| first world countries who make the games he plays
|
| Headline:
|
| Third World Child Struggles with Guilt over Exploiting First
| World Game Developers
|
| Story:
|
| In a small village in a developing country, a young boy named
| Ahmed sat huddled over his tablet, playing the latest mobile
| game. As he tapped away at the virtual farm, building and
| expanding his virtual empire, a feeling of guilt began to gnaw
| at him.
|
| Ahmed couldn't shake the feeling that the game he was enjoying
| so much was built on the backs of exploited programmers in
| first world countries. He knew that the developers of the game
| were likely working long hours for low pay, all so that he, a
| third world child, could have a few moments of entertainment.
|
| As the guilt grew stronger, Ahmed tried to put the game down,
| but found himself unable to resist its allure. He knew that he
| was complicit in the exploitation of his fellow human beings,
| but he couldn't bring himself to stop playing.
|
| "I know it's wrong, but I just can't help it," Ahmed said,
| tears streaming down his face. "I wish there was a way to enjoy
| the game without feeling guilty."
|
| As the sun set over Ahmed's village, the young boy made a
| silent vow to do whatever it takes to make sure that the game
| developers are treated fairly and compensated for their hard
| work. He knew it wouldn't be easy, but he was determined to
| make a difference, even if it meant sacrificing his own
| enjoyment of the game."
|
| Ahmed's story is a reminder that in today's globalized world,
| the actions of one person can have a ripple effect that extends
| far beyond their immediate surroundings. While it may be easy
| to enjoy the latest mobile game or app, it's important to
| remember that behind every pixel is a team of hardworking
| individuals who deserve fair treatment and compensation for
| their labor.
| yamazakiwi wrote:
| This is also reminiscent of that German ad about raising
| money for Americans in poverty.
| [deleted]
| frontman1988 wrote:
| Hell yeah even I would gladly take such a job. 120 rupees an
| hour is a good deal in such tough times where it's almost
| impossible to find a work from home job that isn't an outright
| scam. Last week I sat for 4 hours solving a crazy competitive
| programming question just to get an interview call. It's rough
| out there. If Anybody knows where to apply for these data
| labelling jobs please tell me.
| TACIXAT wrote:
| I'm building something right now that is a reverse bidding
| platform for image acquisition and labeling (company makes a
| solicitation, workers bid) where you get paid per accepted
| image and label. It will open to US labor first and might
| take up to a year to expand to India (assuming you're there).
|
| If you want to be an early India user (paid) put an email in
| your bio and I'll get in touch. If you can program though
| data labeling might not be the best use of your time.
| dekhn wrote:
| the "guilt-industrial complex"
| 0xbadc0de5 wrote:
| They'd rather see poor people earn no money than see them earn
| anything below what their own preconceptions have labelled a
| fair wage. It seems that for a lot of people, it's not about
| building things, it's about tearing things down. The psychology
| appears akin to the childhood bully who, unable to build their
| own, goes around kicking over other children's sandcastles,
| then feels smugly content with their "accomplishment"... not
| recognising all they've done is ruin others work without
| contributing anything of value to the world.
| phailhaus wrote:
| OK, so then we should repeal labor laws and allow employers
| to abuse their employees? The problem is that your tired
| analogy has no nuance: all criticism is considered
| "bullying". We should simply allow everyone to do anything
| they want? Clearly that's absurd, so the sandcastle analogy
| doesn't work.
|
| Why not just argue the criticism on its merits, rather than
| complaining that criticism shouldn't be allowed at all?
| cscurmudgeon wrote:
| So all off shoring is labor abuse? TIL
| ripe wrote:
| I don't know why you're being downvoted. Your point is
| valid. Many arguments can be made against the article's
| critique, but the analogy used by the comment above yours
| is rejecting the very idea that these companies can be
| criticized at all.
| Der_Einzige wrote:
| They call it "critical theory" instead of "constructive
| theory" for a reason.
|
| It takes no skill to critique.
| Xelynega wrote:
| I think your "building" and "tearing down" analogy is
| actually spot on.
|
| We have a whole group of people who are so concerned with
| what's already built and building more and more onto it, that
| any attempt to point out where they went wrong in the
| foundation is met with criticisms like "all you people want
| to do is tear things down" and "well if your foundation is so
| good then why haven't you built as much as us".
|
| Look at the whole phenomenon of critical theories, especially
| critical race theory. People were so enraged by the prospect
| of having the foundations questioned that the actual
| criticisms were less important.
| [deleted]
| somedude895 wrote:
| This article is all over the place. I see the issue with the work
| being traumatizing, but apart from that, the points the article
| makes are just awful.
|
| > labor exploitation is not central to the discourse surrounding
| the ethical development and deployment of AI systems
|
| Yes, because workers' rights and AI Ethics are two separate
| issues. It seems the author is using the fact that AI ethics is a
| thing as a "gotcha" against AI companies, because they treat
| their workers unethically according to them.
|
| > In this article, we [...] argue that supporting transnational
| worker organizing efforts should be a priority in discussions
| pertaining to AI ethics.
|
| And even if you shoved workers rights into AI ethics; How do you
| organize labor for a job where there's an unlimited pool of
| people worldwide who would take the job in an instant with
| minimal training required most likely, making pretty much the
| whole workforce instantly replaceable?
|
| > These corporations know that increased worker power would slow
| down their march toward proliferating "AI" systems requiring vast
| amounts of data, deployed without adequately studying and
| mitigating their harms.
|
| > If corporations are not allowed to exploit labor from Kenya to
| the U.S., for example, they will not be able to proliferate
| harmful technologies as quickly -- their market calculations
| would simply dissuade them from doing so.
|
| Here it's suddenly about actual AI ethics again with the idea
| that progress needs to be artificially slowed down in order to
| ensure ethical implementation. Increasing the "unethically low"
| data labeling wages has the added benefit of slowing down the
| "unethical" development of AI.
|
| > Talk of sentient machines only distracts us from holding them
| accountable for the exploitative labor practices that power the
| "AI" industry.
|
| It really just looks like a weird ramble where the author picked
| a popular topic and then grasped at straws to draw lines to their
| political agenda with some luddism sprinkled on top.
| AlbertCory wrote:
| > But around 15 years ago, before the proliferation of gig work,
| deep learning systems were considered merely an academic
| curiosity, confined to a few interested researchers.
|
| Um, no. Google was using machine learning more than 15 years ago,
| notably in the SmartASS system that predicts whether someone will
| click on an ad [1]. I got my cousin Missy a gig (at $15 an hour)
| rating ads and search results.
|
| This story is hardly a claim of humane treatment of gig workers,
| I should note. I intervened and got her treated fairly.
|
| > Companies make sure to hire people from poor and underserved
| communities, such as refugees, incarcerated people and others
| with few job options, often hiring them through third party firms
| as contractors rather than as full time employees.
|
| Citations, please. A rater who is many standard deviations away
| from the mean is extremely _undesirable_. You want their
| judgments to be representative of the population.
|
| If you want to design something that _appeals_ to Americans, you
| want American raters. Since the US is by far the largest consumer
| of Internet services, it stands to reason most raters of
| "quality" will need to be from the US.
|
| If you just want to know if an image contains objectionable
| content, then yes, you could use Third World employees.
|
| [1] https://albertcory50.substack.com/p/working-at-google-ads
| tristor wrote:
| These "journalists" see someone training image classifiers
| using Mechanical Turk, which awards a wage in excess of the
| local economy in the third world, and imagine to themselves
| this is how all models get trained because they don't
| understand ML, they don't understand market segmentation, and
| generally don't understand technology or business.
| YeGoblynQueenne wrote:
| >> they don't understand ML
|
| Please don't say that, it is upsetting that this false
| assumption thrown around so carelessly. At least check the
| authors' profiles, say on Wikipedia, if you want to know how
| much they understand.
| Xelynega wrote:
| I don't see what there is to understand.
|
| Any 'ai' model right now requires classified data to train.
| It takes human work to classify data. Human work is cheaper
| in some places than others, so companies in expensive places
| exploit the situation of poorer places to lower their
| operating costs.
|
| The wage size compared to the local economy doesn't really
| matter, since the exploitation comes from the difference in
| costs between the local economy and the local economy of the
| company.
|
| In your reality, how is the data getting classified if not by
| people whose situation is being exploited?
| tristor wrote:
| > It takes human work to classify data. Human work is
| cheaper in some places than others, so companies in
| expensive places exploit the situation of poorer places to
| lower their operating costs.
|
| 1. Humans are not fungible in this way, as much as some
| people wish that they were. The ML models we built in my
| workplace were fed classified data that was built using
| some of the most expensive labor in the company, because of
| the highly specialized nature of the model. It would have
| been impossible to train using Mechanical Turk.
|
| 2. Taking money from wealthy countries and putting it into
| the economy of poor countries by paying an outsized local
| wage in the poor country that is cheap in the wealthy
| country, thus lifting up the poor country, is a very
| interesting way to classify "exploitation" (see: Asia and
| Eastern Europe).
|
| > In your reality, how is the data getting classified if
| not by people whose situation is being exploited?
|
| In your reality, apparently it's only acceptable for a
| company that would pay a First World worker $20/hr to do
| classification to also pay a Third World worker $20/hr to
| do classification, rather than $2/hr, which is an outsized
| compensation in a local market where people generally get
| $2/DAY, rather than per hour.
|
| In my reality, BILLIONS of people have been lifted out of
| abject poverty by foreign investment dollars that
| arbitraged labor costs, to massively improve the economic
| situation and quality of life of those foreign workers.
|
| Weirdly, my reality aligns with actual facts.
| cscurmudgeon wrote:
| How is that exploitation?
| Xelynega wrote:
| Why don't US companies pay foreign workers US wages? The
| answer I come up with is that they chose to hire workers
| in disadvantaged situations(relative to opportunity and
| local labour laws) to reduce operating costs.
|
| To me, using people in disadvantaged situations to reduce
| your operating costs is inherently exploitative.
| tristor wrote:
| > Why don't US companies pay foreign workers US wages?
|
| Why doesn't money grow on trees? Why does inflation
| exist? Why are some nations wealthy and other nations
| poor? Why are some nations at different points in their
| economic, social, civil, and technological development?
| Why is the sky blue?
|
| Are you trolling or do you seriously not understand how
| fancifully inane this question is?
| cscurmudgeon wrote:
| Nobody is being forced to work for those companies. Is
| there any actual evidence of exploitation? If so, are the
| governments complicit in not acting against them?
| [deleted]
| oh_sigh wrote:
| Let's say this job was not available for $1.46 an hour after tax.
| Are those potential employees better off or worse off?
| ClumsyPilot wrote:
| lets say profits were making their way down the chain - would
| the economy be better off or worse off?
| somedude895 wrote:
| There's a provider in Kenya, who offers these services at a
| certain rate. OpenAI isn't going go pay them more and
| instruct them to pass the money on to the workers. They're
| businesses, not charities.
|
| How are so many comments in this thread so far detached from
| the real world?
| oh_sigh wrote:
| Probably better. See? I answered your question. Not sure why
| it is so hard to answer mine.
| guerrilla wrote:
| because yours is a manipulative false dichotomy as
| demonstrated by the second question.
| oh_sigh wrote:
| How much do you think ML trainers previously getting paid
| $1.46/hr would get paid if the company distributed 100%
| of their profits to their employees?
| hackinthebochs wrote:
| Stated more realistically, what will those people do in
| the mean time while we all wait for the economic
| structure of the world to reorient itself towards profit
| sharing? How long are they expected to wait?
| guerrilla wrote:
| The economic structure doesn't need to change for these
| companies to act as if it had, nor for these people to
| take these jobs if neither thing happens. All three
| things can be true at the same time. As I said, it was a
| false dichotomy. That's key to understanding the
| situation.
| hackinthebochs wrote:
| And how do you expect to effect such a change? Is it just
| a matter of generating sufficient outrage in your mind?
| Why should these people be forced to wait and suffer
| while you and your ilk write think pieces about silicon
| valley?
|
| This is the problem with ideas like yours, there is no
| practical plan on offer to create the reality you are
| advocating for. Yet, you expect these people who are
| suffering _right now_ to just sit tight. Its narcissism.
| Xelynega wrote:
| Let's say we restructure global labour relations such that
| people arent getting paid a fraction of their worth because of
| where they were born or choose to live.
|
| You know, since we're dealing in hypotheticals.
| arcticbull wrote:
| They would likely be better off in the long term as time spent
| on this kind of menial labor represents a massive opportunity
| cost.
| oh_sigh wrote:
| What do you think the employees would be doing if they didn't
| take this job?
| hbrn wrote:
| > They would likely be better off in the long term
|
| It's quite a dangerous mindset to think that taking freedom
| away from other people is a good thing _because you know
| better_.
|
| In some cases you will be correct. In others, millions will
| die.
| [deleted]
| friend_and_foe wrote:
| So open AI decides to gimp their creation and prevent it from
| returning certain responses because _some people_ will screech
| about those responses, and they decide to hire people on other
| countries to do it, and now we see articles coming out about
| exploitative labor practices...
|
| Give these crybabies what they claim to want: fire all those
| Kenyans and stop gimping chatgpt under the guise of not wanting
| to use exploitative labor practices. Rub their faces in it. You
| can't win with these people, don't even try.
|
| And as a little bonus for the crybabies, when are you going to
| realize you're tools being used in corporate warfare? You think
| it's a coincidence that these articles are being written about an
| up and coming company running circles around the established big
| boys? Do you like being a tool?
| optimalsolver wrote:
| >crybabies
|
| Ironic comment.
|
| >Rub their faces in it
|
| None of that's gonna happen. Deal with it, lol.
| jsimzeroone wrote:
| Neal Stephenson's "The Diamond Age" has a fake AI "character" --
| a book that appears to be AI but is really operated by low-paid
| humans in 3rd world countries (reminiscent of the Mechanical
| Turk, a supposed chess playing automaton that actually contained
| a small person).
|
| There's an old observation from Arthur C. Clarke, that
| sufficiently advanced technology seems like magic. One thing that
| learning how magic tricks are performed taught me is that
| magicians typically do their fake magic by doing an unreasonable
| amount of work behind the scenes -- "magic" in the real world is
| often just doing a large amount of work that people don't realize
| is happening.
|
| Given all that it seems appropriate that the new "real world
| magic" -- ML systems imitating intelligence -- really rest on a
| lot of hidden work by human beings. Just like magical devices
| like iPhones exist due to a lot of surprisingly cheap labor.
| Imagining otherwise is like imagining that the delicious food
| from a 3-star kitchen just appears from the chef's mind, without
| the help of all of the low-paid kitchen workers, farm workers,
| etc. that in reality do most of the work.
| daniel-cussen wrote:
| [dead]
| jjcon wrote:
| Not as a rule though - so many ML systems are utilizing data
| that is streaming in from passive sensors or transactional
| streams and is not human curated at all. The human aspect isn't
| an intrinsic property of ML or even these algorithms only
| particular applications (and I would guess a minority of
| applications too).
|
| Given that, it seems to be a clear miss to apply that logic
| generally. I have to believe it most likely stems from a lack
| of basic understanding and competence on the authors part.
| YeGoblynQueenne wrote:
| >> I have to believe it most likely stems from a lack of
| basic understanding and competence on the authors part.
|
| That is unlikely, given that one of the authors is Timint
| Gebru. I'm quoting below select passages from her wikipedia
| page indicating her background:
|
| _In 2001, Gebru was accepted at Stanford University.[2][5]
| There she earned her Bachelor of Science and Master of
| Science degrees in electrical engineering[8] and her PhD in
| computer vision[9] in 2017.[10] Gebru was advised during her
| PhD program by Fei-Fei Li.[10]_
|
| _Gebru presented her doctoral research at the 2017 LDV
| Capital Vision Summit competition, where computer vision
| scientists present their work to members of industry and
| venture capitalists. Gebru won the competition, starting a
| series of collaborations with other entrepreneurs and
| investors.[11][12]_
|
| _Gebru joined Apple as an intern while at Stanford, working
| in their hardware division making circuitry for audio
| components, and was offered a full-time position the
| following year. Of her work as an audio engineer, her manager
| told Wired she was "fearless," and well-liked by her
| colleagues_
|
| https://en.wikipedia.org/wiki/Timnit_Gebru
| jjcon wrote:
| I didn't say they were uneducated but an audio hardware
| engineer does not imply a good working knowledge of
| industry trends in ML applications.
|
| Regardless, my point still stands, they completely ignore
| (willingly or ignorantly) that human labeled data is not
| intrinsic to ML or even the algorithms themselves and in
| all likelihood is a small minority of datasets used by
| modern ML applications. To then apply that critique
| generally to ML shows ignorance and a misunderstanding of
| the ecosystem.
| YeGoblynQueenne wrote:
| Gebru is not an audio hardware engineer. I call your
| attention to this passage I quoted above:
|
| _Gebru presented her doctoral research at the 2017 LDV
| Capital Vision Summit competition, where computer vision
| scientists present their work to members of industry and
| venture capitalists. Gebru won the competition, starting
| a series of collaborations with other entrepreneurs and
| investors.[11][12]_
|
| And to the fact that she got her PhD in computer vision,
| i.e. the main area of AI research that the article seems
| to be criticising.
| jjcon wrote:
| Her work experience is as an audio engineer - but again
| it doesn't matter what her credentials are, she is wrong
| regardless and you are ignoring my whole point. She shows
| her ignorance of the subject matter (again willingly or
| not) when she applies her critique generally at Ml and
| not just at these specific applications - not sure how
| many times I need to say that.
| YeGoblynQueenne wrote:
| >> Her work experience is as an audio engineer
|
| Her PhD research is in computer vision and she and her
| co-authors are writing mainly about computer vision, but
| you spoke of "a lack of basic understanding and
| competence on the authors part". That is clearly
| incorrect and I don't understand what saying the same
| thing many times will change about that.
| jjcon wrote:
| Computer Vision is a domain and is not equivalent to
| machine learning. They overlap yes, but not necessary.
| Again though you have completely ignored my point again
| and again. The authors ignorantly conflate specific
| applications of ML with the entire industry. That shows a
| lack of competence in this area.
| bloodyplonker22 wrote:
| You perfectly illustrate the problem with typical western
| thinking. The workers you refer to may be low paid when
| compared to the US and other western countries, but they are
| highly paid in their respective countries. A lot of the time,
| the jobs (ie: iPhone assembly, AI related) are highly sought
| after because they are a great alternative to the other jobs
| that the workers with their skill set can get. It is also a
| great way to get a step up the job ladder for them and new
| acquire skills.
| ClumsyPilot wrote:
| > A lot of the time, the jobs (ie: iPhone assembly, AI
| related) are highly sought after
|
| This description alsp applies to mining toxic substances by
| hand and high-end prostitution.
|
| The artisan cobalt-mining by hand is relatively popular, and
| kills you withing 10 years. It is not difficult to explot
| desperate people. I dont think we should be whitewashing it
| danielmarkbruce wrote:
| The woman who started Sama (mentioned in the article)
| explicitly started the company to help people in those
| countries. Her entire life appears to have been directed
| toward helping people in Africa, she had a history of it.
| She wasn't there to exploit people.
| colpabar wrote:
| But it _is_ exploitation though, right? If a bunch of
| people in a western company say "well, we could just
| hire people in africa because that would _significantly_
| reduce our costs, " isn't that an exploitation of cheap
| labor in africa?
| jeremyjh wrote:
| Is it exploitation to buy things from poor people?
| Dalewyn wrote:
| All business is about exploiting someone somewhere for
| your benefit, no exceptions. The only question is whether
| that exploitation is within tolerable limits.
| yesenadam wrote:
| > All business is about exploiting someone somewhere for
| your benefit, no exceptions.
|
| Where did you learn that? I don't think it's at all true.
| It seems maybe you have a no-true-scotsman definition of
| 'exploited', so that no evidence against your claim would
| change your mind.
|
| Picture a baker, who makes bread for people, who get
| bread, the baker gets money. Where is the necessary
| exploitation? I can't imagine where your confidence - "no
| exceptions" comes from. There are no win-win exchanges in
| the world, and none even possible? I'm not a huge fan of
| capitalism but that seems absurd.
| Dalewyn wrote:
| The baker is exploiting his customers' need/want for
| bread. The customers are exploiting the baker's need/want
| for money.
|
| Another way to describe business is that all business is
| about ripping someone off without pissing them off (and
| ideally making them happy). Middlemen who make their
| profit off margins are the most obvious example, but as I
| said this applies to all forms of business.
|
| I reiterate: All business is about exploiting someone
| somewhere for your benefit, no exceptions.
| yesenadam wrote:
| I think you are using the word "exploit" in a different
| way than it is usually used, leading most people to
| misunderstand you. Or as a sibling comment suggests, the
| word has two meanings, and your argument uses
| equivocation (two different meanings in two different
| places) to achieve an apparently thick, substantial
| conclusion out of nothing.
| Dalewyn wrote:
| Let me rephrase it using simpler language, then:
|
| All business is about taking advantage of someone
| somewhere for your benefit, no exceptions.
|
| The baker is taking advantage of his customers' need/want
| for bread. The customers are taking advantage of the
| baker's need/want for money.
| ben_w wrote:
| "Exploitation" has at least two (IMO very different)
| meanings.
|
| Even though most of the time "exploiting an opportunity"
| is neutral and "exploiting our workers" is either a scam
| or abuse, I have seen some texts that used the word in
| the same sense for both cases.
| colpabar wrote:
| it's communist nonsense that's used to justify the idea
| that walking dogs for ~25 hours a week is just too much.
| danielmarkbruce wrote:
| I guess it depends on your definition of "exploit".
|
| If OpenAI had to pay more, they would have gone with
| another option. It's challenging to work across time
| zones, across cultures, across language barriers. Working
| with folks in Reno, NV or somewhere in the southern
| states of the US would have been the choice for OpenAI at
| a much higher price.
|
| It's a competitive world. On the surface the Sama founder
| knew that and realized the options were higher wages for
| these folks in Africa, or none. The choice of _even
| higher_ wasn 't actually on the table.
| boeingUH60 wrote:
| You mean the same Sama that charged OpenAI $12.50 per
| hour for a contract and paid their African contractors $2
| or less an hour?
|
| https://time.com/6247678/openai-chatgpt-kenya-workers/
| bumby wrote:
| I think your example explains why the article is a
| potentially gray area of exploitation.
|
| I think it's clear exploitation when you offer someone
| employment that adversely impacts their rights. So artisan
| cobalt-mining is exploitative, because of it's health
| effects, would go against the UN's definition of "the right
| to work in just and favourable conditions"[1]. However, I'm
| not sure what, if any, rights are being compromised by the
| topic in the article. Maybe there's a case that it's unjust
| because of the asymmetry in the value created and payment.
| If there are some rights abuses, then it becomes a clearly
| exploitative endeavor.
|
| [1] https://www.un.org/en/global-issues/human-rights
| 8note wrote:
| This is another common problem with tropical western thinking
| -- forgetting the historical context. Colonialism happened,
| and the people are so poor and desperate now because of
| things the west did in the past.
| miguelazo wrote:
| This does explain a lot of the difference in development,
| especially in very poor countries. And this isn't just a
| result of "old", traditional colonialism, but imperialism
| in the form of meddling in their internal politics.
| Guatemala is a great example, having had its democracy
| destroyed by Allen Dulles and his CIA in the 1950s. But you
| can also take a more recent example like the 2009 coup in
| Honduras, backed by Hillary Clinton.
| worik wrote:
| As someone who does not live in the USA Iam very glad H
| Clinton did not become president
|
| Impossible to know but I expect she would not have stayed
| out of the Syrian war.
|
| It was she who was largely responsible for the debacle in
| Lybia
|
| So Karen on using all that power and such a short term
| thinker
|
| Trump was a catastrophe for you in the USA but not really
| for us.
|
| Please elect an isolationist....
| wolverine876 wrote:
| Exploitation involves, by definition, giving the exploited
| something they need. If you don't have food, and someone
| gives you food in return for slave labor, they are exploiting
| you.
| josephg wrote:
| If people in a poor country are offered jobs in a factory,
| and they prefer those jobs over subsistence farming (since
| the conditions are better and pay is better), who exactly
| is worse off as a result? Sounds like a profitable trade to
| me given both parties walk away happy.
|
| Outsourced factory jobs are the mechanism by which
| previously poor countries like Taiwan and China (in many
| ways) have been pulled out of poverty. The process is
| happening before our eyes in Vietnam right now.
|
| Do you think you're doing poor people a favour by denying
| them well paid jobs? Should we do the same in the west, and
| have companies fire all our poorest employees?
|
| I can hardly think of a more cruel policy.
| nerdponx wrote:
| > a book that appears to be AI but is really operated by low-
| paid humans in 3rd world countries
|
| I've seen this in real life before, no need for a work of
| fiction.
| brudgers wrote:
| _Diamond Age_ was first published in 1995.
|
| I don't think I am out on an intellectual limb believing very
| very few people would have seen an AI written book back then.
| simonebrunozzi wrote:
| > Mechanical Turk
|
| And also inspiration for Amazon's Mechanical Turk. [0]
|
| [0]: https://www.mturk.com/
| charlieyu1 wrote:
| Reminds me of that "virtual youtubers" are actually humans
| behind the skin
| sfritz wrote:
| In Diamond Age the book is being performed by a skilled actor
| who is voicing many characters using a script entirely
| generated by the AI.
| andrepd wrote:
| >reminiscent of the Mechanical Turk, a supposed chess playing
| automaton that actually contained a small person
|
| Reminiscent of the Amazon Mechanical Turk, a current real-life
| example of exactly they.
| krisoft wrote:
| > a book that appears to be AI but is really operated by low-
| paid humans in 3rd world countries
|
| Uhm actually :) the AI definietly writes the text itself, and
| takes care of Nell, and senses the environment around itself,
| but for plot reasons it can't do voice synthesis. So it employs
| humans to read up the words. At least until Nell learns to
| read.
|
| So it is not just appears, but it is in fact an AI, with a
| veneer of human voice on it.
| Closi wrote:
| If by "exploit" you mean "offer a clear and optional exchange of
| money for labour in return" then yes, I guess people were
| explotied here.
|
| As per everyone in employment.
| idontpost wrote:
| [dead]
| subradios wrote:
| The industrial revolution moved 90% of workers out of farming,
| yes 90% of employment in 1870 was agricultural, literally
| producing calories.
|
| We sometimes mourn for this in the form of back to the land
| pastoralism, but quality of life empirics suggest the industrial
| revolution was a benefit anyway.
|
| Instead of luddism, we should try to find ways that the coming
| apocalypse of white collar knowledge work can benefit humanity as
| a whole, and learn from our mistakes in the rust belt.
| fnordpiglet wrote:
| I think the next level is post scarcity. In a post scarcity
| world maybe we don't labor and toil to live because it's
| unnecessary to tie home, health, food, and life necessities to
| labor if our labor isn't useful. Maybe life becomes about
| something other than working to live and living to work. Maybe
| tying labor to life necessities was necessary given scarcity of
| labor, but when labor scales independently of people we need a
| new way of allocating resources.
| VLM wrote:
| Why do contemporary discussions of post scarcity always
| require something in the future rather than appearing in the
| past due to "the assembly line" or "agriculture"?
|
| Surely in the vast universe of past human discovery it seems
| likely if post-scarcity were possible in any form, that we'd
| have already discovered what will initiate post-scarcity so
| it should be here now... and it seems unlikely that any
| individual invention in the future will kick it off if none
| of the past inventions did.
| fnordpiglet wrote:
| Because productivity still scaled linearly with consuming
| humans, even if the constants improved. AI and other
| advances offer a potential for nearly autonomous
| productivity allow for productivity that scales
| independently of available labor.
|
| Additionally I would say that each advance brought us
| closer to post scarcity. We have close to eliminated
| extreme poverty globally. Compared to hunter gatherer
| society's we already live post scarcity.
|
| Finally we may very well be post scarcity, but the notion
| of nobility in work and morality of labor means we can't
| yet seriously consider decoupling work from life
| necessities. At some point there won't be enough bullshit
| jobs left to justify pretending people need to labor to
| eat, and society will either collapse or we will move
| beyond work to live.
|
| I would posit however the invention left undone is the one
| we use humans for now. Their ability to reason, make
| independent decisions, synthesize new ideas in any
| situation, learn new and different skills, interacting with
| a complex field of visual, auditory, and sensory stimulus
| effectively towards a goal, etc. That's why we research AI.
| If our tools have that, then our tools don't need us. If
| our tools don't need us, we don't have to do the work. If
| we don't have to do the work, there is no scarcity because
| work scales independent of us.
|
| There are also other inventions we know of but haven't
| perfected that help here. Efficient fusion is one. With
| that energy is cheap and plentiful and presumably clean.
| Energy is the ability to do work. With artificial minds
| that can produce minds that can in turn produce minds,
| fueled with plentiful energy, what's left?
|
| So I disagree that we've invented everything that might be
| useful, or that what hasn't yet been invented won't lead to
| improved productivity to the point that human labor is
| redundant and all human needs can be met without it.
| thedorkknight wrote:
| Would be lovely if there was actually any movement to avert the
| employment apocalypse. So far all I see is talk, and I have no
| idea how to do anything beyond that myself
| unity1001 wrote:
| > We sometimes mourn for this in the form of back to the land
| pastoralism, but quality of life empirics suggest the
| industrial revolution was a benefit anyway.
|
| Nope. That 'moved workforce' started living in industrial slums
| and dying at a ripe old age of ~40 instead of living until
| their late 60s.
|
| http://www.filmsforaction.org/news/recovered_economic_histor...
| cornel_io wrote:
| And now the average worldwide lifespan is 72, higher if you
| look at countries that are fully industrialized.
|
| Transitions suck for the people left behind, but that doesn't
| mean that progress is bad, it can really help people overall.
| tristor wrote:
| That link is unhinged. It considers people to be self-
| sufficient on mere subsistence. A society which creates
| collective incentives towards collaboration and away from
| violent domination, creating wealth and value in excess of
| subsistence, and opening up the massive quality of life
| increase to all, is a significant departure in a positive
| direction from mere subsistence for an agrarian peasant that
| survives at the whim of people who could brutally and
| violently take from them.
|
| That has to be the most hilariously and sadly unhinged
| reading and retelling of history I have ever seen. A
| wonderful example of lying with truths. The author seems to
| be part of a communist online writer collective, I suppose
| that should be unsurprising given the subject matter. Commies
| are wild.
| ClumsyPilot wrote:
| > collaboration and away from violent domination
|
| Working in an iron foundry or steelworks is more peacefull
| than quetly living on a farm? There is less conflict
| between workers than farmers?
|
| What is the basis for this fantasy?
| tristor wrote:
| Human history is full of violence, and not all of it is
| between "betters" and "lessers", it's just people being
| violent towards one another to get what they want. Post-
| industrial society established social order and rule of
| law much more clearly than anything prior. A big piece of
| this was due to compulsory and inclusive education, but
| many other factors including the rise of enterprises
| which required social interactions to reach personal
| success changed society to a structure where
| collaboration was rewarded much more so than violence,
| which was punished.
|
| This was not the case prior to industrialization. You
| have some idyllic pastoral fantasy in mind, which was not
| true.
| nerdponx wrote:
| I don't think you're reading the article as-written.
|
| It'd be nice to see some sources for the quoted pamphlets,
| but if we assume that they are actual quotes from primary
| source material, it's quite telling.
|
| The article _does_ get a lot wrong, e.g. conflating
| feudalism with modern industrialized capitalism (hunting
| was been controlled by central political authorities for
| centuries before the industrial revolution).
|
| But there's also a good point being made, that breaking up
| communal economic systems can be used as a tool of
| subjugation and control. There's nothing in here about
| self-sufficiency or subsistence per se.
| notahacker wrote:
| tbh he's not wrong that the article says more about the
| partisan slant of the authors than it does about British
| industrial history. The article touches upon self
| sufficiency with the argument that that peasants could
| have made their own shoes from their own leather in a
| matter of hours so buying them proved they were poorer (a
| particular load of er... old cobblers) and I'm not sure
| various quotes about peasants being lazy proves anything
| more than the fact snobbery existed.
|
| There's plenty of actually problematic stuff (the
| Enclosure Act) that happened to the British peasantry
| mostly _before_ the Industrial Revolution without taking
| the view that peasantry was a particularly pleasant
| lifestyle that nobody would volunteer to change.
| nerdponx wrote:
| * * *
| tarotuser wrote:
| Ned Ludd's premise was of the quality of autonomy and life of
| the workers that were being automated. As automation came in,
| workers got less money, treated worse, and had worse lives.
|
| Being called a 'Luddite' was NEVER about technology, but whom
| gains from technology.
|
| And I dare-say he was right in his concerns. The gains of
| technology are privatized by the owner class, even though we
| worker class are the ones who utilize them. One needs to look
| no further than the "gig economy".
| arcticbull wrote:
| They just wanted apprenticeships for operators and decent
| pay. [1]
|
| Luddites were the victims of a very successful smear
| campaign.
|
| [1] https://www.smithsonianmag.com/history/what-the-luddites-
| rea...
| thescriptkiddie wrote:
| The Luddites were not broadly opposed to new technology, they
| were opposed to the ownership structure which cut them out of
| the higher profits the new technology brought.
| [deleted]
| soiler wrote:
| > quality of life empirics suggest the industrial revolution
| was a benefit anyway. [citation needed]
|
| I'm not trying to stuff AI back into Pandora's box. It's here,
| and it's coming. It _can_ be a really great thing, or it _can_
| be catastrophic. So I mostly agree with your last point. But it
| we 're going to talk about learning from our mistakes, the
| industrial revolution gave us The Jungle, and Amazon, and the
| obliteration of The Amazon.
|
| Things didn't work out for the best; many of them worked out
| horribly. And things that did work out did so because the road
| was paved with human bodies (and tens of billions of nonhuman
| bodies).
| wolverine876 wrote:
| It didn't work out well for a great many of those workers or
| their children. The people who got rich would not accept an ROI
| two generations down the road. Are you willing to accept that
| now - lose your career, much of your income, so that the
| changes in society will benefit your grandchildren (while
| billionaires and their children cash in right now)?
|
| 'It works out in the long run' is BS, and is always applied to
| someone else.
| allemagne wrote:
| The author has two points that continually distract from each
| other. They have a specific critique of worker conditions within
| tech companies, but also a broad skepticism that AI can really be
| all that effective or innovative, rather than an abstraction over
| the workers who "really" do the work. Their first point is solid
| and is accompanied by immediate calls to action, their second is
| vague, wrong, and potentially dangerously naive.
|
| They use AI, "so-called AI", or "AI" (with scare quotes)
| interchangeably. Besides being unnecessarily confusing with their
| own terminology, they are bringing in a pointless philosophical
| debate about what "intelligence" or "learning" really is. The
| "40% of AI startups that don't actually use any AI" is mentioned
| not because they're misleading investors or users, but because
| they point out that the "AI" label makes workers less visible,
| and then they can tie that to Amazon's "artificial artificial
| intelligence" and go on from there to show or at least imply that
| pretty much all AI is a big smokescreen to justify the
| marginalization of workers.
|
| However... isn't that begging the question? Do we actually care
| whether these systems learn and think the same way humans do, do
| their own original work, or whether they're just a worker
| pretending to be a chatbot? Does that make it okay to underpay
| gig workers, give them PTSD, or subject them to poor conditions?
|
| Maybe "real AI" never materializes, but also maybe it just keeps
| improving until it doesn't matter. If so it's dangerous to deny
| that these systems can ever meaningfully replace or augment
| humans. We can't pretend that we live in a universe where
| something will forever be inferior to good old-fashioned human
| labor just because of the disturbing social implications.
| aborsy wrote:
| I assure you, even if the technology one day is fully
| unsupervised, with minimal involvement from the humans, no carbon
| emission, etc, people will still publish the exact same articles,
| form unions, protest, etc: we are abused, there is inequality and
| exploitation, the productivity gains of technologies must be
| equally shared regardless of the contributions, introduce ever
| higher taxation, and so on.
|
| Amazon apparently pays around $25/hour to its starting warehouse
| workers. That's 1k per week for 40 hours per week, or around
| 4k/month. That's more than what French government pays its rocket
| scientists. People can discover their market rates based on their
| skills, and are free to work for another employer. There is no
| lock down.
| ceres wrote:
| > That's more than what French government pays its rocket
| scientists.
|
| I was about to call bullshit but then looked up the average
| salary for an aeronautical engineer in France and it was...
| bleak.
| [deleted]
| guywithahat wrote:
| Sometimes I wonder if these "ethical" articles are only put up to
| stop new companies from competing.
|
| Like "ok now that OpenAI Inc has a basic AI, lets be sure to
| close the door behind us so we don't have to deal with competing
| startups"
| hnbad wrote:
| By that logic any revelations about bad labor conditions
| underlying first world products or services only serve to
| prevent competition from also exploiting those conditions.
|
| If your ethical conundrum is between continuing to allow the
| exploitation of the post-colonial third world, and to stifle
| competition in first world markets, I think you need to retake
| an ethics class because that is not a conundrum.
| 908B64B197 wrote:
| I noticed "Ethics of X" is generally a great field for those
| who can't cut it in field X. See the "Ethics of AI crowd as an
| example", that seems to be mostly made of "tech adjacent"
| folks.
|
| Probably the best example here [0].
|
| [0] https://syncedreview.com/2020/06/30/yann-lecun-quits-
| twitter...
| optimalsolver wrote:
| Would be less need for these articles if major tech companies
| would stop acting like villains in a sci-fi movie.
| pelasaco wrote:
| did you check in which conditions and countries your shoes,
| t-shirts, pants, etc are manufactured?
| roughly wrote:
| Yes. Wherever possible, I try to be aware of the
| consequences of my actions, and adjust how I do things when
| I find they're leading to outcomes I feel are not in
| alignment with my ethics.
| Der_Einzige wrote:
| Cyberpunk*
| _Algernon_ wrote:
| Wouldn't see them "acting like villains in a sci-fi movie" if
| there weren't systematic economic incentives for them to act
| this way.
| roughly wrote:
| And yet, every one of these organizations consists of
| individuals making sets of individual decisions on an
| ongoing basis, who indeed face incentives as well* but also
| persist in making the dubious ethical decisions that the
| company enacts, and we can in fact judge them for this,
| because companies are made of people who don't actually
| abdicate moral responsibility for their actions by logging
| into a corporate email system.
|
| *And here's where I digress to say, again, that, indeed if
| you are the single mother of a child with a heart condition
| wholly reliant on the expensive company health care not
| lapsing for even a minute lest your child not just die but
| literally explode, taking your whole family and half the
| neighborhood with them, I'm not judging you. If, on the
| other hand, you're a single dude between 20 and 35 with a
| college degree, significant assets, no debt, and a sellable
| skillset, yes, you're the one I'm judging.
| gsatic wrote:
| That's cause people today don't just say hey look what I built.
| Everyone acts as if they are doing something world changing.
| And it's bullshit, and the backlash is natural and well
| deserved.
| dekhn wrote:
| These articles are written to make people feel guilty about
| their lives. The ultimate goal is economic equalization.
| croes wrote:
| The goal is awareness. There is no such thing as free lunch,
| somebody has to do the dirty work for too little money.
| whatshisface wrote:
| The amount of dirty work increases as the price of labor
| goes down, because companies put less effort into inventing
| labor-saving techniques.
| VLM wrote:
| The circular reasoning is entertaining, as the low priced
| labor is theoretically generating the AI to be the
| ultimate labor-saving technique. But you don't need to
| save "too cheap to meter" labor. What was the last thing
| that was "too cheap to meter?" Ah yes nuclear power. Look
| how that turned out.
|
| The two problems with trying to save the world via AI are
| the cheapest self driving car is a passenger train driven
| by "too cheap to automate" human, and an economy based on
| AI will be too poverty stricken from wealth inequality to
| permit the AI to generate a profit thus AI is not needed
| thus no need to destroy everything by applying AI.
|
| The more AI is deployed, the poorer an economy will
| become, and the cheaper labor will become, making it
| quite a race to see if the AI gets smarter faster than
| economic activity implodes.
|
| The most likely outcome of AI boosterism will be
| something like the environmentalist movement, but anti-
| AI. Who will be "The Lorax" of anti-AI?
| dekhn wrote:
| Are you sure it's too little money? articles like this
| often report wages that sound small to a person living in
| western Europe or US, but when compared to people in the
| same economy, these salaries are often competitive.
|
| And yes, somebody has to do the dirty work.
| godelski wrote:
| This is why I can never understand these articles. How
| much is $1.46/hr? Is it a little? A lot? Average? So I
| head over to Google and look up the answer[0]. But
| everything is in yearly wages so $1.46*2080 ~= $3037.
| Looks like it would be a lot in Sudan or Myanmar, an
| average in Egypt, and not a lot in Sri Lanka or Ukraine.
| But are these numbers even trustworthy? We had a post
| about this just the other day.
|
| But we don't even know the countries that are being used!
| So we're back to the original question: is it a little? A
| lot? Average? Honestly, when I see people spout out raw
| numbers alarm bells go off in my head. Doesn't matter if
| it is wage, number of car crashes, or number of murders.
| We have so many fucking people on this planet that a
| small percentage of a large number is still a large
| number. There are definitely people using this in
| deceitful manners. There are also dumb people, but
| idiocracy can still be accidentally malicious. The reason
| it is easy to lie with statistics and data is because it
| is actually really difficult to compare numbers and data
| across differing conditions. In this case, it really does
| not make sense to compare the wages of an American to the
| wages of a Sri Lankan. The price of an apple in both
| places isn't equal.
|
| Still, I don't know the answer to my question.
|
| [0] https://www.worlddata.info/average-income.php
| andrepd wrote:
| The fact that it's a lot in Egypt or Myanmar just means
| wages are low in those countries. It's easy to say "well
| that's just the way it is" while you're sat near the top
| of the pyramid.
| [deleted]
| godelski wrote:
| You're simplifying things too much though. There are
| important questions still. For example, how much does an
| apple cost? Rent? These are not equal in these countries.
| But also, let's say that they paid $3/hr in Egypt you
| could end up flooding the market. $15/hr and it could
| ironically be disastrous. We actually have seen stuff
| like this before when Westerners move in to other
| countries. The problem is that the local markets can't
| compete. Then we have a whole neocolonialism situation.
|
| If your goal is to actually help people in a country get
| out of poverty this is actually quite a difficult goal to
| achieve without taking over that country (implicitly or
| explicitly). We don't want to physically invade. We don't
| want to economically invade. But we also don't want them
| to sit in poverty so we have to do something right? So
| how do you do that and let them maintain their
| independence? They might not even want to work with us to
| begin with! Don't pretend that these are trivial problems
| and us westerners sitting at a computer just know what's
| best. That's how colonialism started in the first place.
| astrange wrote:
| Paying extraordinarily higher than average wages in a
| poor country has downsides - most likely they'll be
| stolen (since most charity is stolen) but it also annoys
| the government when eg local doctors stop their practices
| to work for you.
| croes wrote:
| Often competitive only means other jobs also pay too
| little money
| nerdponx wrote:
| If all you take away from this is that "someone is trying to
| make me feel guilty", then maybe you need to re-assess your
| own personal sense of guilt, and stop projecting it onto
| other people.
| brookst wrote:
| Nobody can make you feel anything.
|
| If true information induces feelings in you, perhaps it's
| more constructive to reflect on the information rather than
| the motives of the reporter?
| nomel wrote:
| > Nobody can make you feel anything.
|
| A simple phrasing like this does not reflect the reality of
| humanities evolution and resulting biology. We're social
| creatures, requiring that others make us feel things:
| https://www.apa.org/monitor/oct05/mirror
|
| Unless you're a psychopath, perhaps:
| https://www.medicalnewstoday.com/articles/321839
| dekhn wrote:
| I did journalism in high school and I know quite well that
| how you present true information (framing) or distort true
| information (for example, by comparing African salaries to
| the US) is done intentionally to make the reader feel
| something- guilt, anger, a desire to vote for somebody
| specific.
| wavefunction wrote:
| Does experience in high-school journalism support the
| claim that professional journalism is all a conspiracy to
| manipulate the reader? I doubt it, and I edited my high-
| school newspaper for two years lol
| dekhn wrote:
| Did I say conspiracy? No. I'm making a generalization. If
| you have sources for objective journalism, I'd love to
| see them. The closest I've seen was the Economist about
| 5-10 years ago.
| pelasaco wrote:
| > Nobody can make you feel anything.
|
| This in 2023, when people feel offended by anything, sounds
| pretty off...
| robotresearcher wrote:
| > Nobody can make you feel anything
|
| The size of the advertising industry suggests otherwise.
| 8note wrote:
| I don't see why somebody would feel guiltier about this than
| wearing Nike shoes?
|
| It'll make a cottage industry of "data made in the USA"
|
| It continues to answer the question: can the west run without
| maintaining a dedicated underclass -- no.
|
| Should we be paying even less to make sure that we can keep
| that going? If those people get richer, these systems that
| benefit us will stop working
| chamwislothe2nd wrote:
| I believe this is a form of astroturfing and it is hard not to
| see this stuff. Remember all those articles about the AI winter
| a few years ago? In reality, there was no winter. Lot's of
| smart people took advantage of the opportunity and got
| incredibly wealthy during that period of time and now we see
| some of the fruits of that labour. It's just the beginning too.
|
| It's important, in my opinion, to always ask 'who benefits from
| me believing this?'
| roughly wrote:
| I mean -
|
| > who benefits from me believing this?
|
| In the case of this article, presumably the exploited*
| laborers whose labor conditions might improve if the
| population to whom this article is addressed both became
| aware and started to care about the labor conditions of the
| people behind the systems they use. Trying to 4D chess
| ourselves into not having to actually address the issue at
| hand isn't clever or insightful, it's just abdication with
| numbers attached.
|
| * I'm not arguing whether or not the behavior described in
| the article actually constitutes exploitation or is just the
| noble hand of the market more perfectly forming working
| conditions - that's not my point. The point is if you DO read
| the article and you DO feel that behavior's exploitative,
| scratching your head over who's got an agenda to want this
| thing you find abhorrent changed, as opposed to trying to
| change the behavior you find abhorrent, is the kind of thing
| people think smart people are supposed to do, and tends to
| contribute to more things you find abhorrent happening in the
| world.
| chamwislothe2nd wrote:
| It can be both true and used for astroturfing.
| ghaff wrote:
| I'm not sure how widespread the sentiment was that there was
| going to be a near-term AI winter.
|
| What you did have was increasing skepticism that things like
| fully-realized door-to-door autonomous driving were going
| happen in the short-run--but a lot of the skeptics didn't
| think that would necessarily translate into an AI winter.
| bumby wrote:
| > _Remember all those articles about the AI winter a few
| years ago?_
|
| FTA:
|
| > _" MMC Ventures surveyed 2,830 AI startups in the EU and
| found that 40% of them didn't use AI in a meaningful way."_
|
| While not the same as an AI winter, it can still be
| indicative of "peak hype". If accurate, a large portion of
| companies are claiming to have some nebulous AI advantage
| aren't really delivering on that promise. It's doesn't have
| to be astroturfing to be pointing out a hype machine.
| Particularly if that hype is predicated on a shaky ethical
| foundation.
| chamwislothe2nd wrote:
| It doesn't have to be, no, but it might be. It's worth
| reconsidering the things we believe every now and then.
| jrochkind1 wrote:
| You are imagining that someone who otherwise would start a
| competing company would just choose not to after reading this
| article? Or what?
| dqpb wrote:
| That's not what this article is about.
| LatteLazy wrote:
| I wonder if they are an "inside job": doing such a terrible job
| of making an ethical case against AI convinces people it must
| be fine...
| brudgers wrote:
| To me, the use of cheap labor for training suggests the level
| of importance AI companies place on avoiding social/cultural
| biases in the model in two ways.
|
| 1. Companies standards would be much higher if classification
| was done by engineers paid engineering salaries. And engineers
| would be directly accountable for model biases instead of bias
| being excused on the basis of how sausage gets made.
|
| 2. Outsourcing classification explicitly means classification
| is not valuable as a core competency; not an area in which
| expertise is important; and not an activity deserving of the
| benefits of direct employment such as financial rewards
| correlated to business success (e.g. bonuses, raises, options,
| etc.).
| [deleted]
| VLM wrote:
| Its a supply and demand problem. The supply of human AI gig
| workers vastly staggeringly exceeds the demand for human AI
| workers. Trash talking the entire industry just means the poor
| will get poorer and be worse treated when the demand dries up
| even more. So I'm not really clear on the social justice angle of
| the article.
|
| This does bring up an important long term market point. If there
| are an essentially infinite number of people willing to drive a
| car for practically nothing, tell me again why do we need self
| driving cars as an example of AI? Its not like we're running out
| of poor people anytime soon, they don't cost much of anything, so
| why waste all that money on kicking the poor people out of the
| economy? Eventually, once everyone's kicked out of the economy,
| there won't be anyone left to buy the very expensive self driving
| cars anyway, and no one will need self driving cars because
| they'll be nowhere economically viable left to drive to.
| Certainly we can "self reproduce" car drivers much cheaper than
| we can produce self driving cars full of unobtainable
| microprocessors, and producing human drivers is much more "black
| swan" proof than producing globalist microprocessors.
|
| The other problem is a economy, in the long run, has to serve its
| participants. Lets say as a thought experiment its inevitable
| that all capital and all economic activity will solely be
| concentrated in those with IQs over 150 or sociopathic tendencies
| or outright criminality, and of course with everyone else kicked
| out of the economy they will be the sole beneficiaries of AI,
| much like the European Royalty was, in some sense, the the only
| beneficiary of gold mining industry. Well... won't 99% of the
| population go all French Revolution on them and fire up the
| guillotines? As soon as those people at the top of the pyramid
| are removed, and there won't be too many of them, the 99% of the
| population can go back to happy productivity. Any individual who
| rebels during the runup will get steamrolled by the rest of their
| competitors who don't rebel from the plan, but the endpoint of
| the plan is anyone cooperating with the plan to the inevitable
| endpoint will get the guillotine in the end, so what's the
| maximally efficient game theory perspective if you're trapped in
| a self destructive game where either total participation or
| obvious non-participation results in your demise? Well,
| everyone's better off if everyone "phones it in" and fails. So I
| expect a lot of "totally unexpected" AI failures.
|
| Another philosophical problem is we've run a repeated massively
| parallel experiment on producing smarter better educated "non-
| artificial" intelligence entities known as the university system,
| etc. It really hasn't worked out other than the usual primate
| dominance ritual purposes and making certain the people at the
| top stay at the top. But people seem to think if we emulate that
| nonsense that never worked in a computer, it'll work next time.
| Sort of like perpetual motion doesn't work IRL out in the real
| world but if you abuse a CAD/CAM program hard enough a simulation
| of perpetual motion will work in a computer ... but even if we
| fake it in a computer simulation, that STILL doesn't mean it'll
| work IRL. Its almost like a "heisenberg uncertainty principle" of
| AI where if we had AI it seems the universe routes around it such
| that it can't make a net profit. Kind of like burning two barrels
| of crude oil to grow one barrel of ethanol to replace a barrel of
| crude oil, graph that out and see where it leads?
|
| I'm just saying, if "low IQ" biologicals are already economically
| excluded, why would "low IQ" AIs not also be economically
| excluded? And we've proven we can't educate "better" biologicals,
| so trying to produce a master race of AIs is likely to fail just
| as badly as those past experiments.
| scotty79 wrote:
| > "So-called AI systems are fueled by millions of underpaid
| workers around the world, performing repetitive tasks under
| precarious labor conditions."
|
| You can say this exact thing about capitalism as a whole.
| Children working in coltan mines and on cocoa plantations and
| such.
|
| It feels a bit superficial to write articles that just jump on
| some cool tech bandwagon just to point out that this tech is also
| exploitative. Because nearly everything is. And in case of that
| specific tech it's probably not that bad when compared to
| children in coltan mines.
|
| It's a feature of our economy that we, well off people, all
| benefit from, our entire life.
|
| We'd have to change entire rules of engagement and who's gonna
| pay to make that happen? We are not going to change it by playing
| whack-a-mole with global behemoths.
| golemiprague wrote:
| [dead]
| Xelynega wrote:
| Children working in coltan mines and on cocoa plantations are
| part of the same issue, so I don't understand why you take
| issue with that issue being brought to the forefront using
| 'some cool tech bandwagon'. It's a cool tech bandwagon because
| people's eyes are on it, if you agree these issues are issues
| wouldn't you want them seen by those eyes?
|
| I don't quite understand your final point. Yes, changing the
| labour relations globally would require global changes.
|
| > Who's gonna pay to make that happen?
|
| It's a misunderstanding to frame it like this, because the
| answer is that the exploited people are already paying for the
| current reality with their labour. Why is it assumed that they
| can continue to shoulder the majority of the work for the
| minority of the benefits, but the idea that the work and
| benefits should be spread more evenly is abhorrent?
| scotty79 wrote:
| > I don't understand why you take issue with that issue being
| brought to the forefront using 'some cool tech bandwagon'.
|
| It just feels cheap and even less effective than other
| approaches. It promotes a whack-a-mole mindset.
|
| > It's a misunderstanding to frame it like this, because the
| answer is that the exploited people are already paying for
| the current reality with their labour.
|
| Ah, I see your misunderstanding. I didn't mean that somebody
| will bear the cost of a change. I meant that for any change
| to happen somebody must pay out of their ears for political
| influence to force that change.
|
| The only maeningful laws that come into power are laws that
| somebody bought. Who's gonna buy 'non-exploitation' law if
| everybody with any money (billionaires, but also you and me)
| benefits from this exploitation?
|
| Want to solve the problem? Figure a way how someone can earn
| big money on preventing exploitation of people.
| chatterhead wrote:
| This is a first world manipulation into some kind of contrived
| "workers unite" hit piece.
|
| This has nothing to do with real AI and instead this author is
| trying to pass off hacks and gapfills as an entire industry or
| pursuit for the purpose of manufacturing suffering or abuse that
| isn't there.
|
| This author has a terrible perspective.
|
| If we have to enslave the whole world for 1 year to build an AI
| that elevates all for eternity. Would you do it?
| Xelynega wrote:
| > If we have to enslave the whole world for 1 year to build an
| AI that elevates all for eternity.
|
| I think this shows a fundamental misunderstanding of the issues
| in the article.
|
| Using your analogy, the article is asking the question: "if we
| have to continually enslave entire nations to build an AI that
| a single private corporations benefits directly from, and
| individuals in first world countries benefits indirectly from.
| Would you do it?"
|
| The reason it's unethical is because of the system of
| exploitation we live in. So to answer your question yes, if we
| remove that system of exploitation and everyone benefits
| equally after suffering equally for a year I would "press that
| button" so to speak. The problem is that's not the reality for
| AI and those who work to support it.
| [deleted]
| deltree7 wrote:
| I'm thankful for all the Western companies that exploited my
| labor from age 21 to 25 where I was paid a measly $100 per month
| or $5 / day.
|
| Thanks to that experience, I now make 100x more.
|
| If it were progressive liberals, sitting in their comfort home,
| trying to protect me, I would have been totally not exposed to
| the amazing opportunity this 'exploitative capitalistic company'
| gave me and literally changed my life and people around me
| forever.
|
| Woke Mind Virus is real and a cancer to real progress
___________________________________________________________________
(page generated 2023-01-20 23:00 UTC)