[HN Gopher] I was wrong about the ethics crisis
___________________________________________________________________
I was wrong about the ethics crisis
Author : bikenaga
Score : 190 points
Date : 2024-12-29 16:23 UTC (6 hours ago)
(HTM) web link (cacm.acm.org)
(TXT) w3m dump (cacm.acm.org)
| kelseyfrog wrote:
| Our intelligence agencies have long recognized that individuals
| burdened by debt are vulnerable to coercion and manipulation.
| It's time we acknowledge that the H1-B visa program creates a
| similar dynamic. The program's restrictive rules effectively hang
| over visa holders like the sword of Damocles, leaving them
| perpetually at risk and easily controlled.
|
| We've already seen how Twitter, under Musk's leadership, has
| exploited this system to erode user protections in favor of
| appeasing his ego. When such moral compromises are normalized at
| the top, their effects inevitably cascade downward, influencing
| broader organizational norms and behaviors.
| Swizec wrote:
| > It's time we acknowledge that the H1-B visa program creates a
| similar dynamic. The program's restrictive rules effectively
| hang over visa holders like the sword of Damocles, leaving them
| perpetually at risk and easily controlled.
|
| This is why I went through the pain and cost of sponsoring my
| own O1 and later EB instead of relying on an employer or
| spousal visa. You just cannot be a full participant with
| someone who can get you kicked out of the country on 10 day
| notice.
| Simon_O_Rourke wrote:
| > leaving them perpetually at risk and easily controlled.
|
| I'd use a stronger term here, for some more nefarious companies
| can both exploit and abuse employees on a H1b visa limitation.
| Now go work 60 hour weeks for less than your peers!!
| kelseyfrog wrote:
| I worked it back from stronger language originally in the
| hope that it would be more easily palatable. I completely
| agree with your point
| abduhl wrote:
| What is the relationship between this blog post and the H1-B
| visa program? And are you saying that Twitter has exploited the
| H-1B program to erode user protections?
|
| It seems like you're just trying to shoehorn some kind of
| unrelated anti-Musk sentiment into a discussion that has
| nothing to do with H-1B visas or Elon Musk?
| swiftcoder wrote:
| No, they (and every other BigTech) exploit it to erode
| _worker_ protections.
|
| Which in turn contributes to eroding user protections, since
| unprotected workers aren't really in a position to put up a
| fight when management tells them to do something unethical.
| ralfd wrote:
| What ,,user protections" were eroded at Twitter/X? Or do you
| just mean it became less woke?
| rrix2 wrote:
| just yesterday the owner of twitter was getting his employees
| to delete the accounts of posts he disagreed with
| lazyeye wrote:
| What do you mean?
| DonHopkins wrote:
| Pretending to be ignorant doesn't change the facts,
| reinforce your point, or bolster your ridiculous argument
| that Musk supports free speech. You couldn't possibly be
| more wrong, and pretending to be ignorant doesn't make
| you right.
|
| You have access to the same internet everyone else does.
| Look it up yourself instead of trying to argue with
| people who are paying attention and put in the time to be
| informed.
|
| https://www.axios.com/2024/12/27/musk-x-loomer-h1b-maga-
| veri...
|
| MAGA vs. Musk: Right-wing critics allege censorship, loss
| of X badges.
|
| A handful of conservative critics of Elon Musk are
| alleging censorship and claiming they were stripped of
| their verification badges on X after challenging his
| views on H-1B visas for highly skilled foreign workers.
| WillPostForFood wrote:
| You want to go on the record support Laura Loomer as a
| credible source? Nothing in the article about account
| deletions, and nothing but one notorior crank claiming,
| without evidence, they are being censored.
| DonHopkins wrote:
| Laura Loomer's credibility isn't the issue. It's already
| quite well established and non-debatable that despite
| hypocritically gaslighting and declaring himself a "Free
| Speech Absolutist", that he is so thin-skinned and anti-
| free-speech that he shadowbans, demonetizes, and kicks
| off many many people he doesn't agree with, and promotes
| and amplifies the White Supremacists and Nazis and
| racists he does agree with, and he doesn't support
| anyone's free speech except his own.
|
| The thing about him censoring Laura Loomer only
| illustrates what a ridiculous point it's gotten to. It's
| not his censorship and anti-free-speech she's complaining
| about, it's that it's now to a point that it finally
| applies to her. She's not against leopards eating
| people's faces, she's just against leopards eating HER
| face.
|
| If you still believe Elon Musk supports free speech
| because you're skeptical of Laura Loomer, you're just as
| gullible and ignorant and dishonest and unethical as she
| is.
|
| Of course, just like Musk and Loomer, you're not even
| arguing in good faith, since your own words prove you
| obviously didn't read the article. You said "Nothing in
| the article about account deletions, and nothing but one
| notorior crank claiming, without evidence, they are being
| censored.", but right up at the top the article clearly
| states that THREE people were complaining, and he's
| deleted or threatened to delete the accounts of several
| other people and organizations:
|
| >Driving the news: Trump's conspiracy-minded ally Laura
| Loomer, New York Young Republican Club president Gavin
| Wax and InfoWars host Owen Shroyer _all said_ their
| verification badges disappeared after they criticized
| Musk 's support for H-1B visas, railed against Indian
| culture and attacked Ramaswamy, Musk's DOGE co-chair.
|
| And also:
|
| >He threatened to reassign NPR's account handle last year
| and marked some links to the site as "unsafe" when users
| click through.
|
| >Musk also removed the verification badge of The New York
| Times in 2023.
|
| >X also suspended independent journalist Ken
| Klippenstein's account after he shared Sen. JD Vance's
| vetting document from the alleged Iranian hack of Trump's
| campaign.
|
| And as someone who's not arguing in good faith, you know
| very well it's absolutely true Elon Musk doesn't support
| free speech, and the list of people and organizations
| he's banned or demonetized because he doesn't approve of
| THEIR free speech goes on and on, and there's nowhere
| near enough room in a typical article or attention span
| in a typical reader to list them all. You have a lot of
| nerve to be that blatantly dishonest in a discussion
| about ethics.
|
| But you're so intellectually lazy, you didn't even read
| the article you're facetiously pretending to have read,
| so don't demand other people write and read exhaustive 50
| page well researched detailed articles enumerating every
| fact and scrap of evidence for you, if you're too lazy to
| read a one page article yourself. Because you risk
| embarrassing yourself again by having your own words and
| the article's words quoted back to you in juxtaposition.
| metabagel wrote:
| She's not the only one reporting loss of blue check
| status, subscribers, and ability to monetize her account.
| warkdarrior wrote:
| How is that unethical? X is Musk's toy to do with as he
| pleases. You as a user of X need to understand that he has
| absolutely no responsibility to you as a user.
| kelseyfrog wrote:
| Any time there are consequences to actions, the matter of
| ethics arises. The very act of making decisions that have
| consequences demands responsibility. This is the reality
| of being human.
|
| Whether you or anyone else organize consequences as
| meaningful or not is a moral abdication. The first thing
| an immoral person does is justifying the consequences of
| their actions as inconsequential. This happens to such a
| degree that doing so is a signal of immorality.
| Immorality doesn't look like choosing evil, it looks like
| choosing inconsequentialism.
| Uehreka wrote:
| (Regina George voice) So you agree? There are no "user
| protections" on X?
| WarOnPrivacy wrote:
| > How is that unethical? X is Musk's toy to do with as he
| pleases.
|
| I offer that that rights aren't ethics. Musk has a
| reasonable right to censor speech on his platform that he
| doesn't agree with.
|
| However, when someone establishes themselves as a free
| speech absolutist, it is arguably unethical for them to
| remove, suppress and continually work to eliminate speech
| they disagree with.
| metabagel wrote:
| Your statement is basically "might makes right", which is
| antithetical to ethics.
| StanislavPetrov wrote:
| I'm no fan of Musk but this is has been standard practice
| at Twitter since it was founded. They are just using
| slightly different "standards" to decide who to
| delete/suppress/shadowban.
| bdangubic wrote:
| you can't be that naive... he bought it apparently for
| "free speech reasons" which he repeats every chance he
| gets (even on the same day he is silencing his critics).
| Xi Jinping is more for free speech than Elon is :)
| metabagel wrote:
| Moderation isn't censorship.
|
| The previous site was pretty well moderated. The current
| site is pretty awful, and the site owner is capricious
| about meting out punishment to those who offend him. It's
| all personal, whereas before it was based on moderation
| policy.
| fsckboy wrote:
| what hypocrisy!
|
| it's like my alcoholic doctor telling me I need to cut my
| drinking: his advice may be sound, but it's rich coming
| from him.
|
| I'm referring to the people who denied or did not decry the
| previous twitter administration deleting huge volumes of
| tweets they didn't like, the people who now populate
| bluesky and the fediverse they themselves are quite open
| about saying, because it's a cozy little echo chamber world
| where the people who disagree are erased from their view
| wrs wrote:
| The previous Twitter administration was quite open about
| the censorship they were doing and the reasons they did
| it. You may not like the result, but at least they
| _tried_ to deal with their inherent conflict of interest
| (commerce vs. societal good) in a thoughtful way. The
| current one, on the other hand, constantly trumpets its
| free-speech absolutism while Elon tells the staff to
| delete whatever he wants whenever he wakes up in a bad
| mood, and artificially boost his own trolling.
| metabagel wrote:
| Moderation isn't censorship. Many comments are "killed"
| on this very website. You can see them with the "show
| dead" option.
| chaps wrote:
| Around the time I was born, my dad was in the army and was
| taught in an intelligence class that "financial problems" is
| one of the most exploitable facets of a person by nation
| states. I don't really know much about his work, but it sounded
| like his role was particularly at-risk from nation states
| trying to pull information from him.
|
| What's interesting though is that around that time we basically
| had no money and support from the military! We lived in a
| roach-infested home and barely had money for groceries! It
| absolutely blows me away that my family could barely support
| itself considering the known-and-taught risks of such a
| situation.
|
| When he told me about that I asked him why they didn't pay the
| family more, considering the risks. He hadn't considered it
| even once before that conversation.
| lazyeye wrote:
| Couldn't disagree more about Elon. I'm just glad there was
| someone able to open up free speech again on a social media
| platform and reveal for all to see the level of censorship (by
| surrogacy) by the govt.I think we might be in a very dark place
| indeed if this level of govt corruption was allowed to persist
| for even a few more years.
| DonHopkins wrote:
| What a terribly unfortunate time for you to foolishly choose
| to die on that particular of hill trying to defend Elon
| Musk's dedication to free speech. It says as much about your
| lack of situational awareness and tenuous grasp of reality
| and current events, as it does about Elon Musk's thin skinned
| hypocrisy and contempt for the free speech of anyone but
| himself.
|
| https://www.nbcnews.com/tech/social-media/elon-musk-
| accused-...
|
| >Elon Musk accused of censoring conservatives on X who
| disagree with him about immigration. The claims came after
| Elon Musk was involved in a public feud with some Republicans
| over immigration.
| trallnag wrote:
| Image boards like 4chan are quite free, even without Elon
| webdoodle wrote:
| > I'm just glad there was someone able to open up free speech
| again on a social media platform
|
| It ain't free if someone can buy it.
| bilbo0s wrote:
| Huh?
|
| There is still censorship by surrogacy. It's just that now
| they censor things you don't like being said, so you don't
| mind as much. For you it's not a problem as long as the
| people being censored have a world view or narrative contrary
| to your own. But that's not the same as being a free speech
| supporter.
|
| That's being a supporter of free speech for you. Not for
| anyone else.
| eh_why_not wrote:
| _> It 's time we acknowledge that the H1-B visa program creates
| a similar dynamic....leaving them perpetually at risk and
| easily controlled._
|
| By associating this to the subject of the post, are you
| implying that the perpetrators of unethical tech in the U.S.
| are mainly foreign workers, and not "homegrown" citizens?
| kelseyfrog wrote:
| No, to interpret it in a way that suggests that's it's mainly
| foreign workers would be extrapolating beyond what I said. I
| believe the worker-employee dynamic is fundamentally
| unbalanced[1] in favor of employer leverage over employees. I
| simply believe that this same dynamic is exaggerated when it
| comes to H1-B workers. It's simply easier to examine a social
| relation when it's more apparent.
|
| 1. Workers' choice of employment does not come close to
| ameliorating the disadvantage. Every argument against this is
| a coping mechanism.
| eh_why_not wrote:
| Thanks for clarifying.
| vunderba wrote:
| From the article:
|
| > _But I have yet, until now, to point at the elephant in the
| room and ask whether it is ethical to work for Big Tech, taking
| all of the above into consideration._
|
| People often highlight "boycotting" as the most effective action
| an individual can take to drive change, but for those who work in
| tech, the most powerful message you can send is _denying your
| labor_.
|
| To me, this isn't even about whether "Big Tech" companies are
| ethical; it's a matter of ideological principle. FAANG companies
| already wield far too much power, and I refuse to contribute to
| that imbalance.
| parpfish wrote:
| Wouldnt it be even more effective to take a job there and
| provide them half assed labor?
| baobun wrote:
| No.
|
| How do you think we got here?
|
| Turns out network effect can compensate for a lot of
| incompetence and lethargy. Many (most?) big tech engineers
| are likely already cruising.
|
| Try do something you actually believe is good instead of
| coping by telling yourself you are intentionally failing to
| do something bad.
| ChrisMarshallNY wrote:
| I would consider that unethical, and would not do it.
|
| For many reasons, I live a life of extremely rigorous
| personal ethics.
|
| I don't insist that others do the same, but I do need to
| protect myself from others that assume my ethical stance to
| be weakness.
|
| For example; I make it a point to always keep my word.
|
| Unethical folks that know this of me, are _constantly_ trying
| to get me to make commitments, without divulging the costs to
| me, or the boundaries of said commitments.
|
| It's my responsibility to make sure that I have full
| disclosure, before making a commitment.
|
| Many people become quite jaded and misanthropic, when faced
| with this. I tend to find it amusing, watching people try
| weaseling out of giving full information. Often, these
| efforts tell me more about things, than full disclosure up
| front will.
|
| I like people, and can call some really rapacious bastards
| friend. My ethical stance is truly entirely personal, and I
| have worked closely, with some spectacularly flawed people.
|
| Scott Adams _(He Whose Name Has Been Struck From The Lists)_
| wrote an extremely cynical book, called _The Way of the
| Weasel_ , which is downright prescient.
| gryn wrote:
| -100x engineer. straight from the field CIA manual for
| sabotage PDF.
| spencerflem wrote:
| that's different from half-assing though
| leeoniya wrote:
| > denying your labor.
|
| that's still boycotting. but it needs to be active rather than
| passive to send a message. not applying for work is not enough,
| you have to decline at offer stage on stated principles. i dont
| think most would go through that effort.
|
| only the highest level individuals who Big Tech tries to poach
| can do this without much time invesment because they
| effectively have offers at first contact.
| swiftcoder wrote:
| I don't think anyone is obligated to go through the entire
| interview process just so they can decline the offer.
|
| Yes, you'd waste a few hours of some expensive engineers'
| time, and more hours of relatively cheap recruiter time - but
| sending the recruiter a big ol' fuck you on first contact
| gets the message across just fine.
| leeoniya wrote:
| not sure if recruiters will give a shit. they cast really
| wide, low-effort nets. until you get to offer stage you're
| a nobody, and to get to offer you usually have to expend
| non-trival time/effort.
| pera wrote:
| You'd be surprised: Next time a recruiter from some big
| tech corp sends you an e-mail try explaining why you
| don't want to join them.
| int_19h wrote:
| I've had a recruiter try to convince me that working for
| Facebook isn't all that bad.
|
| But I'm not sure how much of this sentiment that they
| hear from people is actually routed to the companies
| themselves.
| tensor wrote:
| I don't think any of these things ultimately work. There will
| always be someone who will take the money or the deal.
| Advocating for regulation is the most effective way forward.
| But probably the first thing to advocate for is some notion of
| "equal speech," not just "free speech," otherwise it will be
| very hard to get any new regulation passed.
|
| By equal speech I mean that people should have equal
| opportunity to be heard and pitch ideas when it comes to
| political advocacy. If the rich can send a million messages for
| every one of yours, no one will ever hear or listen to you.
| simoncion wrote:
| > There will always be someone who will take the money or the
| deal.
|
| And even if there wasn't, that'd give them even MORE ammo to
| go screaming for easier access to H-1B and similar such
| imported labor.
| pera wrote:
| I agree. This is what I have been doing:
|
| - Respond with a template explaining why I don't want to work
| for Google|Amazon|Meta|Microsoft|Apple
|
| - Include information of some tech unions the recruiter could
| join and give reasons to do so
|
| - Talk to colleagues about concerns and what can we do to
| mitigate current power imbalances
|
| - Talk to family and friends about the industry, its impact on
| society, and provide help if they would like to try alternative
| technologies
| Gimpei wrote:
| I dunno. I'm kind of with the sentiment in his original column,
| or at least how he paraphrased it. I think it's naive to believe
| that you can bring about any real change in tech through moral
| suasion alone. The monetary payoffs are too large and there are
| too many people who will work for them no matter what. If you
| want to change some behavior that you find immoral, your best bet
| is to organize politically and pass laws.
| swiftcoder wrote:
| > there are too many people who will work for them no matter
| what
|
| BigTech was already struggling to hire the caliber of engineers
| they needed when I worked there (and I left 5 years ago), and a
| fair number of the best candidates were refusing on ethical
| grounds (in that era, mostly around Cambridge Analytic and
| Facebook's involvement in Myanmar, but also due to concerns
| about blatant marketing to teens).
|
| I don't think it's a given that these companies can maintain a
| staff of thousands of top-tier engineers as they sink
| themselves ever deeper into the various ethical quagmires.
| epgui wrote:
| I think you totally misunderstand what he is saying.
|
| You can't make people do things with ethics. That's not what
| ethics is for, and that's not what he is talking about.
| zitterbewegung wrote:
| The person is arguing whether it is good or bad to work for Big
| Tech. I wouldn't hate the players when you really should hate the
| game. Most of the populace is largely unaware of surveillance or
| why they are using products that have a negative influence.
| Stopping to work for Big Tech does not change this lack of
| education. Advocating that privacy should be respected or even
| supporting laws that would regulate large technology companies
| has been attempted to be implemented by the ACLU and EFF, but it
| isn't really practical when you can hire lobbyists for around $1
| million dollars, which you can use to get passed nearly anything
| you want. Also, Big Tech may need fewer people to achieve its
| goals, so I think this post is too little and too late.
| swiftcoder wrote:
| > Also, Big Tech may need fewer people to achieve its goals
|
| Someone has drunk the ChatGPT-will-replace-$500,000-engineers
| koolaid, I see
| ivjw wrote:
| I agree with the author that questions of ethics are social
| optimization problems.
|
| > We must balance optimizing for oneself with optimizing for
| others
|
| Yet if each person would optimize for themself, then the
| balancing is automatically taken care of. The invisible hand is
| even more free and dexterous on the social scale than the
| economic.
|
| > the belief in the magical power of the free market always to
| serve the public good has no theoretical basis. In fact, our
| current climate crisis is a demonstrated market failure.
|
| The power of the free market is at least as theoretically and
| empirically sound as the climate crisis.
| epgui wrote:
| I think this is naive, because a flaw of such free market
| thinking is its failure to price in externalities. That's what
| the relationship with the climate crisis link was about.
| ivjw wrote:
| How is the environment, which is directly of concern to the
| primary economic sector, and to the entire economic
| enterprise in the long run, an externality?
| epgui wrote:
| It's actually the classic textbook example.
|
| https://www.investopedia.com/terms/e/externality.asp
| ivjw wrote:
| Unless you are studying an a priori science, textbook
| examples are pedagogical simplifications. Yes, the cost
| of environmental pollution is paid neither by factories
| nor their customers, yet both suffer the consequences,
| and as such are not "uninvolved" with the third party, as
| the definition goes.
| machinestops wrote:
| The problem with optimising exclusively for oneself is that you
| definitionally optimise at the expense of others. Gaps are
| easily widened, and your balancing idea falls apart when the
| scales are tipped from the start.
| ivjw wrote:
| It is not as simple as "my profit" vs. "others' expense". The
| elegance of the invisible hand theory is that it also
| accounts for the cases where others' expense is my expense
| and others' benefit is my benefit just as well as the others.
|
| The scales sure can be tipped on the individual level, but
| you are only considering the "one individual vs. one
| individual" case. Many cliques of extreme power have been
| taken down by the weaker majority, which is also one of the
| processes contributing to the collapse of monopolies.
| machinestops wrote:
| No, it isn't, which is why I added "definitionally". Let's
| say we have a limited resource, X, that is beneficial to
| hold, and it is more beneficial to hold more of it. As it
| is limited, acquiring necessarily means depriving another
| of it. Assuming one has the means to acquire more without
| impacting oneself negatively, in which situation (taking
| optimising for oneself as a maxim) you not seek to acquire
| more?
| ivjw wrote:
| None, but that's exactly the point. _Everyone_ would like
| to have more of it.
|
| This is a unifaceted way of posing problems, often also
| done with monopolization.
| machinestops wrote:
| Precisely. As of such, those with increased capacity for
| access will deprive access to others. No balance of care
| forms. Your recommended ethic is what Kant wished to
| address with his categorical imperative.
|
| Of course this is a unifaceted way of posing a problem:
| it's a model, given we're dealing with philosophical
| ideas. I should hope that I needn't provide examples for
| the model, given the state of the world at present won't
| let you swing a cat without hitting one.
| ivjw wrote:
| You need not. It's evident to any reader that some models
| can take more into account without overloading, including
| the "access" variable you introduced ex post facto.
|
| What I suggested is an instance of Kant's categorical
| imperative: "Act by the maxim whereby you can at once
| will that it should become a universal law." The maxim in
| this case being "optimize for your own benefit."
| uikoleawrfgolmp wrote:
| > Is Big Tech supporting the public good, and if not, what should
| Big Tech workers do about it?
|
| The problem is not if Big Tech does support or does not support
| something. The problem is they have any opinion at all! The pitch
| is they are "platforms" and "arbiters" who decide like highest
| court. They should not have any opinions at all!
|
| All this oligopoly needs to be dissolved!
| Townley wrote:
| > All of us must navigate the trade-off between "me" and "we." A
| famous Talmudic quote states: "If I am not for myself, who will
| be for me? If I am only for myself, what am I?" We must balance
| optimizing for oneself with optimizing for others, including the
| public good... To take an extreme example, Big Tobacco surely
| does not support the public good, and most of us would agree that
| it is unethical to work for Big Tobacco. The question, thus, is
| whether Big Tech is supporting the public good, and if not, what
| should Big Tech workers do about it.
|
| The duty to align your professional life ethically scales with
| your ability to do so. I personally don't cast aspersions on
| anyone working in tobacco farms or in a gas station selling
| cigarettes; they're just trying to get by. But if you're one or
| two levels up Maslow's Pyramid, it's right to weigh your personal
| needs against the impact of your work. You'll also be better off
| for it, knowing that the world would be worse off if you decided
| to switch gears and become a carpenter/baker/bartender/choose
| your adventure.
|
| I'll also say: there are ways to contribute morally outside of
| your 9-5. Volunteer to teach a neighborhood kid to code. Show
| your local sandwich shop how to set their hours online, or maybe
| even build them a cookie cutter Squarespace site. Donate a small
| fraction of your salary (eg 0.5% local, 0.5% global) to causes
| you believe in, and scale up over the years.
| equestria wrote:
| That last part makes me a bit nervous. It's dangerously close
| to the EA belief that it's actually OK to be a ruthless exec
| for a tobacco company, because you can do good things with your
| money that you wouldn't be able to do if you quit the job.
|
| I don't think that's the point you're making, but it's good to
| be careful with that. You can do good after hours, but it
| doesn't absolve you from what you're doing 9-to-5.
|
| As to your first point: yes, but it's all relative. Most tech
| workers are "trying to get by" in their minds. Just look at the
| SFBA rents and the PG&E bills! And wait until you hear about
| their college loans... most people in the top 1% don't think
| about themselves as the top 1%.
|
| In the end, making good decisions often requires sacrifice,
| pretty much no matter how much you make. And we often find ways
| to rationalize why it's not the right time for that.
| adamtaylor_13 wrote:
| But if you won't be the big tabacco exec, someone else will.
|
| So I actually agree with the notion that being the big
| tabacco exec and doing good things with your money, plus
| helping steer things from the inside is a better proposition
| than becoming a baker and letting someone who has NO moral
| qualms with tabacco run the ship.
|
| It's rarely as effective to push change from the outside as
| it is the inside.
| Ar-Curunir wrote:
| "If I don't work with the Nazis, someone else will, so I
| should be a good Nazi"
| marky1991 wrote:
| What would be the alternative in this hypothetical be?
| I'm not clear what the argument here really is.
| kelseyfrog wrote:
| Are there any social norms that allows immoral CEOs to
| exist? What incubates an immoral CEO?
| labster wrote:
| Honestly a better question would be if there are any
| social norms that allow for a moral CEO to exist? Pretty
| much all of our norms are tilted towards producing
| immoral executives.
| lotsofpulp wrote:
| Unless you are suggesting selling tobacco is as unethical
| as torturing and murdering people of different tribes for
| the sake of them being in different tribes, I do not see
| what your point could be.
|
| Should people simply never be able to sell or consume
| tobacco? Even if one's consumption of tobacco does not
| negatively affect anyone else?
| rNULLED wrote:
| Nazis were political leaders. So yes, you should try to
| be a good political leader to prevent the growth in power
| of bad ones.
| davidgay wrote:
| https://en.m.wikipedia.org/wiki/Wilhelm_Canaris
| adamtaylor_13 wrote:
| "If I don't work for the Nazi's they will kill my family,
| so I will work for the Nazi's"
|
| There, I fixed your uninspired and incorrect anecdote.
|
| Big tobacco execs are quite literally killing absolutely
| no one. Last I checked they aren't sticking cigarettes in
| anyone's mouth. Personal responsibility for your own
| actions is unfortunately lacking in many discussions
| surrounding things like this.
| WD-42 wrote:
| I think the idea is that if all good people refuse to
| become a tobacco exec the pool of people willing to take
| the job will be small and full of bad people, eventually
| they will run the business into the ground and the problem
| solves itself. How well this works in practice is
| debatable.
| moolcool wrote:
| > But if you won't be the big tabacco exec, someone else
| will.
|
| In the public discourse, you'll often see CEOs and founders
| lauded as incredibly brilliant and rare. As soon as you
| start to talk about ethics though, they're suddenly
| fungible. "Someone else would run the orphan crushing
| factory if not for me"
| gruez wrote:
| >It's dangerously close to the EA belief that it's actually
| OK to be a ruthless exec for a tobacco company, because you
| can do good things with your money that you wouldn't be able
| to do if you quit the job.
|
| You're saying this as if it's a given, but why wouldn't this
| work?
| Hasu wrote:
| For the same reason people don't think it's OK to rob a
| bank and donate the money to charity.
| gruez wrote:
| That analogy fails because robbing a bank is
| straightforwardly illegal and norm-breaking (by the
| majority of the population), whereas being a tobacco
| executive isn't.
| layer8 wrote:
| Unethical behavior not being treated as norm-breaking
| unless its illegal is part of what's being criticized
| here, I think.
| ClumsyPilot wrote:
| Not if you ask the younger generation
| worik wrote:
| > For the same reason people don't think it's OK to rob a
| bank and donate the money to charity.
|
| I have a problem with violence...
| Avicebron wrote:
| Because it's sociopathic at its core, I don't have time to
| pull up the HN back and forth where it was debated during
| the FTX stuff
|
| but basically it comes across as, "I am willing to
| sacrifice others (but not myself) to achieve my goals
| because I know better."
| gruez wrote:
| >but basically it comes across as, "I am willing to
| sacrifice others (but not myself) to achieve my goals
| because I know better."
|
| Since money is fungible but finite, basically any sort of
| donation decision involves sacrificing someone. Donating
| to fund malaria nets when you'd otherwise have funded
| your local little league team means you're in effect,
| sacrificing the local little league team. Moreover, by
| donating their own money, they're by definition
| "sacrificing myself".
| int_19h wrote:
| This line of thinking (and EA in general) taken to its
| logical conclusion results in stuff like LW's famous
| "moral dilemma" about torturing someone for 50 years
| being justifiable if it prevents sufficiently many people
| from the discomfort of having a speck in their eye.
| CobrastanJorji wrote:
| Because enabling evil on a large scale to pay for doing
| good on a small scale doesn't achieve net good.
| jefftk wrote:
| It wouldn't work because being a tobacco exec is just
| _really_ harmful: https://80000hours.org/2016/01/just-how-
| bad-is-being-a-ceo-i...
|
| Anyone who could do that job has many far better ways they
| could apply their career.
| Avicebron wrote:
| > But if you're one or two levels up Maslow's Pyramid, it's
| right to weigh your personal needs against the impact of your
| work. You'll also be better off for it, knowing that <b>the
| world would be worse off if you decided to switch gears</b>
| and become a carpenter/baker/bartender/choose your adventure.
|
| To highlight this part of the original in support of this
| comment. This comes of as somewhat arrogant and is a pretty
| big red flag...
| dvdkon wrote:
| If you've changed your career to support some goal, here
| the public good, isn't it natural to be strongly convinced
| that your work is advancing that goal?
| ajkjk wrote:
| What confuses me is how many people are evidently in the job
| of "ruthless exec" and then they do it amorally. I can't
| think of any time in my life that I've seen an exec say: no,
| we could do that, but we shouldn't because it's wrong. No
| doubt because anyone who acts that way gets naturally-
| selected out of the job.
|
| But also there seems to be a pervasive belief, which if
| anything feels way strong than it was when I was younger
| (maybe because the moral-majority christian-nation vibes have
| fully disappeared, in the US at least? sure, it was always
| fairly hollow, but at least it was a thing at all), that a
| business leader is not _supposed_ to do moral things, because
| it 's not their job description; their job truly is "increase
| shareholder value on a 6-12 month timescale", and if they try
| to do something different they are judged negatively!
|
| So maybe there is in theory good to be done by being an exec
| and being more moral than average (maybe not a tobacco exec,
| but, say, in tech?). But the system is basically designed to
| prevent you from doing it? It almost seems as though modern
| model of shareholder capitalism is almost designed to keep
| things this way: to eliminate the idea at any point that a
| person should feel bad if they just do the "efficient",
| shareholder-value-maximizing thing. Nobody has any agency in
| the big machine, which means no one is accountable for what
| it does. Perfect, just how we like it? Whereas at least a
| private enterprise which is beholden to the principles of its
| leader could _in principle_ do something besides the most
| cynical possible play at every turn.
| navane wrote:
| There seems to be a new system in place which takes these
| amoral CEOs and does make them accountable.
| Earw0rm wrote:
| Exec., meet exec.?
| hackable_sand wrote:
| It's the truth, and we've had these systems since the
| dawn of civilization. Idk why people are acting surprised
| now when we've been doing this for thousands of years.
|
| If people in power don't provide and protect a democratic
| process to removing poor leadership then they _do not_
| get to complain when people make those decisions on their
| own.
| Earw0rm wrote:
| Financial companies figured out how to do this in the run-
| up to the GFC, and everyone else learned it from them in
| the immediate aftermath.
|
| "They did all that, and literally none of them went to
| jail? We got to get us some..."
|
| Post-2008 tech companies were built that way from the get-
| go.
| ajkjk wrote:
| I think it's useless to believe that the explanation
| behind everything is "greed". It's so easy to blame
| greed; it's amorphous and meaningless; it gives you
| nothing you can do; it's the logic of a people who are
| sure nothing can change, that the way things are is
| inherent: the rich are greedy, the bad things in the
| world are powerful people taking advantage of us for
| benefit, sad for us.
|
| It seems pretty clear that the forces at work are
| designed to incentivize, reward, and rationalize "greed",
| and so if one just does their job, so to speak, they will
| end up doing the greedy thing at every turn. And really
| we are fine with it! -- what we value more than anything
| is value creation (on paper). No matter if the actual
| world is getting worse as long as it appears to be
| getting better: the economy/investment accounts/stock
| grants are going up.
| int_19h wrote:
| I think the cause and effect here are reversed. Thing is,
| in a society like ours, you pretty much have to be a shitty
| human being to become a CEO of anything even remotely big.
| It inevitably requires walking on heads and abusing people
| to the extent that no moral person would be comfortable
| with.
|
| So we have a system that puts selection pressure on
| economic elites to be sociopathic. And then those same
| people write the books on "how to be a good CEO" etc, so
| _of course_ they are going to say that you 're not supposed
| to do things that they themselves don't do.
| jefftk wrote:
| _> It 's dangerously close to the EA belief that it's
| actually OK to be a ruthless exec for a tobacco company,
| because you can do good things with your money that you
| wouldn't be able to do if you quit the job._
|
| That's not an EA belief. While EAs have made arguments
| somewhat in this direction, being a tobacco exec is just
| incredibly harmful and no one should do it:
| https://80000hours.org/2016/01/just-how-bad-is-being-a-
| ceo-i...
|
| (80000 Hours is the primary EA career advising organization)
| wat10000 wrote:
| The obvious answer is to be a tobacco exec, sabotage the
| organization from within, _and_ donate to charity.
| light_hue_1 wrote:
| Yeah, it's definitely an EA belief! If you look at the end
| of the article they show you a link to a response on the EA
| forum.
|
| https://forum.effectivealtruism.org/posts/4N5BsDkcWjr5MRSQy
| /...
|
| EA is one of the most evil ideologies out there.
| jefftk wrote:
| The post you're linking to is not arguing that you should
| become a tobacco exec, it's arguing that 80k has not
| sufficiently made the case that a tobacco exec who
| donated all their income thoughtfully would still be
| causing net harm.
|
| Reading both articles, I think it depends a lot what
| strategy the exec employs. If they optimize for getting
| people to become addicted to smoking or increase how much
| they smoke (growing the market) then I think it's really
| unlikely they could donate enough to make up for that
| enormous harm. On the other hand, if they optimize for
| increasing profitability by increasing prices and
| advocating for regulation that acts as barriers to new
| entrants, and especially if the person who would
| otherwise have the role would be optimizing for growing
| the market, then it's likely their work is positive on
| it's own, regardless of donating.
| int_19h wrote:
| So, you're saying that from a EA perspective, it _can_ in
| fact be okay to be a tobacco executive. QED.
| jefftk wrote:
| What matters is the difference between how the world
| would be with your actions and how it would be otherwise.
|
| Would you also say "so you're saying it's ok to be a
| member of the Nazi party who runs a munitions factory
| [1], QED"?
|
| [1] https://en.wikipedia.org/wiki/Oskar_Schindler
| femiagbabiaka wrote:
| Morality is completely subjective. Prior to certain events in
| the last year I would've said that there were some objective
| standards like minimizing harm to children, but that's out the
| window now, with most of Big Tech implicated.
|
| As a moderately less contentious example, Alex Karp argues
| fervently that it is immoral to not produce weapons of war for
| western countries and the U.S. in particular. Many people agree
| with him. Ultimately people justify their method of making a
| living in whichever way they choose, and tech workers are no
| different. History is the log of the winners and losers of the
| war between the adherents of different moral codes.
| gruez wrote:
| >Prior to certain events in the last year I would've said
| that there were some objective standards like minimizing harm
| to children, but that's out the window now, with most of Big
| Tech implicated.
|
| You're saying as if it's indisputable that "Big Tech" was
| harming children, but we're nowhere close to that. At best,
| the current literature shows a very weak negative
| correlational relationship between social media use and
| mental health. That's certainly not enough to lambast "Big
| Tech" for failing to abide by "objective standards like
| minimizing harm to children".
|
| Moreover I question whether "objective standards like
| minimizing harm to children" existed to begin with, or we're
| just looking at the past with rose tinted glasses. During the
| industrial revolution kids worked in factories and mines. In
| the 20th century they were exposed to lead and particulate
| pollution. Even if you grant that "Big Tech" was harming kids
| in some way, I doubt they're doing it in some unprecedented
| way like you implied.
| femiagbabiaka wrote:
| I'm certainly going to get downvoted for this, but I'm
| referring to the use of computing resources for AI
| surveillance systems used in target selection in Gaza. That
| alongside the fact that Microsoft, Amazon, Google, NVIDIA
| etc. all vie for contracts with militaries domestic and
| global, implicates a large chunk of all tech workers in
| global strife.
| the_af wrote:
| > _" It is immoral to not produce weapons of war for western
| countries and the US in particular"_
|
| I cannot imagine that a substantial "many" people believe
| this. How does it work exactly? If you have any expertise
| even adjacent to weapons building (e.g. being a programmer)
| and you are not building weapons for the US due to a lack of
| effort (as opposed to failing the interview) you're doing
| something immoral?
|
| I don't think many would agree with this. I suppose his
| stance is somehow more nuanced? (I wouldn't agree with it
| either, but at least it would be slightly more reasonable).
| femiagbabiaka wrote:
| https://youtu.be/EZLr6EGGTPE?si=5ome3QBCQk20hpJD
|
| This describes it fairly well, although I was thinking of a
| CNBC interview in particular. He does so many that it's
| hard to catalogue.
|
| The argument is roughly that "the west" and "western
| morality" are critical institutions to be protected, and
| refusing to protect them is immoral.
|
| And yes, a lot of people support his ideals. Major chunks
| of the tech investment class, thousands of workers at
| Palantir, the U.S. State Department, the Acela corridor,
| etc. It is probably a minority viewpoint amongst normal
| Americans, but we're talking about tech workers here. :)
| the_af wrote:
| Well, ok, people in the defense industry would agree it's
| not immoral to make weapons, and the more extremist may
| even call it immoral not to make weapons (though I doubt
| many would, this is an extreme view. I also wonder if
| it's truly heartfelt or simply convenient while they hold
| defense industry jobs, and forgotten when they start
| working elsewhere).
|
| It doesn't follow at all that the best way to defend
| Western institutions is to build weapons.
|
| (Yes, I realize these aren't your views and that you're
| merely describing them. But this Alex Karp guy isn't here
| to debate directly with him...)
| femiagbabiaka wrote:
| I think Karp would say that events like 9/11 or 10/7
| represent attacks on the west by vicious enemies who
| can't be negotiated with, and that the only way to defend
| ourselves is to build weapons and surveillance systems
| that outstrip their capacity to harm.
|
| To your point about his beliefs not being mine, I think
| he has a fundamental misunderstanding of how both of
| those events happened, which is ironic, because the
| prelude and aftermath of both attacks are revisions on
| the same theme.
| the_af wrote:
| I just googled who Alex Karp is and... well, he has a
| vested interest in DoD applications. Of course he'd say
| this. A businessman telling us his business model is a
| moral imperative...
| wazoox wrote:
| > The argument is roughly that "the west" and "western
| morality" are critical institutions to be protected, and
| refusing to protect them is immoral.
|
| "The West" as a collective lost all of the moral high
| ground it was supposed to have during the past few
| decades and particularly last year.
|
| https://mearsheimer.substack.com/p/the-moral-bankruptcy-
| of-t...
| metabagel wrote:
| The moral high ground was lost when the U.S. and its
| allies invaded Iraq on a pretext.
| femiagbabiaka wrote:
| Agreed.
| devjab wrote:
| Morality is not completely subjective.
| https://www.journals.uchicago.edu/doi/10.1086/701478
| femiagbabiaka wrote:
| From the abstract this is a very interesting paper. I'll
| spend this afternoon digging in. But I see a problem
| already: reality trumps academic exercise or humanity's
| aggregated, self-description of its morality.
|
| "Morality-as-cooperation draws on the theory of non-zero-
| sum games to identify distinct problems of cooperation and
| their solutions, and it predicts that specific forms of
| cooperative behavior--including helping kin, helping your
| group, reciprocating, being brave, deferring to superiors,
| dividing disputed resources, and respecting prior
| possession--will be considered morally good wherever they
| arise, in all cultures."
|
| Who is kin? Who are one's superiors? What is prior
| possession? These are all questions of ideology and power.
| The only universal code all humanity agrees on is might
| makes right.
| labster wrote:
| Everyone loves helping kin except when helping kin on the
| public dime. Morals are funny that way.
| jwarden wrote:
| > The only universal code all humanity agrees on is might
| makes right.
|
| This is a cynical and unjustifiable claim.
|
| Obviously some people disagree. In fact in my experience
| people almost universally agrees might does not make
| right.
| femiagbabiaka wrote:
| Cynical? Maybe, I think you're right on that point.
| Unjustifiable? Look at the current and historical state
| of humanity and our global institutions.
| worik wrote:
| I did development work for casino bosses.
|
| Clearly immoral. IMO more so than weapons.
|
| I _realy_ needed the job
| avmich wrote:
| > The duty to align your professional life ethically scales
| with your ability to do so.
|
| I think this implies that we all should aim to have for
| everybody those abilities. That is, if somebody is unable, in
| this sense, to be ethical because he's just trying to get by,
| it's actually our problem - e.g. he sells cigarettes and that
| harms us. So we need to some extent work on the goal of
| everybody having abilities to live ethically.
| plagiarist wrote:
| There's a lot of beneficial things that might happen if we,
| as a society, worked at helping the Invisible Hand manifest.
| Especially if we also ceased putting so much effort into
| fighting it.
|
| One of the basic tenets of capitalism is that the exchanges
| are all voluntary. In practice they are quite clearly not.
| int_19h wrote:
| That's a basic tenet of the free market, not of capitalism.
| The two are not the same.
| amelius wrote:
| Yes, we could tax higher incomes until we've reached that
| goal.
| mmooss wrote:
| Tangentially, the solution would not be higher 'income
| tax', but higher capital gains tax.
|
| The confusion largely is that 'income tax' is really '
| _wage_ tax '. Income common wealthy people with lots of
| capital is return on their capital investment, which is
| exluded from that tax.
| songqin wrote:
| I can fill in the blanks in my head, but I doubt they are
| what you're thinking. Would you mind elaborating on the
| cause/effect you have in mind? It is difficult for me to
| imagine this in and of itself being successful. We would
| also need to solve the allocation of those collected funds,
| as in many countries it would likely go to welfare,
| defense, corruption, etc.
| abyssin wrote:
| I made the choice to change my occupation for a more moral one.
| One issue is, you lose a lot of social credit doing so. It's
| seen as a personal failure rather than a choice. It might also
| be that implicitly challenging their choices makes people
| uncomfortable.
|
| How do you meet people who take responsibility for their life
| design?
| guerrilla wrote:
| > One issue is, you lose a lot of social credit doing so.
| It's seen as a personal failure rather than a choice.
|
| Hmmm, why?
| aziaziazi wrote:
| When someone share it's choice, listeners naturally relate
| and compare to their own choices. If someone is putting
| moral or ethic before money (a very common first
| criterion), the listeners that didn't will feel judged even
| if the orator didn't say anything about them. It's a
| natural but uncomfortable behavior triggering defensive
| mode, that can translate in judging back the one that try
| to do a good think.
|
| Vegans experience this often.
|
| Edit: the "ethical choicers" can reduce such behavior with
| a carefully controlled communication (if someone has more
| tips please share!):
|
| - Don't say "ethic" but "my ethics"
|
| - Keep concise, don't give details if not ask
|
| - Change subject as soon as you feel the listener is
| uncomfortable
|
| - Say ASAP that you're not trying to convince or change
| anyone
| guerrilla wrote:
| This doesn't seem at all to be what the GP was refering
| to in the part I quoted. How is it seen as a personal
| failure on their part?
| aziaziazi wrote:
| My bad, I missed the 'personal' part of 'personal
| failure'. I think my comment is pertinent regarding the
| first phrase you quoted.
| zugi wrote:
| Most people want to judge others and rationalize their own
| behavior, while piling on to whatever views happen to be
| popular at the time.
|
| What's worse, working for Big Tobacco, or working for Big Tech,
| or working for the DEA and spending your days forcefully "civil
| forfeituring" innocent people's money without charges? The
| former are at least taking money from people who voluntarily
| surrender it in exchange for some service, with fairly good
| knowledge of what they're getting themselves into. While the
| latter are basically highway robbers. Yet society has chosen to
| popularize the first one as immoral, and is now working on
| villifying the second, with only scant mention of the third.
|
| I'm sure I'm guilty of selective outrage myself. If we're going
| to quote religious references, how about Christ admonishing
| those who point out the spec in their neighbor's eye, while
| ignoring the log in their own.
|
| More focus on one's own morality, and less on judging others,
| just might make the world a slightly better place.
| andrepd wrote:
| Highly highly disagree. It seems to me the opposite!
|
| People (incl. here) want to rationalise their behaviour by
| giving excuses -- such as the very popular "but X is even
| worse and people don't complain about it" that you yourself
| are doing -- for the fact that they work on in-ethical stuff,
| because the honest answer is simply "this pays cartloads of
| money, fuck you got mine", which is unpalatable to their own
| self-perception.
| zugi wrote:
| Sure, yet you're exemplifying the "judge others" stuff by
| calling what others do unethical, and judging without
| evidence that they only do it because of the money, and not
| because their moral world-view differs from yours.
|
| I guess we're all guilty.
| kstrauser wrote:
| It's at least plausible that someone at the DEA genuinely
| wants to make a nicer, safer world for themselves and their
| neighbors. Yes, the agency does the terrible things you
| mention, but it also gets some horrific stuff off the
| streets. (Think fentanyl and meth, not weed. I couldn't care
| less about that.)
|
| No one working for Big Tobacco thinks they're making the
| world better unless they're an idiot.
| zugi wrote:
| True, and I'm sure many, even most, folks working for Big
| Tech want to make the world a better place.
|
| We likely disagree about the merits of the DEA's War To
| Destroy the Lives of American Meth Users. That's a topic
| for another post perhaps, but the point is people have
| wildly different moral frameworks.
|
| I'm sure there _are_ people working for Big Tobacco who
| think they 're making the world better by helping people
| enjoy themselves. Heck, some people who work in online
| gambling, or sports betting, or run state lotteries, or
| make ice cream, might even believe that!
| riehwvfbk wrote:
| Some people who preach an ascetic and parsimonious way of
| life and judge the choices of others probably also think
| they are making the world a better place, one all-work-
| and-no-play comment at a time ;)
| mtlmtlmtlmtl wrote:
| > Yes, the agency does the terrible things you mention, but
| it also gets some horrific stuff off the streets. (Think
| fentanyl and meth, not weed. I couldn't care less about
| that.)
|
| Do they, though? Some of it, sure, but enough to make a
| positive impact? Probably not. Indeed, efforts to get drug
| X off the street often lead to a proliferation of more
| dangerous drug Y. There's plenty of reason to believe the
| DEA is only making things worse and causing more deaths.
| ChrisMarshallNY wrote:
| I worked for almost 27 years, for a company that aligned with
| my personal morals. The pay was substantially less than what I
| could have made at less-circumspect outfits, and there was a
| nonzero amount of _really annoying_ overhead, but I don 't
| regret it, at all. I slept well at night, made good friends,
| never wrote any software that I regretted, learned _heaps_ of
| stuff, and helped to develop and launch the careers of a few
| others.
|
| Mentioning that here, elicits scorn.
| kstrauser wrote:
| Not from me, it doesn't. That's enviable and I'm glad to hear
| you were able to have that.
| dgfitz wrote:
| I imagine the scorn would occur if you planted a flag about
| how the company morals aligned with your own.
| ChrisMarshallNY wrote:
| Actually, this is the first time I've done that.
|
| I pretty much enjoy the world I live in. That upsets some
| folks.
| talldayo wrote:
| I don't think people are scornful of your work. It makes me
| happy to hear that people still find meaningful employment
| within their means of living. It's increasingly rare that
| someone is paid to do something impactful these days. You
| should feel happy.
|
| The part that will attract scorn is pretending that
| _everyone_ can do that. In the same way that religion spread
| by preying on the poor and lecherous portions of society, so
| too does the tech industry offer the downtrodden and
| mistreated a better life in exchange for moral leniency. It
| 's not even the "revenge of the nerd" stuff past a certain
| point - if a $60,000/year software engineer in America turns
| up their nose to a contract, you can simply send it to a
| development firm in Pakistan for pennies on the dollar and
| get roughly equivalent results. There is no moral bartering
| with at-will employment. It's an illusion.
|
| As individuals, you and I are both powerless to stop the
| proliferation and success of harmful businesses. America's
| number one lesson from the past 4 centuries of economic
| planning is that laissez-faire policy does not course-correct
| without government intervention. Collective bargaining only
| works when you're bargaining on a market you control -
| boycotting certain employers is _entirely ineffective_ when
| you compare it to legislative reform.
|
| So, with that being said, saving your dignity is not enough
| to save society. You have every right to take comfort in
| working a job that you respected - but nobody here owes you
| any more respect than their dairy farmers or the guy in
| Thailand that made their $55 Izod sweatshirt. If you come
| around expecting the hero treatment, then you're bound to
| feel shortchanged. Sorry.
| wizzwizz4 wrote:
| > _if a $60,000 /year software engineer in America turns up
| their nose to a contract, you can simply send it to a
| development firm in Pakistan for pennies on the dollar and
| get roughly equivalent results._
|
| People who live in Pakistan are also capable of making
| moral decisions, you know. Your argument only holds if
| there are infinitely-many people in some kind of idealised
| labour market, but in the real world there are less than a
| million people capable of that kind of work.
|
| If you plan to take an immoral job and then work-to-rule
| while sabotaging the evil schemes, charismatically
| deflecting all blame to those who _were_ trying to make it
| succeed (or, better still, keeping the organisation as a
| whole from _understanding_ that their plan has been
| sabotaged), then that 's a different question, and I'd wish
| you the best of luck. (Not that such a person would be
| bragging about it here, anyway.)
| BurningFrog wrote:
| Given how easy it is to recruit contract killers all over
| the world, I think any unethical software with money
| behind it will be built. Maybe with paying some premium
| for the worst stuff.
| wizzwizz4 wrote:
| It's easy to recruit a hitman, but hard to recruit a
| _competent_ hitman. (See: the subcontracting hitmen in
| 2019.) And killing people is, in general, much easier
| than writing software.
| problemsolver12 wrote:
| Can confirm.
| staunton wrote:
| You're saying that you've killed people for money? Or
| that you hired killers before?
| ChrisMarshallNY wrote:
| And you think I posted that, expecting "hero treatment"?
|
| That's the problem, right there, I guess. We can't even
| mention things that should not elicit anything much more
| than "That's nice," without someone thinking that it's
| tubthumping. I wasn't inviting criticism of my decision.
| Sorry.
|
| Sometimes (most times, actually), I post stuff, just to say
| "Me too," or "Here's my experience with that. Maybe it
| might help." I'd like to think that it helps others to
| maybe feel less alone, in their world.
|
| People mention that they do stuff, _all the time_ , here,
| with the direct expectation of being lauded and cheered. In
| many cases, I'm really happy to laud them, and cheer them
| on. There's some cool stuff that goes down, here.
|
| I'm not really into that kind of thing, for myself. I'm
| retired, and follow my own muse. I've made some big
| impacts, but not really ones that most folks here would
| care about. What people here, think of me, doesn't really
| matter that much. I'm just not that important, and most
| folks here, aren't as important as they might think they
| are. We're all just Bozos on this bus. I have a fairly rich
| social life, and have a lot of people that like me (and,
| also, dislike me), because they _actually know me_.
|
| People also post some stuff that reveals some fairly warped
| and mutated personal worldviews. Most times, I just ignore
| that. I don't think attacking someone in public does much
| to help the world; especially in a professional context
| like HN.
|
| We live in a strange society.
| talldayo wrote:
| > And you think I posted that, expecting "hero
| treatment"?
|
| I mean, yeah. This is absolutely something that should
| make you feel wonderful as an individual, being able to
| help people that are aligned with your moral
| understanding. But it's also something you can't exactly
| share - you'll never communicate the happiness other
| people felt from your assistance, and you're almost
| certainly not going to find people that universally
| respect your own moral compass. On the flip side, there
| are people with extremely perverse senses of justice that
| consider murder and automated attacks on civilian
| populations to be an unparalleled moral imperative - I've
| seen them right here on HN.
|
| It's your life, I can't tell you how to live it. My point
| is to tell you why people _everywhere_ will bristle at
| that type of rhetoric, the holier-than-thou "this is how
| we transcend suffering" memoir written by hands that
| spent more time touching a smartphone than doing manual
| labor to feed a family. If you are in a position where
| you are emotionally, financially and politically secure
| enough to sponsor a life that you are satisfied with
| living, then your satisfaction begins and ends with you.
| It's like announcing your valiant donation to charity on
| a public soapbox - to whom does it serve? Will you be
| donating the soapbox to charity too?
|
| Look out on the world as it is today, and you'll see a
| society of people that reject causal opportunity and
| change. We don't boycott companies when they send death
| squads to kill dissident plantation workers because their
| products taste too good. We can't boycott our tech
| companies when they drive margins low enough to install
| suicide nets and sell user data for profit, because the
| immediate access to porn and Facebook is too enthralling.
|
| You're a little guy, a cog in that great big machine. If
| you know that playing your part had great impact on the
| world, then it should bring you a profound sense of
| personal justice. The part that makes people scornful is
| when you zoom out and look at the machine, then conclude
| "we should all be cogs, imagine how much more efficient
| the whole thing would run!" Many of us aren't made of
| steel, and have too few spokes to fill the same role that
| you do.
| ChrisMarshallNY wrote:
| _> holier-than-thou_
|
| All I said, was that I worked for a company for a long
| time, was basically happy, the work environment was not
| perfect, I found their ethics attractive, and don't have
| any regrets.
|
| We live in a _really_ sick world, if that can be
| interpreted as "holier-than-thou." I know _dozens_ of
| people, personally, that can say _exactly_ the same
| thing. They don 't consider themselves "special," and I
| don't really care that much. Almost none are in the tech
| industry, though, so maybe that's the difference.
|
| I also know a lot of folks that work at jobs they hate;
| often, for big money. I don't waste time judging them,
| and am just happy to have them in my life.
|
| I tend to avoid folks that are actively trying to be
| unethical, but I'm not on a mission to convert them. If
| they ever want to do things differently, I might have
| something they could use.
|
| It's sad to think that someone, saying what I did, is
| somehow "wrong." It's really not a big deal.
| drewcoo wrote:
| > And you think I posted that, expecting "hero
| treatment"?
|
| Seemed more masochistic to me. Different strokes for
| different folks.
| ncr100 wrote:
| This is likely a misinterpretation.
|
| It's not "pretending" or seeking "moral leniency" for
| individuals to use their agency to identify the potential
| for meaningful work, even within constraints. Recognizing
| the impact of work, and making conscious choices about how
| one contributes is more the point.
|
| There exist systemic exploitations of labor certainly.
|
| On being the change ...
|
| It is not heroic idol-seeking to share one's experience,
| nor to ask others to consider the values dimensions of
| their work.
|
| Even on a small scale, change can be made. It's worthy to
| highlight it, and moreover celebrating good can motivate
| values based thinking in others.
| robertlagrant wrote:
| > Mentioning that here, elicits scorn.
|
| No it doesn't. "Woe is ethical me" comments like this might.
| ChrisMarshallNY wrote:
| Seriously?
|
| I've not really mentioned ethics, before.
|
| The scorn is for working for one employer for that amount
| of time.
| phil21 wrote:
| From a casual observer who (used to) mostly lurk, it
| absolutely does.
|
| Maybe the tide has been turning the past few years but it
| was endemic from my point of view a decade ago when I first
| started reading HN.
|
| Folks who didn't chase career maximization were typically
| treated like naive children at best. Working for a third of
| the wages in some flyover state at a boring company vs some
| adtech company with an options package was panned on the
| regular.
|
| It was always part of the zeitgeist you switch jobs early
| and often to maximize your career progression vs. chill
| with the same company for most of your life.
| ghaff wrote:
| I'm not sure it was so much of an ethical statement as
| you should be switching jobs every couple years to
| maximize your paycheck.
|
| Of course, now, it's more about being happy to have a
| well-paying job as opposed to working a "full-time" job
| with two or three paychecks from different companies.
| globalnode wrote:
| My opinion after reading HN for quite a while is that your
| average HN poster is well educated and knows a lot of theory
| but struggles with ethics. Perhaps even seeing a debate
| against ethics as a game to be won.
| obscurette wrote:
| Almost same here, but I also understand that it's a luxury.
| At this point in my life I can afford it, but I've also seen
| times when I could do anything to get food on table for my
| family. Luckily for me those times didn't last.
| analog31 wrote:
| My counter argument is that the US is already the most
| charitable country in the world in terms of private
| contributions, yet there are maybe 10-20 countries where the
| common people are better off than we are (note the large error
| bar there). I speculate that private charity versus private
| anti-charity is like bringing a knife to a gun fight.
| CobrastanJorji wrote:
| Yes. However, just like someone considering dinner might
| falsely convince themselves 'I will eat this broccoli and this
| cheesecake and it will balance out to mostly healthy,' teaching
| some neighborhood kids to code won't ethically offset evil
| professional work, nor will donating a trivial fraction of your
| share of the ill-gotten proceeds.
| calf wrote:
| I think Moshe is right but chose a really poor analogy in Big
| Tobacco, I want to say because working in tech is not at all
| like a farmer working laboriously in a physical field which is
| a lot less ideological and more driven by being in a poor 3rd
| world country, etc.
| hereonout2 wrote:
| > knowing that the world would be worse off if you decided to
| switch gears and become a carpenter/baker/bartender/choose your
| adventure.
|
| I don't understand what you're weighing this against? A job
| that is literally saving lives maybe, or really leading in a
| field of science or technology?
|
| Most of us don't have that though, even here on hacker news.
| Most of us are part of a larger effort that will progress just
| as well without us, our personal impact is marginal at best.
|
| I've worked in tech for two decades for a company I deem
| "moral" and I feel I've had impact. But I could have fitted
| kitchens or made wedding cakes for that time and had just as
| positive an impact on the world and people I serve
| professionally. Hell, if I was a carpenter my work could
| probably outlast anything I've done in tech.
| BurningFrog wrote:
| This is a point too rarely made.
|
| Most work that produces something people are willing to pay
| for _does_ make the world a better place!
|
| Not enormously so for the vast majority of us, but what one
| person out of 8 billion can do.
| bumby wrote:
| I'm not sure I buy the premise as it reads as a Libertarian
| pipe dream. There are just too many examples of people
| willing to pay for something immoral or unethical to think
| that transactions can be broadly painted as a net good.
|
| Capitalist transactions are a reflection of value systems
| and our own shortcomings/biases. To the extent that humans
| are flawed, many of those transactions are going to be
| ethically flawed as well.
| hackable_sand wrote:
| You are in agreement with the comment you're responding
| to.
| bumby wrote:
| Being generous, I'm maybe in agreement if the word "most"
| gets some clarification or nuance.
| hereonout2 wrote:
| There's an aspect of longevity to our impact I love to
| contemplate.
|
| For most of us, our tech work will be long forgotten and
| obsolete 20 years from now. At best it will have provided
| some small intangible advance - hopefully for the better.
|
| But the people that built my house died before I was born,
| yet their work has a tangible ongoing impact to this day.
|
| The people who built some European cathedrals lived over
| 800 years ago, yet that padstone laid by some nameless
| apprentice still holds an entire functional building in
| place.
| quotemstr wrote:
| > The duty to align your professional life ethically scales
| with your ability to do so.
|
| Humans respond to incentives. We seek rewards that may be
| monetary, social, or intellectual: we optimize our behavior for
| them all the same. Trying to improve the world by scolding
| people for acting according to their incentives will not work.
| It's not a serious position. "If everyone would just..." ---
| no, everyone is not going to just, and if they were, they'd
| have already done it. Your exhortation will make no difference.
|
| If you want to change the world, change the incentive
| structure. Don't expect people to act against their personal
| interests because you say so. At best, they'll ignore you. At
| worst, they'll maliciously comply and cause even more harm.
| layer8 wrote:
| One's conscience is part of one's incentives. And talking to
| people can actually affect their conscience. Public discourse
| like the one taking place here is part of the factors that
| can cause cultural shifts.
| WalterBright wrote:
| Trying to stop people from growing and selling marijuana didn't
| work out so well.
| CharlieDigital wrote:
| > there are ways to contribute morally outside of your 9-5.
| Volunteer to teach a neighborhood kid to code.
|
| This simplistic view of the world does not scale -- especially
| so in today's global economy. Imagine we never had public
| education and instead relied on the good nature of individuals
| to teach their neighborhood kids. Imagine competing at a global
| level without a coordinated educational system with baseline
| standards. Instead, what we need is to teach _every kid_ how to
| code (many may not end up coding as a profession, but that 's
| fine; every kid that has the affinity and talent and and wants
| to do it should have the chance).
|
| That's nominally why we have government of the people, by the
| people, for the people. That's why we have taxes. _These scale_
| when the interests are aligned. We 've seen them scale.
|
| The problem arises when (as Mitt Romney famously expressed) we
| think of corporations as people, too, and assign them rights
| associated with personhood.
|
| They are of "some" people, by "some" people, for "some" people.
|
| This is the crisis I think the US is having now. This is what
| it think was punctuated with COVID; there is no longer the
| spirit of "we" and the US is in the era of "me".
| bhouston wrote:
| And he doesn't even get around to mentioning that Google (and
| Amazon) are providing AI computing to Israel even though Google's
| own lawyer warned that they could be used to violate human
| rights. Their lawyers wrote: "Google Cloud services could be used
| for, or linked to, the facilitation of human rights violations,
| including Israeli activity in the West Bank."
|
| It gets worse, they got advice and then didn't follow it:
|
| "Google reportedly sought input from consultants including the
| firm Business for Social Responsibility (BSR). Consultants
| apparently recommended that the contract bar the sale and use of
| its AI tools to the Israeli military 'and other sensitive
| customers,' the report says. Ultimately, the [Google] contract
| reportedly didn't reflect those recommendations."
|
| https://www.theverge.com/2024/12/3/24311951/google-project-n...
|
| The end result is Lavender which HRW details here:
| https://www.hrw.org/news/2024/09/10/questions-and-answers-is...
| arghandugh wrote:
| Big Tech subverted the world's longest running democracy and
| tipped a majority of the global population into authoritarian
| rule. An essay handwringing the question doesn't seem very useful
| at this point.
| lazyeye wrote:
| The govt was changed despite big tech not because of it. And
| the majority of people disagree with your characterization of
| "authoritarian rule".
| metabagel wrote:
| Trump didn't win a majority of those who voted, let alone the
| idea that a majority of the country supports him.
| rexpop wrote:
| In light of these contexts, what would you rather see at the
| top of HN?
| richrichie wrote:
| Is that why people overwhelmingly voted for change in 2024,
| ironically to bring back the "dictator"? Popular mandate, all
| of swing states, majority of governorships, house and the
| senate - seems as decisive a democratic choice as it can get!
| int_19h wrote:
| FWIW, Putin was also overwhelmingly elected by people "voting
| for change" back in the day.
| metabagel wrote:
| It was a narrow victory. Trump didn't win a majority, and his
| win had little effect on down ballot races.
|
| https://www.pbs.org/newshour/politics/the-size-of-donald-
| tru...
|
| > Trump's 2024 raw vote margin was smaller than any popular
| vote winner since 2000, and the fifth-lowest since 1960
| nixosbestos wrote:
| I'd bet $500 that richrichie knows that already and seeing
| it again still won't keep them from repeating the known
| lie. It's kind of part and parcel.
| mgobl wrote:
| Not only do I disagree with the premise, but I think the article
| is poorly argued.
|
| Was working on the Manhattan project unethical because it
| furnished the ability for us to kill humans on an even more vast
| industrial scale than we previously could have imagined? Perhaps,
| but it's hard to square this with the reality that the capability
| of mutually assured destruction has ushered in the longest period
| of relative peace and global stability in recorded history,
| during a period of time we might otherwise expect dramatically
| increased conflict and strife (because we are sharing our limited
| planet with an additional order of magnitude of humans). Had
| everyone at Los Alamos boycotted the effort, would we be in a
| better place when some other power inevitably invented the atomic
| bomb? Somehow I doubt it.
|
| The world is a complex system. While there are hopefully an
| expanding set of core "values" that we collectively believe in,
| any single person is going to be challenged by conflicting values
| at times. This is like the Kagan stages of psychological
| development [1], but societally. I can believe that it's net bad
| for society that someone is working on a cigarette manufacturing
| line, without personally holding them accountable for the ills
| that are downstream of their work. There are competing systems
| (family, society) that place competing values (good - we can
| afford to live, bad - other people get sick and die) on the exact
| same work.
|
| If people want to boycott some types of work, more power to them,
| but I don't think the line between "ethical" and "unethical"
| tasks is so clear that you can put whole corporations on one side
| or another of that line.
|
| Sometimes I try and put a dollar amount on how much value I have
| received from Google in my lifetime. I've used their products for
| at least 20 years. Tens of thousands of dollars seems like an
| accurate estimate. I'm happy to recognize that two things are
| true: that there are societal problems with some big tech
| businesses that we would collectively benefit from solving AND
| that I (and millions of other people less fortunate than me, that
| couldn't "afford" the non-ad-supported cost of these services)
| have benefited tremendously from the existence of Google and its
| ilk.
|
| [1]: https://imgur.com/a/LSkzutj
| g-b-r wrote:
| > that I (and millions of other people less fortunate than me,
| that couldn't "afford" the non-ad-supported cost of these
| services) have benefited tremendously from the existence of
| Google and its ilk.
|
| People who were into Google seem to tremendously overestimate
| the value it provided.
|
| The only Google thing I ever used is Android, and only because
| it's too hard to avoid it.
|
| Had there not been Google you'd have used alternative services,
| and your life would not have been much worse.
|
| Yes, a similarly good search engine would have emerged, similar
| products would have been devised, and the internet would have
| been ad-supported as it already was before Google.
| stickfigure wrote:
| I used Altavista, Lycos, Yahoo, etc in the era before Google
| - and it was worse.
|
| If you're suggesting that some other company besides Google
| would have worked out the same algorithms and business plan,
| then this seems incoherent. Even if true, we'd be here
| discussing how much value we've gotten from Notgoogle. It's
| still a tremendous amount of value, whatever the company is
| named.
| g-b-r wrote:
| > I used Altavista, Lycos, Yahoo, etc in the era before
| Google - and it was worse.
|
| I guess you were only talking about the search engine,
| then.
|
| The technology was ready, PageRank was inspired by other
| work, and Google came to a good degree out of government
| grants.
|
| And by the way, the search engine I was using when Google
| came out (I think it was Northern Light, but I might be
| mistaken) was not significantly worse; Altavista and Yahoo
| were definitely among the worst engines by then
|
| > If you're suggesting that some other company besides
| Google would have worked out the same algorithms and
| business plan, then this seems incoherent.
|
| Why incoherent?
|
| Had another company done exactly the same but with a
| different name, yeah, not much would have changed...
|
| But there was no need for things to go this way, for the
| products you love to emerge; they just, probably, would
| have been made by several companies, rather than all by
| one.
|
| But actually, there have always been alternatives to
| Google's products, it was just your choice to not use them;
| you could probably have gotten a similar value without ever
| touching a Google product.
| spencerflem wrote:
| I've been trying to live a relatively de-google'd life
| right now, and much like you say, it's not so hard.
| Google Maps is the big exception for me.
| quotemstr wrote:
| > Was working on the Manhattan project unethical because it
| furnished the ability for us to kill humans on an even more
| vast industrial scale than we previously could have imagined?
| Perhaps, but it's hard to square this with the reality that the
| capability of mutually assured destruction has ushered in the
| longest period of relative peace and global stability in
| recorded history
|
| Ah, consequentialist versus deontological ethics: neither camp
| can even hear the other. Some people just pattern-match making
| thing X (weapons, profits, patents, non-free software,
| whatever) against individual behavior and condemn individuals
| doing these things regardless of the actual effects on the real
| world. Sure, invading Japan instead of bombing it would have
| killed a million Americans and who knows how many Japanese
| (real WW2 allied estimate), but ATOM BOMB BAD and PEOPLE WHO DO
| BAD, and so we get people who treat Los Alamos as some kind of
| moral black hole.
|
| The world makes sense only when we judge actions by their
| consequences. The strident and brittle deontological rules that
| writers of articles that feature the wor d"ethics" in the
| headline invariably promote are poor approximations of the
| behaviors that lead to good consequences in the world.
| int_19h wrote:
| > invading Japan instead of bombing it would have killed a
| million Americans and who knows how many Japanese (real WW2
| allied estimate),
|
| Most people who believe that nuclear strikes on Japan were
| morally wrong also believe that Japan would have surrendered
| regardless, and nukes were thus redundant (and hence, wrong).
|
| If you studied this question, you should know that there's a
| compelling argument that Japanese were motivated just as much
| if not more by Soviets entering the fray with considerable
| success. Now, you may personally disagree with this
| assessment, but surely you can at least recognize that others
| can legitimately hold this opinion and base their ethical
| calculus on it?
| aporetics wrote:
| I don't really understand the categories you've set up or the
| traditions you're referring to, but it seems like
| consequentialist ethics would be good as a historical
| exercise, but not much else. Because we mostly don't know
| what will happen when we act, at least not with the clarity
| that that kind of analysis would need. I think the implicit
| ethical problem here is that there's not much any individual
| can do that will have a measurable effect when it comes to
| entities as large and powerful as big tech (or any other
| industry). So then how do you think about making ethical
| decisions?
| karaterobot wrote:
| > The world makes sense only when we judge actions by their
| consequences.
|
| I'm not sure I agree with this part. To quote Gene Wolfe:
| "until we reach the end of time we don't know whether
| something is good or bad, we can only judge the intentions of
| those who acted." Judging morals by outcome seems like a
| tricky path down a slippery slope. The Manhattan Project _is_
| morally complicated, both because the intentions of those
| involved was complicated, and because the outcome was
| complicated. What 's wrong to do, I think, is simplifying it
| down to "was good" or "was bad".
| habosa wrote:
| I worked in Big Tech and it changed my life financially so I
| can't judge anyone else for doing it, but I will say that I had a
| moral reckoning while I was there and I am (right now) unwilling
| to go back.
|
| At the time (2012-2022) the things about the business model that
| bothered me were surveillance culture, excessive advertising, and
| monopoly power. Internally I was also horrified at the abuse of
| "vendor/contractor" status to maintain a shadow workforce which
| did a lot of valuable work while receiving almost none of the
| financial benefits that the full-time workforce received.
|
| 3 years later all of those concerns remain but for me they're a
| distant second behind the rise of AI. There's a non-zero chance
| that AI is one of the most destructive creations in human history
| and I just can't allow myself to help advance it until I'm
| convinced that chance is much closer to zero. I'm in the minority
| I know, so the best case scenario for me is that I'm wrong and
| everyone getting rich on AI right now has gotten rich for
| bringing us something good, not our doom.
| devjab wrote:
| I'm curious as to how you think it'll be our doom. As for the
| ethics in it, there are two ways to look at it in my opinion.
| One is yours, the other is to accept that AI is coming and at
| least work to help your civilisation "win". I doubt we'll see
| any form of self aware AI in our lifetimes, but the AI tools
| are obviously going to become extremely powerful tool. I
| suspect we'll continue heading down the "cyberpunk" road
| leading to the dystopian society we read about in the 80/90ies,
| but that's not really the doom of mankind as such. It just
| sucks.
|
| As a former history major I do think it'll be interesting to
| follow how AI shifts the power balances. We like to pretend
| that we've left the "might makes right" world behind, but AI is
| an arms race, and it'll make some terrible weapons. Ethics
| aside you're going to want to have the best of those if you
| want your civilisation to continue being capable or deciding
| which morals it wants to follow.
| ANewFormation wrote:
| I don't think most are _primarily_ concerned about war
| applications, but simply driving mass unemployment.
|
| This even seems to be the exact goal of many who then
| probably imagine the next step would then be some sort of
| basic income to keep things moving, but the endless side
| effects of this transition make it very unclear if this is
| even economically feasible.
|
| At best, it would seem to be a return to defacto feudalism. I
| think 'The Expanse' offered a quite compelling vision of what
| "Basic" would end up being like in practice.
|
| Those who are seen (even if through no fault of their own) as
| providing no value to society - existing only to consume,
| will inevitably be marginalized and ultimately seen as
| something less than.
| bitmasher9 wrote:
| To expand on 'The Expanse' "Basic".
|
| The expanse was a 9+ book series that won several literary
| awards that takes place in an interplanetary humanity
| several centuries in the future.
|
| Roughly one half of the population of earth, or 30 billion
| people, live on basic assistance from The United Nations.
| The only way to leave basic is to get a job or get an
| education, and there are significant hurdles to both of
| those routes. People on basic do not get money, but they do
| receive everything they need to live a life. A barter
| economy exists among those on basic, and some small
| industry is available to those on basic if it flies under
| the government's radar. Some (unspecified population size)
| undocumented people do not receive basic, and may resort to
| crime in order to make ends meet.
| int_19h wrote:
| It's also worth remembering that in "Expanse", there'a
| also Mars, which is a separate state that does _not_ have
| this arrangement - everyone is employed, but conversely
| there 's no unconditional welfare.
|
| However, it is made pretty clear in the books that the
| reason why this is possible for Mars is because they have
| this huge ongoing terraforming project that will take a
| century to complete. So there's always more jobs than
| people to fill them, basically, and it's all ultimately
| still paid for by the government, just not directly (via
| contracts to large enterprises).
| martin-t wrote:
| I love The Expanse and it gets things right more than
| other sci-fi. However, I think it vastly _underestimates_
| the amount of injustice than can be caused by powerful
| people with the help of advanced technology and ML.
|
| 1) You can literally cover the planet with sensors and
| make privacy impossible. Cameras and microphones are
| already cheap and small. What will they look like in
| several hundred years? You can already eavesdrop on a
| conversation in a closed room, e.g. by bouncing a laser
| off the window to amplify air vibrations. What will be
| possible in several hundred years?
|
| 2) Right now, suppressing the population by force
| requires control of a sufficient number of serviles.
| These serviles are prone to joining the revolution if you
| ask them to harm their own friends and families (Chine
| only managed to massacre Tianennmen square after
| reinforcements from other regions survived because the
| initial wave joined the protesters). They are prone to
| only serving as long as you can offer them money or
| threaten then credibly.
|
| In the near future, it will be possible to suppress any
| uprising (if you're willing to use violence) by a small
| number of people controlling a large number of automated
| tools (e.g. killbots, the drone war in Ukraine is a taste
| of what's to come).
|
| Spoilers ahead.
|
| The story vastly underestimates the competence of state
| level bad actors.
|
| In the books, Holden and his group were attacked on Eros
| by a small number (single digits) of covert agents and
| only managed to survive thanks to Miller. In reality, you
| don't send 4 people to apprehend 4 people, you send 40.
|
| Later, Holden and other people were apprehended on
| Ganymede and again, managed to get out of it by
| overpowering their captors because the government just
| didn't send enough people. This is not gonna happen in
| reality.
|
| (Though you might be able to kill one if you're also
| willing to die in the process. A Belarusian citizen had
| several KGB agents break into his flat but because it
| took them a while to break the door down, he managed to
| grab his gun, ambushed them and shot one in the stomach.
| The aggressor later bled out but the citizen was also
| killed.)
| int_19h wrote:
| Proper UBI is absolutely economically feasible if we start
| taxing things like, say, capital gains properly.
|
| "The Expanse" shows the kind of UBI that Big Tech bros
| would like to see, absolutely. Which is to say, the
| absolute minimum you need to give people to prevent a
| revolt and maintain a status quo. But you shouldn't assume
| that this is the only possibility.
|
| As far as "seen as providing no value to society", that is
| very much a cultural thing and it is not a constant, so it
| can and should be changed. OTOH if we insist on treating
| that particular aspect as immutable, our society is always
| going to be shitty towards a large number of people in one
| way or another.
| Unearned5161 wrote:
| While unemployment certainly deserves a conversation of its
| own, I think the more overlooked aspects of education and
| democracy will erode our society deeper into a hole by
| themselves.
|
| I'm rather fearful for the future of education in this
| current climate. The tools are already powerful enough to
| wreak havoc and they haven't stopped growing yet! I don't
| think we'll properly know the effect for some years now, not
| until the kids that are currently in 5th, 6th, or 7th start
| going into the workforce. While the individual optimist in me
| would like to see AI as this great equalizer, personal tutor
| for everyone, equal opportunity deliverance, I think we've
| fumbled it for all but a select few. Certainly there will be
| cases of great success, students who leverage AI to it's
| fullest extent. But I urge one to think of the other side of
| the pie. How will that student respond to this? And how many
| students are really in this section?
|
| AI in its current state presents a pact with the devil for
| all but the most disciplined and passionate of us. It makes
| it far to easy to resign all use of your critical mental
| faculties, and to stagnate in your various abilities to
| navigate our complex modern world. Skills such as critical
| reading, synthesizing, and writing are just a few of the most
| notable examples. Unrestrained use of tools that help us so
| immensely in these categories can bring nothing but slow
| demise for us in the end.
|
| This thought pattern pairs nicely with the discussion of AIs
| effects on democracy. Hopefully the step taken from assuming
| the aforementioned society, with its rampant inabilities to
| reason critically about its surroundings, to saying that this
| is categorically bad for democracy, isn't too large.
| Democracy, an imperfect form of government that is the best
| we have at this moment, only works well with an educated
| populace. An uneducated democracy governs on borrowed time.
| One can already see the paint start to peel (there is a
| larger effect that the Internet has on democracy that I'll
| leave out of this for now, but is worth thinking about as
| it's the one responsible for the current decline in our
| political reality).
|
| The unfortunate conclusion that I reach when I think of all
| of this, is that it comes down to the ability of government
| and corporations to properly restrain this technology and
| foster its growth in a manner that is beneficial for society.
| And that restraint is hard to see coming with our current set
| up. This is to avoid being overly dramatic and saying that
| it's impossible.
|
| If you look at the history of the United States, and truly
| observe the death grip that its baby, capitalism, has on its
| governance, if you look at this, you find it hard to believe
| that this time will be any different from times past. There
| is far too much money and national security concern at stake
| here to do anything but put the pedal to the floor and
| rapidly build an empire in this wild west of AI. The
| unfortunate conclusion is that perhaps this could have been a
| wonderful tool for humanity, and allowed us to realize our
| collective dreams, but due to the reasons stated above I
| believe this is unachievable with our current set up of
| governance and understanding of ethics, globally.
| martin-t wrote:
| > how you think it'll be our doom
|
| There's 2 main possibilities:
|
| 1) Self aware AI with its own agency / free will / goals.
| This is much harder to predict and is IMO less likely with
| the current approaches so i'll skip it.
|
| 2) A"I" / ML tools will become a force multiplier and the
| powerful will be even more so. Powerful people and
| organizations (including governments) already have access to
| much more data about individuals than ordinary citizens. But
| currently you usually need loyal people to sift through data
| and to act on it.
|
| With advanced ML tools, you can analyze every person's entire
| personality, beliefs, social status, etc. And if they align
| with your goals, you can promote them, if not, you can
| disadvantage them.
|
| 2a) This works if you're a rich person deciding whose medical
| bills you will pay (and one such person was recently killed
| for abusing this power).
|
| 2b) This works if you're a rich person owning a social
| network by deciding who's opinions will be more or less
| visible to others. You can shape entire public discourse and
| make entire opinions and topics invisible to those who have
| not already been exposed to them. For example one such
| censored topic in western discourse is when the use of
| violence is justified and moral. The west, at least for now,
| is willing to celebrate moral acts of violence in the past
| (French revolution, American civil war, assassination of
| Reinhard Heydrich) but discussion of situations where
| violence should be used in recent times is taboo and "banned"
| on many centrally moderated platforms.
|
| 2c) And obviously nation states have insane amount of info on
| both their own citizens and those from other nation states.
| They already leads to selective enforcement (everybody is
| guilty of something) and it can get even worse when the
| government becomes more totalitarian. Can you imagine current
| China ever having a revolution and reinstating democracy? I
| can't because any dissent will be stopped before it reaches
| critical mass.
|
| So states which are currently totalitarian are very unlikely
| to restore democracy and states which are currently
| democracies are prone to increasingly totalitarian rule by
| manipulation from rich individuals - see point 2b.
| bongodongobob wrote:
| So you worked there 10 years, made your stack of cash, and then
| had a moral reckoning? So brave.
| ornornor wrote:
| Not ideal I guess but better than no reckoning at all.
| palata wrote:
| Have you been offered a job in a BigTech that pays 3 times
| your current salary, and did you decline it?
|
| It's easy to criticise others when you are not confronted to
| the situation.
| bdangubic wrote:
| 100% - however, if you are great enough to get such an
| offer from bigtech you won't really worry about your
| finances...
|
| younger-me, I would 100% take the money. older-and-wiser me
| would not even apply to begin with
| ninalanyon wrote:
| > There's a non-zero chance that AI is one of the most
| destructive creations in human history
|
| Geoffrey Hinton was interviewed by Sajid Javid on BBC R4 on
| Friday [1] and was considerable more pessimistic. If I hear it
| correctly he reckoned that there is a 10% to 30% chance that AI
| wipes us out within the next 30 years.
|
| [1] https://www.bbc.co.uk/programmes/p0kbsg05
| imglorp wrote:
| Regarding his ask that ACM dedicate itself to the public good,
| the IEEE is already there in its code of ethics.
|
| > hold paramount the safety, health, and welfare of the public,
| to strive to comply with ethical design and sustainable
| development practices, to protect the privacy of others, and to
| disclose promptly factors that might endanger the public or the
| environment
|
| That code is pretty squarely at odds with big tech's latest
| malevolent aims.
| ajuc wrote:
| I really like Timothy Snyder's take on this.
|
| Breakthroughs in information technology always cause disruption
| in the political meaning (wars and chaos). It was like that when
| writing was invented (making big organized religions possible),
| it was the same with printing press (allowing reformation and big
| political movements), it was similar with radio (which allowed
| 20-th century style totalitarian regimes).
|
| Each time the legacy powers struggled to survive and wars
| started. It took some time for the societies to adapt and
| regulate the new technologies and create a new stable
| equilibrium.
|
| It's not surprising that it's the same with internet. We have
| unstable wild-west style information oligarchy forming before our
| eyes. The moguls build continent-spanning empires. There's no
| regulation, the costs are negligible, and the only ones trying to
| control it are the authoritarians. And the new oligarchs are
| obviously fighting with their thought-control powers against the
| regulation with all they've got.
|
| It won't end without fireworks.
| richrichie wrote:
| > In fact, our current climate crisis is a demonstrated market
| failure.
|
| This wrong on so many levels. There is neither a climate crisis
| nor a market failure. If any, central economies exhibited (and
| exhibit) higher levels of pollution and destruction of public
| good.
|
| Mindless repetition of the climate crisis trope has done more
| damage to the cause than carbon emissions.
| metabagel wrote:
| So, there's no climate crisis, and also it's caused by "central
| economies"? (I don't know what those are, but they sound bad.)
| CalChris wrote:
| I didn't think the Cambridge Analytica scandal had anything at
| all to do with computer science. I thought it had to do with
| business and hence business ethics.
| lisper wrote:
| > It is difficult to get a man to understand something, when his
| salary depends on his not understanding it.
|
| I'm going to have to add that my list of favorite aphorisms. And
| it's not just salaries that drive this dynamic. It is difficult
| to get someone to understand something when their entire identity
| is invested in not understanding it. This applies to religions,
| political ideologies, and even to a lot of self-styled
| rationalism.
| 1vuio0pswjnm7 wrote:
| "But in my January 2019 Communications column,^b I dismissed the
| ethical-crisis vibe. I wrote, "If society finds the surveillance
| business model offensive, then the remedy is public policy, in
| the form of laws and regulations, rather than an ethics outrage."
| I now think, however, I was wrong."
| calibas wrote:
| > the belief in the magical power of the free market always to
| serve the public good has no theoretical basis
|
| This needs to be repeated more often.
|
| Early on, there was this idea that free market capitalism was
| inherently amoral, and we had to do things like "vote with your
| wallet" to enforce some kind of morality on the system. This has
| been gradually replaced with a pseudo-religious idea that there's
| some inherent "virtue" to capitalism. You just need to have faith
| in the system, and everything will magically work itself out.
| snnsbsnshs wrote:
| I work on software for managing casinos. I feel morally superior.
| Big tech has real problems if working with gambling and weaponry
| is preferable to big tech.
| sadeshmukh wrote:
| How do you feel morally superior? Casinos are built to only
| extract money from people - they provide minimal value.
| recursivedoubts wrote:
| "In a sort of ghastly simplicity we remove the organ and demand
| the function. We make men without chests and expect of them
| virtue and enterprise. We laugh at honour and are shocked to find
| traitors in our midst. We castrate and bid the geldings be
| fruitful."
| csours wrote:
| 'Never attribute to ethics that which can be explained by
| incentives' - Hanlon's Hammer
|
| 'Show me an organization's stupidity and I'll show you their
| malice' - Munger's Psychology of Human Misquotations
| neilv wrote:
| One ethical thing that some people on HN do, and more should:
| criticize big companies when they do something unethical, even if
| you'd want to work for them.
|
| Yes, presumably, you will get on some company-wide hiring
| denylists. (Not because you're prominent, but because there will
| be routine LLM-powered "corporate fit" checks, against massive
| corpora and streams of ongoing surveillance capitalism monitoring
| of most things being said.)
|
| Some things need to be said. And people need to not just hear it
| once, and forget it, but to hear it from many people, on an
| ongoing basis. So not saying it is being complicit.
| efitz wrote:
| How do I reason ethically about this?
|
| I am a security professional. My work directly affects the
| security of the systems I am responsible for. If I do my job
| well, people's data is less likely to be stolen, leaked,
| intentionally corrupted, or held for ransom. I also influence
| privacy related decisions.
|
| I work for a Mag7 company. The company has many divisions; the
| division I work for doesn't seem to be doing anything that I
| would perceive as unethical, but other divisions of my company do
| behave in a way I consider unethical.
|
| I'm not afraid to take an ethical stance; in a previous job at
| another company I have directly confronted my management chain
| about questionable behavior and threatened to quit (I ended up
| convincing them my position was correct).
|
| So how do I reason about that? Really the sticking point is that
| large companies are not monoliths. Am I acting unethically for
| working for an ethical division of an imperfect company?
| redelbee wrote:
| There are many ways to reason ethically about your situation,
| and you could start by using historical philosophers as
| inspiration.
|
| Bentham might apply if you consider the overall outcome: is the
| work your company does positive or ethical for the majority of
| people the majority of the time? It seems like the "greatest
| good for the greatest number" would allow for some small
| unethical aspects so long as the outcome is good for the
| majority. This could also be seen as a shortcoming in that
| philosophy because it justifies some pretty terrible actions
| for the greater good (some of which, like the Manhattan project
| and its outcome, are mentioned elsewhere in this thread).
|
| Kant might make you look at your company and imagine that all
| companies acted that way as a way to reason ethically. If all
| companies acted the way your company acts would that be good or
| bad for humanity? Kind of like the golden rule, but more
| rational.
|
| There are many more to consider but it's my view that most of
| them will get you to the point where you probably shouldn't
| work for an unethical company, even if your particular work or
| area of focus is perfectly ethical. Mainly because you working
| for the company allows or helps it to exist in some way, and we
| don't want unethical companies to exist. So maybe you could
| reason your way into working there if your sole focus was
| finding a way to destroy the company somehow. Otherwise it's
| probably better to work elsewhere.
| efitz wrote:
| Thank you!
|
| As an aside, I consider anything that actively subverts the
| company, beyond whistleblowing, as unethical, and in fact,
| it's a threat that people like me have to defend against, so
| I would never involve myself in such activities.
| dclowd9901 wrote:
| An ethical absolutist would say "yes." But you might guess such
| a person is not very popular, as there is almost no aspect of
| simply being alive that could be considered ethical.
| ryukoposting wrote:
| Thank you! Great to see this message getting a bit of a platform.
|
| As industry practicioners, we have the agency to force positive
| change in our field. If the government is too encumbered and the
| executives are too avaricious, that leaves us. If you want tech
| to do good things for people, work for a company that makes tech
| that does good things for people.
| petermcneeley wrote:
| You cant have an ethical crisis if you dont have any morality. Do
| what thou wilt is our only modern ethos.
| Animats wrote:
| _" I bemoaned that humanity seems to be serving technology rather
| than the other way around. I argued that tech corporations have
| become too powerful and their power must be curtailed."_
|
| That's a generic problem with corporatism and monopoly, not
| "tech".
|
| It shows up in "tech" because "tech" scales so well and has such
| strong network effects. But the US's tolerance of monopoly is the
| real cause. There need to be about four major players before
| markets push prices down. The US has three big banks, two big
| drugstore chains, etc.
|
| Tough antitrust enforcement would help. Google should be broken
| up into Search, Browsers, Mobile Devices, Ads, and Services, and
| the units prohibited from contracting with each other.
|
| Tough labor law enforcement would help. No more "gig worker" jobs
| that are exempt from labor law. No more "wage shaving". No more
| unpaid overtime. Prorate medical insurance payments based on
| hours, so companies that won't pay people for more than 30 hours
| a week pay their fraction of medical insurance. A minimum wage
| high enough that people making it don't need food stamps.
| ToucanLoucan wrote:
| I think an under-discussed issue is how companies are allowed
| to own sub-companies that don't need to necessarily disclose
| they're owned by a larger conglomerate. Like I don't know how
| you necessarily solve this, but I think if people, for example,
| knew the sheer _number_ of snack brands owned by Nabisco, there
| would be a lot more discussion about corporate consolidation
| and monopolies.
| Animats wrote:
| > sheer number of snack brands owned by Nabisco
|
| Also Yum and Roark, which, together, own much of fast food.
| WalterBright wrote:
| > Prorate medical insurance payments based on hours, so
| companies that won't pay people for more than 30 hours a week
| pay their fraction of medical insurance
|
| Attaching medical insurance to one's job is a market distortion
| caused by government tax policy. I.e. it enables one to buy
| insurance with pre-tax dollars rather than after-tax dollars.
| Making medical insurance premiums fully tax-deductible would
| fix that.
|
| > A minimum wage high enough that people making it don't need
| food stamps.
|
| That just makes those people unemployable, and will need food
| stamps even more. Nobody is going to hire people who cost more
| than the value they produce.
|
| > Google should be broken up into Search, Browsers, Mobile
| Devices, Ads, and Services, and the units prohibited from
| contracting with each other.
|
| Google is already in trouble because AI is disrupting their
| search/advertisement business model.
|
| I'd be careful about destroying big business. The US is only
| part of the world. Destroying US big business means other
| countries will have those companies, and it's lose lose for the
| US. Do you want Big Tech to be American companies, or foreign
| companies?
| ClumsyPilot wrote:
| > That just makes those people unemployable, and will need
| food stamps even more. Nobody is going to hire people who
| cost more than the value they produce.
|
| Good, a job that cannot support biological needs should not
| exist. It's not a viable business.
|
| Why should I pay a stealth subsidy to whatever business it
| is.
|
| > Do you want Big Tech to be American companies, or foreign
| companies?
|
| This excuse was used to start wars, trample civil rights and
| employment rights. It basically means we must become like
| China to beat China. What would be the point?
| ghshephard wrote:
| > Good, a job that cannot support biological needs should
| not exist.
|
| There was a time in the not so distant past, that close to
| 100% of those "Minimum Wage" jobs were held by teenagers
| and youths with close to zero market value as employees,
| who needed their first few jobs to develop the skills,
| knowledge, resume and references so they _could_ get an
| actual job.
|
| Places like McDonalds and Summer Resorts and Amusement
| parks - were great places for youth to learn these skills.
| The real distortion is when you started having _adults_
| working in McDonalds. It was never a job to support a
| family - it was a minimum-wage job for kids to get started.
| PaulDavisThe1st wrote:
| This is just historically inaccurate (and a regrettably
| common claim among older conservative-ish folk.
|
| Those "minimum wage" jobs that you had a teenager in the
| 1950-1986 time period? They paid more than minimum wage
| does now, on an inflation adjusted basis. That $2/hr job
| in 1962 would be paying $21/hr if it had kept up with
| CPI.
|
| That's the whole reason why adults started working in
| them.
|
| Over time, federal minimum wage did not keep up even with
| national inflation rates, let alone regional cost of
| living changes. The result is that these employers, who
| were once forced to pay even their lowest level employees
| a living wage, can avoid paying even that.
| grayhatter wrote:
| all the sources I see say minimum wage should be around
| 12 USD where did you source the 21USD number?
| PaulDavisThe1st wrote:
| $21 is an MIT-provided living wage number for many parts
| of the country (including Santa Fe, where I live (or
| close by)). There are places where that's still not
| enough: I think $35/hr just about covers anywhere in the
| US at this point.
|
| It's also the CPI-adjusted equivalent of 1960s minimum
| wage numbers.
| grayhatter wrote:
| 1963 @ $1.25 or 1957 @ $1.00
|
| giving me $1.25 * (304.702/30.6) = $12.44 or $1 *
| (304.702/28.1) = $10.84
|
| my sources:
| https://www.dir.ca.gov/iwc/minimumwagehistory.htm
| https://www.usinflationcalculator.com/inflation/consumer-
| pri...
| PaulDavisThe1st wrote:
| https://www.bls.gov/data/inflation_calculator.htm
|
| January 1962 $2 => Nov 2024 $21.03
|
| Using $1.25 in Jan 1963 gives me $12.97 ...
|
| I used $2 in 1962 because in the 2016 Republican
| primaries one of the candidates made a reference to their
| job working in a burger store using these numbers.
| WalterBright wrote:
| > Good
|
| It's not better to have people have no jobs and require
| 100% assistance.
|
| > subsidy
|
| Regardless of how you define terms, you'll being paying
| much more to help them when they are jobless.
|
| > become like China
|
| China has a largely state run economy, with the resulting
| problems.
| idiotsecant wrote:
| Is there some evidence that big american companies are less
| negative than big [insert adversary of the decade] are?
|
| Companies are not bound by morals, national identity, or any
| interest other than self-perpetuation. They are a virus that
| we harness to do good. When the virus overwhelms its host,
| its time for medicine.
| DanHulton wrote:
| > That just makes those people unemployable, and will need
| food stamps even more.
|
| This doesn't actually bear out. Minimum wage increases really
| don't have a history of making minimum wage employees
| unemployable, or destroying the companies affected. In fact,
| the opposite tends to happen, as these businesses tend to be
| frequented by other minimum wage employees as customers, so
| it ends up being a rising tide that lifts all boats.
|
| > Do you want Big Tech to be American companies, or foreign
| companies?
|
| I'd argue that you can't just airdrop these companies into
| another country and have them be as successful as they are.
| Even with much stricter monopoly laws, there is a LOT about
| America that incentivizes these companies to locate there,
| and frankly I'm not convinced they'd move.
|
| And as a Canadian, I don't even want Big Tech to be American.
| =) The US is only part of the world, as you said, but your
| lax and corrupt legal system is polluting the world with
| these dangerous megacorps.
|
| Don't get me wrong, we're none better, our system would allow
| for nearly the same abuse, were it not for the fact that our
| whole country is smaller in population that California. But
| the point remains that there's a lot of the world that is
| looking on in horror at these rampaging monster companies and
| is not in any way assured by the "at least they're American"
| defense.
| nox101 wrote:
| > This doesn't actually bear out. Minimum wage increases
| really don't have a history of making minimum wage
| employees unemployable, or destroying the companies
| affected. In fact, the opposite tends to happen, as these
| businesses tend to be frequented by other minimum wage
| employees as customers, so it ends up being a rising tide
| that lifts all boats.
|
| It's not nearly that settled
|
| https://www.youtube.com/watch?v=Wvr0NhYfkO4
|
| https://www.youtube.com/watch?v=8H4yp8Fbi-Y
| grayhatter wrote:
| I'll give you the economist as an acceptable source...
| but reasontv has a habit of omitting easy to access data
| that's detrimental to the argument they started with.
| While that lack of candor might not be disqualifying,
| it's dishonest when you also present yourself as a
| journalist.
| PaulDavisThe1st wrote:
| > Making medical insurance premiums fully tax-deductible
| would fix that.
|
| They more or less are, for those who pay for their own health
| insurance.
|
| Why would have the same rule for those who receive health
| insurance as part of an overall W2/labor-based compensation
| contract?
| WalterBright wrote:
| No, the insurance premiums aren't tax deductible.
| PaulDavisThe1st wrote:
| Line 17, Schedule 1: "Self-employed health insurance
| deduction"
|
| https://www.irs.gov/instructions/i1040gi#en_US_2024_publi
| nk1...
|
| "You may be able to deduct the amount you paid for health
| insurance (which includes medical, dental, and vision
| insurance and qualified long-term care insurance) for
| yourself, your spouse, and your dependents. "
|
| "One of the following statements must be true.
| You were self-employed and had a net profit for the year
| reported on Schedule C or F. You were a
| partner with net earnings from self-employment.
| You used one of the optional methods to figure your net
| earnings from self-employment on Schedule SE.
| You received wages in 2024 from an S corporation in which
| you were a more-than-2% shareholder. Health insurance
| premiums paid or reimbursed by the S corporation are
| shown as wages on Form W-2.
| drewcoo wrote:
| > Attaching medical insurance to one's job is a market
| distortion caused by government tax policy
|
| Here in the US, FDR had a wage freeze as part of his policies
| [1] to deal with the continuing Great Depression that WWII
| had not stopped yet by 1942. Because of that, companies
| needed to get inventive about ways to increase benefits but
| not illegally increase wages. Companies started offering
| insurance plans.
|
| That's where the employment/insurance coupling started.
|
| 1. https://en.wikipedia.org/wiki/Stabilization_Act_of_1942
| WalterBright wrote:
| That's correct.
| narski wrote:
| >Google should be broken up into Search, Browsers, Mobile
| Devices, Ads, and Services, and the units prohibited from
| contracting with each other.
|
| I admire the general spirit of your comment, but this specific
| example seems off to me. Search and browsers, for example,
| don't make sense as independent businesses. Rather, they are
| products based off of Ads.
|
| Maybe the idea would be for Ads to pay Search to include their
| ads, and for Search to pay Browsers to be the default search
| engine?
| from-nibly wrote:
| Having monopolies is not a symptom of government negligence.
| It's the system working as intended.
| dmoy wrote:
| > The US has three big banks
|
| ? I thought we still had the big four? Chase, BoA, Citi, WF?
| And if you're talking about just consumer banking, US Bank is
| only ~30% behind #4 (Citi).
| dghlsakjg wrote:
| The US also has a relatively huge amount of small banks that
| are specialized in financing niches. It's actually a huge
| competitive advantage. If I want a bank that specializes in
| loans for PNW fishing boats, that exists, and they are able
| to competitively price a loan that BoA won't even consider.
|
| The flip side is that big banks are great at driving down
| costs for standard operations (when there are enough of them
| to be competitive). If all I need is a business checking
| account as a consultant, I can access that for no cost via
| one of the giants.
| Animats wrote:
| Wells Fargo, maybe.[1]
|
| And we need Glass-Stegall back. Banks and brokerages should
| be separate. There is no good reason that Goldman Sachs
| should be a bank, other than for bailouts, which is why they
| became a bank.
|
| [1] https://www.macrotrends.net/stocks/charts/WFC/wells-
| fargo/to...
| oooyay wrote:
| The hyperfocus on shareholder returns is also worth mentioning.
| It's tangentially related to a monopolistic trajectory. Instead
| of a company being _really good_ at solving problems in a
| particular domain they attempt to serve _many_ mediocre
| solutions in a variety of domains. Shareholders, VCs, and the
| like encourage this lack of focus on quality and replace it
| with a focus on margins. The solution for low margins is lower
| head count and greater diversification of SKUs. For many
| companies, it 's a recipe for enshittification and spirals into
| mediocrity. Not to mention, when a company enters this phase
| the lives of employees begin to suffer greatly.
| TacticalCoder wrote:
| > Google should be broken up into Search, Browsers, Mobile
| Devices, Ads, and Services, and the units prohibited from
| contracting with each other.
|
| Why Google? Every single day there are articles here on HN with
| many comments explaining that Google is done due to LLMs
| replacing search.
|
| Google market cap: $2.3 bn
|
| Microsoft market cap: $3.2 bn
|
| Break up Microsoft. And for good this time.
| PaulDavisThe1st wrote:
| > There need to be about four major players before markets push
| prices dow
|
| Not at all sure that prices are the problem here, nor that
| markets can solve the actual problems.
| nox101 wrote:
| Apple should also be broken up into hardware/os, app store,
| services, payments, media
|
| Your car shouldn't decide who you can do business with nor
| should it get a fee from every store you drive to. It shouldn't
| push it's own payment system for all purchases. And neither
| should a pocket computer do these things
| timoth3y wrote:
| > "I bemoaned that humanity seems to be serving technology
| rather than the other > way around. I argued that tech
| corporations have become too powerful and their > power must be
| curtailed." > That's a generic problem with corporatism and
| monopoly, not "tech".
|
| If you wound enjoy a deep and rigorous treatment of this
| subject, I strongly recommend Martin Heidegger's "The Question
| Concerning Technology."
|
| He argues that modern technology is fundamentally different
| from historical technology, and that corporatism and monopoly
| are the inevitable result of technology.
| lazzlazzlazz wrote:
| The "ethics crisis", as described here, is the complaining of one
| ruling elite (traditional media, universities, bureaucrats, etc.)
| against another upcoming elite (tech). The problem is that all of
| the power is accruing to tech -- at the expense of the competing,
| traditional elites.
|
| An even bigger problem is that most of the economic and social
| benefits have come from technology. This even includes shorter
| work weeks and paid leave (typically falsely credited to unions)
| and greater disposable income, which have come from technology
| (broadly speaking) and not from activism.
|
| A tech "ethics crisis" and the "dangerous" profit motive are just
| renewed attacks against capitalism, and "tech" is itself just the
| tip of the spear of capitalism (and the cultural nom de guerre of
| capitalism's elites).
| palata wrote:
| > "It is difficult to get a man to understand something, when his
| salary depends on his not understanding it,"
|
| This says it all.
| WalterBright wrote:
| > But the belief in the magical power of the free market always
| to serve the public good has no theoretical basis.f In fact, our
| current climate crisis is a demonstrated market failure.
|
| It's not a free market failure. It's an example of the _Tragedy
| of the Commons_.
|
| https://en.wikipedia.org/wiki/Tragedy_of_the_commons
| metabagel wrote:
| It's both.
| WalterBright wrote:
| The Commons is not owned by anyone, and so property rights
| are not in play. Property rights are essential to the proper
| functioning of the free market.
| BitterCritter wrote:
| I think that's like saying a square is not a rectangle.
| WalterBright wrote:
| The Tragedy of the Commons is characteristic of Marxist
| economies, not free markets.
| aristofun wrote:
| I wonder why it is often the case that people talking about
| ethics are the ones that looks like the least competent or decent
| to do so.
___________________________________________________________________
(page generated 2024-12-29 23:01 UTC)