[HN Gopher] OpenAI employees did not want to go work for Microsoft
___________________________________________________________________
OpenAI employees did not want to go work for Microsoft
Author : apsec112
Score : 192 points
Date : 2023-12-07 18:40 UTC (4 hours ago)
(HTM) web link (www.businessinsider.com)
(TXT) w3m dump (www.businessinsider.com)
| catchnear4321 wrote:
| > We all left these big corporations to move fast and build
| exciting things...
|
| sama found a flock. this will go poorly.
| greenyoda wrote:
| https://archive.ph/hgoST
| sylware wrote:
| ... and after the latest openai episode, we now know that msft
| was pulling the strings in the "shadows" (the "nah! I am just
| supporting without doing anything...").
|
| It is not conspiracy anything to think it is certainly happening
| elsewhere.
|
| And there is so much critical open source stuff on github... that
| said, github is at least noscript/basic (x)html friendly for its
| core functions.
| schemescape wrote:
| Sadly, viewing source code no longer appears to work without
| JavaScript...
|
| Edit: tested in w3m
| chollida1 wrote:
| > ... and after the latest openai episode, we now know that
| msft was pulling the strings in the "shadows" (the "nah! I am
| just supporting without doing anything..."
|
| Do we?
|
| What evidence is there that Microsoft was "pulling the strings
| in the shadows"?
|
| As far as we know Microsoft only found out a minute before Sam
| did that he was being fired.
| rossdavidh wrote:
| Well, we found out that Microsoft is able to reverse
| essentially any decision they feel strongly about, which is
| essentially what "pulling the strings" means, in common
| usage.
| chollida1 wrote:
| > Well, we found out that Microsoft is able to reverse
| essentially any decision they feel strongly about, which is
| essentially what "pulling the strings" means, in common
| usage.
|
| What specific decision did Microsoft reverse?
|
| We already know that they had no say in Sam's firing or any
| specific pull in his rehiring.
|
| Or is there any proof we have that Microsoft forced the
| reversal of Sam's firing?
| whatshisface wrote:
| Microsoft is a billion dollar company, they could hire
| away everyone at a McDonald's restaurant and buy the
| location if they didn't like how it was being run. They
| arguably have less power over OpenAI than they do over
| the average startup because they could probably buy any
| given startup outright but they ended up with nonvoting
| shares in OpenAI.
| VirusNewbie wrote:
| "No one wanted to go to Microsoft." This person called the
| company "the biggest and slowest" of all the major tech companies
| -- the exact opposite of how OpenAI employees see their startup.
|
| Lol. I was wondering about this...
| ausbah wrote:
| they didn't want to leave because of OpenAI's great compensation
| packages ($300k+)
|
| I do think it is a little unfair to characterize MSFT as "slow
| and boring" when they've been the ones to make the fastest pivot
| to supporting generative AI as a product line
| soulbadguy wrote:
| I think the deal was for employee to keep their OpenAI
| compensation even after moving to MSFT
| jandrese wrote:
| Even if they did you know the annual performance review would
| go something like:
|
| "You did a great job this year and met or exceeded all of the
| metrics we measured, but your compa ratio is just too high so
| instead of a raise we are going to give you a lump sum
| instead."
|
| Big company culture vs. startup culture is a known issue.
| People choose to work for startups to avoid that big company
| culture, so if a big company buys you out then it's time to
| move on.
| gwern wrote:
| OP discusses this and how it was a hollow promise. Even if it
| was kept in the full spirit of the nonbinding verbal
| agreement (which would utterly infuriate MS staff and
| demoralize them), it would be a bad deal to swap ultra-hot
| private OA PPUs for the same nominal (but low-growth) amount
| of MS stock and then have to work at MS.
| andy99 wrote:
| These are all people who could make good money anywhere. Few
| are presumably there solely because it's the best paying job.
| Part of it is certainly identity, OpenAI or $hot_startup sounds
| way cooler than Microsoft to a lot of people. And part would be
| wanting to work at a startup and not a legacy SaaS company and
| all the baggage that entails. Whatever carve-out they were
| going to get, there's now way you'd be as unconstrained working
| at MS as you would at OpenAI. It's precisely because the money
| isn't that important that a lot would have probably bailed if
| they all were absorbed into Microsoft.
| boringg wrote:
| Have you looked at MSFT corporate record and the current state
| of their bread and butter products?
| Racing0461 wrote:
| Everyone already knew that. It was just to get sam on the board
| with as little chaos as possible.
| kvee wrote:
| pg talks about how Sam Altman is the most powered person he's
| ever met. Seems we have a super powerful psychopath running
| perhaps the most important company in human history.
|
| I do think he legitimately believes he's doing the right thing
| though all throughout, which maybe makes it more scary.
|
| Sorta like how Mark Zuckerberg seemed to truly believe in
| Facebook's mission and wound up having all sorts of negative
| externalities for the world. Mark Zuckerberg just isn't quite as
| effective as Sam Altman, and it's easier to be suspicious of his
| motives.
|
| Not to say that psychopaths are necessarily bad. Peter in Ender's
| Shadow turned out great!
|
| But it does seem dangerous for 1 person to hold so much power
| over the future of humanity.
|
| Sam Altman's reasoning for him having all the power, I think, is
| that "short timelines and slow takeoff is likely the safest
| quadrant of the short/long timelines and slow/fast takeoff
| matrix."
|
| If you believe that and believe that Sam Altman having complete
| control of OpenAI is the best way to accomplish that, everything
| seems fine.
|
| I'd personally have preferred trying to optimize for long
| timelines and a slow takeoff too, which I think might have been
| doable if we'd devoted more resources to neglected approaches to
| AI alignment-like enhancing human capabilities with BCI and other
| stuff like that.
| superb_dev wrote:
| There's a lot to be said about Altman, but calling him a
| "psychopath" is just wrong. It's a legitimate medical term and
| should not be used for hyperbole
| miraculixx wrote:
| Look up Annie Altman.
| replwoacause wrote:
| Are you saying this because of the diddling accusations or
| for some other reason?
| throw_away_584 wrote:
| https://www.lesswrong.com/posts/QDczBduZorG4dxZiW/sam-
| altman...
| encoderer wrote:
| Everything else aside - in what world is Sam Altman "more
| effective" than Zuck? How do you even define effective?
| kvee wrote:
| In this case I think I just mean more effective at seeming
| good to others.
|
| I think they both believe they are good and doing good.
|
| People tend to be more suspicious of Mark Zuckerberg's
| motives than Sam Altman's.
|
| Sam Altman himself even said he can't be trusted but that was
| ok because of the company structure and then, when he needed
| to, overpowered that structure he claimed was necessary:
| https://x.com/tobyordoxford/status/1727624526450581571?s=20
| ben_w wrote:
| I think you're using the word "psychopath" when you're talking
| about something different, though I can't guess what.
|
| Psychopathy is a personality disorder indicated by a pattern of
| lying, cunning, manipulating, glibness, exploiting,
| heedlessness, arrogance, delusions of grandeur, sexual
| promiscuity, low self-control, disregard for morality, lack of
| acceptance of responsibility, callousness, and lack of empathy
| and remorse.
|
| (Which, now I read it, is disappointingly pattern matching the
| billionaire who invested in both OpenAI and also a BCI startup
| currently looking for human test subjects).
|
| I can see arguments for _either_ saying Altman has delusions of
| grandeur _or_ lack of acceptance of responsibility depending on
| if you believe OpenAI is going too fast or if it 's slowing
| things down unnecessarily, but they can't both be true at the
| same time.
| kvee wrote:
| You may be right here.
|
| However, there seems to be a decent amount of evidence that
| Sam has done exactly what you're talking about.
|
| He manipulated and was "not consistently candid" with the
| board, he got all the OpenAI employees to support him in his
| power struggles, he made them afraid to stand up to him (http
| s://x.com/tobyordoxford/status/1727631406178672993?s=20), he
| exhibited delusions (though I guess they were correct) of
| grandeur with pg with a glint in his eye making clear to pg
| that he wanted to take over yc, he did little things like
| made it seem that he was cool with Eliezer Yudkowsky with a
| photo op but didn't really chat with him, etc.
|
| Again, I am not sure this perspective is necessarily right
| (and I may be convinced just because he's such an effective
| psychopath).
|
| In any case, I think this is a pretty good explanation of
| this perspective:
| https://x.com/erikphoel/status/1731703696197599537?s=20
| gwern wrote:
| > (Which, now I read it, is disappointingly pattern matching
| the billionaire who invested in both OpenAI and also a BCI
| startup currently looking for human test subjects).
|
| Elon Musk actually matches several of those poorly, and
| matches bipolar disorder _much_ better (most of those are
| also bipolar or billionaire symptoms, while psychopathy is
| inconsistent with many Musk symptoms like catatonia):
| https://gwern.net/note/musk
| miraculixx wrote:
| Look up Annie Altman. Be seated.
| wintogreen74 wrote:
| One old guy in a bubble thinks says another young guy in same
| bubble (who he just happened to mentor) is "the most powered
| person he's ever met."
| zlg_codes wrote:
| That whole first part disgusted me. "most powered person he's
| met"? Good lord does that come off as tone deaf, almost
| groveling.
|
| And the most important company in human history? The hell is
| that guy smoking, because I've got good shit and that's some
| serious hyperbole.
|
| Is the hype machine in the room with us right now?
| tester756 wrote:
| >perhaps the most important company in human history.
|
| holy shit, hype is unreal :D
| JohnFen wrote:
| > I do think he legitimately believes he's doing the right
| thing though all throughout, which maybe makes it more scary
|
| I really think the opposite. I think he's after the biggest
| payday/most power he can get, and anything else is a secondary
| consideration.
| gkoberger wrote:
| I think you can fairly ascribe a lot of negative attributes
| to Sam, but an unnatural thirst for money isn't it. Nothing
| about anything he does makes me think he's motivated by
| increasing his personal net worth.
| kvee wrote:
| He has said in podcasts he is motivated not by the money
| but by the power he has at OpenAI
| gkoberger wrote:
| Do you have an actual quote? I've listened to him talk a
| lot, and this feels like a misquote or misinterpretation.
| (I'm not saying it's not true; I just don't see Sam
| saying he personally likes power)
| kvee wrote:
| Here are a couple I could find in notes I took while
| listening to podcasts, though there are more -
|
| "I get like all of the power of running OpenAI" "I don't
| think it's particularly altruistic. Like it would be if I
| didn't already have a bunch of money. The money is gonna
| pile up faster than I can spend it anyway."
|
| Those I think are either from
| https://www.youtube.com/watch?v=3sWH2e5xpdo or
| https://www.youtube.com/watch?v=_hpuPi7YZX8
| kvee wrote:
| Just went to the first video and got the following from
| 1:36, here's a link which starts at that point:
| https://youtu.be/3sWH2e5xpdo?si=bmum-8B02FLoVkWj&t=96
|
| "I mean I have like lots of selfish reasons for doing
| this and as you've said I get like all the power of
| running OpenAI, but I can't think of anything more
| fulfilling to work on and I don't think it's particularly
| altruistic, it would be if I didn't already have a bunch
| of money, yeah, the money is gonna pile up faster than I
| can spend it"
|
| Some other fascinating and relevant stuff in that video
| too.
| gkoberger wrote:
| To me, that sounds like he acknowledged his power but
| disagreed with the person who said it. He's just
| repeating the question but shifted to it being fulfilling
| (and not about money). Without the question being
| included, I think it's hard to use this quote as proof.
|
| He's also said something similar in another interview:
|
| "One of the takeaways I've learned is that this concept
| of having enough money is not an idea that's easy to get
| across to people. I have enough money. What I want more
| of is, like, an interesting life, impact; access to be in
| the conversation. So I still get a lot of selfish benefit
| from this. What else am I going to do with my time? This
| is really great. I cannot imagine a more interesting life
| than this one and a more interesting thing to work on."
| kvee wrote:
| I think I can see how you interpret it that way.
|
| I certainly didn't interpret as him disagreeing with his
| statement "I mean I have like lots of selfish reasons for
| doing this"
|
| It's the "as you've said" part of "as you've said I get
| like all the power of running OpenAI" that would make me
| inclined to think what you wrote here.
|
| But I do think there's a greater chance that he is saying
| that he does like the power.
|
| There's also another quote either in this video or the
| other one I shared I think where he's asked why he's
| doing this, or what motivates him, or something like
| that, and he responds with something like "I'd be lying
| if I didn't say I really like the power"
| JohnFen wrote:
| I don't claim to know what motivates him. I don't know him
| and have no view into his thinking. I'm just going by what
| his actions look like to me.
|
| I can't distinguish between a thirst for money and a thirst
| for power because above a certain level, they're
| essentially the same thing.
| Laaas wrote:
| > This person called the company "the biggest and slowest" of all
| the major tech companies
|
| Could not be further from the truth.
| soulbadguy wrote:
| why ?
| Nthringas wrote:
| because IBM is both bigger and slower?
| nostrebored wrote:
| I don't think IBM is something that comes to mind when
| people are talking "Major tech companies" anymore.
| zlg_codes wrote:
| They ought to, considering they acquired Red Hat a while
| back.
| xcv123 wrote:
| The Red Hat deal was relatively small at $34B.
|
| Microsoft has a market cap of $2.8 Trillion.
|
| IBM is only at $146 Billion. But IBM still has a higher
| head count.
| eikenberry wrote:
| Not after the Redhat acquisition.
| satvikpendem wrote:
| IBM is not "major" anymore in the sense of major tech
| companies that the person above meant.
| soulbadguy wrote:
| I don't think IBM is a "major tech companie" in the modern
| lingo
| airstrike wrote:
| Hard to argue IBM is bigger
| Nthringas wrote:
| I'm factoring in their legacy and history
|
| i.e. since they're older they're bigger
| xcv123 wrote:
| Microsoft is now 20x bigger than IBM in terms of market
| cap.
| ethbr1 wrote:
| https://m.youtube.com/watch?v=3t6L-FlfeaI
| hughesjj wrote:
| IMHO, today, Google takes the cake for slowest. Meta is still
| probably the most agile, and Amazon is super hit or miss (so
| most variance)
|
| Idk where Microsoft would fit in that hierarchy, like Amazon
| it's kind of hit or miss but with less variance and extrema.
| From what I've seen and heard, both Gaming and Azure are
| pretty darn competent these days from an engineering and
| product perspective. Not perfect of course, but nothing is.
| valine wrote:
| Sounds about right to me. They missed the entire smartphone
| revolution because they were too slow to adapt their OS
| (literally their main product) to run on mobile devices.
| Laaas wrote:
| They didn't miss the cloud revolution, and certainly not the
| AI revolution either.
|
| Microsoft is doing absolutely great under Satya Nadella.
| soulbadguy wrote:
| > and certainly not the AI revolution either.
|
| The AI revolution is still underway, we don't know who
| missed what yet. In term of research output. Most of the
| ground breaking work did "not" come from msft. They just
| bough they way in. Valid strategy, but definitely not
| novel.
| valine wrote:
| That's more down to dumb luck partnering with OpenAI.
|
| You have a point with the cloud computing. I'd hesitate to
| call it a revolution though. I'd never willingly use a
| microsoft cloud product if I wasn't forced to by my
| employer.
| ipaddr wrote:
| They were not first in either areas. They leveraged
| existing products like office to catch up.
| Infinitesimus wrote:
| Apple has taught us time and time again that you rarely
| have to be first to the market. Executing well and
| keeping people locked in are pretty important
| zlg_codes wrote:
| Weird, Microsoft's been invisible to me since the early
| 2000s. I don't think they could sell me anything. Their
| whole corporate image is just slimy. Like that one friend
| that wants you to set up an account on something he's got
| going, or wants you to join him on an investment. You can
| _totally_ trust him. You 'll have a site, some cloud
| storage, office software, the works. And the investment?
| Yeah dude it'll rocket any year now.
|
| There's just a _feeeew_ things. Yeah, we 're gonna tell you
| when to restart, we'll update it for you. We'll tell you
| what you can and can't run on your own machine. We'll
| "protect you" by trying to keep you inside the Microsoft
| Store, so you can adjust to buying all of your software
| instead of getting anything for free. Freedom is bad!
|
| What can Microsoft do for a programmer who self-hosts and
| doesn't trust proprietary software? They co-opted GitHub
| but that really just reduced trust in GitHub more than
| anything. And GitHub itself is proprietary software built
| on top of Git. VS Code is a laggy mess. Azure has nothing
| to hook me. LLMs are toys or another privacy invasion
| vector to "analyze". I don't trust them to store my e-mail.
|
| Everything MS touches dies a slow death. Look at Skype.
| djur wrote:
| > What can Microsoft do for a programmer who self-hosts
| and doesn't trust proprietary software?
|
| Ignore them and market themselves to the remaining 99% of
| programmers.
| constantly wrote:
| *99.95% of programmers
| zlg_codes wrote:
| that's a funny assertion. Why was Microsoft chasing WSL
| and PowerShell if they weren't trying to appeal to
| markets that were smaller than the existing market but
| might still be worth revenue? It's not as stark and
| simple as 99/1 split. They've tried hard to prove they're
| friendly to free software and yet there's still plenty of
| people that don't trust them.
|
| Many people use proprietary software because they're
| forced to for work or social circumstances (i.e. banking
| and messaging apps), not by choice. In other words,
| they're socially engineered into it. Why trust software
| you can't inspect? Because it has a big brand behind it?
| Because no software company has ever betrayed the trust
| of its customers? Companies never do deals with shady
| partners that end up betraying _them_ , and in turn their
| customers too? Transparency fixes this, and it can
| mitigate a company going rogue on its own product and
| customers.
|
| Libre software can certainly suck in some areas of the
| stack, but at least I can do more than send an e-mail to
| an unmonitored inbox at a corporation where my ass
| doesn't matter as soon as they have my money. I'll get a
| form letter containing a response to what some random bot
| _thinks_ I 'm talking about, and my concern will not be
| addressed, even as a paying customer. In a world of
| software that I can inspect the code of, I might be able
| to research my way out of a tech issue. The contention is
| that even as a customer, I can't count on Microsoft
| addressing any problems I encounter. If you're already
| coming at me with "get ignored, go away" then what's the
| incentive to engage at all?
|
| You're right that some sectors _are_ ignored, probably
| because they believe they should be paid for providing
| certain things. Fair to have that opinion, but it 's also
| fair for me to disagree with it, criticize it, and share
| that criticism.
|
| I still give Microsoft software a shot from time to time,
| to make sure my understanding of it is at least semi-
| recent. But by and large, I don't enjoy my interactions
| with their ecosystem. Setting up Minecraft Realms and
| mitigating differences between Java and Bedrock was a
| PITA. The migration from Mojang was a pain. I _did_ enjoy
| getting Bedrock for free with my transfer. That was a
| solid move that showed they understood it was a pain and
| were willing to part with a little revenue in the short
| term in the hopes I 'd buy content in the long term. (I
| did, for the record.)
|
| But that's video games. I don't mind them being
| proprietary because I'm there to have fun. As long as
| it's not chatting over the network or doing analytics and
| other shit, I'm fine with a game not coming with source
| code. But I also love that gzDoom, darkplaces, TADS,
| LOVE2D, TIC-80, and other open engines exist where people
| can bring their own creations to life in a non-profit
| environment. Microsoft itself started in a bougie garage,
| they ought to appreciate the spirit of DIY.
|
| I've paid for things so it's not like I won't buy
| _anything_. My experience and view is not any less
| deserving of expression simply because it 's less
| popular, though. There are plenty of people out there
| similar to me in distrusting Microsoft and other big tech
| companies' software. Minecraft isn't perfect, it has
| analytics and a forced word filter, and no way to self-
| host a Bedrock server. None of those were a thing when I
| bought it back in 2011 or so. They want me to pay for a
| Realm for that. And I did, for a little while, due to
| family members wanting to play, but I realized I could
| host it for cheaper. Eventually we stopped playing
| anyway, so I didn't have to worry about it.
|
| Most people simply don't care about software freedom, and
| chide and ridicule those who do. Unsurprising, but still
| no less disappointing when encountering it among those
| who like to develop an air of distinction and taste but
| come up lacking in conversation. If they care so little,
| why be shitty to those who do?
| chrisoverzero wrote:
| > Why trust software you can't inspect?
|
| Because for nearly everyone, that's _all_ software.
| nijave wrote:
| >They didn't miss the cloud revolution
|
| Imo they barely kept pace largely by leveraging existing
| software and repurposing it for cloud. They still seem to
| be playing catch-up with multiple solutions to the same
| problem--some deprecated, some preview.
|
| Take for instance Postgres. Azure had Single Server built
| on Windows containers. They acqui-hired Citus to build out
| distributed and they released Flexible Postgres as a Linux-
| based replacement for SS. Flexible still doesn't have
| feature parity with Single Server and doesn't have feature
| parity with other cloud vendors (pg_wait_sampling roll-up
| is missing last time I checked a few months back).
|
| To make matters worse, their data migration tools are
| sorely lacking as well.
| John23832 wrote:
| You realize that was literally 20 years ago?
| valine wrote:
| More like 10 years ago, that's when Microsoft dropped the
| ball.
| chimeracoder wrote:
| > Sounds about right to me. They missed the entire smartphone
| revolution because they were too slow to adapt their OS
| (literally their main product) to run on mobile devices.
|
| Remember the whole antitrust thing? Microsoft was under a
| consent decree that expired between 2007-2009 (different
| provisions expired at different times).
|
| That decree, combined with the material threat of additional
| action during that time window, limited their ability to
| compete (because that's, well, the entire point of a consent
| decree motivated by an antitrust settlement).
|
| There's a reason that you see a notable difference in
| Microsoft's market position and strategy from 2012-present
| compared to the previous decade (2001-2012). The timing of
| the Ballmer-to-Nadella transition is not coincidental; it's
| indicative of the larger shift the company made as they
| repositioned themselves to aggressively expand again.
| baz00 wrote:
| Actually they did build a really competent smartphone
| operating system that was as good as iOS at the time quite
| frankly and was affordable.
|
| The issue is they went and rewrote a chunk of it, breaking
| APIs and fucked all the developers off, then abandoned it.
|
| The only thing they are is fucking stupid morons.
| duped wrote:
| Windows was running on smartphones back in 2002.
|
| The failure of Windows Mobile, and later Windows phone, had
| little to do with being "slow."
| tfehring wrote:
| My impression is that it's pretty true of Microsoft in general,
| but not of the AI research teams I'm familiar with. As with any
| tech company of its size, there's lots of variation from team
| to team.
| baz00 wrote:
| Yeah I mean I can think of a fair few better nouns that are
| applicable to MSFT than biggest and slowest...
| gwern wrote:
| It doesn't need to be true, just needs to be what a nontrivial
| number of OAers think. And given how many people I still see
| putting down 'M$', I can absolutely believe a dislike of
| Microsoft is widespread.
| zer00eyz wrote:
| During that whole mess it seemed to slip out that MS has access
| to all the IP up to AGI. And that has a definition that might be
| "replacing people at work", so not passing the Turing test fully
| but close enough.
|
| There are some problems that need to get cleared up for that to
| happen. The system needs to loose the cutoff date, be a bit more
| deterministic and still function. The whole quagmire around
| copyright needs to get resolved. (Because it looks like the
| output of LLM's is immediately in the public domain)
|
| If I worked at OpenAI, I would be looking for that contract and
| reading it myself. Because giving away all the IP for the half
| assed runway where you have to get to AGI... doesn't sound like
| it ends in a massive pay day. MS may have cleaned up its public
| image in the recent years, or been displaced by things people
| hate and fear more. But there is this:
| https://foundation.mozilla.org/en/campaigns/microsoft-ai/ and the
| underlying tos looks a like like old school M$ and shady
| dealings.
| joe_the_user wrote:
| Your implication that OpenAI has "AGI" is unsupported and
| implausible imo.
|
| LLMs are impressive, can increase productivity for certain
| workers in certain industries etc, yes but avoid reaching like
| that please.
| zer00eyz wrote:
| I don't imply they have it. I imply that their deal with MS
| has a definition that we would not call "AGI". One that has
| implications that may make cutting them (MS) off impossible.
|
| From: https://openai.com/our-structure ...
|
| "While our partnership with Microsoft includes a multibillion
| dollar investment, OpenAI remains an entirely independent
| company governed by the OpenAI Nonprofit. Microsoft has no
| board seat and no control. And, as explained above, AGI is
| explicitly carved out of all commercial and IP licensing
| agreements."
| joe_the_user wrote:
| Sorry,
|
| I see how your post could be read as you state but the
| simplest reading is what I said.
| the__prestige wrote:
| TFA talks about a tender offer, which allows employees to sell
| their shares at almost a 3x valuation compared to earlier this
| year. This already is a "massive pay day".
| zer00eyz wrote:
| The only time that 3x trade would be a good deal is if you
| think that's the best you're going to do. IF you think your
| gonna be the next amazon/Facebook/google then selling is
| foolish. The MS deal may limit or wreck that possibility.
| ghaff wrote:
| Is it actually news that 70% (or whatever) of the employees at a
| hot startup wouldn't go to work for Microsoft even if they kept
| their compensation packages (which would probably have a lot of
| asterisks attached) because of executive suite drama?
| andy99 wrote:
| It's "business insider". Don't underestimate how poorly
| management understands "technical resources". I can certainly
| see lots of leadership just assuming that pay is the only
| relevant variable and ignore culture entirely during an
| acquihire, assuming people are fine doing whatever as long as
| they're paid.
| JohnFen wrote:
| To be fair, there are a lot of devs for whom compensation is
| the only thing that really matters -- they are
| overrepresented here on HN, even. It would be pretty easy to
| assume that's the majority point of view.
|
| It also may be that's exactly the sort of person that
| Microsoft prefers, too, but I don't know.
| ghaff wrote:
| It's an open question whether moving to Microsoft would
| have been a good deal or a bad deal. But I'm pretty sure
| that most people seriously contemplating a move were
| definitely considering the dollars, even if not
| exclusively. (Although I expect most people signing a
| petition were not actually serious.)
| ElevenLathe wrote:
| Even from a compensation point of view, they presumably have
| a better chance for a big exit of some kind at a startup vs
| as a Microsoft employee.
| rob74 wrote:
| It's not just the compensation however, the article explains
| that by switching to Microsoft they would have lost their
| equity packages, which are worth even more than their salary...
| ghaff wrote:
| _Potential_ equity packages. But hence my comment about
| asterisks. Maybe moving to Microsoft would be a good
| financial deal, maybe a bad deal, but certainly a different
| deal. Especially in the absence of a formal deal.
| hardlianotion wrote:
| I dunno, the company had already demonstrated its ability to
| drastically affect its value to the downside.
| gweinberg wrote:
| How does crap like this get published? Not a shred of evidence is
| given for its assertions, and they sound pretty preposterous.
| Nobody actually wanted to go to Microsoft, and they didn't even
| think Altman was that great as a CEO. So why did they all sign
| the letter threatening to quit? Mysterious unspecified
| "pressure". Why did Microsoft claim it was willing to hire the
| OpenAI team at their current compensation levels, pissing off
| their own employees, if it really wasn't? Umm, no reason.
| ghaff wrote:
| >So why did they all sign the letter threatening to quit?
|
| I probably wouldn't in an employment situation where it could
| come back to bite me. But lots of people sign, virtually or
| otherwise, petitions in the heat of the moment because emotions
| or it's just the path of least resistance. And "at current
| compensation levels" wasn't a contractual promise and would
| probably have had plenty of strings attached.
| mschuster91 wrote:
| > How does crap like this get published?
|
| That is easy to answer: BI belongs to the infamous German media
| conglomerate Axel Springer [1], hosting one of Europe's most
| vile, disgusting and scandal-ridden [2][3] tabloids called
| "BILD" and its barely veiled sister publication "WELT".
|
| [1] https://en.wikipedia.org/wiki/Business_Insider
|
| [2] https://www.swr.de/swr2/wissen/70-jahre-bild-zeitung-
| zwische...
|
| [3] https://de.statista.com/infografik/2588/publikationen-mit-
| de...
| sgift wrote:
| Lol, really? BI is Axel Springer .. wow. And I always
| wondered why I felt iffy about their articles. That makes it
| clear. Into the "ignore this garbage" list they go.
| gwern wrote:
| > and they didn't even think Altman was that great as a CEO.
|
| There are plenty of people who think that. Even the OA
| executives apparently weren't nearly as enthused with Altman as
| all those social-media hearts might lead one to assume. See the
| Time article yesterday:
| https://news.ycombinator.com/item?id=38550240 specifically
| The board expected pressure from investors and media. But they
| misjudged the scale of the blowback from within the company, in
| part because they had reason to believe the executive team
| would respond differently, according to two people familiar
| with the board's thinking, who say the board's move to oust
| Altman was informed by senior OpenAI leaders, who had
| approached them with a variety of concerns about Altman's
| behavior and its effect on the company's culture.
| xazzy wrote:
| Am an employee, signed the letter.
|
| I can't read the article so maybe the content is more nuanced,
| but the framing irks me.
|
| This all happened really fast and the offer was informal. I can
| only speak for myself but I do have a lot of respect for modern
| MS, and would have seriously considered the move if that's what
| kept the team together and comp was reasonable. I would be
| surprised if most people felt differently.
| wintogreen74 wrote:
| >> and comp was reasonable
|
| A lot of complex issues and perspectives packed into those few
| words...
| replwoacause wrote:
| I thought the offer from MSFT, albeit unofficial, was that
| anyone who made the switch kept their current salary.
| shwaj wrote:
| A big part of the comp is in equity, and since OpenAI has
| an uncommon equity structure it is unclear how that would
| translate to Microsoft stock.
| rgbrgb wrote:
| i would guess a lot of OpenAI employees are sitting on some
| pretty valuable stock/options if the company doesn't
| implode
| imjonse wrote:
| Hence the overwhelming number of hearts in the Twitter
| messages asking for the return of Sam Altman.
| dmazzoni wrote:
| What about the non-salary part of comp?
|
| For most employees, their OpenAI stock would have been
| worth even more than their salary at its current valuation,
| and it has the potential to potentially be worth quite a
| bit more in the future.
|
| Replacing it with Microsoft stock would have made it a sure
| thing - but also with much less growth potential.
|
| I'd be really curious to hear if Microsoft actually got so
| far as to figure out what to offer OpenAI employees in
| terms of an equity offer.
| vthallam wrote:
| > >> and comp was reasonable
|
| there's no way you guys would get the same comp though? like
| not even close. MSFT irrespective of how much it wants you is
| not going to honor the 10X increase in valuation from the
| upcoming $90 billion valuation.
|
| Even if they do, you will miss on the upside. MSFT stock is not
| going to 10X but OpenAI's might.
| anupsurendran wrote:
| 100% vthallam. The upside for OpenAI is much higher.
| smileysteve wrote:
| .... Unless Sam Altman, major research leaders and 50%+ of
| the employees had transferred to Microsoft
| blagie wrote:
| I think if things went that far, OpenAIs valuation would very
| quickly pop to zero.
|
| Real MSFT stock beats a theoretical could-have-been with a
| 10x upside.
| ponector wrote:
| Don't forget that with new rounds and new valuation there is
| also a dilution of shares.
|
| If the valuation goes up x10 after next round average
| developer probably will have the same money locked in
| rsu/options.
| aetherson wrote:
| Uh, what? Are you claiming that if a company experiences a
| 10x (!) increase in valuation, that the dilution fully
| destroys that upside and the average developer experiences
| no increase in their comp? That is not even vaguely close
| to true in my experience.
| filoleg wrote:
| > Are you claiming that if a company experiences a 10x
| (!) increase in valuation, that the dilution fully
| destroys that upside and the average developer
| experiences no increase in their comp?
|
| Not the person you are replying to, but it depends on how
| diluted it gets. If they print 9x of currently
| outstanding shares as the value goes 10x, it would result
| in those original shares being worth exactly the same as
| before the 10x jump.
|
| But I agree with you overall, in terms of the actual
| reality. I don't think any company with half a brain
| would do that.
| IshKebab wrote:
| > Even if they do, you will miss on the upside. MSFT stock is
| not going to 10X but OpenAI's might.
|
| I'm pretty sure it doesn't work like that due to the
| existence of leverage. You can make MSFT have the potential
| to 10x (or /10) if you want.
| zamalek wrote:
| > there's no way you guys would get the same comp though?
|
| Nadella was directly involved and he's way smarter than that.
| Comping one the best ML teams in the world correctly is
| child's play compared to undoing Balmer's open source mess.
| capableweb wrote:
| LinkedIn data says ~44% of the OpenAI team is
| "Engineering", the rest is operations, human resources and
| sales.
|
| So most likely, if most of the employees moved over to
| Microsoft, they wouldn't get the same comp, at most 44% of
| the company would.
| zamalek wrote:
| > 44% of the OpenAI team is "Engineering"
|
| Yes, the "you guys" average HN demographic per the gp
| comment.
| HWR_14 wrote:
| Why wouldn't MSFT honor the increased comp? They are still
| investing money in OpenAI at that multiple.
|
| They would have to create some kind of crazy structure to
| avoid wrecking their levels. But of course it could be done.
| mrtksn wrote:
| Obviously MS does some great things but I wonder what do you
| think about their culture?
|
| MS is trying to make me use Edge Browser by randomly refusing
| to let me use Bing chat in any other browser. This makes me
| think that MS is still evil and will force me to do things the
| moment they can. I gave up Bing Chat because of this and was
| using Perplexity.ai instead, until Bing got integrated into
| ChatGPT.
|
| Another thing is the feel of the MS products is very different
| than OpenAI. For example, Bing chat again, would have a strange
| UX where it counts down my message allowance on the topic as I
| keep asking follow up questions as if it was designed for their
| convenience and not mine. OpenAI products on the other hand
| feel much more pleasant to use and don't stress me out even
| when things don't work as expected, which makes me think that
| the approach of product design is very different than MS.
|
| Same tech running in the Azure servers, completely different
| product experiences. IMHO this points to completely different
| mindsets.
| staunton wrote:
| > MS is still evil
|
| All big corporations are "evil", as in, their decisionmaking
| is scaled and institutionalized enough to effectively
| implement the goal of "maximizing shareholder value" above
| what's good for you or society.
| mrtksn wrote:
| No, not all companies try to exploit their strength in one
| area to make me use some other product that I have no
| interest in.
|
| I'm not talking about "hey, we have this product that you
| might like", I'm talking about "if you want to use this
| product, you must also use this product". There's also no
| technical reason for it, sometimes they will let me use it.
|
| Not O.K.
| kbenson wrote:
| > No, not all companies try to exploit their strength in
| one area to make me use some other product that I have no
| interest in.
|
| I'm not sure of any large one that doesn't. The
| incentives are aligned such that it's hard for them not
| to. It's easy to find examples of similar (sometimes
| almost identical) behavior from most large companies.
| mrtksn wrote:
| Okay, what other company does it? You may say Apple & App
| Store but that's not the same.
| andsoitis wrote:
| the conversation in this thread makes me wonder about whether
| the right incentives for OpenAI should be to make tons of
| money.
|
| Personally, I prefer to make tons of money, but as someone who
| will not have a say in whether to participate in an AI-driven
| world, I would prefer if there were non-profit counterweights.
| Perhaps governments will have the wherewithal to steer things
| but I am somewhat concerned.
| strangattractor wrote:
| Smart - If it ain't written down it is not an offer.
| VirusNewbie wrote:
| If you're ok with sharing, did you sign it because you are more
| aligned with the productizing of GPT etc, or is it that you
| truly believe Sama is the CEO to follow? Or a combination of
| both?
| port515 wrote:
| Microsoft will be the leader in AI with or without OpenAI. Mark
| my words, you can take that to the bank. Come back to this post
| in 5 years, you'll see I predicted the future.
| miraculixx wrote:
| Apple more likely. Or some other co we don't know about yet.
| OpenAI is going to crumble. Microsoft will be challenged for
| trust. We'll see what happens.
| replwoacause wrote:
| Apple? They seem to be asleep at the wheel as far as AI goes.
| I thought their focus was hardware. Siri is a piece of junk.
| catchnear4321 wrote:
| apple only has one focus. ecosystem. that is the reason for
| the slow movements and the appearance of sleep. different
| scale. different game.
| neilv wrote:
| > _A scheduled tender offer, which was about to let employees
| sell their existing vested equity to outside investors, would
| have been canceled. All that equity would have been worth
| "nothing," this employee said._
|
| > _The former OpenAI employee estimated that, of the hundreds of
| people who signed the letter saying they would leave, "probably
| 70% of the folks on that list were like, 'Hey, can we, you know,
| have this tender go through?'"_
|
| If that one person's speculation is true, does the non-profit
| have an alignment problem, with employees who are doing the
| technical work -- that the employees are motivated more by
| individual financial situations, than by the non-profit's
| mission?
|
| (Is it possible to structure things such that the people doing
| the work don't have to think about their individual financial
| situations, and can focus 100% on the actual mission? Can they
| hire enough of the best people for their mission that way? And
| maybe also keep key technical talent away from competitors that
| way?)
| CaveTech wrote:
| I think it's relatively easy to prove that the main motivation
| for the majority of employees is not mission alignment, simply
| by the fact that salaries within OpenAI are in the top few
| percentile of the field.
|
| I don't believe this is necessarily in conflict with the
| mission though. Employees are mercenaries, it's up to
| management/leadership to enforce the mission and make sure
| employees contribute to it positively. The employee becomes
| forcefully aligned with the mission because that is the key to
| their personal enrichment. They are paid to contribute, their
| personal beliefs are not all that important.
| JohnFen wrote:
| > it's up to management/leadership to enforce the mission and
| make sure employees contribute to it positively.
|
| But surely, the primary tool that leadership has to do that
| is by selecting employees who are on the same page as them. A
| purely mercenary workforce is very undesirable unless the
| company is also mercenary.
| neilv wrote:
| > _Employees are mercenaries, it 's up to
| management/leadership to enforce the mission and make sure
| employees contribute to it positively. The employee becomes
| forcefully aligned with the mission because that is the key
| to their personal enrichment. They are paid to contribute,
| their personal beliefs are not all that important._
|
| I think that might be the norm, but it's sounds like an awful
| dynamic.
|
| It's also unfortunate if you want to do something better. We
| have many mercenaries companies that have appropriated some
| of the language we might use to characterize something
| better.
|
| So, say you're trying to found a company with grand ideals,
| made up of people who care about the mission, you actively
| want a diversity of ideas, etc., and almost every sentence
| you can think of to communicated that a bunch of candidates
| nodding, "Yeah, yeah, whatever, we've heard this a hundred
| times, just tell me what the TC is, for the 18 months I'll
| stay here".
| dragonwriter wrote:
| > If that one person's speculation is true, does the non-profit
| have an alignment problem, with employees who are doing the
| technical work -- that the employees are motivated more by
| individual financial situations, than by the non-profit's
| mission?
|
| Yes, and moreover they've created a compensation structure
| which actively creates incentives that are _contrary_ to the
| charity 's mission.
|
| This was probably the easiest way to attract talent that had
| high paying alternatives and weren't particularly interested in
| the charity's mission, but that was always a fundamental
| problem with choosing a for-profit entity with that kind of
| needs as the primary funding vehicle for the charity _and also_
| the primary means by which it would achieve research, etc.,
| directed at its charitable purpose.
|
| The problem -- taking OpenAI's stated charitable mission at
| face value [0] -- is that there was nowhere close to enough
| money available from people concerned with that mission to pay
| for it, and OpenAI's response was to go all-in on the most
| straightforward path of raising sufficient funds given what
| resources it had and what the market looked like without
| sufficiently considering the alignment of its fundraising
| mechanism with the purpose for which it was raising funds.
|
| [0] which I should emphasize that I do for the sake of
| argument, not because I necessarily believe that it represents
| the genuine purposes of the people involved even initially.
| ghaff wrote:
| >Is it possible to structure things such that the people doing
| the work don't have to think about their individual financial
| situations, and can focus 100% on the actual mission?
|
| Mostly no. People may not only care about money. But money is
| pretty important--at least until you get into pretty large
| numbers. And then it's still a pretty important keeping score
| metric.
| reso wrote:
| I am still so confused by this whole saga. No one has explained
| how or why the entire board decided to do a coup, and then simply
| changed their minds a few days later.
|
| I have to assume that the individuals involved are under a mix of
| social and legal pressure to not talk about the circumstances.
| richbell wrote:
| The speculation seems to be that Sam was being duplicitous and
| trying to oust a board member he didn't like. Members of the
| board compared notes and realized he had misrepresented his
| conversations with other members in an attempt to build
| concensus. Throw in the rapid expansion of OpenAI
| (commercialization being in conflict with the board's vague
| mandate to do what's best for humanity) and the rumored (now
| confirmed) deal he was arranging with a company that he had a
| financial stake in, and they felt like they needed to remove
| him.
|
| However, by trying to do so swiftly and not allowing him a
| chance to retaliate they pissed off Microsoft and the
| employees. At that point, they were basically forced to
| reinstate him or the entire company would collapse -- and if
| they do that, they can't also go on record clarifying why he's
| bad.
|
| * this is my vague recollection based on reading past
| discussions. I'm on my phone right now and unfortunately don't
| have any sources I can link, take this with a grain of salt.
|
| Edit: a few links
|
| https://news.ycombinator.com/item?id=38559770
|
| https://news.ycombinator.com/item?id=38548404
| JumpCrisscross wrote:
| Their communication strategy was also juvenile at best.
| ethbr1 wrote:
| Thanks for the citations.
|
| That was my guess... but only because it was the only
| scenario I could think of where the board being curiously and
| obviously intentionally vague about 'why' in the
| announcement, but still saying more than nothing, made sense.
| pixelmonkey wrote:
| This recent TIME article lays out the saga pretty
| straightforwardly and makes it a bit less confusing.
|
| https://time.com/6342827/ceo-of-the-year-2023-sam-altman/
|
| At least, that was how I felt after reading it.
|
| Basically, within the span of a year, OpenAI transformed from a
| research lab inside a non-profit that was pursuing a seemingly-
| Quixiotic dream of artificial general intelligence (AGI)...
| into one of the fastest-growing for-profit software companies
| of all time via its creation of the chat-based generative AI
| category (aka ChatGPT) and its consumer/enterprise SaaS and API
| offerings.
|
| The board -- or, at least, its 4 remaining non-CEO members --
| thought that this was too much, too fast, and too soon, and
| that there was a narrow window of time where they could slow
| things down and refocus on the original non-profit mission.
| They also felt that Altman was a bit of a force of nature, had
| his own ideas about where OpenAI was going, and treated "board
| management" as one of his CEO skills to route around obstacles.
|
| Once a board loses trust of their CEO, unfortunately, there is
| usually only one blunt and powerful tool left: firing the CEO.
|
| And this happens pretty often. As the investor Jerry Neumann
| once put it, "Your board of directors is probably going to fire
| you."[1] Boards have very few ways to actually take action when
| they are worried about a company or institution; firing
| management is one of the few "course correction" actions they
| can take quickly.
|
| In OpenAI's case, if they had a for-profit board, that board
| would probably have been ecstatic with Altman and the company's
| progress. But this was not a for-profit board. It was a non-
| profit (mission-oriented) board meant to oversee the safe
| rollout of AGI. Those board members weren't sure the best way
| to do that was to become one of the world's largest for-profit
| software companies.
|
| I'd speculate that it was probably an emotional decision and
| the full implications were not entirely thought through until
| it was too late. I'd also speculate that this explains why Ilya
| Sutskever felt some immediate regret, because his goal wasn't
| to destroy OpenAI (or inspire an employee revolt) but to put
| its non-profit mission back into focus. I like to practice the
| principle of charity[2], and, in this case, I think the non-
| profit board was not acting maliciously but simply did not
| realize the knock-on effects of trying to replace a CEO when
| everything at the company seems to be "going right."
|
| I suspect Altman thought the best way to roll out AI was via
| iterative product development and fast revenue growth to
| finance the GPU demands, utilizing corporate partnerships
| (Microsoft), viral word-of-mouth marketing, and SaaS/API fees
| (ChatGPT). Running out of data center compute started to become
| a primary concern, so it wouldn't surprise me if safety took a
| backseat to this. Remember, all this growth happened in the
| span of a year. Perhaps Altman thought he was satisfying the
| safety concerns simply by talking to regulators, making
| iterative releases, and going on a speaking tour about it, but
| the board thought the only way to go safely was to go slower.
| I'm sure we'll learn more after some books are written about
| the episode.
|
| [1]: https://reactionwheel.net/2021/11/your-boards-of-
| directors-i...
|
| [2]: https://en.wikipedia.org/wiki/Principle_of_charity
| dmazzoni wrote:
| I think that all makes sense.
|
| Things would have played out very differently if the board
| was more experienced and thoughtful. Their hearts might have
| been in the right place, but their actions were reckless and
| ultimately backfired.
| JoshTko wrote:
| Interesting how you don't place any blame on Altman on
| understand addressing board concerns. A more experience CEO
| would have read the tea leaves.
| alecst wrote:
| That's a great summary, and I feel less confused after having
| read it. Thanks.
| tivert wrote:
| > I am still so confused by this whole saga. No one has
| explained how or why the entire board decided to do a coup, and
| then simply changed their minds a few days later.
|
| I don't know how anyone can call the board's action a "coup."
| Calling it that seems to be a propagandistic abuse of the term.
|
| The board was in charge, and it's not a coup if it fires a
| subordinate (the CEO). If anything, the coup was getting the
| board ousted.
| darkerside wrote:
| I think the coup is on the part of other employees who
| advocated for the board to fire Altman. No judgement in
| whether that was justified or not.
| tivert wrote:
| > I think the coup is on the part of other employees who
| advocated for the board to fire Altman. No judgement in
| whether that was justified or not.
|
| The GP was pretty clear that he thought "the entire board
| decided to do a coup," which does not fit that
| interpretation.
|
| But even the scenario you describe isn't something that can
| be properly described as a "coup." In that case, the
| employees are just appealing to a _legitimate_ higher
| authority, which is a totally OK thing to do (e.g. it 's
| not wrong to report a bribe-taking boss to the company
| ethics hotline). IMHO, a coup is where subordinates
| illegitimately _usurp and depose_ the highest authority
| from below.
| PepperdineG wrote:
| >I don't know how anyone can call the board's action a
| "coup." Calling it that seems to be a propagandistic abuse of
| the term.
|
| >The board was in charge, and it's not a coup if it fires a
| subordinate (the CEO). If anything, the coup was getting the
| board ousted.
|
| The CEO was a member of the board and the ones that fired the
| CEO also fired the Chairman of the Board, so the board went
| from 6 to 4. So far there's been even less of an explanation
| for the firing of the Chairman of the Board - who they
| offered to let remain as a regular employee - than there has
| been for Altman, though I see the removal of the Chairman as
| potentially the most egregious and coup-like.
| bogomipz wrote:
| There was a bit of insight today from a board member on her
| perspective. See:
|
| https://archive.is/Sy3Xm
| nicce wrote:
| > No one has explained how or why the entire board decided to
| do a coup, and then simply changed their minds a few days
| later.
|
| Board's job is to hire or fire CEO. Technically CEO made the
| coup since he managed to throw his bosses out of their
| positions.
| 6gvONxR4sf7o wrote:
| > Given the absence of interest in joining Microsoft, many OpenAI
| employees "felt pressured" to sign the open letter, the employee
| admitted. The letter itself was drafted by a group of longtime
| staffers who have the most clout and money at stake with years of
| industry standing and equity built up, as well as higher pay.
| They began calling other staffers late on Sunday night, urging
| them to sign, the employee explained.
|
| What a clusterfuck. I feel bad for anyone who supported the
| board.
| Tenoke wrote:
| This isnt the board though. This is the against the board side.
| what_ever wrote:
| That's why they feel bad for those who supported the board as
| the ones that opposed the board may not have just done it
| based on the board's actions.
| staunton wrote:
| Anyone who supported the board was in a situation where they
| deemed they could afford to go against the peer pressure. This
| is a combination of acting according to their beliefs/values
| and economic security. Those people should be envied, not
| pitied.
| kbenson wrote:
| We should envy people that stand firm on their beliefs and
| are vindicated, not those that may experience backlash in the
| future because people that had a different view won in the
| end. Suffering hardship for doing what you think is right is
| not something to envy.
|
| Don't envy martyrs, wish for a world without the need for
| them.
| fwungy wrote:
| As if OpenAI employees would have any problem landing a new
| gig...
|
| Going from an agile startup environment to an F50 is a huge leap
| culturally. It's like going from Summer science camp to the army.
| wolverine876 wrote:
| > As if OpenAI employees would have any problem landing a new
| gig...
|
| They'd have problems landing new gigs where they worked on
| OpenAI, with OpenAI's resources, team, etc.
| badrabbit wrote:
| You know, I was just thinking how if I was a google exec I'd
| poach openai and/or attempt to gain some controlling shares if
| the org by throwing money at sam/board.
| simplypeter wrote:
| Why would OpenAI employees want to jump ship to Microsoft,
| especially when Microsoft's been slashing jobs and freezing pay?
| Doesn't really add up for me.
| slantedview wrote:
| Even in normal times, Microsoft does stack ranking. It's not
| great.
| wvenable wrote:
| I might be wrong but Microsoft ended stack ranking in 2013.
| Havoc wrote:
| Not sure that part actually matters?
|
| The message of the letter was clear and the gun being held at
| boards head was credible. And what 90%+ supported on paper?
|
| Even if many of the 90% are lukewarm & half-arsed, from a
| leadership perspective that's conclusively game over.
| Crosseye_Jack wrote:
| Well that's the benefit then you have multiple entities bidding
| for your labor. You just need to say Company A will give me the
| same terms (if not better), not need to list A, B, C, D, etc even
| if you just want to stay where you are, you are still in the
| position to cherry pick who you want to work for.
|
| Heck I've done it myself, Was happy where I was, but wanted a pay
| bump, shopped around, went back to my employer and said if you
| will pay (highest bid + percentage) I'll stay, but otherwise I'm
| out the door. They paid up.
|
| Its the gamble you take. Granted when most off the staff also
| have the same demands it puts the company more on the back foot.
|
| Just a note, I've also taken work for less pay, but (imo) better
| working conditions. It all just depends on what you want your
| working conditions to be and what your willing to accept in terms
| of comp.
___________________________________________________________________
(page generated 2023-12-07 23:00 UTC)