[HN Gopher] Greg (2017)
___________________________________________________________________
Greg (2017)
Author : admp
Score : 137 points
Date : 2023-11-18 18:22 UTC (4 hours ago)
(HTM) web link (blog.samaltman.com)
(TXT) w3m dump (blog.samaltman.com)
| moralestapia wrote:
| :')
| unglaublich wrote:
| I see an analogy with AI becoming _so good_ that it will attempt
| to remove the human to improve its reward.
| someperson wrote:
| Very effusive and well-written praise for Sam Altman's friend and
| colleague, Greg Brockman.
| thepablohansen wrote:
| > with an average email response time of about 5 minutes to
| anything.
|
| Seems like he's always considered this a good measure of a
| founder's quality.
|
| From a 2019 interview-
| https://conversationswithtyler.com/episodes/sam-altman/
|
| > You know, years ago I wrote a little program to look at this,
| like how quickly our best founders -- the founders that run
| billion-plus companies -- answer my emails versus our bad
| founders. I don't remember the exact data, but it was mind-
| blowingly different. It was a difference of minutes versus days
| on average response times
| jacquesm wrote:
| That seems a bit simplistic and 'bad founders' begs for a
| definition because it can't just be 'answers email slowly'.
| shrimpx wrote:
| I take it bad founders means the founders that run less than
| billion-plus companies.
| kevinmchugh wrote:
| I would absolutely believe that founders who (go on to) run
| billion dollar companies make a point of replying to YC
| partners very quickly. "Successful executives are highly
| responsive to investor and advisor emails" seems eminently
| plausible. It doesn't suggest that they're equally responsive
| to all emails, but they've got a sense of who's
| important/needs to feel important.
| jacquesm wrote:
| I'm sure they do. But you can also interpret that as
| 'bootlickers are the kind of people I like'. And that is
| not necessarily equivalent to 'good founders'. So I think
| it is a bit of a thin element to judge people by.
| nwiswell wrote:
| I think it is maybe best reframed as "good founders from
| the perspective of those who control capital".
|
| Whether these people ultimately improve society, or
| create a better sense of purpose for their employees, or
| provide visionary direction for the company at a higher
| rate than other founders is kind of orthogonal (or
| perhaps anticorrelated) to being good stewards of
| invested capital.
|
| Founders can, to some extent, get help with the other
| things, but I think it can be reasonably argued that if
| they're not _personally_ regarded as good stewards of
| capital, then the whole enterprise is in doubt.
| jacquesm wrote:
| Precisely. Good founders, bad founders, in the eyes of
| the beholder.
|
| I'm pretty sure I have a completely different opinion on
| what constitutes a good founder and what constitutes a
| bad one compared to Sam Altman, fortunately I don't have
| enough clout to make authoritative statements on the
| subject.
| sprobertson wrote:
| P|Q != Q|P
| snowwrestler wrote:
| One thing to keep in mind is that this was the email response
| time to _Sam Altman, the head of YC._ What competent startup
| founder waits to reply to that?
|
| Responsiveness as a general approach to all email is a bad
| idea. But one needs to know who are the high-priority emailers,
| and how much they value quick replies.
| nopromisessir wrote:
| I read alot. Saw many rumors. I'm aware of the various 'insider
| scoops'. I still maintain we really don't know what happened,
| more or less.
|
| I'm certain of this though... when Greg Brockman walked out the
| door, they lost a major piece of talent.
|
| That guy was a true believer. His enthusiasm was infectious. It
| traveled across the video link... You could feel how passionate
| he was about the future of artificial intelligence and it's
| capacity to change humanity for the better.
|
| I'm sure he'll throw himself at something very cool for the next
| run.
| andygeorge wrote:
| > I read alot.
| jackblemming wrote:
| So what did Sam actually do besides a social media startup that
| was more or less a flop into being gifted a high position at
| ycombinator where he then had a few good investments during
| literally the easiest time to invest in tech history? I'm sure
| there's something I'm missing, but there's not much public info.
| Mithriil wrote:
| Do you even understand how much work is necessary to put in, to
| influence a whole industry massively and with the biggest
| players?
| jackblemming wrote:
| Maybe some want to celebrate BSers like Adam Neumann or
| Elizabeth Holmes who are good at pretending to be important
| and conning investments, but it never really impressed me,
| sorry.
|
| I'll stick to celebrating the actual brains, like Ilya
| Sutskever.
| csours wrote:
| As the farmer said, "We'll see"
|
| https://impossiblehq.com/well-see/ (or google "we'll see
| story")
| ignoramous wrote:
| This is a common retort, but after his run at YC (hand-picked
| by Paul Graham) and OpenAI (taking on Google at AI is no mean
| feat, despite the backing), and his ongoing work with Helion
| Energy and WorldCoin, it is safe to say Sam has more than
| earned his place, perhaps may be twice over, among SV royalty.
| And he's not even 40.
|
| http://paulgraham.com/5founders.html
| polygamous_bat wrote:
| > WorldCoin
|
| Is that supposed to make him look good? Because it doesn't,
| in fact it makes him look very out of touch at best and a
| complete fool at worst.
| torginus wrote:
| I just realized it's THAT Helion. Their fusion experiments
| are not without controversy.
|
| Here's a video on explaining how it works:
|
| https://www.youtube.com/watch?v=_bDXXWQxK38
|
| And here's a video explaining what's wrong with the scheme:
|
| https://www.youtube.com/watch?v=3vUPhsFoniw
|
| But the TLDW version is that environment required for fusing
| He3 with deuterium also leads to deuterium fusing with
| itself, a reaction that creates neutron radiation that
| irradiates its environment.
| shakow wrote:
| > that irradiates its environment.
|
| What is the issue, as long as the containment containers
| are properly designed?
| codethief wrote:
| Radiation damage to the reactor structure and radioactive
| waste, among other things.
| joak wrote:
| The second video is just a nonsense troll, if you want good
| level conversations and skepticism about Helion you'd
| better check what the fusion subreddit says about it https:
| //old.reddit.com/r/fusion/search/?q=helion&restrict_sr...
|
| Long story short: Helion plans a net-electricity demo in
| 2024 and to start selling to the grid in 2028. The timeline
| seems too good to be true but no one says it's impossible.
| Many says they have not enough publications and that there
| are many scientific unknowns. Failure is a possibility,
| success also. Given the timeline we'll know soon.
| stonogo wrote:
| "Hand-picked by so-and-so" used to have another name: "one of
| the good ole boys." Before that, in England, one was "sound."
| It's not a qualification, it's an anointing.
|
| So we have "handing out money," OpenAI, a typical fusion
| outfit (breakeven next year, every year), and a
| cryptocurrency that has already been chased out of the one
| country that tried to adopt it.
|
| I like Sam Altman and he seems to be a genuine person with
| laudable goals, but OpenAI is the only place where he really
| seemed to deliver, and even then there are a lot of people
| unhappy with the non-profit/private subsidiary surprise
| structure.
| bmitc wrote:
| I just want to clarify that it isn't "his" work, is it? It's
| more that he's attached himself to those projects.
| roflyear wrote:
| I haven't worked for Sam, and expect most people commenting
| on him haven't either, so they only have his interviews and
| his public commentary to judge him by. From that commentary
| he seems extremely ... vanilla? But that is probably good for
| an exec.
|
| I haven't read any of his blogs and thought "wow, how
| insightful?" - rather, they read similar to press releases I
| see constantly on LinkedIn. "You have to put something out
| there" type of stuff. Just doing it to do it, not to share
| insight.
|
| That's my take, anyway, from basically all I've seen of him,
| and this gives a "not special" vibe, but my gut tells me
| that's very, very intentional ...
| shmatt wrote:
| Sounds like they were the perfect fit in the pre GPT-3, ChatGPT,
| DALL-E world
|
| Honestly I don't understand the drama around 2 executives who
| have not done any transformer research or whatever will come
| after transformers
|
| The GenAI world will be fine without the ceo of cryptoballs or
| whatever his other company is
| polygamous_bat wrote:
| > whatever will come after transformers
|
| I feel like this is a point that is not being talked about
| enough. Yes, OpenAI gave us GPT and DALL-E. But had sama and
| gdb remained there, would we have gotten anything new that is
| as groundbreaking as the original GPT and DALL-E, or would we
| have continued getting GPT-12 and DALL-E-19? Sure, iPhone 15
| sells, but some may say Apple has stagnated since iPhone was
| released.
| AussieWog93 wrote:
| But now, are we even going to get GPT-4 or GPT-5 with the
| same level of polish that sama would have put into it?
|
| I'd argue right now that we're at the "iPhone 3G" point on
| the technology curve, with significant improvements to come
| over the next few years as the tech gets polished.
| dlivingston wrote:
| Sorry to nitpick, but --
|
| OpenAI was releasing innovations in the GenAI space at a
| breakneck pace. Remember, GPT-1 didn't change the world, it
| was GPT-3.5/4 from _earlier this year_. OpenAI was at peak
| innovation when sama and gdb left.
|
| And folks used to say Apple was stagnant, but after Apple
| Silicon completely upended the personal computer world (along
| with some other things) the dissidents have been mostly
| silent.
| davidy123 wrote:
| Apple Silicon made x86 silicon look bad, but what has it
| really upended? Macs are taking over more of the personal
| computer market, but hard to say what the factor is there.
| I think it's mostly network effects, partially due to their
| shameless proprietary approach. PCs, Apple or other, are
| kinda generally good, no matter what the price or combo,
| and disappearing at the same time, a lot can be done with
| just a browser on any foundation. Apple seems to be years
| behind or nonexistent where things are really changing, AI
| and cloud.
| bmitc wrote:
| > Apple Silicon completely upended the personal computer
| world
|
| How did it upend the personal computer world? Apple's chip
| developments are an amazing technological achievement, but
| they don't have anything innovative to put them in. Apple
| slaps them in grossly thermally limited form factors, where
| the chips can't operate anywhere close to their capability.
| It's kind of a silly exercise, in my opinion. At the end of
| the day, Apple has made the same computers, phones, and
| tablets for the past 10 years. I'm not sure where the
| innovation is.
| gdhkgdhkvff wrote:
| From the various sources it appears that they're being fired
| because they were trying to push the envelope TOO HARD, not
| the other way around.
|
| And, outside of the Cynicism-Is-Intelligence hackernews
| crowd, basically everyone has been fawning over the breakneck
| speed of progress coming out of OpenAI, even at the recent
| OpenAI devdays.
| sctb wrote:
| I don't know gdb very well, but I did get to chat with him a
| bit about what he was working on around the time of this
| article, which was mostly infrastructure grunt work, removing
| obstacles and procedural rough edges--basically anything to
| make the researchers and engineers as happy and productive as
| possible. It is so, so easy to undervalue that kind of work
| done by a totally brilliant and capable technologist. For an
| early stage startup it's gold. For a later stage startup it's
| gold.
| avindroth wrote:
| Why do you think people undervalue it? Very curious.
| sctb wrote:
| It's the impression I got from "who have not done any
| transformer research", as well as the fact that sama wrote
| this article.
| d3ckard wrote:
| You can't demo it and sell it.
| swatcoder wrote:
| It's gold, but it's not singular. There are many people who
| have been doing that work for decades and are able to step
| into the role. The same can't be said for the R&D work he's
| supporting, as comparatively few have deep insight or
| experience for working with the innovative tech yet.
|
| So while Greg's work would have been extremely valuable, it's
| value is on a lesser order of magnitude than many of the
| other researchers and engineers who OpenAI had collected into
| its ranks. More essential innovative value will be lost to
| the bleed of loyalists and startup bettors who will peel off
| from those ranks.
| sctb wrote:
| I'm suggesting that there are not many people who have been
| doing that work, at least not at the same level or to the
| same effect. He did it with Stripe and OpenAI, back to
| back.
| Nidhug wrote:
| I think that there is some kind of elitism around AI
| researchers. Yes they are very valuable, but someone
| helping everyone else be more productive is absolutely
| critical.
| swatcoder wrote:
| Having a car might be critical and acquiring a car might
| be expensive, but there are a lot of them and they are
| ultimately replaceable. If yours is lost and you still
| have cash, you can generally go find a new one the same
| day and borrow a ride from someone if you really need to.
|
| That's not necessarily true for (say) the rare high-end
| graphics board you use for running local inferences. It's
| also expensive -- even less expensive than the car -- but
| replacing it can be a bigger deal and cause a complete
| interruption.
|
| There are countless experienced late-career generalists
| who can keep projects moving by contributing to critical,
| smart support. I'm one of them. We're extremely valuable
| indeed.
|
| But there really are far fewer people who were ahead of
| the curve and years-deep into the AI research central to
| OpenAI's entire existence. Those people are beyond
| _critical_ , they're _essential_.
|
| That doesn't make them better people, or smarter people,
| or in any other way elite. It just means that _in the
| context of OpenAI_ those people are much harder to come
| by and can be much more disruptive when lost.
| mnky9800n wrote:
| That's the job of any good professor for their PhDs and
| postdocs.
| bmitc wrote:
| I don't even know how people like this get valued so much. Why
| do people treat Silicon Valley "entrepreneurs" and investors as
| if they're made out of some sort of intellectual adamantium?
| Aren't they, generally speaking, just people looking to make a
| name and buck for themselves, primarily driven by ego rather
| than intellectual or philosophical pursuits? Most of them got
| lucky with some relatively dumb or straightforward product in
| the middle of a bubble and are not responsible for some major
| leap forward in technology.
| someperson wrote:
| So my understanding from reading the drama the past day is Sam
| Altman was fired from OpenAI due to being too inclined to 'move
| fast and break things' by commercializing OpenAI technology, with
| Greg Brockman (cofounder/board member/close friend/ally) choosing
| to resign in solidarity. The board coup was organized by
| cofounder/Chief Scientist Ilya Sutskever who apparently wants
| OpenAI's original slow moving safety vision.
|
| It's speculated Sam Altman and Greg Brockman may start a new AI
| company.
|
| So now seems like a good time to mention a few very high-level
| points in case they read it:
|
| 1. I love Sam Altman's ship early and often inclinations, even if
| that apparently got him fired. OpenAI was such a breath of fresh
| air compared to sclerotic companies like Google that can invent
| the Transformer architecture yet be organizationally incapable of
| shipping ChatGPT-level tools for years due to overly conservative
| safety concerns
|
| 2. I hate OpenAI (or Sam Altman's?) apparently puritanical
| inclination to anything considered Not Safe For Work, especially
| for paid API usage. Why not allow people to build and sell
| virtual partner chat bots with explicit NSFW content?
|
| 3. I dislike his apparent inclination to build a regulatory moat
| to block others from developing advanced AI -- it's easy to
| interpret this as purely in the self-interest of OpenAI
| shareholders
|
| Without Sam Altman's inclination to move fast I imagine OpenAI
| may become slow, sclerotic and less capable of shipping early,
| like what Google has become.
|
| Good luck Sam, and keep on shipping!
| gkoberger wrote:
| Minor note: Greg was first fired as chairman, and then
| subsequently resigned from the company. They were separate
| actions.
|
| Source: https://twitter.com/gdb/status/1725736242137182594
| browningstreet wrote:
| And the board was ridiculous thinking they could demote him
| and have him stick around. That was either weirdly short-
| sighted or strategic theater. I kind of think it might have
| been the former.
| gkoberger wrote:
| They knew he was going to leave. It's likely a combination
| of the following:
|
| 1. They couldn't fire him as an employee (or felt it was an
| overreach of their mandate)
|
| 2. They wanted to signal a clear distinction that they lost
| faith in him as Chairman, while not losing faith in his
| work as an employee.
|
| 3. They felt like it would play better with the company if
| his ultimate departure was his decision rather than theirs.
|
| 4. Mira, as the new acting CEO (and someone who had nothing
| to do with the actions), declined to fire him even though
| she knew it was ultimately futile.
| sroussey wrote:
| They don't have to pay him an exit fee.
| gkoberger wrote:
| I doubt they care about this. This move already signals
| they're not optimizing for financial outcomes, and the
| independent board members (3/4 involved in this decision)
| have no equity in OpenAI.
| roflyear wrote:
| My take is, Sam and Greg are not the executives they want
| people to think they are. This was recognized, and they got
| upset because of this, and things shook out this way.
| gkoberger wrote:
| Over the past decade, I've never heard a single bad thing
| about Sam or Greg from anyone who has worked with them.
|
| The board may know something nobody else does, but I think
| (given the current information) it's significantly more
| likely that they _are_ who they purport to be... it's just
| that the board wanted something different.
| capableweb wrote:
| > 2. I hate OpenAI (or Sam Altman's?) apparently puritanical
| inclination to anything considered Not Safe For Work,
| especially for paid API usage. Why not allow people to build
| and sell virtual partner chat bots with explicit NSFW content?
|
| I don't think that is either's fault, the US is just very
| puritan and a lot of it is because credit card companies and
| banks don't like it.
|
| > it's easy to interpret this as purely in the self-interest of
| OpenAI shareholders
|
| My guess is that this is probably what the board didn't like,
| Altman focused too much on profits in various ways.
| jsyang00 wrote:
| An OpenAI which allowed NSFW content literally could not
| exist. It would be shut down in under a week. Maybe possible
| under a different regulatory regime (France?) but even then I
| doubt it... any model developed by a company and offered as a
| product will have some censorship which gets baked in.
| wslh wrote:
| Could you expand on why? There is a lot if NSFW content on
| Internet. What could be different, regulatory wise, this
| time?
| someperson wrote:
| But there's plenty of successful US-based sites that host
| both SFW and NSFW content: Reddit, Twitter, Tumblr (before
| Yahoo), DeviantArt, etc
|
| Even it seems Patreon (which I've actually heard it described
| as an "NSFW launderer") -- is fundamentally built upon
| interactions with credit card and banks.
|
| I don't know how true it is, but I've read that payment
| processors like Visa and Mastercard are actually agnostic --
| it's the high-rates of chargebacks that they have a problem
| with.
| pixl97 wrote:
| Reddit isn't what I would call successful in making money,
| so there is that.
| jug wrote:
| The profit focus being wrong seems so weird to me. It's an
| awfully expensive operation to run GPT-4 at scale and even
| now, it's rumored they are running the services at a loss. I
| understand the philosophical side, sure, but you can't just
| disregard all those massive GPU farms and staff tuning their
| models. AI is said to have created a new country in terms of
| energy use and OpenAI no doubt accounts for a large portion
| of that.
| someperson wrote:
| Certainly Microsoft's GPTv4 infrastructure is still eye-
| wateringly expensive:
|
| > GitHub Copilot has reportedly been costing Microsoft up
| to $80 per user per month in some cases as the company
| struggles to make its AI assistant turn a profit.
|
| > According to a Wall Street Journal report, the figures
| reportedly come from an unnamed individual familiar with
| the company, who noted that the Microsoft-owned platform
| was losing an average of $20 per user per month in the
| first few months of 2023.
|
| https://www.techradar.com/pro/microsoft-is-reportedly-
| losing...
| jmerz wrote:
| I'm hacking on some GPT-for-long-form-text stuff right
| now and it is _eye wateringly_ expensive once you start
| generating at anything close to "professional human"
| token outputs. $80 per month sounds already pretty
| optimized.
| alsodumb wrote:
| This article about copilot is BS. Nat Friedman refuted
| this on twitter and made it clear that copilot wasn't
| losing money.
| earthboundkid wrote:
| The web API based licensing scheme is dumb and bad. It's
| including a buggy whip holder on a model T thinking. The
| license should be that they sell a license for use of the
| weights. They can also sell a SaaS that does a web API to use
| the weights. But the weights are the thing other businesses
| actually want and it's controlling and obviously a monopoly
| play to not sell the weights. Other businesses have an obvious
| incentive to only work with companies that sell weights so as
| to prevent their being mere serf's on someone else's SaaS farm.
| joanfihu wrote:
| It's not about Ilya wanting to slow things down.
|
| Ilya is the technical mastermind behind OAI. The technical
| breakthroughs needed for AGI are not there yet. Ilya, Yann,
| Demis and many others are aware of it.
|
| An aggressive push for applied research and commercialisation
| means less resources for technical breakthroughs.
|
| This is a tricky situation.
| ChrisArchitect wrote:
| (2017)
|
| Previous discussion:
| https://news.ycombinator.com/item?id=13811403
| catlover76 wrote:
| > Elon and I were both busy with day jobs
|
| Herr Musk was involved at the beginning?
| mkl wrote:
| Not sure why you think he's German, but yes:
| https://en.wikipedia.org/wiki/OpenAI#History
| charlie0 wrote:
| Yup, my boss at old co had some similarities to Greg. He worked
| long hours, was pretty much aware of everything happening at the
| company both in the US and off-shore, could talk technical
| details with the tech peeps, and business stuff with the non-tech
| folks. He was always on top of things and unblocking people all
| over the org. He also had this great ability to remember things
| very well, even a few months had passed.
|
| Even though he was CTO and incredibly busy, he would find time to
| spend with individual engineers. Once he spent an hour pair-
| programming with me on a difficult issue. Even though his time
| was obviously not spent coding, it was a very productive session.
|
| The founder acknowledged that he really couldn't have done it
| without him, or someone like him on the team. I 100% agree they
| couldn't have built the org without him. He was just on a
| different level and it was awesome seeing him in action.
| shrimpx wrote:
| Near-100% certainty that Altman and Brockman cofound a new AI
| company in the coming days. The question is will they be able to
| recruit a team that can actually build competitive models? Ilya
| Sutksevers don't grow on trees. Maybe they'll just get a team
| good enough to specialize Llama2, since Altman/Brockman seem to
| think what's lacking in this space is glitzy products, app
| stores, b2b integrations, etc. Maybe OpenAI starts to open source
| everything and Altman/Brockman can have their cake and eat it,
| too.
| drexlspivey wrote:
| It will be very ironic when their new startup gets dragged down
| by regulations due to them not having an AI license that sama
| pushed heavily in congress.
| willsmith72 wrote:
| > Maybe OpenAI starts to open source everything
|
| I really doubt it
___________________________________________________________________
(page generated 2023-11-18 23:00 UTC)