[HN Gopher] Macs to Get AI-Focused M4 Chips Starting in Late 2024
___________________________________________________________________
Macs to Get AI-Focused M4 Chips Starting in Late 2024
Author : alwillis
Score : 44 points
Date : 2024-04-11 21:33 UTC (1 hours ago)
(HTM) web link (www.macrumors.com)
(TXT) w3m dump (www.macrumors.com)
| roody15 wrote:
| Was looking at purchasing a PC laptop for a staff member today
| and was surprised just how much better Apple products are in the
| current M generation of devices. Better screens, no fans, better
| battery, better performance ... surprising there is not much
| competition. If you take a Dell XPS and add OLED screen and
| higher resolutions ... quickly find your self near $2,000 with a
| device that still has less battery life, runs hotter, and
| although may match or exceed raw numeric performance ... is still
| arguably not as versatile or as good a machine.
| Rinzler89 wrote:
| Excuse me but what does this have to do with the next gen AI
| chips topic?
|
| Are there any threads left on HN that can be held on topic and
| not be invaded by the same ol' beaten to death "Apple M1 good,
| PCs bad" trope? It's been 4 years since the M series launched,
| everyone already knows what they can do, what exactly were you
| bringing on this topic with your pointless shopping anecdote
| that ahs nothing to do with the topic?
|
| Not everyone is a web dev. Many jobs will require you to use
| tools exclusive to Windows or Linux hence why the market for
| expensive PC laptops is still very large despite being worse on
| paper than MacBooks.
|
| MacOS doesn't solve everyone's needs no matter how good their
| chips are, if their SW doesn't run on it then it's useless, so
| Apple is no thereat to that market.
| lylejantzi3rd wrote:
| He's hoping for somebody to get a clue and build better PCs.
| Rinzler89 wrote:
| How is his comment helping the Intel engineers build better
| CPUs? Does he think that Intel and AMD already solved the
| immense challenges on how to build a better CPU but they
| were just waiting for roody15's low efort comment to
| finally ship them?
| lylejantzi3rd wrote:
| > We vent online in pointless comments and poof, magic
| happens?
|
| More than you might think. You never know who's reading.
| Rinzler89 wrote:
| So what if "they" read his comment? Do you think if Pat
| Gelsinger himself reads this, they'll just "engineer"
| faster now or what?
| lylejantzi3rd wrote:
| Apple seemed to manage just fine. You're saying there's
| nobody else out there that can do the same? And what's
| your solution? Suffer in silence? Or do you think yelling
| at other people on an online forum is somehow more
| productive?
| Rinzler89 wrote:
| _> Suffer in silence?_
|
| You must live in some coo coo land of privileged
| entitlement if you think that not using Apple M laptops
| somehow equivalate to "suffering". Go out and get some
| perspective on what suffering actually feels like.
| lylejantzi3rd wrote:
| You know that's not what I meant. What do you think
| you're accomplishing here?
| ReverseCold wrote:
| Yeah... hopefully the upcoming Qualcomm Snapdragon X chips get
| integrated properly by someone so there's actual competition.
| Mobile data in my laptop also sounds like it would be pretty
| nice. Not holding my breath though, Microsoft may still mess it
| up on the Windows side :(
| wtallis wrote:
| Lenovo and/or Qualcomm managed to mess things up enough on
| the Thinkpad x13s that the cellular connectivity was provided
| by a M.2 card rather than the Snapdragon SoC.
| kingkongjaffa wrote:
| The battery life alone is worth kitting out your team with them
| to be honest. My windows work laptop is dead in under 2 hours.
| My m1 mbp will last all day 8/9 hours + under normal loads. The
| only app that eats the batter is football manager and even then
| it lasts 3-4 hours.
| modeless wrote:
| I agree the lowest specced fanless MacBook Air is very
| appealing at its price point. But for laptops with fans, if you
| want to do gaming or AI the performance of a laptop with Nvidia
| will blow away anything from Apple, and the 2024 version of the
| ROG Zephyrus G14 looks pretty good.
| danielheath wrote:
| For gaming, absolutely; few enough games even run on OSX.
|
| For AI, unified memory is _way_ cheaper than high-ram GPUs.
| andy99 wrote:
| Macs are better but absurdly expensive and I've only ever had
| an awful customer experience with them. For a daily development
| laptop, I don't think it's worth the extra money over Lenovo
| and I don't want to support Apple.
|
| I do think it would be cool to have an M_x chip to try and run
| ML stuff on.
| airstrike wrote:
| For a daily development laptop, I actually find Mac OS
| incredibly superior to the alternatives, but I guess ymmv
| wil421 wrote:
| Is AI the new 3D monitor fad? Am I going to have to pay for AI in
| places I don't care to? Such as Cars, Refrigerators, and Espresso
| machines?
| flir wrote:
| Would you like some toast?
| curious_cat_163 wrote:
| > Apple also plans to add a much improved Neural Engine that has
| an increased number of cores for AI tasks.
|
| This makes total sense but does it? Does Apple really have an
| application roadmap to ultimately utilize these tensor cores?
|
| For all the M1, M2, M3s out there, Siri still sucks. We keep
| seeing latest arXiv papers [1] coming out of their research
| efforts hinting at potential improvements but there is not much
| in those papers that needs to wait for an M4 or M5...
|
| What gives?
|
| [1] https://arxiv.org/abs/2403.20329
| apetresc wrote:
| I mean, the rumours at this point are deafening that Siri is
| getting a complete rehaul (or an outright replacement, which
| probably amounts to the same thing) at WWDC in just two months,
| so I'd reserve judgment on Apple's AI deployment capabilities
| until then.
| Rinzler89 wrote:
| I think their main AI case will be photo and video editing or
| maybe some LLM bots to help your organize and search through
| files, but Siri feels really abandoned.
| knodi123 wrote:
| If they just gave Siri an off-the-shelf LLM with a few extra
| capabilities that plugged into their ecosystem, it would be
| amazing.
| Art9681 wrote:
| They recently published a paper where an on device model
| has UI awareness.
|
| https://arxiv.org/abs/2404.05719
|
| Siri is about to get swole.
| sroussey wrote:
| Siri moved on device, which was a huge engineering effort.
| But capabilities stagnated, like because of the move on
| device.
| digitcatphd wrote:
| Seems like marketing hype? Surely running local LLMs isn't that
| big a market.
| binkHN wrote:
| > Surely running local LLMs isn't that big a market.
|
| Agreed. That said, if you do AI/LLMs, few, if any, portable
| non-Mac systems have the ability to pull this off. The Macs
| just crush it when it comes to GPU and memory bandwidth
| performance, and they do this while sipping battery life.
| aiauthoritydev wrote:
| Generally if there are enough people, there will be an
| ecosystem building software for it.
| highwaylights wrote:
| > What gives?
|
| You give. More money.
|
| These new models aren't going to sell themselves.
| _boffin_ wrote:
| I think that Apple's biggest sell with this stuff is going to
| be utilizing all the SQLite databases on OSX, IOS, IPadOS along
| with all the on-device analytics that are constantly running. I
| have a feeling that we're going to see Journal become more of a
| center focus over the coming years.
|
| Question: Is it illegal for apple to do something on-device
| that runs queries against all the databases for the user's
| applications?
| JSDevOps wrote:
| Ugh! The price of M3s will hold their value as everyone won't
| want this.
| lvl102 wrote:
| I guess they needed to support the stock today? This is just
| saying they will roll out M4 chips after M3. How prescient!
| highwaylights wrote:
| I predict that, after that, they will roll out Macs with the M5
| series of chips.
|
| Just a hunch.
| modeless wrote:
| I'm surprised by the almost universally bearish opinion here so
| far. AI is going to change everything about how we interact with
| computers. A M4 with a terabyte of RAM will run GPT-4 level
| models locally. Forget Siri. You will be able to converse
| directly with your computer as if it was a person. Not only in
| text, but by voice and even video, and it will be incredibly
| responsive. The computer will be able to see using the webcam and
| have conversations about objects you show it. It will be able to
| read your screen and give you advice about what you're working
| on, and even perform tasks for you. It will be able to browse the
| web for you, summarize content, and take actions on your behalf.
| It will be able to render an avatar for itself and show emotions,
| and it will see and react to your facial expressions and
| gestures.
|
| This is not speculative, almost all of this stuff is already
| shown to work, and it is all improving incredibly quickly as we
| speak! It just needs to be assembled into an actual product,
| which is exactly what Apple does best. With their silicon
| advantage they are best positioned to ship these kinds of
| features, and the best part is it can all run offline locally. No
| data sent to servers. No privacy concerns. Zero network latency.
| jsheard wrote:
| > A M4 with a terabyte of RAM will run GPT-4 level models
| locally.
|
| Perhaps, but a 1TB RAM upgrade would probably cost about as
| much as a new car. This is Apple we're talking about.
| sroussey wrote:
| The original 128kb Mac was about $7,300 (today's dollars).
|
| Apple's Lisa was about $30,000 (today's dollars)
| jsheard wrote:
| I don't know what the relevance of prices from 40 years ago
| is, but today they want +$800 just to add +64GB RAM to a
| machine.
| Etheryte wrote:
| For context, if you get the beefiest MacBook Pro right now
| you'd pay an extra $1000 to upgrade from 48GB to 128GB RAM.
| For lower end machines you pay more for RAM, $200 per each
| 8GB you add on top of the 8GB baseline.
| sebastiansm wrote:
| Maybe that M4 was the real electric Apple car project.
| seanmcdirmid wrote:
| This seems to suggest that you can install it on a 2019 Mac
| Pro at least: https://support.apple.com/en-us/102742.
|
| $4,859.99 for 8 128GB LR-DIMMS on amazon:
| https://www.amazon.com/8x128GB-DDR4-3200-PC4-25600-NEMIX-
| RAM...
|
| The new Mac Pro (with the M2 Ultra) uses unified memory, so I
| guess no more third party upgrades? Looking here:
|
| https://everymac.com/actual-maximum-mac-ram/actual-
| maximum-m...
|
| The new Mac Pro is limited to 192GB while the previous Mac
| Pro can go up to 1.5TB.
| binkHN wrote:
| > terabyte of RAM
|
| LOL. Have you seem the pricing of Mac memory?
| jprete wrote:
| I don't really want any of those things. Predictability and
| control is what I want from a computer, not to be an ersatz
| companion that back-seat-drives everything I do.
| ok_dad wrote:
| > You will be able to converse directly with your computer as
| if it was a person.
|
| No thanks, I like that my computer only speaks an incredibly
| specific language which I can use to tell it _exactly_ what to
| do, and know it's not going to do anything else. If I wanted a
| PA, I would hire one.
| glial wrote:
| Perhaps you aren't the target market for this.
| Loveaway wrote:
| Hope they can deliver. Right now Apple hardware is silly compared
| to PC+Nvidia if you wanna play around with GenAI. Both in price
| and performance. Worst case macs end up as thin-clients, all AI
| running on Nvidia in the cloud. Would eat into their competive
| advantage a lot I think.
| oidar wrote:
| If I wanted to build a PC today that could run the big models
| that were released recently (For example, Mixtral 8x22B and
| Command-R with as little quantization as possible) what would I
| buy?
| wmf wrote:
| _Right now Apple hardware is silly compared to PC+Nvidia if you
| wanna play around with GenAI._
|
| You mean it's silly how far ahead Apple is since they offer 192
| GB of VRAM while Nvidia only allows 24 GB for reasonable
| prices? Or do you mean it's silly to compare <$10K Macs with
| >$30K Nvidia setups in the first place?
| sroussey wrote:
| If they ship the M4 this year, then they might be on a yearly
| schedule from here on out.
| russellbeattie wrote:
| I predict Apple's launch events in the fall are going to be off
| the charts in terms of productizing AI.
|
| I'm not an Apple fanboy - I just think Apple as a company has
| been prepared and thinking about this stuff for literally
| decades. They set the bar for user friendly products. It won't be
| the first time that Apple is late to the market, but they've
| always redefined it when they arrive, becoming the new standard.
| It's what they do.
|
| I'm a firm believer in AI at the edge: Low latency, privacy,
| personalization, device integration and no need to invest in
| massive AI server farms. It's in Apple's best interest to bring
| AI computation to the masses.
|
| But we'll see if Tim can pull it off.
___________________________________________________________________
(page generated 2024-04-11 23:01 UTC)