[HN Gopher] Apple introduces M4 chip
       ___________________________________________________________________
        
       Apple introduces M4 chip
        
       Author : excsn
       Score  : 961 points
       Date   : 2024-05-07 14:37 UTC (8 hours ago)
        
 (HTM) web link (www.apple.com)
 (TXT) w3m dump (www.apple.com)
        
       | stjo wrote:
       | Only in the new iPads though, no word when it'll be available in
       | Macs.
        
         | speg wrote:
         | In the video event, Tim mentions more updates at WWDC next
         | month - I suspect we will see a M4 MacBook Pro then.
        
           | atonse wrote:
           | Haven't they been announcing Pros and Max's around December?
           | I don't remember. If they're debuting them at WWDC I'll
           | definitely upgrade my M1. I don't even feel the need to, but
           | it's been 2.5 years.
        
             | Foobar8568 wrote:
             | November 2023 for the M3 refresh, M2 was January 2023 if I
             | remember well.
        
             | hmottestad wrote:
             | Mac Studio with M4 Ultra. Then M4 Pro and Max later in the
             | year.
        
         | throwaway5959 wrote:
         | Hopefully in the Mac Mini at WWDC.
        
       | ramboldio wrote:
       | if only macOS would run on iPad..
        
         | user90131313 wrote:
         | All that power so ipad stays limited like a toy.
        
         | Eun wrote:
         | then a lot of people would buy it, including me :-)
        
         | vbezhenar wrote:
         | Please no. I don't want no touch support in macOS.
        
           | jwells89 wrote:
           | It might work if running in Mac mode required a reboot (no on
           | the fly switching between iOS and macOS) and a connected
           | KB+mouse, with the touch part of the screen (aside from
           | Pencil usage) turning inert in Mac mode.
           | 
           | Otherwise yes, desktop operating systems are a terrible
           | experience on touch devices.
        
             | vbezhenar wrote:
             | > It might work if running in Mac mode required a reboot
             | (no on the fly switching between iOS and macOS) and a
             | connected KB+mouse, with the touch part of the screen
             | (aside from Pencil usage) turning inert in Mac mode.
             | 
             | Sounds like strictly worse version of Macbook. Might be
             | useful for occasional work, but I expect people who would
             | use this mode continuously just to switch to Macbook.
        
               | jwells89 wrote:
               | The biggest market would be for travelers who essentially
               | want a work/leisure toggle.
               | 
               | It's not too uncommon for people to carry both an iPad
               | and MacBook for example, but a 12.9" iPad that could
               | reboot into macOS to get some work done and then drop
               | back to iPadOS for watching movies or sketching could
               | replace both without too much sacrifice. There's
               | tradeoffs, but nothing worse than what you see on PC
               | 2-in-1's, plus no questionable hinges to fail.
        
               | ginko wrote:
               | MacBooks even the air are too large and heavy imo. A
               | 10-11 inch tablet running a real os would be ideal for
               | travel imo.
        
           | Nevermark wrote:
           | Because it would be so great you couldn't help using it? /h
           | 
           | What would be the downside to other's using it?
           | 
           | I get frustrated that Mac doesn't respond to look & pinch!
        
             | vbezhenar wrote:
             | Because I saw how this transformed Windows and GNOME.
             | Applications will be reworked with touch support and become
             | worse for me.
        
           | jonhohle wrote:
           | Why would you need it? Modern iPads have thunderbolt ports
           | (minimally USB-C) and already allow keyboards, network
           | adapters, etc. to be connected. It would be like an iMac
           | without the stand and an option to put it in a keyboard
           | enclosure. Sounds awesome.
        
         | tcfunk wrote:
         | I'd settle for some version of xcode, or some other way of not
         | requiring a macOS machine to ship iOS apps.
        
           | JimDabell wrote:
           | Swift Playgrounds, which runs on the iPad, can already be
           | used to build an app and deploy it to the App Store without a
           | Mac.
        
             | alexpc201 wrote:
             | You can't make a decent iOS app with Swift Playgrounds, its
             | just a toy for kids to learn to code.
        
               | interpol_p wrote:
               | You're probably correct about it being hard to make a
               | decent iOS app in Swift Playgrounds, but it's definitely
               | not a toy
               | 
               | I use it for work several times per week. I often want to
               | test out some Swift API, or build something in SwiftUI,
               | and for some reason it's way faster to tap it out on my
               | iPad in Swift Playgrounds than to create a new project or
               | playground in Xcode on my Mac -- even when I'm sitting
               | directly in front of my Mac
               | 
               | The iPad just doesn't have the clutter of windows and
               | communication open like my mac does that makes it hard to
               | focus on resolving one particular idea
               | 
               | I have so many playground files on my iPad, a quick
               | glance at my project list: interactive gesture-driven
               | animations, testing out time and date logic, rendering
               | perceptual gradients, checking baseline alignment in SF
               | Symbols, messing with NSFilePresenter, mocking out a UI
               | design, animated text transitions, etc
        
           | hot_gril wrote:
           | It needs a regular web browser too.
        
         | havaloc wrote:
         | Maybe this June there'll be an announcement, but like Lucy with
         | the football, I'm not expecting it. I would instabuy if this
         | was the case, especially with a cellular iPad.
        
         | swozey wrote:
         | Yeah, I bought one of them a few years ago planning to use it
         | for a ton of things.
         | 
         | Turns out I only use it on flights to watch movies because I
         | loathe the os.
        
         | umanwizard wrote:
         | They make a version of iPad that runs macOS, it is called a
         | MacBook Pro.
        
           | zuminator wrote:
           | MacBook Pros have touchscreens and Apple Pencil compatibility
           | now?
        
             | umanwizard wrote:
             | Fair enough, I was being a bit flippant. It'd be nice if
             | that existed, but I suspect Apple doesn't want it to for
             | market segmentation reasons.
        
               | kstrauser wrote:
               | I just went to store.apple.com and specced out a 13" iPad
               | Pro with 2TB of storage, nano-texture glass, and a cell
               | modem for $2,599.
               | 
               | MacBook Pros start at $1,599. There's an enormous overlap
               | in the price ranges of the mortal-person models of those
               | products. It's not like the iPad Pro is the cheap
               | alternative to a MBP. I mean, I couldn't even spec out a
               | MacBook Air to cost as much.
        
         | ranyefet wrote:
         | Just give us support for virtualization and we could install
         | Linux on it and use it for development.
        
           | LeoPanthera wrote:
           | UTM can be built for iOS.
        
       | tosh wrote:
       | Did they mention anything about RAM?
        
         | tosh wrote:
         | > faster memory bandwidth
        
           | zamadatix wrote:
           | The announcement video also highlighted "120 GB/s unified
           | memory bandwidth". 8 GB/16 GB depending on model.
        
         | dmitshur wrote:
         | I don't think they included it in the video, but
         | https://www.apple.com/ipad-pro/specs/ says it's 8 GB of RAM in
         | 256/512 GB models, 16 GB RAM in 1/2 TB ones.
        
       | Takennickname wrote:
       | Why even have an event at this point? There's literally nothing
       | interesting.
        
         | antipaul wrote:
         | Video showing Apple Pencil Pro features was pretty sick, and I
         | ain't even an artist
        
           | Takennickname wrote:
           | The highlight of the event was a stylus?
        
           | gardaani wrote:
           | I think they are over-engineering it. I have never liked
           | gestures because it's difficult to discover something you
           | can't see. A button would have been better than an invisible
           | squeeze gesture.
        
             | swozey wrote:
             | I used Android phones forever until the iphone 13 came out
             | and I switched to IOS because I had to de-Google my life
             | completely after they (for no reason at all, "fraud" that I
             | did not commit) blocked my Google Play account.
             | 
             | The amount of things I have to google to use the phone how
             | I normally used Android is crazy. So many gestures required
             | with NOTHING telling you how to use them.
             | 
             | I recently sat around a table with 5 of my friends trying
             | to figure out how to do that "Tap to share contact info"
             | thing. Nobody at the table, all long term IOS users, knew
             | how to do it. I thought that if we tapped the phones
             | together it would give me some popup on how to finish the
             | process. We tried all sorts of tapping/phone versions until
             | we realized we had to unlock both phones.
             | 
             | And one of the people there with the same phone as me (13
             | pro) couldn't get it to work at all. It just did nothing.
             | 
             | And the keyboard. My god is the keyboard awful. I have
             | never typoed so much, and I have _no_ idea how to copy a
             | URL out of Safari to send to someone without using the
             | annoying Share button which doesn 't even have the app I
             | share to the most without clicking the More.. button to
             | show all my apps. Holding my finger over the URL doesn't
             | give me a copy option or anything, and changing the URL
             | with their highlight/delete system is terrible. I get so
             | frustrated with it and mostly just give up. The cursor
             | NEVER lands where I want it to land and almost always
             | highlights an entire word when I want to make a one letter
             | typo fix. I don't have big fingers at all. Changing a long
             | URL that goes past the length of the Safari address bar is
             | a nightmare.
             | 
             | I'm sure (maybe?) that's some option I need to change but I
             | don't even feel like looking into it anymore. I've given up
             | on learning about the phones hidden gestures and just use
             | it probably 1/10th of how I could.
             | 
             | Carplay, Messages and the easy-to-connect-devices ecosystem
             | is the only thing keeping me on it.
        
               | Apocryphon wrote:
               | Sounds like the ideal use case for an AI assistant. Siri
               | ought to tell you how to access hidden features on the
               | device. iOS, assist thyself.
        
               | antipaul wrote:
               | To copy URL from Safari, press and hold (long-press) on
               | URL bar
               | 
               | Press and hold also allows options elsewhere in iOS
        
         | throwaway11460 wrote:
         | 2x better performance per watt is not interesting? Wow, what a
         | time to be alive.
        
           | LoganDark wrote:
           | To me, cutting wattage in half is not interesting, but
           | doubling performance is interesting. So performance per watt
           | is actually a pretty useless metric since it doesn't
           | differentiate between the two.
           | 
           | of course efficiency matters for a battery-powered device,
           | but I still tend to lean towards raw power over all else.
           | Others may choose differently, which is why other metrics
           | exist I guess.
        
             | throwaway11460 wrote:
             | This still means you can pack more performance into the
             | chip though - because you're limited by cooling.
        
               | LoganDark wrote:
               | Huh, never considered cooling. I suppose that contributes
               | to the device's incredible thinness. Generally thin-and-
               | light has always been an incredible turnoff for me, but
               | tech is finally starting to catch up to thicker devices.
        
               | hedora wrote:
               | Thin and light is easier to cool. The entire device is a
               | big heat sink fin. Put another way, as the device gets
               | thinner, the ratio of surface area to volume goes to
               | infinity.
               | 
               | If you want to go thicker, then you have to screw around
               | with heat pipes, fans, etc, etc, to move the heat a few
               | cm to the outside surface of the device.
        
               | LoganDark wrote:
               | That's not why thin-and-light bothers me. Historically,
               | ultrabooks and similarly thin-and-light focused devices
               | have been utterly insufferable in terms of performance
               | compared to something that's even a single cm thicker.
               | But Apple Silicon seems extremely promising, it seems
               | quite competitive with thicker and heavier devices.
               | 
               | I never understood why everyone [looking at PC laptop
               | manufacturers] took thin-and-light to such an extreme
               | that their machines became basically useless. Now Apple
               | is releasing thin-and-light machines that are incredibly
               | powerful, and that is genuinely innovative. I hadn't seen
               | something like that from them since the launch of the
               | original iPhone, that's how big I think this was.
        
             | throwaway5959 wrote:
             | It means a lot to me, because cutting power consumption in
             | half for millions of devices means we can turn off power
             | plants (in aggregate). It's the same as lightbulbs; I'll
             | never understand why people bragged about how much power
             | they were wasting with incandescents.
        
               | yyyk wrote:
               | >cutting power consumption in half for millions of
               | devices means we can turn off power plants
               | 
               | It is well known that software inefficiency doubles every
               | couple years, that is, the same scenario would take 2x as
               | much compute, given entire software stack (not
               | disembodied algorithm which will indeed be faster).
               | 
               | The extra compute will be spent on a more abstract UI
               | stack or on new features, unless forced by physical
               | constraints (e.g. inefficient batteries of early
               | smartphone), which is not the case at present.
        
               | throwaway11460 wrote:
               | That's weird - if software gets 2x worse every time
               | hardware gets 2x better, why did my laptop in 2010 last 2
               | hours on battery while the current one lasts 16 doing
               | _much_ more complex tasks for me?
        
               | yyyk wrote:
               | Elsewhere in the comments, it is noted Apple's own
               | estimates are identical despite allegedly 2x better
               | hardware.
               | 
               | Aside, 2 hours is very low even for 2010. There's a
               | strongly usability advantage for going to 16. But going
               | from 16 to 128 won't add as much. The natural course of
               | things is to converge on a decent enough number and
               | 'spend' the rest on more complex software, a lighter
               | laptop etc.
        
               | Nevermark wrote:
               | They like bright lights?
               | 
               | I have dimmable LED strips around my rooms, hidden by
               | cove molding, reflecting off the whole ceiling, which
               | becomes a super diffuse, super bright "light".
               | 
               | I don't boast about power use, but they are certainly
               | hungry.
               | 
               | For that I get softly defuse lighting with a max
               | brightness comparable to outdoor clear sky daylight.
               | Working from home, this is so nice for my brain and
               | depression.
        
               | codedokode wrote:
               | First, only CPU power consumption is reduced, not other
               | components, second, I doubt tablets contribute
               | significantly to global power consumption, so I think no
               | power plants will be turned off.
        
           | Takennickname wrote:
           | That's bullshit. Does that mean they could have doubled
           | battery life if they kept the performance the same?
           | Impossible.
        
             | throwaway11460 wrote:
             | Impossible why? That's what happened with M1 too.
             | 
             | But as someone else noted, CPU power draw is not the only
             | factor in device battery life. A major one, but not the
             | whole story.
        
               | Takennickname wrote:
               | Intel to M1 is an entire architectural switch where even
               | old software couldn't be run and had to be emulated.
               | 
               | This is a small generational upgrade that doesn't
               | necessitate an event.
               | 
               | Other companies started having events like this because
               | they were copying apples amazing events. Apples events
               | now are just parodies of what Apple was.
        
       | praseodym wrote:
       | "With these improvements to the CPU and GPU, M4 maintains Apple
       | silicon's industry-leading performance per watt. M4 can deliver
       | the same performance as M2 using just half the power. And
       | compared with the latest PC chip in a thin and light laptop, M4
       | can deliver the same performance using just a fourth of the
       | power."
       | 
       | That's an incredible improvement in just a few years. I wonder
       | how much of that is Apple engineering and how much is TSMC
       | improving their 3nm process.
        
         | cs702 wrote:
         | Potentially > 2x greater battery life for the same amount of
         | compute!
         | 
         | That _is_ pretty crazy.
         | 
         | Or am I missing something?
        
           | krzyk wrote:
           | Wait a bit. M2 wasn't as good as the hype was.
        
             | fallat wrote:
             | This. Remember folks Apple's primary goal is PROFIT. They
             | will tell you anything appealing before independent tests
             | are done.
        
             | modeless wrote:
             | That's because M2 was on the same TSMC process generation
             | as M1. TSMC is the real hero here. M4 is the same
             | generation as M3, which is why Apple's marketing here is
             | comparing M4 vs M2 instead of M3.
        
               | sys_64738 wrote:
               | I thought M3 and M4 were different processes though.
               | Higher yield for the latter or such.
        
               | jonathannorris wrote:
               | Actually, M4 is reportedly on a more cost-efficient TSMC
               | N3E node, where Apple was apparently the only customer on
               | the more expensive TSMC N3B node; I'd expect Apple to
               | move away from M3 to M4 very quickly for all their
               | products.
               | 
               | https://www.trendforce.com/news/2024/05/06/news-
               | apple-m4-inc....
        
               | geodel wrote:
               | And why other PC vendors not latching on to the hero?
        
               | mixmastamyk wrote:
               | Apple often buys their entire capacity (of a process) for
               | quite a while.
        
               | modeless wrote:
               | Apple pays TSMC for exclusivity on new processes for a
               | period of time.
        
               | mensetmanusman wrote:
               | Saying tsmc is a hero ignores the thousands of suppliers
               | that improved everything required for tsmc to operate.
               | Tsmc is the biggest, so they get the most experience on
               | all the new toys the world's engineers and scientists are
               | building.
        
               | whynotminot wrote:
               | It's almost as if every part of the stack -- from the
               | uArch that Apple designs down to the insane machinery
               | from ASML, to the fully finished SoC delivered by TSMC --
               | is vitally important to creating a successful product.
               | 
               | But people like to assign credit solely to certain spaces
               | if it suits their narrative (lately, _Apple isn 't
               | actually all that special at designing their chips, it's
               | all solely the process advantage_)
        
               | modeless wrote:
               | Saying TSMC's success is due to their suppliers ignores
               | the fact that all of their competitors failed to keep up
               | despite having access to the same suppliers. TSMC
               | couldn't do it without ASML, but Intel and Samsung failed
               | to do it even with ASML.
               | 
               | In contrast, when Apple's CPU and GPU competitors get
               | access to TSMC's new processes after Apple's exclusivity
               | period expires, they achieve similar levels of
               | performance (except for Qualcomm because they don't
               | target the high end of CPU performance, but AMD does).
        
           | eqvinox wrote:
           | Sadly, this is only processor power consumption, you need to
           | put power into a whole lot of other things to make an useful
           | computer... a display backlight and the system's RAM come to
           | mind as particular offenders.
        
             | cs702 wrote:
             | Thanks. That makes sense.
        
             | treesciencebot wrote:
             | backlight is now the main bottleneck for consumption heavy
             | uses. I wonder what are the main advancements that are
             | happening there to optimize the wattage.
        
               | sangnoir wrote:
               | Is the iPad Pro not yet on OLED? All of Samsung's
               | flagship tablets have OLED screens for well over a decade
               | now. It eliminates the need for backlighting, has
               | superior contrast and pleasant to ise in low-light
               | conditions.
        
               | kbolino wrote:
               | I'm not sure how OLED and backlit LCD compare power-wise
               | exactly, but OLED screens still need to put off a lot of
               | light, they just do it directly instead of with a
               | backlight.
        
               | callalex wrote:
               | The iPad that came out today finally made the switch.
               | iPhones made the switch around 2016. It does seem odd how
               | long it took for the iPad to switch, but Samsung
               | definitely switched too early: my Galaxy Tab 2 suffered
               | from screen burn in that I was never able to recover
               | from.
        
               | devsda wrote:
               | If the usecases involve working on dark terminals all day
               | or watching movies with dark scenes or if the general
               | theme is dark, may be the new oled display will help
               | reduce the display power consumption too.
        
               | whereismyacc wrote:
               | QD-oled reduces it by like 25% I think? But maybe that
               | will never be in laptops, I'm not sure.
        
               | eqvinox wrote:
               | QD-OLED is an engineering improvement, i.e. combining
               | existing researched technology to improve the result
               | product. I wasn't able to find a good source on what
               | exactly it improves in efficiency, but it's not a
               | fundamental improvement in OLED electrical-optical energy
               | conversion (if my understanding is correct.)
               | 
               | In general, OLED screens seem to have an efficiency
               | around 20[?]30%. Some research departments seem to be
               | trying to bump that up
               | [https://www.nature.com/articles/s41467-018-05671-x]
               | which I'd be more hopeful on...
               | 
               | ...but, honestly, at some point you just hit the limits
               | of physics. It seems internal scattering is already a
               | major problem; maybe someone can invent pixel-sized
               | microlasers and that'd help? More than 50-60% seems like
               | a pipe dream at this point...
               | 
               | ...unless we can change to a technology that
               | fundamentally doesn't emit light, i.e. e-paper and the
               | likes. Or just LCD displays without a backlight, using
               | ambient light instead.
        
               | neutronicus wrote:
               | Please give me an external ePaper display so I can just
               | use Spacemacs in a well-lit room!
        
               | mszcz wrote:
               | Onyx makes a HDMI "25 eInk display [0]. It's pricey.
               | 
               | [0] https://onyxboox.com/boox_mirapro
               | 
               | edit: "25, not "27
        
               | vsuperpower2020 wrote:
               | I'm still waiting for the technology to advance. People
               | can't reasonably spend $1500 on the world's shittiest
               | computer monitor, even if it is on sale.
        
               | neutronicus wrote:
               | Dang, yeah, this is the opposite of what I had in mind
               | 
               | I was thinking, like, a couple hundred dollar Kindle the
               | size of a big iPad I can plug into a laptop for text-
               | editing out and about. Hell, for my purposes I'd love an
               | integrated keyboard.
               | 
               | Basically a second, super-lightweight laptop form-factor
               | I can just plug into my chonky Macbook Pro and set on top
               | of it in high-light environments when all I need to do is
               | edit text.
               | 
               | Honestly not a compelling business case now that I write
               | it out, but I just wanna code under a tree lol
        
               | eqvinox wrote:
               | A friend bought it & I had a chance to see it in action.
               | 
               | It is nice for some _very specific use cases_. (They 're
               | in the publishing/typesetting business. It's... idk,
               | really depends on your usage patterns.)
               | 
               | Other than that, yeah, the technology just isn't there
               | yet.
        
               | mholm wrote:
               | I think we're getting pretty close to this. The
               | Remarkable 2 tablet is $300, but can't take video input
               | and software support for non-notetaking is near non-
               | existent. There's even a keyboard available. Boox and
               | Hisense are also making e-ink tablets/phones for
               | reasonable prices.
        
               | craftkiller wrote:
               | If that existed as a drop-in screen replacement on the
               | framework laptop and with a high refresh rate color
               | gallery 3 panel, then I'd buy it at that price point in a
               | heart beat.
               | 
               | I can't replace my desktop monitor with eink because I
               | occasionally play video games. I can't use a 2nd monitor
               | because I live in a small apartment.
               | 
               | I can't replace my laptop screen with greyscale because I
               | need syntax highlighting for programming.
        
               | gumby wrote:
               | Maybe the $100 nano-texture screen will give you the
               | visibility you want. Not the low power of a epaper screen
               | though.
               | 
               | Hmm, emacs on an epaper screen might be great if it had
               | all the display update optimization and "slow modem mode"
               | that Emacs had back in the TECO days. (The SUPDUP network
               | protocol even implemented that at the client end and
               | interacted with Emacs directly!)
        
               | craftkiller wrote:
               | AMD gpus have "Adaptive Backlight Management" which
               | reduces your screen's backlight but then tweaks the
               | colors to compensate. For example, my laptop's backlight
               | is set at 33% but with abm it reduces my backlight to 8%.
               | Personally I don't even notice it is on / my screen seems
               | just as bright as before, but when I first enabled it I
               | did notice some slight difference in colors so its
               | probably not suitable for designers/artists. I'd 100%
               | recommend it for coders though.
        
               | ProfessorLayton wrote:
               | Strangely, Apple seems to be doing the opposite for some
               | reason (Color accuracy?), as dimming the display doesn't
               | seem to reduce the backlight as much, and they're using a
               | combination of software dimming, even at "max"
               | brightness.
               | 
               | Evidence can be seen when opening up iOS apps, which seem
               | to glitch out and reveals the brighter backlight [1].
               | Notice how #FFFFFF white isn't the same brightness as the
               | white in the iOS app.
               | 
               | [1] https://imgur.com/a/cPqKivI
        
             | naikrovek wrote:
             | that's still amazing, to me.
             | 
             | I don't expect an M4 macbook to last any longer than an M2
             | macbook of otherwise similar specs; they will spend that
             | extra power budget on things other than the battery life
             | specification.
        
           | Hamuko wrote:
           | Comparing the tech specs for the outgoing and new iPad Pro
           | models, that potential is very much not real.
           | 
           | Old: 28.65 Wh (11") / 40.88 Wh (13"), up to 10 hours of
           | surfing the web on Wi-Fi or watching video.
           | 
           | New: 31.29 Wh (11") / 38.99 Wh (13"), up to 10 hours of
           | surfing the web on Wi-Fi or watching video.
        
             | binary132 wrote:
             | Ok, but is it twice as fast during those 10 hours, leading
             | to 20 hours of effective websurfing? ;)
        
             | jeffbee wrote:
             | A more efficient CPU can't improve that spec because those
             | workloads use almost no CPU time and the display dominates
             | the energy consumption.
        
               | Hamuko wrote:
               | Unfortunately Apple only ever thinks about battery life
               | in terms of web surfing and video playback, so we don't
               | get official battery-life figures for anything else.
               | Perhaps you can get more battery life out of your iPad
               | Pro web surfing by using dark mode, since OLEDs should
               | use less power than IPS displays with darker content.
        
             | codedokode wrote:
             | Isn't this weird, a new chip consumes 2 times less power,
             | but the battery life is the same?
        
               | masklinn wrote:
               | It's not weird when you consider that browsing the web or
               | watching videos has the CPU idle or near enough, so 95%
               | of the power draw is from the display and radios.
        
               | rsynnott wrote:
               | The OLED likely adds a fair bit of draw; they're
               | generally somewhat more power-hungry than LCDs these
               | days, assuming like-for-like brightness. Realistically,
               | this will be the case until MicroLEDs are available for
               | non-completely-silly money.
        
               | CoastalCoder wrote:
               | This surprises me. I thought the big power downside of
               | LCD displays is that they use filtering to turn unwanted
               | color channels into waste heat.
               | 
               | Knowing nothing else about the technology, I assumed that
               | would make OLED displays more efficient.
        
               | mensetmanusman wrote:
               | Can't beat the thermodynamics of exciton recombination.
               | 
               | https://pubs.acs.org/doi/10.1021/acsami.9b10823
        
               | sroussey wrote:
               | OLED will use less for a screen of black and LCD will use
               | less for a screen of white. Now, take whatever average of
               | what content is on the screen and for you, it may be
               | better or may be worse.
               | 
               | White background document editing, etc., will be worse,
               | and this is rather common.
        
               | beeboobaa3 wrote:
               | No, they have a "battery budget". It the CPU power draw
               | goes down that means the budget goes up and you can spend
               | it on other things, like a nicer display or some other
               | feature.
               | 
               | When you say "up to 10 hours" most people will think "oh
               | nice that's an entire day" and be fine with it. It's what
               | they're used to.
               | 
               | Turning that into 12 hours might be possible but are the
               | tradeoffs worth it? Will enough people buy the device
               | because of the +2 hour battery life? Can you market that
               | effectively? Or will putting in a nicer fancy display
               | cause more people to buy it?
               | 
               | We'll never get significant battery life improvements
               | because of this, sadly.
        
             | fvv wrote:
             | this
        
             | masklinn wrote:
             | Yeah double the PPW does not mean double the battery,
             | because unless you're pegging the CPU/SOC it's likely only
             | a small fraction of the power consumption of a light-use or
             | idle device, especially for an SOC which originates in
             | mobile devices.
             | 
             | Doing basic web navigation with some music in the
             | background, my old M1 Pro has short bursts at ~5W (for the
             | entire SoC) when navigating around, a pair of watts for
             | mild webapps (e.g. checking various channels in discord),
             | and typing into this here textbox it's sitting happy at
             | under half a watt, with the P-cores essentially sitting
             | idle and the E cores at under 50% utilisation.
             | 
             | With a 100Wh battery that would be a "potential" of 150
             | hours or so. Except nobody would ever sell it for that,
             | because between the display and radios the laptop's
             | actually pulling 10~11W.
        
               | pxc wrote:
               | So this could be a bit helpful for heavier duty usage
               | while on battery.
        
               | tracker1 wrote:
               | On my M1 air, I find for casual use of about an hour or
               | so a day, I can literally go close to a couple weeks
               | without needing to recharge. Which to me is pretty
               | awesome. Mostly use my personal desktop when not on my
               | work laptop (docked m3 pro).
        
           | Dibby053 wrote:
           | 2x efficiency vs a 2 year old chip is more or less in line
           | with expectations (Koomey's law). [1]
           | 
           | [1] https://en.wikipedia.org/wiki/Koomey%27s_law
        
           | BurningFrog wrote:
           | Is the CPU/GPU really dominating power consumption that much?
        
             | masklinn wrote:
             | Nah, GP is off their rocker. For the workloads in question
             | the SOC's power draw is a rounding error, low single-digit
             | percent.
        
         | onlyrealcuzzo wrote:
         | On one hand, it's crazy. On the other hand, it's pretty typical
         | for the industry.
         | 
         | Average performance per watt doubling time is 2.6 years:
         | https://newsroom.arm.com/blog/performance-per-
         | watt#:~:text=T....
        
           | praseodym wrote:
           | M2 was launched in June 2022 [1] so a little under 2 years
           | ago. Apple is a bit ahead of that 2.6 years, but not by much.
           | 
           | [1] https://www.apple.com/newsroom/2022/06/apple-
           | unveils-m2-with...
        
             | halgir wrote:
             | If they maintain that pace, it will start compounding
             | incredibly quickly. If we round to 2 years vs 2.5 years,
             | after just a decade you're an entire doubling ahead.
        
           | luyu_wu wrote:
           | Note that performance per watt is 2x higher at both chips
           | peak performance. This is in many ways an unfair comparison
           | for Apple to make.
        
         | 2OEH8eoCRo0 wrote:
         | > I wonder how much of that is Apple engineering and how much
         | is TSMC improving their 3nm process.
         | 
         | I think Apple's design choices had a huge impact on the M1's
         | performance but from there on out I think it's mostly due to
         | TSMC.
        
         | izacus wrote:
         | Apple usually massively exaggerates their tech spec comparison
         | - is it REALLY half the power use of all times (so we'll get
         | double the battery life) or is it half the power use in some
         | scenarios (so we'll get like... 15% more battery life total) ?
        
           | alphakappa wrote:
           | Any product that uses this is more than just the chip, so you
           | cannot get a proportional change in battery life.
        
             | izacus wrote:
             | Sure, but I also remember them comparing M1 chip to 3090
             | GTX and my MacBook M1 Pro doesn't really run games well.
             | 
             | So I've become really suspicious about any claims about
             | performance done by Apple.
        
               | servus45678981 wrote:
               | That is fault of the devs. Because optimization for
               | dedicated graphic cards is a either integrated in the
               | game engine or they just have a version for rtx users.
        
               | filleduchaos wrote:
               | I mean, I remember Apple comparing the M1 _Ultra_ to
               | Nvidia 's RTX 3090. While that chart was definitely
               | putting a spin on things to say the least, and we can
               | argue from now until tomorrow about whether power
               | consumption should or should not be equalised, I have no
               | idea why anyone would expect the M1 _Pro_ (an explicitly
               | much weaker chip) to perform anywhere near the same.
               | 
               | Also what games are you trying to play on it? All my
               | M-series Macbooks have run games more than well enough
               | with reasonable settings (and that has a lot more to do
               | with OS bugs and the constraints of the form factor than
               | with just the chipset).
        
               | achandlerwhite wrote:
               | They compared them in terms of perf/watt, which did hold
               | up, but obviously implied higher performance overall.
        
           | illusive4080 wrote:
           | From their specs page, battery life is unchanged. I think
           | they donated the chip power savings to offset the increased
           | consumption of the tandem OLED
        
             | xattt wrote:
             | I've not seen discussion that Apple likely scales
             | performance of chips to match the use profile of the
             | specific device it's used in. An M2 in an iPad Air is very
             | likely not the same as an M2 in an MBP or Mac Studio.
        
               | mananaysiempre wrote:
               | A Ryzen 7840U in a gaming handheld is not (configured)
               | the same as a Ryzen 7840U in a laptop, for that matter,
               | so Apple is hardly unique here.
        
               | refulgentis wrote:
               | Surprisingly, I think it is: I was going to comment that
               | here, then checked Geekbench, single core scores match
               | for M2 iPad/MacBook Pro/etc. at same clock speed. i.e. M2
               | "base" = M2 "base", but core count differs, and with the
               | desktops/laptops, you get options for M2 Ultra Max SE bla
               | bla.
        
               | joakleaf wrote:
               | The GeekBench [1,2] benchmarks for M2 are:
               | 
               | Single Core: iPad Pro (M2): 2539 Macbook Air (M2): 2596
               | Macbook Pro (M2): 2645
               | 
               | Multi Core: iPad Pro (M2 8-core): 9631 Macbook Air (M2
               | 8-core): 9654 Macbook Pro (M2 8-core): 9642
               | 
               | So, it appears to be almost the same performance (until
               | it throttles due to heat, of course).
               | 
               | 1. https://browser.geekbench.com/ios-benchmarks 2.
               | https://browser.geekbench.com/mac-benchmarks
        
           | aledalgrande wrote:
           | Well battery life would be used by other things too right?
           | Especially by that double OLED screen. "best ever" in every
           | keynote makes me laugh at this point, but it doesn't mean
           | that they're not improving their power envelope.
        
           | philistine wrote:
           | Quickly looking at the press release, it seems to have the
           | same comparisons as in the video. None of Apple's comparisons
           | today are between the M3 and M4. They are ALL comparing the
           | M2 and M4. Why? It's frustrating, but today Apple replaced a
           | product with an M2 with a product with an M4. Apple always
           | compares product to product, never component to component
           | when it comes to processors. So those specs are far more
           | impressive than if we could have numbers between the M3 and
           | M4.
        
             | jorvi wrote:
             | Didn't they do extreme nitpicking for their tests so they
             | could show the M1 beating a 3090 (or M2 a 4090, I can't
             | remember).
             | 
             | Gave me quite a laugh when Apple users started to claim
             | they'd be able to play Cyberpunk 2077 maxed out with maxed
             | out raytracing.
        
               | philistine wrote:
               | I'll give you that Apple's comparisons are sometimes
               | inscrutable. I vividly remember that one.
               | 
               | https://www.theverge.com/2022/3/17/22982915/apple-m1-ultr
               | a-r...
               | 
               | Apple was comparing the power envelope (already a
               | complicated concept) of their GPU against a 3090. Apple
               | wanted to show that the peak of their GPU's performance
               | was reached with a fraction of the power of a 3090. What
               | was terrible was that Apple was cropping their chart at
               | the point where the 3090 was pulling ahead in pure
               | compute by throwing more watts at the problem. So their
               | GPU was not as powerful as a 3090, but a quick glance at
               | the chart would completely tell you otherwise.
               | 
               | Ultimately we didn't see one of those charts today, just
               | a mention about the GPU being 50% more efficient than the
               | competition. I think those charts are beloved by Johny
               | Srouji and no one else. They're not getting the message
               | across.
        
               | izacus wrote:
               | Plenty of people on HN thought that M1 GPU is as powerful
               | as 3090 GPU, so I think the message worked very well for
               | Apple.
               | 
               | They really love those kind of comparisons - e.g. they
               | also compared M1s against really old Intel CPUs to make
               | the numbers look better, knowing that news headlines
               | won't care for details.
        
               | philistine wrote:
               | They compared against really old intel CPUs because those
               | were the last ones they used in their own computers!
               | Apple likes to compare device to device, not component to
               | component.
        
               | oblio wrote:
               | You say that like it's not a marketing gimmick meant to
               | mislead and obscure facts.
               | 
               | It's not some virtue that causes them to do this.
        
               | threeseed wrote:
               | It's funny because your comment is meant to mislead and
               | obscure facts.
               | 
               | Apple compared against Intel to encourage their previous
               | customers to upgrade.
               | 
               | There is nothing insidious about this and is in fact
               | standard business practice.
        
               | izacus wrote:
               | No, they compared it because it made them look way better
               | for naive people. They have no qualms comparing to other
               | competition when it suits them.
               | 
               | You're explanation is a really baffling case of corporate
               | white knighting.
        
               | w0m wrote:
               | > not component to component
               | 
               | that's honestly kind of stupid when discussing things
               | like 'new CPU!' like this thread.
               | 
               | I'm not saying the M4 isn't a great platform, but holy
               | cow the corporate tripe people gobble up.
        
               | refulgentis wrote:
               | Yes, can't remember the precise combo either, there was a
               | solid year or two of latent misunderstandings.
               | 
               | I eventually made a visual showing it was the same as
               | claiming your iPhone was 3x the speed of a Core i9: Sure,
               | if you limit the power draw of your PC to a battery the
               | size of a post it pad.
               | 
               | Similar issues when on-device LLMs happened, thankfully,
               | quieted since then (last egregious thing I saw was stonk-
               | related wishcasting that Apple was obviously turning its
               | Xcode CI service into a full-blown AWS competitor that'd
               | wipe the floor with any cloud service, given the 2x
               | performance)
        
             | homarp wrote:
             | because previous ipad was M2. So 'remember how fast was
             | your previous ipad', well this one is N better.
        
             | kiba wrote:
             | I like the comparison between much older hardware with
             | brand new to highlight how far we came.
        
               | chipdart wrote:
               | > I like the comparison between much older hardware with
               | brand new to highlight how far we came.
               | 
               | That's ok, but why skip the previous iteration then?
               | Isn't the M2 only two generations behind? It's not that
               | much older. It's also a marketing blurb, not a
               | reproducible benchmark. Why leave out comparisons with
               | the previous iteration even when you're just hand-waving
               | over your own data?
        
               | FumblingBear wrote:
               | In this specific case, it's because iPad's never got the
               | M3. They're literally comparing it with the previous
               | model of iPad.
               | 
               | There were some disingenuous comparisons throughout the
               | presentation going back to A11 for the first Neural
               | Engine and some comparisons to M1, but the M2 comparison
               | actually makes sense.
        
               | philistine wrote:
               | I wouldn't call the comparison to A11 disingenuous, they
               | were very clear they were talking about how far their
               | neural engines have come, in the context of the
               | competition just starting to put NPUs in their stuff.
               | 
               | I mean, they compared the new iPad Pro to an iPod Nano,
               | that's just using your own history to make a point.
        
               | FumblingBear wrote:
               | Fair point--I just get a little annoyed when the
               | marketing speak confuses the average consumer and felt as
               | though some of the jargon they used could trip less
               | informed customers up.
        
             | yieldcrv wrote:
             | personally I think this is a comparison most people want.
             | The M3 had a lot of compromises over the M2.
             | 
             | that aside, the M4 is about the Neural Engine upgrades over
             | anything (which probably should have been compared to the
             | M3)
        
               | dakiol wrote:
               | What are such compromises? I may buy an M3 mbp, so would
               | like to hear more
        
               | fh9302 wrote:
               | The M3 Pro had some downgrades compared to the M2 Pro,
               | less performance cores and lower memory bandwidth. This
               | did not apply to the M3 and M3 Max.
        
             | sod wrote:
             | Yes, kinda annoying. But on the other hand, given that
             | apple releases a new chip every 12 months, we can grant
             | them some slack here. Given that from AMD, Intel or nvidia
             | we see usually a 2 year cadence.
        
               | dartos wrote:
               | There's probably easier problems to solve in the ARM
               | space than x86 considering the amount of money and time
               | spent on x86.
               | 
               | That's not to say that any of these problems are easy,
               | just that there's probably more lower hanging fruit in
               | ARM land.
        
               | kimixa wrote:
               | And yet they seem to be the only people picking the
               | apparently "Low Hanging Fruit" in ARM land. We'll see
               | about Qualcomm's Nuvia-based stuff, but that's been
               | "nearly released" for what feels like years now, but you
               | still can't buy one to actually test.
               | 
               | And don't underestimate the investment Apple made - it's
               | likely at a similar level to the big x86 incumbents. I
               | mean AMD's entire Zen development team cost was likely a
               | blip on the balance sheet for Apple.
        
               | re-thc wrote:
               | > We'll see about Qualcomm's Nuvia-based stuff, but
               | that's been "nearly released" for what feels like years
               | now, but you still can't buy one to actually test.
               | 
               | That's more bound by legal than technical reasons...
        
               | transpute wrote:
               | _> Qualcomm 's Nuvia-based stuff, but that's been "nearly
               | released" for what feels like years now_
               | 
               | Launching at Computex in 2 weeks,
               | https://www.windowscentral.com/hardware/laptops/next-gen-
               | ai-...
        
               | 0x457 wrote:
               | Good to know that it's finally seeing the light. I
               | thought they're still in legal dispute with ARM about
               | Nuvia's design?
        
               | transpute wrote:
               | Not privy to details, but some legal disputes can be
               | resolved by licensing price negotiations, motivated by
               | customer launch deadlines.
        
               | dartos wrote:
               | Again, not saying that they are easy (or cheap!) problems
               | to solve, but that there are more relatively easy
               | problems in the ARM space than the x86 space.
               | 
               | That's why Apple can release a meaningfully new chip
               | every year where it takes several for x86 manufacturers
        
               | blackoil wrote:
               | Maybe for GPUs, but for CPU both intel and AMD release
               | with yearly cadance. Even when Intel has nothing new to
               | release, generation is bumped.
        
             | MBCook wrote:
             | It's an iPad event and there were no M3 iPads.
             | 
             | That's all. They're trying to convince iPad users to
             | upgrade.
             | 
             | We'll see what they do when they get to computers later
             | this year.
        
               | epolanski wrote:
               | I have a Samsung Galaxy S7 FE tablet, and I can't figure
               | any use case where I may use more power.
               | 
               | I agree that iPad has more interesting software than
               | android for use cases like video or music editing, but I
               | don't do those on a tablet anyway.
               | 
               | I just can't imagine anyone updating their ipad M2 for
               | this except a tiny niche that really wants that more
               | power.
        
               | MBCook wrote:
               | The A series was good enough.
               | 
               | I'm vaguely considering this but entirely for the screen.
               | The chip has been irrelevant to me for years, it's long
               | past the point where I don't notice it.
        
               | nomel wrote:
               | A series was definitely not good enough. Really depends
               | on what you're using it for. Netflix and web? Sure. But
               | any old HDR tablet, that can maintain 24Hz, is good
               | enough for that.
               | 
               | These are 2048x2732 with 120Hz displays, that support 6k
               | external displays. Gaming and art apps push them pretty
               | hard. From the iPad user in my house, goin from the 2020
               | non M* iPad to a 2023 M2 iPad made a _huge_ difference
               | for the drawing apps. Better latency is always better for
               | drawing, and complex brushes (especially newer ones),
               | selections, etc, would get fairly unusable.
               | 
               | For gaming, it was pretty trivial to dip well below 60Hz
               | with a non M* iPad, with some of the higher demand games
               | like Fortnight, Minecraft (high view distance), Roblox
               | (it ain't what it used to be), etc.
               | 
               | But, the apps will always gravitate to the performance of
               | the average user. A step function in performance won't
               | show up in the apps until the adoption follows, years
               | down the line. Not pushing the average to higher
               | performance is how you stagnate the future software of
               | the devices.
        
               | MBCook wrote:
               | You're right, it's good enough _for me_. That's what I
               | meant but I didn't make that clear at all. I suspect a
               | ton of people are in a similar position.
               | 
               | I just don't push it at all. The few games I play are not
               | complicated in graphics or CPU needs. I don't draw, 3D
               | model, use Logic or Final Cut or anything like that.
               | 
               | I agree the extra power is useful to some people. But
               | even there we have the M1 (what I've got) and the M2
               | models. But I bet there are plenty of people like me who
               | mostly bought the pro models for the better screen and
               | not the additional grunt.
        
               | r00fus wrote:
               | AI on the device may be the real reason for an M4.
        
               | MBCook wrote:
               | Previous iPads have had that for a long time. Since the
               | A12 in 2018. The phones had it even earlier with the A11.
               | 
               | Sure this is faster but enough to make people care?
               | 
               | It may depend heavily on what they announce is in the
               | next version of iOS/iPadOS.
        
               | grujicd wrote:
               | I don't know who would prefer to do music or video
               | editing on smaller display, without keyboard for
               | shortcuts, without proper file system and with
               | problematic connectivity to external hardware. Sure, it's
               | possible, but why? Ok, maybe there's some usecase on the
               | road where every gram counts, but that seems niche.
        
             | mlsu wrote:
             | They know that anyone who has bought an M3 is good on
             | computers for a long while. They're targeting people who
             | have m2 or older macs. People who own an m3 are basically
             | going to buy anything that comes down the pipe, because who
             | needs an m3 over an m2 or even an m1 today?
        
               | abnercoimbre wrote:
               | I'm starting to worry that I'm missing out on some huge
               | gains (M1 Air user.) But as a programmer who's not making
               | games or anything intensive, I think I'm still good for
               | another year or two?
        
               | richiebful1 wrote:
               | I have an M1 Air and I test drove a friend's recent M3
               | Air. It's not very different performance-wise for what I
               | do (programming, watching video, editing small memory-
               | constrained GIS models, etc)
        
               | mlsu wrote:
               | I wanted to upgrade my M1 because it was going to swap a
               | lot with only 8 gigs of RAM and because I wanted a
               | machine that could run big LLMs locally. Ended up going
               | 8G macbook air M1 -> 64G macbook pro M1. My other
               | reasoning was that it would speed up compilation, which
               | it has, but not by too much.
               | 
               | The M1 air is a very fast machine and is perfect for
               | anyone doing normal things on the computer.
        
               | giantrobot wrote:
               | You're not going to be missing out on much. I had the
               | first M1 Air and recently upgraded to an M3 Air. The M1
               | Air has years of useful life left and my upgrade was for
               | reasons not performance related.
               | 
               | The M3 Air performs better than the M1 in raw numbers but
               | outside of some truly CPU or GPU limited tasks you're not
               | likely to actually notice the difference. The day to day
               | behavior between the two is pretty similar.
               | 
               | If your current M1 works you're not missing out on
               | anything. For the power/size/battery envelope the M1 Air
               | was pretty awesome, it hasn't really gotten any worse
               | over time. If it does what you need then you're good
               | until it doesn't do what you need.
        
             | mh8h wrote:
             | That's because the previous iPad Pros came with M2, not M3.
             | They are comparing the performance with the previous
             | generation of the same product.
        
             | raydev wrote:
             | > They are ALL comparing the M2 and M4. Why?
             | 
             | Well, the obvious answer is that those with older machines
             | are more likely to upgrade than those with newer machines.
             | The market for insta-upgraders is tiny.
             | 
             | edit: And perhaps an even more obvious answer: there are no
             | iPads that contained the M3, so the comparison would be
             | more useless. The M4 was just launched today exclusively in
             | iPads.
        
             | loongloong wrote:
             | Doesn't seem plausible to me that Apple will release a "M3
             | variant" that can drive "tandem OLED" displays. So probably
             | logical to package whatever chip progress (including
             | process improvements) into "M4".
             | 
             | And it can signal that "We are serious about iPad as a
             | computer", using their latest chip.
             | 
             | Logical alignment to progresses in engineering (and
             | manufacturing) packaged smartly to generate marketing
             | capital for sales and brand value creation.
             | 
             | Wonder how the newer Macs will use these "tandem OLED"
             | capabilities of the M4.
        
             | mkl wrote:
             | > Apple always compares product to product, never component
             | to component when it comes to processors.
             | 
             | I don't think this is true. When they launched the M3 they
             | compared primarily to M1 to make it look better.
        
             | dyauspitr wrote:
             | The iPads skipped the M3 so they're comparing your old iPad
             | to the new one.
        
           | cletus wrote:
           | IME Apple has always been the most honest when it makes
           | performance claims. LIke when they said an Macbook Air would
           | last 10+ hours and third-party reviewers would get 8-9+
           | hours. All the while, Dell or HP would claim 19 hours and
           | you'd be lucky to get 2 eg [1].
           | 
           | As for CPU power use, of course that doesn't translate into
           | doubling battery life because there are other components. And
           | yes, it seems the OLED display uses more power so, all in
           | all, battery life seems to be about the same.
           | 
           | I'm interested to see an M3 vs M4 performance comparison in
           | the real world. IIRC the M3 was a questionable upgrade. Some
           | things were better but some weren't.
           | 
           | Overall the M-series SoCs have been an excellent product
           | however.
           | 
           | [1]: https://www.laptopmag.com/features/laptop-battery-life-
           | claim...
           | 
           | EDIT: added link
        
             | ajross wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims
             | 
             | That's just laughable, sorry. No one is particularly honest
             | in marketing copy, but Apple is for sure one of the worst,
             | historically. Even more so when you go back to the PPC
             | days. I still remember Jobs on stage talking about how the
             | G4 was the fasted CPU in the world when I knew damn well
             | that it was half the speed of the P3 on my desk.
        
               | zeroonetwothree wrote:
               | Indeed. Have we already forgotten about the RDF?
        
               | coldtea wrote:
               | No, it was just always a meaningless term...
        
               | dijit wrote:
               | You can claim Apple is dishonest for a few reasons.
               | 
               | 1) Graphs often are unannotatted.
               | 
               | 2) Comparisons are rarely against latest generation
               | products. (their argument for that has been that they do
               | not expect people to upgrade yearly, so its showing the
               | difference of their intended upgrade path).
               | 
               | 3) They have conflated performance, for performance per
               | watt.
               | 
               | However, when it comes to battery life, performance (for
               | a task) or specification of their components (screens,
               | ability to use external displays up to 6k, port speed
               | etc) there are almost no hidden gotchas and they have
               | tended to be trustworthy.
               | 
               | The first wave of M1 announcements were met with similar
               | suspicion as you have shown here; but it was swiftly
               | dispelled once people actually got their hands on them.
               | 
               | *EDIT:* Blaming a guy who's been dead for 13 years for
               | something they said 50 years ago, and primarily it seems
               | for internal use is weird. I had to look up the context
               | but it _seems_ it was more about internal motivation in
               | the 70's than relating to anything today, especially when
               | referring to concrete claims.
        
               | Brybry wrote:
               | "This thing is incredible," Jobs said. "It's the first
               | supercomputer on a chip.... We think it's going to set
               | the industry on fire."
               | 
               | "The G4 chip is nearly three times faster than the
               | fastest Pentium III"
               | 
               | - Steve Jobs (1999) [1]
               | 
               | [1] https://www.wired.com/1999/08/lavish-debut-for-
               | apples-g4/
        
               | dijit wrote:
               | Thats cool, but literally last millennium.
               | 
               | And again, the guy has been dead for the better part of
               | _this_ millennium.
               | 
               | What have they shown of any product currently on the
               | market, especially when backed with any concrete claim,
               | that has been proven untrue-
               | 
               |  _EDIT:_ After reading your article and this one:
               | https://lowendmac.com/2006/twice-as-fast-did-apple-lie-
               | or-ju... it looks like it was true in floating point
               | workloads.
        
               | mort96 wrote:
               | Interesting, by what benchmark did you compare the G4 and
               | the P3?
               | 
               | I don't have a horse in this race, Jobs lied or bent the
               | truth all the time so it wouldn't surprise me, I'm just
               | curious.
        
               | dblohm7 wrote:
               | I remember that Apple used to wave around these SIMD
               | benchmarks showing their PowerPC chips trouncing Intel
               | chips. In the fine print, you'd see that the benchmark
               | was built to use AltiVec on PowerPC, but without MMX or
               | SSE on Intel.
        
               | 0x457 wrote:
               | Ah so the way Intel advertises their chips. Got it.
        
               | mort96 wrote:
               | Yeah, and we rightfully criticize Intel for the same and
               | we distrust their benchmarks
        
               | mc32 wrote:
               | Didn't he have to use two PPC procs to get the equivalent
               | perf you'd get on a P3?
               | 
               | Just add them up, it's the same number of Hertz!
               | 
               | But Steve that's two procs vs one!
               | 
               | I think this is when Adobe was optimizing for
               | Windows/intel and was single threaded, but Steve put out
               | some graphs showing better perf on the Mac.
        
               | leptons wrote:
               | Apple marketed their PPC systems as "a supercomputer on
               | your desk", but it was nowhere near the performance of a
               | supercomputer of that age. Maybe similar performance to a
               | supercomputer from the 1970's, but that was their
               | marketing angle from the 1990's.
        
               | galad87 wrote:
               | From https://512pixels.net/2013/07/power-mac-g4/: the ad
               | was based on the fact that Apple was forbidden to export
               | the G4 to many countries due to its "supercomputer"
               | classification by the US government.
        
               | m000 wrote:
               | It seems that US government was buying too much into tech
               | hypes at the turn of the millenium. Around the same
               | period PS2 exports were also restricted [1].
               | 
               | [1] https://www.latimes.com/archives/la-
               | xpm-2000-apr-17-fi-20482...
        
               | georgespencer wrote:
               | > Apple marketed their PPC systems as "a supercomputer on
               | your desk"
               | 
               | It's certainly fair to say that _twenty years ago_ Apple
               | was marketing some of its PPC systems as  "the first
               | supercomputer on a chip"[^1].
               | 
               | > but it was nowhere near the performance of a
               | supercomputer of that age.
               | 
               | That was not the claim. Apple did not argue that the G4's
               | performance was commensurate with the state of the art in
               | supercomputing. (If you'll forgive me: like, _fucking
               | obviously?_ The entire reason they made the claim is
               | precisely because the latest room-sized supercomputers
               | with leapfrog performance gains were in the news very
               | often.)
               | 
               | The claim was that the G4 was capable of sustained
               | gigaflop performance, and therefore met the narrow
               | technical definition of a supercomputer.
               | 
               | You'll see in the aforelinked marketing page that Apple
               | compared the G4 chip to UC Irvine's Aeneas Project, which
               | in ~2000 was delivering 1.9 gigaflop performance.
               | 
               | This chart[^2] shows the trailing average of various
               | subsets of super computers, for context.
               | 
               | This narrow definition is also why the machine could not
               | be exported to many countries, which Apple leaned
               | into.[^3]
               | 
               | > Maybe similar performance to a supercomputer from the
               | 1970's
               | 
               | What am I missing here? Picking perhaps the most famous
               | supercomputer of the mid-1970s, the Cray-1,[^4] we can
               | see performance of 160 MFLOPS, which is 160 million
               | floating point operations per second (with an 80 MHz
               | processor!).
               | 
               | The G4 was capable of delivering ~1 GFLOP performance,
               | which is a billion floating point operations per second.
               | 
               | Are you perhaps thinking of a different decade?
               | 
               | [^1]: https://web.archive.org/web/20000510163142/http://w
               | ww.apple....
               | 
               | [^2]: https://en.wikipedia.org/wiki/History_of_supercompu
               | ting#/med...
               | 
               | [^3]: https://web.archive.org/web/20020418022430/https://
               | www.cnn.c...
               | 
               | [^4]: https://en.wikipedia.org/wiki/Cray-1#Performance
        
               | leptons wrote:
               | >That was not the claim. Apple did not argue that the
               | G4's performance was commensurate with the state of the
               | art in supercomputing.
               | 
               | This is _marketing_ we 're talking about, people see
               | "supercomputer on a chip" and they get hyped up by it.
               | Apple was 100% using the "supercomputer" claim to make
               | their luddite audience think they had a performance
               | advantage, which they did not.
               | 
               | > The entire reason they made the claim is
               | 
               | The reason they marketed it that way was to get people to
               | part with their money. Full stop.
               | 
               | In the first link you added, there's a photo of a Cray
               | supercomputer, which makes the viewer equate Apple =
               | Supercomputer = _I am a computing god if I buy this
               | product_. Apple 's marketing has always been a bit shady
               | that way.
               | 
               | And soon after that period Apple jumped off the PPC
               | architecture and onto the x86 bandwagon. Gimmicks like
               | "supercomputer on a chip" don't last long when the
               | competition is far ahead.
        
               | threeseed wrote:
               | I can't believe Apple is marketing their products in a
               | way to get people to part with their money.
               | 
               | If I had some pearls I would be clutching them right now.
        
               | georgespencer wrote:
               | > This is marketing we're talking about, people see
               | "supercomputer on a chip" and they get hyped up by it.
               | 
               | That is _also_ not in dispute. I am disputing your
               | specific claim that Apple somehow suggested that the G4
               | was of commensurate performance to a modern
               | supercomputer, which does not seem to be true.
               | 
               | > Apple was 100% using the "supercomputer" claim to make
               | their luddite audience think they had a performance
               | advantage, which they did not.
               | 
               | This is why context is important (and why I'd appreciate
               | clarity on whether you genuinely believe a supercomputer
               | from the 1970s was anywhere near as powerful as a G4).
               | 
               | In the late twentieth and early twenty-first century,
               | megapixels were a proxy for camera quality, and megahertz
               | were a proxy for processor performance. More MHz = more
               | capable processor.
               | 
               | This created a problem for Apple, because the G4's
               | SPECfp_95 (floating point) benchmarks crushed Pentium III
               | at lower clock speeds.
               | 
               | PPC G4 500 MHz - 22.6
               | 
               | PPC G4 450 MHz - 20.4
               | 
               | PPC G4 400 MHz - 18.36
               | 
               | Pentium III 600 MHz - 15.9
               | 
               | For both floating point and integer benchmarks, the G3
               | and G4 outgunned comparable Pentium II/III processors.
               | 
               | You can question how this translates to real world use
               | cases - the Photoshop filters on stage were real, but
               | others have pointed out in this thread that it wasn't an
               | apples-to-apples comparison vs. Wintel - but it is
               | inarguable that the G4 had some performance advantages
               | over Pentium at launch, and that it met the (inane)
               | definition of a supercomputer.
               | 
               | > The reason they marketed it that way was to get people
               | to part with their money. Full stop.
               | 
               | Yes, marketing exists to convince people to buy one
               | product over another. That's why companies do marketing.
               | IMO that's a self-evidently inane thing to say in a
               | nested discussion of microprocessor architecture on a
               | technical forum - especially when your interlocutor is
               | establishing the historical context you may be unaware of
               | (judging by your comment about supercomputers from the
               | 1970s, which I am surprised you have not addressed).
               | 
               | I didn't say "The reason Apple markets its computers," I
               | said "The entire reason they made the claim [about
               | supercomputer performance]..."
               | 
               | Both of us appear to know that companies do marketing,
               | but only you appear to be confused about the specific
               | claims Apple made - given that you proactively raised
               | them, and got them wrong - and the historical backdrop
               | against which they were made.
               | 
               | > In the first link you added, there's a photo of a Cray
               | supercomputer
               | 
               | That's right. It looks like a stylized rendering of a
               | Cray-1 to me - what do you think?
               | 
               | > which makes the viewer equate Apple = Supercomputer = I
               | am a computing god if I buy this product
               | 
               | The Cray-1's compute, as measured in GFLOPS, was
               | approximately 6.5x lower than the G4 processor.
               | 
               | I'm therefore not sure what your argument is: you started
               | by claiming that Apple deliberately suggested that the G4
               | had comparable performance to a modern supercomputer.
               | That isn't the case, and the page you're referring to
               | contains imagery of a much less performant supercomputer,
               | as well as a lot of information relating to the history
               | of supercomputers (and a link to a Forbes article).
               | 
               | > Apple's marketing has always been a bit shady that way.
               | 
               | All companies make tradeoffs they think are right for
               | their shareholders and customers. They accentuate the
               | positives in marketing and gloss over the drawbacks.
               | 
               | Note, too, that Adobe's CEO has been duped on the page
               | you link to. Despite your emphatic claim:
               | 
               | > Apple was 100% using the "supercomputer" claim to make
               | their luddite audience think they had a performance
               | advantage, which they did not.
               | 
               | The CEO of Adobe is quoted as saying:
               | 
               | > "Currently, the G4 is significantly faster than any
               | platform we've seen running Photoshop 5.5," said John E.
               | Warnock, chairman and CEO of Adobe.
               | 
               | How is what you are doing materially different to what
               | you accuse Apple of doing?
               | 
               | > And soon after that period Apple jumped off the PPC
               | architecture and onto the x86 bandwagon.
               | 
               | They did so when Intel's roadmap introduced Core Duo,
               | which was significantly more energy-efficient than
               | Pentium 4. I don't have benchmarks to hand, but I suspect
               | that a PowerBook G5 would have given the Core Duo a run
               | for its money (despite the G5 being significantly older),
               | but only for about fifteen seconds before thermal
               | throttling and draining the battery entirely in minutes.
        
               | Vvector wrote:
               | Blaming a company TODAY for marketing from the 1990s is
               | crazy.
        
               | brokencode wrote:
               | Have any examples from the past decade? Especially in the
               | context of how exaggerated the claims are from PC and
               | Android brands they are competing with?
        
               | lynndotpy wrote:
               | Apple recently claimed that RAM in their Macbooks is
               | equivalent to 2x the RAM in any other machine, in defense
               | of the 8GB starting point.
               | 
               | In my experience, I can confirm that this is just not
               | true. The secret is heavy reliance on swap. It's still
               | the case that 1GB = 1GB.
        
               | dmitrygr wrote:
               | > The secret is heavy reliance on swap
               | 
               | You are entirely (100%) wrong, but, sadly, NDA...
        
               | ethanwillis wrote:
               | How convenient :)
        
               | monsieurbanana wrote:
               | Regardless of what you can't tell, he's absolutely right
               | regarding Apple's claims: saying that a 8gb mac is as
               | good as a 16gb non-mac is laughable.
        
               | dmitrygr wrote:
               | That was never said. They said 8gb mac is similar to a
               | 16gb non-Mac
        
               | Zanfa wrote:
               | My entry-level 8GB M1 Macbook Air beats my 64GB 10-core
               | Intel iMac in my day-to-day dev work.
        
               | sudosysgen wrote:
               | Memory compression isn't magic and isn't exclusive to
               | macOS.
        
               | dmitrygr wrote:
               | I suggest you go and look HOW it is done in apple silicon
               | macs, and then think long and hard why this might make a
               | huge difference. Maybe Asahi Linux guys can explain it to
               | you ;)
        
               | sudosysgen wrote:
               | I understand that it can make a difference to performance
               | (which is already baked into the benchmarks we look at),
               | I don't see how it can make a difference to compression
               | ratios, if anything in similar implementations (ex:
               | console APUs) it tends to lead to worse compression
               | ratios.
               | 
               | If there's any publicly available data to the contrary
               | I'd love to read it. Anecdotally I haven't seen a
               | significant difference between zswap on Linux and macOS
               | memory compression in terms of compression ratios, and on
               | the workloads I've tested zswap tends to be faster than
               | no memory compression on x86 for many core machines.
        
               | lynndotpy wrote:
               | I do admit the "reliance on swap" thing is speculation on
               | my part :)
               | 
               | My experience is that I can still tell when the OS is
               | unhappy when I demand more RAM than it can give. MacOS is
               | still relatively responsive around this range, which I
               | just attributed to super fast swapping. (I'd assume
               | memory compression too, but I usually run into this
               | trouble when working with large amounts of poorly-
               | compressible data.)
               | 
               | In either case, I know it's frustrating when someone is
               | confidently wrong but you can't properly correct them, so
               | you have my apologies
        
               | brokencode wrote:
               | Sure, and they were widely criticized for this. Again,
               | the assertion I was responding to is that Apple does this
               | "laughably" more than competitors.
               | 
               | Is an occasional statement that they get pushback on
               | really worse than what other brands do?
               | 
               | As an example from a competitor, take a look at the
               | recent firestorm over Intel's outlandish anti-AMD
               | marketing:
               | 
               | https://wccftech.com/intel-calls-out-amd-using-old-cores-
               | in-...
        
               | ajross wrote:
               | > Sure, and they were widely criticized for this. Again,
               | the assertion I was responding to is that Apple does this
               | "laughably" more than competitors.
               | 
               | FWIW: the language upthread was that it was laughable to
               | say Apple was the _most_ honest. And I stand by that.
        
               | brokencode wrote:
               | Fair point. Based on their first sentence, I
               | mischaracterized how "laughable" was used.
               | 
               | Though the author also made clear in their second
               | sentence that they think Apple is one of the worst when
               | it comes to marketing claims, so I don't think your
               | characterization is totally accurate either.
        
               | rahkiin wrote:
               | There is also memory compression and their insane swap
               | speed due to SoC memory and ssd
        
               | anaisbetts wrote:
               | Every modern operating system now does memory compression
        
               | astrange wrote:
               | Some of them do it better than others though.
        
               | Rinzler89 wrote:
               | Apple uses Magic Compression.
        
               | adamomada wrote:
               | Not sure what windows does but the popular method on e.g.
               | fedora is to split memory into main and swap and then
               | compress swap. It could be more efficient the way Apple
               | does it by not having to partition main memory.
        
               | throwaway744678 wrote:
               | This is a revolution
        
               | seec wrote:
               | Ye that was hilarious, my basic workload borders on the
               | 8GB limit not even pushing it. They have fast swap but
               | nothing beats real ram in the end, and considering their
               | storage pricing is as stupid as their RAM pricing it
               | really makes no difference.
               | 
               | If you go for the base model, you are in for a bad time,
               | 256GB with heavy swap and no dedicated GPU memory (making
               | the 8GB even worse) is just plain stupid.
               | 
               | This what the Apple fanboys don't seem to get, their base
               | model at somewhat affordable price are deeply incompetent
               | and if you start to load it up the pricing just do not
               | make a lot of sense...
        
               | n9 wrote:
               | You know that RAM in these machines is more different
               | than the same as "RAM" in a standard PC? Apple's SoC RAM
               | is more or less part of the CPU/GPU and is super fast.
               | And for obvious reasons cannot be added to.
               | 
               | Anyway, I manage a few M1 and M3 machines with 256/8
               | configs and they all run just as fast as 16 and 32
               | machines EXCEPT for workloads that need more than 8GB for
               | a process (virtualization) or workloads that need lots of
               | video memory (Lightroom can KILL an 8GB machine that
               | isn't doing anything else...)
               | 
               | The 8GB is stupid discussion isn't "wrong" in the general
               | case, but it is wrong for maybe 80% of users.
        
               | ajross wrote:
               | > EXCEPT for workloads that need more than 8GB for a
               | process
               | 
               | Isn't that exactly the upthread contention: Apple's magic
               | compressed swap management is still _swap management_
               | that replaces O(1) fast(-ish) DRAM access with thousands+
               | cycle page decompression operations. It may be faster
               | than storage, but it 's still extremely slow relative to
               | a DRAM fetch. And once your working set gets beyond your
               | available RAM you start thrashing just like VAXen did on
               | 4BSD.
        
               | kcartlidge wrote:
               | > _If you go for the base model, you are in for a bad
               | time, 256GB with heavy swap and no dedicated GPU memory
               | (making the 8GB even worse) is just plain stupid ...
               | their base model at somewhat affordable price are deeply
               | incompetent_
               | 
               | I got the base model M1 Air a couple of years back and
               | whilst I don't do much gaming I do do C#, Python, Go,
               | Rails, local Postgres, and more. I also have a (new last
               | year) Lenovo 13th gen i7 with 16GB RAM running Windows 11
               | and the performance _with the same load_ is night and day
               | - the M1 walks all over it whilst easily lasting 10hrs+.
               | 
               |  _Note that I 'm not a fanboy; I run both by choice. Also
               | both iPhone and Android._
               | 
               | The Windows laptop often gets sluggish and hot. The M1
               | never slows down and stays cold. There's just no
               | comparison (though the Air keyboard remains poor).
               | 
               | I don't much care about the technical details, and I know
               | 8GB isn't a lot. I care about the _experience_ and the
               | underspecced Mac wins.
        
               | cwillu wrote:
               | If someone is claiming "<foo> has always <barred>", then
               | I don't think it's fair to demand a 10 year cutoff on
               | counter-evidence.
        
               | bee_rider wrote:
               | Clearly it isn't the case that Apple has always been more
               | honest than their competition, because there were some
               | years before Apple was founded.
        
               | windowsrookie wrote:
               | While certainly misleading, there were situations where
               | the G4 was incredibly fast for the time. I remember being
               | able to edit Video in iMove on a 12" G4 Laptop. At that
               | time there was no equivalent x86 machine.
        
               | jmull wrote:
               | If you have to go back 20+ years for an example...
        
               | n9 wrote:
               | Worked in an engineering lab at the time of the G4
               | introduction and I can contest that the G4 was a very,
               | very fast CPU for scientific workloads.
               | 
               | Confirmed here:
               | https://computer.howstuffworks.com/question299.htm (and
               | elsewhere.)
               | 
               | A year later I was doing bonkers (for the time) photoshop
               | work on very large compressed tiff files and my G4 laptop
               | running at 400Mhz was more than 2x as fast as PIIIs on my
               | bench.
               | 
               | Was it faster all around? I don't know how to tell. Was
               | Apple as honest as I am in this commentary about how it
               | mattered what you were doing? No. Was it a CPU that was
               | able to do some things very fast vs others? I know it
               | was.
        
             | Aurornis wrote:
             | > ME Apple has always been the most honest when it makes
             | performance claims.
             | 
             | Okay, but your example was about battery life:
             | 
             | > LIke when they said an Macbook Air would last 10+ hours
             | and third-party reviewers would get 8-9+ hours. All the
             | while, Dell or HP would claim 19 hours and you'd be lucky
             | to get 2 eg [1]
             | 
             | And even then, they exaggerated their claims. And your link
             | doesn't say anything about HP or Dell claiming 19 hour
             | battery life.
             | 
             | Apple has definitely exaggerated their performance claims
             | over and over again. The Apple silicon parts are fast and
             | low power indeed, but they've made ridiculous claims like
             | comparing their chips to an nVidia RTX 3090 with completely
             | misleading graphs
             | 
             | Even the Mac sites have admitted that the nVidia 3090
             | comparison was completely wrong and designed to be
             | misleading: https://9to5mac.com/2022/03/31/m1-ultra-gpu-
             | comparison-with-...
             | 
             | This is why you have to take everything they say with a
             | huge grain of salt. Their chip may be "twice" as power
             | efficient in some carefully chosen unique scenario that
             | only exists in an artificial setting, but how does it fare
             | in the real world? That's the question that matters, and
             | you're not going to get an honest answer from Apple's
             | marketing team.
        
               | vel0city wrote:
               | You're right, its not 19 hours claimed. It was more than
               | even that.
               | 
               | > HP gave the 13-inch HP Spectre x360 an absurd 22.5
               | hours of estimated battery life, while our real-world
               | test results showed that the laptop could last for 12
               | hours and 7 minutes.
        
               | seaal wrote:
               | the absurdness was difference in claimed battery life vs
               | actual battery life. 19 vs 2 is more absurd than 22.5 vs
               | 12
               | 
               | > Speaking of the ThinkPad P72, here are the top three
               | laptops with the most, er, far out battery life claims of
               | all our analyzed products: the Lenovo ThinkPad P72, the
               | Dell Latitude 7400 2-in-1 and the Acer TravelMate P6
               | P614. The three fell short of their advertised battery
               | life by 821 minutes (13 hours and 41 mins), 818 minutes
               | (13 hours and 38 minutes) and 746 minutes (12 hours and
               | 26 minutes), respectively.
               | 
               | Dell did manage to be one of the top 3 most absurd claims
               | though.
        
               | ribit wrote:
               | M1 Ultra did benchmark close to 3090 in some synthetic
               | gaming tests. The claim was not outlandish, just largely
               | irrelevant for any reasonable purpose.
               | 
               | Apple does usually explain their testing methodology and
               | they don't cheat on benchmarks like some other companies.
               | It's just that the results are still marketing and should
               | be treated as such.
               | 
               | Outlandish claims notwithstanding, I don't think anyone
               | can deny the progress they achieved with their CPU and
               | especially GPU IP. Improving performance on complex
               | workloads by 30-50% in a single year is very impressive.
        
             | moogly wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims.
             | 
             | I guess you weren't around during the PowerPC days...
             | Because that's a laughable statement.
        
               | imwillofficial wrote:
               | All I remember is tanks in the commercials.
               | 
               | We need more tanks in commercials.
        
               | oblio wrote:
               | I have no idea who's down voting you. They were lying
               | through their teeth about CPU performance back then.
               | 
               | A PC half the price was smoking their top of the line
               | stuff.
        
               | seec wrote:
               | That's funny you say that, because this is precisely the
               | time, I started buying Macs (I got a Pismo PowerBook G3
               | gifted and then bought an iBook G4). And my experience
               | was that for sure, if you put as much money into a PC
               | than in a Mac you would get MUCH better performance.
               | 
               | What made it worth it at the time (I felt) was the
               | software. Today I'm really don't think so, software has
               | improved overall in the industry and there is not a lot
               | of things "Mac specific" that makes it a clear-cut
               | choice.
               | 
               | As for the performance I can't believe all the Apple
               | silicon hype. Sure, it gets good battery life given you
               | use strictly Apple software (or software optimized for it
               | heavily) but in mixed workload situation it's not that
               | impressive.
               | 
               | Using the M2 MacBook Pro of a friend I figured I could
               | get maybe 4-5 hours out of its best case scenario which
               | is better than the 2-3 hours you would get from a PC
               | laptop but also not that great considering the price
               | difference.
               | 
               | And when it comes to performance it is extremely unequal
               | and very lackluster for many things. Like there is more
               | lag launching Activity Monitor on a 2K++ MacBook Pro than
               | launching task manager on a 500 PC. This is a small
               | somewhat stupid example but it does tell the overall
               | story.
               | 
               | They talk a big game but in reality, their stuff isn't
               | that performant in the real world.
               | 
               | And they still market games when one of their 2K laptops
               | plays Dota 2 (a very old, relatively ressource efficient
               | game) worse than a cheapo PC.
        
               | skydhash wrote:
               | > Using the M2 MacBook Pro of a friend I figured I could
               | get maybe 4-5 hours out of its best case scenario which
               | is better than the 2-3 hours you would get from a PC
               | laptop but also not that great considering the price
               | difference.
               | 
               | Any electron apps on it?
        
               | dylan604 wrote:
               | Oh those megahertz myths! Their marketing department is
               | pretty amazing at their spin control. This one was right
               | up there with "it's not a bug; it's a feature" type of
               | spin.
        
             | syncsynchalt wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims
             | 
             | Yes and no. They'll always be honest with the claim, but
             | the scenario for the claimed improvement will always be
             | chosen to make the claim as large as possible, sometimes
             | with laughable results.
             | 
             | Typically something like "watch videos for 3x longer
             | <small>when viewing 4k h265 video</small>" (which means
             | they adapted the previous gen's silicon which could only
             | handle h264).
        
             | moooo99 wrote:
             | They are pretty honest when it comes to battery life
             | claims, they're less honest when it comes to benchmark
             | graphs
        
               | underlogic wrote:
               | I don't think less honest covers it and can't believe
               | anything their marketing says after the 3090 claims.
               | Maybe it's true, maybe not. We'll see from the reviews.
               | Well assuming the reviewers weren't paid off with an
               | "evaluation unit".
        
             | bvrmn wrote:
             | BTW I get 19 hours from DELL XPS and Latitude. It's Linux
             | with custom DE and Vim as IDE though.
        
               | amarka wrote:
               | I get about 21 hours from mine, it's running Windows but
               | powered off.
        
               | bee_rider wrote:
               | This is why Apple can be slightly more honest about their
               | battery specs, they don't have the OS working against
               | them. Unfortunately most DELLs XPS will be running
               | Windows, so it is still misleading to provide specs based
               | on what the hardware could do if not sabotaged.
        
               | wklm wrote:
               | can you share more details about your setup?
        
               | bvrmn wrote:
               | Archlinux, mitigations (spectre alike) off, X11, OpenBox,
               | bmpanel with only CPU/IO indicator. Light theme
               | everywhere. Opera in power save mode. `powertop --auto-
               | tune` and `echo 1 | sudo tee
               | /sys/devices/system/cpu/intel_pstate/no_turbo` Current
               | laptop is Latitude 7390.
        
               | ribit wrote:
               | Right, so you are disabling all performance features and
               | effectively turning your CPU into a low-end low-power
               | SKU. Of course you'd get better battery life. It's not
               | the same thing though.
        
               | fransje26 wrote:
               | I get 20 minutes from my Dell (not the XPS), with Vim.
               | When it was brand-new, I got 40 minutes. A piece of hot
               | garbage, with an energy-inefficient intel cpu..
        
             | bee_rider wrote:
             | Controlling the OS is probably a big help there. At least,
             | I saw lots of complaints about my zenbook model's battery
             | not hitting the spec. It was easy to hit or exceed it in
             | Linux, but you have to tell it not to randomly spin up the
             | CPU.
        
             | izacus wrote:
             | > LIke when they said an Macbook Air would last 10+ hours
             | and third-party reviewers would get 8-9+ hours.
             | 
             | For literal YEARS, Apple battery life claims were a running
             | joke on how inaccurate and overinflated they were.
        
             | nabakin wrote:
             | Maybe for battery life, but definitely not when it comes to
             | CPU/GPU performance. Tbf, no chip company is, but Apple is
             | particularly egregious. Their charts assume best case
             | multi-core performance when users rarely ever use all cores
             | at once. They'd have you thinking it's the equivalent of a
             | 3090 or that you get double the frames you did before when
             | the reality is more like 10% gains.
        
             | dylan604 wrote:
             | Yeah, the assumption seems to be that using less battery by
             | one component means that the power will just magically go
             | unused. As with everything else in life, as soon as
             | something stops using a resource something else fills the
             | vacuum to take advantage of the resource.
        
             | treflop wrote:
             | Apple is always honest but they know how to make you
             | believe something that isn't true.
        
           | bagels wrote:
           | CPU is not the only way that power is consumed in a portable
           | device. It is a large fraction, but you also have displays
           | and radios.
        
           | coldtea wrote:
           | Apple might use simplified and opaque plots to drive their
           | point, but they all too often undersell the differences.
           | Indepedent reviews for example find that they not just hit
           | the mark Apple mentions for things like battery but that
           | often do slightly better...
        
           | michaelmior wrote:
           | > is it REALLY half the power use of all times (so we'll get
           | double the battery life)
           | 
           | I'm not sure what you mean by "of all times" but half the
           | battery usage of the processor definitely doesn't translate
           | into double the battery life since the processor is not the
           | only thing consuming power.
        
           | smith7018 wrote:
           | You wouldn't necessarily get twice the battery life. It could
           | be less than that due to the thinner body causing more heat,
           | a screen that utilizes more energy, etc
        
           | mcv wrote:
           | I don't know, but the M3 MBP I got from work already gives
           | the impression of using barely any power at all. I'm really
           | impressed by Apple Silicon, and I'm seriously reconsidering
           | my decision from years ago to never ever buy Apple again. Why
           | doesn't everybody else use chips like these?
        
             | jacurtis wrote:
             | I have an M3 for my personal laptop and an M2 for my work
             | laptop. I get ~8 hours if I'm lucky on my work laptop, but
             | I have attributed most of that battery loss to all the
             | "protection" software they put on my work laptop that is
             | always showing up under the "Apps Using Significant Power"
             | category in the battery dropdown.
             | 
             | I can have my laptop with nothing on screen, and the
             | battery still points to TrendMicro and others as the cause
             | of heavy battery drain while my laptop seemingly idles.
             | 
             | I recently upgraded my personal laptop to the M3 MacBook
             | pro and the difference is astonishing. I almost never use
             | it plugged in because I genuinely get close to that 20 hour
             | reported battery life. Last weekend I played a AAA Video
             | Game through Xbox Cloud Gaming (awesome for mac gamers btw)
             | and with essentially max graphics (rendered elsewhere and
             | streamed to me of course), I got sucked into a game for
             | like 5 hours and lost only 8% of my battery during that
             | time, while playing a top tier video game! It really blew
             | my mind. I also use GoLand IDE on there and have managed to
             | get a full day of development done using only about 25-30%
             | battery.
             | 
             | So yeah, whatever Apple is doing, they are doing it right.
             | Performance without all the spyware that your work gives
             | you makes a huge difference too.
        
               | bee_rider wrote:
               | For the AAA video game example, I mean, it is interesting
               | how far that kind of tech has come... but really that's
               | just video streaming (maybe slightly more difficult
               | because latency matters?) from the point of view of the
               | laptop, right? The quality of the graphics there have
               | more-or-less nothing to do with the battery.
        
             | jhickok wrote:
             | I think the market will move to using chips like this, or
             | at least have additional options. The new Snapdragon SOC is
             | interesting, and I would suspect we could see Google and
             | Microsoft play in this space at some point soon.
        
           | wwilim wrote:
           | Isn't 15% more battery life a huge improvement on a device
           | already well known for long battery life?
        
           | can16358p wrote:
           | Apple is one of the few companies that underpromise and
           | overdeliver and never exaggerate.
           | 
           | Compared to the competition, I'd trust Apple much more than
           | the Windows laptop OEMs.
        
         | mvkel wrote:
         | And here it is in an OS that can't even max out an M1!
         | 
         | That said, the function keys make me think "and it runs macOS"
         | is coming, and THAT would be extremely compelling.
        
           | a_vanderbilt wrote:
           | We've seen a slow march over the last decade towards the
           | unification of iOS and macOS. Maybe not a "it runs macOS",
           | but an eventual "they share all the same apps" with adaptive
           | UIs.
        
             | asabla wrote:
             | I think so too. Especially after the split from iOS to
             | ipados. Hopefully they'll show something during this year's
             | WWDC
        
             | DrBazza wrote:
             | They probably saw the debacle that was Windows 8 and
             | thought merging a desktop and touch OS is a decade-long
             | gradual task, if that is even the final intention.
             | 
             | Unlike MS that went with the Big Bang in your face approach
             | that was oh-so successful.
        
               | zitterbewegung wrote:
               | People have complained about why Logic Pro / Final Cut
               | wasn't ported to the iPad Pro line. The obvious answer is
               | that making workflows done properly take time.
        
               | a_vanderbilt wrote:
               | Even with the advantage of time, I don't think Microsoft
               | would have been able to do it. They can't even get their
               | own UI situated, much less adaptive. Windows 10/11 is
               | this odd mishmash of old and new, without a consistent
               | language across it. They can't unify what isn't even
               | cohesive in the first place.
        
               | mvkel wrote:
               | I'd be very surprised if Apple is paying attention to
               | anything that's happening with windows. At least as a
               | divining rod for how to execute.
        
               | bluescrn wrote:
               | At this point, there's two fundamentally different types
               | of computing that will likely never be mergeable in a
               | satisfactory way.
               | 
               | We now have 'content consumption platforms' and 'content
               | creation platforms'.
               | 
               | While attempts have been made to try and enable some
               | creation on locked-down touchscreen devices, you're never
               | going to want to try and operate a fully-featured version
               | of Photoshop, Maya, Visual Studio, etc on them. And if
               | you've got a serious workstation with multiple large
               | monitors and precision input devices, you don't want to
               | have dumbed-down touch-centric apps forced upon you
               | Win8-style.
               | 
               | The bleak future that seems likely is that the 'content
               | creation platforms' become ever more niche and far more
               | costly. Barriers to entry for content creators are raised
               | significantly as mainstream computing is mostly limited
               | to locked-down content consumption platforms. And Linux
               | is only an option for as long as non-locked-down hardware
               | is available for sensible prices.
        
               | eastbound wrote:
               | On the other hand, a $4000 mid-game Macbook doesn't have
               | a touchscreen and that's a heresy. Granted, you can get
               | the one with the emoji bar, but why interact using touch
               | on a bar when you could touch the screen directly?
               | 
               | Maybe the end game for Apple isn't the full convergence,
               | but just having a touch screen on the Mac.
        
               | bluescrn wrote:
               | Why would you want greasy finger marks on your Macbook
               | screen?
               | 
               | Not much point having a touchscreen on a Macbook (or any
               | laptop really), unless the hardware has a 'tablet mode'
               | with a detachable or fold-away keyboard.
        
               | dghlsakjg wrote:
               | Mouse and keyboard is still a better interface for A LOT
               | of work. I have yet to find a workflow for any of my
               | professional work that would be faster or easier if you
               | gave me a touchscreen.
               | 
               | There are plenty of laptops that do have touchscreens,
               | and it has always felt more like a gimmick than a useful
               | hardware interface.
        
               | dialup_sounds wrote:
               | Kinda weird to exclude Procreate, Affinity, Final Cut,
               | Logic, etc. from your definition of content creation. The
               | trend has clearly been more professional and creative
               | apps year over year and ever more capable devices to run
               | them on. I mean, you're right that nobody wants to use
               | Photoshop on the iPad, but that's because there are
               | better options.
               | 
               | Honestly, the biggest barrier to creativity is thinking
               | you need a specific concept of a "serious workstation" to
               | do it. Plenty of people are using $2k+ desktops just to
               | play video games.
        
               | bluescrn wrote:
               | In these cases, it still seems that tablet-based tools
               | are very much 'secondary tools', more of a sketchpad to
               | fiddle with ideas while on the move, rather than
               | 'production tools'.
               | 
               | Then there's the whole dealing with lots of files and
               | version control side of things, essential for working as
               | part of a team. Think about creating (and previewing, and
               | finally uploading) a very simple web page, just HTML and
               | a couple of images, entirely on an iPad. While it's
               | probably quite possible these days, I suspect the
               | workflow would be abysmal compared to a 'proper computer'
               | where the file system isn't hidden from you and where
               | you're not constantly switching between full-screen apps.
               | 
               | And that's before you start dealing with anything with
               | significant numbers of files in deep directory
               | structures, or doing more technical image creation (e.g.
               | dealing with alpha channels). And of course, before
               | testing your webpage on all the major browsers. Hmm...
        
               | dghlsakjg wrote:
               | > Barriers to entry for content creators are raised
               | significantly as mainstream computing is mostly limited
               | to locked-down content consumption platforms. And Linux
               | is only an option for as long as non-locked-down hardware
               | is available for sensible prices.
               | 
               | Respectfully, I disagree partially. It has never been
               | easier or more affordable to get into creating content.
               | You can create cinema grade video with used cameras that
               | sell for a few hundred dollars. You can create pixar
               | level animation on open source software, and a pretty
               | cheap computer. A computer that can edit a 4k video costs
               | less than the latest iPhone. There are people that create
               | plenty of content with just a phone. Simply put it is
               | orders of magnitude cheaper and easier to create content
               | than it was less than two decades ago, which is why we
               | are seeing so much content getting made. I used to work
               | for a newspaper and it used to be a lot harder and more
               | expensive to produce audio visual media.
               | 
               | My strong feeling is that the problem of content being
               | locked into platforms has precious little to do with
               | consumption oriented hardware, and more to do with the
               | platforms. Embrace -> extinguish -> exlcusivity ->
               | enshittify seems to be the model behind basically
               | anything that hosts user content these days.
        
               | kmeisthax wrote:
               | You're right about the reason but wrong about the
               | timeline: Jobs saw Windows XP Tablet Edition and built a
               | skunkworks at Apple to engineer a tablet that did not
               | require a stylus. This was purely to spite a friend[0] of
               | his that worked at Microsoft and was very bullish on XP
               | tablets.
               | 
               | Apple then later took the tablet demo technology, wrapped
               | it up in a _very_ stripped-down OS X with a different
               | window server and UI library, and called it iPhone OS.
               | Apple was very clear from the beginning that Fingers Can
               | 't Use Mouse Software, Damn It, and that the whole ocean
               | needed to be boiled to support the new user interface
               | paradigm[1]. They even have very specific UI rules
               | _specifically_ to ensure a finger never meets a desktop
               | UI widget, including things like iPad Sidecar just not
               | forwarding touch events at all and only supporting
               | connected keyboards, mice, and the Apple Pencil.
               | 
               | Microsoft's philosophy has always been the complete
               | opposite. Windows XP through 7 had tablet support that
               | amounted to just some affordances for stylus users
               | layered on top of a mouse-only UI. Windows 8 was the
               | first time they took tablets seriously, but instead of
               | just shipping a separate tablet OS or making Windows
               | Phone bigger, they turned it into a parasite that ate the
               | Windows desktop from the inside-out.
               | 
               | This causes awkwardness. For example, window management.
               | Desktops have traditionally been implemented as a shared
               | data structure - a tree of controls - that every app on
               | the desktop can manipulate. Tablets don't support this:
               | your app gets one[2] display surface to present their
               | whole UI inside of[3], and that surface is typically
               | either full-screen or half-screen. Microsoft solved this
               | incongruity by shoving the entire Desktop inside of
               | another app that could be properly split-screened against
               | the new, better-behaved tablet apps.
               | 
               | If Apple _were_ to decide  "ok let's support Mac apps on
               | iPad", it'd have to be done in exactly the same way
               | Windows 8 did it, with a special Desktop app that
               | contained all the Mac apps in a penalty box. This is so
               | that they didn't have to add support for all sorts of
               | incongruous, touch-hostile UI like floating toolbars,
               | floating pop-ups, global menus, five different ways of
               | dragging-and-dropping tabs, and that weird drawer thing
               | you're not supposed to use anymore, to iPadOS. There
               | really isn't a way to gradually do this, either. You can
               | gradually add _feature parity_ with macOS (which they
               | should), but you can 't gradually find ways to make
               | desktop UI designed by third-parties work on a tablet.
               | You either put it in a penalty box, or you put all the
               | well-behaved tablet apps in their own penalty boxes, like
               | Windows 10.
               | 
               | Microsoft solved Windows 8's problems by going back to
               | the Windows XP/Vista/7 approach of just shipping a
               | desktop for fingers. Tablet Mode tries to hide this, but
               | it's fundamentally just window management automation, and
               | it has to handle all the craziness of desktop. If a
               | desktop app decides it wants a floating toolbar or a
               | window that can't be resized[4], Tablet Mode has to honor
               | that request. In fact, Tablet Mode needs a lot of
               | heuristics to tell what floating windows pair with which
               | apps. So it's a lot more awkward for tablet users in
               | exchange for desktop users having a usable desktop again.
               | 
               | [0] Given what I've heard about Jobs I don't think Jobs
               | was psychologically capable of having friends, but I'll
               | use the word out of convenience.
               | 
               | [1] Though the Safari team was way better at building
               | compatibility with existing websites, so much so that
               | this is the one platform that doesn't have a deep
               | mobile/desktop split.
               | 
               | [2] This was later extended to multiple windows per app,
               | of course.
               | 
               | [3] This is also why popovers and context menus _never_
               | extend outside their containing window on tablets. Hell,
               | also on websites. Even when you have multiwindow, there
               | 's no API surface for "I want to have a control floating
               | on top of my window that is positioned over here and has
               | this width and height".
               | 
               | [4] Which, BTW, is why the iPad has no default calculator
               | app. Before Stage Manager there was no way to have a
               | window the size of a pocket calculator.
        
               | pram wrote:
               | Clip Studio is one Mac app port I've seen that was
               | literally the desktop version moved to the iPad. It
               | uniquely has the top menu bar and everything. They might
               | have made an exception because you're intended to use the
               | pencil and not your fingers.
        
               | Bluecobra wrote:
               | Honestly, using a stylus isn't that bad. I've had to
               | support floor traders for many years and they all still
               | use a Windows-based tablet + a stylus to get around.
               | Heck, even Palm devices were a pleasure to use. Not sure
               | why Steve was so hell bent against them, it probably had
               | to do with his beef with Sculley/Newton.
        
             | skohan wrote:
             | Unfortunately I think "they share all the same apps" will
             | not include a terminal with root access, which is what
             | would really be needed to make iPad a general purpose
             | computer for development
             | 
             | It's a shame, because it's definitely powerful enough, and
             | the idea of traveling with just an iPad seems super
             | interesting, but I imagine they will not extend those
             | features to any devices besides macs
        
               | LordDragonfang wrote:
               | I mean, it doesn't even have to be true "root" access.
               | Chromebooks have a containerized linux environment, and
               | aside from the odd bug, the high end ones are actually
               | great dev machines while retaining the "You spend most of
               | your time in the browser so we may as well bake that into
               | the OS" base layer.
        
               | a_vanderbilt wrote:
               | I actually do use a Chromebook in this way! Out of all
               | the Linux machines I've used, that's why I like it. Give
               | me a space to work and provide an OS that I don't have to
               | babysit or mentally maintain.
        
               | presides wrote:
               | Been a while since I've used a chromebook but iirc
               | there's ALSO root access that's just a bit more difficult
               | to access, and you do actually need to access it from
               | time to time for various reasons, or at least you used
               | to.
        
               | LordDragonfang wrote:
               | You're thinking of Crouton, the old method of using linux
               | on a Chromebook (which involved disabling boot protection
               | and setting up a second linux install in a chroot, with a
               | keybind that allowed you to toggle between the two
               | environments). It was including a
               | 
               | Crostini is the new containerized version that is both
               | officially supported and integrated into ChromeOS
        
             | 0x457 wrote:
             | I will settle for: you can connect 2 monitors to iPad and
             | select audio device sound is going through. If can run
             | IntelliJ and compile rust on the iPad, I would promise to
             | upgrade to the new iPad Pro as soon as it is released every
             | time.
        
             | bonestamp2 wrote:
             | Agreed, this will be the way forward in the future. I've
             | already seen one of my apps (Authy) say "We're no longer
             | building a macOS version, just install the iPad app on your
             | mac".
             | 
             | That's great, but you need an M series chip in your mac for
             | that to work so backwords compatibility only goes back a
             | few years at this point, which is fine for corporate
             | upgrade cycles but might be a bit short for consumers at
             | this time. But it will be fine in the future.
        
             | beeboobaa3 wrote:
             | Until an "iPhone" can run brew, all my developer tools,
             | steam, epic games launcher, etc it's hardly interesting.
        
             | plussed_reader wrote:
             | The writing was on the wall with the introduction of Swift,
             | IMO. Since then it's been over complicating the iPad and
             | dumbing down the macOS interfaces to attain this goal. So
             | much wasted touch/negative space in macOS since Catalina to
             | compensate for fingers and adapative interfaces; so many
             | hidden menus and long taps squirreled away in iOS.
        
             | reaperducer wrote:
             | _Maybe not a "it runs macOS", but an eventual "they share
             | all the same apps" with adaptive UIs_
             | 
             | M-class MacBooks can already run many iPhone and iPad apps.
        
           | criddell wrote:
           | > And here it is in an OS that can't even max out an M1
           | 
           | Do you really want your OS using 100% of CPU?
        
         | bitwize wrote:
         | Ultimately, does it matter?
         | 
         | Michelin-starred restaurants not only have top-tier chefs. They
         | have buyers who negotiate with food suppliers to get the best
         | ingredients they can at the lowest prices they can. Having a
         | preferential relationship with a good supplier is as important
         | to the food quality and the health of the business as having a
         | good chef to prepare the dishes.
         | 
         | Apple has top-tier engineering talent but they are also able to
         | negotiate preferential relationships with their suppliers, and
         | it's both those things that make Apple a phenomenal tech
         | company.
        
           | makeitdouble wrote:
           | Qualcomm is also with TSMC and their newer 4nm processor is
           | expected to stay competitive with the M series.
           | 
           | If the magic comes mostly from TSMC, there's a good chance
           | for these claims to be true and to have a series of better
           | chips coming on the other platforms as well.
        
             | hot_gril wrote:
             | This info is much more useful than a comparison to
             | restaurants.
        
             | 0x457 wrote:
             | Does Qualcomm have any new CPU cores besides that one they
             | can't make ARM due to licensing?
        
               | transpute wrote:
               | The one being announced on May 20th at Computex?
               | https://news.ycombinator.com/item?id=40288969
        
             | stouset wrote:
             | "Stay" competitive implies they've _been_ competitive.
             | Which they haven't.
             | 
             | I'm filing this into the bin with all the other "This next
             | Qualcomm chip will close the performance gap" claims made
             | over the past decade. Maybe this time it'll be true. I
             | wouldn't bet on it.
        
         | GeekyBear wrote:
         | They don't mention which metric is 50% higher.
         | 
         | However, we have more CPU cores, a newer core design, and a
         | newer process node which would all contribute to improving
         | multicore CPU performance.
         | 
         | Also, Apple is conservative on clock speeds, but those do tend
         | to get bumped up when there is a new process node as well.
        
         | philistine wrote:
         | Actually, TSMC's N3E process is somewhat of a regression on the
         | first-generation 3nm process, N3. However, it is simpler and
         | more cost-efficient, and everyone seems to want to get out of
         | that N3 process as quickly as possible. That seems to be the
         | biggest reason Apple released the A17(M3) generation and now
         | the M4 the way they did.
         | 
         | The N3 process is in the A17 Pro, the M3, M3 Pro, and M3 Max.
         | The A17 Pro name seems to imply you won't find it trickle down
         | on the regular iPhones next year. So we'll see that processor
         | only this year in phones, since Apple discontinues their Pro
         | range of phones every year; only the regular phones trickle
         | downrange lowering their prices. The M3 devices are all Macs
         | that needed an upgrade due to their popularity: the Macbook Pro
         | and Macbook Air. They made three chips for them, but they did
         | not make an M3 Ultra for the lower volume desktops. With the
         | announcement of an M4 chip in iPads today, we can expect to see
         | the Macbook Air and Macbook Pro upgraded to M4 soon, with the
         | introduction of an M4 Ultra to match later. We can now expect
         | those M3 devices to be discontinued instead of going downrange
         | in price.
         | 
         | That would leave one device with an N3 process chip: the iMac.
         | At its sale level, I wouldn't be surprised if all the M3 chips
         | that will go into it will be made this year, with the model
         | staying around for a year or two running on fumes.
        
           | GeekyBear wrote:
           | The signs certainly all point to the initial version of N3
           | having issues.
           | 
           | For instance, Apple supposedly required a deal where they
           | only paid TSMC for usable chips per N3 wafer, and not for the
           | entire wafer.
           | 
           | https://arstechnica.com/gadgets/2023/08/report-apple-is-
           | savi...
        
             | dehrmann wrote:
             | My read on the absurd number of Macbook M3 SKUs was that
             | they had yield issues.
        
               | GeekyBear wrote:
               | There is also the fact that we currently have an iPhone
               | generation where only the Pro models got updated to chips
               | on TSMC 3nm.
               | 
               | The next iPhone generation is said to be a return to form
               | with all models using the same SOC on the revised version
               | of the 3nm node.
               | 
               | > Code from the operating system also indicates that the
               | entire iPhone 16 range will use a new system-on-chip -
               | t8140 - Tahiti, which is what Apple calls the A18 chip
               | internally. The A18 chip is referenced in relation to the
               | base model iPhone 16 and 16 Plus (known collectively as
               | D4y within Apple) as well as the iPhone 16 Pro and 16 Pro
               | Max (referred to as D9x internally)
               | 
               | https://www.macrumors.com/2023/12/20/ios-18-code-four-
               | new-ip...
        
           | dhx wrote:
           | N3E still has a +9% logic transistor density increase on N3
           | despite a relaxation to design rules, for reasons such as
           | introduction of FinFlex.[1] Critically though, SRAM cell
           | sizes remain the same as N5 (reversing the ~5% reduction in
           | N3), and it looks like the situation with SRAM cell sizes
           | won't be improving soon.[2][3] It appears more likely that
           | designers particularly for AI chips will just stick with N5
           | as their designs are increasingly constrained by SRAM.
           | 
           | [1] https://semiwiki.com/semiconductor-
           | manufacturers/tsmc/322688...
           | 
           | [2] https://semiengineering.com/sram-scaling-issues-and-what-
           | com...
           | 
           | [3] https://semiengineering.com/sram-in-ai-the-future-of-
           | memory/
        
             | sroussey wrote:
             | SRAM has really stalled. I don't think 5nm was much better
             | than 7nm. On ever smaller nodes, sram will be taking up a
             | larger and larger percent of the entire chip. But the cost
             | is much higher on the smaller nodes even if the performance
             | is not better.
             | 
             | I can see why AMD started putting the SRAM on top.
        
               | magicalhippo wrote:
               | It wasn't immediately clear to me why SRAM wouldn't scale
               | like logic. This[1] article and this[2] paper sheds some
               | light.
               | 
               | From what I can gather the key aspects are that decreased
               | feature sizes lead to more variability between
               | transistors, but also to less margin between on-state and
               | off-state. Thus a kind of double-whammy. In logic
               | circuits you're constantly overwriting with new values
               | regardless of what was already there, so they're not as
               | sensitive to this, while the entire point of a memory
               | circuit is to reliably keep values around.
               | 
               | Alternate transistor designs such as FinFET, Gate-all-
               | around and such can provide mitigation of some of this,
               | say by reducing transistor-to-transistor variability by a
               | factor, but can't get around root issue.
               | 
               | [1]: https://semiengineering.com/sram-scaling-issues-and-
               | what-com...
               | 
               | [2]:
               | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9416021/
        
         | mensetmanusman wrote:
         | Also the thousands of suppliers that have improved their
         | equipment and supplies that feed into the tsmc fabs.
        
         | tuckerpo wrote:
         | It is almost certainly half as much power in the RMS sense, not
         | absolute.
        
         | smrtinsert wrote:
         | Breathtaking
        
         | beeboobaa3 wrote:
         | > And compared with the latest PC chip in a thin and light
         | laptop, M4 can deliver the same performance using just a fourth
         | of the power
         | 
         | It can deliver the same performance as itself at just a fourth
         | of the power than it's using? That's incredible!
        
         | nblgbg wrote:
         | That doesn't seem to reflect in the battery life of these. They
         | have the same exact battery life. Does it mean it's not
         | entirely accurate? Since they don't indicate the battery
         | capacity in their specs, it's hard to confirm this.
        
         | thih9 wrote:
         | They mention just M2 and M4 - curious, how does M3 fit into
         | that?
         | 
         | I.e. would it sit between, or closer to M2 or M4?
        
       | mlhpdx wrote:
       | I'm far from an expert in Apple silicon, but this strikes me as
       | having some conservative improvements. And in-depth info out
       | there yet?
        
         | Lalabadie wrote:
         | 2x the performance per watt is a great improvement, though.
        
           | refulgentis wrote:
           | Clever wording on their part: 2x performance per watt over
           | M2. Took me a minute, had to reason through this is their 2nd
           | generation 3nm chip, so it wasn't from a die shrink, then go
           | spelunking.
        
           | jeffbee wrote:
           | This claim can only be evaluated in the context of a specific
           | operating point. I can 6x the performance per watt of the CPU
           | in this machine I am using by running everything on the
           | efficiency cores and clocking them down to 1100MHz. But
           | performance per watt is not the only metric of interest.
        
         | gmm1990 wrote:
         | It surprised me they called it an M4 vs an M3 something. The
         | display engine seems to be the largest change I don't know what
         | that looked like on previous processors. Completely
         | hypothesizing but could be a significant efficiency improvement
         | if its offloading display stuff.
        
           | aeonik wrote:
           | 3 isn't a power of two, maybe the M8 is next.
        
           | duxup wrote:
           | I'd rather they just keep counting up than some companies
           | where they get into wonky product line naming convention
           | hell.
           | 
           | It's ok if 3 to 4 is or isn't a big jump, it's the next one
           | is really all I want to know. If I need to peek at the specs,
           | the name really won't tell me anything anyhow and I'll be on
           | a webpage.
        
         | shepherdjerred wrote:
         | I expect the pro/max variants will be more interesting. The
         | improvements do look great for consumer devices, though.
        
         | Findecanor wrote:
         | I'm guessing that the "ML accelerator" in the CPU cores means
         | one of ARM's SME extensions for matrix multiplication. SME in
         | ARM v8.4-A adds dot product instructions. v8.6-A adds more,
         | including BF16 support.
         | 
         | https://community.arm.com/arm-community-blogs/b/architecture...
        
           | hmottestad wrote:
           | Apple has the NPU (also called Apple Neural Engine), which is
           | specific hardware for running inference. Can't be used for
           | LLMs though at the moment, maybe the M4 will be different.
           | They also have a vector processor attached to the performance
           | cluster of the CPU, they call the instruction set for it AMX.
           | I believe that that one can be leveraged for faster LLM
           | inferencing.
           | 
           | https://github.com/corsix/amx
        
       | EduardoBautista wrote:
       | The 256gb and 512gb models have 8gb of ram. The 1tb and 2tb
       | models have 16gb. Not a fan of tying ram to storage.
       | 
       | https://www.apple.com/ipad-pro/specs/
        
         | praseodym wrote:
         | And also one less CPU performance core for the lower storage
         | models.
        
           | 05 wrote:
           | Well, they have to sell the dies with failed cores somehow..
        
         | Laaas wrote:
         | The economic reasoning behind this doesn't make sense to me.
         | 
         | What do they lose by allowing slightly more freedom in
         | configurations?
        
           | fallat wrote:
           | The reasoning is money. Come on.
        
           | blegr wrote:
           | It forces you to buy multiple upgrades instead of just the
           | one you need.
        
             | jessriedel wrote:
             | But why does this make them more money than offering
             | separate upgrades at higher prices?
             | 
             | I do think there is a price discrimination story here, but
             | there are some details to be filled in.
        
               | foldr wrote:
               | It's not obvious to me that Apple does make a significant
               | amount of money by selling upgrades. Almost everyone buys
               | the base model. The other models are probably little more
               | than a logistical pain in the butt from Apple's
               | perspective. Apple has to offer more powerful systems to
               | be credible as a platform, but I wouldn't be surprised if
               | the apparently exorbitant price of the upgrades reflects
               | the overall costs associated with complicating the
               | production and distribution lines.
        
           | jessriedel wrote:
           | I think it's a price discrimination technique.
        
           | a_vanderbilt wrote:
           | Margins and profit. Less variations in production makes for
           | higher efficiency. Segmenting the product line can push
           | consumers to purchase higher tiers of product. It's iOS
           | anyways, and the people who know enough to care how much RAM
           | they are getting are self-selecting for those higher product
           | tiers.
        
           | vinkelhake wrote:
           | It's just Apple's price ladder. The prices on their different
           | SKUs are laid out carefully so that there's never a too big
           | of a jump to the next level.
           | 
           | https://talkbackcomms.com/blogs/news/ladder
        
           | naravara wrote:
           | Logistical efficiencies mostly. It ends up being a lot of
           | additional SKUs to manage, and it would probably discourage
           | people from moving up a price tier if they would have
           | otherwise. So from Apple's perspective they're undergoing
           | more hassle (which costs) for the benefit of selling you
           | lower margin products. No upside for them besides maybe
           | higher customer satisfaction, but I doubt it would have moved
           | the needle on that very much.
        
           | izacus wrote:
           | They push you to buy the more expensive model with higher
           | margins.
           | 
           | This is what they did when I was buying iPad Air - it starts
           | with actually problematically low 64GB of Storage... and the
           | 256GB model is the next one with massive price jump.
           | 
           | It's the same kind of "anchoring" (marketing term) that car
           | dealers use to lure you into deciding for their car based on
           | the cheapest 29.999$ model which with "useful" equipment will
           | end you costing like 45.000$
        
             | aeyes wrote:
             | Honest question: What data do you store on an iPad Air? On
             | a phone you might have some photos and videos but isn't a
             | tablet just a media consumption device? Especially on iOS
             | where they try to hide the filesystem as much as possible.
        
               | izacus wrote:
               | No data, but iOS apps have gotten massive, caches have
               | gotten massive and install a game or two and 64GB is
               | gone.
               | 
               | Not to mention that occasionally is nice to have a set of
               | downloaded media available for vacation/travel and 64GB
               | isn't enough to download week worth of content from
               | Netflix.
               | 
               | This is why this is so annoying - you're right, I don't
               | need 512GB or 256GB. But I'd still like to have more than
               | "You're out of space!!" amount.
        
               | trogdor wrote:
               | Where is 64GB coming from?
        
               | izacus wrote:
               | The base iPad Air model - the one the price is most
               | quoted - is 64GB.
        
               | trogdor wrote:
               | No it's not.
               | 
               | https://www.apple.com/ipad-air/specs/
        
               | nozzlegear wrote:
               | I've had the original iPad Pro with 64gb since it first
               | released and have somehow never run out of storage. Maybe
               | my problem is that I don't download games. I'd suggest
               | using a USB drive for downloaded media though if you're
               | planning to travel. All of the media apps I use (Netflix,
               | YouTube, Crunchyroll, etc.) support them. That's worked
               | well for me and is one reason I was comfortable buying
               | the 64gb model.
        
               | sroussey wrote:
               | How do you get Netflix to use an external drive?
        
               | nozzlegear wrote:
               | Sorry, I thought I had done this with Netflix but I tried
               | it just now and couldn't find the option. Then I googled
               | it and it looks like it was never supported, I must've
               | misremembered Netflix being an option.
        
               | imtringued wrote:
               | As he said, you buy excess storage so that you don't have
               | to think about how much storage you are using. Meanwhile
               | if you barely have enough, you're going to have to play
               | data tetris. You can find 256GB SSDs that sell for as low
               | as 20EUR. How much money is it worth to not worry about
               | running out of data? Probably more than the cost of the
               | SSD at these prices.
        
               | lotsofpulp wrote:
               | Email, password manager, iOS keychain, photos, videos,
               | etc should all be there if synced to iCloud.
        
               | maxsilver wrote:
               | > What data do you store on an iPad Air?
               | 
               | Games. You can put maybe three or four significant games
               | on an iPad Air before it maxes out. (MTG Arena is almost
               | 20GB all on it's own, Genshin Impact is like 40+ GB)
        
               | jdminhbg wrote:
               | > isn't a tablet just a media consumption device?
               | 
               | This is actually most of the storage space -- videos
               | downloaded for consumption in places with no or bad
               | internet.
        
               | magicalhippo wrote:
               | Surely it supports USB OTG? Or is that just an Android
               | thing[1]?
               | 
               | [1]: https://liliputing.com/you-can-use-a-floppy-disk-
               | drive-with-...
        
               | izacus wrote:
               | Even on Android you can't download streaming media to OTG
               | USB storage.
        
               | adamomada wrote:
               | Once again, pirates win, paying customers lose
        
               | kmeisthax wrote:
               | Yes. However, applications have to be specifically
               | written to use external storage, which requires popping
               | open the same file picker you use to interact with non-
               | Apple cloud storage. If they store data in their own
               | container, then that can only ever go on the internal
               | storage, iCloud, or device backups. You aren't allowed to
               | rugpull an app and move its storage somewhere else.
               | 
               | I mean, what would happen if you yanked out the drive
               | while an app was running on it?
        
               | wincy wrote:
               | We use PLEX for long trips in the car for the kids. Like
               | 24 hour drives. We drive to Florida in the winter and the
               | iPads easily run out of space after we've downloaded a
               | season or two of Adventure Time and Daniel Tiger.
               | 
               | I could fit more if I didn't insist on downloading
               | everything 1080p I guess.
        
               | skydhash wrote:
               | VLC or Infuse + external storage.
        
               | lozenge wrote:
               | iPad OS is 17 GB and every app seems to think it'll be
               | the only one installed.
        
           | gehsty wrote:
           | Fewer iPad SKUs = more efficient manufacturing and logistics,
           | at iPad scale probably means a very real cost saving.
        
           | michaelt wrote:
           | Because before you know it you're Dell and you're stocking 18
           | different variants of "Laptop, 15 inch screen, 16GB RAM,
           | 512GB SSD" and users are scratching their heads trying to
           | figure out WTF the difference is between a "Latitude 3540" a
           | "Latitude 5540" and a "New Latitude 3550"
        
             | gruez wrote:
             | I can't tell whether this is serious or not. Surely adding
             | independently configurable memory/storage combinations
             | won't confuse the user, any more than having configurable
             | storage options don't make the user confused about what
             | iphone to get?
        
               | its_ethan wrote:
               | Configuring your iPhone storage is something every
               | consumer has a concept of, it's some function of "how
               | many pictures can I store on it"? When it comes to
               | CPU/GPU/RAM and you're having to configure all three, the
               | average person is absolutely more likely to be confused.
               | 
               | It's anecdotal, but 8/10 people that I know over the age
               | of 40 would have no idea what RAM or CPU configurations
               | even theoretically do for them. This is probably the case
               | for _most_ iPad purchasers, and Apple knows this - so why
               | would they provide expensive /confusing configurability
               | options just for the handful of tech-y people who may
               | care? There are still high/med/low performant variations
               | that those people can choose from, any the number of
               | people for whom that would sour them away from a sale is
               | vanishingly small, and they would be likely to not even
               | be looking at Apple in the first place
        
             | skeaker wrote:
             | Yes, the additional $600 they make off of users who just
             | want extra RAM is just an unfortunate side effect of the
             | unavoidable process of not being Dell. Couldn't be any
             | other reason.
        
             | kmeisthax wrote:
             | Apple already fixed this with the Mac: they stock a handful
             | of configurations most likely to sell, and then everything
             | else is a custom order shipped direct from China. The
             | reason why Apple has to sell specific RAM/storage pairs for
             | iPads is that they don't have a custom order program for
             | their other devices, so everything _has_ to be an SKU,
             | _and_ has to sell in enough quantity to justify being an
             | SKU.
        
           | abtinf wrote:
           | The GP comment can be misleading because it suggests Apple is
           | tying storage to ram. That is not the case (at least not
           | directly).
           | 
           | The RAM and system-on-chip are tied together as part of the
           | system-on-package. The SoP is what enables M chips to hit
           | their incredible memory bandwidth numbers.
           | 
           | This is not an easy thing to allow configuration. They can't
           | just plug a different memory chip as a final assembly step
           | before shipping.
           | 
           | They only have two SoPs as part of this launch: 9-core CPU
           | with 8gb, and 10-core CPU with 16gb. The RAM is unified for
           | cpu/gpu (and I would assume neural engine too).
           | 
           | Each new SoP is going to reduce economies of scale and
           | increase supply chain complexity. The 256/512gb models are
           | tied to the first package, the 1/2tb models are tied to the
           | second. Again, these are all part of the PCB, so production
           | decisions have to be made way ahead of consumer orders.
           | 
           | Maybe it's not perfect for each individual's needs, but it
           | seems reasonable to assume that those with greater storage
           | needs also would benefit from more compute and RAM. That is,
           | you need more storage to handle more video production so you
           | are probably more likely to use more advanced features which
           | make better use of increased compute and RAM.
        
           | dragonwriter wrote:
           | > What do they lose by allowing slightly more freedom in
           | configurations?
           | 
           | More costs everywhere in the chain; limiting SKUs is a big
           | efficiency from manufacturing to distribution to retail to
           | support, and it is an easy way (for the same reason) to
           | improve the customer experience, because it makes it a lot
           | easier to not be out of or have delays for a customer's
           | preferred model, as well as making the UI (online) or
           | physical presentation (brick and mortar) for options much
           | cleaner.
           | 
           | Of course, it can feel worse if you you are a power user with
           | detailed knowledge of your particular needs in multiple
           | dimensions and you feel like you are paying extra for
           | features you don't want, but the efficiencies may make that
           | feeling an illusion -- with more freedom, you would being
           | paying for the additional costs that created, so a higher
           | cost for the same options and possibly just as much or more
           | for the particular combination option you would prefer with
           | multidimensional freedom as for the one with extra features
           | without it. Though that counterfactual is impossible to test.
        
           | Aurornis wrote:
           | Several things:
           | 
           | 1. Having more SKUs is expensive, for everything from
           | planning to inventory management to making sure you have
           | enough shelf space at Best Buy (which you have to negotiate
           | for). Chances are good that stores like Best Buy and Costco
           | would only want 2 SKUs anyway, so the additional configs
           | would be a special-order item for a small number of
           | consumers.
           | 
           | 2. After a certain point, adding more options actually
           | _decreases_ your sales. This is confusing to people who think
           | they 'd be more likely to buy if they could get exactly what
           | they wanted, but what you're not seeing is the legions of
           | casual consumers who are thinking about maybe getting an
           | iPad, but would get overwhelmed by the number of options.
           | They might spend days or weeks asking friends which model to
           | get, debating about whether to spend extra on this upgrade or
           | that, and eventually not buying it or getting an alternative.
           | If you simplify the lineup to the "cheap one" and the "high
           | end one" then people abandon most of that overhead and just
           | decide what they want to pay.
           | 
           | The biggest thing tech people miss is that they're not the
           | core consumers of these devices. The majority go to casual
           | consumers who don't care about specifying every little thing.
           | They just want to get the one that fits their budget and move
           | on. Tech people are secondary.
        
         | samatman wrote:
         | It's fairly absurd that they're still selling a 256gb "Pro"
         | machine in the first place.
         | 
         | That said, Apple's policy toward SKUs is pretty consistent: you
         | pay more money and you get more machine, and vice versa. The
         | MacBooks are the only product which has separately configurable
         | memory / storage / chip, and even there some combinations
         | aren't manufactured.
        
           | phkahler wrote:
           | >> It's fairly absurd that they're still selling a 256gb
           | "Pro" machine in the first place.
           | 
           | My guess is they want you to use their cloud storage and pay
           | monthly for it.
        
             | CharlieDigital wrote:
             | That doesn't make any sense.
             | 
             | I'm not storing my Docker containers and `node_modules` in
             | the cloud.
             | 
             | Pro isn't just images and videos.
        
               | davedx wrote:
               | This is a tablet not a laptop
        
             | skydhash wrote:
             | Or use an external storage. I'd be wary of using my iPad as
             | primary storage anyway. It's only work in progress and
             | currently watching/reading media.
        
             | samatman wrote:
             | If that were the goal (I don't think it is), they'd be
             | better off shipping enough storage to push people into the
             | 2TB tier, which is $11 vs. $3 a month for 200GB.
             | 
             | I said this in a sibling comment already, but I think it's
             | just price anchoring so that people find the $1500 they're
             | actually going to pay a bit easier to swallow.
        
           | dijit wrote:
           | Real creative pros will likely be using a 10G Thunderbolt NIC
           | to a SAN; local video editing is not advised unless it's only
           | a single project at a time.
           | 
           | Unless you are a solo editor.
        
           | azinman2 wrote:
           | I have a 256G iPhone. I think I'm using like 160G. Most stuff
           | is just in the cloud. For an iPad it wouldn't be any
           | different, modulo media cached for flights. I could see some
           | cases like people working on audio to want a bunch stored
           | locally, but it's probably in some kind of compressed format
           | such that it wouldn't matter too much.
           | 
           | What is your concern?
        
             | samatman wrote:
             | I don't know about 'concern' necessarily, but it seems to
             | me that 512GB for the base Pro model is a more realistic
             | minimum. There are plenty of use cases where that amount of
             | storage is overkill, but they're all served better by the
             | Air, which come in the same sizes and as little as 128GB
             | storage.
             | 
             | I would expect most actual users of the Pro model, now that
             | 13 inch is available at the lower tier, would be working
             | with photos and video. Even shooting ProRes off a pro
             | iPhone is going to eat into 256 pretty fast.
             | 
             | Seems like that model exists mainly so they can charge
             | $1500 for the one people are actually likely to get, and
             | still say "starts at $1299".
             | 
             | Then again, it's Apple, and they can get away with it, so
             | they do. My main point here is that the 256GB model is bad
             | value compared to the equivalent Air model, because if you
             | have any work where the extra beef is going to matter, it's
             | going to eat right through that amount of storage pretty
             | quick.
        
               | its_ethan wrote:
               | I think you're underestimating the number of people who
               | go in to buy an iPad and gravitate to the Pro because it
               | looks the coolest and sounds like a luxury thing. For
               | those people, who are likely just going to use it for web
               | browsing and streaming videos, the cheapest configuration
               | is the only one they care about.
               | 
               | That type of buyer is a very significant % of sales for
               | iPad pros. Despite the marketing, there are really not
               | that many people (as a % of sales) that will be pushing
               | these iPad's anywhere even remotely close to their
               | computational/storage/spec limits.
        
         | giancarlostoro wrote:
         | Honestly though, that's basically every tablet you cant change
         | the ram, you get what you get and thats it. Maybe they should
         | call them by different names like Pro Max for the ones with
         | 16GB in order to make it more palatable? Small psychological
         | hack.
        
           | dotnet00 wrote:
           | The Samsung tablets at least still retain the SD card slot,
           | so you can focus more on the desired amount of RAM and not
           | worry too much about the built-in storage size.
        
             | Teever wrote:
             | It would be cool if regulators mandated that companies like
             | Apple are obligated to provide models of devices with SD
             | card slots and a seamless way to integrate this storage
             | into the OS/applications.
             | 
             | That combined with replaceable batteries would go a long
             | way to reduce the amount of ewaste.
        
               | mschuster91 wrote:
               | And then people would stick alphabet-soup SD cards into
               | their devices and complain about performance and data
               | integrity, it's enough of a headache in the Android world
               | already (or has been before Samsung and others finally
               | decided to put in enough storage for people to not rely
               | on SD cards any more).
               | 
               | In contrast, Apple's internal storage to my knowledge
               | always is very durable NVMe, attached logically and
               | physically directly to the CPU, which makes their
               | shenanigans with low RAM size possible in the first place
               | - they swap like hell but as a user you barely notice it
               | because it's so blazing fast.
        
               | Teever wrote:
               | Yeah jackasses are always gonna jackass. There's still a
               | public interest in making devices upgradable for the
               | purpose of minimizing e-waste.
               | 
               | I'd just love to buy a device with a moderate amount of
               | unupgreadable SSD and an SD slot so that I can put a more
               | memory in it later so the device can last longer.
        
               | mschuster91 wrote:
               | Agreed but please with something other than microSD
               | cards. Yes, microSD Express is a thing, but both cards
               | and hosts supporting it are rare, the size format doesn't
               | exactly lend itself to durable flash chips, thermals are
               | questionable, and even the most modern microSD Express
               | cards barely hit 800 MB/sec speed, whereas Apple's stuff
               | has hit twice or more that for years [2].
               | 
               | [1] https://winfuture.de/news,141439.html
               | 
               | [2] https://9to5mac.com/2024/03/09/macbook-
               | air-m3-storage-speeds...
        
               | davedx wrote:
               | Not everything has to be solved by regulators. The walled
               | garden is way more important to fix than arbitrary
               | hardware configurations
        
             | zozbot234 wrote:
             | Doesn't iPad come with an USB-C port nowadays? You can
             | attach an external SD card reader.
        
               | amlib wrote:
               | Just like I don't want an umbilical cord hanging out of
               | me just to perform the full extent of my bodily
               | functions, I also wouldn't want a dongle hanging off my
               | tablet for it to be deemed usable.
        
         | paulpan wrote:
         | I don't think it's strictly for price gouging/segmentation
         | purposes.
         | 
         | On the Macbooks (running MacOS), RAM has been used as data
         | cache to speed up data read/write performance until the actual
         | SSD storage operation completes. It makes sense for Apple to
         | account for with higher RAM spec for the 1TB/2TB
         | configurations.
        
           | pantalaimon wrote:
           | > RAM has been used as data cache to speed up data read/write
           | performance until the actual SSD storage operation completes.
           | 
           | I'm pretty sure that's what all modern operating systems are
           | doing.
        
             | eddieroger wrote:
             | Probably, but since we're talking about an Apple product,
             | comparing it to macOS make sense, since they all share the
             | same bottom layer.
        
               | 0x457 wrote:
               | Not, probably, that just how any "modern" OS works. It
               | also uses RAM as a cache to avoid reads from storage,
               | just like any other modern OS.
               | 
               | Apple uses it for segmentation and nothing else.
               | 
               | Modern being - since the 80s.
        
             | wheybags wrote:
             | I'm writing this from memory, so some details may be wrong
             | but: most high end ssds have dram caches on board, with a
             | capacitor that maintains enough charge to flush the cache
             | to flash in case of power failure. This operates below the
             | system page cache that is standard for all disks and oses.
             | 
             | Apple doesn't do this, and use their tight integration to
             | perform a similar function using system memory. So there is
             | some technical justification, I think. They are 100% price
             | gougers though.
        
               | beambot wrote:
               | One company's "Price Gouging" is another's "Market
               | Segmentation"
        
               | gaudystead wrote:
               | > writing this from memory
               | 
               | Gave me a chuckle ;)
        
               | kllrnohj wrote:
               | Using host memory for SSD caches is part of the NVMe
               | spec, it's not some Apple-magic-integration thing:
               | https://www.servethehome.com/what-are-host-memory-buffer-
               | or-...
               | 
               | It's also still typically just worse than an actual dram
               | cache.
        
           | __turbobrew__ wrote:
           | That is called a buffer/page cache and has existed in
           | operating systems since the 1980s.
        
             | riazrizvi wrote:
             | No this is caching with SSDs, it's not the same league.
        
             | btown wrote:
             | With hardware where power-off is only controlled by
             | software, battery life is predictable, and large amounts of
             | data like raw video are being persisted, they might have a
             | very aggressive version of page caching, and a large amount
             | of storage may imply that a scale-up of RAM would be
             | necessary to keep all the data juggling on a happy path.
             | That said, there's no non-business reasons why they
             | couldn't extend that large RAM to smaller storage systems
             | as well.
        
             | astrange wrote:
             | It's unified memory which means the SSD controller is also
             | using the system memory. So more flash needs more memory.
        
               | spixy wrote:
               | Then give me more memory. 512gb storage with 16gb ram
        
           | gruez wrote:
           | How does that justify locking the 16GB option to 1TB/2TB
           | options?
        
             | 0x457 wrote:
             | Since memory is on their SoC it makes it challenging to
             | maintain multiple SKUs. This segmentation makes to me as a
             | consumer.
        
           | lozenge wrote:
           | The write speed needs to match what the camera can output or
           | the WiFi/cellular can download. It has nothing to do with the
           | total size of the storage.
        
           | bschne wrote:
           | Shouldn't the required cache size be dependent on throughput
           | more so than disk size? It does not necessarily seem like
           | you'd need a bigger write cache if the disk is bigger, people
           | who have a 2TB drive don't read/write 2x as much in a given
           | time as those with a 1TB drive. Or am I missing something?
        
           | hosteur wrote:
           | > I don't think it's strictly for price gouging/segmentation
           | purposes.
           | 
           | I think it is strictly for that purpose.
        
           | Aerbil313 wrote:
           | Do people not understand that Apple's 'price gouging' is
           | about UX? A person who has the money to buy a 1TB iPad is
           | worth more than average customer. A 16GB RAM doubtlessly
           | results in a faster UX and that person is more likely to
           | continue purchasing.
        
         | KeplerBoy wrote:
         | If these things will ever get MacOS support it will be useless
         | with 8 GB of Ram.
         | 
         | Such a waste of nice components.
        
           | greggsy wrote:
           | This comes up frequently. 8GB is sufficient for most casual
           | and light productivity use cases. Not everyone is a power
           | user, in fact, most people aren't.
        
             | regularfry wrote:
             | My dev laptop is an 8GB M1. It's fine. Mostly.
             | 
             | I can't run podman, slack, teams, and llama3-8B in
             | llama.cpp at the same time. Oddly enough, this is rarely a
             | problem.
        
               | prepend wrote:
               | It's the "Mostly" part that sucks. What's the price
               | difference between 8 and 16? Like $3 in wholesale prices.
               | 
               | This just seems like lameness on Apple's part.
        
               | KeplerBoy wrote:
               | It's not quite like that. Apple's RAM is in the SoC
               | package, it might be closer to 20$, but still.
        
               | TremendousJudge wrote:
               | They have always done this, for some reason people buy it
               | anyway, so they have no incentive to stop doing it.
        
               | Aurornis wrote:
               | > What's the price difference between 8 and 16? Like $3
               | in wholesale prices.
               | 
               | Your estimates are not even close. You can't honestly
               | think that LPDDR5 at leading edge speeds is only $3 per
               | 64 Gb (aka 8GB), right?
               | 
               | Your estimate is off my an order of magnitude. The memory
               | Apple is using is closer to $40 for that increment, not
               | $3.
               | 
               | And yes, they include a markup, because nobody is
               | integrating hardware parts and selling them at cost. But
               | if you think the fastest LPDDR5 around only costs $3 for
               | 8GB, that's completely out of touch with reality.
        
               | moooo99 wrote:
               | Even if taking raising market prices into account, your
               | estimate for the RAM module price is waaaaaaay off.
               | 
               | You can get 8GB of good quality DDR5 DIMMs for 40$, there
               | is no way in hell that Apple is paying anywhere near
               | that.
               | 
               | Going from 8 to 16GBs is probably somewhere between 3-8$
               | purely in material costs for Apple, not taking into
               | account any other costs associated
        
               | smarx007 wrote:
               | GP said "LPDDR5" and that Apple won't sell at component
               | prices.
               | 
               | You mention DIMMs and component prices instead. This is
               | unhelpful.
               | 
               | See https://www.digikey.com/en/products/filter/memory/mem
               | ory/774... for LPDDR5 prices. You can get a price of
               | $48/chip at a volume of 2000 chips. Assuming that Apple
               | got a deal of $30-40-ish at a few orders of magnitude
               | larger order is quite fair. Though it certainly would be
               | nicer if Apple priced 8GB increments not much above
               | $80-120.
        
               | moooo99 wrote:
               | I am aware that there are differences, I just took RAM
               | DIMMs as a reference because there is a >0% chance that
               | anyone reading this has actually ever bought a comparable
               | product themselves.
               | 
               | As for prices, the prices you cited are not at all
               | comparable. Apple is absolutely certainly buying directly
               | from manufacturers without a middleman since we're
               | talking about millions of units delivered each quarter.
               | Based on those quantities, unit prices are guaranteed to
               | be substantially lower than what DigiKey offers.
               | 
               | Based on what little public information I was able to
               | find, spot market prices for LPDDR4 RAM seem to be
               | somewhere in the 3 to 5$ range for 16GB modules. Let's be
               | generous and put LPDDR5 at tripe the price with 15$ a
               | 16GB module. Given the upgrade price for going from 8 to
               | 16GB is 230 EUR Apple is surely making a huge profit on
               | those upgrades alone by selling an essentially unusable
               | base configuration for a supposed "Pro" product.
        
               | flawsofar wrote:
               | Local LLMs are sluggish on my M2 Air 8GB,
               | 
               | but up until these these things I felt I could run
               | whatever I wanted, including Baldur's Gate 3.
        
               | Aurornis wrote:
               | Same here. My secondary laptop is 8GB of RAM and it's
               | fine.
               | 
               | As devs and power users we'll always have an edge case
               | for higher RAM usage, but the average consumer is going
               | to be perfectly fine with 8GB of RAM.
               | 
               | All of these comments about how 8GB of RAM is going to
               | make it "unusable" or a "waste of components" are absurd.
        
               | camel-cdr wrote:
               | Programming has a wierd way of requirering basically
               | nothing some times, but other times you need to build the
               | latest version of your toolchain, or you are working on
               | some similarly huge project that takes ages to compile.
               | 
               | I was using my 4gb ram pinebook pro in public transport
               | yesterday, and decided to turn of all cores except for a
               | single Cortex-A53, to safe some battery. I had no
               | problems for my usecase of a text editor + shell to
               | compile for doing some SIMD programming.
        
               | internet101010 wrote:
               | At this point I don't think the frustration has much to
               | do with the performance but rather RAM is so cheap that
               | intentionally creating a bottleneck to extract another
               | $150 from a customer comes across as greedy, and I am
               | inclined to agree. Maybe the shared memory makes things
               | more expensive but the upgrade cost has always been
               | around the same amount.
               | 
               | It's not quite in the same ballpark as showing apartment
               | or airfare listings without mandatory fees but it is at
               | the ticket booth outside of the stadium.
        
               | oblio wrote:
               | I imagine you don't have browsers with many tabs.
        
               | adastra22 wrote:
               | I could never understand how people operate with more
               | than a dozen or so open tabs.
        
               | touristtam wrote:
               | Those are the type of "I'll go back later to it", The
               | workflow on modern browser is broken. Instead of
               | leveraging the bookmark functionality to improve the UX,
               | we have this situation of user having 50+ tabs open,
               | because they can. It takes quite a bit of discipline to
               | close down tabs to a more manageable numbers.
        
               | robin_reala wrote:
               | The number of tabs you have doesn't correlate to the
               | number of active web views you have, if you use any
               | browser that unloads background tabs while still saving
               | their state.
        
               | GOONIMMUNE wrote:
               | how many is "many"? I'm also on an M1 Mac 8 GB RAM and I
               | have 146 chrome tabs open without any issues.
        
               | gs17 wrote:
               | Mine is 8GB M1 and it is not fine. But the actual issue
               | for me isn't RAM as much as it is disk space, I'm pretty
               | confident if it wasn't also the 128 GB SSD model it would
               | handle the small memory just fine.
               | 
               | I'm still getting at least 16 GB on my next one though.
        
               | regularfry wrote:
               | Yeah, that's definitely a thing. Podman specifically eats
               | a lot.
        
             | Pikamander2 wrote:
             | That would be fine if the 8GB model was also _priced_ for
             | casual and light productivity use cases. But alas, this is
             | Apple we 're talking about.
        
               | wiseowise wrote:
               | MacBook Air starts at 1,199 euro. For insane battery
               | life, amazing performance, great screen and one of the
               | lightest chassis. Find me comparable laptop, I'll wait.
        
               | touristtam wrote:
               | The screen is the killer. you can have a nice-ish 2nd
               | corporate laptop with decent and swappable battery on
               | which you can install a decent OS (non Windows) and get
               | good milage but the screen is something else.
        
             | dialup_sounds wrote:
             | Most people that consider themselves "power users" aren't
             | even power users, either. Like how being into cars doesn't
             | make you a race car driver.
        
               | 0x457 wrote:
               | Race car drivers think they are pros and can't even
               | rebuild the engine in their car.
               | 
               | There are different categories of "power users"
        
             | wvenable wrote:
             | Isn't this the "Pro" model?
        
           | reustle wrote:
           | > If these things will ever get MacOS support
           | 
           | The Macbook line will get iPadOS support long before they
           | allow MacOS on this line. Full steam ahead towards the walled
           | garden.
        
             | bluescrn wrote:
             | iOS has become such a waste of great hardware, especially
             | in the larger form factor of the iPad.
             | 
             | M1 chips, great screens, precise pencil input and keyboard
             | support, but we still aren't permitted a serious OS on it,
             | to protect the monopolistic store.
             | 
             | App Stores have beeen around long enough to prove that
             | they're little more than laboratories in which to carry out
             | accelerated enshittification experimentation. Everything so
             | dumbed down and feature-light, yet demanding subscriptions
             | or pushing endless scammy ads. And games that are a
             | shameless introduction to gambling addiction, targeted at
             | kids.
             | 
             | Most of the 'apps' that people actually use shouldn't need
             | to be native apps anyway, they should be websites. And now
             | we get the further enshittification of trying to force
             | people out of the browser and into apps, not for a better
             | experience, but for a worse one, where more data can be
             | harvested and ads can't so easily be blocked...
        
             | arecurrence wrote:
             | If the iPad could run Mac apps when docked to Magic
             | Keyboard like the Mac can run iPad apps then there may be a
             | worthwhile middle ground that mostly achieves what people
             | want.
             | 
             | The multitasking will still be poor but perhaps Apple can
             | do something about that when in docked mode.
             | 
             | That said, development likely remains a non-starter given
             | the lack of unix tooling.
        
           | Aurornis wrote:
           | I have a minimum of 64GB on my all my main developer machines
           | (home, work, laptop), but I have a spare laptop with only 8GB
           | of RAM for lightweight travel.
           | 
           | Despite the entire internet telling me it would be "unusable"
           | and a disaster and a complete disaster, it's actually 100%
           | perfectly fine. I can run IDEs, Slack, Discord, Chrome, and
           | do dev work without a problem. I can't run a lot of VMs or
           | compile giant projects with 10 threads, of course, but for
           | typical work tasks it's just fine.
           | 
           | And for the average consumer, it would also be fine. I think
           | it's obvious that a lot of people are out of touch with
           | normal people's computer use cases. 8GB of RAM is fine for
           | 95% of the population and the other 5% can buy something more
           | expensive.
        
             | mananaysiempre wrote:
             | For me personally, it's not an issue of being out of touch.
             | I did, in fact, use a 2014 Macbook with an i5 CPU and 16 GB
             | of RAM for nearly a decade and know how often I hit swap
             | and/or OOM on it even without attempting multicore
             | shenanigans which its processor couldn't have managed
             | anyway.
             | 
             | It's rather an issue of selling deliberately underpowered
             | hardware for no good reason other than to sell actually up-
             | to-date versions for a difference in price that has no
             | relation to the actual availability or price of the
             | components. The sheer disconnect from any kind of reality
             | offends me as a person whose job and alleged primary
             | competency is to recognize reality then bend it to one's
             | will.
        
               | KeplerBoy wrote:
               | I don't think we ever were at a point in computing were
               | you could buy a high-end (even entry level macbooks have
               | high-end pricing) laptop with the same amount of ram as
               | you could 10 years earlier.
               | 
               | 8 GB were the standard back then.
        
             | elaus wrote:
             | But why did you configure 3 machines with 64+ GB, if 8 GB
             | RAM are "100% perfectly fine" for typical work tasks?
             | 
             | For me personally 16 or 32 GB are perfectly fine, 8 GB was
             | too little (even without VMs) and I've never needed 64 or
             | more. So it's curious to see you are pretty much exactly
             | the opposite.
        
               | joshmanders wrote:
               | > But why did you configure 3 machines with 64+ GB, if 8
               | GB RAM are "100% perfectly fine" for typical work tasks?
               | 
               | Did you miss this part prefixing that sentence?
               | 
               | > I can't run a lot of VMs or compile giant projects with
               | 10 threads, of course
        
             | leptons wrote:
             | Be honest, that 8GB computer isn't running MacOS, is it.
        
               | adastra22 wrote:
               | That's the standard configuration of a MacBook Air.
        
             | grecy wrote:
             | I'm editing 4k video and thousands of big RAW images.
             | 
             | The used M1 MacBook Air I just bought is by far the fastest
             | computer I have ever used.
        
           | jiehong wrote:
           | People always complain dev should write more efficient
           | software, so maybe that's one way!
           | 
           | At least, chrome wouldn't run that many tabs on iPad for sure
           | if it used the same engine as desktop chrome
        
           | leptons wrote:
           | These specific model of tablets won't ever get MacOS support.
           | Apple will tell you when you're allowed to run MacOS on a
           | tablet, and they'll make you buy a new tablet specifically
           | for that.
        
         | sambazi wrote:
         | > 8gb of ram
         | 
         | not again
        
         | ilikehurdles wrote:
         | The iPod classic had 160 GB of storage fifteen years ago.
         | 
         | No device should be measuring storage in the gigabytes in 2024.
         | Let alone starting at $1000 offering only 256GB. What
         | ridiculousness.
        
           | dmitrygr wrote:
           | So browse the web and play modern games on an iPod classic
        
           | vrick wrote:
           | To be fair the ipod classic used a platter drive and ipads
           | are high speed SSD storage. That being said, it's been years
           | of the same storage options and at those prices it should be
           | much higher, along with their iCloud storage offerings.
        
           | LeoPanthera wrote:
           | The iPod Classic had spinning rust. Don't pretend it's
           | comparable with a modern SSD.
        
         | intrasight wrote:
         | "Think Different" ;)
        
         | littlestymaar wrote:
         | > 8gb of ram.
         | 
         | WTF? Why so little? That's insane to me, that's the amount of
         | RAM you get with a mid-range android phone.
        
           | MyFirstSass wrote:
           | We've reached a point where their chips has become so amazing
           | they have to introduce "fake scarcity" and "fake limits" to
           | sell their pro lines, while dividing their customers into
           | haves and havenots, while actively stalling the entire field
           | for the masses.
        
             | littlestymaar wrote:
             | Yes, but we're talking about the "pro" version here which
             | makes even less sense!
        
             | its_ethan wrote:
             | You could, alternatively, read less malice into the
             | situation and realize that the _majority of people buying
             | an iPad pro don 't even need 8gb of RAM_ to do what they
             | want to do with the device (web browsing + video
             | streaming).
        
           | deergomoo wrote:
           | I'm not defending Apple's absurd stinginess with RAM (though
           | I don't think it's much of an issue on an iPad given how
           | gimped the OS is), but I've never understood why high-end
           | Android phones have 12/16+ GB RAM.
           | 
           | What needs that amount on a phone? 8GB on a desktop is...well
           | it's not great, but it's _usable_ , and usable for a damn
           | sight more multi-tasking than you would ever do on a
           | smartphone. Is it just because you need to look better on the
           | spec sheet, like the silly camera megapixel wars of the
           | 2010s?
        
       | sroussey wrote:
       | "The next-generation cores feature improved branch prediction,
       | with wider decode and execution engines for the performance
       | cores, and a deeper execution engine for the efficiency cores.
       | And both types of cores also feature enhanced, next-generation ML
       | accelerators."
        
         | gwd wrote:
         | I wonder to what degree this also implies, "More opportunities
         | for speculative execution attacks".
        
           | sroussey wrote:
           | I thought the same thing when I read it, but considering the
           | attacks were known when doing this improvement, I hope that
           | it was under consideration during the design.
        
         | anentropic wrote:
         | I wish there was more detail about the Neural Engine updates
        
           | sroussey wrote:
           | Agreed. I think they just doubled the number of cores and
           | called it a day, but who knows.
        
         | a_vanderbilt wrote:
         | Fat decode pipelines have historically been a major reason for
         | their performance lead. I'm all for improvements in that area.
        
       | snapcaster wrote:
       | Apple was really losing me with the last generation of intel
       | macbooks but these m class processors are so good they've got me
       | locked in all over again
        
         | atonse wrote:
         | The M1 Max that I have is easily the greatest laptop I've ever
         | owned.
         | 
         | It is fast and handles everything I've ever thrown at it (I got
         | 32 GB RAM), it never, ever gets hot, I've never heard a fan in
         | 2+ years (maybe a very soft fan if you put your ear next to
         | it). And the battery life is so incredible that I often use it
         | unplugged.
         | 
         | It's just been a no-compromise machine. And I was thinking of
         | upgrading to an M3 but will probably upgrade to an M4 instead
         | at the end of this year when the M4 maxes come out.
        
           | grumpyprole wrote:
           | Unlike the PC industry, Apple is/was able to move their
           | entire ecosystem to a completely different architecture,
           | essentially one developed exactly for low power use. Windows
           | on ARM efforts will for the foreseeable future be plagued by
           | application support and driver support. It's a great shame,
           | as Intel hardware is no longer competitive for mobile
           | devices.
        
           | teaearlgraycold wrote:
           | Now that there's the 13 inch iPad I am praying they remove
           | the display notch on the Macbooks. It's a little wacky when
           | you've intentionally cut a hole out of your laptop screen
           | just to make it look like your phones did 2 generations ago
           | and now you sell a tablet with the same screen size without
           | that hole.
        
             | tiltowait wrote:
             | I really hate the notch[0], but I do like that the screen
             | stretches into the top that would otherwise be entry. It's
             | unsightly, but we did gain from it.
             | 
             | [0] Many people report that they stop noticing the notch
             | pretty quickly, but that's never been the case for me. It's
             | a constant eyesore.
        
               | kjkjadksj wrote:
               | A big issue with the thin bezels I am now noticing is you
               | lose what used to be a buffer for fingerprints from
               | opening the lid.
        
               | teaearlgraycold wrote:
               | What I've done is use a wallpaper that is black at the
               | top. On the MBP's OLED screen that means the black bezel
               | perfectly blends into the now black menu bar. It's pretty
               | much a perfect solution but the problem it's solving is
               | ridiculous IMO.
        
               | doctor_eval wrote:
               | I do the same, I can't see the notch and got a surprise
               | the other day when my mouse cursor disappeared for a
               | moment.
               | 
               | I don't get the hate for the notch tho. The way I see it,
               | they pushed the menus out of the screen and up into their
               | own dedicated little area. We get more room for content.
               | 
               | It's like the touchbar for menus. Oh, ok, now I know why
               | people hate it. /jk
        
               | adamomada wrote:
               | Here's a handy free utility to automate this for you:
               | https://topnotch.app
               | 
               | Personally I never see the desktop background so I just
               | set desktop to Black, it's perfect for me.
        
               | teaearlgraycold wrote:
               | Thanks!
        
           | kjkjadksj wrote:
           | Thats surprising you haven't heard the fans. Must be the use
           | case. There's a few games that will get it quite hot and
           | spool up the fans. I have also noticed its got somewhat poor
           | sleep management and remains hot while asleep. Sometimes I
           | pick up the computer for the first time that day and its
           | already very hot from whatever kept it out of sleep with a
           | shut lid all night.
        
             | artificialLimbs wrote:
             | Not sure what app you've installed to make it do that, but
             | I've only experienced the opposite. Every Windows 10 laptop
             | I've owned (4 of them) would never go to sleep and turn my
             | bag into an oven if I forgot to manually shut down instead
             | of closing the lid. Whereas my M1 MBP has successfully gone
             | to sleep every lid close.
        
               | deergomoo wrote:
               | The Windows 10 image my employer uses for our Dell
               | shitboxes has sleep completely disabled for some reason I
               | cannot possibly comprehend. The only options in the power
               | menu are Shut Down, Restart, and Hibernate.
               | 
               | If I forget to hibernate before I put it in my bag it
               | either burns through its battery before the next day, or
               | overheats until it shuts itself down. If I'm working from
               | home and get up to pee in the night, I often walk past my
               | office and hear the fans screaming into an empty room,
               | burning god knows how much electricity. Even though the
               | only thing running on it was Slack and an editor window.
               | 
               | It's an absolute joke of a machine and, while it's a few
               | years old now, its original list price was equivalent to
               | a very well specced MacBook Pro. I hope they were getting
               | a substantial discount on them.
        
       | paxys wrote:
       | They keep making iPads more powerful while keeping them on the
       | Fisher-Price OS and then wonder why no one is buying them for
       | real work.
       | 
       | Who in their right mind will spend $1300-$1600 on this rather
       | than a MacBook Pro?
        
         | throwaway2562 wrote:
         | This.
        
         | wizzwizz4 wrote:
         | Everyone knows the Fisher-Price OS is Windows XP (aka
         | Teletubbies OS, on account of Bliss). iPads run Countertop OS
         | (on account of their flat design).
        
         | hedora wrote:
         | You can install linux on the one with a fixed hinge and
         | keyboard, but without a touchscreen. It's the "book" line
         | instead of the "pad" line.
         | 
         | I'm also annoyed that the iPad is locked down, even though it
         | could clearly support everything the macbook does.
         | 
         | Why can't we have a keyboard shortcut to switch between the
         | iPad and Mac desktops or something?
        
       | PreInternet01 wrote:
       | > M4 makes the new iPad Pro an outrageously powerful device for
       | artificial intelligence
       | 
       | Yeah, well, I'm an enthusiastic M3 user, and I'm sure the new AI
       | capabilities are _nice_ , but hyperbole like this is just
       | _asking_ for snark like  "my RTX4090 would like a word".
       | 
       | Other than that: looking forward to when/how this chipset will be
       | available in Macbooks!
        
         | mewpmewp2 wrote:
         | Although as I undserstand m3 chips with more VRAM handle larger
         | LLMs better because they can load more into VRAM compared to
         | 4090.
        
           | freedomben wrote:
           | This is true, but that is only an advantage when running a
           | model larger than the VRAM. If your models are smaller,
           | you'll get substantially better performance in a 4090. So it
           | all comes down to which models you want to run.
        
             | mewpmewp2 wrote:
             | It seems like 13b was running fine on 4090, but when I
             | tried all the more fun or intelligent ones became very slow
             | and would have peformed better on m3.
        
           | PreInternet01 wrote:
           | Yes, M3 chips are _available_ with 36GB unified RAM when
           | embedded in a MacBook, although 18GB and below are the _norm
           | for most models_.
           | 
           | And even though the Apple press release does not even
           | _mention_ memory capacity, I can _guarantee_ you that it will
           | be even less than that on an iPad (simply because RAM is very
           | battery-hungry and most consumers won 't care).
           | 
           | So, therefore my remark: it will be interesting to see how
           | this chipset lands in MacBooks.
        
             | mewpmewp2 wrote:
             | But M3 Max should able to support up to 128gb.
        
         | SushiHippie wrote:
         | Disclosure: I personally don't own any apple devices, except a
         | work laptop with an M2 chip
         | 
         | I think a comparison to the 4090 is unfair, as there is no
         | laptop/tablet with an rtx 4090 and the power consumption of a
         | 4090 is at ~450W on average
        
           | PreInternet01 wrote:
           | > I think a comparison to the 4090 is unfair
           | 
           | No, when using wording like "outrageously powerful", that's
           | exactly the comparison you elicit.
           | 
           | I'd be fine with "best in class" or even "unbeatable
           | performance per Watt", but I can absolutely _guarantee_ you
           | that an iPad does not outperform any current popular-with-
           | the-ML-crowd GPUs...
        
       | simonbarker87 wrote:
       | Seeing an M series chip launch first in an iPad must be result of
       | some mad supply chain and manufacturing related hangovers from
       | COVID.
       | 
       | If the iPad had better software and could be considered a first
       | class productivity machine then it would be less surprising but
       | the one thing no one says about the iPads is "I wish this chip
       | were faster"
        
         | asddubs wrote:
         | Well, it also affects the battery life, so it's not entirely
         | wasted on the ipad
        
         | margalabargala wrote:
         | Maybe they're clocking it way down. Same performance, double
         | the battery life.
        
           | cjauvin wrote:
           | I very rarely wish the battery of my iPad Pro 2018 would last
           | longer, as it's already so good, even considering the age
           | factor.
        
             | 0x457 wrote:
             | Yeah, I don't think about charging my iPad throughout the
             | day, and I constantly use it. Maybe it's in the low 20s
             | late at night, but it never bothered me.
        
         | Zigurd wrote:
         | My guess is that the market size fit current yields.
        
           | a_vanderbilt wrote:
           | I think this is the most likely explanation. Lower volume for
           | the given product matches supply better, and since it's
           | clocked down and has a lower target for GPU cores it has
           | better yields.
        
           | hajile wrote:
           | They already released all their macbooks and latest iphone on
           | N3B which is the worst-yielding 3nm from TSMC. I doubt yields
           | are the issue here.
           | 
           | It's suspected that the fast release for M4 is so TSMC can
           | move away from the horrible-yielding N3B to N3E.
           | 
           | Unfortunately, N3E is less dense. Paired with a couple more
           | little cores, an increase in little core size, 2x larger NPU,
           | etc, I'd guess that while M3 seems to be around 145mm2, this
           | one is going to be quite a bit larger (160mm2?) with the size
           | hopefully being offset by decreased wafer costs.
        
         | DanHulton wrote:
         | I'm wondering if it's because they're hitting the limits of the
         | architecture, and it sounds way better to compare M4 vs M2 as
         | opposed to vs M3, which they'd have to do if it launched in a
         | Macbook Pro.
        
           | mason55 wrote:
           | Eh, they compared the M3 to the M1 when they launched it.
           | People grumbled and then went on with their lives. I don't
           | think they'd use that as a reason for making actual product
           | decisions.
        
         | MuffinFlavored wrote:
         | To me it just feels like a soft launch.
         | 
         | You probably have people (like myself) trying to keep up with
         | the latest MacBook Air who get fatigued having to get a new
         | laptop every year (I just upgraded to the M3 not too long ago,
         | from the M2, and before that... the M1... is there any reason
         | to? Not really...), so now they are trying to entice people who
         | don't have iPads yet / who are waiting for a reason to do an
         | iPad upgrade.
         | 
         | For $1,300 configured with the keyboard, I have no clue what
         | I'd do with this device. They very deliberately are keeping
         | iPadOS + MacOS separate.
        
           | low_common wrote:
           | You get a new laptop every year?
        
             | Teever wrote:
             | If you replace your laptop every year or two and sell the
             | old one online you can keep on the latest technology for
             | only a slight premium.
        
             | MuffinFlavored wrote:
             | I'm sort of "incentivized" to by Apple because as soon as
             | they release a new one, the current device you have will be
             | at "peak trade in value" and deteriorate over time.
             | 
             | It's a negligible amount of money. It's like, brand new
             | $999, trade in for like $450. Once a year... $550
             | remainder/12 months is $45.75/mo to have the latest and
             | greatest laptop.
        
               | fwip wrote:
               | How much is a 2-year old laptop worth? Because if you buy
               | a new laptop every two years and don't even sell the old
               | one, you're only spending $500 a year, which is less than
               | you are now.
        
             | bombcar wrote:
             | It's not terribly expensive if you trade-in or otherwise
             | sell or hand down the previous.
             | 
             | I went from M1 to M1 Pro just to get more displays.
        
           | mvkel wrote:
           | I think (hope) wwdc changes this. The function keys on the
           | Magic Keyboard give me hope.
           | 
           | Also, you know you don't HAVE to buy a laptop every year,
           | right?
        
           | jasonjmcghee wrote:
           | Still using my M1 Air and had no interest in updating to M3.
           | Battery life has dropped a fair amount, but still like 8+
           | hours. That's going to be the trigger to get a new one. If
           | only batteries lasted longer.
        
             | foldr wrote:
             | I don't think it costs that much to have the battery
             | replaced compared to the price of a new laptop.
        
               | jasonjmcghee wrote:
               | Curious how much it would cost. I think parts are on the
               | order of $150? So maybe $4-500 for official repair?
               | 
               | If I can hold out another year or two, would probably end
               | up just getting a new one
        
               | foldr wrote:
               | Oh no, it's way less than that. It should be about $159.
               | You can get an estimate here:
               | https://support.apple.com/en-us/mac/repair
        
           | baq wrote:
           | I feel like I bought the M1 air yesterday. Turns out it was
           | ~4 years ago. Never felt the need to upgrade.
        
             | dialup_sounds wrote:
             | Interestingly, Apple still sells M1 Airs through Walmart,
             | but not their own website.
        
             | Toutouxc wrote:
             | Same here, my M1 Air still looks and feels like a brand new
             | computer. Like, I still think of it as "my new MacBook".
             | It's my main machine for dev work and some hobby
             | photography and I'm just so happy with it.
        
             | wil421 wrote:
             | The only reason I upgraded is my wife "stole" my M1 air. I
             | bought a loaded M3 MBP and then they came out with a 15"
             | Air with dual monitor capabilities. Kinda wish I had the
             | air again. It's not like I move it around much but the form
             | factor is awesome.
        
               | skohan wrote:
               | I love the air form factor. I do serious work on it as
               | well. I have used a pro, but the air does everything I
               | need without breaking a sweat, and it's super convenient
               | to throw in a bag and carry around the house.
        
         | alexpc201 wrote:
         | To offer something better to those who have an iPad Pro M2 and
         | a more powerful environment to run heavier games.
        
         | andrewmunsell wrote:
         | My current assumption is that this has to do with whatever "AI"
         | Apple is planning to launch at WWDC. If they launched a new
         | iPad with an M3 that wasn't able to sufficiently run on-device
         | LLMs or whatever new models they are going to announce in a
         | month, it would be a bad move. The iPhones in the fall will
         | certainly run some new chip capable of on-device models, but
         | the iPads (being announced in the Spring just before WWDC) are
         | slightly inconveniently timed since they have to announce the
         | hardware before the software.
        
           | owenpalmer wrote:
           | interesting theory, we'll see what happens!
        
         | eitally wrote:
         | My guess is that the M4 and M3 are functionally almost
         | identical so there's no real reason for them to restrict the
         | iPad M4 launch until they get the chip into the MacBook / Air.
        
         | mort96 wrote:
         | To be honest, I wish my iPad's chip was slower! I can't do
         | anything other than watch videos and use drawing programs on an
         | iPad, why does it need a big expensive power hungry and
         | environmentally impactful CPU when one 1/10 the speed would do?
         | 
         | If I could actually _do_ something with an iPad there would be
         | a different discussion, but the operating system is so
         | incredibly gimped that the most demanding task it 's really
         | suited for is .. decoding video.
        
           | Shank wrote:
           | > Why does it need a big expensive power hungry and
           | environmentally impactful CPU when one 1/10 the speed would
           | do?
           | 
           | Well, it's not. Every process shrink improves power
           | efficiency. For watching videos, you're sipping power on the
           | M4. For drawing...well if you want low latency while drawing,
           | which generally speaking, people do, you...want the processor
           | and display to ramp up to compensate and carry strokes as
           | fast as possible?
           | 
           | Obviously if your main concern is the environment, you
           | shouldn't upgrade and you should hold onto your existing
           | model(s) until they die.
        
             | mort96 wrote:
             | From what I can tell, the 2020 iPad has perfectly fine
             | latency while drawing, and Apple hasn't been advertising
             | lower latencies for each generation; I think they pretty
             | much got the latency thing nailed down. Surely you could
             | make something with the peak performance of an A12Z use
             | less power on average than an M4?
             | 
             | As for the environmental impact, whether I buy or don't buy
             | this iPad (I won't, don't worry, my 2020 one still works),
             | millions of people will. I don't mind people buying
             | powerful machines when the software can make use of the
             | performance, but for iPad OS..?
        
               | Kon-Peki wrote:
               | The M4 is built on the newest, best, most expensive
               | process node (right?). They've got to amortize out those
               | costs, and then they could work on something cheaper and
               | less powerful. I agree that they probably won't, and
               | that's a shame. But still, the M4 is most likely one of
               | the best options for the best use of this new process
               | node.
        
           | steveridout wrote:
           | I'm under the impression that this CPU is faster AND more
           | efficient, so if you do equivalent tasks on the M4 vs an
           | older processor, the M4 should be less power hungry, not
           | more. Someone correct me if this is wrong!
        
             | mort96 wrote:
             | It's more power efficient than the M3, sure, but surely it
             | could've been even more power efficient if it had worse
             | performance simply from having fewer transistors to switch?
             | It would certainly be more environmentally friendly at the
             | very least!
        
               | Kon-Peki wrote:
               | The most environmentally friendly thing to do is to keep
               | your A12Z for as long as you can, ignoring the annual
               | updates. And when the time comes that you must do a
               | replacement, get the most up to date replacement that
               | meets your needs. Change your mindset - you are not
               | required to buy this one, or the next one.
        
               | mort96 wrote:
               | Of course, I'm not buying this one or any other until
               | something breaks. After all, my _current_ A12Z is way too
               | powerful for iPadOS. It just pains me to see amazing
               | feats of hardware engineering like these iPads with M4 be
               | completely squandered by a software stack which doesn 't
               | facilitate more demanding tasks than decoding video.
               | 
               | Millions of people will be buying these things regardless
               | of what I'm doing.
        
               | _ph_ wrote:
               | Look at the efficiency cores. They are all you are
               | looking for.
        
               | mort96 wrote:
               | I agree!
               | 
               | So what are the performance cores doing there?
        
           | kylehotchkiss wrote:
           | > Environmentally impactful CPU when one 1/10 the speed would
           | do
           | 
           | Apple's started to roll out green energy charging to devices:
           | https://support.apple.com/en-us/108068
           | 
           | If I had to ballpark estimate this, your iPad probably uses
           | less energy per year than a strand of incandescent holiday
           | lights does in a week. Maybe somebody can work out that math.
        
             | mort96 wrote:
             | The environmental concerns I have aren't really power
             | consumption. Making all these CPUs takes _a lot_ of
             | resources.
        
               | pertymcpert wrote:
               | How do you suggest they make new iPads for people who
               | want them? Someone has to make new CPUs and if you can
               | improve perf/W while you're doing so you might as well.
        
           | aurareturn wrote:
           | >To be honest, I wish my iPad's chip was slower! I can't do
           | anything other than watch videos and use drawing programs on
           | an iPad, why does it need a big expensive power hungry and
           | environmentally impactful CPU when one 1/10 the speed would
           | do?
           | 
           | A faster SoC can finish the task with better "work
           | done/watt". Thus, it's more environmentally friendly. Unless
           | you're referring to the resources dedicated to advancing
           | computers such as the food engineers eat and the electricity
           | chip fabs require.
        
             | mort96 wrote:
             | A faster and more power hungry SoC can finish the task with
             | better work done per joule if it is fast enough to offset
             | the extra power consumption. It is my understanding that
             | this is often not the case. See e.g efficiency cores
             | compared to performance cores in these heterogeneous
             | design; the E cores can get more done per joule AFAIU. If
             | my understanding is correct, then removing the P cores from
             | the M4 chip would let it get more work done per joule.
             | 
             | Regardless, the environmental impact I'm thinking about
             | isn't mainly power consumption.
        
               | pertymcpert wrote:
               | The P cores don't get used if they're not needed. You
               | don't need to worry. Most of your every use and
               | background work gets allocated to E cores.
        
           | _ph_ wrote:
           | It has 6 efficiency cores. Every single of them is extremely
           | power efficient, but still faster than an iPad 2-3
           | generations back. So unless you go full throttle, a M4 is
           | going to be by far the most efficient CPU you can have.
        
         | drexlspivey wrote:
         | Meanwhile the mac mini is still on M2
        
         | themagician wrote:
         | Welcome to being old!
         | 
         | Watch a 20-year old creative work on an iPad and you will
         | quickly change your mind. Watch someone who has, "never really
         | used a desktop, [I] just use an iPad" work in Procreate or
         | LumaFusion.
         | 
         | The iPad has amazing software. Better, in many ways, than
         | desktop alternatives _if_ you know how to use it. There are
         | some things they can 't do, and the workflow can be less
         | flexible or full featured in some cases, but the speed at which
         | some people (not me) can work on an iPad is mindblowing.
         | 
         | I use a "pro" app on an iPad and I find myself looking around
         | for how to do something and end up having to Google it half the
         | time. When I watch someone who really knows how to use an iPad
         | use the same app they know exactly what gesture to do or where
         | to long tap. I'm like, "How did you know that clicking on that
         | part of the timeline would trigger that selection," and they
         | just look back at you like, "What do you mean? How else would
         | you do it?"
         | 
         | There is a bizarre and almost undocumented design langauge of
         | iPadOS that some people simply seem to know. It often pops up
         | in those little "tap-torials" when a new feature roles out that
         | I either ignore or forget... but other people internalize them.
        
           | quasarj wrote:
           | They can have my keyboard when they pry it from my cold dead
           | hands! And my mouse, for that matter.
        
             | themagician wrote:
             | Oh, I'm with you. But the funny thing is, they won't even
             | want it.
             | 
             | I have two iPads and two pencils--that way each iPad is
             | never without a penicl--and yet I rarely use the pencil. I
             | just don't think about it. But then when I do, I'm like,
             | "Why don't I use this more often? It's fantastic."
             | 
             | I have tried and tried to adapt and I can not. I need a
             | mouse, keyboard, seperate numpad, and two 5K Displays to
             | mostly arrive at the same output that someone can do with a
             | single 11" or 13" screen and a bunch of differnt spaces
             | that can be flicked through.
             | 
             | I desperatedly wanted to make the iPad my primary machine
             | and I could not do it. But, honestly, I think it has more
             | to do with me than the software. I've become old and
             | stubborn. I want to do things my way.
        
               | skydhash wrote:
               | > "Why don't I use this more often? It's fantastic."
               | 
               | PDF marking and procreate are my main uses for it. And
               | using the ipad on a flat surface.
        
             | Terretta wrote:
             | Magic Keyboard is both, and the current (last, as of today)
             | iteration is great.
             | 
             | It is just fine driving Citrix or any web app like
             | VSCode.dev.
        
               | btown wrote:
               | The existence of vscode.dev always makes me wonder why
               | Microsoft never released an iOS version of VSCode to get
               | more users into its ecosystem. Sure, it's almost as
               | locked down as the web environment, but there's a lot of
               | space in that "almost" - you could do all sorts of things
               | like let users run their code, or complex extensions, in
               | containers in a web view using
               | https://github.com/ktock/container2wasm or similar.
        
           | aurareturn wrote:
           | I'm with you. I think HN (and conversational internet)
           | disproportionally contains more laptop people than the
           | public.
           | 
           | A lot of of the younger generation does all their work on
           | their phone and tablet and does not have a computer.
        
             | Tiktaalik wrote:
             | If those younger folks are as the parent says in the
             | creative sector.
             | 
             | iPad has been a workflow gamechanger for folks who use
             | photoshop etc but users are still prevented from coding on
             | it.
        
               | themagician wrote:
               | It's actually come a long way. The workflow is still...
               | sub-optimal, but there are some really _nice_ terminal
               | apps (LaTerminal, Prompt, ShellFish, iSH) which are
               | functional too. Working Copy is pretty dope for working
               | with git once you get you adapt to it.
               | 
               | I do most of my dev on a Pi5 now, so actually working on
               | the iPad is not that difficult.
               | 
               | If they ever release Xcode for iPadOS that would be a
               | true gamechanger.
        
             | wiseowise wrote:
             | Majority of people don't know what programming is and do
             | shitton of manual things that can be automated by a simple
             | bash/Python script, so what?
        
       | Lalabadie wrote:
       | Key values in the press release:
       | 
       | - Up to 1.5x the CPU speed of iPad Pro's previous M2 chip
       | 
       | - Octane gets up to 4x the speed compared to M2
       | 
       | - At comparable performance, M4 consumes half the power of M2
       | 
       | - High-performance AI engine, that claims 60x the speed of
       | Apple's first engine (A11 Bionic)
        
         | mrtesthah wrote:
         | > _- Up to 1.5x the CPU speed of iPad Pro 's previous M2 chip_
         | 
         | What I want to know is whether that ratio holds for single-core
         | performance measurements.
        
         | codedokode wrote:
         | > Up to 1.5x the CPU speed
         | 
         | Doesn't it mean "1.5x speed in rare specific tasks which were
         | hardware optimized, and 1x ewerywhere else"?
        
           | rsynnott wrote:
           | I mean, we'll have to wait for proper benchmarks, but that
           | would make it a regression vs the M3, so, er, unlikely.
        
         | BurningFrog wrote:
         | At this point I read "up to" as "not"...
        
         | stonemetal12 wrote:
         | Apple claimed M3 was 1.35 the speed of the M2. So the M3 vs M4
         | comparison isn't that impressive. Certainly not bad by any
         | means, just pointing out why it is compared to the M2 here.
        
           | SkyPuncher wrote:
           | While I completely agree with your point, the M chips is a
           | series of chips. The iPad M2 is different than the MBP M2 or
           | the MacBook Air.
           | 
           | It's all just marketing to build hype.
        
             | astrange wrote:
             | No, it's the same as the MB Air.
        
           | kjkjadksj wrote:
           | People make this comment after every single m series release.
           | Its true for intel too, worse even. Changes between like 8th
           | and 9th and 10th gen were like nill, small clock bump same
           | igpu even.
        
           | frankchn wrote:
           | The other reason it is compared to the M2 is that there are
           | no iPads with M3s in them, so it makes sense to compare to
           | the processor used in the previous generation product.
        
           | tiltowait wrote:
           | It seems pretty reasonable to compare it against the last-
           | model iPad, which it's replacing.
        
       | dsign wrote:
       | > M4 makes the new iPad Pro an outrageously powerful device for
       | artificial intelligence.
       | 
       | Isn't there a ToS prohibition about "custom coding" in iOS? Like,
       | the only way you can ever use that hardware directly is for
       | developers who go through Apple Developer Program, which last
       | time I heard was bitter lemon? Tell me if I'm wrong.
        
         | freedomben wrote:
         | Well, this is the heart of the "appliance" model. iPads are
         | _appliances_. You wouldn 't ask about running custom code on
         | your toaster or your blender, so you shouldn't ask about that
         | for your iPad. Also all the common reasons apply: Security and
         | Privacy, Quality Control, Platform Stability and Compatibility,
         | and Integrated User Experience. All of these things are harmed
         | when you are allowed to run custom coding.
         | 
         | (disclaimer: My personal opinion is that the "appliance" model
         | is absurd, but I've tried to steel-man the case for it)
        
           | jebarker wrote:
           | Lots of people ask about running custom code on other
           | appliances. I think they call them hackers.
        
             | freedomben wrote:
             | I think you're reinforcing Apple's point about how security
             | is harmed by allowing custom code.
        
           | kibwen wrote:
           | _> You wouldn 't ask about running custom code on your
           | toaster or your blender, so you shouldn't ask about that for
           | your iPad._
           | 
           | Of course I would, and the only reason other people wouldn't
           | is because they're conditioned to believe in their own innate
           | powerlessness.
           | 
           | If you sell me a CPU, I want the power to program it, period.
        
             | freedomben wrote:
             | I mean this sincerely, are you really an Apple customer
             | then? I feel exactly the same as you, and for that reason I
             | don't buy Apple products. They are honest about what they
             | sell, which I appreciate.
        
               | judge2020 wrote:
               | Some arguments are that you shouldn't be able to create
               | appliances, only general purpose machines.
        
               | taylodl wrote:
               | Ever notice people don't build their own cars anymore?
               | They used to even up through the 60's. I mean ordering a
               | kit or otherwise purchasing all the components and
               | building the car. Nowadays it's very rare that people do
               | that.
               | 
               | I'm old enough to remember when people literally built
               | their own computers, soldering iron in hand. People
               | haven't done that since the early 80's.
               | 
               | Steve Jobs' vision of the Mac, released in 1984, was for
               | it to be a computing appliance - "the computer for the
               | rest of us." The technology of the day prevented that.
               | Though they pushed that as hard as they could.
               | 
               | Today's iPad? It's the fulfillment of Steve Jobs'
               | original vision of the Mac: a computing appliance. It
               | took 40 years, but we're here.
               | 
               | If you don't want a computing appliance then don't buy an
               | iPad. I'd go further and argue don't buy any tablet
               | device. Those that don't want computing appliances don't
               | have to buy them. It's not like laptops, or even
               | desktops, are going anywhere anytime soon.
        
               | nordsieck wrote:
               | > Some arguments are that you shouldn't be able to create
               | appliances, only general purpose machines.
               | 
               | I sincerely hope that you live as much of your life in
               | that world as possible.
               | 
               | Meanwhile, I'll enjoy having a car I don't have to mess
               | with every time I start it up.
        
               | bluescrn wrote:
               | In a world concerned with climate change, we should see
               | many of these 'appliances' as inherently wasteful.
               | 
               | On top of the ugly reality that they're designed to
               | become e-waste as soon as the battery degrades.
        
             | theshrike79 wrote:
             | Can you do that to your car infotainment system btw?
        
               | blacklion wrote:
               | Why not?
               | 
               | It MUST (RFC2119) be airgapped from ABS and ECU, of
               | course.
        
             | taylodl wrote:
             | _> If you sell me a CPU, I want the power to program it,
             | period._
             | 
             | Uhhh, there are CPUs in your frickin' wires now, dude!
             | There are several CPUs in you car for which you generally
             | don't have access. Ditto for your fridge. Your microwave.
             | Your oven. Even your toaster.
             | 
             | We're literally awash in CPUs. You need to update your
             | thinking.
             | 
             | Now, if you said something like "if you sell me a general-
             | purpose computing device, then I want the power to program
             | it, period" then I would fully agree with you. BTW, _you
             | can_ develop software for your own personal use on the
             | iPad. It 's not cheap or easy (doesn't utilize commonly-
             | used developer tooling), but it can be done without having
             | to jump through any special hoops.
             | 
             | Armed with that, we can amend your statement to "if you
             | sell me a general-purpose computing device, then I want the
             | power to program it using readily-available, and commonly-
             | utilized programming tools."
             | 
             | I think that statement better captures what I presume to be
             | your intent.
        
               | talldayo wrote:
               | > but it can be done without having to jump through any
               | special hoops.
               | 
               | You are really stretching the definition of "special
               | hoops" here. On Android sideloading is a switch hidden in
               | your settings menu; on iOS it's either a municipal
               | feature or a paid benefit of their developer program.
               | 
               | Relative to every single other commercial, general-
               | purpose operating system I've used, I would say yeah,
               | Apple practically defines what "special hoops" look like
               | online.
        
               | duped wrote:
               | I do actually want the ability to program the CPUs in my
               | car the same way I'm able to buy parts and mods for every
               | mechanical bit in there down to the engine. In fact we
               | have laws about that sort of thing that don't apply to
               | the software.
        
             | umanwizard wrote:
             | That may be your personal preference, but you should accept
             | that 99% of people don't care about programming their
             | toaster, so you're very unlikely to ever make progress in
             | this fight.
        
               | mort96 wrote:
               | 99% of people don't care about programming anything, that
               | doesn't make this gatekeeping right.
        
               | kjkjadksj wrote:
               | You aren't wrong but businesses aren't in the market to
               | optimize for 1% their customers
        
               | timothyduong wrote:
               | Could apply this for anything complex and packaged.
               | 
               | I'm annoyed that I can't buy particular engines off the
               | shelf and use them in my bespoke approach, why dont car
               | manufacturers give the approach that crate engine
               | providers do?
        
               | doctor_eval wrote:
               | Yeah, if I have to program my toaster, I'm buying a new
               | toaster.
               | 
               | I write enough code during the day to make me happy. I
               | really don't want to be thinking about the optimal
               | brownness of my bagel.
        
             | owenpalmer wrote:
             | the desire to program one's toaster is the most HN thing
             | I've seen all day XD
        
               | BrianHenryIE wrote:
               | I really wish I could program my dishwasher because it's
               | not cleaning very well and if I could add an extra rinse
               | cycle I think it would be fine.
        
               | kjkjadksj wrote:
               | Start by cleaning the filters
        
             | kjkjadksj wrote:
             | And engineer your own bagel setting without buying a bagel
             | model? Dream on.
        
           | shepherdjerred wrote:
           | If I could deploy to my blender as easily as I can to AWS,
           | then I would _definitely_ at least try it.
        
           | paxys wrote:
           | An appliance manufacturer isn't doing an entire press event
           | highlighting how fast the CPU on the appliance is.
        
             | worthless-trash wrote:
             | If its advertised like a general purpose computer,
             | expectations should be met.
        
             | freedomben wrote:
             | Agree completely. I think it's absurd that they talk about
             | technical things like CPU and memory in these
             | announcements. It seems to me like an admission that it's
             | not really an "appliance" but trying to translate Apple
             | marketing into logical/coherent concepts can be a
             | frustrating experience. I just don't try anymore.
        
           | ben-schaaf wrote:
           | I appreciate the steel-man. A strong counter argument for me
           | is that you actually _can_ run any custom code on an iPad, as
           | long as it 's in a web-browser. This is very unlike an
           | appliance where doing so is not possible. Clearly the
           | intention is for arbitrary custom code to run on it, which
           | makes it a personal computer and not an appliance (and should
           | be regulated as such).
        
             | freedomben wrote:
             | That's a fair point, although (steel-manning) the "custom
             | code" in the browser is severely restricted/sandboxed,
             | unlike "native" code would be. So from that perspective,
             | you could maybe expand it to be like a toaster that has
             | thousands of buttons that can make for hyper-specific
             | stuff, but can't go outside of the limits the manufacturer
             | built in.
        
         | eqvinox wrote:
         | As with any Apple device -- or honestly, any computing device
         | in general -- my criteria of evaluation would be the resulting
         | performance if I install Linux on it. (If Linux is not
         | installable on the device, the performance is zero. If Linux
         | driver support is limited, causing performance issues, that is
         | also part of the equation.)
         | 
         | NB: those are _my_ criteria of evaluation. _Very personally._ I
         | 'm a software engineer, with a focus on systems/embedded. Your
         | criteria are yours.
         | 
         | (But maybe don't complain if you buy this for its "AI"
         | capabilities only to find out that Apple doesn't let you do
         | anything "unapproved" with it. You had sufficient chance to see
         | the warning signs.)
        
           | pbronez wrote:
           | It looks like Asahi Linux can run on Apple Silicon iPads...
           | but you have to use an exploit like checkm8 to get past the
           | locked bootloader
           | 
           | https://www.reddit.com/r/AsahiLinux/comments/ttsshm/asahi_li.
           | ..
        
         | killerstorm wrote:
         | It means you can deliver AI apps to users. E.g. generate
         | images.
        
         | sergiotapia wrote:
         | You're not wrong. It's why I don't use apple hardware anymore
         | for work or play. On Android and Windows I can build and
         | install whatever I like, without having to go through mother-
         | Apple for permission.
        
         | wishfish wrote:
         | There's the potential option of Swift Playgrounds which would
         | let you write / run code directly on the iPad without any
         | involvement in the developer program.
        
         | philipwhiuk wrote:
         | C'mon man, it's 2024, they can't just not mention AI in a press
         | release.
        
       | fallingsquirrel wrote:
       | Is it just me or is there not a single performance chart here?
       | Their previous CPU announcements have all had perf-per-watt
       | charts, and that's conspicuously missing here. If this is an
       | improvement over previous gens, wouldn't they want to show that
       | off?
        
         | a_vanderbilt wrote:
         | Since Intel->M1 the performance gains haven't been the
         | headliners they once were, although the uplifts haven't been
         | terrible. It also lets them hide behind the more impressive
         | sounding multiplier which can reference something more specific
         | but not necessarily applicable to broader tasks.
        
       | giancarlostoro wrote:
       | Call me crazy, but I want all that power in a 7" tablet. I like
       | 7" tablets most because they feel less clunky to carry around and
       | take with you. Same with 13" laptops, I'm willing to sacrifice on
       | screen real estate for saving myself from the back pain of
       | carrying a 15" or larger laptop.
       | 
       | Some of this is insanely impressive. I wonder how big the OS ROM
       | (or whatever) is with all these models. For context, even if the
       | entire OS is about 15GB, in order to get some of these features
       | locally just for an LLM on its own, its about 60GB or more, for
       | something ChatGPT esque. Which requires me to spend thousands on
       | a GPU.
       | 
       | Apologies for the many thoughts, I'm quite excited by all these
       | advancements. I always say I want AI to work offline and people
       | tell me I'm moving the goalpost, but it is truly the only way it
       | will become mainstream.
        
         | onlyrealcuzzo wrote:
         | > Call me crazy, but I want all that power in a 7" tablet
         | 
         | Aren't phones getting close to 7" now? The iPhone pro is 6.2",
         | right?
        
           | jsheard wrote:
           | Yeah, big phones have become the new small tablet.
           | 
           | https://phonesized.com/compare/#2299,156
           | 
           | Take away the bezels on the tablet and there's not a lot of
           | difference.
        
           | giancarlostoro wrote:
           | I'm not a huge fan of it, but yeah they are. I actually
           | prefer my phones to be somewhat smaller.
        
           | jwells89 wrote:
           | Biggest difference is aspect ratio. Phones are taller and
           | less pleasant to use in landscape, tablets are more square
           | and better to use in landscape.
           | 
           | You could technically make a more square phone but it
           | wouldn't be fun to hold in common positions, like up to your
           | ear for a call.
        
         | ChrisMarshallNY wrote:
         | I've been using the iPad Mini for years.
         | 
         | I'd love to see them add something to that form factor.
         | 
         | I do see _a lot_ of iPad Minis out there, but usually, as part
         | of dedicated systems (like PoS, and restaurant systems).
         | 
         | On the other hand, I have heard rumblings that Apple may
         | release an _even bigger_ phone, which I think might be overkill
         | (but what do I know. I see a lot of those monster Samsung
         | beasts, out there).
         | 
         | Not sure that is for me. I still use an iPhone 13 Mini.
         | 
         | I suspect that my next Mac will be a Studio. I guess it will be
         | an M4 Studio.
        
           | wiredfool wrote:
           | I loved my ipad mini. It's super long in the tooth now, and I
           | was hoping to replace it today. oh well...
        
             | giancarlostoro wrote:
             | I wish they would stop doing this weird release cycle where
             | some of their tablets don't get the updated chips. It's
             | really frustrating. Makes me hesitant to buy a tablet if I
             | feel like it could get an upgrade a week later or whatever.
        
               | JohnBooty wrote:
               | It certainly seems less than ideal for pro/prosumer
               | buyers who care about the chips inside.
               | 
               | I would guess that Apple doesn't love it either; one
               | suspects that the weird release cycle is at least
               | partially related to availability of chips and other
               | components.
        
               | wiredfool wrote:
               | I probably would have pulled the trigger on a price drop,
               | but at 600+eur for an old version, I'm just not as into
               | that, as I really expect it to be lasting many years.
        
             | Menu_Overview wrote:
             | I was ready to buy one today, too. Disappointing.
             | 
             | I miss my old iPad mini 4. I guess I could try the 11"
             | iPad, but I think I'd prefer it to be smaller.
        
               | wiredfool wrote:
               | Yeah, We've got a full sized iPad here and it's really
               | strange to hold and use. It's all what you're used to.
        
           | giancarlostoro wrote:
           | I wanted to buy a Mini, but they had not updated the
           | processors for them when I was buying, and they cost way more
           | than a regular iPad at the time, I wanted to be budget
           | conscious. I still sometimes regret not just going for the
           | Mini, but I know eventually I'll get one sooner or later.
           | 
           | You know whats even funnier, when the mini came out
           | originally, I made fun of it. I thought it was a dumb
           | concept, oh my ignorance.
        
             | ChrisMarshallNY wrote:
             | I have an iPad Pro 13", and never use it (It's a test
             | machine).
             | 
             | I use the Mini daily.
             | 
             | It's a good thing they made the Pro lighter and thinner.
             | May actually make it more useful.
        
             | ectospheno wrote:
             | I have access to multiple iPad sizes and I personally only
             | use the mini. Is almost perfect. Last year of its long life
             | cycle you start to feel the age of the processor but still
             | better than holding the larger devices. Can't wait for it
             | to be updated again.
        
         | r0fl wrote:
         | The next iPhone pro max will be 6.9 inches
         | 
         | That fits all your wants
        
         | sulam wrote:
         | If your back is hurting from the ~1lb extra going from 13" to
         | 15", I would recommend some body weight exercises. Your back
         | will thank you, and you'll find getting older to be much less
         | painful.
         | 
         | Regarding a small iPad, isn't that the iPad mini? 8" vs 7" is
         | pretty close to what you're asking for.
        
           | teaearlgraycold wrote:
           | I _highly_ recommend doing pull-ups for your posture and
           | health. It was shocking to me how much the state of my spine
           | improved after doing pull-ups as a daily exercise.
        
         | alexpc201 wrote:
         | You can't have all that power in a 7" tablet because the
         | battery will last half hour.
        
           | JohnBooty wrote:
           | Well, maybe. The screen (and specifically the backlight) is a
           | big drain. Smaller screen = less drain.
        
         | Foobar8568 wrote:
         | I am not a large person by any means, yet I have no problem to
         | carry a MBP 16...But then I have a backpack and not a messenger
         | like bag, which I would agree, would be a pain to carry.
        
         | bschmidt1 wrote:
         | > I always say I want AI to work offline
         | 
         | I'm with you, I'm most excited about this too.
         | 
         | Currently building an AI creative studio (make stories, art,
         | music, videos, etc.) that runs locally/offline
         | (https://github.com/bennyschmidt/ragdoll-studio). There is a
         | lot of focus on cloud with LLMs but I can't see how the cost
         | will make much sense for involved creative apps like video
         | creation, etc. Present day users might not have high-end
         | machines, but I think they all will pretty soon - this will
         | make them buy them the way MMORPGs made everyone buy more RAM.
         | Especially the artists and creators. Remember, Photoshop was
         | once pretty difficult to run, you needed a great machine.
         | 
         | I can imagine offline music/movies apps, offline search
         | engines, back office software, etc.
        
         | ant6n wrote:
         | I've got an iPad mini. The main issue is the screen scratches.
         | The other main issue is the screen is like a mirror, so it
         | can't be used everywhere to watch videos (which is the main
         | thing the iPad is useful for). The third main issue is that
         | videos nowadays are way too dark and you can't adjust
         | brightness/gamma on the iPad to compensate.
         | 
         | (Notice a theme?)
        
         | notatoad wrote:
         | a 7" tablet was a really cool form factor back in the day when
         | phones were 4".
         | 
         | but when 6.7" screens on phones are common, what really is the
         | point of a 7" tablet?
        
         | Aurornis wrote:
         | > Call me crazy, but I want all that power in a 7" tablet. I
         | like 7" tablets most because they feel less clunky to carry
         | around and take with you.
         | 
         | iPhone Pro Max screen size is 6.7" and the the upcoming iPhone
         | 16 Pro Max is rumored to be 6.9" with 12GB of RAM. That's your
         | 7" tablet right there.
         | 
         | The thing is - You're an extreme edge case of an edge case.
         | Furthermore, I'm guessing if Apple did roll out a 7" tablet,
         | you'd find some other thing where it isn't exactly 100%
         | perfectly meeting your desired specifications. For example,
         | Apple _is_ about to release a high powered 6.9 " tablet-like
         | device (the iPhone 16 Pro Max) but I'm guessing there's another
         | reason why it doesn't fit your needs.
         | 
         | Which is why companies like Apple ignore these niche use cases
         | and focus on mainstream demands. The niche demands always gain
         | a lot of internet chatter, but when the products come out they
         | sell very poorly.
        
       | daniel31x13 wrote:
       | Well at the end of the day the processors are bottlenecked by its
       | OS. What real value does an iPad bring that a typical iPhone +
       | Mac combo misses? (Other than being a digital notebook...)
        
         | cooper_ganglia wrote:
         | Digital artist's can get a lot of use out of it, I'd assume.
         | The Apple Pencil seems pretty nice with the iPad.
        
           | daniel31x13 wrote:
           | This. If you're anything other than a digital artist/someone
           | who genuinely prefers writing over typing, an iPad is just an
           | extra tool for you to waste your money on.
           | 
           | I had one of the earlier versions and this was pretty much
           | its only use case...
        
         | JohnBooty wrote:
         | I wound up getting a 2019 iPad Pro for 50% off, so $500 or so.
         | Thought I would use it as a work/play hybrid.
         | 
         | Surprisingly (at least to me) I feel that I've more than gotten
         | my money's worth out of it _despite_ it being almost entirely
         | strictly a consumption device.
         | 
         | I tote it around the house so I can watch or listen to things
         | while I'm doing other things. It's also nice to keep on the
         | dining room table so I can read the news or watch something
         | while we're eating. I could do every single one of these things
         | with my laptop, but... that laptop is my _primary work tool._ I
         | don 't like to carry it all over the place, exposing it to
         | spills and dust, etc.
         | 
         | The only real work-related task is serving as a secondary
         | monitor (via AirPlay) for my laptop when I travel.
         | 
         | $500 isn't pocket change, but I've gotten 48 months of
         | enjoyment and would expect at least another 24 to 36 months.
         | That's about $6 a month, or possibly more like $3-4 per month
         | if I resell it eventually.
         | 
         | Worth it for me.
        
         | hot_gril wrote:
         | My wife has a new iPad for grad school, and I'm convinced it's
         | mainly an extra category for some customers to spend more money
         | on if they already have a Mac and iPhone. The school supplied
         | it, then she spent $400+ on the keyboard and other damn dongles
         | to bring the hardware sorta up to par with a laptop, hoping to
         | replace her 2013 MBP.
         | 
         | In the end, she still has to rely on the MBP daily because
         | there's always _something_ the iPad can 't do. Usually
         | something small like a website not fully working on it.
        
         | tiltowait wrote:
         | I often prefer (as in enjoy) using my iPad Pro over my 16" M1
         | MBP, but I think the only thing my iPad is actually better for
         | is drawing.
        
       | pier25 wrote:
       | This is great but why even bother with the M3?
       | 
       | The M3 Macs were released only 7 months ago.
        
         | ls612 wrote:
         | Probably they had some contractual commitments with TSMC and
         | had to use up their N3B capacity somehow. But as soon as N3E
         | became available it's a much better process overall.
        
           | a_vanderbilt wrote:
           | Ramping up production on a new die also takes time. The lower
           | volume and requirements of the M4 as used in the iPad can
           | give them time to mature the line for the Macs.
        
         | SkyPuncher wrote:
         | So far, I haven't seen any comparison between the iPad M4 and
         | the computer M3. Everything was essentially compared to the
         | last iPad chip, the M2.
         | 
         | Your laptop M3 chip is still probably more powerful than this.
         | The laptop M4 will be faster, but not groundbreaking faster.
        
       | lopkeny12ko wrote:
       | I always wonder how _constraining_ it is to design these chips
       | subject to thermal and energy limitations. I paid a lot of money
       | for my hardware and I want it to go as fast as possible. I don 't
       | want my fans to be quiet, and I don't want my battery life to be
       | 30 minutes longer, if it means I get _more raw performance_ in
       | return. But instead, Apple 's engineers have unilaterally decided
       | to handicap their own processors for no real good reason.
        
         | Workaccount2 wrote:
         | The overwhelming majority of people who buy these devices will
         | just use them to watch netflix and tiktok. Apple is well aware
         | of this.
        
         | boplicity wrote:
         | Why not go with a Windows based device? There are many loud and
         | low-battery life options that are very fast.
        
           | jwells89 wrote:
           | Yeah, one of my biggest frustrations as a person who likes
           | keeping around both recent-ish Mac and Windows/Linux laptops
           | is that x86 laptop manufacturers seem to have a severe
           | allergy to building laptops that are good all-rounders...
           | they always have one or multiple specs that are terrible,
           | usually heat, fan noise, and battery life.
           | 
           | Paradoxically this effect is the worst in ultraportables,
           | where the norm is to cram in CPUs that run too hot for the
           | chassis with tiny batteries, making them weirdly bad at the
           | one thing they're supposed to be good at. Portability isn't
           | just physical size and weight, but also runtime and if one
           | needs to bring cables and chargers.
           | 
           | On that note, Apple really needs to resurrect the 12" MacBook
           | with an M-series or even A-series SoC. There'd be absolutely
           | nothing remotely comparable in the x86 ultraportable market.
        
         | etchalon wrote:
         | The reason is because battery life is more important to the
         | vast majority of consumers.
        
         | jupp0r wrote:
         | Thermal load has been a major limiting design factor in high
         | end CPU design for two decades (remember Pentium 4?).
         | 
         | Apart from that, I think you might me in a minority if you want
         | a loud, hot iPad with a heavy battery to power all of this (for
         | a short time, because physics). There are plenty of Windows
         | devices that work exactly like that though if that's really
         | what makes you happy. Just don't expect great performance
         | either, because of diminishing returns of using higher power
         | and also because the chips in these devices usually suck.
        
         | shepherdjerred wrote:
         | You're in the minority
        
         | Perceval wrote:
         | Most of what Apple sells goes into mobile devices: phone,
         | tablet, laptop. In their prior incarnation, they ran up real
         | hard against the thermal limits of what they could put in their
         | laptops with the IBM PowerPC G5 chip.
         | 
         | Pure compute power has never been Apple's center of gravity
         | when selling products. The Mac Pro and the XServe are/were
         | minuscule portions of Apple's sales, and the latter product was
         | killed after a short while.
         | 
         | > Apple's engineers have unilaterally decided to handicap their
         | own processors for no real good reason
         | 
         | This is a misunderstanding of what the limiting factor is of
         | Apple products' capability. The mobile devices all have battery
         | as the limfac. The processors being energy efficient in
         | compute-per-watt isn't a handicap, it's an enabler. And it's a
         | very good reason.
        
         | wiseowise wrote:
         | > I don't want my fans to be quiet, and I don't want my battery
         | life to be 30 minutes longer
         | 
         | I agree with you. I don't want fans to be quiet, I want them
         | completely gone. And with battery life too, not 30 minutes, but
         | 300 minutes. Modern chips are plenty fast, developers need to
         | optimize their shit instead of churning crapware.
        
       | pxc wrote:
       | Given that recent Apple laptops already have solid all-day
       | battery life, with such a big performance per watt improvement, I
       | wonder if they'll end up reducing how much battery any laptops
       | ship with to make them lighter.
        
         | asadotzler wrote:
         | No, because battery life isn't just about the CPU. The CPU sits
         | idle most of the time and when it's not idle, it's at workloads
         | like 20% or whatever. It's the screens that eat batteries
         | because they're on most or all of the time and sucking juice.
         | Look at Apple's docs and you'll see the battery life is the
         | exact same as the previous model. They have a battery budget
         | and if they save 10% on CPU, they give that 10% to a better
         | screen or something. They can't shrink the battery by half
         | until they make screens twice as efficient, not CPUs which
         | account for only a small fraction of power draw.
        
       | zenethian wrote:
       | This is pretty awesome. I wonder if it has a fix for the the
       | GoFetch security flaw?
        
       | ionwake wrote:
       | Sorry to be a noob, but does anyone have a rough estimate of when
       | this m4 chip will be in a macbook air or macbook pro?
        
         | a_vanderbilt wrote:
         | If I had to venture a guess, maybe WWDC '24 that's coming up.
        
       | slashdev wrote:
       | I've got a Mac Pro paperweight because the motherboard went. It's
       | going to the landfill. I can't even sell it for parts because I
       | can't erase the SSD. If they didn't solder everything to the
       | board you could actually repair it. When I replace my current
       | Dell laptop, it will be with a repairable framework laptop.
        
         | stuff4ben wrote:
         | Just because you lack the skills to fix it, doesn't mean it's
         | not repairable. People desolder components all the time to fix
         | phones and ipads and laptops.
         | 
         | https://www.youtube.com/watch?v=VNKNjy3CoZ4
        
           | nicce wrote:
           | In this case, you need to find working motherboard without
           | soldered parts to be able to fix it cost efficiently.
           | Otherwise you need to buy factory component (for extra price,
           | with soldered components...)
        
             | slashdev wrote:
             | Yeah, it's not worth it
        
           | AzzyHN wrote:
           | Any other computer I could simply replace the motherboard
           | with several other compatible motherboards, no soldering or
           | donor board needed.
        
             | kaba0 wrote:
             | It's almost like "any other computer" is not thin as a
             | finger, packed to the brim with features that require
             | miniaturization.
             | 
             | Can you just fix an F1 engine with a wrench?
        
               | Rebelgecko wrote:
               | I'm not sure which gen Mac Pro they have, but the current
               | ones aren't that much thinner than the OG cheese grater
               | Macs from 15 years ago.
               | 
               | In fact the current Gen is bigger than the trashcan ones
               | by quite a bit (although IIRC the trash can Macs had user
               | replaceable SSDs and GPUs)
        
               | mort96 wrote:
               | That stuff makes it more difficult to work on, but it
               | doesn't make it impossible for Apple to sell replacement
               | motherboards... nor does making a "thin desktop" require
               | soldering on SSDs, M.2 SSDs are plenty thin for any small
               | form factor desktop use case.
        
               | slashdev wrote:
               | They do it deliberately. They want you to throw it out
               | and buy a new one
        
               | hot_gril wrote:
               | It's not that small: https://www.cnet.com/a/img/resize/65
               | 262f62ac0f1aa5540aca7cf9...
               | 
               | I totally missed that they released a new Apple Silicon
               | Mac Pro. Turns out it has PCIe slots.
        
               | slashdev wrote:
               | My Dell laptop is much more repairable. I changed the RAM
               | and added second SSD myself.
        
               | sniggers wrote:
               | The mental gymnastics Apple fanboys will do to defend
               | being sold garbage are amazing.
        
               | its_ethan wrote:
               | The inability to appreciate when optimizing a design
               | means not using COTS parts that Apple "haters" do is also
               | amazing...
        
           | slashdev wrote:
           | There's always some wiseass saying "skill issue"
        
         | ipqk wrote:
         | hopefully at least electronics recycling.
        
           | slashdev wrote:
           | Where do you usually take it for that?
           | 
           | If I find a place in walking distance, maybe.
        
             | kylehotchkiss wrote:
             | You could try to stick it in the phones drop off thingy at
             | target. That's my go to for all non valuable electronics.
        
               | slashdev wrote:
               | I don't have that here, but maybe there's something
               | similar
        
           | slashdev wrote:
           | Nothing close enough, I checked
        
         | kjkjadksj wrote:
         | Even repairable only buys you a few years repairability that
         | actually makes sense. For example something similar happened to
         | me, lost the mac mobo on a pre solder addiction model. Only
         | thing is guess how much a used mobo is for an old mac: nearly
         | as much as the entire old mac in working shape. It makes no
         | sense to repair it once the computer hits a certain age between
         | the prices of oem parts and the depreciation of computers.
        
         | kylehotchkiss wrote:
         | Why don't you take it to the Apple Store to recycle it instead
         | of dropping it in the trash can?
        
           | slashdev wrote:
           | They don't accept computers for recycling. That's what I
           | found when I looked it up
        
             | kylehotchkiss wrote:
             | They accept Apple branded computers for recycling if it has
             | no trade in value (they'll try to get you an offer if it
             | has any value). I have recycled damaged apple computers at
             | the store before without trading in.
        
             | crazygringo wrote:
             | They absolutely do. You must have looked it up wrong.
             | 
             | Here:
             | 
             | https://www.apple.com/recycling/nationalservices/
             | 
             | I've even done it before personally with an old MacBook
             | that wouldn't turn on.
        
         | acdha wrote:
         | > I can't even sell it for parts because I can't erase the SSD
         | 
         | The SSD is encrypted with a rate-limited key in the Secure
         | Enclave - unless someone has your password they're not getting
         | your data.
        
           | slashdev wrote:
           | Not worth the liability. I'd rather the landfill and peace of
           | mind than the money
        
             | crazygringo wrote:
             | But what liability?
             | 
             | That's the whole _point_ of encrypted storage. There is no
             | liability if you used a reasonable password.
             | 
             | Why not accept you _have_ peace of mind and resell on eBay
             | for parts?
             | 
             | Assuming you didn't use "password123" or something.
        
       | bigdict wrote:
       | 38 TOPS in the Neural Engine comes dangerously close to the
       | Microsoft requirement of 40 TOPS for "AI PCs".
        
       | 999900000999 wrote:
       | I'm extremely happy with my M1 iPad.
       | 
       | The only real issue is aside from the screen eventually wearing
       | out ( it already has a bit of flex), I can't imagine a reason to
       | upgrade. It's powerful enough to do anything you'd use an iPad
       | for. I primarily make music on mine, I've made full songs with
       | vocals and everything ( although without any mastering - I think
       | this is possible in Logic on iPad).
       | 
       | It's really fun for quick jam sessions, but I can't imagine what
       | else I'd do with it. IO is really bad for media creation, you
       | have a single USB C port( this bothers me the most, the moment
       | that port dies it becomes E Waste), no headphone jack...
        
         | MuffinFlavored wrote:
         | Any apps that work with MIDI controller on iPad?
         | 
         | Also, can't you just use a USB-C hub for like $10 from Amazon?
        
           | 999900000999 wrote:
           | I have more USB hubs than I can count.
           | 
           | You still only have one point of failure for the entire
           | device that can't be easily fixed.
           | 
           | And most midi controllers work fine via USB or Bluetooth
        
         | tootie wrote:
         | I have an iPad that predates M1 and it's also fine. It's a
         | media consumption device and that's about it.
        
       | onetimeuse92304 wrote:
       | As an amateur EE it is so annoying that they reuse names of
       | already existing ARM chips.
       | 
       | ARM Cortex-M4 or simply M4 is quite popular ARM architecture. I
       | am using M0, M3 and M4 chips from ST on a daily basis.
        
         | jupp0r wrote:
         | It's not like the practice of giving marketing names to chips
         | is generally a world of logical sanity if you look at Intel
         | i5/i7/i9 etc.
        
         | zerohp wrote:
         | As a professional EE, I know that ARM Cortex-M4 is not a chip.
         | It's an embedded processor that is put into an SOC (which is a
         | chip), such as the STM32-family from ST.
        
       | diogenescynic wrote:
       | So is the iPad mini abandoned due to the profit margins being too
       | small or what? I wish they'd just make it clear so I could
       | upgrade without worrying a mini replacement will come out right
       | after I buy something. And I don't really understand why there
       | are so many different iPads now (Air/Pro/Standard). It just feels
       | like Apple is slowly becoming like Dell... offer a bunch of SKUs
       | and barely differentiated products. I liked when Apple had fewer
       | products but they actually had a more distinct purpose.
        
         | downrightmike wrote:
         | They refresh it like every 3 years
        
       | grzeshru wrote:
       | Are these M-class chips available to be purchased on Digi-Key and
       | Mouser? Do they have data sheets and recommended circuitry? I'd
       | love to play with one just to see how difficult it is to
       | integrate compared to, say, an stm8/32 or something.
        
         | exabrial wrote:
         | absolutely not, and even if they were, they are not documented
         | in the least bit and require extraordinary custom OS and other
         | BLOBs to run
        
           | grzeshru wrote:
           | Darn it. Oh well.
        
         | downrightmike wrote:
         | lol
        
           | metaltyphoon wrote:
           | Legit made me chuckle
        
         | culopatin wrote:
         | Did you really expect a yes?
        
           | grzeshru wrote:
           | I didn't know what to expect. I thought they may license it
           | to other companies under particular clauses or some such.
        
       | tibbydudeza wrote:
       | 4P cores only ???.
        
         | a_vanderbilt wrote:
         | I'm hoping the higher efficiency gains and improved thermals
         | offset that. The efficiency cores tend to have more impact on
         | the Macs where multitasking is heavier.
        
         | antonkochubey wrote:
         | It's a tablet/ultrabook chip, are you expecting an Threadripper
         | in them?
        
       | treesciencebot wrote:
       | ~38 TOPS at fp16 is amazing, if the quoted number if fp16 (ANE is
       | fp16 according to this [1] but that honestly seems like a bad
       | choice when people are going smaller and smaller even at the
       | higher level datacenter cards so not sure why apple would use it
       | instead of fp8 natively)
       | 
       | [1]: https://github.com/hollance/neural-
       | engine/blob/master/docs/1...
        
         | imtringued wrote:
         | For reference. The llama.cpp people are not going smaller. Most
         | of those models run on 32 bit floats with the dequantization
         | happening on the fly.
        
       | haunter wrote:
       | I love when Gruber is confidently wrong
       | https://daringfireball.net/linked/2024/04/28/m4-ipad-pros-gu...
        
         | alberth wrote:
         | Especially about Gurman, who he loves to hate on.
        
           | atommclain wrote:
           | Never understood the animosity, especially because it seems
           | to only go one direction.
        
             | tambourine_man wrote:
             | He spills Apple's secrets. Gruber had him on his podcast
             | once and called him a super villain in the Apple's
             | universe, or something like this. It was cringeworthy
        
           | MBCook wrote:
           | As a longtime reader/listener I don't see him as hating
           | Gurman at all.
        
         | bombcar wrote:
         | Wasn't it relatively well known that the M3 is on an expensive
         | process and quickly getting to an M4 on a cheaper/higher yield
         | process would be worth it?
        
           | MBCook wrote:
           | Yes but Apple has never gone iPad first on a new chip either,
           | so I was with him in that I assumed it wouldn't be what they
           | would do.
           | 
           | "Let's make all our Macs look slower for a while!"
           | 
           | So I was surprised as well.
        
         | TillE wrote:
         | > or Apple's silicon game is racing far ahead of what I
         | considered possible
         | 
         | Gruber's strange assumption here is that a new number means
         | some major improvements. Apple has never really been consistent
         | about sticking to patterns in product releases.
        
           | tiffanyh wrote:
           | This is a major improvement (over the M3).
           | 
           | It's on a new fab node size.
           | 
           | It also have more CPU cores than it's predessor (M3 with
           | 8-core vs M4 with 10-cores).
        
             | edward28 wrote:
             | It's on TSMC n3E which is a slightly less dense but better
             | yielding than the previous n3B.
        
       | qwertyuiop_ wrote:
       | Does anyone know how much of this giant leap performance as Apple
       | puts it is really useful and perceived by end users of iPad. I am
       | thinking gaming, art applications on iPad. What other major ipad
       | use cases are out there that need this kind of performance boost.
        
         | musictubes wrote:
         | Making music. The iPad is much better for performing than a
         | computer. There is a huge range of instruments, effects,
         | sequencers, etc. available on the iPad. Things like physical
         | modeling and chained reverb can eat up processor cycles so more
         | performance is always welcomed.
         | 
         | Both Final Cut Pro and Davinci resolve can also use as much
         | power as you can give them though it isn't clear to me why
         | you'd use an iPad instead of a Mac. They also announced a crazy
         | multicam app for iPads and iPhones that allows remote control
         | of a bunch of iPhones at the same time.
        
         | bschmidt1 wrote:
         | I imagine running LLMs and other AI models to produce a variety
         | of art, music, video, etc.
        
       | troupo wrote:
       | "The M4 is so fast, it'll probably finish your Final Cut export
       | before you accidentally switch apps and remember that that
       | cancels the export entirely. That's the amazing power performance
       | lead that Apple Silicon provides." #AppleEvent
       | 
       | https://mastodon.social/@tolmasky/112400245162436195
        
         | dlivingston wrote:
         | Ha. That really highlights how absurd the toy of iPadOS is
         | compared to the beasts that are the M-series chips.
         | 
         | It's like putting a Ferrari engine inside of a Little Tikes toy
         | car. I really have no idea who the target market for this
         | device is.
        
         | LeoPanthera wrote:
         | This is a straight-up lie, yes? Switching apps doesn't cancel
         | the export.
        
           | troupo wrote:
           | Can neither confirm nor deny :) I've seen people complain
           | about this on Twitter and Mastodon though.
           | 
           | It's possible people are running into iOS limitations: it
           | _will_ kill apps when it thinks there 's not enough memory.
        
       | satertek wrote:
       | Are there enough cores to allow user switching?
        
       | daft_pink wrote:
       | Who would buy a MacBook Air or mini or studio today with its
       | older chips?
        
         | rc_mob wrote:
         | people on a budget
        
         | alexpc201 wrote:
         | People with a Macbook. You use the Macbook to work and the iPad
         | to play, read, movies, draw, etc. plus you can use it as a
         | second monitor for the Macbook.
        
         | antonkochubey wrote:
         | Someone who needs a MacBook Air or mini or studio, not an iPad
        
           | daft_pink wrote:
           | I'm just venting that their processor strategy doesn't make
           | much sense. The iPad gets the M4, but the Mini and Studio and
           | Mac Pro are still on M2 and the MacBooks are on M3.
           | 
           | They've essentially undercut every Mac they currently sell by
           | putting the M4 in the iPad and most people will never use
           | that kind of power in an iPad.
           | 
           | If you are going to spend $4k on a Mac don't you expect it to
           | have the latest processor?
        
             | victorbjorklund wrote:
             | People who care about having the latest probably are
             | waiting already anyway.
        
             | lotsofpulp wrote:
             | Probably 80%+ of the population can do everything they need
             | or want to do for the next 5 (maybe even 8) years on an M2
             | Air available for less than $1,500.
             | 
             | I write this on a $1,000 late 2015 Intel MacBook Air.
        
       | shepherdjerred wrote:
       | Am I wrong, or is raytracing on an iPad an _insane_ thing to
       | announce? As far as I know, raytracing is the holy grail of
       | computer graphics.
       | 
       | It's something that became viable on consumer gaming desktops
       | just a few years ago, and now we have real-time ray tracing on a
       | tablet.
        
         | vvvvvvvvvvvvv wrote:
         | iPhones with A17 already have hardware ray tracing. Few
         | applications/games support it at present.
        
         | bluescrn wrote:
         | Gotta make the loot boxes look even shinier, keep the gamblers
         | swiping those cards.
        
         | luyu_wu wrote:
         | Why would it be? They announced the same for the A17 in the
         | iPhone. Turns out it was a gimmick that caused over 11W of
         | power draw. Raytracing is a brute force approach that cannot be
         | optimized to the same level as rasterization. For now at least,
         | it is unsuitable for mobile devices. Now if we could use the RT
         | units for Blender that'd be great, but it's iPad OS...
        
         | pshc wrote:
         | It is kind of crazy to look back on. In the future we might
         | look forward to path tracing and more physically accurate
         | renderers. (Or perhaps all the lighting will be hallucinated by
         | AI...?)
        
       | alexpc201 wrote:
       | I understand that they have delayed the announcement of these
       | iPads until the M4 is ready, otherwise there is nothing
       | interesting to offer to those who have an iPad Pro M2. I don't
       | see the convenience of having a MacBook M3 and an iPad M4. If I
       | can't run Xcode on an iPad M4, the MacBook is the smartest
       | option; it has a bigger screen, more memory, and if you
       | complement it with an iPad Air, you don't miss out on anything.
        
       | dhx wrote:
       | M2's Neural Engine had 15TOPS, M3's 18TOPS (+20%) vs. M4's 38TOPS
       | (+111%).
       | 
       | In transistor counts, M2 had 20BTr, M3 25BTr (+25%) and M4 has
       | 28BTr (+12%).
       | 
       | M2 used TSMC N5P (138MTr/mm2), M3 used TSMC N3 (197MTr/mm2, +43%)
       | and M4 uses TSMC N3E (215MTr/mm2, +9%).[1][2]
       | 
       | [1]
       | https://en.wikipedia.org/wiki/5_nm_process#%225_nm%22_proces...
       | 
       | [2]
       | https://en.wikipedia.org/wiki/3_nm_process#%223_nm%22_proces...
        
         | ttul wrote:
         | An NVIDIA RTX 4090 generates 73 TFLOPS. This iPad gives you
         | nearly half that. The memory bandwidth of 120 GBps is roughly
         | 1/10th of the NVIDIA hardware, but who's counting!
        
           | kkielhofner wrote:
           | TOPS != TFLOPS
           | 
           | RTX 4090 Tensor 1,321 TOPS according to spec sheet so roughly
           | 35x.
           | 
           | RTX 4090 is 191 Tensor TFLOPS vs M2 5.6 TFLOPS (M3 is tough
           | to find spec).
           | 
           | RTX 4090 is also 1.5 years old.
        
             | imtringued wrote:
             | Yeah where are the bfloat16 numbers for the neural engine?
             | For AMD you can at least divide by four to get the real
             | number. 16 TOPS -> 4 tflops within a mobile power envelope
             | is pretty good for assisting CPU only inference on device.
             | Not so good if you want to run an inference server but that
             | wasn't the goal in the first place.
             | 
             | What irritates me the most though is people comparing a
             | mobile accelerator with an extreme high end desktop GPU.
             | Some models only run on a dual GPU stack of those. Smaller
             | GPUs are not worth the money. NPUs are primarily eating the
             | lunch of low end GPUs.
        
           | lemcoe9 wrote:
           | The 4090 costs ~$1800 and doesn't have dual OLED screens,
           | doesn't have a battery, doesn't weigh less than a pound, and
           | doesn't actually do anything unless it is plugged into a
           | larger motherboard, either.
        
             | talldayo wrote:
             | From Geekbench: https://browser.geekbench.com/opencl-
             | benchmarks
             | 
             | Apple M3: 29685
             | 
             | RTX 4090: 320220
             | 
             | When you line it up like that it's kinda surprising the
             | 4090 is _just_ $1800. They could sell it for $5,000 a pop
             | and it would still be better value than the highest end
             | Apple Silicon.
        
               | nicce wrote:
               | A bit off-topic since not applicable for iPad:
               | 
               | Adding also M3 MAX: 86072
               | 
               | I wonder the results if the test would be done on Asahi
               | Linux some day. Apple implementation is fairly
               | unoptimized AFAK.
        
               | haswell wrote:
               | Comparing these directly like this is problematic.
               | 
               | The 4090 is highly specialized and not usable for general
               | purpose computing.
               | 
               | Whether or not it's a better value than Apple Silicon
               | will highly depend on what you intend to do with it.
               | Especially if your goal is to have a device you can put
               | in your backpack.
        
               | talldayo wrote:
               | I'm not the one making the comparison, I'm just providing
               | the compute numbers to the people who _did_. Decide for
               | yourself what that means, the only conclusion I made on
               | was compute-per-dollar.
        
             | janalsncm wrote:
             | And yet it's worth it for deep learning. I'd like to see a
             | benchmark training Resnet on an iPad.
        
           | brigade wrote:
           | It would also blow through the iPad's battery in 4 minutes
           | flat
        
           | jocaal wrote:
           | > The memory bandwidth of 120 GBps is roughly 1/10th of the
           | NVIDIA hardware, but who's counting
           | 
           | Memory bandwidth is literally the main bottleneck when it
           | comes to the types of applications gpus are used for, so
           | everyone is counting
        
         | bearjaws wrote:
         | We will have M4 laptops running 400B parameter models next
         | year. Wild times.
        
           | visarga wrote:
           | And they will fit in the 8GB RAM with 0.02 bit quant
        
             | gpm wrote:
             | You can get a macbook pro with 128 GB of memory (for nearly
             | $5000).
             | 
             | Which still implies... a 2 bit quant?
        
               | freeqaz wrote:
               | There are some crazy 1/1.5 bit quants now. If you're
               | curious I'll try to dig up the papers I was reading.
               | 
               | 1.5bit can be done to existing models. The 1 bit (and
               | less than 1 bit iirc) requires training a model from
               | scratch.
               | 
               | Still, the idea that we can have giant models running in
               | tiny amounts of RAM is not completely far fetched at this
               | point.
        
               | gpm wrote:
               | Yeah, I'm broadly aware and have seen a few of the
               | papers, though I definitely don't try and track the state
               | of the art here closely.
               | 
               | My impression and experience trying low bit quants (which
               | could easily be outdated by now) is that you are/were
               | better off with a smaller model and a less aggressive
               | quantization (provided you have access to said smaller
               | model with otherwise equally good training). If that's
               | changed I'd be interested to hear about it, but
               | definitely don't want to make work for you digging up
               | papers.
        
               | moneywoes wrote:
               | eli5 quant?
        
               | gpm wrote:
               | Quant is short for "quantization" here.
               | 
               | LLMs are parameterized by a ton of weights, when we say
               | something like 400B we mean it has 400 billion
               | parameters. In modern LLMs those parameters are basically
               | always 16 bit floating point numbers.
               | 
               | It turns out you can get nearly as good results by
               | reducing the precision of those numbers, for instance by
               | using 4 bits per parameter instead of 16, meaning each
               | parameter can only take on one of 16 possible values
               | instead of one of 65536.
        
         | adrian_b wrote:
         | > The Most Powerful Neural Engine Ever
         | 
         | While it is true that the claimed performance for M4 is better
         | than for the current Intel Meteor Lake and AMD Hawk Point, it
         | is also significantly lower (e.g. around half) than the AI
         | performance claimed for the laptop CPU+GPU+NPU models that both
         | Intel and AMD will introduce in the second half of this year
         | (Arrow Lake and Strix Point).
        
           | whynotminot wrote:
           | > will introduce
           | 
           | Incredible that in the future there will be better chips than
           | what Apple is releasing now.
        
             | hmottestad wrote:
             | Don't worry. It's Intel we're talking about. They may say
             | that it's coming out in 6 months, but that's never stopped
             | them from releasing it in 3 years instead.
        
               | adrian_b wrote:
               | AMD is the one that has given more precise values (77
               | TOPS) for their launch, their partners are testing the
               | engineering samples and some laptop product listings seem
               | to have been already leaked, so the launch is expected
               | soon (presentation in June, commercial availability no
               | more than a few months later).
        
               | spxneo wrote:
               | I literally don't give a fck about Intel anymore they are
               | irrelevant
               | 
               | The taiwanese silicon industrial complex deserves our
               | dollars. Their workers are insanely hard working and it
               | shows in its product.
        
               | benced wrote:
               | There's no Taiwanese silicon industrial complex, there's
               | TSMC. The rest of Taiwanese fabs are irrelevant. Intel is
               | the clear #3 (and looks likely-ish to overtake Samsung?
               | We'll see).
        
             | adrian_b wrote:
             | The point is that it is a very near future, a few months
             | away.
             | 
             | Apple is also bragging very hyperbolically that the NPU
             | they introduce right now is faster than all the older NPUs.
             | 
             | So, while what Apple says, "The Most Powerful Neural Engine
             | Ever" is true now, it will be true for only a few months.
             | Apple has done a good job, so as it is normal, at launch
             | their NPU is the fastest. However this does not deserve any
             | special praise, it is just normal, as normal as the fact
             | that the next NPU launched by a competitor will be faster.
             | 
             | Only if the new Apple NPU would have been slower than the
             | older models, that would have been a newsworthy failure. A
             | newsworthy success would have been only if the new M4 would
             | have had at least a triple performance than it has, so that
             | the competitors would have needed more than a year to catch
             | up with it.
        
               | whynotminot wrote:
               | Is this the first time you're seeing marketing copy? This
               | is an entirely normal thing to do. Apple has an advantage
               | with the SoC they are releasing today, and they are going
               | to talk about it.
               | 
               | I expect we will see the same bragging from Apple's
               | competitors whenever they actually launch the chips
               | you're talking about.
               | 
               | Apple has real silicon shipping right now. What you're
               | talking about doesn't yet exist.
               | 
               | > A newsworthy success would have been only if the new M4
               | would have had at least a triple performance than it has,
               | so that the competitors would have needed more than a
               | year to catch up with it.
               | 
               | So you decide what's newsworthy now? Triple? That's so
               | arbitrary.
               | 
               | I certainly better not see you bragging about these
               | supposed chips later if they're not three times faster
               | than what Apple just released today.
        
               | adrian_b wrote:
               | I said triple, because the competitors are expected to
               | have a double speed in a few months.
               | 
               | If M4 were 3 times faster than it is, it would have
               | remained faster than Strix Point and Arrow Lake, which
               | would have been replaced only next year, giving supremacy
               | to M4 for more than a year.
               | 
               | If M4 were twice faster, it would have continued to share
               | the first position for more than a year. As it is, it
               | will be the fastest for one quarter, after which it will
               | have only half of the top speed.
        
               | whynotminot wrote:
               | And then Apple will release M5 next year, presumably with
               | another increase in TOPS that may well top their
               | competitors. This is how product releases work.
        
               | spxneo wrote:
               | strongly doubt we will see M5 so soon
        
               | handsclean wrote:
               | I can't tell what you're criticizing. Yes, computers get
               | faster over time, and future computers will be faster
               | than the M4. If release cycles are offset by six months
               | then it makes sense that leads only last six months in a
               | neck-and-neck race. I'd assume after Arrow Lake and Strix
               | Point the lead will then go back to M5 in six months,
               | then Intel and AMD's whatever in another six, etc. I
               | guess that's disappointing if you expected a multi-year
               | leap ahead like the M1, but that's just a bad
               | expectation, it never happens and nobody predicted or
               | claimed it.
        
           | intrasight wrote:
           | > The Most Powerful Neural Engine Ever
           | 
           | that would be my brain still - at least for now ;)
        
           | spxneo wrote:
           | damn bro thanks for this
           | 
           | here i am celebrating not pulling the trigger on M2 128gb
           | yesterday
           | 
           | now im realizing M4 ain't shit
           | 
           | will wait a few more months for what you described. will
           | probably wait for AMD
           | 
           | > Given that Microsoft has defined that only processors with
           | an NPU with 45 TOPS of performance or over constitute being
           | considered an 'AI PC',
           | 
           | so already with 77 TOPS it just destroys M4. Rumoured to hit
           | the market in 2 months or less.
        
         | paulpan wrote:
         | The fact that TSMC publishes their own metrics and target goals
         | for each node makes it straightforward to compare the
         | transistor density, power efficiency, etc.
         | 
         | The most interesting aspect of the M4 is simply it's debuting
         | on the iPad lineup, whereas historically it's always been on
         | the iPhone (for A-series) and Macbook (for M-series). Makes
         | sense given low expected yielded for the newest node for one of
         | Apple's lower volume products.
         | 
         | For the curious, the original TSMC N3 node had a lot of issues
         | plus was very costly so makes sense to move away from it:
         | https://www.semianalysis.com/p/tsmcs-3nm-conundrum-does-it-e...
        
           | spenczar5 wrote:
           | iPads are actually much higher volume than Macs. Apple sells
           | about 2x to 3x as many tablets as laptops.
           | 
           | Of course, phones dwarf both.
        
             | andy_xor_andrew wrote:
             | The iPad Pros, though?
             | 
             | I'm very curious how much iPad Pros sell. Out of all the
             | products in Apple's lineup, the iPad Pro confuses me the
             | most. You can tell what a PM inside Apple thinks the iPad
             | Pro is for, based on the presentation: super powerful M4
             | chip! Use Final Cut Pro, or Garageband, or other desktop
             | apps on the go! Etc etc.
             | 
             | But in reality, who actually buys them, instead of an iPad
             | Air? Maybe some people with too much money who want the
             | latest gadgets? Ever since they debuted, the general
             | consensus from tech reviewers on the iPad Pro has been
             | "It's an amazing device, but no reason to buy it if you can
             | buy a MacBook or an iPad Air"
             | 
             | Apple really wants this "Pro" concept to exist for iPad
             | Pro, like someone who uses it as their daily work surface.
             | And maybe _some_ people exist like that (artists?
             | architects?) but most of the time when I see an iPad in a
             | "pro" environment (like a pilot using it for nav, or a
             | nurse using it for notes) they're using an old 2018
             | "regular" iPad.
        
               | transpute wrote:
               | iPadOS 16.3.1 can run virtual machines on M1/M2 silicon, 
               | https://old.reddit.com/r/jailbreak/comments/18m0o1h/tutor
               | ial...
               | 
               | Hypervisor support was removed from the iOS 16.4 kernel,
               | hopefully it will return in iPadOS 18 for at least some
               | approved devices.
               | 
               | If not, Microsoft/HP/Dell/Lenovo Arm laptops with
               | M3-competitive performance are launching soon, with
               | mainline Linux support.
        
               | dmitrygr wrote:
               | > Microsoft/HP/Dell/Lenovo Arm laptops with
               | M3-competitive performance are launching soon, with
               | mainline Linux support.
               | 
               | I have been seeking someone who'll be willing to put
               | money on such a claim. I'll bet the other way. Perchance
               | you're the person I seek, if you truly believe this?
        
               | transpute wrote:
               | Which part - launch timing, multicore performance or
               | mainline Linux support?
        
               | zarzavat wrote:
               | I presume the sequence of events was: some developer at
               | Apple thought it would be a great idea to port hypervisor
               | support to iPad and their manager approves it. It gets
               | all the way into the OS, then an exec gets wind of it and
               | orders its removal because it allows users to subvert the
               | App Store and Apple Rent. I doubt it's ever coming back.
               | 
               | This is everything wrong with the iPad Pro in a nutshell.
               | Fantastic hardware ruined by greed.
        
               | intrasight wrote:
               | Totally agree about "Pro". Imagine if they gave it a real
               | OS. Someone yesterday suggested to dual-boot. At first I
               | dismissed that idea. But after thinking about it, I can
               | see the benefits. They could leave ipadOS alone and
               | create a bespoke OS. They certainly have the resources to
               | do so. It would open up so many new sales channels for a
               | true tablet.
        
               | kmeisthax wrote:
               | >artists? architects?
               | 
               | Ding ding ding ding ding! The iPad Pro is useful
               | _primarily_ for those people. Or at least it _was_. The
               | original selling point of the Pro was that it had[0] the
               | Apple Pencil and a larger screen to draw on. The 2021
               | upgrade gave the option to buy a tablet with 16GB of RAM,
               | which you need for Procreate as that has very strict
               | layer limits. If you look at the cost of dedicated
               | drawing tablets with screens in them, dropping a grand on
               | an iPad Pro and Pencil is surprisingly competitive.
               | 
               | As for every other use case... the fact that all these
               | apps have iPad versions now is great, _for people with
               | cheaper tablets_. The iPad Air comes in 13 " now and
               | that'll satisfy all but the most demanding Procreate
               | users _anyway_ , for about the same cost as the Pro had
               | back in 2016 or so. So I dunno. Maybe someone at Apple's
               | iPad division just figured they need a halo product? Or
               | maybe they want to compete with the Microsoft Surface
               | without having to offer the flexibility (and
               | corresponding jank) of a real computer? I dunno.
               | 
               | [0] sold separately, which is one of my biggest pet
               | peeves with tablets
        
             | wpm wrote:
             | iPads as a product line sure, but the M4 is only in the
             | Pros at the moment which are likely lower volume than the
             | MacBook Air.
        
       | exabrial wrote:
       | All I want is more memory bandwidth at lower latency. I've learnt
       | that's the vast majority of felt responsiveness today. I could
       | care less about AI and Neural Engine party tricks, stuff I might
       | use once a day or week.
        
       | bschmidt1 wrote:
       | Bring on AI art, music, & games!
        
       | oxqbldpxo wrote:
       | All this powerful hardware on a laptop computer is like driving a
       | Ferrari at 40 mph. It is begging for better use. If apple ever
       | releases an ai robot that's going to change everything. Long ways
       | to go, but when it arrives, it will be chatgptx100.
        
       | adonese wrote:
       | imaging a device so powerful as this new ipad, yet so useless. it
       | baffles me that we have this great hardware only the software bit
       | is lacking
        
       | tiffanyh wrote:
       | Nano-Texture
       | 
       | I really hope this comes to all Apple products soon (iPhones, all
       | iPads, etc).
       | 
       | It's some of the best anti-reflective tech I've seen that keeps
       | color and brightness deep & bright.
        
         | kylehotchkiss wrote:
         | Will be interesting to see how it holds up on devices that get
         | fingerprints and could be scratched though. Sort of wish Apple
         | would offer it as a replaceable screen film.
        
       | rnikander wrote:
       | Any hope for a new iPhone SE? My 1st gen's battery is near dead.
        
       | api wrote:
       | Looks great. Now put it in a real computer. Such a waste to be in
       | a jailed device that can't run anything.
        
       | obnauticus wrote:
       | Looks like their NPU (aka ANE) takes up about 1/3 of the die area
       | of the GPU.
       | 
       | Would be interesting to see how much they're _actually_ utilizing
       | the NPU versus their GPU for AI workloads.
        
       | noiv wrote:
       | I got somewhat accustomed to new outrageous specs every year, but
       | reading near the end that by 2030 Apple plans to be 'carbon
       | neutral across the entire manufacturing supply chain and life
       | cycle of every product' makes me hope one day my devices are not
       | just a SUV on the data highway.
        
       | gavin_gee wrote:
       | I'm still rocking an iPad 6th generation. It's a video
       | consumption device only. A faster CPU doesn't enable any new use
       | cases.
       | 
       | The only reason is the consumer's desire to buy more.
        
         | cyberpunk wrote:
         | New oled does look like quite a nice display though...
        
           | doctor_eval wrote:
           | Yep I have an OLED TV but was watching a movie on my (M1)
           | iPad Pro last night, and realised how grey the blacks were.
           | 
           | Once you see it, etc.
        
         | nortonham wrote:
         | and I still have a an original iPad Air first gen.....still
         | works for basic things. Unsupported by apple now, but still
         | usable.
        
       | rvalue wrote:
       | No mention of battery life. They keep making stuff thin, and
       | unupgradeable. What's the point of buying an apple device that is
       | going to wear out in 5 years?
        
         | namdnay wrote:
         | To be fair every MacBook I've had has lasted 10 years minimum
        
         | TillE wrote:
         | Battery replacement costs $200. It's not like you just have to
         | throw it in the trash if the battery dies.
        
       | EugeneOZ wrote:
       | 1Tb model = EUR2750.
       | 
       | For iPad, not an MBP laptop.
        
       | JodieBenitez wrote:
       | And here I am with my MB Air M1 with no plan to upgrade
       | whatsoever because I don't need to...
       | 
       | (yes, I understand this is about iPad, but I guess we'll see
       | these M4 on the MB Air as well ?)
        
       | NorwegianDude wrote:
       | > M4 has Apple's fastest Neural Engine ever, capable of up to 38
       | trillion operations per second, which is faster than the neural
       | processing unit of any AI PC today.
       | 
       | I always wonder what crazy meds Apple employees are on. Two RTX
       | 4090s is quite common for hobbyist use, and that is 1321 TOPS
       | each, making two over 69 times more than what Apple claims to be
       | the fastest in the world. That performance is literally less than
       | 1 % of a single H200.
       | 
       | Talk about misleading marketing...
        
         | akshayt wrote:
         | They are referring to integrated npus in current cpus like in
         | the intel cote ultra.
         | 
         | They explicitly mentioned in the event that the industry refers
         | to the neural engine as a NPU
        
           | mort96 wrote:
           | But the word they used isn't "NPU" or "neural engine" but "AI
           | PC"??? If I build a PC with a ton of GPU power with the
           | intention of using that compute for machine learning then
           | that's an "AI PC"
        
             | fwip wrote:
             | The technicality they're operating on is that the "AI PC"
             | doesn't have a "neural processing unit."
             | 
             | > faster than the neural processing unit of any AI PC
             | today.
        
               | mort96 wrote:
               | Ah. I guess you could argue that that's technically not
               | directly false. That's an impressive level of being
               | dishonest without being technically incorrect.
               | 
               | By comparing the non-existent neural engine in your
               | typical AI PC, you could claim that the very first SoC
               | with an "NPU" is infinitely faster than the typical AI PC
        
               | sroussey wrote:
               | The phrase AI PC used by Intel and AMD is about having an
               | NPU like in the Intel Ultra chips. These are ML only
               | things, and can run without activating the GPU.
               | 
               | https://www.theverge.com/2023/12/14/23998215/intel-core-
               | ultr...
        
             | SllX wrote:
             | On paper you're absolutely correct. AI PC is marketing
             | rubbish out of Wintel. Apple's doing a direct comparison to
             | that marketing rubbish and just accepting that they'll
             | probably have to play along with it.
             | 
             | So going by the intended usage of this marketing rubbish,
             | the comparison Apple is making isn't to GPUs. It's to
             | Intel's chips that like Apple's, integrate, CPU, GPU, and
             | NPU. They just don't name drop Intel anymore when they
             | don't have to.
        
               | mort96 wrote:
               | If they literally just said that the iPad's NPU is faster
               | than the NPU of any other computer it'd be fine, I would
               | have no issue with it (though it makes you wonder, maybe
               | that wouldn't have been true? Maybe Qualcomm or Rockchip
               | have SoCs with faster NPUs, so the "fastest of any AI PC"
               | qualifier is necessary to exclude those?)
        
             | aurareturn wrote:
             | "AI PC" is what Microsoft and the industry has deemed SoCs
             | that have an NPU in it. It's not a term that Apple made up.
             | It's what the industry is using.
             | 
             | Of course, Apple has had an NPU in their SoC since the
             | first iPhone with FaceID.
        
             | numpad0 wrote:
             | Microsoft/Intel are trying to push this "AI-enabled PC" or
             | whatever for few months, to obsolete laptops without NPU
             | stuffed in unused I/O die space of CPU. Apple weaponized
             | that in this instance.
             | 
             | 1: https://www.theregister.com/2024/03/12/what_is_an_ai_pc/
        
             | stetrain wrote:
             | "AI PC" is a specific marketing term from Intel and
             | Microsoft. I don't think their specs include dual RTX
             | 4090s.
             | 
             | https://www.tomshardware.com/pc-components/cpus/intel-
             | shares...
        
               | oarth wrote:
               | An AI PC is a PC suited to be used for AI... Dual 4090 is
               | very suited for small scale AI.
               | 
               | It might be a marketing term by Microsoft, but that is
               | just dumb, and has nothing to do with what Apple says. If
               | this was in relation to Microsofts "AI PC" then Apple
               | should have written "Slower than ANY AI PC." instead, as
               | the minimum requirements for "AI PC by Microsoft" seems
               | to be 45 TOPS, and the M4 is too slow to qualify by the
               | Microsoft definition.
               | 
               | Are you heavily invested in Apple stock or somehting?
               | When a company clearly lies and tries to mislead people,
               | call them out on it, don't defend them. Companies are not
               | your friend. Wtf.
        
               | pertymcpert wrote:
               | > Are you heavily invested in Apple stock or somehting?
               | 
               | This isn't a nice thing to say.
        
           | NorwegianDude wrote:
           | The text clearly states faster than any AI PC, not that its
           | faster than any NPUs integrated into a CPU.
           | 
           | They could have written it correctly, but that sounds way
           | less impressive, so instead they make up shit to make it
           | sound very impressive.
        
             | aurareturn wrote:
             | https://www.microsoft.com/en-us/americas-partner-
             | blog/2024/0...
             | 
             | It's the term Microsoft, Intel, AMD, and Qualcomm decided
             | to rally around. No need to get upset at Apple for using
             | the same term as reference for comparison.
             | 
             | Ps. Nvidia also doesn't like the term because of precisely
             | what you said. But it's not Apple that decided to use this
             | term.
        
           | janalsncm wrote:
           | I've never heard anyone refer to an NPU before. I've heard of
           | GPU and TPU. But in any case, I don't know the right way to
           | compare Apple's hardware to a 4090.
        
         | syntaxing wrote:
         | Definitely misleading but they're talking about "AI CPU" rather
         | than GPUs. They're pretty much taking a jab at Intel.
        
         | talldayo wrote:
         | Watching this site recover after an Apple press release is like
         | watching the world leaders deliberate Dr. Strangelove's
         | suggestions.
        
         | make3 wrote:
         | (A H200 is a five digits datacenter GPU without a display port,
         | it's not what they mean by PC, but your general point still
         | stands)
        
         | MBCook wrote:
         | That's not a neural processing unit. It's a GPU.
         | 
         | They said they had the fastest NPU in a PC. Not the fastest on
         | earth (one of the nVidia cards, probably). Not the fastest way
         | you could run something (probably a 4090 as you said). Just the
         | fastest NPU shipping in a PC. Probably consumer PC.
         | 
         | It's marketing, but it seems like a reasonable line to draw to
         | me. It's not like when companies draw a line like "fastest car
         | under $70k with under 12 cylinders but available in green from
         | the factory".
        
           | NorwegianDude wrote:
           | Of course a GPU from Nvidia is also a NPU. People are
           | spending billions each month on Nvidia, because it's a great
           | NPU.
           | 
           | The fact is that a GPU from Nvidia is a much faster NPU than
           | a CPU from Apple.
           | 
           | It is marketing as you say, but it's misleading marketing, on
           | purpose. They could have simply written "the fastest
           | integrated NPU of any CPU" instead. This is something Apple
           | often does on purpose, and people believe it.
        
             | MBCook wrote:
             | A GPU does other things. It's designed to do something
             | else. That's why we call it a _G_ PU.
             | 
             | It just happens to be it's good at neural stuff too.
             | 
             | There's another difference too. Apple's NPU is integrated
             | in their chip. Intel and AMD are going the same. A 4090 is
             | not integrated into a CPU.
             | 
             | I'm somewhat guessing. Apple said NPU is the industry term,
             | honestly I'd never heard it before today. I don't know if
             | the official definition draws a distinction that would
             | exclude GPUs or not.
             | 
             | I simply think the way Apple presented things seemed
             | reasonable. When they made that claim the fact that they
             | might be comparing against a 4090 never entered my mind. If
             | they had said it was the fastest way to run neural networks
             | I would have questioned it, no doubt. But that wasn't the
             | wording they used.
        
               | sudosysgen wrote:
               | NVidia GPUs basically have an NPU, in the form of Tensor
               | units. They don't just happen to be good at matmul, they
               | have specific hardware designed to run neural networka.
               | 
               | There is no actual distinction. A GPU with Tensor
               | cores(=matmul units) really does have an NPU just as much
               | as a CPU with an NPU (=matmul units).
        
               | oarth wrote:
               | > A GPU does other things.
               | 
               | Yes, and so does the M4.
               | 
               | > It just happens to be it's good at neural stuff too.
               | 
               | No, it's no coincidence. Nvidia has been focusing on
               | neural nets, same as Apple.
               | 
               | > There's another difference too. Apple's NPU is
               | integrated in their chip.
               | 
               | The neural processing capabilities of Nvidia
               | products(Tensor Cores) are also integrated in the chip.
               | 
               | > A 4090 is not integrated into a CPU.
               | 
               | Correct, but nobody ever stated that. Apple stated that
               | M4 was faster than any AI PC today, not that it's the
               | fastest NPU integrated into a CPU. And by the way, the M4
               | is also a GPU.
               | 
               | > I don't know if the official definition draws a
               | distinction that would exclude GPUs or not.
               | 
               | A NPU can be a part of a GPU, a CPU or it's own chip.
               | 
               | > If they had said it was the fastest way to run neural
               | networks I would have questioned it,
               | 
               | They said fastest NPU, neural processing unit. It's the
               | term Apple and a few others use for their AI accelerator.
               | The whole point of a AI accelerator is performance and
               | efficiency. If something does a better job at it then
               | it's a better AI accelerator.
        
               | lostmsu wrote:
               | You know G in GPU stands for Graphics, right? So if you
               | want to play a game of words, NVidia's device dedicated
               | to something else is 30 times faster than "fastest"
               | Apple's device dedicated specifically to neural
               | processing.
        
             | dyauspitr wrote:
             | At that point you could just call a GPU a CPU. There are
             | manful distinctions to be made based on what the chip is
             | used for exclusively.
        
           | adrian_b wrote:
           | Both Intel's and AMD's laptop CPUs include NPUs, and they are
           | indeed slower than M4.
           | 
           | Nevertheless, Apple's bragging is a little weird, because
           | both Intel and AMD have already announced that in a few
           | months they will launch laptop CPUs with much faster NPUs
           | than Apple M4 (e.g. 77 TOPS for AMD), so Apple will hold the
           | first place for only a very short time.
        
             | MBCook wrote:
             | But do you expect them to say it's the "soon to be second
             | fastest"?
             | 
             | It's the fastest available today. And when they release
             | something faster (M4 Pro or Mac or Ultra or whatever)
             | they'll call that the fastest.
             | 
             | Seems fair to me.
        
             | jjtheblunt wrote:
             | Why do you believe that? Announcements about future
             | releases, by Intel and AMD, are not current future facts.
             | If they deliver, then fine, but you speak like they're
             | factual.
        
         | phren0logy wrote:
         | I have a MacBook M2 and a PC with a 4090 ("just" one of them) -
         | the VRAM barrier is usually what gets me with the 4090 when I
         | try to run local LLMs (not train them). For a lot of things, my
         | MacBook is fast enough, and with more RAM, I can run bigger
         | models easily. And, it's portable and sips battery.
         | 
         | The marketing hype is overblown, but for many (most? almost
         | all?) people, the MacBook is a much more useful choice.
        
           | ProllyInfamous wrote:
           | Expanding on this, I have an M2Pro (mini) & a tower w/GPU...
           | but for daily driving the M2Pro idles at 15-35W whereas the
           | tower idles at 160W.
           | 
           | Under full throttle/load, even though the M2Pro is rated as
           | less-performant, it is only using 105W -- the tower/GPU are
           | >450W!
        
         | SkyPuncher wrote:
         | All of the tech specs comparisons were extremely odd. Many
         | things got compared to the M1, despite the most recent iPad
         | having the M2. Heck, one of the comparisons was to the A11 chip
         | that was introduced nearly 7 years ago.
         | 
         | I generally like Apple products, but I cannot stand the way
         | they present them. They always hide how it compares against the
         | directly previous product.
        
         | Aurornis wrote:
         | It's a marketing trick. They're talking about _NPU_ s
         | specifically, which haven't really been rolled out on the PC
         | side.
         | 
         | So while they're significantly slower than even casual gaming
         | GPUs, they're technically the fastest _NPUs_ on the market.
         | 
         | It's marketing speak.
        
         | smith7018 wrote:
         | You're calling $3,600 worth of GPUs "quite common for hobbyist
         | use" and then comparing an iPad to a $40,000 AI-centric GPU.
        
           | sudosysgen wrote:
           | It's almost 70x more powerful. A 4 year old 3070 laptop was
           | cheaper when it came out and has about 200 TOPS, 7 times as
           | much. It's just factually incorrect to call it "faster than
           | any AI PC", it's far slower than a cheaper laptop from 4
           | years ago.
        
             | astrange wrote:
             | "Powerful" isn't the thing that matters for a battery-
             | powered device. Power/perf is.
        
               | sudosysgen wrote:
               | If they thought that peak performance didn't matter, they
               | wouldn't quote peak performance numbers in their
               | comparison, and yet they did. Peak performance clearly
               | matters, even in battery powered devices: many workloads
               | are bursty and latency matters then, and there are
               | workloads where you can be expected to be plugged in. In
               | fact, one such workload is generative AI which is often
               | characterized by burst usage where latency matters a lot,
               | which is exactly what these NPUs are marketed towards.
        
             | acdha wrote:
             | AI PC is a specific marketing term which Intel is using for
             | their NPU-equipped products where they're emphasizing low-
             | power AI:
             | 
             | https://www.intel.com/content/www/us/en/newsroom/news/what-
             | i...
             | 
             | In that context it seems fair to make the comparison
             | between a MacBook and the PC version which is closest on
             | perf/watt rather than absolute performance on a space
             | heater.
        
               | sudosysgen wrote:
               | Then make a comparison on perf/watt. As it is, we have no
               | way of knowing if it's better on a perf/watt basis than
               | something like an RTX4050 which is 10 times faster and
               | uses about 10 times the power.
               | 
               | The PC version of accelerated AI workloads, in 2024, is a
               | GPU with optimized matmul cores. It's the most powerful
               | and most efficient way of accelerating neural network
               | loads right now. Comparing to a suboptimal implementation
               | and making it sound like you're comparing to the industry
               | is misleading.
               | 
               | If they are referring to a specific marketing term for a
               | single company, they should do so explicitly. Otherwise,
               | it's just being misleading, because it's not even
               | including Apple's main competition, which is AMD and
               | NVidia, and using a generic sounding term.
        
         | password54321 wrote:
         | Just two 4090s? If you don't have at least 8 4090s do not even
         | call yourself a hobbyist.
        
         | numpad0 wrote:
         | Also the 38 TOPS figure is kind of odd. Intel had already shown
         | laptop CPUs with 45 TOPS NPU[1] though it hasn't shipped, and
         | Windows 12 is rumored to require 40 TOPS. If I'm doing math
         | right, (int)38 falls short of both.
         | 
         | 1: https://www.tomshardware.com/pc-components/cpus/intel-
         | says-l...
        
         | citizenpaul wrote:
         | This is standard apple advertising.The best whatever in the
         | world that is the same as some standard thing with a different
         | name. Apple is like clothing makers that "vanity size" their
         | clothes. If you dont know that basically means a size 20 is 30
         | size 21 is 31 and so on.
         | 
         | Neural processing unit is basically a made up term at this
         | point so of course they can have the fastest in the world.
        
       | ProfessorZoom wrote:
       | Apple Pencil Pro...
       | 
       | Apple Pencil Ultra next?
       | 
       | Apple Pencil Ultra+
       | 
       | Apple Pencil Pro Ultra XDR+
        
       | tsunamifury wrote:
       | Its really saying something about how the tech sector has shifted
       | due to the recent AI wave that Apple is announcing a chipset
       | entirely apart from a product.
       | 
       | This has never happened to my knowledge in this companies
       | history? I could be wrong though, even the G3/G4s were launched
       | as PowerMacs.
        
       | marinhero wrote:
       | I get frustrated seeing this go into the iPad and knowing that we
       | can't get a shell, and run our own binaries there. Not even as a
       | VM like [UserLAnd](https://userland.tech). I could effectively
       | travel with one device less in my backpack but instead I have to
       | carry two M chips, two displays, batteries, and so on...
       | 
       | It's great to see this tech moving forward but it's frustrating
       | to not see it translate into a more significant impact in the
       | ways we work, travel and develop software.
        
         | bschmidt1 wrote:
         | Think the play is "consumer AI". Would you really write code on
         | an iPad? And if you do, do you use an external keyboard?
        
           | e44858 wrote:
           | Tablets are the perfect form factor for coding because you
           | can easily mount them in an ergonomic position like this:
           | https://mgsloan.com/posts/comfortable-airplane-computing/
           | 
           | Most laptops have terrible keyboards so I'd be using an
           | external one either way.
        
             | bschmidt1 wrote:
             | Those keyboards are absolutely ridiculous, sorry.
        
         | LeoPanthera wrote:
         | UTM can be built for iOS.
        
           | zamadatix wrote:
           | Hypervisor.framework is not exposed without a jailbreak which
           | makes this quite limited in terms of usability and
           | functionality.
        
       | therealmarv wrote:
       | So why should I buy any Apple Laptop with M3 chip now (if I'm not
       | in hurry)? lol
        
         | MBCook wrote:
         | That's why a lot of people weren't expecting this an even
         | questioned Mark Gurman's article saying it would happen.
        
         | _ph_ wrote:
         | If you are not in a hurry, you almost never should buy new
         | hardware as the next generation will be around the corner. On
         | the other side, it could be up to 12 months, until the M4 is
         | available across the line. And for most tasks, a M3 is a great
         | value too. One might watch how many AI features that would
         | benefit from a M4 are presented at WWDC. But then, the next Mac
         | OS release won't be out before October.
        
           | therealmarv wrote:
           | The Macbook Airs with M3 have been launched 2 months ago. 2
           | months is really not that long ago, even in the Apple
           | universe. For sure I'm waiting on what happens on WWDC!
        
         | wiseowise wrote:
         | They've just released MacBook Air 15 inch, new one is at least
         | a year away.
        
       | czbond wrote:
       | Any idea when M4 will be in a mac pro?
        
       | asow92 wrote:
       | Why are we running these high end CPUs on tablets without the
       | ability to run pro apps like Xcode?
       | 
       | Until I can run Xcode on an iPad (not Swift Playgrounds), it's a
       | pass for me. Hear me out: I don't want to bring both an iPad and
       | Macbook on trips, but I need Xcode. Because of this, I have to
       | pick the Macbook every time. I want an iPad, but the iPad doesn't
       | want me.
        
         | elpakal wrote:
         | "It's not you, it's me" - Xcode to the iPad
        
           | asow92 wrote:
           | In all seriousness, you're right. Sandboxing Xcode but making
           | it fully featured is surely a nightmare engineering problem
           | for Apple. However, I feel like some kind of containerized
           | macOS running in the app sandbox could be possible.
        
         | al_borland wrote:
         | WWDC is a month away. I'm hoping for some iPadOS updates to let
         | people actually take advantage of the power they put in these
         | tablets. Apple has often released new hardware before showing
         | off new OS features to take advantage of it.
         | 
         | I know people have been hoping for that for a long time, so I'm
         | not holding my breath.
        
           | asow92 wrote:
           | My guess is that WWDC will be more focused on AI this year,
           | but I will remain hopeful.
        
         | smrtinsert wrote:
         | Yep I have also no use for a touch screen device of that size.
         | Happy to get an m4 mac air or whatever it will be called but
         | I'm done with pads.
        
         | kjkjadksj wrote:
         | Didn't you want to play a reskinned bejewelled or subway surfer
         | with 8k textures?
        
         | Naomarik wrote:
         | Didn't have to look long to find a comment mirroring how I feel
         | about these devices. To me it feels like they're just adding
         | power to an artificially castrated device I can barely do
         | anything with. See no reason to upgrade from my original iPad
         | Pro that's not really useful for anything. Just an overpowered
         | device running phone software.
        
           | asow92 wrote:
           | I feel the same way. I just can't justify upgrading from my
           | 10.5" Pro from years ago. It's got pro motion and runs most
           | apps fine. Sure, the battery isn't great after all these
           | years, but it's not like it's getting used long enough to
           | notice.
        
         | MuffinFlavored wrote:
         | > I don't want to bring both an iPad and Macbook on trips, but
         | I need ______
         | 
         | Why not just make the iPad run MacOS and throw iPadOS into the
         | garbage?
        
           | asow92 wrote:
           | I like some UX aspects of iPadOS, but need the functionality
           | of macOS for work.
        
             | reddalo wrote:
             | iPadOS is still mainly a fork of iOS, a glorified mobile
             | interface. They should really switch to a proper macOS
             | system, now that the specs allow for it.
        
         | kylehotchkiss wrote:
         | Visual studio code running from remote servers seemed like it
         | was making great progress right until the AI trendiness thing
         | took over... and hasn't seemed to advance much since. Hopefully
         | the AI thing cools down and the efforts on remote tooling/dev
         | environments continues onwards.
        
           | deergomoo wrote:
           | If we're going down that route then what's the point in
           | putting good hardware in the device? It might as well just be
           | a thin client. Having the same SoCs as their laptops and
           | desktops but then relegating the iPad to something that needs
           | to be chained to a "real" computer to do anything useful in
           | the development space seems like a tremendous waste of
           | potential.
        
         | timmg wrote:
         | Just wait until you buy an Apple Vision Pro...
         | 
         | [It's got the same restrictions as an iPad, but costs more than
         | a MacBookPro.]
        
           | gpm wrote:
           | This is in fact the thing that stopped me from buying an
           | Apple Vision Pro.
        
         | deergomoo wrote:
         | I've been saying this for years, I would love to get a desktop
         | Mac and use an iPad for the occasional bit of portable
         | development I do away from a desk, like when I want to noodle
         | on an idea in front of the TV.
         | 
         | I'm very happy with my MacBook, but I don't like that the mega
         | expensive machine I want to keep for 5+ years needs to be tied
         | to a limited-life lithium battery that's costly and labour
         | intensive to replace, just so I can sometimes write code in
         | other rooms in my house. I know there's numerous remote options
         | but...the iPad is right there, just lemme use it!
        
           | pjot wrote:
           | I've had success using cloud dev environments with an iPad -
           | the key for me was also using a mouse and keyboard - after
           | things weren't _that_ different
        
         | w1nst0nsm1th wrote:
         | I love and hate Apple as almost everyone else and have an iPad
         | for 'consultation' only (reading, browsing, video), but on
         | Android, you have IDEs for games dev (Godot), real android apps
         | IDE (through F-Droid), Python, Java and C/C++ IDE (through
         | Android Store) which are close enough of the Linux way...
         | 
         | So the iPad devices could handle that too if Apple allowed
         | it...
         | 
         | Once Apple will enforce the European Union requirement to allow
         | 'sideloading' on iPad, maybe we will be able to have nice
         | things also on it.
         | 
         | That could also be a good thing for Apple himself. A lot of
         | people in Europe have a bad opinion of Apple (partly?) because
         | of the closed (walled) garden of iPad/iOS and other
         | technology/IP which make their portable devices apart of the
         | Android ecosystem.
        
         | paulcole wrote:
         | As hard as it might be to believe, software developers are not
         | the "pros" Apple is choosing to appeal to with the iPad Pro.
         | 
         | Other jobs exist!
         | 
         | People with disposable income who just want to buy the
         | nicest/most expensive thing exist!
        
       | eterevsky wrote:
       | They are talking about iPad Pro as the primary example of M4
       | devices. But iPads don't really seem to be limited by
       | performance. Nobody I know compiles Chrome or does 3D renders on
       | an iPad.
        
         | SkyPuncher wrote:
         | It's all marketing toward people who aspire to be these
         | creative types. Very, few people actually need it but it feels
         | good when the iPad Air is missing a few key features that push
         | you to the Pro.
         | 
         | More practically, it should help with battery life. My
         | understanding is energy usage scales non-linearly with demand.
         | A more powerful chip running at 10% may be more battery
         | efficient than a less powerful chip running at 20%
        
       | FredPret wrote:
       | Why does a tablet have a camera bump!? Just take out the camera.
       | And let me run VSCode and a terminal.
        
       | visarga wrote:
       | LLaMA 3 tokens/second please, that's what we care about.
        
         | bschmidt1 wrote:
         | Hahah yes
        
         | mlboss wrote:
         | Only spec that I care about
        
       | amai wrote:
       | 16GB RAM ought to be enough for anybody! (Tim Cook)
        
       | bmurphy1976 wrote:
       | I love these advances and I really want a new iPad but I can't
       | stand the 10"+ form factor. When will the iPad Mini get a
       | substantial update?
        
       | radicaldreamer wrote:
       | Unfortunate that they got rid of the SIM card slot, Google Fi
       | only supports physical sims for their "data only" sim feature.
        
       | Dowwie wrote:
       | Can anyone explain where the media engine resides and runs?
        
         | wmf wrote:
         | The whole iPad is basically one chip so... the media engine is
         | in the M4. AFAIK it's a top-level core not part of the GPU but
         | Marcan could correct me.
        
       | nojvek wrote:
       | I am awaiting the day when a trillion transistors will be put on
       | a mobile device chewing 5W of peak power.
       | 
       | It's going to be a radical future.
        
       | vivzkestrel wrote:
       | any benchmarks of how it stacks up to m1, m2 and m3?
        
       | TheRealGL wrote:
       | Who wrote this? "A fourth of the power", what happened to a
       | quarter of the power?
        
       | lenerdenator wrote:
       | So long as it lets me play some of the less-intense 00's-10's era
       | PC games in some sort of virtualization framework at decent
       | framerates one day, and delivers great battery life as a backend
       | web dev workstation-on-the-go the next, it's a good chip. The M2
       | Pro does.
        
       | rsp1984 wrote:
       | _Together with next-generation ML accelerators in the CPU, the
       | high-performance GPU, and higher-bandwidth unified memory, the
       | Neural Engine makes M4 an outrageously powerful chip for AI._
       | 
       | In case it is not abundantly clear by now: Apple's AI strategy is
       | to put inference (and longer term even learning) on edge devices.
       | This is completely coherent with their privacy-first strategy
       | (which would be at odds with sending data up to the cloud for
       | processing).
       | 
       | Processing data at the edge also makes for the best possible user
       | experience because of the complete independence of network
       | connectivity and hence minimal latency.
       | 
       | If (and that's a big if) they keep their APIs open to run any
       | kind of AI workload on their chips it's a strategy that I
       | personally really really welcome as I don't want the AI future to
       | be centralised in the hands of a few powerful cloud providers.
        
         | krunck wrote:
         | Yes, that would be great. But without the ability for us to
         | verify this who's to say they won't use the edge resources(your
         | computer and electricity) to process data(your data) and then
         | send the results to their data center? It would certainly save
         | them a lot of money.
        
           | astrange wrote:
           | You seem to be describing face recognition in Photos like
           | it's a conspiracy against you. You'd prefer the data center
           | servers looking at your data?
        
           | IggleSniggle wrote:
           | When you can do all inference at the edge, you can keep it
           | disconnected from the network if you don't trust the data
           | handling.
           | 
           | I happen to think they wouldn't, simply because sending this
           | data back to Apple in any form that they could digest it is
           | not aligned with their current privacy-first strategies. But
           | if they make a device that still works if it stays
           | disconnected, the neat thing is that you can just...keep it
           | disconnected. You don't have to trust them.
        
             | chem83 wrote:
             | Except that's an unreasonable scenario for a smart phone.
             | It doesn't prove that the minute the user goes online it
             | won't be egressing data willingly or not.
        
               | IggleSniggle wrote:
               | I don't disagree, although when I composed my comment I
               | had desktop/laptop in mind, as I think genuinely useful
               | on-device smartphone-AI is a ways of yet, and who knows
               | what company Apple will be by then.
        
             | bee_rider wrote:
             | To use a proprietary system and not trust the vendor, you
             | have to _never_ connect it. That's possible of course, but
             | it seems pretty limiting, right?
        
           | chem83 wrote:
           | +1 The idea that it's on device, hence it's privacy-
           | preserving is Apple's marketing machine speaking and that
           | doesn't fly anymore. They have to do better to convince any
           | security and privacy expert worth their salt that their
           | claims and guarantees can be independently verified on behalf
           | of iOS users.
           | 
           | Google did some of that on Android, which means open-sourcing
           | their on-device TEE implementation, publishing a paper about
           | it etc.
        
           | robbomacrae wrote:
           | They already do this. It's called federated learning and its
           | a way for them to use your data to help personalize the model
           | for you and also (to a much lesser extent) the global model
           | for everyone whilst still respecting your data privacy. It's
           | not to save money, it's so they can keep your data private on
           | device and still use ML.
           | 
           | https://www.technologyreview.com/2019/12/11/131629/apple-
           | ai-...
        
           | victorbjorklund wrote:
           | If you trust that Apple doesn't film you with the camera when
           | you use the phone while sitting on the toilet. Why wouldn't
           | you trust Apple now?
           | 
           | It would have to be a huge conspiracy with all Apples
           | employees. And you can easily just listen to the network and
           | see if they do it or not.
        
             | xanderlewis wrote:
             | I find it somewhat hard to believe that wouldn't be in
             | contravention of some law or other. Or am I wrong?
             | 
             | Of course we can then worry that companies are breaking the
             | law, but you have to draw the line somewhere... and what
             | have they to gain anyway?
        
         | joelthelion wrote:
         | >n case it is not abundantly clear by now: Apple's AI strategy
         | is to put inference (and longer term even learning)
         | 
         | I'm curious: is anyone seriously using apple hardware to train
         | Ai models at the moment? Obviously not the big players, but I
         | imagine it might be a viable option for Ai engineers in
         | smaller, less ambitious companies.
        
           | andrewmcwatters wrote:
           | Yes, it can be more cost effective for smaller businesses to
           | do all their work on Mac Studios, versus having a dedicated
           | Nvidia rig plus Apple or Linux hardware for your workstation.
           | 
           | Honestly, you can train basic models just fine on M-Series
           | Max MacBook Pros.
        
             | nightski wrote:
             | A decked out Mac Studio is like $7k for far less GPU power.
             | I find that highly unlikely.
        
               | inciampati wrote:
               | But you get access to a very large amount of RAM for that
               | price.
        
               | softfalcon wrote:
               | Don't attack me, I'm not disagreeing with you that an
               | nVidia GPU is far superior at that price point.
               | 
               | I simply want to point out that these folks don't really
               | care about that. They want a Mac for more reasons than
               | "performance per watt/dollar" and if it's "good enough",
               | they'll pay that Apple tax.
               | 
               | Yes, yes, I know, it's frustrating and they could get
               | better Linux + GPU goodness with an nVidia PC running
               | Ubuntu/Arch/Debian, but macOS is painless for the average
               | science AI/ML training person to set up and work with.
               | There are also known enterprise OS management solutions
               | that business folks will happily sign off on.
               | 
               | Also, $7000 is chump change in the land of "can I get
               | this AI/ML dev to just get to work on my GPT model I'm
               | using to convince some VC's to give me $25-500 million?"
               | 
               | tldr; they're gonna buy a Mac cause it's a Mac and they
               | want a Mac and their business uses Mac's. No amount of
               | "but my nVidia GPU = better" is ever going to convince
               | them otherwise as long as there is a "sort of" reasonable
               | price point inside Apple's ecosystem.
        
               | brookst wrote:
               | What Linux setup do you recommend for 128GB of GPU
               | memory?
        
               | TylerE wrote:
               | A non-decked out Mac Studio is a hell of a machine for
               | $1999.
               | 
               | Do you also compare cars by looking at only the super
               | expensive limited editions, with every single option box
               | ticked?
               | 
               | I'd also point out that said 3 year old $1999 Mac Studio
               | that I'm typing this on already runs ML models usefully,
               | maybe 40-50% of the old 3000-series Nvidia machine it
               | replaces, while using literally less than 10% of the
               | power and making a tiny tiny fraction of the noise.
               | 
               | Oh, and it was cheaper. And not running Windows.
        
               | bee_rider wrote:
               | They are talking about training models, though. Run is a
               | bit ambiguous, is that also what you mean?
        
               | TylerE wrote:
               | No.
               | 
               | For training the Macs do have some interesting advantages
               | due to the unified memory. The GPU cores have access to
               | all of system RAM (and also the system RAM is
               | _ridiculously_ fast - 400GB /sec when DDR4 is barely
               | 30GB/sec, which has a lot of little fringe benefits of
               | it's own, part of why the Studio feels like an even more
               | powerful machine than it actually is. It's just super
               | snappy and responsive, even under heavy load.)
               | 
               | The largest consumer NVidia card has 22GB of useable RAM.
               | 
               | The $1999 Mac has 32GB, and for $400 more you get 64GB.
               | 
               | $3200 gets you 96GB, and more GPU cores. You can hit the
               | system max of 192GB for $5500 on an Ultra, albeit it with
               | the lessor GPU.
               | 
               | Even the recently announced 6000-series AI-oriented
               | NVidia cards max out at 48GB.
               | 
               | My understanding is a that a lot of enthusiasts are using
               | Macs for training because for certain things having more
               | RAM is just enabling.
        
               | andrewmcwatters wrote:
               | Not all of us who own small businesses are out here
               | speccing AMD Ryzen 9s and RTX 4090s for workstations.
               | 
               | You can't lug around a desktop workstation.
        
             | skohan wrote:
             | > a dedicated Nvidia rig
             | 
             | I am honestly shocked Nvidia has been allowed to maintain
             | their moat with cuda. It seems like AMD would have a ton to
             | gain just spending a couple million a year to implement all
             | the relevant ML libraries with a non-cuda back-end.
        
           | Q6T46nT668w6i3m wrote:
           | Yes, there're a handful of apps that use the neural engine to
           | fine tune models to their data.
        
           | alfalfasprout wrote:
           | Not really (I work on AI/ML Infrastructure at a well known
           | tech company and talk regularly w/ our peer companies).
           | 
           | That said, inference on apple products is a different story.
           | There's definitely interest in inference on the edge. So far
           | though, nearly everyone is still opting for inference in the
           | cloud for two reasons:
           | 
           | 1. There's a lot of extra work involved in getting ML/AI
           | models ready for mobile inference. And this work is different
           | for iOS vs. Android 2. You're limited on which exact device
           | models will run the thing optimally. Most of your customers
           | won't necessarily have that. So you need some kind of
           | fallback. 3. You're limited on what kind of models you can
           | actually run. You have way more flexibility running inference
           | in the cloud.
        
             | teaearlgraycold wrote:
             | Pytorch actually has surprisingly good support for Apple
             | Silicon. Occasionally an operation needs to use CPU
             | fallback but many applications are able to run inference
             | entirely off of the CPU cores.
        
               | rcarmo wrote:
               | And there is a lot of work being done with mlx.
        
               | ein0p wrote:
               | I've found it to be pretty terrible compared to CUDA,
               | especially with Huggingface transformers. There's no
               | technical reason why it has to be terrible there though.
               | Apple should fix that.
        
             | throwitaway222 wrote:
             | Inference on the edge is a lot like JS - just drop a crap
             | ton of data to the front end, and let it render.
        
             | gopher_space wrote:
             | A cloud solution I looked at a few years ago could be
             | replicated (poorly) in your browser today. In my mind the
             | question has become one of determining _when_ my model is
             | useful enough to detach from the cloud, not whether that
             | should happen.
        
             | ethbr1 wrote:
             | Power for power, any thoughts on what mobile inference
             | looks like vs doing it in the cloud?
        
           | deanishe wrote:
           | Isn't Apple hardware too expensive to make that worthwhile?
        
             | brookst wrote:
             | For business-scale model work, sure.
             | 
             | But you can get an M2 Ultra with 192GB of UMA for $6k or
             | so. It's very hard to get that much GPU memory at all, let
             | alone at that price. Of course the GPU processing power is
             | anemic compared to a DGX Station 100 cluster, but the mac
             | is $143,000 less.
        
           | cafed00d wrote:
           | I like to think back to 2011 and paraphrase what people were
           | saying: "Is anyone seriously using gpu hardware to write nl
           | translation software at the moment?"
           | 
           | "No, we should be use cheap commodity abundantly available
           | cpus and orchestrate then behind cloud magic to write our nl
           | translation apps"
           | 
           | or maybe "no we should build purpose built high performance
           | computing hardware to write our nl translation apps"
           | 
           | Or perhaps in the early 70s "is anyone seriously considering
           | personal computer hardware to ...". "no, we should just buy
           | IBM mainframes ..."
           | 
           | I don't know. Im probably super biased. I like the idea of
           | all this training work breaking the shackles of
           | cloud/mainframe/servers/off-end-user-device and migrating to
           | run on peoples devices. It feels "democratic".
        
           | dylan604 wrote:
           | Does one need to train an AI model on specific hardware, or
           | can a model be trained in one place and then used somewhere
           | else? Seems like Apple could just run their fine tuned model
           | called Siri on each device. Seems to me like asking for
           | training on Apple devices is missing the strategy. Unless of
           | course, it's just for purely scientific $reasons like "why
           | install Doom on the toaster?" vs doing it for a purpose.
        
             | xanderlewis wrote:
             | It doesn't _require_ specific hardware; you can train a
             | neural net with pencil and paper if you have enough time.
             | Of course, some pieces of hardware are more efficient than
             | others for this.
        
           | robbomacrae wrote:
           | I don't think this is what you meant but it matches the spec:
           | federated learning is being used by Apple to train models for
           | various applications and some of that happens on device
           | (iphones/ipads) with your personal data before its hashed and
           | sent up to the mothership model anonymously.
           | 
           | https://www.technologyreview.com/2019/12/11/131629/apple-
           | ai-...
        
           | avianlyric wrote:
           | Apple are. Their "Personal Voice" feature fine tunes a voice
           | model on device using recordings of your own voice.
           | 
           | An older example is the "Hey Siri" model, which is fine tuned
           | to your specific voice.
           | 
           | But with regards to on device training, I don't think anyone
           | is seriously looking at training a model from scratch on
           | device, that doesn't make much sense. But taking models and
           | fine tuning them to specific users makes a whole ton of
           | sense, and an obvious approach to producing "personal" AI
           | assistants.
           | 
           | [1] https://support.apple.com/en-us/104993
        
         | legitster wrote:
         | > This is completely coherent with their privacy-first strategy
         | (which would be at odds with sending data up to the cloud for
         | processing).
         | 
         | I feel like people are being a bit naive here. Apple's "Privacy
         | First" strategy was a _marketing_ spin developed in response to
         | being dead-last in web-development /cloud computing/smart
         | features.
         | 
         | Apple has had no problem changing their standards by 180
         | degrees and being blatantly anti-consumer whenever they have a
         | competitive advantage to do so.
        
           | seec wrote:
           | Don't bother the fanboys have an Apple can't do anything
           | wrong/malicious. At this point it's closer to a religion than
           | ever.
           | 
           | You would be amazed at the response of some of them when I
           | point out some shit Apple does that make their products
           | clearly lacking for the price, the cognitive dissonance is so
           | strong they don't know how to react in any other way than
           | lying or pretending it doesn't matter.
        
             | acdha wrote:
             | If you're annoyed about quasi-religious behavior, consider
             | that your comment has nothing quantifiable and contributed
             | nothing to this thread other than letting us know that you
             | don't like Apple products for non-specific reasons. Maybe
             | you could try to model the better behavior you want to see?
        
             | n9 wrote:
             | Your comment is literally more subjective, dismissive, and
             | full of FUD than any other on on this thread. Check
             | yourself.
        
           | IggleSniggle wrote:
           | Of course! The difference is that, for the time being, my
           | incentives are aligned with theirs in regards to preserving
           | my privacy.
           | 
           | The future is always fungible. Anyone can break whatever
           | trust they've built _very_ quickly. But, like the post you
           | are replying to, I have no qualms about supporting companies
           | that are currently doing things in my interest and don 't
           | have any clear strategic incentive to violate that trust.
           | 
           | Edit: that same incentive structure would apply to NVIDIA,
           | afaik
        
             | jajko wrote:
             | I can't agree with your comment. apple has all the
             | incentives to monetize your data, that's the whole value of
             | Google and Meta. And they are already heading into ad-
             | business earning billions last I've checked. Hardware ain't
             | selling as much as before, this isn't going to change for
             | the better in foreseeable future.
             | 
             | The logic is exactly same as ie Meta claims - we will
             | pseudoanonymize your data, so technically your specific
             | privacy is just yours, see nothing changed. But you are in
             | various target groups for ads, plus we know how 'good'
             | those anon efforts are when money are at play and
             | corporations are only there to earn as much money as
             | possible. Rest is PR.
        
               | IggleSniggle wrote:
               | Persuasive, thank you
        
               | legitster wrote:
               | I'll disagree with your disagreement - in part at least.
               | Apple is still bigger than Meta or Google. Even if they
               | had a strong channel to serve ads or otherwise monetize
               | data, the return would represent pennies on the dollar.
               | 
               | And Apple's privacy stance is a _moat_ against these
               | other companies making money off of their customer base.
               | So for the cost of pennies on the dollar, they protect
               | their customer base and ward off competition. That 's a
               | pretty strong incentive.
        
           | robbomacrae wrote:
           | Having worked at Apple I can assure you it's not just spin.
           | It's nigh on impossible to get permission to even compare
           | your data with another service inside of Apple and even if
           | you do get permission the user ids and everything are
           | completely different so theres no way to match up users.
           | Honestly its kind of ridiculous the lengths they go to and
           | makes development an absolute PITA.
        
             | briandear wrote:
             | As an Apple alum, I can agree with everything you've said.
        
             | legitster wrote:
             | That could very well be true, but I also think it could
             | change faster than people realize. Or that Apple has the
             | ability to compartmentalize (kind of like how Apple can
             | advocate for USB C adoption in some areas and fight it in
             | others).
             | 
             | I'm not saying this to trash Apple - I think it's true of
             | any corporation. If Apple starts losing revenue in 5 years
             | because their LLM isn't good enough because they don't have
             | enough data, they are still going to take it and have some
             | reason justifying why _theirs_ is privacy focused and
             | everyone else is not.
        
         | croes wrote:
         | It isn't privacy if Apple knows.
         | 
         | They are the gatekeeper of your data for their benefit not
         | yours.
        
           | jajko wrote:
           | Yes at the end its just some data representing user's trained
           | model. Is there a contractual agreement with users that apple
           | will never ever transfer a single byte of those, otherwise
           | huge penalties will happen? If not, its pinky PR promise that
           | sounds nice.
        
             | threeseed wrote:
             | Apple publicly documents their privacy and security
             | practices.
             | 
             | At minimum, laws around the world prevent companies from
             | knowingly communicating false information to consumers.
             | 
             | And in many countries the rules around privacy are much
             | more stringent.
        
               | croes wrote:
               | I bet Boeing also has documentation about their security
               | practices.
               | 
               | Talk is cheap and in Apple's case it's part of their PR.
        
               | bamboozled wrote:
               | What is wrong with Boeing's security?
        
               | dudeinjapan wrote:
               | > What is wrong with Boeing's security?
               | 
               | Too many holes.
        
               | threeseed wrote:
               | But what does that have to do with the price of milk in
               | Turkmenistan.
               | 
               | Because Boeing's issues have nothing to do with privacy
               | or security and since they are not consumer facing have
               | no relevance to what we are talking about.
        
         | dheera wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | Apple has never been privacy-first in practice. They give you
         | the illusion of privacy but in reality it's a closed-source
         | system and you are forced to trust Apple with your data.
         | 
         | They also make it a LOT harder than Android to execute your own
         | MITM proxies to inspect what exact data is being sent about you
         | by all of your apps including the OS itself.
        
           | deadmutex wrote:
           | Yeah, given that they resisted putting RCS in iMessage so
           | long, I am a bit skeptical about the whole privacy narrative.
           | Especially when Apple's profit is at odds with user privacy.
        
             | notaustinpowers wrote:
             | From my understanding, the reason RCS was delayed is
             | because Google's RCS was E2EE only in certain cases (both
             | users using RCS). But also because Google's RCS runs
             | through Google servers.
             | 
             | If Apple enabled RCS in messages back then, but the
             | recipient was not using RCS, then Google now has the
             | decrypted text message, even when RCS advertises itself as
             | E2EE. With iMessage, at least I know all of my messages are
             | E2EE when I see a blue bubble.
             | 
             | Even now, RCS is available on Android if using Google
             | Messages. Yes, it's pre-installed on all phones, but OEMs
             | aren't required to use it as the default. It opens up more
             | privacy concerns because now I don't know if my messages
             | are secure. At least with the green bubbles, I can assume
             | that anything I send is not encrypted. With RCS, I can't be
             | certain unless I verify the messaging app the recipient is
             | using and hope they don't replace it with something else
             | that doesn't support RCS.
        
               | vel0city wrote:
               | You know what would really help Apple customers increase
               | their privacy when communicating with non-Apple devices?
               | 
               | Having iMessage available to everyone regardless of their
               | mobile OS.
        
               | notaustinpowers wrote:
               | Agreed. While I have concerns regarding RCS, Apple's
               | refusal to make iMessage an open platform due to customer
               | lock-in is ridiculous and anti-competitive.
        
             | fabrice_d wrote:
             | How is RCS a win on the privacy front? It's not even e2e
             | encrypted in an interoperable way (Google implementation is
             | proprietary).
        
             | acdha wrote:
             | RCS is a net loss for privacy: it gives the carriers
             | visibility into your social graph and doesn't support end
             | to end encryption. Google's PR campaign tried to give the
             | impression that RCS supports E2EE but it's restricted to
             | their proprietary client.
        
           | ben_w wrote:
           | You say that like open source isn't also an illusion of
           | trust.
           | 
           | The reality is, there's too much to verify, and not enough
           | interest for the "many eyeballs make all bugs shallow"
           | argument.
           | 
           | We are, all of us, forced to trust, forced to go without the
           | genuine capacity to verify. It's not great, and the best we
           | can do is look for incentives and try to keep those aligned.
        
             | dheera wrote:
             | I don't agree with relying on the many eyeballs argument
             | for security, but from a privacy standpoint, I do think at
             | least the availability of source to MY eyeballs, as well as
             | the ability to modify, recompile, and deploy it, is better
             | than "trust me bro I'm your uncle Steve Jobs and I know
             | more about you than you but I'm a good guy".
             | 
             | If you want to, for example, compile a GPS-free version of
             | Android that appears like it has GPS but in reality just
             | sends fake coordinates to keep apps happy thinking they got
             | actual permissions, it's fairly straightforward to make
             | this edit, and you own the hardware so it's within your
             | rights to do this.
             | 
             | Open-source is only part of it; in terms of privacy, being
             | able to see what all is being sent in/out of my device is
             | is arguably more important than open source. Closed source
             | would be fine if they allowed me to easily inject my own
             | root certificate for this purpose. If they aren't willing
             | to do that, including a 1-click replacement of the
             | certificates in various third-party, certificate-pinning
             | apps that are themselves potential privacy risks, it's a
             | fairly easy modification to any open source system.
             | 
             | A screen on my wall that flashes every JSON that gets sent
             | out of hardware that I own should be my right.
        
               | ben_w wrote:
               | > Open-source is only part of it; in terms of privacy,
               | being able to see what all is being sent in/out of my
               | device is is arguably more important than open source.
               | 
               | I agree; unfortunately it feels as if this ship has not
               | only sailed, but the metaphor would have to be expanded
               | to involve the port at well.
               | 
               | Is it even possible, these days, to have a functioning
               | experience with no surprise network requests? I've tried
               | to limit mine via an extensive hosts file list, but that
               | _did_ break stuff even a decade ago, and the latest
               | version of MacOS doesn 't seem to fully respect the hosts
               | file (weirdly it _partially_ respects it?)
               | 
               | > A screen on my wall that flashes every JSON that gets
               | sent out of hardware that I own should be my right.
               | 
               | I remember reading a tale about someone, I think it was a
               | court case or an audit, who wanted every IP packet to be
               | printed out on paper. Only backed down when the volume
               | was given in articulated lorries per hour.
               | 
               | I sympathise, but you're reminding me of that.
        
             | ajuc wrote:
             | Open source is like democracy. Imperfect and easy to fuck
             | up, but still by far the best thing available.
             | 
             | Apple is absolutism. Even the so called "enlightened"
             | absolutism is still bad compared to average democracy.
        
           | wan23 wrote:
           | > Apple has never been privacy-first in practice > They also
           | make it a LOT harder than Android to execute your own MITM
           | proxies
           | 
           | I would think ease of MITM and privacy are opposing concerns
        
         | sergiotapia wrote:
         | > privacy-first strategy
         | 
         | That's just their way of walled gardening apple customers. Then
         | they can extort devs and other companies dry without any
         | middle-men.
        
         | MyFirstSass wrote:
         | I've been saying the same thing since ANE and the incredible
         | new chips with shared ram, suddenly everyone could run capable
         | local models - but then Apple decided to be catastrophically
         | stingy once again putting ridiculous 8gb's of ram in these new
         | iPads' and their new macbook air's destroying having a
         | widespread "intelligent local siri" because now half the new
         | generation can't run anything.
         | 
         | Apple is an amazing powerhouse but also disgustingly elitist
         | and wasteful if not straight up vulgar in its profit motives.
         | There's really zero idealism there despite their romantic and
         | creative legacy.
         | 
         | There's always some straight idiotic limitations in their
         | otherwise incredible machines, with no other purpose than to
         | create planned obsolescence, "PRO" exclusivity and piles
         | e-waste.
        
         | KaiserPro wrote:
         | > This is completely coherent with their privacy-first strategy
         | (which would be at odds with sending data up to the cloud for
         | processing).
         | 
         | I mean yeah, that makes good marketing copy, but its more due
         | to reducing latency and keeping running costs down.
         | 
         |  _but_ as this is mostly marketing fluff we 'll need to
         | actually see how it performs before casting judgment on how
         | "revolutionary" it is.
        
         | lunfard000 wrote:
         | Prob beacuse they are like super-behind in the cloud space, it
         | is not like they wouldn't like to sell the service. They
         | ignored photos privacy quite a few times in the icloud.
        
           | dylan604 wrote:
           | is it surprising since they effectively given the finger to
           | data center hardware designs?
        
         | jablongo wrote:
         | So for hardware accelerated training with something like
         | PyTorch, does anyone have a good comparison between Metal vs
         | Cuda, both in terms of performance and capabilities?
        
         | s1k3s wrote:
         | For everyone else who doesn't understand what this means, he's
         | saying Apple wants you to be able to run models on their
         | devices, just like you've been doing on nvidia cards for a
         | while.
        
           | nomel wrote:
           | I think he's saying they want to make local AI a first class,
           | _default_ , capability, which is _very_ unlike buying a $1k
           | peripheral to enable it. At this point (though everyone seems
           | to be working on it), other companies need to include a
           | gaming GPU in every laptop, _and tablet_ now (lol), to enable
           | this.
        
         | andsoitis wrote:
         | > In case it is not abundantly clear by now: Apple's AI
         | strategy is to put inference (and longer term even learning) on
         | edge devices. This is completely coherent with their privacy-
         | first strategy (which would be at odds with sending data up to
         | the cloud for processing).
         | 
         | Their primary business goal is to sell hardware. Yes, they've
         | diversified into services and being a shopping mall for all,
         | but it is about selling luxury hardware.
         | 
         | The promise of privacy is one way in which they position
         | themselves, but I would not bet the bank on that being true
         | forever.
        
           | bamboozled wrote:
           | As soon as the privacy thing goes away, I'd say a major part
           | of their customer base goes away too. Most people use android
           | so they don't get "hacked" if Apple is doing the hacking, I'd
           | just buy a cheaper alternative.
        
             | Draiken wrote:
             | At least here in Brazil, I've never heard such arguments.
             | 
             | Seems even more unlikely for non technical users.
             | 
             | It's just their latest market campaign, as far as I can
             | tell. The vast majority of people buy iPhones because of
             | the status it gives.
        
               | everly wrote:
               | They famously had a standoff with the US gov't over the
               | Secure Enclave.
               | 
               | Marketing aside, all indications point to the iOS
               | platform being the most secure mobile option (imo).
        
               | elzbardico wrote:
               | This is a prejudiced take. Running AI tasks locally on
               | the device definitely is a giant improvement for the user
               | experience.
               | 
               | But not only that, Apple CPUs are objectively leagues
               | ahead of their competition in the mobile space. I am
               | still using a IPhone released in 2020 with absolutely no
               | appreciable slow down or losses in perceived performance.
               | Because even a 4 years old IPhone still has specs that
               | don't lag behind by much the equivalent Android phones, I
               | still receive the latest OS updates, and because frankly,
               | Android OS is mess.
               | 
               | If I cared about status, I would have changed my phone
               | already for a new one.
        
               | kernal wrote:
               | >Apple CPUs are objectively leagues ahead of their
               | competition in the mobile space
               | 
               | This is a lie. The latest Android SoCs are just as
               | powerful as the A series.
               | 
               | >Because even a 4 years old IPhone still has specs that
               | don't lag behind by much the equivalent Android phones, I
               | still receive the latest OS updates, and because frankly,
               | Android OS is mess.
               | 
               | Samsung and Google offer 7 years of OS and security
               | updates. I believe that beats the Apple policy.
        
               | martimarkov wrote:
               | Strangle Android 14 seems to not be available for s20
               | phone which was released in 2020?
               | 
               | Or am I mistaken here?
        
               | Jtsummers wrote:
               | > Samsung and Google offer 7 years of OS and security
               | updates. I believe that beats the Apple policy.
               | 
               | On the second part:
               | 
               | https://en.wikipedia.org/wiki/IPadOS_version_history
               | 
               | The last iPads to stop getting OS updates (including
               | security, to be consistent with what Samsung and Google
               | are pledging) got 7 and 9 years of updates each (5th gen
               | iPad and 1st gen iPad Pro). The last iPhones to lose
               | support got about 7 years each (iPhone 8 and X). 6S, SE
               | (1st), and 7 got 9 and 8 years of OS support with
               | security updates. The 5S (released in 2013) last got a
               | security update in early 2023, so also about 9 years, the
               | 6 (2014) ended at the same time so let's call it 8 years.
               | The 4S, 2011, got 8 years of OS support. 5 and 5C got 7
               | and 6 years of support (5C was 5 in a new case, so was
               | always going to get a year less in support).
               | 
               | Apple has not, that I've seen at least, ever established
               | a long term support policy on iPhones and iPads, but the
               | numbers show they're doing at least as well as what
               | Samsung and Google are _promising_ to do, but have not
               | yet done. And they 've been doing this for more than a
               | decade now.
               | 
               | EDIT:
               | 
               | Reworked the iOS numbers a bit, down to the month (I was
               | looking at years above and rounding, so this is more
               | accurate). iOS support time by device for devices that
               | cannot use the current iOS 17 (so the XS and above are
               | not counted here) in months:                 1st - 32
               | 3G  - 37       3GS - 56       4   - 48       4S  - 93
               | 5   - 81       5C  - 69       5S  - 112       6   - 100
               | 6S  - 102       SE  - 96       7   - 90       8   - 78
               | X   - 76
               | 
               | The average is 72.5 months, just over 6 years. If we
               | knock out the first 2 phones (both have somewhat
               | justifiable short support periods, massive hardware
               | changes between each and their successor) the average
               | jumps to just shy of 79 months, or about 6.5 years.
               | 
               | The 8 and X look like regressions, but their last updates
               | were just 2 months ago (March 21, 2024) so still a good
               | chance their support period will increase and exceed the
               | 7 year mark like every model since the 5S. We'll have to
               | see if they get any more updates in November 2024 or
               | later to see if they can hit the 7 year mark.
        
               | kernal wrote:
               | >The last iPads to stop getting OS updates (including
               | security, to be consistent with what Samsung and Google
               | are pledging) got 7 and 9 years of updates each (5th gen
               | iPad and 1st gen iPad Pro). The last iPhones to lose
               | support got about 7 years each (iPhone 8 and X). 6S, SE
               | (1st), and 7 got 9 and 8 years of OS support with
               | security updates. The 5S (released in 2013) last got a
               | security update in early 2023, so also about 9 years, the
               | 6 (2014) ended at the same time so let's call it 8 years.
               | The 4S, 2011, got 8 years of OS support. 5 and 5C got 7
               | and 6 years of support (5C was 5 in a new case, so was
               | always going to get a year less in support).
               | 
               | These are very disingenuous numbers that don't tell the
               | complete story. An iPhone 7 getting a single critical
               | security patch does not take into account the hundreds of
               | security patches it did not receive when it stopped
               | receiving support. It received that special update
               | because Apple likely was told or discovered it was being
               | exploited in the wild.
               | 
               | Google and Samsung now offer 7 years of OS upgrades and
               | 84 months of full security patches. Selectively patching
               | a phone that is out of the support window with a single
               | security patch does not automatically increase its EOL
               | support date.
        
               | Jtsummers wrote:
               | They made that pledge for the Pixel 8 (2023). Let's
               | revisit this in 2030 and see what the nature of their
               | support is at that point and how it compares to Apple's
               | support for iPhone devices. We can't make a real
               | comparison since they haven't done anything yet, only
               | made promises.
               | 
               | What we can do _today_ is note that Apple never made a
               | promise, but did provide very long security support for
               | their devices despite that. They 've already met or come
               | close to the Samsung/Google pledge (for one device) on
               | almost half their devices, and those are all the recent
               | ones (so it's not a downward trend of good support then
               | bad support, but rather mediocre/bad support to improving
               | and increasingly good support).
               | 
               | Another fun one:
               | 
               | iPhone XS was released in September 2018, it is on the
               | current iOS 17 release. In the absolute worst case of it
               | losing iOS 18 support in September, it will have received
               | 6 full years of support in both security and OS updates.
               | It'll still hit 7 years (comfortably) of security
               | updates. If it does get iOS 18 support in September, then
               | Apple will hit the Samsung/Google pledge 5 years before
               | Samsung/Google can even demonstrate their ability to
               | follow through (Samsung has a chance, but Google has no
               | history of commitment).
               | 
               | I have time to kill before training for a century ride:
               | 
               | Let's ignore everything before iPhone 4S, they had short
               | support periods that's just a fact and hardly worth
               | investigating. This is an analysis of devices released in
               | 2011 and later, when the phones had, mostly, matured as a
               | device so we should be expecting longer support periods.
               | These are the support periods when the phones were able
               | to run the still-current iOS versions, not counting later
               | security updates or minor updates but after the major iOS
               | version had been deprecated. As an example, for the
               | iPhone 4S it had support from 2011-2016. In 2016 its OS,
               | iOS 9, was replaced by iOS 10. Here are the numbers:
               | 4S       - 5 years       5        - 5 years       5C
               | - 4 years (decreased, 5 hardware but released a year
               | later in a different case)       5S       - 6 years
               | 6        - 5 years (decreased, not sure why)       6S
               | - 7 years (hey, Apple did it! 2015 release, lost iOS
               | upgrades in 2022)       SE(1st)  - 5 years (like 5C, 6S
               | hardware but released later)       7        - 6 years
               | (decreased over 6S, not sure why)       8        - 6
               | years       X        - 6 years
               | 
               | The 6S is a bit of an outlier, hitting 7 years of full
               | support running the current iOS. 5C and SE(1st) both got
               | less total support, but their internals were the same as
               | prior phones and they lost support at the same time as
               | them (this is reasonable, if annoying, and does drag down
               | the average). So Apple has clearly trended towards 6
               | years of full support, the XS (as noted above) will get
               | at least 6 years of support as of this coming September.
               | We'll have to see if they can get it past the 7 year
               | mark, I know they haven't promised anything but the trend
               | suggests they can.
        
               | fl0ki wrote:
               | I look forward to these vendors delivering on their
               | promises, and I look forward to Apple perhaps formalizing
               | a promise with less variability for future products.
               | 
               | Neither of these hopes retroactively invalidates the fact
               | that Apple has had a much better track record of
               | supporting old phone models up to this point. Even if you
               | do split hairs about the level of patching some models
               | got in their later years, they still got full iOS updates
               | for years longer than most Android phones got any patches
               | at all, regardless of severity.
               | 
               | This is not an argument that somehow puts Android on top,
               | at best it adds nuance to just how _much_ better iOS
               | support has been up to this point.
               | 
               | Let's also not forget that if Apple wasn't putting this
               | kind of pressure on Google, they wouldn't have even made
               | the promise to begin with, because it's clear how long
               | they actually care to support products with no outside
               | pressure.
        
               | patall wrote:
               | > I am still using a IPhone released in 2020 with
               | absolutely no appreciable slow down or losses in
               | perceived performance.
               | 
               | My Pixel 4a here is also going strong, only the battery
               | is slowly getting worse. I mean, it's 2024, do phones
               | really still get slow? The 4a is now past android
               | updates, but that was promised after 3 years. But at 350
               | bucks, it was like 40% less than the cheapest iPhone mini
               | at that time.
        
               | onemoresoop wrote:
               | > I mean, it's 2024, do phones really still get slow?
               | 
               | Hardware is pretty beefed up but bloat keeps on growing,
               | that is slowing things down considerably.
        
               | moneywoes wrote:
               | what about security updates?
        
               | tick_tock_tick wrote:
               | > I am still using a IPhone released in 2020 with
               | absolutely no appreciable slow down or losses in
               | perceived performance.
               | 
               | Only because Apple lost a lawsuit otherwise they'd have
               | kept intentionally slowing it down.
        
               | dijit wrote:
               | I never understood this argument.
               | 
               | Theres no "status" to a brand of phone when the cheapest
               | point of entry is comparable and the flagship is cheaper
               | than the alternative flagship.
               | 
               | Marketing in most of europe is chiefly not the same as
               | the US though so maybe its a perspective thing.
               | 
               | I just find it hard to really argue "status" when the
               | last 4 iPhone generations are largely the same and
               | cheaper than the Samsung flagships.
               | 
               | At Elgiganten a Samsung S24 Ultra is 19,490 SEK[0].
               | 
               | The most expensive iPhone 15 pro max is 18,784 SEK at the
               | same store[1].
               | 
               | [0]: https://nya.elgiganten.se/product/mobiler-tablets-
               | smartklock...
               | 
               | [1]: https://nya.elgiganten.se/product/mobiler-tablets-
               | smartklock...
        
               | pompino wrote:
               | Its not an argument, just ask why people lust after the
               | latest iPhones in poor countries. They do it because they
               | see rich people owning them. Unless you experience that,
               | you won't really understand it.
        
               | Draiken wrote:
               | My take is that it's like a fashion accessory. People buy
               | Gucci for the brand, not the material or comfort.
               | 
               | Rich people ask for the latest most expensive iPhone even
               | if they're only going to use WhatsApp and Instagram on
               | it. It's not because of privacy or functionality, it's
               | simply to show off to everyone they can purchase it. Also
               | to not stand out within their peers as the only one
               | without it.
               | 
               | As another content said: it's not an argument, it's a
               | fact here.
        
               | Aerbil313 wrote:
               | I have an iPhone so I guess I qualify as a rich person by
               | your definition. I am also a software engineer. I cannot
               | state enough how bogus that statement is. I've used both
               | iPhone and Android, and recent flagships. iPhone is by
               | far the easiest one to use. Speaking in more objective
               | terms, iPhones have a coherent UI which maintains its
               | consistency both throughout the OS and over the years.
               | They're the most dumbed down phones and easiest to
               | understand. I recommend iPhone to all my friends and
               | relatives.
               | 
               | There's obviously tons of people who see iPhone as a
               | status item. They're right, because iPhone is expensive
               | and only the rich can buy them. This doesn't mean iPhone
               | is not the best option out there for a person who doesn't
               | want to extensively customize his phone and just use it.
        
               | ClumsyPilot wrote:
               | > iPhone and Android, and recent flagships. iPhone is by
               | far the easiest one to use. Speaking in more objective
               | terms, iPhones have a coherent UI
               | 
               | It's not about if you've used android, it's about if
               | you've beeen poor-ish or stingy
               | 
               | To some people those are luxuries- the most expensive
               | phone they buy is a mid-range Motorola for $300 with
               | snapdragon 750g or whatever. They run all the same apps
               | after all, they take photos.
               | 
               | iPhones are simply outside of your budget.
        
               | hindsightbias wrote:
               | It's fashion and the kids are hip. But there is an
               | endless void of Apple haters here who want to see it
               | burn. They have nothing in common with 99.9% of the
               | customer base.
        
               | ClumsyPilot wrote:
               | I was thinking about this for a while, the problem is not
               | about apple, it's the fact that the rest of the industry
               | is gutless, and has zero vision or leadership. Whatever
               | Apple does, the rest of the industry will follow or
               | oppose - but will be defined by it.
               | 
               | It's like how people who don't like US and want nothing
               | to do with US still discuss US politics, because it has
               | so much effect everywhere.
               | 
               | (Ironically no enough people discuss China in any
               | coherent level of understanding)
        
               | briandear wrote:
               | The vast majority of people don't. They buy because the
               | ecosystem works. Not sure how I get status from a phone
               | that nobody knows I have. I don't wear it on a chain.
        
               | Draiken wrote:
               | Could it possibly be different in Brazil?
               | 
               | iPhones are not ubiquitous here, and they're way more
               | expensive than other options.
        
             | jamesmontalvo3 wrote:
             | Maybe true for a lot of the HN population, but my teenagers
             | are mortified by the idea of me giving them android phones
             | because then they would be the pariahs turning group
             | messages from blue to green.
        
               | WheatMillington wrote:
               | This is a sad state of affairs.
        
               | adamomada wrote:
               | Interesting that some people would take that as an Apple
               | problem and others would take it as a Google problem
               | 
               | Who's at fault for not having built-in messaging that
               | works with rich text, photos, videos, etc?
               | 
               | Google has abandoned more messaging products than I can
               | remember while Apple focused on literally the main
               | function of a phone in the 21st century. And they get
               | shit for it
        
               | simonh wrote:
               | I'm in Europe and everyone uses WhatsApp, and while
               | Android does gave higher share over here, iPhone still
               | dominate the younger demographics. I'm not denying
               | blue/green is a factor in the US but it's not even a
               | thing here. It's nowhere near the only it even a dominant
               | reason iPhones are successful with young people.
        
               | adamc wrote:
               | Snobbery is an expensive pastime.
        
               | lolinder wrote:
               | And just to elaborate on this: it's not just snobbery
               | about the color of the texts, for people who rely on
               | iMessage as their primary communication platform it
               | really is a severely degraded experience texting with
               | someone who uses Android. We Android users have long
               | since adapted to it by just avoiding SMS/MMS in favor of
               | other platforms, but iPhone users are accustomed to just
               | being able to send a video in iMessage and have it be
               | decent quality when viewed.
               | 
               | Source: I'm an Android user with a lot of iPhones on my
               | in-laws side.
        
             | littlestymaar wrote:
             | Apple only pivoted into the "privacy" branding relatively
             | recently [1] and I don't think that many people came for
             | that reason alone. In any case, most are now trapped into
             | the walled garden and the effort to escape is likely big
             | enough. And there's no escape anyway, since Google will
             | always make Android worse in that regard...
             | 
             | [1] in 2013 they even marketed their "eBeacon" technology
             | as a way for retail stores to monitor and track their
             | customers which...
        
               | adamomada wrote:
               | Ca 2013 was the release of the Nexus 5, arguably the
               | first really usable android smartphone.
               | 
               | Privacy wasn't really a concern because most people
               | didn't have the privacy eroding device yet. In the years
               | following the Nexus 5 is where smartphones went into
               | geometric growth and the slow realization of the privacy
               | nightmare became apparent
               | 
               | Imho I was really excited to get a Nexus 4 at the time,
               | just a few short years later the shine wore off and I was
               | horrified at the smartphone enabled future. And I have a
               | 40 year background in computers and understand them
               | better than 99 out of 100 users - if I didn't see it, I
               | can't blame them either
        
               | mkl wrote:
               | > Ca 2013 was the release of the Nexus 5, arguably the
               | first really usable android smartphone.
               | 
               | What a strange statement. I was late to the game with a
               | Nexus S in 2010, and it was really usable.
        
               | adamomada wrote:
               | Define usable. Imho before Nexus 4 everything was crap,
               | Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz)
               | plus software at the time (post-kitkat) was when it was
               | really ready for mainstream
        
             | moneywoes wrote:
             | is that still the case?
        
             | tick_tock_tick wrote:
             | I'd say from my experience the average Apple users care
             | less about privacy then the general public. It's a status
             | symbol first and foremost 99% of what people do on their
             | phones is basically identical on both platforms at this
             | point.
        
           | serial_dev wrote:
           | It doesn't need to stay true forever.
           | 
           | The alternative is Google / Android devices and OpenAI
           | wrapper apps, both of which usually offer a half baked UI,
           | poor privacy practices, and a completely broken UX when the
           | internet connection isn't perfect.
           | 
           | Pair this with the completely subpar Android apps, Google
           | dropping support for an app about once a month, and suddenly
           | I'm okay with the lesser of two evils.
           | 
           | I know they aren't running a charity, I even hypothesized
           | that Apple just can't build good services so they pivoted to
           | focusing on this fake "privacy" angle. In the end, iPhones
           | are likely going to be better for edge AI than whatever is
           | out there, so I'm looking forward to this.
        
             | jocaal wrote:
             | > better for edge AI than whatever is out there, so I'm
             | looking forward to this
             | 
             | What exactly are you expecting? The current hype for AI is
             | large language models. The word 'large' has a certain
             | meaning in that context. Much larger that can fit on your
             | phone. Everyone is going crazy about edge AI, what am I
             | missing?
        
               | jchanimal wrote:
               | It fits on your phone, and your phone can offload battery
               | burning tasks to nearby edge servers. Seems like the path
               | consumer-facing AI will take.
        
               | jitl wrote:
               | Quantized LLMs can run on a phone, like Gemini Nano or
               | OpenLLAMA 3B. If a small local model can handle simple
               | stuff and delegate to a model in the data center for
               | harder tasks and with better connectivity you could get
               | an even better experience.
        
               | SmellTheGlove wrote:
               | > If a small local model can handle simple stuff and
               | delegate to a model in the data center for harder tasks
               | and with better connectivity you could get an even better
               | experience.
               | 
               | Distributed mixture of experts sounds like an idea. Is
               | anyone doing that?
        
               | cheschire wrote:
               | Sounds like an attack vector waiting to happen if you
               | deploy enough competing expert devices into a crowd.
               | 
               | I'm imagining a lot of these LLM products on phones will
               | be used for live translation. Imagine a large crowd event
               | of folks utilizing live AI translation services being
               | told completely false translations because an actor
               | deployed a 51% attack.
        
               | jagger27 wrote:
               | I'm not particularly scared of a 51% attack between the
               | devices attached to my Apple ID. If my iPhone splits
               | inference work with my idle MacBook, Apple TV, and iPad,
               | what's the problem there?
        
               | moneywoes wrote:
               | what about in situations with no bandwidth?
        
               | callalex wrote:
               | In the hardware world, last year's large has a way of
               | becoming next year's small. For a particularly funny
               | example of this, check out the various letter soup names
               | that people keep applying to screen resolutions. https://
               | en.m.wikipedia.org/wiki/Display_resolution_standards...
        
               | gopher_space wrote:
               | > Everyone is going crazy about edge AI, what am I
               | missing?
               | 
               | If you clone a model and then bake in a more expensive
               | model's correct/appropriate responses to your queries,
               | you now have the functionality of the expensive model in
               | your clone. For your specific use case.
               | 
               | The size of the resulting case-specific models are small
               | enough to run on all kinds of hardware, so everyone's
               | seeing how much work can be done on their laptop right
               | now. One incentive for doing so is that your approaches
               | to problems are constrained by the cost and security of
               | the Q&A roundtrip.
        
             | kernal wrote:
             | >subpar Android apps
             | 
             | Care to cite these subpar Android apps? The app store is
             | filled to the brim with subpar and garbage apps.
             | 
             | >Google dropping support for an app about once a month
             | 
             | I mean if you're going to lie why not go bigger
             | 
             | >I'm okay with the lesser of two evils.
             | 
             | So the more evil company is the one that pulled out of
             | China because they refused to hand over their users data to
             | the Chinese government on a fiber optic silver plate?
        
               | martimarkov wrote:
               | Google operates in China albeit via their HK domain.
               | 
               | They also had project DragonFly if you remember.
               | 
               | The lesser of two evils is that one company doesn't try
               | to actively profile me (in order for their ads business
               | to be better) with every piece of data it can find and
               | forces me to share all possible data with them.
               | 
               | Google is famously known to kill apps that are good and
               | used by customers: https://killedbygoogle.com/
               | 
               | As for the subpar apps: there is a massive difference
               | between the network traffic when on the Home Screen
               | between iOS and Android.
        
               | kernal wrote:
               | >Google operates in China albeit via their HK domain.
               | 
               | The Chinese government has access to the iCloud account
               | of every Chinese Apple user.
               | 
               | >They also had project DragonFly if you remember.
               | 
               | Which never materialized.
               | 
               | >The lesser of two evils is that one company doesn't try
               | to actively profile me (in order for their ads business
               | to be better) with every piece of data it can find and
               | forces me to share all possible data with them.
               | 
               | Apple does targeted and non targeted advertising as well.
               | Additionally, your carrier has likely sold all of the
               | data they have on you. Apple was also sued for selling
               | user data to ad networks. Odd for a Privacy First company
               | to engage in things like that.
               | 
               | >Google is famously known to kill apps that are good and
               | used by customers: https://killedbygoogle.com/
               | 
               | Google has been around for 26 years I believe. According
               | to that link 60 apps were killed in that timeframe.
               | According to your statement that Google kills an app a
               | month that would leave you 252 apps short. Furthermore,
               | the numbers would indicate that Google has killed 2.3
               | apps per year or .192 apps per month.
               | 
               | >As for the subpar apps: there is a massive difference
               | between the network traffic when on the Home Screen
               | between iOS and Android.
               | 
               | Not sure how that has anything to do with app quality,
               | but if network traffic is your concern there's probably a
               | lot more an Android user can do than an iOS user to
               | control or eliminate the traffic.
        
             | rfoo wrote:
             | > The alternative is Google / Android devices
             | 
             | No, the alternative is Android devices with everything
             | except firmware built from source and signed by myself. And
             | at the same time, being secure, too.
             | 
             | You just can't have this on Apple devices. On Android side
             | choices are limited too, I don't like Google and especially
             | their disastrous hardware design, but their Pixel line is
             | the most approachable one able to do all these.
             | 
             | Heck, you can't even build your own app for your own iPhone
             | without buying another hardware (a Mac, this is not a
             | software issue, this is a legal issue, iOS SDK is licensed
             | to you on the condition of using on Apple hardware only)
             | and a yearly subscription. How is this acceptable at all?
        
               | adamomada wrote:
               | The yearly subscription is for publishing your app on
               | Apple's store and definitely helps keep some garbage out.
               | Running your own app on your own device is basically
               | solved with free third party solutions now (see AltStore
               | and since a newer method I can't recall atm)
        
               | simfree wrote:
               | WebGPU and many other features on iOS are unimplemented
               | or implemented in half-assed or downright broken ways.
               | 
               | These features work on all the modern desktop browsers
               | and on Android tho!
        
               | Aloisius wrote:
               | > WebGPU and many other features
               | 
               | WebGPU isn't standardized yet. Hell, _most_ of the
               | features people complain about aren 't part of any
               | standard, but for some reason there's this sense that if
               | it's in Chrome, it's standard - as if Google dictates
               | standards.
        
               | moooo99 wrote:
               | > but for some reason there's this sense that if it's in
               | Chrome, it's standard - as if Google dictates standards.
               | 
               | Realistically, given the market share of Chrome and
               | Chromium based browsers, they kind of do.
        
               | yencabulator wrote:
               | Meanwhile, Apple has historically dictated that Google
               | can't publish Chrome for iOS, only a reskinned Safari.
               | People in glass-walled gardens shouldn't throw stones.
        
               | ToucanLoucan wrote:
               | > Heck, you can't even build your own app for your own
               | iPhone without buying another hardware (a Mac, this is
               | not a software issue, this is a legal issue, iOS SDK is
               | licensed to you on the condition of using on Apple
               | hardware only) and a yearly subscription. How is this
               | acceptable at all?
               | 
               | Because they set the terms of use of the SDK? You're not
               | required to use it. You aren't required to develop for
               | iOS. Just because Google gives it all away for free
               | doesn't mean Apple has to.
        
               | ClumsyPilot wrote:
               | > You aren't required to develop for iOS
               | 
               | Do you have a legal right to write software or run your
               | own software for hardware you bought?
               | 
               | Because it's very easy to take away a right by erecting
               | aritificial barriers, just like how you could
               | discriminate by race at work, but pretend you are doing
               | something else,
        
               | ToucanLoucan wrote:
               | > Do you have a legal right to write software or run your
               | own software for hardware you bought?
               | 
               | I've never heard of such a thing. Ideally I'd _like_
               | that, but I don 't have such freedoms with the computers
               | in my cars, for example, or the one that operates my
               | furnace, or even for certain parts of my PC.
        
               | ClumsyPilot wrote:
               | So you bought "a thing' but you can't control what it
               | does, how it does it, you don't get to decide what data
               | it collects or who can see that data.
               | 
               | You aren't allowed to repair the "thing' because the
               | software can detect you changed something and will refuse
               | to boot. And whenever it suits the manufacturer, they
               | will decide when the 'thing' is declared out of support
               | and stops functioning.
               | 
               | I would say you are not an owner then, you (and me) and
               | just suckers that are paying for the party. Maybe it's a
               | lease. But then we also pay when it breaks, so it more of
               | a digital feudalism.
        
               | nrb wrote:
               | > How is this acceptable at all?
               | 
               | Because as you described, the only alternatives that
               | exist are terrible experiences for basically everyone, so
               | people are happy to pay to license a solution that solves
               | their problems with minimal fuss.
               | 
               | Any number of people could respond to "use Android
               | devices with everything except firmware built from source
               | and signed by myself" with the same question.
        
               | mbreese wrote:
               | _> No, the alternative is Android devices with everything
               | except firmware built from source and signed by myself_
               | 
               | Normal users will not do this. Just because many of the
               | people here can build and sign a custom Android build
               | doesn't mean that is a viable _commercial_ alternative.
               | It is great that is an option for those of us who can do
               | it, but don 't present it as a viable alternative to the
               | iOS/Google ecosystems. The fraction of people who can and
               | will be willing to do this is really small. And even if
               | you can do it, how many people will want to maintain
               | their custom built OSes?
        
             | cbsmith wrote:
             | Google has also been working on (and provides kits for)
             | local machine learning on mobile devices... and they run on
             | both iOS and Android. The Gemini App does send data in to
             | Google for learning, but even that you can opt out of.
             | 
             | Apple's definitely pulling a "Heinz" move with privacy, and
             | it is true that they're doing a better job of it overall,
             | but Google's not completely horrible either.
        
           | nox101 wrote:
           | Their primary business is transitioning to selling services
           | and extracting fees. It's their primary growth
        
             | brookst wrote:
             | Hey, I'm way ahead of Apple. I sell my services to my
             | employer and extract fees from them. Do you extract fees
             | too?
        
               | nox101 wrote:
               | I'm not sure what you're point is. My point (which I
               | failed at), is that Apple's incentives are changing
               | because their growth is dependent on services and
               | extracting fees so they will likely do things that try to
               | make people dependent on those services and find more
               | ways to charge fees (to users and developers).
               | 
               | Providing services is arguably at odds with privacy since
               | a service with access to all the data can provide a
               | better service than one without so there will be a
               | tension between trying to provide the best services,
               | fueling their growth, and privacy.
        
               | brookst wrote:
               | I apologize for being oblique and kind of snarky.
               | 
               | My point was that it's interesting how we can frame a
               | service business "extracting fees" to imply wrongdoing.
               | When it's pretty normal for all services to charge
               | ongoing fees for ongoing delivery.
        
               | ClumsyPilot wrote:
               | It's about the money, it's about perverse incentives and
               | propensity of service businesses to get away with unfair
               | practices. We have decent laws about your rights as a
               | consumer when you buy stuff, but like no regulation of
               | services
        
             | adamomada wrote:
             | So the new iPad & M4 was just some weekend project that
             | they shrugged and decided to toss over to their physical
             | retail store locations to see if anyone still bought
             | physical goods eh
        
           | stouset wrote:
           | Nothing is true forever. Google wasn't evil forever, Apple
           | won't value privacy forever.
           | 
           | Until we figure out how to have guarantees of forever, the
           | best we can realistically do is evaluate companies and their
           | products by their behavior _now_ weighted by their behavior
           | in the past.
        
           | klabb3 wrote:
           | > but it is about selling luxury hardware.
           | 
           | Somewhat true but things are changing. While there are plenty
           | of "luxury" Apple devices like Vision Pro or fully decked out
           | MacBooks for web browsing we no longer live in a world where
           | tech are just lifestyle gadgets. People spend hours a day on
           | their phones, and often run their life and businesses through
           | it. Even with the $1000+/2-3y price tag, it's simply not that
           | much given how central role it serves in your life. This is
           | especially true for younger generations who often don't have
           | laptops or desktops at home, and also increasingly in poorer-
           | but-not-poor countries (say eg Eastern Europe). So the iPhone
           | (their best selling product) is far, far, far more a
           | commodity utility than typical luxury consumption like
           | watches, purses, sports cars etc.
           | 
           | Even in the higher end products like the MacBooks you see a
           | lot of professionals (engineers included) who choose it
           | because of its price-performance-value, and who don't give a
           | shit about luxury. Especially since the M1 launched, where
           | performance and battery life took a giant leap.
        
             | _the_inflator wrote:
             | I disagree.
             | 
             | Apple is selling hardware and scaling AI by utilizing it is
             | simply a smart move.
             | 
             | Instead of building huge GPU clusters, having to deal with
             | NVIDIA for GOUs (Apple kicked NVIDIA out years ago because
             | of disagreements), Apple is building mainly on existing
             | hardware.
             | 
             | This is in other terms utilizing CPU power.
             | 
             | On the other hand this helps their marketing keeping high
             | price points when Apple now is going to differentiate their
             | COU power and therefore hardware prices over AI
             | functionality correlating with CPU power. This is also
             | consistent with Apple stopping the MHz comparisons years
             | ago.
        
             | KptMarchewa wrote:
             | No computers in eastern Europe? WTF? Are you confusing us
             | with Indians programming on their phones?
        
               | vr46 wrote:
               | Countering a lazy reference with some weird racist
               | stereotype was the best you could do?
        
           | hehdhdjehehegwv wrote:
           | As a privacy professional for many, many years this is 100%
           | correct. Apple wouldn't be taking billions from Google for
           | driving users to their ad tracking system, they wouldn't give
           | the CCP access to all Chinese user data (and maybe beyond),
           | and they wouldn't be on-again-off-again flirting with
           | tailored ads in Apple News if privacy was a "human right".
           | 
           | (FWIW my opinion is it is a human right, I just think Tim
           | Cook is full of shit.)
           | 
           | What Apple calls privacy more often than not is just putting
           | lipstick on the pig that is their anticompetitive walled
           | garden.
           | 
           | Pretty much everybody in SV who works in privacy rolls their
           | eyes at Apple. They talk a big game but they are as full of
           | shit as Meta and Google - and there's receipts to prove it
           | thanks to this DoJ case.
           | 
           | Apple want to sell high end hardware. On-device computation
           | is a better user experience, hands down.
           | 
           | That said, Siri is utter dogshit so on-device dogshit is just
           | faster dogshit.
        
             | moneywoes wrote:
             | any private guides for todays smartphone user?
        
           | kortilla wrote:
           | The MacBook Air is not a luxury device. That meme is out of
           | date
        
             | pseufaux wrote:
             | Curious what criteria you're using for using for qualifying
             | luxury. It seems to me that materials, software, and design
             | are all on par with other more expansive Apple products.
             | The main difference is the chipset which I would argue is
             | on an equal quality level as the pro chips but designed for
             | a less power hungry audience.
        
             | ozim wrote:
             | Maybe for you, but I still see sales guys who refuse
             | working on WinTel where basically what the do is browse
             | internet and do spreadsheets - so mainly just because they
             | would not look cool compared to other sales guys rocking
             | MacBooks.
        
               | stevage wrote:
               | I don't buy this "looking cool" argument.
               | 
               | I have used both. I think the Mac experience is
               | significantly better. No one is looking at me.
        
             | lolinder wrote:
             | I can't buy a MacBook Air for less than $999, and that's
             | for a model with 8GB RAM, an 8-core CPU and 256GB SSD. The
             | equivalent (based on raw specs) in the PC world runs for
             | $300 to $500.
             | 
             | How is something that is twice as expensive as the
             | competition _not_ a luxury device?
        
               | spurgu wrote:
               | Really? You can find a laptop with the equivalent of
               | Apple Silicon for $3-500? And while I haven't used
               | Windows in ages I doubt it runs as well with 8 GB as
               | MacOS does.
        
               | lolinder wrote:
               | Sure, then try this one from HP with 16GB RAM and a CPU
               | that benchmarks in the same ballpark as the M2, for $387:
               | 
               | https://www.amazon.com/HP-Pavilion-i7-11370H-Micro-Edge-
               | Anti...
               | 
               | The point isn't that the MacBook Air isn't _better_ by
               | some metrics than PC laptops. A Rolls-Royce is  "better"
               | by certain metrics than a Toyota, too. What makes a
               | device luxury is if it costs substantially more than
               | competing products that the average person would consider
               | a valid replacement.
        
               | datadrivenangel wrote:
               | How much does it cost to get a device with comparable
               | specs, performance, and 18 hour battery life?
               | 
               | Closer to $999 then $500.
        
               | lolinder wrote:
               | This CPU benchmarks in the same ballpark as the M2 and it
               | runs for $329:
               | 
               | https://www.amazon.com/Lenovo-IdeaPad-
               | Ryzen5-5500U-1920x1080...
               | 
               | An 18 hour battery life _is_ a luxury characteristic, not
               | something penny pinchers will typically be selecting on.
        
               | outworlder wrote:
               | What about the rest of the system? The SSD, for example?
               | 
               | Apple likes to overcharge for storage, but the drives are
               | _really_ good.
        
               | lolinder wrote:
               | When you're breaking out SSD speeds you're _definitely_
               | getting into the  "luxury" territory.
               | 
               | As I said in another comment:
               | 
               | The point isn't that the MacBook Air isn't better by some
               | metrics than PC laptops. A Rolls-Royce is "better" by
               | certain metrics than a Toyota, too. What makes a device
               | luxury is if it costs substantially more than competing
               | products that the average person would consider a valid
               | replacement.
        
               | pquki4 wrote:
               | What the point of comparison? Isn't 18 hour battery and
               | Genius Bar part of the "luxury"?
               | 
               | Like I say Audi is a luxury car because a Toyota costs
               | less than half as much, and you ask "what about a Toyota
               | with leather seats"?
        
           | moritzwarhier wrote:
           | > > In case it is not abundantly clear by now: Apple's AI
           | strategy is to put inference (and longer term even learning)
           | on edge devices. This is completely coherent with their
           | privacy-first strategy (which would be at odds with sending
           | data up to the cloud for processing).
           | 
           | > Their primary business goal is to sell hardware.
           | 
           | There is no contradiction here. No need for luxury. Efficient
           | hardware scales, Moore's law has just been rewritten, not
           | defeated.
           | 
           | Power efficiency combined with shared and extremely fast RAM,
           | it is still a formula for success as long as they are able to
           | deliver.
           | 
           | By the way, M-series MacBooks have crossed bargain territory
           | by now compared to WinTel in some specific (but large)
           | niches, e.g. the M2 Air.
           | 
           | They are still technically superior in power efficiency and
           | still competitive in performance in many common uses, be it
           | traditional media decoding and processing, GPU-heavy tasks
           | (including AI), single-core performance...
           | 
           | By the way, this includes web technologies / JS.
        
           | jimbokun wrote:
           | For all their competitors it's not true right now.
        
         | SpaceManNabs wrote:
         | This comment is odd. I wouldn't say it is misleading, but it is
         | odd because it borders on such definition.
         | 
         | > Apple's AI strategy is to put inference (and longer term even
         | learning) on edge devices
         | 
         | This is pretty much everyone's strategy. Model distillation is
         | huge because of this. This goes in line with federated
         | learning. This goes in line with model pruning too. And
         | parameter efficient tuning and fine tuning and prompt learning
         | etc.
         | 
         | > This is completely coherent with their privacy-first strategy
         | 
         | Apple's marketing for their current approach is privacy-first.
         | They are not privacy first. If they were privacy first, you
         | would not be able to use app tracking data on their first party
         | ad platform. They shut it off for everyone else but themselves.
         | Apple's approach is walled garden first.
         | 
         | > Processing data at the edge also makes for the best possible
         | user experience because of the complete independence of network
         | connectivity
         | 
         | as long as you don't depend on graph centric problems where
         | keeping a local copy of that graph is prohibitive. Graph
         | problems will become more common. Not sure if this is a problem
         | for apple though. I am just commenting in general.
         | 
         | > If (and that's a big if) they keep their APIs open to run any
         | kind of AI workload on their chips
         | 
         | Apple does not have a good track record of this; they are quite
         | antagonistic when it comes to this topic. Gaming on apple was
         | dead for nearly a decade (and pretty much still is) because
         | steve jobs did not want people gaming on macs. Apple has eased
         | up on this, but it very much seems that if they want you to use
         | their devices (not yours) in a certain way, then they make it
         | expensive to do anything else.
         | 
         | Tbf, I don't blame apple for any of this. It is their strategy.
         | Whether it works or not, it doesn't matter. I just found this
         | comment really odd since it almost seemed like evangelism.
         | 
         | edit: weird to praise apple for on device training when it is
         | not publicly known if they have trained any substantial model
         | even on cloud.
        
           | nomel wrote:
           | > This is pretty much everyone's strategy.
           | 
           | I think this is being too charitable on the state of
           | "everyone". It's everyone's _goal_. Apple is actively
           | achieving that goal, with their many year _strategy_ of in
           | house silicon /features.
        
             | SpaceManNabs wrote:
             | > Apple is actively achieving that goal, with their many
             | year strategy of in house silicon/features
             | 
             | So are other companies, with their many year strategy of
             | actually building models that accessible to the public.
             | 
             | yet Apple is "actively" achieving the goal without any
             | distinct models.
        
               | nomel wrote:
               | No. "On edge" is not a model existence limitation, it is
               | a hardware capability/existence limitation, by
               | definition, and by the fact that, as you point out, the
               | models already _exist_.
               | 
               | You can already run those open weight models on Apple
               | devices, on edge, with huge improvements on the newer
               | hardware. Why is a distinct model required? Do the rumors
               | appease these thoughts?
               | 
               | If others are making models, with no way to actually run
               | them, that's not a viable "on edge" strategy, since it
               | involves waiting for someone else to actually accomplish
               | the goal first (as is being done by Apple).
        
               | SpaceManNabs wrote:
               | > "On edge" is not a model existence limitation
               | 
               | It absolutely is. Model distillation will still be
               | pertinent. And so will be parameter efficient tuning for
               | edge training. I cannot emphasize more how important this
               | is. You will need your own set of weights. If apple wants
               | to use open weights, then sure. Ignore this. Don't seem
               | like they want to long-term... And even if they use open
               | weights, they will still be behind other companies have
               | done model distillation and federated learning for years.
               | 
               | > Why is a distinct model required?
               | 
               | Ask apple's newly poached AI hires this question. Doesn't
               | seem like you would take an answer from me.
               | 
               | > If others are making models, with no way to actually
               | run them
               | 
               | Is this the case? People have been running distilled
               | llamas on rPis with pretty good throughput.
        
           | jameshart wrote:
           | Everyone's strategy?
           | 
           | The biggest players in commercial AI models at the moment -
           | OpenAI and Google - have made absolutely no noise about
           | pushing inference to end user devices at all. Microsoft,
           | Adobe, other players who are going big on embedding ML models
           | into their products, are not pushing those models to the
           | edge, they're investing in cloud GPU.
           | 
           | Where are you picking up that this is _everyone's_ strategy?
        
             | SpaceManNabs wrote:
             | > Where are you picking up that this is everyone's
             | strategy?
             | 
             | Read what their engineers say in public. Unless I
             | hallucinated years of federated learning.
             | 
             | Also apple isn't even a player yet and everyone is
             | discussing how they are moving stuff to the edge lol. Can't
             | critique companies for not being on the edge yet when apple
             | doesn't have anything out there.
        
         | kernal wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | How can they have a privacy first strategy when they operate an
         | Ad network and have their Chinese data centers run by state
         | controlled companies?
        
           | KerrAvon wrote:
           | How can I have mint choc and pineapple swirl ice cream when
           | there are children starving in Africa?
        
           | n9 wrote:
           | ... I think that the more correct assertion would be that
           | Apple is a sector leader in privacy. If only because their
           | competitors make no bones about violating the privacy of
           | their customers as it is the basis of thier business model.
           | So it's not that Apple is A+ so much as the other students
           | are getting Ds and Fs.
        
         | strangescript wrote:
         | The fundamental problem with this strategy is model size. I
         | want all my apps to be privacy first with local models, but
         | there is no way they can share models in any kind of coherent
         | way. Especially when good apps are going to fine tune their
         | models. Every app is going to be 3GB+
        
           | tyho wrote:
           | Foundation models will be the new .so files.
        
           | SpaceManNabs wrote:
           | I don't think HN understands how important model distillation
           | still is for federated learning. Hype >> substance ITT
        
         | macns wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | You mean .. with their _said_ privacy-first strategy
        
         | ptman wrote:
         | Apple privacy is marketing https://www.eurekalert.org/news-
         | releases/1039938
        
         | xipix wrote:
         | How is local more private? Whether AI runs on my phone or in a
         | data center I still have to trust third parties to respect my
         | data. That leaves only latency and connectivity as possible
         | reasons to wish for endpoint AI.
        
           | chatmasta wrote:
           | If you can run AI in airplane mode, you are not trusting any
           | third party, at least until you reconnect to the Internet.
           | Even if the model was malware, it wouldn't be able to
           | exfiltrate any data prior to reconnecting.
           | 
           | You're trusting the third party at training time, to build
           | the model. But you're not trusting it at inference time (or
           | at least, you don't have to, since you can airgap inference).
        
         | thefourthchime wrote:
         | Yes is it completely clear. My guess is they do something like
         | "Siri-powered shortcuts". Where you can ask it to do a couple
         | things and it'll dynamically create a script and execute it.
         | 
         | I can see a smaller model trained to do that may work well
         | enough, however, I've never seen any real working examples of
         | this work, that rabit device is heading in that direction, but
         | it's mostly vaporware now.
        
           | _boffin_ wrote:
           | Pretty much my thoughts too. Going to have a model that's
           | smaller than 3B built in. The'll have tokens that directly
           | represent functions / shortcuts.
        
         | choppaface wrote:
         | On "privacy": If Apple owned the Search app versus paying
         | Google, and used their own ad network (which they have for App
         | Store today), Apple will absolutely use your data and location
         | etc to target you with ads.
         | 
         | It can even be third party services sending ad candidates
         | directly to your phone and then the on-device AI chooses which
         | is relevant.
         | 
         | Privacy is a contract not the absence of a clear business
         | opportunity. Just look at how Apple does testing internally
         | today. They have no more respect for human privacy than any of
         | their competitors. They just differentiate through marketing
         | and design.
        
         | LtWorf wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | Is this the same apple whose devices do not work at all unless
         | you register an apple account?
        
           | yazzku wrote:
           | Some people really seem to be truly delusional. It's obvious
           | that the company's "privacy" is a marketing gimmick when you
           | consider the facts. Do people not consider the facts anymore?
           | How does somebody appeal to the company's "privacy-first
           | strategy" with a straight face in light of the facts? I
           | suppose they are not aware of the advertising ID that is
           | embedded in all Apple operating systems. That one doesn't
           | even require login.
        
             | LtWorf wrote:
             | Considering the facts is much harder when admitting a
             | mistake is involved.
        
         | aiauthoritydev wrote:
         | Edge inference and cloud inference are not mutually exclusive
         | and chances are any serious player would be dipping their toes
         | in both.
        
         | 7speter wrote:
         | >I personally really really welcome as I don't want the AI
         | future to be centralised in the hands of a few powerful cloud
         | providers.
         | 
         | Watch out for being able to using ai on your local machine and
         | those ai services using telemetry to send your data (recorded
         | conversations, for instance) to their motherships.
        
           | happyopossum wrote:
           | > and those ai services using telemetry to send your data
           | (recorded conversations, for instance) to their motherships
           | 
           | This doesn't require Ai and I am not aware of any instances
           | of this happening today, so what exactly are we watching out
           | for?
        
         | Powdering7082 wrote:
         | Also they don't have to pay _either_ the capex or opex costs
         | for training a model if they get user 's devices to train the
         | models
        
         | aiauthoritydev wrote:
         | I think these days everyone links their products with AI. Today
         | even BP CEO linked his business with AI. Edge inference and
         | cloud inference are not mutually exclusive choices. Any serious
         | provider will provide both and the improvement in quality of
         | services come from you giving more of your data to the service
         | provider. Most people are totally fine with that and that will
         | not change any time sooner. Privacy paranoia is mostly a fringe
         | thing in consumer tech.
        
         | gtirloni wrote:
         | _> complete independence of network connectivity and hence
         | minimal latency._
         | 
         | Does it matter that each token takes additional milliseconds on
         | the network if the local inference isn't fast? I don't think it
         | does.
         | 
         | The privacy argument makes some sense, if there's no telemetry
         | leaking data.
        
         | kshahkshah wrote:
         | Why is Siri still so terrible though?
        
         | w1nst0nsm1th wrote:
         | > to put inference on edge devices...
         | 
         | It will take a long time before you can put performant
         | inference on edge device.
         | 
         | Just download one of the various open source large(st) langage
         | model and test it on your desktop...
         | 
         | Compute power and memory and storage requirements are insane if
         | you want decent result... I mean not just Llama gibberish.
         | 
         | Until such requirement are satisfied, distant model are the way
         | to go, at least for conversational model.
         | 
         | Aside llm, AlphaGo would not run on any end user device, by a
         | long shot, even if it is an already 'old' technology.
         | 
         | I think 'neural engine' on end user device is just marketing
         | nonsense at this current state of the art.
        
         | tweetle_beetle wrote:
         | I wonder if BYOE (bring your own electricity) also plays a part
         | in their long term vision? Data centres are expensive in terms
         | of hardware, staffing and energy. Externalising this cost to
         | customers saves money, but also helps to paint a green(washing)
         | narrative. It's more meaningful to more people to say they've
         | cut their energy consumption by x than to say they have a
         | better server obselesence strategy, for example.
        
           | dyauspitr wrote:
           | Why would it be green washing? Aren't their data centers and
           | commercial operations run completely on renewable energy?
        
             | ironmagma wrote:
             | If you offload data processing to the end user, then your
             | data center uses less energy on paper. The washing part is
             | that work is still being done and spending energy, just
             | outside of the data center.
        
               | dyauspitr wrote:
               | Which honestly is still good for the environment to have
               | the work distributed across the entire electricity grid.
               | 
               | That work needs to be done anyways and Apple is doing it
               | in the cleanest way possible. What's an alternative in
               | your mind, just don't do the processing? That sounds like
               | making progress towards being green. If you're making
               | claims of green washing you need to be able to back it up
               | with what alternative would actually be "green".
        
               | ironmagma wrote:
               | I didn't make any claims, I just explained what the
               | parent was saying. There could be multiple ways to make
               | it more green: one being not doing the processing, or
               | another perhaps just optimizing the work being done. But
               | actually, no, you don't need a viable way to be green in
               | order to call greenwashing "greenwashing." It can just be
               | greenwashing, with no alternative that is actually green.
        
           | timpetri wrote:
           | That is an interesting angle to look at it from. If they're
           | gonna keep pushing this they end up with a strong incentive
           | to make the iPhone even more energy efficient, since users
           | have come to expect good/always improving battery life.
           | 
           | At the end of the day, AI workloads in the cloud will always
           | be a lot more compute effective however, meaning lowered
           | combined footprint. However, in the server based model, there
           | is more incentive to pre-compute (waste inference) things to
           | make them appear snappy on device. Analogous would be all
           | that energy spent doing video encoding for YouTube videos
           | that never get watched. Although, it's "idle" resources for
           | budgeting purposes.
        
           | benced wrote:
           | Apple has committed that all of its products will be carbon-
           | neutral - including emissions from charging during their
           | lifetime - by 2030. The Apple Watch is already there.
           | 
           | https://www.apple.com/newsroom/2023/09/apple-unveils-its-
           | fir...
        
         | aborsy wrote:
         | What are the example of the edge devices made by Apple?
        
           | floam wrote:
           | MacBook, iPhone, iPad?
        
         | dancemethis wrote:
         | Apple is privacy last, if anything. Forgotten PRISM already?
        
         | royaltjames wrote:
         | Yes this began with the acquisition of xnor.ai. Absolutely
         | amazing what will be done (and is being done) with edge
         | computing.
        
         | Pesthuf wrote:
         | Honestly, if they manage this, they have my money. But to get
         | actually powerful models running, they need to supply the
         | devices with enough RAM - and that's definitely not what Apple
         | like to do.
        
         | jonplackett wrote:
         | I think it will be a winning strategy. Lag is a real killer for
         | LLMs.
         | 
         | I think they'll have another LLM on a server (maybe a deal for
         | openai/gemini) that the one on the device can use like ChatGPT
         | uses plugins.
         | 
         | But on device Apple have a gigantic advantage. Rabbit and
         | Humane are good ideas humbled by shitty hardware that runs out
         | of battery, gets too hot, has to connect to the internet to do
         | literally anything.
         | 
         | Apple is in a brilliant position to solve all those things.
         | 
         | I hope they announce something good at WWDC
        
           | threeseed wrote:
           | There really isn't enough emphasis on the downsides of server
           | side platforms.
           | 
           | So many of these are only deployed in US and so if you're say
           | in country Australia not only do you have all your traffic
           | going to the US but it will be via slow and intermittent
           | cellular connections.
           | 
           | It makes using services like LLMs unusably slow.
           | 
           | I miss the 90s and having applications and data reside
           | locally.
        
         | throwaway48476 wrote:
         | >Apple's AI strategy is to put inference (and longer term even
         | learning) on edge devices
         | 
         | Ironic, given that AI requires lots of VRAM.
        
         | lagt_t wrote:
         | Has nothing to do with privacy, google is also pushing gemini
         | nano to the device. The sector is discovering the diminishing
         | returns of LLMs.
         | 
         | With the ai cores on phones they can cover your average user
         | use cases with a light model without the server expense.
        
           | SpaceManNabs wrote:
           | don't bother. apple's marketing seems to have won on here. i
           | made a similar point only for people to tell me that apple is
           | the only org seriously pushing federated learning.
        
       | zincmaster wrote:
       | I own M1, A10X and A12X iPad Pros. I have yet to see any of them
       | ever max out their processor or get slow. I have no idea why
       | anyone would need an M4 one. Sure, it's because Apple no longer
       | has M1s being fabbed at TSMC. But seriously, who would upgrade.
       | 
       | Put MacOS on iPad Pro, then it gets interesting. The most
       | interesting thing my ipad pros do are look at security cameras or
       | read ODB-II settings on my vehicle. Hell, they can't even
       | maintain an SSH connection correctly. Ridiculous.
       | 
       | I see Apple always show videos of people editing video on their
       | iPad Pro. Who does that??? We use them for watching videos
       | (kids). One is in a car as a mapping system - that's a solid use
       | case. One I gave my Dad and he did know what to do with it - so
       | its collecting dust. And one lives in the kitchen doing recipes.
       | 
       | Functionally, a 4 year old Chromebook is 3x as useful as a new
       | iPad Pro.
        
       | jsaltzman20 wrote:
       | Who built the M4 chip: https://www.linkedin.com/posts/jason-
       | salt_ai-apple-semicondu...
        
       | jsaltzman20 wrote:
       | Who built the M4 chip? https://www.linkedin.com/posts/jason-
       | salt_ai-apple-semicondu...
        
         | doctor_eval wrote:
         | Great People Units. OK. Nice recruitment pitch.
        
       | phkahler wrote:
       | >> And with AI features in iPadOS like Live Captions for real-
       | time audio captions, and Visual Look Up, which identifies objects
       | in video and photos, the new iPad Pro allows users to accomplish
       | amazing AI tasks quickly and on device. iPad Pro with M4 can
       | easily isolate a subject from its background throughout a 4K
       | video in Final Cut Pro with just a tap, and can automatically
       | create musical notation in real time in StaffPad by simply
       | listening to someone play the piano. And inference workloads can
       | be done efficiently and privately...
       | 
       | These are really great uses of AI hardware. All of them benefit
       | the user, where many of the other companies doing AI are somehow
       | trying to benefit themselves. AI as a feature vs AI as a service
       | or hook.
        
       | leesec wrote:
       | Why are all the comparisons with the M2? Apple did this with the
       | M3 -> M1 as well right?
        
       | spxneo wrote:
       | almost went for M2 128gb to run some local llamas
       | 
       | glad I held out. M4 is going to put downward pressure across all
       | previous gen.
       | 
       | edit: nvm, AMD is coming out with twice the performance of M4 in
       | two months or less. If the M2s become super cheap I will consider
       | it but M4 came far too late. There's just way better alternatives
       | now and very soon.
        
       | winwang wrote:
       | I would want to know if the LPDDR6 rumors are substantiated, i.e.
       | memory bus details.
       | 
       | https://forums.macrumors.com/threads/lpddr6-new-beginning-of...
       | 
       | If M4 Max could finally break the 400GBps limit of the past few
       | years and hit 600 GBps, it would be huge for local AI since it
       | could directly translate into inference speedups.
        
         | hmottestad wrote:
         | The M4 has increased the bandwidth from 100 to 120 GB/s. The M4
         | Max would probably be 4x that at 480 GB/s, but the M4 Ultra
         | would be 960 GB/s compared to M2 Ultra at 800 GB/s.
        
           | winwang wrote:
           | Dang. +20% is still a nice difference, but not sure how they
           | did that. Here's hoping M4 Max can include more tech, but
           | that's copium.
           | 
           | 960 GB/s is 3090 level so that's pretty good. I'm curious if
           | the Macbooks right now are actually more so compute limited
           | due to tensor throughput being relatively weak, not sure
           | about real-world perf.
        
       | animatethrow wrote:
       | Only iPad Pro has M4? Once upon a time during the personal
       | computer revolution in the 1980s, little more than a decade after
       | man walked the moon, humans had sufficiently technologically
       | developed that it was possible to compile and run programs on the
       | computers we bought, whether the computer was Apple (I,II,III,
       | Mac), PC, Commodore, Amiga, or whatever. But these old ways were
       | lost to the mists of time. Is there any hope this ancient
       | technology will be redeveloped for iPad Pro within the next 100
       | years? Specifically within Q4 of 2124, when Prime will finally
       | offer deliveries to polar Mars colonies? I want to buy an iPad
       | Pro M117 for my great-great-great-great-granddaughter but only if
       | she can install a C++ 212X compiler on it.
        
       | GalaxyNova wrote:
       | I hate how apple tends to make statements about their products
       | without clear benchmarks.
        
       | abhayhegde wrote:
       | What's the endgame with iPads though? I mainly use it for
       | consumption, taking notes and jotting annotations on PDFs. Well,
       | it's a significant companion for my work, but I cannot see if
       | I've any reason to upgrade from iPad Air 5, especially given the
       | incompatibility of the Pencil 2nd gen.
        
       | biscuit1v9 wrote:
       | > the latest chip delivering phenomenal performance to the all-
       | new iPad Pro What a joke. They have M4 and they still run iOS?
       | Why can't they run MacOS instead?
       | 
       | If you take it a bit deeper: if an iPad would have keyboard,
       | mouse and MacOS - it would basically be a 10/12 inch macbook.
        
       | jshaqaw wrote:
       | The heck do I do with an M4 in an iPad? Scroll hacker news really
       | really fast?
       | 
       | Apple needs to reinvest in software innovation on the iPad. I
       | don't think my use case for it has evolved in 5 years.
        
         | hmottestad wrote:
         | I was hoping they would come out and say "and now developers
         | can develop apps directly on their iPads with our new release
         | of Xcode" but yeah, no. Don't know if the M4 with just 16GB of
         | memory would be very comfortable for any pro workload.
        
           | doctor_eval wrote:
           | There's no way Apple would announce major new OS features
           | outside of WWDC.
           | 
           | So, perhaps it's no coincidence that a new iPadOS will be
           | announced in exactly one month.
           | 
           | Here's hoping anyway!
        
       | hmottestad wrote:
       | 120 GB/s memory bandwidth. The M4 Max will probably top out at 4x
       | that and the M4 Ultra at 2x that again. The M4 Ultra will be very
       | close to 1TB/s of bandwidth. That would put the M4 Ultra in line
       | with the 4090.
       | 
       | Rumours are that the Mac Studio and Mac Pro will skip M3 and go
       | straight to M4 at WWDC this summer, which would be very
       | interesting. There has also been some talk about an M4 Extreme,
       | but we've heard rumours about the M1 Extreme and M2 Extreme
       | without any of those showing up.
        
       | jgiacjin wrote:
       | Is there an sdk to work on gaming with unity with ipad Pro m4?
        
       | gigatexal wrote:
       | lol I just got a m3 max and this chip does more than 2x tops than
       | my NPU does
        
       | gigatexal wrote:
       | Anyone know if the ram multiples of the M4 are better than the
       | M3? Example: could a base model M4 sport more than 24GB of ram?
        
         | LtdJorge wrote:
         | Not for the iPad (8/16 GB)
        
           | gigatexal wrote:
           | Right I was thinking more laptops and desktops but yeah 8/16
           | makes a ton more sense for the ipad only.
        
       | thih9 wrote:
       | When this arrives in MacBooks, what would that mean in practice?
       | Assuming base M4 config (not max, not ultra - those were already
       | powerful in earlier iterations), what kind of LLM could I run on
       | it locally?
        
         | talldayo wrote:
         | > what kind of LLM could I run on it locally?
         | 
         | Anything up to the memory configuration it is limited to. So
         | for base model M4, that likely means you have 8gb of memory
         | with 4-5 of it realistically usable.
        
       | thih9 wrote:
       | > while Apple touts the performance jump of the 10-core CPU found
       | inside the new M4 chip, that chip variant is exclusive to the 1
       | TB and 2 TB iPad Pro models. Lower storage iPads get a 9-core
       | CPU. They also have half the RAM
       | 
       | https://9to5mac.com/2024/05/07/new-ipad-pro-missing-specs-ca...
        
       | ycsux wrote:
       | Everyone seems as confused as I am about Apple's strategy here. I
       | wasn't sure the M4 existed, now it can be bought in a format
       | noone wants. How will this bring in a lot of revenue?
        
       | TheMagicHorsey wrote:
       | The only reason I'm buying a new iPad Pro is the screen and
       | because the battery on my 2021 iPad Pro is slowly dying.
       | 
       | I could care less that the M4 chip is in the iPad Pro ... all I
       | use it for is browsing the web, watching movies, playing chess,
       | and posting on Hacker News (and some other social media as well).
        
       | Topfi wrote:
       | Generally, I feel that telling a company how to handle a product
       | line as successful as the iPads, but I beg you, please make Xcode
       | available on iPad OS or provide an optional and separate MacOS
       | mode similar to Dex on Samsung tablets. Being totally honest, I
       | don't like MacOS that much in comparison to other options, but we
       | have to face the fact that even with the M1, the iPads raw
       | performance was far beyond the vast majority of laptops and
       | tablets in a wide range of use cases, yet the restrictive
       | software made that all for naught. Consider that the "average"
       | customer is equally happy with and, due to pricing, generally
       | steered towards the iPad Air, which are great devices that cover
       | the vast majority of use cases essentially identical to the Pro.
       | 
       | Please find a way beyond local transformer models to offer a true
       | use case that differentiates the Pro from the Air (ideally
       | development). The second that gets announced, I'd order the
       | 13-inch model straight away. As it stands, Apple's stance is at
       | least saving me from spending 3,5k as I've resigned myself to
       | accept that the best hardware in tablets simply cannot be used in
       | any meaningful way. Xcode would be a start, MacOS a bearable
       | compromise (unless the address recent instability and bugs which
       | would make it more than that), Asahi a ridiculous, yet beautiful
       | pipedream. Fedora on an iPad, the best of hardware and software,
       | at least in my personal opinion.
        
       ___________________________________________________________________
       (page generated 2024-05-07 23:00 UTC)