[HN Gopher] Apple introduces M4 chip
       ___________________________________________________________________
        
       Apple introduces M4 chip
        
       Author : excsn
       Score  : 1539 points
       Date   : 2024-05-07 14:37 UTC (2 days ago)
        
 (HTM) web link (www.apple.com)
 (TXT) w3m dump (www.apple.com)
        
       | stjo wrote:
       | Only in the new iPads though, no word when it'll be available in
       | Macs.
        
         | speg wrote:
         | In the video event, Tim mentions more updates at WWDC next
         | month - I suspect we will see a M4 MacBook Pro then.
        
           | atonse wrote:
           | Haven't they been announcing Pros and Max's around December?
           | I don't remember. If they're debuting them at WWDC I'll
           | definitely upgrade my M1. I don't even feel the need to, but
           | it's been 2.5 years.
        
             | Foobar8568 wrote:
             | November 2023 for the M3 refresh, M2 was January 2023 if I
             | remember well.
        
             | hmottestad wrote:
             | Mac Studio with M4 Ultra. Then M4 Pro and Max later in the
             | year.
        
         | throwaway5959 wrote:
         | Hopefully in the Mac Mini at WWDC.
        
       | ramboldio wrote:
       | if only macOS would run on iPad..
        
         | user90131313 wrote:
         | All that power so ipad stays limited like a toy.
        
         | Eun wrote:
         | then a lot of people would buy it, including me :-)
        
         | vbezhenar wrote:
         | Please no. I don't want no touch support in macOS.
        
           | jwells89 wrote:
           | It might work if running in Mac mode required a reboot (no on
           | the fly switching between iOS and macOS) and a connected
           | KB+mouse, with the touch part of the screen (aside from
           | Pencil usage) turning inert in Mac mode.
           | 
           | Otherwise yes, desktop operating systems are a terrible
           | experience on touch devices.
        
             | vbezhenar wrote:
             | > It might work if running in Mac mode required a reboot
             | (no on the fly switching between iOS and macOS) and a
             | connected KB+mouse, with the touch part of the screen
             | (aside from Pencil usage) turning inert in Mac mode.
             | 
             | Sounds like strictly worse version of Macbook. Might be
             | useful for occasional work, but I expect people who would
             | use this mode continuously just to switch to Macbook.
        
               | jwells89 wrote:
               | The biggest market would be for travelers who essentially
               | want a work/leisure toggle.
               | 
               | It's not too uncommon for people to carry both an iPad
               | and MacBook for example, but a 12.9" iPad that could
               | reboot into macOS to get some work done and then drop
               | back to iPadOS for watching movies or sketching could
               | replace both without too much sacrifice. There's
               | tradeoffs, but nothing worse than what you see on PC
               | 2-in-1's, plus no questionable hinges to fail.
        
               | ginko wrote:
               | MacBooks even the air are too large and heavy imo. A
               | 10-11 inch tablet running a real os would be ideal for
               | travel imo.
        
             | gorbypark wrote:
             | This is what I want, but with an iPhone (with an iPad would
             | be cool, too). Sell me some insanely expensive dock with a
             | USB-C display port output for a monitor (and a few more for
             | peripherals) and when the phone is plugged in, it becomes
             | macOS.
        
           | Nevermark wrote:
           | Because it would be so great you couldn't help using it? /h
           | 
           | What would be the downside to other's using it?
           | 
           | I get frustrated that Mac doesn't respond to look & pinch!
        
             | vbezhenar wrote:
             | Because I saw how this transformed Windows and GNOME.
             | Applications will be reworked with touch support and become
             | worse for me.
        
           | jonhohle wrote:
           | Why would you need it? Modern iPads have thunderbolt ports
           | (minimally USB-C) and already allow keyboards, network
           | adapters, etc. to be connected. It would be like an iMac
           | without the stand and an option to put it in a keyboard
           | enclosure. Sounds awesome.
        
           | pquki4 wrote:
           | That's your argument of not allowing others to use MacOS on
           | an iPad?
        
         | tcfunk wrote:
         | I'd settle for some version of xcode, or some other way of not
         | requiring a macOS machine to ship iOS apps.
        
           | JimDabell wrote:
           | Swift Playgrounds, which runs on the iPad, can already be
           | used to build an app and deploy it to the App Store without a
           | Mac.
        
             | alexpc201 wrote:
             | You can't make a decent iOS app with Swift Playgrounds, its
             | just a toy for kids to learn to code.
        
               | interpol_p wrote:
               | You're probably correct about it being hard to make a
               | decent iOS app in Swift Playgrounds, but it's definitely
               | not a toy
               | 
               | I use it for work several times per week. I often want to
               | test out some Swift API, or build something in SwiftUI,
               | and for some reason it's way faster to tap it out on my
               | iPad in Swift Playgrounds than to create a new project or
               | playground in Xcode on my Mac -- even when I'm sitting
               | directly in front of my Mac
               | 
               | The iPad just doesn't have the clutter of windows and
               | communication open like my mac does that makes it hard to
               | focus on resolving one particular idea
               | 
               | I have so many playground files on my iPad, a quick
               | glance at my project list: interactive gesture-driven
               | animations, testing out time and date logic, rendering
               | perceptual gradients, checking baseline alignment in SF
               | Symbols, messing with NSFilePresenter, mocking out a UI
               | design, animated text transitions, etc
        
           | hot_gril wrote:
           | It needs a regular web browser too.
        
         | havaloc wrote:
         | Maybe this June there'll be an announcement, but like Lucy with
         | the football, I'm not expecting it. I would instabuy if this
         | was the case, especially with a cellular iPad.
        
         | swozey wrote:
         | Yeah, I bought one of them a few years ago planning to use it
         | for a ton of things.
         | 
         | Turns out I only use it on flights to watch movies because I
         | loathe the os.
        
         | umanwizard wrote:
         | They make a version of iPad that runs macOS, it is called a
         | MacBook Pro.
        
           | zuminator wrote:
           | MacBook Pros have touchscreens and Apple Pencil compatibility
           | now?
        
             | umanwizard wrote:
             | Fair enough, I was being a bit flippant. It'd be nice if
             | that existed, but I suspect Apple doesn't want it to for
             | market segmentation reasons.
        
               | kstrauser wrote:
               | I just went to store.apple.com and specced out a 13" iPad
               | Pro with 2TB of storage, nano-texture glass, and a cell
               | modem for $2,599.
               | 
               | MacBook Pros start at $1,599. There's an enormous overlap
               | in the price ranges of the mortal-person models of those
               | products. It's not like the iPad Pro is the cheap
               | alternative to a MBP. I mean, I couldn't even spec out a
               | MacBook Air to cost as much.
        
         | ranyefet wrote:
         | Just give us support for virtualization and we could install
         | Linux on it and use it for development.
        
           | LeoPanthera wrote:
           | UTM can be built for iOS.
        
       | tosh wrote:
       | Did they mention anything about RAM?
        
         | tosh wrote:
         | > faster memory bandwidth
        
           | zamadatix wrote:
           | The announcement video also highlighted "120 GB/s unified
           | memory bandwidth". 8 GB/16 GB depending on model.
        
         | dmitshur wrote:
         | I don't think they included it in the video, but
         | https://www.apple.com/ipad-pro/specs/ says it's 8 GB of RAM in
         | 256/512 GB models, 16 GB RAM in 1/2 TB ones.
        
       | Takennickname wrote:
       | Why even have an event at this point? There's literally nothing
       | interesting.
        
         | antipaul wrote:
         | Video showing Apple Pencil Pro features was pretty sick, and I
         | ain't even an artist
        
           | Takennickname wrote:
           | The highlight of the event was a stylus?
        
           | gardaani wrote:
           | I think they are over-engineering it. I have never liked
           | gestures because it's difficult to discover something you
           | can't see. A button would have been better than an invisible
           | squeeze gesture.
        
             | swozey wrote:
             | I used Android phones forever until the iphone 13 came out
             | and I switched to IOS because I had to de-Google my life
             | completely after they (for no reason at all, "fraud" that I
             | did not commit) blocked my Google Play account.
             | 
             | The amount of things I have to google to use the phone how
             | I normally used Android is crazy. So many gestures required
             | with NOTHING telling you how to use them.
             | 
             | I recently sat around a table with 5 of my friends trying
             | to figure out how to do that "Tap to share contact info"
             | thing. Nobody at the table, all long term IOS users, knew
             | how to do it. I thought that if we tapped the phones
             | together it would give me some popup on how to finish the
             | process. We tried all sorts of tapping/phone versions until
             | we realized we had to unlock both phones.
             | 
             | And one of the people there with the same phone as me (13
             | pro) couldn't get it to work at all. It just did nothing.
             | 
             | And the keyboard. My god is the keyboard awful. I have
             | never typoed so much, and I have _no_ idea how to copy a
             | URL out of Safari to send to someone without using the
             | annoying Share button which doesn 't even have the app I
             | share to the most without clicking the More.. button to
             | show all my apps. Holding my finger over the URL doesn't
             | give me a copy option or anything, and changing the URL
             | with their highlight/delete system is terrible. I get so
             | frustrated with it and mostly just give up. The cursor
             | NEVER lands where I want it to land and almost always
             | highlights an entire word when I want to make a one letter
             | typo fix. I don't have big fingers at all. Changing a long
             | URL that goes past the length of the Safari address bar is
             | a nightmare.
             | 
             | I'm sure (maybe?) that's some option I need to change but I
             | don't even feel like looking into it anymore. I've given up
             | on learning about the phones hidden gestures and just use
             | it probably 1/10th of how I could.
             | 
             | Carplay, Messages and the easy-to-connect-devices ecosystem
             | is the only thing keeping me on it.
        
               | Apocryphon wrote:
               | Sounds like the ideal use case for an AI assistant. Siri
               | ought to tell you how to access hidden features on the
               | device. iOS, assist thyself.
        
               | antipaul wrote:
               | To copy URL from Safari, press and hold (long-press) on
               | URL bar
               | 
               | Press and hold also allows options elsewhere in iOS
        
               | swozey wrote:
               | Oh, I see what I'm doing wrong now. If I hold it down it
               | highlights it and gives me the "edit letter/line" thing,
               | and then if I let go I get the Copy option. I guess in
               | the past I've seen it highlight the word and just stopped
               | before it got to that point.
               | 
               | Thanks!
        
         | throwaway11460 wrote:
         | 2x better performance per watt is not interesting? Wow, what a
         | time to be alive.
        
           | LoganDark wrote:
           | To me, cutting wattage in half is not interesting, but
           | doubling performance is interesting. So performance per watt
           | is actually a pretty useless metric since it doesn't
           | differentiate between the two.
           | 
           | of course efficiency matters for a battery-powered device,
           | but I still tend to lean towards raw power over all else.
           | Others may choose differently, which is why other metrics
           | exist I guess.
        
             | throwaway11460 wrote:
             | This still means you can pack more performance into the
             | chip though - because you're limited by cooling.
        
               | LoganDark wrote:
               | Huh, never considered cooling. I suppose that contributes
               | to the device's incredible thinness. Generally thin-and-
               | light has always been an incredible turnoff for me, but
               | tech is finally starting to catch up to thicker devices.
        
               | hedora wrote:
               | Thin and light is easier to cool. The entire device is a
               | big heat sink fin. Put another way, as the device gets
               | thinner, the ratio of surface area to volume goes to
               | infinity.
               | 
               | If you want to go thicker, then you have to screw around
               | with heat pipes, fans, etc, etc, to move the heat a few
               | cm to the outside surface of the device.
        
               | LoganDark wrote:
               | That's not why thin-and-light bothers me. Historically,
               | ultrabooks and similarly thin-and-light focused devices
               | have been utterly insufferable in terms of performance
               | compared to something that's even a single cm thicker.
               | But Apple Silicon seems extremely promising, it seems
               | quite competitive with thicker and heavier devices.
               | 
               | I never understood why everyone [looking at PC laptop
               | manufacturers] took thin-and-light to such an extreme
               | that their machines became basically useless. Now Apple
               | is releasing thin-and-light machines that are incredibly
               | powerful, and that is genuinely innovative. I hadn't seen
               | something like that from them since the launch of the
               | original iPhone, that's how big I think this was.
        
             | throwaway5959 wrote:
             | It means a lot to me, because cutting power consumption in
             | half for millions of devices means we can turn off power
             | plants (in aggregate). It's the same as lightbulbs; I'll
             | never understand why people bragged about how much power
             | they were wasting with incandescents.
        
               | yyyk wrote:
               | >cutting power consumption in half for millions of
               | devices means we can turn off power plants
               | 
               | It is well known that software inefficiency doubles every
               | couple years, that is, the same scenario would take 2x as
               | much compute, given entire software stack (not
               | disembodied algorithm which will indeed be faster).
               | 
               | The extra compute will be spent on a more abstract UI
               | stack or on new features, unless forced by physical
               | constraints (e.g. inefficient batteries of early
               | smartphone), which is not the case at present.
        
               | throwaway11460 wrote:
               | That's weird - if software gets 2x worse every time
               | hardware gets 2x better, why did my laptop in 2010 last 2
               | hours on battery while the current one lasts 16 doing
               | _much_ more complex tasks for me?
        
               | yyyk wrote:
               | Elsewhere in the comments, it is noted Apple's own
               | estimates are identical despite allegedly 2x better
               | hardware.
               | 
               | Aside, 2 hours is very low even for 2010. There's a
               | strongly usability advantage for going to 16. But going
               | from 16 to 128 won't add as much. The natural course of
               | things is to converge on a decent enough number and
               | 'spend' the rest on more complex software, a lighter
               | laptop etc.
        
               | Nevermark wrote:
               | They like bright lights?
               | 
               | I have dimmable LED strips around my rooms, hidden by
               | cove molding, reflecting off the whole ceiling, which
               | becomes a super diffuse, super bright "light".
               | 
               | I don't boast about power use, but they are certainly
               | hungry.
               | 
               | For that I get softly defuse lighting with a max
               | brightness comparable to outdoor clear sky daylight.
               | Working from home, this is so nice for my brain and
               | depression.
        
               | codedokode wrote:
               | First, only CPU power consumption is reduced, not other
               | components, second, I doubt tablets contribute
               | significantly to global power consumption, so I think no
               | power plants will be turned off.
        
           | Takennickname wrote:
           | That's bullshit. Does that mean they could have doubled
           | battery life if they kept the performance the same?
           | Impossible.
        
             | throwaway11460 wrote:
             | Impossible why? That's what happened with M1 too.
             | 
             | But as someone else noted, CPU power draw is not the only
             | factor in device battery life. A major one, but not the
             | whole story.
        
               | Takennickname wrote:
               | Intel to M1 is an entire architectural switch where even
               | old software couldn't be run and had to be emulated.
               | 
               | This is a small generational upgrade that doesn't
               | necessitate an event.
               | 
               | Other companies started having events like this because
               | they were copying apples amazing events. Apples events
               | now are just parodies of what Apple was.
        
               | echoangle wrote:
               | You know the main point of the event was the release of
               | new iPads, right?
        
       | praseodym wrote:
       | "With these improvements to the CPU and GPU, M4 maintains Apple
       | silicon's industry-leading performance per watt. M4 can deliver
       | the same performance as M2 using just half the power. And
       | compared with the latest PC chip in a thin and light laptop, M4
       | can deliver the same performance using just a fourth of the
       | power."
       | 
       | That's an incredible improvement in just a few years. I wonder
       | how much of that is Apple engineering and how much is TSMC
       | improving their 3nm process.
        
         | cs702 wrote:
         | Potentially > 2x greater battery life for the same amount of
         | compute!
         | 
         | That _is_ pretty crazy.
         | 
         | Or am I missing something?
        
           | krzyk wrote:
           | Wait a bit. M2 wasn't as good as the hype was.
        
             | modeless wrote:
             | That's because M2 was on the same TSMC process generation
             | as M1. TSMC is the real hero here. M4 is the same
             | generation as M3, which is why Apple's marketing here is
             | comparing M4 vs M2 instead of M3.
        
               | sys_64738 wrote:
               | I thought M3 and M4 were different processes though.
               | Higher yield for the latter or such.
        
               | jonathannorris wrote:
               | Actually, M4 is reportedly on a more cost-efficient TSMC
               | N3E node, where Apple was apparently the only customer on
               | the more expensive TSMC N3B node; I'd expect Apple to
               | move away from M3 to M4 very quickly for all their
               | products.
               | 
               | https://www.trendforce.com/news/2024/05/06/news-
               | apple-m4-inc....
        
               | modeless wrote:
               | Yeah and M2 was on N5P vs M1's N5, but it was still N5.
               | M4 is still N3.
        
               | geodel wrote:
               | And why other PC vendors not latching on to the hero?
        
               | mixmastamyk wrote:
               | Apple often buys their entire capacity (of a process) for
               | quite a while.
        
               | modeless wrote:
               | Apple pays TSMC for exclusivity on new processes for a
               | period of time.
        
               | mensetmanusman wrote:
               | Saying tsmc is a hero ignores the thousands of suppliers
               | that improved everything required for tsmc to operate.
               | Tsmc is the biggest, so they get the most experience on
               | all the new toys the world's engineers and scientists are
               | building.
        
               | whynotminot wrote:
               | It's almost as if every part of the stack -- from the
               | uArch that Apple designs down to the insane machinery
               | from ASML, to the fully finished SoC delivered by TSMC --
               | is vitally important to creating a successful product.
               | 
               | But people like to assign credit solely to certain spaces
               | if it suits their narrative (lately, _Apple isn 't
               | actually all that special at designing their chips, it's
               | all solely the process advantage_)
        
               | modeless wrote:
               | Saying TSMC's success is due to their suppliers ignores
               | the fact that all of their competitors failed to keep up
               | despite having access to the same suppliers. TSMC
               | couldn't do it without ASML, but Intel and Samsung failed
               | to do it even with ASML.
               | 
               | In contrast, when Apple's CPU and GPU competitors get
               | access to TSMC's new processes after Apple's exclusivity
               | period expires, they achieve similar levels of
               | performance (except for Qualcomm because they don't
               | target the high end of CPU performance, but AMD does).
        
               | mensetmanusman wrote:
               | Tsmc being the biggest let them experiment at 10x the
               | rate. It turns out they had the right business model that
               | Intel didn't notice was there, it just requires
               | dramatically lower margins and higher volumes and far
               | lower paid engineers.
        
           | eqvinox wrote:
           | Sadly, this is only processor power consumption, you need to
           | put power into a whole lot of other things to make an useful
           | computer... a display backlight and the system's RAM come to
           | mind as particular offenders.
        
             | cs702 wrote:
             | Thanks. That makes sense.
        
             | treesciencebot wrote:
             | backlight is now the main bottleneck for consumption heavy
             | uses. I wonder what are the main advancements that are
             | happening there to optimize the wattage.
        
               | sangnoir wrote:
               | Is the iPad Pro not yet on OLED? All of Samsung's
               | flagship tablets have OLED screens for well over a decade
               | now. It eliminates the need for backlighting, has
               | superior contrast and pleasant to ise in low-light
               | conditions.
        
               | kbolino wrote:
               | I'm not sure how OLED and backlit LCD compare power-wise
               | exactly, but OLED screens still need to put off a lot of
               | light, they just do it directly instead of with a
               | backlight.
        
               | callalex wrote:
               | The iPad that came out today finally made the switch.
               | iPhones made the switch around 2016. It does seem odd how
               | long it took for the iPad to switch, but Samsung
               | definitely switched too early: my Galaxy Tab 2 suffered
               | from screen burn in that I was never able to recover
               | from.
        
               | sangnoir wrote:
               | LineageOS has an elegant solution for OLED burn in:
               | imperceptibly shift persistent UI elements my a few
               | pixels over time
        
               | devsda wrote:
               | If the usecases involve working on dark terminals all day
               | or watching movies with dark scenes or if the general
               | theme is dark, may be the new oled display will help
               | reduce the display power consumption too.
        
               | whereismyacc wrote:
               | QD-oled reduces it by like 25% I think? But maybe that
               | will never be in laptops, I'm not sure.
        
               | eqvinox wrote:
               | QD-OLED is an engineering improvement, i.e. combining
               | existing researched technology to improve the result
               | product. I wasn't able to find a good source on what
               | exactly it improves in efficiency, but it's not a
               | fundamental improvement in OLED electrical-optical energy
               | conversion (if my understanding is correct.)
               | 
               | In general, OLED screens seem to have an efficiency
               | around 20[?]30%. Some research departments seem to be
               | trying to bump that up
               | [https://www.nature.com/articles/s41467-018-05671-x]
               | which I'd be more hopeful on...
               | 
               | ...but, honestly, at some point you just hit the limits
               | of physics. It seems internal scattering is already a
               | major problem; maybe someone can invent pixel-sized
               | microlasers and that'd help? More than 50-60% seems like
               | a pipe dream at this point...
               | 
               | ...unless we can change to a technology that
               | fundamentally doesn't emit light, i.e. e-paper and the
               | likes. Or just LCD displays without a backlight, using
               | ambient light instead.
        
               | neutronicus wrote:
               | Please give me an external ePaper display so I can just
               | use Spacemacs in a well-lit room!
        
               | mszcz wrote:
               | Onyx makes a HDMI "25 eInk display [0]. It's pricey.
               | 
               | [0] https://onyxboox.com/boox_mirapro
               | 
               | edit: "25, not "27
        
               | vsuperpower2020 wrote:
               | I'm still waiting for the technology to advance. People
               | can't reasonably spend $1500 on the world's shittiest
               | computer monitor, even if it is on sale.
        
               | neutronicus wrote:
               | Dang, yeah, this is the opposite of what I had in mind
               | 
               | I was thinking, like, a couple hundred dollar Kindle the
               | size of a big iPad I can plug into a laptop for text-
               | editing out and about. Hell, for my purposes I'd love an
               | integrated keyboard.
               | 
               | Basically a second, super-lightweight laptop form-factor
               | I can just plug into my chonky Macbook Pro and set on top
               | of it in high-light environments when all I need to do is
               | edit text.
               | 
               | Honestly not a compelling business case now that I write
               | it out, but I just wanna code under a tree lol
        
               | eqvinox wrote:
               | A friend bought it & I had a chance to see it in action.
               | 
               | It is nice for some _very specific use cases_. (They 're
               | in the publishing/typesetting business. It's... idk,
               | really depends on your usage patterns.)
               | 
               | Other than that, yeah, the technology just isn't there
               | yet.
        
               | mholm wrote:
               | I think we're getting pretty close to this. The
               | Remarkable 2 tablet is $300, but can't take video input
               | and software support for non-notetaking is near non-
               | existent. There's even a keyboard available. Boox and
               | Hisense are also making e-ink tablets/phones for
               | reasonable prices.
        
               | craftkiller wrote:
               | If that existed as a drop-in screen replacement on the
               | framework laptop and with a high refresh rate color
               | gallery 3 panel, then I'd buy it at that price point in a
               | heart beat.
               | 
               | I can't replace my desktop monitor with eink because I
               | occasionally play video games. I can't use a 2nd monitor
               | because I live in a small apartment.
               | 
               | I can't replace my laptop screen with greyscale because I
               | need syntax highlighting for programming.
        
               | gumby wrote:
               | Maybe the $100 nano-texture screen will give you the
               | visibility you want. Not the low power of a epaper screen
               | though.
               | 
               | Hmm, emacs on an epaper screen might be great if it had
               | all the display update optimization and "slow modem mode"
               | that Emacs had back in the TECO days. (The SUPDUP network
               | protocol even implemented that at the client end and
               | interacted with Emacs directly!)
        
               | craftkiller wrote:
               | AMD gpus have "Adaptive Backlight Management" which
               | reduces your screen's backlight but then tweaks the
               | colors to compensate. For example, my laptop's backlight
               | is set at 33% but with abm it reduces my backlight to 8%.
               | Personally I don't even notice it is on / my screen seems
               | just as bright as before, but when I first enabled it I
               | did notice some slight difference in colors so its
               | probably not suitable for designers/artists. I'd 100%
               | recommend it for coders though.
        
               | ProfessorLayton wrote:
               | Strangely, Apple seems to be doing the opposite for some
               | reason (Color accuracy?), as dimming the display doesn't
               | seem to reduce the backlight as much, and they're using a
               | combination of software dimming, even at "max"
               | brightness.
               | 
               | Evidence can be seen when opening up iOS apps, which seem
               | to glitch out and reveals the brighter backlight [1].
               | Notice how #FFFFFF white isn't the same brightness as the
               | white in the iOS app.
               | 
               | [1] https://imgur.com/a/cPqKivI
        
               | superb_dev wrote:
               | The max brightness of the desktop is gonna be lower than
               | the actual max brightness of the panel, because the panel
               | needs to support HDR content. That brightness would be
               | too much for most cases
        
               | ProfessorLayton wrote:
               | This was a photo of my MBA 15" which doesn't have an HDR
               | capable screen afaik. Additionally, this artifacting
               | happens at all brightness levels, including the lowest.
               | 
               | It also just doesn't seem ideal that some apps (iOS)
               | appear much brighter than the rest of the system. HDR
               | support in macOS is a complete mess, although I'm not
               | sure if Windows is any better.
        
             | naikrovek wrote:
             | that's still amazing, to me.
             | 
             | I don't expect an M4 macbook to last any longer than an M2
             | macbook of otherwise similar specs; they will spend that
             | extra power budget on things other than the battery life
             | specification.
        
           | Hamuko wrote:
           | Comparing the tech specs for the outgoing and new iPad Pro
           | models, that potential is very much not real.
           | 
           | Old: 28.65 Wh (11") / 40.88 Wh (13"), up to 10 hours of
           | surfing the web on Wi-Fi or watching video.
           | 
           | New: 31.29 Wh (11") / 38.99 Wh (13"), up to 10 hours of
           | surfing the web on Wi-Fi or watching video.
        
             | binary132 wrote:
             | Ok, but is it twice as fast during those 10 hours, leading
             | to 20 hours of effective websurfing? ;)
        
             | jeffbee wrote:
             | A more efficient CPU can't improve that spec because those
             | workloads use almost no CPU time and the display dominates
             | the energy consumption.
        
               | Hamuko wrote:
               | Unfortunately Apple only ever thinks about battery life
               | in terms of web surfing and video playback, so we don't
               | get official battery-life figures for anything else.
               | Perhaps you can get more battery life out of your iPad
               | Pro web surfing by using dark mode, since OLEDs should
               | use less power than IPS displays with darker content.
        
             | codedokode wrote:
             | Isn't this weird, a new chip consumes 2 times less power,
             | but the battery life is the same?
        
               | masklinn wrote:
               | It's not weird when you consider that browsing the web or
               | watching videos has the CPU idle or near enough, so 95%
               | of the power draw is from the display and radios.
        
               | rsynnott wrote:
               | The OLED likely adds a fair bit of draw; they're
               | generally somewhat more power-hungry than LCDs these
               | days, assuming like-for-like brightness. Realistically,
               | this will be the case until MicroLEDs are available for
               | non-completely-silly money.
        
               | CoastalCoder wrote:
               | This surprises me. I thought the big power downside of
               | LCD displays is that they use filtering to turn unwanted
               | color channels into waste heat.
               | 
               | Knowing nothing else about the technology, I assumed that
               | would make OLED displays more efficient.
        
               | mensetmanusman wrote:
               | Can't beat the thermodynamics of exciton recombination.
               | 
               | https://pubs.acs.org/doi/10.1021/acsami.9b10823
        
               | sroussey wrote:
               | OLED will use less for a screen of black and LCD will use
               | less for a screen of white. Now, take whatever average of
               | what content is on the screen and for you, it may be
               | better or may be worse.
               | 
               | White background document editing, etc., will be worse,
               | and this is rather common.
        
               | beeboobaa3 wrote:
               | No, they have a "battery budget". It the CPU power draw
               | goes down that means the budget goes up and you can spend
               | it on other things, like a nicer display or some other
               | feature.
               | 
               | When you say "up to 10 hours" most people will think "oh
               | nice that's an entire day" and be fine with it. It's what
               | they're used to.
               | 
               | Turning that into 12 hours might be possible but are the
               | tradeoffs worth it? Will enough people buy the device
               | because of the +2 hour battery life? Can you market that
               | effectively? Or will putting in a nicer fancy display
               | cause more people to buy it?
               | 
               | We'll never get significant battery life improvements
               | because of this, sadly.
        
             | fvv wrote:
             | this
        
             | masklinn wrote:
             | Yeah double the PPW does not mean double the battery,
             | because unless you're pegging the CPU/SOC it's likely only
             | a small fraction of the power consumption of a light-use or
             | idle device, especially for an SOC which originates in
             | mobile devices.
             | 
             | Doing basic web navigation with some music in the
             | background, my old M1 Pro has short bursts at ~5W (for the
             | entire SoC) when navigating around, a pair of watts for
             | mild webapps (e.g. checking various channels in discord),
             | and typing into this here textbox it's sitting happy at
             | under half a watt, with the P-cores essentially sitting
             | idle and the E cores at under 50% utilisation.
             | 
             | With a 100Wh battery that would be a "potential" of 150
             | hours or so. Except nobody would ever sell it for that,
             | because between the display and radios the laptop's
             | actually pulling 10~11W.
        
               | pxc wrote:
               | So this could be a bit helpful for heavier duty usage
               | while on battery.
        
               | tracker1 wrote:
               | On my M1 air, I find for casual use of about an hour or
               | so a day, I can literally go close to a couple weeks
               | without needing to recharge. Which to me is pretty
               | awesome. Mostly use my personal desktop when not on my
               | work laptop (docked m3 pro).
        
           | Dibby053 wrote:
           | 2x efficiency vs a 2 year old chip is more or less in line
           | with expectations (Koomey's law). [1]
           | 
           | [1] https://en.wikipedia.org/wiki/Koomey%27s_law
        
           | BurningFrog wrote:
           | Is the CPU/GPU really dominating power consumption that much?
        
             | masklinn wrote:
             | Nah, GP is off their rocker. For the workloads in question
             | the SOC's power draw is a rounding error, low single-digit
             | percent.
        
         | onlyrealcuzzo wrote:
         | On one hand, it's crazy. On the other hand, it's pretty typical
         | for the industry.
         | 
         | Average performance per watt doubling time is 2.6 years:
         | https://newsroom.arm.com/blog/performance-per-
         | watt#:~:text=T....
        
           | praseodym wrote:
           | M2 was launched in June 2022 [1] so a little under 2 years
           | ago. Apple is a bit ahead of that 2.6 years, but not by much.
           | 
           | [1] https://www.apple.com/newsroom/2022/06/apple-
           | unveils-m2-with...
        
             | halgir wrote:
             | If they maintain that pace, it will start compounding
             | incredibly quickly. If we round to 2 years vs 2.5 years,
             | after just a decade you're an entire doubling ahead.
        
           | luyu_wu wrote:
           | Note that performance per watt is 2x higher at both chips
           | peak performance. This is in many ways an unfair comparison
           | for Apple to make.
        
           | faeriechangling wrote:
           | It's a shame performance per watt doesn't double every 2.6
           | years for modems and screens.
        
             | onlyrealcuzzo wrote:
             | Watts per pixel probably did something close for a long
             | time for screens.
             | 
             | Same for Watts per bit.
             | 
             | There's just a lot more pixels and bits.
        
         | 2OEH8eoCRo0 wrote:
         | > I wonder how much of that is Apple engineering and how much
         | is TSMC improving their 3nm process.
         | 
         | I think Apple's design choices had a huge impact on the M1's
         | performance but from there on out I think it's mostly due to
         | TSMC.
        
         | izacus wrote:
         | Apple usually massively exaggerates their tech spec comparison
         | - is it REALLY half the power use of all times (so we'll get
         | double the battery life) or is it half the power use in some
         | scenarios (so we'll get like... 15% more battery life total) ?
        
           | alphakappa wrote:
           | Any product that uses this is more than just the chip, so you
           | cannot get a proportional change in battery life.
        
             | izacus wrote:
             | Sure, but I also remember them comparing M1 chip to 3090
             | GTX and my MacBook M1 Pro doesn't really run games well.
             | 
             | So I've become really suspicious about any claims about
             | performance done by Apple.
        
               | servus45678981 wrote:
               | That is fault of the devs. Because optimization for
               | dedicated graphic cards is a either integrated in the
               | game engine or they just have a version for rtx users.
        
               | filleduchaos wrote:
               | I mean, I remember Apple comparing the M1 _Ultra_ to
               | Nvidia 's RTX 3090. While that chart was definitely
               | putting a spin on things to say the least, and we can
               | argue from now until tomorrow about whether power
               | consumption should or should not be equalised, I have no
               | idea why anyone would expect the M1 _Pro_ (an explicitly
               | much weaker chip) to perform anywhere near the same.
               | 
               | Also what games are you trying to play on it? All my
               | M-series Macbooks have run games more than well enough
               | with reasonable settings (and that has a lot more to do
               | with OS bugs and the constraints of the form factor than
               | with just the chipset).
        
               | achandlerwhite wrote:
               | They compared them in terms of perf/watt, which did hold
               | up, but obviously implied higher performance overall.
        
           | illusive4080 wrote:
           | From their specs page, battery life is unchanged. I think
           | they donated the chip power savings to offset the increased
           | consumption of the tandem OLED
        
             | xattt wrote:
             | I've not seen discussion that Apple likely scales
             | performance of chips to match the use profile of the
             | specific device it's used in. An M2 in an iPad Air is very
             | likely not the same as an M2 in an MBP or Mac Studio.
        
               | mananaysiempre wrote:
               | A Ryzen 7840U in a gaming handheld is not (configured)
               | the same as a Ryzen 7840U in a laptop, for that matter,
               | so Apple is hardly unique here.
        
               | jakjak123 wrote:
               | The manufacturer often targets a tdp that is reasonable
               | for thermals and battery life, but the cpu package is
               | often the same.
        
               | seec wrote:
               | Yeah, but the difference is that you usually don't get
               | people arguing that it's the same thing or that it can be
               | performance competitive in the long run. When it comes to
               | Apple stuff, people say some irrational stuff that is
               | totally bonkers...
        
               | refulgentis wrote:
               | Surprisingly, I think it is: I was going to comment that
               | here, then checked Geekbench, single core scores match
               | for M2 iPad/MacBook Pro/etc. at same clock speed. i.e. M2
               | "base" = M2 "base", but core count differs, and with the
               | desktops/laptops, you get options for M2 Ultra Max SE bla
               | bla.
        
               | joakleaf wrote:
               | The GeekBench [1,2] benchmarks for M2 are:
               | 
               | Single Core: iPad Pro (M2): 2539 Macbook Air (M2): 2596
               | Macbook Pro (M2): 2645
               | 
               | Multi Core: iPad Pro (M2 8-core): 9631 Macbook Air (M2
               | 8-core): 9654 Macbook Pro (M2 8-core): 9642
               | 
               | So, it appears to be almost the same performance (until
               | it throttles due to heat, of course).
               | 
               | 1. https://browser.geekbench.com/ios-benchmarks 2.
               | https://browser.geekbench.com/mac-benchmarks
        
             | jurmous wrote:
             | Likely there is also a smaller battery as the iPad Pro is
             | quite a bit thinner
        
           | aledalgrande wrote:
           | Well battery life would be used by other things too right?
           | Especially by that double OLED screen. "best ever" in every
           | keynote makes me laugh at this point, but it doesn't mean
           | that they're not improving their power envelope.
        
           | philistine wrote:
           | Quickly looking at the press release, it seems to have the
           | same comparisons as in the video. None of Apple's comparisons
           | today are between the M3 and M4. They are ALL comparing the
           | M2 and M4. Why? It's frustrating, but today Apple replaced a
           | product with an M2 with a product with an M4. Apple always
           | compares product to product, never component to component
           | when it comes to processors. So those specs are far more
           | impressive than if we could have numbers between the M3 and
           | M4.
        
             | jorvi wrote:
             | Didn't they do extreme nitpicking for their tests so they
             | could show the M1 beating a 3090 (or M2 a 4090, I can't
             | remember).
             | 
             | Gave me quite a laugh when Apple users started to claim
             | they'd be able to play Cyberpunk 2077 maxed out with maxed
             | out raytracing.
        
               | philistine wrote:
               | I'll give you that Apple's comparisons are sometimes
               | inscrutable. I vividly remember that one.
               | 
               | https://www.theverge.com/2022/3/17/22982915/apple-m1-ultr
               | a-r...
               | 
               | Apple was comparing the power envelope (already a
               | complicated concept) of their GPU against a 3090. Apple
               | wanted to show that the peak of their GPU's performance
               | was reached with a fraction of the power of a 3090. What
               | was terrible was that Apple was cropping their chart at
               | the point where the 3090 was pulling ahead in pure
               | compute by throwing more watts at the problem. So their
               | GPU was not as powerful as a 3090, but a quick glance at
               | the chart would completely tell you otherwise.
               | 
               | Ultimately we didn't see one of those charts today, just
               | a mention about the GPU being 50% more efficient than the
               | competition. I think those charts are beloved by Johny
               | Srouji and no one else. They're not getting the message
               | across.
        
               | izacus wrote:
               | Plenty of people on HN thought that M1 GPU is as powerful
               | as 3090 GPU, so I think the message worked very well for
               | Apple.
               | 
               | They really love those kind of comparisons - e.g. they
               | also compared M1s against really old Intel CPUs to make
               | the numbers look better, knowing that news headlines
               | won't care for details.
        
               | philistine wrote:
               | They compared against really old intel CPUs because those
               | were the last ones they used in their own computers!
               | Apple likes to compare device to device, not component to
               | component.
        
               | oblio wrote:
               | You say that like it's not a marketing gimmick meant to
               | mislead and obscure facts.
               | 
               | It's not some virtue that causes them to do this.
        
               | threeseed wrote:
               | It's funny because your comment is meant to mislead and
               | obscure facts.
               | 
               | Apple compared against Intel to encourage their previous
               | customers to upgrade.
               | 
               | There is nothing insidious about this and is in fact
               | standard business practice.
        
               | oblio wrote:
               | Apple's the ONLY tech company that doesn't compare
               | products to their competitors.
               | 
               | The intensity of the reality distortion field and hubris
               | is mind boggling.
               | 
               | Turns out, you fell for it.
        
               | izacus wrote:
               | No, they compared it because it made them look way better
               | for naive people. They have no qualms comparing to other
               | competition when it suits them.
               | 
               | You're explanation is a really baffling case of corporate
               | white knighting.
        
               | w0m wrote:
               | > not component to component
               | 
               | that's honestly kind of stupid when discussing things
               | like 'new CPU!' like this thread.
               | 
               | I'm not saying the M4 isn't a great platform, but holy
               | cow the corporate tripe people gobble up.
        
               | refulgentis wrote:
               | Yes, can't remember the precise combo either, there was a
               | solid year or two of latent misunderstandings.
               | 
               | I eventually made a visual showing it was the same as
               | claiming your iPhone was 3x the speed of a Core i9: Sure,
               | if you limit the power draw of your PC to a battery the
               | size of a post it pad.
               | 
               | Similar issues when on-device LLMs happened, thankfully,
               | quieted since then (last egregious thing I saw was stonk-
               | related wishcasting that Apple was obviously turning its
               | Xcode CI service into a full-blown AWS competitor that'd
               | wipe the floor with any cloud service, given the 2x
               | performance)
        
             | homarp wrote:
             | because previous ipad was M2. So 'remember how fast was
             | your previous ipad', well this one is N better.
        
             | kiba wrote:
             | I like the comparison between much older hardware with
             | brand new to highlight how far we came.
        
               | chipdart wrote:
               | > I like the comparison between much older hardware with
               | brand new to highlight how far we came.
               | 
               | That's ok, but why skip the previous iteration then?
               | Isn't the M2 only two generations behind? It's not that
               | much older. It's also a marketing blurb, not a
               | reproducible benchmark. Why leave out comparisons with
               | the previous iteration even when you're just hand-waving
               | over your own data?
        
               | FumblingBear wrote:
               | In this specific case, it's because iPad's never got the
               | M3. They're literally comparing it with the previous
               | model of iPad.
               | 
               | There were some disingenuous comparisons throughout the
               | presentation going back to A11 for the first Neural
               | Engine and some comparisons to M1, but the M2 comparison
               | actually makes sense.
        
               | philistine wrote:
               | I wouldn't call the comparison to A11 disingenuous, they
               | were very clear they were talking about how far their
               | neural engines have come, in the context of the
               | competition just starting to put NPUs in their stuff.
               | 
               | I mean, they compared the new iPad Pro to an iPod Nano,
               | that's just using your own history to make a point.
        
               | FumblingBear wrote:
               | Fair point--I just get a little annoyed when the
               | marketing speak confuses the average consumer and felt as
               | though some of the jargon they used could trip less
               | informed customers up.
        
             | yieldcrv wrote:
             | personally I think this is a comparison most people want.
             | The M3 had a lot of compromises over the M2.
             | 
             | that aside, the M4 is about the Neural Engine upgrades over
             | anything (which probably should have been compared to the
             | M3)
        
               | dakiol wrote:
               | What are such compromises? I may buy an M3 mbp, so would
               | like to hear more
        
               | fh9302 wrote:
               | The M3 Pro had some downgrades compared to the M2 Pro,
               | less performance cores and lower memory bandwidth. This
               | did not apply to the M3 and M3 Max.
        
             | sod wrote:
             | Yes, kinda annoying. But on the other hand, given that
             | apple releases a new chip every 12 months, we can grant
             | them some slack here. Given that from AMD, Intel or nvidia
             | we see usually a 2 year cadence.
        
               | dartos wrote:
               | There's probably easier problems to solve in the ARM
               | space than x86 considering the amount of money and time
               | spent on x86.
               | 
               | That's not to say that any of these problems are easy,
               | just that there's probably more lower hanging fruit in
               | ARM land.
        
               | kimixa wrote:
               | And yet they seem to be the only people picking the
               | apparently "Low Hanging Fruit" in ARM land. We'll see
               | about Qualcomm's Nuvia-based stuff, but that's been
               | "nearly released" for what feels like years now, but you
               | still can't buy one to actually test.
               | 
               | And don't underestimate the investment Apple made - it's
               | likely at a similar level to the big x86 incumbents. I
               | mean AMD's entire Zen development team cost was likely a
               | blip on the balance sheet for Apple.
        
               | re-thc wrote:
               | > We'll see about Qualcomm's Nuvia-based stuff, but
               | that's been "nearly released" for what feels like years
               | now, but you still can't buy one to actually test.
               | 
               | That's more bound by legal than technical reasons...
        
               | transpute wrote:
               | _> Qualcomm 's Nuvia-based stuff, but that's been "nearly
               | released" for what feels like years now_
               | 
               | Launching at Computex in 2 weeks,
               | https://www.windowscentral.com/hardware/laptops/next-gen-
               | ai-...
        
               | 0x457 wrote:
               | Good to know that it's finally seeing the light. I
               | thought they're still in legal dispute with ARM about
               | Nuvia's design?
        
               | transpute wrote:
               | Not privy to details, but some legal disputes can be
               | resolved by licensing price negotiations, motivated by
               | customer launch deadlines.
        
               | paulmd wrote:
               | speaking of which, whatever happened to qualcomm's
               | bizarre assertion that ARM was pulling a _sneak move_ in
               | all its new licensing deals to outlaw third-party IP
               | entirely and force ARM-IP-only?
               | 
               | there was one quiet "we haven't got anything like that in
               | the contract we're signing with ARM" from someone else,
               | and then radio silence. And you'd _really think_ that
               | would be major news, because it 's massively impactful on
               | pretty much everyone, since one of the major use-cases of
               | ARM is as a base SOC to bolt your custom proprietary
               | accelerators onto...
               | 
               | seemed like obvious bullshit at the time from a company
               | trying to "publicly renegotiate" a licensing agreement
               | they probably broke...
        
               | dartos wrote:
               | Again, not saying that they are easy (or cheap!) problems
               | to solve, but that there are more relatively easy
               | problems in the ARM space than the x86 space.
               | 
               | That's why Apple can release a meaningfully new chip
               | every year where it takes several for x86 manufacturers
        
               | blackoil wrote:
               | Maybe for GPUs, but for CPU both intel and AMD release
               | with yearly cadance. Even when Intel has nothing new to
               | release, generation is bumped.
        
             | MBCook wrote:
             | It's an iPad event and there were no M3 iPads.
             | 
             | That's all. They're trying to convince iPad users to
             | upgrade.
             | 
             | We'll see what they do when they get to computers later
             | this year.
        
               | epolanski wrote:
               | I have a Samsung Galaxy S7 FE tablet, and I can't figure
               | any use case where I may use more power.
               | 
               | I agree that iPad has more interesting software than
               | android for use cases like video or music editing, but I
               | don't do those on a tablet anyway.
               | 
               | I just can't imagine anyone updating their ipad M2 for
               | this except a tiny niche that really wants that more
               | power.
        
               | MBCook wrote:
               | The A series was good enough.
               | 
               | I'm vaguely considering this but entirely for the screen.
               | The chip has been irrelevant to me for years, it's long
               | past the point where I don't notice it.
        
               | nomel wrote:
               | A series was definitely not good enough. Really depends
               | on what you're using it for. Netflix and web? Sure. But
               | any old HDR tablet, that can maintain 24Hz, is good
               | enough for that.
               | 
               | These are 2048x2732 with 120Hz displays, that support 6k
               | external displays. Gaming and art apps push them pretty
               | hard. From the iPad user in my house, goin from the 2020
               | non M* iPad to a 2023 M2 iPad made a _huge_ difference
               | for the drawing apps. Better latency is always better for
               | drawing, and complex brushes (especially newer ones),
               | selections, etc, would get fairly unusable.
               | 
               | For gaming, it was pretty trivial to dip well below 60Hz
               | with a non M* iPad, with some of the higher demand games
               | like Fortnight, Minecraft (high view distance), Roblox
               | (it ain't what it used to be), etc.
               | 
               | But, the apps will always gravitate to the performance of
               | the average user. A step function in performance won't
               | show up in the apps until the adoption follows, years
               | down the line. Not pushing the average to higher
               | performance is how you stagnate the future software of
               | the devices.
        
               | MBCook wrote:
               | You're right, it's good enough _for me_. That's what I
               | meant but I didn't make that clear at all. I suspect a
               | ton of people are in a similar position.
               | 
               | I just don't push it at all. The few games I play are not
               | complicated in graphics or CPU needs. I don't draw, 3D
               | model, use Logic or Final Cut or anything like that.
               | 
               | I agree the extra power is useful to some people. But
               | even there we have the M1 (what I've got) and the M2
               | models. But I bet there are plenty of people like me who
               | mostly bought the pro models for the better screen and
               | not the additional grunt.
        
               | placeholderTest wrote:
               | The AX series, which is what iPads were using before the
               | M series, were precisely the chip family that got
               | rebranded as the M1, M2, etc.
               | 
               | The iPads always had a lot of power, people simply
               | started paying more attention when the chip family was
               | ported to PC.
        
               | MBCook wrote:
               | Yeah. I was just using the A to M chip name transition as
               | an easy landmark to compare against.
        
               | r00fus wrote:
               | AI on the device may be the real reason for an M4.
        
               | MBCook wrote:
               | Previous iPads have had that for a long time. Since the
               | A12 in 2018. The phones had it even earlier with the A11.
               | 
               | Sure this is faster but enough to make people care?
               | 
               | It may depend heavily on what they announce is in the
               | next version of iOS/iPadOS.
        
               | r00fus wrote:
               | That's my point - if there's a real on-device LLM it may
               | be much more usable with the latest chip.
        
               | grujicd wrote:
               | I don't know who would prefer to do music or video
               | editing on smaller display, without keyboard for
               | shortcuts, without proper file system and with
               | problematic connectivity to external hardware. Sure, it's
               | possible, but why? Ok, maybe there's some usecase on the
               | road where every gram counts, but that seems niche.
        
             | mlsu wrote:
             | They know that anyone who has bought an M3 is good on
             | computers for a long while. They're targeting people who
             | have m2 or older macs. People who own an m3 are basically
             | going to buy anything that comes down the pipe, because who
             | needs an m3 over an m2 or even an m1 today?
        
               | abnercoimbre wrote:
               | I'm starting to worry that I'm missing out on some huge
               | gains (M1 Air user.) But as a programmer who's not making
               | games or anything intensive, I think I'm still good for
               | another year or two?
        
               | richiebful1 wrote:
               | I have an M1 Air and I test drove a friend's recent M3
               | Air. It's not very different performance-wise for what I
               | do (programming, watching video, editing small memory-
               | constrained GIS models, etc)
        
               | mlsu wrote:
               | I wanted to upgrade my M1 because it was going to swap a
               | lot with only 8 gigs of RAM and because I wanted a
               | machine that could run big LLMs locally. Ended up going
               | 8G macbook air M1 -> 64G macbook pro M1. My other
               | reasoning was that it would speed up compilation, which
               | it has, but not by too much.
               | 
               | The M1 air is a very fast machine and is perfect for
               | anyone doing normal things on the computer.
        
               | giantrobot wrote:
               | You're not going to be missing out on much. I had the
               | first M1 Air and recently upgraded to an M3 Air. The M1
               | Air has years of useful life left and my upgrade was for
               | reasons not performance related.
               | 
               | The M3 Air performs better than the M1 in raw numbers but
               | outside of some truly CPU or GPU limited tasks you're not
               | likely to actually notice the difference. The day to day
               | behavior between the two is pretty similar.
               | 
               | If your current M1 works you're not missing out on
               | anything. For the power/size/battery envelope the M1 Air
               | was pretty awesome, it hasn't really gotten any worse
               | over time. If it does what you need then you're good
               | until it doesn't do what you need.
        
               | windowsrookie wrote:
               | I have a 2018 15" MBP, and an M1 Air and honestly they
               | both perform about the same. The only noticeable
               | difference is the MBP takes ~3 seconds to wake from sleep
               | and the M1 is instant.
        
             | mh8h wrote:
             | That's because the previous iPad Pros came with M2, not M3.
             | They are comparing the performance with the previous
             | generation of the same product.
        
             | raydev wrote:
             | > They are ALL comparing the M2 and M4. Why?
             | 
             | Well, the obvious answer is that those with older machines
             | are more likely to upgrade than those with newer machines.
             | The market for insta-upgraders is tiny.
             | 
             | edit: And perhaps an even more obvious answer: there are no
             | iPads that contained the M3, so the comparison would be
             | more useless. The M4 was just launched today exclusively in
             | iPads.
        
             | loongloong wrote:
             | Doesn't seem plausible to me that Apple will release a "M3
             | variant" that can drive "tandem OLED" displays. So probably
             | logical to package whatever chip progress (including
             | process improvements) into "M4".
             | 
             | And it can signal that "We are serious about iPad as a
             | computer", using their latest chip.
             | 
             | Logical alignment to progresses in engineering (and
             | manufacturing) packaged smartly to generate marketing
             | capital for sales and brand value creation.
             | 
             | Wonder how the newer Macs will use these "tandem OLED"
             | capabilities of the M4.
        
             | mkl wrote:
             | > Apple always compares product to product, never component
             | to component when it comes to processors.
             | 
             | I don't think this is true. When they launched the M3 they
             | compared primarily to M1 to make it look better.
        
             | dyauspitr wrote:
             | The iPads skipped the M3 so they're comparing your old iPad
             | to the new one.
        
           | cletus wrote:
           | IME Apple has always been the most honest when it makes
           | performance claims. LIke when they said an Macbook Air would
           | last 10+ hours and third-party reviewers would get 8-9+
           | hours. All the while, Dell or HP would claim 19 hours and
           | you'd be lucky to get 2 eg [1].
           | 
           | As for CPU power use, of course that doesn't translate into
           | doubling battery life because there are other components. And
           | yes, it seems the OLED display uses more power so, all in
           | all, battery life seems to be about the same.
           | 
           | I'm interested to see an M3 vs M4 performance comparison in
           | the real world. IIRC the M3 was a questionable upgrade. Some
           | things were better but some weren't.
           | 
           | Overall the M-series SoCs have been an excellent product
           | however.
           | 
           | [1]: https://www.laptopmag.com/features/laptop-battery-life-
           | claim...
           | 
           | EDIT: added link
        
             | ajross wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims
             | 
             | That's just laughable, sorry. No one is particularly honest
             | in marketing copy, but Apple is for sure one of the worst,
             | historically. Even more so when you go back to the PPC
             | days. I still remember Jobs on stage talking about how the
             | G4 was the fasted CPU in the world when I knew damn well
             | that it was half the speed of the P3 on my desk.
        
               | zeroonetwothree wrote:
               | Indeed. Have we already forgotten about the RDF?
        
               | coldtea wrote:
               | No, it was just always a meaningless term...
        
               | jimbokun wrote:
               | Was simply a phrase to acknowledge that Jobs was better
               | at giving demos than anyone who ever lived.
        
               | dijit wrote:
               | You can claim Apple is dishonest for a few reasons.
               | 
               | 1) Graphs often are unannotatted.
               | 
               | 2) Comparisons are rarely against latest generation
               | products. (their argument for that has been that they do
               | not expect people to upgrade yearly, so its showing the
               | difference of their intended upgrade path).
               | 
               | 3) They have conflated performance, for performance per
               | watt.
               | 
               | However, when it comes to battery life, performance (for
               | a task) or specification of their components (screens,
               | ability to use external displays up to 6k, port speed
               | etc) there are almost no hidden gotchas and they have
               | tended to be trustworthy.
               | 
               | The first wave of M1 announcements were met with similar
               | suspicion as you have shown here; but it was swiftly
               | dispelled once people actually got their hands on them.
               | 
               | *EDIT:* Blaming a guy who's been dead for 13 years for
               | something they said 50 years ago, and primarily it seems
               | for internal use is weird. I had to look up the context
               | but it _seems_ it was more about internal motivation in
               | the 70's than relating to anything today, especially when
               | referring to concrete claims.
        
               | Brybry wrote:
               | "This thing is incredible," Jobs said. "It's the first
               | supercomputer on a chip.... We think it's going to set
               | the industry on fire."
               | 
               | "The G4 chip is nearly three times faster than the
               | fastest Pentium III"
               | 
               | - Steve Jobs (1999) [1]
               | 
               | [1] https://www.wired.com/1999/08/lavish-debut-for-
               | apples-g4/
        
               | dijit wrote:
               | Thats cool, but literally last millennium.
               | 
               | And again, the guy has been dead for the better part of
               | _this_ millennium.
               | 
               | What have they shown of any product currently on the
               | market, especially when backed with any concrete claim,
               | that has been proven untrue-
               | 
               |  _EDIT:_ After reading your article and this one:
               | https://lowendmac.com/2006/twice-as-fast-did-apple-lie-
               | or-ju... it looks like it was true in floating point
               | workloads.
        
               | seanmcdirmid wrote:
               | The G4 was a really good chip if you used photoshop. It
               | took intel awhile to catch up.
        
               | mort96 wrote:
               | Interesting, by what benchmark did you compare the G4 and
               | the P3?
               | 
               | I don't have a horse in this race, Jobs lied or bent the
               | truth all the time so it wouldn't surprise me, I'm just
               | curious.
        
               | dblohm7 wrote:
               | I remember that Apple used to wave around these SIMD
               | benchmarks showing their PowerPC chips trouncing Intel
               | chips. In the fine print, you'd see that the benchmark
               | was built to use AltiVec on PowerPC, but without MMX or
               | SSE on Intel.
        
               | 0x457 wrote:
               | Ah so the way Intel advertises their chips. Got it.
        
               | mort96 wrote:
               | Yeah, and we rightfully criticize Intel for the same and
               | we distrust their benchmarks
        
               | mc32 wrote:
               | Didn't he have to use two PPC procs to get the equivalent
               | perf you'd get on a P3?
               | 
               | Just add them up, it's the same number of Hertz!
               | 
               | But Steve that's two procs vs one!
               | 
               | I think this is when Adobe was optimizing for
               | Windows/intel and was single threaded, but Steve put out
               | some graphs showing better perf on the Mac.
        
               | leptons wrote:
               | Apple marketed their PPC systems as "a supercomputer on
               | your desk", but it was nowhere near the performance of a
               | supercomputer of that age. Maybe similar performance to a
               | supercomputer from the 1970's, but that was their
               | marketing angle from the 1990's.
        
               | galad87 wrote:
               | From https://512pixels.net/2013/07/power-mac-g4/: the ad
               | was based on the fact that Apple was forbidden to export
               | the G4 to many countries due to its "supercomputer"
               | classification by the US government.
        
               | m000 wrote:
               | It seems that US government was buying too much into tech
               | hypes at the turn of the millenium. Around the same
               | period PS2 exports were also restricted [1].
               | 
               | [1] https://www.latimes.com/archives/la-
               | xpm-2000-apr-17-fi-20482...
        
               | actionfromafar wrote:
               | The PS2 was used in supercomputing clusters.
        
               | georgespencer wrote:
               | > Apple marketed their PPC systems as "a supercomputer on
               | your desk"
               | 
               | It's certainly fair to say that _twenty years ago_ Apple
               | was marketing some of its PPC systems as  "the first
               | supercomputer on a chip"[^1].
               | 
               | > but it was nowhere near the performance of a
               | supercomputer of that age.
               | 
               | That was not the claim. Apple did not argue that the G4's
               | performance was commensurate with the state of the art in
               | supercomputing. (If you'll forgive me: like, _fucking
               | obviously?_ The entire reason they made the claim is
               | precisely because the latest room-sized supercomputers
               | with leapfrog performance gains were in the news very
               | often.)
               | 
               | The claim was that the G4 was capable of sustained
               | gigaflop performance, and therefore met the narrow
               | technical definition of a supercomputer.
               | 
               | You'll see in the aforelinked marketing page that Apple
               | compared the G4 chip to UC Irvine's Aeneas Project, which
               | in ~2000 was delivering 1.9 gigaflop performance.
               | 
               | This chart[^2] shows the trailing average of various
               | subsets of super computers, for context.
               | 
               | This narrow definition is also why the machine could not
               | be exported to many countries, which Apple leaned
               | into.[^3]
               | 
               | > Maybe similar performance to a supercomputer from the
               | 1970's
               | 
               | What am I missing here? Picking perhaps the most famous
               | supercomputer of the mid-1970s, the Cray-1,[^4] we can
               | see performance of 160 MFLOPS, which is 160 million
               | floating point operations per second (with an 80 MHz
               | processor!).
               | 
               | The G4 was capable of delivering ~1 GFLOP performance,
               | which is a billion floating point operations per second.
               | 
               | Are you perhaps thinking of a different decade?
               | 
               | [^1]: https://web.archive.org/web/20000510163142/http://w
               | ww.apple....
               | 
               | [^2]: https://en.wikipedia.org/wiki/History_of_supercompu
               | ting#/med...
               | 
               | [^3]: https://web.archive.org/web/20020418022430/https://
               | www.cnn.c...
               | 
               | [^4]: https://en.wikipedia.org/wiki/Cray-1#Performance
        
               | leptons wrote:
               | >That was not the claim. Apple did not argue that the
               | G4's performance was commensurate with the state of the
               | art in supercomputing.
               | 
               | This is _marketing_ we 're talking about, people see
               | "supercomputer on a chip" and they get hyped up by it.
               | Apple was 100% using the "supercomputer" claim to make
               | their luddite audience think they had a performance
               | advantage, which they did not.
               | 
               | > The entire reason they made the claim is
               | 
               | The reason they marketed it that way was to get people to
               | part with their money. Full stop.
               | 
               | In the first link you added, there's a photo of a Cray
               | supercomputer, which makes the viewer equate Apple =
               | Supercomputer = _I am a computing god if I buy this
               | product_. Apple 's marketing has always been a bit shady
               | that way.
               | 
               | And soon after that period Apple jumped off the PPC
               | architecture and onto the x86 bandwagon. Gimmicks like
               | "supercomputer on a chip" don't last long when the
               | competition is far ahead.
        
               | threeseed wrote:
               | I can't believe Apple is marketing their products in a
               | way to get people to part with their money.
               | 
               | If I had some pearls I would be clutching them right now.
        
               | georgespencer wrote:
               | > This is marketing we're talking about, people see
               | "supercomputer on a chip" and they get hyped up by it.
               | 
               | That is _also_ not in dispute. I am disputing your
               | specific claim that Apple somehow suggested that the G4
               | was of commensurate performance to a modern
               | supercomputer, which does not seem to be true.
               | 
               | > Apple was 100% using the "supercomputer" claim to make
               | their luddite audience think they had a performance
               | advantage, which they did not.
               | 
               | This is why context is important (and why I'd appreciate
               | clarity on whether you genuinely believe a supercomputer
               | from the 1970s was anywhere near as powerful as a G4).
               | 
               | In the late twentieth and early twenty-first century,
               | megapixels were a proxy for camera quality, and megahertz
               | were a proxy for processor performance. More MHz = more
               | capable processor.
               | 
               | This created a problem for Apple, because the G4's
               | SPECfp_95 (floating point) benchmarks crushed Pentium III
               | at lower clock speeds.
               | 
               | PPC G4 500 MHz - 22.6
               | 
               | PPC G4 450 MHz - 20.4
               | 
               | PPC G4 400 MHz - 18.36
               | 
               | Pentium III 600 MHz - 15.9
               | 
               | For both floating point and integer benchmarks, the G3
               | and G4 outgunned comparable Pentium II/III processors.
               | 
               | You can question how this translates to real world use
               | cases - the Photoshop filters on stage were real, but
               | others have pointed out in this thread that it wasn't an
               | apples-to-apples comparison vs. Wintel - but it is
               | inarguable that the G4 had some performance advantages
               | over Pentium at launch, and that it met the (inane)
               | definition of a supercomputer.
               | 
               | > The reason they marketed it that way was to get people
               | to part with their money. Full stop.
               | 
               | Yes, marketing exists to convince people to buy one
               | product over another. That's why companies do marketing.
               | IMO that's a self-evidently inane thing to say in a
               | nested discussion of microprocessor architecture on a
               | technical forum - especially when your interlocutor is
               | establishing the historical context you may be unaware of
               | (judging by your comment about supercomputers from the
               | 1970s, which I am surprised you have not addressed).
               | 
               | I didn't say "The reason Apple markets its computers," I
               | said "The entire reason they made the claim [about
               | supercomputer performance]..."
               | 
               | Both of us appear to know that companies do marketing,
               | but only you appear to be confused about the specific
               | claims Apple made - given that you proactively raised
               | them, and got them wrong - and the historical backdrop
               | against which they were made.
               | 
               | > In the first link you added, there's a photo of a Cray
               | supercomputer
               | 
               | That's right. It looks like a stylized rendering of a
               | Cray-1 to me - what do you think?
               | 
               | > which makes the viewer equate Apple = Supercomputer = I
               | am a computing god if I buy this product
               | 
               | The Cray-1's compute, as measured in GFLOPS, was
               | approximately 6.5x lower than the G4 processor.
               | 
               | I'm therefore not sure what your argument is: you started
               | by claiming that Apple deliberately suggested that the G4
               | had comparable performance to a modern supercomputer.
               | That isn't the case, and the page you're referring to
               | contains imagery of a much less performant supercomputer,
               | as well as a lot of information relating to the history
               | of supercomputers (and a link to a Forbes article).
               | 
               | > Apple's marketing has always been a bit shady that way.
               | 
               | All companies make tradeoffs they think are right for
               | their shareholders and customers. They accentuate the
               | positives in marketing and gloss over the drawbacks.
               | 
               | Note, too, that Adobe's CEO has been duped on the page
               | you link to. Despite your emphatic claim:
               | 
               | > Apple was 100% using the "supercomputer" claim to make
               | their luddite audience think they had a performance
               | advantage, which they did not.
               | 
               | The CEO of Adobe is quoted as saying:
               | 
               | > "Currently, the G4 is significantly faster than any
               | platform we've seen running Photoshop 5.5," said John E.
               | Warnock, chairman and CEO of Adobe.
               | 
               | How is what you are doing materially different to what
               | you accuse Apple of doing?
               | 
               | > And soon after that period Apple jumped off the PPC
               | architecture and onto the x86 bandwagon.
               | 
               | They did so when Intel's roadmap introduced Core Duo,
               | which was significantly more energy-efficient than
               | Pentium 4. I don't have benchmarks to hand, but I suspect
               | that a PowerBook G5 would have given the Core Duo a run
               | for its money (despite the G5 being significantly older),
               | but only for about fifteen seconds before thermal
               | throttling and draining the battery entirely in minutes.
        
               | seanmcdirmid wrote:
               | G4 was 1998, Core Duo was 2006, 8 years isn't bad.
        
               | georgespencer wrote:
               | That is a long time - bet it felt even longer to the poor
               | PowerBook DRI at Apple who had to keep explaining to
               | Steve Jobs why a G5 PowerBook wasn't viable!
        
               | seanmcdirmid wrote:
               | Ya, I really wanted a G5 but power and thermals weren't
               | going to work and IBM/Moto weren't interested in making a
               | mobile version.
        
               | Vvector wrote:
               | Blaming a company TODAY for marketing from the 1990s is
               | crazy.
        
               | leptons wrote:
               | Except they still do the same kind of bullshit marketing
               | today.
        
               | brokencode wrote:
               | Have any examples from the past decade? Especially in the
               | context of how exaggerated the claims are from PC and
               | Android brands they are competing with?
        
               | lynndotpy wrote:
               | Apple recently claimed that RAM in their Macbooks is
               | equivalent to 2x the RAM in any other machine, in defense
               | of the 8GB starting point.
               | 
               | In my experience, I can confirm that this is just not
               | true. The secret is heavy reliance on swap. It's still
               | the case that 1GB = 1GB.
        
               | dmitrygr wrote:
               | > The secret is heavy reliance on swap
               | 
               | You are entirely (100%) wrong, but, sadly, NDA...
        
               | ethanwillis wrote:
               | How convenient :)
        
               | monsieurbanana wrote:
               | Regardless of what you can't tell, he's absolutely right
               | regarding Apple's claims: saying that a 8gb mac is as
               | good as a 16gb non-mac is laughable.
        
               | dmitrygr wrote:
               | That was never said. They said 8gb mac is similar to a
               | 16gb non-Mac
        
               | Zanfa wrote:
               | My entry-level 8GB M1 Macbook Air beats my 64GB 10-core
               | Intel iMac in my day-to-day dev work.
        
               | sudosysgen wrote:
               | Memory compression isn't magic and isn't exclusive to
               | macOS.
        
               | dmitrygr wrote:
               | I suggest you go and look HOW it is done in apple silicon
               | macs, and then think long and hard why this might make a
               | huge difference. Maybe Asahi Linux guys can explain it to
               | you ;)
        
               | sudosysgen wrote:
               | I understand that it can make a difference to performance
               | (which is already baked into the benchmarks we look at),
               | I don't see how it can make a difference to compression
               | ratios, if anything in similar implementations (ex:
               | console APUs) it tends to lead to worse compression
               | ratios.
               | 
               | If there's any publicly available data to the contrary
               | I'd love to read it. Anecdotally I haven't seen a
               | significant difference between zswap on Linux and macOS
               | memory compression in terms of compression ratios, and on
               | the workloads I've tested zswap tends to be faster than
               | no memory compression on x86 for many core machines.
        
               | lynndotpy wrote:
               | I do admit the "reliance on swap" thing is speculation on
               | my part :)
               | 
               | My experience is that I can still tell when the OS is
               | unhappy when I demand more RAM than it can give. MacOS is
               | still relatively responsive around this range, which I
               | just attributed to super fast swapping. (I'd assume
               | memory compression too, but I usually run into this
               | trouble when working with large amounts of poorly-
               | compressible data.)
               | 
               | In either case, I know it's frustrating when someone is
               | confidently wrong but you can't properly correct them, so
               | you have my apologies
        
               | brokencode wrote:
               | Sure, and they were widely criticized for this. Again,
               | the assertion I was responding to is that Apple does this
               | "laughably" more than competitors.
               | 
               | Is an occasional statement that they get pushback on
               | really worse than what other brands do?
               | 
               | As an example from a competitor, take a look at the
               | recent firestorm over Intel's outlandish anti-AMD
               | marketing:
               | 
               | https://wccftech.com/intel-calls-out-amd-using-old-cores-
               | in-...
        
               | ajross wrote:
               | > Sure, and they were widely criticized for this. Again,
               | the assertion I was responding to is that Apple does this
               | "laughably" more than competitors.
               | 
               | FWIW: the language upthread was that it was laughable to
               | say Apple was the _most_ honest. And I stand by that.
        
               | brokencode wrote:
               | Fair point. Based on their first sentence, I
               | mischaracterized how "laughable" was used.
               | 
               | Though the author also made clear in their second
               | sentence that they think Apple is one of the worst when
               | it comes to marketing claims, so I don't think your
               | characterization is totally accurate either.
        
               | rahkiin wrote:
               | There is also memory compression and their insane swap
               | speed due to SoC memory and ssd
        
               | anaisbetts wrote:
               | Every modern operating system now does memory compression
        
               | astrange wrote:
               | Some of them do it better than others though.
        
               | Rinzler89 wrote:
               | Apple uses Magic Compression.
        
               | adamomada wrote:
               | Not sure what windows does but the popular method on e.g.
               | fedora is to split memory into main and swap and then
               | compress swap. It could be more efficient the way Apple
               | does it by not having to partition main memory.
        
               | throwaway744678 wrote:
               | This is a revolution
        
               | anaisbetts wrote:
               | Citation needed?
        
               | astrange wrote:
               | Don't know if I'm allowed to. It's not that special
               | though.
        
               | seec wrote:
               | Ye that was hilarious, my basic workload borders on the
               | 8GB limit not even pushing it. They have fast swap but
               | nothing beats real ram in the end, and considering their
               | storage pricing is as stupid as their RAM pricing it
               | really makes no difference.
               | 
               | If you go for the base model, you are in for a bad time,
               | 256GB with heavy swap and no dedicated GPU memory (making
               | the 8GB even worse) is just plain stupid.
               | 
               | This what the Apple fanboys don't seem to get, their base
               | model at somewhat affordable price are deeply incompetent
               | and if you start to load it up the pricing just do not
               | make a lot of sense...
        
               | n9 wrote:
               | You know that RAM in these machines is more different
               | than the same as "RAM" in a standard PC? Apple's SoC RAM
               | is more or less part of the CPU/GPU and is super fast.
               | And for obvious reasons cannot be added to.
               | 
               | Anyway, I manage a few M1 and M3 machines with 256/8
               | configs and they all run just as fast as 16 and 32
               | machines EXCEPT for workloads that need more than 8GB for
               | a process (virtualization) or workloads that need lots of
               | video memory (Lightroom can KILL an 8GB machine that
               | isn't doing anything else...)
               | 
               | The 8GB is stupid discussion isn't "wrong" in the general
               | case, but it is wrong for maybe 80% of users.
        
               | ajross wrote:
               | > EXCEPT for workloads that need more than 8GB for a
               | process
               | 
               | Isn't that exactly the upthread contention: Apple's magic
               | compressed swap management is still _swap management_
               | that replaces O(1) fast(-ish) DRAM access with thousands+
               | cycle page decompression operations. It may be faster
               | than storage, but it 's still extremely slow relative to
               | a DRAM fetch. And once your working set gets beyond your
               | available RAM you start thrashing just like VAXen did on
               | 4BSD.
        
               | kcartlidge wrote:
               | > _If you go for the base model, you are in for a bad
               | time, 256GB with heavy swap and no dedicated GPU memory
               | (making the 8GB even worse) is just plain stupid ...
               | their base model at somewhat affordable price are deeply
               | incompetent_
               | 
               | I got the base model M1 Air a couple of years back and
               | whilst I don't do much gaming I do do C#, Python, Go,
               | Rails, local Postgres, and more. I also have a (new last
               | year) Lenovo 13th gen i7 with 16GB RAM running Windows 11
               | and the performance _with the same load_ is night and day
               | - the M1 walks all over it whilst easily lasting 10hrs+.
               | 
               |  _Note that I 'm not a fanboy; I run both by choice. Also
               | both iPhone and Android._
               | 
               | The Windows laptop often gets sluggish and hot. The M1
               | never slows down and stays cold. There's just no
               | comparison (though the Air keyboard remains poor).
               | 
               | I don't much care about the technical details, and I know
               | 8GB isn't a lot. I care about the _experience_ and the
               | underspecced Mac wins.
        
               | Phrodo_00 wrote:
               | None of that seems to be high loads or stuff that needs a
               | lot of ram.
        
               | cwillu wrote:
               | If someone is claiming "<foo> has always <barred>", then
               | I don't think it's fair to demand a 10 year cutoff on
               | counter-evidence.
        
               | bee_rider wrote:
               | Clearly it isn't the case that Apple has always been more
               | honest than their competition, because there were some
               | years before Apple was founded.
        
               | brokencode wrote:
               | For "always" to be true, the behavior needs to extend to
               | the present date. Otherwise, it's only true to say "used
               | to".
        
               | windowsrookie wrote:
               | While certainly misleading, there were situations where
               | the G4 was incredibly fast for the time. I remember being
               | able to edit Video in iMove on a 12" G4 Laptop. At that
               | time there was no equivalent x86 machine.
        
               | jmull wrote:
               | If you have to go back 20+ years for an example...
        
               | n9 wrote:
               | Worked in an engineering lab at the time of the G4
               | introduction and I can contest that the G4 was a very,
               | very fast CPU for scientific workloads.
               | 
               | Confirmed here:
               | https://computer.howstuffworks.com/question299.htm (and
               | elsewhere.)
               | 
               | A year later I was doing bonkers (for the time) photoshop
               | work on very large compressed tiff files and my G4 laptop
               | running at 400Mhz was more than 2x as fast as PIIIs on my
               | bench.
               | 
               | Was it faster all around? I don't know how to tell. Was
               | Apple as honest as I am in this commentary about how it
               | mattered what you were doing? No. Was it a CPU that was
               | able to do some things very fast vs others? I know it
               | was.
        
               | ajross wrote:
               | It's just amazing that this kind of nonsense persists.
               | There were no significant benchmarks, "scientific" or
               | otherwise, at the time or since showing that kind of
               | behavior. The G4 was a dud. Apple rushed out some
               | apples/oranges comparisons at launch (the one you link
               | appears to be the bit where they compared a SIMD-
               | optimized tool on PPC to generic compiled C on x86,
               | though I'm too lazy to try to dig out the specifics from
               | stale links), and the reality distortion field did the
               | rest.
        
             | Aurornis wrote:
             | > ME Apple has always been the most honest when it makes
             | performance claims.
             | 
             | Okay, but your example was about battery life:
             | 
             | > LIke when they said an Macbook Air would last 10+ hours
             | and third-party reviewers would get 8-9+ hours. All the
             | while, Dell or HP would claim 19 hours and you'd be lucky
             | to get 2 eg [1]
             | 
             | And even then, they exaggerated their claims. And your link
             | doesn't say anything about HP or Dell claiming 19 hour
             | battery life.
             | 
             | Apple has definitely exaggerated their performance claims
             | over and over again. The Apple silicon parts are fast and
             | low power indeed, but they've made ridiculous claims like
             | comparing their chips to an nVidia RTX 3090 with completely
             | misleading graphs
             | 
             | Even the Mac sites have admitted that the nVidia 3090
             | comparison was completely wrong and designed to be
             | misleading: https://9to5mac.com/2022/03/31/m1-ultra-gpu-
             | comparison-with-...
             | 
             | This is why you have to take everything they say with a
             | huge grain of salt. Their chip may be "twice" as power
             | efficient in some carefully chosen unique scenario that
             | only exists in an artificial setting, but how does it fare
             | in the real world? That's the question that matters, and
             | you're not going to get an honest answer from Apple's
             | marketing team.
        
               | vel0city wrote:
               | You're right, its not 19 hours claimed. It was more than
               | even that.
               | 
               | > HP gave the 13-inch HP Spectre x360 an absurd 22.5
               | hours of estimated battery life, while our real-world
               | test results showed that the laptop could last for 12
               | hours and 7 minutes.
        
               | seaal wrote:
               | the absurdness was difference in claimed battery life vs
               | actual battery life. 19 vs 2 is more absurd than 22.5 vs
               | 12
               | 
               | > Speaking of the ThinkPad P72, here are the top three
               | laptops with the most, er, far out battery life claims of
               | all our analyzed products: the Lenovo ThinkPad P72, the
               | Dell Latitude 7400 2-in-1 and the Acer TravelMate P6
               | P614. The three fell short of their advertised battery
               | life by 821 minutes (13 hours and 41 mins), 818 minutes
               | (13 hours and 38 minutes) and 746 minutes (12 hours and
               | 26 minutes), respectively.
               | 
               | Dell did manage to be one of the top 3 most absurd claims
               | though.
        
               | hinkley wrote:
               | You're working hard to miss the point there.
               | 
               | Dell and IBM were lying about battery life before OSX was
               | even a thing and normal people started buying MacBooks.
               | Dell and IBM will be lying about battery life when the
               | sun goes red dwarf.
               | 
               | Reviewers and individuals like me have _always_ been able
               | to get 90% of Apple's official battery times without
               | jumping through hoops to do so. "If you were very
               | careful" makes sense for an 11% difference. A ten hour
               | difference is fucking bullshit.
        
               | dmz73 wrote:
               | So you are saying that Dell with Intel CPU could get
               | longer battery life than Mac with M1? What does that say
               | about quality of Apple engineering? Their marketeering is
               | certainly second to none.
        
               | ribit wrote:
               | M1 Ultra did benchmark close to 3090 in some synthetic
               | gaming tests. The claim was not outlandish, just largely
               | irrelevant for any reasonable purpose.
               | 
               | Apple does usually explain their testing methodology and
               | they don't cheat on benchmarks like some other companies.
               | It's just that the results are still marketing and should
               | be treated as such.
               | 
               | Outlandish claims notwithstanding, I don't think anyone
               | can deny the progress they achieved with their CPU and
               | especially GPU IP. Improving performance on complex
               | workloads by 30-50% in a single year is very impressive.
        
               | tsimionescu wrote:
               | It did not get anywhere close to a 3090 in any test when
               | the 3090 was running at full power. They were only
               | comparable at specific power usage thresholds.
        
             | moogly wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims.
             | 
             | I guess you weren't around during the PowerPC days...
             | Because that's a laughable statement.
        
               | imwillofficial wrote:
               | All I remember is tanks in the commercials.
               | 
               | We need more tanks in commercials.
        
               | oblio wrote:
               | I have no idea who's down voting you. They were lying
               | through their teeth about CPU performance back then.
               | 
               | A PC half the price was smoking their top of the line
               | stuff.
        
               | seec wrote:
               | That's funny you say that, because this is precisely the
               | time, I started buying Macs (I got a Pismo PowerBook G3
               | gifted and then bought an iBook G4). And my experience
               | was that for sure, if you put as much money into a PC
               | than in a Mac you would get MUCH better performance.
               | 
               | What made it worth it at the time (I felt) was the
               | software. Today I'm really don't think so, software has
               | improved overall in the industry and there is not a lot
               | of things "Mac specific" that makes it a clear-cut
               | choice.
               | 
               | As for the performance I can't believe all the Apple
               | silicon hype. Sure, it gets good battery life given you
               | use strictly Apple software (or software optimized for it
               | heavily) but in mixed workload situation it's not that
               | impressive.
               | 
               | Using the M2 MacBook Pro of a friend I figured I could
               | get maybe 4-5 hours out of its best case scenario which
               | is better than the 2-3 hours you would get from a PC
               | laptop but also not that great considering the price
               | difference.
               | 
               | And when it comes to performance it is extremely unequal
               | and very lackluster for many things. Like there is more
               | lag launching Activity Monitor on a 2K++ MacBook Pro than
               | launching task manager on a 500 PC. This is a small
               | somewhat stupid example but it does tell the overall
               | story.
               | 
               | They talk a big game but in reality, their stuff isn't
               | that performant in the real world.
               | 
               | And they still market games when one of their 2K laptops
               | plays Dota 2 (a very old, relatively ressource efficient
               | game) worse than a cheapo PC.
        
               | skydhash wrote:
               | > Using the M2 MacBook Pro of a friend I figured I could
               | get maybe 4-5 hours out of its best case scenario which
               | is better than the 2-3 hours you would get from a PC
               | laptop but also not that great considering the price
               | difference.
               | 
               | Any electron apps on it?
        
               | fragmede wrote:
               | or VMs. they should be getting way better out of that.
        
               | kagakuninja wrote:
               | Apple switched to Intel chips 20 years ago. Who fucking
               | cares about PowerPC?
               | 
               | Today, Apple Silicon is smoking all but the top end Intel
               | chips, while using a fraction of the power.
        
               | dylan604 wrote:
               | Oh those megahertz myths! Their marketing department is
               | pretty amazing at their spin control. This one was right
               | up there with "it's not a bug; it's a feature" type of
               | spin.
        
               | hinkley wrote:
               | Before macOS became NextStep it was practically a
               | different company. I've been using Apple hardware for 21
               | years, when they got a real operating system. Even the G4
               | did better than the laptop it replaced.
        
             | syncsynchalt wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims
             | 
             | Yes and no. They'll always be honest with the claim, but
             | the scenario for the claimed improvement will always be
             | chosen to make the claim as large as possible, sometimes
             | with laughable results.
             | 
             | Typically something like "watch videos for 3x longer
             | <small>when viewing 4k h265 video</small>" (which means
             | they adapted the previous gen's silicon which could only
             | handle h264).
        
             | moooo99 wrote:
             | They are pretty honest when it comes to battery life
             | claims, they're less honest when it comes to benchmark
             | graphs
        
               | underlogic wrote:
               | I don't think less honest covers it and can't believe
               | anything their marketing says after the 3090 claims.
               | Maybe it's true, maybe not. We'll see from the reviews.
               | Well assuming the reviewers weren't paid off with an
               | "evaluation unit".
        
             | bvrmn wrote:
             | BTW I get 19 hours from DELL XPS and Latitude. It's Linux
             | with custom DE and Vim as IDE though.
        
               | amarka wrote:
               | I get about 21 hours from mine, it's running Windows but
               | powered off.
        
               | bee_rider wrote:
               | This is why Apple can be slightly more honest about their
               | battery specs, they don't have the OS working against
               | them. Unfortunately most DELLs XPS will be running
               | Windows, so it is still misleading to provide specs based
               | on what the hardware could do if not sabotaged.
        
               | hinkley wrote:
               | I wonder if it's like webpages. The numbers are
               | calculated before marketing adds the crapware and ruins
               | all of your hard work.
        
               | wklm wrote:
               | can you share more details about your setup?
        
               | bvrmn wrote:
               | Archlinux, mitigations (spectre alike) off, X11, OpenBox,
               | bmpanel with only CPU/IO indicator. Light theme
               | everywhere. Opera in power save mode. `powertop --auto-
               | tune` and `echo 1 | sudo tee
               | /sys/devices/system/cpu/intel_pstate/no_turbo` Current
               | laptop is Latitude 7390.
        
               | ribit wrote:
               | Right, so you are disabling all performance features and
               | effectively turning your CPU into a low-end low-power
               | SKU. Of course you'd get better battery life. It's not
               | the same thing though.
        
               | Sohcahtoa82 wrote:
               | > echo 1 | sudo tee
               | /sys/devices/system/cpu/intel_pstate/no_turbo
               | 
               | Isn't that going to torch performance? My i9-9900 has a
               | base frequency of 3.6 Ghz and a turbo of 5.0 Ghz.
               | Disabling the turbo would create a 28% drop in
               | performance.
               | 
               | I suppose if everything else on the system is configured
               | to use as little power as possible, then it won't even be
               | noticed. But seeing as CPUs underclock when idle (I've
               | seen my i9 go as low as 1.2 Ghz), I'm not sure disabling
               | turbo makes a significant impact except when your CPU is
               | being pegged.
        
               | bvrmn wrote:
               | That's the point. I have no performance bottleneck with
               | no_turbo. My i5 tends to turn on turbo mode and increased
               | power demand (heat leaks) even if it's no needed. For
               | example with no_turbo laptop is always cold and fan
               | basically stays silent. With turbo it easily gets 40C
               | warm while watching YT or doing my developer stuff,
               | building docker containers and so.
        
               | fransje26 wrote:
               | I get 20 minutes from my Dell (not the XPS), with Vim.
               | When it was brand-new, I got 40 minutes. A piece of hot
               | garbage, with an energy-inefficient intel cpu..
        
               | ben-schaaf wrote:
               | Frankly that sounds like you got a lemon. Even the most
               | inefficient gaming laptops get over an hour _under a full
               | gaming workload_.
        
             | bee_rider wrote:
             | Controlling the OS is probably a big help there. At least,
             | I saw lots of complaints about my zenbook model's battery
             | not hitting the spec. It was easy to hit or exceed it in
             | Linux, but you have to tell it not to randomly spin up the
             | CPU.
        
               | hinkley wrote:
               | I had to work my ass off on my Fujitsu Lifebook to get
               | 90% of the estimate, even on Linux. I even worked on a
               | kernel patch for the Transmeta CPU, based on unexploited
               | settings in the CPU documentation, but it came to no or
               | negligible difference in power draw, which I suppose is
               | why Linus didn't do it in the first place.
        
             | izacus wrote:
             | > LIke when they said an Macbook Air would last 10+ hours
             | and third-party reviewers would get 8-9+ hours.
             | 
             | For literal YEARS, Apple battery life claims were a running
             | joke on how inaccurate and overinflated they were.
        
               | hinkley wrote:
               | I've never known a time when Dell, IBM, Sony, Toshiba,
               | Fujitsu, Alien, weren't lying through their teeth about
               | battery times.
               | 
               | What time period are you thinking about for Apple? I've
               | been using their laptops since the last G4 which is
               | twenty years. They've always been substantially more
               | accurate about battery times.
        
               | ruszki wrote:
               | The problem with arguing about battery life this way is
               | that it's highly dependent on usage patterns.
               | 
               | For example I would be surprised if there is any laptop,
               | which is sufficiently fast for my usage, and it's battery
               | life is more than 2-3 hours top. Heck, I have several
               | laptops and all of them dies in one-one and a half hours.
               | But of course, I never optimized for battery life, so who
               | knows. So in my case, all of them are lying equally. I
               | don't even check battery life for 15 years now. It's a
               | useless metric for me, because all of them are shit.
               | 
               | But of course for people who don't need to use VMs, run
               | several "micro"services at once, have constant internet
               | transfer and have 5+ Intellij project open at the same
               | time which caching several millions LOC, while gazillion
               | web pages are open, maybe there is a difference, for me
               | it doesn't matter whether it's one or one and a half
               | hours.
        
             | nabakin wrote:
             | Maybe for battery life, but definitely not when it comes to
             | CPU/GPU performance. Tbf, no chip company is, but Apple is
             | particularly egregious. Their charts assume best case
             | multi-core performance when users rarely ever use all cores
             | at once. They'd have you thinking it's the equivalent of a
             | 3090 or that you get double the frames you did before when
             | the reality is more like 10% gains.
        
             | dylan604 wrote:
             | Yeah, the assumption seems to be that using less battery by
             | one component means that the power will just magically go
             | unused. As with everything else in life, as soon as
             | something stops using a resource something else fills the
             | vacuum to take advantage of the resource.
        
             | treflop wrote:
             | Apple is always honest but they know how to make you
             | believe something that isn't true.
        
             | bmitc wrote:
             | > IME Apple has always been the most honest when it makes
             | performance claims.
             | 
             | In nearly every single release, their claims are well above
             | actual performance.
        
           | bagels wrote:
           | CPU is not the only way that power is consumed in a portable
           | device. It is a large fraction, but you also have displays
           | and radios.
        
           | coldtea wrote:
           | Apple might use simplified and opaque plots to drive their
           | point, but they all too often undersell the differences.
           | Indepedent reviews for example find that they not just hit
           | the mark Apple mentions for things like battery but that
           | often do slightly better...
        
           | michaelmior wrote:
           | > is it REALLY half the power use of all times (so we'll get
           | double the battery life)
           | 
           | I'm not sure what you mean by "of all times" but half the
           | battery usage of the processor definitely doesn't translate
           | into double the battery life since the processor is not the
           | only thing consuming power.
        
           | smith7018 wrote:
           | You wouldn't necessarily get twice the battery life. It could
           | be less than that due to the thinner body causing more heat,
           | a screen that utilizes more energy, etc
        
           | mcv wrote:
           | I don't know, but the M3 MBP I got from work already gives
           | the impression of using barely any power at all. I'm really
           | impressed by Apple Silicon, and I'm seriously reconsidering
           | my decision from years ago to never ever buy Apple again. Why
           | doesn't everybody else use chips like these?
        
             | jacurtis wrote:
             | I have an M3 for my personal laptop and an M2 for my work
             | laptop. I get ~8 hours if I'm lucky on my work laptop, but
             | I have attributed most of that battery loss to all the
             | "protection" software they put on my work laptop that is
             | always showing up under the "Apps Using Significant Power"
             | category in the battery dropdown.
             | 
             | I can have my laptop with nothing on screen, and the
             | battery still points to TrendMicro and others as the cause
             | of heavy battery drain while my laptop seemingly idles.
             | 
             | I recently upgraded my personal laptop to the M3 MacBook
             | pro and the difference is astonishing. I almost never use
             | it plugged in because I genuinely get close to that 20 hour
             | reported battery life. Last weekend I played a AAA Video
             | Game through Xbox Cloud Gaming (awesome for mac gamers btw)
             | and with essentially max graphics (rendered elsewhere and
             | streamed to me of course), I got sucked into a game for
             | like 5 hours and lost only 8% of my battery during that
             | time, while playing a top tier video game! It really blew
             | my mind. I also use GoLand IDE on there and have managed to
             | get a full day of development done using only about 25-30%
             | battery.
             | 
             | So yeah, whatever Apple is doing, they are doing it right.
             | Performance without all the spyware that your work gives
             | you makes a huge difference too.
        
               | bee_rider wrote:
               | For the AAA video game example, I mean, it is interesting
               | how far that kind of tech has come... but really that's
               | just video streaming (maybe slightly more difficult
               | because latency matters?) from the point of view of the
               | laptop, right? The quality of the graphics there have
               | more-or-less nothing to do with the battery.
        
               | mcv wrote:
               | Over the weekend, I accidentally left my work M3
               | unplugged with caffeinate running (so it doesn't sleep).
               | It wasn't running anything particularly heavy, but still,
               | on Monday, 80% charge left.
               | 
               | That's mindblowing. Especially since my personal laptop
               | is a Thinkpad X1 Extreme. I can't leave that unplugged at
               | all.
        
             | jhickok wrote:
             | I think the market will move to using chips like this, or
             | at least have additional options. The new Snapdragon SOC is
             | interesting, and I would suspect we could see Google and
             | Microsoft play in this space at some point soon.
        
           | wwilim wrote:
           | Isn't 15% more battery life a huge improvement on a device
           | already well known for long battery life?
        
           | can16358p wrote:
           | Apple is one of the few companies that underpromise and
           | overdeliver and never exaggerate.
           | 
           | Compared to the competition, I'd trust Apple much more than
           | the Windows laptop OEMs.
        
           | VelesDude wrote:
           | If there is any dishonesty, I would wager it is a case of it
           | can double the battery life in low power scenarios. Can go
           | twice as long when doing word processing for instance. Can
           | potentially idle a lot lower
        
           | Petersipoi wrote:
           | > so we'll get double the battery life
           | 
           | This is an absurd interpretation. Nobody hears that and says
           | "they made the screen use half the energy".
        
         | mvkel wrote:
         | And here it is in an OS that can't even max out an M1!
         | 
         | That said, the function keys make me think "and it runs macOS"
         | is coming, and THAT would be extremely compelling.
        
           | a_vanderbilt wrote:
           | We've seen a slow march over the last decade towards the
           | unification of iOS and macOS. Maybe not a "it runs macOS",
           | but an eventual "they share all the same apps" with adaptive
           | UIs.
        
             | asabla wrote:
             | I think so too. Especially after the split from iOS to
             | ipados. Hopefully they'll show something during this year's
             | WWDC
        
             | DrBazza wrote:
             | They probably saw the debacle that was Windows 8 and
             | thought merging a desktop and touch OS is a decade-long
             | gradual task, if that is even the final intention.
             | 
             | Unlike MS that went with the Big Bang in your face approach
             | that was oh-so successful.
        
               | zitterbewegung wrote:
               | People have complained about why Logic Pro / Final Cut
               | wasn't ported to the iPad Pro line. The obvious answer is
               | that making workflows done properly take time.
        
               | a_vanderbilt wrote:
               | Even with the advantage of time, I don't think Microsoft
               | would have been able to do it. They can't even get their
               | own UI situated, much less adaptive. Windows 10/11 is
               | this odd mishmash of old and new, without a consistent
               | language across it. They can't unify what isn't even
               | cohesive in the first place.
        
               | mvkel wrote:
               | I'd be very surprised if Apple is paying attention to
               | anything that's happening with windows. At least as a
               | divining rod for how to execute.
        
               | bluescrn wrote:
               | At this point, there's two fundamentally different types
               | of computing that will likely never be mergeable in a
               | satisfactory way.
               | 
               | We now have 'content consumption platforms' and 'content
               | creation platforms'.
               | 
               | While attempts have been made to try and enable some
               | creation on locked-down touchscreen devices, you're never
               | going to want to try and operate a fully-featured version
               | of Photoshop, Maya, Visual Studio, etc on them. And if
               | you've got a serious workstation with multiple large
               | monitors and precision input devices, you don't want to
               | have dumbed-down touch-centric apps forced upon you
               | Win8-style.
               | 
               | The bleak future that seems likely is that the 'content
               | creation platforms' become ever more niche and far more
               | costly. Barriers to entry for content creators are raised
               | significantly as mainstream computing is mostly limited
               | to locked-down content consumption platforms. And Linux
               | is only an option for as long as non-locked-down hardware
               | is available for sensible prices.
        
               | eastbound wrote:
               | On the other hand, a $4000 mid-game Macbook doesn't have
               | a touchscreen and that's a heresy. Granted, you can get
               | the one with the emoji bar, but why interact using touch
               | on a bar when you could touch the screen directly?
               | 
               | Maybe the end game for Apple isn't the full convergence,
               | but just having a touch screen on the Mac.
        
               | bluescrn wrote:
               | Why would you want greasy finger marks on your Macbook
               | screen?
               | 
               | Not much point having a touchscreen on a Macbook (or any
               | laptop really), unless the hardware has a 'tablet mode'
               | with a detachable or fold-away keyboard.
        
               | dghlsakjg wrote:
               | Mouse and keyboard is still a better interface for A LOT
               | of work. I have yet to find a workflow for any of my
               | professional work that would be faster or easier if you
               | gave me a touchscreen.
               | 
               | There are plenty of laptops that do have touchscreens,
               | and it has always felt more like a gimmick than a useful
               | hardware interface.
        
               | dialup_sounds wrote:
               | Kinda weird to exclude Procreate, Affinity, Final Cut,
               | Logic, etc. from your definition of content creation. The
               | trend has clearly been more professional and creative
               | apps year over year and ever more capable devices to run
               | them on. I mean, you're right that nobody wants to use
               | Photoshop on the iPad, but that's because there are
               | better options.
               | 
               | Honestly, the biggest barrier to creativity is thinking
               | you need a specific concept of a "serious workstation" to
               | do it. Plenty of people are using $2k+ desktops just to
               | play video games.
        
               | bluescrn wrote:
               | In these cases, it still seems that tablet-based tools
               | are very much 'secondary tools', more of a sketchpad to
               | fiddle with ideas while on the move, rather than
               | 'production tools'.
               | 
               | Then there's the whole dealing with lots of files and
               | version control side of things, essential for working as
               | part of a team. Think about creating (and previewing, and
               | finally uploading) a very simple web page, just HTML and
               | a couple of images, entirely on an iPad. While it's
               | probably quite possible these days, I suspect the
               | workflow would be abysmal compared to a 'proper computer'
               | where the file system isn't hidden from you and where
               | you're not constantly switching between full-screen apps.
               | 
               | And that's before you start dealing with anything with
               | significant numbers of files in deep directory
               | structures, or doing more technical image creation (e.g.
               | dealing with alpha channels). And of course, before
               | testing your webpage on all the major browsers. Hmm...
        
               | superb_dev wrote:
               | There are so many artists who exclusively work on their
               | iPad. It does seem cumbersome for a whole studio to use
               | iPads, but they can be a powerhouse for an individual
        
               | dialup_sounds wrote:
               | It seems weirdly arbitrary to say that tools people have
               | been using in production aren't "production tools".
        
               | dghlsakjg wrote:
               | > Barriers to entry for content creators are raised
               | significantly as mainstream computing is mostly limited
               | to locked-down content consumption platforms. And Linux
               | is only an option for as long as non-locked-down hardware
               | is available for sensible prices.
               | 
               | Respectfully, I disagree partially. It has never been
               | easier or more affordable to get into creating content.
               | You can create cinema grade video with used cameras that
               | sell for a few hundred dollars. You can create pixar
               | level animation on open source software, and a pretty
               | cheap computer. A computer that can edit a 4k video costs
               | less than the latest iPhone. There are people that create
               | plenty of content with just a phone. Simply put it is
               | orders of magnitude cheaper and easier to create content
               | than it was less than two decades ago, which is why we
               | are seeing so much content getting made. I used to work
               | for a newspaper and it used to be a lot harder and more
               | expensive to produce audio visual media.
               | 
               | My strong feeling is that the problem of content being
               | locked into platforms has precious little to do with
               | consumption oriented hardware, and more to do with the
               | platforms. Embrace -> extinguish -> exlcusivity ->
               | enshittify seems to be the model behind basically
               | anything that hosts user content these days.
        
               | fsflover wrote:
               | > At this point, there's two fundamentally different
               | types of computing that will likely never be mergeable in
               | a satisfactory way.
               | 
               | This is a completely artificial creation by Apple and
               | Google to extract more money from you. Nothing technical
               | prevents one from using a full OS on a phone today.
               | 
               | Sent from my Librem 5 running desktop GNU/Linux.
        
               | kmeisthax wrote:
               | You're right about the reason but wrong about the
               | timeline: Jobs saw Windows XP Tablet Edition and built a
               | skunkworks at Apple to engineer a tablet that did not
               | require a stylus. This was purely to spite a friend[0] of
               | his that worked at Microsoft and was very bullish on XP
               | tablets.
               | 
               | Apple then later took the tablet demo technology, wrapped
               | it up in a _very_ stripped-down OS X with a different
               | window server and UI library, and called it iPhone OS.
               | Apple was very clear from the beginning that Fingers Can
               | 't Use Mouse Software, Damn It, and that the whole ocean
               | needed to be boiled to support the new user interface
               | paradigm[1]. They even have very specific UI rules
               | _specifically_ to ensure a finger never meets a desktop
               | UI widget, including things like iPad Sidecar just not
               | forwarding touch events at all and only supporting
               | connected keyboards, mice, and the Apple Pencil.
               | 
               | Microsoft's philosophy has always been the complete
               | opposite. Windows XP through 7 had tablet support that
               | amounted to just some affordances for stylus users
               | layered on top of a mouse-only UI. Windows 8 was the
               | first time they took tablets seriously, but instead of
               | just shipping a separate tablet OS or making Windows
               | Phone bigger, they turned it into a parasite that ate the
               | Windows desktop from the inside-out.
               | 
               | This causes awkwardness. For example, window management.
               | Desktops have traditionally been implemented as a shared
               | data structure - a tree of controls - that every app on
               | the desktop can manipulate. Tablets don't support this:
               | your app gets one[2] display surface to present their
               | whole UI inside of[3], and that surface is typically
               | either full-screen or half-screen. Microsoft solved this
               | incongruity by shoving the entire Desktop inside of
               | another app that could be properly split-screened against
               | the new, better-behaved tablet apps.
               | 
               | If Apple _were_ to decide  "ok let's support Mac apps on
               | iPad", it'd have to be done in exactly the same way
               | Windows 8 did it, with a special Desktop app that
               | contained all the Mac apps in a penalty box. This is so
               | that they didn't have to add support for all sorts of
               | incongruous, touch-hostile UI like floating toolbars,
               | floating pop-ups, global menus, five different ways of
               | dragging-and-dropping tabs, and that weird drawer thing
               | you're not supposed to use anymore, to iPadOS. There
               | really isn't a way to gradually do this, either. You can
               | gradually add _feature parity_ with macOS (which they
               | should), but you can 't gradually find ways to make
               | desktop UI designed by third-parties work on a tablet.
               | You either put it in a penalty box, or you put all the
               | well-behaved tablet apps in their own penalty boxes, like
               | Windows 10.
               | 
               | Microsoft solved Windows 8's problems by going back to
               | the Windows XP/Vista/7 approach of just shipping a
               | desktop for fingers. Tablet Mode tries to hide this, but
               | it's fundamentally just window management automation, and
               | it has to handle all the craziness of desktop. If a
               | desktop app decides it wants a floating toolbar or a
               | window that can't be resized[4], Tablet Mode has to honor
               | that request. In fact, Tablet Mode needs a lot of
               | heuristics to tell what floating windows pair with which
               | apps. So it's a lot more awkward for tablet users in
               | exchange for desktop users having a usable desktop again.
               | 
               | [0] Given what I've heard about Jobs I don't think Jobs
               | was psychologically capable of having friends, but I'll
               | use the word out of convenience.
               | 
               | [1] Though the Safari team was way better at building
               | compatibility with existing websites, so much so that
               | this is the one platform that doesn't have a deep
               | mobile/desktop split.
               | 
               | [2] This was later extended to multiple windows per app,
               | of course.
               | 
               | [3] This is also why popovers and context menus _never_
               | extend outside their containing window on tablets. Hell,
               | also on websites. Even when you have multiwindow, there
               | 's no API surface for "I want to have a control floating
               | on top of my window that is positioned over here and has
               | this width and height".
               | 
               | [4] Which, BTW, is why the iPad has no default calculator
               | app. Before Stage Manager there was no way to have a
               | window the size of a pocket calculator.
        
               | pram wrote:
               | Clip Studio is one Mac app port I've seen that was
               | literally the desktop version moved to the iPad. It
               | uniquely has the top menu bar and everything. They might
               | have made an exception because you're intended to use the
               | pencil and not your fingers.
        
               | Bluecobra wrote:
               | Honestly, using a stylus isn't that bad. I've had to
               | support floor traders for many years and they all still
               | use a Windows-based tablet + a stylus to get around.
               | Heck, even Palm devices were a pleasure to use. Not sure
               | why Steve was so hell bent against them, it probably had
               | to do with his beef with Sculley/Newton.
        
               | ukuina wrote:
               | > Palm devices were a pleasure to use.
               | 
               | RIP Graffiti.
        
               | faeriechangling wrote:
               | >Unlike MS that went with the Big Bang in your face
               | approach that was oh-so successful.
               | 
               | It was kind of successful, touchscreen laptops see pretty
               | big sales nowadays. I don't know what crack they were
               | smoking with Windows 8.0 though.
        
             | skohan wrote:
             | Unfortunately I think "they share all the same apps" will
             | not include a terminal with root access, which is what
             | would really be needed to make iPad a general purpose
             | computer for development
             | 
             | It's a shame, because it's definitely powerful enough, and
             | the idea of traveling with just an iPad seems super
             | interesting, but I imagine they will not extend those
             | features to any devices besides macs
        
               | LordDragonfang wrote:
               | I mean, it doesn't even have to be true "root" access.
               | Chromebooks have a containerized linux environment, and
               | aside from the odd bug, the high end ones are actually
               | great dev machines while retaining the "You spend most of
               | your time in the browser so we may as well bake that into
               | the OS" base layer.
        
               | a_vanderbilt wrote:
               | I actually do use a Chromebook in this way! Out of all
               | the Linux machines I've used, that's why I like it. Give
               | me a space to work and provide an OS that I don't have to
               | babysit or mentally maintain.
        
               | presides wrote:
               | Been a while since I've used a chromebook but iirc
               | there's ALSO root access that's just a bit more difficult
               | to access, and you do actually need to access it from
               | time to time for various reasons, or at least you used
               | to.
        
               | LordDragonfang wrote:
               | You're thinking of Crouton, the old method of using linux
               | on a Chromebook (which involved disabling boot protection
               | and setting up a second linux install in a chroot, with a
               | keybind that allowed you to toggle between the two
               | environments). It was including a
               | 
               | Crostini is the new containerized version that is both
               | officially supported and integrated into ChromeOS
        
             | 0x457 wrote:
             | I will settle for: you can connect 2 monitors to iPad and
             | select audio device sound is going through. If can run
             | IntelliJ and compile rust on the iPad, I would promise to
             | upgrade to the new iPad Pro as soon as it is released every
             | time.
        
             | bonestamp2 wrote:
             | Agreed, this will be the way forward in the future. I've
             | already seen one of my apps (Authy) say "We're no longer
             | building a macOS version, just install the iPad app on your
             | mac".
             | 
             | That's great, but you need an M series chip in your mac for
             | that to work so backwords compatibility only goes back a
             | few years at this point, which is fine for corporate
             | upgrade cycles but might be a bit short for consumers at
             | this time. But it will be fine in the future.
        
             | beeboobaa3 wrote:
             | Until an "iPhone" can run brew, all my developer tools,
             | steam, epic games launcher, etc it's hardly interesting.
        
             | plussed_reader wrote:
             | The writing was on the wall with the introduction of Swift,
             | IMO. Since then it's been over complicating the iPad and
             | dumbing down the macOS interfaces to attain this goal. So
             | much wasted touch/negative space in macOS since Catalina to
             | compensate for fingers and adapative interfaces; so many
             | hidden menus and long taps squirreled away in iOS.
        
             | reaperducer wrote:
             | _Maybe not a "it runs macOS", but an eventual "they share
             | all the same apps" with adaptive UIs_
             | 
             | M-class MacBooks can already run many iPhone and iPad apps.
        
           | criddell wrote:
           | > And here it is in an OS that can't even max out an M1
           | 
           | Do you really want your OS using 100% of CPU?
        
             | ric2b wrote:
             | They mean that this OS only runs iPad apps, it doesn't let
             | you run the kind of software you expect to take full
             | advantage of the CPU.
        
           | underdeserver wrote:
           | What function keys?
        
             | kstrauser wrote:
             | The new Magic Keyboard has a laptop-style row of function
             | keys (and esc!).
        
         | bitwize wrote:
         | Ultimately, does it matter?
         | 
         | Michelin-starred restaurants not only have top-tier chefs. They
         | have buyers who negotiate with food suppliers to get the best
         | ingredients they can at the lowest prices they can. Having a
         | preferential relationship with a good supplier is as important
         | to the food quality and the health of the business as having a
         | good chef to prepare the dishes.
         | 
         | Apple has top-tier engineering talent but they are also able to
         | negotiate preferential relationships with their suppliers, and
         | it's both those things that make Apple a phenomenal tech
         | company.
        
           | makeitdouble wrote:
           | Qualcomm is also with TSMC and their newer 4nm processor is
           | expected to stay competitive with the M series.
           | 
           | If the magic comes mostly from TSMC, there's a good chance
           | for these claims to be true and to have a series of better
           | chips coming on the other platforms as well.
        
             | hot_gril wrote:
             | This info is much more useful than a comparison to
             | restaurants.
        
             | 0x457 wrote:
             | Does Qualcomm have any new CPU cores besides that one they
             | can't make ARM due to licensing?
        
               | transpute wrote:
               | The one being announced on May 20th at Computex?
               | https://news.ycombinator.com/item?id=40288969
        
             | stouset wrote:
             | "Stay" competitive implies they've _been_ competitive.
             | Which they haven't.
             | 
             | I'm filing this into the bin with all the other "This next
             | Qualcomm chip will close the performance gap" claims made
             | over the past decade. Maybe this time it'll be true. I
             | wouldn't bet on it.
        
               | makeitdouble wrote:
               | Point taken. I used "stay" as in, their next
               | rumored/leaked chip wouldn't be an single anomalous
               | success but the start of a trend that could expand to the
               | X2, X3 Elite chips coming after.
               | 
               | Basically we'd need some basis to believe they'll be
               | progressively improving at more or less the same pace as
               | Intel's or Apple's chips to get on board with ARM laptops
               | for Windows/linux.
               | 
               | Otherwise I don't see software makers care enough to port
               | their build to ARM as well.
        
         | GeekyBear wrote:
         | They don't mention which metric is 50% higher.
         | 
         | However, we have more CPU cores, a newer core design, and a
         | newer process node which would all contribute to improving
         | multicore CPU performance.
         | 
         | Also, Apple is conservative on clock speeds, but those do tend
         | to get bumped up when there is a new process node as well.
        
         | philistine wrote:
         | Actually, TSMC's N3E process is somewhat of a regression on the
         | first-generation 3nm process, N3. However, it is simpler and
         | more cost-efficient, and everyone seems to want to get out of
         | that N3 process as quickly as possible. That seems to be the
         | biggest reason Apple released the A17(M3) generation and now
         | the M4 the way they did.
         | 
         | The N3 process is in the A17 Pro, the M3, M3 Pro, and M3 Max.
         | The A17 Pro name seems to imply you won't find it trickle down
         | on the regular iPhones next year. So we'll see that processor
         | only this year in phones, since Apple discontinues their Pro
         | range of phones every year; only the regular phones trickle
         | downrange lowering their prices. The M3 devices are all Macs
         | that needed an upgrade due to their popularity: the Macbook Pro
         | and Macbook Air. They made three chips for them, but they did
         | not make an M3 Ultra for the lower volume desktops. With the
         | announcement of an M4 chip in iPads today, we can expect to see
         | the Macbook Air and Macbook Pro upgraded to M4 soon, with the
         | introduction of an M4 Ultra to match later. We can now expect
         | those M3 devices to be discontinued instead of going downrange
         | in price.
         | 
         | That would leave one device with an N3 process chip: the iMac.
         | At its sale level, I wouldn't be surprised if all the M3 chips
         | that will go into it will be made this year, with the model
         | staying around for a year or two running on fumes.
        
           | GeekyBear wrote:
           | The signs certainly all point to the initial version of N3
           | having issues.
           | 
           | For instance, Apple supposedly required a deal where they
           | only paid TSMC for usable chips per N3 wafer, and not for the
           | entire wafer.
           | 
           | https://arstechnica.com/gadgets/2023/08/report-apple-is-
           | savi...
        
             | dehrmann wrote:
             | My read on the absurd number of Macbook M3 SKUs was that
             | they had yield issues.
        
               | GeekyBear wrote:
               | There is also the fact that we currently have an iPhone
               | generation where only the Pro models got updated to chips
               | on TSMC 3nm.
               | 
               | The next iPhone generation is said to be a return to form
               | with all models using the same SOC on the revised version
               | of the 3nm node.
               | 
               | > Code from the operating system also indicates that the
               | entire iPhone 16 range will use a new system-on-chip -
               | t8140 - Tahiti, which is what Apple calls the A18 chip
               | internally. The A18 chip is referenced in relation to the
               | base model iPhone 16 and 16 Plus (known collectively as
               | D4y within Apple) as well as the iPhone 16 Pro and 16 Pro
               | Max (referred to as D9x internally)
               | 
               | https://www.macrumors.com/2023/12/20/ios-18-code-four-
               | new-ip...
        
           | dhx wrote:
           | N3E still has a +9% logic transistor density increase on N3
           | despite a relaxation to design rules, for reasons such as
           | introduction of FinFlex.[1] Critically though, SRAM cell
           | sizes remain the same as N5 (reversing the ~5% reduction in
           | N3), and it looks like the situation with SRAM cell sizes
           | won't be improving soon.[2][3] It appears more likely that
           | designers particularly for AI chips will just stick with N5
           | as their designs are increasingly constrained by SRAM.
           | 
           | [1] https://semiwiki.com/semiconductor-
           | manufacturers/tsmc/322688...
           | 
           | [2] https://semiengineering.com/sram-scaling-issues-and-what-
           | com...
           | 
           | [3] https://semiengineering.com/sram-in-ai-the-future-of-
           | memory/
        
             | sroussey wrote:
             | SRAM has really stalled. I don't think 5nm was much better
             | than 7nm. On ever smaller nodes, sram will be taking up a
             | larger and larger percent of the entire chip. But the cost
             | is much higher on the smaller nodes even if the performance
             | is not better.
             | 
             | I can see why AMD started putting the SRAM on top.
        
               | magicalhippo wrote:
               | It wasn't immediately clear to me why SRAM wouldn't scale
               | like logic. This[1] article and this[2] paper sheds some
               | light.
               | 
               | From what I can gather the key aspects are that decreased
               | feature sizes lead to more variability between
               | transistors, but also to less margin between on-state and
               | off-state. Thus a kind of double-whammy. In logic
               | circuits you're constantly overwriting with new values
               | regardless of what was already there, so they're not as
               | sensitive to this, while the entire point of a memory
               | circuit is to reliably keep values around.
               | 
               | Alternate transistor designs such as FinFET, Gate-all-
               | around and such can provide mitigation of some of this,
               | say by reducing transistor-to-transistor variability by a
               | factor, but can't get around root issue.
               | 
               | [1]: https://semiengineering.com/sram-scaling-issues-and-
               | what-com...
               | 
               | [2]:
               | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9416021/
        
         | mensetmanusman wrote:
         | Also the thousands of suppliers that have improved their
         | equipment and supplies that feed into the tsmc fabs.
        
         | tuckerpo wrote:
         | It is almost certainly half as much power in the RMS sense, not
         | absolute.
        
         | smrtinsert wrote:
         | Breathtaking
        
         | beeboobaa3 wrote:
         | > And compared with the latest PC chip in a thin and light
         | laptop, M4 can deliver the same performance using just a fourth
         | of the power
         | 
         | It can deliver the same performance as itself at just a fourth
         | of the power than it's using? That's incredible!
        
         | nblgbg wrote:
         | That doesn't seem to reflect in the battery life of these. They
         | have the same exact battery life. Does it mean it's not
         | entirely accurate? Since they don't indicate the battery
         | capacity in their specs, it's hard to confirm this.
        
           | davidee wrote:
           | I haven't paid too much attention today, but what I did see
           | with the iPad Pro was that they're using an OLED display
           | (maybe even some kind of double layer OLED for increased
           | brightness if I'm understanding the marketing jargon?).
           | 
           | I believe that OLED is much more power hungry than the
           | previous display type (LED backlit LCD of some type?). I
           | could be wrong, but in TV land that's the case...
           | 
           | Could explain, at least partly, why run time isn't greatly
           | increased.
        
         | thih9 wrote:
         | They mention just M2 and M4 - curious, how does M3 fit into
         | that?
         | 
         | I.e. would it sit between, or closer to M2 or M4?
        
         | tyrd12 wrote:
         | Considering the cost difference would that still make M4
         | better. Whatever the savings in power are offset by the price?
        
       | mlhpdx wrote:
       | I'm far from an expert in Apple silicon, but this strikes me as
       | having some conservative improvements. And in-depth info out
       | there yet?
        
         | Lalabadie wrote:
         | 2x the performance per watt is a great improvement, though.
        
           | refulgentis wrote:
           | Clever wording on their part: 2x performance per watt over
           | M2. Took me a minute, had to reason through this is their 2nd
           | generation 3nm chip, so it wasn't from a die shrink, then go
           | spelunking.
        
           | jeffbee wrote:
           | This claim can only be evaluated in the context of a specific
           | operating point. I can 6x the performance per watt of the CPU
           | in this machine I am using by running everything on the
           | efficiency cores and clocking them down to 1100MHz. But
           | performance per watt is not the only metric of interest.
        
         | gmm1990 wrote:
         | It surprised me they called it an M4 vs an M3 something. The
         | display engine seems to be the largest change I don't know what
         | that looked like on previous processors. Completely
         | hypothesizing but could be a significant efficiency improvement
         | if its offloading display stuff.
        
           | aeonik wrote:
           | 3 isn't a power of two, maybe the M8 is next.
        
           | duxup wrote:
           | I'd rather they just keep counting up than some companies
           | where they get into wonky product line naming convention
           | hell.
           | 
           | It's ok if 3 to 4 is or isn't a big jump, it's the next one
           | is really all I want to know. If I need to peek at the specs,
           | the name really won't tell me anything anyhow and I'll be on
           | a webpage.
        
         | shepherdjerred wrote:
         | I expect the pro/max variants will be more interesting. The
         | improvements do look great for consumer devices, though.
        
         | Findecanor wrote:
         | I'm guessing that the "ML accelerator" in the CPU cores means
         | one of ARM's SME extensions for matrix multiplication. SME in
         | ARM v8.4-A adds dot product instructions. v8.6-A adds more,
         | including BF16 support.
         | 
         | https://community.arm.com/arm-community-blogs/b/architecture...
        
           | hmottestad wrote:
           | Apple has the NPU (also called Apple Neural Engine), which is
           | specific hardware for running inference. Can't be used for
           | LLMs though at the moment, maybe the M4 will be different.
           | They also have a vector processor attached to the performance
           | cluster of the CPU, they call the instruction set for it AMX.
           | I believe that that one can be leveraged for faster LLM
           | inferencing.
           | 
           | https://github.com/corsix/amx
        
       | EduardoBautista wrote:
       | The 256gb and 512gb models have 8gb of ram. The 1tb and 2tb
       | models have 16gb. Not a fan of tying ram to storage.
       | 
       | https://www.apple.com/ipad-pro/specs/
        
         | praseodym wrote:
         | And also one less CPU performance core for the lower storage
         | models.
        
           | 05 wrote:
           | Well, they have to sell the dies with failed cores somehow..
        
         | Laaas wrote:
         | The economic reasoning behind this doesn't make sense to me.
         | 
         | What do they lose by allowing slightly more freedom in
         | configurations?
        
           | fallat wrote:
           | The reasoning is money. Come on.
        
           | blegr wrote:
           | It forces you to buy multiple upgrades instead of just the
           | one you need.
        
             | jessriedel wrote:
             | But why does this make them more money than offering
             | separate upgrades at higher prices?
             | 
             | I do think there is a price discrimination story here, but
             | there are some details to be filled in.
        
               | foldr wrote:
               | It's not obvious to me that Apple does make a significant
               | amount of money by selling upgrades. Almost everyone buys
               | the base model. The other models are probably little more
               | than a logistical pain in the butt from Apple's
               | perspective. Apple has to offer more powerful systems to
               | be credible as a platform, but I wouldn't be surprised if
               | the apparently exorbitant price of the upgrades reflects
               | the overall costs associated with complicating the
               | production and distribution lines.
        
               | blegr wrote:
               | It's not about the price of upgrades though, it's about
               | their bundling together and the ridiculously stingy base
               | specs that often make the upgrade non-optional. People
               | who buy a base MacBook Air probably aren't thinking about
               | keeping it for 8 years or using it for heavy workloads.
        
               | foldr wrote:
               | Sure, but bundling them together reduces the supply chain
               | complexity and reduces Apple's costs. If the options were
               | more fine grained, Apple would sell even less of each
               | model and it would be even less worth their while.
               | 
               | Also, I _have_ seen lots of people on HN complain about
               | the price itself, even if it 's not what you yourself
               | object to.
        
             | sneak wrote:
             | That's not what "force" means.
        
               | blegr wrote:
               | Yes, but you understood what I meant since you could
               | assert that it's not what it means.
        
           | jessriedel wrote:
           | I think it's a price discrimination technique.
        
           | a_vanderbilt wrote:
           | Margins and profit. Less variations in production makes for
           | higher efficiency. Segmenting the product line can push
           | consumers to purchase higher tiers of product. It's iOS
           | anyways, and the people who know enough to care how much RAM
           | they are getting are self-selecting for those higher product
           | tiers.
        
           | vinkelhake wrote:
           | It's just Apple's price ladder. The prices on their different
           | SKUs are laid out carefully so that there's never a too big
           | of a jump to the next level.
           | 
           | https://talkbackcomms.com/blogs/news/ladder
        
           | naravara wrote:
           | Logistical efficiencies mostly. It ends up being a lot of
           | additional SKUs to manage, and it would probably discourage
           | people from moving up a price tier if they would have
           | otherwise. So from Apple's perspective they're undergoing
           | more hassle (which costs) for the benefit of selling you
           | lower margin products. No upside for them besides maybe
           | higher customer satisfaction, but I doubt it would have moved
           | the needle on that very much.
        
           | izacus wrote:
           | They push you to buy the more expensive model with higher
           | margins.
           | 
           | This is what they did when I was buying iPad Air - it starts
           | with actually problematically low 64GB of Storage... and the
           | 256GB model is the next one with massive price jump.
           | 
           | It's the same kind of "anchoring" (marketing term) that car
           | dealers use to lure you into deciding for their car based on
           | the cheapest 29.999$ model which with "useful" equipment will
           | end you costing like 45.000$
        
             | aeyes wrote:
             | Honest question: What data do you store on an iPad Air? On
             | a phone you might have some photos and videos but isn't a
             | tablet just a media consumption device? Especially on iOS
             | where they try to hide the filesystem as much as possible.
        
               | izacus wrote:
               | No data, but iOS apps have gotten massive, caches have
               | gotten massive and install a game or two and 64GB is
               | gone.
               | 
               | Not to mention that occasionally is nice to have a set of
               | downloaded media available for vacation/travel and 64GB
               | isn't enough to download week worth of content from
               | Netflix.
               | 
               | This is why this is so annoying - you're right, I don't
               | need 512GB or 256GB. But I'd still like to have more than
               | "You're out of space!!" amount.
        
               | trogdor wrote:
               | Where is 64GB coming from?
        
               | izacus wrote:
               | The base iPad Air model - the one the price is most
               | quoted - is 64GB.
        
               | trogdor wrote:
               | No it's not.
               | 
               | https://www.apple.com/ipad-air/specs/
        
               | izacus wrote:
               | Not sure if you're pretending to not know, but all
               | previous base iPad models were 64GB.
        
               | nozzlegear wrote:
               | I've had the original iPad Pro with 64gb since it first
               | released and have somehow never run out of storage. Maybe
               | my problem is that I don't download games. I'd suggest
               | using a USB drive for downloaded media though if you're
               | planning to travel. All of the media apps I use (Netflix,
               | YouTube, Crunchyroll, etc.) support them. That's worked
               | well for me and is one reason I was comfortable buying
               | the 64gb model.
        
               | sroussey wrote:
               | How do you get Netflix to use an external drive?
        
               | nozzlegear wrote:
               | Sorry, I thought I had done this with Netflix but I tried
               | it just now and couldn't find the option. Then I googled
               | it and it looks like it was never supported, I must've
               | misremembered Netflix being an option.
        
               | imtringued wrote:
               | As he said, you buy excess storage so that you don't have
               | to think about how much storage you are using. Meanwhile
               | if you barely have enough, you're going to have to play
               | data tetris. You can find 256GB SSDs that sell for as low
               | as 20EUR. How much money is it worth to not worry about
               | running out of data? Probably more than the cost of the
               | SSD at these prices.
        
               | lotsofpulp wrote:
               | Email, password manager, iOS keychain, photos, videos,
               | etc should all be there if synced to iCloud.
        
               | maxsilver wrote:
               | > What data do you store on an iPad Air?
               | 
               | Games. You can put maybe three or four significant games
               | on an iPad Air before it maxes out. (MTG Arena is almost
               | 20GB all on it's own, Genshin Impact is like 40+ GB)
        
               | jdminhbg wrote:
               | > isn't a tablet just a media consumption device?
               | 
               | This is actually most of the storage space -- videos
               | downloaded for consumption in places with no or bad
               | internet.
        
               | magicalhippo wrote:
               | Surely it supports USB OTG? Or is that just an Android
               | thing[1]?
               | 
               | [1]: https://liliputing.com/you-can-use-a-floppy-disk-
               | drive-with-...
        
               | izacus wrote:
               | Even on Android you can't download streaming media to OTG
               | USB storage.
        
               | adamomada wrote:
               | Once again, pirates win, paying customers lose
        
               | kmeisthax wrote:
               | Yes. However, applications have to be specifically
               | written to use external storage, which requires popping
               | open the same file picker you use to interact with non-
               | Apple cloud storage. If they store data in their own
               | container, then that can only ever go on the internal
               | storage, iCloud, or device backups. You aren't allowed to
               | rugpull an app and move its storage somewhere else.
               | 
               | I mean, what would happen if you yanked out the drive
               | while an app was running on it?
        
               | wincy wrote:
               | We use PLEX for long trips in the car for the kids. Like
               | 24 hour drives. We drive to Florida in the winter and the
               | iPads easily run out of space after we've downloaded a
               | season or two of Adventure Time and Daniel Tiger.
               | 
               | I could fit more if I didn't insist on downloading
               | everything 1080p I guess.
        
               | skydhash wrote:
               | VLC or Infuse + external storage.
        
               | lozenge wrote:
               | iPad OS is 17 GB and every app seems to think it'll be
               | the only one installed.
        
               | nomel wrote:
               | > but isn't a tablet just a media consumption device
               | 
               | In my sphere, everyone with an iPad uses it for the Apple
               | Pencil and/or video editing. Raw files for drawings get
               | surprisingly big, once you get up into the many tens of
               | layers, considering an artist can draw a few a day.
        
           | gehsty wrote:
           | Fewer iPad SKUs = more efficient manufacturing and logistics,
           | at iPad scale probably means a very real cost saving.
        
           | michaelt wrote:
           | Because before you know it you're Dell and you're stocking 18
           | different variants of "Laptop, 15 inch screen, 16GB RAM,
           | 512GB SSD" and users are scratching their heads trying to
           | figure out WTF the difference is between a "Latitude 3540" a
           | "Latitude 5540" and a "New Latitude 3550"
        
             | gruez wrote:
             | I can't tell whether this is serious or not. Surely adding
             | independently configurable memory/storage combinations
             | won't confuse the user, any more than having configurable
             | storage options don't make the user confused about what
             | iphone to get?
        
               | its_ethan wrote:
               | Configuring your iPhone storage is something every
               | consumer has a concept of, it's some function of "how
               | many pictures can I store on it"? When it comes to
               | CPU/GPU/RAM and you're having to configure all three, the
               | average person is absolutely more likely to be confused.
               | 
               | It's anecdotal, but 8/10 people that I know over the age
               | of 40 would have no idea what RAM or CPU configurations
               | even theoretically do for them. This is probably the case
               | for _most_ iPad purchasers, and Apple knows this - so why
               | would they provide expensive /confusing configurability
               | options just for the handful of tech-y people who may
               | care? There are still high/med/low performant variations
               | that those people can choose from, any the number of
               | people for whom that would sour them away from a sale is
               | vanishingly small, and they would be likely to not even
               | be looking at Apple in the first place
        
             | skeaker wrote:
             | Yes, the additional $600 they make off of users who just
             | want extra RAM is just an unfortunate side effect of the
             | unavoidable process of not being Dell. Couldn't be any
             | other reason.
        
             | kmeisthax wrote:
             | Apple already fixed this with the Mac: they stock a handful
             | of configurations most likely to sell, and then everything
             | else is a custom order shipped direct from China. The
             | reason why Apple has to sell specific RAM/storage pairs for
             | iPads is that they don't have a custom order program for
             | their other devices, so everything _has_ to be an SKU,
             | _and_ has to sell in enough quantity to justify being an
             | SKU.
        
           | abtinf wrote:
           | The GP comment can be misleading because it suggests Apple is
           | tying storage to ram. That is not the case (at least not
           | directly).
           | 
           | The RAM and system-on-chip are tied together as part of the
           | system-on-package. The SoP is what enables M chips to hit
           | their incredible memory bandwidth numbers.
           | 
           | This is not an easy thing to allow configuration. They can't
           | just plug a different memory chip as a final assembly step
           | before shipping.
           | 
           | They only have two SoPs as part of this launch: 9-core CPU
           | with 8gb, and 10-core CPU with 16gb. The RAM is unified for
           | cpu/gpu (and I would assume neural engine too).
           | 
           | Each new SoP is going to reduce economies of scale and
           | increase supply chain complexity. The 256/512gb models are
           | tied to the first package, the 1/2tb models are tied to the
           | second. Again, these are all part of the PCB, so production
           | decisions have to be made way ahead of consumer orders.
           | 
           | Maybe it's not perfect for each individual's needs, but it
           | seems reasonable to assume that those with greater storage
           | needs also would benefit from more compute and RAM. That is,
           | you need more storage to handle more video production so you
           | are probably more likely to use more advanced features which
           | make better use of increased compute and RAM.
        
           | dragonwriter wrote:
           | > What do they lose by allowing slightly more freedom in
           | configurations?
           | 
           | More costs everywhere in the chain; limiting SKUs is a big
           | efficiency from manufacturing to distribution to retail to
           | support, and it is an easy way (for the same reason) to
           | improve the customer experience, because it makes it a lot
           | easier to not be out of or have delays for a customer's
           | preferred model, as well as making the UI (online) or
           | physical presentation (brick and mortar) for options much
           | cleaner.
           | 
           | Of course, it can feel worse if you you are a power user with
           | detailed knowledge of your particular needs in multiple
           | dimensions and you feel like you are paying extra for
           | features you don't want, but the efficiencies may make that
           | feeling an illusion -- with more freedom, you would being
           | paying for the additional costs that created, so a higher
           | cost for the same options and possibly just as much or more
           | for the particular combination option you would prefer with
           | multidimensional freedom as for the one with extra features
           | without it. Though that counterfactual is impossible to test.
        
           | Aurornis wrote:
           | Several things:
           | 
           | 1. Having more SKUs is expensive, for everything from
           | planning to inventory management to making sure you have
           | enough shelf space at Best Buy (which you have to negotiate
           | for). Chances are good that stores like Best Buy and Costco
           | would only want 2 SKUs anyway, so the additional configs
           | would be a special-order item for a small number of
           | consumers.
           | 
           | 2. After a certain point, adding more options actually
           | _decreases_ your sales. This is confusing to people who think
           | they 'd be more likely to buy if they could get exactly what
           | they wanted, but what you're not seeing is the legions of
           | casual consumers who are thinking about maybe getting an
           | iPad, but would get overwhelmed by the number of options.
           | They might spend days or weeks asking friends which model to
           | get, debating about whether to spend extra on this upgrade or
           | that, and eventually not buying it or getting an alternative.
           | If you simplify the lineup to the "cheap one" and the "high
           | end one" then people abandon most of that overhead and just
           | decide what they want to pay.
           | 
           | The biggest thing tech people miss is that they're not the
           | core consumers of these devices. The majority go to casual
           | consumers who don't care about specifying every little thing.
           | They just want to get the one that fits their budget and move
           | on. Tech people are secondary.
        
           | sneak wrote:
           | It benefits Apple to not have people thinking about or
           | worrying about the amount of ram in their iPad. The OS
           | doesn't surface it anywhere.
        
         | samatman wrote:
         | It's fairly absurd that they're still selling a 256gb "Pro"
         | machine in the first place.
         | 
         | That said, Apple's policy toward SKUs is pretty consistent: you
         | pay more money and you get more machine, and vice versa. The
         | MacBooks are the only product which has separately configurable
         | memory / storage / chip, and even there some combinations
         | aren't manufactured.
        
           | phkahler wrote:
           | >> It's fairly absurd that they're still selling a 256gb
           | "Pro" machine in the first place.
           | 
           | My guess is they want you to use their cloud storage and pay
           | monthly for it.
        
             | CharlieDigital wrote:
             | That doesn't make any sense.
             | 
             | I'm not storing my Docker containers and `node_modules` in
             | the cloud.
             | 
             | Pro isn't just images and videos.
        
               | davedx wrote:
               | This is a tablet not a laptop
        
             | skydhash wrote:
             | Or use an external storage. I'd be wary of using my iPad as
             | primary storage anyway. It's only work in progress and
             | currently watching/reading media.
        
             | samatman wrote:
             | If that were the goal (I don't think it is), they'd be
             | better off shipping enough storage to push people into the
             | 2TB tier, which is $11 vs. $3 a month for 200GB.
             | 
             | I said this in a sibling comment already, but I think it's
             | just price anchoring so that people find the $1500 they're
             | actually going to pay a bit easier to swallow.
        
           | dijit wrote:
           | Real creative pros will likely be using a 10G Thunderbolt NIC
           | to a SAN; local video editing is not advised unless it's only
           | a single project at a time.
           | 
           | Unless you are a solo editor.
        
           | azinman2 wrote:
           | I have a 256G iPhone. I think I'm using like 160G. Most stuff
           | is just in the cloud. For an iPad it wouldn't be any
           | different, modulo media cached for flights. I could see some
           | cases like people working on audio to want a bunch stored
           | locally, but it's probably in some kind of compressed format
           | such that it wouldn't matter too much.
           | 
           | What is your concern?
        
             | samatman wrote:
             | I don't know about 'concern' necessarily, but it seems to
             | me that 512GB for the base Pro model is a more realistic
             | minimum. There are plenty of use cases where that amount of
             | storage is overkill, but they're all served better by the
             | Air, which come in the same sizes and as little as 128GB
             | storage.
             | 
             | I would expect most actual users of the Pro model, now that
             | 13 inch is available at the lower tier, would be working
             | with photos and video. Even shooting ProRes off a pro
             | iPhone is going to eat into 256 pretty fast.
             | 
             | Seems like that model exists mainly so they can charge
             | $1500 for the one people are actually likely to get, and
             | still say "starts at $1299".
             | 
             | Then again, it's Apple, and they can get away with it, so
             | they do. My main point here is that the 256GB model is bad
             | value compared to the equivalent Air model, because if you
             | have any work where the extra beef is going to matter, it's
             | going to eat right through that amount of storage pretty
             | quick.
        
               | its_ethan wrote:
               | I think you're underestimating the number of people who
               | go in to buy an iPad and gravitate to the Pro because it
               | looks the coolest and sounds like a luxury thing. For
               | those people, who are likely just going to use it for web
               | browsing and streaming videos, the cheapest configuration
               | is the only one they care about.
               | 
               | That type of buyer is a very significant % of sales for
               | iPad pros. Despite the marketing, there are really not
               | that many people (as a % of sales) that will be pushing
               | these iPad's anywhere even remotely close to their
               | computational/storage/spec limits.
        
         | giancarlostoro wrote:
         | Honestly though, that's basically every tablet you cant change
         | the ram, you get what you get and thats it. Maybe they should
         | call them by different names like Pro Max for the ones with
         | 16GB in order to make it more palatable? Small psychological
         | hack.
        
           | dotnet00 wrote:
           | The Samsung tablets at least still retain the SD card slot,
           | so you can focus more on the desired amount of RAM and not
           | worry too much about the built-in storage size.
        
             | Teever wrote:
             | It would be cool if regulators mandated that companies like
             | Apple are obligated to provide models of devices with SD
             | card slots and a seamless way to integrate this storage
             | into the OS/applications.
             | 
             | That combined with replaceable batteries would go a long
             | way to reduce the amount of ewaste.
        
               | mschuster91 wrote:
               | And then people would stick alphabet-soup SD cards into
               | their devices and complain about performance and data
               | integrity, it's enough of a headache in the Android world
               | already (or has been before Samsung and others finally
               | decided to put in enough storage for people to not rely
               | on SD cards any more).
               | 
               | In contrast, Apple's internal storage to my knowledge
               | always is very durable NVMe, attached logically and
               | physically directly to the CPU, which makes their
               | shenanigans with low RAM size possible in the first place
               | - they swap like hell but as a user you barely notice it
               | because it's so blazing fast.
        
               | Teever wrote:
               | Yeah jackasses are always gonna jackass. There's still a
               | public interest in making devices upgradable for the
               | purpose of minimizing e-waste.
               | 
               | I'd just love to buy a device with a moderate amount of
               | unupgreadable SSD and an SD slot so that I can put a more
               | memory in it later so the device can last longer.
        
               | mschuster91 wrote:
               | Agreed but please with something other than microSD
               | cards. Yes, microSD Express is a thing, but both cards
               | and hosts supporting it are rare, the size format doesn't
               | exactly lend itself to durable flash chips, thermals are
               | questionable, and even the most modern microSD Express
               | cards barely hit 800 MB/sec speed, whereas Apple's stuff
               | has hit twice or more that for years [2].
               | 
               | [1] https://winfuture.de/news,141439.html
               | 
               | [2] https://9to5mac.com/2024/03/09/macbook-
               | air-m3-storage-speeds...
        
               | davedx wrote:
               | Not everything has to be solved by regulators. The walled
               | garden is way more important to fix than arbitrary
               | hardware configurations
        
             | zozbot234 wrote:
             | Doesn't iPad come with an USB-C port nowadays? You can
             | attach an external SD card reader.
        
               | amlib wrote:
               | Just like I don't want an umbilical cord hanging out of
               | me just to perform the full extent of my bodily
               | functions, I also wouldn't want a dongle hanging off my
               | tablet for it to be deemed usable.
        
         | paulpan wrote:
         | I don't think it's strictly for price gouging/segmentation
         | purposes.
         | 
         | On the Macbooks (running MacOS), RAM has been used as data
         | cache to speed up data read/write performance until the actual
         | SSD storage operation completes. It makes sense for Apple to
         | account for with higher RAM spec for the 1TB/2TB
         | configurations.
        
           | pantalaimon wrote:
           | > RAM has been used as data cache to speed up data read/write
           | performance until the actual SSD storage operation completes.
           | 
           | I'm pretty sure that's what all modern operating systems are
           | doing.
        
             | eddieroger wrote:
             | Probably, but since we're talking about an Apple product,
             | comparing it to macOS make sense, since they all share the
             | same bottom layer.
        
               | 0x457 wrote:
               | Not, probably, that just how any "modern" OS works. It
               | also uses RAM as a cache to avoid reads from storage,
               | just like any other modern OS.
               | 
               | Apple uses it for segmentation and nothing else.
               | 
               | Modern being - since the 80s.
        
               | tomxor wrote:
               | Even on the Atari ST you would use a "RAM disk" when
               | working with "large" data before manually flushing it to
               | a floppy. Some people would use the trashcan icon to
               | emphasise the need to manually flush... Not quite a
               | cache, but the concept was there.
        
             | wheybags wrote:
             | I'm writing this from memory, so some details may be wrong
             | but: most high end ssds have dram caches on board, with a
             | capacitor that maintains enough charge to flush the cache
             | to flash in case of power failure. This operates below the
             | system page cache that is standard for all disks and oses.
             | 
             | Apple doesn't do this, and use their tight integration to
             | perform a similar function using system memory. So there is
             | some technical justification, I think. They are 100% price
             | gougers though.
        
               | beambot wrote:
               | One company's "Price Gouging" is another's "Market
               | Segmentation"
        
               | gaudystead wrote:
               | > writing this from memory
               | 
               | Gave me a chuckle ;)
        
               | kllrnohj wrote:
               | Using host memory for SSD caches is part of the NVMe
               | spec, it's not some Apple-magic-integration thing:
               | https://www.servethehome.com/what-are-host-memory-buffer-
               | or-...
               | 
               | It's also still typically just worse than an actual dram
               | cache.
        
           | __turbobrew__ wrote:
           | That is called a buffer/page cache and has existed in
           | operating systems since the 1980s.
        
             | riazrizvi wrote:
             | No this is caching with SSDs, it's not the same league.
        
             | btown wrote:
             | With hardware where power-off is only controlled by
             | software, battery life is predictable, and large amounts of
             | data like raw video are being persisted, they might have a
             | very aggressive version of page caching, and a large amount
             | of storage may imply that a scale-up of RAM would be
             | necessary to keep all the data juggling on a happy path.
             | That said, there's no non-business reasons why they
             | couldn't extend that large RAM to smaller storage systems
             | as well.
        
               | marshray wrote:
               | People without the "large amount of storage model" need
               | to record video from the camera too.
               | 
               | The justifications I see are to reduce the number of
               | models needed to stock and to keep the purchasing
               | decision simple for customers. These are very good
               | reasons.
        
             | astrange wrote:
             | It's unified memory which means the SSD controller is also
             | using the system memory. So more flash needs more memory.
        
               | spixy wrote:
               | Then give me more memory. 512gb storage with 16gb ram
        
               | astrange wrote:
               | This post starts with "then" but isn't responsive to
               | anything I said.
        
           | gruez wrote:
           | How does that justify locking the 16GB option to 1TB/2TB
           | options?
        
             | 0x457 wrote:
             | Since memory is on their SoC it makes it challenging to
             | maintain multiple SKUs. This segmentation makes to me as a
             | consumer.
        
               | thejazzman wrote:
               | If that were the case why do they bother with an iPad,
               | iPad Air, iPad Pro, iPhone SE, iPhone, iPhone Pro, iPhone
               | Pro Max, ... each with their own number of colors and
               | storage variations.
               | 
               | But no 16GB without more SSD? lol?
        
               | 0x457 wrote:
               | iPad Air was created to make a new price category.
               | 
               | The iPhone SE exists because there is a market for this
               | form factor. If you look at the specs, you would notice
               | it uses hardware previously used by more expensive
               | models.
               | 
               | > iPhone, iPhone Pro, iPhone Pro Max
               | 
               | Again, different customers for different form-factors.
               | These phones differ more than just SoC in them.
               | 
               | You understand that having N different colors of iPads is
               | different from having N different SoCs for the same model
               | of an iPad.
        
           | lozenge wrote:
           | The write speed needs to match what the camera can output or
           | the WiFi/cellular can download. It has nothing to do with the
           | total size of the storage.
        
           | bschne wrote:
           | Shouldn't the required cache size be dependent on throughput
           | more so than disk size? It does not necessarily seem like
           | you'd need a bigger write cache if the disk is bigger, people
           | who have a 2TB drive don't read/write 2x as much in a given
           | time as those with a 1TB drive. Or am I missing something?
        
             | BitBanging wrote:
             | IIRC SSD manufacturers are likely to store a mapping table
             | of LBAs (logical block addresses) to PBAs (physical block
             | addresses) in the DRAM or Host Memory Buffer.
             | 
             | Some calculation like:
             | 
             | total storage size / page size per LBA (512B or 4KiB
             | usually) * mapping data structure size
        
               | inkyoto wrote:
               | > SSD manufacturers are likely to store a mapping table
               | of LBAs (logical block addresses) to PBAs (physical block
               | addresses) in the DRAM or Host Memory Buffer.
               | 
               | Are LBA's a thing on SSD's nowadays? I thought it was the
               | legacy of the spinning rust.
               | 
               | SSD's operate on memory pages of the flash memory, and
               | the page management is a complicated affair that is also
               | entirely opaque to the host operating system due to the
               | behind the scenes page remapping. Since flash memory is
               | less durable (in the long term), the SSD's come
               | overprovisioned and the true SSD capacity is always more
               | (up to a double if my memory serves me well). The SSD
               | controller also runs an embedded RTOS that monitors
               | failures in flash chips and proactively evacuates and
               | remaps ailing flash memory pages onto the healthy ones.
               | Owing to this behaviour, the memory pages that the SSD
               | controller reports back to the operating system have
               | another, entirely hidden, layer of indirection.
        
               | BitBanging wrote:
               | Yep, LBAs are the primary addressing scheme in the NVMe
               | spec, written into every single IO command. I would
               | imagine there could be a better way, but NVMe & OS
               | support still carries some baggage from SATA HDDs -> SATA
               | SSDs -> NVMe SSDs.
               | 
               | As you mentioned, over-provisioning and other NAND flash
               | memory health management techniques like garbage
               | collection and wear leveling are needed for usable modern
               | SSDs. Modern SSD controllers are complex beasts having
               | 3-7 microprocessor cores (probably double digit core
               | counts now with PCIe 5.0), encryption engines, power &
               | thermal management, error correction, multiple hardware
               | PHYs, etc.
               | 
               | Example product sheet:
               | https://www.marvell.com/content/dam/marvell/en/public-
               | collat...
        
               | surajrmal wrote:
               | On a physical level, flash deals with pages and erase
               | blocks. NVMe has LBAs defined but it's always been an
               | awkward legacy thing.
        
           | hosteur wrote:
           | > I don't think it's strictly for price gouging/segmentation
           | purposes.
           | 
           | I think it is strictly for that purpose.
        
           | Aerbil313 wrote:
           | Do people not understand that Apple's 'price gouging' is
           | about UX? A person who has the money to buy a 1TB iPad is
           | worth more than average customer. A 16GB RAM doubtlessly
           | results in a faster UX and that person is more likely to
           | continue purchasing.
        
             | zipping1549 wrote:
             | > Do people not understand that Apple's 'price gouging' is
             | about UX? A person who has the money to buy a 1TB iPad is
             | worth more than average customer. A 16GB RAM doubtlessly
             | results in a faster UX and that person is more likely to
             | continue purchasing.
             | 
             | And that decision somehow turns into making budget
             | conscious people's UX shittier? How is that a reason not to
             | make 16gb RAM, which is almost a bare minimum in 2024,
             | available to everyone?
        
           | nbsande wrote:
           | If I'm understanding your point correctly that wouldn't
           | prevent them from offering higher ram specs for the lower
           | storage eg. 512 gig macs. So it seems like it is just price
           | gouging
        
         | KeplerBoy wrote:
         | If these things will ever get MacOS support it will be useless
         | with 8 GB of Ram.
         | 
         | Such a waste of nice components.
        
           | greggsy wrote:
           | This comes up frequently. 8GB is sufficient for most casual
           | and light productivity use cases. Not everyone is a power
           | user, in fact, most people aren't.
        
             | regularfry wrote:
             | My dev laptop is an 8GB M1. It's fine. Mostly.
             | 
             | I can't run podman, slack, teams, and llama3-8B in
             | llama.cpp at the same time. Oddly enough, this is rarely a
             | problem.
        
               | prepend wrote:
               | It's the "Mostly" part that sucks. What's the price
               | difference between 8 and 16? Like $3 in wholesale prices.
               | 
               | This just seems like lameness on Apple's part.
        
               | KeplerBoy wrote:
               | It's not quite like that. Apple's RAM is in the SoC
               | package, it might be closer to 20$, but still.
        
               | TremendousJudge wrote:
               | They have always done this, for some reason people buy it
               | anyway, so they have no incentive to stop doing it.
        
               | Aurornis wrote:
               | > What's the price difference between 8 and 16? Like $3
               | in wholesale prices.
               | 
               | Your estimates are not even close. You can't honestly
               | think that LPDDR5 at leading edge speeds is only $3 per
               | 64 Gb (aka 8GB), right?
               | 
               | Your estimate is off my an order of magnitude. The memory
               | Apple is using is closer to $40 for that increment, not
               | $3.
               | 
               | And yes, they include a markup, because nobody is
               | integrating hardware parts and selling them at cost. But
               | if you think the fastest LPDDR5 around only costs $3 for
               | 8GB, that's completely out of touch with reality.
        
               | moooo99 wrote:
               | Even if taking raising market prices into account, your
               | estimate for the RAM module price is waaaaaaay off.
               | 
               | You can get 8GB of good quality DDR5 DIMMs for 40$, there
               | is no way in hell that Apple is paying anywhere near
               | that.
               | 
               | Going from 8 to 16GBs is probably somewhere between 3-8$
               | purely in material costs for Apple, not taking into
               | account any other costs associated
        
               | smarx007 wrote:
               | GP said "LPDDR5" and that Apple won't sell at component
               | prices.
               | 
               | You mention DIMMs and component prices instead. This is
               | unhelpful.
               | 
               | See https://www.digikey.com/en/products/filter/memory/mem
               | ory/774... for LPDDR5 prices. You can get a price of
               | $48/chip at a volume of 2000 chips. Assuming that Apple
               | got a deal of $30-40-ish at a few orders of magnitude
               | larger order is quite fair. Though it certainly would be
               | nicer if Apple priced 8GB increments not much above
               | $80-120.
        
               | moooo99 wrote:
               | I am aware that there are differences, I just took RAM
               | DIMMs as a reference because there is a >0% chance that
               | anyone reading this has actually ever bought a comparable
               | product themselves.
               | 
               | As for prices, the prices you cited are not at all
               | comparable. Apple is absolutely certainly buying directly
               | from manufacturers without a middleman since we're
               | talking about millions of units delivered each quarter.
               | Based on those quantities, unit prices are guaranteed to
               | be substantially lower than what DigiKey offers.
               | 
               | Based on what little public information I was able to
               | find, spot market prices for LPDDR4 RAM seem to be
               | somewhere in the 3 to 5$ range for 16GB modules. Let's be
               | generous and put LPDDR5 at tripe the price with 15$ a
               | 16GB module. Given the upgrade price for going from 8 to
               | 16GB is 230 EUR Apple is surely making a huge profit on
               | those upgrades alone by selling an essentially unusable
               | base configuration for a supposed "Pro" product.
        
               | wtallis wrote:
               | Mind the difference between GB and Gb.
        
               | tuetuopay wrote:
               | DDR5 DIMMs and LPDDR chips as in the MacBooks are not the
               | same beasts at all.
               | 
               | A DIMM is 8 or 16 chips (9/18 is ECC), while the LPDDR is
               | a single chip for the same storage. The wild density
               | difference in chip capacity (512MB or 1GB vs 8GB) makes a
               | huge difference, and how a stick can be sold at retail
               | for cheaper than the bare LPDDR chip in volume.
        
               | flawsofar wrote:
               | Local LLMs are sluggish on my M2 Air 8GB,
               | 
               | but up until these these things I felt I could run
               | whatever I wanted, including Baldur's Gate 3.
        
               | Aurornis wrote:
               | Same here. My secondary laptop is 8GB of RAM and it's
               | fine.
               | 
               | As devs and power users we'll always have an edge case
               | for higher RAM usage, but the average consumer is going
               | to be perfectly fine with 8GB of RAM.
               | 
               | All of these comments about how 8GB of RAM is going to
               | make it "unusable" or a "waste of components" are absurd.
        
               | luyu_wu wrote:
               | The point you're missing is that it's about the future. I
               | generally agree, but it's obvious everything becomes more
               | RAM intensive as time goes on. Hell, even games can take
               | more than 8 GB of purely VRam these days.
        
               | 6SixTy wrote:
               | It's really not weird. The more you charge for the base
               | product and upgrades, serving the bare minimum becomes
               | less acceptable. It also doesn't help that the 4GB base
               | models from years past aged super quickly compared to
               | it's higher end cousins.
        
               | camel-cdr wrote:
               | Programming has a wierd way of requirering basically
               | nothing some times, but other times you need to build the
               | latest version of your toolchain, or you are working on
               | some similarly huge project that takes ages to compile.
               | 
               | I was using my 4gb ram pinebook pro in public transport
               | yesterday, and decided to turn of all cores except for a
               | single Cortex-A53, to safe some battery. I had no
               | problems for my usecase of a text editor + shell to
               | compile for doing some SIMD programming.
        
               | internet101010 wrote:
               | At this point I don't think the frustration has much to
               | do with the performance but rather RAM is so cheap that
               | intentionally creating a bottleneck to extract another
               | $150 from a customer comes across as greedy, and I am
               | inclined to agree. Maybe the shared memory makes things
               | more expensive but the upgrade cost has always been
               | around the same amount.
               | 
               | It's not quite in the same ballpark as showing apartment
               | or airfare listings without mandatory fees but it is at
               | the ticket booth outside of the stadium.
        
               | jonfromsf wrote:
               | The bigger problem is when you need a new machine fast,
               | the apple store doesn't have anything but the base models
               | in stock. In my org we bought a machine for a new
               | developer who was leaving town, and were forced to buy an
               | 8gb machine because the store didn't have other options
               | (it was going to be a 2 week wait). As you can imagine,
               | the machine sucked for running Docker etc and we had to
               | sell it on facebook marketplace for a loss.
        
               | gorbypark wrote:
               | I've never encountered an actual Apple Store not having
               | specced up machines on hand (maybe not EVERY possible
               | configuration, but a decent selection). If you go to a
               | non-Apple retailer, afaik, they are limited to the base
               | spec machines (RAM wise), it's not even a matter of them
               | being out of stock. If you want anything other than 8GB
               | (or whatever the base amount is for that model) of RAM
               | you need to go through Apple directly. This was the case,
               | at least in Canada a few years ago, correct me if I'm
               | wrong/things have changed.
        
               | oblio wrote:
               | I imagine you don't have browsers with many tabs.
        
               | adastra22 wrote:
               | I could never understand how people operate with more
               | than a dozen or so open tabs.
        
               | touristtam wrote:
               | Those are the type of "I'll go back later to it", The
               | workflow on modern browser is broken. Instead of
               | leveraging the bookmark functionality to improve the UX,
               | we have this situation of user having 50+ tabs open,
               | because they can. It takes quite a bit of discipline to
               | close down tabs to a more manageable numbers.
        
               | oblio wrote:
               | Well, there are how many browser out there? 50? And
               | opening tabs with something like Tree Style Tabs still is
               | the best user experience.
               | 
               | > It takes quite a bit of discipline to close down tabs
               | to a more manageable numbers.
               | 
               | Or you just click the little chevron in Tree Style Tabs
               | or equivalent and 100 tabs are just hidden in the UI.
        
               | robin_reala wrote:
               | The number of tabs you have doesn't correlate to the
               | number of active web views you have, if you use any
               | browser that unloads background tabs while still saving
               | their state.
        
               | oblio wrote:
               | I'm fairly sure that if you open up the web messengers,
               | Gmail, etc, the browser can't and won't unload them,
               | because they're active in the background.
               | 
               | It's fairly easy to hit a few GB of RAM used up just with
               | those.
        
               | GOONIMMUNE wrote:
               | how many is "many"? I'm also on an M1 Mac 8 GB RAM and I
               | have 146 chrome tabs open without any issues.
        
               | gs17 wrote:
               | Mine is 8GB M1 and it is not fine. But the actual issue
               | for me isn't RAM as much as it is disk space, I'm pretty
               | confident if it wasn't also the 128 GB SSD model it would
               | handle the small memory just fine.
               | 
               | I'm still getting at least 16 GB on my next one though.
        
               | regularfry wrote:
               | Yeah, that's definitely a thing. Podman specifically eats
               | a lot.
        
               | gs17 wrote:
               | My exact-ish headache, I have to check my free disk space
               | before launching Docker.
        
               | kalleboo wrote:
               | Yeah personally I find cheaping out on the storage far
               | more egregious than cheaping out on the RAM. Even if you
               | have most things offloaded onto the cloud, 128 GB was not
               | even enough for that, and the 256 GB is still going to be
               | a pain point even for many casual home users, and at the
               | price point of Apple machines it's inexcusable to not add
               | another $25 of flash
        
             | Pikamander2 wrote:
             | That would be fine if the 8GB model was also _priced_ for
             | casual and light productivity use cases. But alas, this is
             | Apple we 're talking about.
        
               | wiseowise wrote:
               | MacBook Air starts at 1,199 euro. For insane battery
               | life, amazing performance, great screen and one of the
               | lightest chassis. Find me comparable laptop, I'll wait.
        
               | touristtam wrote:
               | The screen is the killer. you can have a nice-ish 2nd
               | corporate laptop with decent and swappable battery on
               | which you can install a decent OS (non Windows) and get
               | good milage but the screen is something else.
        
               | wiseowise wrote:
               | Forgot to mention that it must be completely silent.
        
               | 6SixTy wrote:
               | Asking for a machine with "insane battery life, amazing
               | performance, great screen and one of the lightest
               | chassis" and oh, it must be completely silent is a loaded
               | set of demands. Apple in the current market is
               | essentially the only player that can actually make a
               | laptop that can meet your demands, at least without doing
               | a bunch of research into something that's equivalent and
               | hoping the goal posts don't move again.
        
               | paulmd wrote:
               | this is extremely funny in the context of the protracted
               | argument up-thread about what you could reasonably be
               | comparing the macbook air against.
               | 
               | like, the $359 acer shitbox _probably doesn 't do all the
               | exact same thing as the MBA either_, but that's actually
               | ok and really only demonstrates the MBA is an
               | unaffordable luxury product, basically the same as a
               | gold-plated diamond-encrusted flip-phone.
               | 
               | https://news.ycombinator.com/item?id=40292839
               | 
               | Not your circus, not your clowns, but this is sort of the
               | duality of apple: "it's all marketing and glitz, a luxury
               | product, there's no reason to buy it, and the fact that
               | they have a better product only PROVES it" vs "of course
               | no PC manufacturer could possibly be expected to offer a
               | top-notch 120hz mini-LED screen, a good keyboard, great
               | trackpad, good speakers, and good SOC performance in a
               | thin-n-light..."
        
             | dialup_sounds wrote:
             | Most people that consider themselves "power users" aren't
             | even power users, either. Like how being into cars doesn't
             | make you a race car driver.
        
               | 0x457 wrote:
               | Race car drivers think they are pros and can't even
               | rebuild the engine in their car.
               | 
               | There are different categories of "power users"
        
               | _gabe_ wrote:
               | Race car _drivers_. They _are_ pros. Professional
               | drivers. They definitely know how to drive a car much
               | more efficiently than I do, or anyone that's just into
               | cars. I assume the race car engineers are the pros at
               | rebuilding engines.
               | 
               | And as for the parent comment's point, being into cars
               | doesn't mean you're as good as a professional race car
               | driver.
        
             | wvenable wrote:
             | Isn't this the "Pro" model?
        
           | reustle wrote:
           | > If these things will ever get MacOS support
           | 
           | The Macbook line will get iPadOS support long before they
           | allow MacOS on this line. Full steam ahead towards the walled
           | garden.
        
             | bluescrn wrote:
             | iOS has become such a waste of great hardware, especially
             | in the larger form factor of the iPad.
             | 
             | M1 chips, great screens, precise pencil input and keyboard
             | support, but we still aren't permitted a serious OS on it,
             | to protect the monopolistic store.
             | 
             | App Stores have beeen around long enough to prove that
             | they're little more than laboratories in which to carry out
             | accelerated enshittification experimentation. Everything so
             | dumbed down and feature-light, yet demanding subscriptions
             | or pushing endless scammy ads. And games that are a
             | shameless introduction to gambling addiction, targeted at
             | kids.
             | 
             | Most of the 'apps' that people actually use shouldn't need
             | to be native apps anyway, they should be websites. And now
             | we get the further enshittification of trying to force
             | people out of the browser and into apps, not for a better
             | experience, but for a worse one, where more data can be
             | harvested and ads can't so easily be blocked...
        
             | arecurrence wrote:
             | If the iPad could run Mac apps when docked to Magic
             | Keyboard like the Mac can run iPad apps then there may be a
             | worthwhile middle ground that mostly achieves what people
             | want.
             | 
             | The multitasking will still be poor but perhaps Apple can
             | do something about that when in docked mode.
             | 
             | That said, development likely remains a non-starter given
             | the lack of unix tooling.
        
           | Aurornis wrote:
           | I have a minimum of 64GB on my all my main developer machines
           | (home, work, laptop), but I have a spare laptop with only 8GB
           | of RAM for lightweight travel.
           | 
           | Despite the entire internet telling me it would be "unusable"
           | and a disaster and a complete disaster, it's actually 100%
           | perfectly fine. I can run IDEs, Slack, Discord, Chrome, and
           | do dev work without a problem. I can't run a lot of VMs or
           | compile giant projects with 10 threads, of course, but for
           | typical work tasks it's just fine.
           | 
           | And for the average consumer, it would also be fine. I think
           | it's obvious that a lot of people are out of touch with
           | normal people's computer use cases. 8GB of RAM is fine for
           | 95% of the population and the other 5% can buy something more
           | expensive.
        
             | mananaysiempre wrote:
             | For me personally, it's not an issue of being out of touch.
             | I did, in fact, use a 2014 Macbook with an i5 CPU and 16 GB
             | of RAM for nearly a decade and know how often I hit swap
             | and/or OOM on it even without attempting multicore
             | shenanigans which its processor couldn't have managed
             | anyway.
             | 
             | It's rather an issue of selling deliberately underpowered
             | hardware for no good reason other than to sell actually up-
             | to-date versions for a difference in price that has no
             | relation to the actual availability or price of the
             | components. The sheer disconnect from any kind of reality
             | offends me as a person whose job and alleged primary
             | competency is to recognize reality then bend it to one's
             | will.
        
               | KeplerBoy wrote:
               | I don't think we ever were at a point in computing were
               | you could buy a high-end (even entry level macbooks have
               | high-end pricing) laptop with the same amount of ram as
               | you could 10 years earlier.
               | 
               | 8 GB were the standard back then.
        
               | brailsafe wrote:
               | 10 years the ago the default for macbook pros was 4GB,
               | and those started showing their age very quickly for what
               | was not a small amount of money.
        
               | KeplerBoy wrote:
               | I'm not sure that's true:
               | 
               | https://support.apple.com/en-us/111942
        
               | brailsafe wrote:
               | Hmm, unexpected. I was quite sure my partner's 2015 mbp
               | was sitting at 4gb, but you win this one! ;)
               | 
               | Edit: I confirmed that I was indeed wrong, but the payoff
               | isn't great anyway because that just means that yes in
               | fact they've kept the exact same ram floor for 10 years.
               | Insane.
        
             | elaus wrote:
             | But why did you configure 3 machines with 64+ GB, if 8 GB
             | RAM are "100% perfectly fine" for typical work tasks?
             | 
             | For me personally 16 or 32 GB are perfectly fine, 8 GB was
             | too little (even without VMs) and I've never needed 64 or
             | more. So it's curious to see you are pretty much exactly
             | the opposite.
        
               | joshmanders wrote:
               | > But why did you configure 3 machines with 64+ GB, if 8
               | GB RAM are "100% perfectly fine" for typical work tasks?
               | 
               | Did you miss this part prefixing that sentence?
               | 
               | > I can't run a lot of VMs or compile giant projects with
               | 10 threads, of course
        
             | leptons wrote:
             | Be honest, that 8GB computer isn't running MacOS, is it.
        
               | adastra22 wrote:
               | That's the standard configuration of a MacBook Air.
        
               | leptons wrote:
               | That's all well and good but nowhere did OP mention that
               | it was an Apple computer at all. All they mentioned was
               | this:
               | 
               | >"I have a spare laptop with only 8GB of RAM"
        
             | grecy wrote:
             | I'm editing 4k video and thousands of big RAW images.
             | 
             | The used M1 MacBook Air I just bought is by far the fastest
             | computer I have ever used.
        
             | erksa wrote:
             | I have the base M2 air with 8gb ram, and it's really been
             | perfect for working on. The only time things have become an
             | issue is dual user accounts being logged in at the same
             | time. Which is very preventable.
        
             | hypercube33 wrote:
             | Problem is yes it does run but it's probably paging to disk
             | more than you think. I wonder if that lowers both
             | performance and battery life.
        
             | trog wrote:
             | Half my organisation runs on 8GB Chromebooks. We were
             | testing one of our app changes the other day and it
             | performed better on the Chromebook than it did on my i7
             | machine with 32GB.
        
             | tamrix wrote:
             | Fine just doesn't cut it for a premium machine you expect
             | to last a few years at least. It's honestly just marketed
             | so you want to spend extra and upgrade. Let's be real.
        
               | jeffhuys wrote:
               | It will last more than a few years, AND it's marketed so
               | you want to spend extra.
        
             | walteweiss wrote:
             | I bought a second-hand office-grade PC recently, about a
             | year ago. It was about $10 to $15, had no disks (obviously)
             | and just 2 GB of DDR3 RAM. Also, an integrated GPU with
             | some low-grade Intel CPU (Pentium, if I'm not wrong). Even
             | the generation isn't current, it's about a decade old, a
             | bit more.
             | 
             | I put a spare 120 GB SSD, a cheap no-name brand that was
             | just lying around for some testing purposes. Found the
             | similar off-the-shelf DDR3 2 GB RAM stick. I thought the
             | RAM was faulty, turned out it's in a working condition, so
             | I put it there.
             | 
             | I need the computer for basic so-called office work (a
             | browser, some messengers, email client and a couple of
             | other utilities). I thought I'd buy at least two 4GB RAM
             | sticks after I test it, so you know, 8 GB is just the bare
             | imaginable minimum! I have my 16 GBs everywhere since, idk,
             | maybe 2012 or something.
             | 
             | And you know what?! It works very well with 4 GB of RAM and
             | default Fedora (it's 40 now, but I started with 38, iirc).
             | It has the default Gnome (also, 46 now, started with 44,
             | iirc). And it works very well!
             | 
             | It doesn't allow me to open a gazillion of browser tabs,
             | but my workflow is designed to avoid it, so I have like 5
             | to 10 open simultaneously.
             | 
             | Before throwing Fedora at the PC, I thought I would just
             | install a minimal Arch Linux with swaywm and be good. But I
             | decided I don't want to bother, and I'll just buy 8 GB
             | later on, and be done with it.
             | 
             | And here I am, having full-blown Gnome and just 4 GB of
             | RAM. I don't restrict myself too much, the only time I
             | notice it's not my main PC is when I want to do some heavy
             | web-browsing (e.g. shopping on some different heavy
             | websites with many tabs opened). Then it slows down
             | significantly, till I close the unnecessary tabs or apps.
             | All the software is updated and current, so it's not like
             | it's some ancient PC from 00's.
             | 
             | Also, I have my iPad Pro 12,9 1st Gen with just 4 GB of RAM
             | too, and I never feel it's slow for me.
             | 
             | I understand that some tasks would require a lot of RAM,
             | and it's not for everyone. Having a lot of RAM everywhere,
             | I'm quite used to not thinking of it at all for a
             | significant part of my career (for over a decade now), so I
             | may have something opened for weeks that I don't have any
             | need for.
             | 
             | So, it's 2024, and I'm surprised to say that 4 GB of RAM is
             | plenty when you're focused on some tasks and don't
             | multitask heavily. Which never productive for me at least.
             | I even noticed that I enjoy my low-memory PC even more, as
             | it reminds me with its slowdowns that I'm entering the
             | multitasking state.
             | 
             | I use swaywm on my Arch Linux laptop, and most of the time
             | it's less than 3-4 Gb (I have 16 Gb).
        
             | paulddraper wrote:
             | > 8gb is actually 100% perfectly fine
             | 
             | Thus making your three other machines 400% perfectly fine?
        
             | cubefox wrote:
             | > 8GB of RAM is fine for 95% of the population and the
             | other 5% can buy something more expensive.
             | 
             | This argument is self defeating in the context of the M4
             | announcement. "Average consumers" who don't need 16 GB of
             | RAM don't need an M4 either. But people who _do_ need an M4
             | chip probably also need 16 GB of RAM.
             | 
             | I think actually more people need 16 GB of RAM rather than
             | a top M4 chip. Having only 8 GB can be a serious limitation
             | in some memory heavy circumstances, while having (say) an
             | M2 SoC rather than an M4 SoC probably doesn't break any
             | workflow at all, it just makes it somewhat slower.
        
           | jiehong wrote:
           | People always complain dev should write more efficient
           | software, so maybe that's one way!
           | 
           | At least, chrome wouldn't run that many tabs on iPad for sure
           | if it used the same engine as desktop chrome
        
           | leptons wrote:
           | These specific model of tablets won't ever get MacOS support.
           | Apple will tell you when you're allowed to run MacOS on a
           | tablet, and they'll make you buy a new tablet specifically
           | for that.
        
           | tamrix wrote:
           | Why can't they just make it 12gb? It will sell far easier.
           | It's all soldered on anyway.
        
             | KeplerBoy wrote:
             | It probably has a 128 bit buswidth. You need 32 bit per
             | memory chip, so you end up with 4 chips.
             | 
             | 3 GB ram chips probably exist, but are definitely not
             | common.
        
             | TheBigSalad wrote:
             | The point is: 8GB of RAM is really, really cheap. Like $20
             | retail.
        
           | trvz wrote:
           | Unlike seemingly everyone making this claim, I have used an
           | M1 Mac mini with 8GB RAM. It's fine, and certainly not
           | useless.
        
           | emodendroket wrote:
           | Is the base M1 Macbook Air "useless"?
        
         | sambazi wrote:
         | > 8gb of ram
         | 
         | not again
        
           | paulddraper wrote:
           | Sadly so
        
         | ilikehurdles wrote:
         | The iPod classic had 160 GB of storage fifteen years ago.
         | 
         | No device should be measuring storage in the gigabytes in 2024.
         | Let alone starting at $1000 offering only 256GB. What
         | ridiculousness.
        
           | dmitrygr wrote:
           | So browse the web and play modern games on an iPod classic
        
           | vrick wrote:
           | To be fair the ipod classic used a platter drive and ipads
           | are high speed SSD storage. That being said, it's been years
           | of the same storage options and at those prices it should be
           | much higher, along with their iCloud storage offerings.
        
             | sneak wrote:
             | People don't generally run out of storage that often. I
             | think perhaps you overestimate how much local data everyday
             | people store.
        
           | LeoPanthera wrote:
           | The iPod Classic had spinning rust. Don't pretend it's
           | comparable with a modern SSD.
        
           | marshray wrote:
           | I just checked, and my iPhone is using 152 GB of storage.
           | 
           | I have years of photos and videos on there. Apparently the
           | 256 GB model was the right choice for me?
        
             | sneak wrote:
             | It stores lower res proxies on the device and full res in
             | iCloud, iirc.
        
               | elAhmo wrote:
               | Only if it is explicitly turned on.
               | 
               | I have the 256 GB model as well, full resolution of 5+
               | years of photos (around 20k), around 200 GB used, 180 on
               | iCloud via Family Sharing.
        
           | emodendroket wrote:
           | I agree, the new iPad should have a spinning platter hard
           | drive.
        
             | ilikehurdles wrote:
             | If 15 years of technological progress can't find it cost
             | effective to fit more than 256gb of solid state storage in
             | a $1000 device, then what are we even doing here?
             | 
             | A 1 TB consumer-oriented SSD is about $50 today. At Apple's
             | manufacturing scale, do you have any doubt that the cost to
             | them is nearly negligible?
        
         | intrasight wrote:
         | "Think Different" ;)
        
         | littlestymaar wrote:
         | > 8gb of ram.
         | 
         | WTF? Why so little? That's insane to me, that's the amount of
         | RAM you get with a mid-range android phone.
        
           | MyFirstSass wrote:
           | We've reached a point where their chips has become so amazing
           | they have to introduce "fake scarcity" and "fake limits" to
           | sell their pro lines, while dividing their customers into
           | haves and havenots, while actively stalling the entire field
           | for the masses.
        
             | littlestymaar wrote:
             | Yes, but we're talking about the "pro" version here which
             | makes even less sense!
        
             | its_ethan wrote:
             | You could, alternatively, read less malice into the
             | situation and realize that the _majority of people buying
             | an iPad pro don 't even need 8gb of RAM_ to do what they
             | want to do with the device (web browsing + video
             | streaming).
        
           | deergomoo wrote:
           | I'm not defending Apple's absurd stinginess with RAM (though
           | I don't think it's much of an issue on an iPad given how
           | gimped the OS is), but I've never understood why high-end
           | Android phones have 12/16+ GB RAM.
           | 
           | What needs that amount on a phone? 8GB on a desktop is...well
           | it's not great, but it's _usable_ , and usable for a damn
           | sight more multi-tasking than you would ever do on a
           | smartphone. Is it just because you need to look better on the
           | spec sheet, like the silly camera megapixel wars of the
           | 2010s?
        
             | pquki4 wrote:
             | I think that is very obvious. You have a browser window and
             | a few apps open -- messaging, youtube, email, podcast etc.
             | When you switch between apps and eventually back to your
             | browser, you don't want the page to reload due to other
             | apps eating up the memory. As simple as that. It's about
             | having a good experience.
        
           | sneak wrote:
           | Phones and tablets are effectively single-tasking. They don't
           | need much more ram than that in practice.
           | 
           | I use my iPad Pro constantly for heavy stuff and I don't know
           | how much ram is in it; it has never been something I needed
           | to think about. The OS doesn't even expose it.
        
           | filleduchaos wrote:
           | Is a Pixel 8 a mid-range Android phone? Mine shipped with 8GB
           | of RAM, and even with my frankly insane Chrome habits (I'm
           | currently sitting at around 200 tabs) I'm only using about
           | five and half gigabytes, _and_ it runs a hell of a lot
           | smoother than other phones with more RAM that I 've used.
           | 
           | There's absolutely nothing a mobile device does that should
           | require that much memory. That shitty OEMs bloat the hell out
           | of their ROMs and slap on more memory to match isn't a good
           | thing or something to emulate in my opinion.
        
       | sroussey wrote:
       | "The next-generation cores feature improved branch prediction,
       | with wider decode and execution engines for the performance
       | cores, and a deeper execution engine for the efficiency cores.
       | And both types of cores also feature enhanced, next-generation ML
       | accelerators."
        
         | gwd wrote:
         | I wonder to what degree this also implies, "More opportunities
         | for speculative execution attacks".
        
           | sroussey wrote:
           | I thought the same thing when I read it, but considering the
           | attacks were known when doing this improvement, I hope that
           | it was under consideration during the design.
        
         | anentropic wrote:
         | I wish there was more detail about the Neural Engine updates
        
           | sroussey wrote:
           | Agreed. I think they just doubled the number of cores and
           | called it a day, but who knows.
        
         | a_vanderbilt wrote:
         | Fat decode pipelines have historically been a major reason for
         | their performance lead. I'm all for improvements in that area.
        
       | snapcaster wrote:
       | Apple was really losing me with the last generation of intel
       | macbooks but these m class processors are so good they've got me
       | locked in all over again
        
         | atonse wrote:
         | The M1 Max that I have is easily the greatest laptop I've ever
         | owned.
         | 
         | It is fast and handles everything I've ever thrown at it (I got
         | 32 GB RAM), it never, ever gets hot, I've never heard a fan in
         | 2+ years (maybe a very soft fan if you put your ear next to
         | it). And the battery life is so incredible that I often use it
         | unplugged.
         | 
         | It's just been a no-compromise machine. And I was thinking of
         | upgrading to an M3 but will probably upgrade to an M4 instead
         | at the end of this year when the M4 maxes come out.
        
           | grumpyprole wrote:
           | Unlike the PC industry, Apple is/was able to move their
           | entire ecosystem to a completely different architecture,
           | essentially one developed exactly for low power use. Windows
           | on ARM efforts will for the foreseeable future be plagued by
           | application support and driver support. It's a great shame,
           | as Intel hardware is no longer competitive for mobile
           | devices.
        
           | teaearlgraycold wrote:
           | Now that there's the 13 inch iPad I am praying they remove
           | the display notch on the Macbooks. It's a little wacky when
           | you've intentionally cut a hole out of your laptop screen
           | just to make it look like your phones did 2 generations ago
           | and now you sell a tablet with the same screen size without
           | that hole.
        
             | tiltowait wrote:
             | I really hate the notch[0], but I do like that the screen
             | stretches into the top that would otherwise be entry. It's
             | unsightly, but we did gain from it.
             | 
             | [0] Many people report that they stop noticing the notch
             | pretty quickly, but that's never been the case for me. It's
             | a constant eyesore.
        
               | kjkjadksj wrote:
               | A big issue with the thin bezels I am now noticing is you
               | lose what used to be a buffer for fingerprints from
               | opening the lid.
        
               | teaearlgraycold wrote:
               | What I've done is use a wallpaper that is black at the
               | top. On the MBP's OLED screen that means the black bezel
               | perfectly blends into the now black menu bar. It's pretty
               | much a perfect solution but the problem it's solving is
               | ridiculous IMO.
        
               | doctor_eval wrote:
               | I do the same, I can't see the notch and got a surprise
               | the other day when my mouse cursor disappeared for a
               | moment.
               | 
               | I don't get the hate for the notch tho. The way I see it,
               | they pushed the menus out of the screen and up into their
               | own dedicated little area. We get more room for content.
               | 
               | It's like the touchbar for menus. Oh, ok, now I know why
               | people hate it. /jk
        
               | Chilko wrote:
               | > The way I see it, they pushed the menus out of the
               | screen and up into their own dedicated little area. We
               | get more room for content.
               | 
               | Exactly - laughed at first but it quickly made sense if
               | they are prioritizing decent webcam quality. Before my M1
               | Pro 14 I had a Dell XPS 13 that also had tiny bezels but
               | squeezed the camera into the very thin top bezel. The
               | result was a terrible webcam that I gladly traded for a
               | notch and a better camera quality.
               | 
               | However, that Dell did still fit Windows Hello (face
               | unlock) capability into that small bezel, so the absence
               | of FaceID despite having the notch is a bit shit.
        
               | adamomada wrote:
               | Here's a handy free utility to automate this for you:
               | https://topnotch.app
               | 
               | Personally I never see the desktop background so I just
               | set desktop to Black, it's perfect for me.
        
               | teaearlgraycold wrote:
               | Thanks!
        
           | kjkjadksj wrote:
           | Thats surprising you haven't heard the fans. Must be the use
           | case. There's a few games that will get it quite hot and
           | spool up the fans. I have also noticed its got somewhat poor
           | sleep management and remains hot while asleep. Sometimes I
           | pick up the computer for the first time that day and its
           | already very hot from whatever kept it out of sleep with a
           | shut lid all night.
        
             | artificialLimbs wrote:
             | Not sure what app you've installed to make it do that, but
             | I've only experienced the opposite. Every Windows 10 laptop
             | I've owned (4 of them) would never go to sleep and turn my
             | bag into an oven if I forgot to manually shut down instead
             | of closing the lid. Whereas my M1 MBP has successfully gone
             | to sleep every lid close.
        
               | deergomoo wrote:
               | The Windows 10 image my employer uses for our Dell
               | shitboxes has sleep completely disabled for some reason I
               | cannot possibly comprehend. The only options in the power
               | menu are Shut Down, Restart, and Hibernate.
               | 
               | If I forget to hibernate before I put it in my bag it
               | either burns through its battery before the next day, or
               | overheats until it shuts itself down. If I'm working from
               | home and get up to pee in the night, I often walk past my
               | office and hear the fans screaming into an empty room,
               | burning god knows how much electricity. Even though the
               | only thing running on it was Slack and an editor window.
               | 
               | It's an absolute joke of a machine and, while it's a few
               | years old now, its original list price was equivalent to
               | a very well specced MacBook Pro. I hope they were getting
               | a substantial discount on them.
        
       | paxys wrote:
       | They keep making iPads more powerful while keeping them on the
       | Fisher-Price OS and then wonder why no one is buying them for
       | real work.
       | 
       | Who in their right mind will spend $1300-$1600 on this rather
       | than a MacBook Pro?
        
         | throwaway2562 wrote:
         | This.
        
         | wizzwizz4 wrote:
         | Everyone knows the Fisher-Price OS is Windows XP (aka
         | Teletubbies OS, on account of Bliss). iPads run Countertop OS
         | (on account of their flat design).
        
         | hedora wrote:
         | You can install linux on the one with a fixed hinge and
         | keyboard, but without a touchscreen. It's the "book" line
         | instead of the "pad" line.
         | 
         | I'm also annoyed that the iPad is locked down, even though it
         | could clearly support everything the macbook does.
         | 
         | Why can't we have a keyboard shortcut to switch between the
         | iPad and Mac desktops or something?
        
       | PreInternet01 wrote:
       | > M4 makes the new iPad Pro an outrageously powerful device for
       | artificial intelligence
       | 
       | Yeah, well, I'm an enthusiastic M3 user, and I'm sure the new AI
       | capabilities are _nice_ , but hyperbole like this is just
       | _asking_ for snark like  "my RTX4090 would like a word".
       | 
       | Other than that: looking forward to when/how this chipset will be
       | available in Macbooks!
        
         | mewpmewp2 wrote:
         | Although as I undserstand m3 chips with more VRAM handle larger
         | LLMs better because they can load more into VRAM compared to
         | 4090.
        
           | freedomben wrote:
           | This is true, but that is only an advantage when running a
           | model larger than the VRAM. If your models are smaller,
           | you'll get substantially better performance in a 4090. So it
           | all comes down to which models you want to run.
        
             | mewpmewp2 wrote:
             | It seems like 13b was running fine on 4090, but when I
             | tried all the more fun or intelligent ones became very slow
             | and would have peformed better on m3.
        
           | PreInternet01 wrote:
           | Yes, M3 chips are _available_ with 36GB unified RAM when
           | embedded in a MacBook, although 18GB and below are the _norm
           | for most models_.
           | 
           | And even though the Apple press release does not even
           | _mention_ memory capacity, I can _guarantee_ you that it will
           | be even less than that on an iPad (simply because RAM is very
           | battery-hungry and most consumers won 't care).
           | 
           | So, therefore my remark: it will be interesting to see how
           | this chipset lands in MacBooks.
        
             | mewpmewp2 wrote:
             | But M3 Max should able to support up to 128gb.
        
         | SushiHippie wrote:
         | Disclosure: I personally don't own any apple devices, except a
         | work laptop with an M2 chip
         | 
         | I think a comparison to the 4090 is unfair, as there is no
         | laptop/tablet with an rtx 4090 and the power consumption of a
         | 4090 is at ~450W on average
        
           | PreInternet01 wrote:
           | > I think a comparison to the 4090 is unfair
           | 
           | No, when using wording like "outrageously powerful", that's
           | exactly the comparison you elicit.
           | 
           | I'd be fine with "best in class" or even "unbeatable
           | performance per Watt", but I can absolutely _guarantee_ you
           | that an iPad does not outperform any current popular-with-
           | the-ML-crowd GPUs...
        
       | simonbarker87 wrote:
       | Seeing an M series chip launch first in an iPad must be result of
       | some mad supply chain and manufacturing related hangovers from
       | COVID.
       | 
       | If the iPad had better software and could be considered a first
       | class productivity machine then it would be less surprising but
       | the one thing no one says about the iPads is "I wish this chip
       | were faster"
        
         | asddubs wrote:
         | Well, it also affects the battery life, so it's not entirely
         | wasted on the ipad
        
         | margalabargala wrote:
         | Maybe they're clocking it way down. Same performance, double
         | the battery life.
        
           | cjauvin wrote:
           | I very rarely wish the battery of my iPad Pro 2018 would last
           | longer, as it's already so good, even considering the age
           | factor.
        
             | 0x457 wrote:
             | Yeah, I don't think about charging my iPad throughout the
             | day, and I constantly use it. Maybe it's in the low 20s
             | late at night, but it never bothered me.
        
         | Zigurd wrote:
         | My guess is that the market size fit current yields.
        
           | a_vanderbilt wrote:
           | I think this is the most likely explanation. Lower volume for
           | the given product matches supply better, and since it's
           | clocked down and has a lower target for GPU cores it has
           | better yields.
        
           | hajile wrote:
           | They already released all their macbooks and latest iphone on
           | N3B which is the worst-yielding 3nm from TSMC. I doubt yields
           | are the issue here.
           | 
           | It's suspected that the fast release for M4 is so TSMC can
           | move away from the horrible-yielding N3B to N3E.
           | 
           | Unfortunately, N3E is less dense. Paired with a couple more
           | little cores, an increase in little core size, 2x larger NPU,
           | etc, I'd guess that while M3 seems to be around 145mm2, this
           | one is going to be quite a bit larger (160mm2?) with the size
           | hopefully being offset by decreased wafer costs.
        
         | DanHulton wrote:
         | I'm wondering if it's because they're hitting the limits of the
         | architecture, and it sounds way better to compare M4 vs M2 as
         | opposed to vs M3, which they'd have to do if it launched in a
         | Macbook Pro.
        
           | mason55 wrote:
           | Eh, they compared the M3 to the M1 when they launched it.
           | People grumbled and then went on with their lives. I don't
           | think they'd use that as a reason for making actual product
           | decisions.
        
         | MuffinFlavored wrote:
         | To me it just feels like a soft launch.
         | 
         | You probably have people (like myself) trying to keep up with
         | the latest MacBook Air who get fatigued having to get a new
         | laptop every year (I just upgraded to the M3 not too long ago,
         | from the M2, and before that... the M1... is there any reason
         | to? Not really...), so now they are trying to entice people who
         | don't have iPads yet / who are waiting for a reason to do an
         | iPad upgrade.
         | 
         | For $1,300 configured with the keyboard, I have no clue what
         | I'd do with this device. They very deliberately are keeping
         | iPadOS + MacOS separate.
        
           | low_common wrote:
           | You get a new laptop every year?
        
             | Teever wrote:
             | If you replace your laptop every year or two and sell the
             | old one online you can keep on the latest technology for
             | only a slight premium.
        
             | MuffinFlavored wrote:
             | I'm sort of "incentivized" to by Apple because as soon as
             | they release a new one, the current device you have will be
             | at "peak trade in value" and deteriorate over time.
             | 
             | It's a negligible amount of money. It's like, brand new
             | $999, trade in for like $450. Once a year... $550
             | remainder/12 months is $45.75/mo to have the latest and
             | greatest laptop.
        
               | fwip wrote:
               | How much is a 2-year old laptop worth? Because if you buy
               | a new laptop every two years and don't even sell the old
               | one, you're only spending $500 a year, which is less than
               | you are now.
        
               | faeriechangling wrote:
               | You really shouldn't trade-in your laptop on the basis of
               | trying to maximise its trade-in value, that doesn't make
               | economic sense.
               | 
               | You should be incentivised by trying to minimise
               | depreciation. You incur the greatest amount of
               | depreciation closest to the date of purchase, so the
               | longer you go between purchases, the less depreciation
               | you'll realise.
               | 
               | If I expected to say get, $450 after 1 year, and $250
               | after 2 years. By trading in every 2 years, I'm getting a
               | laptop that's a bit older, but you're also saving
               | $14.58/month on depreciation. If the year after that
               | becomes $150, you'd be saving $22.22/month. If the price
               | is worth it is subjective, I'm just saying going for
               | maximal trade in value doesn't really make sense, since
               | you save more money the lower the trade-in value you get.
        
             | bombcar wrote:
             | It's not terribly expensive if you trade-in or otherwise
             | sell or hand down the previous.
             | 
             | I went from M1 to M1 Pro just to get more displays.
        
           | mvkel wrote:
           | I think (hope) wwdc changes this. The function keys on the
           | Magic Keyboard give me hope.
           | 
           | Also, you know you don't HAVE to buy a laptop every year,
           | right?
        
           | jasonjmcghee wrote:
           | Still using my M1 Air and had no interest in updating to M3.
           | Battery life has dropped a fair amount, but still like 8+
           | hours. That's going to be the trigger to get a new one. If
           | only batteries lasted longer.
        
             | foldr wrote:
             | I don't think it costs that much to have the battery
             | replaced compared to the price of a new laptop.
        
               | jasonjmcghee wrote:
               | Curious how much it would cost. I think parts are on the
               | order of $150? So maybe $4-500 for official repair?
               | 
               | If I can hold out another year or two, would probably end
               | up just getting a new one
        
               | foldr wrote:
               | Oh no, it's way less than that. It should be about $159.
               | You can get an estimate here:
               | https://support.apple.com/en-us/mac/repair
        
               | jasonjmcghee wrote:
               | Good to know- also didn't know about that tool.
               | Appreciate it!
        
           | baq wrote:
           | I feel like I bought the M1 air yesterday. Turns out it was
           | ~4 years ago. Never felt the need to upgrade.
        
             | dialup_sounds wrote:
             | Interestingly, Apple still sells M1 Airs through Walmart,
             | but not their own website.
        
             | Toutouxc wrote:
             | Same here, my M1 Air still looks and feels like a brand new
             | computer. Like, I still think of it as "my new MacBook".
             | It's my main machine for dev work and some hobby
             | photography and I'm just so happy with it.
        
             | wil421 wrote:
             | The only reason I upgraded is my wife "stole" my M1 air. I
             | bought a loaded M3 MBP and then they came out with a 15"
             | Air with dual monitor capabilities. Kinda wish I had the
             | air again. It's not like I move it around much but the form
             | factor is awesome.
        
               | skohan wrote:
               | I love the air form factor. I do serious work on it as
               | well. I have used a pro, but the air does everything I
               | need without breaking a sweat, and it's super convenient
               | to throw in a bag and carry around the house.
        
         | alexpc201 wrote:
         | To offer something better to those who have an iPad Pro M2 and
         | a more powerful environment to run heavier games.
        
         | andrewmunsell wrote:
         | My current assumption is that this has to do with whatever "AI"
         | Apple is planning to launch at WWDC. If they launched a new
         | iPad with an M3 that wasn't able to sufficiently run on-device
         | LLMs or whatever new models they are going to announce in a
         | month, it would be a bad move. The iPhones in the fall will
         | certainly run some new chip capable of on-device models, but
         | the iPads (being announced in the Spring just before WWDC) are
         | slightly inconveniently timed since they have to announce the
         | hardware before the software.
        
           | owenpalmer wrote:
           | interesting theory, we'll see what happens!
        
         | eitally wrote:
         | My guess is that the M4 and M3 are functionally almost
         | identical so there's no real reason for them to restrict the
         | iPad M4 launch until they get the chip into the MacBook / Air.
        
         | mort96 wrote:
         | To be honest, I wish my iPad's chip was slower! I can't do
         | anything other than watch videos and use drawing programs on an
         | iPad, why does it need a big expensive power hungry and
         | environmentally impactful CPU when one 1/10 the speed would do?
         | 
         | If I could actually _do_ something with an iPad there would be
         | a different discussion, but the operating system is so
         | incredibly gimped that the most demanding task it 's really
         | suited for is .. decoding video.
        
           | Shank wrote:
           | > Why does it need a big expensive power hungry and
           | environmentally impactful CPU when one 1/10 the speed would
           | do?
           | 
           | Well, it's not. Every process shrink improves power
           | efficiency. For watching videos, you're sipping power on the
           | M4. For drawing...well if you want low latency while drawing,
           | which generally speaking, people do, you...want the processor
           | and display to ramp up to compensate and carry strokes as
           | fast as possible?
           | 
           | Obviously if your main concern is the environment, you
           | shouldn't upgrade and you should hold onto your existing
           | model(s) until they die.
        
             | mort96 wrote:
             | From what I can tell, the 2020 iPad has perfectly fine
             | latency while drawing, and Apple hasn't been advertising
             | lower latencies for each generation; I think they pretty
             | much got the latency thing nailed down. Surely you could
             | make something with the peak performance of an A12Z use
             | less power on average than an M4?
             | 
             | As for the environmental impact, whether I buy or don't buy
             | this iPad (I won't, don't worry, my 2020 one still works),
             | millions of people will. I don't mind people buying
             | powerful machines when the software can make use of the
             | performance, but for iPad OS..?
        
               | Kon-Peki wrote:
               | The M4 is built on the newest, best, most expensive
               | process node (right?). They've got to amortize out those
               | costs, and then they could work on something cheaper and
               | less powerful. I agree that they probably won't, and
               | that's a shame. But still, the M4 is most likely one of
               | the best options for the best use of this new process
               | node.
        
           | steveridout wrote:
           | I'm under the impression that this CPU is faster AND more
           | efficient, so if you do equivalent tasks on the M4 vs an
           | older processor, the M4 should be less power hungry, not
           | more. Someone correct me if this is wrong!
        
             | mort96 wrote:
             | It's more power efficient than the M3, sure, but surely it
             | could've been even more power efficient if it had worse
             | performance simply from having fewer transistors to switch?
             | It would certainly be more environmentally friendly at the
             | very least!
        
               | Kon-Peki wrote:
               | The most environmentally friendly thing to do is to keep
               | your A12Z for as long as you can, ignoring the annual
               | updates. And when the time comes that you must do a
               | replacement, get the most up to date replacement that
               | meets your needs. Change your mindset - you are not
               | required to buy this one, or the next one.
        
               | mort96 wrote:
               | Of course, I'm not buying this one or any other until
               | something breaks. After all, my _current_ A12Z is way too
               | powerful for iPadOS. It just pains me to see amazing
               | feats of hardware engineering like these iPads with M4 be
               | completely squandered by a software stack which doesn 't
               | facilitate more demanding tasks than decoding video.
               | 
               | Millions of people will be buying these things regardless
               | of what I'm doing.
        
               | _ph_ wrote:
               | Look at the efficiency cores. They are all you are
               | looking for.
        
               | mort96 wrote:
               | I agree!
               | 
               | So what are the performance cores doing there?
        
               | _ph_ wrote:
               | They are for those tasks, where you do need high
               | performance. Where you would wait for your device
               | instead. A few tasks require all cpu power you can get,
               | so that is what the performance cores are for. But most
               | of the time, it will consume a fraction of that power.
        
               | mort96 wrote:
               | My _whole point_ is that iPadOS is such that there 's
               | really nothing useful to do with that performance. No
               | task an iPad can do requires CPU power (except for
               | _maybe_ playing games but that 'll throttle the M4 to
               | hell and back anyway).
        
               | _ph_ wrote:
               | Every time you perform complex computations on images or
               | video, you need any bit of performance you can get.
        
               | mort96 wrote:
               | No?
        
           | kylehotchkiss wrote:
           | > Environmentally impactful CPU when one 1/10 the speed would
           | do
           | 
           | Apple's started to roll out green energy charging to devices:
           | https://support.apple.com/en-us/108068
           | 
           | If I had to ballpark estimate this, your iPad probably uses
           | less energy per year than a strand of incandescent holiday
           | lights does in a week. Maybe somebody can work out that math.
        
             | mort96 wrote:
             | The environmental concerns I have aren't really power
             | consumption. Making all these CPUs takes _a lot_ of
             | resources.
        
               | pertymcpert wrote:
               | How do you suggest they make new iPads for people who
               | want them? Someone has to make new CPUs and if you can
               | improve perf/W while you're doing so you might as well.
        
               | mort96 wrote:
               | They could start by making it possible to use the iPads
               | for something by opening up iPadOS
        
               | pertymcpert wrote:
               | You didn't answer my question.
        
           | aurareturn wrote:
           | >To be honest, I wish my iPad's chip was slower! I can't do
           | anything other than watch videos and use drawing programs on
           | an iPad, why does it need a big expensive power hungry and
           | environmentally impactful CPU when one 1/10 the speed would
           | do?
           | 
           | A faster SoC can finish the task with better "work
           | done/watt". Thus, it's more environmentally friendly. Unless
           | you're referring to the resources dedicated to advancing
           | computers such as the food engineers eat and the electricity
           | chip fabs require.
        
             | mort96 wrote:
             | A faster and more power hungry SoC can finish the task with
             | better work done per joule if it is fast enough to offset
             | the extra power consumption. It is my understanding that
             | this is often not the case. See e.g efficiency cores
             | compared to performance cores in these heterogeneous
             | design; the E cores can get more done per joule AFAIU. If
             | my understanding is correct, then removing the P cores from
             | the M4 chip would let it get more work done per joule.
             | 
             | Regardless, the environmental impact I'm thinking about
             | isn't mainly power consumption.
        
               | pertymcpert wrote:
               | The P cores don't get used if they're not needed. You
               | don't need to worry. Most of your every use and
               | background work gets allocated to E cores.
        
               | mort96 wrote:
               | But they _do_ get used. And they take up space on die.
        
               | pertymcpert wrote:
               | When do they get used?
        
               | mort96 wrote:
               | I don't know the details of iOS's scheduler or how it
               | decides which tasks should go on which kind of core, but
               | the idea is to put tasks which benefit from high
               | performance on the P-cores, right?
        
           | _ph_ wrote:
           | It has 6 efficiency cores. Every single of them is extremely
           | power efficient, but still faster than an iPad 2-3
           | generations back. So unless you go full throttle, a M4 is
           | going to be by far the most efficient CPU you can have.
        
           | janandonly wrote:
           | Then the lower price point of the iPad should entice you more
           | now.
        
             | mort96 wrote:
             | When the A12Z is too powerful, the M2 is as well
        
         | drexlspivey wrote:
         | Meanwhile the mac mini is still on M2
        
           | bombcar wrote:
           | So is the studio.
        
         | themagician wrote:
         | Welcome to being old!
         | 
         | Watch a 20-year old creative work on an iPad and you will
         | quickly change your mind. Watch someone who has, "never really
         | used a desktop, [I] just use an iPad" work in Procreate or
         | LumaFusion.
         | 
         | The iPad has amazing software. Better, in many ways, than
         | desktop alternatives _if_ you know how to use it. There are
         | some things they can 't do, and the workflow can be less
         | flexible or full featured in some cases, but the speed at which
         | some people (not me) can work on an iPad is mindblowing.
         | 
         | I use a "pro" app on an iPad and I find myself looking around
         | for how to do something and end up having to Google it half the
         | time. When I watch someone who really knows how to use an iPad
         | use the same app they know exactly what gesture to do or where
         | to long tap. I'm like, "How did you know that clicking on that
         | part of the timeline would trigger that selection," and they
         | just look back at you like, "What do you mean? How else would
         | you do it?"
         | 
         | There is a bizarre and almost undocumented design langauge of
         | iPadOS that some people simply seem to know. It often pops up
         | in those little "tap-torials" when a new feature roles out that
         | I either ignore or forget... but other people internalize them.
        
           | quasarj wrote:
           | They can have my keyboard when they pry it from my cold dead
           | hands! And my mouse, for that matter.
        
             | themagician wrote:
             | Oh, I'm with you. But the funny thing is, they won't even
             | want it.
             | 
             | I have two iPads and two pencils--that way each iPad is
             | never without a penicl--and yet I rarely use the pencil. I
             | just don't think about it. But then when I do, I'm like,
             | "Why don't I use this more often? It's fantastic."
             | 
             | I have tried and tried to adapt and I can not. I need a
             | mouse, keyboard, seperate numpad, and two 5K Displays to
             | mostly arrive at the same output that someone can do with a
             | single 11" or 13" screen and a bunch of differnt spaces
             | that can be flicked through.
             | 
             | I desperatedly wanted to make the iPad my primary machine
             | and I could not do it. But, honestly, I think it has more
             | to do with me than the software. I've become old and
             | stubborn. I want to do things my way.
        
               | skydhash wrote:
               | > "Why don't I use this more often? It's fantastic."
               | 
               | PDF marking and procreate are my main uses for it. And
               | using the ipad on a flat surface.
        
             | Terretta wrote:
             | Magic Keyboard is both, and the current (last, as of today)
             | iteration is great.
             | 
             | It is just fine driving Citrix or any web app like
             | VSCode.dev.
        
               | btown wrote:
               | The existence of vscode.dev always makes me wonder why
               | Microsoft never released an iOS version of VSCode to get
               | more users into its ecosystem. Sure, it's almost as
               | locked down as the web environment, but there's a lot of
               | space in that "almost" - you could do all sorts of things
               | like let users run their code, or complex extensions, in
               | containers in a web view using
               | https://github.com/ktock/container2wasm or similar.
        
               | cromka wrote:
               | It irks me they didn't print even tiniest F denotations
               | on the functional keys.
        
           | aurareturn wrote:
           | I'm with you. I think HN (and conversational internet)
           | disproportionally contains more laptop people than the
           | public.
           | 
           | A lot of of the younger generation does all their work on
           | their phone and tablet and does not have a computer.
        
             | Tiktaalik wrote:
             | If those younger folks are as the parent says in the
             | creative sector.
             | 
             | iPad has been a workflow gamechanger for folks who use
             | photoshop etc but users are still prevented from coding on
             | it.
        
               | themagician wrote:
               | It's actually come a long way. The workflow is still...
               | sub-optimal, but there are some really _nice_ terminal
               | apps (LaTerminal, Prompt, ShellFish, iSH) which are
               | functional too. Working Copy is pretty dope for working
               | with git once you get you adapt to it.
               | 
               | I do most of my dev on a Pi5 now, so actually working on
               | the iPad is not that difficult.
               | 
               | If they ever release Xcode for iPadOS that would be a
               | true gamechanger.
        
               | nomel wrote:
               | "Prevented from coding" is just not true. There are many
               | python IDEs, Swift Playgrounds, etc. Pythonista, and the
               | like, are neat because you get full access to all the
               | iPhone sensors.
        
               | amjnsx wrote:
               | And maybe that's fine? Look at it from the opposite side.
               | All those artists complaining about how terrible macbooks
               | are because you can't draw on them.
        
             | wiseowise wrote:
             | Majority of people don't know what programming is and do
             | shitton of manual things that can be automated by a simple
             | bash/Python script, so what?
        
               | nomel wrote:
               | So, the product may not fit your use case or preference.
               | That's ok. Others love it. That's also ok.
               | 
               | What's silly is thinking that a device that makes
               | everyone happy is somehow trivial, or that a device
               | _intentionally_ made with a specific type of interface is
               | bad because of that intent. If things were as bad or as
               | trivial as some people suggest, someone else would have
               | made a successful competing product by now, rather than
               | the string failures across the industry, from those who
               | have tried.
        
               | Nathanba wrote:
               | what you are describing is a certainly some kind of
               | professional but we should look from a higher vantage
               | point. A professional in any field will ultimately want
               | to customize their tool or make their own tools and that
               | is not possible with ipads or even really software on
               | ipads. It is a massive step backwards to sit in a walled,
               | proprietary garden and claim that these people are
               | productive professionals as if they are comparable to the
               | previous generation of professionals. They may be in some
               | sense but from a more historical, higher viewpoint they
               | all seem herded into tightly controlled, corporate tools
               | that will be taken from them whenever it is convenient
               | for someone else. Ipad users are effectively like vmware
               | users who are just waiting for a Broadcom moment. The
               | price hike will always come eventually, the support will
               | always drop at some point. It is all borrowed time, tools
               | someone else controls and makes. It might be necessary in
               | a world where we all need to make money but to positively
               | support it is something else entirely.
        
               | nomel wrote:
               | > A professional in any field will ultimately want to
               | customize their tool or make their own tools
               | 
               | I suspect you're a programmer. This is _not_ the
               | perspective or reality of most professional users. Most
               | professional apps are not, themselves, customizable. Most
               | professional users do not make their own tools, or want
               | to. If you're a programmer, you'll understand this,
               | because that's why you're employed: they want you to do
               | it.
        
               | Topfi wrote:
               | Sorry to be pedantic, but I feel that the distinction is
               | important, that seems more like a UX than a programming
               | job. Too often, UX, UI, coding, documentation, etc. are
               | thrown together, viewed as tasks that can be handled by
               | the same people interchangeably and it rarely yields
               | great results, in part because programmers often start
               | out with expectations that can differ from the vast
               | majority of users.
               | 
               | Also, "most" and "any" aren't all too helpful in this
               | discussion (not directed at anyone in particular, these
               | can be read in comments throughout this thread) because
               | there are going to be countless examples in either
               | direction, but from my limited experience, I have seen
               | professionals in various spaces, some which very much
               | prefer a default workflow and others that heavily
               | customize. I know talented professional programmers doing
               | great work in the out-of-the-box setup of VSCode combined
               | with GitHub Desktop, etc. but also have seen graphic
               | designers, video editors, and even people focused purely
               | on writing text that have created immensely impressive
               | workflows, stringing macros together and relying heavily
               | on templates and their preferred folder structures. Even
               | on iPad OS, people can have their custom-tailored
               | workflow regarding file placement, syncing with cloud
               | storage, etc., just in a restricted manner and for what
               | it's worth, I sometimes prefer using Alight Motion for
               | certain video editing tasks on my smartphone over
               | grabbing my laptop.
               | 
               | I have seen and feel strongly that any professional from
               | any field can have a customized workflow and can benefit
               | from the ability to customize their toolset, even those
               | outside programming, but I also feel equally strongly
               | that sane defaults must remain and the "iPad way of doing
               | things", as much as I in my ancient mid-twenties will
               | never fully adapt to it, must remain for people who
               | prefer and thrief in that environment.
        
               | wiseowise wrote:
               | Define most professional apps.
               | 
               | A lot of professional applications include some form of
               | scripting engine.
        
               | nomel wrote:
               | Sure, which most professionals don't use. If you think
               | the average professional can program, or use scripting
               | engines, it's because you're a HN user, and probably a
               | programmer, not an average professional, and less likely
               | one that uses an iPad.
               | 
               | But, there's nothing technically stopping an app
               | developer from implementing any of this, including
               | desktop level apps. Compute, keyboard/mouse, and stylus
               | is ready. I think the minuscule market that would serve
               | is what's stopping them.
        
               | Nathanba wrote:
               | It has nothing to do with programming, I know mechanics
               | who complain about car models where the manuals costs
               | massive amounts of money if they are even allowed to get
               | it at all and it takes weeks to order them. This should
               | be a familiar story in every field. Do artists not have
               | ateliers full of custom brushes, things they found work
               | for them and they customized? Not to mention that artists
               | these days are Maya, Autodesk and Photoshop users. Is
               | that pen really powerful enough? Because a pen is really
               | close to a mouse pointer anyway, so why even stick to an
               | ipad then, you can simple buy a pen and board for the
               | desktop computer. This is not about whether I am a
               | programmer or not, this is about why some praise and use
               | Apple devices for professionals even though they are not
               | the best choice.
        
               | nomel wrote:
               | > A professional in any field will ultimately want to
               | customize their tool or make their own tools and that is
               | not possible with ipads or even really software on iPads.
               | 
               | I was responding to this main point.
               | 
               | Every professional drawing app, on the iPad, allows you
               | to make your own brushes. There's no limitation there.
               | That's just a fundamental requirement of a drawing app.
               | They're not customizing the workflow or tool/apps
               | _itself_ , which is what I thought you were referring to.
               | 
               | > make their own tools
               | 
               | This requires programming, does it not? Do you have some
               | examples?
               | 
               | > why some praise and use Apple devices for professionals
               | even though they are not the best choice.
               | 
               | Especially for drawing, I think it would be best to ask
               | the professionals why they chose a ~1lb iPad in their
               | backpack with a pixel perfect stylus, over a desktop
               | computer and mouse. The answers might surprise you.
        
               | wiseowise wrote:
               | Just because people love it, doesn't mean that it can't
               | be better. It also doesn't mean that current way of doing
               | things is efficient.
               | 
               | > What's silly is thinking that a device that makes
               | everyone happy is somehow trivial, or that a device
               | intentionally made with a specific type of interface is
               | bad because of that intent.
               | 
               | Add a proper native terminal, proper virtualization
               | framework a la what we have on Mac, side loading and
               | third party browser support with plugins and you'll shut
               | up 99% of complaining users here.
               | 
               | > If things were as bad or as trivial as some people
               | suggest, someone else would have made a successful
               | competing product by now, rather than the string failures
               | across the industry, from those who have tried.
               | 
               | Right, let me use my couple billion change in R&D to
               | create iPad compatible system with all that I've
               | mentioned before.
        
               | lm28469 wrote:
               | > Just because people love it, doesn't mean that it can't
               | be better.
               | 
               | Better for who ? They sell like hot cakes, 50+ millions
               | of them per year
               | 
               | According to google there are 25m software devs in the
               | world, even if half (extremely generous) of them would
               | buy a new ipad every single year (extremely generous) if
               | they implemented what you ask it isn't going to change
               | much for apple, ipads are like 10% of their revenues. So
               | at best we're talking of 3% revenue increase
               | 
               | > 99% of complaining users here.
               | 
               | 99% of not much is nothing for apple, they're already
               | printing money too fast to know what to do with it
        
               | wiseowise wrote:
               | If all you care about is revenue - sure, don't see any
               | point arguing with you. It'll end with "it's not going to
               | yield additional revenue".
        
             | aqfamnzc wrote:
             | Yeah, apparently some people book flights from their
             | phones!? Nah man, that's a laptop activity. I'd never spend
             | more than a couple hundred dollars on my phone. Haha
        
           | steve_adams_86 wrote:
           | I watched someone do some incredibly impressive modelling on
           | an iPad Pro via shapr3D, and yeah, it was a young person.
           | 
           | I'm into the idea of modelling like this, or drawing, but the
           | reality is I spend most of my time and money on a desktop
           | work station because the software I need most is there. I'm
           | totally open to iPads being legit work machines, but they're
           | still too limited (for me) to make the time and cash
           | investment for the transition.
           | 
           | You're definitely right though. People are doing awesome work
           | on them without the help of a traditional desktop or laptop
           | computer.
        
           | Der_Einzige wrote:
           | The examples given are _always_ artists, whose jobs are
           | actively on the chopping block due to AI models and systems
           | which _checks notes_ don 't even run that effectively on
           | apple hardware yet!
           | 
           | Of course, SWE jobs are on the chopping block for the same
           | reasons, but I claim that AI art models are ahead of AI
           | coding models in terms of quality and flexibility.
        
           | GeneralMaximus wrote:
           | > Welcome to being old!
           | 
           | This has nothing to do with age. I have an iPad Pro that I
           | barely use because it has been designed for use cases that I
           | just don't have.
           | 
           | I don't do any digital art, don't take handwritten notes, and
           | don't need to scan and/or mark up documents very often. I
           | don't edit photos or videos often enough to need a completely
           | separate device for the task.
           | 
           | I mostly use my computers for software development, which is
           | impossible on an iPad. I tried running my dev tools inside an
           | iSH session, and also on a remote Linux box that I could SSH
           | into. It wasn't a great experience. Why do this when I could
           | just run VS Code or WebStorm on a Mac?
           | 
           | I also write a lot -- fiction, blog posts, journal entries,
           | reading notes -- which should technically be possible to do
           | well on the iPad. In practice there just aren't enough
           | powerful apps for serious long-form writing on a tablet.
           | Microsoft Word on iPad lacks most of the features of its
           | desktop counterpart, Scrivener doesn't support cloud sync
           | properly, iA Writer is too limited if you're writing anything
           | over a few thousand words, and Obsidian's UI just doesn't
           | work well on a touch device. The only viable app is Ulysses,
           | which is ... okay, I guess? If it floats your boat.
           | 
           | I sometimes do music production. This is now possible on the
           | iPad via Logic Pro. I suppose I could give it a try, but what
           | does that get me? I already own an Ableton license and a
           | couple of nice VSTs, none of which transfers over to the
           | iPad. I can also download random third-party apps to
           | manipulate audio on my Mac, or mess around with Max4Live, or
           | use my Push 2 to make music. Again, this stuff doesn't work
           | on an iPad and it never will, because the APIs to enable
           | these things simply don't exist.
           | 
           | There are tons of people who use Windows because they need to
           | use proprietary Windows-based CAD software. Or people who
           | need the full desktop version of Excel to do their jobs. Or
           | academics and researchers who need Python/R to crunch data.
           | All of these people might LOVE to use something like these
           | new iPads, but they can't because iPadOS just can't meet
           | their use cases.
           | 
           | I really like the idea of a convertible tablet that can
           | support touch, stylus, keyboard, and pointer input. The iPad
           | does a great job at being this device as far as hardware and
           | system software is concerned. But unfortunately, it's too
           | limited for a the kinds of workflows people using
           | laptops/desktops need to do.
        
             | themagician wrote:
             | It's okay. It happens to all of us.
        
           | faeriechangling wrote:
           | It's not a matter of being young or old, it's that iPadOS is
           | not tooled to be a productive machine for software
           | developers, but IS tooled to be productive machine for
           | artists.
        
         | robsh wrote:
         | I have the faintest of hope that WWDC will reveal a new hybrid
         | Mac/iPad OS. If it ever happens I won't hesitate to buy an iPad
         | Pro.
        
       | Lalabadie wrote:
       | Key values in the press release:
       | 
       | - Up to 1.5x the CPU speed of iPad Pro's previous M2 chip
       | 
       | - Octane gets up to 4x the speed compared to M2
       | 
       | - At comparable performance, M4 consumes half the power of M2
       | 
       | - High-performance AI engine, that claims 60x the speed of
       | Apple's first engine (A11 Bionic)
        
         | mrtesthah wrote:
         | > _- Up to 1.5x the CPU speed of iPad Pro 's previous M2 chip_
         | 
         | What I want to know is whether that ratio holds for single-core
         | performance measurements.
        
         | codedokode wrote:
         | > Up to 1.5x the CPU speed
         | 
         | Doesn't it mean "1.5x speed in rare specific tasks which were
         | hardware optimized, and 1x ewerywhere else"?
        
           | rsynnott wrote:
           | I mean, we'll have to wait for proper benchmarks, but that
           | would make it a regression vs the M3, so, er, unlikely.
        
         | BurningFrog wrote:
         | At this point I read "up to" as "not"...
        
         | stonemetal12 wrote:
         | Apple claimed M3 was 1.35 the speed of the M2. So the M3 vs M4
         | comparison isn't that impressive. Certainly not bad by any
         | means, just pointing out why it is compared to the M2 here.
        
           | SkyPuncher wrote:
           | While I completely agree with your point, the M chips is a
           | series of chips. The iPad M2 is different than the MBP M2 or
           | the MacBook Air.
           | 
           | It's all just marketing to build hype.
        
             | astrange wrote:
             | No, it's the same as the MB Air.
        
           | kjkjadksj wrote:
           | People make this comment after every single m series release.
           | Its true for intel too, worse even. Changes between like 8th
           | and 9th and 10th gen were like nill, small clock bump same
           | igpu even.
        
           | frankchn wrote:
           | The other reason it is compared to the M2 is that there are
           | no iPads with M3s in them, so it makes sense to compare to
           | the processor used in the previous generation product.
        
           | tiltowait wrote:
           | It seems pretty reasonable to compare it against the last-
           | model iPad, which it's replacing.
        
       | dsign wrote:
       | > M4 makes the new iPad Pro an outrageously powerful device for
       | artificial intelligence.
       | 
       | Isn't there a ToS prohibition about "custom coding" in iOS? Like,
       | the only way you can ever use that hardware directly is for
       | developers who go through Apple Developer Program, which last
       | time I heard was bitter lemon? Tell me if I'm wrong.
        
         | freedomben wrote:
         | Well, this is the heart of the "appliance" model. iPads are
         | _appliances_. You wouldn 't ask about running custom code on
         | your toaster or your blender, so you shouldn't ask about that
         | for your iPad. Also all the common reasons apply: Security and
         | Privacy, Quality Control, Platform Stability and Compatibility,
         | and Integrated User Experience. All of these things are harmed
         | when you are allowed to run custom coding.
         | 
         | (disclaimer: My personal opinion is that the "appliance" model
         | is absurd, but I've tried to steel-man the case for it)
        
           | jebarker wrote:
           | Lots of people ask about running custom code on other
           | appliances. I think they call them hackers.
        
             | freedomben wrote:
             | I think you're reinforcing Apple's point about how security
             | is harmed by allowing custom code.
        
           | kibwen wrote:
           | _> You wouldn 't ask about running custom code on your
           | toaster or your blender, so you shouldn't ask about that for
           | your iPad._
           | 
           | Of course I would, and the only reason other people wouldn't
           | is because they're conditioned to believe in their own innate
           | powerlessness.
           | 
           | If you sell me a CPU, I want the power to program it, period.
        
             | freedomben wrote:
             | I mean this sincerely, are you really an Apple customer
             | then? I feel exactly the same as you, and for that reason I
             | don't buy Apple products. They are honest about what they
             | sell, which I appreciate.
        
               | judge2020 wrote:
               | Some arguments are that you shouldn't be able to create
               | appliances, only general purpose machines.
        
               | taylodl wrote:
               | Ever notice people don't build their own cars anymore?
               | They used to even up through the 60's. I mean ordering a
               | kit or otherwise purchasing all the components and
               | building the car. Nowadays it's very rare that people do
               | that.
               | 
               | I'm old enough to remember when people literally built
               | their own computers, soldering iron in hand. People
               | haven't done that since the early 80's.
               | 
               | Steve Jobs' vision of the Mac, released in 1984, was for
               | it to be a computing appliance - "the computer for the
               | rest of us." The technology of the day prevented that.
               | Though they pushed that as hard as they could.
               | 
               | Today's iPad? It's the fulfillment of Steve Jobs'
               | original vision of the Mac: a computing appliance. It
               | took 40 years, but we're here.
               | 
               | If you don't want a computing appliance then don't buy an
               | iPad. I'd go further and argue don't buy any tablet
               | device. Those that don't want computing appliances don't
               | have to buy them. It's not like laptops, or even
               | desktops, are going anywhere anytime soon.
        
               | kibwen wrote:
               | _> If you don 't want a computing appliance then don't
               | buy an iPad._
               | 
               | If you _do_ want a computing appliance, then there 's
               | nothing wrong with having a machine that _could_ be
               | reprogrammed that you simply choose _not_ to reprogram.
               | Please stop advocating for a worse world for the rest of
               | us when it doesn 't benefit you in the slightest to have
               | a machine that you don't control.
        
               | taylodl wrote:
               | Stop being so damned melodramatic. I'm not advocating for
               | a "worse world for the rest of us." There are a
               | _plethora_ of choices for machines that aren 't
               | appliances. In fact, _the overwhelming majority_ of
               | machines are programmable. Apple thinks the market wants
               | a computing appliance. The market will decide. Meanwhile,
               | you have lots of other choices.
        
               | nordsieck wrote:
               | > Some arguments are that you shouldn't be able to create
               | appliances, only general purpose machines.
               | 
               | I sincerely hope that you live as much of your life in
               | that world as possible.
               | 
               | Meanwhile, I'll enjoy having a car I don't have to mess
               | with every time I start it up.
        
               | kibwen wrote:
               | This is a false dichotomy. There's nothing stopping
               | anyone from shipping a device with software that works,
               | but that can still be reprogrammed.
        
               | bluescrn wrote:
               | In a world concerned with climate change, we should see
               | many of these 'appliances' as inherently wasteful.
               | 
               | On top of the ugly reality that they're designed to
               | become e-waste as soon as the battery degrades.
        
             | theshrike79 wrote:
             | Can you do that to your car infotainment system btw?
        
               | blacklion wrote:
               | Why not?
               | 
               | It MUST (RFC2119) be airgapped from ABS and ECU, of
               | course.
        
             | taylodl wrote:
             | _> If you sell me a CPU, I want the power to program it,
             | period._
             | 
             | Uhhh, there are CPUs in your frickin' wires now, dude!
             | There are several CPUs in you car for which you generally
             | don't have access. Ditto for your fridge. Your microwave.
             | Your oven. Even your toaster.
             | 
             | We're literally awash in CPUs. You need to update your
             | thinking.
             | 
             | Now, if you said something like "if you sell me a general-
             | purpose computing device, then I want the power to program
             | it, period" then I would fully agree with you. BTW, _you
             | can_ develop software for your own personal use on the
             | iPad. It 's not cheap or easy (doesn't utilize commonly-
             | used developer tooling), but it can be done without having
             | to jump through any special hoops.
             | 
             | Armed with that, we can amend your statement to "if you
             | sell me a general-purpose computing device, then I want the
             | power to program it using readily-available, and commonly-
             | utilized programming tools."
             | 
             | I think that statement better captures what I presume to be
             | your intent.
        
               | talldayo wrote:
               | > but it can be done without having to jump through any
               | special hoops.
               | 
               | You are really stretching the definition of "special
               | hoops" here. On Android sideloading is a switch hidden in
               | your settings menu; on iOS it's either a municipal
               | feature or a paid benefit of their developer program.
               | 
               | Relative to every single other commercial, general-
               | purpose operating system I've used, I would say yeah,
               | Apple practically defines what "special hoops" look like
               | online.
        
               | duped wrote:
               | I do actually want the ability to program the CPUs in my
               | car the same way I'm able to buy parts and mods for every
               | mechanical bit in there down to the engine. In fact we
               | have laws about that sort of thing that don't apply to
               | the software.
        
             | umanwizard wrote:
             | That may be your personal preference, but you should accept
             | that 99% of people don't care about programming their
             | toaster, so you're very unlikely to ever make progress in
             | this fight.
        
               | mort96 wrote:
               | 99% of people don't care about programming anything, that
               | doesn't make this gatekeeping right.
        
               | kjkjadksj wrote:
               | You aren't wrong but businesses aren't in the market to
               | optimize for 1% their customers
        
               | zipping1549 wrote:
               | It's not optimizing. It's opening. If I buy a $2,000
               | hardware I should be able to do whatever I want with it.
        
               | timothyduong wrote:
               | Could apply this for anything complex and packaged.
               | 
               | I'm annoyed that I can't buy particular engines off the
               | shelf and use them in my bespoke approach, why dont car
               | manufacturers give the approach that crate engine
               | providers do?
        
               | kibwen wrote:
               | Then I wish you the best of luck in your fight. In the
               | meantime, don't drag _me_ down or tell me that I 'm wrong
               | just because you, personally, don't want something that I
               | want that also doesn't harm you in the slightest.
        
               | doctor_eval wrote:
               | Yeah, if I have to program my toaster, I'm buying a new
               | toaster.
               | 
               | I write enough code during the day to make me happy. I
               | really don't want to be thinking about the optimal
               | brownness of my bagel.
        
             | owenpalmer wrote:
             | the desire to program one's toaster is the most HN thing
             | I've seen all day XD
        
               | BrianHenryIE wrote:
               | I really wish I could program my dishwasher because it's
               | not cleaning very well and if I could add an extra rinse
               | cycle I think it would be fine.
        
               | kjkjadksj wrote:
               | Start by cleaning the filters
        
             | kjkjadksj wrote:
             | And engineer your own bagel setting without buying a bagel
             | model? Dream on.
        
           | shepherdjerred wrote:
           | If I could deploy to my blender as easily as I can to AWS,
           | then I would _definitely_ at least try it.
        
           | paxys wrote:
           | An appliance manufacturer isn't doing an entire press event
           | highlighting how fast the CPU on the appliance is.
        
             | worthless-trash wrote:
             | If its advertised like a general purpose computer,
             | expectations should be met.
        
             | freedomben wrote:
             | Agree completely. I think it's absurd that they talk about
             | technical things like CPU and memory in these
             | announcements. It seems to me like an admission that it's
             | not really an "appliance" but trying to translate Apple
             | marketing into logical/coherent concepts can be a
             | frustrating experience. I just don't try anymore.
        
           | ben-schaaf wrote:
           | I appreciate the steel-man. A strong counter argument for me
           | is that you actually _can_ run any custom code on an iPad, as
           | long as it 's in a web-browser. This is very unlike an
           | appliance where doing so is not possible. Clearly the
           | intention is for arbitrary custom code to run on it, which
           | makes it a personal computer and not an appliance (and should
           | be regulated as such).
        
             | freedomben wrote:
             | That's a fair point, although (steel-manning) the "custom
             | code" in the browser is severely restricted/sandboxed,
             | unlike "native" code would be. So from that perspective,
             | you could maybe expand it to be like a toaster that has
             | thousands of buttons that can make for hyper-specific
             | stuff, but can't go outside of the limits the manufacturer
             | built in.
        
         | eqvinox wrote:
         | As with any Apple device -- or honestly, any computing device
         | in general -- my criteria of evaluation would be the resulting
         | performance if I install Linux on it. (If Linux is not
         | installable on the device, the performance is zero. If Linux
         | driver support is limited, causing performance issues, that is
         | also part of the equation.)
         | 
         | NB: those are _my_ criteria of evaluation. _Very personally._ I
         | 'm a software engineer, with a focus on systems/embedded. Your
         | criteria are yours.
         | 
         | (But maybe don't complain if you buy this for its "AI"
         | capabilities only to find out that Apple doesn't let you do
         | anything "unapproved" with it. You had sufficient chance to see
         | the warning signs.)
        
           | pbronez wrote:
           | It looks like Asahi Linux can run on Apple Silicon iPads...
           | but you have to use an exploit like checkm8 to get past the
           | locked bootloader
           | 
           | https://www.reddit.com/r/AsahiLinux/comments/ttsshm/asahi_li.
           | ..
        
         | killerstorm wrote:
         | It means you can deliver AI apps to users. E.g. generate
         | images.
        
         | sergiotapia wrote:
         | You're not wrong. It's why I don't use apple hardware anymore
         | for work or play. On Android and Windows I can build and
         | install whatever I like, without having to go through mother-
         | Apple for permission.
        
         | wishfish wrote:
         | There's the potential option of Swift Playgrounds which would
         | let you write / run code directly on the iPad without any
         | involvement in the developer program.
        
         | philipwhiuk wrote:
         | C'mon man, it's 2024, they can't just not mention AI in a press
         | release.
        
       | fallingsquirrel wrote:
       | Is it just me or is there not a single performance chart here?
       | Their previous CPU announcements have all had perf-per-watt
       | charts, and that's conspicuously missing here. If this is an
       | improvement over previous gens, wouldn't they want to show that
       | off?
        
         | a_vanderbilt wrote:
         | Since Intel->M1 the performance gains haven't been the
         | headliners they once were, although the uplifts haven't been
         | terrible. It also lets them hide behind the more impressive
         | sounding multiplier which can reference something more specific
         | but not necessarily applicable to broader tasks.
        
       | giancarlostoro wrote:
       | Call me crazy, but I want all that power in a 7" tablet. I like
       | 7" tablets most because they feel less clunky to carry around and
       | take with you. Same with 13" laptops, I'm willing to sacrifice on
       | screen real estate for saving myself from the back pain of
       | carrying a 15" or larger laptop.
       | 
       | Some of this is insanely impressive. I wonder how big the OS ROM
       | (or whatever) is with all these models. For context, even if the
       | entire OS is about 15GB, in order to get some of these features
       | locally just for an LLM on its own, its about 60GB or more, for
       | something ChatGPT esque. Which requires me to spend thousands on
       | a GPU.
       | 
       | Apologies for the many thoughts, I'm quite excited by all these
       | advancements. I always say I want AI to work offline and people
       | tell me I'm moving the goalpost, but it is truly the only way it
       | will become mainstream.
        
         | onlyrealcuzzo wrote:
         | > Call me crazy, but I want all that power in a 7" tablet
         | 
         | Aren't phones getting close to 7" now? The iPhone pro is 6.2",
         | right?
        
           | jsheard wrote:
           | Yeah, big phones have become the new small tablet.
           | 
           | https://phonesized.com/compare/#2299,156
           | 
           | Take away the bezels on the tablet and there's not a lot of
           | difference.
        
           | giancarlostoro wrote:
           | I'm not a huge fan of it, but yeah they are. I actually
           | prefer my phones to be somewhat smaller.
        
           | jwells89 wrote:
           | Biggest difference is aspect ratio. Phones are taller and
           | less pleasant to use in landscape, tablets are more square
           | and better to use in landscape.
           | 
           | You could technically make a more square phone but it
           | wouldn't be fun to hold in common positions, like up to your
           | ear for a call.
        
         | ChrisMarshallNY wrote:
         | I've been using the iPad Mini for years.
         | 
         | I'd love to see them add something to that form factor.
         | 
         | I do see _a lot_ of iPad Minis out there, but usually, as part
         | of dedicated systems (like PoS, and restaurant systems).
         | 
         | On the other hand, I have heard rumblings that Apple may
         | release an _even bigger_ phone, which I think might be overkill
         | (but what do I know. I see a lot of those monster Samsung
         | beasts, out there).
         | 
         | Not sure that is for me. I still use an iPhone 13 Mini.
         | 
         | I suspect that my next Mac will be a Studio. I guess it will be
         | an M4 Studio.
        
           | wiredfool wrote:
           | I loved my ipad mini. It's super long in the tooth now, and I
           | was hoping to replace it today. oh well...
        
             | giancarlostoro wrote:
             | I wish they would stop doing this weird release cycle where
             | some of their tablets don't get the updated chips. It's
             | really frustrating. Makes me hesitant to buy a tablet if I
             | feel like it could get an upgrade a week later or whatever.
        
               | JohnBooty wrote:
               | It certainly seems less than ideal for pro/prosumer
               | buyers who care about the chips inside.
               | 
               | I would guess that Apple doesn't love it either; one
               | suspects that the weird release cycle is at least
               | partially related to availability of chips and other
               | components.
        
               | wiredfool wrote:
               | I probably would have pulled the trigger on a price drop,
               | but at 600+eur for an old version, I'm just not as into
               | that, as I really expect it to be lasting many years.
        
             | Menu_Overview wrote:
             | I was ready to buy one today, too. Disappointing.
             | 
             | I miss my old iPad mini 4. I guess I could try the 11"
             | iPad, but I think I'd prefer it to be smaller.
        
               | wiredfool wrote:
               | Yeah, We've got a full sized iPad here and it's really
               | strange to hold and use. It's all what you're used to.
        
           | giancarlostoro wrote:
           | I wanted to buy a Mini, but they had not updated the
           | processors for them when I was buying, and they cost way more
           | than a regular iPad at the time, I wanted to be budget
           | conscious. I still sometimes regret not just going for the
           | Mini, but I know eventually I'll get one sooner or later.
           | 
           | You know whats even funnier, when the mini came out
           | originally, I made fun of it. I thought it was a dumb
           | concept, oh my ignorance.
        
             | ChrisMarshallNY wrote:
             | I have an iPad Pro 13", and never use it (It's a test
             | machine).
             | 
             | I use the Mini daily.
             | 
             | It's a good thing they made the Pro lighter and thinner.
             | May actually make it more useful.
        
             | ectospheno wrote:
             | I have access to multiple iPad sizes and I personally only
             | use the mini. Is almost perfect. Last year of its long life
             | cycle you start to feel the age of the processor but still
             | better than holding the larger devices. Can't wait for it
             | to be updated again.
        
         | r0fl wrote:
         | The next iPhone pro max will be 6.9 inches
         | 
         | That fits all your wants
        
         | sulam wrote:
         | If your back is hurting from the ~1lb extra going from 13" to
         | 15", I would recommend some body weight exercises. Your back
         | will thank you, and you'll find getting older to be much less
         | painful.
         | 
         | Regarding a small iPad, isn't that the iPad mini? 8" vs 7" is
         | pretty close to what you're asking for.
        
           | teaearlgraycold wrote:
           | I _highly_ recommend doing pull-ups for your posture and
           | health. It was shocking to me how much the state of my spine
           | improved after doing pull-ups as a daily exercise.
        
             | brcmthrowaway wrote:
             | set and rep protocol?
        
               | teaearlgraycold wrote:
               | I just have a bar in my apartment in a doorway. Sometimes
               | when I walk by I do 3 - 8 pull ups, then go on my way. Do
               | that a few times a day and you're doing pretty good.
               | Sometimes I'll do a few L pull ups as well.
               | 
               | If I'm doing pull ups in the gym I'll do 3 sets of 7.
               | That's the most I can do at the moment.
        
             | Der_Einzige wrote:
             | The average HN based-boy apple user has almost negative arm
             | strength. You're asking them to start with pull ups? They
             | need to be able to do a real push-up first!
        
         | alexpc201 wrote:
         | You can't have all that power in a 7" tablet because the
         | battery will last half hour.
        
           | JohnBooty wrote:
           | Well, maybe. The screen (and specifically the backlight) is a
           | big drain. Smaller screen = less drain.
        
         | Foobar8568 wrote:
         | I am not a large person by any means, yet I have no problem to
         | carry a MBP 16...But then I have a backpack and not a messenger
         | like bag, which I would agree, would be a pain to carry.
        
         | bschmidt1 wrote:
         | > I always say I want AI to work offline
         | 
         | I'm with you, I'm most excited about this too.
         | 
         | Currently building an AI creative studio (make stories, art,
         | music, videos, etc.) that runs locally/offline
         | (https://github.com/bennyschmidt/ragdoll-studio). There is a
         | lot of focus on cloud with LLMs but I can't see how the cost
         | will make much sense for involved creative apps like video
         | creation, etc. Present day users might not have high-end
         | machines, but I think they all will pretty soon - this will
         | make them buy them the way MMORPGs made everyone buy more RAM.
         | Especially the artists and creators. Remember, Photoshop was
         | once pretty difficult to run, you needed a great machine.
         | 
         | I can imagine offline music/movies apps, offline search
         | engines, back office software, etc.
        
         | ant6n wrote:
         | I've got an iPad mini. The main issue is the screen scratches.
         | The other main issue is the screen is like a mirror, so it
         | can't be used everywhere to watch videos (which is the main
         | thing the iPad is useful for). The third main issue is that
         | videos nowadays are way too dark and you can't adjust
         | brightness/gamma on the iPad to compensate.
         | 
         | (Notice a theme?)
        
           | dmitrygr wrote:
           | search amazon for matte glass screen protectors. thank me
           | later
        
         | notatoad wrote:
         | a 7" tablet was a really cool form factor back in the day when
         | phones were 4".
         | 
         | but when 6.7" screens on phones are common, what really is the
         | point of a 7" tablet?
        
         | Aurornis wrote:
         | > Call me crazy, but I want all that power in a 7" tablet. I
         | like 7" tablets most because they feel less clunky to carry
         | around and take with you.
         | 
         | iPhone Pro Max screen size is 6.7" and the the upcoming iPhone
         | 16 Pro Max is rumored to be 6.9" with 12GB of RAM. That's your
         | 7" tablet right there.
         | 
         | The thing is - You're an extreme edge case of an edge case.
         | Furthermore, I'm guessing if Apple did roll out a 7" tablet,
         | you'd find some other thing where it isn't exactly 100%
         | perfectly meeting your desired specifications. For example,
         | Apple _is_ about to release a high powered 6.9 " tablet-like
         | device (the iPhone 16 Pro Max) but I'm guessing there's another
         | reason why it doesn't fit your needs.
         | 
         | Which is why companies like Apple ignore these niche use cases
         | and focus on mainstream demands. The niche demands always gain
         | a lot of internet chatter, but when the products come out they
         | sell very poorly.
        
       | daniel31x13 wrote:
       | Well at the end of the day the processors are bottlenecked by its
       | OS. What real value does an iPad bring that a typical iPhone +
       | Mac combo misses? (Other than being a digital notebook...)
        
         | cooper_ganglia wrote:
         | Digital artist's can get a lot of use out of it, I'd assume.
         | The Apple Pencil seems pretty nice with the iPad.
        
           | daniel31x13 wrote:
           | This. If you're anything other than a digital artist/someone
           | who genuinely prefers writing over typing, an iPad is just an
           | extra tool for you to waste your money on.
           | 
           | I had one of the earlier versions and this was pretty much
           | its only use case...
        
         | JohnBooty wrote:
         | I wound up getting a 2019 iPad Pro for 50% off, so $500 or so.
         | Thought I would use it as a work/play hybrid.
         | 
         | Surprisingly (at least to me) I feel that I've more than gotten
         | my money's worth out of it _despite_ it being almost entirely
         | strictly a consumption device.
         | 
         | I tote it around the house so I can watch or listen to things
         | while I'm doing other things. It's also nice to keep on the
         | dining room table so I can read the news or watch something
         | while we're eating. I could do every single one of these things
         | with my laptop, but... that laptop is my _primary work tool._ I
         | don 't like to carry it all over the place, exposing it to
         | spills and dust, etc.
         | 
         | The only real work-related task is serving as a secondary
         | monitor (via AirPlay) for my laptop when I travel.
         | 
         | $500 isn't pocket change, but I've gotten 48 months of
         | enjoyment and would expect at least another 24 to 36 months.
         | That's about $6 a month, or possibly more like $3-4 per month
         | if I resell it eventually.
         | 
         | Worth it for me.
        
           | beacon294 wrote:
           | I loved my 2017 ipad pro but I retired it because I noticed
           | my productivity went way down and my consumption went way up.
        
             | seuraughty wrote:
             | Yeah I had the same experience and ultimately got rid of an
             | M1 iPad Pro for an M3 MacBook Air. I still have the ease
             | and portability for watching videos while doing the dishes
             | or on an airplane, with the added benefit of a keyboard and
             | OS for productivity in case the muse visits.
        
         | hot_gril wrote:
         | My wife has a new iPad for grad school, and I'm convinced it's
         | mainly an extra category for some customers to spend more money
         | on if they already have a Mac and iPhone. The school supplied
         | it, then she spent $400+ on the keyboard and other damn dongles
         | to bring the hardware sorta up to par with a laptop, hoping to
         | replace her 2013 MBP.
         | 
         | In the end, she still has to rely on the MBP daily because
         | there's always _something_ the iPad can 't do. Usually
         | something small like a website not fully working on it.
        
         | tiltowait wrote:
         | I often prefer (as in enjoy) using my iPad Pro over my 16" M1
         | MBP, but I think the only thing my iPad is actually better for
         | is drawing.
        
       | pier25 wrote:
       | This is great but why even bother with the M3?
       | 
       | The M3 Macs were released only 7 months ago.
        
         | ls612 wrote:
         | Probably they had some contractual commitments with TSMC and
         | had to use up their N3B capacity somehow. But as soon as N3E
         | became available it's a much better process overall.
        
           | a_vanderbilt wrote:
           | Ramping up production on a new die also takes time. The lower
           | volume and requirements of the M4 as used in the iPad can
           | give them time to mature the line for the Macs.
        
         | SkyPuncher wrote:
         | So far, I haven't seen any comparison between the iPad M4 and
         | the computer M3. Everything was essentially compared to the
         | last iPad chip, the M2.
         | 
         | Your laptop M3 chip is still probably more powerful than this.
         | The laptop M4 will be faster, but not groundbreaking faster.
        
       | lopkeny12ko wrote:
       | I always wonder how _constraining_ it is to design these chips
       | subject to thermal and energy limitations. I paid a lot of money
       | for my hardware and I want it to go as fast as possible. I don 't
       | want my fans to be quiet, and I don't want my battery life to be
       | 30 minutes longer, if it means I get _more raw performance_ in
       | return. But instead, Apple 's engineers have unilaterally decided
       | to handicap their own processors for no real good reason.
        
         | Workaccount2 wrote:
         | The overwhelming majority of people who buy these devices will
         | just use them to watch netflix and tiktok. Apple is well aware
         | of this.
        
         | boplicity wrote:
         | Why not go with a Windows based device? There are many loud and
         | low-battery life options that are very fast.
        
           | jwells89 wrote:
           | Yeah, one of my biggest frustrations as a person who likes
           | keeping around both recent-ish Mac and Windows/Linux laptops
           | is that x86 laptop manufacturers seem to have a severe
           | allergy to building laptops that are good all-rounders...
           | they always have one or multiple specs that are terrible,
           | usually heat, fan noise, and battery life.
           | 
           | Paradoxically this effect is the worst in ultraportables,
           | where the norm is to cram in CPUs that run too hot for the
           | chassis with tiny batteries, making them weirdly bad at the
           | one thing they're supposed to be good at. Portability isn't
           | just physical size and weight, but also runtime and if one
           | needs to bring cables and chargers.
           | 
           | On that note, Apple really needs to resurrect the 12" MacBook
           | with an M-series or even A-series SoC. There'd be absolutely
           | nothing remotely comparable in the x86 ultraportable market.
        
         | etchalon wrote:
         | The reason is because battery life is more important to the
         | vast majority of consumers.
        
         | jupp0r wrote:
         | Thermal load has been a major limiting design factor in high
         | end CPU design for two decades (remember Pentium 4?).
         | 
         | Apart from that, I think you might me in a minority if you want
         | a loud, hot iPad with a heavy battery to power all of this (for
         | a short time, because physics). There are plenty of Windows
         | devices that work exactly like that though if that's really
         | what makes you happy. Just don't expect great performance
         | either, because of diminishing returns of using higher power
         | and also because the chips in these devices usually suck.
        
         | shepherdjerred wrote:
         | You're in the minority
        
         | Perceval wrote:
         | Most of what Apple sells goes into mobile devices: phone,
         | tablet, laptop. In their prior incarnation, they ran up real
         | hard against the thermal limits of what they could put in their
         | laptops with the IBM PowerPC G5 chip.
         | 
         | Pure compute power has never been Apple's center of gravity
         | when selling products. The Mac Pro and the XServe are/were
         | minuscule portions of Apple's sales, and the latter product was
         | killed after a short while.
         | 
         | > Apple's engineers have unilaterally decided to handicap their
         | own processors for no real good reason
         | 
         | This is a misunderstanding of what the limiting factor is of
         | Apple products' capability. The mobile devices all have battery
         | as the limfac. The processors being energy efficient in
         | compute-per-watt isn't a handicap, it's an enabler. And it's a
         | very good reason.
        
         | wiseowise wrote:
         | > I don't want my fans to be quiet, and I don't want my battery
         | life to be 30 minutes longer
         | 
         | I agree with you. I don't want fans to be quiet, I want them
         | completely gone. And with battery life too, not 30 minutes, but
         | 300 minutes. Modern chips are plenty fast, developers need to
         | optimize their shit instead of churning crapware.
        
       | pxc wrote:
       | Given that recent Apple laptops already have solid all-day
       | battery life, with such a big performance per watt improvement, I
       | wonder if they'll end up reducing how much battery any laptops
       | ship with to make them lighter.
        
         | asadotzler wrote:
         | No, because battery life isn't just about the CPU. The CPU sits
         | idle most of the time and when it's not idle, it's at workloads
         | like 20% or whatever. It's the screens that eat batteries
         | because they're on most or all of the time and sucking juice.
         | Look at Apple's docs and you'll see the battery life is the
         | exact same as the previous model. They have a battery budget
         | and if they save 10% on CPU, they give that 10% to a better
         | screen or something. They can't shrink the battery by half
         | until they make screens twice as efficient, not CPUs which
         | account for only a small fraction of power draw.
        
       | zenethian wrote:
       | This is pretty awesome. I wonder if it has a fix for the the
       | GoFetch security flaw?
        
       | ionwake wrote:
       | Sorry to be a noob, but does anyone have a rough estimate of when
       | this m4 chip will be in a macbook air or macbook pro?
        
         | a_vanderbilt wrote:
         | If I had to venture a guess, maybe WWDC '24 that's coming up.
        
           | ionwake wrote:
           | Thanks bro
        
       | slashdev wrote:
       | I've got a Mac Pro paperweight because the motherboard went. It's
       | going to the landfill. I can't even sell it for parts because I
       | can't erase the SSD. If they didn't solder everything to the
       | board you could actually repair it. When I replace my current
       | Dell laptop, it will be with a repairable framework laptop.
        
         | stuff4ben wrote:
         | Just because you lack the skills to fix it, doesn't mean it's
         | not repairable. People desolder components all the time to fix
         | phones and ipads and laptops.
         | 
         | https://www.youtube.com/watch?v=VNKNjy3CoZ4
        
           | nicce wrote:
           | In this case, you need to find working motherboard without
           | soldered parts to be able to fix it cost efficiently.
           | Otherwise you need to buy factory component (for extra price,
           | with soldered components...)
        
             | slashdev wrote:
             | Yeah, it's not worth it
        
           | AzzyHN wrote:
           | Any other computer I could simply replace the motherboard
           | with several other compatible motherboards, no soldering or
           | donor board needed.
        
             | kaba0 wrote:
             | It's almost like "any other computer" is not thin as a
             | finger, packed to the brim with features that require
             | miniaturization.
             | 
             | Can you just fix an F1 engine with a wrench?
        
               | Rebelgecko wrote:
               | I'm not sure which gen Mac Pro they have, but the current
               | ones aren't that much thinner than the OG cheese grater
               | Macs from 15 years ago.
               | 
               | In fact the current Gen is bigger than the trashcan ones
               | by quite a bit (although IIRC the trash can Macs had user
               | replaceable SSDs and GPUs)
        
               | mort96 wrote:
               | That stuff makes it more difficult to work on, but it
               | doesn't make it impossible for Apple to sell replacement
               | motherboards... nor does making a "thin desktop" require
               | soldering on SSDs, M.2 SSDs are plenty thin for any small
               | form factor desktop use case.
        
               | slashdev wrote:
               | They do it deliberately. They want you to throw it out
               | and buy a new one
        
               | hot_gril wrote:
               | It's not that small: https://www.cnet.com/a/img/resize/65
               | 262f62ac0f1aa5540aca7cf9...
               | 
               | I totally missed that they released a new Apple Silicon
               | Mac Pro. Turns out it has PCIe slots.
        
               | slashdev wrote:
               | My Dell laptop is much more repairable. I changed the RAM
               | and added second SSD myself.
        
               | sniggers wrote:
               | The mental gymnastics Apple fanboys will do to defend
               | being sold garbage are amazing.
        
               | its_ethan wrote:
               | The inability to appreciate when optimizing a design
               | means not using COTS parts that Apple "haters" do is also
               | amazing...
        
               | sniggers wrote:
               | It is being optimized, it's just that the optimization is
               | geared towards vacuuming money from brainwashed pockets
               | instead of making a product that's worth the money.
        
           | slashdev wrote:
           | There's always some wiseass saying "skill issue"
        
         | ipqk wrote:
         | hopefully at least electronics recycling.
        
           | slashdev wrote:
           | Where do you usually take it for that?
           | 
           | If I find a place in walking distance, maybe.
        
             | kylehotchkiss wrote:
             | You could try to stick it in the phones drop off thingy at
             | target. That's my go to for all non valuable electronics.
        
               | slashdev wrote:
               | I don't have that here, but maybe there's something
               | similar
        
           | slashdev wrote:
           | Nothing close enough, I checked
        
           | gnabgib wrote:
           | Depending on where, a lot of electronics "recyclers" are
           | actually resellers. Some of them are even cheeky enough to
           | deny electronics they know they can't resell (If they're
           | manned.. many are cage-drops in the back of eg Staples)
        
         | kjkjadksj wrote:
         | Even repairable only buys you a few years repairability that
         | actually makes sense. For example something similar happened to
         | me, lost the mac mobo on a pre solder addiction model. Only
         | thing is guess how much a used mobo is for an old mac: nearly
         | as much as the entire old mac in working shape. It makes no
         | sense to repair it once the computer hits a certain age between
         | the prices of oem parts and the depreciation of computers.
        
           | paulmd wrote:
           | ok but now get this: what if we started a program where
           | people prepay part of the repair with an initial fee, and
           | then for a couple years they can have their laptop repaired
           | at a reduced, fixed price? That helps secure the supply
           | chain. You could then partner with a retail computer store
           | (or start your own!) and have a network of brick-and-mortar
           | stores with subject-matter experts to perform the repairs as
           | well as more minor troubleshooting etc. It'd basically be
           | like healthcare, but for your computer!
           | 
           | I think if you partnered with a major computer brand, that
           | kind of thing could really be huge. Maybe someone like
           | framework perhaps. Could be a big brand discriminator - bring
           | that on-site service feel to average consumers.
        
         | kylehotchkiss wrote:
         | Why don't you take it to the Apple Store to recycle it instead
         | of dropping it in the trash can?
        
           | slashdev wrote:
           | They don't accept computers for recycling. That's what I
           | found when I looked it up
        
             | kylehotchkiss wrote:
             | They accept Apple branded computers for recycling if it has
             | no trade in value (they'll try to get you an offer if it
             | has any value). I have recycled damaged apple computers at
             | the store before without trading in.
        
             | crazygringo wrote:
             | They absolutely do. You must have looked it up wrong.
             | 
             | Here:
             | 
             | https://www.apple.com/recycling/nationalservices/
             | 
             | I've even done it before personally with an old MacBook
             | that wouldn't turn on.
        
               | slashdev wrote:
               | I went there, they give me insructions to print labels
               | they'll send me, find a box, pad it appropriately, attach
               | the labels and then ship it.
               | 
               | It's going in the landfill.
        
               | skupig wrote:
               | You being unwilling to spend the barest amount of effort
               | to recycle it is your problem, not Apple's.
        
               | slashdev wrote:
               | If they took it at their store, fine. If they want me to
               | take an hour to go print a label (I don't have a
               | printer), and then another hour to package it up and ship
               | it. I'll pass.
               | 
               | They also say to erase the data before shipping it -
               | which I can't do.
        
               | _zoltan_ wrote:
               | sounds to me like a you problem.
        
               | urda wrote:
               | As another commenter put that I also agree with:
               | 
               | You being unwilling to spend the barest amount of effort
               | to recycle it is your problem, not Apple's.
        
               | crazygringo wrote:
               | First you said they don't accept recycling.
               | 
               | Now you claim you "went there" and discovered they _do_
               | accept recycling but only if you mail it.
               | 
               | One of those is necessarily false, since I doubt you went
               | to the Apple Store in between your comments.
               | 
               | However, I suspect _both_ your claims are wrong, because
               | Apple stores _absolutely_ accept old devices to recycle
               | directly. (They _also_ provide mail-in options for people
               | who don 't have one they can visit directly.)
               | 
               | From your many comments, it seems like you have an
               | ideological axe to grind that somehow your device can't
               | be recycled, despite abundant evidence to the contrary
               | and lots of people here trying to help you.
        
         | acdha wrote:
         | > I can't even sell it for parts because I can't erase the SSD
         | 
         | The SSD is encrypted with a rate-limited key in the Secure
         | Enclave - unless someone has your password they're not getting
         | your data.
        
           | slashdev wrote:
           | Not worth the liability. I'd rather the landfill and peace of
           | mind than the money
        
             | crazygringo wrote:
             | But what liability?
             | 
             | That's the whole _point_ of encrypted storage. There is no
             | liability if you used a reasonable password.
             | 
             | Why not accept you _have_ peace of mind and resell on eBay
             | for parts?
             | 
             | Assuming you didn't use "password123" or something.
        
               | slashdev wrote:
               | Every system has vulnerabilities. Plus that password I
               | used has been in dataleaks. I don't trust it.
        
               | lm28469 wrote:
               | If you're that paranoid you cannot trust any software or
               | hardware you haven't designed yourself
        
               | slashdev wrote:
               | I trust the stuff I design least of all
        
               | xvector wrote:
               | It will be easy to break in time. Eventually you'll just
               | be able to use a tool that shines a laser at the right
               | bit and breaks the rate limiting. We've already seen
               | similar attacks on hardware wallets previously thought
               | invulnerable.
               | 
               | I don't think any cryptography has stood the test of
               | time. It's unlikely anything today will survive post-
               | quantum.
        
         | kalleboo wrote:
         | > _I can 't even sell it for parts because I can't erase the
         | SSD. If they didn't solder everything to the board you could
         | actually repair it. _
         | 
         | The Mac Pro does not have a soldered-in SSD. They even sell
         | user-replaceable upgrades.
         | https://www.apple.com/shop/product/MR393AM/A/apple-2tb-ssd-u...
        
         | bustling-noose wrote:
         | Get a heat gun and remove the NAND. Then sell the rest of it to
         | a local repair store or just give them for free if it's an old
         | Mac Pro. The parts in your Mac Pro are something someone can
         | reuse to restore their Mac Pro instead of a landfill. Not every
         | part is security related. Also Apple may take the Mac Pro
         | itself and give you store credit cause they do recycle it.
        
           | jpalawaga wrote:
           | i don't think you can do that. there was just a video on here
           | last week of a repair shop drilling the memory out, as that
           | was the only way to remove it without damaging the
           | motherboard.
        
       | bigdict wrote:
       | 38 TOPS in the Neural Engine comes dangerously close to the
       | Microsoft requirement of 40 TOPS for "AI PCs".
        
         | ycsux wrote:
         | That's a good reason why they didn't release it as Mac Book M4
        
       | 999900000999 wrote:
       | I'm extremely happy with my M1 iPad.
       | 
       | The only real issue is aside from the screen eventually wearing
       | out ( it already has a bit of flex), I can't imagine a reason to
       | upgrade. It's powerful enough to do anything you'd use an iPad
       | for. I primarily make music on mine, I've made full songs with
       | vocals and everything ( although without any mastering - I think
       | this is possible in Logic on iPad).
       | 
       | It's really fun for quick jam sessions, but I can't imagine what
       | else I'd do with it. IO is really bad for media creation, you
       | have a single USB C port( this bothers me the most, the moment
       | that port dies it becomes E Waste), no headphone jack...
        
         | MuffinFlavored wrote:
         | Any apps that work with MIDI controller on iPad?
         | 
         | Also, can't you just use a USB-C hub for like $10 from Amazon?
        
           | 999900000999 wrote:
           | I have more USB hubs than I can count.
           | 
           | You still only have one point of failure for the entire
           | device that can't be easily fixed.
           | 
           | And most midi controllers work fine via USB or Bluetooth
        
             | bombcar wrote:
             | I wish it had two USB C ports, one on the bottom and one on
             | the side. Even if they only really were one internally at
             | least you'd have more mounting options.
        
         | tootie wrote:
         | I have an iPad that predates M1 and it's also fine. It's a
         | media consumption device and that's about it.
        
         | kalleboo wrote:
         | The USB-C port is on it's own separate board, so it's
         | repairable with minimal waste
         | https://www.ifixit.com/Guide/iPad+Pro+12.9-Inch+5th+Gen+USB-...
        
           | 999900000999 wrote:
           | It's only 100$ or so to repair which isn't as bad as I
           | thought.
           | 
           | https://simplyfixable.com/blog-detail/ipad-charging-port-
           | rep...
           | 
           | But it's still not something I'd do on my own.
        
       | onetimeuse92304 wrote:
       | As an amateur EE it is so annoying that they reuse names of
       | already existing ARM chips.
       | 
       | ARM Cortex-M4 or simply M4 is quite popular ARM architecture. I
       | am using M0, M3 and M4 chips from ST on a daily basis.
        
         | jupp0r wrote:
         | It's not like the practice of giving marketing names to chips
         | is generally a world of logical sanity if you look at Intel
         | i5/i7/i9 etc.
        
         | zerohp wrote:
         | As a professional EE, I know that ARM Cortex-M4 is not a chip.
         | It's an embedded processor that is put into an SOC (which is a
         | chip), such as the STM32-family from ST.
        
       | diogenescynic wrote:
       | So is the iPad mini abandoned due to the profit margins being too
       | small or what? I wish they'd just make it clear so I could
       | upgrade without worrying a mini replacement will come out right
       | after I buy something. And I don't really understand why there
       | are so many different iPads now (Air/Pro/Standard). It just feels
       | like Apple is slowly becoming like Dell... offer a bunch of SKUs
       | and barely differentiated products. I liked when Apple had fewer
       | products but they actually had a more distinct purpose.
        
         | downrightmike wrote:
         | They refresh it like every 3 years
        
           | kalleboo wrote:
           | Pretty much, yeah
           | https://buyersguide.macrumors.com/#iPad_Mini
        
       | grzeshru wrote:
       | Are these M-class chips available to be purchased on Digi-Key and
       | Mouser? Do they have data sheets and recommended circuitry? I'd
       | love to play with one just to see how difficult it is to
       | integrate compared to, say, an stm8/32 or something.
        
         | exabrial wrote:
         | absolutely not, and even if they were, they are not documented
         | in the least bit and require extraordinary custom OS and other
         | BLOBs to run
        
           | grzeshru wrote:
           | Darn it. Oh well.
        
         | downrightmike wrote:
         | lol
        
           | metaltyphoon wrote:
           | Legit made me chuckle
        
         | culopatin wrote:
         | Did you really expect a yes?
        
           | grzeshru wrote:
           | I didn't know what to expect. I thought they may license it
           | to other companies under particular clauses or some such.
        
       | tibbydudeza wrote:
       | 4P cores only ???.
        
         | a_vanderbilt wrote:
         | I'm hoping the higher efficiency gains and improved thermals
         | offset that. The efficiency cores tend to have more impact on
         | the Macs where multitasking is heavier.
        
         | antonkochubey wrote:
         | It's a tablet/ultrabook chip, are you expecting an Threadripper
         | in them?
        
         | ulfw wrote:
         | It's the non-Pro, non-Max chip. How many performance cores do
         | you expect?
         | 
         | Bigger shocker is only three cores alive in the 256GB and 512GB
         | models paired with 8GB
        
       | treesciencebot wrote:
       | ~38 TOPS at fp16 is amazing, if the quoted number if fp16 (ANE is
       | fp16 according to this [1] but that honestly seems like a bad
       | choice when people are going smaller and smaller even at the
       | higher level datacenter cards so not sure why apple would use it
       | instead of fp8 natively)
       | 
       | [1]: https://github.com/hollance/neural-
       | engine/blob/master/docs/1...
        
         | imtringued wrote:
         | For reference. The llama.cpp people are not going smaller. Most
         | of those models run on 32 bit floats with the dequantization
         | happening on the fly.
        
       | haunter wrote:
       | I love when Gruber is confidently wrong
       | https://daringfireball.net/linked/2024/04/28/m4-ipad-pros-gu...
        
         | alberth wrote:
         | Especially about Gurman, who he loves to hate on.
        
           | atommclain wrote:
           | Never understood the animosity, especially because it seems
           | to only go one direction.
        
             | tambourine_man wrote:
             | He spills Apple's secrets. Gruber had him on his podcast
             | once and called him a super villain in the Apple's
             | universe, or something like this. It was cringeworthy
        
           | MBCook wrote:
           | As a longtime reader/listener I don't see him as hating
           | Gurman at all.
        
         | bombcar wrote:
         | Wasn't it relatively well known that the M3 is on an expensive
         | process and quickly getting to an M4 on a cheaper/higher yield
         | process would be worth it?
        
           | MBCook wrote:
           | Yes but Apple has never gone iPad first on a new chip either,
           | so I was with him in that I assumed it wouldn't be what they
           | would do.
           | 
           | "Let's make all our Macs look slower for a while!"
           | 
           | So I was surprised as well.
        
             | transpute wrote:
             | Nuvia/Qualcomm Elite X aspires to beat M3 and launches in 2
             | weeks.
             | 
             | Now Apple can keep their crown with this early M4 launch.
        
         | TillE wrote:
         | > or Apple's silicon game is racing far ahead of what I
         | considered possible
         | 
         | Gruber's strange assumption here is that a new number means
         | some major improvements. Apple has never really been consistent
         | about sticking to patterns in product releases.
        
           | tiffanyh wrote:
           | This is a major improvement (over the M3).
           | 
           | It's on a new fab node size.
           | 
           | It also have more CPU cores than it's predessor (M3 with
           | 8-core vs M4 with 10-cores).
        
             | edward28 wrote:
             | It's on TSMC n3E which is a slightly less dense but better
             | yielding than the previous n3B.
        
       | qwertyuiop_ wrote:
       | Does anyone know how much of this giant leap performance as Apple
       | puts it is really useful and perceived by end users of iPad. I am
       | thinking gaming, art applications on iPad. What other major ipad
       | use cases are out there that need this kind of performance boost.
        
         | musictubes wrote:
         | Making music. The iPad is much better for performing than a
         | computer. There is a huge range of instruments, effects,
         | sequencers, etc. available on the iPad. Things like physical
         | modeling and chained reverb can eat up processor cycles so more
         | performance is always welcomed.
         | 
         | Both Final Cut Pro and Davinci resolve can also use as much
         | power as you can give them though it isn't clear to me why
         | you'd use an iPad instead of a Mac. They also announced a crazy
         | multicam app for iPads and iPhones that allows remote control
         | of a bunch of iPhones at the same time.
        
         | bschmidt1 wrote:
         | I imagine running LLMs and other AI models to produce a variety
         | of art, music, video, etc.
        
         | pquki4 wrote:
         | I have a 3rd gen iPad Pro 12.9 for reading and other light
         | activity. I haven't found any reason to upgrade for the past
         | few years. I don't see myself getting another iPad unless this
         | one dies or if Apple actually unlocks the potential of the
         | hardware.
        
       | troupo wrote:
       | "The M4 is so fast, it'll probably finish your Final Cut export
       | before you accidentally switch apps and remember that that
       | cancels the export entirely. That's the amazing power performance
       | lead that Apple Silicon provides." #AppleEvent
       | 
       | https://mastodon.social/@tolmasky/112400245162436195
        
         | dlivingston wrote:
         | Ha. That really highlights how absurd the toy of iPadOS is
         | compared to the beasts that are the M-series chips.
         | 
         | It's like putting a Ferrari engine inside of a Little Tikes toy
         | car. I really have no idea who the target market for this
         | device is.
        
         | LeoPanthera wrote:
         | This is a straight-up lie, yes? Switching apps doesn't cancel
         | the export.
        
           | troupo wrote:
           | Can neither confirm nor deny :) I've seen people complain
           | about this on Twitter and Mastodon though.
           | 
           | It's possible people are running into iOS limitations: it
           | _will_ kill apps when it thinks there 's not enough memory.
        
           | kalleboo wrote:
           | The export progress dialog says "Keep Final Cut Pro open
           | until the export is complete", and the standard iPadOS
           | limitations are that background tasks are killed after either
           | 10 minutes or when some foreground app wants more RAM. So it
           | it's not instantly cancelled but it's a precarious workflow
           | compared to on a Mac.
        
       | satertek wrote:
       | Are there enough cores to allow user switching?
        
       | daft_pink wrote:
       | Who would buy a MacBook Air or mini or studio today with its
       | older chips?
        
         | rc_mob wrote:
         | people on a budget
        
         | alexpc201 wrote:
         | People with a Macbook. You use the Macbook to work and the iPad
         | to play, read, movies, draw, etc. plus you can use it as a
         | second monitor for the Macbook.
        
         | antonkochubey wrote:
         | Someone who needs a MacBook Air or mini or studio, not an iPad
        
           | daft_pink wrote:
           | I'm just venting that their processor strategy doesn't make
           | much sense. The iPad gets the M4, but the Mini and Studio and
           | Mac Pro are still on M2 and the MacBooks are on M3.
           | 
           | They've essentially undercut every Mac they currently sell by
           | putting the M4 in the iPad and most people will never use
           | that kind of power in an iPad.
           | 
           | If you are going to spend $4k on a Mac don't you expect it to
           | have the latest processor?
        
             | victorbjorklund wrote:
             | People who care about having the latest probably are
             | waiting already anyway.
        
             | lotsofpulp wrote:
             | Probably 80%+ of the population can do everything they need
             | or want to do for the next 5 (maybe even 8) years on an M2
             | Air available for less than $1,500.
             | 
             | I write this on a $1,000 late 2015 Intel MacBook Air.
        
               | wayoverthecloud wrote:
               | Does it still work great?
        
               | lotsofpulp wrote:
               | Yes. It doesn't need a lot of horsepower to browse the
               | web, read and edit PDFs, edit spreadsheets, and video
               | call.
               | 
               | Obviously, the newer laptops are much more smooth and
               | responsive.
        
               | daft_pink wrote:
               | Honestly only reason I want a studio is because I run
               | several monitors and my Mac Mini can't run all my
               | monitors unless I use displaylink, which doesn't allow me
               | to run any HDCP protected content and is just glitchy and
               | hacky in general.
               | 
               | I think for the past 10 years you are correct, but I
               | think we are currently entering the AI age and the base
               | M4 has 38 TOPS (trillion operations per second) and we
               | aren't going to be able to run the AI models on device
               | with lower latency that they will surely be releasing
               | this summer without a more recent chip, so I don't think
               | things are as future proof as they used to be.
               | 
               | But that's not really the point, the point is that I
               | don't want to spend $4k to buy a Mac Studio with an M2
               | chip while the M3 Macbook Pro has on par performance and
               | the iPad has an M4. Apple should come up with a better
               | update strategy then randomly updating devices based on
               | previous update cycles.
        
       | shepherdjerred wrote:
       | Am I wrong, or is raytracing on an iPad an _insane_ thing to
       | announce? As far as I know, raytracing is the holy grail of
       | computer graphics.
       | 
       | It's something that became viable on consumer gaming desktops
       | just a few years ago, and now we have real-time ray tracing on a
       | tablet.
        
         | vvvvvvvvvvvvv wrote:
         | iPhones with A17 already have hardware ray tracing. Few
         | applications/games support it at present.
        
         | bluescrn wrote:
         | Gotta make the loot boxes look even shinier, keep the gamblers
         | swiping those cards.
        
         | luyu_wu wrote:
         | Why would it be? They announced the same for the A17 in the
         | iPhone. Turns out it was a gimmick that caused over 11W of
         | power draw. Raytracing is a brute force approach that cannot be
         | optimized to the same level as rasterization. For now at least,
         | it is unsuitable for mobile devices. Now if we could use the RT
         | units for Blender that'd be great, but it's iPad OS...
        
         | pshc wrote:
         | It is kind of crazy to look back on. In the future we might
         | look forward to path tracing and more physically accurate
         | renderers. (Or perhaps all the lighting will be hallucinated by
         | AI...?)
        
       | alexpc201 wrote:
       | I understand that they have delayed the announcement of these
       | iPads until the M4 is ready, otherwise there is nothing
       | interesting to offer to those who have an iPad Pro M2. I don't
       | see the convenience of having a MacBook M3 and an iPad M4. If I
       | can't run Xcode on an iPad M4, the MacBook is the smartest
       | option; it has a bigger screen, more memory, and if you
       | complement it with an iPad Air, you don't miss out on anything.
        
       | dhx wrote:
       | M2's Neural Engine had 15TOPS, M3's 18TOPS (+20%) vs. M4's 38TOPS
       | (+111%).
       | 
       | In transistor counts, M2 had 20BTr, M3 25BTr (+25%) and M4 has
       | 28BTr (+12%).
       | 
       | M2 used TSMC N5P (138MTr/mm2), M3 used TSMC N3 (197MTr/mm2, +43%)
       | and M4 uses TSMC N3E (215MTr/mm2, +9%).[1][2]
       | 
       | [1]
       | https://en.wikipedia.org/wiki/5_nm_process#%225_nm%22_proces...
       | 
       | [2]
       | https://en.wikipedia.org/wiki/3_nm_process#%223_nm%22_proces...
        
         | ttul wrote:
         | An NVIDIA RTX 4090 generates 73 TFLOPS. This iPad gives you
         | nearly half that. The memory bandwidth of 120 GBps is roughly
         | 1/10th of the NVIDIA hardware, but who's counting!
        
           | kkielhofner wrote:
           | TOPS != TFLOPS
           | 
           | RTX 4090 Tensor 1,321 TOPS according to spec sheet so roughly
           | 35x.
           | 
           | RTX 4090 is 191 Tensor TFLOPS vs M2 5.6 TFLOPS (M3 is tough
           | to find spec).
           | 
           | RTX 4090 is also 1.5 years old.
        
             | imtringued wrote:
             | Yeah where are the bfloat16 numbers for the neural engine?
             | For AMD you can at least divide by four to get the real
             | number. 16 TOPS -> 4 tflops within a mobile power envelope
             | is pretty good for assisting CPU only inference on device.
             | Not so good if you want to run an inference server but that
             | wasn't the goal in the first place.
             | 
             | What irritates me the most though is people comparing a
             | mobile accelerator with an extreme high end desktop GPU.
             | Some models only run on a dual GPU stack of those. Smaller
             | GPUs are not worth the money. NPUs are primarily eating the
             | lunch of low end GPUs.
        
           | lemcoe9 wrote:
           | The 4090 costs ~$1800 and doesn't have dual OLED screens,
           | doesn't have a battery, doesn't weigh less than a pound, and
           | doesn't actually do anything unless it is plugged into a
           | larger motherboard, either.
        
             | talldayo wrote:
             | From Geekbench: https://browser.geekbench.com/opencl-
             | benchmarks
             | 
             | Apple M3: 29685
             | 
             | RTX 4090: 320220
             | 
             | When you line it up like that it's kinda surprising the
             | 4090 is _just_ $1800. They could sell it for $5,000 a pop
             | and it would still be better value than the highest end
             | Apple Silicon.
        
               | nicce wrote:
               | A bit off-topic since not applicable for iPad:
               | 
               | Adding also M3 MAX: 86072
               | 
               | I wonder the results if the test would be done on Asahi
               | Linux some day. Apple implementation is fairly
               | unoptimized AFAK.
        
               | haswell wrote:
               | Comparing these directly like this is problematic.
               | 
               | The 4090 is highly specialized and not usable for general
               | purpose computing.
               | 
               | Whether or not it's a better value than Apple Silicon
               | will highly depend on what you intend to do with it.
               | Especially if your goal is to have a device you can put
               | in your backpack.
        
               | talldayo wrote:
               | I'm not the one making the comparison, I'm just providing
               | the compute numbers to the people who _did_. Decide for
               | yourself what that means, the only conclusion I made on
               | was compute-per-dollar.
        
               | aurareturn wrote:
               | I think it would be simpler to compare cost/transistor.
        
               | pulse7 wrote:
               | This is true, but... RTX 4090 has only 24GB RAM and M3
               | can run with 192GB RAM... A game changer for largest/best
               | models...
        
               | talldayo wrote:
               | CUDA features unified memory that is only limited by the
               | bandwidth of your PCIe connector:
               | https://developer.nvidia.com/blog/unified-memory-cuda-
               | beginn...
               | 
               | People have been tiling 24gb+ models on a single (or
               | several) 3090/4090s for a while now.
        
               | trvz wrote:
               | That's for OpenCL, Apple gets higher scores through
               | Metal.
        
               | talldayo wrote:
               | And Nvidia annihilates _those_ scores with CUBlas. I 'm
               | going to play nice and post the OpenCL scores since both
               | sides get a fair opportunity to optimize for it.
        
               | trvz wrote:
               | Actually, I'd like to see Nvidia's highest Geekbench
               | scores. Feel free to link them.
               | 
               | It's stupid to look at OpenCL when that's not what's used
               | in real use.
        
             | janalsncm wrote:
             | And yet it's worth it for deep learning. I'd like to see a
             | benchmark training Resnet on an iPad.
        
           | brigade wrote:
           | It would also blow through the iPad's battery in 4 minutes
           | flat
        
           | jocaal wrote:
           | > The memory bandwidth of 120 GBps is roughly 1/10th of the
           | NVIDIA hardware, but who's counting
           | 
           | Memory bandwidth is literally the main bottleneck when it
           | comes to the types of applications gpus are used for, so
           | everyone is counting
        
           | anvuong wrote:
           | This comment needs to be downvoted more. TFLOPS is not TOPS,
           | this comparison is meaningless, the 4090 is about 40x TOPS of
           | the M4.
        
         | bearjaws wrote:
         | We will have M4 laptops running 400B parameter models next
         | year. Wild times.
        
           | visarga wrote:
           | And they will fit in the 8GB RAM with 0.02 bit quant
        
             | gpm wrote:
             | You can get a macbook pro with 128 GB of memory (for nearly
             | $5000).
             | 
             | Which still implies... a 2 bit quant?
        
               | freeqaz wrote:
               | There are some crazy 1/1.5 bit quants now. If you're
               | curious I'll try to dig up the papers I was reading.
               | 
               | 1.5bit can be done to existing models. The 1 bit (and
               | less than 1 bit iirc) requires training a model from
               | scratch.
               | 
               | Still, the idea that we can have giant models running in
               | tiny amounts of RAM is not completely far fetched at this
               | point.
        
               | gpm wrote:
               | Yeah, I'm broadly aware and have seen a few of the
               | papers, though I definitely don't try and track the state
               | of the art here closely.
               | 
               | My impression and experience trying low bit quants (which
               | could easily be outdated by now) is that you are/were
               | better off with a smaller model and a less aggressive
               | quantization (provided you have access to said smaller
               | model with otherwise equally good training). If that's
               | changed I'd be interested to hear about it, but
               | definitely don't want to make work for you digging up
               | papers.
        
               | moneywoes wrote:
               | eli5 quant?
        
               | gpm wrote:
               | Quant is short for "quantization" here.
               | 
               | LLMs are parameterized by a ton of weights, when we say
               | something like 400B we mean it has 400 billion
               | parameters. In modern LLMs those parameters are basically
               | always 16 bit floating point numbers.
               | 
               | It turns out you can get nearly as good results by
               | reducing the precision of those numbers, for instance by
               | using 4 bits per parameter instead of 16, meaning each
               | parameter can only take on one of 16 possible values
               | instead of one of 65536.
        
               | sbierwagen wrote:
               | Interestingly enough, Llama3 suffers more performance
               | loss than Llama2 did at identical quantizations.
               | https://arxiv.org/abs/2404.14047
               | 
               | There's some speculation that a net trained for more
               | epochs on more data learns to pack more information into
               | the weights, and so does worse when weight data is
               | degraded.
        
               | Der_Einzige wrote:
               | Most claims of "nearly as good results" are massively
               | overblown.
               | 
               | Even the so called "good" quants of huge models are
               | extremely crippled.
               | 
               | Nothing is ever free, and even going from 16 to 8bit will
               | massively reduce the quality of your model, no matter
               | whatever their hacked benchmarks claim.
               | 
               | No, it doesn't help because of "free regularization"
               | either. Dropout and batch norm were also placebo BS that
               | didn't actually help to back in the day when they were
               | still being used.
        
               | fennecfoxy wrote:
               | Quantization is reducing the number of bits to store a
               | parameter for a machine learning model.
               | 
               | Put simple, a parameter is a number that determines how
               | likely it is that something will occur, ie if the number
               | is < 0.5 say "goodbye" otherwise say "hello".
               | 
               | Now, if the parameter is a 32bit (unsigned) integer it
               | can have a value of 0-4,294,967,296.
               | 
               | If you were using this 32bit value to represent physical
               | objects, then you could represent 4,294,967,296 objects
               | (each object gets given its own number).
               | 
               | However a lot of the time in machine learning, after
               | training you can find that not quite so many different
               | "things" need to be represented by a particular
               | parameter, so if say you were representing types of fruit
               | with this parameter (Google says there are over 2000
               | types of fruit, but let's just say there are exactly
               | 2000). In that case 4,294,967,296/2000 means there are
               | 2.1 million distinct values we assign each fruit, which
               | is such a waste! Our perfect case would be that we use a
               | number that only represents 0-2000 in the smallest way
               | for this job.
               | 
               | Now is where quantization comes in, where the size of the
               | number we use to represent a parameter is reduced, saving
               | memory size at the expense of a small performance hit of
               | the model accuracy - it's known that many models don't
               | really take a large accuracy hit from this, meaning that
               | the way the parameter is used inside the model doesn't
               | really need/take advantage of being able to represent so
               | many values.
               | 
               | So what we do is say, reduce that 32bit number to 16, or
               | 8, or 4 bits. We go from being able to represent billions
               | or millions of distinct values/states to maybe 16 (with
               | 4bit quantization) and then we benchmark the model
               | performance against the larger version with 32bit
               | parameters - often finding that what training has decided
               | to use that parameter for doesn't really need an
               | incredibly granular value.
        
               | pulse7 wrote:
               | Current 2 bit quant models are useless. Smaller models
               | yield better results.
        
         | adrian_b wrote:
         | > The Most Powerful Neural Engine Ever
         | 
         | While it is true that the claimed performance for M4 is better
         | than for the current Intel Meteor Lake and AMD Hawk Point, it
         | is also significantly lower (e.g. around half) than the AI
         | performance claimed for the laptop CPU+GPU+NPU models that both
         | Intel and AMD will introduce in the second half of this year
         | (Arrow Lake and Strix Point).
        
           | whynotminot wrote:
           | > will introduce
           | 
           | Incredible that in the future there will be better chips than
           | what Apple is releasing now.
        
             | hmottestad wrote:
             | Don't worry. It's Intel we're talking about. They may say
             | that it's coming out in 6 months, but that's never stopped
             | them from releasing it in 3 years instead.
        
               | adrian_b wrote:
               | AMD is the one that has given more precise values (77
               | TOPS) for their launch, their partners are testing the
               | engineering samples and some laptop product listings seem
               | to have been already leaked, so the launch is expected
               | soon (presentation in June, commercial availability no
               | more than a few months later).
        
               | spxneo wrote:
               | I literally don't give a fck about Intel anymore they are
               | irrelevant
               | 
               | The taiwanese silicon industrial complex deserves our
               | dollars. Their workers are insanely hard working and it
               | shows in its product.
        
               | benced wrote:
               | There's no Taiwanese silicon industrial complex, there's
               | TSMC. The rest of Taiwanese fabs are irrelevant. Intel is
               | the clear #3 (and looks likely-ish to overtake Samsung?
               | We'll see).
        
             | adrian_b wrote:
             | The point is that it is a very near future, a few months
             | away.
             | 
             | Apple is also bragging very hyperbolically that the NPU
             | they introduce right now is faster than all the older NPUs.
             | 
             | So, while what Apple says, "The Most Powerful Neural Engine
             | Ever" is true now, it will be true for only a few months.
             | Apple has done a good job, so as it is normal, at launch
             | their NPU is the fastest. However this does not deserve any
             | special praise, it is just normal, as normal as the fact
             | that the next NPU launched by a competitor will be faster.
             | 
             | Only if the new Apple NPU would have been slower than the
             | older models, that would have been a newsworthy failure. A
             | newsworthy success would have been only if the new M4 would
             | have had at least a triple performance than it has, so that
             | the competitors would have needed more than a year to catch
             | up with it.
        
               | whynotminot wrote:
               | Is this the first time you're seeing marketing copy? This
               | is an entirely normal thing to do. Apple has an advantage
               | with the SoC they are releasing today, and they are going
               | to talk about it.
               | 
               | I expect we will see the same bragging from Apple's
               | competitors whenever they actually launch the chips
               | you're talking about.
               | 
               | Apple has real silicon shipping right now. What you're
               | talking about doesn't yet exist.
               | 
               | > A newsworthy success would have been only if the new M4
               | would have had at least a triple performance than it has,
               | so that the competitors would have needed more than a
               | year to catch up with it.
               | 
               | So you decide what's newsworthy now? Triple? That's so
               | arbitrary.
               | 
               | I certainly better not see you bragging about these
               | supposed chips later if they're not three times faster
               | than what Apple just released today.
        
               | adrian_b wrote:
               | I said triple, because the competitors are expected to
               | have a double speed in a few months.
               | 
               | If M4 were 3 times faster than it is, it would have
               | remained faster than Strix Point and Arrow Lake, which
               | would have been replaced only next year, giving supremacy
               | to M4 for more than a year.
               | 
               | If M4 were twice faster, it would have continued to share
               | the first position for more than a year. As it is, it
               | will be the fastest for one quarter, after which it will
               | have only half of the top speed.
        
               | whynotminot wrote:
               | And then Apple will release M5 next year, presumably with
               | another increase in TOPS that may well top their
               | competitors. This is how product releases work.
        
               | spxneo wrote:
               | strongly doubt we will see M5 so soon
        
               | thejazzman wrote:
               | the M3 was released Oct 30, 2023 the M4 was released May
               | 7, 2024
               | 
               | [disco stu] if these trends continue, the M5 will be out
               | on November 14, 2024
        
               | handsclean wrote:
               | I can't tell what you're criticizing. Yes, computers get
               | faster over time, and future computers will be faster
               | than the M4. If release cycles are offset by six months
               | then it makes sense that leads only last six months in a
               | neck-and-neck race. I'd assume after Arrow Lake and Strix
               | Point the lead will then go back to M5 in six months,
               | then Intel and AMD's whatever in another six, etc. I
               | guess that's disappointing if you expected a multi-year
               | leap ahead like the M1, but that's just a bad
               | expectation, it never happens and nobody predicted or
               | claimed it.
        
               | davej wrote:
               | Apple will also introduce the "Pro" line of their M4
               | chips later in the year and I expect that they will
               | improve the Neural Engine further.
        
           | intrasight wrote:
           | > The Most Powerful Neural Engine Ever
           | 
           | that would be my brain still - at least for now ;)
        
           | spxneo wrote:
           | damn bro thanks for this
           | 
           | here i am celebrating not pulling the trigger on M2 128gb
           | yesterday
           | 
           | now im realizing M4 ain't shit
           | 
           | will wait a few more months for what you described. will
           | probably wait for AMD
           | 
           | > Given that Microsoft has defined that only processors with
           | an NPU with 45 TOPS of performance or over constitute being
           | considered an 'AI PC',
           | 
           | so already with 77 TOPS it just destroys M4. Rumoured to hit
           | the market in 2 months or less.
        
         | paulpan wrote:
         | The fact that TSMC publishes their own metrics and target goals
         | for each node makes it straightforward to compare the
         | transistor density, power efficiency, etc.
         | 
         | The most interesting aspect of the M4 is simply it's debuting
         | on the iPad lineup, whereas historically it's always been on
         | the iPhone (for A-series) and Macbook (for M-series). Makes
         | sense given low expected yielded for the newest node for one of
         | Apple's lower volume products.
         | 
         | For the curious, the original TSMC N3 node had a lot of issues
         | plus was very costly so makes sense to move away from it:
         | https://www.semianalysis.com/p/tsmcs-3nm-conundrum-does-it-e...
        
           | spenczar5 wrote:
           | iPads are actually much higher volume than Macs. Apple sells
           | about 2x to 3x as many tablets as laptops.
           | 
           | Of course, phones dwarf both.
        
             | andy_xor_andrew wrote:
             | The iPad Pros, though?
             | 
             | I'm very curious how much iPad Pros sell. Out of all the
             | products in Apple's lineup, the iPad Pro confuses me the
             | most. You can tell what a PM inside Apple thinks the iPad
             | Pro is for, based on the presentation: super powerful M4
             | chip! Use Final Cut Pro, or Garageband, or other desktop
             | apps on the go! Etc etc.
             | 
             | But in reality, who actually buys them, instead of an iPad
             | Air? Maybe some people with too much money who want the
             | latest gadgets? Ever since they debuted, the general
             | consensus from tech reviewers on the iPad Pro has been
             | "It's an amazing device, but no reason to buy it if you can
             | buy a MacBook or an iPad Air"
             | 
             | Apple really wants this "Pro" concept to exist for iPad
             | Pro, like someone who uses it as their daily work surface.
             | And maybe _some_ people exist like that (artists?
             | architects?) but most of the time when I see an iPad in a
             | "pro" environment (like a pilot using it for nav, or a
             | nurse using it for notes) they're using an old 2018
             | "regular" iPad.
        
               | transpute wrote:
               | iPadOS 16.3.1 can run virtual machines on M1/M2 silicon, 
               | https://old.reddit.com/r/jailbreak/comments/18m0o1h/tutor
               | ial...
               | 
               | Hypervisor support was removed from the iOS 16.4 kernel,
               | hopefully it will return in iPadOS 18 for at least some
               | approved devices.
               | 
               | If not, Microsoft/HP/Dell/Lenovo Arm laptops with
               | M3-competitive performance are launching soon, with
               | mainline Linux support.
        
               | dmitrygr wrote:
               | > Microsoft/HP/Dell/Lenovo Arm laptops with
               | M3-competitive performance are launching soon, with
               | mainline Linux support.
               | 
               | I have been seeking someone who'll be willing to put
               | money on such a claim. I'll bet the other way. Perchance
               | you're the person I seek, if you truly believe this?
        
               | transpute wrote:
               | Which part - launch timing, multicore performance or
               | mainline Linux support?
        
               | dmitrygr wrote:
               | perf >= M3 _while_ power consumption  <= M3, while booted
               | Linux and, say 50%: streaming a video on youtube.com over
               | wifi at min brightness, 50% compiling some C project in a
               | loop, minimum brightness from and to internal SSD.
               | 
               | Compared to macOS on M3 doing the same
        
               | transpute wrote:
               | its_a_trap.jpg :)
               | 
               | At Qualcomm SoC launch, OSS Linux can't possibly compete
               | with the deep pockets of optimized-shenanigan Windows
               | "drivers" or vertically integrated macOS on Apple
               | Silicon.
               | 
               | But the incumbent landscape of Arm laptops for Linux is
               | so desolate, that it can only be improved by the arrival
               | of multiple Arm devices from Tier 1 PC OEMs based on a
               | single SoC family, with skeletal support in mainline
               | Linux. In time, as with Asahi reverse engineering of
               | Apple firmware interfaces, we can have mainline Linux
               | support and multiple Linux distros on enterprise Arm
               | laptops.
               | 
               | One risk for MS/Asus/HP/Dell/Lenovo devices based on
               | Qualcomm Nuvia/Oryon/EliteX is that Qualcomm + Arm
               | licensing fees could push device pricing into "premium"
               | territory. The affordable Apple Macbook Air, including
               | used M1 devices, will provide price and performance
               | competition. If enterprises buy Nuvia laptops in volume,
               | then Linux will have a used Arm laptop market in 2-3
               | years.
               | 
               | So.. your test case might be feasible after a year or two
               | of Linux development and optimization. Until then, WSL2
               | on Windows 11 could be a fallback. For iPad Pro users
               | desperate for portable Linux/BSD VM development with long
               | battery life, Qualcomm-based Arm laptops bring much
               | needed competition to Apple Silicon. If Nuvia devices can
               | run multiple OSS operating systems, it's already a win
               | for users, making possible the Apple-impossible. Ongoing
               | performance improvements will be a bonus.
        
               | dmitrygr wrote:
               | That's the point! In two years m5 will exit.
               | 
               | But I'm happy to take that bet with "Linux" replaced with
               | "windows"
        
               | transpute wrote:
               | Since the hardware already exists and has been
               | benchmarked privately, this is less of a bet and more of
               | an information asymmetry. So let's assume you would win
               | :) Next question is why - is it a limitation of the SoC,
               | power regulators, motherboard design, OS integration, Arm
               | licensing, Apple patents, ..?
        
               | zarzavat wrote:
               | I presume the sequence of events was: some developer at
               | Apple thought it would be a great idea to port hypervisor
               | support to iPad and their manager approves it. It gets
               | all the way into the OS, then an exec gets wind of it and
               | orders its removal because it allows users to subvert the
               | App Store and Apple Rent. I doubt it's ever coming back.
               | 
               | This is everything wrong with the iPad Pro in a nutshell.
               | Fantastic hardware ruined by greed.
        
               | transpute wrote:
               | It's been rumored for years that a touch-optimized
               | version of macOS has been in development for use within
               | iOS VMs.
        
               | ninkendo wrote:
               | Never. Not ever, ever ever.
               | 
               | Apple currently has 5 major build trains: macOS, iOS,
               | watchOS, tvOS (which also runs HomePod), and visionOS.
               | Huge amounts of the code are already the same between
               | them: they literally just build the same stuff with
               | different build settings... except for the UI. The UI has
               | actually unique stuff in each train.
               | 
               | This has become more true over time... teams are likely
               | sick of not having certain dependencies on certain
               | trains, so they're becoming more identical at the
               | foundation/framework level every release.
               | 
               | Saying they'll make a macOS with a touch UI is like
               | saying Honda is finally going to make a motorcycle with
               | four wheels and a full car frame. The UI is the
               | differentiating factor in the OS's. Everything else has
               | already converged or is rapidly doing so.
               | 
               | If the goal is to support macOS apps on iOS then there's
               | a dilemma: how do you suddenly make apps that are
               | designed from the ground up for a mouse, good for touch?
               | The answer is you don't: you just make the rest of the
               | system identical (make the same APIs available
               | everywhere) and ask developers to make the UI parts
               | different.
               | 
               | I could _almost_ believe that they'd make a macOS VM
               | available for use with a keyboard and mouse within iOS.
               | But to me it'd make more sense to do a sort of reverse
               | version of how iOS apps are supported on macOS... where
               | macOS apps are run natively on the iPad, but rendered
               | with the iPad's window management (modulo whatever
               | multitasking features they still need to implement to
               | make this seamless) and strictly require a keyboard and
               | mouse to be in this mode. There's just no reason to make
               | a VM if you're doing this: you can just run the binary
               | directly. The kernel is the same, the required frameworks
               | are the same. No VM is needed.
        
               | transpute wrote:
               | VMs are needed by professional developers who want to run
               | CLI tools and services (e.g. web server, database)
               | without the security restrictions of iOS, while retaining
               | the OS integrity of the iPad Pro device.
               | 
               | Even if a macOS VM had only a CLI terminal and a few core
               | apps made by Apple, using a Swift UI framework that was
               | compatible with a touch interface, it would be a huge
               | step forward for iPad owners who are currently limited to
               | slow and power-expensive emulation (iSH, ashell). Apple
               | could create a new app store or paid upgrade license
               | entitlement for iOS-compatible macOS apps, so that users
               | can pay ISVs for an app version with iOS touch input.
        
               | ninkendo wrote:
               | What you're talking about sounds great but it's not "a
               | touch optimized version of macOS". You're describing a
               | CLI environment in a sandbox.
               | 
               | Apple will never ever take macOS and change its UI to be
               | optimized for touch. Or at least if they do, it's time to
               | sell the stock. They already have a touch UI, and it's
               | called iOS. They're converging the two operating systems
               | by making the underlying frameworks the same... the UI is
               | literally the only thing they _shouldn't_ converge.
        
               | transpute wrote:
               | _> subvert the App Store and Apple Rent._
               | 
               | EU and US regulators are slowly eroding that service
               | monopoly.
               | 
               |  _> Fantastic hardware_
               | 
               | Hopefully Apple leadership stops shackling their hardware
               | under the ho-hum service bus.
               | 
               | It's been rumored for years that a touch-optimized
               | version of macOS has been in development for use in iOS
               | VMs. With the launch of M4 1TB 16GB iPad Pros for $2K
               | (the price of two MacBook Airs), Apple can sell
               | developers the freedom to carry one device instead of
               | two, without loss of revenue,
               | https://news.ycombinator.com/item?id=40287922
        
               | zarzavat wrote:
               | I bet that touch-optimized macOS will never see the light
               | of day, or if it does it will be insanely crippled. Too
               | much of an existential threat to Apple's stock price.
               | 
               | Apple is in the midst of a cold war with regulators now.
               | Every new feature will be scrutinized to check that it
               | offers no threat to their golden goose if regulators
               | force them to open it up. Allowing one type of VM means
               | that regulators could force them to allow any type of VM.
        
               | intrasight wrote:
               | Totally agree about "Pro". Imagine if they gave it a real
               | OS. Someone yesterday suggested to dual-boot. At first I
               | dismissed that idea. But after thinking about it, I can
               | see the benefits. They could leave ipadOS alone and
               | create a bespoke OS. They certainly have the resources to
               | do so. It would open up so many new sales channels for a
               | true tablet.
        
               | tyre wrote:
               | Which sales channels?
        
               | transpute wrote:
               | _> They could leave ipadOS alone and create a bespoke
               | OS._
               | 
               | Asahi Linux already runs on Apple Silicon.
               | 
               | The EU could try to unlock Apple device boot of owner-
               | authorized operating systems.
        
               | intrasight wrote:
               | That's another path to having a real OS. And more likely
               | to be realized.
        
               | kmeisthax wrote:
               | >artists? architects?
               | 
               | Ding ding ding ding ding! The iPad Pro is useful
               | _primarily_ for those people. Or at least it _was_. The
               | original selling point of the Pro was that it had[0] the
               | Apple Pencil and a larger screen to draw on. The 2021
               | upgrade gave the option to buy a tablet with 16GB of RAM,
               | which you need for Procreate as that has very strict
               | layer limits. If you look at the cost of dedicated
               | drawing tablets with screens in them, dropping a grand on
               | an iPad Pro and Pencil is surprisingly competitive.
               | 
               | As for every other use case... the fact that all these
               | apps have iPad versions now is great, _for people with
               | cheaper tablets_. The iPad Air comes in 13 " now and
               | that'll satisfy all but the most demanding Procreate
               | users _anyway_ , for about the same cost as the Pro had
               | back in 2016 or so. So I dunno. Maybe someone at Apple's
               | iPad division just figured they need a halo product? Or
               | maybe they want to compete with the Microsoft Surface
               | without having to offer the flexibility (and
               | corresponding jank) of a real computer? I dunno.
               | 
               | [0] sold separately, which is one of my biggest pet
               | peeves with tablets
        
               | simonsquiff wrote:
               | What's sad about the Air is that it's only a 60hz screen.
               | I'm spoilt now with 120hz on the first gen iPad Pro, the
               | iPad needs it even more than phones (and they need it).
               | So I'm not a demanding user in all other ways but the Air
               | is not satisfying to me, yet.
        
             | wpm wrote:
             | iPads as a product line sure, but the M4 is only in the
             | Pros at the moment which are likely lower volume than the
             | MacBook Air.
        
           | srg0 wrote:
           | With Logic Pro for iPad they now have applications for all
           | their traditional Mac use cases on iPad. If anything, it
           | feels like Apple is pushing for a switch from low-tier Macs
           | to iPad Pro.
           | 
           | And they surely can sell more gadgets and accessories for an
           | iPad than for a laptop.
        
         | barbariangrunge wrote:
         | My m2 pro is already more powerful than I can use. The screen
         | is too small to do big work like using a daw or doing video
         | editing, the Magic Keyboard is uncomfortable so I stopped
         | writing on it. All that processing power, I don't know what it
         | will be used for on a tablet without even a good file system.
         | Lousy ergonomics
        
       | exabrial wrote:
       | All I want is more memory bandwidth at lower latency. I've learnt
       | that's the vast majority of felt responsiveness today. I could
       | care less about AI and Neural Engine party tricks, stuff I might
       | use once a day or week.
        
       | bschmidt1 wrote:
       | Bring on AI art, music, & games!
        
       | oxqbldpxo wrote:
       | All this powerful hardware on a laptop computer is like driving a
       | Ferrari at 40 mph. It is begging for better use. If apple ever
       | releases an ai robot that's going to change everything. Long ways
       | to go, but when it arrives, it will be chatgptx100.
        
       | adonese wrote:
       | imaging a device so powerful as this new ipad, yet so useless. it
       | baffles me that we have this great hardware only the software bit
       | is lacking
        
       | tiffanyh wrote:
       | Nano-Texture
       | 
       | I really hope this comes to all Apple products soon (iPhones, all
       | iPads, etc).
       | 
       | It's some of the best anti-reflective tech I've seen that keeps
       | color and brightness deep & bright.
        
         | kylehotchkiss wrote:
         | Will be interesting to see how it holds up on devices that get
         | fingerprints and could be scratched though. Sort of wish Apple
         | would offer it as a replaceable screen film.
        
         | adultSwim wrote:
         | When they stopped offering matte displays I switched to
         | Thinkpads. I'd really like an Air but can't imagine looking at
         | myself all day.
        
       | rnikander wrote:
       | Any hope for a new iPhone SE? My 1st gen's battery is near dead.
        
       | api wrote:
       | Looks great. Now put it in a real computer. Such a waste to be in
       | a jailed device that can't run anything.
        
       | obnauticus wrote:
       | Looks like their NPU (aka ANE) takes up about 1/3 of the die area
       | of the GPU.
       | 
       | Would be interesting to see how much they're _actually_ utilizing
       | the NPU versus their GPU for AI workloads.
        
       | noiv wrote:
       | I got somewhat accustomed to new outrageous specs every year, but
       | reading near the end that by 2030 Apple plans to be 'carbon
       | neutral across the entire manufacturing supply chain and life
       | cycle of every product' makes me hope one day my devices are not
       | just a SUV on the data highway.
        
       | gavin_gee wrote:
       | I'm still rocking an iPad 6th generation. It's a video
       | consumption device only. A faster CPU doesn't enable any new use
       | cases.
       | 
       | The only reason is the consumer's desire to buy more.
        
         | cyberpunk wrote:
         | New oled does look like quite a nice display though...
        
           | doctor_eval wrote:
           | Yep I have an OLED TV but was watching a movie on my (M1)
           | iPad Pro last night, and realised how grey the blacks were.
           | 
           | Once you see it, etc.
        
         | nortonham wrote:
         | and I still have a an original iPad Air first gen.....still
         | works for basic things. Unsupported by apple now, but still
         | usable.
        
       | rvalue wrote:
       | No mention of battery life. They keep making stuff thin, and
       | unupgradeable. What's the point of buying an apple device that is
       | going to wear out in 5 years?
        
         | namdnay wrote:
         | To be fair every MacBook I've had has lasted 10 years minimum
        
         | TillE wrote:
         | Battery replacement costs $200. It's not like you just have to
         | throw it in the trash if the battery dies.
        
       | EugeneOZ wrote:
       | 1Tb model = EUR2750.
       | 
       | For iPad, not an MBP laptop.
        
       | JodieBenitez wrote:
       | And here I am with my MB Air M1 with no plan to upgrade
       | whatsoever because I don't need to...
       | 
       | (yes, I understand this is about iPad, but I guess we'll see
       | these M4 on the MB Air as well ?)
        
         | bschmidt1 wrote:
         | You'll want to upgrade to produce generative content - you just
         | don't know it yet.
        
           | JodieBenitez wrote:
           | I don't do such things, not on my laptop anyways. I'll
           | upgrade when the battery capacity is worn out, that's the
           | only reason I can foresee.
        
       | NorwegianDude wrote:
       | > M4 has Apple's fastest Neural Engine ever, capable of up to 38
       | trillion operations per second, which is faster than the neural
       | processing unit of any AI PC today.
       | 
       | I always wonder what crazy meds Apple employees are on. Two RTX
       | 4090s is quite common for hobbyist use, and that is 1321 TOPS
       | each, making two over 69 times more than what Apple claims to be
       | the fastest in the world. That performance is literally less than
       | 1 % of a single H200.
       | 
       | Talk about misleading marketing...
        
         | akshayt wrote:
         | They are referring to integrated npus in current cpus like in
         | the intel cote ultra.
         | 
         | They explicitly mentioned in the event that the industry refers
         | to the neural engine as a NPU
        
           | mort96 wrote:
           | But the word they used isn't "NPU" or "neural engine" but "AI
           | PC"??? If I build a PC with a ton of GPU power with the
           | intention of using that compute for machine learning then
           | that's an "AI PC"
        
             | fwip wrote:
             | The technicality they're operating on is that the "AI PC"
             | doesn't have a "neural processing unit."
             | 
             | > faster than the neural processing unit of any AI PC
             | today.
        
               | mort96 wrote:
               | Ah. I guess you could argue that that's technically not
               | directly false. That's an impressive level of being
               | dishonest without being technically incorrect.
               | 
               | By comparing the non-existent neural engine in your
               | typical AI PC, you could claim that the very first SoC
               | with an "NPU" is infinitely faster than the typical AI PC
        
               | sroussey wrote:
               | The phrase AI PC used by Intel and AMD is about having an
               | NPU like in the Intel Ultra chips. These are ML only
               | things, and can run without activating the GPU.
               | 
               | https://www.theverge.com/2023/12/14/23998215/intel-core-
               | ultr...
        
             | SllX wrote:
             | On paper you're absolutely correct. AI PC is marketing
             | rubbish out of Wintel. Apple's doing a direct comparison to
             | that marketing rubbish and just accepting that they'll
             | probably have to play along with it.
             | 
             | So going by the intended usage of this marketing rubbish,
             | the comparison Apple is making isn't to GPUs. It's to
             | Intel's chips that like Apple's, integrate, CPU, GPU, and
             | NPU. They just don't name drop Intel anymore when they
             | don't have to.
        
               | mort96 wrote:
               | If they literally just said that the iPad's NPU is faster
               | than the NPU of any other computer it'd be fine, I would
               | have no issue with it (though it makes you wonder, maybe
               | that wouldn't have been true? Maybe Qualcomm or Rockchip
               | have SoCs with faster NPUs, so the "fastest of any AI PC"
               | qualifier is necessary to exclude those?)
        
             | aurareturn wrote:
             | "AI PC" is what Microsoft and the industry has deemed SoCs
             | that have an NPU in it. It's not a term that Apple made up.
             | It's what the industry is using.
             | 
             | Of course, Apple has had an NPU in their SoC since the
             | first iPhone with FaceID.
        
               | max51 wrote:
               | when they made up that term, they also made up a TOPS
               | requirement that is higher than what the m4 has. It's not
               | by much, but technically the m4 is not even fast enough
               | to qualify as an AI PC.
        
             | numpad0 wrote:
             | Microsoft/Intel are trying to push this "AI-enabled PC" or
             | whatever for few months, to obsolete laptops without NPU
             | stuffed in unused I/O die space of CPU. Apple weaponized
             | that in this instance.
             | 
             | 1: https://www.theregister.com/2024/03/12/what_is_an_ai_pc/
        
             | stetrain wrote:
             | "AI PC" is a specific marketing term from Intel and
             | Microsoft. I don't think their specs include dual RTX
             | 4090s.
             | 
             | https://www.tomshardware.com/pc-components/cpus/intel-
             | shares...
        
               | oarth wrote:
               | An AI PC is a PC suited to be used for AI... Dual 4090 is
               | very suited for small scale AI.
               | 
               | It might be a marketing term by Microsoft, but that is
               | just dumb, and has nothing to do with what Apple says. If
               | this was in relation to Microsofts "AI PC" then Apple
               | should have written "Slower than ANY AI PC." instead, as
               | the minimum requirements for "AI PC by Microsoft" seems
               | to be 45 TOPS, and the M4 is too slow to qualify by the
               | Microsoft definition.
               | 
               | Are you heavily invested in Apple stock or somehting?
               | When a company clearly lies and tries to mislead people,
               | call them out on it, don't defend them. Companies are not
               | your friend. Wtf.
        
               | pertymcpert wrote:
               | > Are you heavily invested in Apple stock or somehting?
               | 
               | This isn't a nice thing to say.
        
               | stetrain wrote:
               | > Are you heavily invested in Apple stock or somehting?
               | When a company clearly lies and tries to mislead people,
               | call them out on it, don't defend them. Companies are not
               | your friend. Wtf.
               | 
               | I don't own any Apple stock, at least not directly. I'm
               | not defending Apple, just trying to understand what their
               | claim is. Apple does plenty of consumer un-friendly
               | things but they aren't dumb and they have good lawyers so
               | they tend not to directly lie about things in product
               | claims.
        
               | NorwegianDude wrote:
               | Fair enough. You are correct that Apple aren't dumb, but
               | they do mislead as much as they can in marketing, and by
               | their own words from previous court cases you're not a
               | reasonable person if you think it's facts.
               | 
               | In this case they do straight up lie, without a question.
               | There is no reasonable explanation for the claim. If they
               | had some absurd meaning behind it then they should have
               | put a footnote on it.
        
           | NorwegianDude wrote:
           | The text clearly states faster than any AI PC, not that its
           | faster than any NPUs integrated into a CPU.
           | 
           | They could have written it correctly, but that sounds way
           | less impressive, so instead they make up shit to make it
           | sound very impressive.
        
             | aurareturn wrote:
             | https://www.microsoft.com/en-us/americas-partner-
             | blog/2024/0...
             | 
             | It's the term Microsoft, Intel, AMD, and Qualcomm decided
             | to rally around. No need to get upset at Apple for using
             | the same term as reference for comparison.
             | 
             | Ps. Nvidia also doesn't like the term because of precisely
             | what you said. But it's not Apple that decided to use this
             | term.
        
               | max51 wrote:
               | If you want to adopt this new terminology, remember that
               | Intel and Microsoft have a requirement of 40 TOPS for "AI
               | PCs". How can the m4 be faster than any AI PC if it's too
               | slow to even qualify as one?
        
               | pertymcpert wrote:
               | Source? IIRC that 40 TOPS was just Microsoft saying that
               | was the requirement for "next gen" AI PCs, not a
               | requirement for any AI PC to be classed as one.
        
           | janalsncm wrote:
           | I've never heard anyone refer to an NPU before. I've heard of
           | GPU and TPU. But in any case, I don't know the right way to
           | compare Apple's hardware to a 4090.
        
         | syntaxing wrote:
         | Definitely misleading but they're talking about "AI CPU" rather
         | than GPUs. They're pretty much taking a jab at Intel.
        
         | talldayo wrote:
         | Watching this site recover after an Apple press release is like
         | watching the world leaders deliberate Dr. Strangelove's
         | suggestions.
        
         | make3 wrote:
         | (A H200 is a five digits datacenter GPU without a display port,
         | it's not what they mean by PC, but your general point still
         | stands)
        
         | MBCook wrote:
         | That's not a neural processing unit. It's a GPU.
         | 
         | They said they had the fastest NPU in a PC. Not the fastest on
         | earth (one of the nVidia cards, probably). Not the fastest way
         | you could run something (probably a 4090 as you said). Just the
         | fastest NPU shipping in a PC. Probably consumer PC.
         | 
         | It's marketing, but it seems like a reasonable line to draw to
         | me. It's not like when companies draw a line like "fastest car
         | under $70k with under 12 cylinders but available in green from
         | the factory".
        
           | NorwegianDude wrote:
           | Of course a GPU from Nvidia is also a NPU. People are
           | spending billions each month on Nvidia, because it's a great
           | NPU.
           | 
           | The fact is that a GPU from Nvidia is a much faster NPU than
           | a CPU from Apple.
           | 
           | It is marketing as you say, but it's misleading marketing, on
           | purpose. They could have simply written "the fastest
           | integrated NPU of any CPU" instead. This is something Apple
           | often does on purpose, and people believe it.
        
             | MBCook wrote:
             | A GPU does other things. It's designed to do something
             | else. That's why we call it a _G_ PU.
             | 
             | It just happens to be it's good at neural stuff too.
             | 
             | There's another difference too. Apple's NPU is integrated
             | in their chip. Intel and AMD are going the same. A 4090 is
             | not integrated into a CPU.
             | 
             | I'm somewhat guessing. Apple said NPU is the industry term,
             | honestly I'd never heard it before today. I don't know if
             | the official definition draws a distinction that would
             | exclude GPUs or not.
             | 
             | I simply think the way Apple presented things seemed
             | reasonable. When they made that claim the fact that they
             | might be comparing against a 4090 never entered my mind. If
             | they had said it was the fastest way to run neural networks
             | I would have questioned it, no doubt. But that wasn't the
             | wording they used.
        
               | sudosysgen wrote:
               | NVidia GPUs basically have an NPU, in the form of Tensor
               | units. They don't just happen to be good at matmul, they
               | have specific hardware designed to run neural networka.
               | 
               | There is no actual distinction. A GPU with Tensor
               | cores(=matmul units) really does have an NPU just as much
               | as a CPU with an NPU (=matmul units).
        
               | oarth wrote:
               | > A GPU does other things.
               | 
               | Yes, and so does the M4.
               | 
               | > It just happens to be it's good at neural stuff too.
               | 
               | No, it's no coincidence. Nvidia has been focusing on
               | neural nets, same as Apple.
               | 
               | > There's another difference too. Apple's NPU is
               | integrated in their chip.
               | 
               | The neural processing capabilities of Nvidia
               | products(Tensor Cores) are also integrated in the chip.
               | 
               | > A 4090 is not integrated into a CPU.
               | 
               | Correct, but nobody ever stated that. Apple stated that
               | M4 was faster than any AI PC today, not that it's the
               | fastest NPU integrated into a CPU. And by the way, the M4
               | is also a GPU.
               | 
               | > I don't know if the official definition draws a
               | distinction that would exclude GPUs or not.
               | 
               | A NPU can be a part of a GPU, a CPU or it's own chip.
               | 
               | > If they had said it was the fastest way to run neural
               | networks I would have questioned it,
               | 
               | They said fastest NPU, neural processing unit. It's the
               | term Apple and a few others use for their AI accelerator.
               | The whole point of a AI accelerator is performance and
               | efficiency. If something does a better job at it then
               | it's a better AI accelerator.
        
               | lostmsu wrote:
               | You know G in GPU stands for Graphics, right? So if you
               | want to play a game of words, NVidia's device dedicated
               | to something else is 30 times faster than "fastest"
               | Apple's device dedicated specifically to neural
               | processing.
        
             | dyauspitr wrote:
             | At that point you could just call a GPU a CPU. There are
             | manful distinctions to be made based on what the chip is
             | used for exclusively.
        
           | adrian_b wrote:
           | Both Intel's and AMD's laptop CPUs include NPUs, and they are
           | indeed slower than M4.
           | 
           | Nevertheless, Apple's bragging is a little weird, because
           | both Intel and AMD have already announced that in a few
           | months they will launch laptop CPUs with much faster NPUs
           | than Apple M4 (e.g. 77 TOPS for AMD), so Apple will hold the
           | first place for only a very short time.
        
             | MBCook wrote:
             | But do you expect them to say it's the "soon to be second
             | fastest"?
             | 
             | It's the fastest available today. And when they release
             | something faster (M4 Pro or Mac or Ultra or whatever)
             | they'll call that the fastest.
             | 
             | Seems fair to me.
        
               | pulse7 wrote:
               | It is NOT the fastest available today. I have a 1 year
               | old PC bellow my desk which does faster neural network
               | processing than M4. It has an Intel CPU and NVIDIA 4090.
               | It runs Llama 7B models MUCH faster than any Apple's
               | chip. And it is a PC. I am sorry, but the sentence "It's
               | the fastest available today." is a straightforward lie...
        
             | jjtheblunt wrote:
             | Why do you believe that? Announcements about future
             | releases, by Intel and AMD, are not current future facts.
             | If they deliver, then fine, but you speak like they're
             | factual.
        
           | snypher wrote:
           | But I can't just say I have "the world's fastest GXZ", when
           | GXZ is just some marketing phrase. If we're willing to accept
           | a GPU ~= NPU then it's just a meaningless claim.
        
         | phren0logy wrote:
         | I have a MacBook M2 and a PC with a 4090 ("just" one of them) -
         | the VRAM barrier is usually what gets me with the 4090 when I
         | try to run local LLMs (not train them). For a lot of things, my
         | MacBook is fast enough, and with more RAM, I can run bigger
         | models easily. And, it's portable and sips battery.
         | 
         | The marketing hype is overblown, but for many (most? almost
         | all?) people, the MacBook is a much more useful choice.
        
           | ProllyInfamous wrote:
           | Expanding on this, I have an M2Pro (mini) & a tower w/GPU...
           | but for daily driving the M2Pro idles at 15-35W whereas the
           | tower idles at 160W.
           | 
           | Under full throttle/load, even though the M2Pro is rated as
           | less-performant, it is only using 105W -- the tower/GPU are
           | >450W!
        
         | SkyPuncher wrote:
         | All of the tech specs comparisons were extremely odd. Many
         | things got compared to the M1, despite the most recent iPad
         | having the M2. Heck, one of the comparisons was to the A11 chip
         | that was introduced nearly 7 years ago.
         | 
         | I generally like Apple products, but I cannot stand the way
         | they present them. They always hide how it compares against the
         | directly previous product.
        
         | Aurornis wrote:
         | It's a marketing trick. They're talking about _NPU_ s
         | specifically, which haven't really been rolled out on the PC
         | side.
         | 
         | So while they're significantly slower than even casual gaming
         | GPUs, they're technically the fastest _NPUs_ on the market.
         | 
         | It's marketing speak.
        
           | kllrnohj wrote:
           | It's still lower than Qualcomm's Snapdragon Elite X (45 TOPS)
        
         | smith7018 wrote:
         | You're calling $3,600 worth of GPUs "quite common for hobbyist
         | use" and then comparing an iPad to a $40,000 AI-centric GPU.
        
           | sudosysgen wrote:
           | It's almost 70x more powerful. A 4 year old 3070 laptop was
           | cheaper when it came out and has about 200 TOPS, 7 times as
           | much. It's just factually incorrect to call it "faster than
           | any AI PC", it's far slower than a cheaper laptop from 4
           | years ago.
        
             | astrange wrote:
             | "Powerful" isn't the thing that matters for a battery-
             | powered device. Power/perf is.
        
               | sudosysgen wrote:
               | If they thought that peak performance didn't matter, they
               | wouldn't quote peak performance numbers in their
               | comparison, and yet they did. Peak performance clearly
               | matters, even in battery powered devices: many workloads
               | are bursty and latency matters then, and there are
               | workloads where you can be expected to be plugged in. In
               | fact, one such workload is generative AI which is often
               | characterized by burst usage where latency matters a lot,
               | which is exactly what these NPUs are marketed towards.
        
             | acdha wrote:
             | AI PC is a specific marketing term which Intel is using for
             | their NPU-equipped products where they're emphasizing low-
             | power AI:
             | 
             | https://www.intel.com/content/www/us/en/newsroom/news/what-
             | i...
             | 
             | In that context it seems fair to make the comparison
             | between a MacBook and the PC version which is closest on
             | perf/watt rather than absolute performance on a space
             | heater.
        
         | password54321 wrote:
         | Just two 4090s? If you don't have at least 8 4090s do not even
         | call yourself a hobbyist.
        
         | numpad0 wrote:
         | Also the 38 TOPS figure is kind of odd. Intel had already shown
         | laptop CPUs with 45 TOPS NPU[1] though it hasn't shipped, and
         | Windows 12 is rumored to require 40 TOPS. If I'm doing math
         | right, (int)38 falls short of both.
         | 
         | 1: https://www.tomshardware.com/pc-components/cpus/intel-
         | says-l...
        
           | kllrnohj wrote:
           | Snapdragon Elite X is 45 TOPS as well, so Apple's isn't even
           | the fastest ARM-based SoC NPU
        
         | citizenpaul wrote:
         | This is standard apple advertising.The best whatever in the
         | world that is the same as some standard thing with a different
         | name. Apple is like clothing makers that "vanity size" their
         | clothes. If you dont know that basically means a size 20 is 30
         | size 21 is 31 and so on.
         | 
         | Neural processing unit is basically a made up term at this
         | point so of course they can have the fastest in the world.
        
         | DeathArrow wrote:
         | Apple always had the fastest, the biggest, the best. Or at
         | least they have to claim that to justify the price premium.
         | 
         | Previous iPads had the best screens in a tablet even if they
         | weren't oleds. Now that they finally use oleds, they have the
         | best oled screens and the best screens in a tablet.
        
       | ProfessorZoom wrote:
       | Apple Pencil Pro...
       | 
       | Apple Pencil Ultra next?
       | 
       | Apple Pencil Ultra+
       | 
       | Apple Pencil Pro Ultra XDR+
        
       | tsunamifury wrote:
       | Its really saying something about how the tech sector has shifted
       | due to the recent AI wave that Apple is announcing a chipset
       | entirely apart from a product.
       | 
       | This has never happened to my knowledge in this companies
       | history? I could be wrong though, even the G3/G4s were launched
       | as PowerMacs.
        
         | pram wrote:
         | They've done it for every single M processor release.
        
         | wmf wrote:
         | The M4 was announced with the iPad Pro that uses it.
        
       | marinhero wrote:
       | I get frustrated seeing this go into the iPad and knowing that we
       | can't get a shell, and run our own binaries there. Not even as a
       | VM like [UserLAnd](https://userland.tech). I could effectively
       | travel with one device less in my backpack but instead I have to
       | carry two M chips, two displays, batteries, and so on...
       | 
       | It's great to see this tech moving forward but it's frustrating
       | to not see it translate into a more significant impact in the
       | ways we work, travel and develop software.
        
         | bschmidt1 wrote:
         | Think the play is "consumer AI". Would you really write code on
         | an iPad? And if you do, do you use an external keyboard?
        
           | e44858 wrote:
           | Tablets are the perfect form factor for coding because you
           | can easily mount them in an ergonomic position like this:
           | https://mgsloan.com/posts/comfortable-airplane-computing/
           | 
           | Most laptops have terrible keyboards so I'd be using an
           | external one either way.
        
             | bschmidt1 wrote:
             | Those keyboards are absolutely ridiculous, sorry.
        
           | marinhero wrote:
           | Yes. If I'm plugging it to a thunderbolt dock I'd expect it
           | to work like a MacBook Air
        
         | LeoPanthera wrote:
         | UTM can be built for iOS.
        
           | zamadatix wrote:
           | Hypervisor.framework is not exposed without a jailbreak which
           | makes this quite limited in terms of usability and
           | functionality.
        
           | xyst wrote:
           | best you can hope for is cpu pass through. Gl with using the
           | rest of the chip
        
         | xyst wrote:
         | yup - im honestly tired of the Apple ~~~jail~~~ ecosystem.
         | 
         | I love the lower power usage/high efficiency of ARM chips but
         | the locked down ecosystem is a drag.
         | 
         | Just the other day, I was trying to get gpu acceleration to
         | work within a vm on my m1 mac. I think it's working? But
         | compared to native it's slow.
         | 
         | I think it's just a misconfig, somewhere (ie, hypervisor or
         | qemu or UTM or maybe the emulation service in vm).
         | 
         | On other systems (intel/amd + nvidia/radeon) this is more or
         | less a "pass through" but on mac it's a different beast.
        
           | paulmd wrote:
           | gpu passthrough for VMs is not supported on apple silicon
           | period afaik. there may be some "native" renderer built on
           | top of metal but apple doesn't support SR-IOV or "headless
           | passthrough".
           | 
           | https://chariotsolutions.com/blog/post/apple-silicon-gpus-
           | do...
           | 
           | otoh no, it is not "more or less [automatic]" in other
           | hardware either, SR-IOV has been on the enthusiast wishlist
           | for a ridiculously long time now because basically nobody
           | implements it (or, they restrict it to the most datacenter-y
           | of products).
           | 
           | intel iGPUs from the HD/UHD Intel Graphics Technology era
           | have a concept called GVT-g which isn't quite SR-IOV but
           | generally does the thing. Newer Xe-based iGPUs do not support
           | this, nor do the discrete graphics cards.
           | 
           | AMD's iGPUs do not have anything at all afaik. Their dGPUs
           | don't even implement reset properly, which is becoming a big
           | problem with people trying to set up GPU clouds for AI stuff
           | - a lot of times the AMD machines will need a hard power
           | reset to come back.
           | 
           | NVIDIA GPUs do work properly, and do implement SR-IOV
           | properly... but they only started letting you do passthrough
           | recently, and only 1 VM instance per card (so, 1 real + 1
           | virtual).
           | 
           | Curious what you're using (I'm guessing intel iGPU or nvidia
           | dGPU) but generally this is still something that gets Wendell
           | Level1techs hot and bothered about the mere _possibility_ of
           | this feature being in something without a five-figure
           | subscription attached.
           | 
           | https://www.youtube.com/watch?v=tLK_i-TQ3kQ
           | 
           | It does suck that Apple refuses to implement vulkan support
           | (or sign graphics drivers), I think that's de-facto how
           | people interact with most "hardware accelerated graphics"
           | solutions in vmware or virtualbox, but SR-IOV is actually
           | quite a rare feature, and "passthrough" is not sufficient
           | here since the outer machine still needs to use the GPU as
           | well. The feature point is SR-IOV not just passthrough.
        
         | transpute wrote:
         | _> Not even as a VM_
         | 
         | WWDC is next month. There's still a chance of iPadOS 18
         | including a Hypervisor API for macOS/Linux VMs on M4 iPads.
        
           | monocularvision wrote:
           | I hope for this every single year. I just don't see it
           | happening. But I hope I am wrong.
        
             | transpute wrote:
             | 2022, https://appleinsider.com/articles/22/10/20/apple-
             | rumored-to-...
             | 
             |  _> A leaker has claimed that Apple is working on a version
             | of macOS exclusive for the M2 iPad Pro ... the exclusivity
             | to M2 iPad Pro could be a marketing push. If the feature is
             | only available on that iPad, more people would buy it._
             | 
             | Based on the M4 announcement, vMacOS could be exclusive to
             | the 1TB/2TB iPad Pro with 16GB RAM that would be helpful
             | for VMs.
        
             | Kelteseth wrote:
             | At this point, you would have a better chance of running
             | your own apps by relocating to the EU ;)
        
         | ragazzina wrote:
         | > instead I have to carry two M chips
         | 
         | What's the incentive for Apple to unify them, since you've
         | already given them the money twice?
        
       | therealmarv wrote:
       | So why should I buy any Apple Laptop with M3 chip now (if I'm not
       | in hurry)? lol
        
         | MBCook wrote:
         | That's why a lot of people weren't expecting this an even
         | questioned Mark Gurman's article saying it would happen.
        
         | _ph_ wrote:
         | If you are not in a hurry, you almost never should buy new
         | hardware as the next generation will be around the corner. On
         | the other side, it could be up to 12 months, until the M4 is
         | available across the line. And for most tasks, a M3 is a great
         | value too. One might watch how many AI features that would
         | benefit from a M4 are presented at WWDC. But then, the next Mac
         | OS release won't be out before October.
        
           | therealmarv wrote:
           | The Macbook Airs with M3 have been launched 2 months ago. 2
           | months is really not that long ago, even in the Apple
           | universe. For sure I'm waiting on what happens on WWDC!
        
         | wiseowise wrote:
         | They've just released MacBook Air 15 inch, new one is at least
         | a year away.
        
       | czbond wrote:
       | Any idea when M4 will be in a mac pro?
        
       | asow92 wrote:
       | Why are we running these high end CPUs on tablets without the
       | ability to run pro apps like Xcode?
       | 
       | Until I can run Xcode on an iPad (not Swift Playgrounds), it's a
       | pass for me. Hear me out: I don't want to bring both an iPad and
       | Macbook on trips, but I need Xcode. Because of this, I have to
       | pick the Macbook every time. I want an iPad, but the iPad doesn't
       | want me.
        
         | elpakal wrote:
         | "It's not you, it's me" - Xcode to the iPad
        
           | asow92 wrote:
           | In all seriousness, you're right. Sandboxing Xcode but making
           | it fully featured is surely a nightmare engineering problem
           | for Apple. However, I feel like some kind of containerized
           | macOS running in the app sandbox could be possible.
        
         | al_borland wrote:
         | WWDC is a month away. I'm hoping for some iPadOS updates to let
         | people actually take advantage of the power they put in these
         | tablets. Apple has often released new hardware before showing
         | off new OS features to take advantage of it.
         | 
         | I know people have been hoping for that for a long time, so I'm
         | not holding my breath.
        
           | asow92 wrote:
           | My guess is that WWDC will be more focused on AI this year,
           | but I will remain hopeful.
        
           | data-ottawa wrote:
           | I've been hoping for the same thing since I bought my M1 iPad
           | just before WWDC.
           | 
           | As much as I'd like the OLED screen and lighter weight, I
           | don't have any compelling argument to buy a new one.
        
         | smrtinsert wrote:
         | Yep I have also no use for a touch screen device of that size.
         | Happy to get an m4 mac air or whatever it will be called but
         | I'm done with pads.
        
         | kjkjadksj wrote:
         | Didn't you want to play a reskinned bejewelled or subway surfer
         | with 8k textures?
        
         | Naomarik wrote:
         | Didn't have to look long to find a comment mirroring how I feel
         | about these devices. To me it feels like they're just adding
         | power to an artificially castrated device I can barely do
         | anything with. See no reason to upgrade from my original iPad
         | Pro that's not really useful for anything. Just an overpowered
         | device running phone software.
        
           | asow92 wrote:
           | I feel the same way. I just can't justify upgrading from my
           | 10.5" Pro from years ago. It's got pro motion and runs most
           | apps fine. Sure, the battery isn't great after all these
           | years, but it's not like it's getting used long enough to
           | notice.
        
             | al_borland wrote:
             | Something has changed with how the iPads behave at rest.
             | When I got my first iPad in 2010 I could leave it for
             | weeks, pick it up, and it would hardly use any battery at
             | all. Today, it seems like my iPad mini will eat 10% or more
             | per day just sitting on a table untouched. I don't like
             | leaving it plugged in all the time, but with it being dead
             | every time I go to pick it up, I simply stop picking it up.
             | 
             | Even a good battery isn't that good. That seems to be a
             | software problem.
             | 
             | My only theory is it's turning on the screen every time it
             | gets a notification. However, I have a case that covers the
             | screen, which should keep the screen off in my opinion. I
             | have thought about disabling 100% of the notification, but
             | without a global toggle that seems pretty annoying to do.
        
               | easton wrote:
               | My guess is something to do with Find My/ offline
               | finding. That would cause it to wake up all the time,
               | maybe Apple thought it was worth the trade off.
        
               | transpute wrote:
               | _> Today, it seems like my iPad mini will eat 10% or more
               | per day just sitting on a table untouched._
               | 
               | That's abnormal.
               | 
               | If it's malware, do a clean reinstall from DFU mode using
               | Apple Configurator on a Mac.
        
               | al_borland wrote:
               | My last 2 or 3 iPads have been this way. I'd be surprised
               | if it was malware.
        
               | transpute wrote:
               | It's unusual. Do they lose battery even in airplane mode?
               | 
               | What does Settings > Battery > "Battery Usage by App"
               | show as the top consumers of power?
               | 
               | Does "Low Power Mode" make any difference?
        
               | al_borland wrote:
               | I'll have to play more with it for the other things. I
               | haven't invested much time in troubleshooting, since it
               | seemed like that's the way iPads just are now. Hopefully
               | that's not actually true.
               | 
               | When I looked at the top battery consumers in the past
               | there wasn't anything that stood out. I think home screen
               | was at the top. It wasn't one or two apps killing it with
               | background activity.
        
               | transpute wrote:
               | _> home screen was at the top_
               | 
               | Since the biggest battery consumption associated with
               | home screen is the display, and users are only briefly on
               | the home screen, before using it to navigate elsewhere,
               | home screen should be near the bottom (1%) of power
               | consumption.
        
           | pulse7 wrote:
           | "device I can barely do anything with" -> Apple can do
           | anything with iPads, but we - regular SW developers - are cut
           | off... :)
        
             | filleduchaos wrote:
             | Software developers are not the only professionals to
             | exist, and are far from being the market for tablets of all
             | things.
        
         | MuffinFlavored wrote:
         | > I don't want to bring both an iPad and Macbook on trips, but
         | I need ______
         | 
         | Why not just make the iPad run MacOS and throw iPadOS into the
         | garbage?
        
           | asow92 wrote:
           | I like some UX aspects of iPadOS, but need the functionality
           | of macOS for work.
        
             | reddalo wrote:
             | iPadOS is still mainly a fork of iOS, a glorified mobile
             | interface. They should really switch to a proper macOS
             | system, now that the specs allow for it.
        
         | kylehotchkiss wrote:
         | Visual studio code running from remote servers seemed like it
         | was making great progress right until the AI trendiness thing
         | took over... and hasn't seemed to advance much since. Hopefully
         | the AI thing cools down and the efforts on remote tooling/dev
         | environments continues onwards.
        
           | deergomoo wrote:
           | If we're going down that route then what's the point in
           | putting good hardware in the device? It might as well just be
           | a thin client. Having the same SoCs as their laptops and
           | desktops but then relegating the iPad to something that needs
           | to be chained to a "real" computer to do anything useful in
           | the development space seems like a tremendous waste of
           | potential.
        
           | pquki4 wrote:
           | If we are talking about running from remote servers, my 2018
           | iPad Pro with A12Z (or whatever letter) can do that almost
           | just as well.
        
           | cromka wrote:
           | AI? You mean like copilot and stuff? That runs locally?
           | 
           | Otherwise why would it be an obstacle here?
        
         | timmg wrote:
         | Just wait until you buy an Apple Vision Pro...
         | 
         | [It's got the same restrictions as an iPad, but costs more than
         | a MacBookPro.]
        
           | gpm wrote:
           | This is in fact the thing that stopped me from buying an
           | Apple Vision Pro.
        
         | deergomoo wrote:
         | I've been saying this for years, I would love to get a desktop
         | Mac and use an iPad for the occasional bit of portable
         | development I do away from a desk, like when I want to noodle
         | on an idea in front of the TV.
         | 
         | I'm very happy with my MacBook, but I don't like that the mega
         | expensive machine I want to keep for 5+ years needs to be tied
         | to a limited-life lithium battery that's costly and labour
         | intensive to replace, just so I can sometimes write code in
         | other rooms in my house. I know there's numerous remote options
         | but...the iPad is right there, just lemme use it!
        
           | pjot wrote:
           | I've had success using cloud dev environments with an iPad -
           | the key for me was also using a mouse and keyboard - after
           | things weren't _that_ different
        
           | mtoner23 wrote:
           | get a macbook then? they are similarly price to an ipad pro
        
           | lll-o-lll wrote:
           | Is there no RDP equivalent for mac? Just RDP into your main
           | workstation?
        
             | mycall wrote:
             | https://support.apple.com/en-il/guide/mac-help/mh11848/mac
        
           | moistoreos wrote:
           | I've been giving some thought to this. I wonder if an iPad
           | would suffice in front of the tv and just ssh into a Mac Mini
           | for dev work. I'd love an iPad but I can't justify it either
           | because of the limitation of hardware capabilities. I also
           | don't really want to purchase two machines just for dev tasks
           | and travel. But, I think having that kind of lifestyle will
           | be expensive no matter the approach.
        
         | w1nst0nsm1th wrote:
         | I love and hate Apple as almost everyone else and have an iPad
         | for 'consultation' only (reading, browsing, video), but on
         | Android, you have IDEs for games dev (Godot), real android apps
         | IDE (through F-Droid), Python, Java and C/C++ IDE (through
         | Android Store) which are close enough of the Linux way...
         | 
         | So the iPad devices could handle that too if Apple allowed
         | it...
         | 
         | Once Apple will enforce the European Union requirement to allow
         | 'sideloading' on iPad, maybe we will be able to have nice
         | things also on it.
         | 
         | That could also be a good thing for Apple himself. A lot of
         | people in Europe have a bad opinion of Apple (partly?) because
         | of the closed (walled) garden of iPad/iOS and other
         | technology/IP which make their portable devices apart of the
         | Android ecosystem.
        
         | paulcole wrote:
         | As hard as it might be to believe, software developers are not
         | the "pros" Apple is choosing to appeal to with the iPad Pro.
         | 
         | Other jobs exist!
         | 
         | People with disposable income who just want to buy the
         | nicest/most expensive thing exist!
        
         | mcfedr wrote:
         | I got an iPad a couple of years ago, was really disappointed by
         | the how limited it felt in what I could do. Not to mention the
         | awful app store.
        
         | codercotton wrote:
         | macOS should be an iPadOS app. The hardware is ready! Have a
         | folder mapped into iPadOS Files. Not sure much else is needed.
        
         | 0x38B wrote:
         | "...but the iPad doesn't want me" is exactly it; I used iPad
         | from the very first one - that chunky, hard-edged aluminum and
         | glass slate, and remained a heavy user up until a few years
         | ago. For half a decade, the iPad was my only computer on the
         | go. I spent two years abroad with a 12.9" Pro.
         | 
         | The conclusion I came to was that I loved the hardware but
         | found the software a huge letdown for doing real work; I tried
         | SSHing into VPSs and the like, but that wasn't enough.
         | 
         | But man, the power in these thin, elegant devices is huge, and
         | greater with the M4 chips. If Asahi ran on the M4 iPads I'd
         | probably give it a go! - in an alternate dream universe, that
         | is...
        
       | eterevsky wrote:
       | They are talking about iPad Pro as the primary example of M4
       | devices. But iPads don't really seem to be limited by
       | performance. Nobody I know compiles Chrome or does 3D renders on
       | an iPad.
        
         | SkyPuncher wrote:
         | It's all marketing toward people who aspire to be these
         | creative types. Very, few people actually need it but it feels
         | good when the iPad Air is missing a few key features that push
         | you to the Pro.
         | 
         | More practically, it should help with battery life. My
         | understanding is energy usage scales non-linearly with demand.
         | A more powerful chip running at 10% may be more battery
         | efficient than a less powerful chip running at 20%
        
       | FredPret wrote:
       | Why does a tablet have a camera bump!? Just take out the camera.
       | And let me run VSCode and a terminal.
        
         | kylehotchkiss wrote:
         | The new document scanning functionality the camera bump helps
         | enable is really nice. That previously required third party
         | apps, which started out great (Swiftscan) then got greedy and
         | turned into monthly subscriptions. I will happily enjoy Apple
         | erasing entire categories of simple apps that turned one time
         | purchases into monthly subscriptions.
        
       | visarga wrote:
       | LLaMA 3 tokens/second please, that's what we care about.
        
         | bschmidt1 wrote:
         | Hahah yes
        
         | mlboss wrote:
         | Only spec that I care about
        
       | amai wrote:
       | 16GB RAM ought to be enough for anybody! (Tim Cook)
        
       | bmurphy1976 wrote:
       | I love these advances and I really want a new iPad but I can't
       | stand the 10"+ form factor. When will the iPad Mini get a
       | substantial update?
        
       | radicaldreamer wrote:
       | Unfortunate that they got rid of the SIM card slot, Google Fi
       | only supports physical sims for their "data only" sim feature.
        
       | Dowwie wrote:
       | Can anyone explain where the media engine resides and runs?
        
         | wmf wrote:
         | The whole iPad is basically one chip so... the media engine is
         | in the M4. AFAIK it's a top-level core not part of the GPU but
         | Marcan could correct me.
        
       | nojvek wrote:
       | I am awaiting the day when a trillion transistors will be put on
       | a mobile device chewing 5W of peak power.
       | 
       | It's going to be a radical future.
        
       | vivzkestrel wrote:
       | any benchmarks of how it stacks up to m1, m2 and m3?
        
       | TheRealGL wrote:
       | Who wrote this? "A fourth of the power", what happened to a
       | quarter of the power?
        
       | lenerdenator wrote:
       | So long as it lets me play some of the less-intense 00's-10's era
       | PC games in some sort of virtualization framework at decent
       | framerates one day, and delivers great battery life as a backend
       | web dev workstation-on-the-go the next, it's a good chip. The M2
       | Pro does.
        
       | rsp1984 wrote:
       | _Together with next-generation ML accelerators in the CPU, the
       | high-performance GPU, and higher-bandwidth unified memory, the
       | Neural Engine makes M4 an outrageously powerful chip for AI._
       | 
       | In case it is not abundantly clear by now: Apple's AI strategy is
       | to put inference (and longer term even learning) on edge devices.
       | This is completely coherent with their privacy-first strategy
       | (which would be at odds with sending data up to the cloud for
       | processing).
       | 
       | Processing data at the edge also makes for the best possible user
       | experience because of the complete independence of network
       | connectivity and hence minimal latency.
       | 
       | If (and that's a big if) they keep their APIs open to run any
       | kind of AI workload on their chips it's a strategy that I
       | personally really really welcome as I don't want the AI future to
       | be centralised in the hands of a few powerful cloud providers.
        
         | krunck wrote:
         | Yes, that would be great. But without the ability for us to
         | verify this who's to say they won't use the edge resources(your
         | computer and electricity) to process data(your data) and then
         | send the results to their data center? It would certainly save
         | them a lot of money.
        
           | astrange wrote:
           | You seem to be describing face recognition in Photos like
           | it's a conspiracy against you. You'd prefer the data center
           | servers looking at your data?
        
           | IggleSniggle wrote:
           | When you can do all inference at the edge, you can keep it
           | disconnected from the network if you don't trust the data
           | handling.
           | 
           | I happen to think they wouldn't, simply because sending this
           | data back to Apple in any form that they could digest it is
           | not aligned with their current privacy-first strategies. But
           | if they make a device that still works if it stays
           | disconnected, the neat thing is that you can just...keep it
           | disconnected. You don't have to trust them.
        
             | chem83 wrote:
             | Except that's an unreasonable scenario for a smart phone.
             | It doesn't prove that the minute the user goes online it
             | won't be egressing data willingly or not.
        
               | IggleSniggle wrote:
               | I don't disagree, although when I composed my comment I
               | had desktop/laptop in mind, as I think genuinely useful
               | on-device smartphone-AI is a ways of yet, and who knows
               | what company Apple will be by then.
        
             | bee_rider wrote:
             | To use a proprietary system and not trust the vendor, you
             | have to _never_ connect it. That's possible of course, but
             | it seems pretty limiting, right?
        
           | chem83 wrote:
           | +1 The idea that it's on device, hence it's privacy-
           | preserving is Apple's marketing machine speaking and that
           | doesn't fly anymore. They have to do better to convince any
           | security and privacy expert worth their salt that their
           | claims and guarantees can be independently verified on behalf
           | of iOS users.
           | 
           | Google did some of that on Android, which means open-sourcing
           | their on-device TEE implementation, publishing a paper about
           | it etc.
        
           | robbomacrae wrote:
           | They already do this. It's called federated learning and its
           | a way for them to use your data to help personalize the model
           | for you and also (to a much lesser extent) the global model
           | for everyone whilst still respecting your data privacy. It's
           | not to save money, it's so they can keep your data private on
           | device and still use ML.
           | 
           | https://www.technologyreview.com/2019/12/11/131629/apple-
           | ai-...
        
           | victorbjorklund wrote:
           | If you trust that Apple doesn't film you with the camera when
           | you use the phone while sitting on the toilet. Why wouldn't
           | you trust Apple now?
           | 
           | It would have to be a huge conspiracy with all Apples
           | employees. And you can easily just listen to the network and
           | see if they do it or not.
        
             | xanderlewis wrote:
             | I find it somewhat hard to believe that wouldn't be in
             | contravention of some law or other. Or am I wrong?
             | 
             | Of course we can then worry that companies are breaking the
             | law, but you have to draw the line somewhere... and what
             | have they to gain anyway?
        
         | joelthelion wrote:
         | >n case it is not abundantly clear by now: Apple's AI strategy
         | is to put inference (and longer term even learning)
         | 
         | I'm curious: is anyone seriously using apple hardware to train
         | Ai models at the moment? Obviously not the big players, but I
         | imagine it might be a viable option for Ai engineers in
         | smaller, less ambitious companies.
        
           | andrewmcwatters wrote:
           | Yes, it can be more cost effective for smaller businesses to
           | do all their work on Mac Studios, versus having a dedicated
           | Nvidia rig plus Apple or Linux hardware for your workstation.
           | 
           | Honestly, you can train basic models just fine on M-Series
           | Max MacBook Pros.
        
             | nightski wrote:
             | A decked out Mac Studio is like $7k for far less GPU power.
             | I find that highly unlikely.
        
               | inciampati wrote:
               | But you get access to a very large amount of RAM for that
               | price.
        
               | softfalcon wrote:
               | Don't attack me, I'm not disagreeing with you that an
               | nVidia GPU is far superior at that price point.
               | 
               | I simply want to point out that these folks don't really
               | care about that. They want a Mac for more reasons than
               | "performance per watt/dollar" and if it's "good enough",
               | they'll pay that Apple tax.
               | 
               | Yes, yes, I know, it's frustrating and they could get
               | better Linux + GPU goodness with an nVidia PC running
               | Ubuntu/Arch/Debian, but macOS is painless for the average
               | science AI/ML training person to set up and work with.
               | There are also known enterprise OS management solutions
               | that business folks will happily sign off on.
               | 
               | Also, $7000 is chump change in the land of "can I get
               | this AI/ML dev to just get to work on my GPT model I'm
               | using to convince some VC's to give me $25-500 million?"
               | 
               | tldr; they're gonna buy a Mac cause it's a Mac and they
               | want a Mac and their business uses Mac's. No amount of
               | "but my nVidia GPU = better" is ever going to convince
               | them otherwise as long as there is a "sort of" reasonable
               | price point inside Apple's ecosystem.
        
               | brookst wrote:
               | What Linux setup do you recommend for 128GB of GPU
               | memory?
        
               | TylerE wrote:
               | A non-decked out Mac Studio is a hell of a machine for
               | $1999.
               | 
               | Do you also compare cars by looking at only the super
               | expensive limited editions, with every single option box
               | ticked?
               | 
               | I'd also point out that said 3 year old $1999 Mac Studio
               | that I'm typing this on already runs ML models usefully,
               | maybe 40-50% of the old 3000-series Nvidia machine it
               | replaces, while using literally less than 10% of the
               | power and making a tiny tiny fraction of the noise.
               | 
               | Oh, and it was cheaper. And not running Windows.
        
               | bee_rider wrote:
               | They are talking about training models, though. Run is a
               | bit ambiguous, is that also what you mean?
        
               | TylerE wrote:
               | No.
               | 
               | For training the Macs do have some interesting advantages
               | due to the unified memory. The GPU cores have access to
               | all of system RAM (and also the system RAM is
               | _ridiculously_ fast - 400GB /sec when DDR4 is barely
               | 30GB/sec, which has a lot of little fringe benefits of
               | it's own, part of why the Studio feels like an even more
               | powerful machine than it actually is. It's just super
               | snappy and responsive, even under heavy load.)
               | 
               | The largest consumer NVidia card has 22GB of useable RAM.
               | 
               | The $1999 Mac has 32GB, and for $400 more you get 64GB.
               | 
               | $3200 gets you 96GB, and more GPU cores. You can hit the
               | system max of 192GB for $5500 on an Ultra, albeit it with
               | the lessor GPU.
               | 
               | Even the recently announced 6000-series AI-oriented
               | NVidia cards max out at 48GB.
               | 
               | My understanding is a that a lot of enthusiasts are using
               | Macs for training because for certain things having more
               | RAM is just enabling.
        
               | Der_Einzige wrote:
               | The huge amount of optimizations available on Nvidia and
               | _not_ available on Apple make the reduced VRAM worth it,
               | because even the most bloated of foundation models will
               | have some magical 0.1bit quantization technique be
               | invented by a turbo-nerd which only works on Nvidia.
               | 
               | I keep hearing this meme of Mac's being a big deal in LLM
               | training, but I have seen zero evidence of it, and I am
               | deeply immersed in the world of LLM training, including
               | training from scratch.
               | 
               | Stop trying to meme apple M chips as AI accelerators.
               | I'll believe it when unsloth starts to support a single
               | non-nvidia chip.
        
               | MBCook wrote:
               | If you work for a company willing to shell out sure there
               | are better options.
               | 
               | But for individual developers it's an interesting
               | proposition.
               | 
               | And a bigger question is: what if you already have (or
               | were going to buy) a Mac? You prefer them or maybe are
               | developing for Apple platforms.
               | 
               | Upping the chip or memory could easily be cheaper than
               | getting a PC rig that's faster for training. That may be
               | worth it to you.
               | 
               | Not everyone is starting from zero or wants the fastest
               | possible performance money can buy ignoring all other
               | factors.
        
               | arvinsim wrote:
               | Agreed. Although inference is good enough on the Mac,
               | there is no way I am training on them at all.
               | 
               | It's just more efficient to offload training to cloud
               | Nvidia GPUs
        
               | singhrac wrote:
               | Yeah, and I think people forget all the time that
               | inference (usually batch_size=1) is memory bandwidth
               | bound, but training (usually batch_size=large) is usually
               | compute bound. And people use enormous batch sizes for
               | training.
               | 
               | And while the Mac Studio has a lot of memory bandwidth
               | compared to most desktops CPUs, it isn't comparable to
               | consumer GPUs (the 3090 has a bandwidth of ~936GBps) let
               | alone those with HBM.
               | 
               | I really don't hear about anyone training on anything
               | besides NVIDIA GPUs. There are too many useful features
               | like mixed-precision training, and don't even get me
               | started on software issues.
        
               | andrewmcwatters wrote:
               | Not all of us who own small businesses are out here
               | speccing AMD Ryzen 9s and RTX 4090s for workstations.
               | 
               | You can't lug around a desktop workstation.
        
             | skohan wrote:
             | > a dedicated Nvidia rig
             | 
             | I am honestly shocked Nvidia has been allowed to maintain
             | their moat with cuda. It seems like AMD would have a ton to
             | gain just spending a couple million a year to implement all
             | the relevant ML libraries with a non-cuda back-end.
        
               | bee_rider wrote:
               | AMD doesn't really seem inclined toward building
               | developer ecosystems in general.
               | 
               | Intel seems like they could have some interesting stuff
               | in the annoyingly named "OneAPI" suite but I ran it on my
               | iGPU so I have no idea if it is actually good. It was
               | easy to use, though!
        
               | jimmySixDOF wrote:
               | There are quite a few back and forth X/Twitter storms in
               | teacups between George Hotz / tinygrad and the AMD
               | management about opening up the firmware for custom ML
               | integrations to replace CUDA but last I checked they were
               | running into walls
        
               | skohan wrote:
               | I don't understand why you would need custom firmware. It
               | seems like you could go a long way just implementing
               | back-ends for popular ML libraries in openCL / compute
               | shaders
        
             | whimsicalism wrote:
             | smaller businesses have no business having a dedicated GPU
             | rig of any kind
        
           | Q6T46nT668w6i3m wrote:
           | Yes, there're a handful of apps that use the neural engine to
           | fine tune models to their data.
        
           | alfalfasprout wrote:
           | Not really (I work on AI/ML Infrastructure at a well known
           | tech company and talk regularly w/ our peer companies).
           | 
           | That said, inference on apple products is a different story.
           | There's definitely interest in inference on the edge. So far
           | though, nearly everyone is still opting for inference in the
           | cloud for two reasons:
           | 
           | 1. There's a lot of extra work involved in getting ML/AI
           | models ready for mobile inference. And this work is different
           | for iOS vs. Android 2. You're limited on which exact device
           | models will run the thing optimally. Most of your customers
           | won't necessarily have that. So you need some kind of
           | fallback. 3. You're limited on what kind of models you can
           | actually run. You have way more flexibility running inference
           | in the cloud.
        
             | teaearlgraycold wrote:
             | Pytorch actually has surprisingly good support for Apple
             | Silicon. Occasionally an operation needs to use CPU
             | fallback but many applications are able to run inference
             | entirely off of the CPU cores.
        
               | rcarmo wrote:
               | And there is a lot of work being done with mlx.
        
               | ein0p wrote:
               | I've found it to be pretty terrible compared to CUDA,
               | especially with Huggingface transformers. There's no
               | technical reason why it has to be terrible there though.
               | Apple should fix that.
        
               | teaearlgraycold wrote:
               | Yeah. It's good with YOLO and Dino though. My M2 Max can
               | compute Dino embeddings faster than a T4 (which is the
               | GPU in AWS's g4dn instance type).
        
               | ein0p wrote:
               | MLX will probably be even faster than that, if the model
               | is already ported. Faster startup time too. That's my
               | main pet peeve though: there's no technical reason why
               | PyTorch couldn't be just as good. It's just underfunding
               | and neglect
        
               | whimsicalism wrote:
               | t4's are like 6 years old
        
             | throwitaway222 wrote:
             | Inference on the edge is a lot like JS - just drop a crap
             | ton of data to the front end, and let it render.
        
             | gopher_space wrote:
             | A cloud solution I looked at a few years ago could be
             | replicated (poorly) in your browser today. In my mind the
             | question has become one of determining _when_ my model is
             | useful enough to detach from the cloud, not whether that
             | should happen.
        
             | ethbr1 wrote:
             | Power for power, any thoughts on what mobile inference
             | looks like vs doing it in the cloud?
        
               | alfalfasprout wrote:
               | Mobile can be more efficient. But you're making big
               | tradeoffs. You are very limited in what you can actually
               | run on-device. And ultimately you're also screwing over
               | your user's battery life, etc.
        
           | deanishe wrote:
           | Isn't Apple hardware too expensive to make that worthwhile?
        
             | brookst wrote:
             | For business-scale model work, sure.
             | 
             | But you can get an M2 Ultra with 192GB of UMA for $6k or
             | so. It's very hard to get that much GPU memory at all, let
             | alone at that price. Of course the GPU processing power is
             | anemic compared to a DGX Station 100 cluster, but the mac
             | is $143,000 less.
        
             | MBCook wrote:
             | You want to buy a bunch of new equipment to do training?
             | Yeah Mac's aren't going to make sense.
             | 
             | You want your developers to be able to do training locally
             | and they already use Macs? Maybe an upgrade would make
             | business sense. Even if you have beefy servers or the cloud
             | for large jobs.
        
           | cafed00d wrote:
           | I like to think back to 2011 and paraphrase what people were
           | saying: "Is anyone seriously using gpu hardware to write nl
           | translation software at the moment?"
           | 
           | "No, we should be use cheap commodity abundantly available
           | cpus and orchestrate then behind cloud magic to write our nl
           | translation apps"
           | 
           | or maybe "no we should build purpose built high performance
           | computing hardware to write our nl translation apps"
           | 
           | Or perhaps in the early 70s "is anyone seriously considering
           | personal computer hardware to ...". "no, we should just buy
           | IBM mainframes ..."
           | 
           | I don't know. Im probably super biased. I like the idea of
           | all this training work breaking the shackles of
           | cloud/mainframe/servers/off-end-user-device and migrating to
           | run on peoples devices. It feels "democratic".
        
             | seanmcdirmid wrote:
             | I remember having lunch with a speech recognition
             | researcher who was using GPUs to train DNNs to do speech
             | recognition in 2011. It really was thought of as niche back
             | then. But the writing was on the wall I guess in the
             | results they were getting.
        
               | diego_sandoval wrote:
               | AMD didn't read the wall, unfortunately.
        
             | fennecfoxy wrote:
             | I don't think of examples really apply, because it's more a
             | question of being on "cutting edge" vs personal hardware.
             | 
             | For example, running a local model and access to the
             | features of a larger more capable/cloud model are two
             | completely different features therefore there is no "no we
             | should do x instead".
             | 
             | I'd imagine that a dumber local model runs and defers to
             | cloud model when it needs to/if user has allowed it to go
             | to cloud. Apple could not compete on "our models run
             | locally privacy is a bankable feature" alone imo, TikTok
             | install base has shown us enough that users prefer
             | content/features over privacy, they'll definitely still
             | need SoA cloud based models to compete.
        
           | dylan604 wrote:
           | Does one need to train an AI model on specific hardware, or
           | can a model be trained in one place and then used somewhere
           | else? Seems like Apple could just run their fine tuned model
           | called Siri on each device. Seems to me like asking for
           | training on Apple devices is missing the strategy. Unless of
           | course, it's just for purely scientific $reasons like "why
           | install Doom on the toaster?" vs doing it for a purpose.
        
             | xanderlewis wrote:
             | It doesn't _require_ specific hardware; you can train a
             | neural net with pencil and paper if you have enough time.
             | Of course, some pieces of hardware are more efficient than
             | others for this.
        
           | robbomacrae wrote:
           | I don't think this is what you meant but it matches the spec:
           | federated learning is being used by Apple to train models for
           | various applications and some of that happens on device
           | (iphones/ipads) with your personal data before its hashed and
           | sent up to the mothership model anonymously.
           | 
           | https://www.technologyreview.com/2019/12/11/131629/apple-
           | ai-...
        
           | avianlyric wrote:
           | Apple are. Their "Personal Voice" feature fine tunes a voice
           | model on device using recordings of your own voice.
           | 
           | An older example is the "Hey Siri" model, which is fine tuned
           | to your specific voice.
           | 
           | But with regards to on device training, I don't think anyone
           | is seriously looking at training a model from scratch on
           | device, that doesn't make much sense. But taking models and
           | fine tuning them to specific users makes a whole ton of
           | sense, and an obvious approach to producing "personal" AI
           | assistants.
           | 
           | [1] https://support.apple.com/en-us/104993
        
             | MBCook wrote:
             | They already do some "simple" training on device. The
             | example I can think of is photo recognition in the photo
             | library. It likely builds on something else but being able
             | to identify which phase is your grandma versus your
             | neighbor is not done in Apple's cloud. It's done when your
             | devices are idle and plugged into power.
             | 
             | A few years ago it wasn't shared between devices so each
             | device had to do it themselves. I don't know if it's shared
             | at this point.
             | 
             | I agree you're not going to be training an LLM or anything.
             | But smaller tasks limited and scope may prove a good fit.
        
         | legitster wrote:
         | > This is completely coherent with their privacy-first strategy
         | (which would be at odds with sending data up to the cloud for
         | processing).
         | 
         | I feel like people are being a bit naive here. Apple's "Privacy
         | First" strategy was a _marketing_ spin developed in response to
         | being dead-last in web-development /cloud computing/smart
         | features.
         | 
         | Apple has had no problem changing their standards by 180
         | degrees and being blatantly anti-consumer whenever they have a
         | competitive advantage to do so.
        
           | seec wrote:
           | Don't bother the fanboys have an Apple can't do anything
           | wrong/malicious. At this point it's closer to a religion than
           | ever.
           | 
           | You would be amazed at the response of some of them when I
           | point out some shit Apple does that make their products
           | clearly lacking for the price, the cognitive dissonance is so
           | strong they don't know how to react in any other way than
           | lying or pretending it doesn't matter.
        
             | acdha wrote:
             | If you're annoyed about quasi-religious behavior, consider
             | that your comment has nothing quantifiable and contributed
             | nothing to this thread other than letting us know that you
             | don't like Apple products for non-specific reasons. Maybe
             | you could try to model the better behavior you want to see?
        
             | n9 wrote:
             | Your comment is literally more subjective, dismissive, and
             | full of FUD than any other on on this thread. Check
             | yourself.
        
           | IggleSniggle wrote:
           | Of course! The difference is that, for the time being, my
           | incentives are aligned with theirs in regards to preserving
           | my privacy.
           | 
           | The future is always fungible. Anyone can break whatever
           | trust they've built _very_ quickly. But, like the post you
           | are replying to, I have no qualms about supporting companies
           | that are currently doing things in my interest and don 't
           | have any clear strategic incentive to violate that trust.
           | 
           | Edit: that same incentive structure would apply to NVIDIA,
           | afaik
        
             | jajko wrote:
             | I can't agree with your comment. apple has all the
             | incentives to monetize your data, that's the whole value of
             | Google and Meta. And they are already heading into ad-
             | business earning billions last I've checked. Hardware ain't
             | selling as much as before, this isn't going to change for
             | the better in foreseeable future.
             | 
             | The logic is exactly same as ie Meta claims - we will
             | pseudoanonymize your data, so technically your specific
             | privacy is just yours, see nothing changed. But you are in
             | various target groups for ads, plus we know how 'good'
             | those anon efforts are when money are at play and
             | corporations are only there to earn as much money as
             | possible. Rest is PR.
        
               | IggleSniggle wrote:
               | Persuasive, thank you
        
               | legitster wrote:
               | I'll disagree with your disagreement - in part at least.
               | Apple is still bigger than Meta or Google. Even if they
               | had a strong channel to serve ads or otherwise monetize
               | data, the return would represent pennies on the dollar.
               | 
               | And Apple's privacy stance is a _moat_ against these
               | other companies making money off of their customer base.
               | So for the cost of pennies on the dollar, they protect
               | their customer base and ward off competition. That 's a
               | pretty strong incentive.
        
           | robbomacrae wrote:
           | Having worked at Apple I can assure you it's not just spin.
           | It's nigh on impossible to get permission to even compare
           | your data with another service inside of Apple and even if
           | you do get permission the user ids and everything are
           | completely different so theres no way to match up users.
           | Honestly its kind of ridiculous the lengths they go to and
           | makes development an absolute PITA.
        
             | briandear wrote:
             | As an Apple alum, I can agree with everything you've said.
        
             | legitster wrote:
             | That could very well be true, but I also think it could
             | change faster than people realize. Or that Apple has the
             | ability to compartmentalize (kind of like how Apple can
             | advocate for USB C adoption in some areas and fight it in
             | others).
             | 
             | I'm not saying this to trash Apple - I think it's true of
             | any corporation. If Apple starts losing revenue in 5 years
             | because their LLM isn't good enough because they don't have
             | enough data, they are still going to take it and have some
             | reason justifying why _theirs_ is privacy focused and
             | everyone else is not.
        
         | croes wrote:
         | It isn't privacy if Apple knows.
         | 
         | They are the gatekeeper of your data for their benefit not
         | yours.
        
           | jajko wrote:
           | Yes at the end its just some data representing user's trained
           | model. Is there a contractual agreement with users that apple
           | will never ever transfer a single byte of those, otherwise
           | huge penalties will happen? If not, its pinky PR promise that
           | sounds nice.
        
             | threeseed wrote:
             | Apple publicly documents their privacy and security
             | practices.
             | 
             | At minimum, laws around the world prevent companies from
             | knowingly communicating false information to consumers.
             | 
             | And in many countries the rules around privacy are much
             | more stringent.
        
               | croes wrote:
               | I bet Boeing also has documentation about their security
               | practices.
               | 
               | Talk is cheap and in Apple's case it's part of their PR.
        
               | bamboozled wrote:
               | What is wrong with Boeing's security?
        
               | dudeinjapan wrote:
               | > What is wrong with Boeing's security?
               | 
               | Too many holes.
        
               | threeseed wrote:
               | But what does that have to do with the price of milk in
               | Turkmenistan.
               | 
               | Because Boeing's issues have nothing to do with privacy
               | or security and since they are not consumer facing have
               | no relevance to what we are talking about.
        
         | dheera wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | Apple has never been privacy-first in practice. They give you
         | the illusion of privacy but in reality it's a closed-source
         | system and you are forced to trust Apple with your data.
         | 
         | They also make it a LOT harder than Android to execute your own
         | MITM proxies to inspect what exact data is being sent about you
         | by all of your apps including the OS itself.
        
           | deadmutex wrote:
           | Yeah, given that they resisted putting RCS in iMessage so
           | long, I am a bit skeptical about the whole privacy narrative.
           | Especially when Apple's profit is at odds with user privacy.
        
             | notaustinpowers wrote:
             | From my understanding, the reason RCS was delayed is
             | because Google's RCS was E2EE only in certain cases (both
             | users using RCS). But also because Google's RCS runs
             | through Google servers.
             | 
             | If Apple enabled RCS in messages back then, but the
             | recipient was not using RCS, then Google now has the
             | decrypted text message, even when RCS advertises itself as
             | E2EE. With iMessage, at least I know all of my messages are
             | E2EE when I see a blue bubble.
             | 
             | Even now, RCS is available on Android if using Google
             | Messages. Yes, it's pre-installed on all phones, but OEMs
             | aren't required to use it as the default. It opens up more
             | privacy concerns because now I don't know if my messages
             | are secure. At least with the green bubbles, I can assume
             | that anything I send is not encrypted. With RCS, I can't be
             | certain unless I verify the messaging app the recipient is
             | using and hope they don't replace it with something else
             | that doesn't support RCS.
        
               | vel0city wrote:
               | You know what would really help Apple customers increase
               | their privacy when communicating with non-Apple devices?
               | 
               | Having iMessage available to everyone regardless of their
               | mobile OS.
        
               | notaustinpowers wrote:
               | Agreed. While I have concerns regarding RCS, Apple's
               | refusal to make iMessage an open platform due to customer
               | lock-in is ridiculous and anti-competitive.
        
               | jodrellblank wrote:
               | > " _due to customer lock-in_ "
               | 
               | Their words or your words?
        
               | int_19h wrote:
               | "moving iMessage to Android will hurt us more than help
               | us."
        
             | fabrice_d wrote:
             | How is RCS a win on the privacy front? It's not even e2e
             | encrypted in an interoperable way (Google implementation is
             | proprietary).
        
             | acdha wrote:
             | RCS is a net loss for privacy: it gives the carriers
             | visibility into your social graph and doesn't support end
             | to end encryption. Google's PR campaign tried to give the
             | impression that RCS supports E2EE but it's restricted to
             | their proprietary client.
        
               | kotaKat wrote:
               | On top of that, rooted devices are denied access to it,
               | which means Google is now gatekeeping a "carrier" service
               | on top of that even more.
        
               | dheera wrote:
               | > rooted devices are denied access to it
               | 
               | By what? It's impossible for a process to know for sure
               | if the system is rooted or not. A rooted system can
               | present itself to a process to look like a non-rooted
               | system if it's engineered well enough.
               | 
               | I'd bet that most of these apps probably just check if
               | "su" returns a shell, in which case perhaps all that's
               | needed is to modify the "su" executable to require "su
               | --magic-phrase foobar" before it drops into a root shell,
               | and returns "bash: su: not found" or whatever if called
               | with no arguments.
        
               | hollerith wrote:
               | >A rooted system can present itself to a process to look
               | like a non-rooted system if it's engineered well enough.
               | 
               | That was true 20 years ago, but most smartphones these
               | days have cryptograhically-verified boot chains and
               | remote attestation of how the boot went.
        
           | ben_w wrote:
           | You say that like open source isn't also an illusion of
           | trust.
           | 
           | The reality is, there's too much to verify, and not enough
           | interest for the "many eyeballs make all bugs shallow"
           | argument.
           | 
           | We are, all of us, forced to trust, forced to go without the
           | genuine capacity to verify. It's not great, and the best we
           | can do is look for incentives and try to keep those aligned.
        
             | dheera wrote:
             | I don't agree with relying on the many eyeballs argument
             | for security, but from a privacy standpoint, I do think at
             | least the availability of source to MY eyeballs, as well as
             | the ability to modify, recompile, and deploy it, is better
             | than "trust me bro I'm your uncle Steve Jobs and I know
             | more about you than you but I'm a good guy".
             | 
             | If you want to, for example, compile a GPS-free version of
             | Android that appears like it has GPS but in reality just
             | sends fake coordinates to keep apps happy thinking they got
             | actual permissions, it's fairly straightforward to make
             | this edit, and you own the hardware so it's within your
             | rights to do this.
             | 
             | Open-source is only part of it; in terms of privacy, being
             | able to see what all is being sent in/out of my device is
             | is arguably more important than open source. Closed source
             | would be fine if they allowed me to easily inject my own
             | root certificate for this purpose. If they aren't willing
             | to do that, including a 1-click replacement of the
             | certificates in various third-party, certificate-pinning
             | apps that are themselves potential privacy risks, it's a
             | fairly easy modification to any open source system.
             | 
             | A screen on my wall that flashes every JSON that gets sent
             | out of hardware that I own should be my right.
        
               | ben_w wrote:
               | > Open-source is only part of it; in terms of privacy,
               | being able to see what all is being sent in/out of my
               | device is is arguably more important than open source.
               | 
               | I agree; unfortunately it feels as if this ship has not
               | only sailed, but the metaphor would have to be expanded
               | to involve the port at well.
               | 
               | Is it even possible, these days, to have a functioning
               | experience with no surprise network requests? I've tried
               | to limit mine via an extensive hosts file list, but that
               | _did_ break stuff even a decade ago, and the latest
               | version of MacOS doesn 't seem to fully respect the hosts
               | file (weirdly it _partially_ respects it?)
               | 
               | > A screen on my wall that flashes every JSON that gets
               | sent out of hardware that I own should be my right.
               | 
               | I remember reading a tale about someone, I think it was a
               | court case or an audit, who wanted every IP packet to be
               | printed out on paper. Only backed down when the volume
               | was given in articulated lorries per hour.
               | 
               | I sympathise, but you're reminding me of that.
        
             | ajuc wrote:
             | Open source is like democracy. Imperfect and easy to fuck
             | up, but still by far the best thing available.
             | 
             | Apple is absolutism. Even the so called "enlightened"
             | absolutism is still bad compared to average democracy.
        
             | msla wrote:
             | Open Source is how that XZ hack got caught.
        
               | ben_w wrote:
               | Selection bias -- everyone only knows about the bugs that
               | do get caught.
               | 
               | I was one of many who reported a bug in Ubuntu that went
               | un-fixed for years, where the response smelled of nation-
               | state influence:
               | https://bugs.launchpad.net/ubuntu/+bug/1359836
               | 
               | And Log4Shell took about _8 years to notice_ :
               | https://en.wikipedia.org/wiki/Log4Shell
        
           | wan23 wrote:
           | > Apple has never been privacy-first in practice > They also
           | make it a LOT harder than Android to execute your own MITM
           | proxies
           | 
           | I would think ease of MITM and privacy are opposing concerns
        
         | sergiotapia wrote:
         | > privacy-first strategy
         | 
         | That's just their way of walled gardening apple customers. Then
         | they can extort devs and other companies dry without any
         | middle-men.
        
         | MyFirstSass wrote:
         | I've been saying the same thing since ANE and the incredible
         | new chips with shared ram, suddenly everyone could run capable
         | local models - but then Apple decided to be catastrophically
         | stingy once again putting ridiculous 8gb's of ram in these new
         | iPads' and their new macbook air's destroying having a
         | widespread "intelligent local siri" because now half the new
         | generation can't run anything.
         | 
         | Apple is an amazing powerhouse but also disgustingly elitist
         | and wasteful if not straight up vulgar in its profit motives.
         | There's really zero idealism there despite their romantic and
         | creative legacy.
         | 
         | There's always some straight idiotic limitations in their
         | otherwise incredible machines, with no other purpose than to
         | create planned obsolescence, "PRO" exclusivity and piles
         | e-waste.
        
         | KaiserPro wrote:
         | > This is completely coherent with their privacy-first strategy
         | (which would be at odds with sending data up to the cloud for
         | processing).
         | 
         | I mean yeah, that makes good marketing copy, but its more due
         | to reducing latency and keeping running costs down.
         | 
         |  _but_ as this is mostly marketing fluff we 'll need to
         | actually see how it performs before casting judgment on how
         | "revolutionary" it is.
        
         | lunfard000 wrote:
         | Prob beacuse they are like super-behind in the cloud space, it
         | is not like they wouldn't like to sell the service. They
         | ignored photos privacy quite a few times in the icloud.
        
           | dylan604 wrote:
           | is it surprising since they effectively given the finger to
           | data center hardware designs?
        
         | jablongo wrote:
         | So for hardware accelerated training with something like
         | PyTorch, does anyone have a good comparison between Metal vs
         | Cuda, both in terms of performance and capabilities?
        
         | s1k3s wrote:
         | For everyone else who doesn't understand what this means, he's
         | saying Apple wants you to be able to run models on their
         | devices, just like you've been doing on nvidia cards for a
         | while.
        
           | nomel wrote:
           | I think he's saying they want to make local AI a first class,
           | _default_ , capability, which is _very_ unlike buying a $1k
           | peripheral to enable it. At this point (though everyone seems
           | to be working on it), other companies need to include a
           | gaming GPU in every laptop, _and tablet_ now (lol), to enable
           | this.
        
           | Petersipoi wrote:
           | Awesome. I'm going to go tell my mom she can just pull her
           | Nvidia card out of her pocket at the train station to run
           | some models.
           | 
           | On second thought.. maybe it isn't "just like you've been
           | doing on Nvidia cards for a while"
        
         | andsoitis wrote:
         | > In case it is not abundantly clear by now: Apple's AI
         | strategy is to put inference (and longer term even learning) on
         | edge devices. This is completely coherent with their privacy-
         | first strategy (which would be at odds with sending data up to
         | the cloud for processing).
         | 
         | Their primary business goal is to sell hardware. Yes, they've
         | diversified into services and being a shopping mall for all,
         | but it is about selling luxury hardware.
         | 
         | The promise of privacy is one way in which they position
         | themselves, but I would not bet the bank on that being true
         | forever.
        
           | bamboozled wrote:
           | As soon as the privacy thing goes away, I'd say a major part
           | of their customer base goes away too. Most people use android
           | so they don't get "hacked" if Apple is doing the hacking, I'd
           | just buy a cheaper alternative.
        
             | Draiken wrote:
             | At least here in Brazil, I've never heard such arguments.
             | 
             | Seems even more unlikely for non technical users.
             | 
             | It's just their latest market campaign, as far as I can
             | tell. The vast majority of people buy iPhones because of
             | the status it gives.
        
               | everly wrote:
               | They famously had a standoff with the US gov't over the
               | Secure Enclave.
               | 
               | Marketing aside, all indications point to the iOS
               | platform being the most secure mobile option (imo).
        
               | elzbardico wrote:
               | This is a prejudiced take. Running AI tasks locally on
               | the device definitely is a giant improvement for the user
               | experience.
               | 
               | But not only that, Apple CPUs are objectively leagues
               | ahead of their competition in the mobile space. I am
               | still using a IPhone released in 2020 with absolutely no
               | appreciable slow down or losses in perceived performance.
               | Because even a 4 years old IPhone still has specs that
               | don't lag behind by much the equivalent Android phones, I
               | still receive the latest OS updates, and because frankly,
               | Android OS is mess.
               | 
               | If I cared about status, I would have changed my phone
               | already for a new one.
        
               | kernal wrote:
               | >Apple CPUs are objectively leagues ahead of their
               | competition in the mobile space
               | 
               | This is a lie. The latest Android SoCs are just as
               | powerful as the A series.
               | 
               | >Because even a 4 years old IPhone still has specs that
               | don't lag behind by much the equivalent Android phones, I
               | still receive the latest OS updates, and because frankly,
               | Android OS is mess.
               | 
               | Samsung and Google offer 7 years of OS and security
               | updates. I believe that beats the Apple policy.
        
               | martimarkov wrote:
               | Strangle Android 14 seems to not be available for s20
               | phone which was released in 2020?
               | 
               | Or am I mistaken here?
        
               | Jtsummers wrote:
               | > Samsung and Google offer 7 years of OS and security
               | updates. I believe that beats the Apple policy.
               | 
               | On the second part:
               | 
               | https://en.wikipedia.org/wiki/IPadOS_version_history
               | 
               | The last iPads to stop getting OS updates (including
               | security, to be consistent with what Samsung and Google
               | are pledging) got 7 and 9 years of updates each (5th gen
               | iPad and 1st gen iPad Pro). The last iPhones to lose
               | support got about 7 years each (iPhone 8 and X). 6S, SE
               | (1st), and 7 got 9 and 8 years of OS support with
               | security updates. The 5S (released in 2013) last got a
               | security update in early 2023, so also about 9 years, the
               | 6 (2014) ended at the same time so let's call it 8 years.
               | The 4S, 2011, got 8 years of OS support. 5 and 5C got 7
               | and 6 years of support (5C was 5 in a new case, so was
               | always going to get a year less in support).
               | 
               | Apple has not, that I've seen at least, ever established
               | a long term support policy on iPhones and iPads, but the
               | numbers show they're doing at least as well as what
               | Samsung and Google are _promising_ to do, but have not
               | yet done. And they 've been doing this for more than a
               | decade now.
               | 
               | EDIT:
               | 
               | Reworked the iOS numbers a bit, down to the month (I was
               | looking at years above and rounding, so this is more
               | accurate). iOS support time by device for devices that
               | cannot use the current iOS 17 (so the XS and above are
               | not counted here) in months:                 1st - 32
               | 3G  - 37       3GS - 56       4   - 48       4S  - 93
               | 5   - 81       5C  - 69       5S  - 112       6   - 100
               | 6S  - 102       SE  - 96       7   - 90       8   - 78
               | X   - 76
               | 
               | The average is 72.5 months, just over 6 years. If we
               | knock out the first 2 phones (both have somewhat
               | justifiable short support periods, massive hardware
               | changes between each and their successor) the average
               | jumps to just shy of 79 months, or about 6.5 years.
               | 
               | The 8 and X look like regressions, but their last updates
               | were just 2 months ago (March 21, 2024) so still a good
               | chance their support period will increase and exceed the
               | 7 year mark like every model since the 5S. We'll have to
               | see if they get any more updates in November 2024 or
               | later to see if they can hit the 7 year mark.
        
               | kernal wrote:
               | >The last iPads to stop getting OS updates (including
               | security, to be consistent with what Samsung and Google
               | are pledging) got 7 and 9 years of updates each (5th gen
               | iPad and 1st gen iPad Pro). The last iPhones to lose
               | support got about 7 years each (iPhone 8 and X). 6S, SE
               | (1st), and 7 got 9 and 8 years of OS support with
               | security updates. The 5S (released in 2013) last got a
               | security update in early 2023, so also about 9 years, the
               | 6 (2014) ended at the same time so let's call it 8 years.
               | The 4S, 2011, got 8 years of OS support. 5 and 5C got 7
               | and 6 years of support (5C was 5 in a new case, so was
               | always going to get a year less in support).
               | 
               | These are very disingenuous numbers that don't tell the
               | complete story. An iPhone 7 getting a single critical
               | security patch does not take into account the hundreds of
               | security patches it did not receive when it stopped
               | receiving support. It received that special update
               | because Apple likely was told or discovered it was being
               | exploited in the wild.
               | 
               | Google and Samsung now offer 7 years of OS upgrades and
               | 84 months of full security patches. Selectively patching
               | a phone that is out of the support window with a single
               | security patch does not automatically increase its EOL
               | support date.
        
               | Jtsummers wrote:
               | They made that pledge for the Pixel 8 (2023). Let's
               | revisit this in 2030 and see what the nature of their
               | support is at that point and how it compares to Apple's
               | support for iPhone devices. We can't make a real
               | comparison since they haven't done anything yet, only
               | made promises.
               | 
               | What we can do _today_ is note that Apple never made a
               | promise, but did provide very long security support for
               | their devices despite that. They 've already met or come
               | close to the Samsung/Google pledge (for one device) on
               | almost half their devices, and those are all the recent
               | ones (so it's not a downward trend of good support then
               | bad support, but rather mediocre/bad support to improving
               | and increasingly good support).
               | 
               | Another fun one:
               | 
               | iPhone XS was released in September 2018, it is on the
               | current iOS 17 release. In the absolute worst case of it
               | losing iOS 18 support in September, it will have received
               | 6 full years of support in both security and OS updates.
               | It'll still hit 7 years (comfortably) of security
               | updates. If it does get iOS 18 support in September, then
               | Apple will hit the Samsung/Google pledge 5 years before
               | Samsung/Google can even demonstrate their ability to
               | follow through (Samsung has a chance, but Google has no
               | history of commitment).
               | 
               | I have time to kill before training for a century ride:
               | 
               | Let's ignore everything before iPhone 4S, they had short
               | support periods that's just a fact and hardly worth
               | investigating. This is an analysis of devices released in
               | 2011 and later, when the phones had, mostly, matured as a
               | device so we should be expecting longer support periods.
               | These are the support periods when the phones were able
               | to run the still-current iOS versions, not counting later
               | security updates or minor updates but after the major iOS
               | version had been deprecated. As an example, for the
               | iPhone 4S it had support from 2011-2016. In 2016 its OS,
               | iOS 9, was replaced by iOS 10. Here are the numbers:
               | 4S       - 5 years       5        - 5 years       5C
               | - 4 years (decreased, 5 hardware but released a year
               | later in a different case)       5S       - 6 years
               | 6        - 5 years (decreased, not sure why)       6S
               | - 7 years (hey, Apple did it! 2015 release, lost iOS
               | upgrades in 2022)       SE(1st)  - 5 years (like 5C, 6S
               | hardware but released later)       7        - 6 years
               | (decreased over 6S, not sure why)       8        - 6
               | years       X        - 6 years
               | 
               | The 6S is a bit of an outlier, hitting 7 years of full
               | support running the current iOS. 5C and SE(1st) both got
               | less total support, but their internals were the same as
               | prior phones and they lost support at the same time as
               | them (this is reasonable, if annoying, and does drag down
               | the average). So Apple has clearly trended towards 6
               | years of full support, the XS (as noted above) will get
               | at least 6 years of support as of this coming September.
               | We'll have to see if they can get it past the 7 year
               | mark, I know they haven't promised anything but the trend
               | suggests they can.
        
               | kernal wrote:
               | Sure. They also pledged to support Chromebooks for 10
               | years. My point being is that I don't think they'll be
               | clawing back their new hardware support windows anytime
               | soon. Their data indicates that these devices were used
               | well beyond their initial support window metrics so it
               | was in their, and their users, best interest to keep them
               | updated as long as they possibly could. 3 years of OS
               | updates and 4 years of security updates was always the
               | weak link in their commitment to security. And this
               | applies to all of their devices including the A series -
               | something I don't see other Android OEM's even matching.
               | 
               | BTW, my daily driver is an iPhone 13 and I was coming
               | from an iPhone X. So I'm well aware of the incredible
               | support Apple provides its phones. Although, I would
               | still like to see an 8+ year promise from them.
        
               | fl0ki wrote:
               | I look forward to these vendors delivering on their
               | promises, and I look forward to Apple perhaps formalizing
               | a promise with less variability for future products.
               | 
               | Neither of these hopes retroactively invalidates the fact
               | that Apple has had a much better track record of
               | supporting old phone models up to this point. Even if you
               | do split hairs about the level of patching some models
               | got in their later years, they still got full iOS updates
               | for years longer than most Android phones got any patches
               | at all, regardless of severity.
               | 
               | This is not an argument that somehow puts Android on top,
               | at best it adds nuance to just how _much_ better iOS
               | support has been up to this point.
               | 
               | Let's also not forget that if Apple wasn't putting this
               | kind of pressure on Google, they wouldn't have even made
               | the promise to begin with, because it's clear how long
               | they actually care to support products with no outside
               | pressure.
        
               | kernal wrote:
               | I agree. This is the type of competition I like to see
               | between these two companies. In the end the consumer wins
               | regardless of which one you buy. Google has also promised
               | 10 years of Chromebook support, so they've clearly got
               | the message on the importance of supporting hardware much
               | longer than a lot of people would use them for.
        
               | elzbardico wrote:
               | Google can't keep a product alive. You're welcome to
               | believe on their promises of extended support after all
               | those years of shitty updates policies.
        
               | patall wrote:
               | > I am still using a IPhone released in 2020 with
               | absolutely no appreciable slow down or losses in
               | perceived performance.
               | 
               | My Pixel 4a here is also going strong, only the battery
               | is slowly getting worse. I mean, it's 2024, do phones
               | really still get slow? The 4a is now past android
               | updates, but that was promised after 3 years. But at 350
               | bucks, it was like 40% less than the cheapest iPhone mini
               | at that time.
        
               | onemoresoop wrote:
               | > I mean, it's 2024, do phones really still get slow?
               | 
               | Hardware is pretty beefed up but bloat keeps on growing,
               | that is slowing things down considerably.
        
               | moneywoes wrote:
               | what about security updates?
        
               | tick_tock_tick wrote:
               | > I am still using a IPhone released in 2020 with
               | absolutely no appreciable slow down or losses in
               | perceived performance.
               | 
               | Only because Apple lost a lawsuit otherwise they'd have
               | kept intentionally slowing it down.
        
               | noname120 wrote:
               | This has been debunked.
        
               | FireBeyond wrote:
               | .... by Apple.
        
               | noname120 wrote:
               | See https://en.wikipedia.org/wiki/Batterygate
        
               | FireBeyond wrote:
               | Right. "By Apple".
               | 
               | Apple says it made these changes for other reasons,
               | honestly, truly. And if it happened to have the same
               | effect, then that was unfortunate, and unintended.
               | 
               | Only Apple really knows. But there was a slew of changes
               | and reversals following the drama. "Oh, we'll implement
               | notifications now", "Oh, we'll change the peak
               | performance behavior", and "we will change and add
               | additional diagnostics to make sure issues are battery
               | related" certainly has a feel for a bunch of ex post
               | facto rationalization of several things that seem, to me,
               | that if it was truly a battery thing all along, would
               | have been functional requirements.
        
               | dijit wrote:
               | I never understood this argument.
               | 
               | Theres no "status" to a brand of phone when the cheapest
               | point of entry is comparable and the flagship is cheaper
               | than the alternative flagship.
               | 
               | Marketing in most of europe is chiefly not the same as
               | the US though so maybe its a perspective thing.
               | 
               | I just find it hard to really argue "status" when the
               | last 4 iPhone generations are largely the same and
               | cheaper than the Samsung flagships.
               | 
               | At Elgiganten a Samsung S24 Ultra is 19,490 SEK[0].
               | 
               | The most expensive iPhone 15 pro max is 18,784 SEK at the
               | same store[1].
               | 
               | [0]: https://nya.elgiganten.se/product/mobiler-tablets-
               | smartklock...
               | 
               | [1]: https://nya.elgiganten.se/product/mobiler-tablets-
               | smartklock...
        
               | pompino wrote:
               | Its not an argument, just ask why people lust after the
               | latest iPhones in poor countries. They do it because they
               | see rich people owning them. Unless you experience that,
               | you won't really understand it.
        
               | Draiken wrote:
               | My take is that it's like a fashion accessory. People buy
               | Gucci for the brand, not the material or comfort.
               | 
               | Rich people ask for the latest most expensive iPhone even
               | if they're only going to use WhatsApp and Instagram on
               | it. It's not because of privacy or functionality, it's
               | simply to show off to everyone they can purchase it. Also
               | to not stand out within their peers as the only one
               | without it.
               | 
               | As another content said: it's not an argument, it's a
               | fact here.
        
               | Aerbil313 wrote:
               | I have an iPhone so I guess I qualify as a rich person by
               | your definition. I am also a software engineer. I cannot
               | state enough how bogus that statement is. I've used both
               | iPhone and Android, and recent flagships. iPhone is by
               | far the easiest one to use. Speaking in more objective
               | terms, iPhones have a coherent UI which maintains its
               | consistency both throughout the OS and over the years.
               | They're the most dumbed down phones and easiest to
               | understand. I recommend iPhone to all my friends and
               | relatives.
               | 
               | There's obviously tons of people who see iPhone as a
               | status item. They're right, because iPhone is expensive
               | and only the rich can buy them. This doesn't mean iPhone
               | is not the best option out there for a person who doesn't
               | want to extensively customize his phone and just use it.
        
               | ClumsyPilot wrote:
               | > iPhone and Android, and recent flagships. iPhone is by
               | far the easiest one to use. Speaking in more objective
               | terms, iPhones have a coherent UI
               | 
               | It's not about if you've used android, it's about if
               | you've beeen poor-ish or stingy
               | 
               | To some people those are luxuries- the most expensive
               | phone they buy is a mid-range Motorola for $300 with
               | snapdragon 750g or whatever. They run all the same apps
               | after all, they take photos.
               | 
               | iPhones are simply outside of your budget.
        
               | Draiken wrote:
               | Yes, by pure statistics you are probably rich compared to
               | everyone else. The average software developer salary is
               | way bigger than the average salary for the entirety of
               | the US. Let's not even mention compared to the rest of
               | the world.
               | 
               | Sure, some people pick up the iPhone because they like
               | the specs, or the apps, or whatever else. That's why I
               | said the majority picks it up for status, not all. But
               | keep in mind nobody's judging the iPhone's specs or
               | capabilities here. We're talking about why people buy it.
               | 
               | Ask any teenager why they want an iPhone. I'd be very
               | surprised if even one said it's because of privacy. It's
               | because of the stupid blue bubble, which is a proxy for
               | status.
               | 
               | I'm pretty sure if Apple released the same phone again
               | with a new name and design, people would still buy it.
               | For the majority, it's not because of features, ease of
               | use, specs, etc: it's status.
        
               | hindsightbias wrote:
               | It's fashion and the kids are hip. But there is an
               | endless void of Apple haters here who want to see it
               | burn. They have nothing in common with 99.9% of the
               | customer base.
        
               | ClumsyPilot wrote:
               | I was thinking about this for a while, the problem is not
               | about apple, it's the fact that the rest of the industry
               | is gutless, and has zero vision or leadership. Whatever
               | Apple does, the rest of the industry will follow or
               | oppose - but will be defined by it.
               | 
               | It's like how people who don't like US and want nothing
               | to do with US still discuss US politics, because it has
               | so much effect everywhere.
               | 
               | (Ironically no enough people discuss China in any
               | coherent level of understanding)
        
               | hatsix wrote:
               | You're absolutely right, I'm so glad that Apple was the
               | first company to release a phone with a touch screen, or
               | a phone with an app store, or a smart watch or a VR
               | headset.
               | 
               | Apple doesn't release new products, they wait until the
               | actual brave and innovating companies have done the
               | exploration and then capitalize on all of their
               | learnings. Because they are never the first movers and
               | they have mountains of cash, they're able to enter the
               | market without the baggage of early adopters. They don't
               | have to worry about maintaining their early prototypes.
               | 
               | Apple doesn't innovate or show leadership, they wait
               | until the innovators have proven that the market is big
               | enough to handle Apple, then they swoop in with a product
               | that combines the visions of the companies that were
               | competing.
               | 
               | Apple is great at what they do, don't get me wrong. And
               | swooping in when the market is right is just good
               | business. Just don't mistake that for innovation or
               | leadership.
        
               | hatsix wrote:
               | The cheapest point of entry is absolutely not comparable.
               | The cheapest new iPhone on apple.com is $429. The
               | cheapest new Samsung on samsung.com is $199 (They do have
               | a phone listed for $159, but it's button says "Notify
               | Me").
               | 
               | Granted, you may have been leaning very heavily on the
               | dictionary definition of "comparable", in that the two
               | numbers are able to be compared. However, when the
               | conclusion of that comparison is "More than twice the
               | price", I think you should lead with that.
               | 
               | Keep in mind, the iPhone SE is using a 3 year old
               | processor, the Samsung A15 was released 5 months ago with
               | a brand new processor.
        
               | molszanski wrote:
               | Is this brand new cpu faster or more energy efficient?
        
               | hatsix wrote:
               | Yes.
               | 
               | According to various sites, the Mediatek Dimensity 6100+
               | is a 6nm update to a core that was released 3 years ago
               | (Dimensity 700 on a 7nm). It's 5-10% faster, likely due
               | to the update from 7 to 6nm, as the cores are the same
               | and run at the same speed. It contains an updated
               | bluetooth chipset (from 5.1 to 5.2) and supports a larger
               | max camera. The camera on the A15 is well below the max
               | size of the previous chipset, however, the increased
               | camera bandwidth should ensure that the camera feels
               | snappier (a common complaint on low-end phones). The
               | process improvement should increase efficiency as well,
               | however, there are not benchmarks that are able to test
               | this.
        
               | briandear wrote:
               | The vast majority of people don't. They buy because the
               | ecosystem works. Not sure how I get status from a phone
               | that nobody knows I have. I don't wear it on a chain.
        
               | Draiken wrote:
               | Could it possibly be different in Brazil?
               | 
               | iPhones are not ubiquitous here, and they're way more
               | expensive than other options.
        
             | jamesmontalvo3 wrote:
             | Maybe true for a lot of the HN population, but my teenagers
             | are mortified by the idea of me giving them android phones
             | because then they would be the pariahs turning group
             | messages from blue to green.
        
               | WheatMillington wrote:
               | This is a sad state of affairs.
        
               | adamomada wrote:
               | Interesting that some people would take that as an Apple
               | problem and others would take it as a Google problem
               | 
               | Who's at fault for not having built-in messaging that
               | works with rich text, photos, videos, etc?
               | 
               | Google has abandoned more messaging products than I can
               | remember while Apple focused on literally the main
               | function of a phone in the 21st century. And they get
               | shit for it
        
               | pseudalopex wrote:
               | Most of the world doesn't care about built in. Apple
               | decided against iMessage on Android for lock in. Android
               | had RCS in 2019.
        
               | int_19h wrote:
               | Apple get shit for it because they made it a proprietary
               | protocol for which clients are not available on anything
               | except their own hardware. The whole point of messaging
               | is that it should work with all my contacts, not just
               | those who drank the Apple-flavored Kool-Aid.
        
               | paulmd wrote:
               | Google's protocol is proprietary too - their encryption
               | extension makes it inaccessible for anyone else and
               | google will not partner or license (companies have
               | tried).
               | 
               | RCS as currently implemented is iMessage but with a coat
               | of google paint. There is no there there.
        
               | int_19h wrote:
               | Google should get plenty of shit too for closing down
               | GTalk in the first place. It's not an either-or. Big tech
               | in general hates open protocols and interoperability for
               | consumer stuff; Apple is just the most egregious offender
               | there.
        
               | simonh wrote:
               | I'm in Europe and everyone uses WhatsApp, and while
               | Android does gave higher share over here, iPhone still
               | dominate the younger demographics. I'm not denying
               | blue/green is a factor in the US but it's not even a
               | thing here. It's nowhere near the only it even a dominant
               | reason iPhones are successful with young people.
        
               | adamc wrote:
               | Snobbery is an expensive pastime.
        
               | lolinder wrote:
               | And just to elaborate on this: it's not just snobbery
               | about the color of the texts, for people who rely on
               | iMessage as their primary communication platform it
               | really is a severely degraded experience texting with
               | someone who uses Android. We Android users have long
               | since adapted to it by just avoiding SMS/MMS in favor of
               | other platforms, but iPhone users are accustomed to just
               | being able to send a video in iMessage and have it be
               | decent quality when viewed.
               | 
               | Source: I'm an Android user with a lot of iPhones on my
               | in-laws side.
        
               | bigstrat2003 wrote:
               | No, it really is just snobbery.
        
               | Dylan16807 wrote:
               | Be aware that iPhones degrade MMS more than necessary and
               | the only reason seems to be to punish Android use.
        
             | littlestymaar wrote:
             | Apple only pivoted into the "privacy" branding relatively
             | recently [1] and I don't think that many people came for
             | that reason alone. In any case, most are now trapped into
             | the walled garden and the effort to escape is likely big
             | enough. And there's no escape anyway, since Google will
             | always make Android worse in that regard...
             | 
             | [1] in 2013 they even marketed their "eBeacon" technology
             | as a way for retail stores to monitor and track their
             | customers which...
        
               | adamomada wrote:
               | Ca 2013 was the release of the Nexus 5, arguably the
               | first really usable android smartphone.
               | 
               | Privacy wasn't really a concern because most people
               | didn't have the privacy eroding device yet. In the years
               | following the Nexus 5 is where smartphones went into
               | geometric growth and the slow realization of the privacy
               | nightmare became apparent
               | 
               | Imho I was really excited to get a Nexus 4 at the time,
               | just a few short years later the shine wore off and I was
               | horrified at the smartphone enabled future. And I have a
               | 40 year background in computers and understand them
               | better than 99 out of 100 users - if I didn't see it, I
               | can't blame them either
        
               | mkl wrote:
               | > Ca 2013 was the release of the Nexus 5, arguably the
               | first really usable android smartphone.
               | 
               | What a strange statement. I was late to the game with a
               | Nexus S in 2010, and it was really usable.
        
               | adamomada wrote:
               | Define usable. Imho before Nexus 4 everything was crap,
               | Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz)
               | plus software at the time (post-kitkat) was when it was
               | really ready for mainstream
        
             | moneywoes wrote:
             | is that still the case?
        
             | tick_tock_tick wrote:
             | I'd say from my experience the average Apple users care
             | less about privacy then the general public. It's a status
             | symbol first and foremost 99% of what people do on their
             | phones is basically identical on both platforms at this
             | point.
        
           | serial_dev wrote:
           | It doesn't need to stay true forever.
           | 
           | The alternative is Google / Android devices and OpenAI
           | wrapper apps, both of which usually offer a half baked UI,
           | poor privacy practices, and a completely broken UX when the
           | internet connection isn't perfect.
           | 
           | Pair this with the completely subpar Android apps, Google
           | dropping support for an app about once a month, and suddenly
           | I'm okay with the lesser of two evils.
           | 
           | I know they aren't running a charity, I even hypothesized
           | that Apple just can't build good services so they pivoted to
           | focusing on this fake "privacy" angle. In the end, iPhones
           | are likely going to be better for edge AI than whatever is
           | out there, so I'm looking forward to this.
        
             | jocaal wrote:
             | > better for edge AI than whatever is out there, so I'm
             | looking forward to this
             | 
             | What exactly are you expecting? The current hype for AI is
             | large language models. The word 'large' has a certain
             | meaning in that context. Much larger that can fit on your
             | phone. Everyone is going crazy about edge AI, what am I
             | missing?
        
               | jchanimal wrote:
               | It fits on your phone, and your phone can offload battery
               | burning tasks to nearby edge servers. Seems like the path
               | consumer-facing AI will take.
        
               | jitl wrote:
               | Quantized LLMs can run on a phone, like Gemini Nano or
               | OpenLLAMA 3B. If a small local model can handle simple
               | stuff and delegate to a model in the data center for
               | harder tasks and with better connectivity you could get
               | an even better experience.
        
               | SmellTheGlove wrote:
               | > If a small local model can handle simple stuff and
               | delegate to a model in the data center for harder tasks
               | and with better connectivity you could get an even better
               | experience.
               | 
               | Distributed mixture of experts sounds like an idea. Is
               | anyone doing that?
        
               | cheschire wrote:
               | Sounds like an attack vector waiting to happen if you
               | deploy enough competing expert devices into a crowd.
               | 
               | I'm imagining a lot of these LLM products on phones will
               | be used for live translation. Imagine a large crowd event
               | of folks utilizing live AI translation services being
               | told completely false translations because an actor
               | deployed a 51% attack.
        
               | jagger27 wrote:
               | I'm not particularly scared of a 51% attack between the
               | devices attached to my Apple ID. If my iPhone splits
               | inference work with my idle MacBook, Apple TV, and iPad,
               | what's the problem there?
        
               | moneywoes wrote:
               | what about in situations with no bandwidth?
        
               | mr_toad wrote:
               | Using RAG a smaller local LLM combined with local data
               | (e.g. your emails, iMessages etc) can be useful than a
               | large external LLM that doesn't have your data.
               | 
               | No point asking GPT4 "what time does John's party
               | start?", but a local LLM can do better.
        
               | jwells89 wrote:
               | This is why I think Apple's implementation of LLMs is
               | going to be a big deal, even if it's not technically as
               | capable. Just making Siri better able to converse (e.g.
               | ask clarifying questions) and giving it the context
               | offered by user data will make it dramatically more
               | useful than silo'd off remote LLMs.
        
               | callalex wrote:
               | In the hardware world, last year's large has a way of
               | becoming next year's small. For a particularly funny
               | example of this, check out the various letter soup names
               | that people keep applying to screen resolutions. https://
               | en.m.wikipedia.org/wiki/Display_resolution_standards...
        
               | gopher_space wrote:
               | > Everyone is going crazy about edge AI, what am I
               | missing?
               | 
               | If you clone a model and then bake in a more expensive
               | model's correct/appropriate responses to your queries,
               | you now have the functionality of the expensive model in
               | your clone. For your specific use case.
               | 
               | The size of the resulting case-specific models are small
               | enough to run on all kinds of hardware, so everyone's
               | seeing how much work can be done on their laptop right
               | now. One incentive for doing so is that your approaches
               | to problems are constrained by the cost and security of
               | the Q&A roundtrip.
        
             | kernal wrote:
             | >subpar Android apps
             | 
             | Care to cite these subpar Android apps? The app store is
             | filled to the brim with subpar and garbage apps.
             | 
             | >Google dropping support for an app about once a month
             | 
             | I mean if you're going to lie why not go bigger
             | 
             | >I'm okay with the lesser of two evils.
             | 
             | So the more evil company is the one that pulled out of
             | China because they refused to hand over their users data to
             | the Chinese government on a fiber optic silver plate?
        
               | martimarkov wrote:
               | Google operates in China albeit via their HK domain.
               | 
               | They also had project DragonFly if you remember.
               | 
               | The lesser of two evils is that one company doesn't try
               | to actively profile me (in order for their ads business
               | to be better) with every piece of data it can find and
               | forces me to share all possible data with them.
               | 
               | Google is famously known to kill apps that are good and
               | used by customers: https://killedbygoogle.com/
               | 
               | As for the subpar apps: there is a massive difference
               | between the network traffic when on the Home Screen
               | between iOS and Android.
        
               | kernal wrote:
               | >Google operates in China albeit via their HK domain.
               | 
               | The Chinese government has access to the iCloud account
               | of every Chinese Apple user.
               | 
               | >They also had project DragonFly if you remember.
               | 
               | Which never materialized.
               | 
               | >The lesser of two evils is that one company doesn't try
               | to actively profile me (in order for their ads business
               | to be better) with every piece of data it can find and
               | forces me to share all possible data with them.
               | 
               | Apple does targeted and non targeted advertising as well.
               | Additionally, your carrier has likely sold all of the
               | data they have on you. Apple was also sued for selling
               | user data to ad networks. Odd for a Privacy First company
               | to engage in things like that.
               | 
               | >Google is famously known to kill apps that are good and
               | used by customers: https://killedbygoogle.com/
               | 
               | Google has been around for 26 years I believe. According
               | to that link 60 apps were killed in that timeframe.
               | According to your statement that Google kills an app a
               | month that would leave you 252 apps short. Furthermore,
               | the numbers would indicate that Google has killed 2.3
               | apps per year or .192 apps per month.
               | 
               | >As for the subpar apps: there is a massive difference
               | between the network traffic when on the Home Screen
               | between iOS and Android.
               | 
               | Not sure how that has anything to do with app quality,
               | but if network traffic is your concern there's probably a
               | lot more an Android user can do than an iOS user to
               | control or eliminate the traffic.
        
               | FireBeyond wrote:
               | > Google has been around for 26 years I believe.
               | According to that link 60 apps were killed in that
               | timeframe. According to your statement that Google kills
               | an app a month that would leave you 252 apps short.
               | Furthermore, the numbers would indicate that Google has
               | killed 2.3 apps per year or .192 apps per month.
               | 
               | Most of the "Services" on that list are effectively apps,
               | too:
               | 
               | VPN by Google One, Album Archive, Hangouts, all the way
               | back to Answers, Writely, and Deskbar.
               | 
               | I didn't touch hardware, because I think that should be
               | considered separately.
               | 
               | The first of 211 services on that site was killed in
               | 2006.
               | 
               | The first of the 60 apps on that site was killed in 2012.
               | 
               | So even apps alone, 4.28 a year.
               | 
               | But more inclusively, 271 apps or services in 17 years is
               | ~16/year, over one a month.
               | 
               | You need to remind yourself of the site guidelines about
               | assuming the worst. Your comments just come across
               | condescendingly.
        
               | kernal wrote:
               | >Most of the "Services" on that list are effectively
               | apps, too:
               | 
               | Even with the additional apps you've selected it still
               | doesn't come close to the one app per month claim.
               | 
               | >I didn't touch hardware, because I think that should be
               | considered separately.
               | 
               | So why even mention it? Is Apple impervious to
               | discontinuing hardware?
               | 
               | >The first of 211 services on that site was killed in
               | 2006.
               | 
               | So we're talking about services now? Or apps? Or apps and
               | services? The goal posts keep moving.
               | 
               | >You need to remind yourself of the site guidelines about
               | assuming the worst. Your comments just come across
               | condescendingly.
               | 
               | I suggest you also consult the guidelines in regards to
               | calling people names. My comments were never intended to
               | be inferred that way.
        
               | wiseowise wrote:
               | > The Chinese government has access to the iCloud account
               | of every Chinese Apple user.
               | 
               | Source?
        
               | wiseowise wrote:
               | > I mean if you're going to lie why not go bigger
               | 
               | Google podcasts, Stadia, shitton of other discontinued
               | applications?
        
             | rfoo wrote:
             | > The alternative is Google / Android devices
             | 
             | No, the alternative is Android devices with everything
             | except firmware built from source and signed by myself. And
             | at the same time, being secure, too.
             | 
             | You just can't have this on Apple devices. On Android side
             | choices are limited too, I don't like Google and especially
             | their disastrous hardware design, but their Pixel line is
             | the most approachable one able to do all these.
             | 
             | Heck, you can't even build your own app for your own iPhone
             | without buying another hardware (a Mac, this is not a
             | software issue, this is a legal issue, iOS SDK is licensed
             | to you on the condition of using on Apple hardware only)
             | and a yearly subscription. How is this acceptable at all?
        
               | adamomada wrote:
               | The yearly subscription is for publishing your app on
               | Apple's store and definitely helps keep some garbage out.
               | Running your own app on your own device is basically
               | solved with free third party solutions now (see AltStore
               | and since a newer method I can't recall atm)
        
               | tsimionescu wrote:
               | Those are only available in the EU, and Apple has been
               | huffing and puffing even here.
        
               | dns_snek wrote:
               | Notice that parent never talked about _publishing_ apps,
               | just _building_ and running apps on their own device.
               | "Publishing on AltStore" (or permanently running the app
               | on your own device in any other way) still requires a
               | $100/year subscription as far as I'm aware.
        
               | simfree wrote:
               | WebGPU and many other features on iOS are unimplemented
               | or implemented in half-assed or downright broken ways.
               | 
               | These features work on all the modern desktop browsers
               | and on Android tho!
        
               | Aloisius wrote:
               | > WebGPU and many other features
               | 
               | WebGPU isn't standardized yet. Hell, _most_ of the
               | features people complain about aren 't part of any
               | standard, but for some reason there's this sense that if
               | it's in Chrome, it's standard - as if Google dictates
               | standards.
        
               | moooo99 wrote:
               | > but for some reason there's this sense that if it's in
               | Chrome, it's standard - as if Google dictates standards.
               | 
               | Realistically, given the market share of Chrome and
               | Chromium based browsers, they kind of do.
        
               | rootusrootus wrote:
               | I didn't like it when Microsoft dominated browsers, and
               | I'm no happier now. I've stopped using Chrome.
        
               | notpushkin wrote:
               | Just curious - what are you using now?
        
               | feisuzhu wrote:
               | I've been using Firefox since the Quantum version is out.
               | It feels slightly slower to Chrome but it's negligible to
               | me. Otherwise I can't tell a difference (except some
               | heavy web based Office like solutions screaming 'Your
               | browser is not supported!' but actually works fine).
        
               | yencabulator wrote:
               | Meanwhile, Apple has historically dictated that Google
               | can't publish Chrome for iOS, only a reskinned Safari.
               | People in glass-walled gardens shouldn't throw stones.
        
               | simfree wrote:
               | Firefox has an implementation of WebGPU, why is Safari
               | missing in action?
        
               | ToucanLoucan wrote:
               | > Heck, you can't even build your own app for your own
               | iPhone without buying another hardware (a Mac, this is
               | not a software issue, this is a legal issue, iOS SDK is
               | licensed to you on the condition of using on Apple
               | hardware only) and a yearly subscription. How is this
               | acceptable at all?
               | 
               | Because they set the terms of use of the SDK? You're not
               | required to use it. You aren't required to develop for
               | iOS. Just because Google gives it all away for free
               | doesn't mean Apple has to.
        
               | ClumsyPilot wrote:
               | > You aren't required to develop for iOS
               | 
               | Do you have a legal right to write software or run your
               | own software for hardware you bought?
               | 
               | Because it's very easy to take away a right by erecting
               | aritificial barriers, just like how you could
               | discriminate by race at work, but pretend you are doing
               | something else,
        
               | ToucanLoucan wrote:
               | > Do you have a legal right to write software or run your
               | own software for hardware you bought?
               | 
               | I've never heard of such a thing. Ideally I'd _like_
               | that, but I don 't have such freedoms with the computers
               | in my cars, for example, or the one that operates my
               | furnace, or even for certain parts of my PC.
        
               | ClumsyPilot wrote:
               | So you bought "a thing' but you can't control what it
               | does, how it does it, you don't get to decide what data
               | it collects or who can see that data.
               | 
               | You aren't allowed to repair the "thing' because the
               | software can detect you changed something and will refuse
               | to boot. And whenever it suits the manufacturer, they
               | will decide when the 'thing' is declared out of support
               | and stops functioning.
               | 
               | I would say you are not an owner then, you (and me) and
               | just suckers that are paying for the party. Maybe it's a
               | lease. But then we also pay when it breaks, so it more of
               | a digital feudalism.
        
               | paulmd wrote:
               | > Do you have a legal right to write software or run your
               | own software for hardware you bought?
               | 
               | No, obviously not. Do you have a right to run a custom OS
               | on your PS5? Do you have a right to run a custom
               | application on your cable set-top box? Etc. Such a right
               | obviously doesn't exist and most people generally are
               | somewhere between "don't care" and actively rejecting it
               | for various reasons (hacking in games, content DRM, etc).
               | 
               | It's fine if you think there _should_ be, but it
               | continues this weird trend of using apple as a foil for
               | complaining about random other issues that other vendors
               | tend to be just as bad or oftentimes even worse about,
               | simply because they're a large company with a large group
               | of anti-fans /haters who will readily nod along.
               | 
               | Remember when the complaint was that the pelican case of
               | factory OEM tools you could rent (or buy) to install your
               | factory replacement screen was _too big and bulky_ ,
               | meaning it was really just a plot to sabotage right to
               | repair?
               | 
               | https://www.theverge.com/2022/5/21/23079058/apple-self-
               | servi...
        
               | dns_snek wrote:
               | > Remember when the complaint was that the pelican case
               | of factory OEM tools you could rent (or buy) to install
               | your factory replacement screen was too big and bulky,
               | meaning it was really just a plot to sabotage right to
               | repair?
               | 
               | Yes, I do. That was and continues to be a valid
               | complaint, among all other anti-repair schemes Apple have
               | come up with over the years. DRM for parts, complete
               | unavailability of some commonly repaired parts,
               | deliberate kneecapping of "Apple authorized service
               | providers", leveraging the US customs to seize shipments
               | of legitimate and/or unlabeled replacement parts as
               | "counterfeits", gaslighting by official representatives
               | on Apple's own forums about data recovery, sabotaging
               | right to repair laws, and even denial of design issues[1]
               | to weasel out of warranty repair just to name a few.
               | 
               | All with the simple anti-competitive goal of making third
               | party repair (both authorized and independent) a less
               | attractive option due to artificially increased prices,
               | timelines to repair, or scaremongering about privacy.
               | 
               | https://arstechnica.com/gadgets/2022/12/weakened-right-
               | to-re...
               | 
               | https://www.pcgamer.com/ifixit-says-apples-iphone-14-is-
               | lite...
               | 
               | [1] Butterfly keyboards, display cables that were too
               | short and failed over time
        
               | paulmd wrote:
               | > Yes, I do. That was and continues to be a valid
               | complaint,
               | 
               | No, it doesn't - because you can simply not use the tools
               | if you don't want. You can just order a $2 spudger off
               | Amazon if you want, you don't need the tools at all.
               | 
               | It continues to be a completely invalid complaint that
               | shows just how bad-faith the discussion about apple has
               | become - it literally costs you nothing to not use the
               | tools if you want, there is no downside to having apple
               | make them available to people, and yet you guys still
               | find a way to bitch about it.
               | 
               | Moreover, despite some "bold" proclamations from the
               | haters... no android vendors ever ended up making their
               | oem tooling available to consumers at all. You _have_ to
               | use the Amazon spudger on your pixel, and you _will_ fuck
               | up the waterproofing when you do your repair, because the
               | android phone won't seal properly against water without
               | the tools either. IPX sixtywho!?
               | 
               | It's literally a complete and total net positive: nothing
               | was taken away from you, and you don't need to use it,
               | and it makes your life easier and produces a better
               | repair if you want it. Apple went out of their way to
               | both make the tooling available to normies who want to
               | rent it or people who want to buy it for real. And people
               | still bitch, and still think they come off better for
               | having done so. Classic "hater" moment, in the Paul
               | Graham sense. Anti-fanboys are real.
               | 
               | https://paulgraham.com/fh.html
               | 
               | Literally, for some people - the pelican cases with the
               | tools are too big and heavy. And that's enough to justify
               | the hate.
               | 
               | Again, great example of the point I was making in the
               | original comment: people inserting their random hobby
               | horse issues using apple as a foil. You don't like how
               | phones are made in general, so you're using apple as a
               | whipping boy for the issue even if it's not really caused
               | or worsened by the event in question etc. Even if the
               | event in question is apple _making that issue somewhat
               | better,_ and is done worse by all the other vendors etc.
               | Can't buy tooling for a pixel _at all_ , doing those
               | repairs will simply break waterproofing without it, and
               | you're strictly better off having the ability to get
               | access to the tooling if you decide you want it, but
               | apple offering it is a flashpoint you can exploit for
               | rhetorical advantage.
        
               | ToucanLoucan wrote:
               | > Moreover, despite some "bold" proclamations from the
               | haters... no android vendors ever ended up making their
               | oem tooling available to consumers at all. You have to
               | use the Amazon spudger on your pixel, and you will fuck
               | up the waterproofing when you do your repair, because the
               | android phone won't seal properly against water without
               | the tools either. IPX sixtywho!?
               | 
               | I think the dirty little secret here is that an iPhone is
               | just about the only phone, apart from maybe some of the
               | really nice Google and Samsung flagships, that anyone
               | _wants_ to repair, because they 're bloody expensive.
               | Which is fine and dandy but then do kindly park your
               | endless bemoaning of the subjects of e-waste and non-
               | repairable goods, when Android by far and away is the
               | worse side of that equation, with absolute shit tons of
               | low yield, crap hardware made, sold, and thrown away when
               | the first software update renders it completely unusable
               | (if it wasn't already, from the factory).
        
               | dns_snek wrote:
               | Could you chill with the relentless insults? I'd
               | appreciate it.
               | 
               | Perhaps you haven't noticed, but once you tally up
               | overpriced parts together with their oversized, heavy,
               | expensive rental of tools _that you don 't need_, you end
               | up with a sum that matches what you would pay to have it
               | repaired by Apple - except you're doing all of the work
               | yourself.
               | 
               | A curious consumer who has never repaired a device, but
               | might have been interested in doing so, will therefore
               | conclude that repairing their own device is 1. Far too
               | complicated, thanks to an intimidating-looking piece of
               | kit that they recommend, but is completely unnecessary,
               | and 2. Far too expensive, because Apple prices these such
               | that the repair is made economically nonviable.
               | 
               | So yes, I still believe that this is Apple fighting the
               | anti-repair war on a psychological front. You're giving
               | them benefit of the doubt even though they've established
               | a clear pattern of behavior that demonstrates their anti-
               | repair stance beyond any reasonable doubt - although you
               | dance around the citations and claim that I'm being
               | unreasonable about Apple genuinely making the repair
               | situation "better".
               | 
               | Futhermore, I'm not a fanboy or anti-fanboy of any
               | company. The only thing I'm an anti-fanboy of are anti-
               | consumer practices. If Apple changed some of their
               | practices I'd go out and buy an iPhone and a Macbook
               | tomorrow.
               | 
               | The fact that I pointed out that Apple is hostile against
               | repair does not mean that I endorse Google, Samsung, or
               | any other brand - they all suck when it comes to repair,
               | yet you're taking it as a personal attack and calling me
               | names for it.
        
               | paulmd wrote:
               | Actually to be fully clear, in many cases you have an
               | anti-right: literally not only do you not have a right,
               | but it's illegal to circumvent technological restrictions
               | intended to prevent the thing you want to do.
               | 
               | As noxious as that whole thing is, it's literally the
               | law. I agree the outcome is horrifying of course...
               | stallman was right all along, it's either your device or
               | it's not.
               | 
               | And legally speaking, we have decided it's ok to go with
               | "not".
        
               | rfoo wrote:
               | > You aren't required to develop for iOS.
               | 
               | Sure, as a SWE I'm not going to buy a computer unable to
               | run my own code. A smartphone is an ergonomic portable
               | computer, so I say no to iPhone and would like to remind
               | others who didn't have a deep think into this about it.
        
               | nrb wrote:
               | > How is this acceptable at all?
               | 
               | Because as you described, the only alternatives that
               | exist are terrible experiences for basically everyone, so
               | people are happy to pay to license a solution that solves
               | their problems with minimal fuss.
               | 
               | Any number of people could respond to "use Android
               | devices with everything except firmware built from source
               | and signed by myself" with the same question.
        
               | mbreese wrote:
               | _> No, the alternative is Android devices with everything
               | except firmware built from source and signed by myself_
               | 
               | Normal users will not do this. Just because many of the
               | people here can build and sign a custom Android build
               | doesn't mean that is a viable _commercial_ alternative.
               | It is great that is an option for those of us who can do
               | it, but don 't present it as a viable alternative to the
               | iOS/Google ecosystems. The fraction of people who can and
               | will be willing to do this is really small. And even if
               | you can do it, how many people will want to maintain
               | their custom built OSes?
        
               | rodgerd wrote:
               | > Normal users will not do this. J
               | 
               | Unfortunately a lot of the "freedom" crowd think that
               | unless you want to be an 80s sysadmin you don't deserve
               | security or privacy. Or computers.
        
               | muyuu wrote:
               | the main reason the masses don't have privacy and
               | security-centred systems is that they don't demand them
               | and they will trade it away for a twopence or for the
               | slightest increment in convenience
               | 
               | a maxim that seems to hold true at every level of
               | computing is that users will not care about security
               | unless forced into caring
               | 
               | with privacy they may care more, but they are easily
               | conditioned to assume it's there or that nothing can be
               | realistically be done about losing it
        
               | notpushkin wrote:
               | I, an engineer, am not doing this myself, too. There is a
               | middle ground though: just use a privacy-oriented Android
               | build, like DivestOS. [1]
               | 
               | There are a couple caveats:
               | 
               | 1. It is still a bit tricky for a non-technical person to
               | install. Should not be a problem if they know somebody
               | who can help, though. There's been some progress making
               | the process more user friendly recently (e.g. WebUSB-
               | based GrapheneOS installer).
               | 
               | 2. There are some papercuts if you don't install Google
               | services on your phone. microG [2] helps with most but
               | some still remain. My main concern with this setup is
               | that I can't use Google Pay this way, but having to bring
               | my card with me every time seems like an acceptable trade
               | off to me.
               | 
               | [1]: https://divestos.org/
               | 
               | [2]: https://microg.org/
        
               | int_19h wrote:
               | The biggest problem with these kinds of setups is usually
               | the banking apps which refuse to run if it's not "safe".
        
               | fsflover wrote:
               | > No, the alternative is Android devices with everything
               | except firmware built from source and signed by myself
               | 
               | I wouldn't bet on this long term, since it fully relies
               | on Google hardware, and Google's long-term strategy is to
               | remove your freedom piece by piece, cash on it, not to
               | support it.
               | 
               | The real alternative is GNU/Linux phones, Librem 5 and
               | Pinephone, without any ties to greedy, anti-freedom
               | corporations.
        
               | wiseowise wrote:
               | > No, the alternative is Android devices with everything
               | except firmware built from source and signed by myself.
               | And at the same time, being secure, too.
               | 
               | There are people who don't know how to use file explorer,
               | new generation grows up in a world of iPhones without
               | ever seeing file system. Any other bright ideas?
        
             | cbsmith wrote:
             | Google has also been working on (and provides kits for)
             | local machine learning on mobile devices... and they run on
             | both iOS and Android. The Gemini App does send data in to
             | Google for learning, but even that you can opt out of.
             | 
             | Apple's definitely pulling a "Heinz" move with privacy, and
             | it is true that they're doing a better job of it overall,
             | but Google's not completely horrible either.
        
           | nox101 wrote:
           | Their primary business is transitioning to selling services
           | and extracting fees. It's their primary growth
        
             | brookst wrote:
             | Hey, I'm way ahead of Apple. I sell my services to my
             | employer and extract fees from them. Do you extract fees
             | too?
        
               | nox101 wrote:
               | I'm not sure what you're point is. My point (which I
               | failed at), is that Apple's incentives are changing
               | because their growth is dependent on services and
               | extracting fees so they will likely do things that try to
               | make people dependent on those services and find more
               | ways to charge fees (to users and developers).
               | 
               | Providing services is arguably at odds with privacy since
               | a service with access to all the data can provide a
               | better service than one without so there will be a
               | tension between trying to provide the best services,
               | fueling their growth, and privacy.
        
               | brookst wrote:
               | I apologize for being oblique and kind of snarky.
               | 
               | My point was that it's interesting how we can frame a
               | service business "extracting fees" to imply wrongdoing.
               | When it's pretty normal for all services to charge
               | ongoing fees for ongoing delivery.
        
               | ClumsyPilot wrote:
               | It's about the money, it's about perverse incentives and
               | propensity of service businesses to get away with unfair
               | practices. We have decent laws about your rights as a
               | consumer when you buy stuff, but like no regulation of
               | services
        
               | brookst wrote:
               | There is tons of regulation of services? Everything from
               | fraud / false advertising to disclosure of fees to length
               | and terms of contracts. What regulation do you think is
               | missing?
               | 
               | And as someone who presumably provides services for a
               | living, what additional regulations would you like to be
               | subject to?
        
             | adamomada wrote:
             | So the new iPad & M4 was just some weekend project that
             | they shrugged and decided to toss over to their physical
             | retail store locations to see if anyone still bought
             | physical goods eh
        
           | stouset wrote:
           | Nothing is true forever. Google wasn't evil forever, Apple
           | won't value privacy forever.
           | 
           | Until we figure out how to have guarantees of forever, the
           | best we can realistically do is evaluate companies and their
           | products by their behavior _now_ weighted by their behavior
           | in the past.
        
           | klabb3 wrote:
           | > but it is about selling luxury hardware.
           | 
           | Somewhat true but things are changing. While there are plenty
           | of "luxury" Apple devices like Vision Pro or fully decked out
           | MacBooks for web browsing we no longer live in a world where
           | tech are just lifestyle gadgets. People spend hours a day on
           | their phones, and often run their life and businesses through
           | it. Even with the $1000+/2-3y price tag, it's simply not that
           | much given how central role it serves in your life. This is
           | especially true for younger generations who often don't have
           | laptops or desktops at home, and also increasingly in poorer-
           | but-not-poor countries (say eg Eastern Europe). So the iPhone
           | (their best selling product) is far, far, far more a
           | commodity utility than typical luxury consumption like
           | watches, purses, sports cars etc.
           | 
           | Even in the higher end products like the MacBooks you see a
           | lot of professionals (engineers included) who choose it
           | because of its price-performance-value, and who don't give a
           | shit about luxury. Especially since the M1 launched, where
           | performance and battery life took a giant leap.
        
             | _the_inflator wrote:
             | I disagree.
             | 
             | Apple is selling hardware and scaling AI by utilizing it is
             | simply a smart move.
             | 
             | Instead of building huge GPU clusters, having to deal with
             | NVIDIA for GOUs (Apple kicked NVIDIA out years ago because
             | of disagreements), Apple is building mainly on existing
             | hardware.
             | 
             | This is in other terms utilizing CPU power.
             | 
             | On the other hand this helps their marketing keeping high
             | price points when Apple now is going to differentiate their
             | COU power and therefore hardware prices over AI
             | functionality correlating with CPU power. This is also
             | consistent with Apple stopping the MHz comparisons years
             | ago.
        
               | klabb3 wrote:
               | Did you reply to the right comment? Feels like we're
               | talking about different things altogether.
        
               | bingbingbing777 wrote:
               | What AI is Apple scaling?
        
               | bionhoward wrote:
               | Seen MLX folks post on X about nice results running local
               | LLMs. https://github.com/ml-explore/mlx
               | 
               | Also, Siri, and consider: you're scaling AI on apple's
               | hardware, too, you can develop your own local custom AI
               | on it, there's more memory available for linear algebra
               | in a maxed out MBP than the biggest GPUs you can buy.
               | 
               | They scale the VRAM capacity with unified memory and that
               | plus a ton of software is enough to make the Apple stuff
               | plenty competitive with the corresponding NVIDIA stuff
               | for the specific task of running big AI models locally.
        
             | coffeebeqn wrote:
             | Engineers use MacBook pros because it's the best built
             | laptop, the best screen, arguably the best OS and most
             | importantly - they're not the ones paying for them.
        
               | BobbyTables2 wrote:
               | I have one and hate it with a passion. A MacBook Air
               | bought new in the past 3 years should be able to use
               | Teams (alone) without keeling over. Takes over a minute
               | to launch Outlook.
               | 
               | My 15 year old Sony laptop can do better.
               | 
               | Even if Microsoft on Mac is an unmitigated dumpster fire,
               | this is ridiculous.
               | 
               | I avoid using it whenever possible. If people email me,
               | it'd better not be urgent.
        
               | riddlemethat wrote:
               | I avoid using Outlook on any device, but I wouldn't
               | complain about my Surface tablet's performance based on
               | how poorly iTunes performs...
        
               | josephg wrote:
               | Is it an Apple silicon or Intel machine? Intel macs are
               | crazy slow - especially since the most recent few
               | versions of macOS. And especially since developers
               | everywhere have upgraded to an M1 or better.
        
               | saagarjha wrote:
               | No MacBook Air from the last 3 years is Intel-based
        
               | josephg wrote:
               | You could certainly still buy new intel macbooks 3 years
               | ago from Apple. Plenty of people did - particularly given
               | a lot of software was still running through rosetta at
               | the time.
               | 
               | The M1 air was only released in November 2020. With a bit
               | of slop in the numbers, its very possible the parent
               | poster bought an intel mac just before the M1 launched.
        
               | jlarcombe wrote:
               | Yeah it's such a shame how much the performance has been
               | affected by recent macOS. I kept my 2019 Mac Book Pro on
               | Catalina for years because everyone else was
               | complaining... finally upgraded directly to Sonoma and
               | the difference in speed was night and day!
        
               | rootusrootus wrote:
               | Sounds a bit like my Intel MBP, in particular after they
               | (the company I work for) installed all the lovely
               | bloatware/tracking crap IT thinks we need to be subjected
               | to. Most of the day the machine runs with the fans
               | blasting away.
               | 
               | Still doesn't take a minute to launch Outlook, but I
               | understand your pain.
               | 
               | I keep hoping it will die, because it would be replaced
               | with an M-series MBP and they are way, way, WAY faster
               | than even the best Intel MBP.
        
               | WWLink wrote:
               | > Even if Microsoft on Mac is an unmitigated dumpster
               | fire, this is ridiculous.
               | 
               | It is Microsoft. I could rant all day about the dumpster
               | fire that is the "NEW Microsoft Teams (Work or School)"
               | 
               | It's like the perfect shining example of how MS doesn't
               | give a flaming fuck about their end users.
        
               | larkost wrote:
               | I will pile on on MS Teams. I am on a Mac and
               | periodically have to fight it because it went offline on
               | me for some reason and I am no longer getting messages.
               | Slightly less annoying is when my iPhone goes to sleep
               | and Teams on my iPhone then sets my status to "Away",
               | even though I am actively typing on Teams on my computer.
               | 
               | And while my particular problems might be partially
               | because I am on MacOS, I observe Windows-using colleagues
               | have just as many problems joining meetings (either total
               | refusal, no audio, or sharing issues). So I think using
               | Teams as a measure of any computer is probably not
               | warranted.
        
               | bigboy12 wrote:
               | I suppose you like bloatware and ads in your taskbar and
               | 49 years of patch Tuesday. Have fun with that. I'll take
               | Mac over any windows.
        
               | 486sx33 wrote:
               | Outlook (old) is okay on Mac Teams is a dumpster fire on
               | every platform
        
               | safety1st wrote:
               | Meanwhile here I am, running linux distros and XFCE on
               | everything. My hardware could be a decade old and I
               | probably wouldn't notice.
               | 
               | (In fact I DO have a spare 13 year old laptop hanging
               | around that still gets used for web browsing, mail and
               | stuff. It is not slow.)
        
               | freedomben wrote:
               | Indeed, I have a 15-year-old desktop computer that is
               | still running great on Linux. I upgraded the RAM to the
               | maximum supported by the motherboard, which is 8 GB, and
               | it has gone through three hard drives in its life, but
               | otherwise it is pretty much the same. As a basic web
               | browsing computer, and for light games, it is fantastic.
        
               | safety1st wrote:
               | It also performs pretty well for the particular brand of
               | web development I do, which basically boils down to
               | running VS Code, a browser, and a lot of ssh.
               | 
               | It's fascinating to me how people are still attached to
               | the hardware upgrade cycle as an idea that matters, and
               | yet for a huge chunk of people and scenarios, basically
               | an SSD, 8gb of RAM and an Intel i5 from a decade ago
               | could have been the end of computing history with no real
               | loss to productivity.
               | 
               | I honestly look at people who use Apple or Windows with a
               | bit of pity, because those ecosystems would just give me
               | more stuff to worry about.
        
               | kolinko wrote:
               | That's not an issue with Macboom but with MS. MS has an
               | incentive to deliver such a terrible experience on macs.
        
               | int_19h wrote:
               | MS has literally thousands of managers running Outlook
               | and Teams on their company-provided ARM MacBooks daily.
        
               | kagakuninja wrote:
               | Teams is shit, and hangs and crashes on my Mac. I blame
               | Microsoft for that.
        
               | al_borland wrote:
               | And they can typically setup their dev environment
               | without a VM, while also getting commercial app support
               | if they need it.
               | 
               | Windows requires a VM, like WSL, for a lot of people, and
               | Linux lacks commercial support. macOS strikes a good
               | balance in the middle that makes it a pretty compelling
               | choice.
        
               | ensignavenger wrote:
               | There are a plethora of companies offering commercial
               | support for various Linux distributions.
        
               | al_borland wrote:
               | I was thinking more about software like the Adobe suite,
               | Microsoft Office, or other closed source software that
               | hasn't released on Linux. Electron has made things a bit
               | better, but there are still a lot of bigs gaps for the
               | enterprise, unless the company is specifically choosing
               | software to maintain Linux support for end users.
               | 
               | Sure, Wine exists, but it's not something I'd want to
               | rely on for a business when there are alternatives like
               | macOS which will offer native support.
        
               | rob74 wrote:
               | Most people don't need the Adobe Suite, and the web
               | version of M$-Office is more than Ok for occasional use.
               | Most other enterprise software are web apps too nowadays,
               | so it's much less relevant what OS your machine is
               | running than it was ten years ago...
        
               | seec wrote:
               | Yep, that's pretty much it.
               | 
               | Apple fanboys like to talk about how cool and long
               | lasting a MacBook Air is but a 500 bucks Chromebook will
               | do just as well while allowing pretty much 90% of the use
               | cases. Sure, the top end power is much lower but at the
               | same time considering the base RAM/storage combo Apple
               | gives it is not that relevant. If you starting loading it
               | up, that puts the pricing in an entirely different
               | category and in my opinion the MacBook Air becomes
               | seriously irrelevant when compared to serious computing
               | devices in the same price range...
        
               | close04 wrote:
               | There's still a huge market for people who want higher
               | end hardware and to run workloads locally, or put a
               | higher price on privacy. For people who want to keep
               | their data close to their chest, and particularly now
               | with the AI bloom, being able to perform all tasks on
               | device is more valuable than ever.
               | 
               | A Chromebook "does the job" but it's closer to a thin
               | client than a workstation. A lot of the job is done
               | remotely and you may not want that.
        
               | nolist_policy wrote:
               | Not at all, a Chromebook let's you run Linux apps. I can
               | run full blown IDEs locally without problems. And yes,
               | that is with 8Gb ram, ChromeOS has superb memory
               | management.
        
               | hollerith wrote:
               | Since the full blown IDE is running in a Linux VM, don't
               | you mean, "Linux has superb memory management"?
        
               | nolist_policy wrote:
               | Well, Google developed and deployed MGLRU to Chromebooks
               | long before upstreamed it. Plus they use some magic to
               | check the MGLRU working set size inside the VMs and
               | balance everything.
        
               | hollerith wrote:
               | Now I see. Interesting. (I'm planning to switch to
               | ChromeOS, BTW.)
        
               | pokerface_86 wrote:
               | what chromebooks come with a mini LED HDR screen and
               | insane battery life? i'd love to know
        
               | nolist_policy wrote:
               | No mini LED, but you can configure the HP Elite Dragonfly
               | Chromebook with a 1000 nits IPS display.
               | 
               | And AFAIK, Google dictates 10+h of battery life with
               | mixed web browsing for all Chromebooks.
        
               | pokerface_86 wrote:
               | 1000 nits is useless without HDR.
        
               | pokerface_86 wrote:
               | and backlight control
        
               | gytdev wrote:
               | Excel is essential and in most businesses that I worked
               | with, most of the accounting and business side is run on
               | it. I switched to Windows from Linux just because of
               | Excel when WSL came out. If Linux would have Excel and
               | Photoshop that would be a no brainer to choose it, but
               | that will never happen
        
               | qalmakka wrote:
               | You usually don't need either for software development
               | though, and if you do the free or online alternatives are
               | often good enough for the rare occasions you need them.
               | If you are a software developer and you have to spend
               | significant time using Office it means you either are
               | developing extensions for Office or your company
               | management is somewhat lacking and you are forced to
               | handle things you should not (like bureaucracy for
               | instance).
        
               | al_borland wrote:
               | Where I'm at my email is in Outlook. Having to use the
               | web version sounds annoying. I also end up getting a lot
               | of information in spreadsheets. Having to move all that
               | to the online version to open also sounds annoying. The
               | online version is also more limited, which could lead to
               | issues.
               | 
               | I could see a front end dev needing Photoshop for some
               | things, if they don't have a design team to give them
               | assets.
               | 
               | There are also security software the company says laptops
               | must have which isn't available for Linux. They only buy
               | and deploy this stuff with Windows and macOS in mind.
               | 
               | A couple weeks ago on HN I saw someone looking for a
               | program to make a demo of their app (I think). The
               | comments were filled with people recommending an app on
               | macOS that was apparently far and away the best option,
               | and many were disappointed by the lack of availability
               | elsewhere. I find there are a lot of situations like
               | this, where I might be able to get the job done on
               | another OS, but the software I actually want to use is on
               | macOS. Obviously this one is a matter of taste to some
               | degree.
               | 
               | It's not as big an issue as it was 20 years ago, but it's
               | still an issue for in many environments.
        
               | wojciii wrote:
               | I use Linux for work with the MS apps used in a browser.
               | I use one specific app using a remote desktop .. also
               | using a browser.
               | 
               | So this can be done. I don't expect the IT support to
               | help me with any Linux issues.
               | 
               | My excuse for using Linux? It makes me more effective at
               | developing software.
        
               | overgard wrote:
               | If you mean WSL for containers, macOS needs a VM too. If
               | youre doing C++ macOS dev tools are .. bleak. Great for
               | webdev though
        
               | al_borland wrote:
               | WSL for normal stuff. My co-worker is on Windows and had
               | to setup WSL to get a linter working with VS Code. It
               | took him a week to get it working the first time, and it
               | breaks periodically, so he needs to do it all over again
               | every few months.
        
               | trimethylpurine wrote:
               | I'm developing on Windows for Windows, Linux, Android,
               | and web, including C, Go, Java, TSQL and MSSQL
               | management. I do not necessarily need WSL except for C.
               | SSH is built directly into the Windows terminal and is
               | fully scriptable in PS.
               | 
               | WSL is also nice for Bash scripting, but it's not
               | necessary.
               | 
               | It is a check box in the "Add Features" panel. There is
               | nothing to install or setup. Certainly not for linting,
               | unless, again, you're using a Linux tool chain.
               | 
               | But if you are, just check the box. No setup beyond VS
               | Code, bashrc, vimrc, and your tool chain. Same as you
               | would do on Mac.
               | 
               | If anything, all the Mac specific quirks make setting up
               | the Linux tool chains much harder. At least on WSL the
               | entire directory structure matches Linux out of the box.
               | The tool chains just work.
               | 
               | While some of the documentation is in its infancy, the
               | workflow and versatility of cross platform development on
               | Windows, I think, is unmatched.
        
               | davrosthedalek wrote:
               | This. I have to onboard a lot of students to our analysis
               | toolchain (Nuclear Physics, ROOT based, C++). 10 years
               | ago I prayed that the student has a Mac, because it was
               | so easy. Now I pray they have Windows, because of WSL.
               | The tool chain is all compiled from source. Pretty much
               | every major version, but often also minor versions, of
               | macos break the compilation of ROOT. I had several
               | release upgrades of Ubuntu that only required a
               | recompile, if that, and it always worked.
        
               | int_19h wrote:
               | Unless he is doing Linux development in the first place,
               | that sounds very weird. You most certainly don't need to
               | set up WSL to lint Python or say JS in VSCode on Windows.
        
               | pjmlp wrote:
               | As Windows/UNIX developer, I only use WSL for Linux
               | containers.
        
               | overgard wrote:
               | That sounds wild, you can run bash and unix utils on
               | windows with minimal fuss without WSL. Unless that linter
               | truly needed linux (and i mean, vscode extensions are
               | typescript..) that sounds like overkill
        
               | Noumenon72 wrote:
               | Don't you need Cygwin or Git Bash if you don't use WSL?
               | That's kind of fussy.
        
               | repelsteeltje wrote:
               | | This!
               | 
               | I would love to buy Apple hardware, but not from Apple. I
               | mean: M2 13 inch notebook with access to swap/extend
               | memory and storage, regular US keyboard layout and proper
               | desktop Linux (Debian, Alpine, Mint, PopOS!, Fedora
               | Cinamon) or windows. MacOS and the Apple eco system just
               | gets in your way when you're just trying to maintain a
               | multi-platform C++/Java/Rust code base.
        
               | pineaux wrote:
               | WSL is not a VM. Edit: TIL WSL2 is a VM. I develop on mac
               | and linux computers so should have kept my mouth shut
               | anyways
        
               | worthless-trash wrote:
               | Hey, you learned and corrected yourself, dont be so hard
               | on yourself mate.
        
               | tomcam wrote:
               | Username highly inaccurate ;)
        
               | ddingus wrote:
               | Seriously! I agree. They just modeled the best discussion
               | with some of the highest value there is.
               | 
               | Being wrong is no big deal. Being unable to be right is
               | often a very big deal.
        
               | shandor wrote:
               | Just to make sure your TIL is complete, do note that
               | Linux containers are VMs _also_ on MacOS :)
        
               | forty wrote:
               | What do you mean without a VM? I guess you don't count
               | docker/podman as VMs then?
        
               | BytesAndGears wrote:
               | Likely that most devs want to use Unix tools -- terminal,
               | etc.
        
               | dualboot wrote:
               | Those aren't VMs -- they're containers.
        
               | Hasu wrote:
               | Only on Linux - on MacOS and Windows, you have to do
               | virtualization for containers.
        
               | anArbitraryOne wrote:
               | And the M1 chip on mine really alters productivity. Every
               | time we want to update a library, we need some kind of
               | workaround.
               | 
               | It's great having a chip that is so much different than
               | what our production infrastructure uses.
        
               | gibolt wrote:
               | This should be a temporary problem solved with time. The
               | battery and performance gains are completely worth most
               | workarounds required.
        
               | anArbitraryOne wrote:
               | Not worth it at all. I rarely use battery power, so I'd
               | rather have an intel or AMD chip with more cores and a
               | higher clock speed at the expense of the battery. Oh, and
               | an OS that can actually manage its windows, and customize
               | keyboard settings, and not require an account to use the
               | app store
        
               | nxicvyvy wrote:
               | This hasn't been true for a long time.
        
               | resonious wrote:
               | M* has caused nothing but trouble for most mac user
               | engineers I know (read: most engineers I know) who
               | upgraded. Now not only are they building software for a
               | different OS, they're building for a different
               | architecture! They do all of their important compute in
               | Docker, wasting CPU cycles and memory on the VM. All for
               | what: a nice case? nice UI (that pesters you to try
               | Safari)?
               | 
               | It looks like Apple's silicon and software is really good
               | for those doing audio/video. Why people like it for dev
               | is mostly a mystery to me. Though I know a few people who
               | don't really like it but are just intimidated by Linux or
               | just can't handle the small UX differences.
        
               | AtomicOrbital wrote:
               | I strongly suggest putting in the time to learn how to
               | install and maintain a linux laptop ... Ubuntu 24.04 is a
               | great engineer platform
        
               | daviddever23box wrote:
               | It is, provided that the hardware vendor has reasonably
               | decent support for power management, and you're willing
               | to haul around an AC adapter if not. In general, I really
               | like AMD hardware with built-in graphics for this, or
               | alternately, Intel Tiger Lake-U based hardware.
               | 
               | Asahi Linux is shockingly great on Apple Silicon
               | hardware, though.
        
               | spullara wrote:
               | 1) macs are by far the best hardware and also performance
               | running intel code is faster than running intel code on
               | the previous intel macs:
               | https://discourse.slicer.org/t/hardware-is-apple-m1-much-
               | fas... 2) they should use safari to keep power usage low
               | and browser diversity high
        
               | khaki54 wrote:
               | It's basically required for iOS development. Working
               | around it is extremely convoluted any annoying
        
               | resonious wrote:
               | I forgot to mention that as an obvious exception. Of
               | course developing for Apple is best on Apple hardware.
        
               | niij wrote:
               | In my experience as a backend services Go developer (and
               | a bit of Scala) the switch to arm has been mostly
               | seamless. There was a little config at the beginning to
               | pull dual-image docker images (x64 and arm) but that was
               | a one time configuration. Otherwise I'm still targeting
               | Linux/x64 with Go builds and Scala runs on the JVM so
               | it's supported everywhere anyway; they both worked out of
               | the box.
               | 
               | My builds are faster, laptop stays cooler, and battery
               | lasts longer. I love it.
               | 
               | If I was building desktop apps I assume it would be a
               | less pleasant experience like you mention.
        
               | phlakaton wrote:
               | The pain for me has been in the VM scene, as VirtualBox
               | disappeared from the ecosystem with the switch to ARM.
        
               | herval wrote:
               | I don't know a single engineer who had issues with M
               | chips, and most engineers I know (me included) benefited
               | considerably from the performance gains, so perhaps your
               | niche isn't that universal?
        
               | resonious wrote:
               | My niche is Ruby on Rails web dev, which is definitely
               | not universal, but not all that narrow either!
        
               | fiddlerwoaroof wrote:
               | You must have an unusual setup because, between Rosetta
               | and rosetta in Virtualization.framework VMs (configurable
               | in Docker Desktop or Rancher Desktop), I've never had
               | issues running intel binaries on my Mac
        
               | Lio wrote:
               | I'm doing Ruby on Rails dev too. I don't notice a hige
               | difference between macOS and Linux for how I work.
               | 
               | There's quirks to either OS.
               | 
               | Eg when on Gnome it drives me mad that it won't focus a
               | recently launched apps.
               | 
               | On macOS it annoys me that I have install a 3rd party
               | util to move windows around.
               | 
               | Meh, you just adapt after a while.
        
               | herval wrote:
               | what's wrong w/ Rails on M chips? I don't recall having
               | had much trouble with it (except w/ nokogiri bindings
               | right when the M1 was first available, but that's a given
               | for any new release of OSX)
        
               | jetpks wrote:
               | I'm an engineer that has both an apple silicon laptop
               | (mbp, m2) and a linux laptop (arch, thinkpad x1 yoga.) I
               | choose the mac every day of the week and it's not even
               | close. I'm sure it's not great for specific engineering
               | disciplines, but for me (web, rails, sre) it really can't
               | be beat.
               | 
               | The UX differences are absolutely massive. Even after
               | daily-driving that thinkpad for months, Gnome always felt
               | kinda not quite finished. Maybe KDE is better, but it
               | didn't have Wayland support when I was setting that
               | machine up, which made it a non-starter.
               | 
               | The real killer though is battery life. I can work
               | literally all day unplugged on the mbp and finish up with
               | 40-50% remaining. When i'm traveling these days, i don't
               | even bring a power cable with me during the day. The
               | thinkpad, despite my best efforts with powertop, the most
               | aggressive frequency scaling i could get, and a bunch of
               | other little tricks, lasts 2 hours.
               | 
               | There are niceties about Linux too. Package management is
               | better and the docker experience is _way_ better. Overall
               | though, i'd take the apple silicon macbook 10 times out
               | of 10.
        
               | resonious wrote:
               | Interesting. I do similar (lots of Rails) but have pretty
               | much the opposite experience (other than battery life -
               | Mac definitely wins there). Though I use i3/Sway more
               | than Gnome. The performance of running our huge monolith
               | locally is much better for Linux users than Mac users
               | where I work.
               | 
               | I used a Mac for awhile back in 2015 but it never really
               | stood out to me UX-wise, even compared to Gnome. All I
               | really need to do is open a few windows and then switch
               | between them. In i3 or Sway, opening and switching
               | between windows is very fast and I never have to drag
               | stuff around.
        
               | xarope wrote:
               | I'd like to offer a counterpoint, I have an old'ish T480s
               | which runs linuxmint, several lxd containers for traefik,
               | golang, python, postgres and sqlserver (so not even
               | dockerized, but full VMs running these services), and I
               | can go the whole morning (~4-5 hours).
               | 
               | I think the culprit is more likely the power hungry intel
               | CPU in your yoga?
               | 
               | Going on a slight tangent; I've tried but do not like the
               | mac keyboards, they feel very shallow to me, hence why
               | I'm still using my old T480s. The newer thinkpad laptop
               | keyboards all seem to be going that way though (going
               | thinner), much to my dismay. Perhaps a P14s is my next
               | purchase, despite it's bulk.
               | 
               | Anybody with a framework 13 want to comment on their
               | keyboard?
        
               | freedomben wrote:
               | I really like the keyboards on my frameworks. I have both
               | the 13 and the new 16, and they are pretty good. Not as
               | good as the old T4*0s I'm afraid, but certainly usable.
        
               | klabb3 wrote:
               | > The thinkpad, [...], lasts 2 hours.
               | 
               | This echoes my experiences for anything that needs power
               | management. Not just that the battery life is worse, but
               | that it _degrades_ quickly. In two years it's barely
               | usable. I've seen this with non-Apple phones and laptops.
               | iPhone otoh is so good these days you don't need to
               | upgrade until EOL of ~6 years (and even if you need it
               | battery is not more expensive than any other proprietary
               | battery). My last MacBook from 2011 failed a couple of
               | years ago only because of a Radeon GPU inside with a
               | known hw error.
               | 
               | > There are niceties about Linux too.
               | 
               | Yes! If you haven't tried in years, the Linux _desktop_
               | experience is awesome (at least close enough) for me - a
               | dev who CAN configure stuff if I need to but find it
               | excruciatingly menial if it isn 't related to my core
               | work. It's really an improvement from a decade ago.
        
               | pompino wrote:
               | >The UX differences are absolutely massive.
               | 
               | Examples?
        
               | jwells89 wrote:
               | Battery life followed by heat and fan noise have been my
               | sticking points with non-mac laptops.
               | 
               | My first gen ThinkPad Nano X1 would be an excellent
               | laptop, if it weren't for the terrible battery life even
               | in power save mode (which as an aside, slows it down a
               | _lot_ ) and its need to spin up a fan to do something as
               | trivial as driving a rather pedestrian 2560x1440 60hz
               | display.
               | 
               | It feels almost like priorities are totally upside down
               | for x86 laptop manufacturers. I totally understand and
               | appreciate that there are performance oriented laptops
               | that aren't supposed to be good with battery life, but
               | there's no good reason for there being so few
               | ultraportable and midrange x86 laptops that have good
               | battery life and won't fry your lap or sound like a jet
               | taking off when pushed a little. It's an endless sea of
               | mediocrity.
        
               | landswipe wrote:
               | This is going to change once Arm on Linux becomes a thing
               | with Qualcomm's new jazz. I am mostly tethered to a dock
               | with multiple screens. I have been driving Ubuntu now for
               | over 4 years full time for work.
        
               | freedomben wrote:
               | Interestingly enough, the trend I am seeing is all the
               | MacBook engineers moving back to native development
               | environments. Basically, no longer using docker. And just
               | as expected, developers are getting bad with docker and
               | are finding it harder to use. They are getting more and
               | more reliant on devops help or to lean on the team member
               | who is on Linux to handle all of that stuff. We were on a
               | really great path for a while there in development where
               | we were getting closer to the ideal of having development
               | more closely resemble production, and to have developers
               | understand the operations tools. Now we're cruising
               | firmly in the opposite direction because of this Apple
               | switch to arm. Mainly it wouldn't bother me so much if
               | people would recognize that they are rationalizing
               | because they like the computers, but they don't. They
               | just try to defend logically a decision they made
               | emotionally. I do it too, every human does, but a little
               | recognition would be nice.
        
               | int_19h wrote:
               | It's not even a problem with MacBooks as such. They are
               | still excellent consumer devices (non-casual gaming
               | aside). It's this weird positioning of them as the
               | ultimate dev laptop that causes so many problems, IMO.
        
               | wiseowise wrote:
               | Why would excellent machine be blamed for shitty
               | software?
        
               | int_19h wrote:
               | Because machines are tools meant to perform tasks, and
               | part of that is being interoperable with other tools and
               | de facto standards in the relevant field. For dev work,
               | today, MacBook is not good at it.
        
               | daviddever23box wrote:
               | Remember, though, that the binaries deployed in
               | production environments are not being built locally on
               | individual developer machines, but rather in the cloud,
               | as reproducible builds securely deployed from the cloud
               | to the cloud.
               | 
               | Modern language tooling (Go, Rust et al) allows one to
               | build and test on any architecture, and the native macOS
               | virtualization (https://developer.apple.com/documentation
               | /virtualization) provides remarkably better performance
               | compared to Docker (which is a better explanation for its
               | fading from daily use).
               | 
               | Your "trend" may, in fact, not actually reflect the
               | reality of how cloud development works at scale.
               | 
               | And I don't know a single macOS developer that "lean(s)
               | on the team member who is on Linux" to leverage tools
               | that are already present on their local machine. My own
               | development environments are IDENTICAL across all three
               | major platforms.
        
               | tsimionescu wrote:
               | Virtualization and Docket are orthogonal technologies.
               | The reason you use docker, especially in dev, is to have
               | the exact same system libraries, dependencies, and
               | settings on each build. The reason you use virtualization
               | is to access hardware and kernel features that are not
               | present on your hardware or native OS.
               | 
               | If you deploy on docker (or Kubernetes) on Linux in
               | production, then ideally you should be using docker on
               | your local system as well. Which, for Windows or MacOS
               | users, requires a Linux VM as well.
        
               | daviddever23box wrote:
               | It seems that you're trying to "educate" me on how
               | containers and virtualization work, when in fact I've
               | been doing this for a while, on macOS, Linux and Windows
               | (itself having its own Hyper-V pitfalls).
               | 
               | I know you mean well, though.
               | 
               | There is no Docker on macOS without a hypervisor layer -
               | period - and a VM, though there are multiple possible
               | container runtimes not named Docker that are suitable for
               | devops-y local development deployments (which will
               | always, of course, be constrained in comparison to the
               | scale of lab / staging / production environments). Some
               | of these can better leverage the Rosetta 2 translation
               | layer that Apple provides, than others.
        
               | tsimionescu wrote:
               | I'm sorry that I came up as patronizing, I was more so
               | trying to explain my confusion and thought process rather
               | than to teach you about virtualization and containers.
               | 
               | Specifically what confused me in your comment was that
               | you were saying Docker on Mac was superseded by their new
               | native virtualization, which just doesn't make sense to
               | me, for the reasons I was bringing up. I still don't
               | understand what you were trying to say; replacing docker
               | with podman or containerd or something else still doesn't
               | have anything to do with virtualization or Rosetta, or at
               | least I don't see the connection.
               | 
               | I should also say that I don't think anyone really means
               | specifically docker when they talk about it, they
               | probably mean containerization + image repos in general.
        
               | eloisant wrote:
               | We have to cross-compile anyway because now we're
               | deploying to arm64 Linux (AWS Graviton) in addition to
               | x86 Linux.
               | 
               | So even if all developers of your team are using Linux,
               | unless you want to waste money by ignoring arm64
               | instances on cloud computing, you'll have to setup cross
               | compilation.
        
               | wubrr wrote:
               | No, no, no, yes.
        
               | synergy20 wrote:
               | no, no, NO and yes.
               | 
               | I actually rejected a job offer when heard I will be
               | given a macbook pro.
               | 
               | Apple, been the most closed company these days, should be
               | avoided as much as you can, not to mention its macos is
               | useless for linux developers like me, anything else is
               | better.
               | 
               | its keyboard is dumb to me(that stupid command/ctrl key
               | difference), can not even mouse-select and paste is
               | enough for me to avoid Macos at all costs.
        
               | 486sx33 wrote:
               | I think I had similar feelings but took an open mind and
               | love my m2 pro Sometimes an open mind reaps rewards
               | friend
        
               | gibolt wrote:
               | I selected Mac + iOS devices when a job offered a choice,
               | specifically to try out the option, while personally
               | sticking with Windows and Android.
               | 
               | Now the performance of Mx Macs convinced me to switch,
               | and I'll die on the hill of Android for life
        
               | insaneirish wrote:
               | > I actually rejected a job offer when heard I will be
               | given a macbook pro.
               | 
               | Probably best for you both.
        
               | fastball wrote:
               | macOS is clearly better for linux devs than Windows,
               | given it is unix under-the-hood.
               | 
               | I don't even know what you mean by mouse-select and
               | paste.
        
               | BenjiWiebe wrote:
               | On most Linux environments: text you highlight with the
               | mouse (or highlight by double/triple clicking) can be
               | "pasted" by middle-clicking.
        
               | Cyphase wrote:
               | And it's a separate clipboard from Ctrl+C/right-click-
               | and-copy. The number of times I miss that on non-Linux...
        
               | Lio wrote:
               | Personally, I use tmux on both Linux and macOS to get
               | multiple clipboards and the mouse behaviour I'm used to.
        
               | Reason077 wrote:
               | > _" I don't even know what you mean by mouse-select and
               | paste."_
               | 
               | Presumably they mean linux-style text select & paste,
               | which is done by selecting text and then clicking the
               | middle mouse button to paste it (no explicit "copy"
               | command).
               | 
               | macOS doesn't have built-in support for this, but there
               | are some third-party scripts/apps to enable it.
               | 
               | For example: https://github.com/lodestone/macpaste
        
               | int_19h wrote:
               | On Windows these days, you get WSL, which is actual
               | Linux, kernel and all. There are still some differences
               | with a standalone Linux system, but they are far smaller
               | than macOS, in which not only the kernel is completely
               | different, but the userspace also has many rather
               | prominent differences that you will very quickly run
               | afoul of (like different command line switches for the
               | same commands).
               | 
               | Then there's Docker. Running amd64 containers on Apple
               | silicon is slow for obvious reasons. Running arm64
               | containers is fast, but the actual environment you will
               | be deploying to is almost certainly amd64, so if you're
               | using that locally for dev & test purposes, you can get
               | some surprises in prod. Windows, of course, will happily
               | run amd64 natively.
        
               | paulmd wrote:
               | > the actual environment you will be deploying to is
               | almost certainly amd64
               | 
               | that's up to your team of course, but graviton is
               | generally cheaper than x86 instances nowadays and afaik
               | the same is true on google and the other clouds.
        
               | aforwardslash wrote:
               | Arm is an ISA, not a family of processors. You may expect
               | Apple chips and Graviton to be wildly different, and
               | perform completely different in the same scenario. In
               | fact, most Arm cpus also have specific extensions that
               | are not found in other manufacturers. So yes, while both
               | recognize a base set of instructions, thats about it -
               | expect that everything else is different. I know, amd64
               | is also technically an ISA, but you have 2 major
               | manufacturers, with very similar and predictable
               | performance characteristics. And even then, sometimes
               | something on AMD behaves quite differently from Intel.
               | 
               | For most devs, doing crud stuff or writing high-level
               | scripting languages, this isn't really a problem. For
               | some devs, working on time-sensitive problems or with
               | strict baseline performance requirements, this is
               | important. For devs developing device drivers, emulation
               | can only get you so far.
        
               | paulmd wrote:
               | What are you responding to here?
               | 
               | No, I said you won't always be deploying on amd64.
               | Because arm64 is now the cheapest option and generally
               | faster than the sandy bridge vcpu unit that amd64
               | instances are indexed against (and really, constrained
               | to, intentionally, by AWS).
               | 
               | I never said anything about graviton not being arm64.
        
               | aforwardslash wrote:
               | Its not about price, its about compatibility. Just
               | because software compiles in a different ISA doesnt mean
               | it behaves the same way. But if that isn't obvious to
               | you, good for you.
        
               | Reason077 wrote:
               | > _" userspace also has many rather prominent differences
               | ... (like different command line switches for the same
               | commands)."_
               | 
               | Very quickly remedied by installing the GNU versions of
               | those commands, ie: "brew install coreutils findutils"
               | (etc)
               | 
               | Then you'll have exactly the same command line switches
               | as on Linux.
        
               | ekimekim wrote:
               | > I actually rejected a job offer when heard I will be
               | given a macbook pro.
               | 
               | For what it's worth, I've had a good success rate at
               | politely asking to be given an equivalent laptop I can
               | put linux on, or provide my own device. I've never had to
               | outright reject an offer due to being required to use a
               | Mac. At worst I get "you'll be responsible for making our
               | dev environment work on your setup".
        
               | nobleach wrote:
               | I've had 50/50. These days I'm fairly okay with just
               | taking the Macbook Pro. I did have one instance where I
               | got one my first week and used my Dell XPS with Linux the
               | entire 10 months I was at the place. I returned the
               | Macbook basically unused.
               | 
               | Only one time did I interview with a place where I asked
               | if I'd be given a choice what hardware/OS I could use.
               | The response was "We use Windows". My response was, "no
               | we do not. Either I will not be using Windows with you,
               | or I will not be using Windows NOT with you". I didn't
               | get an offer. I was cool with it.
        
               | fooblaster wrote:
               | what amazing laptop must an employer give you to not be
               | summarily rejected?
        
               | synergy20 wrote:
               | any thing runs Linux,even wsl2 is fine,no macos is the
               | key. and yes it costs the employer about half of the
               | expensive Apple devices that can not even be upgraded,
               | its hardware is as closed as its software.
        
               | saagarjha wrote:
               | Employers typically also care about costs like "how hard
               | is it to provision the devices" and "how long is the
               | useful life of this" or "can I repurpose an old machine
               | for someone else".
        
               | p_l wrote:
               | Provisioning is a place where Windows laptops win hands
               | down, though.
               | 
               | Pretty much everything going wrong with provisioning
               | involves going extra weird on hw (usually for cheap
               | supplier) and/or pushing weird third party "security"
               | crapware.
        
               | wiseowise wrote:
               | > its keyboard is dumb to me(that stupid command/ctrl key
               | difference)
               | 
               | Literally best keyboard shortcuts out of all major OSes.
               | I don't know what weird crab hands you need to have to
               | comfortably use shortcuts on Windows/Linux. CMD maps
               | PERFECTLY on my thumb.
        
               | pompino wrote:
               | "Engineers" - ironically the term used in the software
               | industry for people who never standardize anything, solve
               | the same problem solved by other "engineers" over and
               | over again (how many libraries do you need for arrays and
               | vectors and guis and buttons and text boxes and binary
               | trees and sorting, yada yada?) while making the same
               | mistakes and learning the hard way each time, also
               | vehemently argue about software being "art" might like
               | OSX, but even that is debatable. Meanwhile actual
               | Engineers (the ones with the license) the people who need
               | CAD and design tools for building bridges and running
               | manufacturing plants stay far away from OSX.
        
               | PaulHoule wrote:
               | If you look at creative pros such as photographers and
               | Hollywood 'film' editors, VFX artists, etc. you will see
               | a lot of Windows and Linux as people are more concerned
               | about getting absolute power at a fair price and don't
               | care if it is big, ugly. etc.
        
               | pompino wrote:
               | Oh, I'm sure there are lots of creatives who use OSX, so
               | I don't mean to suggest nobody uses OSX, so I'll admit it
               | was a bit in jest to poke fun at the stereotype. I'm
               | definitely oldschool - but to me It's a bit cringe to
               | hear "Oh, I'm an engineer.." or "As an engineer.." from
               | people sit at a coffee shop writing emails or doing the
               | most basic s/w dev work. I truly think silicon valley
               | people would benefit from talking to technical people who
               | are building bridges and manufacturing plants and cars
               | and hardware and chips and all this stuff on
               | r/engineeringporn that everyone takes for granted. I
               | transitioned from s/w to hardcore manufacturing 15 years
               | ago, and it was eye opening, and very humbling.
        
               | shykes wrote:
               | "silicon valley people would benefit from talking to
               | people who build chips", that's a good one!
        
               | pompino wrote:
               | It would be funny, if it wasn't also sad to see the
               | decline.
        
               | 2muchcoffeeman wrote:
               | I'd assume a lot of this is because you can't get the
               | software on MacOS. Not a choice. Who is choosing to use
               | Windows 10/11 where you get tabloid news in the OS by
               | default? Or choosing to hide the button to create local
               | user accounts?
        
               | pompino wrote:
               | People overwhelmingly choose windows world-wide to get
               | shit done. That answers the who.
        
               | 2muchcoffeeman wrote:
               | So the same software exists on multiple platforms, there
               | are no legacy or hardware compatibility considerations,
               | interoperability considerations, no budget
               | considerations, and the users have a choice in what they
               | use?
               | 
               | I.e the same functionality exists with no draw backs and
               | money was no object.
               | 
               | And they chose Windows? Seriously why?
        
               | pompino wrote:
               | We use the sales metrics and signals available to us.
               | 
               | I don't know what to say except resign to the fact that
               | the world is fundamentally unfair, and you won't ever get
               | to run the A/B experiment that you want. So yes, Windows
               | it is !
        
               | goosedragons wrote:
               | More choice in hardware. More flexibility in hardware. UI
               | preferences. You can't get a Mac 2 in 1 or a Mac foldable
               | or a Mac gaming notebook or a Mac that weighs less than a
               | kilogram. You can't get a Mac with an OLED screen or a
               | numpad. Some people just prefer the Windows UI too. I
               | usually use Linux but between MacOS and Windows, I prefer
               | the latter.
        
               | steve1977 wrote:
               | Who is choosing to use macOS, where non-Apple monitors
               | and other 3rd party hardware just stops working after
               | minor updates and then starts working again after another
               | update, without any official statement from Apple that
               | there was a problem and a fix?
        
               | mschuster91 wrote:
               | So what, Windows does the same. Printers [1], WiFi [2],
               | VPN [3], Bluetooth devices [4], audio [5] - and that's
               | just stuff I found via auto-completing "windows update
               | breaks" on Google in under 5 minutes.
               | 
               | The only problem is that Apple is even worse at
               | communicating issues than Microsoft is.
               | 
               | [1] https://www.bleepingcomputer.com/news/microsoft/micro
               | soft-wa...
               | 
               | [2] https://www.bleepingcomputer.com/news/microsoft/micro
               | soft-fi...
               | 
               | [3] https://www.bleepingcomputer.com/news/microsoft/micro
               | soft-sa...
               | 
               | [4] https://www.forbes.com/sites/gordonkelly/2019/06/12/m
               | icrosof...
               | 
               | [5] https://www.theregister.com/2022/08/22/windows_10_upd
               | ate_kil...
        
               | steve1977 wrote:
               | The big difference is that Microsoft - at least usually -
               | confirms and owns the issues.
               | 
               | With Apple, it's usually just crickets... nothing in the
               | release notes, no official statements, nothing. It's just
               | trial and error for the users to see if a particular
               | update fixed the issue.
        
               | lupire wrote:
               | That's anti-competitive and frustrating, but not an
               | argument against the value of a pure Apple hardware
               | ecosystem.
        
               | steve1977 wrote:
               | Which was not the point. The question was who would be
               | choosing Windows over macOS. I would and this is one of
               | the reasons why.
        
               | wiseowise wrote:
               | I do. Because for all issues it has, it is still much
               | better than whatever Windows has to offer.
               | 
               | > where non-Apple monitors and other 3rd party hardware
               | just stops working after minor updates and then starts
               | working again after another update, without any official
               | statement from Apple that there was a problem and a fix?
               | 
               | At least my WiFi doesn't turn off indefinitely during
               | sleep until I power cycle whole laptop because of a
               | shitty driver.
        
               | wiseowise wrote:
               | You seem to have some romanticized notion of engineers
               | and deeply offended by someone calling themselves
               | engineer. Why do you even care if someone sits at a
               | coffee shop writing emails and calls themselves engineer?
               | You think it somehow dilutes prestige of word "engineer"?
               | Makes it less elite or what?
        
               | pompino wrote:
               | "deeply offended" - My default response to imposters is
               | laughter. Call yourself Lord, King, President, Doctor,
               | Lawyer whatever - doesn't matter to me. I'd suggest you
               | to lighten up.
        
               | lupusreal wrote:
               | They hate you because you speak the truth. Code monkeys
               | calling themselves engineers really is funny.
        
               | zaphirplane wrote:
               | Do you have an engineering degree ?
        
               | pompino wrote:
               | Yes, a bachelors and a masters.
               | 
               | Not that the degree means much, I learnt 90% of what I
               | know on the job. It certainly helped get my foot in
               | through the university brand, and alumni network.
               | 
               | You can call yourself anything you want Doctor, Lawyer,
               | Engineer. I have the freedom to think my own thoughts
               | too.
        
               | cafed00d wrote:
               | I always likened "engineers"[1] to "people who are
               | proficient in calculus"; and "computers"[1] to "people
               | who are proficient at calculations".
               | 
               | There was brief sidestep from late 1980s to early 2010s
               | (~2012) where the term "software engineer" came into
               | vogue and completely ran orthogonal to "proficiency in
               | calculus". I mean, literally 99% of software engineers
               | never learned calculus!
               | 
               | But it's nice to see that ever since ~2015 or so (and
               | perhaps even going forward) proficiency in calculus is
               | rising to the fore. We call those "software engineers"
               | "ML Engineers" nowadays, ehh fine by me. And all those
               | "computers" are not people anymore -- looks like
               | carefully arranged sand (silicon) in metal took over.
               | 
               | I wonder if it's just a matter of time before the
               | carefully-arranged-sand-in-metal form factor will take
               | over the "engineer" role too. One of those Tesla/Figure
               | robots becomes "proficient at calculus" and "proficient
               | at calculations" better than "people".
               | 
               | Reference: [1]: I took the terms "engineer" and
               | "computer" literally out of the movie "Hidden Figures"
               | https://en.wikipedia.org/wiki/Hidden_Figures#Plot
               | 
               | It looks like ever since humankind learned calculus there
               | was an enormous benefit to applying it in the engineering
               | of rockets, aeroplanes, bridges, houses, and eventually
               | "the careful arrangement of sand (silicon)". Literally
               | every one of those jobs required learning calculus at
               | school and applying calculus at work.
        
               | hyperadvanced wrote:
               | Most software engineering just doesn't require calculus,
               | though it does benefit from having the understanding of
               | functions and limit behaviors that higher math does. But
               | if you look at a lot of meme dev jobs they've
               | transitioned heavily away from the crypto craze of the
               | past 5 years towards "prompt engineering" or the like to
               | exploit LLMs in the same way that the "Uber for X" meme
               | of 2012-2017 exploited surface level knowledge of JS or
               | API integration work. Fundamentally, the tech ecosystem
               | desires low skill employees, LLMs are a new frontier in
               | doing a lot with a little in terms of deep technical
               | knowledge.
        
               | techcode wrote:
               | Why pointing out Calculus as opposed to just Math?
               | 
               | Might be just my Eastern Europe background where it was
               | all just "Math" and both equations (that's Algebra I
               | guess) and simpler functions/analysis (Calculus?) are
               | taught in elementary school around age 14 or 15.
               | 
               | Maybe I'm missing/forgetting something - I think I used
               | Calculus more during electrical engineering than for
               | computer/software engineering.
        
               | pompino wrote:
               | True, we learnt calculus before college in my home
               | country - but it was just basic stuff. But I learnt a lot
               | more of it including partial derivatives in first year of
               | engineering college.
               | 
               | >I think I used Calculus more during electrical
               | engineering than for computer/software engineering.
               | 
               | I think that was OPs point - most engineering disciplines
               | teach it.
        
               | cafed00d wrote:
               | Yeah computer science went through this weird offshoot
               | for 30-40 years where calculus was simply taught because
               | of tradition.
               | 
               | It was not really necessary through all of the app
               | developers eras. In fact, it's so much so the case that
               | many software engineers graduating from 2000-2015 or so
               | work as software engineers without a degree in BS.
               | Rather, they could drop the physics & calculus grind and
               | opt for a BA in computer science. They then went on to
               | become proficient software engineers in the industry.
               | 
               | It's only after the recent advances of AI around
               | 2012/2015 did a proficiency in calculus become crucial to
               | software engineering again.
               | 
               | I mean, there's a whole rabbit hole of knowledge on the
               | reason why ML frameworks deal with calculating vector-
               | Jacobian or Jacobian-vector products. Appreciating that
               | and their relation to gradient is necessary to design &
               | debug frameworks like PyTorch or MLX.
               | 
               | Sure, I will concede that a sans-calculus training (BA in
               | Computer Science) can still be sufficiently useful to
               | working as an ML engineer in data analytics,
               | api/services/framework design, infrastructure, systems
               | engineering, and perhaps even inference engineering. But
               | I bet all those people will need to be proficient in
               | calculus the more they have to deal with debugging
               | models.
        
               | KptMarchewa wrote:
               | In my central european university we've learned "Real
               | Analysis" that was way more concerned about theorems and
               | proofs rather than "calculating" something - if anything,
               | actually calculating derivatives or integrals was a
               | warmup problem to the meat of the subject.
        
               | cafed00d wrote:
               | Calculus, because all of engineering depends critically
               | on the modeling of real world phenomena using ordinary or
               | partial differential equations.
               | 
               | I don't mean to disregard other branches of math -- of
               | course they're useful -- but calculus stands out in
               | specific _applicability_ to engineering.
               | 
               | Literally every single branch of engineering. All o then.
               | Petrochemical engineering to Biotech. They all use
               | calculus as a fundamental block of study.
               | 
               | Discovering new drugs using Pk/Pd modeling is driven by
               | modeling then drug<->pathogen repo as cycles using Lotka
               | models.
               | 
               | Im not saying engineers dont need to learn stats or
               | arithmetic. IMO those are more fundamental to _all_
               | fields, janitors or physicians or any field really. But
               | calculus is fundamental to engineering alone.
               | 
               | Perhaps, a begrudging exception I can make is its
               | applications in Finance.
               | 
               | But every other field where people build rockets, cars,
               | airplanes, drugs, or ai robots, you'd need proficiency in
               | calculus just as much as you'd need proficiency in
               | writing or proficiency in arithmetic.
        
               | pompino wrote:
               | Hmm, that is an interesting take. Calculus does seems
               | like the uniting factor.
               | 
               | I've come to appreciate the fact that domain knowledge
               | has a more dominant role in solving a problem than
               | technical/programming knowledge. I often wonder how s/w
               | could align with other engineering practices in terms of
               | approach design in a standardized way so we can just
               | churn out code w/o an excessive reliance on quality
               | assurance. I'm really hoping visual programming is going
               | to be the savior here. It might allow SMEs and Domain
               | experts to utilize a visual interface to implement their
               | ideas.
               | 
               | Its interesting how python dominated C/C++ in the case of
               | the NumPy community. One would have assumed C/C++ to be a
               | more a natural fit for performance oriented code. But the
               | domain knowledge overpowered technical knowledge and
               | eventually people started asking funny questions like
               | 
               | https://stackoverflow.com/questions/41365723/why-is-my-
               | pytho...
        
               | chasd00 wrote:
               | there was some old commercial that had the tagline
               | "performance is nothing without control". If you can't
               | put the technology to work on your problems then the
               | technology, no matter how incredible, is worthless to
               | you.
        
               | cafed00d wrote:
               | I agree a hundred percent that domain knowledge is the
               | single most dominant influence to problem solving
               | expertise.
        
               | ido wrote:
               | That 99% guess seems high considering calculus is
               | generally a required subject when studying computer
               | science (or software engineering) at most universities I
               | know of.
        
               | germandiago wrote:
               | In mine it was mandarory, there were 9 + 9 + 4.5 credits
               | of only calculus itself. There was way more: discrete
               | math, algebra...
        
               | cafed00d wrote:
               | You're right it's a total guess. It's based on my
               | experience in the field.
               | 
               | My strong "opinion" here comes from an observation that
               | while calculus may have been a required subject of study
               | in awarding engineering degrees, the reality is, people
               | didn't really study it. They just brushed through a
               | couple of pages and wrote a few tests/exams.
               | 
               | In America there's plethora of expert software engineers
               | who opt for a bachelors degree in computer science that
               | is a BA not a BS.
               | 
               | I think that's complete totally reasonable thing to do if
               | you don't want to grind out the physics and calculus
               | courses. They are super hard after all. And let's face
               | it, all of the _useful to humanity_ work in software
               | doesn't require expertise in physics or calculus, at
               | least until now.
               | 
               | With AI going forward it's hard to say. If more of the
               | jobs shift over to model building then yes perhaps a back
               | to basics approach of calculus proficiency could be
               | required.
        
               | GeneralMaximus wrote:
               | This checks out. I'm a software developer who took math
               | all through high school and my first three years of
               | college. I barely scraped through my calculus exams, but
               | I excelled at combinatorics, probability, matrix math,
               | etc. (as long as it didn't veer into calculus for some
               | reason).
               | 
               | I guess I just enjoy things more when I can count them.
        
               | Shorel wrote:
               | For this engineering, I think calculus is not the main
               | proficiency enhancer you claim it to be. Linear Algebra,
               | combinatorics, probability and number theory are more
               | relevant.
               | 
               | Calculus was important during the world wars because it
               | means we could throw shells to the enemy army better, and
               | that was an important issue during that period.
               | 
               | Nowadays, calculus is just a stepping stone to more
               | relevant mathematics.
        
               | lupire wrote:
               | Calculus is continuous, analog math. Digital Computers
               | use discrete math.
               | 
               | Both are math, and both are still incredibly important.
               | Rockets haven't gone out of style.
        
               | cafed00d wrote:
               | Calculus has never gone out of style ;)
               | 
               | Todays ML frameworks grapple with the problem of
               | "jacobian-vector products" & "vector-jacobian product" as
               | a consequence of understanding the interplay between
               | gradients & derivative; and the application of the "chain
               | rule". All of those 3 concepts are fundamentally
               | understood by being proficient in calculus.
               | 
               | While I'm being the hype-man for calculus I don't mean to
               | say proficiency in linear algebra or statistics is in any
               | "less necessary" or "less useful" or "less challenging"
               | or "less.." in any way.
               | 
               | I'm merely stating that, historically, calculus has been
               | the unique branch of study for engineering. Statistics
               | has always found value in many fields -- business,
               | finance, government policy etc.
               | 
               | Sure Linear algebra is one of those unique fields too --
               | I kinda like to think of it as "algebra" in general and
               | perhaps its utility has flowed in tandem with calculus.
               | Idk. I haven't thought super hard about it.
        
               | barrenko wrote:
               | You have precisely captured why I got interested in AI.
        
               | LudwigNagasena wrote:
               | Aren't you supposed to learn calculus to be able to
               | understand what O(n) even is? Is it not a standard part
               | of a CS major?
        
               | aae42 wrote:
               | They also drive trains
        
               | nineteen999 wrote:
               | Maybe we need a new moniker "webgineer". The average
               | HN/FAANG web programmer does appear to vastly
               | overestimate the value of their contributions to the
               | world.
        
               | peterleiser wrote:
               | 1999 indeed! I haven't heard that term since around 1999
               | when I was hired as a "web engineer" and derisively
               | referred to myself as a "webgineer". I almost asked if I
               | could change my title to "sciencematician".
        
               | techcode wrote:
               | Have we done full circle?
               | 
               | When I started doing this "Internet stuff" we were called
               | "webmasters", and job would actually include what today
               | we call: - DevOps - Server/Linux sysadmin - DB admin -
               | Full stack (backend and frontend) engineer
               | 
               | And I might have forgot some things.
        
               | KptMarchewa wrote:
               | People who cobble together new printers or kettles
               | overestimate the value of their contributions to the
               | world too. The delineation isn't between JS devs and JPL
               | or ASML engineers.
        
               | mogiddy55 wrote:
               | From what I've heard (not an OSX user) Windows is the
               | best operating system for multiple screens; OSX and Linux
               | glitch way more. Most anyone doing 3D sculpture or
               | graphics/art on a professional level will eventually move
               | to working with 2-3 screens, and since there are no
               | exclusively Mac design programs, OSX will be suboptimal.
               | 
               | There's little things too, like some people using gaming
               | peripherals (multi-button MMO mice and left hand
               | controllers, etc.) for editing, which might not be
               | compatible with OSX.
               | 
               | And also, if you're mucking around with two 32 inch 4k
               | monitors and a 16 inch Wacom it might start to feel a
               | little ridiculous trying to save space with a Mac Pro.
        
               | egypturnash wrote:
               | I've been doing art on a pro level for twenty five years
               | and I dislike multiple monitors.
        
               | mogiddy55 wrote:
               | I am just commenting about what I've seen at concept
               | artist desks / animation studios / etc.
        
               | ddingus wrote:
               | Why is that?
               | 
               | I am not an artist and also dislike multiple monitors,
               | though I will employ two of them on occasion.
               | 
               | My reasons are:
               | 
               | If the window and application management doesn't suck,
               | one display is all one needs.
               | 
               | With cheap multiple displays and touch devices came an
               | ongoing enshitification of app and window management.
               | (And usually dumber focus rules)
               | 
               | Having to turn my head x times a day sucks.
        
               | Shorel wrote:
               | Besides Windows having more drivers for USB adapters than
               | Linux*, which is a reflection of the market, I find Linux
               | having much fewer glitches using multiple screens.
               | 
               | Once it works, Linux is more reliable than Windows. And
               | virtual desktops have always worked better on Linux than
               | on Windows. So I disagree with you on that front.
               | 
               | * In my case, this means I had to get an Anker HDMI
               | adapter, instead of any random brand.
        
               | KptMarchewa wrote:
               | >I find Linux having much fewer glitches using multiple
               | screens.
               | 
               | Maybe as long as you don't need working fractional
               | scaling with different DPI monitors, which is nothing
               | fancy now.
        
               | Toutouxc wrote:
               | Nitpick: it hasn't been called "OS X" for almost eight
               | years now, starting with macOS Sierra.
        
               | tlrobinson wrote:
               | Who do you think writes those CAD and design tools that
               | help "actual engineers" solve the same problems over and
               | over?
        
               | pompino wrote:
               | Would you like me to explain how it works to you? I'm not
               | sure why you added a question mark.
        
               | what wrote:
               | Yes, they were asking you a question. Do you not
               | understand question marks?
        
               | bee_rider wrote:
               | I did EE in college but we mostly just used Windows
               | because the shitty semi-proprietary SPICE simulator we
               | had to use, and stuff like that, only supported Windows.
               | The company that makes your embedded processor might only
               | support Windows (and begrudgingly at that).
               | 
               | I think engineers using software should not be seen as an
               | endorsement. They seem to have an incredible tolerance
               | for bad UI.
        
               | pompino wrote:
               | You seem to be suggesting that a chunk of the hundreds of
               | millions of people who use a UI that you don't like,
               | secretly hate it or are forced to tolerate it. Not a
               | position I'd personally want to argue or defend, so I'll
               | leave it at that.
        
               | paulmd wrote:
               | What an oddly aggressive and hostile response to such a
               | banal observation. Yes, millions of people use software
               | they hate, all the time, that's wildly uncontroversial.
        
               | pompino wrote:
               | Its not an "observation" its someone making it up. Why
               | are you so upset if I disagree?
        
               | wiseowise wrote:
               | Making up what? Go drop by your nearby shop. My hair
               | styling constantly complains about management software
               | that they use and quality of payment integration. At work
               | I constantly hear complaints about shitty, slow IDEs. At
               | optician store guy been complaining about inventory
               | system.
               | 
               | People hate software that they're forced to use.
               | Professionals are better at tolerating crapware, because
               | there's usually sunk cost fallacy involved.
        
               | d0mine wrote:
               | There are only two types of software: those that people
               | hate and those that nobody uses (a paraphrase)
        
               | pompino wrote:
               | This is not a reasonable way to infer the sentiment of
               | hundreds of millions of people in different countries,
               | different business, different situations, etc, etc.
               | 
               | Disguising it as an "observation" is even more
               | ridiculous.
        
               | bee_rider wrote:
               | Indeed I'm not ready to defend it, it is just an
               | anecdote. I expected the experience of using crappy
               | professional software to be so universal that I wouldn't
               | have to.
        
               | pompino wrote:
               | Sure, and this is where I will ask you post a list of
               | "good" professional software so I can google all the bugs
               | in that software :)
               | 
               | Nah, I'm good. Believe what you want to believe my
               | friend.
        
               | Rinzler89 wrote:
               | _> They seem to have an incredible tolerance for bad UI._
               | 
               | Irelevant.
               | 
               | Firstly, it's a tool, not a social media platform
               | designed to sell ads and farm clicks, it needs to be
               | utilitarian and that's it, like a power drill or a pickup
               | truck, not look pretty since they're not targeting
               | consumers but solving a niche set of engineering
               | problems.
               | 
               | Secondly, the engineers are not the ones paying for that
               | software so their individual tolerance is irelevant since
               | their company pays for the tools and for their tolerance
               | to those tools, being part of the job description and the
               | pay.
               | 
               | Unless you run your own business , you're not gonna turn
               | down lucrative employment because on site they provide
               | BOSCH tools and GM trucks while you personally prefer the
               | UX of Makita and Toyota. If those tools' UX slows down
               | the process and makes the project take longer it's not my
               | problem, my job is to clock in at 9 and clock out at 5,
               | that's it, it's the company's problem to provide the best
               | possible tools for the job, if they can.
        
               | macintux wrote:
               | > my job is to clock in at 9 and clock out at 5
               | 
               | Where can I find one of those jobs?
        
               | Rinzler89 wrote:
               | It was figuratively. Obviously everyone has different
               | working hours/patterns depending on job market, skill set
               | and personal situation.
               | 
               | But since you asked, Google is famous for low workloads.
               | Or Microsoft. Or any other old and large slow moving
               | company with lots of money, like IBM, Intel, SAP, ASML,
               | Airbus, DHL, Siemens, manufacturing, aerospace, big
               | pharma, transportation, etc. No bootstrapped "agile"
               | start-ups and scale-ups, or failing companies that need
               | to compete in a race to the bottom.
               | 
               | Depends mostly on where you live though.
        
               | bee_rider wrote:
               | Do you disagree with the sentence before the one you
               | quoted? I think we basically agree, you came up with a
               | bunch of reasons that
               | 
               | > I think engineers using software should not be seen as
               | an endorsement.
        
               | ddingus wrote:
               | Is it truly bad UI?
               | 
               | They may be locked in, which just forces things. Not an
               | endorsement.
               | 
               | However, they may also be really productive with whatever
               | it is. This could be an endorsement.
               | 
               | In CAD, as an example, there are often very productive
               | interaction models that seem obtuse, or just bad to
               | people learning the tools first time.
               | 
               | Often improving on first time ramp ups to competence
               | nearly always impacts the pro user too.
               | 
               | Where it plays out this way, I have always thought the UI
               | was good in that the pros can work at peak efficiency. It
               | is hard to beat them.
               | 
               | Fact is the task complexity footprint is just large
               | enough to make "good", as in simple, intuitive interfaces
               | not possible.
        
               | fecal_henge wrote:
               | I'd say a lot of engineers (bridges, circuit boards,
               | injection mouldings) are kept far away from OSX (and
               | linux). Honestly, I'd just love a operating system that
               | doesn't decide its going to restart itself periodically!
        
               | wiseowise wrote:
               | > Honestly, I'd just love a operating system that doesn't
               | decide its going to restart itself periodically!
               | 
               | My MBP has been running without any restart for over a
               | month.
        
               | wiseowise wrote:
               | You can shit all you want on so called "engineers", but
               | they are the one who make the CAD you're talking about
               | that "real engineers" use. So get off your high horse.
        
               | germandiago wrote:
               | I challenge you to take those people who make bridges to
               | build full software.
               | 
               | I am not meaning software is engineering or not.
               | 
               | It is a fact, in terms of cost, that software and bridge
               | building are, most of the time very different activities
               | with very different goals and cost-benefit ratios.
               | 
               | All those things count when taking decisions about the
               | level of standardization.
               | 
               | About standards... there are lots also and widely used,
               | from networking to protocols, data transfer formats...
               | with well-known strengths and limitations.
        
               | pauby wrote:
               | In my 30+- year career I can confidently say that
               | Software Engineers look towards standardisation by
               | default as it makes their lives easier.
               | 
               | It feels to me that you're bitter or had more than one
               | bad experience. Perhaps you keep working with, or come
               | across, bad Engineers as your generalising is inaccurate.
        
               | pas wrote:
               | > who never standardize anything
               | 
               | IETF RFCs soon number over 10K; Java, win32, the Linux
               | kernel syscall API are famous for backward compatibility
               | 
               | not to mention the absurd success of standard libraries
               | of Python, Rust, PHP and certain "standard" projects like
               | Django, React, and ExpressJS
               | 
               | > (how many libraries do you need for arrays and vectors
               | and guis and buttons and text boxes and binary trees and
               | sorting, yada yada?)
               | 
               | considering the design space is enormous and the
               | tradeoffs are not trivial ... it's good to have libraries
               | that fundamentally solve the similar thing but in
               | different context-dependent ways
               | 
               | arguably we are using too many libraries and not enough
               | problem-specific in-situ DSLs (see the result of Alan
               | Kay's research the STEPS project at VPRI -
               | https://news.ycombinator.com/item?id=32966987 )
        
               | pompino wrote:
               | I'd argue almost all NEW library development is about
               | politics and platform ownership. Every large company
               | wants to be the dependency that other projects tie into.
               | And if you don't want to hitch your wagon to google or
               | facebook or whoever, you roll your own.
               | 
               | Many if not most computational problems are fundamentally
               | about data and data transformation under constraints -
               | Throughput, Memory, Latency, etc, etc. And for the
               | situations where the tradeoffs are non-trivial, solving
               | this problem is purely about domain knowledge regarding
               | the nature of the data (video codec data, real-time
               | sensor data, financial data, etc) not about programming
               | expertise.
               | 
               | The various ways to high level architect the overall
               | design in terms of client/server, P2P, distributed vs
               | local, threading model, are, IME are not what I would
               | call crazy complicated. There are standard ways of
               | implementing various variations of the overall design
               | which sadly because of a overall roll-your-own mindset,
               | most devs are reluctant to adopt someone elses design.
               | Part of that is that we don't have a framework of
               | knowledge that allows us to build a library for these
               | designs in our head where we can just pick one thats
               | right for our usecase.
               | 
               | I don't agree with your characterization of the design
               | space as 'enourmous'. I'd say most programmers just need
               | to know a handful of design types because they're not
               | working on high performance, low latency, multi-million
               | endpoint scalable projects where as you say things can
               | get non-trivial.
               | 
               | I'll give a shot at an analogy (I'm hoping the nitpickers
               | are out to lunch). The design space for door knob is
               | enormous because of the various hand shapes, disability
               | constraints, door sizes, applications, security
               | implications, etc. And yet we've standardize d on a few
               | door knob types for most homes which you can go out and
               | buy and install yourself. The special case bank vaults
               | and prisons and other domains solve it their own way.
        
               | sirsinsalot wrote:
               | You're kidding yourself if you don't think that
               | mechanical, structural or any other engineers don't do
               | the same thing. They do.
               | 
               | I worked for one of the UKs leading architecture /
               | construction firms writing software and also am an
               | amature mechanic.
               | 
               | You'd be amazed at how many gasket types, nuts, bolts,
               | fasteners, unfasters, glues, concretes, bonding agents
               | and so on ... all invented for edge preferences and most
               | of which could be used interchangably.
               | 
               | Also standards? Hah. They're an absolute shitshow in any
               | engineering effort.
               | 
               | I mean ... even just units of measure. C'mon.
        
               | cpill wrote:
               | not machines learning Devs
        
               | OtomotO wrote:
               | If it weren't for the OS I would've bought a MacBook
               | instead of a Lenovo laptop.
               | 
               | I've set up my OS exactly as I want it. (I use arch btw
               | ;-))
        
               | nirse wrote:
               | Same, but on gentoo :-p
        
               | ZiiS wrote:
               | Arch works fairly well on Apple silicon now, though
               | Fedora is easier/recomended. Limited emulation due to the
               | 16KB pages and no thunderbolt display out.
        
               | tlrobinson wrote:
               | I don't think it's at all unreasonable for an engineer
               | using a device for 8+ hours every day to pay an
               | additional, say, 0.5% of their income (assuming very
               | conservatively $100,000 income after tax, $1,000 extra
               | for a MacBook, 2 year product lifespan) for the best
               | built laptop, best screen, and best OS.
        
               | yunobcool wrote:
               | $100,000 after tax does not seem conservative to me (at
               | least outside the US).
        
               | tlrobinson wrote:
               | $50,000 income, 4 year product lifespan?
               | 
               | Obviously doesn't apply to all engineers.
        
               | rfoo wrote:
               | > and best OS
               | 
               | I do networking stuff and macOS is on par with Windows -
               | I can't live on it without running into bugs or very
               | questionable behavior for longer than a week. Same as
               | Windows.
        
               | klabb3 wrote:
               | What stuff is weird? I have so far had very good
               | experiences with Apple (although not iOS yet). Almost
               | everything I do on my Linux workstation works on Mac too.
               | Windows though is beyond horrible and different in every
               | way.
               | 
               | > I do networking stuff
               | 
               | Me too, but probably very different stuff. I'm doing p2p
               | stuff over tcp and am affected mostly by sock options,
               | buffer sizes, tcp options etc.
        
               | sensanaty wrote:
               | > Best OS
               | 
               | I like apple hardware, but their OS is fucking atrocious.
               | In the year 2024 it still doesn't have a native volume
               | mixer, or any kind of sensible window management
               | shortcuts. Half the things on it have to be fixed with
               | _paid software_. Complete joke of an OS, if it were up to
               | me I 'd stick a linux distro on top of the hardware and
               | be happy
        
               | chipdart wrote:
               | > Engineers use MacBook pros because it's the best built
               | laptop, the best screen, arguably the best OS and most
               | importantly - they're not the ones paying for them.
               | 
               | I know engineers from a FANG that picked MacBook pros in
               | spite of the specs and only because of the bling/price
               | tag. Them they spent their whole time using it as a
               | remote terminal for Linux servers, and they still
               | complained about the thing being extremely short on RAM
               | and HD.
               | 
               | One of them even tried to convince their managers to give
               | the vision pro a try, even though there was zero use
               | cases for it.
               | 
               | Granted, they drive multiple monitors well with a single
               | USB-C plug, at least with specific combinations of
               | monitors and hubs.
               | 
               | It's high time that the "Apple sells high end gear"
               | shtick is put to rest. Even their macOS treadmill is
               | becoming tiring.
        
               | theshrike79 wrote:
               | The build quality of Apple laptops is still pretty
               | unmatched in every price category.
               | 
               | Yes, there are 2k+ laptops from Dell/Lenovo that match
               | and exceed a similarly priced MacBook in pure power, but
               | usually lack battery life and/or build quality.
        
               | amias wrote:
               | the more the deviate from the BSD core the worse it gets.
        
               | theshrike79 wrote:
               | But I can still fire up a terminal and use all of my *nix
               | skills to operate.
               | 
               | I can't do that on Windows without either wrestling with
               | PowerShell or WSL2
        
               | yayr wrote:
               | Apple devices also work quite seamless together. IPads
               | for example work great as a second screen wirelessly with
               | the MBPs. I'd immediately buy a 14 inch ipad just for
               | that, since that is so useful when not on your standard
               | desk. Also copy paste between devices or headphones just
               | work...
               | 
               | in case Apple would come up with the idea to take an ipad
               | as external compute unit that would be amazing... just
               | double your ram, compute and screen with it in such a
               | lightweight form factor... should be possible if they
               | want
        
               | citiguy wrote:
               | You can use the iPad as a second monitor on Windows too
               | and it works nicely. I also use my airpod pro's with my
               | Dell Windows XPS and it's perfect.
        
               | yayr wrote:
               | is there now a low latency solution for windows 2nd
               | monitor? I was only aware of some software where latency
               | is quite bad or one company that provided a wireless HDMI
               | / Displayport dongle...
               | 
               | Also the nice thing for headphones within apple is, that
               | the airpods automatically switch to where the attention
               | is... meaning e.g., in case I watch something on the
               | laptop and pick up an iphone call (no matter if via phone
               | or any app) the airpod automatically switches
        
               | chipdart wrote:
               | > The build quality of Apple laptops is still pretty
               | unmatched in every price category.
               | 
               | I owned a MacBook Pro with the dreaded butterfly
               | keyboard. It was shit.
               | 
               | How many USB ports do the new MacBook air have? The old
               | ones had two. And shipped with 8GB of RAM? These are
               | shit-tier specs.
               | 
               | The 2020 MacBook pros had a nice thing: USB-C charging,
               | and you could charge the from either side. Current models
               | went back to MagSafe, only on one side. The number of USB
               | ports is still very low.
               | 
               | But the are shiny. I guess that counts as quality.
        
               | macintux wrote:
               | USB-C charging still works with the Pros (driving a M3
               | Max), and 3 ports seems reasonable to me.
        
               | nickv wrote:
               | I guess we can agree to disagree, but I find the 2020 rev
               | Macbook pros have a good number of USB-C ports (2 on the
               | left, 1 on the right -- all can do PD), a magsafe
               | charger, headphone jack, HDMI port and SD card slot. How
               | many USB-C ports do you need? Sometimes I wish there was
               | ethernet but I get why it's not there.
               | 
               | I agree, the butterfly keyboard was shitty but I
               | absolutely love the keyboard on the 2020 rev. It's still
               | not as great as my mechanical desktop keyboard, but for a
               | laptop keyboard it's serious chef's kiss. Also, I have
               | yet to find a trackpad that is anywhere as good as the
               | Macbook. Precision trackpads are still way way worse.
               | 
               | Finally, the thing that always beings me back to MBPs (vs
               | Surfacebooks or Razers) is battery life. I typically get
               | a good 10+ hours on my MBP. Battery life on my old Razer
               | Blade and Surfacebooks were absolutely comically
               | horrible.
        
               | tharkun__ wrote:
               | I'm absolutely not an Apple person. Privately own zero
               | Apple hardware.
               | 
               | However there are two awesome things about my work MBP I
               | would really want from my ThinkPad:
               | 
               | Magsafe charger - too many close calls!
               | 
               | And the track pad.
               | 
               | I can't work properly without an external mouse on my
               | ThinkPad. But on the MBP everything just has the right
               | size, location, proportions and handling on the track
               | pad. I had a mouse for the MBP too but I stopped using
               | it!
        
               | theshrike79 wrote:
               | > I owned a MacBook Pro with the dreaded butterfly
               | keyboard. It was shit.
               | 
               | Yea, the butterfly was utter shit. And they fucked up the
               | touchbar by not just putting it on TOP of the existing
               | F-keys.
               | 
               | But the rest of the laptop was still well machined :D
        
               | davrosthedalek wrote:
               | My 15 inch macbook which fried its display twice (didn't
               | go to sleep properly and then put in a bagpack and
               | overheated. There is no way to see that the sleep didn't
               | kick in), and then had the broken display cable problem
               | (widespread and Apple wanted $900 for a new display..)
               | would disagree. For comparison: The 4k touch display on
               | my xps15 that didn't survive a diet coke bath was <$300
               | including labor for a guy to show up in my office and
               | repair it while I was watching....
        
               | davedx wrote:
               | I'm freelance so I've absolutely paid for my last 3
               | Macbooks. They're best in class tools and assets for my
               | business.
        
               | pjmlp wrote:
               | US Engineers, and in countries of similar income, the
               | rest of the world is pretty much settled in a mix of
               | Windows and GNU/Linux desktop/laptops.
        
               | PeterStuer wrote:
               | If we are honest vanity signaling is a large part of it.
               | Basically the Gucci bag equivalent for techies.
        
               | lost_womble wrote:
               | Honestly not. My tests run WAY faster on Apple Silicon,
               | that's all I care about.
        
               | PeterStuer wrote:
               | Not being contrarian, but what are you comparing?
        
               | jwr wrote:
               | > Engineers use MacBook pros because it's the best built
               | laptop, the best screen, arguably the best OS and most
               | importantly - they're not the ones paying for them.
               | 
               | I am the one paying for my MacBook Pro, because my
               | company is a self-funded business. I run my entire
               | business on this machine and I love it. I always buy the
               | fastest CPU possible, although I don't max out the RAM
               | and SSD.
               | 
               | Amusingly enough, I talked to someone recently about
               | compilation speeds and that person asked my why I don't
               | compile my software (Clojure and ClojureScript) on
               | "powerful cloud servers". Well, according to Geekbench,
               | which always correlates very well with my compilation
               | speeds, there are very few CPUs out there that can beat
               | my M3 Max, and those aren't easily rentable as bare-metal
               | cloud servers. Any virtual server will be slower.
               | 
               | So please, don't repeat the "MacBooks are for spoiled
               | people who don't have to pay for them" trope. There are
               | people for whom this is simply the best machine for the
               | job at hand.
               | 
               | Incidentally, I checked my financials: a 16" MBP with M3
               | and 64GB RAM, amortized over 18 months (very short!)
               | comes out to around $150/month. That is not expensive at
               | all for your main development machine that you run your
               | business on!
        
               | traceroute66 wrote:
               | > comes out to around $150/month.
               | 
               | Which, incidentally, is probably about 10x less than you
               | would spend compiling your software on "powerful cloud
               | servers". :-)
        
               | renonce wrote:
               | For a fair comparison, what about comparing against the
               | cheapest "power cloud server"?
               | 
               | I mean Hetzner has a reputation for renting bare metal
               | servers at the cheapest price in the market. Try AX102
               | which has very close performance to a M3 Max (CPU only):
               | https://www.hetzner.com/dedicated-rootserver/matrix-ax/
               | 
               | While the OP's solution has a lot of advantages like
               | being able to own the device and including GPU, but at
               | least we do have cloud servers with comparable costs
               | available.
        
               | DenseComet wrote:
               | I tried a lot to use remote servers for development when
               | I had an Intel MacBook and I found the experience to
               | always be so frustrating that I upgraded to the M series.
               | Have the tools gotten any better or is vscode remote
               | containers still the standard?
        
               | jwr wrote:
               | Indeed! That server is very close to my M3 Max. I stand
               | slightly corrected :)
               | 
               | Worth noting: the monthly cost is close to my 18-month
               | amortized cost.
        
               | dizhn wrote:
               | In your case it makes sense to get the most performant
               | machine you can get even if it means you're paying a ton
               | more for marginal gains. This is not usually true for the
               | general public.
        
               | alemanek wrote:
               | General public can buy a M1 MacBook Air for $799 if they
               | need a laptop at all. An air will serve them well for a
               | long time.
        
               | yatz wrote:
               | In addition to the hardware, the OSX software is so much
               | better with flawless speed, productivity, and
               | multitasking with gestures. Try doing the desktop
               | switching on the windows. On a flip note, I would gladly
               | use the cloud if internet speeds and latency comes down
               | to negligible level - we developer are an impatient lot.
        
               | TheRealDunkirk wrote:
               | I think relatively few corporations are offering Macs to
               | people. It's all bog-standard POS Dells, with locked-down
               | Windows images that often do not even allow you to change
               | the screensaver settings or the background image, in the
               | name of "security." I'd love to be wrong about that.
        
               | barrenko wrote:
               | Engineers loving tools is peak HN :).
        
               | BossingAround wrote:
               | Arguably the best OS? For what? For browsing the web,
               | video editing, etc.? Maybe. For development? Jesus, macOS
               | doesn't even have native container support. All the devs
               | I know with macOS then either get a second Linux laptop,
               | or spend a lot of their time SSHd into a Linux server.
               | 
               | For dev (at least backend and devops), macOS is not that
               | great.
        
               | faeriechangling wrote:
               | Yeah it's funny for all the hoopla I've heard over the
               | glory of MacOS having a REAL UNIX TERMINAL, WSL works
               | better in practice simply because it's running an actual
               | Linux VM and thus the support is better.
               | 
               | Still, I just don't think it's that burdensome to get
               | containers running on MacOS, it's just annoying that it
               | happens to work worse than on Windows or Linux. Ignoring
               | the hardware, the only real advantage to MacOS
               | development is when you're targeting Apple products with
               | what you're developing.
        
               | kagakuninja wrote:
               | I don't know what you are talking about, I'm a back end
               | engineer, and every company I've worked for during the
               | last 12 years gives out MacBook pros to all devs. Even
               | the game company that used C# and Mono gave out MacBooks
               | (and dual booted them, which of course you can't do any
               | more; I never bothered with Windows since our servers
               | were written in Scala).
               | 
               | Not all teams run tons of containers on personal
               | computers. All our servers are running on AWS. I rarely
               | ssh into anything.
               | 
               | I like the fact that OS X is based on UNIX, and not some
               | half-assed bullshit bolted onto Windows. I still have bad
               | memories of trying to use Cygwin 15 years ago. Apparently
               | WSL is an improvement, but I don't care.
               | 
               | Mac runs all the software I need, and it has real UNIX
               | shells.
        
               | borissk wrote:
               | This statement is completely wrong. There are millions of
               | engineers in the world and most of them live in countries
               | like China, India and Russia. Very few of them use
               | MacBooks.
               | 
               | The vast majority of the software engineers in big
               | companies (that employ a lot more people than big tech
               | and startups combined) who use Java and C# also have
               | predominately Windows laptops (as their employers can
               | manage Windows laptops a lot easier, have agreements with
               | vendors like Dell to buy them with a discount, have
               | software like AV that doesn't support MacOS, etc.).
               | 
               | On top of that MacBooks don't have the best screens and
               | are not the best built. Many Windows laptops have OLED
               | screens or 4K IPS screens. There are premium Windows
               | laptops made out of magnesium and carbon fiber.
        
               | kagakuninja wrote:
               | I'm an American, so maybe the situation is different
               | elsewhere.
               | 
               | Every company I've worked for during the last 12 years
               | gives out MacBook Pros. And I've been developing using
               | Scala / Java for the last 20 years.
               | 
               | Employers manage Macs just fine, this isn't 1999. There
               | have been studies showing that Macs have lower IT
               | maintenance costs compared to Windows.
               | 
               | I admit that I haven't dealt with Windows devices in a
               | long time, maybe there are some good ones available now,
               | but I find your statements to be beyond belief. Apple
               | Silicon Macs have blown the doors off the competition,
               | out performing all but top-end Intel laptops, while using
               | a fraction of the power (and I never even hear the fans
               | come on).
        
               | xedrac wrote:
               | "best OS" is so subjective here. I'll concede that the
               | MacBook hardware is objectively better than any laptop
               | I've owned. But it's a huge leap to say Mac OS is
               | objectively better than Linux IMO.
        
               | amelius wrote:
               | They are perhaps only the best by a very small margin.
               | 
               | I am happy to not support Apple's ecosystem and use a
               | minimally worse laptop from a different brand.
        
               | Hammershaft wrote:
               | Apple's hardware these days is exceptional but the
               | software left wanting in comparison. MacOS feels like
               | it's been taking two steps back for every step forward
               | for a decade now. I run MacOS, Linux w/ i3, and Windows
               | all every day, and outside of aesthetics & apple
               | integration MacOS feels increasingly the least coherent
               | of the 3.
               | 
               | The same is true of the ipad which is just a miraculous
               | piece of hardware constrained by an impotent operating
               | system.
        
             | jojobas wrote:
             | Spending your life on a phone is still a lifestyle
             | "choice".
        
             | nox101 wrote:
             | price-performance is not a thing for a vast majority of
             | users. Sure I'd like a $40k car but I can only afford a
             | $10k car. It's not nice but it gets me from a to b on my
             | min-wage salary. Similarly, I know plenty of friends and
             | family. They can either get 4 macs for $1000 each (mom,
             | dad, sister, brother) so $4k. Or they can get 4 windows PCs
             | for $250 so $1k total.
             | 
             | The cheap Windows PCs suck just like a cheap car sucks (ok,
             | they suck more), but they still get the job done. You can
             | still browse the web, read your email, watch a youtube
             | video, post a youtube video, write a blog, etc.. My dad got
             | some HP celeron. It took 4 minutes to boot. It still ran
             | though and he paid probably $300 for it vs $999 for a mac.
             | He didn't have $999.
        
               | klabb3 wrote:
               | I'm not saying one or the other is better for your family
               | members. But MacBooks last very long. We'll see about the
               | M series but for myself for instance I got the M1 air
               | without fans, which has the benefit of no moving pieces
               | or air inlets, so even better. My last one, a MBP from
               | 2011 lasted pretty much 10 years. OS updates are 8-10y.
               | 
               | > The cheap Windows PCs suck [...], but they still get
               | the job done
               | 
               | For desktop, totally. Although I would still wipe it with
               | Ubuntu or so because Windows is so horrible these days
               | even my mom is having a shit time with only browsing and
               | video calls.
               | 
               | A random laptop however is a different story. Except for
               | premium brands (closer to Apple prices) they tend to have
               | garbage battery life, infuriating track pad, massive
               | thermal issues, and preloaded with bloatware. Apple was
               | always better here, but now with the lower power/heat of
               | the ARM chips, they got soooo much better overnight.
        
               | nox101 wrote:
               | > A random laptop however is a different story. Except
               | for premium brands (closer to Apple prices) they tend to
               | have garbage battery life, infuriating track pad, massive
               | thermal issues, and preloaded with bloatware. Apple was
               | always better here, but now with the lower power/heat of
               | the ARM chips, they got soooo much better overnight.
               | 
               | To the person with no budget, all that doesn't matter.
               | They'll still get let $250 laptop and put up with the
               | garbage battery life (find a power outlet), infuriating
               | trackpad (buy an external mouse for $10), bloatware (most
               | users don't know this and just put up with it), etc....
               | 
               | I agree Apple is better. But if your budget is $250 and
               | not $1k then you get what you can get for $250 and
               | continue to feed your kids and pay your rent.
        
               | fragmede wrote:
               | But also you don't have to buy new. If I had $250, an
               | ancient MacBook might be better than a newer low-end
               | windows laptop. Though for my purposes I'd probably get
               | an oldish Chromebook and root it.
        
             | chipdart wrote:
             | > Somewhat true but things are changing. While there are
             | plenty of "luxury" Apple devices like Vision Pro or fully
             | decked out MacBooks for web browsing we no longer live in a
             | world where tech are just lifestyle gadgets.
             | 
             | I notice your use of the weasel word "just".
             | 
             | We undoubtedly live in a world where Apple products are
             | sold as lifestyle gadgets. Arguably it's more true today
             | than it ever was. It's also a world where Apple's range of
             | Veblen goods managed to gain footing in social circles to
             | an extent that we have kids being bullied for owning
             | Android phones.
             | 
             | Apple's lifestyle angle is becoming specially relevant
             | because they can no longer claim they sell high-end
             | hardware, as the difference in specs between Apple's
             | hardware and product ranges from other OEMs is no longer
             | noticeable. Apple's laughable insistence on shipping
             | laptops with 8GB of RAM is a good example.
             | 
             | > Even in the higher end products like the MacBooks you see
             | a lot of professionals (engineers included) who choose it
             | because of its price-performance-value, and who don't give
             | a shit about luxury.
             | 
             | I don't think so, and that contrasts with my personal
             | experience. All my previous roles offered a mix of MacBooks
             | and windows laptops, and MacBooks were opted by new
             | arrivals because they were seen as perks and the particular
             | choice of windows ones in comparison were not as
             | impressive, even though they out-specced Apple's offering
             | (mid-range HP and Dell). In fact in a recent employee's
             | review their main feedback was that the MacBook pro line
             | was under-specced because at best it shipped with only 16GB
             | of RAM while the less impressive HP ones already came with
             | 32GB. In previous years, they called for the replacement of
             | the MacBook line due to the rate of keyboard malfunctions.
             | Meaning, engineers were purposely picking the
             | underperforming option for non-technical reasons.
        
               | lynx23 wrote:
               | I bought my first Apple product roughly 11 years ago
               | explicitly because it had the best accessibility support
               | at the time (and that is still true). While I realize you
               | only see your slice of the world, I really cringe when I
               | see the weasel-word "lifestyle". This "Apple is for the
               | rich kids"-fairytale is getting really really old.
        
               | pembrook wrote:
               | Apparently you've never used Apple Silicon. There's no PC
               | equivalent in terms of specs.
               | 
               | Also, I think you're misunderstanding what a Veblen good
               | is and the difference between "premium" and "luxury."
               | Apple does not create luxury or "Veblen" goods like for
               | example, LVMH.
               | 
               | An easy way to discern the difference between premium and
               | luxury -- does the company advertise the product's
               | features or price?
               | 
               | For example, a Chanel handbag is almost entirely divorced
               | from its utility as a handbag. Chanel doesn't advertise
               | features or pricing, because it's not about the product's
               | value or utility, it's what it says about your personal
               | wealth that you bought it. That's a Veblen good.
               | 
               | Apple _heavily_ advertises features and pricing. Because
               | they sell premium products that _are not_ divorced from
               | their utility or value.
        
             | arvinsim wrote:
             | Macbooks are not bang-for-buck. Most engineers I know buy
             | it because it's like Windows but with Unix tools built-in.
        
               | aplummer wrote:
               | I would be interested if there exists a single better
               | value machine in $ per hour than my partners 2012 MacBook
               | Air, which still goes
        
               | jpc0 wrote:
               | Any decent laptop from the same era. My parents are using
               | both HP ProBooks and Lenovo Thinkpads from that era
               | currently and they are working perfectly and maintenance
               | costs are lower than the same era macbooks...
               | 
               | I own a MacBook Air, I won't be buying another purely
               | because the moment I need to upgrade anything or repair
               | anything it's effectively ewaste.
        
               | goguy wrote:
               | Not found any good proxy which works well with cisco VPN
               | software. Charles and proxyman work intermittently at
               | best and require disconnecting from the VPN and various
               | such dances.
               | 
               | Fiddler on windows works flawlessly.
        
             | ActorNightly wrote:
             | >Even in the higher end products like the MacBooks you see
             | a lot of professionals (engineers included) who choose it
             | because of its price-performance-value, and who don't give
             | a shit about luxury.
             | 
             | Most CS professionals who write code have no idea what it
             | takes to build a desktop, so the hardware that they chose
             | is pretty much irrelevant because they aren't specifically
             | choosing for hardware. The reason Apple gets bought is
             | mostly by anyone, including tech people, is because of
             | ecosystem. The truth is, nobody really care that much about
             | actual specs as long as its good enough to do basic stuff,
             | and when you are indifferent to the actual difference but
             | all your friends are in the ecosystem, the choice is
             | obvious.
             | 
             | You can easily see this yourself: ask these "professionals"
             | about the details of the Apple Neural engine, and its a
             | very high chance that they will repeat some marketing
             | material, while failing to mention that Apple does not
             | publish any real docs for ANE, you have to sign your code
             | to run on ANE, and you have to basically use Core ML to
             | utilize the ANE. I.e if they really cared about inference,
             | all of them would be buying laptops with discrete 4090s for
             | almost the same price.
             | 
             | Meanwhile, if you look at people who came from EE/ECE (who
             | btw on the average are far better coders than people with
             | CS background, based on my 500+ interviews in the industry
             | across several sectors), you see a way larger skew towards
             | Android/custom built desktops/windows laptops running
             | Linux. If you lived and breathed Linux and low level OS,
             | you tend appreciate all the power and customization that it
             | gives you because you don't have to go learn how to do
             | things.
        
               | daviddever23box wrote:
               | Coming from both environments, I'd be wary of making some
               | of these assertions, especially when you consider that
               | any ecosystem that optimizes software and hardware
               | together (from embedded devices all the way to general-
               | purpose computing machines) is generally going to perform
               | well, given the appropriate engineering focus. This
               | applies regardless of (RT)OS / hardware choice, i.e.,
               | it's simply common sense.
               | 
               | The signing of binaries is a part of adult developer
               | life, and is certainly required for the platforms you
               | mention as well.
               | 
               | Unquestionably, battery life on 4090-based laptops sucks
               | on a good day, and if you're working long hours, the last
               | thing you want to have to do is park yourself next to
               | your 350W adapter just to get basic work done.
        
               | ActorNightly wrote:
               | >specially when you consider that any ecosystem that
               | optimizes software and hardware together (from embedded
               | devices all the way to general-purpose computing
               | machines) is generally going to perform well, given the
               | appropriate engineering focus.
               | 
               | Very much not true. Not to make this personal, but this
               | is exactly what Im talking about Apple fans not
               | understanding hardware.
               | 
               | Linux has been through the ringer of fighting its way to
               | general use, and because of its open source nature and
               | constant development. So in terms of working well, it has
               | been optimized for hardware WAY further than Apple, which
               | is why you find it on servers, personal desktops, phones,
               | portable gaming devices, and even STM32 Cortex bldc
               | control boards, all of which run different hardware.
               | 
               | Apple doesn't optimize for general use, it optimizes for
               | a specific business case. In the case of Apple silicon,
               | it was purely battery life which brings more people in to
               | the ecosystem. Single core performance is on par with all
               | the other chips, because the instruction set doesn't
               | actually matter
               | (https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-
               | doesnt-...), multi core is behind, Mac Os software is
               | still a pile of junk (Rosetta still isn't good across the
               | board), computers are not repairable, you have no privacy
               | since Apple collects a shitload of telemetry for
               | themselves, e.t.c and so on.
               | 
               | And, Apple has no incentive to make any of this better -
               | prior to Apple Silicon, people were still buying Intel
               | Macs with worse specs and performance for the same price,
               | all for the ecosystem and vanity. And not only was the
               | Macos still terrible (and much slower), you also had
               | hardware failures like plugging in a wrong USBC hub would
               | blow the chip and brick your Mac, butterfly keyboards
               | failing, and questionable decisions like virtual esc
               | keys.
               | 
               | >The signing of binaries is a part of adult developer
               | life,
               | 
               | ...for professional use, and the private key holder
               | should be the person who wrote that software. I hope you
               | understand how ridiculous it is to ask a developer to
               | sign code using the manufacturers key to allow them to
               | run that code on a machine that they own.
               | 
               | >Unquestionably, battery life on 4090-based laptops sucks
               | on a good day,
               | 
               | Well yea, but you are not buying that laptop for battery
               | life. Also, with Ryzen cpus and 4090s, most get like 6-8
               | hours depending on use due to Nvidia Prime, which is
               | pretty good for travel, especially if you have a backpack
               | with a charging brick.
               | 
               | If you want portability, there are plenty of lighter
               | weight option like Lenovo Yoga which can get 11-12 hours
               | of battery life for things like web browsing.
        
             | hinkley wrote:
             | Most of Apple's money comes from iPhones.
        
             | kmacdough wrote:
             | It's not about price-performance value at all. Mac is still
             | the most expensive performance. And Apple is only
             | particularly popular in the US. Android phones dominate
             | most other markets, particularly poor markets.
             | 
             | Apple is popular in the US because a) luxury brands hold
             | sway b) they goad customers into bullying non-customers
             | (blue/green chats) and c) they limit features and
             | customizability in favor of simpler interfaces.
             | 
             | It's popular with developers because a) performance is
             | valuable even at Apples steep cost b) it's Unix-based
             | unlike Windows so shares more with the Linux systems most
             | engineers are targeting.
        
               | rjha wrote:
               | I have never been an apple fanboy. Till 2022, I was on
               | android phones. Work issued either Thinkpad or XPS
               | variants. However, I have owned apple _books_ since 2004
               | starting from panther era. I sincerely believe that apple
               | provides best features and performance combination in the
               | given price for laptops.
               | 
               | Here I feel that I-hate-apple crowd is just stuck with
               | this notion of luxury overpriced brand when it is clearly
               | not the case. Apple has superior hardware at better price
               | points. Last time I was doing shopping for a laptop, I
               | could get similar features only at a 30% - 40% price
               | premium in other brands.
               | 
               | I am typing this on an apple M2 air and try finding
               | similar performance under 2000 USD in other brands. The
               | responsiveness, the (mostly) sane defaults and superior
               | rendering and fonts make it worth it. The OS does not
               | matter so much as it used to do in 2004 and the fact that
               | I have a unix terminal in 2024 is just incidental. I have
               | turned off auto updates and I do not use much of phone
               | integration apart from taking backups and photo copying.
               | 
               | I switched to an iPhone in 2022 from a 200 US$ Samsung
               | handset. Here, I would say that not everyone needs an
               | iPhone. My old phone used to do all the tricks I need on
               | this one. However, the camera is really and photos are
               | really great. If I buy an iPhone next time, it would be
               | just for the photos it takes.
        
             | tomcar288 wrote:
             | you can get a laptop with a much bigger screen and a
             | keyboard for as little as 100 to 300$ and it will be much
             | much easier to get work done on, than an apple phone. so i
             | think apple is still very much a luxury product.
        
           | hehdhdjehehegwv wrote:
           | As a privacy professional for many, many years this is 100%
           | correct. Apple wouldn't be taking billions from Google for
           | driving users to their ad tracking system, they wouldn't give
           | the CCP access to all Chinese user data (and maybe beyond),
           | and they wouldn't be on-again-off-again flirting with
           | tailored ads in Apple News if privacy was a "human right".
           | 
           | (FWIW my opinion is it is a human right, I just think Tim
           | Cook is full of shit.)
           | 
           | What Apple calls privacy more often than not is just putting
           | lipstick on the pig that is their anticompetitive walled
           | garden.
           | 
           | Pretty much everybody in SV who works in privacy rolls their
           | eyes at Apple. They talk a big game but they are as full of
           | shit as Meta and Google - and there's receipts to prove it
           | thanks to this DoJ case.
           | 
           | Apple want to sell high end hardware. On-device computation
           | is a better user experience, hands down.
           | 
           | That said, Siri is utter dogshit so on-device dogshit is just
           | faster dogshit.
        
             | moneywoes wrote:
             | any private guides for todays smartphone user?
        
               | hehdhdjehehegwv wrote:
               | At this point call your government representatives and
               | ask for new laws, or if you live someplace with laws,
               | actual enforcement (looking at you EU).
               | 
               | The idea that user behavior or consumer choice will
               | change any of this is basically discredited in practice.
               | It will always been cat and mouse until the point that
               | CEOs go to jail, then it will stop.
        
               | lynx23 wrote:
               | CEOs dont go to jail. If they do, its an exception that
               | is not relevant to the game.
        
           | kortilla wrote:
           | The MacBook Air is not a luxury device. That meme is out of
           | date
        
             | pseufaux wrote:
             | Curious what criteria you're using for using for qualifying
             | luxury. It seems to me that materials, software, and design
             | are all on par with other more expansive Apple products.
             | The main difference is the chipset which I would argue is
             | on an equal quality level as the pro chips but designed for
             | a less power hungry audience.
        
             | ozim wrote:
             | Maybe for you, but I still see sales guys who refuse
             | working on WinTel where basically what the do is browse
             | internet and do spreadsheets - so mainly just because they
             | would not look cool compared to other sales guys rocking
             | MacBooks.
        
               | stevage wrote:
               | I don't buy this "looking cool" argument.
               | 
               | I have used both. I think the Mac experience is
               | significantly better. No one is looking at me.
        
               | ozim wrote:
               | I provide laptops to people from time to time. They
               | expect to get a MacBook even if company is Windows and
               | they don't have any real arguments.
        
             | lolinder wrote:
             | I can't buy a MacBook Air for less than $999, and that's
             | for a model with 8GB RAM, an 8-core CPU and 256GB SSD. The
             | equivalent (based on raw specs) in the PC world runs for
             | $300 to $500.
             | 
             | How is something that is twice as expensive as the
             | competition _not_ a luxury device?
             | 
             | EDIT: Because there's repeated confusion in the replies: I
             | am _not_ saying that a MacBook Air is not objectively a
             | better device. I 'm saying it is better by metrics that
             | fall strictly into the "luxury" category.
             | 
             | Better build quality, system-on-a-chip, better OS, better
             | battery life, aluminum case--all of these are luxury
             | characteristics that someone who is looking for a
             | functional device that meets their needs at a decent price
             | won't have as dealbreakers.
        
               | spurgu wrote:
               | Really? You can find a laptop with the equivalent of
               | Apple Silicon for $3-500? And while I haven't used
               | Windows in ages I doubt it runs as well with 8 GB as
               | MacOS does.
        
               | lolinder wrote:
               | Sure, then try this one from HP with 16GB RAM and a CPU
               | that benchmarks in the same ballpark as the M2, for $387:
               | 
               | https://www.amazon.com/HP-Pavilion-i7-11370H-Micro-Edge-
               | Anti...
               | 
               | The point isn't that the MacBook Air isn't _better_ by
               | some metrics than PC laptops. A Rolls-Royce is  "better"
               | by certain metrics than a Toyota, too. What makes a
               | device luxury is if it costs substantially more than
               | competing products that the average person would consider
               | a valid replacement.
        
               | zlsa wrote:
               | I'm not sure a machine that benchmarks half as fast as an
               | M2 can be said to be in the same ballpark.
               | 
               | MacBook Air (2022):
               | https://browser.geekbench.com/macs/macbook-air-2022
               | 
               | Ryzen 5 5500U (CPU):
               | https://browser.geekbench.com/processors/amd-
               | ryzen-5-5500u
               | 
               | Ryzen 5 5500U (APU, similar laptop):
               | https://browser.geekbench.com/v5/compute/6751456
        
               | wiseowise wrote:
               | No offense, but you sound like a garbage salesman at flee
               | market trying to sell his junk.
        
               | breuleux wrote:
               | > the average person would consider a valid replacement
               | 
               | But what is that, exactly? If you look at all aspects of
               | a laptop: CPU, RAM, SSD, battery life, screen quality,
               | build quality, touchpad, OS, and put them in order of
               | importance for the average consumer, what would be on
               | top? I don't think it's the tech specs.
               | 
               | For instance, I would be willing to bet that for a large
               | number of consumers, battery life is far more important
               | than the tech specs, which means that a valid replacement
               | for their MacBook must have equivalent battery life. You
               | also have to consider things like the expected lifespan
               | of the laptop and its resale value to properly compare
               | their costs. It's not simple.
        
               | datadrivenangel wrote:
               | How much does it cost to get a device with comparable
               | specs, performance, and 18 hour battery life?
               | 
               | Closer to $999 then $500.
        
               | lolinder wrote:
               | This CPU benchmarks in the same ballpark as the M2 and it
               | runs for $329:
               | 
               | https://www.amazon.com/Lenovo-IdeaPad-
               | Ryzen5-5500U-1920x1080...
               | 
               | An 18 hour battery life _is_ a luxury characteristic, not
               | something penny pinchers will typically be selecting on.
        
               | outworlder wrote:
               | What about the rest of the system? The SSD, for example?
               | 
               | Apple likes to overcharge for storage, but the drives are
               | _really_ good.
        
               | lolinder wrote:
               | When you're breaking out SSD speeds you're _definitely_
               | getting into the  "luxury" territory.
               | 
               | As I said in another comment:
               | 
               | The point isn't that the MacBook Air isn't better by some
               | metrics than PC laptops. A Rolls-Royce is "better" by
               | certain metrics than a Toyota, too. What makes a device
               | luxury is if it costs substantially more than competing
               | products that the average person would consider a valid
               | replacement.
        
               | hatsix wrote:
               | There is no user buying a lowest-tier Macbook Air who
               | would be able to tell the difference between the Lenovo
               | SSD and the Macbook SSD.
        
               | FireBeyond wrote:
               | When I bought my cheesegrater Mac Pro, I wanted 8TB of
               | SSD.
               | 
               | Except Apple wanted $3,000 for 7TB of SSD (considering
               | the base price already included 1TB).
               | 
               | Instead, I bought a 4xM.2 PCI card, and 4 2TB Samsung Pro
               | SSDs.
               | 
               | I paid $1,300 for it, got to keep the 1TB "system" SSD.
               | 
               | And I get faster speeds from it, 6.8GBps versus 5.5GBps
               | off the system drive.
               | 
               | For $2,000 I could have got the PCI 4.0 version and SSDs,
               | and get 26GBps.
        
               | ukuina wrote:
               | Expandability is no longer an option with Apple Silicon.
        
               | FireBeyond wrote:
               | Not technically true. The Mac Pro 2023 has 6 PCI slots...
               | 
               | ... for an eye watering $3,000 over the exact same spec
               | Mac Studio.
               | 
               | I liked my cheesegrater, though I didn't like the heat
               | output.
               | 
               | And I cannot justify that. I sacrificed half the
               | throughput (2800MBps) for $379 and got an external 4 x
               | M.2 TB3 enclosure.
               | 
               | Oh, and a USB 3 hub to replace one I had installed in the
               | cheesegrater to augment the built in ports. $400 give or
               | take.
        
               | goosedragons wrote:
               | They're average. A 512GB M3 MBA gets like 3000MBps for
               | read/write. A 1TB Samsung 990 Pro, which costs less than
               | the upgrade from 256GB to 512GB on the Air is over twice
               | as fast. And on base models Apple skimps and speeds are
               | slower.
        
               | pquki4 wrote:
               | What the point of comparison? Isn't 18 hour battery and
               | Genius Bar part of the "luxury"?
               | 
               | Like I say Audi is a luxury car because a Toyota costs
               | less than half as much, and you ask "what about a Toyota
               | with leather seats"?
        
               | FireBeyond wrote:
               | I am all in on Apple, to be clear. Mac Pros, multiple
               | MBPs, Studio, Pro Display XDR, multiple Watches, phones,
               | iPad Pro.
               | 
               | My experiences (multiple) with Genius Bar have been
               | decidedly more "meh" to outright frustrating, versus
               | "luxury", oftentimes where I know more than the Genius.
               | 
               | Logic Board issues where on a brand new macOS install I
               | could reproducibly cause a kernel panic around graphics
               | hardware. There was an open recall (finally, after
               | waiting MONTHS) on this. It covered my Mac. But because
               | it passed their diagnostic tool, they would only offer to
               | replace the board on a time and materials basis.
               | 
               | I had a screen delamination issue. "It's not that bad -
               | you can't see it when the screen is on, and you have to
               | look for it". Huh. Great "luxury" experience.
               | 
               | And then the multiple "we are going to price this so
               | outrageously, and use that as an excuse to try to
               | upsell". Like the MBA that wouldn't charge due to a
               | circuit issue. Battery fine, healthy. Laptop, fine,
               | healthy, on AC. Just couldn't deliver current to the
               | battery. Me, thinking sure, $300ish maybe with a little
               | effort.
               | 
               | "That's going to be $899 to repair. That's only $100 less
               | than a new MBA, maybe we should take a look at some of
               | the new models?" Uh, no. I'm not paying $900 for a laptop
               | that spends 99% (well, 100% now) of its life on AC power.
        
               | hatsix wrote:
               | Yes, but having all three of those things (well,
               | specs/performance is probably just one thing, but
               | treating them as separate as you did means that I don't
               | have to do the heavy lifting of figuring out what a third
               | thing would actually be) IS, in fact, a luxury.
               | 
               | Nobody is away from a power source for longer than 18
               | hours. MOST people don't need the performance that a
               | macbook air has, their NEEDS would be met by a raspberry
               | pi... that is, basic finances, logging into various
               | services, online banking, things that first world
               | citizens "rely" on.
               | 
               | The definition of luxury is "great comfort and
               | extravagance", and every current Apple product fits that
               | definition. Past Apple definitely had non-luxury
               | products, as recently as the iPhone C (discontinued 10
               | years ago)... but Apple has eliminated all low-value
               | options from their lineup.
        
               | sfmike wrote:
               | Good question, I think the answer is even at thousands a
               | window device battery can't hit 18 hour specs. Can
               | someone name a windows device even at 2k+ that acts like
               | an M chip? In fact the pricier windows usually mean GPU
               | and those have worse battery then cheap windows(my 4090
               | is an hour or so off charge)
        
               | p_l wrote:
               | Thinkpad X250, admittedly at max specs, did 21 hours in
               | 2018. My T470 from 2020 did over 27 hours at max charge.
               | 
               | M-series Macs is when MacBooks _stopped sucking_ at
               | battery life without sleeping and wrecking state every
               | moment they could.
        
               | vampiresdoexist wrote:
               | Build quality, battery life, clean os install, and the
               | value held over time has no Windows equivalent even at
               | some much higher price points.
        
               | lolinder wrote:
               | The same thing can be (and is!) said about luxury car
               | brands. That's what makes the MacBook Air a luxury item.
               | 
               | Most people, when given the pitch you just gave me for a
               | 2x increase in price, will choose the cheaper item, just
               | like they choose the cheaper car.
        
               | vundercind wrote:
               | They're tools. This attempt to treat them as luxury goods
               | doesn't hold with those. It's entirely common for even
               | people who want to do some home repair--let alone
               | professionals--but aren't clueless about DIY to spend 2x
               | the cheapest option, because they know the cheapest one
               | is actually worth $0. More will advocate spending way
               | more than 2x, as long as you're 100% sure you're going to
               | use it a lot (like, say, a phone or laptop, even for a
               | lot of non-computer-geeks). This is true even if they're
               | just buying a simple lowish-power impact driver, nothing
               | fancy, not the most powerful one, not the one with the
               | most features. Still, they'll often not go for the
               | cheapest one, because those are generally not even fit
               | for their intended purpose.
               | 
               | [edit] I mean sure there are people who just want the
               | Apple logo, I'm not saying there are zero of those, but
               | they're also excellent, reliable tools (by the standards
               | of computers--so, still bad) and a good chunk of their
               | buyers are there for that. Even the ones who only have a
               | phone.
        
               | lolinder wrote:
               | I didn't go for the cheapest option: I'm typing this on a
               | laptop that I bought a few months ago for $1200. It has
               | an aluminum case, 32GB RAM, an AMD Ryzen CPU that
               | benchmarks similar to the M3, and 1TB SSD. I can open it
               | up and replace parts with ease.
               | 
               | The equivalent from Apple would currently run me $3200.
               | If I'm willing to compromise to 24GB of RAM I can get one
               | for $2200.
               | 
               | What makes an Apple device a luxury item isn't that it's
               | more expensive, it's that no matter what specs you pick
               | it will _always_ be much more expensive than equivalent
               | specs from a non-luxury provider. The things that Apple
               | provides are _not_ the headline stats that matter for a
               | tool-user, they 're luxury properties that don't actually
               | matter to most people.
               | 
               | Note that there's nothing wrong with buying a luxury
               | item! It's entirely unsurprising that most people on HN
               | looking at the latest M4 chip prefer luxury computers,
               | and that's fine!
        
               | vundercind wrote:
               | Huh. Most of the folks I know on Apple stuff started out
               | PC (and sometimes Android--I did) and maybe even made fun
               | of Apple devices for a while, but switched after exposure
               | to them because they turned out to be far, far better
               | tools. And not even much more expensive, if at all, for
               | TCO, given the longevity and resale value.
        
               | lolinder wrote:
               | Eh, I have to use a MacBook Pro for work because of IT
               | rules and I'm still not sold. Might be because I'm a
               | Linux person who absolutely must have a fully
               | customizable environment, but MacOS always feels so
               | limited.
               | 
               | The devices are great and feel great. Definitely high
               | quality (arguably, luxury!). The OS leaves a lot to be
               | desired for me.
        
               | vundercind wrote:
               | I spent about a decade before switching using Linux as my
               | main :-) Mostly Gentoo and Ubuntu (man, it was good in
               | the first few releases)
               | 
               | Got a job in dual-platform mobile dev and was issued a
               | MacBook. Exposure to dozens of phones and tablets from
               | both ecosystem. I was converted within a year.
               | 
               | (I barely customize anything these days, fwiw--hit the
               | toggle for "caps as an extra ctrl", brew install
               | spectacle, done. Used to have opinions about my graphical
               | login manager, use custom icon sets, all that stuff)
        
               | musicale wrote:
               | > no matter what specs you pick it will always be much
               | more expensive than equivalent specs from a non-luxury
               | provider
               | 
               | On the phone side, I guess you would call Samsung and
               | Google luxury providers? On the laptop side there are a
               | number of differentiating features that are of general
               | interest.
               | 
               | > The things that Apple provides are not the headline
               | stats that matter for a tool-user, they're luxury
               | properties that don't actually matter to most people
               | 
               | Things that might matter to regular people (and tool
               | users):
               | 
               | - design and build for something you use all day
               | 
               | - mic and speakers that don't sound like garbage (very
               | noticeable and relevant in the zoom/hybrid work era)
               | 
               | - excellent display
               | 
               | - excellent battery life
               | 
               | - seamless integration with iPhone, iPad, AirPods
               | 
               | - whole widget: fewer headaches vs. Windows (ymmv);
               | better app consistency vs. Linux
               | 
               | - in-person service/support at Apple stores
               | 
               | It's hard to argue that Apple didn't reset expectations
               | for laptop battery life (and fanless performance) with
               | the M1 MacBook Air. If Ryzen has caught up, then
               | competition is a good thing for all of us (maybe not
               | intel though...) In general Apple isn't bleeding edge,
               | but they innovate with high quality, very usable
               | implementations (wi-fi (1999), gigabit ethernet (2001),
               | modern MacBook Pro design (2001), "air"/ultrabook form
               | factors (2008), thunderbolt (2011), "retina" display and
               | standard ssd (2012), usb-c (2016), M1: SoC/SiP/unified
               | memory/ARM/asymmetric cores/neural engine/power
               | efficiency/battery life (2020) ...and occasionally with
               | dubious features like the touchbar and butterfly keyboard
               | (2016).)
        
               | landswipe wrote:
               | Once Arm and battery life shift occurs with Linux and
               | Windows, they'll (ie. Apple) be on the front foot again
               | with something new, that's the beauty of competition.
        
               | wiseowise wrote:
               | > It has an aluminum case, 32GB RAM, an AMD Ryzen CPU
               | that benchmarks similar to the M3, and 1TB SSD.
               | 
               | How much does it weight? Battery life? Screen quality?
               | Keyboard? Speakers?
        
               | Shtirlic wrote:
               | Lenovo Thinkpad p14s(t14) gen 4, 7840U, $1300, oled 2.8K
               | 400 nits P3, 64gb RAM, 1TB, keyboard excellent, speakers
               | shitty(using sony wh-1000xm4), battery(52.5Wh) life not
               | good not bad, OLED screen draws huge amount of power.
               | weight ~3 lb.
        
               | wiseowise wrote:
               | This spec costs 2k euro in NL. Fully specd Air (15 inch)
               | is 2,5k euro, with arguably better everything except RAM
               | and is completely silent. Doesn't look that much
               | different to me in terms of price.
        
               | faeriechangling wrote:
               | >The things that Apple provides are not the headline
               | stats that matter for a tool-user, they're luxury
               | properties that don't actually matter to most people.
               | 
               | Here lies the rub, ARE those the stats that matter? Or
               | does the screen, touchpad, speakers, battery life,
               | software, support services, etc. matter more?
               | 
               | I feel people just TOTALLY gloss over the fact that Apple
               | is crushing the competition in terms of trackpads +
               | speakers + battery life, which are hardly irrelevant
               | parts of most people's computing experience. Many people
               | hardly use their computers to compute - they mostly use
               | them to input and display information. For such users,
               | memory capacity and processing performance ARE frills,
               | and Apple is a market leader where it's delivering value.
               | 
               | Also even in compute, apple is selling computers with a
               | 512-bit or 1024-bit LPDDR5x bus for a lower price than
               | you can get from the competition. Apple is also
               | frequently leading the pack in terms of compute/watt.
               | This has more niche appeal, but I've seen people buy
               | Apple to run LLM inferencing 24/7 while the Mac Studio
               | sips power.
        
               | bigstrat2003 wrote:
               | Also, those things aren't even true about Apple devices.
               | Apple fanboys have been convinced that their hardware
               | really is way better than everything else for decades. It
               | has never been true and still isn't.
        
               | lupire wrote:
               | Luxury car market is 20% of US car market. Even more
               | among people who can afford the option. It's not "most
               | people" but it's not an outlier either.
        
               | nxicvyvy wrote:
               | Clean os install? You haven't used windows in a while
               | have you?
               | 
               | Im a Linux guy but am forced to use Mac's and windows
               | every now and then.
               | 
               | Windows has outpaced macos for a decade straight.
               | 
               | Macos looks like it hasn't been updated in years. It's
               | constantly bugging me for passwords for random things. It
               | is objectively the worst OS. I'd rather work on a
               | Chromebook.
        
               | theshackleford wrote:
               | I'm not a single operating system guy like you. I use all
               | three professionally. I've never had the bizzare struggle
               | you describe.
        
               | worthless-trash wrote:
               | I think he has different critera on what bothers him,
               | thats okay though isn't it. I get a little annoyed at
               | anything where I have to use a touchpad, not enough to
               | rant about it, but it definitely increases friction
               | (haha) in my thought process.
        
               | wiseowise wrote:
               | > Macos looks like it hasn't been updated in years.
               | 
               | Maybe the only reason why Windows outpaced macos in this
               | for you is because Windows started as crap and is now
               | barely looks like a proper OS?
               | 
               | https://ntdev.blog/2021/02/06/state-of-the-windows-how-
               | many-...
               | 
               | Thank God Mac hasn't changed that much. I absolutely love
               | its UI.
        
               | pompino wrote:
               | What metrics are you using for build quality? Admittedly
               | I don't know a ton of mac people (I'm an engineer working
               | in manufacturing) but the mac people I know, stuff always
               | breaks, but they're bragging about how apple took care of
               | it for free.
        
               | kotaKat wrote:
               | I can walk into many Walmarts in the US right now with
               | $699 and walk out with a MBA with the M1. That's a damn
               | good deal.
        
               | lolinder wrote:
               | That's still ~twice as expensive as the items I linked to
               | below, and that's at clearance prices.
               | 
               | A good deal on a luxury item still gets you a luxury
               | item.
               | 
               | And if we want to compare Walmart to Walmart, this thing
               | currently runs for $359 and has 16GB RAM, 512GB SSD, and
               | a CPU that benchmarks slightly faster than the _M2_ :
               | 
               | https://www.walmart.com/ip/Acer-Aspire-3-15-6-inch-
               | Laptop-AM...
        
               | epcoa wrote:
               | Oh dear. 16:10 screen with superior resolution,
               | brightness and gamut - and it still gets superior battery
               | life driving all those pixels.. that's a headline feature
               | that even a non-propellerhead can observe (I was honestly
               | surprised when I looked up that Acer screen what a dim,
               | narrow piece of shit it is) - notably there are ballpark
               | priced systems with better screens.
               | 
               | I think you unjustifiably downplay how much of a selling
               | point a screen that looks great (or at least decent) on
               | the floor is. And I know tons of devs that put up with
               | the 45% NTSC abominations on Thinkpads that aren't even
               | suitable for casual photo editing or web media, just
               | because you make do with that doesn't automatically make
               | a halfway decent display on a laptop a "luxury".
               | 
               | Sorry, but don't buy the "everything that isn't a $300
               | econo shit laptop is luxury" thesis repeated ad nauseum.
        
               | int_19h wrote:
               | What defines "luxury" exactly if not the combination of
               | price and "premium experience"?
        
               | tsimionescu wrote:
               | "Luxury" often includes some amount of pure status
               | symbols added to the package, and often on what is
               | actually a sub-par experience. The quintessential luxury
               | tech device were the Vertu phones from just before and
               | even early in the smartphone era - mid-range phones tech
               | and build quality-wise, with encrusted gems and gold
               | inserts and other such bling, sold at several thousand
               | dollars (Edit: they actually ranged between a few
               | thousand dollars all the way to 50,000+).
               | 
               | But the definition of luxury varies a lot by product
               | category. Still, high-end and luxury are separate
               | concepts, which ven when they do overlap.
        
               | darkwater wrote:
               | You just made up the "sub-par experience" as a defining
               | point of a luxury product. A luxury product is defined by
               | being a status symbol (check for all Apple devices) and
               | especially by its price. A luxury car like a Bentley will
               | still you bring from point A to point B like the cheapest
               | Toyota.
        
               | tsimionescu wrote:
               | I didn't say that sub-par experience was a _requirement_
               | , I said it was often a part of luxury products. Or, more
               | precisely, I should have said that something being of
               | excellent build quality and offering excellent, top of
               | the line experience is neither sufficient nor necessary
               | for being a luxury good.
               | 
               | It is true though that luxury goods are, often, top of
               | the line as well. Cars and watches are often examples of
               | this. Clothes are a much more mixed bag, with some luxury
               | brands using excellent materials and craftsmanship, while
               | others use flashy design and branding with mediocre
               | materials and craftsmanship.
               | 
               | Exactly where Apple sits is very debatable in my
               | experience. I would personally say that many of their
               | products are far too affordable and simple to be
               | considered luxury products - the iPhone in particular.
               | The laptops I'm less sure about.
        
               | darkwater wrote:
               | Fair enough. Apple is clearly not in the same luxury
               | league like a Bentley or a yatch, but it's totally like a
               | Mercedes, to continue with the car analogy. You get a
               | "plus" for the extra money but then it's open for debate
               | whether that "plus" is worth or not. And it's actually
               | the source of many flamewars on the Internet.
        
               | acdha wrote:
               | I think the Mercedes comparison (or the more common BMW)
               | one is also useful for getting the idea that not every
               | manufacturer is competing for the same segments but the
               | prices in segments are generally close. No Mercedes is as
               | cheap as a Camry but a Lexus is similar.
               | 
               | This comes up so often in these flame wars where people
               | are really saying "I do/don't think you need that
               | feature" and won't accept that other people aren't
               | starting from the same point. I remember in the 90s
               | reading some dude on Fidonet arguing that Macs were
               | overpriced because they had unnecessary frills like sound
               | cards and color displays; I wasn't a Mac user then but
               | still knew this was not a persuasive argument.
        
               | lupire wrote:
               | Luxury means beauty and comfort, beyond the bare
               | necessity of function.
        
               | tsimionescu wrote:
               | By that definition, Zara is a luxury clothing brand,
               | Braun is a luxury appliance maker, and Renault is a
               | luxury car brand. I think it requires significantly more.
        
               | int_19h wrote:
               | That would also apply to Apple products then, and
               | especially so to their laptops. I actually bought a
               | MacBook Air recently and the thing that I like most about
               | it is how comfortable the keyboard and especially the
               | trackpad is compared even to high-end ThinkPads. And, on
               | the other hand, the trackpad on my T14s is certainly
               | quite sufficient to operate it, so this comfort that
               | MacBook offers is beyond the bare necessity of function.
        
               | positus wrote:
               | I doubt I am alone in saying that I would gladly pay
               | twice the price to avoid having to use Windows. It's the
               | most user-hostile, hand-holdy, second-guess-and-confirm-
               | my-explicit-command-ey os I've used to date. And
               | bloatware baked in? No thanks.
        
               | neoromantique wrote:
               | ...but that is a luxury.
        
               | positus wrote:
               | You're probably right. I am in the middle-class, maybe
               | lower middle-class, and I live in the US. I have
               | advantages and opportunities that many in other
               | circumstances do not and I am sincerely grateful for
               | them.
        
               | fl0ki wrote:
               | Damn right it is.
        
               | bigstrat2003 wrote:
               | Windows is pretty shit these days, but it's not the only
               | other option. Linux is far more sane than MacOS or
               | Windows.
        
               | Dylan16807 wrote:
               | Good news, not using windows is free.
        
               | eigen wrote:
               | > this thing currently runs for $359 and has 16GB RAM,
               | 512GB SSD, and a CPU that benchmarks slightly faster than
               | the M2:
               | 
               | I'm seeing significant differences in the performance
               | between Acer Aspire A315 to M2 Macbook Air; the Acer is
               | ~33% of the M2 for 50% the price.
               | 
               | https://browser.geekbench.com/v6/cpu/compare/5942766?base
               | lin...
        
               | darkwater wrote:
               | Nowadays (well actually this has been true for the last
               | 10 years) a normal user won't care about that extra perf
               | ratio. The Aspire is "fast enough".
        
               | antifa wrote:
               | I'm gonna go out on a limb and guess we're also looking
               | at 10 hr battery vs 1hr battery.
        
               | darkwater wrote:
               | No brand new laptop has a 1h battery. Also, battery life
               | importance as in "I can work a full day unplugged from
               | AC" it's something that affects only a subset of laptop
               | users, and mostly during some specific conditions (i.e.
               | long travels).
        
               | AdamN wrote:
               | That's more like cheap vs middle of the road. There is no
               | luxury space in laptops - displays, iPads, and
               | workstations maybe but that's it (and those are more pro
               | than luxury).
               | 
               | $999 amortized over 3 years is $30/mo which is less than
               | what even middle class people spend on coffee.
        
               | amon22 wrote:
               | Sorry but linkink Acer crap doesnt help your point.
        
               | wiseowise wrote:
               | Weight
               | 
               | > 3.92 lb
               | 
               | Battery life
               | 
               | > 6.5 h
               | 
               | Probably half of that.
               | 
               | Fans that sound like jet engine, screen quality which
               | would force me to stab my eyes, speakers sounding worse
               | than a toilet, plastic build.
               | 
               | I'm not convinced.
        
               | FireBeyond wrote:
               | The Walmart variant was introduced 6 weeks ago to offload
               | excess stocks of a four year old discontinued model. I'm
               | not sure your argument of "at only 70% of the price of a
               | model two generations newer" is the sales pitch you think
               | it is.
        
               | vinkelhake wrote:
               | Walmart is currently selling Apple's old stock of M1
               | Airs. You can get the 8GB 256GB version for $699.
        
               | lolinder wrote:
               | See my reply to the person that beat you to it:
               | 
               | > https://news.ycombinator.com/item?id=40292804
               | 
               | tl;dr is that Walmart is _also_ selling an Acer for $359
               | that beats that device on every headline metric.
               | 
               | It's nice to know that I could get the old-gen model for
               | slightly cheaper, but that's still an outrageous price if
               | the MacBook Air isn't to be considered a luxury item.
        
               | zer0zzz wrote:
               | It's half the price because by the time the MacBook Air
               | dies you're on your second or third acer.
        
               | lolinder wrote:
               | My last Acer lasted me six years until I decided to
               | replace it for more power (which, notably, I would have
               | done with a MacBook by then too). They're not as well
               | built as a MacBook, but they're well built enough for the
               | average laptop turnover rate.
        
               | zer0zzz wrote:
               | That's fair
        
               | Der_Einzige wrote:
               | The apple defense force would rather die than admit that
               | Apple hardware is overpriced and a bad value.
               | 
               | 8gb of ram was pathetic in 2018, and is SUPER pathetic in
               | 2024.
        
               | fl0ki wrote:
               | If it was actually bad value they wouldn't sell as high
               | as they do and review with as much consumer satisfaction
               | as they do.
               | 
               | These products may not offer you much value and you don't
               | have to buy them. Clearly plenty of people and
               | institutions bought them because they believed they
               | offered the best value to them.
        
               | lolinder wrote:
               | Agreed. I'd definitely make the same arguments here as I
               | would for an Audi. There's clearly a market, and that
               | means they're not a bad value _for a certain type of
               | person_.
        
               | lupire wrote:
               | All I kmow about Audi is that it costd $14,000 to repair
               | a scape from a parking garage wall.
        
               | bigstrat2003 wrote:
               | If people were actually rational that might be true, but
               | they aren't. Apple survives entirely on the fact that
               | they have convinced people they are cool, not because
               | they actually provide good value.
        
               | wiseowise wrote:
               | Any examples where they don't provide good value?
        
               | wiseowise wrote:
               | There's literally dozens of videos show that 8 GB is more
               | than enough for casual or even entry level development
               | use.
               | 
               | https://www.youtube.com/watch?v=kHKIcBWbnjo
        
               | wiseowise wrote:
               | > tl;dr is that Walmart is also selling an Acer for $359
               | that beats that device on every headline metric.
               | 
               | Try this:
               | 
               | https://news.ycombinator.com/item?id=40297295
        
               | zer0zzz wrote:
               | what you're arguing is that a product that meets the
               | basic criteria of a good product makes it luxury. That
               | seems pretty wild to me.
               | 
               | No one calls a Toyota Camry with base options luxury but
               | it works well for a long time and has good quality.
        
               | lolinder wrote:
               | My Acer Aspire lasted me for tens of thousands of hours
               | of use and abuse by small children over 6 years until I
               | replaced it this year because I finally felt like I
               | wanted more power. _That 's_ the Toyota Camry of laptops.
               | 
               | The features that Apple adds on top of that are strictly
               | optional. You can very much prefer them and think that
               | they're essential, but that doesn't make it so. Some
               | people feel that way about leather seats.
        
               | zer0zzz wrote:
               | No, that's a Corolla or a Kia Forte of laptops.
        
               | zombiwoof wrote:
               | I bought a used thinkpad x13 for 350 bucks. It won me
               | over from my m3 MacBook Pro that costs 4 times as much
        
               | skibbityboop wrote:
               | I got a Latitude 9430 on eBay for $520. This thing is an
               | amazing laptop and I'd put it right there with the Macs I
               | have to work with at dayjob, as far as build
               | quality/feel.
        
               | insaneirish wrote:
               | > How is something that is twice as expensive as the
               | competition not a luxury device?
               | 
               | You can buy a version of <insert product here> from
               | Walmart at 1/2 price of a "normal" retailer. Does that
               | mean every "normal" retailer is actually a luxury goods
               | dealer?
               | 
               | Is my diner a luxury restaurant because a burger costs
               | twice as much as McDonald's?
               | 
               | Stop the silliness.
        
               | lolinder wrote:
               | All I'm learning from comments like this is that there
               | are a _lot_ of people who are very resistant to the idea
               | that they buy luxury goods.
        
               | golergka wrote:
               | When I buy a Rick Owens coat for $3k, sure it's a luxury
               | good. It protects from the elements just the same, I know
               | that I overpay only because it looks nice. But when I pay
               | the same for the device I need for my work and use for 12
               | hours a day, it's not luxury -- it's just common sense.
               | I've tried working with Windows and Linux, and I know
               | that I'm paying not only for specs, but because the sum
               | of all the qualities will result in a much better
               | experience -- which will allow me to work (and earn
               | money) faster and with less headache.
        
               | antifa wrote:
               | $1000 for a laptop that will last 10 years seems crazy to
               | call a luxury, when we have Alienware/apple laptops that
               | go for 2k to 5k+ and demographics that buys them yearly.
        
               | Dylan16807 wrote:
               | > You can buy a version of <insert product here> from
               | Walmart at 1/2 price of a "normal" retailer. Does that
               | mean every "normal" retailer is actually a luxury goods
               | dealer?
               | 
               | What percent of that retailer's products does that
               | comparison apply to?
               | 
               | If it's more than half then yeah that's probably a luxury
               | goods dealer.
        
               | wiseowise wrote:
               | > The equivalent (based on raw specs) in the PC world
               | runs for $300 to $500.
               | 
               | Equivalent device?! Find me Windows laptop in ANY price
               | category that can match weight, fanless design, screen
               | quality, battery life, speakers quality and battery life
               | of Air.
        
           | moritzwarhier wrote:
           | > > In case it is not abundantly clear by now: Apple's AI
           | strategy is to put inference (and longer term even learning)
           | on edge devices. This is completely coherent with their
           | privacy-first strategy (which would be at odds with sending
           | data up to the cloud for processing).
           | 
           | > Their primary business goal is to sell hardware.
           | 
           | There is no contradiction here. No need for luxury. Efficient
           | hardware scales, Moore's law has just been rewritten, not
           | defeated.
           | 
           | Power efficiency combined with shared and extremely fast RAM,
           | it is still a formula for success as long as they are able to
           | deliver.
           | 
           | By the way, M-series MacBooks have crossed bargain territory
           | by now compared to WinTel in some specific (but large)
           | niches, e.g. the M2 Air.
           | 
           | They are still technically superior in power efficiency and
           | still competitive in performance in many common uses, be it
           | traditional media decoding and processing, GPU-heavy tasks
           | (including AI), single-core performance...
           | 
           | By the way, this includes web technologies / JS.
        
             | ewhanley wrote:
             | This is it. An M series air is an incredible machine for
             | most people - people who likely won't ever write a line of
             | js or use a GPU. Email, banking, YouTube, etc ona device
             | with incredible battery and hardware that will likely be
             | useful for a decade is perfect. The average user hasn't
             | even heard of HN.
        
               | xvector wrote:
               | It's great for power users too. Most developers really
               | enjoy the experience of writing code on Macs. You get a
               | Unix based OS that's just far more usable and polished
               | than a Linux laptop.
               | 
               | If you're into AI, there's objectively literally no other
               | laptop on the planet that is competitive with the GPU
               | memory available on an MBP.
        
               | wiseowise wrote:
               | It's an amazing machine for engineers too.
        
               | LettuceSand12 wrote:
               | Are newer airs good enough for development?
        
               | moritzwarhier wrote:
               | depends on your workload. RAM and passive cooling are the
               | most likely issues but afaik an M2/M3 with 16GiB still
               | performs a lot better than an similarly priced x64
               | laptop. Active cooling doesn't mean no throttling either.
               | 
               | If you don't explicitly want a laptop, a 32GB M2 Pro Mac
               | Mini would be a good choice I think.
               | 
               | Personally i only have used MBPs so far.
               | 
               | But the M-series Air are not remotely comparable to the
               | old Intel Airs, that's for sure :)
        
           | jimbokun wrote:
           | For all their competitors it's not true right now.
        
           | VelesDude wrote:
           | I think it was Paul Thurrott on Windows Weekly podcast who
           | said that all these companies don't really care about
           | privacy. Apple takes billions of dollar a year to direct data
           | towards Google via the search defaults. Clearly privacy has a
           | price. And I suspect it will only get worse with time as they
           | keep chasing the next quarter.
           | 
           | Tim Cook unfortunately is so captured in that quarterly
           | mindset of 'please the share holders' that it is only a
           | matter of time.
        
             | cvwright wrote:
             | It doesn't matter to me if they "really care" about privacy
             | or not. Megacorps don't "really care" about anything except
             | money.
             | 
             | What matters to me is that they continue to see privacy as
             | something they can sell in order to make money.
        
               | VelesDude wrote:
               | Yeah, some poor phrasing on my behalf.
               | 
               | I do hope that those working in these companies actually
               | building the tools do care. But unfortunately, it seems
               | that corruption is an emergent property of complexity.
        
             | Cthulhu_ wrote:
             | The Google payments are an interesting one; I don't think
             | it's a simple "Google pays them to prefer them", but a
             | "Google pays them to stop them from building a competitor".
             | 
             | Apple is in the position to build a competing search
             | product, but the amount Google pays is the amount of money
             | they would have to earn from it, and that is improbable
             | even if it means they can set their own search engine as
             | default.
        
             | spacebanana7 wrote:
             | Apple isn't ethically Mullvad, but they're much better than
             | some of their android competitors who allow adverts on the
             | lock screen.
        
           | raincole wrote:
           | Privacy is a luxury today so yeah selling luxury hardware and
           | promise of privacy form a coherent business position.
        
           | ngcc_hk wrote:
           | If you average out it is quite not expensive. Unlike android
           | phone or let us not talk about android tablet.
           | 
           | The important is that a good starting ai learning platform is
           | what ... most apple does not touch those price.
           | 
           | Hence with privacy it is a good path.
           | 
           | You do not want communism even if it is not expensive in the
           | short term.
        
           | sneak wrote:
           | It's not even true now. Apple (and by the extension USG and
           | CCP) can read ~every iMessage. The e2ee is backdoored.
        
           | pompino wrote:
           | >The promise of privacy is one way in which they position
           | themselves, but I would not bet the bank on that being true
           | forever.
           | 
           | They failed with their ad-business so this is a nice pivot.
           | I'll take it, I'm not usually a cheerleader for Apple, but
           | I'll support anyone who can erode Google's surveillance
           | dominance.
        
           | m463 wrote:
           | > The promise of privacy
           | 
           | I have very little faith in apple in this respect.
           | 
           | For clarity, just install little snitch on your machine, and
           | watch what happens with your system. Even without being
           | signed in with an apple id and everything turned off, apple
           | phones home all the time.
        
             | transpute wrote:
             | You can block 17.0.0.0 at the router, opening up only the
             | notification servers. CDNs are a bit harder, but can be
             | done with dnsmasq allow/deny of wildcard domains. Apple has
             | documentation on network traffic from their devices,
             | https://support.apple.com/en-us/101555
        
               | m463 wrote:
               | you can also use privoxy and use it in network settings
        
           | lynx23 wrote:
           | Every company is selling one thing or another, and nothing is
           | going to last forever. I really fail to see what, except for
           | generic negativity, your comment adds to anything.
        
           | jsxlite wrote:
           | I wouldn't bank on that being true forever after 2012. A
           | corporation is goal are vastly determined by the corporate
           | structure.
        
           | Refusing23 wrote:
           | Their 2nd largest revenue source (at ... 20-25%, below only
           | the iphone) is software services.
           | 
           | iCloud, App store revenue, apple tv, and so on
        
           | ActorNightly wrote:
           | >The promise of privacy is one way in which they position
           | themselves, but I would not bet the bank on that being true
           | forever.
           | 
           | Everyone seems to have forgotten about the Celebrity iCloud
           | photo leak.
        
           | 8fingerlouie wrote:
           | > but it is about selling luxury hardware.
           | 
           | While Apple is first and foremost a hardware company, it has
           | more or less always been about the "Apple experience".
           | They've never "just" been a hardware company.
           | 
           | For as long as Apple has existed, they've done things "their
           | way" both with hardware and software, though they tend to
           | want to abstract the software away.
           | 
           | If it was merely a question of selling hardware, why does
           | iCloud exist ? or AppleTV+, or Handoff ? or iMessage, or the
           | countless other seemingly small life improvements that
           | somehow the remainder of the industry cannot seem to figure
           | out how to do well.
           | 
           | Just a "simple" thing as switching headphones seamlessly
           | between devices is something i no longer think about, it just
           | happens, and it takes a trip with a Windows computer and a
           | regular bluetooth headset to remind me how things used to be.
           | 
           | As part of their "privacy first" strategy, iMessage also fits
           | in nicely. Apple doesn't have to operate a huge instant
           | messaging network, which undoubtedly is not making a profit,
           | but they do, because having one entry to secure, encrypted
           | communication fits well with the Apple Experience. iMessage
           | did so well at abstracting the ugly details of encryption
           | that few people even think about that that's what the blue
           | bubble is actually about, it more or less only means your
           | message is end to end encrypted. As a side effect you can
           | also send full resolution images (and more), but that's in no
           | way unique to iMessage.
        
           | thefz wrote:
           | > The promise of privacy is one way in which they position
           | themselves, but I would not bet the bank on that being true
           | forever.
           | 
           | Everybody so quick to forget Apple was/is part of PRISM like
           | any other company.
        
           | heresie-dabord wrote:
           | > The promise of privacy is one way in which they position
           | themselves, but I would not bet the bank on that being true
           | forever.
           | 
           | Indeed.
           | 
           | Privacy starts with architectural fundamentals that are very
           | difficult to retrofit...
           | 
           | If a supplier of products has not built the products this
           | way, it would be naive to bet bank or farm on the supplier.
           | Even if there were profound motivation to retrofit.
           | 
           | Add to this the general tendency of the market to exploit its
           | customers.
        
           | oorza wrote:
           | > The promise of privacy is one way in which they position
           | themselves, but I would not bet the bank on that being true
           | forever.
           | 
           | There are a ton of us out here that consciously choose Apple
           | because of their position on privacy. I have to imagine they
           | know how many customers they'll lose if they ever move on
           | this, and I want to believe that it's a large enough
           | percentage to prevent it from happening. Certainly my circle
           | is not a useful sample, but the Apple people in it are almost
           | all Apple people because of privacy.
        
           | Guthur wrote:
           | That is not where Apple's growth has been for quite some
           | time, it's services. And because of that I'll be awaiting the
           | economic rental strategy to come at any moment.
        
           | ethagknight wrote:
           | Nearly 1/4th of their revenue is Apple services, and its
           | their primary source of revenue growth to feed the beast.
        
           | dabbz wrote:
           | When I was at Apple for a short time, there was a small joke
           | I hear from the ex-amazonians there who would say "What's the
           | difference between an Apple software engineer and an Amazon
           | software engineer? The Amazon engineer will spin up a new
           | service on AWS. An Apple engineer will spin up a new app". Or
           | something along those lines. I forget the exact phrasing. It
           | was a joke that Apple's expertise is in on-device features,
           | whereas Amazon thrives in the cloud services world.
        
             | golergka wrote:
             | What would a Google engineer do? Write a design doc or
             | sunset a service?
        
           | amelius wrote:
           | What do you mean luxury? Samsung produces phones that are 2x
           | more expensive and are more luxurious.
           | 
           | E.g. Apple still doesn't offer a flipphone that turns into a
           | tablet when you open it.
        
         | SpaceManNabs wrote:
         | This comment is odd. I wouldn't say it is misleading, but it is
         | odd because it borders on such definition.
         | 
         | > Apple's AI strategy is to put inference (and longer term even
         | learning) on edge devices
         | 
         | This is pretty much everyone's strategy. Model distillation is
         | huge because of this. This goes in line with federated
         | learning. This goes in line with model pruning too. And
         | parameter efficient tuning and fine tuning and prompt learning
         | etc.
         | 
         | > This is completely coherent with their privacy-first strategy
         | 
         | Apple's marketing for their current approach is privacy-first.
         | They are not privacy first. If they were privacy first, you
         | would not be able to use app tracking data on their first party
         | ad platform. They shut it off for everyone else but themselves.
         | Apple's approach is walled garden first.
         | 
         | > Processing data at the edge also makes for the best possible
         | user experience because of the complete independence of network
         | connectivity
         | 
         | as long as you don't depend on graph centric problems where
         | keeping a local copy of that graph is prohibitive. Graph
         | problems will become more common. Not sure if this is a problem
         | for apple though. I am just commenting in general.
         | 
         | > If (and that's a big if) they keep their APIs open to run any
         | kind of AI workload on their chips
         | 
         | Apple does not have a good track record of this; they are quite
         | antagonistic when it comes to this topic. Gaming on apple was
         | dead for nearly a decade (and pretty much still is) because
         | steve jobs did not want people gaming on macs. Apple has eased
         | up on this, but it very much seems that if they want you to use
         | their devices (not yours) in a certain way, then they make it
         | expensive to do anything else.
         | 
         | Tbf, I don't blame apple for any of this. It is their strategy.
         | Whether it works or not, it doesn't matter. I just found this
         | comment really odd since it almost seemed like evangelism.
         | 
         | edit: weird to praise apple for on device training when it is
         | not publicly known if they have trained any substantial model
         | even on cloud.
        
           | nomel wrote:
           | > This is pretty much everyone's strategy.
           | 
           | I think this is being too charitable on the state of
           | "everyone". It's everyone's _goal_. Apple is actively
           | achieving that goal, with their many year _strategy_ of in
           | house silicon /features.
        
             | SpaceManNabs wrote:
             | > Apple is actively achieving that goal, with their many
             | year strategy of in house silicon/features
             | 
             | So are other companies, with their many year strategy of
             | actually building models that accessible to the public.
             | 
             | yet Apple is "actively" achieving the goal without any
             | distinct models.
        
               | nomel wrote:
               | No. "On edge" is not a model existence limitation, it is
               | a hardware capability/existence limitation, by
               | definition, and by the fact that, as you point out, the
               | models already _exist_.
               | 
               | You can already run those open weight models on Apple
               | devices, on edge, with huge improvements on the newer
               | hardware. Why is a distinct model required? Do the rumors
               | appease these thoughts?
               | 
               | If others are making models, with no way to actually run
               | them, that's not a viable "on edge" strategy, since it
               | involves waiting for someone else to actually accomplish
               | the goal first (as is being done by Apple).
        
               | SpaceManNabs wrote:
               | > "On edge" is not a model existence limitation
               | 
               | It absolutely is. Model distillation will still be
               | pertinent. And so will be parameter efficient tuning for
               | edge training. I cannot emphasize more how important this
               | is. You will need your own set of weights. If apple wants
               | to use open weights, then sure. Ignore this. Don't seem
               | like they want to long-term... And even if they use open
               | weights, they will still be behind other companies have
               | done model distillation and federated learning for years.
               | 
               | > Why is a distinct model required?
               | 
               | Ask apple's newly poached AI hires this question. Doesn't
               | seem like you would take an answer from me.
               | 
               | > If others are making models, with no way to actually
               | run them
               | 
               | Is this the case? People have been running distilled
               | llamas on rPis with pretty good throughput.
        
               | nomel wrote:
               | > And even if they use open weights, they will still be
               | behind other companies have done model distillation and
               | federated learning for years.
               | 
               | I'm sorry, but we're talking about "on edge" here though.
               | Those other companies _have no flipping hardware_ to run
               | it  "on edge", in a "generic" way, which is the goal.
               | Apple's strategy involves the generic.
               | 
               | > If apple wants to use open weights
               | 
               | This doesn't make sense. Apple doesn't dictate the models
               | you can use with their hardware. _You can already
               | accelerate LLAMA with the neural engines_. You can
               | download the app right now. _You can already deploy your
               | models on edge, on their hardware_. That _is_ the success
               | they 're achieving. You _cannot_ effectively do this on
               | competitor hardware, with good performance, from
               | "budget" to "Pro" lineup, which is a requirement of the
               | goal.
               | 
               | > they will still be behind other companies have done
               | model distillation and federated learning for years.
               | 
               | What hardware are they running it on? Are they taking
               | advantage of Apple (or other) hardware in their strategy?
               | Federated learning is an _application_ of  "on edge", it
               | doesn't *enable* on edge, which is part of Apple's
               | strategy.
               | 
               | > Ask apple's newly poached AI hires this question.
               | Doesn't seem like you would take an answer from me.
               | 
               | Integrating AI in _their_ apps /experience is not the
               | same as enabling a generic "on edge", default, capability
               | in all Apple devices (which they have been working
               | towards for years now). This is the end goal for "on
               | edge". You seem to be talking about OS integration, or
               | something else.
               | 
               | > People have been running distilled llamas on rPis with
               | pretty good throughput.
               | 
               | Yes, the fundamental limitation there being hardware
               | performance, not the model, with that "pretty good"
               | making the "pretty terrible" user experience. But,
               | there's also nothing stopping anyone from running these
               | distilled (a requirement of limited hardware) models on
               | Apple hardware, taking advantage of Apples fully defined
               | "on edge" strategy. ;) Again, you can run llamas on Apple
               | silicon, accelerated, as I do.
        
               | SpaceManNabs wrote:
               | > Those other companies have no flipping hardware to run
               | it "on edge", in a "generic" way, which is the goal
               | 
               | Maybe? This is why I responded to:
               | 
               | > It's everyone's goal. Apple is actively achieving that
               | goal
               | 
               | This is is the issue I found disagreeable. Other
               | organizations and individual people are achieving that
               | goal too. Google says GPT-Nano is going to device, and if
               | the benchmarks are to be believed, if it runs at that
               | level, their work so far is also actively achieving that
               | goal. Meta has released multiple distilled models that
               | people have already proven to run inference at the device
               | level. It cannot be argued that meta is not actively
               | achieving that goal either. They don't have to release
               | the hardware because they went a different route. I
               | applaud Apple for the M chips. They are super cool.
               | People are still working on using them so Apple can
               | realize that goal too.
               | 
               | So when you go to the statement that started this
               | 
               | > Apple's AI strategy is to put inference (and longer
               | term even learning) on edge devices
               | 
               | Multiple orgs also share this. And I can't say that one
               | particular org is super ahead of the others. And I can't
               | elevate apple in that race because it is not clear that
               | they are truly privacy-focused or that they will keep
               | APIs open.
               | 
               | > You cannot effectively do this on competitor hardware,
               | with good performance, from "budget" to "Pro" lineup,
               | which is a requirement of the goal
               | 
               | Why do you say you cannot do this with good performance?
               | How many tokens do you want for a device? Is 30T/s
               | enough? You can do that on laptops running small mixtral.
               | 
               | > What hardware are they running it on? Are they taking
               | advantage of Apple (or other) hardware in their strategy?
               | 
               | I don't know. I have nothing indicating necessarily apple
               | or nvidia or otherwise. Do you?
               | 
               | > [Regarding the rest]
               | 
               | Sure, my point is that they definitely have an intent for
               | bespoke models. And why I raised the point that not all
               | computation will be feasible on edge for the time being.
               | My point with what raised this particular line of inquiry
               | is whether a pure edge experience truly enables the best
               | user experience. And also why I raised the point about
               | Apple's track record of open APIs. Which is why "actively
               | achieving" is something that I put doubt on. And I also
               | cast doubt on apple being privacy focused. Just emphasize
               | tying it back to the reason I even commented.
        
           | jameshart wrote:
           | Everyone's strategy?
           | 
           | The biggest players in commercial AI models at the moment -
           | OpenAI and Google - have made absolutely no noise about
           | pushing inference to end user devices at all. Microsoft,
           | Adobe, other players who are going big on embedding ML models
           | into their products, are not pushing those models to the
           | edge, they're investing in cloud GPU.
           | 
           | Where are you picking up that this is _everyone's_ strategy?
        
             | SpaceManNabs wrote:
             | > Where are you picking up that this is everyone's
             | strategy?
             | 
             | Read what their engineers say in public. Unless I
             | hallucinated years of federated learning.
             | 
             | Also apple isn't even a player yet and everyone is
             | discussing how they are moving stuff to the edge lol. Can't
             | critique companies for not being on the edge yet when apple
             | doesn't have anything out there.
        
             | smj-edison wrote:
             | I believe at least Google is starting to do edge inference
             | --take a look at the pixel 8 line-up they just announced.
             | It doesn't seem to be emphasized as much, but the tensor G3
             | chip certainly has builtin inference.
        
             | losvedir wrote:
             | Of course Google is. That's what Gemini Nano is for.
        
         | kernal wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | How can they have a privacy first strategy when they operate an
         | Ad network and have their Chinese data centers run by state
         | controlled companies?
        
           | KerrAvon wrote:
           | How can I have mint choc and pineapple swirl ice cream when
           | there are children starving in Africa?
        
             | edm0nd wrote:
             | Fuck them kids. It's delicious af.
        
           | n9 wrote:
           | ... I think that the more correct assertion would be that
           | Apple is a sector leader in privacy. If only because their
           | competitors make no bones about violating the privacy of
           | their customers as it is the basis of thier business model.
           | So it's not that Apple is A+ so much as the other students
           | are getting Ds and Fs.
        
         | strangescript wrote:
         | The fundamental problem with this strategy is model size. I
         | want all my apps to be privacy first with local models, but
         | there is no way they can share models in any kind of coherent
         | way. Especially when good apps are going to fine tune their
         | models. Every app is going to be 3GB+
        
           | tyho wrote:
           | Foundation models will be the new .so files.
        
             | flawsofar wrote:
             | And fine tuning datasets will be compressed and sent rather
             | than the whole model
        
             | strangescript wrote:
             | This would be interesting but also feels a little
             | restrictive. Maybe something like LoRa could bridge the
             | capability gap but if a competitor then drops a much more
             | capable model then you either have to ignore it or bring it
             | into your app.
             | 
             | (assuming companies won't easily share all their models for
             | this kind of effort)
        
           | SpaceManNabs wrote:
           | I don't think HN understands how important model distillation
           | still is for federated learning. Hype >> substance ITT
        
           | Havoc wrote:
           | You could always mix and match. Do lighter task on device and
           | outsource to cloud if needed
        
           | mr_toad wrote:
           | Gemini nano is 1.8B 4 bit parameters, so a little under a GB.
           | And hopefully each app won't include a full copy of their
           | models.
        
           | int_19h wrote:
           | You can do quite a lot with LoRA without having to replace
           | all the weights.
        
         | macns wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | You mean .. with their _said_ privacy-first strategy
        
         | ptman wrote:
         | Apple privacy is marketing https://www.eurekalert.org/news-
         | releases/1039938
        
         | xipix wrote:
         | How is local more private? Whether AI runs on my phone or in a
         | data center I still have to trust third parties to respect my
         | data. That leaves only latency and connectivity as possible
         | reasons to wish for endpoint AI.
        
           | chatmasta wrote:
           | If you can run AI in airplane mode, you are not trusting any
           | third party, at least until you reconnect to the Internet.
           | Even if the model was malware, it wouldn't be able to
           | exfiltrate any data prior to reconnecting.
           | 
           | You're trusting the third party at training time, to build
           | the model. But you're not trusting it at inference time (or
           | at least, you don't have to, since you can airgap inference).
        
         | thefourthchime wrote:
         | Yes is it completely clear. My guess is they do something like
         | "Siri-powered shortcuts". Where you can ask it to do a couple
         | things and it'll dynamically create a script and execute it.
         | 
         | I can see a smaller model trained to do that may work well
         | enough, however, I've never seen any real working examples of
         | this work, that rabit device is heading in that direction, but
         | it's mostly vaporware now.
        
           | _boffin_ wrote:
           | Pretty much my thoughts too. Going to have a model that's
           | smaller than 3B built in. The'll have tokens that directly
           | represent functions / shortcuts.
        
         | choppaface wrote:
         | On "privacy": If Apple owned the Search app versus paying
         | Google, and used their own ad network (which they have for App
         | Store today), Apple will absolutely use your data and location
         | etc to target you with ads.
         | 
         | It can even be third party services sending ad candidates
         | directly to your phone and then the on-device AI chooses which
         | is relevant.
         | 
         | Privacy is a contract not the absence of a clear business
         | opportunity. Just look at how Apple does testing internally
         | today. They have no more respect for human privacy than any of
         | their competitors. They just differentiate through marketing
         | and design.
        
         | LtWorf wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | Is this the same apple whose devices do not work at all unless
         | you register an apple account?
        
           | yazzku wrote:
           | Some people really seem to be truly delusional. It's obvious
           | that the company's "privacy" is a marketing gimmick when you
           | consider the facts. Do people not consider the facts anymore?
           | How does somebody appeal to the company's "privacy-first
           | strategy" with a straight face in light of the facts? I
           | suppose they are not aware of the advertising ID that is
           | embedded in all Apple operating systems. That one doesn't
           | even require login.
        
             | LtWorf wrote:
             | Considering the facts is much harder when admitting a
             | mistake is involved.
        
               | yazzku wrote:
               | A "mistake" seems to be putting it lightly when the thing
               | has been reiterated multiple times throughout the years,
               | but yeah. Seems more like blind dogma. Obviously people
               | don't like the facts pointed out to them either as you
               | can tell by the down votes on my comment. If I am wrong,
               | please tell me how in a reply.
        
             | kalleboo wrote:
             | What are the facts that people are not considering?
             | 
             | The advertising ID is useless as of App Tracking
             | Transparency.
        
           | kalleboo wrote:
           | That's not true though? I reset and set up devices for
           | testing all the time, and you can skip logging into an Apple
           | ID.
        
             | LtWorf wrote:
             | For real usage, not for testing a single app. And I mean
             | phones.
        
         | aiauthoritydev wrote:
         | Edge inference and cloud inference are not mutually exclusive
         | and chances are any serious player would be dipping their toes
         | in both.
        
           | MBCook wrote:
           | Right. The difference is that Apple has a ton of edge
           | capacity, they've been building it for a long time.
           | 
           | Google and Samsung have been building it too, at different
           | speeds.
           | 
           | Intel and AMD seem further behind (at the moment) unless the
           | user has a strong GPU, which is especially uncommon on the
           | most popular kind of computer: laptops.
           | 
           | And if you're not one of those four companies... you probably
           | don't have much capable consumer edge hardware.
        
         | 7speter wrote:
         | >I personally really really welcome as I don't want the AI
         | future to be centralised in the hands of a few powerful cloud
         | providers.
         | 
         | Watch out for being able to using ai on your local machine and
         | those ai services using telemetry to send your data (recorded
         | conversations, for instance) to their motherships.
        
           | happyopossum wrote:
           | > and those ai services using telemetry to send your data
           | (recorded conversations, for instance) to their motherships
           | 
           | This doesn't require Ai and I am not aware of any instances
           | of this happening today, so what exactly are we watching out
           | for?
        
           | shinycode wrote:
           | I agree but for a different reason.
           | 
           | Now the subscription is 20$ a month and the API price is
           | accessible. What will happen when they all decide to x100 or
           | x1000 the price of their API ? All the companies that got rid
           | of people in favor of AI, might have lost the knowledge as
           | well. This is dangerous and might kill a lot of companies no
           | ?
        
         | Powdering7082 wrote:
         | Also they don't have to pay _either_ the capex or opex costs
         | for training a model if they get user 's devices to train the
         | models
        
         | aiauthoritydev wrote:
         | I think these days everyone links their products with AI. Today
         | even BP CEO linked his business with AI. Edge inference and
         | cloud inference are not mutually exclusive choices. Any serious
         | provider will provide both and the improvement in quality of
         | services come from you giving more of your data to the service
         | provider. Most people are totally fine with that and that will
         | not change any time sooner. Privacy paranoia is mostly a fringe
         | thing in consumer tech.
        
           | twototango wrote:
           | Hey could i get a source on the BP stuff please. Just curious
           | as I couldn't find anything on the interwebs
        
             | aiauthoritydev wrote:
             | https://x.com/zerohedge/status/1787822478686851122
        
               | aiauthoritydev wrote:
               | https://www.reddit.com/r/stocks/comments/1cmi5gj/bp_earni
               | ngs...
        
           | MBCook wrote:
           | I agree. Apple has been on this path for a while, the first
           | processor with a Neural Engine was the A11 in 2017 or so. The
           | path didn't appear to change at all.
           | 
           | The big differences today that stood out to me were adopting
           | AI as a term (they used machine learning before) and
           | repeating the term AI everywhere they could shove it in since
           | that's obviously what the street wants to hear.
           | 
           | That's all that was different. And I'm not surprised they
           | emphasized it given all the weird "Apple is behind on AI"
           | articles that have been going around.
        
         | gtirloni wrote:
         | _> complete independence of network connectivity and hence
         | minimal latency._
         | 
         | Does it matter that each token takes additional milliseconds on
         | the network if the local inference isn't fast? I don't think it
         | does.
         | 
         | The privacy argument makes some sense, if there's no telemetry
         | leaking data.
        
         | kshahkshah wrote:
         | Why is Siri still so terrible though?
        
         | w1nst0nsm1th wrote:
         | > to put inference on edge devices...
         | 
         | It will take a long time before you can put performant
         | inference on edge device.
         | 
         | Just download one of the various open source large(st) langage
         | model and test it on your desktop...
         | 
         | Compute power and memory and storage requirements are insane if
         | you want decent result... I mean not just Llama gibberish.
         | 
         | Until such requirement are satisfied, distant model are the way
         | to go, at least for conversational model.
         | 
         | Aside llm, AlphaGo would not run on any end user device, by a
         | long shot, even if it is an already 'old' technology.
         | 
         | I think 'neural engine' on end user device is just marketing
         | nonsense at this current state of the art.
        
         | tweetle_beetle wrote:
         | I wonder if BYOE (bring your own electricity) also plays a part
         | in their long term vision? Data centres are expensive in terms
         | of hardware, staffing and energy. Externalising this cost to
         | customers saves money, but also helps to paint a green(washing)
         | narrative. It's more meaningful to more people to say they've
         | cut their energy consumption by x than to say they have a
         | better server obselesence strategy, for example.
        
           | dyauspitr wrote:
           | Why would it be green washing? Aren't their data centers and
           | commercial operations run completely on renewable energy?
        
             | ironmagma wrote:
             | If you offload data processing to the end user, then your
             | data center uses less energy on paper. The washing part is
             | that work is still being done and spending energy, just
             | outside of the data center.
        
               | dyauspitr wrote:
               | Which honestly is still good for the environment to have
               | the work distributed across the entire electricity grid.
               | 
               | That work needs to be done anyways and Apple is doing it
               | in the cleanest way possible. What's an alternative in
               | your mind, just don't do the processing? That sounds like
               | making progress towards being green. If you're making
               | claims of green washing you need to be able to back it up
               | with what alternative would actually be "green".
        
               | ironmagma wrote:
               | I didn't make any claims, I just explained what the
               | parent was saying. There could be multiple ways to make
               | it more green: one being not doing the processing, or
               | another perhaps just optimizing the work being done. But
               | actually, no, you don't need a viable way to be green in
               | order to call greenwashing "greenwashing." It can just be
               | greenwashing, with no alternative that is actually green.
        
               | talldayo wrote:
               | > Which honestly is still good for the environment to
               | have the work distributed across the entire electricity
               | grid.
               | 
               | Sometimes, but parallelization has a cost. The power
               | consumption from 400,000,000 iPhones downloading a 2gb
               | LLM is not negligible, probably more than what you'd
               | consume running it as a REST API on a remote server. Not
               | to mention slower.
        
               | peapicker wrote:
               | Downloading 2gb of anything on my iPhone via wifi from my
               | in-home gigabit fiber barely puts a dent in my battery
               | life let alone much time.
               | 
               | The random ads in most phone games are much worse on my
               | battery life.
        
               | talldayo wrote:
               | Yeah it's a shame that mobile games are shit when console
               | and PC gaming gets taken so serious by comparison. If you
               | want to blame that on developers and not Apple's stupid-
               | ass policies stopping you from emulating real games, be
               | my guest. That's a take I'd love to hear.
               | 
               | Keep downloadin' those ads. This is what Apple wants from
               | you, a helpless and docile revenue stream. Think
               | Different or stay mad.
        
               | peapicker wrote:
               | Blame? Simply saying downloading 2gb isn't the power
               | consumption hit you seem to think it is.
               | 
               | Not much of a gamer anyway, just an observation when I
               | tried a couple apparently dodgy games.
               | 
               | Not sure why your reply wasn't related to your original
               | comment. Felt rather knee-jerk reactionary to me instead.
               | Oh well.
        
               | Dylan16807 wrote:
               | > Which honestly is still good for the environment to
               | have the work distributed across the entire electricity
               | grid.
               | 
               | This doesn't make any sense.
               | 
               | > If you're making claims of green washing you need to be
               | able to back it up with what alternative would actually
               | be "green".
               | 
               | Sometimes there isn't an alternative. In which case you
               | don't get to look green, sorry. The person critiquing
               | greenwashing doesn't need to give an alternative, why
               | would that be their job? They're just evaluating whether
               | it's real or fake.
               | 
               | Though in this case using renewable energy can help.
        
               | nozzlegear wrote:
               | > Sometimes there isn't an alternative. In which case you
               | don't get to look green, sorry. The person critiquing
               | greenwashing doesn't need to give an alternative, why
               | would that be their job? They're just evaluating whether
               | it's real or fake.
               | 
               | Baselessly calling every greening and sustainability
               | effort "greenwashing", especially when there's
               | practically no thought put into what the alternative
               | might be, is trite and borderline intellectually
               | dishonest. They don't want to have a conversation about
               | how it could be improved, they just want to interject
               | "haha that's stupid, corporations are fooling all of you
               | sheeple" from their place of moral superiority. This shit
               | is so played out.
        
               | dyauspitr wrote:
               | This is what I wanted to get across but you said it so
               | much better.
        
               | Dylan16807 wrote:
               | > Baselessly calling every greening and sustainability
               | effort "greenwashing", especially when there's
               | practically no thought put into what the alternative
               | might be
               | 
               | Baseless? The foundation of this accusation is rock
               | solid. Offloading the exact same computation to another
               | person so your energy numbers look better is _not a
               | greening or sustainability effort_.
               | 
               | Fake green should always be called greenwashing.
               | 
               | You don't need to suggest an improvement to call out
               | something that is completely fake. The faker doesn't get
               | to demand a "conversation".
               | 
               | You've seen a bunch of people be incorrectly dismissive
               | and decided that dismissiveness is automatically wrong.
               | It's not.
               | 
               | For an extreme example, imagine a company installs a
               | "pollution-preventing boulder" at their HQ. It's very
               | valid to call that greenwashing and walk away. Don't let
               | them get PR for nothing. If they were actually trying,
               | and made a mistake, suggest a fix. But you can't fix
               | fake.
        
               | nozzlegear wrote:
               | > Baseless? The foundation of this accusation is rock
               | solid. Offloading the exact same computation to another
               | person so your energy numbers look better is not a
               | greening or sustainability effort.
               | 
               | Yes, I consider it baseless for the following reasons:
               | 
               | - First, consider the hardware running in data centers,
               | and the iDevices running at the edge - the iPhones, iPads
               | and presumably Macs. There's a massive difference in
               | power consumption between a data center full of GPUs, and
               | whatever the equivalent might be in iDevices. Few chips
               | come close to Apple's M-series in power usage.
               | 
               | - Second, Apple's commitment to making those devices
               | carbon neutral by 2030; I'm unaware of any commitment to
               | make cloud compute hardware carbon neutral, but I'll
               | admit that I don't really keep up with that kind of
               | hardware so I could be totally wrong there.
               | 
               | - Third, consider that an AI compute service (I'm not
               | sure what you call it) like OpenAI is always running and
               | crunching numbers in its data center, while the iDevices
               | are each individually running only when needed by the
               | user.
               | 
               | - Fourth, the people who own the iDevices may charge them
               | using more sustainable methods than would power a data
               | center. For example, Iowa - where I live - generates 62%
               | of its energy from wind power and nearly two-thirds of
               | its total energy from renewable resources [1], whereas
               | California only gets 54% of its energy from renewable
               | resources. Of course this cuts both ways, there are
               | plenty of states or even countries that get most of their
               | power from coal, like Ohio.
               | 
               | That said, neither of us have any real numbers on these
               | things so the best either of us can do is be optimistic
               | or pessimistic. But I'd rather do that and have a
               | discussion about it, instead of dismiss it out of hand
               | like everyone else does by saying "haha dumb, get
               | greenwashed".
               | 
               | You're right that improvements don't need to be suggested
               | to have a conversation about greening/greenwashing. My
               | irritation lies more in the fact that it's almost a trope
               | at this point that you can click into the comments on any
               | HN story that mentions greening/sustainability, and there
               | will be comments calling it fake greenwashing. I don't
               | disagree that it's easy for a company to greenwash if
               | they want to, but it's tiring to see _everything_ called
               | greenwashing without applying any critical thinking.
               | Everyone wants to be so jaded about corporations that
               | they 'll never trust a word about it.
               | 
               | [1] Although this two-thirds total includes the bio-fuel
               | ethanol, so I feel like it shouldn't be included.
        
               | Dylan16807 wrote:
               | 1. Maybe, but wouldn't Apple want to use M-series chips
               | to do this either way?
               | 
               | 2. That's an interesting angle.
               | 
               | 3. It's the same total amount, and both will go idle when
               | there's less demand.
               | 
               | 4. I think the average data center gets cleaner energy
               | than the average house but I can't find a proper
               | comparison so maybe that's baseless.
               | 
               | Also as far as I'm aware, inference takes significantly
               | fewer resources when you can batch it.
               | 
               | > but it's tiring to see everything called greenwashing
               | without applying any critical thinking
               | 
               | That does sound tiring, but in this particular case I
               | think there was sufficient critical thinking, and it was
               | originally brought up as just a possibility.
        
           | timpetri wrote:
           | That is an interesting angle to look at it from. If they're
           | gonna keep pushing this they end up with a strong incentive
           | to make the iPhone even more energy efficient, since users
           | have come to expect good/always improving battery life.
           | 
           | At the end of the day, AI workloads in the cloud will always
           | be a lot more compute effective however, meaning lowered
           | combined footprint. However, in the server based model, there
           | is more incentive to pre-compute (waste inference) things to
           | make them appear snappy on device. Analogous would be all
           | that energy spent doing video encoding for YouTube videos
           | that never get watched. Although, it's "idle" resources for
           | budgeting purposes.
        
           | benced wrote:
           | Apple has committed that all of its products will be carbon-
           | neutral - including emissions from charging during their
           | lifetime - by 2030. The Apple Watch is already there.
           | 
           | https://www.apple.com/newsroom/2023/09/apple-unveils-its-
           | fir...
        
             | dns_snek wrote:
             | From your link:
             | 
             | > "Apple defines high-quality credits as those from
             | projects that are real, additional, measurable, and
             | quantified, with systems in place to avoid double-counting,
             | and that ensure permanence."
             | 
             | Apple then pledged to buy carbon credits from a company
             | called Verra. In 2023, an investigation found that more
             | than 90% of Verra's carbon credits are a sham. Notably,
             | Apple made their pledge _after_ the results of this
             | investigation were known - so much for their greenwashing.
             | 
             | https://www.theguardian.com/environment/2023/jan/18/reveale
             | d...
        
           | MBCook wrote:
           | I'm not sure it's that (benched pointed out their carbon
           | commitment) as simple logistics.
           | 
           | Apple doesn't have to build the data centers. Apple doesn't
           | have to buy the AI capacity themselves (even if from TSMC for
           | Apple designed chips). Apple doesn't have to have the
           | personnel for the data centers or the air conditioning. They
           | don't have to pay for all the network bandwidth.
           | 
           | There are benefits to the user to having the AI run on their
           | own devices in terms of privacy and latency as mentioned by
           | the GP.
           | 
           | But there are also benefits to Apple simply because it means
           | it's no longer their resources being used up above and beyond
           | electricity.
           | 
           | I keep reading about companies having trouble getting GPUs
           | from the cloud providers and that some crypto networks have
           | pivoted to selling GPU access for AI work as crypto profits
           | fall.
           | 
           | Apple doesn't have to deal with any of that. They have
           | underused silicon sitting out there ready to light up to make
           | their customers happy (and perhaps interested in buying a
           | faster device).
        
             | vineyardmike wrote:
             | I agree with everything you said but the TSMC bit. They are
             | quite literally competing with NVidia et al for fab space
             | for customers chips. Sure they get the AI bits built-in to
             | existing products but surely they're bigger/more expensive
             | to manufacture and commit from TSMC because of it.
        
               | MBCook wrote:
               | I think we're on the same page.
               | 
               | I was trying to say that there was still a cost to using
               | their own chips for server AI because they still had to
               | pay to have them made so they weren't "free" because
               | they're Apple products as opposed to buying nVidia parts.
               | 
               | You're right, there is a cost to them to put the AI stuff
               | on end user chips too since die space isn't free and
               | extra circuits mean fewer chips fit per wafer.
        
           | the_king wrote:
           | It makes sense for desktops but not for devices with
           | batteries. I think Apple should introduce a new device for
           | $5-10k that has 400GB of VRAM that all Macs on the network
           | use for ML.
           | 
           | If you're on battery, you don't want to do LLM inference on a
           | laptop. Hell, you don't really want to do transcription
           | inference for that long - but would be nice not to have to
           | send it to a data center.
        
         | aborsy wrote:
         | What are the example of the edge devices made by Apple?
        
           | floam wrote:
           | MacBook, iPhone, iPad?
        
             | bingbingbing777 wrote:
             | So laptops are now edge devices?
        
               | MBCook wrote:
               | Doesn't it just refer to the end user device as opposed
               | to a server somewhere or a middle box?
        
               | Dylan16807 wrote:
               | Last time I looked for the definition, nobody can agree
               | on whether client devices count as edge or not.
        
               | aborsy wrote:
               | These don't sit on the edge of the internet , and
               | typically are not called edge devices.
               | 
               | It's usually a more powerful device such as a router or
               | mini server between LAN and internet.
        
         | dancemethis wrote:
         | Apple is privacy last, if anything. Forgotten PRISM already?
        
         | royaltjames wrote:
         | Yes this began with the acquisition of xnor.ai. Absolutely
         | amazing what will be done (and is being done) with edge
         | computing.
        
         | Pesthuf wrote:
         | Honestly, if they manage this, they have my money. But to get
         | actually powerful models running, they need to supply the
         | devices with enough RAM - and that's definitely not what Apple
         | like to do.
        
         | jonplackett wrote:
         | I think it will be a winning strategy. Lag is a real killer for
         | LLMs.
         | 
         | I think they'll have another LLM on a server (maybe a deal for
         | openai/gemini) that the one on the device can use like ChatGPT
         | uses plugins.
         | 
         | But on device Apple have a gigantic advantage. Rabbit and
         | Humane are good ideas humbled by shitty hardware that runs out
         | of battery, gets too hot, has to connect to the internet to do
         | literally anything.
         | 
         | Apple is in a brilliant position to solve all those things.
         | 
         | I hope they announce something good at WWDC
        
           | threeseed wrote:
           | There really isn't enough emphasis on the downsides of server
           | side platforms.
           | 
           | So many of these are only deployed in US and so if you're say
           | in country Australia not only do you have all your traffic
           | going to the US but it will be via slow and intermittent
           | cellular connections.
           | 
           | It makes using services like LLMs unusably slow.
           | 
           | I miss the 90s and having applications and data reside
           | locally.
        
             | whimsicalism wrote:
             | Unusably slow? It's like 0.3 seconds to first token and
             | then pretty much all of the tokens can follow within a
             | second.
             | 
             | I find it hard to understand the edge usecase for text-
             | based models.
        
             | EarthMephit wrote:
             | Even in Australia is the LLM lag to a server noticable?
             | 
             | Generally an LLM seems to take about 3s or more to respond,
             | and the network delay to the US is a couple of hundred
             | milliseconds.
             | 
             | The network delay seems minimal compared to the actual
             | delay of the LLM.
        
           | whimsicalism wrote:
           | Having used groq and other fast LLM services a fair bit, lag
           | seems negligible. You're literally just passing text at close
           | to the speed of light.
        
             | jonplackett wrote:
             | * when you have a good internet connection
             | 
             | ** when you live in the USA
        
               | whimsicalism wrote:
               | > * when you have a good internet connection
               | 
               | Or at least, a good enough internet connection to send
               | plaintext.
               | 
               | > * when you live in the USA
               | 
               | Even from Australia to USA is just ~300ms of latency for
               | first token and then the whole thing can finish in ~1s.
               | And making that faster doesn't require on-device
               | deployment, it just requires a server in Australia -
               | which is obviously going to be coming if it hasn't
               | already for many providers.
        
           | lolinder wrote:
           | > Lag is a real killer for LLMs.
           | 
           | I'm curious to hear more about this. My experience has been
           | that inference speeds are the #1 cause of delay by orders of
           | magnitude, and I'd assume those won't go down substantially
           | on edge devices because the cloud will be getting faster at
           | approximately the same rate.
           | 
           | Have people outside the US benchmarked OpenAI's response
           | times and found network lag to be a substantial contributor
           | to slowness?
        
             | whimsicalism wrote:
             | of course not, it's just text. people here are just
             | spitballing.
             | 
             | groq is way better for inference speeds btw
        
           | vineyardmike wrote:
           | I run a few models (eg Llama3:8b) on my 2023 MacBook Air, and
           | there is still a fair bit of lag and delay, compared to a
           | hosted (and much larger) model like Gemini. A large source of
           | the lag is the initial loading of the model into RAM. Which
           | an iPhone will surely suffer from.
           | 
           | Humane had lag _and_ they used voice chat which is a bad UX
           | paradigm. VUI is bad because it adds lag to the information
           | within the medium. Listening to preambles and lists are
           | always slower than a human eyes ability to scan a page of
           | text. Their lag is not due to LLMs, which can be much faster
           | than whatever they did.
           | 
           | We should remind ourselves that an iPhone can likely suffer
           | similar battery and heat issues - especially if it's running
           | models locally.
        
             | whywhywhywhy wrote:
             | Humane's lag feels down to just bad software design too, it
             | almost feels like a two stage thing is happening like it's
             | sending your voice or transcription up to the cloud,
             | figuring out where it needs to go to get it done, telling
             | the device to tell you its about to do that then finally
             | doing it. E.g
             | 
             | User: "What is this thing?"
             | 
             | Pin: "I'll have a look what that is" (It feels this
             | response has to come from a server)
             | 
             | Pin: "It's a <answer>" (The actual answer)
             | 
             | We're still a bit away from iPhone running anything viable
             | locally, even small models today you can almost feel the
             | chip creaking under the load they're incurring on it and
             | the whole phone begins to choke.
        
         | throwaway48476 wrote:
         | >Apple's AI strategy is to put inference (and longer term even
         | learning) on edge devices
         | 
         | Ironic, given that AI requires lots of VRAM.
        
         | lagt_t wrote:
         | Has nothing to do with privacy, google is also pushing gemini
         | nano to the device. The sector is discovering the diminishing
         | returns of LLMs.
         | 
         | With the ai cores on phones they can cover your average user
         | use cases with a light model without the server expense.
        
           | SpaceManNabs wrote:
           | don't bother. apple's marketing seems to have won on here. i
           | made a similar point only for people to tell me that apple is
           | the only org seriously pushing federated learning.
        
         | zachbee wrote:
         | If they're doing inference on edge devices, one challenge I see
         | is protecting model weights. If you want to deploy a
         | proprietary model on an edge AI chip, the weights can get
         | stolen via side-channel attacks [1]. Obviously this isn't a
         | concern for open models, but I doubt Apple would go the open
         | models route.
         | 
         | [1] https://spectrum.ieee.org/how-prevent-ai-power-usage-
         | secrets
        
           | Havoc wrote:
           | Nobody is taking particular care protecting weights for edge
           | class models
        
         | bmitc wrote:
         | > [Apple's] privacy-first strategy
         | 
         | That is a marketing and advertising strategy.
        
           | moosemess wrote:
           | - dozens of horrific 0days cves every year because not enough
           | is invested in security, making private virtually impossible
           | 
           | - credit card required to install free apps such as the
           | "private" Signal messenger
           | 
           | - location required just to show me the weather in a static
           | place, lol
           | 
           | - claims to be e2e but apple controls all keys and identities
           | 
           | - basically sells out all users' icloud data in china, and
           | totally doesn't do the same in the US, because tim pinky
           | swears
           | 
           | - everything is closed source
        
           | jascination wrote:
           | I'm a cross-platform app developer and can assure you iOS is
           | much stronger in terms of what data you can/can't get from a
           | user
        
             | bmitc wrote:
             | That says nothing about what Apple does with data.
        
         | cyanydeez wrote:
         | their privacy strategy is to make you feel comfortable with
         | their tech so you don't mind when they shop it around to the
         | highest bidder.
         | 
         | Make no mistake, they're just waiting for the right MBA to walk
         | through the door, see the sky high value of their users and
         | start chop shopping that.
         | 
         | Enshitiffication is always available to the next CEO, and this
         | is just going to be more and more tempting as the value of the
         | walled garden increases.
        
           | acdha wrote:
           | Yes, it's possible that they'll change in the future but that
           | doesn't make it inevitable. Everything you describe could
           | have happened at any point in the last decade or two but
           | didn't, which suggests that it's not "waiting for the right
           | MBA" but an active effort to keep the abusive ones out.
           | 
           | One thing to remember is that they understand the value of
           | long-term investments. They aren't going to beat Google and
           | Facebook at advertising and have invested billions in a
           | different model those companies can't easily adopt, and I'm
           | sure someone has done the math on how expensive it would be
           | to switch.
        
         | felixding wrote:
         | > This is completely coherent with their privacy-first strategy
         | 
         | Do not believe what they say, watch what they do.
        
         | SergeAx wrote:
         | Then putting only 256gb into their cheaper devices is really
         | bad move. Even simple models like Whisper require hundreds of
         | megabytes of storage.
        
         | the_king wrote:
         | I'm all for running as much on the edge as possible, but we're
         | not even close to being able to do real-time inference on
         | Frontier models on Macs or iPads, and that's just for vanilla
         | LLM chatbots. Low-precision Llama 3-8b is awesome, but it isn't
         | a Claude 3 replacer, totally drains my battery, and is slow (M1
         | Max).
         | 
         | Multimodal agent setups are going to be data center/home-lab
         | only for at least the next five years.
         | 
         | Apple isn't about to put 80GB on VRAM in an iPad for about 15
         | reasons.
        
         | ChuckMcM wrote:
         | Something they _should_ be able to do now, but do not seem to,
         | is to allow you to train Siri to recognize _exactly_ your voice
         | and accent. Which is to say, to take the speech-to-text model
         | that is listening and putting it into the Siri integration API,
         | to both be 99.99% accurate for your speech and to recognize you
         | and only you when it comes to invoking voice commands.
         | 
         | It could, if it chose to, continue to recognize all voices but
         | at the same time limit the things the non-owner could ask for
         | based on owner preferences.
        
           | jilijeanlouis wrote:
           | This is really easy to do: it's just an embedding of your
           | voice. So typically like 10/30 sec max of your voice to
           | configure this. You already do a similar setup for faceId. I
           | agree with you, I don't understand why they don't do it.
        
         | eru wrote:
         | > Processing data at the edge also makes for the best possible
         | user experience because of the complete independence of network
         | connectivity and hence minimal latency.
         | 
         | That's one particular set of trade-offs, but not necessarily
         | the best. Eg if your network connection and server processing
         | speed is sufficiently faster than your local processing speed,
         | the latency would be higher for doing it locally.
         | 
         | Local inference can also use more battery power. And you need a
         | more beefy device, all else being equal.
        
         | chipdart wrote:
         | > This is completely coherent with their privacy-first strategy
         | (...)
         | 
         | I think you're trying too hard to rationalize this move as pro-
         | privacy and pro-consumer.
         | 
         | Apple is charging a premium for hardware based on performance
         | claims, which they need to create relevance and demand for it.
         | 
         | There is zero demand for the capacity for running
         | computationally demanding workloads beyond very niche
         | applications, for what classifies as demanding for the
         | consumer-grade hardware being sold for the past two decades.
         | 
         | If Apple offloads these workloads to the customer's own
         | hardware, they don't have to provide this computing capacity
         | themselves. This means no global network of data centers, no
         | infrastructure, no staff, no customer support, no lawyer,
         | nothing.
         | 
         | More importantly, Apple claims to be pro privacy but their
         | business moves are in reality in the direction of ensuring that
         | they are in sole control of their users' data. Call it what you
         | want but leveraging their position to ensure they hold a
         | monopoly over a market created over their userbase is not a pro
         | privacy move, just like Apple's abuse of their control over the
         | app store is not a security move.
        
         | carabiner wrote:
         | iPhone Photos app already does incredible image subject search
         | via ML locally. Versus Android which does it via cloud.
        
         | arvinsim wrote:
         | If inference on edge devices is their goal, then they would
         | have to rethink their pricing on storage and RAM.
        
         | fnord77 wrote:
         | They're a hardware company, so yes, they want to sell thick
         | clients.
        
         | ActorNightly wrote:
         | >Apple's AI strategy is to put inference (and longer term even
         | learning) on edge devices.
         | 
         | Apple's AI strategy is to put inference (and longer term even
         | learning) on edge devices...only for Apple stuff.
         | 
         | There is a big difference. ANE right now is next to useless for
         | anything not Apple.
        
           | davedx wrote:
           | I think that's going to change with WWDC
        
             | ActorNightly wrote:
             | Nah. Apple doesn't have incentive to provide any more dev
             | power. It will keep things locked down and charge people
             | for Apple branded software products. That has been their
             | business for the past decade.
        
               | davedx wrote:
               | I think there's always been a tension at Apple between
               | keeping everything as locked down as possible and opening
               | up parts because they need the developer driven app
               | ecosystem. My prediction is Neural Engine is going to
               | become more useful to third party developers. I could be
               | wrong
        
         | edanm wrote:
         | And yet Siri is super slow because it does the processing off-
         | device, and is far less useful than it could be because it is
         | cobbled with restrictions.
         | 
         | I can't even find a way to resume playing whatever Audible book
         | I was last playing. "Siri play audible" or something. As far as
         | I know, this is impossible to do.
        
         | blegr wrote:
         | I hope this means AI-accelerated frameworks get better support
         | on Mx. Unified memory and Metal are a pretty good alternative
         | for local deep learning development.
        
         | unusualmonkey wrote:
         | There is no guarantee that local processing is going to have
         | lower latency than remote processing. Given the huge compute
         | needs of some AI models (e.g. chat gpt) the time saved by using
         | larger compute likely dwarfs the relatively small time need to
         | transmit a request.
        
         | wilde wrote:
         | Ehhh at this point Apple's privacy strategy is little more than
         | marketing. Sure they'll push stuff to the edge to save
         | themselves money and book the win, but they also are addicted
         | to the billions they make selling your searches to Google.
         | 
         | Agreed on the UX improvements though.
        
         | neilsimp1 wrote:
         | The entire software stack is non-free and closed-source. This
         | means you'd be taking Apple at their word on "privacy". Do you
         | trust Apple? I wouldn't, given their track record.
        
           | madeofpalk wrote:
           | What track record?
        
             | Gh0stRAT wrote:
             | They fought the FBI over unlocking iPhones when they could
             | have just quietly complied with the request. I'd say they
             | have a decent track record.
        
               | MaxBarraclough wrote:
               | They might have been thinking of the recently discovered
               | hardware backdoor issue, CVE-2023-38606 (see also
               | _Operation Triangulation_ ). There was surprisingly
               | little reporting on it.
               | 
               | Discussion: https://news.ycombinator.com/item?id=38783112
               | 
               | Transcript of _Security Now_ podcast episode discussing
               | the issue: https://www.grc.com/sn/sn-955.htm
        
             | m463 wrote:
             | Just install little snitch and you'll see how MUCH gets
             | sent back to the mothership. and that is just macos.
        
         | garydgregory wrote:
         | My cynical view is that doing AI on the client is the only way
         | they can try to keep selling luxury items (jewelry really) and
         | increasing prices for what are essentially and functionally
         | commodity devices.
        
         | marticode wrote:
         | Every chip coming this year (Intel, AMD, Qualcomm) has an AI
         | processor. I am not sure Apple is doing anything special here.
        
         | unboxingelf wrote:
         | Apple is UX-first, not privacy-first.
        
         | ethagnawl wrote:
         | > Processing data at the edge also makes for the best possible
         | user experience because of the complete independence of network
         | connectivity and hence minimal latency.
         | 
         | I know a shop who's doing this and it's a very promising
         | approach. The ability to offload the costs of cloud GPU time is
         | a _tremendous_ advantage. That 's to say nothing of the
         | decreased latency, increased privacy, etc. The glaring downside
         | is that you are dependent upon your users to be willing and
         | able to run native apps (or possibly WASM, I'm not sure) on
         | bleeding edge hardware. However, for some target markets (e.g.
         | video production, photography, designers, etc.) it's a "safe"
         | assumption that they will be using the latest and greatest
         | Macs.
         | 
         | I've also been hearing people talk somewhat seriously about
         | setting up their own training/inference farms using Macs
         | because, at least for now, they're more readily available and
         | cheaper to buy/run than big GPUs. That comes with a host of ops
         | problems but it still may prove worthwhile for some use cases
         | and addresses some of the same privacy concerns as edge
         | computing if you're able to keep data/computation in-house.
        
         | beestripes wrote:
         | Privacy can actually be reduced with on-device ai too. Now,
         | without actually sending any data to iCloud apple can still
         | have a general idea of what you're doing. Imagine a state has a
         | law that makes certain subjects illegal to discuss. They could
         | compel apple to have their local AI detect that content and
         | then to broadcast a ping in the AirTags network about the user
         | and their location. No internet connection required on the
         | target.
        
         | amelius wrote:
         | In case it is not abundantly clear by now: Apple's strategy is
         | to turn developers into slaves.
         | 
         | Run away from these temptations. You will never truly own the
         | hardware anyway.
        
         | demondemidi wrote:
         | Every embedded company is pushing ML at the edge with inference
         | engines. Check out MLPerfTiny. They've been benchmarking all
         | sorts of edge AI since 2019.
        
         | msla wrote:
         | Privacy from everyone but Apple, certainly.
        
       | zincmaster wrote:
       | I own M1, A10X and A12X iPad Pros. I have yet to see any of them
       | ever max out their processor or get slow. I have no idea why
       | anyone would need an M4 one. Sure, it's because Apple no longer
       | has M1s being fabbed at TSMC. But seriously, who would upgrade.
       | 
       | Put MacOS on iPad Pro, then it gets interesting. The most
       | interesting thing my ipad pros do are look at security cameras or
       | read ODB-II settings on my vehicle. Hell, they can't even
       | maintain an SSH connection correctly. Ridiculous.
       | 
       | I see Apple always show videos of people editing video on their
       | iPad Pro. Who does that??? We use them for watching videos
       | (kids). One is in a car as a mapping system - that's a solid use
       | case. One I gave my Dad and he did know what to do with it - so
       | its collecting dust. And one lives in the kitchen doing recipes.
       | 
       | Functionally, a 4 year old Chromebook is 3x as useful as a new
       | iPad Pro.
        
       | jsaltzman20 wrote:
       | Who built the M4 chip: https://www.linkedin.com/posts/jason-
       | salt_ai-apple-semicondu...
        
       | jsaltzman20 wrote:
       | Who built the M4 chip? https://www.linkedin.com/posts/jason-
       | salt_ai-apple-semicondu...
        
         | doctor_eval wrote:
         | Great People Units. OK. Nice recruitment pitch.
        
       | phkahler wrote:
       | >> And with AI features in iPadOS like Live Captions for real-
       | time audio captions, and Visual Look Up, which identifies objects
       | in video and photos, the new iPad Pro allows users to accomplish
       | amazing AI tasks quickly and on device. iPad Pro with M4 can
       | easily isolate a subject from its background throughout a 4K
       | video in Final Cut Pro with just a tap, and can automatically
       | create musical notation in real time in StaffPad by simply
       | listening to someone play the piano. And inference workloads can
       | be done efficiently and privately...
       | 
       | These are really great uses of AI hardware. All of them benefit
       | the user, where many of the other companies doing AI are somehow
       | trying to benefit themselves. AI as a feature vs AI as a service
       | or hook.
        
       | leesec wrote:
       | Why are all the comparisons with the M2? Apple did this with the
       | M3 -> M1 as well right?
        
       | spxneo wrote:
       | almost went for M2 128gb to run some local llamas
       | 
       | glad I held out. M4 is going to put downward pressure across all
       | previous gen.
       | 
       | edit: nvm, AMD is coming out with twice the performance of M4 in
       | two months or less. If the M2s become super cheap I will consider
       | it but M4 came far too late. There's just way better alternatives
       | now and very soon.
        
         | kalleboo wrote:
         | > _AMD is coming out with twice the performance of M4 in two
         | months or less_
         | 
         | M4 Pro/Max/Ultra with variants double+ the performance by just
         | scaling cores are probably also going to be announced at WWDC
         | in a month when they also announce their AI roadmap
        
       | winwang wrote:
       | I would want to know if the LPDDR6 rumors are substantiated, i.e.
       | memory bus details.
       | 
       | https://forums.macrumors.com/threads/lpddr6-new-beginning-of...
       | 
       | If M4 Max could finally break the 400GBps limit of the past few
       | years and hit 600 GBps, it would be huge for local AI since it
       | could directly translate into inference speedups.
        
         | hmottestad wrote:
         | The M4 has increased the bandwidth from 100 to 120 GB/s. The M4
         | Max would probably be 4x that at 480 GB/s, but the M4 Ultra
         | would be 960 GB/s compared to M2 Ultra at 800 GB/s.
        
           | winwang wrote:
           | Dang. +20% is still a nice difference, but not sure how they
           | did that. Here's hoping M4 Max can include more tech, but
           | that's copium.
           | 
           | 960 GB/s is 3090 level so that's pretty good. I'm curious if
           | the Macbooks right now are actually more so compute limited
           | due to tensor throughput being relatively weak, not sure
           | about real-world perf.
        
             | wmf wrote:
             | LPDDR5-6400 + 20% ~= LPDDR5-7500 (same speed used in
             | Intel/AMD laptops)
        
       | animatethrow wrote:
       | Only iPad Pro has M4? Once upon a time during the personal
       | computer revolution in the 1980s, little more than a decade after
       | man walked the moon, humans had sufficiently technologically
       | developed that it was possible to compile and run programs on the
       | computers we bought, whether the computer was Apple (I,II,III,
       | Mac), PC, Commodore, Amiga, or whatever. But these old ways were
       | lost to the mists of time. Is there any hope this ancient
       | technology will be redeveloped for iPad Pro within the next 100
       | years? Specifically within Q4 of 2124, when Prime will finally
       | offer deliveries to polar Mars colonies? I want to buy an iPad
       | Pro M117 for my great-great-great-great-granddaughter but only if
       | she can install a C++ 212X compiler on it.
        
       | GalaxyNova wrote:
       | I hate how apple tends to make statements about their products
       | without clear benchmarks.
        
         | bschmidt1 wrote:
         | It's a 10-core CPU + 10-core GPU + 16-core "NPU" (neural
         | processing unit) for AI all on a consumer handheld. It's like a
         | Ferrari engine in a Honda Civic - all we know is it's going to
         | be fast and hopefully it doesn't catch on fire.
        
       | abhayhegde wrote:
       | What's the endgame with iPads though? I mainly use it for
       | consumption, taking notes and jotting annotations on PDFs. Well,
       | it's a significant companion for my work, but I cannot see if
       | I've any reason to upgrade from iPad Air 5, especially given the
       | incompatibility of the Pencil 2nd gen.
        
       | biscuit1v9 wrote:
       | > the latest chip delivering phenomenal performance to the all-
       | new iPad Pro What a joke. They have M4 and they still run iOS?
       | Why can't they run MacOS instead?
       | 
       | If you take it a bit deeper: if an iPad would have keyboard,
       | mouse and MacOS - it would basically be a 10/12 inch macbook.
        
       | jshaqaw wrote:
       | The heck do I do with an M4 in an iPad? Scroll hacker news really
       | really fast?
       | 
       | Apple needs to reinvest in software innovation on the iPad. I
       | don't think my use case for it has evolved in 5 years.
        
         | hmottestad wrote:
         | I was hoping they would come out and say "and now developers
         | can develop apps directly on their iPads with our new release
         | of Xcode" but yeah, no. Don't know if the M4 with just 16GB of
         | memory would be very comfortable for any pro workload.
        
           | doctor_eval wrote:
           | There's no way Apple would announce major new OS features
           | outside of WWDC.
           | 
           | So, perhaps it's no coincidence that a new iPadOS will be
           | announced in exactly one month.
           | 
           | Here's hoping anyway!
        
       | hmottestad wrote:
       | 120 GB/s memory bandwidth. The M4 Max will probably top out at 4x
       | that and the M4 Ultra at 2x that again. The M4 Ultra will be very
       | close to 1TB/s of bandwidth. That would put the M4 Ultra in line
       | with the 4090.
       | 
       | Rumours are that the Mac Studio and Mac Pro will skip M3 and go
       | straight to M4 at WWDC this summer, which would be very
       | interesting. There has also been some talk about an M4 Extreme,
       | but we've heard rumours about the M1 Extreme and M2 Extreme
       | without any of those showing up.
        
       | jgiacjin wrote:
       | Is there an sdk to work on gaming with unity with ipad Pro m4?
        
       | gigatexal wrote:
       | lol I just got a m3 max and this chip does more than 2x tops than
       | my NPU does
        
       | gigatexal wrote:
       | Anyone know if the ram multiples of the M4 are better than the
       | M3? Example: could a base model M4 sport more than 24GB of ram?
        
         | LtdJorge wrote:
         | Not for the iPad (8/16 GB)
        
           | gigatexal wrote:
           | Right I was thinking more laptops and desktops but yeah 8/16
           | makes a ton more sense for the ipad only.
        
         | wmf wrote:
         | The M3 can support 64 GB but Apple artificially limits it to 24
         | GB. There's no telling when they will decide to uncripple it.
        
           | gigatexal wrote:
           | How did you come up with that? Based on the core layouts or
           | something?
           | 
           | Regarding crippling I only went with a max because I wanted
           | more ram. I would have loved to have gone with an M3 pro but
           | 36GB wasn't enough.
        
             | wmf wrote:
             | 128-bit LPDDR5 supports 64 GB. You can see this in a few
             | laptops.
        
       | thih9 wrote:
       | When this arrives in MacBooks, what would that mean in practice?
       | Assuming base M4 config (not max, not ultra - those were already
       | powerful in earlier iterations), what kind of LLM could I run on
       | it locally?
        
         | talldayo wrote:
         | > what kind of LLM could I run on it locally?
         | 
         | Anything up to the memory configuration it is limited to. So
         | for base model M4, that likely means you have 8gb of memory
         | with 4-5 of it realistically usable.
        
       | thih9 wrote:
       | > while Apple touts the performance jump of the 10-core CPU found
       | inside the new M4 chip, that chip variant is exclusive to the 1
       | TB and 2 TB iPad Pro models. Lower storage iPads get a 9-core
       | CPU. They also have half the RAM
       | 
       | https://9to5mac.com/2024/05/07/new-ipad-pro-missing-specs-ca...
        
       | ycsux wrote:
       | Everyone seems as confused as I am about Apple's strategy here. I
       | wasn't sure the M4 existed, now it can be bought in a format
       | noone wants. How will this bring in a lot of revenue?
        
       | TheMagicHorsey wrote:
       | The only reason I'm buying a new iPad Pro is the screen and
       | because the battery on my 2021 iPad Pro is slowly dying.
       | 
       | I could care less that the M4 chip is in the iPad Pro ... all I
       | use it for is browsing the web, watching movies, playing chess,
       | and posting on Hacker News (and some other social media as well).
        
       | Topfi wrote:
       | Generally, I feel that telling a company how to handle a product
       | line as successful as the iPads doesn't make much sense (what
       | does my opinion matter vs their success), but I beg you, please
       | make Xcode available on iPad OS or provide an optional and
       | separate MacOS mode similar to Dex on Samsung tablets. Being
       | totally honest, I don't like MacOS that much in comparison to
       | other options, but we have to face the fact that even with the
       | M1, the iPads raw performance was far beyond the vast majority of
       | laptops and tablets in a wide range of use cases, yet the
       | restrictive software made that all for naught. Consider that the
       | "average" customer is equally happy with and, due to pricing,
       | generally steered towards the iPad Air, which are great devices
       | that cover the vast majority of use cases essentially identical
       | to the Pro.
       | 
       | Please find a way beyond local transformer models to offer a true
       | use case that differentiates the Pro from the Air (ideally
       | development). The second that gets announced, I'd order the
       | 13-inch model straight away. As it stands, Apple's stance is at
       | least saving me from spending 3,5k as I've resigned myself to
       | accept that the best hardware in tablets simply cannot be used in
       | any meaningful way. Xcode would be a start, MacOS a bearable
       | compromise (unless they start to address the instability and bugs
       | I deal with on my MBP, which would make MacOS more than just a
       | compromise), Asahi a ridiculous, yet beautiful pipedream. Fedora
       | on an iPad, the best of hardware and software, at least in my
       | personal opinion.
        
         | ragazzina wrote:
         | >a product line as successful as the iPads
         | 
         | Ipad revenue has been declining for 9 out of the last 10
         | quarters.
        
       | 1024core wrote:
       | > capable of up to 38 trillion operations per second
       | 
       | Assuming these are BF16 ops, by comparison, an H100 from NVIDIA
       | will do 1979 teraFLOPS BF16.
       | 
       | So this "Neural Engine" from Apple is 50x slower than an H100
       | PCIe version.
        
         | foobiekr wrote:
         | The h100 is $80k.
        
       | aetherspawn wrote:
       | No numbers on battery life improvements?
        
       | bschmidt1 wrote:
       | Anyone know more about the "NPU" (neural processing unit)?
       | Basically a GPU specialized for AI?
        
         | Havoc wrote:
         | Yeah. Basically specialises in fast matrix operations so
         | focused on ai and ML. Vaguely like a gpu minus the graphic
         | output and the pipeline needed for that.
        
           | bschmidt1 wrote:
           | Awesome thanks. Given it's an iPad I'm guessing that means
           | consumer-facing AI which is very cool.
        
       | dev1ycan wrote:
       | It's hilarious how they still push the "better for the
       | Environment" garbage when they industrially destroy old iphones.
        
         | Havoc wrote:
         | They have been working on tech to pull them apart via robots to
         | isolate component parts. Unsure if it's in use at scale though
        
       | 0xWTF wrote:
       | Dear Apple, please stop focusing on thinner. At this point you're
       | selling high-end breakables. To quote Katherine the Great,
       | Huzzah!
        
       | tzury wrote:
       | "M4 has Apple's fastest Neural Engine ever, capable of up to 38
       | trillion operations per second"
       | 
       | Let that sink in.
        
         | zamadatix wrote:
         | Built in to a SoC* e.g. a 2060 from 5 years ago had double that
         | in tensor cores it just wasn't part of the SoC. Great
         | improvement for its type/application, dubious marketing claim
         | with the wording.
         | 
         | And it's really not that great even though it's a welcomed
         | improvement - Microsoft is pushing 45 TOPs as the baseline, you
         | just can't get that in an APU yet (well, at least for the next
         | couple months).
        
       | zmmmmm wrote:
       | I'm most interested in what this means for the next Vision
       | device.
       | 
       | Half the power budget could well translate to a very significant
       | improvement in heat on the device, battery size and other
       | benefits. Could they even dispense with the puck and get the
       | battery back onto the headset for a consumer version that runs at
       | slightly lower resolution and doesn't have the EyeSight feature?
       | 
       | If they could do that for $2000 I think we'd have a totally
       | different ball game for that device.
        
       | xyst wrote:
       | Cool. Too bad it's Apple and it's locked down to Apple products.
       | 
       | If Apple ever decides to pivot business to chip manufacturing
       | that allows open access and sale. Then I would revisit.
       | 
       | Always need competition in this space. Especially for low power
       | cpu/gpus.
        
       | rustcleaner wrote:
       | I really really hope [but doubt] Qubes OS runs on something like
       | this with these caveats:
       | 
       | -using an add-in video card for dom0 gpu
       | 
       | -apple's on-die gpu can be passed through to a qube
       | 
       | -apple's unified memory does not need OS X to adjust the RAM/VRAM
       | wall
       | 
       | Since I doubt these caveats are in play, I think I may be stuck
       | with enterprise hand-me-downs that I can pass through.
        
       | pmayrgundter wrote:
       | Am I missing it? 3x faster than an Xbox 10 from 2020 sounds
       | "just" like Moore's law growth
       | 
       | Cool! Thanks Apple!
       | 
       | I guess this is to be expected given your market position and
       | billions in the bank?
       | 
       | Thanks so much for doing it. We'll definitely buy all your
        
       | mrpippy wrote:
       | Today's Xcode 15.4 RC suggests that "Donan" is the M4 core
       | codename, and that it may support ARM's SME instructions.
       | 
       | https://mastodon.social/@bshanks/112401605018159567
        
       | SkyMarshal wrote:
       | _> M4 has Apple's fastest Neural Engine ever, capable of up to 38
       | trillion operations per second, which is faster than the neural
       | processing unit of any AI PC today._
       | 
       | I didn't even realize there is other PC-level hardware with AI-
       | specific compute. What's the AMD and Intel equivalent of Neural
       | Engine? (not that it matters since it seems the GPU where most of
       | the AI workload is handled anyway)
        
         | zamadatix wrote:
         | AMD/Intel just call it an NPU.
        
         | dragonwriter wrote:
         | > not that it matters since it seems the GPU where most of the
         | AI workload is handled anyway
         | 
         | GPUs can also have AI-specific compute (e.g., Nvidia's tensor
         | cores.)
        
       | SkyMarshal wrote:
       | I wonder if they've implemented any fixes or mitigations for
       | GoFetch (https://gofetch.fail/).
        
       | smallstepforman wrote:
       | As an engineer, I find it extremelly frustrating to read Apple's
       | marketing speak. It almost sounds like ChatGPT and StarTrek
       | techno-babble. Engineers cannot stomach reading the text, and non
       | engineers wont bother reading it anyway.
       | 
       | Whats wrong with plain old bullet-points and sticking to the
       | technical data?
        
         | dimask wrote:
         | The target audience is neither engineers nor general public;
         | these announcements are meant for tech journalists/youtubers
         | etc to refer to when writing or talking about it.
        
       | zer0zzz wrote:
       | Looking forward when these new 10 core M4s end up in cheap mac
       | desktops. I hope theres a max ram boost from the existing 24GB.
        
       | JSDevOps wrote:
       | Why does this feel like it was hastily added at the last minute?
       | Developing a chip like the M4 presumably takes years, so hastily
       | incorporating 'AI' to meet the hype and demand could inevitably
       | lead to problems.
        
         | mlyle wrote:
         | Apple has had "Neural Engine" hardware in their SOCs since
         | 2017, and have been building its capabilities every generation.
        
       | helsinkiandrew wrote:
       | Maybe I'm getting blase about the ever improving technical
       | capabilities, but I find the most astounding thing is that the M4
       | chip, an OLED screen on one side, and aluminium case on the other
       | can fit in 5.1mm!
        
       | koksik202 wrote:
       | Give me Mac OS bootup on iPad and I am in
       | 
       | Don't get the hype of the performance and being locked to iPad
       | ecosystem
        
       | fakelonmusk wrote:
       | Why does Apple hurry to push M4 before A18 Pro? Who can support
       | the hypotheses below? 1) M3 follows M2 and A16 Pro in part, and
       | 2) M4 follows M2 and A17 Pro.
        
       | drstrangevibes wrote:
       | yes but will it blend?
        
       | sean_the_geek wrote:
       | Seems like we are clearly in the _"post peak Apple"_ era now.
       | This update is just for the sake of update; iPad lineup is (even
       | more) confusing; iPhone cash-cow continues but at slower growth
       | rate; new product launches - ahem! Vision Pro - have seen low to
       | negligible adoption; marginal improvements to products so
       | consumers are holding out longer on to their devices.
        
       | garydgregory wrote:
       | Posts like these directly to the Apple PR dept is just an ad IMO.
        
       | qwerty456127 wrote:
       | > M4 has Apple's fastest Neural Engine ever, capable of up to 38
       | trillion operations per second, which is faster than the neural
       | processing unit of any AI PC today.
       | 
       | How useful this is for free libraries? Can you invoke it from
       | your Python or C++ code in a straightforward manner? Does it not
       | rely on proprietary drivers only available for their own OS?
        
         | zamadatix wrote:
         | It's a piece of hardware so it'll have their own driver but you
         | can access it through the OS APIs just like any other piece of
         | hardware. What's new is the improvement in speed, Apple has
         | exposed their NPUs for years. You can e.g. run Stable Diffusion
         | on them.
        
       | throwaway1194 wrote:
       | I dread using a Mac for any serious work, you lose a lot of the
       | advantages of Linux (proper package and window management, native
       | containers, built-in drivers for every hardware out there,
       | excellent filesystems support, etc). And you get what exactly?
        
         | _zoltan_ wrote:
         | Best hardware on the world best desktop OS. I wouldn't trade my
         | MBP .
        
           | throwaway1194 wrote:
           | Best according to whom and for what reason? Give me some real
           | reasons for why it is "best". That's what Apple's marketing
           | department would say.
        
             | _zoltan_ wrote:
             | I don't need to fiddle with settings and as a developer all
             | my tools are there (git, clang, ...). I don't need to care
             | about the version of my kernel and can be worry free to
             | click update.
             | 
             | Plus it's beautiful, great to touch hardware.
        
       | insane_dreamer wrote:
       | 8GB RAM?? what is this, 2005? Seriously, how much would it cost
       | Apple to start its base M4 model with 16GB RAM.
        
         | achandlerwhite wrote:
         | it's an ipad...
        
       | ponorin wrote:
       | part of me wonders if the reason apple went with an unusual
       | double jump in processor generation is that they are fearing or
       | at least trying to delay comparison with other desktop class arm
       | processors. wonder if mac lineup will get m4 at all or start with
       | m4 pro or something. we'll see.
        
         | zamadatix wrote:
         | Doesn't add up, if you fear something about to launch will
         | outdo your product in a given segment then you push that launch
         | earlier, not later so your competitor is also first to market.
        
       | ethagknight wrote:
       | Interesting, it seems Apple knows they are up against the wall
       | where there isn't really much more their devices NEED to do. The
       | cases that Apple gives for using these devices incredible compute
       | is very marginal. Their phones computers and iPads are fantastic
       | and mainly limited by physics. I have basically one of
       | everything, love it all, and do not feel constrained by the
       | devices of the last 3-4 years. Vision and Watch still leave room
       | for improvement, but those are small lines. Limited opportunity
       | to innovate, hence the Vision being pushed to market without a
       | real path forward for customer need/use. Very few people read
       | about the latest iPad m4 and think "oh wow my current iPad cant
       | do that..."
       | 
       | Curious what steps they will take, or if they shouldn't just
       | continue returning large amounts of cash to shareholders.
        
       | stackedinserter wrote:
       | All this power and I still can't play decent games on my macbook.
        
       | alanh wrote:
       | > 3-nanometer technology
       | 
       | Wow. I remember being assured that we would never reach even low
       | double-digit nanometer processes.
        
       ___________________________________________________________________
       (page generated 2024-05-09 23:02 UTC)