[HN Gopher] M3 Macs: there's more to performance than counting c...
___________________________________________________________________
M3 Macs: there's more to performance than counting cores
Author : ingve
Score : 172 points
Date : 2023-11-03 07:59 UTC (15 hours ago)
(HTM) web link (eclecticlight.co)
(TXT) w3m dump (eclecticlight.co)
| stuff4ben wrote:
| This is good to know. I worry about the longevity of these new
| M-series processors from Apple since they're so (relatively) new.
| What is the usable lifespan of a base M1-MacBook Air for example?
| I guess I'll find out since I just bought one for my kid who is a
| junior in HS and I'm hoping this will last until they graduate
| college.
| dewote wrote:
| For context my M1 Air feels as good as the day I bought it and
| it's just about to hit 3 years old.
|
| I kept my previous 2014 MBP for 6 years and I can't see this
| one being any different.
| jghn wrote:
| Same. The only issue I have w/ my original M1 Air is that I
| was impatient and got the 8GB model because I could get it
| into my hands a month faster. But that's not something that's
| changed over time.
|
| Every new release I drool over getting the new shiny but the
| reality is that for my personal laptop it's AOK for almost
| everything I do. Sometimes a bit slow and frustrating due to
| the RAM but again, that's on me.
| Detrytus wrote:
| I'm running Windows (ARM) virtual machine on M1 Air, and it's
| constantly thermally throttled. It was hitting 95 Celsius
| degrees sometimes. I even bought one of those cooling pads
| with three fans, to make up for the lack of cooling in the
| laptop itself, so it is now down to 65 Celsius degress. Makes
| my life little more bearable, but I'm switching to M3 Macbook
| pro (with proper cooling) ASAP.
| deagle50 wrote:
| Add a thermal pad to connect the heatsink to the back cover
| and the thermal throttling will go away. Tons of tutorials
| on Youtube and it takes 5 minutes. I did it on my M1 Air
| and it doesn't go above 85-90C now.
| charrondev wrote:
| I've got a 16 inch Pro with 32g of RAM and I can see myself
| using this for development work at least a decade.
|
| I tend to run it with a large external 4k 120hz screen and
| don't see the need to upgrade for better IO for a long time and
| the thing is very, very fast.
| andreasley wrote:
| For a while now, the lifespan of Apple devices has mostly been
| limited by Apple stopping to provide software updates. Most
| Macs from 20 years ago still work fine today. If some component
| died, it was usually the hard drive or the power supply in my
| experience.
|
| Apple Silicon will definitely be the architecture best
| supported by Apple for the foreseeable future. There is no
| downside to it in terms or reliability or longevity.
| kmmlng wrote:
| To make a somewhat exaggerated comparison: horses still work
| fine today, but it really doesn't make sense for people to be
| using them anymore outside of niche use cases or as a hobby.
|
| Sure, macs from 20 years ago still work "fine", but the
| reason they aren't widely used anymore is not the lack of
| software updates.
| SXX wrote:
| I bought M1 Air 256GB with 16GB RAM on release for light web
| development and professional 2D gamedev on Unity. Still using
| it till now and it's really really durable hardware and I've
| been usign it daily for 3+ years.
|
| With so little RAM had to sacrifice ability to properly run
| VMs, but yet till this day there just no comparable hardware on
| market.
|
| 8GB versions though... Certainly Apple should be damned for
| selling laptops with 8GB RAM in 2023 when my x220i ran 16GB in
| 2011. At some point they will all become e-waste due too little
| RAM or dead soldered SSD because of extreme swapping.
| stuff4ben wrote:
| 8GB of RAM ought to be enough for anybody right? I'm hoping
| that's true for someone who spends most of their time in
| Google Docs or Word and the occasional light web browsing.
| This MBA replaced a Chromebook which really was on its last
| legs.
| alpaca128 wrote:
| I'm using an M2 Air with 16GB and M1 Mini with 8GB. On the
| 8GB machine I have to restart Firefox every few days
| because otherwise it completely fills up RAM + Swap and the
| entire system starts getting sluggish. It's a very
| noticeable bottleneck while on the 16GB Macbook I don't
| really think about memory usage.
|
| I would not recommend the 8GB variant to anyone who plans
| on using it for more than a couple browser tabs and some
| light tasks.
| stuff4ben wrote:
| > I would not recommend the 8GB variant to anyone who
| plans on using it for more than a couple browser tabs and
| some light tasks.
|
| Which is basically all my kids do. Less than 10 tabs,
| nothing else.
| Synaesthesia wrote:
| I'm a heavy user often do 50 tabs and lots of apps in the
| background. 8gb M1 air is really fast for me.
| astrange wrote:
| Check about:memory in Firefox. It's not reasonable to run
| out of swap; that means it's trying to keep something
| like 40GB in there after compression. You might have an
| extension or something leaking.
| stetrain wrote:
| From my experience with M1, M1 Max, and M2 Max laptops, I think
| the only real limitation to useful longevity of the M1 machines
| is configured RAM and Apple's willingness to provide software
| updates.
|
| A point in its favor is that Apple is still selling the M1
| MacBook Air new today as their entry level laptop, which is now
| 3 years old. And it's still a great machine with better
| performance and battery life than a lot of competitors and it's
| completely fanless. I expect Apple to push for dropping new OS
| support for Intel machines fairly quickly, but the countdown
| clock on M1 support probably doesn't even start until they stop
| selling them as new.
|
| My previous Intel MacBook Pro was still perfectly serviceable 8
| years after purchase when I traded it in, although it wasn't
| going to receive major OS updates going forward.
|
| The only time I've had issues with my M1 MacBook Air 8GB is
| when I temporarily tried using it as a real dev machine while
| waiting for a backordered company laptop. As soon as you really
| hit that RAM ceiling due to running docker and big IDEs you
| really feel the performance drop, but until that point it was
| perfectly competent, and again this is a fully fanless machine.
| troupe wrote:
| My experience with buying lower end intel based Macbook Airs is
| that they generally last 10 years before we replace them.
| Usually this comes down to asking if it is worth replacing the
| battery or better to invest in a new machine. I'm assuming the
| M processors will be similar.
| kalleboo wrote:
| For me at least the longevity is already far better than the
| Intel MacBooks Pro. I bought the 2017 and 2019 models and for
| my use those felt obsolete from day 1, with 2 hour battery life
| and permanent full-speed fans and CPU throttling.
|
| I have a M1 Pro since release day and I don't see myself
| wanting to replace this until probably the M5 is out.
| cubefox wrote:
| This seems to be a lot of effort to rationalize the surprisingly
| small performance increase from M2 to M3. Initially the
| assumption was that M2 to M3 would be a bigger step than M1 to
| M2, not a smaller one. Perhaps TSMC 3nm is showing the limits of
| scaling?
| Olreich wrote:
| Scaling based on node size has been limited for a long time. It
| feels like it started at 13nm that every shrink was getting
| less and less performance uplift, but it's likely it's been
| going on for longer and we just had much more room for
| improvement on chip design.
| scrlk wrote:
| It's a mixture of several problems:
|
| * TSMC N3B being a bit of a flop (yield issues, too expensive)
|
| * Brain drain from Apple's chip design teams over the last few
| years
|
| * Tim Cook trying to push the average selling price up to keep
| revenue growth going in the face of sales declines (e.g.
| hobbling memory bandwidth, reducing the number of performance
| cores for M3 Pro)
|
| I don't expect there to be a M1 style generational leap for a
| long time, expect 2010s Intel style yearly performance gains
| from here on out.
| pier25 wrote:
| The GPU is increasing a lot though.
|
| Another point is the CPU improvements are constant. In about
| two years the base M3 is now on par with the M1 Max.
| hajile wrote:
| The GPU is critical to future profits though. Apple really
| wants your monthly subscription to Apple Arcade and they
| want to expand in other game areas which is why they've
| been paying AAA companies to optimize for Mac. This also
| ties into their VR headset where gaming will be one of the
| core features.
| the-golden-one wrote:
| Hardly any AAA companies are optimising for the Apple
| GPU. MoltenVK is where all the interest is.
|
| Even if they did, there is very little in Apple Arcade
| which taxes the GPU, most target the lowest common
| denominator in terms of supported iOS/phone combinations.
|
| The original Apple Arcade strategy was for AAA titles,
| but for whatever reason that wasn't pursued, so now we
| have a tonne of casual games and re-releases of old
| titles.
|
| Apple just seems to run hot and cold on gaming.
| zarzavat wrote:
| It's true but the problem with the GPU is Apple's addiction
| to RAM money. Yes the GPU is improving in performance but
| it does you no good if it has to share a tiny amount of
| system RAM with the CPU.
| kbd wrote:
| Tiny? My M2 Air has 24g, which is more ram than any other
| laptop video card I've had.
| zarzavat wrote:
| Sure, if you pay for it. The 14" starts at 8GB. Eight!
| 200$ more for another 8GB.
| forrestthewoods wrote:
| > The GPU is increasing a lot though.
|
| Is it? Does it matter? Unified RAM is cool. But I'd rather
| have an Nvidia 4090.
| blktiger wrote:
| Weren't all of the new M3 Macs announced the same price as
| they previously were or lower? Same with the recently
| announced iPhones? Or am I mis-remembering? Seems like prices
| not increasing given all the recent inflation are actually a
| price decrease pretty much across the board not an increase
| on the average selling price?
| scrlk wrote:
| Entry level pricing for a MacBook Pro now starts at $1599
| rather than $1299.
| sroussey wrote:
| And it has a downgrade from a Pro chip to a non pro chip.
| npunt wrote:
| 13" Macbook Pro had M2 not M2 Pro
| eyelidlessness wrote:
| No it doesn't. The previous (cheaper) entry level MBPs
| were non-Pro M2 (and non-Pro M1 before that).
| GeekyBear wrote:
| > expect 2010s Intel style yearly performance gains from here
| on out
|
| Intel saw very little gain this year at all in return for a
| 400 watt power draw under load.
|
| The plain old M3 saw a 20% performance gain along side
| efficiency gains.
|
| Having a 22 hour battery life is insane and you certainly
| aren't going to manage that with a 400 watt power draw.
| pretzel5297 wrote:
| 400 watts is on a desktop chip where there is no concept of
| battery life.
|
| 20% increase on performance is compared to M1 not, M2 -
| which also had 20% increase in performance on M1.
| GeekyBear wrote:
| > 20% increase on performance is compared to M1 not, M2
|
| Nope.
|
| > The M3 chip has single-core and multi-core scores of
| about 3,000 and 11,700, respectively, in the Geekbench 6
| database. When you compare these scores to those of the
| M2's single-core and multi-core scores (around 2,600 and
| 9,700, respectively), the M3 chip is indeed up to 20%
| faster like Apple claims.
|
| https://www.laptopmag.com/laptops/macbooks/apple-m3-bench
| mar...
|
| > 400 watts is on a desktop chip where there is no
| concept of battery life.
|
| Yes, and in exchange for that ridiculous 400 watt power
| draw, Intel saw negligible performance gains.
|
| > In some areas, the extra clock speeds available on the
| Core i9-14900K show some benefit, but generally speaking,
| it won't make much difference in most areas.
|
| https://www.anandtech.com/show/21084/intel-
| core-i9-14900k-co...
|
| Intel only wishes they could hit a 20% gain in exchange
| for all that increased power draw and heat. As that
| review noted the best improvement they saw in any of the
| common benchmarks was just 6%.
| llm_nerd wrote:
| Apple seems to be focusing on efficiency more than anything,
| and maybe a little more market segmentation where the lesser
| chips are less competitive with the bigger chips. The Pro
| offers fewer performance cores, trading them for efficiency
| cores. It has reduced memory bandwidth.
|
| From a generational perspective, the M3 Max is offering the
| same level of performance as the M2 Ultra. That's amazing as
| the m3 max is 12p/4e vs 16p/8e in the m2 Ultra. The M3 Ultra
| should be a substantial lift.
| Zetobal wrote:
| It's a tick and some people... especially marketing departments
| want to sell it as a tock.
| solardev wrote:
| Wait, is that the sound the minute hand makes?! I never
| realized.
| xattt wrote:
| Assuming you don't have a single-hand clock.
| vbezhenar wrote:
| https://en.wikipedia.org/wiki/Tick-tock_model
| camillomiller wrote:
| I don't know where you get the idea of small performance
| increase. What I'm seeing until now, also in pre-review unites,
| is actually the opposite, especially when ray tracing and mesh
| shading get taken into account. For anything GPU related these
| new machines are a giant leap.
| jeffbee wrote:
| Are these mainstream use cases? These always feel like they
| are chosen to demonstrate the processor's strengths, instead
| of talking about the ways the processor helps in real
| applications.
| brigadier132 wrote:
| Isn't it a 15% increase? I really don't consider any double
| digit percent increases to be small.
| hajile wrote:
| Actual IPC increase is 1-2%. The rest is from ramping the
| clockspeeds. This is a problem because power consumption goes
| up exponentially with frequency. Go up too high and they'll
| be doing what AMD or Intel does where a single core is using
| 50+ watts to hit those peak numbers.
| svnt wrote:
| Power is polynomial (it goes with the square of frequency),
| not exponential.
| _ph_ wrote:
| How is this a problem? It just looks that they didn't try
| to improve the general architecture but make one step to
| ramp up the clock speeds without increasing the power
| consumption thanks to the process step. Which is a great
| achievement, because any tape out on a new advanced process
| - here for the first time a "3nm" - is a big achievement.
| One has to consider that Apple now has yearly updates in
| its processor lineup. The next step will probably introduce
| more architectural changes. Only if those would stop
| showing up for several years in a row I would get
| concerned.
| hajile wrote:
| With each recent node step, you basically get +10-15%
| clockspeed or -30% power consumption. They just blew
| their entire node on a small clockspeed ramp.
|
| Now, if they want a wider core for M4, that means more
| transistors and more heat. They are then forced to: not
| go wider, decrease max clockspeed, hold max clockspeed
| for a pitiful amount of time, or increase power
| consumption.
|
| On the whole, I'd rather have a wider core and lower
| clockspeeds then turn the other power savings into either
| battery life or a few more E-cores.
| GeekyBear wrote:
| > Isn't it a 15% increase?
|
| At least according to Geekbench, it's a 20% performance
| increase.
|
| > The M3 chip has single-core and multi-core scores of about
| 3,000 and 11,700, respectively, in the Geekbench 6 database.
| When you compare these scores to those of the M2's single-
| core and multi-core scores (around 2,600 and 9,700,
| respectively), the M3 chip is indeed up to 20% faster like
| Apple claims.
|
| https://www.laptopmag.com/laptops/macbooks/apple-m3-benchmar.
| ..
|
| Along side a battery life increase to 22 hours? It's been a
| pretty good showing.
| pretzel5297 wrote:
| Benchmarks are optimized for specifically.
| wtallis wrote:
| Are you alleging that a chip which was in development for
| years was optimized specifically for a benchmark that was
| released a few months ago?
| deagle50 wrote:
| The surface area of the M3 Pro vs M2 Pro tells you everything
| you need to know.
| DesiLurker wrote:
| have a handy link and few more sentences for my friend who
| does not gets it?
| deagle50 wrote:
| Search "M3 Pro" on Twitter and you'll see some decent
| comparisons.
| Keyframe wrote:
| Everytime Apple oversells there are apologists putting a spin
| on it. Nothing new to see.
| Tagbert wrote:
| And there are detractors downplaying improvements. Nothing
| new to see.
| tambourine_man wrote:
| 10-15% increase in single threaded code every year is pretty
| good these days.
| aeonik wrote:
| The graph needs a legend. Explaining the graph in the several
| paragraphs below and then cross referencing shape names and
| spatial relationships, is not a fun game to play.
| whywhywhywhy wrote:
| Feels like we've gone from "it's just better, it's obviously
| better you can see with your eyes and feel it as you use it" when
| M1/2 dropped.
|
| To trying to justify why the M3 is 20% faster than the M1 and the
| M2 was also 20% faster than the M1 and weirdly Apple is only
| comparing it to their older processor.
|
| Like people, maybe it's just an underwhelming update... no need
| to pretend it's not.
| dmix wrote:
| Every Apple update that's not a new product/major revision
| people rush to HN to say how underwhelmed they are. This says
| more about the high bar they set for announcements and people's
| own expectations... rather than how steady and efficient
| progress works in reality.
|
| This is what product iterations look like, M2 only came out a
| year ago and M3 is a notable speed increase and just as
| important bump in battery life (22hrs is insane for the
| performance you get).
|
| They compared to M1 and intel because people almost always wait
| 1-2 cycles before upgrading because MacBooks easily last 2yrs
| under heavy use and it's an expensive upgrade.
| runjake wrote:
| 1. I think you set your expectations too high.
|
| 2. I assume Apple compared the M3 to M1 because most of the
| customers they are targeting are still on the M1.
|
| M2 customers bought a computer less than a year ago and that
| pool of people is relatively small compared to the number on M1
| today.
| mcphage wrote:
| > I think you set your expectations too high.
|
| Well, they did announce it in an event called "scary fast". I
| mean there's always marketing exaggeration, but I did expect
| more than trying to understand if it's even faster _at all_.
| Otherwise I 'm not entirely certain what the point was.
| whynotminot wrote:
| Literally the only point of confusion here is M3 Pro versus
| M2 Pro, where Apple seems to have simply made different
| choices around what the Pro chip should be.
|
| M3 Max is a _massive_ gain over M2 Max. And M3 is a nice
| improvement over M2.
|
| Hope that helps.
| stetrain wrote:
| It felt like half the event was listing different types of
| workload and saying that the new chips were 15-20% faster
| than the M2 generation at that task, and 30-40% faster than
| the M1 generation.
|
| The only real exception to that is the M3 Pro, which is
| closer to the performance of the M2 Pro at a reduced core
| count.
| runjake wrote:
| > Well, they did announce it in an event called "scary
| fast".
|
| It was the night before Halloween. > I mean
| there's always marketing exaggeration
|
| Apple's marketing people are notorious for exaggeration.
| > I did expect more than trying to understand if it's even
| faster at all
|
| Their stated performance improvements seem to be accurate.
| Although I'm surprised at _how much_ faster the M3 Max
| seems to be. > I'm not entirely certain
| what the point was.
|
| The point was this year's revisions with a speed bump. They
| do this almost every year.
|
| That said, early benchmarks are indicating that the M3 Max
| performance is on par with the M2 Ultra. I consider that
| "scary fast" myself.
| FireBeyond wrote:
| > 2. I assume Apple compared the M3 to M1 because most of the
| customers they are targeting are still on the M1.
|
| I feel like this is giving Apple the benefit of the doubt.
|
| Most people here wouldn't say the same of Intel if they
| started comparing to 2 or 3 generations ago.
|
| No, most likely they're comparing to M1 because it's the
| biggest difference that is still plausible to explain.
| pohl wrote:
| The M3 Max performs similarly to an M2 Ultra. That feels pretty
| big to me as a current M1 Max user.
| steve1977 wrote:
| It's also more expensive than a M2 Max or M1 Max if I'm not
| mistaken.
| stetrain wrote:
| 14" MBP with the highest M2 Max configuration, 64GB RAM,
| 1TB SSD was $3899.
|
| A new model with the same configuration is also $3899.
| steve1977 wrote:
| Interesting, they seem to be bit more expensive where I
| live, although I cannot compare them 1:1 (spec'd one with
| M2 Max 30 Core GPU and 64 GB RAM some months ago, however
| for 64 GB RAM I would need to go with the M3 Max 40 Core
| GPU now, so not a fair comparison)
|
| Edit: But in any case, the difference would not be huge
| from what I can see, so I guess my point is moot
| speedgoose wrote:
| Wow, that's overpriced for such low specs.
| grecy wrote:
| Please show me a laptop from another manufacturer that
| has similar specs, similar screen, similar battery life
| that costs less.
| speedgoose wrote:
| It's more that those specs are not worth the price bump.
| I have a M1 that is slightly less capable but cost much
| much less. I use it to access machines that are faster
| than this M3 laptop when needed.
| stetrain wrote:
| For 16 core CPU, a dedicated GPU equivalent, 400GB/s
| LPDDR5 memory, 3024x1964 Mini-LED, and 18 hours of
| battery?
|
| The closest I can configure a Dell XPS 15 is $3099, and
| that's a 14 core CPU and likely lower memory bandwidth
| and lower performance SSD. They claim 18 hours of battery
| but only with the base screen, the upgraded screen is
| presumably less.
|
| And from personal experience using an XPS 15 is a
| significantly worse experience in stability, heat, fan
| noise, and real-world battery life.
|
| And here are benchmarks for the XPS vs the M3 Max:
|
| https://browser.geekbench.com/v6/cpu/3367184
|
| https://browser.geekbench.com/v6/cpu/3372431
| speedgoose wrote:
| Yes I guess if you absolutely need those specs in a small
| laptop and you don't care about value it makes sense. I
| think it's more that the laptop is not a good deal
| compared to other Apple laptops.
| stetrain wrote:
| Oh for sure. The value return per dollar gets worse the
| higher you go up Apple's options list. Especially their
| RAM and SSD upgrade prices.
| ttoinou wrote:
| So, with inflation, the actual price got down
| strangescript wrote:
| Exactly, how are people overlooking this. Its like saying I
| am going to drop a near silent windows laptop, that is thin
| and cool, oh and has BETTER performance than last year's
| desktop windows machines.
| abrouwers wrote:
| I agree it's impressive, in the same way Ferrari announcing
| a new super car is impressive. But, I drive a hatchback,
| and was just hoping they'd cave on 8gb/256gb for
| ram/storage, or support for multiple monitors.
| adolph wrote:
| Well they just caved on the 128G storage last year (2022
| [0]), so it seems premature to think that might happen.
| The last bump in base RAM was 2017 [1] and before that
| 2012, so one might think that would increase soonish.
| However, the shift from Intel to Apple silicon changed
| the nature of RAM in the machines. Its too soon to tell
| what a base-level upgrade would look like.
|
| 0. https://everymac.com/systems/apple/macbook-
| air/specs/macbook...
|
| 1. https://everymac.com/systems/apple/macbook-
| air/specs/macbook...
| rahoulb wrote:
| I bought my M1 MacBook Pro not long after they were released.
| It's approaching 3 years old which is a pretty standard age for
| people to start looking at a replacement - I guess that's why
| they're comparing it to the M1.
|
| Slight tangent - even though the MBP has been struggling to
| handle my dev work (partly because the codebase I'm working on
| has grown significantly in the last two years) I won't be
| upgrading. Work bought me a MacStudio (M2 Max), which mainly
| runs headlessly on my desk (also running Homebridge and plugged
| into my big speakers for AirPlay). This means I'm keeping the
| MBP as my portable machine (using VSCode's remote extensions
| and/or CodeServer to do dev work from wherever). I also have a
| late-2015 27" iMac at the office (with OpenCore Legacy Patcher
| so it can run whatever the latest macOS is called nowadays).
| This also works perfectly well now all the hard stuff is done
| on the MacStudio - and the screen is still lovely. (My previous
| 27" iMac was in active service for 10 years, although it was
| almost unusable towards the end).
|
| Another tangent - I bet that's why they're not doing the larger
| "pro" iMacs. If it weren't for VSCode Remote Extensions and
| OCLP, I would have ended up with a beautiful monitor that was
| essentially useless (as was my original 27" iMac). People who
| need that extra power should probably avoid all-in-ones and I
| won't be getting one again.
|
| But for most people, who tend to have a single machine, a
| comparison to the 3 year old equivalent seems pretty fair to
| me.
| bee_rider wrote:
| Comparing against the device 2 generations back does seem to
| make more sense from a "should I upgrade" point of view, and
| seems like a fine thing to put in marketing slides (as well
| as comparisons vs current peers, since that's relevant to the
| question of "what should I upgrade to."
|
| The lackluster year-to-year is a bit of a bad sign WRT the
| long term trajectory, but they've been doing this
| successfully for quite a while, they've recovered from worse
| I suspect.
| dnissley wrote:
| Oldest MBPs with an M1 are from late 2021, so only 2 years
| old.
| rahoulb wrote:
| Those were the redesigned M1 Max/Pro versions.
|
| Mine is the M1-with-touchbar (same design as the intel
| version) which was released at the same time as the M1
| MacBook Air - late 2020.
| cesaref wrote:
| Having been all over the apple range over the years, the M1
| Max Macbook Pro i'm currently using has been I would say the
| best of their laptops since the original G4 Powerbook. A
| couple of years into ownership of this machine, and it's been
| an excellent experience.
|
| I develop on it every day, and i've no need to upgrade it, so
| i'll wait for the M4 :)
| karolist wrote:
| This is the way, portable machine should have a good battery
| life, good screen and good keyboard for me, everything else
| gets done on a remote, network connected machine with gobs of
| RAM and Linux.
| Jaxan wrote:
| > It's approaching 3 years old which is a pretty standard age
| for people to start looking at a replacement
|
| Is this really true? Three years sounds very new to me, and
| no one in their right minds would replace it.
| fouc wrote:
| The apple marketing compared their m3 to both m2 and m1, saying
| it was 50% faster than m1, and 30% (or 20%?) faster than m2
| stetrain wrote:
| 20% performance increases generation over generation, every 18
| months or so, seems pretty great to me.
|
| If people were expecting gains like the Intel to M1 transition
| on an annual basis I'm not sure that was ever realistic.
| Frost1x wrote:
| >Like people, maybe it's just an underwhelming update... no
| need to pretend it's not.
|
| _" Buy the new M3, a small incremental improvement on the
| M1/M2"_ would probably get a lot of people fired as a marketing
| campaign.
|
| While I agree with you, we live in a reality where hype,
| hyperbole, misleading, and intentionally manipulative
| information is accepted and common place for selling things. It
| sure would be nice to look at a product or service and get a
| clear comparison without having to read between the lines, pick
| out subtle vague language usage, and keep up with the latest
| propaganda techniques but alas, it's everywhere.
|
| On the bright side it should teach everyone you shouldn't trust
| the majority of information at face value, which is a useful
| skill in life. On the downside we continuously erode trust in
| one another and I worry it's wearing on social structures and
| relationships everywhere as more people see and mimic these
| behaviors everywhere for everything.
| fsociety wrote:
| Why would you market to people who just bought an M2 Pro this
| year? Those who are going to upgrade, will anyways. Seems silly
| to market to people telling them to buy new multi-thousand
| dollar laptops every year.
| creativenolo wrote:
| Yeah and in addition, people know Intel to Apple silicon was
| a massive leap, so why market how much fast it is than the
| intel version. That play has been played.
|
| The stats compare to m2 are there. They are just not the
| headline/summary.
| lostlogin wrote:
| Most of the customers are going to be Mac users, and I'll bet
| that most already have M series machines.
|
| Comparing M3s to old Intel Macs would also be lame, and a
| fairly meaningless comparison.
|
| It would be interesting to know how many people compare M
| series performance to Intel performance when buying, as those
| are usually going to be different markets surely?
| jmull wrote:
| It seems like the M3 favors efficiency and thermals over raw
| power. This makes sense for the devices they've put it in.
|
| I wonder if we're going to see different processors (M3X or M4)
| for the next release of the pro desktops, that favor power over
| thermals and efficiency?
|
| Maybe there will be a kind of tick-tock, with odd numbered
| processors favoring efficiency and even numbered processors for
| power?
|
| M3, M5, M7 for the laptops
|
| M4, M6, M8 for the pro desktops
|
| ?
| pohl wrote:
| Doesn't prioritizing efficiency and thermals make sense for any
| device? Even a desktop has moments when it's doing almost
| nothing but has tasks that can adequately be handled by an E
| core where powering up a P core is overkill. Doesn't being
| efficient mean that more of the thermal budget can be held in
| reserve for when it's actually needed? Don't desktop users
| desire for their powerful machine to also be quiet?
| jeffbee wrote:
| The blue-logo CPU company gives you a choice. You can adopt
| their most aggressive power profiles and get marginal
| performance gains if you want them. Or, you can tune them
| down to levels where they are pretty fast and fairly quiet.
| Or, you can dial them all the way down to their most
| efficient operating point, which pretty much nobody wants
| because that point is around 1W/core and they are pretty
| slow.
|
| Apple is dictating how much power the CPU is allowed to draw
| and they don't let it scale up. They are also making a static
| choice about how much die area and power to spend on the GPU,
| which might not suit every user. I know I personally don't
| give a rip about the GPU in my M2 mac mini, beyond the fact
| that it can draw things on the screen.
| diffeomorphism wrote:
| Not really, no. Case in point:
|
| https://www.notebookcheck.net/AMD-Ryzen-9-7940HS-analysis-
| Ze...
|
| Adjusting the TDP from 80W all the way down to 35W costs you
| relatively little in performance and gives you about the same
| efficiency as an M2 Pro. That is not done because it does not
| sell.
|
| > Don't desktop users desire for their powerful machine to
| also be quiet?
|
| No. That is not an "also" but an "instead". People claim they
| would make that trade-off but that is just not the case. Same
| for "make it heavier but give me more battery life".
| bgirard wrote:
| If I can tuck away my desktop under my desk and it acts as
| a large space heater in the below zero winter I don't mind
| if it's helping me get my compiles down faster.
|
| There's some jobs I want a fast desktop for and other jobs,
| like browsing cooking recipes, I'll take my M1 macbook air
| for.
| arcticbull wrote:
| > That is not done because it does not sell.
|
| It's not done because it doesn't sell in the PC market.
| Apple's kind of your only choice and if they say you're
| getting perf/W you're getting perf/W :)
|
| > People claim they would make that trade-off but that is
| just not the case.
|
| It's not 2 dimensional, tdp vs noise. It's tdp vs size vs
| noise vs price. You get a large ATX mid-tower and throw
| some Noctua 140mm fans in there and you won't hear a thing
| as it dispenses 1KW of heat into your feet area.
| crest wrote:
| > Doesn't prioritising efficiency and thermals make sense for
| any device?
|
| No. If you have the cooling to sustain the higher power mode
| it can be worth it to pay the power bill to get work done
| faster. Compared to the productivity gains for compute
| limited workloads the increased power bill is little more
| than a rounding error or even a net power saving because you
| need less supporting infrastructure per worker to accomplish
| the same assuming that the tasks are even efficiently
| parallelizable to multiple (human) workers.
| GeekyBear wrote:
| > Doesn't prioritizing efficiency and thermals make sense for
| any device?
|
| Intel did just ship 14th Gen Core i9 chips with a 400 watt
| power draw under load and very little in the way of a
| performance gain.
|
| > In some areas, the extra clock speeds available on the Core
| i9-14900K show some benefit, but generally speaking, it won't
| make much difference in most areas.
|
| The most significant win performance for the Core i9-14900K
| came in CineBench R23 MT... the Core i9-14900K sits 6% ahead
| in this benchmark
|
| https://www.anandtech.com/show/21084/intel-
| core-i9-14900k-co...
|
| Having your best benchmark result only see a 6% gain in
| return for a more than 400 watt power draw makes it pretty
| clear that just as with the Pentium IV of old, Intel isn't on
| a sustainable path.
|
| It also makes the M3's 20% performance gain along side
| efficiency gains look pretty darn good in comparison.
| solardev wrote:
| You can just stack more of them for performance, no? You can't
| magically cool them down for idle periods though.
| jmull wrote:
| Dollars are never unlimited though, so you can't just add
| more chips/chiplets/cores.
| stetrain wrote:
| For the M3 and M3 Pro, they kept the same number of cores or in
| the case of the Pro actually decreased them.
|
| On the other hand, the M3 Max gained 4 Performance cores for a
| total of 16 cores (12P 4E) vs the M2 Max's 12 cores (8P 4E).
|
| The M3 Max beats the M2 Ultra (2x M2 Max dies) in single and
| multicore benchmarks.
|
| Seems like Apple is leaning towards efficiency on their lower
| end chips, but also using the extra transistor density to push
| performance on M3 Max, increasing the gap between the low and
| high end chips.
|
| That M3 Max is presumably going to be turned into an M3 Ultra
| sometime next year with 32 cores, which would roughly double
| the M2 Ultra multicore performance.
| jltsiren wrote:
| The M3 Max beats the M2 Ultra in single-task benchmarks, that
| measure how well various kinds of software can take advantage
| of the hardware. Such benchmarks don't scale well with the
| number of CPU cores, because many tasks involve sequential
| bottlenecks.
|
| With higher-end hardware, you should get more meaningful
| results by running a few copies of the benchmark software in
| parallel and reporting the sum of multicore scores as the
| true multicore score. That would reflect the common use case
| of running several independent tasks in parallel.
| deagle50 wrote:
| Apple is favoring yields and margin over everything else. They
| cut the M3 Pro as much as possible so it's half the area of the
| M2 Pro.
| hajile wrote:
| I think it's more accurate to say that M3 favors smaller chips
| because N3B has atrocious yields (not to mention cost savings).
|
| When they switch to N3E and get better yields, I imagine that
| we'll see chip size grow. As we're getting close to the number
| of cores that the average consumer can use or would want to use
| in a desktop, I imagine that we'll be seeing an increase in
| GPU, NPU, and other specialized units instead of more CPUs
| though.
| Fluorescence wrote:
| > I wonder if we're going to see different processors (M3X or
| M4) for the next release of the pro desktops
|
| I don't think they care much about pro desktops given the
| pretty embarrassing MacPro. They just don't mean much for the
| bottom line compared to consumer lifestyle devices.
|
| I expect they would be happily rid of the customer segment -
| annoying power users wanting low-level access and
| customisation. They keep their machines for too long and don't
| give much upselling opportunity for subscriptions, fashion
| accessories, house and family trinkets.
| jwells89 wrote:
| It can make sense for desktops too, like in the case of the Mac
| Studio. The role it plays there is keeping fans inaudible while
| maintaining good thermals rather than conserving battery.
| Dritzzka wrote:
| Apple needs to stop explaining stats in vague weeb speak and just
| give us what the actual compile times in GCC are.
|
| >Then again, that takes courage.
| whynotminot wrote:
| Given what appears to be a reduced performance focus for the M3
| Pro chip, and the M3 Max has taken off to the stratosphere in
| both performance and unfortunately price, I'm wondering if for a
| lot of professional users if getting a refurbished M2 Max machine
| is actually the best price to performance move right now.
|
| Really looking forward to some detailed M3 Pro analysis.
| asylteltine wrote:
| And yet the air can't support more than one monitor. I know Apple
| is a small company with only a few engineers so it must have been
| too hard.
|
| Sarcasm obviously. As an Apple enthusiast it's so annoying how
| they sniff farts.
| sneak wrote:
| The Air is not the computer for people who use a lot of
| external monitors. 99.9%+ of MBAir users will never connect
| even one external monitor.
|
| They make a small MacBook Pro that is suited for that task.
| Detrytus wrote:
| And yet it's a dick move from Apple to artificially limit
| number of external screens supported.
| sneak wrote:
| Lots of compromises are made to get the Air to the size,
| weight, and most importantly, price point (ie sub-$1k) that
| it is.
|
| It is Apple's best selling computer by quite a bit. They
| sell metric assloads of the $999 base model each fall when
| school starts.
|
| IMO the $999 one (or $899 with education discount) is
| literally the best price:perf ratio of any computer
| available today or at any time in my recent memory. It's
| astounding how good it is for that price.
| user_7832 wrote:
| > Lots of compromises are made to get the Air to the
| size, weight, and most importantly, price point (ie
| sub-$1k) that it is.
|
| > IMO the $999 one (or $899 with education discount) is
| literally the best price:perf ratio of any computer
| available today or at any time in my recent memory. It's
| astounding how good it is for that price.
|
| It is possible for your comment to be correct while the
| poster above is also correct. The M chips are amazing,
| but let's not pretend that Apple doesn't knowingly gimp
| the hardware to upsell the more expensive versions.
|
| _" But that would make it more expensive."_
|
| Yes it would... by pennies. It is a bit ridiculous to
| sell a phone as expensive as the current iphones and
| limit them to USB 2.
| sneak wrote:
| Generally speaking, people don't use the port on an
| iPhone for anything other than charging. The fact that it
| supports USB v-anything is mostly unnecessary cost.
|
| If you are thinking in any way whatsoever about the non-
| wireless data transfer speeds on your phone, you are not
| the market or user it was designed for. Same goes for
| external monitors on an MBA.
| user_7832 wrote:
| I agree that USB isn't commonly used but that's kinda the
| point - it doesn't hurt apple much to do it (and they're
| not saving anything remotely significant), but they still
| do it to force the few who do want to use fast USBs. A
| more common example where nearly everyone would notice
| would be the abysmal ram in their phones until very
| recently.
| lukas099 wrote:
| > knowingly gimp the hardware to upsell the more
| expensive versions.
|
| I'm skeptical. A ton of money is dumped into optimizing
| everything and then they just throw it down the drain?
| That would drive more people to the competition than to
| higher-priced Macs.
| user_7832 wrote:
| I meant that bit in general and not specifically for the
| mac, but for eg they sold iPads with 32gb storage as
| recently as 2020 iirc. Even now the base level macs have
| much less ram and storage than similarly priced windows
| counterparts. I think part of it is just so they can say
| "macbooks start at $xx!". There are more examples I can't
| think of right now.
| masklinn wrote:
| Come off it.
|
| - the 14" M3 is $1600, but limited to a single external
| display, and actually only has two TB ports, it's just
| sad
|
| - the M3 Pro is limited to two external displays, despite
| an entry price of $2000
|
| $3199 is the baseline to be able to use 3 external
| displays (although at that price you can plug up to 4).
| MikusR wrote:
| the 2000 Euro M3 Macbook pro also supports only one
| external display and is the same size and weight as the
| ones that support more.
| _ph_ wrote:
| They are not artificially limiting the number of external
| screens. To support more screens, they would have to
| reserve die area for an additional controller. That could
| mean a gpu less or something. This is about engineering
| tradeoffs.
| masklinn wrote:
| Considering the M3 has 55% more transistors than the M1,
| I'm sure they could have found some room for a third
| display controller, or a more capable one (one with MST
| support for instance), or just a more _flexible_ one.
|
| They're selling a $1600 machine which can't drive two
| external displays, it's sad. Intel's HD Graphics have
| been doing better since 2012 and they're trash.
| nomel wrote:
| > considering the M3 has 55% more transistors than the M1
|
| And that's why everyone loves node shrinks. You can stuff
| more transistors for the same cost.
| willseth wrote:
| It's a business decision. You don't think Apple knows the
| demand for multiple external displays? If they thought it
| would sell more Macs, they would do it.
| masklinn wrote:
| > It's a business decision.
|
| That doesn't make it less frustrating.
|
| > If they thought it would sell more Macs, they would do
| it.
|
| It's short term thinking, it generates bad feeling of
| nickel and diming and unnecessary upsells: on M3 you need
| to pay $2000 to be able to use two external displays, and
| $3200 for 3, even if you have no need for the rest of the
| processing power.
|
| Not only that but they managed to create more range
| confusion with the expensive but gimped entry-level 14".
| willseth wrote:
| What you describe is only considered by a vanishingly
| small number of their customers. The vast majority don't
| care or even own an external display. You could be right
| about range confusion, but I doubt it simply because,
| like everything else, Apple is so disciplined about
| market research. Time will tell
| matwood wrote:
| > The vast majority don't care or even own an external
| display.
|
| Or more than 1 external display. The vast majority own
| none or a single. Few people have multiples, and those
| people are generally enthusiasts who will also tend to
| pay more.
| freeAgent wrote:
| You can say, "it's a business decision," about nearly
| anything that a business does. That's not really an
| argument for or against something.
| wubrr wrote:
| Engineering tradeoffs to support something intel/amd PCs
| have supported for 15+ years? Maybe apple is just shit at
| engineering?
| shepherdjerred wrote:
| Intel/AMD chips don't approach the efficiency of what
| Apple makes.
| jamespo wrote:
| I don't understand this pretence that the vast majority
| of the time laptops are plugged into the power anyway.
| FireBeyond wrote:
| There's always one reason after another. "It won't fit",
| you say sure it can, others do it, "Well, they don't do
| it efficiently", you say okay, how many times are you
| driving multiple monitors (most of which support power
| delivery these days) on battery, then it'll be "Apple
| just understands this better".
| shepherdjerred wrote:
| You're effectively saying that you want Apple to design a
| chip without any compromises in features, cost,
| performance, or efficiency.
|
| Apple sure could design a chip that drives more monitors,
| but maybe they decided that making the chip x% more
| performance/efficient, or having y feature was more
| important.
|
| Or, maybe this really truly is trivial and they decided
| to segment their chips by not including this feature.
| There is absolutely nothing wrong with that.
| tcmart14 wrote:
| Perhaps because until recently, battery life has sucked?
| I used to plug in my laptops all the time. Now with the
| M1, it only gets plugged in to charge, then I don't plug
| it in until I charge it again. I have been enjoying
| sitting on my couch when I want to, to get work done
| rather than at my desk.
| wubrr wrote:
| You can sit on the couch and have your laptop plugged in.
| I also don't keep my lenovo plugged in all the time and
| don't notice any battery issues. Realistically you're not
| sitting on a couch, working for 6 hours straight either.
| deergomoo wrote:
| It's not just battery life, it's heat and noise too. Yeah
| I know not a lot of people actually use laptops on laps
| (though I do), but you typically can't hide it under a
| desk like you can with a desktop.
|
| My job provided me with a Dell laptop with an i7 and it
| can keep my coffee warm just idling. Sometimes, when it's
| decided it doesn't want to sleep, I can hear the fans
| from the next room. While the screen is off, not being
| touched.
|
| I could go on at length about just how shitty the Dell is
| compared to my MacBook Air (despite costing more and
| performing worse!), but Apple Silicon Macs are just so
| much _nicer_ when it comes to the practicalities of using
| a portal computer.
| postalrat wrote:
| So this conversion goes from Apple shouldn't need more
| display controllers since most people Air users don't use
| more displays to most people don't use their laptop on
| battery but power usage should be a major concern.
|
| It's almost like we should focus on what Apple does best
| and ignore what Apple does worse.
| scarface_74 wrote:
| True. But when I was going to customer sites, it was
| really nice to plug in my MacBook Pro the night before
| and not have to think about battery life. It's also nice
| not to have the fans going constantly and being able to
| work with it on my lap without the fear of never being
| able to have little Scarface's.
| claytongulick wrote:
| Have you ever actually tried driving decent large
| monitors from those crappy intel chips?
|
| I have.
|
| Engineering tradeoffs indeed.
|
| For all its flaws, Apple focuses on making sure there is
| a good experience overall. In general, they'd rather have
| a more limited hardware ecosystem that works better than
| a broad ecosystem that's flaky.
|
| There are tradeoffs to both approaches. I go back and
| forth between windows and mac, and have for decades.
|
| I was a surfacebook/win10/WSL guy for a few years, and
| liked it a lot. Win 11? Not so much.
|
| Currently the M series chips blows any PC laptop away for
| the combination of portability/performance/battery.
|
| The Snapdragon X might change that in the next couple
| years, and I'll take a look at switching back then, and
| see what win 12 offers. If MS decides to take a step back
| from being an ad-serving platform and goes back to
| focusing on productivity, I may switch back.
|
| For now though, I'm planning on accepting the gut punch
| to my wallet and picking up a 14" M3 Max in a few weeks.
| wubrr wrote:
| > Have you ever actually tried driving decent large
| monitors from those crappy intel chips?
|
| Yes, I regularly drive my monitors with 1 intel laptop
| and/or 1 amd laptop (modern ones, linux). Not a single
| issue ever. I plug in my work m2 pro mac to these same
| monitors - constant issues.
|
| > For all its flaws, Apple focuses on making sure there
| is a good experience overall.
|
| MacOS has had dozens of well-know, well-documented bugs
| for years that haven't been addressed, many of them
| mentioned in this thread. I don't see any focus from
| Apple on these whatsoever - their main focus seems to be
| advertising and visual design.
| d3w4s9 wrote:
| I have done dual external monitor setup for years on
| Intel and AMD laptops without any issues.
|
| The standard setup at my company is a ThinkPad connected
| to dual 27" monitor. My company has thousand of
| employees. (Mac setup is also available for those who
| requests.) Many many companies offer a similar setup for
| employees.
|
| Maybe you need a reality check first.
| cylinder714 wrote:
| If you're referring to the 13-inch model, that's been
| discontinued.
| sneak wrote:
| No, the 14. The Air is 13, and the "normal" MBP is now 16.
| The 14 MBP is not onerously large if you are one of the
| people who want an Air-sized computer with lots of IO.
|
| It would be annoying if you had to get a 16" laptop to be
| able to hook up to multiple external displays. I love my
| 16" MBP but my Air is the one that goes everywhere in my
| handbag.
| mananaysiempre wrote:
| The plain M3 version of the MBP doesn't support more than
| one external display either[1].
|
| [1] https://www.apple.com/macbook-pro/specs/ (scroll down
| to "Display Support")
| kylec wrote:
| You need to spend a minimum of $2000 though, the new $1600
| MacBook Pro also only supports one external display.
| sroussey wrote:
| Yeah, the new MacBook Pro with a non Pro or Max chip is
| essentially an Air with a couple extra ports. I don't think
| they should have released it.
| upget_tiding wrote:
| It's better than the 13" MacBook Pro with touchbar that
| it replaced.
| user_7832 wrote:
| While some engineering compromises I can understand, most
| appear to simply be anti-consumer/cash-grabs. Need more than
| 8gb ram? Hope you've got sufficient money. Want a high refresh
| rate screen present in cheap android phones? Sorry not for our
| base models.
| datpiff wrote:
| > Want a high refresh rate screen present in cheap android
| phones?
|
| What is this for? Games? I avoided this to save money the
| last time I bought a phone. I honestly can't think of use
| case where a phone display needs to update higher than 60 Hz
| user_7832 wrote:
| High refresh rates aren't just for games, they make the
| entire interface smoother. Things like frame rate or
| latency are quality of life things that you don't _need_ ,
| but once you upgrade, the difference becomes clear. Sort of
| like going from a slowly accelerating station wagon to a
| Porsche/sports car even when (capped by the speed limit) if
| you've experienced that. Hopefully someone else has a
| better analogy.
| masklinn wrote:
| More importantly VFR is not useful for just high refresh
| rates. I would argue that for most people on a phone it's
| the least useful part of the thing.
|
| Instead VFR allows saving battery by slowing down the
| refresh rates, down to as low as 1FPS for static or high
| latency content. That's part of why always on display is
| only available on the 15 Pro: at 60Hz it's a battery
| killer.
|
| > Sort of like going from a slowly accelerating station
| wagon to a Porsche/sports car even when (capped by the
| speed limit) if you've experienced that.
|
| So... functionally useless.
| user_7832 wrote:
| > So... functionally useless.
|
| (Disclaimer, I haven't driven a sports car but I have
| driven cars between "nice" and "struggles to reach the
| speed limit")
|
| While it might appear to be very similar, having a sports
| car that you're capable of driving well/defensively can
| literally be a life saver. There was this video from a
| few years back on reddit of this guy in a 911. Vehicle in
| front suddenly stops/crashes, he needs to rapidly change
| lanes at highway speeds. The little 911 unsurprisingly
| handled it excellently. A slow/heavy car would've very
| likely spun out and crashed.
|
| Having more power if you don't need it doesn't hurt (if
| you're trained well). Less can hurt.
| masklinn wrote:
| > While it might appear to be very similar, having a
| sports car can literally be a life saver.
|
| Or it can be a life ender by planting you into the side
| of the road, something mustangs and BMWs are well known
| for, amongst others.
|
| > that you're capable of driving well/defensively
|
| If you need to be a skilled pilot to face the situation,
| it's less the car and more the pilot which does the life
| saving. And even skilled pilots who know their cars well
| can fuck it up. Rowan Atkinson famously spun his F1 into
| a tree, after 14 years of ownership, having been racing
| for 2 if not 3 decades.
|
| > There was this video from a few years back on reddit of
| this guy in a 911. Vehicle in front suddenly
| stops/crashes, he needs to rapidly change lanes at
| highway speeds.
|
| Something which ends up in a collision as often as not.
|
| > The little 911 unsurprisingly handled it excellently. A
| slow/heavy car would've very likely spun out and crashed.
|
| Or not. Or maybe with a less capable machine they'd have
| driven more prudently and would have kept more space.
|
| > Having more power if you don't need it doesn't hurt (if
| you're trained well).
|
| So having more power can literally hurt. And routinely
| does.
| lukas099 wrote:
| So basically while more sportiness can make you safer in
| theory, in practice you have a human in the mix who is
| all but guaranteed to drive less safely. I agree.
| turtlebits wrote:
| IME, the difference is negligible. I went from a iPhone
| 12 to a Galaxy S21 Ultra and yes, there is a difference,
| but it's not something I notice, even when occasionally
| using my wife's iPhone 12.
|
| The bigger difference I notice is in loading times for
| apps.
| user_7832 wrote:
| I'm not sure how the Galaxy handles it but if it's
| similar to the stock android situation, you're likely
| often only seeing a 60hz screen rate. You can go to
| developer settings (or perhaps screen settings) and force
| 90hz or 120hz on. If your eyesight is otherwise good
| (especially if your younger than 50) you should be able
| to notice the difference quickly.
| culturestate wrote:
| _> yes, there is a difference, but it 's not something I
| notice_
|
| Anecdotally, I use the last gen pre-120hz iPad Pro as my
| daily driver and I notice the difference _immediately_
| when I switch from my phone to the iPad. It's not
| annoying enough to force me to upgrade immediately, but
| it'll absolutely be a requirement for me going forward.
| jwells89 wrote:
| For me the difference is visible, but the added
| smoothness is more of a cherry on top than anything. I
| switch between 60hz, 120hz, and 240hz panels every day
| and the 60hz panels don't bother me a bit. On desktop and
| laptop displays I'd take integer-scaling-friendly pixel
| density over high refresh rates any day.
|
| Variable refresh rates should be standard across the
| board however, not restricted to high refresh panels.
| There's no more reason for a static screen to be
| redrawing at 60hz than there is for it to be redrawing at
| 120hz or 240hz.
| ska wrote:
| To stick with your car analogy though, with screens it's
| more like the performance vs. gas mileage trade off. Your
| sports car will get you the same place at the same time
| (capped by speed limit) with more fun (if you like that
| sort of thing) but cost you at the fuel pump. This is
| fine if you don't care about $/gallon, but sucks if there
| is gas rationing.
|
| This analogy is also flawed, of course.
| willsmith72 wrote:
| sounds like something I'd rather never want to get used
| to then. More expensive, uses more power so kills battery
| life faster and worse for the environment. Where's the
| win?
| user_7832 wrote:
| > uses more power so kills battery life faster and worse
| for the environment. Where's the win?
|
| Depends on the device, but often the difference is a very
| minimal few point percent, perhaps 3-5%. I would think it
| can easily be compensated for by
| undervolting/powerlimiting on laptops.
|
| The win is lesser eye strain/headaches for a lot of
| people. No harm in turning it off if you don't need it,
| but many do benefit.
| tuetuopay wrote:
| > uses more power so kills battery life faster
|
| not necessarily. on promotion iPhones (fancy name for
| variable refresh rates), the panel is not stuck at 120Hz
| all the time. it varies from 1Hz when the screen is
| static up to 120Hz when it's getting animated, down to
| whatever framerate your content is at (e.g. movies and
| videos). it actually is a battery saver whenever the
| phone's screen is static (common on most apps with text)
| or with <60Hz content (youtube, movies, etc).
| varispeed wrote:
| I can certainly see a difference between 60Hz and say
| 120Hz. The latter is way easier on the eye.
| lvncelot wrote:
| I've recently splurged on a new phone (old one was >4 years
| old and started to run a little slow) and a high refresh
| screen just adds to the general "snappiness" of everything.
| I'm also using a high refresh-rate monitor for work and
| it's always a little weird to go back to a 60hz screen.
|
| This is obviously not a critical feature, mind you, but it
| does add to the experience of using a device, even if it's
| just productivity-related tasks.
| varispeed wrote:
| Even the lowest tier models probably beat any PC laptop when
| it comes to usability and comfort though.
| wubrr wrote:
| Strong disagree.
|
| UI is very buggy and inconsistent.
|
| CLI tools/shell is attempting to follow more-or-less
| standardish unix/linux setup but fails pretty hard, with
| many of the same commands existing but behaving just
| differently enough to be annoying.
|
| Uses it's own set of shortcuts, different from what's
| standard on windows/linux and most other operating systems.
|
| Not sure what people mean when they say completely vague
| and ambiguous things like 'usability' and 'comfort' - are
| you referring to the fact that it has rounded buttons and
| corners and pretty ads?
|
| It basically falls in between a standard windows and highly
| modified ubuntu setup, but does far worse at both. The UI,
| apps, etc is far more buggy and unintuitive than windows,
| and the os is far less 'unix' and far less customizable
| than ubuntu.
| stouset wrote:
| > Unix tools/shell is attempting to follow more-or-less
| standardish unix/linux setup but fails pretty hard, with
| many of the same commands existing but behaving just
| differently enough to be annoying.
|
| I'm pretty sure this is simply you expecting GNU and
| actually getting FreeBSD-based tooling. It's not wrong,
| it's just different than what you expected.
| wubrr wrote:
| Not at all. As a single example - macos comes with an
| extremely outdated version of GNU bash (3.2) from 2007.
| And yes, this has consequences in that many modern bash
| scripts depend on bash 4+ (current latest GNU bash is
| 5.2+).
| foldr wrote:
| Zsh has been the default shell on MacOS for ages now. If
| you need a more modern bash then you can easily install
| it using brew or other common package managers. This is a
| total non-issue.
| wubrr wrote:
| Of course shipping a very outdated tool with your OS is
| an issue. And the fact that you need to use 3rd-party
| package managers to update/replace it is also an issue.
|
| But if you want to contend that anything that is
| solveable via customization/3rd party packages is a
| 'total non-issue', then I fail to see how you could argue
| that Linux isn't superior to MacOS in every single way.
| karolist wrote:
| It's not an issue even if you say it is. Takes 1 minute
| to setup brew on a fresh Mac, third party or not who
| cares when it's open source, apt is also open-source and
| in the same sense a third party someone develops, you
| just get that with the base Debian like systems, if we
| start counting the minute wasted to setup brew then you
| waste more time to install Debian in the first place,
| Macs come pre-installed.
| wubrr wrote:
| It literally is an issue, and the fact that you're
| pointing out a potential solution should make that
| obvious to you. Many software companies (including
| FAANGs) have basically given up trying to support
| building/running most of their software on mac for these
| exact reasons, and they have really tried - dedicating
| hundreds of experienced engineers to the problem.
| karolist wrote:
| I call BS on the FAANG statement. I work at FAANG and 80%
| SWEs use MBPs, actually if we go by literal meaning of
| FAANG company list, they do not even focus on desktop
| software and are mostly web companies and the tooling is
| all backend where SWE machines do not do any heavy
| lifting.
| wubrr wrote:
| You work at FAANG and you can run your entire stack on
| MacOS? Which FAANG?
|
| Or you mean, you work at FAANG and you can run your web
| application which is 0.01% of the stack on your mac?
|
| > they do not even focus on desktop software and are
| mostly web companies and the tooling is all backend where
| SWE machines do not do any heavy lifting.
|
| The web frontend/ui is a relatively small portion of the
| development that goes on at typical FAANG. And the reason
| the macs 'do not do any heavy lifting' is because they
| literally can't - most of the complex backend systems,
| low low level/high performance code, etc can't be run on
| macos.
| scarface_74 wrote:
| This is total bullshit. Amazon and Microsoft both support
| the Mac thoroughly. From what I saw when I was there,
| most developers at Amazon use Macs.
| wubrr wrote:
| You're completely wrong. I worked at Amazon (AWS) for
| years, yes developers generally use macs, but they
| generally cannot run their entire stacks on mac. Most of
| the actual software people work on is run on remote linux
| (AL2) ec2 instances after syncing the code from mac.
|
| Running the entire AWS stack on macos is literally
| impossible at this point because major pieces aren't even
| built for macos (and can't be without major rewrites).
| wredue wrote:
| Does brew still require mangling OS permissions?
| foldr wrote:
| No. This is answered on the Homebrew installation page:
| https://docs.brew.sh/Installation
| wredue wrote:
| That page appears to very much describe that it does, in
| fact, still fuck your systems permissions...
|
| I'll never install homebrew as long as the dev continues
| this amateur hour bullshit.
| toast0 wrote:
| Using tools from the base OS is really holding it wrong.
| Just like ftp.exe/Internet Explorer/Edge and Safari
| should only be used to download a usable browser, the
| Apple provided CLI tools should only be used to download
| the tools and versions of tools that you actually want.
|
| Otherwise, you have no way to control versioning anyway.
| When the OS version is tied to the version of so many
| other tools, it's a nightmare. Apple is trying to push
| you in the right direction by basically not updating the
| CLI tools ever, even when upstream didn't change the
| license to something unacceptable for them to distribute
| as in the case of bash.
| wubrr wrote:
| > Apple provided CLI tools should only be used to
| download the tools and versions of tools that you
| actually want.
|
| Why? Why do I need to use 3rd party package managers, or
| manual installations from 3rd party sources to install
| common tools that aren't a decade+ outdated?
|
| > When the OS version is tied to the version of so many
| other tools, it's a nightmare.
|
| It doesn't need to be tied to anything. If the OS needs
| specific libraries/tools then they can be installed in a
| separate location. If there are new major versions of
| user-level tools (like bash) those should come by
| default, not some 15 year old version of the same tool,
| with the same license. Multiple linux distros have solved
| these problems in different ways 10+ years ago.
|
| > Apple is trying to push you in the right direction by
| basically not updating the CLI tools ever
|
| No, they are just neglecting the CLI ecosystem, the
| package management, and the many many outstanding bugs
| that have existed and been ignored for years.
| spacedcowboy wrote:
| Because the newer tools changed the licensing terms, and
| the corporate lawyers won't let anything under GPL3
| anywhere near anything if they can help it.
|
| GPL2 was viral, but the terms were easier to stomach.
| Apple is allergic to GPLv3 code because there is a clause
| in the license requiring you provide a way to run
| modified version of the software which would require
| Apple to let users self sign executables.
|
| This is a simple result of the GPL going where Apple will
| not, so OSX is stuck with whatever is MIT, BSD, Apache,
| or GPL2 licensed.
|
| I like the GPL, I license my open source stuff under it,
| but it doesn't work for all cases. Apple is fine with not
| using software licensed under the GPL, when it conflicts
| with other company principles.
|
| Simple as.
| rfoo wrote:
| > Apple is allergic to GPLv3 code because there is a
| clause in the license requiring you provide a way to run
| modified version of the software which would require
| Apple to let users self sign executables.
|
| Wow, I don't know this before. Good job FSF! This,
| should, be, a, basic, right.
| spacedcowboy wrote:
| You can of course sign your own binaries. You can't alter
| them sign a system binary. I'm good with that.
| wubrr wrote:
| > Apple is allergic to GPLv3 code because there is a
| clause in the license requiring you provide a way to run
| modified version of the software which would require
| Apple to let users self sign executables.
|
| Does that really add up though? You can install and run
| 3rd party software on macos without any signing needed.
| toast0 wrote:
| > Why? Why do I need to use 3rd party package managers,
| or manual installations from 3rd party sources to install
| common tools that aren't a decade+ outdated?
|
| Because you want to control the version of 3rd party
| software.
|
| > It doesn't need to be tied to anything. If the OS needs
| specific libraries/tools then they can be installed in a
| separate location
|
| The OS installs tools in /usr/bin and you should install
| tools in a separate location. Apple provides a commercial
| UNIX, not a Linux distribution. /usr/local or /opt are
| traditional locations for you to place the 3rd party
| software you want to use on your commercial UNIX.
|
| If you want the OS to ship with updated tools, of course
| it's tied to the OS version. Then if you want bash 70,
| you'll need to run macOs Fresno or later, or install bash
| 70 in /usr/local. You should just install your version
| anyway, and then you won't have to worry about OS
| versions (unless bash 70 requires kernel apis unavailable
| before macOs Fresno, in which case you're stuck; but most
| software isn't intimately tied to kernel versions)
| jamespo wrote:
| We're not talking about keeping up to date with the
| latest revision of python / java here, it's a shell
| that's woefully out of date. Apple is not RHEL
| backporting fixes so binaries continue to run for 10
| years - in fact they seem quite cavalier about backwards
| compatibility.
| toast0 wrote:
| Upstream changed the license. Apple doesn't distribute
| GPLv3 software. Apple's not going to distribute a newer
| version. That's how license changes work. The old version
| continues to work as well as it always did; scripts
| written for the new version don't work, but why would you
| write a bash4 script for macOs?
|
| Not that they were going to update bash regularly anyway.
| So if you needed a new version in the base, you'd need a
| new version of the OS. And then you're back to the same
| problem you have now. Apple doesn't distribute the
| version of the 3rd party tool you want for the OS you're
| on.
| wubrr wrote:
| > The old version continues to work as well as it always
| did; scripts written for the new version don't work,
|
| Right, so scripts written for bash4+ don't work - which
| is most of modern bash scripts..
|
| > but why would you write a bash4 script for macOs?
|
| The problem is that you have to write special scripts
| (among other things) just for macos. This is very real
| problem because most modern software companies run their
| stuff on linux, while the dev laptops/workstations are
| often macos.
| foldr wrote:
| Bash is only there for backwards compatibility with old
| scripts. The default MacOS shell is an up-to-date zsh.
| BigJ1211 wrote:
| Maybe I'm and oddball, but I download third party package
| managers (or wrappers) on literally every OS I use.
|
| Scoop, Chocolatey on Windows. Brew on MacOS. Amethyst on
| Arch.
|
| Linux package managers eviscerate whatever is available
| on MacOS and Windows by a long shot.
|
| Frankly if there's one thing I want my OSes (excluding
| Linux) to keep their greasy paws off of, it would be my
| CLI and build environments.
|
| I want to install and manage what I need, not be force
| fed whatever sludge comes out of Microsoft's or Apple's
| tainted teats. Manage and update the OS, don't touch
| anything else.
| wubrr wrote:
| Linux and MacOS package management is very different,
| because linux distros generally have first-class
| supported package management that comes with the OS.
|
| > I want to install and manage what I need
|
| Good luck installing what you need if someone hasn't made
| a port explicitly for macos.
| skuhn wrote:
| Unfortunately this will never be fixed. It's not a
| technical problem. They refuse to ship software licensed
| under GPL v3, and bash 3.2 is the final GPL v2 version.
|
| I hate it.
| tcmart14 wrote:
| Didn't they stop 'shipping' bash by default awhile ago?
| Now the default shell is zsh?
| skuhn wrote:
| bash is still shipped on macos 14, the default shell is
| zsh since 10.15 in 2019.
| Someone wrote:
| I wonder why they haven't fixed it yet by removing bash
| and the other GPL-licensed tools. Even if they currently
| need one to boot the system, they have the resources to
| change that.
| iAMkenough wrote:
| I refuse to drop Samba and go back to AFP
| jlokier wrote:
| _> Why? Why do I need to use 3rd party package managers,
| or manual installations from 3rd party sources to install
| common tools that aren 't a decade+ outdated?_
|
| There actually is a reason. It's not lack of maintenance.
|
| It's because Bash 3.2 is the last version licensed as
| GPLv2. Bash 4.0 and later changed license to GPLv3.
|
| Apple decided it's not safe for them to ship _any_
| software with GPLv3 with macOS, because of stronger legal
| conditions in GPLv3. This is also the reason they stopped
| updating Samba.
|
| They changed the default shell to Zsh long ago. The old
| Bash is kept around, so that users who want to stick with
| Bash can still use it as their shell, and so that
| existing scripts for macOS (for example in installers)
| continue to work. If nobody cared about Bash, I expect
| they would have dropped it from macOS when switching to
| Zsh, rather than keeping the old version.
|
| So from a certain point of view, Bash 3.2 is the most
| recent version they can ship.
|
| As for other tools like "cp", "ls", "rm", "touch", etc. I
| agree they are annoying on macOS, when you are used to
| the versatile GNU/Linux command line options. I sometimes
| type options after filename arguments due to habit on
| Linux. macOS commands very annoyingly treats those
| options as filenames instead. And I miss options like
| "touch --date".
|
| However, this is not due to old tools. Those differences
| are just how up current (up to date) BSD commands work.
| They are just a different unix lineage than Linux.
|
| The GNU tools were intentionally written to be more user-
| friendly than traditional UNIX(tm) tools, which is why
| the command line options are generally nicer. GNU/Linux
| systems come with GNU tools of course. macOS never did
| because it is derived from BSD and comes with BSD tools
| (with Apple enhancements, like "cp -c").
|
| You get the same on most other unix environments that are
| not GNU/Linux. People install the GNU tools on top, if
| they want the GNU command line experience instead of the
| default. Sometimes with the "g" prefix (like "gls",
| "gtouch" etc.).
|
| These days, even if Apple decided it's worth the
| technical fallout of switching from BSD command line
| tools to GNU, they wouldn't do it for the same legal
| reason as what keeps Bash at 3.2: The current GNU tools
| are licensed as GPLv3.
| Nullabillity wrote:
| > So from a certain point of view, Bash 3.2 is the most
| recent version they can ship.
|
| Apple could _choose_ to be GPLv3 compliant tomorrow, if
| they wanted to.
| AnthonyMouse wrote:
| They also seem to be about the only ones who seem to
| think this is a problem. Even Microsoft has the GPLv3
| bash in WSL.
| wubrr wrote:
| > However, this is not due to old tools. Those
| differences are just how up current (up to date) BSD
| commands work. They are just a different unix lineage
| than Linux.
|
| What up-to-date BSD are you referring to? The commands on
| different forks of BSD like FreeBSD vs OpenBSD are not
| the same. And quickly looking at the man pages of latest
| FreeBSD vs MacOS versions of ls, for example - they are
| not the same.
|
| > These days, even if Apple decided it's worth the
| technical fallout of switching from BSD command line
| tools to GNU, they wouldn't do it for the same legal
| reason as what keeps Bash at 3.2: The current GNU tools
| are licensed as GPLv3.
|
| What actually prevents them from including GPLv3
| software?
| shepherdjerred wrote:
| > CLI tools/shell is attempting to follow more-or-less
| standardish unix/linux setup but fails pretty hard, with
| many of the same commands existing but behaving just
| differently enough to be annoying.
|
| macOS is UNIX certified and POSIX compliant. You're
| probably expecting GNU commands, but macOS is based off
| of FreeBSD (and is not related Linux).
| wubrr wrote:
| First of all unix/freebsd/linux are all closely related.
| Second, macos does not come with current freebsd set of
| CLI tools, and in fact comes with several very outdated
| GNU CLI tools.
| shepherdjerred wrote:
| Yes, macOS does include some GNU programs, and both the
| FreeBSD and GNU programs that it includes are rather
| outdated.
| zaphar wrote:
| CLI tools/shell is attempting to follow more-or-less
| standardish unix/linux setup but fails pretty
| hard, with many of the same commands existing but
| behaving just differently enough to be annoying.
|
| They wasn't following unix/Linux standards they are
| following unix/bsd standards. Standards that predate
| Linux and for some of us are very familiar indeed.
| wubrr wrote:
| What standards are those exactly?
|
| Everyone knows that macos (and windows) took a lot from
| unix/bsd. But that does not mean they are actively
| following any kind of reasonable modern standard. And
| saying they might be following some 30 year old supposed
| bsd standard is pretty hilarious for an OS that portends
| to be cutting-edge, intuitive and well-integrated.
| masswerk wrote:
| Hum, BSD is still very much around.
| _gabe_ wrote:
| I think the buggy apps and UI isn't emphasized enough on
| Macs. I honestly can't remember the last time I was on
| Windows and performing an action failed to give me any
| visual indicator whatsoever that something happened, but
| that's common on my M2. I'll click something, have no
| feedback, and then a few seconds later the thing happens.
|
| Just a couple concrete examples, if you open an app in a
| workspace, then switch screens to a different workspace,
| then click the app in the dock, nothing happens. I would
| expect to be taken to the screen and have the app made
| visible, but instead, nothing. I kept thinking that the
| app must be frozen or something until I switched screens
| and found it.
|
| Another example, I was carrying my laptop to and from
| work. I put it in my backpack with padding that protects
| the laptop. I didn't jostle it around, I literally just
| carried it to and from work. I get home and the screen is
| in some sort of weird flickering bugged out state. I had
| to forcefully restart it just to get it working again.
|
| With all that said, the trackpads and gestures on Macs
| are amazing. The displays are also very visually
| appealing. The performance is good.
| wtallis wrote:
| > I honestly can't remember the last time I was on
| Windows and performing an action failed to give me any
| visual indicator whatsoever that something happened
|
| I find that more often than not I can't make it through
| the Windows setup without worse janky stuff happening.
| Pretty often when toggling off all the bullshit privacy
| settings that shouldn't be opt-out to begin with, I'll
| get a visual indication of the switch _starting_ to move
| after my click, then turning around and going back to the
| default--so my click was definitely received, but
| rejected somehow. That seems worse to me than a correct
| response delayed.
|
| > if you open an app in a workspace, then switch screens
| to a different workspace, then click the app in the dock,
| nothing happens. I would expect to be taken to the screen
| and have the app made visible, but instead, nothing.
|
| There is a visual indication in that the contents of the
| menu bar change to reflect the newly active app; unlike
| on Windows a Mac app can be active without having an
| active or foreground window. There's a system setting to
| control whether to switch spaces in this scenario, but I
| don't recall whether the behavior you describe is the
| default or something you accidentally configured to annoy
| you.
| BigJ1211 wrote:
| By default it jumps to the workspace that application is
| opened up on.
| nvarsj wrote:
| I was always kind of puzzled that big tech companies seem
| to exclusively use Macs for software dev. Then when I
| joined one, I discovered that everyone uses remote Linux
| VMs for actual development. Which makes a lot more sense
| - Macbook as a glorified thin client with great battery
| life suits it pretty well. Although, I still sorely miss
| Linux/Windows window management.
| BigJ1211 wrote:
| As someone who uses Windows, Arch with Hyperland and
| MacOS almost daily. I am immensely curious about what you
| find inconsistent and buggy about macOS? In my experience
| that's been the least buggy and inconsistent of the
| three.
|
| My Mac and Linux shortcuts align more than my Windows and
| Linux/Mac ones. For the most used ones it's CMD +
| whatever the standard key is for that shortcut, instead
| of Control. Overal I prefer that the Super key is more
| useful than what has been the default for many years with
| Windows.
|
| I also use the CLI for virtually everything on Mac and
| Linux, MacOS isn't all that different and feels more like
| another Linux flavour than it's own beast.
|
| The only UI gripes I can think off immediately is no
| window snapping and closing the window doesn't mean
| you've exited the application. The first requires a
| third-party tool (I recommend Rectangle), the latter is a
| change in behaviour.
|
| Frankly I'm not really all that interested in defending
| MacOS, but I hope you realise that saying "very buggy and
| inconsistent" without naming anything specific isn't any
| less vague and ambiguous than "usability and comfort".
|
| Your reaction comes off as "I'm used to this, therefore
| the other thing is bad and unintuitive." I'm sure this
| isn't your intention, so specifics would be illuminating.
| deergomoo wrote:
| > Uses it's own set of shortcuts, different from what's
| standard on windows/linux
|
| Mac OS predates both.
|
| Also, this is personal preference, but I find engaging
| Command with my thumb _far_ more comfortable than Control
| with my pinky.
| Der_Einzige wrote:
| Ah yes, the 8gb ram models in almost 2024 with crippled
| SSDs are beating equivalent PC laptops for their price...
| NOT!
|
| 8gb wasn't even enough in 2015!
|
| Apple competes and demolishes anything from PC in battery
| and build quality. Certainly not in usability and
| subjectively comfort.
|
| Also, low tier PCs support multiple monitors. Try getting
| that on the macbook Air.
| musha68k wrote:
| Agreed, especially the increasing greediness with regards to
| RAM and corresponding upselling is unfortunately real at this
| point. I just upgraded to the iPhone 15 Pro Max because of
| the 8GB RAM and I will probably need to do so for my two Macs
| as well. 16GB/32GB RAM I "fear" not going to cut it in 2024
| for my kind of snappy productivity work anymore. Especially
| due to unified memory architecture.
|
| QED seemingly works out well for Apple though.. :P
|
| I'm still hoping for some ex Apple folks to create some new
| version of NeXT computers. I would switch immediately to
| someone's alternative offering trying to actually play at
| Apple's level of quality.
| astrange wrote:
| > I just upgraded to the iPhone 15 Pro Max because of the
| 8GB RAM
|
| This is a bad reason to upgrade. If you were supposed to
| know about it, it'd be in the specs.
|
| It's not there to improve performance or be more forward
| compatible, it's because the camera upgrades need it.
| musha68k wrote:
| OOM kills are basically the sole reason for me to upgrade
| iPhones. Nothing more annoying than force-reloaded/lost
| state in/between apps while on-the-go and under tight
| time constraints. Also somewhat true: new lenses are
| amazing but "computational photography" defaults
| sometimes less compelling than my old iPhone X's more
| "honest" output. I'm going to play around with RAW some
| more if I get the time for it though.
| marmaduke wrote:
| Just curious, I had a 2020 SE daily and never had a OOM,
| how do you do it?
| musha68k wrote:
| How does software do this usually? :) I guess it's a
| combination of heavy multitasking / using it for work +
| private plus - again - constant-drum of increasingly
| bloaty websites and applications following whatever the
| current ceiling is. "Simple" apps like podcatchers come
| to mind, don't know how often Overcast bailed on me with
| maybe 100 podcasts subscribed / updating? But again there
| are many more cases and sometimes not only third party.
| user_7832 wrote:
| I still use a 2020SE and experience apps reloading all
| the time. I cannot keep more than 2 apps reliably in
| memory at any point. This includes Safari, youtube,
| reddit (honestly a poorly made app), spotify etc - fairly
| "common" ones.
| euazOn wrote:
| It sucks but at least DisplayLink works well enough. Driving 4
| monitors this way with my M1 Mac Air with no issues.
| Ographer wrote:
| Can you name or link any specific hardware or software you're
| using? Last time I looked into how to run 2 monitors on my M1
| Air I gave up after seeing $400 docks that seemed to have
| performance issues for some people.
| bloopernova wrote:
| I have this for my m1 pro:
|
| https://a.co/d/gD5q8G3
|
| It uses the 6950 DisplayLink chip, which can do 4K@60Hz.
|
| It's 90 dollars and drives 2 screens.
|
| Software is less good. I had to enable Rosetta2 to get the
| pkg to install. Then you have to boot to recovery mode to
| allow signed 3rd party drivers. And you get a creepy notice
| that someone is watching your screen, which is how the
| display driver works.
|
| Performance is pretty good, just a bit laggy on mouse
| movement.
| heyoni wrote:
| Is that because the screen generation is software based?
| I noticed that when using display link and couldn't bare
| the input lag or worse the compression from moving a
| window too rapidly. Switched to a monitor with
| thunderbolt output and my coworkers thought I was being a
| diva.
| Ographer wrote:
| My understand is that yes, the external displays are both
| rendered by the CPU and so there are lower frame rates
| and frames dropped which is a concern of mine since I am
| pushing this computer to its limits with some
| applications I run. I've heard that for casual use it
| isn't too noticeable.
|
| Unfortunately I don't think the M1 Air supports TB daisy
| chaining unless you're ok with mirrored displays. I still
| can't decide if I want to ask my job for a new computer,
| a new dock, or new displays lol.
| freeAgent wrote:
| Yeah, it's definitely not as good as native, but it's not
| bad. One other significant restriction for DisplayLink is
| that it's unable to display DRMed video content.
| firecall wrote:
| Sadly it doesn't work well enough with 4K displays for me.
|
| At least, not the last time I tried.
| wubrr wrote:
| I mean, the support for even one monitor is by far worse than
| windows or linux IMO.
|
| - Monitor randomly resets/readjusts for no apparent reason.
| Like multiple times every hour.
|
| - Windows disappear, become inaccessible after moving between
| monitors, even though it still open and active according to
| doc/task list.
|
| - Why can't I move a window to a monitor/workspace that has a
| maximized window? Like, you can do it by un-maximizing the
| window in question, moving the other window over and then re-
| maximizing the window again. But why is this nonsense needed?
| What problem could this restriction possibly be solving?
|
| - Lots of other monitor/workspace related problems (quick
| google will show dozens related problems without any obvious
| solutions or explanations for the completely nonsensical
| behaviour).
|
| I really don't understand how people can say macos has good and
| consist overall UI? I mean, yeah - it's consistently buggy and
| un-intuitive:
|
| - Doc constantly breaks/becomes inaccessible. This is a known
| problem for at least 5+ years - the solution is to manually
| kill and restart the process???
|
| - Text cursor/caret randomly disappears when editing text, so
| you can't see where the cursor is, and you can't fix this
| unless you restart the app (happens to pretty much all apps).
|
| - Was working with unicode characters recently, now whenever I
| press command+s to save something, it gives me a visual unicode
| character selector popup, with no apparent way to stop/cancel
| this behaviour from the popup itself.
|
| - Bad defaults in terms of keypress repeat times and rate. No
| apparent way to change this from settings - need to run
| commands and re-login to test new behaviour.
|
| Just an overall crap OS and UI imo.
| jwells89 wrote:
| I've been using macOS in a multimonitor setup for years
| without much issue. A good bit of it boils down to macOS
| expecting monitors to be well-behaved, e.g. each having
| unique EDIDs (many don't, instead sharing one across all
| units of a particular model) and initializing in a timely
| fashion.
|
| > Why can't I move a window to a monitor/workspace that has a
| maximized window?
|
| Because it's fullscreened, not maximized. macOS doesn't
| really have window maximization in the traditional sense out
| of the box, you need a utility like Magnet or Moom for that.
|
| That fullscreen mode was introduced in 10.7 Lion and a lot of
| long time mac users have found it silly from day one.
| Personally I never use it.
|
| > Text cursor/caret randomly disappears when editing text, so
| you can't see where the cursor is, and you can't fix this
| unless you restart the app (happens to pretty much all apps).
|
| I've seen this, but only in Chromium browsers and Electron
| apps. Seems like it might be a Blink bug.
|
| Regarding other OSes, multimonitor on Linux is mostly fine
| (so long as you're using Wayland; X11 is another matter
| especially if you're doing something slightly uncommon like
| using two GPUs, in which case xorg.conf mucking will likely
| be necessary).
|
| By far the most frustration I've had with multimonitor is in
| Windows, which is generally weak there. IIRC it only recently
| gained the ability to set per-display wallpaper; before you
| had to glue wallpapers together into a single image that
| spanned across them, which is silly.
| Terretta wrote:
| > _That fullscreen mode was introduced in 10.7 Lion and a
| lot of long time mac users have found it silly from day
| one. Personally I never use it._
|
| Agreed, however it's arguably for a different mental model.
| If people think of it as "focus mode" or "workspace mode"
| it makes more sense. Four finger swipe to slide between the
| workspaces or focuses.
|
| More importantly, you don't need MOOM.
|
| - - -
|
| To maximize a window to the dimension of the screen without
| entering full screen mode, either:
|
| 1) hold shift option [?] and click the green maximize
| button on the top left of the window
|
| - or -
|
| 2) hold the Option key and double click a corner of a
| window to maximize without full screen focus, and doing
| that again will size it back to where it was
| jwells89 wrote:
| > To maximize a window to the dimension of the screen
| without entering full screen mode, either:
|
| This is a handy trick to know, but has caveats. Option-
| green-functionality is actually defined by apps, not the
| OS, and so in some cases it will for example act as a
| "fit window to content" button.
|
| Option-double-clicking a corner appears to be consistent
| across windows however.
| mhio wrote:
| Double click anywhere on the title bar works for me.
| claytongulick wrote:
| Interesting - it's one of the killer features of macos for
| me.
|
| I use it to compartmentalize stuff I'm working on, and 4
| finger swipe between them.
|
| I have a couple instances of VSCode, some app/debugging
| browser windows, some chrome profile windows etc... all
| running full screen and I spend my day brush swiping
| between them.
|
| It may be different for me because I work on a 14" macbook
| pro, mostly at coffee shops or unusual work locations, so I
| don't have a large monitor setup. I don't think I'd use it
| much if I had more of a traditional desk/multimonitor
| config.
| jwells89 wrote:
| I could definitely see it being more useful on a small
| screen, especially something like the 12" Macbook.
|
| Generally I'm working at my desk with at least 2x 27"
| displays. If I'm out somewhere it's instead 16" MBP +
| 12.9" iPad with Sidecar.
| wubrr wrote:
| But you can have workspaces/maximized windows without
| having the restriction of preventing moving windows to
| another monitor/workspace that has a maximized window.
|
| I use workspaces on my linux laptops all the time - this
| 'feature' has existed for ~20 years, and is not hard to
| implement. The difference being that macos seems to force
| the restriction of having only 1 window per workspace and
| not being able to drag a window from one workspace to
| another. If that's the behaviour you want (one window per
| workspace), you can easily operate in this way without
| having the forced restriction.
| wtallis wrote:
| You _can_ have multiple monitors and multiple workspaces
| with multiple windows that can be freely moved between
| monitors and spaces. The only restriction is that when
| you make a window full screen (not the same as
| maximizing), it becomes its own new single-occupancy
| space. You seem to think that making a window fullscreen
| is the only way to make a new space, but it 's just a
| special case of a larger system that already has the
| functionality you're asking for.
| wubrr wrote:
| > A good bit of boils down to macOS expecting monitors to
| be well-behaved, e.g. each having unique EDIDs (many don't,
| instead sharing one across all units of a particular model)
| and initializing in a timely fashion.
|
| I don't really buy this explanation. I'm talking about
| using a single external monitor here, a monitor which I
| regularly also use with linux (intel/amd) laptops without a
| single issue.
|
| > Because it's fullscreened, not maximized. macOS doesn't
| really have window maximization in the traditional sense
| out of the box, you need a utility like Magnet or Moom for
| that.
|
| That's not really a good reason though. It still doesn't
| explain why the restriction exists. Why can't a non-
| maximized/fullscreen window be displayed on top of a
| fullscreen/maximized window? What problem does that solve?
|
| > That fullscreen mode was introduced in 10.7 Lion and a
| lot of long time mac users have found it silly from day
| one. Personally I never use it.
|
| It's default though.
| jwells89 wrote:
| > I don't really buy this explanation. I'm talking about
| using a single external monitor here, a monitor which I
| regularly also use with linux (intel/amd) laptops without
| a single issue.
|
| Might be model-specific then. It's not something I've
| seen with displays from Asus, Alienware, and Apple. I
| briefly owned a Dell monitor that would periodically
| flicker but it got returned.
|
| > That's not really a good reason though. It still
| doesn't explain why the restriction exists. Why can't a
| non-maximized/fullscreen window be displayed on top of a
| fullscreen/maximized window? What problem does that
| solve?
|
| Probably because it'd be easy for windows to get "lost"
| if the fullscreen window were focused and non-
| fullscreened windows fell behind it, with there being no
| obvious indicator that those windows exist.
| arcatech wrote:
| Do you honestly believe all of those things happen to
| everyone? You think everyone who likes macOS is just ignoring
| those kinds of severe issues?
|
| Obviously you have some kind of problem with your system.
| hashhar wrote:
| Well, most of what he is saying are actually easily
| reproducible macOS quirks.
|
| - Windows disappear, become inaccessible after moving
| between monitors, even though it still open and active
| according to doc/task list.
|
| This indeed happens relatively often if you have multiple
| monitors and switch between them for any reason. e.g. in my
| case I have two machines and two monitors, sometimes I
| switch the primary monitor to a specific machine and this
| almost always fucks up macOS. Solution is to disconnect and
| reconnect the monitor.
|
| - Why can't I move a window to a monitor/workspace that has
| a maximized window? Like, you can do it by un-maximizing
| the window in question, moving the other window over and
| then re-maximizing the window again. But why is this
| nonsense needed? What problem could this restriction
| possibly be solving?
|
| A lot of people who use macOS agree that the fullscreen
| window thing is needless and makes for quirky behaviour.
|
| - Dock constantly breaks/becomes inaccessible. This is a
| known problem for at least 5+ years - the solution is to
| manually kill and restart the process???
|
| - Text cursor/caret randomly disappears when editing text,
| so you can't see where the cursor is, and you can't fix
| this unless you restart the app (happens to pretty much all
| apps).
|
| Yep, happens quite frequently to multiple people I know and
| in multiple apps.
|
| - Bad defaults in terms of keypress repeat times and rate.
| No apparent way to change this from settings - need to run
| commands and re-login to test new behaviour.
|
| Indeed this is very painful for people who are non-
| developers and are used to be able to have higher repeat
| rates.
| Exoristos wrote:
| I've both used Macs for decades and at one time worked in
| enterprise Mac technical support, and I've never seen this
| stuff happen.
| AtlasBarfed wrote:
| I've used multi-monitor dev setups and large screen setups
| in mac laptops for, christ, 15 years.
|
| I've seen all of those things listed.
|
| It would be one thing if OSX/Linux/Windows UIs would just
| stay at their basic usability from around 2010. They
| haven't. They just get steadily worse.
|
| OSX has been the most stable of them, quirks and all, even
| with a goddamn architecture change, but its closed
| hardware, a forced upgrade cycle, and bad backwards
| compatibility.
|
| Linux once would just rewrite window managers every 5 years
| as soon as they got stable, now Linux is rewriting with
| Wayland and Vulkan, and ... about 5 years in they'll
| probably start rewriting those once they get a little
| stable.
|
| Windows? Committed UI suicide with the tiles and two
| desktops thing in Windows 8. Now there's ARM in the future
| and a break with the only thing it has going for it:
| backwards compatibility.
|
| But yes, everything the author complains about happens in
| OSX with laptop + monitor / dual monitors, and even without
| with disappearing mouse cursors.
| d3w4s9 wrote:
| I haven't used Mac for a while, but I definitely remember
| the part about window freezing (and requires killing the
| process) when connecting/disconnecting a external monitor.
| It's very frustrating.
|
| (And I for one hasn't bought a Macbook since 2020 because I
| am not going to spend at least $1,999 just to get dual
| monitor support. A $500 asus laptop can do dual monitor
| without any problem, and it turns out that machine is good
| enough for my productivity needs. That money is better
| spent elsewhere)
| wkat4242 wrote:
| It's not hard. The pro can do it. It's simply market
| positioning to make you spend more.
| alberth wrote:
| Only 1-monitor support seems like a market segmentation
| decision, not a lack of capability.
|
| (Much like how the iPhone Pro has faster USB-C data transfer
| speeds vs base iPhone)
| jbverschoor wrote:
| Give me a "pro" which is thinner than and without a fan.
|
| or
|
| Give me an "air" with two external monitors and 64GB ram.
|
| The pros are clunky and heavy. I'm on an air, and will stay
| there for a long time because if this
| abakker wrote:
| Aren't the pros and the airs very close to the same weight?
| the 15" air is 3.3 lbs and the MacBook 14" with the m3 is
| 3.4lbs. The heaviest 16" with the m3Max is 4.4.
| nwienert wrote:
| 15 air significantly heavier than 13
| joshstrange wrote:
| I agree but the iPhone is a bad example. They used the
| previous year chip in the non-pro phone (or a binned version
| IIRC) which didn't have the USB3 speed support. I guess we
| will see if next year's base iPhone has the faster data speed
| support or not.
| josu wrote:
| Yeah, you can actually use these dongles to connect more
| monitors.
|
| https://m1displays.com/
| jonah wrote:
| The regular iPhone 15 has last-year's Pro chip, the A16, and
| the 15 Pro has the new A17, so it's not simply a matter of
| binning or disabling features.
| delfinom wrote:
| It's intentional. Apple is going down the path of SKUs to
| segment their price points.
|
| It may also be they have chips that have bad display
| controllers and they are using this as one way of offloading
| those chips with the bad controllers lasered off.
| hartator wrote:
| I have the Air, it does support its own monitor + a 6k monitor.
| quitit wrote:
| The answer is pretty straight forward: it has no fan and
| limited ram.
|
| It supports a 6K screen in addition to the built-in screen.
| It's reasonable to say that if you need 3 screens to get your
| job done, then that is not representative of the market for
| apple's lowest-end laptop.
| exabrial wrote:
| Dude my old galaxy phone from yesterday could drive a 4k
| display
| sfmike wrote:
| is efficient cores just lower clock or how could 3 6 efficient
| and 6 performance be worse, is it possible it could be better as
| its more efficient while also faster and just calling it
| efficient cores? After all isn't that an arbitrary semantic term?
| reliablereason wrote:
| They take up a smaller size of the chip.
|
| Which could be seen as an indication that those cores have less
| caching and less parts for prefetching and stuff like that.
|
| The efficiency core would do less things per cycle but also use
| less power per cycle. The performance cores would do 3
| instructions in a cycle but the efficiency core might only do
| half an instruction.
| bloopernova wrote:
| I want a better system load metric now that we've got
| heterogenous cores in CPUs. The Pixel 8 Pro has 3 types,
| efficiency, performance, and one single ultra performance core.
|
| If your efficiency cores are always close to max usage, but your
| performance cores are idle, is your system being heavily or
| barely used?
|
| I understand that system load isn't really a useful metric
| between systems, but it's useful to compare on a single system I
| think. I just want a better at-a-glance thing to communicate to
| me if my computer is under or overloaded.
|
| (Additionally, do you have to specify that a particular app can
| run on efficiency cores, or does the process scheduler do it all
| without input from a human?)
| tootyskooty wrote:
| Sounds like power usage might fit the bill.
| zamadatix wrote:
| Raw power usage is very skewed. Getting that last 20% of
| performance the chip is capable of might take as much power
| as the first 80% did, and not each type of workload may have
| the same skew depending on which components are being
| stressed and how. Say you did have an adjusted map of
| power<->utilization though, it still assumes "system
| utilization" means "how much of the total possible capacity
| of this system is being utilized". This is a valid take, if
| you can get such a mapping, but it's not necessarily the only
| valid take of what system load is.
| cduzz wrote:
| Isn't this just a subset of the existing capacity monitoring
| problem where I want to know how loaded a multi-core system is
| under the following scenario:
|
| Some critical path, single-threaded task is at 100% of capacity
| of a core; the system has 10 cores -- is my system at 10%
| utilization or 100% utilization?
| ahoka wrote:
| Easy, measure the TDP usage instead of CPU usage, which is
| impossible to correctly measure at a given time anyway.
| cduzz wrote:
| Well, I certainly track capacity by power used by servers
| under management, but a system that's pinned on one thread
| is at 100% utilization, for very real definitions of
| "capacity" but the power consumed will _also_ be some small
| fraction of how much power would be drawn if you were
| lighting up all the cores at 100% also.
| the8472 wrote:
| linux has pressure stall information[0] which tracks the time
| where some processes couldn't be run because they were waiting
| for contended resources (cpu/io/memory). An N-parallel compute-
| bound job on an M-thread CPU will stay at nearly 0 CPU pressure
| if N=M (assuming no background tasks) because they're not
| stepping on each other's toes. At N>M pressure will start to
| rise.
|
| [0] https://docs.kernel.org/accounting/psi.html
| twoodfin wrote:
| Note that for memory, as I understand the documentation, it's
| less about "couldn't be run" than "couldn't have the real
| memory allocation necessary to avoid paging".
|
| This is in contrast to another contended resource, memory
| bandwidth, "waiting" for which manifests at the process level
| as CPU cycles like any other code. It's possible to use
| profiling tools to distinguish memory waits from other CPU
| activity, but as far as I know nobody has built the
| infrastructure to bubble that data up in some form so it
| could be tracked systemically by the OS.
|
| I can guess why: The memory hierarchy is complicated, and
| what you can derive about it from CPU performance counters is
| indirect and limited. Still, even some basic estimation for
| how many cache lines a process was responsible for driving
| over the memory bus would be helpful for those building high-
| performance systems and applications.
| Hello71 wrote:
| PSI is exported by the kernel to track context switches:
| switch to idle process = io wait, switch to kswapd = mem
| wait, switch to other runnable process = cpu wait. these
| are already visible to the kernel, it just needs to
| increment some counters and expose those to userspace. you
| can get the perf counters with something like `perf stat
| -ae cache-misses sleep 60`, it doesn't need to be in /proc.
|
| additionally, context switches are the same on everything
| that can run Linux, whereas PMU counters are highly CPU-
| specific (potentially even different in each CPU stepping),
| so given the current state of affairs, a generic interface
| would be very limited.
| paulddraper wrote:
| Huh? # of running processes is the usage of the system.
|
| ---
|
| If your system is running it on CPU, GPUs, high-efficiency,
| low-efficiency, etc....that's the _performance level_ of your
| system, not the _load_ on your system.
|
| This is nothing new. We've had power-saving CPU throttling for
| ages. Carry on.
| flashback2199 wrote:
| I don't see apple silicon in macs having been the right decision
| a decade from now because competing toe to toe with the entire
| semis industry on architecture has never worked before except in
| iphone where they had a huge first mover advantage.
| larkost wrote:
| I think the argument that this might not be great in the long
| run is worth considering, but in the short run I think that
| their results over the last few years are nothing short of
| astonishing. Specifically their laptops, especially when not
| plugged into a power source, have both stunning performance and
| battery life at the same time.
|
| Yes there are competitor chips from AMD and the like that are
| faster when plugged in, but at a power budget that mean they
| are severely throttled when unplugged. This is what Apple cares
| about, and at this point they are unmatched. Apple likes to
| have fast desktop machines (and they do), but they don't care
| about that nearly as much as they care about laptops-in-laptop-
| mode. We are a couple of years into this, and Intel is making
| noises about trying to match this, but has not yet, and Apple
| seems to be pulling further ahead in specifically this area.
|
| And they have a huge overlap in this focus in the development
| of the related processors for their iPhones/iPads/AppleTV
| products, and the upcoming headset. A lot of the development
| work gets largely reused in that other space, making Apples ROI
| even better.
|
| So long as their is a big enough market that cares about that
| (and from the look of sales, there is), Apple's strategy seems
| to be a winning one.
| flashback2199 wrote:
| They are making themselves more different from the rest of
| the market instead of "the same but better" which brought
| people back to the mac in the first place. It used to be as a
| normal person you bought the latest Mac and it came with the
| latest i7 and you didn't even think about it and none of your
| friends on PC could say anything bad about it because it was
| the same chip they had. Now it's like woo M chip is so
| different and fast look at my new Mac guys and then your
| friends who have PC say actually it's not faster it's just
| more power efficient and now the seed of wondering if Apple
| is really better has been planted whether or not it's totally
| true. I've already started seeing this happen on forums. The
| average person buying a Mac doesn't want to have to think
| about the nuances of specs and once they start digging into
| specs they might find themselves buying a PC instead. But
| we'll see, maybe the past won't repeat itself this time.
| foldr wrote:
| >It used to be as a normal person you bought the latest Mac
| and it came with the latest i7 and you didn't even think
| about it and none of your friends on PC could say anything
| bad about it because it was the same chip they had.
|
| Only a tiny, tiny fraction of people who buy computers have
| conversations like this with their friends. I am a software
| engineer and could not care less which chips my friends
| have in their laptops.
| flashback2199 wrote:
| Then why is their marketing right now all about the chip
| not the product
| astrange wrote:
| Why are you arguing about PC performance with your friends?
| Don't you have something better to do?
|
| The biggest advantage of ARM over x86 is security,
| efficiency is second. Performance is nice to have but is
| kind of coincidental (depending on workload) and I'd expect
| everyone else to catch up.
| aseo wrote:
| I always understood the move to Apple Silicon was also for
| supply chain concerns, and not just technical. I remember
| reading that Apple would often express frustration at having to
| align their products' schedules based on Intel's schedules, and
| there would be a lack of support from Intel in their
| collaboration -- I could see this is a a valid bottleneck for
| Apple, and this isn't their first rodeo in processor
| transitions. Of course, now, their bottleneck moves down the
| supply chain and will be TSMC, but TSMC seems to be happy to
| provide whatever Apple asks for, as seen with their large 3nm
| orders.
| flashback2199 wrote:
| So why work only with Intel and not also with AMD. Never
| understood that. They chose vendor lock in so of course the
| vendor is going to sit on it same as Motorola did and later
| IBM. Now they still have vendor lock in except the vendor is
| internal. From a distance, it looks super dumb, betting that
| you can do better with an internal project than playing off
| the established duopoly. I'm sure I'm too out of the know to
| understand.
| mdasen wrote:
| I certainly thought this back in 2012 when Apple started
| designing their own mobile CPU cores. How could Apple compete
| with ARM's designs which would have much greater economies of
| scale or with Qualcomm who would be shipping way more units
| than Apple? More than a decade later and it's proven to be a
| durable competitive advantage.
|
| I would point out that Apple had no first mover advantage with
| the iPhone. Apple was using standard ARM cores for the first 5
| years. It's not like ARM didn't have a multi-decade head start
| on Apple designing ARM cores.
|
| There are certainly risks, but so far it looks like Apple is
| making the right decision. Intel and AMD are mostly still
| targeting a higher power/thermal level than Apple is looking
| for. Apple likes being able to run things without relying on an
| external GPU. Having the ability to design things themselves
| means they're able to make what is most important to their
| users.
|
| With Intel, Apple was always at the whims of what Intel wanted
| to produce. It took forever to get the 1038NG7 28W part from
| Intel which meant that Apple couldn't offer a 13" laptop with
| the speed that a lot of users wanted. It also meant that they
| were beholden to crappy Intel graphics - even as they paid up
| for the Iris graphics.
|
| With their own CPUs, their destiny is in their hands. It's
| worked extremely well for the iPhone and it's working well for
| their Macs too.
|
| One of the things to remember: Apple already needs to design
| these cores for their iPhones and iPads. Once you're doing
| that, it makes a certain amount of sense to re-use those core
| designs for the Macs. It isn't zero effort, but they aren't
| starting from scratch.
|
| I'd also note that Apple tends to do less than a lot of
| companies and the same applies to their processors. Intel, AMD,
| and Qualcomm are trying to make parts for a huge range of
| devices. Apple has a much narrower field of products. While
| Qualcomm is trying to create processors for $100 cheap phones,
| Apple just goes with a single CPU for its mobile devices
| (sometimes using last year's CPU for some devices). With the
| Mac, Apple has a little more variety, but it's still pretty
| constrained. An M Ultra is basically just two M Maxes put
| together. Most of the time, the design is basically the same
| and Apple has just changed the core count or something like
| that.
|
| Will Apple compete with Nvidia's GPUs? Probably not, but an M2
| Ultra is competing with an Nvidia RTX 3070 desktop GPU. Sure,
| that's not the current generation or Nvidia's highest spec, but
| it's still good - and OpenCL isn't great on Mac so it isn't
| even a fair test. The new M3s have much upgraded graphics and
| it'll be interesting to see how well they do.
|
| You can say it has never worked before, but I think that might
| ignore a few things. First, one of the reasons that Intel came
| to dominate things is because everyone who tried to vertically
| integrate their processors tried to charge too much. Regardless
| of what you think of Apple's pricing, they aren't charging more
| for the privilege of their M processors than they were for
| their Intel ones. I think it also ignores the fact that Apple
| has so much money. Finally, in terms of cores shipped in the
| segments that Apple is shipping them, they're huge. ARM isn't
| shipping many X-series cores (their top performance cores).
| They're mostly shipping lower spec'd cores. There aren't a ton
| of flagship Android phones being sold. Intel is mostly shipping
| lower-end laptop CPUs destined for machines in the $400-800
| range, server CPUs, etc. Apple has a lot of scale for the cores
| it is designing.
|
| There is risk, but Apple took that risk a decade ago with their
| iPhone CPUs and they've had a great advantage there. While
| Intel and others are looking to revitalize their CPU game, it
| seems like they're doing it at a higher thermal/power level
| than Apple is looking for - and trying to benchmark themselves
| against Apple parts at a fraction of the power. Apple is
| getting to design parts that do what they need and they've
| proven that they can beat the industry for more than a decade.
| I'd say it was the right decision to move to Apple Silicon for
| Macs.
| jader201 wrote:
| It's ridiculous that the base and Pro only support 1 and 2
| external displays, respectively.
|
| Want 3 monitors? Have to pay for the Max.
|
| https://www.macrumors.com/2023/11/02/m3-chip-still-supports-...
| shepherdjerred wrote:
| From what I know, this isn't some artificial limitation that
| Apple imposes. Adding support has a cost that most customers
| don't need to pay.
|
| Regardless, most people aren't going to be plugging in three
| monitors into their laptops, so most people aren't going to
| care about this.
| runjake wrote:
| > Adding support has a cost that most customers don't need to
| pay.
|
| It also adds heat, battery consumption, and space. For a chip
| that will also be used in iPads. Still I wished the base M3
| chip supported at least two external displays. Or at least
| let you use two by disabling the internal display (eg.
| clamshell mode or whatever). > most people
| aren't going to be plugging in three monitors into their
| laptops, so most people aren't going to care about this.
|
| Last summer (2022), or thereabouts, I got a survey request
| from Apple where they asked how much I cared about external
| monitor support, followed by subsequent questions about the
| number of external displays that was important to me. So,
| Apple's looking at it.
| ChrisLTD wrote:
| It's entirely Apple's choice to use the M chips in iPads.
| And if that means they need to compromise on laptop
| performance or features, they should change course.
| wtallis wrote:
| Do you want Apple to drop some of their existing
| products, or do you want them to make more chip designs
| each generation with fewer economies of scale for each?
| freeAgent wrote:
| Apple still uses iPhone chips in the iPad and iPad Mini,
| and at this point I'd say they're more than fast enough.
| The iPad Air and Pro with M chips seem to me to be
| unnecessarily powerful for a device that runs iPadOS and
| therefore can only run apps from the App Store, etc. IMO,
| people don't buy iPad Airs and Pros over the standard
| iPad or Mini because they need the performance of the M
| chip. It's the rest of what's in the package.
| giancarlostoro wrote:
| If I ever have to try to plug more than two monitors into a
| laptop at that point I'm going to start asking myself why I'm
| not just buying a tower instead. The whole point of laptops
| is portability.
| thrwy_918 wrote:
| > The whole point of laptops is portability.
|
| Many people want portability, but also want to use external
| displays at home or at the office .
| baz00 wrote:
| Intel support 4 displays on a bottom end i5-1235U.
|
| Even my crappy little Lenovo M600 with an N-series Celeron
| supports 3 displays. Two 4k ones fine. I didn't have a third
| to test it with.
|
| It's either an architectural fuck up or intentional
| segmentation.
| shepherdjerred wrote:
| > It's either an architectural fuck up or intentional
| segmentation.
|
| Do you think there is any chance that Apple would have had
| to make some engineering compromise (cost, performance,
| efficiency) for a feature that very few people would use?
| mickeyfrac wrote:
| Apple may be setting up for this to be a killer feature with
| Apple Vision headsets.
|
| They are saying a single 4K feed, but if you could break that
| into segments it would change the game. That would be 4 x 1080p
| screens + the laptop.
|
| If you could slice the rectangles anyway you wanted and rotate
| them individually then you have the perfect environment. Which
| you would be able to change instantly, presumably, with Mission
| Control.
|
| Personally I would pay lots of dollars for that.
| TIPSIO wrote:
| Speaking of...
|
| Can I sidecar to two or more iPads with this yet?
|
| --
|
| "Use an iPad as a second display for a Mac"
|
| https://support.apple.com/en-us/HT210380
| punkybr3wster wrote:
| Why on earth would you want this when you can get much nicer
| screens for fractions of the price?
|
| One I can understand as I used to travel with my iPad Pro and
| used it as a second screen. But two? I've switched to a much
| lighter and much nicer UHD HDR usb-c monitor that is amazing
| for the price and the weight difference.
|
| You also don't get the random disconnections when sidecar
| weirds out for no reason.
| corbezzoli wrote:
| Really you're complaining you can't plug in a 4th display into
| your computer? The fraction of the population who does can
| afford the Max, together with the 3 extra monitors.
|
| I'm not trying to justify Apple here, but this usage seems
| quite niche and it's understandable to me to need a non-base
| setup. It doesn't sound _ridiculous_ at all. As a matter of
| fact, the vast majority of notebook users never even plug a
| single monitor in.
| unlikelytomato wrote:
| I might argue that what you are describing is exactly why
| it's a bit absurd. It would be surprising to me as a user to
| discover such a limitation. If I didn't read about it here, I
| would probably find out after the return window and then be
| pissed that they took a stand on this particular artificial
| market segmentation.
| swader999 wrote:
| My metrics are simple: Does the fan turn on? Can it go faster
| than I think and type? M2 with lots of ram has been a dream.
| wkat4242 wrote:
| This guy is becoming too much of an apple apologist for me. I've
| seen him defending pretty bad decisions like John Gruber (who
| always was) so I've stopped following him like I have Gruber.
|
| Also, _because_ I 've disagreed so much with Apple's decisions in
| the past years I've abandoned their ecosystem altogether so I'm
| also much less invested in the topic. Though I still use a Mac
| for work as a "least bad" option.
|
| Too bad because he did have good technical insights.
| wkat4242 wrote:
| Ps I know this is a kinda hot take but everything Apple has
| rubbed me the wrong way the past 10 years and I'm less and less
| aligned with Apple fans. I know this doesn't apply to everyone.
|
| In 2004 I moved to Mac because it was a powerful and
| configurable Unix OS with the benefit of a consistent UI
| (nothing on Linux was there then, it was a mess) and major
| commercial apps like Office and Photoshop.
|
| The latter are still true but the platform is so locked-in that
| most of its features are useless to me as a multi-OS person.
| And the hardware is quite locked down as well which simply
| doesn't work for me. A lot of the reasons for this are not
| user-centric but commercial.
| exabrial wrote:
| I think I'll hold out to the M4 and make sure memory bandwidth
| and latency is restored to it's previous value.
| GeekyBear wrote:
| Given that Intel and AMD tend to choke the memory bandwidth on
| anything below their server chips, this seems short sighted.
|
| The M3 Pro's "reduced memory bandwidth" is still double the
| memory bandwidth you see on Intel and AMD's HEDT chips.
|
| Step up to the M3 Max and you're looking at five times the
| memory bandwidth of Intel and AMD's chips.
| throwaway49594 wrote:
| You're comparing strictly CPU memory bandwidth to CPU + GPU.
| If you add CPU + GPU bandwidth for a PC you'll get similar
| numbers.
| buildbot wrote:
| Your PC can't use the GPU memory bandwidth for the CPU
| whatsoever. So why would you add the bandwidth?
| fredgrott wrote:
| For context, a desktop intel 13900 to 1400 combined with at least
| a NVIDIA 4060 beats it...System76 Thelio Mira for example.
|
| Both in Single and Multicores and GPU memory bandwidth
| Synaesthesia wrote:
| Yes but only just and it used 5-10x more power.
| pram wrote:
| This is my favorite genre of Apple Silicon posts. Someone who
| can't tell the difference between a laptop and a 30lb desktop.
| fh9302 wrote:
| They are basically equal while the Intel CPU uses significantly
| more power.
|
| https://browser.geekbench.com/processors/intel-core-i9-13900...
| https://browser.geekbench.com/v6/cpu/3364975
| gtvwill wrote:
| Devices get a fail for repairability. So yes there is more to
| performance than cores. For instance the entire Mac lineup are
| mostly unrepairable e-waste in the making with excessively
| wasteful design choices behind them.
|
| The company has a about as much ethics as wet tea towel. A facade
| of good intentions masking greed. I couldn't care less if their
| laptops get an hour extra battery or are a few seconds faster.
| The company is scum from an ethics and waste standpoint.
| Surprises me so many can turn a blind eye to it for 3 months of
| chart topping numbers.
| musha68k wrote:
| As a user of many Apple Silicon generations in different / mostly
| top configurations by now I'm obviously a big fan. One thing that
| I have "observed" though (with no data to back it up): it seems
| to me as if under heavy load / resource over-provisioning the
| Intel systems from before seemed to recover more gracefully? I
| wonder if it's just me, the particular OS version I had been
| using at the time or some other thing I'm missing?
|
| Again, no idea if this is actually the case / for other workflows
| than mine. I would be curious to know if anyone else had made the
| same observation potentially with actual "high load performance"
| data to back it up?
| lilyball wrote:
| Right now I have an intel laptop for work and an apple silicon
| laptop for personal use. The workloads I do on these machines
| is different so comparisons are a bit hard to do, but I've seen
| the intel laptop exhibit poor behavior under load that I've
| never seen from the apple silicon machine.
| scarface_74 wrote:
| By "load" you could easily mean "running Microsoft Teams".
___________________________________________________________________
(page generated 2023-11-03 23:00 UTC)