[HN Gopher] Apple's follow-up to M1 chip goes into mass producti...
___________________________________________________________________
Apple's follow-up to M1 chip goes into mass production for Mac
Author : lhoff
Score : 426 points
Date : 2021-04-27 12:54 UTC (10 hours ago)
(HTM) web link (asia.nikkei.com)
(TXT) w3m dump (asia.nikkei.com)
| hu3 wrote:
| I might consider buying one when it is able to run proper Linux.
| And even then it's probably going to be limited to ARM Linux
| only.
| rswail wrote:
| "Limited to the CPU type that is installed"... how is that a
| "limit"?
| hu3 wrote:
| For my workloads that is a limitation that must be considered
| when purchasing hardware. It's the same reason why I don't
| buy ARM Chromebooks.
|
| It's going to take years to run proper Linux on M1 and even
| more for the ecosystem to catch up to x64_86.
|
| For me it's reasonable to keep using AMD Ryzen 5000 which is
| faster than M1 on my multithreaded workloads anyway despite
| using 7nm. Plus it has better GPU, more memory, more storage,
| more ports and supports multiple monitors.
|
| Sure it is more expensive, but that's just because my segment
| is geared to pros with higher requirements. Apple currently
| has no laptop offering on this tier.
| vmception wrote:
| Think they new chip will have a better RAM solution ultimately
| allowing for more RAM?
| ojosilva wrote:
| Just a side rant here... I'm really frustrated I can't monitor
| the Neural Engine's usage in the M1 in my MacBook Air. Apparently
| Apple did not build an API for extracting data points from the
| these 16 cores, so I can only watch what the CPU and GPU are
| doing when running and optimizing Tensorflow models while the NE
| remains a black box.
| Someone wrote:
| FTA: _" the latest semiconductor production technology, known as
| 5-nanometer plus, or N5P. Producing such advanced chipsets takes
| at least three months"_
|
| I know as good as nothing of this process, but I can't imagine
| the latency from silicon wafer to finished product is 3 months. I
| also can't imagine some inherent start-up delay for producing the
| first chip (but as I said: I know as good as nothing of this
| process), so where do those 3 months go? Is it a matter of "you
| have to be extremely lucky to hit so few minor issues that it
| only takes 3 months to have a first working product"?
| olliej wrote:
| Latency for a chip is 4 weeks at the low end of complexity, and
| 12 weeks (3 months) at the high end of complexity.
|
| My mind was blown when I first found that out
| Someone wrote:
| Thanks. Also good to hear that I'm not the only one who finds
| that surprising.
| my_usernam3 wrote:
| Fellow mind blown friend here!
| kumarvvr wrote:
| Somewhere, the collective whos who of the silicon chip world is
| shitting their pants.
|
| Apple just showed to the world how powerful and efficient
| processors can be. All that with good design.
|
| Customers are going to demand more from Intel and the likes.
|
| Just imagine Apple releasing the Mx chips for server
| infrastructure. Intel ought to be sweating bullets now.
|
| edit: a word.
| zucker42 wrote:
| What makes you think that M1 would be better in servers than
| Ampere's chips or Graviton for instance? Desktop and server
| chips have different design goals.
| jcadam wrote:
| After the XServe, I wouldn't touch Apple servers with a 10 foot
| pole.
| partiallypro wrote:
| If we're honest most customers aren't going to notice the
| performance differences, it will mostly be people with heavy
| work loads.
| coldtea wrote:
| That includes anybody who uses Electron or Chrome... :-)
| thysultan wrote:
| They will notice the battery. Which is performance.
| lostgame wrote:
| Anecdotally; I'll say this is equivocally not the case. I've
| had a half dozen of my friends upgrade and some of the
| aspects are absolutely surreal.
|
| The instant turn-on. No fans. Safari alone feels like a
| completely different beast and a friend and I spent about two
| hours running through some of the most demanding web sites,
| side by side against my 2018 15" MBP.
|
| Holy crap. Just...all the subtleties.
|
| I'd actually say it's easier to notice the immense general
| speed-up than I notice the difference between a 4min and 2min
| render time in Final Cut.
|
| Dragging 4K clips around in FCP was infinitely smooth; and it
| made you immediately be able tell the UI/UX itself was
| dropping frames and needing to catch up to itself sometimes.
|
| These are things you don't notice until you've tried the M1
| for the first time.
|
| It truly is an undeniably insane piece of engineering. They
| killed it.
| sgt wrote:
| Not true at all. M1 and follow-ups are about single thread
| performance, the only actual performance most consumers are
| likely to notice.
| merdadicapra wrote:
| > the only actual performance most consumers are likely to
| notice
|
| Remember when Apple made the Power Mac _QUAD_ , to indicate
| that the system had _FOUR_ processors, because "men should
| not live of single thread performance alone"?
|
| > _" With quad-core processing, a new PCI Express
| architecture and the fastest workstation card from Nvidia,
| the new Power Mac G5 Quad is the most powerful system we've
| ever made," Philip Schiller, Apple's senior vice president
| of worldwide product marketing, said in a statement._
|
| Of course they can't advertise the multi core performances,
| because they are lower compared to similar systems, just
| like with the Power Mac they could not advertise that the
| system was greener, because it wasn't.
| astrange wrote:
| The multicore performance is good for a portable, because
| background threads can run on the efficiency cores
| without making the whole system thermal throttle. Intel
| can't do that.
| baggy_trough wrote:
| You can definitely notice the difference between a formerly
| top of the line Intel MacBook Pro and an M1 Air. The Air is
| way faster for opening apps, photo library, and other common
| daily activities.
| selimnairb wrote:
| I work on a 25-year-old research hydrology model written in
| C. My M1 Mac mini runs the model about 68% faster than my
| 2019 i9 MacBook Pro. Definitely thinking of trading that in
| once a 64GB M1-based system is available.
| TheTrotters wrote:
| And M1 Macs won't burn a hole through your desk.
| holman wrote:
| I run a decent chunk of heavy work loads, and tbh the main
| things I notice on the new Air are all of the _other_
| niceties: instantly awake, ludicrous battery life, no fans. I
| think those aspects are going to be much more noticeable than
| many expect.
| fossuser wrote:
| Agreed - most people use their computers as web browsers to
| access social media and shopping.
|
| The performance boost there is extremely noticeable on new
| M1 chips.
| sjs382 wrote:
| But they _will_ notice that they left their charging cable at
| work /home and were still able to work all day without it.
| samgranieri wrote:
| I don't think Apple will go back to creating server
| infrastructure again. In the interim, I know Amazon created an
| ARM server chip called Graviton, and I'd like to run a
| kubernetes cluster on that
| nonameiguess wrote:
| A lot of other fairly standard cloud native technologies
| don't yet work on ARM, notably istio, even though most
| implementations of kubernetes itself will work. This is the
| major thing preventing me from being able to use a Macbook
| even for development (at least an M1, but that's all I have).
| tambourine_man wrote:
| https://pbs.twimg.com/media/Ez-8qRYUUAIoukD?format=jpg
| kumarvvr wrote:
| I hope they do, because, ultimately, it will be good for the
| environment.
| lozaning wrote:
| I would do truly terrible things for an updated xserve RAID
| (the most beautiful computing device ever made) that
| supported SATA or NVME drives.
|
| Right now I've got IDE to SATA adapters in mine, which leaves
| just enough space to also squeeze 2.5" sata ssd into 3.5"
| drive caddies.
| etaioinshrdlu wrote:
| The latest Mac Pro does come in a rack mount option, but I'm
| not sure what market it's targeting.
| mlindner wrote:
| Linus (of Linus Tech Tips) talks about the target market
| some in his videos about it
| https://www.youtube.com/watch?v=Bw7Zu3LoeG8
|
| In short, it's for media production, not computing. You put
| it in a media rack.
| jeffy90 wrote:
| Maybe ios ci/cd build infrastructure?
| evanmoran wrote:
| I think it's mainly for Mac / iOS development CI systems
| like for CircleCI or Mac Stadium. The hardware just seems
| too expensive for it to replace more generic arm services
| that AWS is offering.
| selectodude wrote:
| On-site music and video production. You can rack it in a
| portable rack with your other racks of music equipment
| and it fits in the back of a van.
| [deleted]
| qaq wrote:
| They should do a cloud they have some very unique tech like
| foundationDB based SQL RDBMS and M chips and prob a lot more
| cool thingies.
| Jonnax wrote:
| Yes, you can run a kubernetes cluster using it.
|
| It's been available for a year: https://aws.amazon.com/about-
| aws/whats-new/2020/08/amazon-ek...
| qbasic_forever wrote:
| Amazon's Graviton 2 is the M1 of the server world in my
| experience. It's faster, cheaper, and overall a better
| experience than any other instance type for most workloads. You
| have to get through a bit of a learning curve with multiarch or
| cross-compiling to ARM64 depending on what your codebase is
| like, but after that it's smooth sailing.
|
| Azure and Google need to really step up and get some
| competitive (or even any) ARM instances--fast.
| matwood wrote:
| Agreed. I'm busy transitioning everything I can to Graviton2
| for the $/perf savings.
|
| For my Java services so far, no change. Aurora was a button
| click. ECS is the next thing to tackle.
| sprite wrote:
| I hope they add support for graviton on elastic beanstalk
| soon.
| sneak wrote:
| It's pretty crazy how good the Graviton processors are.
|
| What's even more crazy is that you _cannot buy them_. Such
| price /perf is literally only available to rent from AWS.
|
| I hope Oxide does something about this soon; it is a sort of
| dangerous situation to have such a competitive advantage only
| be available via rental from a single (large and
| anticompetitive) company.
| wmf wrote:
| Ampere Altra is actually faster than Graviton 2 and you can
| buy it. (But apparently the motherboard is >$5,000!?!)
| https://store.avantek.co.uk/arm-servers.html
| qbasic_forever wrote:
| Yeah, I kinda wish Amazon would make a Graviton 2 powered
| laptop or dev machine. It would be really nice to develop
| directly on the ISA for the instance instead of cross-
| compiling. An Apple M1 of course works, but more ARM64
| laptops need to be available. It's sad that it never got
| traction in the Windows hardware world.
| sneak wrote:
| My concern is data centers, not development systems. One
| can always just ssh/x11/rdp/whatever into the Graviton
| machines.
|
| There's also nothing wrong with the $999 M1 Air. Isn't
| the Surface also ARM? And most Chromebooks?
|
| After you've already got the code, though, it's Amazon's
| way or the highway.
| wmf wrote:
| Future ARM Surface laptops will probably be faster than
| Graviton per-core but obviously with fewer cores.
| AtlasBarfed wrote:
| AMD will probably "arm" up quickly, NVidia is acquiring ARM
| proper, Qualcomm won't sit still.
|
| Intel? Well, maybe 12 months after they catch up to AMD in
| x86.
|
| AMD's chips I've seen are decently competitive with M1. If
| ARM is a not-too-bad microcode adaptation and ditching a
| whole lot of x86 silicon real estate, AMD might be able to
| come to market quickly.
|
| Intel isn't just getting pantsed in x86 by AMD and process
| by TSMC, they are seeing their entire ISA getting
| challenged. Wow are they in trouble.
| Hamuko wrote:
| Isn't the ARM acquisition kinda stuck at the moment?
| uKGgZfqqNZtf7Za wrote:
| It has to go through regulatory approval in the UK but
| this was to be expected. Jensen (Nvidia CEO) said he
| expected the acquisition to be complete by 2023.
| dagmx wrote:
| It is, but nvidia don't need it necessarily to compete in
| the ARM space, since they already make Arm SOCs
| AtlasBarfed wrote:
| arguably they would proceed on an ARM design with or
| without the formal company acquisition. It's too good of
| an opportunity.
| supernova87a wrote:
| My layman's understanding was that M1 chips had very efficient
| low-power/idle usage, and better memory + instructions
| pipelining architecture for jobs composed of diverse
| (inconsistent) size and "shape". And that's why it blows away
| consumer workload Intel chips.
|
| In a server high load environment, is this still the case?
| Continuous load, consistent jobs -- is the M1 still that much
| better or does its advantages decrease?
| merdadicapra wrote:
| 1 - Apple is not going to enter the server market ever again,
| especially not the generalist cloud infrastructure
|
| 2 - 90% of the laptops sold Worldwide aren't Apple laptops,
| most of them are sub $500 laptops. Amazon best selling is ASUS
| Laptop L210 Ultra Thin Laptop, 11.6" HD Display, Intel Celeron
| N4020 Processor, 4GB RAM, 64GB Storage priced at $199
|
| Basically only Mac users who upgraded to M1 are going to notice
| the difference
| Logon90 wrote:
| And there are still Pentium processors running out there
| somewhere in Africa, how is it relevant?
| [deleted]
| jhayward wrote:
| > 2 - 90% of the laptops sold Worldwide aren't Apple laptops,
|
| This is perhaps the wrong metric to look at. What % of total
| profit is Apple capturing? Several years ago the accepted
| wisdom in phones was that Apple was getting ~30% of the unit
| volume and 90% of the profit.
| ogre_codes wrote:
| > 90% of the laptops sold Worldwide aren't Apple laptops,
| most of them are sub $500 laptops.
|
| Apple's share of the total PC shipments is closer to 15% now
| and their share of the laptop market is even higher.
| dgellow wrote:
| ... in the US. It's a different story worldwide where Apple
| is around 8% (source: https://9to5mac.com/2021/01/11/mac-
| huge-q4-growth-49-percent...).
| neogodless wrote:
| I tried a variety of Google searches, but I can't find
| information specifically narrowed down to Apple's share of
| the notebook/laptop market in the U.S. Can you help me?
| amelius wrote:
| > Just imagine Apple releasing the Mx chips for server
| infrastructure. Intel ought to be sweating bullets now.
|
| As a developer, I'm sweating bullets. Imagine a future where
| Apple is the only serious platform left on mobile, server and
| desktop, then developers can forget about starting that next
| disruptive startup and instead they will become just second-
| class Apple "employees" with contracts similar to Uber drivers.
| pfranz wrote:
| I guess maybe if Apple makes more changes after all of that
| happens. 10 years ago Instagram was iOS-only for years. Uber
| launched on iPhone and later added Android. Clubhouse is iOS-
| only right now. These companies are choosing to focus on iOS
| for their startups.
|
| The other day there was an article here about Swift on the
| server and it was full of replies from people who would love
| to use it more but lack of investment from Apple makes it
| often a poor choice. It's doubtful even Apple is using it
| much server-side in house.
| Tepix wrote:
| I think companies such as Tenstorrent have a good shot at
| disrupting the semiconductor industry once again with their
| AI chips
| TameAntelope wrote:
| With all the respect in the world, I think the complete lack
| of that happening at all this century should make you pause
| and reflect on why you think it's even remotely possible.
|
| I'm happy to be wrong, but I think you're buying too much
| into the SV/media fearmongering on a handful of companies
| whose main business is actually just attention economics.
| snypher wrote:
| Not happening this century is a big statement. I don't know
| a lot about this industry, but I don't see why Apple
| couldn't do what IBM did to arrive at "everyone on x86".
| TameAntelope wrote:
| I don't know a ton about the history of x86, but I kind
| of assumed all of that shook out, one way or another,
| pre-2000.
|
| I was using admittedly vague language for rhetorical
| effect. I only technically meant the past 21 years.
| hugi wrote:
| Indeed. It's interesting to see how people like to label
| Apple a monopoly - a company with a 19% market share in
| smartphones and an 8% share on the desktop.
| elcomet wrote:
| in the world sure, but it's a different story in the US.
| neogodless wrote:
| About 15% of personal computers in the U.S. now.
|
| https://www.statista.com/statistics/576473/united-states-
| qua...
| nexuist wrote:
| Apple isn't harnessing magic here. The opportunity for
| competition exists, but everyone else is just sitting on
| their asses while Apple's chip team puts in the work.
|
| If you want to beat Mx, you have to think outside the box,
| just like how Apple thought outside of x86.
|
| That being said, I do enjoy the shift to ARM as the dominant
| arch. x86 has far too many decades of cruft behind it. I
| don't think Apple will take over the ARM space; in fact I
| think we will see the emergence of new players as ARM isn't
| as finicky about licensing their tech as Intel is. The only
| reason Intel even had a competitor (AMD) in the past 20 years
| is because they licensed x86 to them before it started
| becoming the de facto choice.
| edrxty wrote:
| I'm really hoping one of the other chipmakers jumps on the
| RISC-V bandwagon. There's substantial potential to do to
| ARM what ARM did to x86 here. If
| Intel/AMD/Qualcomm/Broadcom/whoever started talking about a
| significant RISC-V offering I'd be buying as much of their
| stock as I could.
| acegopher wrote:
| RISC-V doesn't offer the same advantage over ARM that ARM
| has over x86, so that's unlikely to happen.
| edrxty wrote:
| I work with RISC-V, it has a lot of features that are not
| yet well explored. In particular the instruction set
| extension mechanism is extremely robust and capable,
| allowing much more FPGA fabric integration that you
| currently see on ARM devices. As we move towards more
| programmable logic it'll be a massive advantage going
| forward.
| AgentOrange1234 wrote:
| That sounds really interesting -- what makes RISC-V
| better than ARM for this?
| foobiekr wrote:
| ARM didn't do anything to x86. One specific team at one
| company did something. Maybe two if you're super generous
| and include graviton.
| edrxty wrote:
| ARM is destroying x86 across the board, not just at
| Apple. They've long won the mobile space and are now
| making significant inroads into the server market, long
| thought to be the last bastion of Intel/x86.
| foobiekr wrote:
| What's the revenue of ARM server CPUs?
| breakfastduck wrote:
| It doesn't matter. The revenue for x86 server CPUs goes
| down every time Amazon install a CPU they made themselves
| instead of buying an intel chip.
| Elora wrote:
| > The opportunity for competition exists
|
| This is so deep and powerful if you really think about it.
| This is what we were promised. While not perfect, I applaud
| Apple for at least trying.
| deaddodo wrote:
| > Apple isn't harnessing magic here. The opportunity for
| competition exists, but everyone else is just sitting on
| their asses while Apple's chip team puts in the work. > If
| you want to beat Mx, you have to think outside the box,
| just like how Apple thought outside of x86.
|
| They didn't even do anything special. People have been
| telling multiple ARM IP holders to build a wide, powerful
| chip for ages. Especially for the server space.
|
| Apple is just the first one to actually do it. AMD and
| Intel have been doing it all along, which is why it's so
| impressive that Apple's chip is finally in spitting
| distance of them despite being a fairly normal design with
| some specific optimizations for use cases.
| enos_feedler wrote:
| "The opportunity for competition exists" is the most true
| statement and response to everyone grumbling about Apple
| dominance. Somewhere, somehow, the ecosystem of competitors
| has failed to execute, got lazy, etc and now is looking to
| regulators to bail them out. It makes me a bit sick. Apple
| is nothing more than a computing stack wrapped in a
| portfolio of consumer products. They happen to see this
| future early, invested aggressively in it, from silicon to
| security and everyone got caught with their pants down.
| dmix wrote:
| There's already multiple billion dollar companies trying
| to compete and get to where Apple's at. Apple has the
| lead now but they wont forever. They never do. I don't
| see what the big deal is here.
|
| Competition has brought us such great processors. We
| should be thankful for it.
| enos_feedler wrote:
| In the consumer space there are only 2 paths for a viable
| competitor to emerge. 1) a coordinated ecosystem attack
| or 2) an individual company doubles down.
|
| There are things that lead me to believe 1) can't happen.
| Google ultimately acquiring Fitbit instead of ecosystem
| compatibility and integration. It seems like the giants
| are needing to acquire the pieces to build their own
| Apple, rather than partner with them. Also Google and
| Microsoft have complementary businesses but they barely
| coordinate on this. The closest thing I have seen is
| Android helping Microsoft with Dual screen Android
| devices. In most other areas they are setup to compete
| (chromeos vs. windows, azure vs. gcp, stadia vs. xbox
| cloud, gsuite vs office, bing vs google search etc).
|
| 2) Samsung is the most equipped to lead this.
| josephg wrote:
| How far behind Apple are AMD at the moment? If ryzen
| chips were optimized for low power and manufactured on
| TSMC's 5nm process like the M1, what sort of performance
| difference would we be seeing?
| 1123581321 wrote:
| There's also 3) which is another sea change in computing,
| something that would lead to current device categories
| becoming irrelevant, and a currently overlooked company
| positioned to capitalize on it.
| sim_card_map wrote:
| > they wont forever. They never do.
|
| retina imacs and macbook pros beg to differ
|
| 8,9 years later still no good alternative
| spockz wrote:
| You would need to think quite far out of the box and hit it
| to be able to beat a vertically integrated, extremely well-
| funded, and successful company.
| firebaze wrote:
| I guess it's not black magic apple is doing right here.
| From my experience with big companies, Intel just got
| buried in its processes, approvals, middle-management
| etc.pp.; they still got the talent, and in the past years
| there wasn't any serious competitor to them.
|
| The dual wake-up call from AMD and from Apple (ARM),
| combined with the money Intel has in its pocket will have
| a serious influence on the cpu market. Unsure if they'll
| come out ahead, but it will get interesting and probably
| cheaper, not only for consumers.
| reddog wrote:
| Like IBM circa 1968?
| cromulent wrote:
| Well, the world was changing and new opportunities arose.
| I think it's different now compared to then.
|
| In 1968 there weren't that many computing devices around.
| When video games and home computing came around, there
| were massive opportunities and the entrenched players
| missed some of them.
|
| Same with mobile phones, same with the internet, same
| with smartphones.
| zepto wrote:
| There are lots of very rich companies that claim to want
| to compete but choose not to take the risks Apple did, of
| investing in their own vertical integration.
| bombcar wrote:
| Each individual risk Apple takes is small - they didn't
| develop the M1 from scratch; they have years of iPhone
| and iPad processor development under their belt.
| jimbokun wrote:
| The risk of Apple deciding to design their own chips was
| massive.
| cma wrote:
| Apple started working with ARM in the 80s, eventually
| used in the Newton. I don't know that they did design
| directly, but they influenced the changes that became
| ARM6.
| zepto wrote:
| The individual risks are small now, but they weren't when
| Apple was recovering from near death.
| google234123 wrote:
| ARM also has some decades of cruft attached to it.
| danlugo92 wrote:
| No, the aechitecture used in m1 is from 2012
| r00fus wrote:
| ARMv8, ie, AArch64 is a redesign/new, and entirely 64 bit
| with legacy (32bit) support mode tacked on. The design
| was done to be able to remove legacy without another
| redesign.
|
| To my understanding, this is quite unlike AMD64 (ie,
| Intel's current 64 bit ISA licensed from AMD) - which
| extended x86.
| astrange wrote:
| The 32-bit mode is optional; M1 doesn't even support it.
| amelius wrote:
| > The opportunity for competition exists
|
| Perhaps, but I guess it would take on the order of a decade
| for a competitor to build their technology including an
| ecosystem with customers.
|
| And who says they won't copy Apple's business practices?
|
| > but everyone else is just sitting on their asses while
| Apple's chip team puts in the work.
|
| That's because the complexity of the semiconductor industry
| is daunting. You need far more than a team of IC designers,
| especially if you want your product to become successful in
| a world where there is one main player.
| mistersys wrote:
| Sure if someone is starting from scratch. We have
| Microsoft, Android and Linux which are alternative
| computing platforms, and android + linux are already
| fully ARM. Apple's chips are still much faster than most
| Android ARM chips, but Android ARM manufacturers are not
| starting from zero.
| Bud wrote:
| You mean the business practices wherein Apple is 10x
| better on privacy than everyone else to the point that
| Facebook and Google are now being forced to be honest
| with users, disclose all the user data they are stealing
| and selling, and disclose all the privacy and security
| risks inherent in their business models?
|
| Sign me up for more of those kinds of "business
| practices", thank you very much.
| [deleted]
| skinnymuch wrote:
| Apple is reportedly beefing up their mobile ads business.
| Hard to say privacy is the reason when Apple immediately
| attempts to benefit directly financially and likely with
| some privacy hits.
| Bud wrote:
| Would be helpful if you could support this with some
| links and evidence. I monitor this issue a lot, and
| haven't seen anything that presents added privacy risks,
| at least not yet.
| dep_b wrote:
| Might be referring to App Store ads
| dr-detroit wrote:
| intel is 1 really _FUN_ Justin Long commercial away from
| disrupting Apple 's market share but slow and steady wins
| this race for domination.
| sunstone wrote:
| I'm sure linux will run on these chips as well. If I'm not
| mistaken linux has already booted on the M1.
| amelius wrote:
| For now. What happens when Linux or software that runs on
| Linux starts competing more broadly with Apple?
| IOT_Apprentice wrote:
| I'm unclear what that would mean exactly. Some particular
| market space where a Linux-only based software package is
| the dominant solution in a business or consumer space? Do
| you mean a Chromebook?
| amelius wrote:
| Let's say some game producer starts making games that run
| on their Linux platform that in turn runs on Apple
| hardware.
| flohofwoe wrote:
| Give Mac software a few years to "catch up" and everything will
| feel just as slow as usual (and I'm not even joking, so far,
| software was always excellent at undoing any advancements in
| hardware very quickly).
| crazygringo wrote:
| But since so much software these days is cross-platform,
| apps/sites will still have to work performantly on Intel
| chips. E.g. Google can't slow down Chrome to take advantage
| of Mac speed increases, because it would become unusable on
| Windows.
|
| So I actually think that Mac software will hold the
| performance edge for a long, long time.
| nexuist wrote:
| It logically makes sense for Windows to move to ARM as well
| at this point. They already ported it with Windows RT, and
| now all they have to do is provide a translation layer like
| Rosetta and release the full Windows experience (instead of
| the locked down version RT was).
| crazygringo wrote:
| But there's no M1 chip equivalent for PC's.
|
| It's fine to have an ARM version of Windows but there's
| no equivalently high-performance ARM chip to run it on.
| Unless you're talking about Bootcamp on Macs.
| coliveira wrote:
| We don't even need to wait for that. Right now web developers
| are creating the new generation of web-based apps that will
| consume everything that M1 can give and more. In fact these
| apps already existed, they were just largely ignored because
| they're so power-hungry (just look at electron and similar
| stuff).
| mrweasel wrote:
| Even worse: Developers will get an M-series Mac, write
| Electron apps that performs okay. They will then ship them
| as cross platform app, that will absolutely suck on any
| other platform.
| imtringued wrote:
| Where is that inefficiency supposed to come from though? I
| mean there are four big inefficiencies that came with JS.
|
| 1. Garbage collection and minimum runtime
|
| Garbage collection decreases memory efficiency, browsers
| need a certain minimum amount of resources to render a
| page, no matter how complicated leading to 500MB RAM text
| editors like Atom and worse once you open files. Similar
| problems plague Eclipse and IntelliJ which both often
| consume 1GB of RAM. The JVM often needs 150MB for an empty
| or perfectly optimized app.
|
| 2. Everything is an object with pointers
|
| This is especially bad in Javascript where every object is
| basically a hashmap. This causes performance slowdowns
| because even something as simple as looking up a field is a
| pointer chase through several layers now. Raw numerical
| data may consume a lot of memory if you are not using typed
| arrays. Especially bad with ArrayList<Integer> in Java.
|
| 3. JIT
|
| JIT compilers can only spend a certain amount of time on
| optimizations, which means JIT languages tend to either
| suffer from slow start up times or faster start up but less
| optimizations.
|
| 4. GUI complexity
|
| Things like having animations and constantly recomputing
| layouts.
|
| If you designed your processors for these things and made
| them fast at this, the only further source of slowdown is a
| lack of caring because you have already exhausted the
| unavoidable technical reasons. E.g. your processor is so
| fast, you write a naive algorithm that takes 5 seconds on
| the fastest computer available but then takes 20 seconds on
| an old computer.
| josho wrote:
| I think the next layer is cross platform bridges.
|
| Look at multi-platform products like ms office and how
| slow they run on a Mac. I suspect because there is a
| translation layer to bridge win32 calls to Mac
| equivalents. And that seems like it would be point 5 on
| your list.
| zepto wrote:
| > If you designed your processors for these things and
| made them fast at this
|
| What does that look like?
|
| There is a reason Apple doesn't do #1 and #3 and is
| moving away from #2 in their own code.
|
| They are just inefficient ways of doing things. Designing
| a processor to support an inefficient mechanism will
| still lose out to a processor which doesn't have to.
| Klonoar wrote:
| I don't think "animations" fits here. Well-implemented
| CSS animations are generally fine (though this is in and
| of itself a high skill to do well, I think). If you're
| still driving animations with pure JS, you probably need
| a refresher in modern web dev.
|
| Diffing a VDOM might fit here, but that's not really GUI-
| specific - just a combination of your earlier points.
| ogre_codes wrote:
| Maybe for some low hangers, but compiling code will still be
| faster, XCode and VSCode will be a bit snappier, Safari and
| Messages will be a bit better. These base things need to work
| on a broad range of products.
|
| The fact that Apple writes much of their software for iOS and
| MacOS at the same time means much of it is designed to run on
| fairly light hardware.
|
| I know we're all stuck with bloated stuff like Slack and some
| of us with MS Office, but just go native where you can it
| reap the benefits of the platform.
| flohofwoe wrote:
| > but compiling code will still be faster
|
| I don't know about that, LLVM is getting slower with each
| new release too [1]. Same for Xcode: startup is very
| noticeably slower than a few years ago on the same machine.
| Starting into a debugging session for the first time is
| _much_ slower (it used to be instant, now it 's multiple
| seconds). The new build system in Xcode is most definitely
| slower than the old one, at least for projects with more
| than a handful compile targets. Etc etc etc... new software
| is only optimized to a point where performance doesn't hurt
| too much on the developer's machine.
|
| [1] https://www.npopov.com/2020/05/10/Make-LLVM-fast-
| again.html
| nicoburns wrote:
| The work off the back of that article has mostly halted
| that I think.
| ogre_codes wrote:
| I suppose that's a fair comment. The whole build system
| and IDE slow down when working with Swift. It's possible
| Swift wouldn't exist if it weren't for faster processors.
|
| One nice thing is Swift is slow to build but quick at
| runtime.
| arthur_sav wrote:
| No need to wait, just use Chrome to get a taste of the
| "future".
| jrockway wrote:
| I feel like things have felt pretty good in the PC world
| since NVMe SSDs and approximately Sandy Bridge.
|
| But, I do agree that there is a lot of jank in software that
| we write these days. Over the weekend I started writing an
| application that uses glfx + dear imgui, running on a 360Hz
| monitor. The lack of latency was incredible. I think it's
| something that people don't even know is possible, so they
| don't even attempt to get it. But once you know, everything
| else feels kind of shitty. This comment box in a web browser
| just feels ... off.
| disgruntledphd2 wrote:
| Emacs has given me this feeling for a while. Even over a
| remote connection that is super slow, I am supremely
| confident in every keystroke getting to the server and
| doing what I want.
|
| (Except for lsp/company which occasionally just messes up
| everything).
| jkestner wrote:
| Speed is a feature, said well:
| https://craigmod.com/essays/fast_software/
| legulere wrote:
| There are some hard usability limits that feel like the
| performance has stagnated or getting worse: when resizing
| Relayouting needs to be faster than the time for one frame
| (16ms usually). On Windows seemingly no software manages to
| do that.
| mojo982 wrote:
| You are right. And I hate it.
| mullingitover wrote:
| Software is a gas that expands to fill its container. It's
| just science.
| nemo1618 wrote:
| Even today the M1 still feels slow occasionally -- hanging
| for a second or two, browser taking multiple seconds to
| launch, etc. Granted, that could be due to I/O or power
| management, but in any case it's clear that software is not
| making proper use of hardware.
| pentae wrote:
| What you just described only happens for me with Chrome on
| my M1 MBA. Try loading Safari, its instant.
| granshaw wrote:
| Could be even worse - web devs writing shitty webapps that
| are fast on their Macs but dog slow on non-Macs
| happymellon wrote:
| Nah, don't worry they are using electron apps to build them
| as well.
| Klonoar wrote:
| If you don't run Electron apps, even an older Mac will fly.
| My 2015 is still astoundingly fast, it just doesn't have
| Chrome bogging it down.
| inDigiNeous wrote:
| This idea that Electron apps are inherently somehow slow is
| starting to bug me out. While writing Electron version of
| our graphics heavy web application, I noticed that the
| memory usage or CPU consumption is not a lot higher than
| some other native applications.
|
| We have taken careful care in order to write fast and
| memory friendly javascript if possible, avoiding using slow
| features of the language (the list is long, but things like
| forEach loops, ineffecient data structures etc) and taking
| care to profile and optimize.
|
| Result is an application that feels almost as fast as
| native and doesn't really consume so much memory even,
| although we are doing canvas based graphics realtime.
|
| My suspection is that many web-developers (and thus,
| qualified to developer for Electron) just don't have the
| tenacity or background to write efficient code.
|
| You can write slow-ass molasses javascript code very
| easily. Just take somebody who has done webdev for maybe
| like 2 - 3 years and doesn't have any other deeper CS
| background. Watch what kind of memory inefficient and slow
| code structures, especially with javascript where you don't
| really understand how heavy an operation like map() can be
| in worse cases, or where you are creating copies of your
| data multiple times for example, and voila, you have a
| slow-ass memory-hogging electron application.
|
| Maybe I should do a blog post about how our Electron
| -application is performing just to show people that you can
| write fast code using javascript also. But it takes skill
| and time, and maybe in this current day what matters is
| just cranking out builds that work somehow.
| mike_d wrote:
| It isn't just Electron, Node is also plagued by its own
| low barrier to entry.
|
| One of my former employers had the great idea of hiring
| hundreds of "senior" JS devs and redoing the entire
| frontend of the website. When it launched to production
| the whole system scaled at approximately 1 enterprise-
| grade server to 1 concurrent user.
|
| While I applaud your efforts to teach people how to write
| code faster, the majority of JS devs I have found just
| want to hit feature-complete and go home.
| heipei wrote:
| I'm a self-taught JavaScript developer (front- and back-
| end) and I would love to read such a blog post. I mostly
| use language features for their expressiveness (map,
| filter, forEach) and rarely think about their performance
| implication unless there is reason to believe some piece
| of code is in a performance-critical path. However with a
| guide I might reconsider some scenarios where I'd be
| willing to give up expressiveness for performance.
| N1H1L wrote:
| Please write the blog post and share it to HN. I would
| really look forward to reading it and learning from it.
| coliveira wrote:
| Exactly, I have a 2015 MacBook that works perfectly, unless
| I try to use electron-based apps.
| Hamuko wrote:
| My early 2011 MacBook Pro definitely isn't fast but it's
| usable for checking out a couple tabs worth of stuff on the
| sofa.
|
| My work 2019 16-inch MacBook Pro actually doesn't feel that
| fast considering how fucking expensive it is.
| tandr wrote:
| My fully-loaded 2019 16-inch MacBook feels VERY
| underpowered actually - once you try to use all the cores
| (multiple compilation threads, parallel tests with
| multiple docker images running), it goes into what I
| would call a "heat shock". I suspect throttling kicks in
| and cpu charts in Activity Monitor become close-to-death
| flatlines. Oh, and the fan noise becomes non-ignorable
| too.
| Hamuko wrote:
| Yeah, full utilization leads to throttling, guaranteed.
| Although apparently it can get much worse than what I've
| had, down to like 1.0 Ghz.
|
| I've also had issues with the laptop waking from sleep,
| where it'd either hang for up to minutes, or just
| outright kernel panic. Big Sur seems to have fixed the
| kernel panics but waking from sleep still feels really
| slow.
| knolan wrote:
| My 16" had two speeds.
|
| Quiet and cool with about 11 hours of battery life when
| writing, reading or some simple productivity stuff.
|
| Fast but loud and hot when doing anything else with about
| 3 hours of battery life. Even the Touch Bar (which I
| don't hate, but I also don't love) is uncomfortable to
| the touch.
|
| It seems there is no in between. I'm generally happy with
| the machine, but I'm very interested in what's next for
| the big MBP.
| Klonoar wrote:
| You know, I believe you on the 2019 - I've heard this
| from more people than I care to admit, and makes me glad
| I skipped that year.
|
| I think from 2016-2019 was a rough era for the MacBook
| Pro when you factor in price for performance; still great
| machines, but they didn't feel as polished as the 2015
| MBP or the 2020 M1's.
|
| Edit: year banding.
| selectodude wrote:
| My 2016 MacBook Pro has been an absolute pleasure to use.
| It's small, light, great battery life, plenty of
| performance, I consider it the second incarnation of my
| favorite laptop ever, the Titanium PowerBook G4.
|
| Except for the keyboard. Been replaced three times. It's
| defective by design. Such a letdown.
| taysondev wrote:
| I actually used a MacBook Air 2011 up until summer 2020 as
| my main home computer. The only reason I even bought a new
| MacBook was because I couldn't get the newest version of
| macOS which I needed for the latest version of Xcode.
|
| Finally sold the 2011 MacBook Air a few months ago for
| $300. I got so much value out of that laptop.
| Toutouxc wrote:
| My old 2013 is still a charming little machine. I made my
| GF "upgrade" to it from a random ~2017 HP plasticbook and
| she loves the thing. 4 GB of RAM can still fly if you only
| run native apps.
| mrweasel wrote:
| You can even go back to 2013 and not notice that you're
| using a 7-8 year old machine. A MacBook Pro from late 2013
| runs Big Sur and performs fine for most everyday use use
| case and any development work that doesn't require an IDE.
| You just can't run Chrome all the well, which should tell
| you more about Chrome than Macs.
| inDigiNeous wrote:
| Still using a mid-2014 Macbook Pro with 16 gigs of ram
| and a 2.5 GHz quad-core i7 as my daily driver. Everything
| performs just fine to me, and macOS has even got faster
| during this time I've used the laptop, which is something
| to give credit to Apple for.
|
| I use mainly VSCode these days though and Firefox, but
| even Unity game engine works just fine on this machine.
|
| Any IDE I also throw at this machine works just fine.
| Only things that feel slow, are things that need a beefy
| GPU.
| vladvasiliu wrote:
| > and any development work that doesn't require an IDE
|
| Probably depends on the IDE and the language. My
| impression is that newer computers (especially in the
| "consumer" / "general public" market) have especially
| improved efficiency. They're not much faster, but they
| last longer on smaller batteries.
|
| My late 2013 15" (2.3 GHz / 16 GB RAM) works great with
| Pycharm for moderately sized projects. It's even usable
| for personal Rust projects (with the intellij-rust
| plugin).
|
| For Rust it's somewhat slower than my i5-8500 desktop but
| not by much. For incremental compiles, I don't feel like
| I'm waiting around more. The i5 mainly wins when it can
| use all six cores.
|
| It's however quite a bit faster than my work HP ProBook
| with an i5-8250U which is a much newer machine (2018 I'd
| say).
|
| Aside from battery life, which is somewhat lesser, all in
| all the mac is a much, much better machine than the HP,
| especially for "creature comforts": gorgeous screen, no
| misaligned panels rubbing against my wrists, great
| trackpad, no random coil whine, inaudible fans unless it
| starts compiling for a while, no background noise on the
| headphone out, integrated optical output when I want to
| use my home stereo.
| hugi wrote:
| I do all my work (java/postgres etc) on a 2012 i7 Mac
| Mini. It used to be my home media server but I started
| using it as a "temporary machine" last year after killing
| my 2018 15" MBP by spilling water over it. I was planning
| to replace it with an Apple Silicon machine once they
| became available, but it's performing so well I'm more
| than happy to keep using it while waiting for a 15" or
| 16" M1x/M2. Amazing little machines.
| AceJohnny2 wrote:
| Is there an equivalent to Parkinson's Law ("Work expands to
| fill the available schedule"), where software expands to fit
| the available performance?
| flohofwoe wrote:
| "Software is a gas; it expands to fill its container." -
| Nathan Myhrvold
|
| PS: apparently the quote is from 1997, go figure.
| jml7c5 wrote:
| https://en.wikipedia.org/wiki/Wirth's_law
| SSLy wrote:
| This is called "What Andy giveth, Bill taketh away."
| homarp wrote:
| and this is now called Electron
|
| trading speed of development with 'hardware optimization'.
|
| Mac being a small market, some users are still
| opiniated/loud enough to reward nicer/native UI vs cross-
| ish platform lowest common denominator UI. Maybe there is
| hope Arm mac stays fast.
| mixmastamyk wrote:
| + Mac OS checking every binary you run over the internet.
| walterbell wrote:
| Pihole and others can block those Apple queries.
| bonestamp2 wrote:
| What happens if you block them.. do they eventually stop
| working if they haven't been validated in a long time?
| Klonoar wrote:
| It's slightly more nuanced than that - you can of course
| just block it from doing so, and there's certainly an
| argument for it being updated to not need a network call
| each time, but phrasing it like this makes it sound worse
| than it actually is.
|
| I'll just quote Jeff Johnson, who's looked into this and
| written about it - his comment[1] on this is post is
| quite useful:
|
| https://eclecticlight.co/2020/11/25/macos-has-checked-
| app-si...
|
| >The request to http://crl.apple.com/root.crl is simply
| checking the revocation status of Apple's own Developer
| ID Certification Authority intermediate signing
| certificate. If you examine the cert in the System Roots
| keychain, you can see that URL under CRL Distribution
| Points. This request contains no information specific to
| third-party developers or apps. In contrast, there's no
| CRL for third-party Developer ID leaf certs signed by
| Apple's intermediate cert. Their status is only available
| via OCSP.
|
| Notably, this:
|
| >This request contains no information specific to third-
| party developers or apps.
|
| https://eclecticlight.co/2020/11/25/macos-has-checked-
| app-si...
| whatever1 wrote:
| I would say that 90% of my pc resources are consumed to
| execute JavaScript code. If we need dedicated hardware to
| execute this more efficiently so be it. We do it anyway
| for so many other domains. Video, graphics, encryption
| etc.
| rmorey wrote:
| There basically already is in the M1:
| https://www.anandtech.com/show/16226/apple-
| silicon-m1-a14-de...
|
| tl;dr it has really sick floating point double
| performance which directly translates to JS performance
| [deleted]
| Jetrel wrote:
| One of the big deals with electron-et-al is that we're
| not married to _JavaScript_ , really; we're married to
| "the DOM and css" as our gui, and things like
| FF/Chrome/Safari devtools to develop that gui.
|
| Most javascript developers are already using babel-et-al
| pipelines to build electron apps, which are already
| transpiling between major variants of javascript, and I
| wouldn't at all surprised to see a thing where it gets
| compiled into WebAssembly rather than interpreting
| javascript. I also think there's a thing, right now,
| where it's possible to build electron apps with
| Rust+WebASM; I'm not sure, but I think the main thrust
| here is it definitely would eliminate a huge chunk of the
| slowdown.
|
| I guess the main takeaway is just that the development
| revolution that's happened recently is mainly about how
| insanely good browser dev tools have become, and not
| about javascript - javascript was just along for the
| ride. As an aside - I recently saw a video of someone
| demoing one of the old Symbolics LISP workstations, and I
| was shocked to realize how much they had in common with a
| modern browser console - specifically of being able to
| inspect all of your gui components, live, and look at all
| the properties set on them. It's provided a hell of a way
| for me to explain what the deal was with all the old
| gurus in the 80s "AI Winter" diaspora who were pretty
| butthurt about having to move from programming in that
| environment, to having to write C on DOS (or whatever
| else paid the bills at the time).
| imwillofficial wrote:
| Yep, see the success of the Nova editor.
| homarp wrote:
| discussion of Nova Editor:
| https://news.ycombinator.com/item?id=24495330
| Hamuko wrote:
| It's a success?
| imwillofficial wrote:
| Yes
| vondur wrote:
| From some the demos I've seen on YouTube, the Nvidia Jetson AGX
| would make for a really nice desktop. If Nvidia could release a
| desktop oriented ARM machine with its graphics chips
| integration, it could make for a really nice workstation.
| sschueller wrote:
| But it's not really the chip, we have powerful ARM CPU every
| except on desktops. It's the combination of software and the
| chip that makes this a huge computing jump. No one wanted to
| invest in rewriting software to switch away from x86.
| wayneftw wrote:
| Exactly this. Nobody cares about the M1 except for tech
| enthusiasts and Apple fans.
|
| The problem is that you effectively have to run macOS to take
| advantage of it and that's a no-go for a wide variety of
| people. I don't even care if I can ever run Windows or Linux
| on an M1 because Apple will make it a pain in the ass to do
| so. They don't even have a fucking bios UI... Imagine
| enterprise IT shops rolling Windows out on that garbage?
| It'll never happen.
|
| And I don't want ARM at all unless the systems built around
| it are as modular and open as the x86 ecosystem.
| huitzitziltzin wrote:
| I think you are exactly right, but we should celebrate that!
|
| This kind of competitive pressure will inspire a response from
| Intel and other firms.
|
| The result I would predict and hope for in a few years would be
| better chips from everyone in the market.
| neogodless wrote:
| I'm not saying what you think is wrong, but your view seems
| like it may be narrow, and missing the bigger ecosystem.
|
| Intel has certainly dominated consumer computer sales over
| the past decade, and until 4 years ago they were largely
| still selling the best chips for most consumer use cases
| (outside of mobile.) Intel had several missteps, but I don't
| think their dominant position was the only source of their
| problems, or simply that they thought they didn't have to
| try. They legitimately had some bad decisions and engineering
| problems. While the management is now changing, and that
| might get their engineering ducks in a row, the replacement
| of Intel with Apple Silicon in Apple's products is not likely
| to be some kind of "waking up" moment for Intel, in my
| opinion. Either they'll figure out their problems and somehow
| get back on an even keel with other chip designers _and_
| fabrication, or they won 't.
|
| Meanwhile other competitors in x86 and ARM have also have a
| short-term history of success and failure, again regardless
| of what Apple is doing. And the timelines for these plans of
| execution are often measured in the scale of two to three
| years, and I'm not seeing how Apple successfully designing
| CPUs would change these roadmaps for competitors.
|
| For everyone involved, particularly those utilizing TSMC,
| there are benefits over time as processes improve and enable
| increases in performance and efficiency due to process rather
| than design, and the increased density will benefit any chip
| designers that can afford to get on newer processes.
|
| I guess if I'd attempt to summarize, it's not clear who is
| motivated and able to compete against Apple in ARM design. In
| other words, is there a clear ARM market outside of iOS/macOS
| and outside of Android (where chip designers already
| compete)? And in the Linux/Windows consumer computing space,
| there's going to be a divide. Those that can accept a
| transition to macOS and value the incredible efficiency of
| Apple Silicon will do so. Those that continue buying within
| their previous ecosystems will continuing comparing the
| options they have (Intel/AMD), where currently chips are
| getting better. AMD has been executing very well over four
| years now, and Intel's latest chips are bringing solid gains
| in IPC and integrated GPU performance, though they still have
| process issues to overcome if they wish to catch back up in
| efficiency, and they may also need to resolve process issues
| to regain a foothold in HEDT. But even there, where AMD seems
| pretty clearly superior on most metrics, the shift in market
| share is slow, and momentum plus capacity give Intel a lot of
| runway.
|
| The only other consideration is for Windows to transition to
| ARM, but there's still a bit of a chicken and egg problem
| there. Will an ARM chip come out with Apple Silicon like
| performance, despite poor x86 emulation software in Windows
| when run on ARM? Or will Microsoft create a Rosetta-like
| translation software that eases the transition? I'm not clear
| on what will drive either of those to happen.
| cainxinth wrote:
| All very impressive, but here's my question: what are they going
| to do about graphics cards? Will they find a way to connect
| existing graphics cards to their CPU? Will they make their own
| ARM-based graphics cards? Will AMD or Nvidia?
| reasonabl_human wrote:
| They are R&D'ing their own GPUs to vertically integrate
| according to some rumors from my Apple friends.
| NelsonMinar wrote:
| Why would Apple build fancy graphics cards? They have no
| meaningful gaming market and haven't cared about it for years.
| For machine learning?
| alkonaut wrote:
| They don't need to build them, but they need their machines
| to be able to use them (for the same reasons their current
| pro machines use them).
| Synaesthesia wrote:
| They already have. Their integrated graphics now rival that
| of discrete gaming laptops.
| hu3 wrote:
| Rival how?
|
| A Surface Book 3 with an intel processor and an outdated
| Nvidia 1650 TI laps around M1 in games. Almost 2x
| performance. I'm not even going to compare it to laptops
| with modern GPUs.
|
| https://www.anandtech.com/show/16252/mac-mini-
| apple-m1-teste...
| jeroenhd wrote:
| Nvidia? Ha, never in a million years.
|
| Support for one of the recent Radeons was recently added to
| macOS, so it's a possibility. No reason the M1 can't do PCIe,
| as far as I know the only thing keeping eGPUs from working on
| the M1 right now is software support. It could also be that the
| driver was added because of the extensibility of the Pro,
| though.
|
| My expectation is that they'll keep the GPU on the same level,
| which is "good enough for most users", and focus on hardware
| acceleration for tasks like video and audio encoding and
| decoding instead. With an ML chip and fast audiovisual
| processing, most consumers don't need a beefy GPU at all, as
| long as you stick to Apple's proprietary standards. Seems like
| a win-win for Apple if they don't add in an external GPU.
| robenkleene wrote:
| And are you thinking the solution for people who do need a
| powerful GPU is eGPUs and Mac Pros?
| jeroenhd wrote:
| I don't think Apple cares much for those people, they can
| buy the Mac Pro or a PC if they really need the GPU power.
|
| eGPUs can be a nice addition, but I doubt Apple will
| release an official eGPU system. You're already limited to
| AMD GPUs after the clusterfuck of a fight Apple and Nvidia
| had, and I doubt Intel's Xe lineup will receive much love
| for Apple right after the Intel CPUs have been cut from
| Apple's products.
|
| Honestly, for the kind of work that does need an arbitrary
| amount of GPU horsepower, you're barking at the wrong tree
| if you buy Apple. Get yourself a Macbook and a console or
| game streaming service if you want to play video games, and
| get yourself a workstation if you want to do CAD work.
|
| I don't think the work Apple would need to put into a GPU
| solution would be worth it, financially speaking.
| culturestate wrote:
| _> I doubt Apple will release an official eGPU system_
|
| They already have one[1], and you can even buy eGPUs from
| the Apple Store[2].
|
| 1. https://support.apple.com/en-us/HT208544
|
| 2. https://www.apple.com/sg/shop/product/HM8Y2B/A/blackma
| gic-eg...
| jakeva wrote:
| That's a Radeon Pro 580, AFAIK this eGPU offering hasn't
| been updated in several years.
| robenkleene wrote:
| How would you fit Apple's AR/VR ambitions into this
| perspective? (I.e., given AR/VR has steeper GPU
| requirements, both on the consumption and creation side.)
| Pulcinella wrote:
| Well unless Apple can pull an M1 and do with their GPUs
| what they did with their CPUs and start to embarrass Nvidia
| and AMD with lower power, higher performance GPUs.
| Pulcinella wrote:
| Yeah I imagine the Radeon support was for the Pro and the
| existing Intel Macs (though I don't know if those Radeon GPUs
| are really supported via eGPU. Are there enclosures where
| they fit?)
|
| Still I can't see Apple only developing one integrated GPU
| per year unless they somehow figure out how to magically make
| them somewhat approach Nvidia and AMDs modern chips. What
| would the ARM Mac Pro use?
|
| It seems that Apple has put in a lot of development resources
| into getting Octane (and maybe Redshift and other GPU
| accelerated 3D renderers) to support Metal (to the point
| where it sounds like there may have been Apple Metal
| engineers basically working at Otoy to help develop Octane
| for Metal) and I can't just imagine that happening just to
| support the the Apple Silicon GPUs. I wouldn't be surprised
| if we see eGPU support announced for ARM Macs at WWDC (and
| maybe even the iPad Pros that support Thunderbolt. Yeah the
| idea of plugging your iPad into an eGPU enclosure is funny,
| but if it's not to hard to implement, why not?)
| volta83 wrote:
| > Still I can't see Apple only developing one integrated
| GPU per year unless they somehow figure out how to
| magically make them somewhat approach Nvidia and AMDs
| modern chips. What would the ARM Mac Pro use?
|
| What do mac users need a beefy gpu for?
|
| AFAICT apple just need a GPU that's good enough for most
| users not to complain, integrated Intel-GPU style.
| Pulcinella wrote:
| What I said in before, 3D rendering (and video processing
| and anything else you might want a powerful GPU for).
| bredren wrote:
| >It seems that Apple has put in a lot of development
| resources into getting Octane to support Metal...and I
| can't just imagine that happening just to support the the
| Apple Silicon GPUs.
|
| At the start there will still be a lot more Mac Pros
| running AMD hardware that must be supported.
|
| It may not be obvious, but Apple has repair work to do in
| the pro community. Four years ago this month, Apple
| unusually disclosed that it was "completely rethinking the
| Mac Pro." [1]
|
| This new Mac Pro design wasn't announced until June of 2019
| and didn't hit the market until December 10th of 2019.
| That's just _six months_ prior to the Apple Silicon
| announcement.
|
| So, unless Apple simultaneously was trying to honor pro
| users while also laying plans to abandon them, it is hard
| to imagine that Apple spent 2017-2019 designing a Mac Pro
| that they would not carry forward with Apple Silicon
| hardware. Keep in mind, the company had just gotten through
| a major failure with the Gen 2 cylindrical Mac Pro design.
|
| The current, Gen 3 2019 Mac Pro design has the Mac Pro
| Expansion Module (MPX). This is intended to be a plug-and-
| play system for graphics and storage upgrades. [2]
|
| While the Apple Silicon SoC can run with some GPU tasks, it
| does seem it does not make sense for the type of work that
| big discrete cards have generally been deployed for.
|
| There is already a living example of a custom Apple-
| designed external graphics card. Apple designed and
| released Afterburner, a custom "accelerator" card targeted
| at video editing with the gen 3 Mac Pro in 2019.
|
| Afterburner has attributes of the new Apple Silicon design
| in that it is proprietary to Apple and fanless. [3]
|
| It seems implausible Apple created the Afterburner product
| for a single release without plans to continue to upgrade
| and extend the product concept using Apple Silicon.
|
| So, I think the question isn't if discrete Apple Silicon
| GPUs will be supported but how many types and in and what
| configurations.
|
| I think the Mac Mini will remain its shape and size, and
| that alongside internal discrete GPUs for the Pro, Apple
| may release something akin to the Blackmagic eGPU products
| they collaborated on for the RX580 and Vega 56.
|
| While possibly not big sellers, Apple Silicon eGPUs would
| serve generations of new AS notebooks and minis. This
| creates a whole additional use case. The biggest problem I
| see with this being a cohesive ecosystem is the lack of a
| mid-market Apple display. [4]
|
| [1] https://daringfireball.net/2017/04/the_mac_pro_lives
|
| [2] https://www.apple.com/newsroom/2019/06/apple-unveils-
| powerfu...
|
| [3] https://www.youtube.com/watch?v=33ywFqY5o1E
|
| [4] https://forums.macrumors.com/threads/wishful-thinking-
| wwdc-d...
| raghavtoshniwal wrote:
| Nit: Afterburner is built on FPGAs, they are
| architecturally different from the M-series chips and
| GPUs.
| Hamuko wrote:
| Kinda feels like Apple's choice at the moment is just their own
| integrated GPUs. eGPU is also a possibility.
| asaddhamani wrote:
| Will probably not be great for battery life
| akmarinov wrote:
| Option 1 - yes, they will
|
| Option 2 - no, but does it matter? It's not like the previous
| gen Macs had great GPUs and no one is gaming on a Mac anyway.
|
| Option 2.5 - bring back eGPU
| bogwog wrote:
| > no, but does it matter? It's not like the previous gen Macs
| had great GPUs and no one is gaming on a Mac anyway.
|
| True, but previous macs were never really competitive with PC
| alternatives on the hardware side, since they all used the
| same chips just with a higher price tag. With M1, that's
| starting to change, and Apple has the opportunity to attract
| a much large customer base for Mac than it ever has.
|
| And of course, they're much more interested in gaming
| nowadays thanks to iOS. Maybe not interested enough to suck
| up their pride and apologize to Nvidia for giving them the
| finger, but probably enough to at least stick a beefier GPU
| into macs.
| needle0 wrote:
| Even putting aside the performance issue, Apple and gaming
| have never worked together quite well.
|
| Apple's modus operandi of quickly and frequently
| deprecating old architectures and requiring app developers
| to constantly keep up goes down very badly with the
| traditional video game development model - of spending time
| and budget finishing up one game, releasing a few patches,
| then moving on to the next game with little further upkeep
| or maintenance of the now-done game. (Yes, games as a
| service is more common nowadays, but a huge number of games
| still go by this old model.) This model relies on long-term
| compatibility of old binaries on the platform being pretty
| stable, which is fairly true for consoles and Windows, but
| Apple platforms are anything but.
|
| There are massive piles upon piles of only slightly old
| games that are not just unsupported but simply refuse to
| run on both the iOS App Store and Steam for Mac (including
| Valve's own back catalog!), due to the abandonment of
| 32-bit binary support a few versions back. And even if the
| developer is willing to do bare minimum upkeep work to
| recompile an old game and make it run on current hardware,
| chances are that between the time of release and now, lots
| of new mandatory hoops (eg. natively support a certain
| screen size) have been added to the app store checklist so
| that passing store certification requires tons more work
| than a simple recompile, further deterring the dev.
|
| Perhaps you could chalk it up to the dev being lazy for not
| doing regular maintenance of the game, but the rest of the
| game industry doesn't force you to do that, while only
| Apple does.
| cainxinth wrote:
| You also need GPUs for rendering video, which people do use
| Macs for.
| agloeregrets wrote:
| To make a Mac Pro-scale system with real gains, they would
| roughly need the equal of 9x the number of performance cores of
| an M1 (~36 to 48 cores), if they were to scale GPU in the same
| way (72 core GPU) you are looking at a 72 core GPU with over 23
| TFlops (FP32), they could also find room in clock speeds and
| 5nm+ to get an additional 10 out of it I imagine. In general
| that would be enough for many but I wouldn't be too surprised
| to see them do something more exotic with their own GPU.
| AtlasBarfed wrote:
| So, M(n)+ or M(n+1) ?
|
| "tentatively known as the M2"
|
| Blasphemy! Plus then N+1!
| alberth wrote:
| Besides the radio antenna (Qualcomm) that Apple is quickly
| replacing with their own, is there any other tech/chips inside
| Apple SoC that they don't design themselves?
| poyu wrote:
| Oh I think _they are_ getting into the radio chip business.
|
| https://www.apple.com/newsroom/2021/04/apple-commits-430-bil...
| dan1234 wrote:
| Here's hoping this chipset will support 32GB+ RAM and more than 2
| displays!
| canuckintime wrote:
| Edit: double posted https://news.ycombinator.com/item?id=26957269
| indigo945 wrote:
| This is not an oxymoron. You can both feel that the M1 chip is
| superior to previous designs in most aspects, and admit that it
| is lacking in others.
| canuckintime wrote:
| I consider the ability to drive more than one external
| display to be directly related to the power and design of the
| chip.
| herrkanin wrote:
| People can be both legitimately impressed by the power and
| efficiency of Apple's first desktop-class processor, while also
| understand that certain more niche features were out of scope
| of a first version. I'm certainly expecting this to be fixed by
| the second generation, and if it's still missing I won't be
| quite as understanding.
| canuckintime wrote:
| > People can be both legitimately impressed by the power and
| efficiency of Apple's first desktop-class processor, while
| also understand that certain more niche features were out of
| scope of a first version. I'm certainly expecting this to be
| fixed by the second generation, and if it's still missing I
| won't be quite as understanding.
|
| I'm responding to a HN commenter who was not just impressed
| about the power of the M1 but hyperbolically asserts that it
| is better than everything else yet the next top voted HN
| comment demonstrates otherwise with a demand for downgraded
| feature. The tenor of those reactions are opposed and my aim
| is to reflect the nuance
| mortenjorck wrote:
| This would be quite an accelerated timeline if Apple ships its
| second-generation M-series chip only eight months after the
| first. Typically, they've followed a sort of six-month tick-tock
| pattern for the A-series, launching a new major revision in the
| fall with the new iPhone, and launching an "X" revision in the
| spring with new iPads.
|
| I think most observers have been expecting an "M1X" for Apple's
| first pro-oriented ARM Macs, so an M2 already would be a
| surprise.
| _the_inflator wrote:
| Imagine they put Macbooks on a yearly upgrade cycle like the
| iPhone - OMG, that would be impressive.
| gtirloni wrote:
| I don't know if you're being serious but, given the lack of
| improvements in chip design lately, that would indeed be
| impressive.
|
| I don't mind upgrading every other year. I just want the
| upgrades to be meaningful.
| Tagbert wrote:
| It's not that you, as a user, need to upgrade, but Apple
| could upgrade the SOC in their machines each year. It's
| like the phone, small incremental updates each year. If you
| wait a few years to buy a new one, the change feels
| dramatic.
| ogre_codes wrote:
| Apple forecast a 2 year transition a year ago at WWDC. That
| means they need a processor for their base laptop/ consumer
| desktops. One for their high end laptops and many of their pro
| desktops. And arguably one to replace the Xeon in the iMac Pro
| and Mac Pro.
|
| Unless they are going to use this same CPU for the Mac Pro,
| this is right on schedule.
| clajiness wrote:
| > "This would be quite an accelerated timeline if Apple ships
| its second-generation M-series chip only eight months after the
| first."
|
| The M1 was available on Nov. 17th, 2020. The article states
| that the chip is entering mass production, and due for release
| sometime in 2H. This could easily be released a year after the
| M1, if not 13 months later.
| cmsj wrote:
| I'm not sure you'll see a true second generation chip, I would
| be expecting it to be mostly the same thing, but with more
| cores and some solution to providing more RAM.
|
| Having said that, Apple does have something of a history of
| pushing out v1 of a product that sets a high bar for everyone
| else to try and catch up with, then immediately pushing out a
| v2 that raises the bar well above where everyone else was
| aiming.
|
| Overall though, it's awesome that the Macs now get to benefit
| from the vast investment going each year into making
| faster/better CPUs every year, for hundreds of millions of new
| iPhones.
| rbanffy wrote:
| > I think most observers have been expecting an "M1X"
|
| An M2 name implies some architectural differences like extra
| cores or more external bandwidth. I'd be totally happy with an
| M1x with some tweaks like more external connectivity and more
| memory.
|
| Which, for me, would be quite perfect. The only reason I'm
| holding back this purchase is the 16 GB memory limit.
| larkost wrote:
| Same here. I want a replacement for my 27in iMac and would
| have held my nose at the slightly smaller screen, but really
| want more memory than 16GiB (docker, etc...).
|
| So Apple will just have to wait to get my money until fall
| (or whenever they announce the successor to the 27in iMac).
| thehnguy wrote:
| I'm excited for the 27 (or maybe it will be 29") variant
| Reason077 wrote:
| > _" I think most observers have been expecting an "M1X" for
| Apple's first pro-oriented ARM Macs"_
|
| I'm pretty sure that's what this is, rather than a next-
| generation ("M2") chip. It will likely have the same cpu core
| designs as the M1, just more of them. And possibly paired with
| the new, long rumored "desktop class" Apple GPU.
| rsynnott wrote:
| I'm not sure about that, because the M1 cores are, in
| essence, quite old at this point; they're pretty similar to
| the A12 cores. Apple usually does quite a major micro arch
| refresh every few years; it's probably coming up to time.
| simondotau wrote:
| Given the timing, I doubt it. Apple has established a fairly
| stable cadence of core improvements every 12 months. Enough
| time has elapsed between the M1 and this new chip that I'd
| expect it to have more in common with the "Apple A15"
| generation SOC than the A14/M1.
|
| As to what Apple's choice of marketing name, that's entirely
| arbitrary. (For what it's worth, my guess is they're ditching
| "X" suffixes and will designate higher spec variants with
| other prefix letters e.g. the "P1" chip.)
| ksec wrote:
| Considering M1 doesn't support LPDDR5, and A15 will be in a
| similar time frame. I would not be surprised it will be a M2
| ( Based on A15 ), or more likely a M2X.
| [deleted]
| pfranz wrote:
| When they switched to Intel they released the first Macbook
| Pros in January 2006 (32-bit Core) and in October 2006 shipped
| 64-bit Core 2 Duos.
| gregoriol wrote:
| Some people (a lot?) are now waiting for the new Pro devices
| and won't buy new until then
| EricE wrote:
| I'd buy a MacBook Air in a heartbeat if I could get 32GB of
| RAM in it. RAM is the only thing causing me to turn up my
| nose at the M2.
|
| If they would have released the new iMac with a bigger panel
| so there were options for the 27" as well as the former 21"
| then my mother would be rocking a new iMac next month.
|
| I know they said up to two years for the transition but I
| want to transition now :)
| thehnguy wrote:
| We're in a brave new world. The early prognosticators thought
| the M1 was more a proof of concept (shove it into existing
| designs to get it out there).
|
| But now we know that it was always intended to be a real player
| (it's in the iMac and iPad Pro).
|
| So this news is interesting to me because now it seems to cut
| back the other way that maybe the M1 was designed for a short
| shelf life.
|
| In a world where Apple is controlling so much of its silo,
| "normal" design and building deadlines will be upended.
| dijit wrote:
| > The early prognosticators thought the M1 was more a proof
| of concept (shove it into existing designs to get it out
| there).
|
| Which is a weird take when you consider the thermal issues
| that Intel macs were plagued with. It's almost like the
| chassis was designed with 10w of dissipation in mind which
| Intel couldn't operate within, but the M1 could easily.
|
| I had assumed that Apple designed for the M1 and then fit
| Intel chips into those designs.
| apetrovic wrote:
| My private conspiracy theory (supported by nothing) is that
| Intel promised to Apple good 10w processors back in
| 2014-ish, and Apple designed 2016 MBP based on that
| promise. And when Intel didn't delivered, they shipped Macs
| anyway, and either started working on M1 or cleared any
| doubt about should they continue working on it.
| clairity wrote:
| that's not just your (conspiracy) theory, it's exactly
| what happened (something i've also noted before). intel
| screwed apple years ago and apple decided to move on. it
| just took many years of chip development to get to this
| point.
| zimpenfish wrote:
| Honestly wouldn't be surprised if it came out that they
| started working (at least theoretically) on Apple Silicon
| when they transitioned to Intel in the first place and it
| just took this many years for all the pieces to be ready
| and lined up.
| EricE wrote:
| I think Apple would have been perfectly happy buying CPUs
| from Intel as long as Intel kept their end of the bargain
| up.
|
| After the PowerPC fiasco and IBM leaving Apple high and
| dry, I have zero doubt that there was a contingency plan
| always under way before the ink even dried on the PA Semi
| acquisition, but it wasn't probably a concrete strategy
| until about the third time in a row Intel left Apple high
| and dry on a bed of empty promises.
|
| Apple has so much experience with processor transitions
| they don't have to stay on ARM either. And they have the
| capital to move somewhere else if it makes enough sense
| to them. I find it highly unlikely - but if it made sense
| it would be highly probable :)
| mortenjorck wrote:
| Not only plausible, I'd say this is the most likely way
| it played out.
|
| At the time of the Intel transition, Apple had already
| gone through the process once before with 68k to PPC. It
| had to be clear to the long-game thinkers at Apple that
| this cycle would keep repeating itself until Apple found
| a way to bring that critical part of its platform under
| its own control. Intel was riding high in 2006, but so
| had IBM in 1994.
|
| Within two years of the Intel transition, Apple acquired
| P.A. Semi. The iPhone had barely been out for a year at
| that point, and still represented a fraction of the
| company's Mac revenue - and while it looked to us
| outsiders like the acquisition was all about the iPhone
| and iPad, in retrospect, a long-term replacement for
| Intel was almost certainly the endgame all along.
| clairity wrote:
| possible, but as outsiders, it's hard to be sure of that
| sequence of events with those sets of facts, to draw that
| conclusion definitively. perhaps that was a backup plan
| that quickly became the primary plan.
|
| but with the 2016 line of macs, it was obvious that apple
| was expecting faster, smaller, cooler, more power
| efficient 10nm chips from intel, and intel fell flat on
| their face delivering. it's not clear how far before that
| that apple knew intel was flubbing, but 2014 seems a
| reasonable assumption given product development
| timelines. as intel's downward trajectory became clearer
| over the following months, along with the robustly upward
| trajectory of apple silicon, the transition became
| realizable, and eventually inevitable.
|
| as an aside, i'm using a beat up 2015 macbook pro and
| eagerly awaiting the m2 version as its replacement,
| seeking to skip this whole intel misstep entirely.
| thehnguy wrote:
| Fascinating. It's amazing the 3D chess these companies
| have to play effectively.
| mtgx wrote:
| Not really, because M1 was probably meant as a stop-gap, and
| it's mostly a rehash of A12.
|
| M2 is probably based on the Arm v9 ISA and has been in design
| for years.
| EricE wrote:
| The M1 is no stop gap. When you have people criticizing it
| because it _only_ bests 90% of the current PC market but not
| all of it...
|
| Well, if that is indeed a stop gap then I can't wait to see
| their first "real" chip :)
| paulpan wrote:
| I think getting out the M2 before the fabled "M1X" actually
| makes sense. This could explain the decision to put the M1 into
| the new iPad Pros, to re-use the M1 chips elsewhere once the
| new M2 becomes available.
|
| Main reason being the M1 was a more proof of concept and rushed
| out (despite being as good as it turned out to be). The M2 will
| be a more refined M1 but with notable improvements such as
| LPDDR5 support - akin to AMD's Zen1 and Zen+ releases.
|
| On the other hand, there could be a M1X being readied for
| release in the upcoming June WWDC. It may be architecturally
| older than the M2 but still superior performance on a big cores
| differential, e.g. M1 only has 4 big cores and 4 small cores,
| the M1X just needs more big cores to be notably more
| performant.
|
| All highly speculative of course, will have to find out in
| about a month.
| bloqs wrote:
| soldered ram and SSD coupled with SSD Wear issues leading to a
| less than 3 year lifespan of a laptop makes all of this a hard
| pass for me, and should be for any sensible person too.
| rvz wrote:
| Not sure why you are downvoted, but this is true is it not?
| It's even worse with Apple Silicon machines since if the SSD
| dies, well the whole thing is bricked. Unlike the Intel Macs.
|
| It seems the Mac aficionados (Especially the M1 fanatics) are
| in denial of the degree of lock-in with the Mac as it gradually
| descends into become nearly as locked in as an iPhone.
|
| I'd give it 0.1 out of 10 for repairability. At least with the
| latest Surface line-up the SSD can be upgraded.
| wmf wrote:
| You can wear out any SSD. There's no evidence that Apple SSDs
| are any worse than others. You need to have backups. You need
| to understand that Apple products are sealed and disposable
| and only buy them if your use case can accommodate that.
| argvargc wrote:
| Even if true (it isn't - SSD issues appear to be mostly related
| to as-yet non-native software), a 3 yr lifespan for the price
| of 6 yrs worth of half-speed laptop makes sense.
| sneak wrote:
| I know a lot of people waiting for this one, myself included.
| Here's hoping Asahi Linux is ready around then!
|
| I'm guessing the 2021 MacBook Pro is going to be the fastest
| laptop ever made.
| gregoriol wrote:
| Every generation is "the fastest ever made". The question is
| more: will this one be the Pro version?
| guywhocodes wrote:
| Surely not every mbp has been the fastest laptop ever made.
| Has this in fact at any given point in time actually been
| true?
|
| This could be it tho, but there are probably some desktop TDP
| chip current gen laptops out there.
| sneak wrote:
| I didn't mean just Apple. I just meant the fastest laptop
| money can buy.
| postalrat wrote:
| The fastest laptop money can buy has always been available.
| I guess this is the first time you are considering buying
| one?
| sneak wrote:
| Sometimes I fantasize about printing a t-shirt that
| simply says "shut up, you know what I meant".
| tailspin2019 wrote:
| I'll buy two
| mmmmmbop wrote:
| Maybe their two requirements for a laptop were it being
| (1) an Apple product and (2) the fastest laptop money can
| buy.
| BurningFrog wrote:
| I expect every mac user at my job to get one.
| 5scale wrote:
| Full article:
| https://web.archive.org/web/20210427125928/https://asia.nikk...
| mikece wrote:
| While this news is about "a rumor according to sources familiar
| with the matter" it's obvious that Apple will be doing this at
| some point. Whether it's the M2 or if there will be a new letter
| designator (X-series silicon for eXtreme performance? Apple X1?)
| I am very interested to see what the performance numbers will be
| for an ARM-powered workstation rocking 16 to 32 high power cores.
| Aside from the Ampere eMAG, is a 16+ core ARM-powered workstation
| even a thing yet? (I don't count Amazon's Gravaton chips in this
| discussion because I cannot own a machine on my desktop powered
| by one).
| hajile wrote:
| M2 seems _very_ unlikely to me because that will create product
| misinformation easily. Imagine the following
|
| M1 -- 4 A14 cores
|
| M2 -- 8 A14 cores
|
| M3 -- 4 A15 cores
|
| That "third generation" sounds better, but is actually inferior
| to the second generation. X1 or M1x seem much more likely. It's
| the same reason why they made an A12x instead of calling it
| A13.
|
| They probably need 3 designs, but economies of scale begin to
| be a problem. A14 and M1 are now selling in the millions, so
| they get a huge benefit. More importantly, they are in devices
| that get replaced a bit more frequently.
|
| M1x (8-16 cores) will be in bigger iMac and laptops. They don't
| sell nearly as many of these. In addition, yields on the larger
| chip will be lower. Finally, people with these devices tend to
| keep their machines longer than phones, tablets, or cheaper
| laptops.
|
| The 16-64-core chips are a major potential headache. If they go
| chiplet, then no big problem (actually, the M1x as two M1
| chiplets seems like a desirable direction to head). If it is
| monolithic, the very limited sales and production will drive
| prices much higher. A logical way to offset this would be
| bigger orders with the excess being sold to Amazon (or others)
| to replace their mini cloud, but that's not been mentioned
| publicly.
| mikece wrote:
| I always assumed the "M" in the M1 designation meant "mobile"
| and that higher-powered (in terms of processing, electricity,
| and heat dissipation) would be coming later and have a
| different letter designator. Either that or we'll get Air/Pro
| suffixes to the chip numbers (eg: M1 Pro, M2 Air...)
| mcintyre1994 wrote:
| M for mobile could make sense given they've just put an M1
| in an iPad Pro. I assumed it was Mac before that and we'd
| get an M1X or something for higher end models but that
| seems wrong now.
| mft_ wrote:
| ...but they also just put it in the iMac...
| wffurr wrote:
| The iMac has long used laptop grade parts.
| mcintyre1994 wrote:
| The variant of it that uses laptop parts, but fair point.
| Mobile does seem a stretch for iMac. Let's just call it
| Apple's unique naming convention :)
| danaris wrote:
| ...I always thought it was for "Mac".
|
| Though the fact that they've just labeled the chip in the
| latest iPad Pro as such does add a bit of confusion to
| that.
| Eric_WVGG wrote:
| Another possibility is that they skip M1x altogether; if the
| Pro machines are coming in the autumn and not for WWDC, then
| the frame will be closer to the iPhones and it would make
| sense for them to use that later tech.
|
| M1 (winter 2020) -- 4x A14 cores
|
| M2x (autumn 2021) -- 8x A15 cores
|
| M2 (winter/spring 2022) -- 4x A15 cores
|
| etc.
|
| There's really no reason for the naming and timing of Pro
| machines to lag behind consumer machines just because of
| arcane numbering. And there's precedent, Intel Core Duo Macs
| were released before Intel Core "Solo" Macs.
|
| But if they're actually ready for WWDC, then no, it'll just
| be called M1x.
|
| As for the Mac Pro... we'll it's definitely going to be
| weird. I think the existence of the Afterburner card proves
| that Apple sees a future in highly specialized modular add-
| ons providing most of the differentiation, but buyers would
| still want a bigger chip than in the 16" laptops, so who
| knows... of course nobody even knows how an M1 would perform
| with proper cooling and a higher clock!
|
| [edit] also making M2x ahead of M2 will give them the
| benefits of "binning" some chips
| mciancia wrote:
| There is something available for building workstations. Not
| sure what about performance though. https://www.solid-
| run.com/arm-servers-networking-platforms/h...
| wmf wrote:
| Here's an ARM workstation: https://store.avantek.co.uk/ampere-
| altra-64bit-arm-workstati... "Unfortunately" the minimum CPU is
| 64-core.
| seumars wrote:
| My wallet is ready for the next line of macbook pros.
| imwillofficial wrote:
| I have an M1 MacBook Air, and I'm blown away. I cannot wait for a
| 16 inch MacBook Pro with whatever madness they have planned.
|
| I love the direction Apple is headed with their hardware.
| alkonaut wrote:
| Why a new SoC? Isn't the M1 basically maxiing out what can be
| done on a SoC, but what's missing is the version with external
| memory and GPU?
|
| They can refresh the core in the M1 of course, and I expect they
| will do that yearly like the AXX cores, but it would be weird to
| go even 2 generations of the SoC without addressing the pro cpu.
| wmf wrote:
| Apple could easily fit 2x-4x the performance on an SoC so
| that's what people expect the M1X and M1Z to be. Note that it's
| still an SoC if it has "external" memory (Apple's "unified
| memory" isn't what people think).
| samgranieri wrote:
| I really want the next MacBook Pro to support driving two 4K
| monitors over thunderbolt and have an option to buy 32 gigs of
| ram.
| atonse wrote:
| Compared to M1? Because my Intel MBP has done that for years.
|
| I now use an M1 mac mini with two 4k monitors (not from one
| cable though)
| dan1234 wrote:
| The M1 MacBooks can only drive 1 external monitor because
| they're already driving the internal display.
| atomlib wrote:
| You can achieve that and even more if you do the right thing
| and stay with the x86-64.
| ojbyrne wrote:
| My current MBP drives 3 4K monitors and has 64 gigs of ram.
| Perhaps I'm not clear on the point you're trying to make.
| dmitriid wrote:
| I really want Apple to actually support external monitor
| properly.
|
| Here's an incomplete and continuously updated list of monitors
| (not) working with Apple hardware:
| https://tonsky.me/blog/monitors-mac/
| lowbloodsugar wrote:
| I finally bought an external graphics card bay for my i9 15",
| and now it's obsolete. =)
| christkv wrote:
| or 16gb high speed on die and another pool of DDR4 memory
| outside.
| bhouston wrote:
| I would prefer just one 6K or 8K monitor personally at a size
| of say 55" with a wide ratio. Simpler setup. No post in the
| middle. Something like this but bigger with higher resolution,
| and maybe a bit more height vertically (this one is a bit too
| wide for its height in my opinion):
| https://www.samsung.com/ca/monitors/gaming/super-ultra-wide-...
|
| I think that dual 4K is the old way, single 6K/8K ultra wides
| are the future.
|
| That said I am rocking the dual 4Ks on the Mac Mini M1 and it
| works great:
| https://twitter.com/BenHouston3D/status/1384693982249340935
| samgranieri wrote:
| Here's an old picture of how I use 2 4k monitors at home
| https://i.imgur.com/Wwr6G42.jpg (I've upgraded my desk and
| keyboard since)
|
| I'd strongly consider getting a 6k monitor, or maybe a 5k2k
| ultrawide if I didn't already have two my two Dell 27inch
| 4ks.
| redm wrote:
| 8k is 4x the resolution of 4k at a 16:9 aspect ratio. It
| would require supporting 4x 4k displays.
|
| Example: https://i.pcmag.com/imagery/articles/07toBDd6lpyucCy
| M0xWrcQv...
| chevill wrote:
| >I think that dual 4K is the old way, single 6K/8K ultra
| wides are the future.
|
| In the near future 16:9 8ks are going to be standard (too bad
| it won't be 16:10) but right now you can get a decent triple
| 4k monitor setup for half of what an 8k monitor costs.
|
| IMO ultra-wide is great for productivity in a situational
| sense but it sucks for gaming and media consumption. Also
| getting proper support for ultra-wide to be standard is
| probably going to take another decade.
|
| I think there's something to be said for each monitor being a
| separate entity. I think some people will still prefer
| multiple monitors over ultra-wide even when support for
| ultra-wide is a solved problem.
|
| My personal setup now is a 28" 4k@60hz, a 32" 1440P@240hz as
| a primary and a 24" 1440p@144hz. I have my work computer, my
| desktop, a mac-mini, a switch, and a PS4 running off of them
| so having separate monitors is ideal for me.
|
| Ultra-wides and super-ultrawides are cool, but IMO they
| aren't as practical yet.
| Foxhuls wrote:
| I've been playing games on an ultrawide since 2015 so I'm
| going to have to disagree hard with you. In the beginning
| ultrawide support was hit or miss but at this point I'd
| safely say 90-95% of games support ultrawide. There are
| also of plenty gaming specific ultrawide monitor options. I
| still have a second 16:9 monitor on the side because I do
| agree with your point of monitors being a separate entity.
| There are programs from dividing the space of monitors but
| if you're used to multiple monitors I don't think switching
| to a single monitor of any size will be a good replacement.
| I think it's also worth throwing in that watching movies on
| my ultrawide is my favorite place to do it as the 21:9
| aspect ratio means that movies stretch to the entire
| screen. It's definitely an amazing experience.
| chevill wrote:
| >In the beginning ultrawide support was hit or miss but
| at this point I'd safely say 90-95% of games support
| ultrawide.
|
| There's support, and then there's that support being so
| flawless that its the same quality of experience you
| would get with a standard aspect ratio. Each of us has
| our own standard of what's good enough. Its working out
| for you but the people I game with that were using ultra-
| wide monitors switched back due to the number of issues
| they were having, as recently as last year. I did some
| research myself when I was upgrading my monitor and some
| of the games I played would have had issues so or me
| personally it wasn't good enough.
|
| Another thing to consider is a lot of game content is
| designed with standard aspect ratios in mind, so whether
| expanding the viewpoint makes it better is going to be a
| personal standard. It will be interesting to see if UW
| monitors do become a standard in a couple of decades
| whether game developers start specifically making content
| that utilizes the extra space to offer something that
| isn't possible with existing games.
| billylindeman wrote:
| I rock a 38" dell ultrawide. It's not HiDPI but it works
| fantastic for me. It's just about big enough to make me not
| miss dual screens, and it's awesome for games.
| masklinn wrote:
| > I would prefer just one 6K or 8K monitor personally
|
| Sure but the Air and MBP already support 6k external
| displays.
|
| They don't support 2x4K.
| bhouston wrote:
| I do agree. But what I am trying to say is that we should
| be pushing for 6K and 8K ultra wide monitors rather than
| the old hack of 2 monitors to one computer.
| dylan604 wrote:
| There are times I actually prefer 2 separate monitors.
| Allowing one monitor to quickly go full screen while
| continuing to use the other monitor is quite useful in my
| workflows.
| macintux wrote:
| Virtual desktops are also more useful with two separate
| displays, or at least lend themselves to different use
| cases.
| Foxhuls wrote:
| Out of curiosity, when did you last use a single monitor
| instead of your current dual monitor setup?
|
| I've been using an ultrawide since 2015 but have almost
| always had a second side monitor with it. The period
| where I didn't have a second monitor was short as having
| a separate monitor has always come in handy. All of the
| extra space on an ultrawide is great but when you're
| doing something that takes the whole screen, it still
| leaves you in the same position you would be with a
| single 16:9 display.
| masklinn wrote:
| Not the GP, and I really don't agree with their position,
| but I did switch from two monitors to a single UW last
| year.
|
| TBF my old setup was a decade old so it was merely a 24"
| (1920x1200) and a 19" (1280x1024 in portrait).
|
| Though my new display has a much larger logical size, the
| inability to put something fullscreen on one of the
| monitors (whether the small one or the large one,
| depending on the content) is definitely one of the
| drawbacks I had a hard time getting used to, not only can
| it be inconvenient, it actually wastes space as I can't
| get rid of window decorations, which putting e.g. a video
| player or browser window in fullscreen allowed.
| jiveturkey wrote:
| it's not a hack though. i prefer 2x4k. i hate the curve
| (no straight lines) and UW is too wide to be comfortable
| (eye and neck angle). 2x4k, one straight on and one
| angled is ideal. i spent 2 weeks and spent $8000 on
| monitors to test this out. 34UW at normal desk distance
| is perfect, however 2x4k is better overall. I actually
| use 3, as I use the macbook screen as the 3rd.
|
| that said, i have no issue with your choice. for some
| applications (eg video editing is the default choice for
| ads of UW monitors) it is really better.
|
| i do have issue with you denigrating 2 monitors. it is
| not at all "an old hack".
| meepmorp wrote:
| Do you have periodic weirdness on the thunderbolt monitor?
| Every so often, my m1 mini wakes or boots so that there's a
| vertical misalignment by a pixel or two about halfway across.
| It's like the desktop was cut in half and taped back together
| slightly crooked.
| bhouston wrote:
| I have zero weirdness at all. It just works perfectly all
| the time.
|
| Neither monitor supports Thunderbolt. But I run off of a
| USB-C to Display Port cable and one is using the HDMI port.
| Both run at 60Hz as it is HDMI 2.
| macNchz wrote:
| I don't have an M1 machine, but I have had periodic
| weirdness on external monitors (of every flavor) when
| waking from sleep on all the Mac laptops I've owned over
| the last 10 years.
| jclardy wrote:
| I have an Acer Predator that has this issue (Whether using
| my Mac or PC attached.) Power cycling the monitor makes it
| go away. Basically the left side of the screen is shifted
| one pixel higher than the right side of the screen, making
| a vertical line down the middle where the shift changes.
| dfinninger wrote:
| I have this happen on my Windows PC every now and again.
| It's on the monitor that's hooked up via DisplayPort.
| bhouston wrote:
| I have run into poor quality DisplayPort cables in the
| past. I now only buy ones I know are brand name.
| alanwreath wrote:
| YES -- it's not just me! At first I thought my monitors
| were somehow broken (which is unfortunate as I paid a bit
| extra for name brand LG). I suspected something, as each
| monitor is plugged into an M1 computer (one's a Mac mini M1
| and the other a MacBook Air m1). Both exhibit the visual
| problem you describe on a random basis.
| BoardsOfCanada wrote:
| Just as a data point, I have an MBP M1 and an LG external
| monitor and this has never happened to me in almost 6
| months usage.
| alanwreath wrote:
| pairing this with another comment by myself (about multi-
| monitor support beyond only one external monitor). I
| wonder if this is the main reason that Apple didn't
| support more than one external. Does the issue exhibit
| itself even more when additional monitors are connected?
| To be clear it never occurs with my main MacBook Air
| screen (only the external).
| gwking wrote:
| I haev this problem! For me it's a strip about 2 inches
| wide, down the middle, one pixel higher than the rest of
| the image. I am using an LG 5k monitor, and I have been
| unsure if it's a problem with the monitor or the machine.
| I'd love to know what kind of monitor you are using; this
| gives me hope that it might be an apple bug that will get
| fixed eventually.
| meepmorp wrote:
| I have an LG 5k, too! And it only happens on that monitor
| - my LG 4k over HDMI is fine.
| astrange wrote:
| It's because the 5K monitor is actually two monitors
| taped together ("multi-stream transport").
| lofi_lory wrote:
| Something like this... but a lot cheaper please. Like <500EUR
| please.
| solarkraft wrote:
| I use a 55" 4K Samsung TV as a monitor. It cold boots quickly
| and has no latency or subsampling issues in the right mode. I
| can recommend it as a more cost effective alternative to
| multiple monitors.
|
| The amount of available screen space at 1:1 is refreshing
| (not as crazy as imagined though), but it makes me realize
| how grainy "normal" monitors are. In some years 8K TVs might
| be cheap enough for an upgrade, but on that time scale I can
| see VR being a serious contender as well (it's really good
| already).
| asciimov wrote:
| How far back do you sit from this monitor? Do you use
| resolution scaling?
|
| The last time I tried a large format display, I couldn't
| move back far enough to make it work but I really loved
| working in 4k without scaling.
| yoz-y wrote:
| I'd be curious about that too, once I sat in front of a
| 30" monitor and really hated it due to proximity
| (60-80cm) and size. A single 27" @4k is the sweet spot
| for me.
| solarkraft wrote:
| Currently around 70cm due to the size of my desk, but I'd
| like to eventually move back to something around 120
| eventually, as that's what provides maximal comfort.
| sixothree wrote:
| I have the exact same setup as you (samsung 55) and my
| monitor is 76 cm from the front of my desk.
| shiftpgdn wrote:
| I use a 4k 43" Dell monitor without scaling. I just
| measured and my "eyeball to glass" distance is 30 inches.
| Works great and gives me the real estate of about 4
| smaller screens.
| kingsuper20 wrote:
| I've always wondered if a far-sighted person wouldn't be
| better off with a huge monitor at a distance.
| ayewo wrote:
| Mind linking to the exact model you use?
| sgt wrote:
| One thing about TV's is that they are grainy up close.
| Being a 55" you probably don't sit very close to it, but
| how does it overall feel e.g. in font quality? And I also
| wonder if it's good for your eyes to focus on something
| that's constantly far away. I would think a mixture of the
| two (normal distance to a PC monitor with frequent breaks)
| would be preferred.
| solarkraft wrote:
| > how does it overall feel e.g. in font quality?
|
| > they are grainy up close
|
| Hmm, maybe it's its "TV-ness", but the individual pixels
| look pretty sharp to me at a close look (no quantum dots
| or weird pentile arrangement, that should definitely be
| looked out for), but I have no other low DPI display to
| compare it to. My reasoning was always "it's just an LCD
| panel, why would it be different" and so far I feel
| proven right.
|
| > I also wonder if it's good for your eyes to focus on
| something that's constantly far away
|
| My appeal-to-nature-argument would be that the natural
| human mix would probably have been much heavier towards
| looking into the distance than it is now (see the massive
| rise of near-sightedness in cities), so it can only be
| better than what we're currently doing.
| dsr_ wrote:
| It's best for your eyes to change focal distance. Staring
| off into infinity, staring at 75cm, and staring at 30cm
| are all bad for you. Find something else to look at every
| so often.
|
| There are few differences between a 4K 42" screen and 4
| 1080P 21" screens: the smaller screens can be angled,
| they have bezels, and are probably more expensive in
| aggregate.
| phkahler wrote:
| I use a 55" Curved Samsung 4K TV. The DPI is similar to a
| 24" 1080p in a 2x2 configuration. The curve provides a
| similar effect to turning 2 monitors toward the viewer. I
| don't use the height very often, as I have progressive
| lenses and have to really look up to see the top. But I
| can lean back for watching a full screen video quite
| comfortably. IMHO it's fantastic. For those that can
| still see a difference in Hi-DPI an 8K 55" would be for
| them. I really don't need it though and this thing costs
| about $600 at Wallmart.
| rbanffy wrote:
| > One thing about TV's is that they are grainy up close.
|
| Most TV's have a "gaming" setting that doesn't try to
| "improve" the video feed and just passes it through.
| sixothree wrote:
| I do the same thing in my "living room" setup. I have the
| entertainment center with the television. Then a small desk
| in front of that.
|
| I can slide the desk out of the way when I want things back
| to normal in my living room.
| jedberg wrote:
| I much prefer two monitors. I run each one as separate
| desktops, which means I can have "left screen apps" and
| "right screen apps" and cycle them independently. It also
| means I can switch desktops on my left screen while my apps
| on my right screen stay where they are.
|
| Also, with two screens, I can adjust the angle between them
| to be more or less severe.
| GordonS wrote:
| Yeah, I even actually kind of like the "post" between
| monitors that the GP mentioned (as long as the bezels are
| thin :) - it help as a kind of mental division between
| types of work.
|
| Also, 2 monitors means more choice in layout; for example,
| my 2nd monitor is in portrait orientation.
| bobmaxup wrote:
| How do you deal with full screen video, games, etc on a
| single monitor?
| throw0101a wrote:
| > _No post in the middle._
|
| I personally prefer the visual break as I find it useful for
| creating fixed 'work areas': terminals/xterms, browser, mail,
| etc.
| xnx wrote:
| With a single screen, you can add any virtual (stripe of
| pixels) or physical divider (piece of tape) that you like.
| With two screens, there's no way to remove the gap.
| powvans wrote:
| 1000x this. For several years I've been eyeing the 49"
| ultrawides as a replacement for my current 34" ultrawide. It
| would definitely be a big upgrade, but I keep thinking that
| something high DPI must be around the corner in that format.
| The years just keep rolling by and it seems like display tech
| barely advances.
| mleo wrote:
| I purchased the Dell 49" ultra wide to replace a
| Thunderbolt Display and 27" 4K. It is effectively the same
| resolution as I was using. It is nice not having the break
| and being able to better center windows. Split window views
| are much easier to work with when being able to see the
| entire line of code.
| ticviking wrote:
| Only if I can have a nice tiling WM.
|
| I work on mac all day and on my 1440p widescreen I am
| constantly trying to put stuff so I can use everything I want
| to at the same time. `[Code|App|Docs]` is so common we ought
| to be able to split a wide screen 3 ways the way we do a
| normal 2.
| spockz wrote:
| Try out Rectangle.app. It saved my bacon mentally and can
| tile windows how you want. Just not automatically like a
| full tiling window manager.
|
| https://rectangleapp.com/
| Cd00d wrote:
| +1 on Rectangle recommendation. Learn a few hot-keys (or
| rely on the menubar dropdown) and you can very easily get
| any layout of windows you want.
|
| I was using Spectacle before, but I think the project
| died, and Rectangle is what I use now.
| Chazprime wrote:
| I suspect that the MacBook Pro 16 will probably one of the
| first M2 offerings and it already offers up to 64g of RAM, so
| hopefully you'll get your wish.
| esotericsean wrote:
| This is exactly what I've been waiting for!
| 2OEH8eoCRo0 wrote:
| I'm no silicon architect but isn't this why performance is so
| good? They can cut out all those extra pcie lanes and
| extraneous stuff and focus on performance. If you start adding
| back big I/O won't you just be back down to Earth with everyone
| else?
| adgjlsfhk1 wrote:
| Not really. The main reason the M1 doesn't have good IO is
| that it's a beefed up ipad CPU. This was the first generation
| where they were putting the chips in systems that can use IO,
| so they probably are just behind on the design for those
| parts of the chip.
| mcphage wrote:
| > The main reason the M1 doesn't have good IO is that it's
| a beefed up ipad CPU. This was the first generation where
| they were putting the chips in systems that can use IO
|
| Could you explain what you mean by this? I don't think I
| understand.
| hans-moleman wrote:
| All of Apple's prior ARM experience comes from iOS
| devices. They had no need to develop io capabilities for
| iPhones and iPads with a single port. Now they put the
| chips in computers but most likely haven't fully yet
| developed extended io capabilities.
| mcphage wrote:
| Hmm, but wouldn't features like wifi, all the cameras,
| the touchscreen, Cellular data, NFC, touch ID, speakers,
| bluetooth, and so on all come in via IO as well? You're
| right that they only have one port, but they still
| connect to a lot of different components.
| dijit wrote:
| Those usually connect over serial-esque busses, and they
| connect via a security chip. (Akin to the t2 chip but
| much lighter)
|
| A far cry from the kind of tolerances and sustained high
| throughput of PCIe.
| adgjlsfhk1 wrote:
| The problem with this analogy is that NFC, touch ID,
| speakers, and bluetooth are all in the MB/s or less
| range. On the desktop side, you have to deal with 10gb/s
| ethernet and 50-100gb/s for ram. It's just a whole
| different ballgame.
| Tuna-Fish wrote:
| > have an option to buy 32 gigs of ram.
|
| I think it's quite likely that M2 will have a 128-bit lpddr5
| interface. Using widely available modules, this would allow
| them up to 64GB ram.
| EricE wrote:
| I would love that! I got an M1 MacBook Air - unfortunately it
| was _so_ good I found it could run the games I like perfectly
| well, which I hadn 't planned on doing. It ran everything
| just fine except for Cities:Skylines. I have way to many mods
| and assets for 16GB of RAM to be reasonable, so it was with
| much reluctance I took it back on the last day of my return
| window. Being able to drop my Windows gaming PC and
| consolidate everything onto one machine will be very nice
| though! And I may spring for a larger size screen if I am
| going to have to abandon the MBA class of machine.
|
| Returning it still hurts. I was really hoping for something
| that would take more than 32GB of RAM but am not surprised
| that it's still going to be later this year. The guts in the
| new iMac would be perfect for my Mom if they just offered a
| 27" replacement too.
|
| Oh well. A few more months won't kill me.
| jbluepolarbear wrote:
| I use a Wavlink 4K dock to drive 2 4K monitors on my Mac.
| samgranieri wrote:
| I've got a Caldigit TS3
| erikpukinskis wrote:
| Sansibit MM4 for me
| lizknope wrote:
| Does it support two 4K monitors now?
|
| I have an old Macbook from 2013 that runs Linux now with an
| Intel integrated GPU and it supports dual 4K monitors at 30Hz.
| I use the Thunderbolt ports as Mini DisplayPort outputs and
| connect to two 4K monitors with DisplayPort and it works fine.
| clashmoore wrote:
| M1 laptops can only support one external display at 6k and
| 60Hz. The other video goes to the laptop's own display. Even
| if you use the laptop closed in clamshell mode you can't
| connect or daisy chain to more than one external display.
|
| The mini can support two external displays. One via
| thunderbolt at that same 6k and 60Hz and the other a 4k at
| 60Hz through the HDMI port.
| staticfloat wrote:
| Note that the restriction on external monitors can be
| worked around by using a DisplayLink compatible
| dongle/dock, since it uses a custom driver (I assume it
| does something in software that would otherwise be limited
| in hardware).
|
| I use the Dell D6000 and I run three (1080p) external
| monitors in addition to the built in monitor.
| clashmoore wrote:
| I've been trying to find a DisplayLink dock that can
| output video via USB-C or Thunderbolt. Everybody's shared
| solutions always involve HDMI connections, never USB-C.
|
| I have two Thunderbolt/USB-C monitors that I was hoping
| to daisy chain with one wire from my Mac. Alas it's not
| possible.
|
| My hope is power into a dock. Thunderbolt from dock to
| laptop to power laptop. Thunderbolt/USB-C from dock into
| first monitor. Second Thunderbolt/USB-C from dock using
| the DisplayLink tech to second monitor.
| jsjohnst wrote:
| You won't find a Displaylink adapter that supports
| Thunderbolt monitors. It just won't work from a technical
| aspect.
| jacobolus wrote:
| Three 1080p displays add up to 3/4 the bandwidth of one
| 4k display, at the same framerate.
| aldanor wrote:
| Really hoping for 128-256gb ram limit, perhaps on a separate
| ddr. E.g., any sort of serious data science work is now
| impossible on m1s simply because of ram limits.
| Tagbert wrote:
| You would probably need to wait for the 3rd level of M SOCs
| predicted for the Mac Pro. The M1 is for the lowest
| performance tier machines like the Air, the low end iMac, the
| low end MacBook Pro. This next chip M2/M1X is for the middle
| tier like the 16" MacBook Pro and the 27"/30" iMac. It will
| probably take a third tier to handle the large RAM and GPU
| needs to the Mac Pro.
| aldanor wrote:
| Yep, that's exactly what I've heard as well, and it makes
| perfect sense. I'm actually silently hoping 128gb would
| fall into the 'mid tier' - like it currently does with the
| old Intel iMac 27''. You don't always need a $15k cheese
| grater when all you're looking for is just a bit more
| memory on a Mac platform...
| dman wrote:
| In a laptop?
| forgetfulness wrote:
| Maybe the commenter has a very interesting use for it, but
| why would you buy a 256 GB RAM machine (which isn't a lot
| of memory for this arena either) to develop the model,
| instead of using something smaller to work on it and
| leasing a big cluster for the minutes to hours of the day
| that you'll need to train it on the actual dataset?
| nonameiguess wrote:
| I don't do this kind of thing any more, but back when I
| did, the one thing that consistently bit me was
| exploratory analysis requiring one-hot encoding of
| categorical data where you might have thousands upon
| thousands of categories. Take something like the Walmart
| shopper segmentation challenge on Kaggle that a hobbyist
| might want to take a shot at. That's just exploratory
| analysis, not model training. Having to do that in the
| cloud would be quite annoying when your feedback loop
| involves updating plots that you would really like to
| have in-memory on a machine connected to your monitor.
| Granted, you can forward a Jupyter server from an EC2,
| but also the high-memory EC2s are extremely expensive for
| hobbyists, way more than just buying your own RAM if
| you're going to do it often.
| rubatuga wrote:
| I think there are studies showing that one hot encodings
| are not as efficient as an embedding, so maybe you would
| want to reduce the dimensions before attempting the
| analysis.
| aldanor wrote:
| Kicking learns off on a cluster is surely a thing as
| well. And in some fields, as you correctly mentioned,
| memory requirements may be measured in terabytes. It's
| more of a 'production use case' though - what I meant is
| the 'dev use case'. For instance, playing with mid/high
| frequency market data, plotting various things, just
| quickly looking at the data and various metrics, trying
| and testing things out often requires up too 100gb of
| memory at disposal at any time. It's definitely not only
| about model training. And 'something smaller to work on'
| principle doesn't always work in these cases. If the
| whole thing fits in my local ram, I would of course
| prefer to work on it locally until I actually need
| cluster resources.
|
| (But seriously though... what is 16gb these days? Looking
| at process monitor, I think my firefox takes over 5gb now
| since I have over a thousand tabs in treestyletab, clion
| + pycharm take another 5gb, parallels vm for some work
| stuff is another 10gb; if doing any local data science
| work that's usually at least a dozen gb or a few dozen or
| more)
| JonathanFly wrote:
| >Maybe the commenter has a very interesting use for it,
| but why would you buy a 256 GB RAM machine (which isn't a
| lot of memory for this arena either) to develop the
| model, instead of using something smaller to work on it
| and leasing a big cluster for the minutes to hours of the
| day that you'll need to train it on the actual dataset?
|
| A 128GB of a ram in a consumer PC is about $750 dollars
| (desktop anyway, laptop may be more?). That's less than a
| single high end consumer gaming GPU. Or a fraction of a
| Quadro GPU.
|
| So to the extent that developers _ever_ run things on
| their local hardware (CPU, GPU, whatever) 128GB of RAM is
| not much of a leap. Or 256GB for Threadripper. It 's in
| the ballpark of having a high-end consumer GPU.
| llampx wrote:
| The same M chip is used in 4 product lines so I'm going to
| assume a Pro version of the iMac or Mac Mini is what the
| parent means, but if you need that much memory, setting up
| a VM should be worth it. Same if you need a GPU.
| octopoc wrote:
| The latest Macbook Pro 16" supports 32GB or 64GB RAM [1]. It's
| been that way for a few generations. I have the 2018 model and
| it has 32GB. I think that might have been the first generation
| where that much RAM was supported.
|
| [1] https://www.apple.com/macbook-pro-16/specs/
|
| edit: oh right, that's not M1 _brainfart_
| jonwachob91 wrote:
| Those are intel based Macs. This being an article about the
| M1 going into mass production, it's probably fair to say the
| parent comment was referring to those specs in an M1 Mac, not
| an intel Mac.
| [deleted]
| wombat-man wrote:
| yeah but they want that with apple silicon
| martimarkov wrote:
| Yea but that one uses the intel cpu. The problem with M1 is
| that it only supports up to 16 right now. And no eGPU. :/
| lathiat wrote:
| Me too
| DCKing wrote:
| The current RAM restrictions in the M1 are dumb restrictions
| resulting from the fact that you can't get LPDDR4X chips larger
| than 8GB in size (the M1 has two on SoC LPDDR4X chips).
|
| This year we should see a lot of CPU manufacturers change to
| (LP)DDR5 for memory - high end Samsung and Qualcomm SoCs are
| already using LPDDR5. It's a safe bet Apple is also switching
| to LPDDR5 which would put the upper limit on 64GB for the M2.
| This is notably still lower than the 128GB you can put in the
| current 27" iMac (for an eye watering $2600 extra), but is an
| amount far more pro users can live with.
| anaerobicover wrote:
| Just to side note, the 27" iMac RAM is the rare component
| that is upgradable by the user. You should be able to get
| 128GB for under $700 if you do not buy the RAM from Apple.
| hajile wrote:
| Samsung makes a very large 12GB module. I'm sure that
| offering would be taken advantage of by a lot of people if
| they'd just put it in the table.
|
| The real issue is not shipping the M1 with LPDDR5 support. We
| already have tons of Qualcomm phones and even Intel
| processors shipping with since last year. If Apple had done
| the same, we wouldn't be talking about this today.
| DCKing wrote:
| > The real issue is not shipping the M1 with LPDDR5
| support.
|
| I brought this up before but it was pointed out to me that
| the only manufacturer of LPDDR5 last year was Samsung,
| which was producing not-completely-standardized LPDDR5 at
| the time and probably didn't have enough spare volume for
| Apple anyway. Having 12GB LPDDR4X modules from one vendor
| (24GB total) probably is not enough reason for Apple to
| switch up their supply chain either, not for the M1 at
| least.
|
| And, to be fair, I think Apple did get away with shipping
| low memory computers by targeting only their cheapest
| computers.
| jolux wrote:
| 16GB is not "low memory" for the vast majority of users.
| monocasa wrote:
| You'd be surprised how quickly a regular user can eat
| that these days with electron apps and a browser. My MIL
| would pretty routinely get out of memory errors at 16GB
| and she's hardly a power user. Somehow her Facebook tab
| would clock in at over a GB alone.
| astrange wrote:
| Unless you turned off swap, out of memory actually means
| out of swap, so probably the machine was out of disk
| space.
|
| Safari shows a memory warning when a tab uses ~2GB
| memory, and can be too aggressive about it.
| monocasa wrote:
| Activity monitor showed that RAM was full. She did not
| use Safari, but instead Firefox.
| astrange wrote:
| Well that's fine, RAM is supposed to be full. You're
| wasting it if it's not full. If you're really using too
| much, the system will either get horribly slow or you'll
| get a dialog asking you to force quit apps. (which tends
| to come up way too late, but whatever)
| monocasa wrote:
| > If you're really using too much, the system will either
| get horribly slow or you'll get a dialog asking you to
| force quit apps.
|
| Yes, she was getting both. To the point that I asked her
| to keep activity monitor running in the background so we
| could see who the problem applications were.
|
| I'm not sure why this is in question.
| jolux wrote:
| I pretty regularly have Slack running while compiling
| Erlang in the background, running a Docker container,
| several dozen Firefox tabs, and a Microsoft Teams call
| going, and I do not have problems with 16GB on x86.
| Perhaps ARM Macs use more memory or something. My next
| MacBook will definitely have more than 16GB if it's
| available, but that's more for future need rather than
| present.
| monocasa wrote:
| She has problems with 16GB on x86. A single docker
| container and compiling erlang really don't use that
| much. It's many tabs of office 365, google docs, and
| social media, and how every app is an electron app now
| that eats memory.
|
| I think we as programmers have lost sight of how a single
| gif in memory is about the same size as a whole docker
| container.
| jolux wrote:
| I don't think that's accurate, because Docker on macOS
| involves virtualization. And Teams is the worst resource
| hog I've used. Though I don't use Office or social media
| much (Twitter occasionally) so maybe that's it.
| monocasa wrote:
| > because Docker on macOS involves virtualization
|
| Go check out the resource usage. Yes, it involves
| virtualization, but the stripped down virtualized kernel
| uses very little resources. Getting that information is a
| bit complicated by the fact that it's difficult to see
| the difference between the virtualized kernel and it's
| applications in a way that's consistent with the host OS,
| but it's really in the dozens of megabytes range of
| overhead for something like docker on mac.
| hajile wrote:
| It's a deal with the devil (especially on the pro
| machines) where you trade 50% faster CPU for 50% less
| RAM.
|
| Everyone raves about how 8GB machines feel, but the
| massive SSD usage and pitiful lifespan shows that there's
| not a free lunch here. 16GB (or a 24GB option) goes a
| long way toward preventing early death. I actually
| suspect they'll be launching a free SSD replacement
| program in the next couple years.
|
| I've also heard claims from a couple devs that side-by-
| side comparisons with their x86 macs show a massive
| increase in RAM usage for the same app for whatever
| reason. I'd guess that's solvable, but could contribute
| even more to the SSD problem. On the bright side, all
| this seems to indicate a great pre-fetch and caching
| algorithm.
| foldr wrote:
| > I actually suspect they'll be launching a free SSD
| replacement program in the next couple years.
|
| Not even the most alarmist estimates suggest that the
| SSDs on the 8GB machines will start dying after two
| years!
|
| At the moment the SSD lifetime concerns are little more
| than FUD. Everything we know is perfectly consistent with
| these machines having SSD lifetimes of a decade, even
| with heavy usage.
| 542458 wrote:
| I was going to say! People seem to get all paranoid
| whenever swap is brought up on SSD machines, but modern
| SSD lifespans are pretty awesome.
|
| https://techreport.com/review/27909/the-ssd-endurance-
| experi...
|
| 700TB was the minimum that any drive in the above link
| managed. If you used 100gigs of swap per day it would
| take you two decades to hit that level.
| astrange wrote:
| > I've also heard claims from a couple devs that side-by-
| side comparisons with their x86 macs show a massive
| increase in RAM usage for the same app for whatever
| reason.
|
| Partly because the GPU works differently, partly because
| some inaccurate methods of counting memory (Activity
| Monitor) have cosmetic issues displaying Rosetta
| processes.
| CalChris wrote:
| That could make an M2 Mac Mini into a scary _on prem_
| machine.
| dillondoyle wrote:
| My current almost 2 year old 16" has 64gm ram already. Is
| this new version faster despite lower amounts of storage?
| dijit wrote:
| It's faster ram, and a faster CPU. Memory only really gives
| you "number of active things that the CPU can switch to"
| (meaning, open idle programs) and cache.
|
| If the nvme drive is fast enough, the filesystem cache is
| not as noticeably effective, and if you have aggressive
| "suspend" of processes (like iOS) then lack of RAM capacity
| does not really impact performance at all. But memory
| latency, speed and the CPU do.
| amelius wrote:
| How is this possible in this time of shortage of IC production
| capacity?
| Invictus0 wrote:
| Apple booked the production capacity at TSMC years in advance.
| mpweiher wrote:
| Apple wasn't cheap and pre-bought production capacity.
| dcchambers wrote:
| I haven't yet used an M1 mac but based on what I've read about it
| I have fully bought into the hype train.
|
| Hoping my next laptop will be a M2-powered MBP, assuming they can
| increase the maximum RAM to at least 32GB.
| reasonabl_human wrote:
| I wouldn't get too hung up on the RAM. Went from an XPS with
| 64GB that used about 16-20GB in my day to day, still able to
| use the same workflows and memory is handled fine behind the
| scenes on my M1 Air with 16GB. Maybe get your hands on one from
| a friend and play around with it. Would imagine ram heavy tasks
| like rendering etc would choke but I just run many workflows /
| builds simultaneously which works fine.
| msoad wrote:
| I replaced my Core i9 fully spec'ed out MacBook Pro 64GB RAM
| with an M1 MacBook Pro with 16GB of RAM and I can tell you my
| computing experience is better! Specially the fact that my
| computer works on battery!
| lostgame wrote:
| There are rumours of a supposed M1X that may hit before the M2,
| so you may be waiting a little longer than you'd think. :)
|
| Of course, the Apple rumour mill; grain of salt, etc - but I
| wouldn't be surprised if we saw an M1X that could, for instance
| - support 32GB of RAM by the end of the year - (which is the
| only blocker from me buying an M1) - and pop out the M2 next
| year maybe with an Apple-CPU-powered Mac Pro?
|
| Food for thought. :)
| dcchambers wrote:
| Yeah I guess 'M1X' vs 'M2' doesn't matter so much. As long as
| they've had time to work out any kinks from the first gen and
| increase the RAM, I'm all in.
| johnwalkr wrote:
| There's missing features (like number of USB ports) but no
| real kinks that I've come across. Although I don't need it,
| I tried parallels + ARM windows + x86 windows apps on a
| lark and it worked without any noticable performance
| issues.
| lostgame wrote:
| Whoa. That's excellent to hear.
|
| Since they are non-upgradeable, I will certainly be
| waiting until a unit with at least 32GB RAM (ideally
| 64GB) before I'd upgrade at all and consider it future-
| proofed, but this is great to know!
| sesteel wrote:
| If Apple is having this kind of success, it seems they should
| look to compete in the data center with this or the following
| generation of chips. I wonder if it is a good time to invest in
| Apple.
| jfb wrote:
| What's in it for Apple? I'm not trying to be glib, here, but
| unless there were some Mac only server functionality, nobody
| would buy an Apple ARM powered datacentre machine.
| _ph_ wrote:
| First of all, Apple could save a huge amount of money
| replacing Intel based servers with their own chips. Both on
| the CPU price, expecially the Xeons are _really_ expensive as
| well as on electricity consumption, probably the largest
| running cost of data centers.
|
| Then the gains of scale, making a CPU just of the Mac Pro
| would mean too low production numbers, but with data center
| usage would drive those up - especially if Apple also sold it
| to other customers, e.g. bringing the Xserve back. For the OS
| they could run Linux virtualized or they give the native
| Linux on Mac developers a hand.
| epistasis wrote:
| If Linux was supported, it would be an interesting competitor
| to AWS's graviton instances.
|
| As for what's in it for Apple, it would be the profit from
| selling a hopefully large number of chips, but adding
| official Linux support and also commuting to an entire new
| market for a minimum of three years is probably far higher a
| cost on focus than any potential profits.
| mcintyre1994 wrote:
| What if Rosetta 2 was that Mac only server functionality? I
| don't know that they'd do it, but from M1 Mac reviews it
| sounds like M1 + Rosetta 2 will run at least some x86 code
| faster and more power efficiently than any competitor.
|
| I don't know how feasible it is to scale that up to a
| datacenter though, and I expect MacOS licensing costs would
| wipe away any power efficiency gains. But I do wonder if they
| could hypothetically scale up and beat the best Intel/AMD
| have to offer just using Rosetta 2 to translate x86 code.
| gpm wrote:
| Eh, the Asahi linux people already have linux running on this
| chip.
|
| What's in it for Apple is money and better economies of scale
| for chips. But I don't really think it fits Apple's MO so I
| doubt they'll do it.
| AceJohnny2 wrote:
| > _Eh, the Asahi linux people already have linux running on
| this chip_
|
| More specifically, people are running Linux _on the CPu
| cores_.
|
| The M1 is a system-on-chip, and according to the floorplan
| [1], the CPUs are maybe 1/5th of the chip. There are many
| other features that aren't unlocked, such as GPU (which is
| a brand new architecture) or power management. The latter
| is key to exploiting the chip to its full performance
| envelope.
|
| I don't expect Asahi to get anywhere further than proof-of-
| concept before it becomes obsolete by the march of the
| silicon industry.
|
| [1] https://images.anandtech.com/doci/16226/M1.png
| gpm wrote:
| I think it depends on how much changes between
| generations. So far it seems like most of this isn't
| really new, but almost exactly what's been on the
| iDevices for a long time. If they don't re-architect
| substantially between generations I can see the Asahi
| project keeping up.
|
| The GPU stuff is new, but it seems like good progress is
| being made: https://rosenzweig.io/blog/asahi-gpu-
| part-3.html
|
| For data centers, it helps that GPU and so on is just
| less important. It's wasted Silicon, but the CPU is
| competitive even before considering the added features so
| that's not the end of the world. There's a good chance
| that Apple can use that to their advantage too, by using
| chips with broken GPUs/NNPUs in the DC... or designing a
| smaller chip for data centers... or one with more
| cores... or so on.
| mumblemumble wrote:
| MO is my thought, too. Getting back into server hardware
| would require supporting a vastly different kind of client,
| and possibly require them to start doing some things that
| they, as a company, might find distasteful. Supporting
| equipment they stopped making a long time ago, for example.
| Including maintaining stockpiles of replacement parts.
| easton wrote:
| They get a gigantic boost in internal datacenter performance
| if they can jam a bunch of Apple Silicon chips into a rack
| mounted server and boot Linux on it. If they can get a
| similar boost in performance at lower power efficiency in the
| chip that is going in the Mac Pro, taking that chip and
| putting it on a board with Ethernet and Power wouldn't be a
| ton of engineering cost and then they could massively reduce
| the power consumption and cooling costs of the datacenters.
|
| And then they could resell the box they designed as a server,
| either with Linux support out of the box (unlikely, but since
| in this mystical scenario they'd have to write kernel patches
| for Linux to get it to boot...) or with a build of macOS that
| could be set up headlessly, in order to recoup the
| development costs. Apple shops that want machines to run
| Compressor or Xcode's build server would eat that up.
| usrusr wrote:
| With a sufficiently wide lead in energy efficiency, just
| selling hardware without any follow up lock-in harvest can be
| attractive even for a company as spoiled as Apple. They'd
| likely want to make the modules offered sufficiently big to
| make them unattractive for desktop use or else they'd risk
| cannibalizing their lock-in market.
| varispeed wrote:
| I think if they ever released the servers, they would want a
| total control over what you run on them, so you couldn't just
| upload your service, Apple would have to approve it first.
| Toutouxc wrote:
| How does that even make sense? You can run anything on Macs,
| and these are one level less enterprisey.
| oblio wrote:
| Do you mean that they should start making and selling servers?
| It's unlikely.
|
| Or do you mean that they'll start selling parts (SoCs)? Not in
| a million years :-)
| sesteel wrote:
| I am just trying to forecast how bright Apple's future is.
| Seems like they have options. So, is there going to be a
| shift towards ARM/RISC in general or not. If so, where do I
| put my money?
| oblio wrote:
| Well, speaking for Apple, Apple 99% sells consumer
| products. So look for big consumer products markets which
| they haven't entered. Cars would be one of those markets,
| for example.
|
| AR/VR/more wearables would be another.
|
| Home appliances/electronics would be another.
| Thaxll wrote:
| How would they be succesful in DC? Designing a product for
| consumers is very different than for servers. On top of that
| you add terrible server support for MacOS.
| jbverschoor wrote:
| They only need to support a Linux kernel. They've used google
| cloud, azure, now aws. The contract is worth billions, and
| will end in 23 or 24.it's very likely they'll at least sun
| their own cloud completely. And maybe they'll compete as a
| public cloud later
| johnwalkr wrote:
| I think they will soon, but only for a specific use-case: MacOS
| and iOS development. Devops in cloud is expected these days for
| development and the only offerings available for it are
| basically mac minis jammed into a rack, often with grey-area
| virtual machines running. A new Xserve model and Xserve cloud
| service would be great!
| jagger27 wrote:
| If they were to (re)enter this market they'd have to support
| Linux, which I just don't see happening.
|
| What's interesting to me is to see if they'll use M-series
| chips in their own datacenters. They already run Linux there
| apparently.
| lumost wrote:
| I'm really curious how this will play out. DataCenters haven't
| been apple's market for a long time, and the requirements of
| datacenter customers are kinda anti-apple these days.
|
| More likely I would see Apple making an exclusive server chip
| licensing arrangement with ARM or something similar.
| sesteel wrote:
| High throughput at relatively low power. To me, it seems like
| a match made in heaven. There is the practicality of building
| rock solid containerization on these CPUs. I don't know where
| that stands, but it seems like an obvious fit.
| pjmlp wrote:
| I doubt that they would bother with the 2nd coming of macOS
| Server for anything other than Apple shops.
| canuckintime wrote:
| I'm reading somewhat incompatible reactions in the top level
| comments e.g. [1] > Somewhere, the collective whos who of the
| silicon chip world is shitting their pants. Apple just showed to
| the world how powerful and efficient processors can be. All that
| with good design. Customers are going to demand more from Intel
| and the likes.
|
| Another [2]: > I really want the next MacBook Pro to support
| driving two 4K monitors over thunderbolt and have an option to
| buy 32 gigs of ram.
|
| Meanwhile the last Intel MacBook Pro supports driving four (4!)
| 4K displays [4]. Apple silicon is far ahead in benchmarks but how
| does speeds and feeds translate into what customers actually
| want?
|
| Battery life is impressive but unfortunately not the usual
| differentiator during a worldwide pandemic. The M1 Macs are quite
| quiet (the first MacBook Air without a fan--in 2020!) meanwhile
| the Intel Surface Book was fanless in 2017. We shot the messenger
| of the recent Intel attack Apple ads [5] but message is still
| worth reading. I bought an M1 MBA and realized the speed didn't
| make a difference as my consumer computer. For the first time in
| decades I'm not sure if Apple provides the most pleasurable
| experience.
|
| [1] https://news.ycombinator.com/item?id=26956336
|
| [2] https://news.ycombinator.com/item?id=26955682
|
| [4] https://support.apple.com/en-ca/HT210754
|
| [5] https://www.macrumors.com/2021/03/17/justin-long-get-a-
| mac-i...
| Toutouxc wrote:
| How are the reactions incompatible? People like me, who don't
| need more than 16 GB of RAM and one monitor, are happy with the
| M1. Other people are waiting on the M1X/M2 chip to bring what
| they need.
|
| > meanwhile the Intel Surface Book was fanless in 2017
|
| The MacBook was fanless in 2015 and, like many other fanless
| designs using Intel chips, it was slow.
| rvz wrote:
| > Other people are waiting on the M1X/M2 chip to bring what
| they need.
|
| Well those people must have been bullish on Apple Silicon and
| 'not just' M1. They think its worth skipping over M1 rather
| than going all in on the 1st gen product which at the time
| had primitive support for most mainstream software,
| especially for developers.
|
| Maybe Apple knew that the M1 could not drive more than 1
| monitor on the Macbook Air and in fact left that limitation
| in with a small disclaimer.
|
| Perhaps they will announce this capability in the M2 Macs.
| canuckintime wrote:
| > How are the reactions incompatible? People like me, who
| don't need more than 16 GB of RAM and one monitor, are happy
| with the M1. Other people are waiting on the M1X/M2 chip to
| bring what they need.
|
| I agree with your nuance.
|
| > The MacBook was fanless in 2015 and, like many other
| fanless designs using Intel chips, it was slow.
|
| The Surface Book 2/3 and M1 MacBook Air are not slow (hence
| the point of my comparison)
| rvz wrote:
| Even earlier than expected. Perhaps my suspicion to skip M1 to go
| for M2 was completely right.
|
| It doesn't hurt to wait a little or perhaps skip the first gen
| versions due to the extremely early software support at the time,
| rather than getting last year's model (Nov 2020) and suffering
| from the workarounds and hacks to get your tools or software
| working on Apple Silicon.
|
| I won't have a wobbly software transition unlike most M1 early
| adopters. Afterwards, I'll most certainly skipping M1 to
| something higher. (Likely M2)
|
| Like I said before, wait for WWDC and don't end up like this guy.
| [0] Those on Intel (Especially those on 2020), no need to run for
| the M1, Just skip it and go for M2 or higher.
|
| Downvoters: Here's another foot-gun Apple hid behind the hype
| squad. [1] For the iPad Pro, if you bought the 2020 version, your
| keyboard is incompatible with the M1 version meaning you have to
| fork another $350 for a new one that works.
|
| At the time of the Macbook Air M1 launch (Nov 2020), tons of
| software issues, even the recovery system fell apart for most
| people for M1. Even upgrading to this on launch day right here
| with those issues was an instant no deal.
|
| Once again Intel Mac users, plenty of time to migrate to
| something even better. (M2 or higher)
|
| [0] https://www.zdnet.com/article/i-sold-my-old-ipad-pro-to-
| back...
|
| [1] https://arstechnica.com/gadgets/2021/04/new-12-9-inch-
| ipad-p...
| barbazoo wrote:
| That link is just an ad for a service called Backflip. What
| does that have to do with this?
| LysPJ wrote:
| I've been extremely happy with my M1 Air. I waited until Golang
| and Docker were available (around Jan 2021), but I haven't
| suffered any workarounds or hacks.
|
| To be honest, it has all been much smoother than I expected,
| but YMMV.
| gkilmain wrote:
| I have an M1 and I love it. Sure, there were some early issues
| like chrome crashing all the time and some packages not working
| but I haven't run into any issues as of late.
| st_goliath wrote:
| Cool! So now (or at least soon-ish) I can get my hands on some
| dirt cheap, practically never used M1 hardware on eBay to play
| around with?
|
| I wonder if Apple is familiar with the Osborne effect[1].
|
| [1] https://en.wikipedia.org/wiki/Osborne_effect
| o_m wrote:
| I don't think Apple will put the new processor in the existing
| M1 products, except for the 13" MacBook Pro.
| a_c_s wrote:
| This is a rumor, Apple didn't make this announcement. It is not
| an example of the Osborne Effect.
| enraged_camel wrote:
| Is Apple going to be affected by the chip shortages we have been
| hearing about?
| spamizbad wrote:
| Probably not since they likely didn't scale back their orders
| in 2020.
| kristofferR wrote:
| I had to wait a month to get my Macbook Air M1 here in
| Norway.
|
| Was quite a painful wait (as my Macbook 12' got water damage,
| with the SMC limiting in to 1Ghz).
| ajmurmann wrote:
| Taiwan also has a severe water shortage. I'd assume that's
| still a threat to production for Apple. https://www.taipeitim
| es.com/News/feat/archives/2021/04/22/20...
| totalZero wrote:
| Nikkei says they have already postponed some Mac and iPad
| production. Not sure how reliable that story is, but so far it
| doesn't look like customer orders are at all delayed. I bought
| a nonstandard one when this article came out, and they
| delivered it a week ahead of schedule.
|
| https://asia.nikkei.com/Business/Tech/Semiconductors/MacBook...
| AmVess wrote:
| Apple tends to buy fab capacity in very large chunks. IIRC,
| they bought all of TSMC's 5nm capacity for a year.
| [deleted]
| usefulcat wrote:
| I would expect that Apple's chips are probably some of the more
| profitable chips made by TSMC. Apple has historically had
| relatively large margins, so they can probably afford to pay a
| bit more.
| matsemann wrote:
| Is anything known about the chip? The deal breaker of M1 (for me)
| as it currently stands is the amount of RAM it can handle (16
| GB).
|
| Edit: Mistyped 16 as 1, sorry about the confusion
| bla3 wrote:
| Also the relatively low number of cores.
| JoshTko wrote:
| I've only read very rare workloads actually be constrained by
| 16GB on the M1, what use case do you have that you know will be
| hampered on a M1?
| squeaky-clean wrote:
| My Ableton starter template project takes about 22GB of RAM
| to open. Music production is a pretty common use case that
| can be very heavy on RAM.
| matsemann wrote:
| My current laptop has 64 GB. Could probably be fine with 32,
| but I'm seldom under 20 in usage. I run certain services
| locally in Docker, and then have 5+ instances of IntelliJ for
| various stuff open, some of them running a java server or
| node building the frontend, and others on my team may be
| running Android+iOS simulator at the same time as well.
|
| I _could_ alter my usage patterns and work with less. But
| people not used to having loads of ram don 't know what
| they're missing out on.
| [deleted]
| reasonabl_human wrote:
| I went from a 64GB XPS 15 as a developer utilizing ~10-20GB
| during general workloads and I can get the same work done
| on my new M1 MacBook Air 16GB without a hitch. Unless you
| are reading and writing to ram incredibly quickly or need a
| specific massive amount of data in memory like for
| rendering type tasks, running general applications beyond
| the 16GB point is totally fine, and the OS will figure out
| the rest.
|
| I'm curious to know if it'd work for you, do you have
| access to an M1 to try out your workflow on? The max ram
| 'issues' seem to be way overblown.
| [deleted]
| asadlionpk wrote:
| side-question, isn't iOS simulator running _natively_ on
| M1? That would mean it consumes less RAM than on x86. If
| that 's true, it should be possible to fit Android+iOS
| workflow.
|
| As a data point: I am running node + iOS Simulator (along
| with XCode + VSCode + Chrome with 50+ tabs) setup on M1
| 16GB and it works fine, I also keep them running while I
| take a break to play a LoL match. Works great for me.
| danlugo92 wrote:
| Simulators run natively in x86 they simulate the api/abi
| as opposing to emulating the processor
| dieortin wrote:
| I think you mean 8GB?
| kasperni wrote:
| I think he meant 16GB.
| brianwawok wrote:
| I think you mean 16GB?
| [deleted]
| jdauriemma wrote:
| I own an M1 Macbook Air with 16GB of RAM
| dwaltrip wrote:
| Yep. Unfortunately the M1 MacBook Pro currently only goes
| up to 8GB.
| ericlewis wrote:
| I am typing from an M1 MacBook Pro with 16GB.
|
| edit: you have to select the SSD size then you can choose
| the RAM.
| varispeed wrote:
| I hate to say that, but I am likely going to buy M2 Mac. I don't
| like Apple and their anti-competitive tactics, but I admit they
| won their spot for now. However, as soon as good PC competitor
| comes in, I'll drop Apple like a hot potato.
| oblio wrote:
| Well, here's hoping that both Linux and Windows work flawlessly
| on their hardware at some point.
| moooo99 wrote:
| Not even macOS works flawlessly on the hardware, why should
| Windows and Linux do. But as far as it goes for Windows, not
| having every other update completely break my system would be
| a welcome change.
| lofi_lory wrote:
| Because with Linux excitement can change things. What are
| you gonna do if you miss something in iOS/macOS? The right
| people can in principle make anything work in Linux but
| with macOS you are left praying Apple decides your use case
| is their business case.
|
| Imagine what would happen if the compute/$ M1 laptops would
| perform in some respect better with Linux than macOS.
| Things may get out of hand, when huge chunks of the Linux
| community gets involved.
| oblio wrote:
| > Not even macOS works flawlessly on the hardware, why
| should Windows and Linux do.
|
| This is just pedantry and needless nitpicking. Replace
| "work flawlessly" with "work well" in my previous comment.
| vbezhenar wrote:
| Of all the operating systems, I'm finding macOS to be less
| annoying than the rest. So far Apple did not make a single
| computer suitable for me, but if they would release something
| like Mac Pro Cheap Edition (or whatever, I just want
| workstation-level specs for $2k), I'll switch to it. I don't
| really care about M1 or Intel, I think that any modern CPUs are
| fast enough for any tasks.
| tmccrary55 wrote:
| I'm really hoping Alder Lake or a similar AMD product gets PC
| closer to M1+ performance and battery consumption.
|
| The M1 chip is amazing but I'm a tiling window manager man.
| kbd wrote:
| It's not exactly a tiling window manager, but if you can
| program some simple Lua then Hammerspoon is a godsend. You
| can program anything any of the other window managers for Mac
| (like Rectangle, Spectacle, etc.) can do and have complete
| freedom to set up your own keyboard shortcuts for anything.
|
| I have some predefined layouts[1] for my most common usage.
| So, one keyboard shortcut arranges the screen how I want, and
| I have other keyboard shortcuts[2] (along with using
| Karabiner Elements for a 'hyper' key) to open or switch to
| common apps.
|
| [1] https://github.com/kbd/setup/blob/1a05e5df545db0133cf7b6f
| 1bc...
|
| [2] https://github.com/kbd/setup/blob/1a05e5df545db0133cf7b6f
| 1bc...
| hajile wrote:
| https://ianyh.com/amethyst/
| _joel wrote:
| Not sure why you're being down voted, I use Amethyst and
| love it.
| ripply wrote:
| I also am a tiling window manager man and tried that a few
| years back (as well as everything else on the market) when
| I had a mac from work, unfortunately, without real support
| from the OS, these are all just poor mans windows managers
| and can't compare to the real thing. I gave up trying to
| use any of them and ended up installing a virtual machine
| where I could actually use one.
| galangalalgol wrote:
| I'm a gnome addict. If you ever asked yourself "who are
| they building this monstrosity for?". That would be me,
| it fits my exact preferences in a workflow out of the
| box. Unity irritates me to no end, and my wife's mac is
| almost unusable for me which she greatly appreciates.
| aryik wrote:
| You've gotten plenty of recommendations, but I'll add one for
| Magnet [1]. I've been using it for years, and I love it. One
| of the best software purchases I've ever made - small,
| lightweight, does exactly one thing and does it very well,
| and made by a small developer to boot.
|
| [1] https://magnet.crowdcafe.com/index.html
| reader_mode wrote:
| It seems like apple is TSMC priority customer so I suspect
| they will always be a node ahead of AMD.
|
| Doesn't matter if Apple does mobile and low power stuff
| exclusively, but if they can scale this design into a higher
| TDP/core count it's going to get interesting.
| anuragsoni wrote:
| This will obviously not be comparable to a tiling window
| manager, but I've been pretty happy with Rectangle [1] on my
| mac. The keyboard mappings are pretty easy to configure and
| i've found it to work well even in multi monitor setups.
|
| [1] https://github.com/rxhanson/Rectangle
| mwint wrote:
| +1 for Rectangle; I've been using it ever since it was
| Spectacle. There's nothing I really miss about a proper
| tiling window manager (though I'm sure hardcore users would
| disagree)
| anuragsoni wrote:
| > There's nothing I really miss about a proper tiling
| window manager (though I'm sure hardcore users would
| disagree)
|
| Agreed. I mostly just needed the keyboard driven window
| snapping I was used to on Gnome and rectangle has filled
| than need 100%.
| volta83 wrote:
| This is Mac OS:
|
| - https://www.reddit.com/r/unixporn/comments/jupmda/aquayabai
| _...
|
| - https://www.reddit.com/r/unixporn/comments/mvuplf/yabaimaco
| s...
|
| Its called Yabai (+ skhd):
| https://github.com/koekeishiya/yabai
|
| That is, you can have a tilin WM today with all the
| advantages of running MacOS.
| tl wrote:
| From your third link: System Integrity
| Protection needs to be (partially) disabled for
| yabai to inject a scripting addition into Dock.app for
| controlling windows with functions that require elevated
| privileges. This enables control of the window server,
| which is the sole owner of all window connections,
| and enables additional features of yabai.
|
| The risk Apple kills yabai after you're adjusted to it is
| real.
| Lownin wrote:
| I use yabai without disabling sip. You get most of the
| features. It's the first tiling WM I've used, so it's
| possible I'm missing something critical without disabling
| sip, but so far I'm quite happy with it despite whatever
| features are missing. ymmv, of course.
| volta83 wrote:
| > The risk Apple kills yabai after you're adjusted to it
| is real.
|
| This holds for anything in the Apple ecosystem, up to
| Fortnite.
|
| Yabai has been going for long, and every issue could be
| worked around relatively painlessly.
| matwood wrote:
| > This holds for anything in the Apple ecosystem.
|
| Holds for anything in almost any ecosystem. With that
| said, Apple has stated over and over the Mac will remain
| developer friendly. Being able to disable SIP, IMO is
| part of that.
| athorax wrote:
| FWIW, I run yabai without having disabled SIP and it
| works great. There is probably some subset of
| functionality I am missing out on, but it does what I
| need it to.
| shepherdjerred wrote:
| https://github.com/koekeishiya/yabai
| zucker42 wrote:
| Zen 4 will probably at least match the M1, but it will be a
| while before those chips come out and Apple will soon improve
| even more.
| ravi-delia wrote:
| I'm curious to see if Zen 4 manages actually. It'll be an
| interesting datapoint as to what actually makes the M1
| better.
| intrasight wrote:
| I have a feeling that Apple has pulled ahead and will stay
| ahead for a LONG time. They are extending their moat. And
| applying Mx to VR they will create a new moat.
| Synaesthesia wrote:
| They started pulling ahead in the A6-A7 days and never
| looked back. Amazing progression.
| totalZero wrote:
| Alder Lake will get closer because of the big.LITTLE
| structure, but I don't know if we will really see a contender
| from Intel until Meteor Lake. Lithography size gets too much
| attention for its nomenclature, but it actually matters for
| battery consumption and thermal management. Intel must
| execute flawlessly on 7nm and spend generously on capex to
| keep the ball rolling.
| [deleted]
| 2OEH8eoCRo0 wrote:
| "Antitrust is okay if you have a fast processor"
| twobitshifter wrote:
| For CPUs Apple is actually upending the AMD Intel duopoly,
| isn't that good for competition? Furthermore, AMD only
| recently broke back into the market, which Intel had a
| stranglehold on. This is the most competitive the CPU market
| has been since the early 00s.
| gpm wrote:
| "It's not my personal responsibility to attempt to enforce
| antitrust against companies by boycotting them at significant
| personal expense".
| sneak wrote:
| Additionally, it is still (for the moment) possible to
| engage with Apple computer hardware without using any of
| their bullshit services (like the App Store) into which all
| their anticompetitive behavior has so far been constrained.
|
| This is of course not so on Apple mobile devices: it's dox
| yourself to the App Store or GTFO over there.
| varispeed wrote:
| Exactly, there are bodies that are supposed to protect
| consumers from that behaviour, unfortunately they failed
| everyone massively. That in itself begs for an inquiry how
| those institutions actually work and whether it is worth
| spending tax payer money on them if they consistently fail
| to deliver.
| jjoonathan wrote:
| ...and if that exact kind of navel-gazing is _why_ they
| don 't work? What then?
| varispeed wrote:
| Charge people for failing to deliver, dispense huge fines
| and jail time. Then rebuild it in a way to avoid mistakes
| why the previous solution didn't work.
| katbyte wrote:
| It would seem to me there is fairly healthy competition
| between apple and windows laptops?
| varispeed wrote:
| The thing is, every competitor is going to upgrade to these
| and if you stay with inferior Intel products, you give
| yourself competitive disadvantage. Unfortunately this is the
| current state of play. If I won't be able to achieve
| something the same speed competitor can, I put myself in a
| bad spot. I try to not mix emotions and business, however for
| a personal machine, I am rocking 5950X and it is awesome.
| totalZero wrote:
| > I am rocking 5950X
|
| Irrelevant from a mass market perspective. That chip is
| still out of stock at Best Buy and the Amazon third party
| version is marked up 54% versus MSRP.
|
| Incidentally the 11900k is also out of stock at many
| retailers, but it's so much cheaper. You can still buy a
| pre-binned version that clocks 5.1GHz; even with markup
| that costs 30% less than the aforementioned third party
| 5950x.
|
| Availability and price matter. My take on AMD's heavy focus
| on the enterprise segment right now is that they have no
| choice. If your enterprise partners get faced with scarcity
| issues, they will lose faith in your supply chain. You can
| tell a retail customer to wait patiently for 6 months, and
| even an OEM that makes retail devices (eg Lenovo) may
| forgive some shortfalls as long as there's a substitute
| available, but Microsoft and Google aren't going to wait
| around in line like that.
| squeaky-clean wrote:
| Mass market isn't buying individual PC parts and
| assembling the PC, they're buying prebuilts or using
| whatever their office mass-purchased. Go on dell.com
| right now and you can order a PC with a 5950x and RTX
| 3080. Good luck buying either of those individually
| without writing a web bot.
| totalZero wrote:
| I just did. Fastest delivery (express) for ALIENWARE
| AURORA RYZEN(tm) EDITION R10 GAMING DESKTOP with base
| specs aside from 5950x and the cheapest liquid cooler
| (required by Dell for that CPU) is May 26. Some of the
| higher end default/featured configurations would deliver
| in late June. Not sure whats up with that.
|
| Honestly the price/performance ratio there is pretty nice
| in the eyes of a Mac user like me, but I don't know what
| office is buying Alienware, and a bulk order would no
| doubt take longer to deliver. Those are the only machines
| popping up when you filter for AMD 5000 series on
| dell.com.
|
| Considering that 5950x is AM4 compatible, folks who had
| bought a pre-built machine and want to upgrade are also
| part of the mass market. And I think you can't discredit
| the homebuilt PC crowd for a high-end desktop chip. The
| people who care enough to want this chip can probably
| figure out how to clip it into a motherboard and tighten
| a few screws and connectors here and there.
| philliphaydon wrote:
| What antitrust are we talking about.
| [deleted]
| jonplackett wrote:
| The Apple's second version of everything is always the one to
| get. The first version is always exciting, but usually comes at
| some large expense.
|
| - iPhone 2 had 3G and was so much better than 1 - iPad 2 was
| about 3 X slimmer/lighter than 1
|
| Lots of other examples if I though about it.
| niels_bom wrote:
| What, in your opinion, is the drawback of the M1 product
| series?
|
| I haven't read many negative reviews.
| mjhagen wrote:
| In the case of CPU architecture switches, Apple's gone with
| the same strategy every time so far: switch out the guts,
| keep the outside design. So maybe not negative reviews
| regarding the CPU, just a bit boring design.
|
| I disagree with OP though, not second, but third
| generations have been the one to get for me: iPod 3rd gen,
| iPhone 4, Apple Watch Series 3. OK, iPad 3 was a bit of a
| failure, but still, first retina.
| greedo wrote:
| The new iMac diverges dramatically from this model. It's
| a complete new design, one that could only have been done
| after a switch off Intel.
| JohnBooty wrote:
| The most commonly cited limitation I've heard is a max of
| 16GB RAM on the current M1 Macbooks. The limitation of a
| single external monitor is probably the second most common.
|
| A lot of users would like to run multiple external monitors
| and have > 16GB of RAM. I know I'm in that group.
| alanwreath wrote:
| multi monitor support. You can have two monitors max.
| outside1234 wrote:
| Memory, ports. That might be more of a product level issue
| but its a blocker for me.
| jonplackett wrote:
| Don't get me wrong, it looks great. But so did iPad 1 and
| iPhone 1, until v2 came out.
|
| The main things I would hope for though are more RAM, and
| probably a much beefier GPU
| Synaesthesia wrote:
| It's got a better GPU than anything which uses less than
| 35W, probably even higher than that.
| jonplackett wrote:
| Yeah it's great. I've recommended 2 people get these
| laptops and i probably would have got one too if there
| was a 15 inch one.
|
| I'm just hoping there are some more surprises in store
| with that slightly bigger battery.
| Dennip wrote:
| The RAM is limited to 16GB, IIRC the max I/O throughout is
| also somewhat limited as well, so you have to compromise on
| number of USB4/10Gb Lan etc
| GloriousKoji wrote:
| It would be fine for 98% of the things I do but I still
| need to do that 2%. With the x86 CPUs I always had
| virtualization with USB pass through as a final workaround
| solution but with the M1 there are things I absolutely can'
| do.
| wulfklaue wrote:
| * Limited IO * Max 16GB Memory ( at unjustified cost ) *
| Limited Multi monitor support * No eGPU support ( as of now
| )
|
| * Only about 50 a 55% of all the software is M1 ready (
| looking at some tracking sites ). Technically this is not a
| M1 flaw but you need something to push the market/iron out
| the bugs. While when the M2 gets introduced, you may be at
| 60 or 70%. As in less likely to run into issues as the M1
| users are the real beta testers. Even Parallels only
| recently got good M1 support ( with massive speed increases
| ).
|
| As of this moment, buying a M1 laptop is a less beta tester
| feature then it was several months ago. If you ended up
| buying in Nov/Dec, you spend a lot of time under Rosetta 2
| or dealing with issues.
|
| > I haven't read many negative reviews.
|
| The above was kind of always overglanced by a lot of the
| reviews, as the youtubers mostly looked at their own us for
| video editing etc and there most software was on point very
| early in the release cycle.
|
| You see a lot of reviews in Jan/Feb from people going back
| to Windows laptops after a month, not because of the CPU
| being bad but because they ran into software issues.
|
| In the mean time the software situation has evolved a lot
| more but the progresses with software being made M1 ready
| has also slowed down a lot.
|
| As a PC user i want a M1 like laptop, that has long battery
| life, is silent and still powerful ( unlike a lot of
| Windows laptops where its always the old saying: Pick 2,
| you can never have all 3 ).
|
| But i prefer one with 8 performance cores, double the iGPU
| cores ( with preferably DDR5 ) for light gaming and
| standard 16GB. So some macBook 16 Pro or whatever, if the
| price is not insane. We shall see what Apple introduces...
|
| So far the new offerings from AMD and Intel are fast but
| still power hungry and heat generating ( aka fan noise! ).
| AMD is only going little.big in 3nm.
|
| Intel's alder lake may be a M1 competitor ( for battery
| life under light loads ) but its again first generation
| product so expect to be a beta tester until Windows gets
| fine tuned for a long time to properly use the little
| cores! For heavy loads, ... well, 10nm is 10nm, no matter
| how many +++ you add.
| ajuc wrote:
| Dedicated graphic cards doesn't work. No Linux support.
| Reason077 wrote:
| > _"No Linux support"_
|
| It's coming:
|
| https://arstechnica.com/gadgets/2021/04/apple-m1-hardware
| -su...
|
| (Virtualized Linux is already well-supported on M1 Macs)
| philjones_ca wrote:
| The original Intel MacBook comes to mind as well
|
| - released with the (32-bit) Core Duo processor in May 2006 -
| replaced with (64-bit) Core 2 Duo model in November 2006
|
| It only was supported through to 10.6 Snow Leopard as 10.7
| Lion went 64-bit only
| jonplackett wrote:
| Yes! Good example. I was thinking about that too but
| couldn't remember my history enough to explain.
|
| AirPods VS AirPods Pro is another I just remembered
|
| I think watch v2 was a big improvement too.
| r00fus wrote:
| My parents owned (still have) an OG Intel Mac Mini - the
| box said "Core Solo". Seems like that was one of the few
| devices sold with that chip.
| varispeed wrote:
| I still have an iPad 2. I use it in the kitchen to browse
| recipes.
| mobilio wrote:
| I still have another one near bed for reading eBooks
| intergalplan wrote:
| I have a first-gen iPad Mini that still sees light use.
| Approximately the same internals as the iPad 2, AFAIK.
| danaris wrote:
| Obviously, we won't know if the M1X/M2 has a similar big
| advantage over the M1 until it ships, but...
|
| You can also look at it this way: The M1 is _not_ the first
| version. It is (IIRC) an enhanced version of the A14 SoC
| family, and a clear evolution of the work Apple has been
| doing on its silicon for the past decade.
| geodel wrote:
| No need to hate. Nowadays people in general do make necessity
| out of convenience and virtue out of necessity.
| sigzero wrote:
| I almost went for the M1 but "1.0" kept me sane. I will
| definitely go for the M2.
| soapdog wrote:
| people wanting more cores, more memory, eGPU support, and I'm
| here just wanting them to have multiple colours...
___________________________________________________________________
(page generated 2021-04-27 23:01 UTC)