[HN Gopher] Apple's new M1 Pro and M1 Max processors
       ___________________________________________________________________
        
       Apple's new M1 Pro and M1 Max processors
        
       Author : emdashcomma
       Score  : 684 points
       Date   : 2021-10-18 17:24 UTC (5 hours ago)
        
 (HTM) web link (www.apple.com)
 (TXT) w3m dump (www.apple.com)
        
       | dewiz wrote:
       | ...and the Apple store is down lol :-)
        
       | klelatti wrote:
       | I think the M1 Max has more transistors than any commercially
       | available CPU or GPU? In a $3499 laptop!
        
         | nazgulnarsil wrote:
         | 57 billion transistors? a 5950x has under 20 a 3060ti has under
         | 20
         | 
         | are they counting the RAM?
        
           | klelatti wrote:
           | Does either have a 10 core CPU or large cache memory?
        
             | wmf wrote:
             | Yes, the 5950X has 16 big CPU cores and 64 MB of L3 cache.
        
               | klelatti wrote:
               | My mistake. But it doesn't have a 32 core GPU!
               | 
               | Agreed these counts look high - but the fact that they
               | can slot this into a $3,499 laptop is remarkable and must
               | say something about the cost effectiveness of TSMC 5nm
               | process.
        
           | modulusshift wrote:
           | Nope, RAM is off die but on package.
        
       | hbn wrote:
       | These things came at a great time for me. My late-2014 MBP just
       | received its last major OS upgrade (Big Sur), so I'm officially
       | in unsupported waters now. I was getting concerned in that era
       | from 2015-2019 with all the bad decisions (butterfly keyboard, no
       | I/O, touchbar, graphics issues, etc.) but this new generation of
       | MacBooks seems to have resolved all my points of concern.
       | 
       | On the other hand, my late-2014 model is still performing...
       | fine? It gets a bit bogged down running something moderately
       | intensive like a JetBrains IDE (which is my editor of choice), or
       | when I recently used it to play a Jack Box Party Pack with
       | friends, but for most things it's pretty serviceable. I got it
       | before starting university, it carried me all the way to getting
       | my bachelor's degree last year, and it's still trucking along
       | just fine. Definitely one of my better purchases I've made.
        
         | breiti wrote:
         | You could easily upgrade macOS using
         | https://dortania.github.io/OpenCore-Legacy-
         | Patcher/MODELS.ht....
         | 
         | On the hardware side, you could open it up, clean the fan, re-
         | apply thermal paste (after so many years, this will make a big
         | difference) and maybe even upgrade the SSD if you feel like it.
         | 
         | That way, this laptop can easily survive another 1-3 years
         | depending on your use-cases.
        
       | dang wrote:
       | All: let's keep this thread about the processors, and talk about
       | the new MBPs in the other thread:
       | https://news.ycombinator.com/item?id=28908383.
       | 
       | Edit: to read all the 600+ comments in this thread, click More at
       | the bottom of the page, or like this:
       | 
       | https://news.ycombinator.com/item?id=28908031&p=2
       | 
       | https://news.ycombinator.com/item?id=28908031&p=3
        
       | danielovichdk wrote:
       | I feel happy for the Apple community with these processors.
       | 
       | But I can't stop to think my Intel machine. It feels like I am
       | left in the dust and nothing seems to be coming that remotely
       | looks like the M1.
        
         | enraged_camel wrote:
         | Intel is years behind Apple, with no real strategy for catching
         | up. Apple most likely already has the M2 ready, and the M3
         | under heavy development.
        
           | spacedcowboy wrote:
           | M2 was sent for manufacture earlier in the year, if you
           | believe the rumor mill (and to be fair, they were spot on
           | this time around)
        
           | rpmisms wrote:
           | The good news is that AMD is working their butt off on this,
           | and seems to be much closer to Apple in terms of Watts/unit
           | of performance. Intel needs to get in gear, now.
        
           | terafo wrote:
           | Intel was stuck on the same node for 6 years. Alder lake
           | looks very promising, Alchemist GPUs same. They will have CPU
           | performance crown on laptops in less than 6 months. Power
           | usage will be much better than now. Their 3d stacking
           | strategy is very promising, it will allow for many very
           | interesting SKUs. I wouldn't count them out.
        
       | Asmod4n wrote:
       | Prediction for Mac Pros and iMac Pros: several SoCs on the
       | mainboard, interconnected with a new bus, 16 CPU cores for each
       | SoC, 4 SoCs max. The on SoC RAM will act as a L4 Cache and they
       | will share normal, User replaceable DDR5 RAM for ,,unified"
       | access.
        
       | d3nj4l wrote:
       | Did they really just claim 30-series tier performance on the max
       | GPU? If that's true that would be insane!
        
         | [deleted]
        
         | KarlKemp wrote:
         | Equivalent to "the notebook with the fastest GPU we could find
         | at half the power" is how I remember it...
         | 
         | I'm just not entirely certain what GPU performance does for
         | me...? I don't work with video, there aren't any games, and I'm
         | not playing them, anyway. Does iTerm2 scrolling get better?
         | 
         | I used to be quite happy with the GeForce 7xx(?) and
         | tensorflow, and this seems like it would have quite a bit of
         | power for ML. Unfortunately, the software just isn't there
         | (yet?).
        
           | pjmlp wrote:
           | Something like this?
           | 
           | https://developer.apple.com/metal/tensorflow-plugin/
        
         | adtac wrote:
         | the comparison was only on laptops, but still impressive if
         | they did that with 70% (?) less power
        
           | neogodless wrote:
           | Well, no - they immediately followed the discrete laptop
           | graphic comparison with a desktop graphic comparison,
           | highlighting how much more power they draw for "the same"
           | performance.
        
             | smileybarry wrote:
             | That wasn't for desktop graphics, it was for a top-end
             | laptop graphics SKU (I think RTX 3080 Mobile on a
             | ~135W-150W TDP configuration?). Otherwise the graph would
             | extend all the way to 360W for a RTX 3090.
        
               | ls612 wrote:
               | And I think based on these numbers that a desktop 3090
               | would have well over double the performance of the M1
               | Max. It's apples to oranges, but lets not go crazy just
               | yet.
               | 
               | Now, I am extremely excited to see what they will come up
               | with for the Mac Pro with a desktop thermal budget. That
               | might just blow everything else by any manufacturer
               | completely out of the water.
        
               | neogodless wrote:
               | Yup - I misheard while watching.
               | 
               | The chart only shows a line up to about 105W, so it's not
               | clear what they're trying to represent there. (Not that
               | there's any question this seems to be way more
               | efficient!)
        
             | magicalist wrote:
             | > _Well, no - they immediately followed the discrete laptop
             | graphic comparison with a desktop graphic comparison_
             | 
             | pretty sure the comparison was with "the most powerful PC
             | laptop we could find", which makes sense because they then
             | talked about how much it was throttled when running only on
             | battery while the new M1 is not.
        
         | Symmetry wrote:
         | GPU workloads are very parallel. By throwing more transistors
         | at the problem while lowering clock rates you can get pretty
         | good performance even in a constrained power budget.
        
         | baybal2 wrote:
         | > claim 30-series tier performance on the max GPU? If that's
         | true that would be insane!
         | 
         | TDP, TDP, TDP!
         | 
         | With big enough heatsink, the performance can be proportionally
         | high (perf = sqrt(TDP))
        
           | KarlKemp wrote:
           | ...but they are claiming 50-70 % lower power usage, and
           | therefore 50-70 % lower TDP as well.
        
         | [deleted]
        
       | spaceisballer wrote:
       | Lots of junk comments, but I guess that happens with Apple
       | announcements. Laptops seem impressive to me, I want to see the
       | real world use metrics. Pushing hard on the performance per watt
       | type metric and no doubt they have a lot of power and use less
       | power. Seems like they listened to the outcry of people regarding
       | the Touch Bar and more ports. Seems like this should sell well.
        
         | Woden501 wrote:
         | Seems they may have finally noticed the hit from a decent
         | number of the pro's using their products migrating to different
         | platforms, and realized they needed to take a few steps back on
         | the more radical innovations to put out a solid working
         | machine. Hell I haven't wanted an Apple machine since the early
         | days of the unibody when other manufacturers started releasing
         | the same form-factor. This has me considering one for my next
         | development machine depending on the price premium over the
         | competition.
        
         | arvinsim wrote:
         | Yup, waiting for the performance review embargo to lift.
        
       | lvl100 wrote:
       | I wonder if 64GB in unified memory will jumpstart ML developments
       | for Macs. I cannot wait to run a Mac Mini farm.
        
       | notquitehuman wrote:
       | I'm waiting on Apple's final decision on the CSAM scanner before
       | I buy any more hardware from them. These processors look cool,
       | but I don't think they're worth the Apple premium if they're also
       | spying for law enforcement.
        
         | ostenning wrote:
         | valid point
        
       | Brakenshire wrote:
       | Can the M1 MacBooks be used with Linux?
        
         | [deleted]
        
         | pantalaimon wrote:
         | It's a work in progress. See https://asahilinux.org/blog/ for
         | the latest updates.
        
           | duskwuff wrote:
           | And marcan42 has specifically promised to start bringup work
           | on the M1 Pro next week:
           | 
           | https://twitter.com/marcan42/status/1450163929993269249
        
           | Brakenshire wrote:
           | This is interesting:
           | 
           | > However, Apple is unique in putting emphasis in keeping
           | hardware interfaces compatible across SoC generations - the
           | UART hardware in the M1 dates back to the original iPhone!
           | This means we are in a unique position to be able to try
           | writing drivers that will not only work for the M1, but may
           | work -unchanged- on future chips as well. This is a very
           | exciting opportunity in the ARM64 world. We won't know until
           | Apple releases the M1X/M2, but if we succeed in making enough
           | drivers forwards-compatible to boot Linux on newer chips,
           | that will make things like booting older distro installers
           | possible on newer hardware. That is something people take for
           | granted on x86, but it's usually impossible in the embedded
           | world - and we hope we can change that on these machines.
        
       | haberman wrote:
       | I always thought it was strange that "integrated graphics" was,
       | for years, was synonymous with "cheap, underperforming" compared
       | to the power of a discrete GPU.
       | 
       | I never could see any fundamental reason why "integrated" should
       | mean "underpowered." Apple is turning things around, and is
       | touting the benefits of high-performance integrated graphics.
        
         | SkeuomorphicBee wrote:
         | > I never could see any fundamental reason why "integrated"
         | should mean "underpowered."
         | 
         | There was always one reason: limited memory bandwidth. You
         | simply couldn't cram enough pins and traces for all the
         | processor io plus a memory bus wide enough to feed a powerful
         | GPU. (at least not in a reasonable price)
        
           | hajile wrote:
           | We solved that almost a decade ago now with HBM. Sure, the
           | latencies aren't amazing, but the power consumption numbers
           | are and large caches can hide the higher access latencies
           | pretty well in almost all cases.
        
             | dragontamer wrote:
             | PS4 / PS5 / XBox One / XBox Series X are all iGPU but with
             | good memory bandwidths.
        
             | terafo wrote:
             | Only time that I can remember HBM being used with some kind
             | of integrated graphics was strange Intel NUC with Vega GPU
             | and IIRC correctly they were on the same die.
        
         | fulafel wrote:
         | The software side hasn't been there on x86 GP platforms, even
         | though AMD tried. It's worked out better on consoles.
        
           | boardwaalk wrote:
           | What software is missing? I figured the AMD G-series CPUs
           | used the same graphics drivers and same codepaths in those
           | drivers for the same (Vega) architecture.
           | 
           | My impression was that it was still the hardware holding
           | things back: Everything but the latest desktop CPUs still
           | using the older Vega architecture. And even those latest
           | desktop CPUs are essentially PS5 chips that got binned out.
        
         | cududa wrote:
         | Perhaps with Vista? "Integrated" graphics meant something like
         | Intel 915 which couldn't run "Aero". Even if you had the Intel
         | 945, if you had low bandwidth RAM graphics performance still
         | stuttered. Good article:
         | https://arstechnica.com/gadgets/2008/03/the-vista-capable-de...
        
         | throwawaywindev wrote:
         | Video game consoles have been using integrated graphics for at
         | least 15 years now, since Playstation 3 and Xbox 360.
        
           | monocasa wrote:
           | Longer since integrated graphics used to mean integrated onto
           | the north bridge and it's main memory controller. nForce
           | integrated chipsets with GPUs in fact started from the
           | machinations of the original Xbox switching to Intel from AMD
           | at the last second.
        
           | ip26 wrote:
           | In that case, it's more like discrete graphics with
           | integrated CPU :)
        
           | gerardvivancos wrote:
           | If you mean including PS3 and X360, these two consoles had
           | discrete GPUs. The move to AMD APUs was on the Xbox One and
           | PS4 generation
        
           | terafo wrote:
           | You are mistaken. On both PS3 and Xbox 360 CPU and GPU is on
           | different chips and made by different vendors(CPU made by IBM
           | and GPU by Nvidia in case of PS3 and CPU by IBM and GPU by
           | ATI for Xbox 360). Nonetheless in PS4/XOne generation they
           | both use single die with unified memory for everything and
           | their GPU could be called integrated.
        
             | gsnedders wrote:
             | For the 360, from 2010 production (when they introduced the
             | 45nm shrink), the CPU and GPU was merged into a single
             | chip.
        
               | cududa wrote:
               | Yup. Prior they were absolutely different dies just on
               | the same package
        
           | mod wrote:
           | Yeah, and vendors like Bungie are forced to cap their
           | framerates at 30fps (Destiny 2).
        
             | hunterb123 wrote:
             | They capped PC as well.
        
           | [deleted]
        
         | satya71 wrote:
         | Very simple: thermal budget. Chip performance is limited by
         | thermal budget. You just can't spend more than roughly 100W in
         | a single package, without going into very expensive and
         | esoteric cooling mechanisms.
        
           | gpt5 wrote:
           | As the charts Apple shared in the event showed, you hit
           | diminishing returns in performance/watt pretty quickly.
           | 
           | Sure. It'd be tough to be the top performing chip in the
           | market, but you can get pretty close.
        
       | bla3 wrote:
       | This is about the processors, not the laptops, so commenting on
       | the chips instead. They look great, but they look like they're
       | the M1 design, just more of it. Which is plenty for a laptop! But
       | it'll be interesting to see what they'll do for their desktops.
       | 
       | Most of the additional chip area went into more GPUs and special-
       | purpose video codec hardware. It's "just" two more cores than the
       | vanilla M1, and some of the efficiency cores on the M1 became
       | performance cores. So CPU-bound things like compiling code will
       | be "only" 20-50% faster than on the M1 MacBook. The big wins are
       | for GPU-heavy and codec-heavy workloads.
       | 
       | That makes sense since that's where most users will need their
       | performance. I'm still a bit sad that the era of "general purpose
       | computing" where CPU can do all workloads is coming to an end.
       | 
       | Nevertheless, impressive chips, I'm very curious where they'll
       | take it for the Mac Pro, and (hopefully) the iMac Pro.
        
         | klelatti wrote:
         | In this context the absence of the 27 inch iMac was
         | interesting. If these SoC were not deemed to be 'right' for the
         | bigger iMac then possibly a more CPU focused / developer
         | focused SoC may be in the works for the iMac?
        
           | [deleted]
        
           | wmf wrote:
           | Nah, it'll be the same M1 Max. Just wait a few months like
           | with the 24" iMac.
        
             | klelatti wrote:
             | You're probably right. Maybe they have inventory to work
             | through!
        
               | sroussey wrote:
               | Yeah, M1 Pro/Max on iMac 30" likely in 2022 H1. Mini
               | also, I imagine.
        
               | matthew_kuiash wrote:
               | Yup. Once I saw the 24" iMac I knew the 27" had had it's
               | chips. 30" won't actually be much bigger than the 27" if
               | the bezels shrink to almost nothing - which seems to be
               | the trend.
        
               | nicoburns wrote:
               | They might also have supply constraints on these new
               | chips. I suspect they are going to sell a lot of these
               | new MacBook Pros
        
               | iSnow wrote:
               | They also have a limited headcount and resources so they
               | wouldn't want to announce M1x/pro/max for all machines
               | now and have employees be idle for the next 3 months.
               | 
               | Notebooks also have a higher profit margin, so they sell
               | them to those who need to upgrade now. The lower-margin
               | systems like Mini will come later. And the Mac Pro will
               | either die or come with the next iteration of the chips.
        
               | ellisv wrote:
               | The Mac Pro might be blocked, or still progressing,
               | through design changes more so than chip changes.
        
           | julienb_sea wrote:
           | I doubt they are going to make different chips for prosumer
           | devices. They are going to spread out the M1 pro/max upgrade
           | to the rest of the lineup at some point during the next year,
           | so they can claim "full transition" through their quoted 2
           | years.
           | 
           | The wildcard is the actual mac pro. I suspect we aren't going
           | to hear about mac pro until next Sept/Oct events, and super
           | unclear what direction they are going to go. Maybe allowing
           | config of multiple M1 max SOCs somehow working together.
           | Seems complicated.
        
             | klelatti wrote:
             | On reflection I think they've decided that their pro users
             | want 'more GPU not more CPU' - they could easily have added
             | a couple more CPU cores but it obviously wasn't a priority.
             | 
             | Agreed that it's hard to see how designing a CPU just for
             | the Mac Pro would make any kind of economic sense but
             | equally struggling to see what else they can do!
        
         | spicybright wrote:
         | I'm going to sound dumb for this, but how difficult do any of
         | you think it would be to make a computer with 2 M1 chips? Or
         | even 4?
        
           | davrosthedalek wrote:
           | I think the problem would be how one chip can access the
           | memory of the other one. The big advantage in the M1xxxx is
           | the unified memory. I don't think the chips have any hardware
           | to support cache coherency and so on spanning more than one
           | chip.
        
           | fulafel wrote:
           | You would have to implement single system image abstraction,
           | if you wanted more than a networked cluster of M1s in a box,
           | in the OS using just software plus virtual memory. You'd use
           | the PCIe as the interconnect. Similar has been done by other
           | vendors for server systems, but it has tradeoffs that would
           | probably not make sense to Apple now.
           | 
           | A more realistic question would be what good hw multisocket
           | SMP support would look like in M1 Max or later chips, as that
           | would be a more logical thing to build if Apple wanted this.
        
           | biggieshellz wrote:
           | They're not meant to go together like that -- there's not
           | really an interconnect for it, or any pins on the package to
           | enable something like that. Apple would have to design a new
           | Mx SoC with something like that as an explicit design goal.
        
           | lemoncucumber wrote:
           | The M1 is a system-on-a-chip so AIUI there's a bunch of stuff
           | on there that you wouldn't want two of.
        
           | sys_64738 wrote:
           | I think the issue is OS management of tasks to prevent cpu 1
           | from having to access memory of cpu 2.
        
         | baybal2 wrote:
         | Actually no "extra" chip area in comparison to x86 based
         | solution.
         | 
         | They just throw away so much of cruft from the die like PCIE
         | PHYs, and x86 legacy I/O with large area analog circuitry.
         | 
         | Redundant complex DMA, and memory controller IPs are also
         | thrown away.
         | 
         | Clock, and power rails on the SoC are also probably taking less
         | space because of more shared circuitry.
         | 
         | Same with self-test, debug, fusing blocks, and other small
         | tidbits.
        
           | azinman2 wrote:
           | This is very interesting and first time I've heard / thought
           | about this. Wonder how much power efficiency comes from
           | exactly these things?
        
             | baybal2 wrote:
             | PCI is quite power hungry when it works on full trottle.
             | 
             | The seemed power efficiency when PCIE was going 1.0 2.0 3.0
             | ... was due to dynamic power control, and link sleep.
             | 
             | On top of it, they simply don't haul memory nonstop over
             | PCIE anymore, since data going to/from GPU is simply not
             | moving anywhere.
        
         | sys_64738 wrote:
         | The way I interpreted it is that it's like lego so they can add
         | more fast cores or more efficiency cores depending on the
         | platform needs. The successor generations will be new lego
         | building blocks.
        
         | cududa wrote:
         | Edit: I was wrong! Thanks for pointing it out
         | 
         | Not exactly. M1 CPU, GPU, and RAM were all capped in the same
         | package. New ones appear to be more a single board soldered
         | onto mainboard, with a discrete CPU, GPU, and RAM package each
         | capped individually if their "internals" promo video is to be
         | believed (and it usually is an exact representation of the
         | shipping product)
         | https://twitter.com/cullend/status/1450203779148783616?s=20
         | 
         | Suspect this is a great way for them to manage demand and
         | various yields by having 2 CPU's (or one, if the difference
         | between pro/ max is yield on memory bandwidth) and discrete
         | RAM/ GPU components
        
           | modulusshift wrote:
           | CPU and GPU is one die, you're looking at RAM chips on each
           | side of the package. The M1 also had the RAM separate but on
           | package.
           | 
           | M1:
           | https://d3nevzfk7ii3be.cloudfront.net/igi/ZRQGFteQwoIVFbNn
        
         | lowbloodsugar wrote:
         | Really curious if the memory bandwidth is entirely available to
         | the CPU if the GPU is idle. An nvidia RTX3090 has nearly 1TB/s
         | bandwidth, so the GPU is clearly going to use as much of the
         | 400GB/s as possible. Other unified architectures have multiple
         | channels or synchronization to memory, such that no one part of
         | the system can access the full bandwidth. But if the CPU can
         | access all 400GB/s, that is an absolute game changer for
         | anything memory bound. Like 10x faster than an i9 I think?
        
           | londons_explore wrote:
           | I suspect the GPU is never really idle.
           | 
           | Even simple screen refresh blending say 5 layers and
           | outputting it to a 4k screen is 190Gbits at 144 Hz.
        
             | lowbloodsugar wrote:
             | Don't know much about the graphics on an M1. Does it not
             | render to a framebuffer? Is that framebuffer spread over
             | all 4 memory banks? Can't wait to read all about it.
        
             | modulusshift wrote:
             | Apple put ProMotion in the built in display, so while it
             | can ramp up to 120Hz, it'll idle at more like 24 Hz when
             | showing static content. (the iPad Pro goes all the way down
             | to 10Hz, but some early sources seem to say 24Hz for these
             | MacBook Pros.) There may also be panel self refresh
             | involved, in which case a static image won't even need that
             | much. I bet the display coprocessors will expose the
             | adaptive refresh functionality over the external display
             | connectors as well.
        
             | snek_case wrote:
             | Don't Apple iPhones use an adaptive refresh rate nowadays?
        
               | andy_ppp wrote:
               | Indeed ProMotion is coming to these new MacBooks too.
        
           | hajile wrote:
           | AMD showed with their Infinity Cache that you can get away
           | with much less bandwidth if you have large caches. It has the
           | side effect of radically reducing power consumption.
           | 
           | Apple put 32MB of cache in their latest iPhone. 128 or even
           | 256MB of L3 cache wouldn't surprise me at all given the power
           | benefits.
        
           | znwu wrote:
           | Not sure if it will be available, but 400GB/s is way too much
           | for 8 cores to take up. You would need some sort of avx512 to
           | hog up that much bandwidth.
           | 
           | Moreover, it's not clear how much of a bandwidth/width does
           | M1 max CPU interconnect/bus provide.
           | 
           | --------
           | 
           | Edit: Add common sense about HPC workloads.
           | 
           | There is a fundamental idea called _memory-access-to-
           | computation_ ratio. We can 't assume a 1:0 ratio since it was
           | doing literally nothing except copying.
           | 
           | Typically your program needs serious fixing if it can't
           | achieve 1:4. (This figure comes from a CUDA course. But I
           | think it should be similar for SIMD)
           | 
           | Edit: also a lot of that bandwidth is fed through cache.
           | _Locality_ will eliminate some orders of magnitudes of memory
           | access, depending on the code.
        
             | lowbloodsugar wrote:
             | Don't know the clock speed but 8 cores at 3Ghz working on
             | 128bit SIMD is 8 _3_ 16 = 384GB/s so we are in the right
             | ball park. Not that I personally have a use for that =) Oh,
             | wait, bloody Java GC might be a use for that. (LOL, FML or
             | both).
        
               | dragontamer wrote:
               | But the classic SIMD problem is matrix-multiplication,
               | which doesn't need full memory bandwidth (because a lot
               | of the calculations are happening inside of cache).
               | 
               | The question is: what kind of problems are people needing
               | that want 400GB/s bandwidth on a CPU? Well, probably none
               | frankly. The bandwidth is for the iGPU really.
               | 
               | The CPU just "might as well" have it, since its a system-
               | on-a-chip. CPUs usually don't care too much about main-
               | memory bandwidth, because its like 50ns+ away latency (or
               | ~200 clock ticks). So to get a CPU going in any typical
               | capacity, you'll basically want to operate out of L1 / L2
               | cache.
               | 
               | > Oh, wait, bloody Java GC might be a use for that. (LOL,
               | FML or both).
               | 
               | For example, I know you meant the GC as a joke. But if
               | you think of it, a GC is mostly following pointer->next
               | kind of operations, which means its mostly latency bound,
               | not bandwidth bound. It doesn't matter that you can read
               | 400GB/s, your CPU is going to read an 8-byte pointer,
               | wait 50-nanoseconds for the RAM to respond, get the new
               | value, and then read a new 8-byte pointer.
               | 
               | Unless you can fix memory latency (and hint, no one seems
               | to be able to do so), you'll be only able to hit 160MB/s
               | or so, no matter how high your theoretical bandwidth is,
               | you get latency locked at a much lower value.
        
               | carlhjerpe wrote:
               | Doesn't prefetching data into the cache more quickly
               | assist in execution speed here?
        
               | dragontamer wrote:
               | How do you prefetch "node->next" where "node" is in a
               | linked list?
               | 
               | Answer: you literally can't. And that's why this kind of
               | coding style will forever be latency bound.
               | 
               | EDIT: Prefetching works when the address can be predicted
               | ahead of time. For example, when your CPU-core is reading
               | "array", then "array+8", then "array+16", you can be
               | pretty damn sure the next thing it wants to read is
               | "array+24", so you prefetch that. There's no need to wait
               | for the CPU to actually issue the command for "array+24",
               | you fetch it even before the code executes.
               | 
               | Now if you have "0x8009230", which points to
               | "0x81105534", which points to "0x92FB220", good luck
               | prefetching that sequence.
               | 
               | --------
               | 
               | Which is why servers use SMT / hyperthreading, so that
               | the core can "switch" to another thread while waiting
               | those 50-nanoseconds / 200-cycles or so.
        
               | carlhjerpe wrote:
               | I don't really know how the implementation of a tracing
               | GC works but I was thinking they could do some smart
               | memory ordering to land in the same cache-line as often
               | as possible.
               | 
               | Thanks for the clarifications :)
        
               | monocasa wrote:
               | Interestingly earlyish smalltalk VMs used to keep the
               | object headers in a separate contiguous table.
               | 
               | Part of the problem though, is that the object graph walk
               | pretty quickly is non contiguous, regardless of how it's
               | laid out in memory.
        
               | znwu wrote:
               | Yeah the marking phase cannot be efficiently vectorized.
               | But I wonder if it can help with compacting/copying
               | phase.
               | 
               | Also for me the process sounds oddly familiar to vmem
               | table walking. There is currently a RISC-V J extension
               | drafting group. I wonder what they can come up with.
        
               | [deleted]
        
             | [deleted]
        
             | GeekyBear wrote:
             | A single big core in the M1 could pretty much saturate the
             | memory bandwidth available.
             | 
             | https://www.anandtech.com/show/16252/mac-mini-
             | apple-m1-teste...
        
             | terafo wrote:
             | > _Not sure if it will be available, but 400GB /s is way
             | too much for 8 cores to take up. You would need some sort
             | of avx512 to hog up that much bandwidth._
             | 
             | If we assume that frequency is 3.2Ghz and IPC of 3 with
             | well optimized code(which is conservative for performance
             | cores since they are extremely wide) and count only
             | performance cores we get 5 bytes for instruction. M1
             | supports 128-bit Arm Neon, so peak bandwidth usage per
             | instruction(if I didn't miss anything) is 32 bytes.
        
           | tkrsh wrote:
           | More memory bandwith = 10x faster than an i9 ? this makes no
           | sense to me doesn't clock speed and cores determine the major
           | part of the performance of a cpu ?
        
         | fotta wrote:
         | I think the higher memory is also a huge win, with support for
         | up to 64gb.
        
           | christkv wrote:
           | And the much higher memory bandwidth
        
           | PaulKeeble wrote:
           | 400GB/s available to the CPU cores in a unified memory, that
           | is going to really help certain workloads that are very
           | memory dominant on modern architectures. Both Intel and AMD
           | are solving this with ever increasing L3 cache sizes but just
           | using attached memory in a SOC has vastly higher memory
           | bandwidth potential and probably better latency too
           | especially on work that doesn't fit in ~32MB of L3 cache.
        
             | amelius wrote:
             | > 400GB/s available to the CPU cores in a unified memory
             | 
             | It's not just throughput that counts, but latency. Any
             | numbers to compare there?
        
             | emsy wrote:
             | Good point. Especially since a lot of software these days
             | is not all that cache friendly. Realistically this means we
             | have 2 years or so till further abstractions eat up the
             | performance gains.
        
             | Unklejoe wrote:
             | The M1 still uses DDR memory at the end of the day, it's
             | just physically closer to the core. This is in contrast to
             | L3 which is actual SRAM on the core.
             | 
             | The DDR being closer to the core may or may not allow the
             | memory to run at higher speeds due to better signal
             | integrity, but you can purchase DDR4-5333 today whereas the
             | M1 uses 4266.
             | 
             | The real advantage is the M1 Max uses 8 channels, which is
             | impressive considering that's as many as an AMD EPYC, but
             | operates at like twice the speed at the same time.
        
               | dragontamer wrote:
               | > The M1 still uses DDR memory at the end of the day,
               | it's just physically closer to the core. This is in
               | contrast to L3 which is actual SRAM on the core.
               | 
               | But they're probably using 8-channels of LPDDR5, if this
               | 400GB/s number is to be believed. Which is far more
               | memory channels / bandwidth than any normal chip released
               | so far, EPYC and Skylake-server included.
        
               | duskwuff wrote:
               | It's more comparable to the sort of memory bus you'd
               | typically see on a GPU... which is exactly what you'd
               | hope for on a system with high-end integrated graphics.
               | :)
        
               | dragontamer wrote:
               | You'd expect HBM or GDDR6 to be used. But this is
               | seemingly LPDDR5 that's being used.
               | 
               | So its still quite unusual. Its like Apple decided to
               | take commodity phone-RAM and just make many parallel
               | channels of it... rather than using high-speed RAM to
               | begin with.
               | 
               | HBM is specifically designed to be soldered near a
               | CPU/GPU as well. For them to be soldering commodity
               | LPDDR6 is kinda weird to me.
               | 
               | ---------
               | 
               | We know it isn't HBM because HBM is 1024-bits at lower
               | clock speeds. Apple is saying they have 512-bits across 8
               | channels (64-bits per channel), which is near LPDDR5 /
               | DDR kind of numbers.
               | 
               | 200GBps is within the realm of 1x HBM channel (1024-bit
               | at low clock speeds), and 400GBps is 2x HBM channels
               | (2048-bit bus at low clock speeds).
        
               | sounds wrote:
               | Just to underscore this, memory physically closer to the
               | cores has improved tRAS times measured in nanoseconds.
               | This has the secondary effect of boosting the performance
               | of the last-level cache since it can fill lines on a
               | cache miss much faster.
               | 
               | The step up from DDR4 to DDR5 will help fill cache misses
               | that are predictable, but everybody uses a prefetcher
               | already, the net effect of DDR5 is mostly just better
               | efficiency.
               | 
               | The change Apple is making, moving the memory closer to
               | the cores, improves unpredicted cache misses. That's
               | significant.
        
               | dragontamer wrote:
               | > Just to underscore this, memory physically closer to
               | the cores has improved tRAS times measured in
               | nanoseconds.
               | 
               | I doubt that tRAS timing is affected by how close / far a
               | DRAM chip is from the core. Its just a RAS command after
               | all: transfer data from DRAM to the sense-amplifiers.
               | 
               | If tRAS has improved, I'd be curious how it was done. Its
               | one of those values that's basically been constant (on a
               | nanosecond basis) for 20 years.
               | 
               | Most DDR3 / DDR4 improvements have been about breaking up
               | the chip into more-and-more groups, so that Group#1 can
               | be issued a RAS command, then Group#2 can be issued a
               | separate RAS command. This doesn't lower latency, it just
               | allows the memory subsystem to parallelize the requests
               | (increasing bandwidth but not improving the actual
               | command latency specifically).
        
               | kllrnohj wrote:
               | The physically shorter wiring is doing basically nothing.
               | That's not where any of the latency bottlenecks are for
               | RAM. If it was physically on-die, like HBM, that'd be
               | maybe different. But we're still talking regular LPDDR5
               | using off the shelf dram modules. The shorter wiring
               | would potentially improve signal quality, but ground
               | shields do that, too. And Apple isn't exceeding any specs
               | on this (ie, it's not overclocked), so above average
               | signal integrity isn't translating into any performance
               | gains anyway.
        
               | wmf wrote:
               | _improved tRAS times_
               | 
               | Has this been documented anywhere? What timings are Apple
               | using?
        
               | morei wrote:
               | L3 is almost never SRAM, it's usually eDRAM and clocked
               | significantly lower than L1 or L2.
               | 
               | (SRAM is prohibitively expensive to do at scale due to
               | die area required).
        
               | Unklejoe wrote:
               | L3 is SRAM on all AMD Ryzen chips that I'm aware of.
               | 
               | I think it's the same with Intel too except for that one
               | 5th gen chip.
        
               | dragontamer wrote:
               | As far as I'm aware, IBM is one of the few chip-designers
               | who have eDRAM capabilities.
               | 
               | IBM has eDRAM on a number of chips in varying capacities,
               | but... its difficult for me to think of Intel, AMD,
               | Apple, ARM, or other chips that have eDRAM of any kind.
               | 
               | Intel had one: the eDRAM "Crystalwell" chip, but that is
               | seemingly a one-off and never attempted again. Even then,
               | this was a 2nd die that was "glued" onto the main chip,
               | and not like IBM's truly eDRAM (embedded into the same
               | process).
        
               | kergonath wrote:
               | > The DDR being closer to the core may or may not allow
               | the memory to run at higher speeds due to better signal
               | integrity, but you can purchase DDR4-5333 today whereas
               | the M1 uses 4266.
               | 
               | My understanding is that bringing the RAM closer
               | increases the bandwidth (better latency and larger
               | buses), not necessarily the speed of the RAM dies. Also,
               | if I am not mistaken, the RAM in the new M1s is LP-DDR5
               | (I read that, but it did not stay long on screen so I
               | could be mistaken). Not sure how it is comparable with
               | DDR4 DIMMs.
        
               | Unklejoe wrote:
               | The overall bandwidth isn't affected much by the distance
               | alone. Latency, yes, in the sense that the signal
               | literally has to travel further, but that difference is
               | miniscule (like 1/10th of a nanosecond) compared to
               | overall DDR access latencies.
               | 
               | Better signal integrity could allow for larger busses,
               | but I don't think this is actually a single 512 bit bus.
               | I think it's multiple channels of smaller busses (32 or
               | 64 bit). There's a big difference from an electrical
               | design perspective (byte lane skew requirements are
               | harder to meet when you have 64 of them). That said, I
               | think multiple channels is better anyway.
               | 
               | The original M1 used LPDDR4 but I think the new ones use
               | some form of DDR5.
        
               | GeekyBear wrote:
               | > The overall bandwidth isn't affected much by the
               | distance alone.
               | 
               | Testing showed that the M1's performance cores had a
               | surprising amount of memory bandwidth.
               | 
               | >One aspect we've never really had the opportunity to
               | test is exactly how good Apple's cores are in terms of
               | memory bandwidth. Inside of the M1, the results are
               | ground-breaking: A single Firestorm achieves memory reads
               | up to around 58GB/s, with memory writes coming in at
               | 33-36GB/s. Most importantly, memory copies land in at 60
               | to 62GB/s depending if you're using scalar or vector
               | instructions. The fact that a single Firestorm core can
               | almost saturate the memory controllers is astounding and
               | something we've never seen in a design before.
               | 
               | https://www.anandtech.com/show/16252/mac-mini-
               | apple-m1-teste...
        
               | rdw wrote:
               | Your comment got me thinking, and I checked the math. It
               | turns out that light takes ~0.2 ns to travel 2 inches.
               | But the speed of signal propagation in copper is ~0.6 c,
               | so that takes it up to 0.3 ns. So, still pretty small
               | compared to the overall latencies (~13-18 ns for DDR5)
               | but it's not negligible.
               | 
               | I do wonder if there are nonlinearities that come in to
               | play when it comes to these bottlenecks. Yes, by moving
               | the RAM closer it's only reducing the latency by 0.2 ns.
               | But, it's also taking 1/3rd of the time that it used to,
               | and maybe they can use that extra time to do 2 or 3
               | transactions instead. Latency and bandwidth are inversely
               | related, after all!
        
               | jjoonathan wrote:
               | Well, you can have high bandwidth and poor latency at the
               | same time -- think ultra wide band radio burst from Earth
               | to Mars -- but yeah, on a CPU with all the crazy co-
               | optimized cache hierarchies and latency hiding it's
               | difficult to see how changing one part of the system
               | changes the whole. For instance, if you switched 16GB of
               | DRAM for 4GB of SRAM, you could probably cut down the
               | cache-miss latency a lot -- but do you care? If you cache
               | hit rate is high enough, probably not. Then again, maybe
               | chopping the worst case lets you move allocation away
               | from L3 and L2 and into L1, which gets you a win again.
               | 
               | I suspect the only people who really know are the CPU
               | manufacturer teams that run PIN/dynamorio traces against
               | models -- and I also suspect that they are NDA'd through
               | this life and the next and the only way we will ever know
               | about the tradeoffs are when we see them pop up in actual
               | designs years down the road.
        
               | jjoonathan wrote:
               | DRAM latencies are pretty heinous. It makes me wonder if
               | the memory industry will go through a similar transition
               | to the storage industry's HDD->SSD sometime in the not
               | too distant future.
               | 
               | I wonder about the practicalities of going to SRAM for
               | main memory. I doubt silicon real estate would be the
               | limiting factor (1T1C to 6T, isn't it?) and Apple charges
               | a king's ransom for RAM anyway. Power might be a problem
               | though. Does anyone have figures for SRAM power
               | consumption on modern processes?
        
               | phkahler wrote:
               | >> I wonder about the practicalities of going to SRAM for
               | main memory. I doubt silicon real estate would be the
               | limiting factor (1T1C to 6T, isn't it?) and Apple charges
               | a king's ransom for RAM anyway. Power might be a problem
               | though. Does anyone have figures for SRAM power
               | consumption on modern processes?
               | 
               | I've been wondering about this for years. Assuming the
               | difference is similar to the old days, I'd take 2-4GB of
               | SRAM over 32GB of DRAM any day. Last time this came up
               | people claimed SRAM power consumption would be
               | prohibitive, but I have a hard time seeing that given
               | these 50B transistor chips running at several GHz. Most
               | of the transistors in an SRAM are not switching, so they
               | should be optimized for leakage and they'd still be way
               | faster than DRAM.
        
               | GeekyBear wrote:
               | Apple also uses massive cache sizes, compared to the
               | industry.
               | 
               | They put a 32 megabyte system level cache in their latest
               | phone chip.
               | 
               | >at 32MB, the new A15 dwarfs the competition's
               | implementations, such as the 3MB SLC on the Snapdragon
               | 888 or the estimated 6-8MB SLC on the Exynos 2100
               | 
               | https://www.anandtech.com/show/16983/the-apple-a15-soc-
               | perfo...
               | 
               | It will be interesting to see how big they go on these
               | chips.
        
               | rektide wrote:
               | > Apple also uses massive cache sizes, compared to the
               | industry.
               | 
               | AMD's upcoming Ryzen are supposed to have 192MB L3
               | "v-cache" SRAM stacked above each chiplet. Current
               | chiplets are 8-core. I'm not sure if this is a single
               | chiplet but supposedly good for 2Tbps[1].
               | 
               | Slightly bigger chip than a iphone chip yes. :) But also
               | wow a lot of cache. Having it stacked above rather than
               | built in to the core is another game-changing move, since
               | a) your core has more space b) you can 3D stack many
               | layers of cache atop.
               | 
               | This has already been used on their GPUs, where the 6800
               | & 6900 have 128MB of L3 "Infinity cache" providing
               | 1.66TBps. It's also largely how these cards get by with
               | "only" 512GBps worth of GDDR6 feeding them (256bit/quad-
               | channel... at 16GT). AMD's R9 Fury from spring 2015 had
               | 1TBps of HBM2, for compare, albeit via that slow 4096bit
               | wide interface.
               | 
               | Anyhow, I'm also in awe of the speed wins Apple got here
               | from bringing RAM in close. Cache is a huge huge help.
               | Plus 400GBps main memory is truly awesome, and it's neat
               | that either the CPU or GPU can make use of it.
               | 
               | [1] https://www.anandtech.com/show/16725/amd-
               | demonstrates-stacke...
        
             | znwu wrote:
             | I'm thinking with that much bandwidth, maybe they will roll
             | out SVE2 with vlen=512/1024 for future M series.
             | 
             | AVX512 suffers from bandwidth on desktop. But now the
             | bandwidth is just huge and SVE2 is naturally scalable.
             | Sounds like free lunch?
        
           | lkbm wrote:
           | I'm guessing that's new for the 13" or for the M1, but my
           | 16-inch MacBook Pro purchased last year had 64GB of memory.
           | (Looks like it's considered a 2019 model, despite being
           | purchased in September 2020).
        
             | fotta wrote:
             | Right the Intels supported 64gb, but the 16gb limitation on
             | the M1 was literally the only thing holding me back from
             | upgrading.
        
             | jack_riminton wrote:
             | I don't think this is an apples to apples comparison
             | because of how the new unified memory works
        
           | bla3 wrote:
           | I thought the memory was one of the more interesting bits
           | here.
           | 
           | My 2-year-old Intel MBP has 64 GB, and 8 GB of additional
           | memory on the GPU. True, on the M1 Max you don't have to copy
           | back and forth between CPU and GPU thanks to integrated
           | memory, but the new MBP still has less total memory than my
           | 2-year-old Intel MBP.
           | 
           | And it seems they just barely managed to get to 64 GiB. The
           | whole processor chip is surrounded by memory chips. That's in
           | part why I'm curious to see how they'll scale this. One idea
           | would be to just have several M1 Max SoCs on a board, but
           | that's going to be interesting to program. And getting to 1
           | TB of memory seems infeasible too.
        
             | mlindner wrote:
             | How much of that 64 GB is in use at the same time though?
             | Caching not recently used stuff from DRAM out to an SSD
             | isn't actually that slow, especially with the high speed
             | SSD that Apple uses.
        
             | sulam wrote:
             | So, M1 has been out for a while now, with HN doom and gloom
             | about not being able to put enough memory into them. Real
             | world usage has demonstrated far less memory usage than
             | people expected (I don't know why, maybe someone paid
             | attention and can say). The result is that 32G is a LOT of
             | memory for an M1-based laptop, and 64G is only needed for
             | very specific workloads I would expect.
        
             | derefr wrote:
             | > but the new MBP still has less total memory
             | 
             | From the perspective of your GPU, that 64GB of main memory
             | attached to your CPU is almost as slow to fetch from as if
             | it were memory on a separate NUMA node, or even pages
             | swapped to an NVMe disk. It may as well not be considered
             | "memory" at all. It's effectively a secondary storage tier.
             | 
             | Which means that you can't really do "GPU things" (e.g.
             | working with hugely detailed models where it's the model
             | itself, not the textures, that take up the space) as if you
             | had 64GB of memory. You can maybe break apart the problem,
             | but maybe not; it all depends on the workload. (For
             | example, you can't really run a Tensorflow model on a GPU
             | with less memory than the model size. Making it work would
             | be like trying to distribute a graph-database routing query
             | across nodes -- constant back-and-forth that multiplies the
             | runtime exponentially. Even though each step is
             | parallelizable, on the whole it's the opposite of an
             | embarrassingly-parallel problem.)
        
               | Vomzor wrote:
               | That's not how M1's unified memory works.
               | 
               | >The SoC has access to 16GB of unified memory. This uses
               | 4266 MT/s LPDDR4X SDRAM (synchronous DRAM) and is mounted
               | with the SoC using a system-in-package (SiP) design. A
               | SoC is built from a single semiconductor die whereas a
               | SiP connects two or more semiconductor dies. SDRAM
               | operations are synchronised to the SoC processing clock
               | speed. Apple describes the SDRAM as a single pool of
               | high-bandwidth, low-latency memory, allowing apps to
               | share data between the CPU, GPU, and Neural Engine
               | efficiently. In other words, this memory is shared
               | between the three different compute engines and their
               | cores. The three don't have their own individual memory
               | resources, which would need data moved into them. This
               | would happen when, for example, an app executing in the
               | CPU needs graphics processing - meaning the GPU swings
               | into action, using data in its memory. https://www.thereg
               | ister.com/2020/11/19/apple_m1_high_bandwid...
               | 
               | These Macs are gonna be machine learning beasts.
        
             | gamacodre wrote:
             | Why 1TB? 640GB ought to be enough for anything...
        
               | gamacodre wrote:
               | Huh, I guess that was as bad an idea as the 640K one.
        
               | saijanai wrote:
               | How much per 8K x 10 bit color, video frame?
               | 
               | Roughly 190GB per minute without sound.
               | 
               | Trying to do special effects on more than a few seconds
               | of 8K video would overwhelm a 64GB system, I suspect.
        
               | jrk wrote:
               | Video and VFX generally don't need to keep whole
               | sequences in RAM persistently these days because:
               | 
               | 1. The high-end SSDs in all Macs can keep up with that
               | data rate (3GB/sec) 2. Real-time video work is virtually
               | always performed on compressed (even losslessly
               | compressed) streams, so the data rate to stream is less
               | than that.
        
               | gamacodre wrote:
               | You were (unintentionally) trolled. My first post up
               | there was alluding to the legend that Bill Gates once
               | said, speaking of the original IBM PC, "640K of memory
               | should be enough for anybody." (N.B. He didn't[0])
               | 
               | [0] https://www.wired.com/1997/01/did-gates-really-
               | say-640k-is-e...
        
             | fotta wrote:
             | I'm interested to see how the GPU on these performs, I
             | pretty much disable the dGPU on my i9 MBP because it bogs
             | my machine down. So for me it's essentially the same amount
             | of memory.
        
             | Gene_Parmesan wrote:
             | Just some genuine honest curiosity here; how many workloads
             | actually require 64gb of ram? For instance, I'm an amateur
             | in the music production scene, and I know that sampling
             | heavy work flows benefit from being able to load more audio
             | clips fully into RAM rather than streaming them from disk.
             | But 64g seems a tad overkill even for that.
             | 
             | I guess for me I would prefer an emphasis on
             | speed/bandwidth rather than size, but I'm also aware there
             | are workloads that I'm completely ignorant of.
        
               | hatsubai wrote:
               | Another anecdote from someone who is also in the music
               | production scene - 32GB tended to be the "sweet spot" in
               | my personal case for the longest time, but I'm finding
               | myself hitting the limits more and more as I continue to
               | add more orchestral tracks which span well over 100
               | tracks total in my workflows.
               | 
               | I'm finding I need to commit and print a lot of these.
               | Logic's little checker in the upper right showing RAM,
               | Disk IO, CPU, etc also show that it is getting close to
               | memory limits on certain instruments with many layers.
               | 
               | So as someone who would be willing to dump $4k into a
               | laptop where its main workload is only audio production,
               | I would feel much safer going with 64GB knowing there's
               | no real upgrade if I were to go with the 32GB model
               | outside of buying a totally new machine.
               | 
               | Edit: And yes, there is does show the typical "fear of
               | committing" issue that plagues all of us people making
               | music. It's more of a "nice to have" than a necessity,
               | but I would still consider it a wise investment. At least
               | in my eyes. Everyone's workflow varies and others have
               | different opinions on the matter.
        
               | [deleted]
        
               | kmeisthax wrote:
               | I know the main reason why the Mac Pro has options for
               | LRDIMMs for terabytes of RAM is specifically for audio
               | production, where people are basically using their system
               | memory as cache for their entire instrument library.
               | 
               | I have to wonder how Apple plans to replace the Mac Pro -
               | the whole benefit of M1 is that gluing the memory to the
               | chip (in a user-hostile way) provides significant
               | performance benefits; but I don't see Apple actually
               | engineering a 1TB+ RAM SKU or an Apple Silicon machine
               | with socketed DRAM channels anytime soon.
        
               | ellisv wrote:
               | > how many workloads actually require 64gb of ram?
               | 
               | Don't worry, Chrome will eat that up in no time!
               | 
               | More seriously, I look forward to more RAM for some of
               | the datasets I work with. At least so I don't have to
               | close everything else while running those workloads.
        
               | dylan604 wrote:
               | On a desktop Hackintosh, I started with 32GB that would
               | die with out of memory errors when I was processing 16bit
               | RAW images at full resolution. Because it was Hackintosh,
               | I was able to upgrade to 64GB so the processing could
               | complete. That was the only thing running.
        
               | jackjeff wrote:
               | Can't answer for music, but as a developer a sure way to
               | waste a lot of RAM is to run a bunch of virtual machines,
               | containers or device simulators.
               | 
               | I have 32GB, so unless I'm careless everything usually
               | fits in memory without swapping. If you got over things
               | get slow and you notice.
        
               | 00deadbeef wrote:
               | Same, I tend to get everything in 32GB but more and more
               | often I'm going over that and having things slow down.
               | I've also nuked an SSD in a 16GB MBP due to incredibly
               | high swap activity. It would make no sense for me to buy
               | another 32GB machine if I want it to last five years.
        
               | miohtama wrote:
               | Don't run Chrome and Slack at the same time :)
        
               | xcskier56 wrote:
               | How do you track the swap activity? What would you call
               | "high" swap activity?
        
               | AdrianB1 wrote:
               | Not many, but there are a few that need even more. My
               | team is running SQL servers on their laptops (development
               | and support) and when that is not enough, we go to
               | Threadrippers with 128-256GB of RAM. Other people run
               | Virtual Machines on their computers (I work most of the
               | time in a VM) and you can run several VMs at the same
               | time, eating up RAM really fast.
        
               | FpUser wrote:
               | I ran 512GB on my home server, 256GB on my desktop and
               | 128GB on small factor desktop that I take with me to
               | summer cottage.
               | 
               | Some of my projects work with big in memory databases.
               | Add regular tasks and video processing on top and there
               | you go.
        
             | londons_explore wrote:
             | Memory is very stackable if needed, since the power per
             | unit area is very low.
        
           | ssijak wrote:
           | And NVMe with 7.5gbps are like, we are almost not even note
           | worthy haha Impressive all around.
        
             | jeswin wrote:
             | It's not that noteworthy, given that affordable Samsung 980
             | Pro SSDs have been doing those speeds for well over a year
             | now.
        
               | nojito wrote:
               | 980 pro maxes at 7.
        
               | jeswin wrote:
               | But it's also been around for at least a year. And
               | upcoming pcie 5 SSDs will up that to 10-14GBps.
               | 
               | I'm saying Apple might have wanted to emphasise their
               | more standout achievements. Such as on the CPU front,
               | where they're likely to be well ahead for a year -
               | competition won't catch up until AMD starts shipping 5nm
               | Zen4 CPUs in Q3/Q4 2022.
        
               | nojito wrote:
               | Apple has a well over 5 year advantage when compared to
               | their competition.
        
         | KMnO4 wrote:
         | > I'm still a bit sad that the era of "general purpose
         | computing" where CPU can do all workloads is coming to an end.
         | 
         | They'll still do all workloads, but are optimized for certain
         | workloads. How is that any different than say, a Xeon or EPYC
         | cpu designed for highly threaded (server/scientific computing)
         | applications?
        
         | ed_elliott_asc wrote:
         | Surely the shared ram between cpu and gpu is the killer feature
         | - zero copy and up to 64gb ram available for the gpu!
        
         | the_arun wrote:
         | Could we use M1 chips on non-apple boards? If yes, I wish Apple
         | releases these for non mac os consumption. Eg. Running linux
         | servers in the cloud.
        
           | priyanmuthu wrote:
           | Wasn't there a rumor that AMD was creating ARM chips? It will
           | be great if we have ARM versions of EPYC chips.
        
             | monocasa wrote:
             | They were, but have stopped talking about that for years.
             | The project is probably canceled; I've heard Jim Keller
             | talk about how that work was happening simultaneously with
             | Zen 1.
        
             | zik wrote:
             | I'd love to see someone do a serious desktop RISC-V
             | processor.
        
           | socialdemocrat wrote:
           | Not a great fit. Something like Ampere altra is better as it
           | gives you 80 cores and much more memory which better fits a
           | server. A server benefits more from lots of weaker cores than
           | a few strong cores. The M1 is an awesome desktop/laptop chip
           | and possibly great for HPC, but not for servers.
           | 
           | What might be more interesting is to see powerful gaming rigs
           | built around the these chips. They could have build a kickass
           | game console with these chips.
        
           | culpable_pickle wrote:
           | Why? There are plenty of server oriented ARM platforms
           | available for use (See AWS Graviton). What benefit do you
           | feel Apple's platform gives over existing ones?
        
             | modulusshift wrote:
             | Well, tons, there isn't another ARM core that can match a
             | single M1 Firestorm, core to core. Heck, only the highest
             | performance x86 cores can match a Firestorm core. and
             | that's just raw performance, not even considering power
             | efficiency. But of course, Apple's not sharing.
        
             | dragontamer wrote:
             | The Apple cores are full custom, Apple-only designs.
             | 
             | The AWS Graviton are Neoverse cores, which are pretty good,
             | but clearly these Apple-only M1 cores are above-and-beyond.
             | 
             | ---------
             | 
             | That being said: these M1 cores (and Neoverse cores) are
             | missing SMT / Hyperthreading, and a few other features I'd
             | expect in a server product. Servers are fine with the
             | bandwidth/latency tradeoff: more (better) bandwidth but at
             | worse (highter) latencies.
        
               | carlhjerpe wrote:
               | My understanding is that you don't really need
               | hyperthreading on a RISC CPU because decoding
               | instructions is easier and doesn't have to be
               | parallelised as with hyperthreading.
        
               | tedunangst wrote:
               | Hyper threading has nothing to do with instruction
               | decode. It's for hiding memory latency. The sparc T line
               | is 8 way threaded.
        
               | monocasa wrote:
               | It's slightly more general than that, hiding inefficient
               | use of functional units. A lot of times that's totally
               | memory latency causing the inability to keep FUs fed like
               | you say, but i've seen other reasons, like a wide but
               | diverse set of FUs that have trouble applying to every
               | workload.
        
               | dragontamer wrote:
               | Okay, the whole RISC thing is stupid. But ignoring that
               | aspect of the discussion... POWER9, one of those RISC
               | CPUs, has 8-way SMT. Neoverse E1 also has SMT-2 (aka:
               | 2-way hyperthreading).
               | 
               | SMT / Hyperthreading has nothing to do with RISC / CISC
               | or whatever. Its just a feature some people like or don't
               | like.
               | 
               | RISC CPUs (Neoverse E1 / POWER9) can perfectly do SMT if
               | the designers wanted.
        
               | socialdemocrat wrote:
               | Don't think that is entirely true. Lots of features which
               | exist on both RISC and CISC CPUs have different natural
               | fit. Using micro-ops e.g. on a CISC is more important
               | than in RISC CPU even if both benefit. Likewise
               | pipelining has a more natural fit on RISC than CISC,
               | while micro-op cache is more important on CISC than RISC.
        
               | dragontamer wrote:
               | I don't even know what RISC or CISC means anymore.
               | They're bad, non-descriptive terms. 30 years ago, RISC or
               | CISC meant something, but not anymore.
               | 
               | Today's CPUs are pipelined, out-of-order, speculative,
               | (sometimes) SMT, SIMD, multi-core with MESI-based
               | snooping for cohesive caches. These words actually have
               | meaning.
        
               | chasil wrote:
               | The DEC Alpha had SMT on their processor roadmap, but it
               | was never implemented as their own engineers told the
               | Compaq overlords that they could never compete with
               | Intel.
               | 
               | "The 21464's origins began in the mid-1990s when computer
               | scientist Joel Emer was inspired by Dean Tullsen's
               | research into simultaneous multithreading (SMT) at the
               | University of Washington."
               | 
               | https://en.wikipedia.org/wiki/Alpha_21464
        
           | MaysonL wrote:
           | Linux on M1 Macs is under development, running, last I heard.
           | 
           | https://9to5mac.com/2021/10/07/linux-is-now-usable-as-a-
           | basi...
        
         | neogodless wrote:
         | > "just" two more cores than the vanilla M1
         | 
         | Total cores, but going from 4 "high performance" and 4
         | "efficiency" to 8 "high performance" and 2 "efficiency. So
         | should be more dramatic increase in performance than "20% more
         | cores" would provide.
        
           | skohan wrote:
           | Is there a tradeoff in terms of power consumption?
        
             | hajile wrote:
             | In A15, Anandtech claims the Efficiency cores are 1/3 the
             | performance, but 1/10 the power. They should be looking at
             | (effectively) doubling the power consumption over M1 with
             | just the CPUs and assuming they don't increase clockspeeds.
             | 
             | Going from 8 to 16 or 32 GPU cores is another massive power
             | increase.
        
               | barelysapient wrote:
               | I wonder if Apple will give us a 'long-haul' mode where
               | the system is locked to only the energy efficient cores
               | and settings. I us developer types would love a computer
               | that survives 24 hours on battery.
        
               | modulusshift wrote:
               | macOS Monterey coming out on the 25th has a new Low Power
               | Mode feature that may do just that. That said, these Macs
               | are incredibly efficient for light use, you may already
               | get 24 hrs of battery life with your workload. Not
               | counting screen off time.
        
               | sulam wrote:
               | Yes, it depends on what you're doing, but if you can
               | watch 21 hours of video, many people will be able to do
               | more than 24 hours of development.
        
             | ksec wrote:
             | Yes. But the 14" and 16" has larger battery than 13"
             | MacBook Pro or Air. And they were designed for performance,
             | so two less EE core doesn't matter as much.
             | 
             | It is also important to note, despite the name with M1, we
             | dont know if the CPU core are the same as the one used in
             | M1 / A14. Or did they used A15 design where the energy
             | efficient core had significant improvement. Since the Video
             | Decoder used in M1 Pro and Max seems to be from A15, the
             | LPDDR5 is also a new memory controller.
        
             | cbarrick wrote:
             | In the power/performance curves provided by Apple, they
             | imply that the Pro/Max provides the same level of
             | performance at a slightly _lower_ power consumption than
             | the original M1.
             | 
             | But at the same time, Apple isn't providing any hard data
             | or explaining their methodology. I dunno how much we should
             | be reading into the graphs. /shrug
        
               | mlindner wrote:
               | I think you misread the graph. https://www.apple.com/news
               | room/images/product/mac/standard/A...
               | 
               | The graph there shows that the new chip is higher power
               | usage at all performance levels.
        
               | derefr wrote:
               | Not all; it looks like the M1 running full-tilt is
               | slightly less efficient for the same perf than the M1
               | Pro/Max. (I.e., the curves intersect.)
        
               | jrk wrote:
               | Yes, but only at the very extreme. It's normal that a
               | high core count part at low clocks has higher efficiency
               | (perf/power) at a given performance level than a low core
               | count part at high clocks, since power grows super-
               | linearly with clock speed (decreasing efficiency). But
               | notably they've tuned the clock/power regime of the M1
               | Pro/Max CPUs that the crossover region here is very
               | small.
        
             | gzer0 wrote:
             | > M1 Pro delivers up to 1.7x more CPU performance at the
             | same power level and achieves the PC chip's peak
             | performance using up to 70 percent less power
             | 
             | Uses less power
        
               | mlindner wrote:
               | That's compared to the PC chips, not M1. M1 uses less
               | power at same performance levels.
               | 
               | https://www.apple.com/newsroom/images/product/mac/standar
               | d/A...
        
             | SpelingBeeChamp wrote:
             | Huge, apparently. I just spent a bit over $7,000 for a top-
             | spec model and was surprised to read that it comes with a
             | 140 watt power adapter.
             | 
             | Prior to my current M1 MBP, my daily driver was a maxed-out
             | 16" MBP. It's a solid computer, but it functions just as
             | well as a space heater.
             | 
             | And its power adapter is only 100 watts.
        
               | jerrysievert wrote:
               | the power supply is for charging the battery faster. the
               | new magsafe 3 system can charge with more wattage than
               | usb-c, as per the announcement. usb-c max wattage is 100
               | watts, which was the previous limiting factor for battery
               | charge.
        
               | robert_foss wrote:
               | USB-C 3.1 PD delivers up to 240watts.
        
               | gumby wrote:
               | USB Power Delivery 3.1 goes up to 240 W (or, I should
               | say, "will go up" as I don't think anybody is shipping it
               | yet)
        
               | tantony wrote:
               | They support fast-charging the battery to 50% in 30
               | minutes. That's probably the reason for the beefy
               | charger.
        
       | [deleted]
        
       | pier25 wrote:
       | Anyone can comment on what Intel and AMD are going to do now?
       | 
       | Will they be able to catch up or will Qualcomm become the
       | alternative for ARM laptop chips? (and maybe desktop chips too)
        
         | ksec wrote:
         | >Anyone can comment on what Intel and AMD are going to do now?
         | 
         | In the short term, nothing. But it isn't like Apple will
         | magically make all PC user switch to Mac.
         | 
         | Right now Intel will need to catch up with Foundry first. AMD
         | needs to work their way into partnering with many people with
         | GPU IP which is their unique advantage. Both are currently well
         | under way and are sensible path forward. Both CEOs knows what
         | they are doing. I rarely praise any CEOs, but Pat and Dr Lisa
         | are good.
        
         | neogodless wrote:
         | This exact question was asked a year ago when the M1 was
         | announced.
         | 
         | In the year since, their laptop market share increased about 2%
         | from 14 to 16%[0].
         | 
         | The reasons for this are:
         | 
         | 1. When deciding on a computer, you often have to decide based
         | on use case, software/games used, and what operating system
         | will work best for those use cases. For Windows users, it
         | doesn't matter if you can get similar performance from a
         | Macbook Pro, because you're already shopping Windows PCs.
         | 
         | 2. Performance for _most_ use cases has been _enough_ for
         | practically a decade (depending on the use case.) For some
         | things, no amount of performance is  "enough" but your workload
         | may still be very OS-dependent. So you probably start with OS X
         | or Windows in mind before you begin.
         | 
         | 3. The efficiency that M1/Pro/Max are especially good at are
         | not the only consideration for purchase decisions for hardware.
         | And they are only available in a Macbook / Macbook Pro / Mini.
         | If you want anything else - desktop, dedicated gaming laptop,
         | or any other configuration that isn't covered here, you're
         | still looking at a PC instead of a Mac. If you want to run
         | Linux, you're probably still better off with a PC. If you want
         | OS X, then there is only M1, and Intel/AMD are wholly
         | irrelevant.
         | 
         | 4. Many buyers simply do not want to be a part of Apple's
         | closed system.
         | 
         | So for Intel/AMD to suddenly be "behind" still means that years
         | will have to go by while consumers (and especially corporate
         | buyers) shift their purchase decisions and Apple market share
         | grows beyond the 16% they're at now. But performance is not the
         | only thing to consider, and Intel/AMD are not sitting still
         | either. They release improved silicon over time. If you'd asked
         | me a year ago, I'd say "do not buy anything Intel" but their
         | 2021 releases are perfectly fine, even if not class-leading.
         | AMD's product line has improved drastically over the past 4
         | years, and are easy to recommend for many use cases. Their Zen
         | 4 announcement may also be on the 5nm TSMC node, and could be
         | within the ballpark of M1 Pro/Max for performance/efficiency,
         | but available to the larger PC marketplace.
         | 
         | [0] https://www.statista.com/statistics/576473/united-states-
         | qua...
        
           | klelatti wrote:
           | These are unit market share numbers, so will include large
           | numbers of PCs - both consumer and corporate - at price
           | points where Apple isn't interested in competing because the
           | margins are probably too low.
           | 
           | I suspect by value their share is far higher and their % of
           | profits is even bigger.
           | 
           | The strategy is very clear and it's the same as the iPhone.
           | Dominate the high end and capture all the profits. Of course
           | gaming is an exception to this.
           | 
           | The bad news for Intel is that they make their margins on
           | high end CPUs too.
           | 
           | For Intel and AMD there are two different questions: will
           | Intel fix their process technology, and will AMD get access
           | to TSMC's leading nodes in the volumes needed to make a
           | difference to their market share?
        
           | pier25 wrote:
           | All good points but:
           | 
           | 1) In the pro market (audio, video, 3d, etc) performance is
           | very relevant.
           | 
           | 2) Battery time is important to all types of laptop users.
           | 
           | 3) Apple is certainly working on more desktop alternatives.
           | 
           | 4) You don't need move all your devices into the closed
           | ecosystem just because you use a Mac. Also, some people just
           | don't want to use macOS on principle, but I'm guessing this
           | is a minority.
           | 
           | > _AMD 's product line has improved drastically over the past
           | 4 years_
           | 
           | My desktop Windows PC has a 3700X which was very impressive
           | at the time, but it is roughly similar in perf to the "low
           | end" M1 aimed at casual users.
           | 
           | > _Their Zen 4 announcement may also be on the 5nm TSMC node,
           | and could be within the ballpark of M1 Pro /Max for
           | performance/efficiency, but available to the larger PC
           | marketplace._
           | 
           | That would be great.
        
             | asdff wrote:
             | In the pro market especially you have people who are stuck
             | using some enterprise software that is only developed for
             | PC like a few Autodesk programs. If you are into gaming
             | many first party developers don't even bother making a mac
             | port. The new call of duty and battlefield games are on
             | every platform but switch and mac OS, and that's
             | increasingly par for the course for this industry since mac
             | laptops have been junk to game on for so long.
        
             | neogodless wrote:
             | Agreed.
             | 
             | I think the big thing to remember is that "performance
             | crown" at any moment in time does not have a massive
             | instantaneous effect on the purchasing habits across the
             | market.
             | 
             | I have no doubt that Apple will continue to grow their
             | market share here. But the people that continue to buy PC
             | will not expect ARM-based chips unless someone (whether
             | Intel, AMD, Qualcomm or someone else) builds those chips
             | _and_ they are competitive with x86. And x86 chips are not
             | suddenly  "so bad" (read: obsolete) that no one will
             | consider buying them.
        
       | elromulous wrote:
       | As is typical for apple, the phrasing is somewhat intentionally
       | misleading (like my favorite apple announcement - "introducing
       | the best iphone yet" - as if other companies are going
       | backwards?). The wording is of course carefully chosen to be
       | technically true, but to the average consumer, this might imply
       | that these are more powerful than any CPU apple has ever offered
       | (which of course is not true).
        
         | eqtn wrote:
         | This time, they showed which laptop was used to compare the
         | performance on the bottom left corner during presentation
        
           | Trex_Egg wrote:
           | Yeah, that is good ofcourse.
        
         | ukd1 wrote:
         | Which are more powerful?
        
         | BitAstronaut wrote:
         | >this might imply that these are more powerful than any CPU
         | apple has ever offered (which of course is not true).
         | 
         | Excuse my ignorance, what is?
        
       | FearlessNebula wrote:
       | Can somebody check up on intel? Are they okay?
        
       | nrjames wrote:
       | I wish that these systems somehow could use/access the CUDA and
       | DLSS pipelines from NVidia.
        
       | sva_ wrote:
       | I suppose the future of personal computing may be ARM then? For
       | now
        
         | kfprt wrote:
         | If Nvidia buys ARM they'll flee like rats on a sinking ship.
         | I'd bet on RISC-V.
        
       | m15i wrote:
       | Will the 64GB RAM max chip be practical for training deep
       | learning models? Any benchmarks vs GTX 3090?
        
         | andrewl-hn wrote:
         | Their comparison charts showed the performance of mobile GPUs,
         | not the desktop ones. So, I wouldn't call this "practical".
         | Most likely depends on what kind of models you are building and
         | what software you use and how optimized it is for M1.
        
       | WORMS_EAT_WORMS wrote:
       | I'm so ridiculously happy with my first generation M1 I have zero
       | desire to upgrade.
       | 
       | Kind of wild to consider given how long it has taken to get here
       | with the graveyard of Apple laptops in my closet.
        
         | eugeniub wrote:
         | Same. I am impressed with M1 Pro and M1 Max performance
         | numbers. I ordered the new MBP to replace my 2020 M1 MBP, but I
         | bought it with the base M1 Pro and I'm personally way more
         | excited about 32gb, 1000 nits brightness, function row keys,
         | etc.
        
         | lowbloodsugar wrote:
         | See, this is why you have kids. Now my kid gets my M1 Air, and
         | I get a M1 Max!
        
         | Unbeliever69 wrote:
         | For sure. Mine has been the perfect dev machine. My Docker
         | build times are the envy of the office.
        
         | maxekman wrote:
         | I was thinking the exact same thing! The fanless M1 Air is a
         | dev monster in a mouse package. Couldn't be happier with that
         | combo.
        
           | dan1234 wrote:
           | If the M1 supported 3 displays I would've bought an air last
           | year.
           | 
           | Feels like I'll have to pay a lot for that 3rd monitor!
        
       | rcheu wrote:
       | 400 GB/s is insane memory bandwidth. I think a m5.24xlarge for
       | instance has something around 250 GB/s (hard to find exact
       | number). Curious if anyone knows more details about how this
       | compares.
        
         | Andys wrote:
         | Its still a bit unclear how much of the bandwidth the CPUs can
         | use (as opposed to the GPUs)
        
       | ryanjodonnell wrote:
       | Is it possible to dual boot windows with the new m1 chips?
        
         | aldanor wrote:
         | You can most definitely use the latest Parallels IIRC, why dual
         | boot?
        
           | ryanjodonnell wrote:
           | Mostly for gaming. I use bootcamp now to play MTG Arena and
           | StarCraft 2 on windows which seems to have much better perf.
           | 
           | I imagine gaming doesnt work well in parallels?
        
             | cweagans wrote:
             | I don't think I'd make that assumption. https://www.applega
             | mingwiki.com/wiki/M1_compatible_games_mas... , for
             | instance, has a number of games that do pretty well in
             | Parallels.
        
             | Synaesthesia wrote:
             | You can play SC2 via rosetta. Unfortunately not optimised
             | for M1
        
         | pantalaimon wrote:
         | No.
        
           | [deleted]
        
       | turbinerneiter wrote:
       | It makes me sad that no one will never be able to build anything
       | with those chips.
       | 
       | I imagine there could me many, many innovative products built
       | with these chips if Apple sold them and supported Linux (or even
       | Windows).
        
         | zionic wrote:
         | Imagine if apple was forced (via anti trust laws) to spin off
         | their CPU division...
        
         | userbinator wrote:
         | ...or just released the full documentation for them. Apple
         | being Apple and wanting full control over its users, I don't
         | see that happening. I really don't care how fast or efficient
         | these are, if they're not publicly documented all I think is
         | "oh no, more proprietary crap". Apple might even make more $$$
         | if it wanted to open up, but it doesn't.
        
           | uerobert wrote:
           | What do you mean? All the APIs (Xcode SDK, Metal, ML, etc)
           | required to build on their devices are very well documented.
        
             | turbinerneiter wrote:
             | I'm not talking about apps on their devices, I'm talking
             | about new types of devices based on their processors.
             | 
             | If these were available for others to buy, I think we would
             | be very surprised by the innovative new devices people
             | would invent.
        
       | soheil wrote:
       | It's a bit concerning that the new chips have special purpose
       | video codec hardware. I hope this trend doesn't continue,
       | requiring laptops from different manufacturers to play different
       | video formats or at least with a non-degraded quality.
        
         | Synaesthesia wrote:
         | Video encode and decode have been GPU and integrated GFX
         | features for quite a long time now.
        
       | MisterTea wrote:
       | Apple headline: New chip is faster than old chip.
       | 
       | HN: 54 points
       | 
       | Slow news day.
        
         | octos4murai wrote:
         | MacBook announcements are the opposite of slow news days.
         | Consumer tech outlets are literally scrambling to cover every
         | aspect of these things because there is so much interest.
        
         | fartcannon wrote:
         | Not suggesting this is whats happening, but you can pay for
         | this kind attention.
         | 
         | On HN you probably don't have to though. Lots of fans of Apple
         | things here.
        
       | sk0g wrote:
       | Now, if only Unreal Engine builds were available M1 native, I
       | could get rid of my huge and heavy desktop entirely!
       | 
       | Interestingly, some improvements to Rosetta were mentioned,
       | extremely briefly.
        
         | tehnub wrote:
         | Same wish here. Last I tried a few months ago, I was unable to
         | compile UE4 as well. These machines would be great for UE4 dev
         | if only the compatibility was there. I wonder if the politics
         | between Epic and Apple has made efforts in this area a lower
         | priority.
        
         | out_of_protocol wrote:
         | Apple very much against games, they've even broken OpenGL just
         | because. Don't expect any type of gaming ecosystem around Apple
         | anytime soon.
        
           | spacedcowboy wrote:
           | So against games that they just became a Patron supporter of
           | Blender [1]...
           | 
           | [1] https://www.blender.org/press/apple-joins-blender-
           | developmen...
        
             | manquer wrote:
             | Blender has strong use case in the animation and movie
             | ecosystems. RenderMan, Pixar has some strong connections
             | with Jobs and in turn with Blender, games may not really be
             | in their radar for Blender sponsorship.
             | 
             | Besides supporting creator workflows (Final Cut Pro, best
             | in class laptop graphics, Blender etc) doesn't mean they
             | want to directly support gamers as buyers, just that they
             | believe creators who produce games (or other media) are
             | strong market for them go after.
             | 
             | The marketing is designed strongly towards the WFH post
             | pandemic designer market. They either had to ship their
             | expensive Mac desktops to their home or come in and work at
             | office last year. This laptop's graphics performance pitch
             | is for that market to buy/upgrade now.
        
           | dewey wrote:
           | > Apple very much against games
           | 
           | They are not against games, they just don't care about
           | supporting anything else that's not coming through their
           | frameworks and the app store. This can easily verified by the
           | way-too-long segments of game developer demos at the annual
           | WWDC.
        
             | out_of_protocol wrote:
             | Thats not how industry works, thats not how any of this
             | works. iPhone ecosystem is big enough to move itself
             | forward, but desktop market plays by different rules.If you
             | don't follow what majority of the market do, it's much
             | cheaper to just ignore that tiny customer segment which
             | requires totally alien set of technologies
        
               | GeekyBear wrote:
               | >If you don't follow what majority of the market do, it's
               | much cheaper to just ignore that tiny customer segment
               | which requires totally alien set of technologies
               | 
               | iOS and Macs both use Metal.
               | 
               | You can't refuse to support Metal without missing out on
               | a very big slice of total gaming revenue.
        
               | out_of_protocol wrote:
               | On mobile devices - definitely
               | 
               | On desktop - missing what, 2% or less? Checked Steam
               | stats - yep, about 2%
        
               | boardwaalk wrote:
               | That Steam stat is probably a chicken and the egg
               | situation. I know I don't run Steam on my Macbook because
               | there's nothing I want to play -- but I would if there
               | was.
               | 
               | Still the Mac marketshare is not that high (~15%?) but
               | might start looking attractive to developers looking to
               | "get in first" when hardware that can actually run games
               | becomes available (cough).
        
               | GeekyBear wrote:
               | Metal is Metal. Once you support Metal for iOS games they
               | also work under MacOS when running on Apple's chips.
               | 
               | You can support traditional MacOS application chrome with
               | little additional effort.
        
               | terafo wrote:
               | Desktop games and mobile games are not the same. On
               | mobile pretty much every heavy game uses either UE or
               | Unity. High end PC games use custom engines that are
               | heavily tuned for x86 and use different APIs. Metal port
               | would be expensive and not worth it.
        
               | dewey wrote:
               | Which is precisely what I said. They don't care that the
               | larger gaming market ignores their platform. Apple Arcade
               | and other gaming related endeavours all aim at the casual
               | mobile gamer market.
        
           | aroman wrote:
           | They literally had a Unity developer in the showcase during
           | the keynote.
        
       | smoldesu wrote:
       | Great update, I think Apple did the right thing by ignoring
       | developers this time. 70% of their customers are either creatives
       | who rely on proprietary apps, or people who just want a bigger
       | iPhone. Those people will be really happy with this upgrade, but
       | I have to wonder what the other 30% is thinking. It'll be
       | interesting to see how Apple continues to slowly shut out
       | portions of their prosumer market in the interest of making A
       | Better Laptop.
        
         | nharada wrote:
         | I agree this is very targeted towards the creative market (i.e.
         | SD card port, etc), but I'm curious as a developer what you
         | would have liked to see included that isn't in this release.
         | 
         | I guess personally having better ML training support would be
         | nice, since I suspect these M1 Max chips could be absolute
         | monsters for some model training/fine-tuning workloads. But I
         | can't think of anything design-wise really.
        
           | dmitriid wrote:
           | As a developer and a power user, I'd love for them to stop
           | iOS'ifying Mac OS.
           | 
           | There's such a huge disconnect between what they do with
           | hardware and what they do to MacOS.
        
             | rpmisms wrote:
             | Honestly a MacOS "Pro Mode" would be great. Let me run the
             | apps I want, not use my mouse, and be able to break things,
             | but keep the non-power user experience smooth and easy.
        
               | smoldesu wrote:
               | Seconding this. If both iOS and MacOS had a "Pro Mode"
               | that didn't treat me like a toddler, I'd be jumping into
               | the Apple ecosystem head-first.
        
           | cormacrelf wrote:
           | The big ticket items: Hardware-accelerated VSCode cursor
           | animation. Dedicated button for exiting vim. Additional
           | twelve function keys, bringing the total to 24; further,
           | vectorised function key operations, so you can execute up to
           | 12 "nmap <f22> :set background=dark" statements in parallel.
           | Dedicated encode and decode engines that support YAML, TOML
           | and JWT <-> JSON. A neural engine that can approximate
           | polynomial time register allocation almost instantly. The CPU
           | recognises when you are running Autoconf `./configure` checks
           | and fast-forwards to the end.
           | 
           | I would also like a decent LSP implementation for Siri, but
           | the question was about hardware.
        
             | smoldesu wrote:
             | You could just say "I want Linux" and get the same point
             | across.
        
               | cormacrelf wrote:
               | Linux has offered the Virtual Function Key System since
               | at least 1998, but there isn't a driver that uses the
               | native 8-lane AVF instruction set on my 230% mechanical
               | keyboard yet.
        
             | pram wrote:
             | Solid gold
        
       | alberth wrote:
       | Server Chips: if you removed the GPU, and added ECC - these would
       | be dang nice server chips.
        
         | ttul wrote:
         | There is technically no reason Apple could not introduce a
         | cloud computing service based on their silicon at some point.
         | But would it generate the kind of profit margins they need? An
         | interesting space to watch.
        
           | jbverschoor wrote:
           | 2 years when their contracts expire
        
         | Weryj wrote:
         | I was having a vision last night of a server rack filled with
         | iPad Pros /w Ethernet through the USB-C. I still wonder what
         | the performance per mm^3 would be in comparison to a
         | traditional server.
        
       | [deleted]
        
       | throwawaysea wrote:
       | I know Apple has a translating feature called Rosetta. But what
       | about virtual machines? Is it possible to run Windows 10 (not the
       | ARM edition but the regular, full Windows 10) as a virtual
       | machine on top of an Apple M1 chip? It looks like UTM
       | (https://mac.getutm.app/) enables this, although at a performance
       | hit, but I don't know how well it works in practice. What about
       | Parallels - their website suggests you can run Windows 10 Arm
       | edition but doesn't make it clear whether you can run x86
       | versions of operating systems on top of an M1 Mac (see old blog
       | post at https://www.parallels.com/blogs/parallels-desktop-m1/). I
       | would expect that they can run any architecture on top of an ARM
       | processor but with some performance penalty.
       | 
       | I'm trying to figure out if these new MacBook Pros would be an
       | appropriate gift for a CS student entering the workforce. I am
       | worried that common developer tools might not work well or that
       | differences in processors relative to other coworkers may cause
       | issues.
        
         | thoughtsimple wrote:
         | Neither Apple, Microsoft, nor Parallels is planning to support
         | x86-64 Windows on Apple silicon. You can run emulation software
         | like QEMU and it works but it is very slow. UTM uses QEMU.
        
       | amansidhant wrote:
       | So if I want a new Macbook purely for software development and
       | building mobile apps, what should I pick between the $2499 14"
       | and the $3499 16"? Doesn't look like there's any difference in
       | Xcode build times from their website
        
         | rpmisms wrote:
         | Probably depends on your eyesight. I like small laptops with
         | very high resolution, but I have good eyes.
        
         | methyl wrote:
         | It's only a matter of your preference, whether you like more
         | real estate on the screen or better portability.
        
         | seviu wrote:
         | 14" + M1 Max (24 GPU cores) with 32 Gb Ram is the sweet spot
         | imho. It costs a bit more but you get twice the memory
         | bandwidth and double the ram, which will always prove handy.
         | 
         | I develop iOS apps and I think this is the sweet spot. I am not
         | sure what impact the extra bandwidth of the M1 Max will have
         | though. We will have to wait to see. For video editing is
         | clear. For Xcode not so sure.
         | 
         | 14 or 16 inches is up to personal preference. I just value more
         | the smaller package and the reduced weight. Performance is
         | about the same.
        
         | symlinkk wrote:
         | I'd get the 16", 14" is pretty small for something you'd be
         | using every day
        
       | titzer wrote:
       | These chips are impressive, but TBH I have always been annoyed by
       | these vague cartoon-line graphs. Like, is this measured data. No?
       | Just some market doodle? Please don't make graphs meaningless
       | marketing gags. I mean, please don't make graphs _even more_
       | meaningless marketing gags.
        
       | aqme28 wrote:
       | I hate the naming.
       | 
       | By name alone and without looking at specs, can you tell me which
       | is the faster chip-- the M1 Max or the M1 Pro?
        
         | billyhoffman wrote:
         | In isolation, maybe. But it follows the naming convention of
         | their iPhones models:
         | 
         | base model < pro model < pro max model
        
         | rbilgil wrote:
         | Well I'm not necessarily a fan of the naming but assuming Max
         | stands for maximum, it's pretty clearly the best one. The one
         | you get if you want to max it out. But they should've called it
         | Pro Max for consistency with the iPhones...
        
         | twic wrote:
         | The M1 Pro is faster. The M1 MAX is an M1, but with new control
         | software which makes it keep crashing.
        
           | btzo wrote:
           | wrong
        
             | 19h wrote:
             | I assume it was a reference to the 737 Max..
        
           | filoleg wrote:
           | M1 Pro is on the cheapest model, and M1 Max is on the most
           | expensive model. So I think you are just flat out wrong here.
        
             | aroman wrote:
             | You missed the joke they were making about the Boeing 737
             | MAX.
        
         | tyingq wrote:
         | It would be odd for me if "Max" were not the "maximum".
        
           | aqme28 wrote:
           | That's still a bad naming convention. It won't be the maximum
           | forever.
        
             | singhkays wrote:
             | But it will the the max M1 forever. When M2 comes around,
             | that's a different class.
        
               | eugeniub wrote:
               | It appears that M1 Max itself comes in 24-core and
               | 32-core GPU variants. So I guess some M1 Max chips are
               | more maximum than other M1 Max chips.
        
       | agluszak wrote:
       | Ah yes, the naming. Instead of M2 we got M1 Pro & M1 Max. I'm
       | waiting for M1 Ultra+ Turbo 5G Mimetic-Resolution-Cartridge-View-
       | Motherboard-Easy-To-Install-Upgrade for Infernatron/InterLace TP
       | Systems for Home, Office or Mobile [sic]
        
       | euroderf wrote:
       | Any guesses how long it will take for Apple to update the pre-
       | existing M1 Macs ? (Price drop, performance boost)
        
         | slayerjain wrote:
         | maybe around next fall
        
       | citilife wrote:
       | Those power comparisons aren't really fair IMO. They're testing
       | power consumption...
       | 
       | They're using a "msi prestige 14 evo (intel CPU)" vs an optimized
       | laptop using an M1.
       | 
       | Further, where's AMD? They have a better power vs performance
       | ratio.
       | 
       | I'm not sure it's as good or not, but that's a lot of cherry
       | picking.
        
         | marricks wrote:
         | Can you be more specific in how the M1 is optimized while MSI's
         | isn't? Also why was MSI a bad comparison?
         | 
         | It seems reasonable to me but I don't follow PC much these
         | days.
        
       | hydroreadsstuff wrote:
       | How do they get 200/400GB per second RAM bandwidth? Isn't that
       | like 4/8 channel DDR5. 4/8 times as fast as current Intel/AMD
       | CPUs/APUs? (E.g.
       | https://www.intel.com/content/www/us/en/products/sku/201837/...
       | with 45.8GB/s)
       | 
       | Laptop/desktop have 2 channels. High-end desktop can have 4
       | channels. Servers have 8 channels.
       | 
       | How does Apple do that? I was always assuming that having that
       | many channels is prohibitive in terms of either power consumption
       | and/or chip size. But I guess I was wrong.
       | 
       | It can't be GDDR because chips with the required density don't
       | exist, right?
        
         | G4E wrote:
         | That sound like HBM2, maybe HBM3 but that would be the first
         | consumer product to include it afaik.
         | 
         | Basically the bus is really large, and the memory dies must be
         | really close to the main processing die. Those memory were
         | notably on the RX Vega from AMD, and before that on the R9
         | Fury.
         | 
         | https://en.m.wikipedia.org/wiki/High_Bandwidth_Memory
        
           | hydroreadsstuff wrote:
           | If that were the case you could probably see an interposer.
           | And I think the B/W would be even higher.
        
             | kingosticks wrote:
             | And the price would be even higher.
        
         | Tuna-Fish wrote:
         | It's LPDDR5, which maxes out at 6.4Gbit/s/pin, on a
         | 256bit/512bit interface.
         | 
         | It's much easier to make a wider bus with LPDDR5 and chips
         | soldered on the board than with DIMMs.
        
           | hydroreadsstuff wrote:
           | I hope we will see this on more devices. This is a huge boon
           | to performance.
           | 
           | Might even forebode soldering RAM onto packages from here on
           | out and forever.
           | 
           | Steamdeck will probably have a crazy 100gb/sec ram b/w. twice
           | of current laptops and desktops.
        
             | zamadatix wrote:
             | Steamdeck is 88 GB/s using quad channel.
        
         | ksec wrote:
         | They are using LPDDR5.
         | 
         | Not the usual DDR5 used in Desktop / Laptop.
        
           | dragontamer wrote:
           | DDR5 isn't common yet.
           | 
           | DDR4 is the common desktop/laptop chip. LPDDR5 is cell-phone
           | chip, so its kinda funny to see a low-power RAM being used in
           | a very wide fashion like this.
        
       | yboris wrote:
       | Is there any VR for the Mac? Seems like the machine is more-than-
       | ready for VR!
        
         | smoldesu wrote:
         | Apple had SteamVR support for a while, and even Valve Proton
         | support for a while (meaning that Windows-only games were quite
         | playable on Mac hardware). Unfortunately, Apple pulled 32-bit
         | library support without offering a suitable alternative, so
         | Valve was forced to scrap their plans for Mac gaming entirely.
        
           | thrwyoilarticle wrote:
           | I maintain that this was to avoid the lack of support for
           | 32-bit games being blamed on the Apple Silicon.
        
       | thuccess129 wrote:
       | Still have not solved the tinytiny inverted Tee arrow key
       | arrangement on the keyboard. Need to improve on the former IBM's
       | 6-key cluster below the right shift key, or arrange full sized
       | key arrows in a crossplus pattern breaking out of the rectangle
       | at the lower right corner.
        
       | yodsanklai wrote:
       | > This also means that every chip Apple creates, from design to
       | manufacturing, will be 100 percent carbon neutral.
       | 
       | How is that even possible?
        
         | bee_rider wrote:
         | I guess it must be net, right? So maybe carbon offsets or
         | providing more green energy than they consume, to the grid?
        
         | strobe wrote:
         | likely they not but is possible to say that because they buying
         | a Carbon offsets or similar products to make it 100 neutral
         | https://en.wikipedia.org/wiki/Carbon_offset
         | 
         | (obviously apple can afford that)
        
         | trenchgun wrote:
         | With offsets.
        
           | robocat wrote:
           | There are two kinds of offsets: 1. existing offsets where you
           | buy them but there is no net offset creation, 2. newly
           | created offsets where your purchase makes a net difference.
           | An example of (1) could be buying an existing forest that
           | wasn't going to be felled. An example of (2) could be
           | converting a coal electricity plant to capture CO2 instead of
           | shutting the plant down.
           | 
           | A quick skim of the Apple marketing blurb at least implies
           | they are trying to create new offsets e.g. "Over 80 percent
           | of the renewable energy that Apple sources comes from
           | projects that Apple created", and "Apple is supporting the
           | development of the first-ever direct carbon-free aluminium
           | smelting process through investments and collaboration with
           | two of its aluminium suppliers." --
           | https://www.apple.com/nz/newsroom/2020/07/apple-commits-
           | to-b...
        
         | supertrope wrote:
         | Collecting juicy tax credits for installing solar power. Carbon
         | credits.
        
       | sharkjacobs wrote:
       | This is roughly in line with what I expected, given the
       | characteristics of the M1. It's still very power efficient and
       | cool, has more CPU cores, a lot more GPU cores, wider memory
       | controller, and presumably it has unchanged single core
       | performance.
       | 
       | Apple clearly doesn't mean these to be a high performance desktop
       | offering though because they didn't even offer an Mac Mini SKU
       | with the new M1s.
       | 
       | But what I'm really curious about is how Apple is going to push
       | this architecture for their pro desktop machines. Is there a
       | version of the M1 which can take advantage of a permanent power
       | supply and decent air flow?
        
         | julienb_sea wrote:
         | I don't think they are going to make desktop versions, they'll
         | probably put the pro and max versions in a new iMac body during
         | Q2 2022 and might add config options to mac mini. Might be for
         | supply chain reasons, focusing 100% on macbook pro production
         | to meet the absolutely massive incoming demand.
        
       | gradys wrote:
       | M1 Pro Max is only $200 more. I'm tempted, but do we think it
       | will be more power hungry under the same workload than the Pro?
        
         | alex504 wrote:
         | On Apple's website, see the notes below the battery consumption
         | claims.
         | 
         | https://www.apple.com/macbook-pro-14-and-16/#footnote-23
         | 
         | They are using the m1 pro to get their battery claim numbers.
         | 
         | I ordered an M1 Pro based on the slightly lower price and my
         | assumption that it will be less power hungry. If it is only 200
         | dollars cheaper why else would they even offer the M1 Pro? The
         | extra performance of he max seems like overkill for my needs so
         | if it has worse power consumption I don't want it. I could
         | probably get away with an M1 but I need the 16" screen.
         | 
         | We will find out in a few weeks when there are benchmarks
         | posted by 3rd party reviewers, but by that time who knows how
         | long it will take to order one.
        
         | awill wrote:
         | The full-blown Max they talked about is an $800 upgrade.
         | https://www.apple.com/shop/buy-mac/macbook-pro/16-inch It's
         | combined with double RAM (32GB), double GPU (32-core).
         | 
         | The $200 upgrade is called 'Max', but is still 16GB RAM, and
         | 'only' 24 core GPU.
        
           | HunterWare wrote:
           | Nah, 200-400 more and memory size is an independent thing.
        
       | sharken wrote:
       | So with 57B transistors for the M1 Max you can fit the AMD 5800H
       | (10 B) and the RTX 3080 Ti (28 B) and have 19B transistors left.
       | 
       | So the performance should be top notch but cooling and power
       | requirements will be quite high.
       | 
       | So battery life of 21 hours is quite the achievement.
       | 
       | Still, i prefer the open architecture of the PC any day.
        
         | Matthias247 wrote:
         | I think memory is part of that, whereas it would be excluded
         | for the other chips you mentioned.
         | 
         | But OTOH 57B transistors for 64GB of memory means there would
         | be less than one transistors per byte of memory - so I'm not
         | sure how this works, but I'm not too knowledgeable in chip
         | design.
        
         | up6w6 wrote:
         | https://en.wikipedia.org/wiki/Transistor_count
         | 
         | It seems to be way beyond any CPU for end users and even some
         | servers like AWS Graviton 2
        
         | Synaesthesia wrote:
         | I wish we had real open hardware with everything documented.
         | Sadly that us very rare.
        
       | throwaway879080 wrote:
       | how does this new GPU and "neural engine" perform comparing to
       | Nvidia GPUs, and do they support Tensorflow and something similar
       | to CUDA SDK
        
         | vimy wrote:
         | The M1 gpu was comparable to a 1080.
         | https://blog.tensorflow.org/2020/11/accelerating-tensorflow-...
         | I believe they are working on PyTorch support.
        
       | baybal2 wrote:
       | I wonder what would the benchmark results be.
        
       | gjsman-1000 wrote:
       | For me, think about that memory bandwidth. No other CPU comes
       | even close. A Ryzen 5950X can only transfer about 43GB/s. This
       | thing promises 400GB/s on the highest-end model.
        
         | Woden501 wrote:
         | No consumer CPU comes close. Just saw an article about the
         | next-gen Xeon's with HBM though that blows even this away
         | (1.8TB/s theoretically), but what else would one expect from
         | enterprise systems. Getting pretty damn excited about all the
         | CPU manufacturers finally getting their asses into gear
         | innovation-wise after what feels like a ridiculously long
         | period of piss-warm "innovation".
        
           | kzrdude wrote:
           | Thanks to Apple in this case for taking a holistic approach
           | to making a better computer.
        
         | Thaxll wrote:
         | And only 10 cores so a 5950x completely wreck an M1.
        
         | defaultname wrote:
         | As always, though, the integrated graphics thing is a mixed
         | blessing. 0-copy and shared memory and all of that, but now the
         | GPU cores are fighting for the same memory. If you are really
         | using the many displays that they featured, just servicing and
         | reading the framebuffers must be...notable.
         | 
         | A high end graphics card from nvidia these days has 1000GB/s
         | all to itself, not in competition with the CPUs. If these GPUs
         | are really as high of performance as claimed, there may be
         | situations where one subsystem or the other is starved.
        
       | pantalaimon wrote:
       | Let's see how fast we'll see support for those in Asahi Linux
        
       | humantorso wrote:
       | I'll wait for the Linus benchmarks/comparisons.
        
         | Synaesthesia wrote:
         | You mean Anandtech
        
       | mem0r1 wrote:
       | I really wonder if the CPU cores are able to access the memory
       | with the specified high bandwith or if its just for the GPU
       | cores.
        
       | WithinReason wrote:
       | Isn't almost every new Apple chip the most powerful chip Apple
       | has ever built?
        
         | epistasis wrote:
         | They could have been optimizing for lower power consumption
         | rather than more compute power. For example, the next iPhone
         | chip will likely _not_ be the most powerful when it comes to
         | compute, even if it beats the other iPhone chips.
        
           | WithinReason wrote:
           | And that's exactly why I put the word "almost" in the
           | sentence!
        
             | bee_rider wrote:
             | Maybe currently, but they are only on their second
             | generation of laptop chips.
             | 
             | I guess going forward the current A-series chip will be
             | lower power/performance than any reasonably recent M-series
             | chip (given the power envelope difference).
        
       | pastelsky wrote:
       | Does this leap sound big enough to eat into the traditional
       | windows pro laptop market?
       | 
       | ITs going to have a tough time justifying purchases.
        
       | octos4murai wrote:
       | Do these new processors mean anything for those of us who need to
       | run x86 VMs or is Apple Silicon still a no-go?
        
         | sharikous wrote:
         | you can always run an emulator such as QEMU if you really need
         | x86 once in a while
         | 
         | working with it would be a pain, however, if you absolutely
         | need x86 better get an Intel mac (possibly used) or a PC
        
       | [deleted]
        
       | mohanmcgeek wrote:
       | Disingenuous for Apple to compare these against 2017 Intel chips
       | and call them 2x and 3.7x faster.
       | 
       | I would love to see how they fare against 2021 Intel and amd
       | chips.
        
         | lowbloodsugar wrote:
         | The slide where they say its faster than an 8-core PC laptop
         | CPU is comparing it against the 11th gen i7 11800H [1]. So it's
         | not as fast as the fastest laptop chip, and it's certainly not
         | as fast as the monster laptops that people put desktop CPUs in.
         | But it uses 40% of the power of a not-awful 11th gen 8-core i7
         | laptop. The M1 is nowhere near as fast as a full-blown 16 core
         | desktop CPU.
         | 
         | I am sure we will see reviews against high end intel and amd
         | laptops very soon, and I wont be surprised if real world
         | performance blows people away, as the M1 Air did.
         | 
         | [1] https://live.arstechnica.com/apple-october-18-unleashed-
         | even...
        
           | spacedcowboy wrote:
           | ... and neither is the M1 (in any configuration) a "full
           | blown 16 core desktop CPU".
           | 
           | Those will be called M2 and come later next year, according
           | to the rumor mill anyway.
        
             | lowbloodsugar wrote:
             | Sorry, that is what I meant. I'll edit.
        
         | Rapzid wrote:
         | When M1 first released they pulled some marketing voodoo and
         | you always saw the actively cooled performance numbers listed
         | with the passively cooled TDP :D Nearly every tech
         | article/review was reporting those two numbers together.
        
         | klelatti wrote:
         | How is it disingenuous - defined in my dictionary as not candid
         | - when we know precisely which chips they are comparing
         | against?
         | 
         | They are giving Mac laptop users information to try to persuade
         | them to upgrade from their 2017 MacBook Pros and this is
         | probably the most relevant comparison.
        
           | eugeniub wrote:
           | I'm pretty sure they are comparing them with 2019/2020
           | MacBook Pros, which apparently have chips originally launched
           | 2017.
        
             | klelatti wrote:
             | Looks to me like they are comparing against 2020 MBPs (at
             | least for 13 inch) which use 10nm Ice Lake so nothing to do
             | with 2017 at all!!
        
         | james33 wrote:
         | They did that to compare against the last comparable Intel
         | chips in a Mac, which seems rather useful for people looking to
         | upgrade from that line of Mac.
        
           | peterkos wrote:
           | Reminds me of AMD comparing their insane IPC increase when
           | Ryzen first came out.
        
         | 00deadbeef wrote:
         | I thought they compared it with an i9-9980HK which is the top-
         | end 2019 chip in the outgoing 16" MBP?
        
         | ac29 wrote:
         | Intel's 2021 laptop chips (Alder Lake) are rumoured to be
         | released later this month (usually actual availability is a few
         | months after "release"). I expect them to be pretty compelling
         | compared to the previous generation Intel parts, and maybe even
         | vs AMD's latest. But the new "Intel 7" node (formerly 10++ or
         | something) is almost certainly going to be behind TSMC N5 in
         | power and performance, so Apple will most likely still have the
         | upper hand.
        
       | JumpCrisscross wrote:
       | Can these drive more than one external monitor?
        
         | masklinn wrote:
         | 2x6K on the Pro, 3x6K + 1x4K on the Max. The 4K seems to be
         | because that's the limit of the HDMI port.
         | 
         | No mention of MST though.
        
         | johnwheeler wrote:
         | up to 4
        
         | bredren wrote:
         | XDR Pro Display was the only external monitor mentioned, and
         | was shown connected a few times. Screenshots and below details
         | here: https://forums.macrumors.com/threads/pro-display-xdr-
         | owners-...
         | 
         | Also specific statements:
         | 
         | - M1 Pro SoC supports two XDR Pro Display
         | 
         | - M1 Max SoC was highlighted (Connectivity: Display Support ->
         | 33:50):                   - Supports three XDR Pro Displays and
         | a 4k television simultaneously                - "75 Million
         | pixels of real estate."              - Highlighted still having
         | free ports with this setup.
        
         | rstupek wrote:
         | It was shown driving 4 external monitors
        
         | can16358p wrote:
         | Yes. Up to three on Max.
        
         | salamandersauce wrote:
         | Yes. The Pro and Max can drive 2 XDR displays.
        
           | Kiro wrote:
           | > XDR
           | 
           | What does that mean?
        
             | salamandersauce wrote:
             | The 6K pro monitor Apple sells.
        
               | Kiro wrote:
               | Just so I don't misunderstand, does that mean I need XDR
               | or will it work with any monitor? I was very surprised to
               | see that the original M1 only supported one external
               | monitor so just want to confirm before I press buy.
        
               | salamandersauce wrote:
               | No it will, it's just a very demanding monitor so if it
               | can run multiple of those it will have no problem with
               | random 1080p or 4K monitors.
        
               | yarcob wrote:
               | It will of course work with other monitors, but there
               | aren't that many high res monitors out there. The Apple
               | display is the only 6k monitor I know of. There are a few
               | 5k monitors and lots of 4k monitors.
               | 
               | There's one 8k monitor from Dell, but I don't think it's
               | supported by macOS yet.
        
             | [deleted]
        
             | NovaS1X wrote:
             | It's their marketing term for their HDR tech. The XDR
             | displays are their flagship media productivity monitors.
             | The new screens in the 14/16 inch MacBooks have "XDR" tech
             | as well.
        
         | billyhoffman wrote:
         | Indeed. The M1 Max can drive 3 6K monitors and a 4K TV all at
         | once. Why? Professional film editors and color graders. You can
         | be working on the three 6K monitors and then Output your render
         | to the 4K TV simultaneously
        
         | cpascal wrote:
         | Yes, the live-stream showed a MacBook Pro connected to many
         | monitors
        
         | LeSaucy wrote:
         | The real question is can you plug in a display and not get
         | kernel panics.
        
           | CallMeMarc wrote:
           | I see myself in this one and I don't like it.
        
         | More-nitors wrote:
         | yay more-nitors!
        
       | DiabloD3 wrote:
       | Since these won't ship in non-Apple products, I don't see really
       | the point. They're only slightly ahead of AMD products when it
       | comes to performance/watt, slightly behind performance/dollar (in
       | an Apples to apples comparison on similarly configured laptops),
       | and that's only because Apple is head of AMD at TSMC for new
       | nodes, not because Apple has any inherent advantage.
       | 
       | I have huge respect for the PA Semi team, but they're basically
       | wasting that talent if Apple only intends to silo their products
       | into an increasingly smaller market. The government really needs
       | to look into splitting Apple up to benefit shareholders and the
       | general public.
        
         | kergonath wrote:
         | > I have huge respect for the PA Semi team, but they're
         | basically wasting that talent if Apple only intends to silo
         | their products into an increasingly smaller market.
         | 
         | They design the SoCs in all iPhones and soon all Macs. They
         | have the backing of a huge company with an unhealthy amount of
         | money, and are free from a lot of constraints that come with
         | having to sell a general-purpose CPUs to OEMs. They can work
         | directly with the OS developers so that whatever fancy thing
         | they put in their chips is used and has a real impact on
         | release or shortly thereafter, and will be used by millions of
         | users. Sounds much more exciting than working on the n-th Core
         | generation at Intel. Look at how long it is taking for
         | mainstream software to take advantage of vector extensions. I
         | can't see how that is wasting talent.
         | 
         | > The government really needs to look into splitting Apple up
         | to benefit shareholders and the general public.
         | 
         | So that their chip guys become just another boring SoC
         | designer? Talk about killing the golden goose. Also, fuck the
         | shareholders. The people who should matter are the users, and
         | they seem quite happy with the products. Apple certainly has
         | some unfair practices, but it's difficult to argue that their
         | CPUs are problematic.
        
         | pjmlp wrote:
         | Our local office only uses macOS and Windows for desktop
         | computers, GNU/Linux nowadays only for servers in some Amazon
         | and Azure cloud instances.
         | 
         | Apple will have plenty of customers.
        
         | Jtsummers wrote:
         | "They're only slightly ahead..." and "The government really
         | needs to look into splitting Apple up to benefit shareholders
         | and the general public." doesn't really seem to jive for me.
         | 
         | If they're only slightly ahead, what's the point of splitting
         | them up when everyone else, in your analysis, is nearly on par
         | or will soon be on par with them?
        
         | rpmisms wrote:
         | This is a poorly considered take, no offense to you. I think
         | you're failing to consider that Apple traditionally drives
         | innovation in the computing market, and this will push a lot of
         | other manufacturers to compete with them. AMD is already on the
         | warpath, and Intel just got a massive kick in the pants.
         | 
         | There's other arguments against Apple being as big as it, but
         | this isn't a good one. Tesla being huge and powerful has driven
         | amazing EV innovation, for example, and Apple is in the same
         | position in the computing market.
        
         | kzrdude wrote:
         | ARM going mainstream in powerful personal computers was
         | exciting enough as it was, with the release of the Apple
         | Silicon M1. With time hopefully these will be good to use with
         | Linux.
        
       | fwip wrote:
       | I wish that we could compare Intel/AMD at the 5nm process to
       | these chips, to see how much speedup is the architecture vs the
       | process node.
       | 
       | Also, all of the benchmarks based on compiling code for the
       | native platform are misleading, as x86 targets often take longer
       | to compile for (as they have more optimization passes
       | implemented).
        
       | baybal2 wrote:
       | Bad day for Intel(r)
        
       | [deleted]
        
       | sam0x17 wrote:
       | Any indication on the gaming performance of these vs. a typical
       | nvidia or AMD card? My husband is thinking of purchasing a mac
       | but I've cautioned him that he won't be able to use his e-gpu
       | like usual until someone hacks support to work for it again, and
       | even then he'd be stuck with pretty old gen AMD cards at best.
        
         | deltron3030 wrote:
         | I wouldn't get one of those for games, better get a Windows PC
         | and a M1 MacBook Air, cost should be about the same for both.
         | Game support just won't be there if you care about gaming.
        
         | neogodless wrote:
         | The video moved a bit too fast for me to catch the exact
         | laptops they were comparing. They did state that the M1 Pro is
         | 2.5x the Radeon Pro 5600M, and the M1 Max is 4x the same GPU.
         | 
         | The performance to power charts were comparing against roughly
         | RTX 3070 level laptop cards.
        
           | [deleted]
        
         | rudedogg wrote:
         | Wow, I had no idea M1 doesn't support eGPUs. I was planning on
         | buying an external enclosure for Apple Silicon when I upgraded;
         | thanks for pointing that out.
        
           | sam0x17 wrote:
           | Not only that, but you're stuck with cards from 5 years ago
           | with current macbooks. It's a shame too because the plug and
           | play support is better than the very best e-gpu plug and play
           | on linux or windows.
           | 
           | I don't see Apple personally adding support any time soon,
           | either. Clearly their play now is to make all hardware in
           | house. The last thing they want is people connecting 3090s so
           | they can have an M1 Max gaming rig. They only serve creators,
           | and this has always been true. Damn waste if you ask me.
        
             | rudedogg wrote:
             | You could use the new AMD cards at least though right? I
             | don't think Nvidia support will ever happen again though (I
             | got burned by that, bought a 1060 right before they dropped
             | Nvidia).
             | 
             | I'm on a RX 5700XT with my Hackintosh, and it works well.
             | 
             | Edit: Thinking about this more.. I bet third party GPUs are
             | a dead end for users and Apple is planning to phase them
             | out.
        
               | jdminhbg wrote:
               | I think I'd wait to see what the Mac Pro supports before
               | coming to that conclusion. Could be something they're
               | still working on for that product, and then when it's
               | working on the Apple Silicon build of macOS will be made
               | available on laptops as well.
        
               | sam0x17 wrote:
               | I guarantee you this is them finally ripping the Band-Aid
               | of having to support _any_ outside hardware
        
               | sam0x17 wrote:
               | Oh that's cool, last I checked the best AMD card you can
               | use is a Vega 64.
        
             | lowbloodsugar wrote:
             | Radeon 6900XT works with eGPU. But yes, intel macs only.
             | And of course, you're not getting all 16 lanes!
             | 
             | https://support.apple.com/en-us/HT208544
        
         | ksec wrote:
         | In terms of Hardware M1 Max is great. On paper you wont find
         | anything match its performance under load. As even Gaming /
         | Content Creation Laptop throttle after a short while.
         | 
         | The problem is gaming isn't exactly a Mac thing. From Game
         | selection to general support on the platform. So really
         | performance should be the least of your concern if you are
         | buying a Mac for games.
        
           | sam0x17 wrote:
           | He typically just dual-boots windows anyway so selection
           | isn't much of an issue, though I also don't know if that is
           | working yet on the M1 platform
        
             | ksec wrote:
             | >though I also don't know if that is working yet on the M1
             | platform
             | 
             | Dual booting isn't working and likely not any time soon as
             | Microsoft does not intend to support Apple M1 [1]. And I
             | doubt Apple have intention to port their GPU Metal Drivers
             | to Windows. ( Compared to using AMD Drivers on Windows in
             | the old days )
             | 
             | He will likely need to use some sort of VM solution like
             | Parallel. [2]
             | 
             | [1] https://appleinsider.com/articles/21/09/13/microsoft-
             | says-wi...
             | 
             | [2] https://www.engadget.com/parallels-desktop-17-m-1-mac-
             | perfor...
        
           | mlindner wrote:
           | I'm not aware of a single even semi-major game (including
           | popular indie titles) that run native on M1 yet. Everything
           | is running under Rosetta and game companies so far seem
           | completely uninterested in native support.
        
             | zamadatix wrote:
             | If you're looking for some to test try EVE, WOW, or
             | Baldur's Gate 3. That's about it as far as major games
             | AFAIK.
        
             | modulusshift wrote:
             | Disco Elysium, EVE Online, Minecraft*, Timberborn, Total
             | War: Rome Remastered, World of Warcraft.
             | 
             | Minecraft is kinda cheating, because Java, and even
             | considering that it takes a bit of hacking. Alternatively
             | you can sideload the iOS version.
        
             | boardwaalk wrote:
             | World of Warcraft & Eve Online are both native.
        
         | BitAstronaut wrote:
         | The fine print:
         | 
         | >Testing conducted by Apple in September 2021 using
         | preproduction 16-inch MacBook Pro systems with Apple M1 Max,
         | 10-core CPU, 32-core GPU, 64GB of RAM, and 8TB SSD, as well as
         | production Intel Core i9-based PC systems with NVIDIA Quadro
         | RTX 6000 graphics with 24GB GDDR6, production Intel Core
         | i9-based PC systems with NVIDIA GeForce RTX 3080 graphics with
         | 16GB GDDR6, and the latest version of Windows 10 available at
         | the time of testing.
         | 
         | https://www.businesswire.com/news/home/20211018005775/en/Gam...
        
       | eecc wrote:
       | Sigh... back to eating ramen (joking... I'm Italian, I'd never
       | cut on my food)
        
       | yellowapple wrote:
       | If Apple ever gets around to putting an M1 Max in a Mac Mini,
       | that'd probably push me over the edge toward buying one.
        
         | Andys wrote:
         | Yeah, especially if it could run Linux. This would be a
         | powerful little server.
         | 
         | I decked out my workstation with a 16 core Ryzen & 96GB RAM and
         | it didn't cost anywhere near the price of this new 64GB M1 Max
         | combo. (But it may very well be less powerful, which is
         | astonishing. It would be great to at least have the choice.)
        
         | soheil wrote:
         | Can't you buy a MBP 16" and connect it to whatever display you
         | were going to connect your Mac Mini to?
        
           | jb1991 wrote:
           | But there's a huge difference in the price if you don't need
           | all the other stuff that comes with the MacBook Pro.
        
             | cmckn wrote:
             | Agreed, and the mini has different I/O that you might
             | prefer (USB-A, 10 gig eth). Also, it's smaller (surprise,
             | "mini"). Plus, clamshell mode on a MacBook just isn't the
             | same as a desktop for "always-on" use cases.
        
           | [deleted]
        
       | nodesocket wrote:
       | > 140W USB-C Power Adapter
       | 
       | Wait huh? My current 16" Intel Core i9 is only 95watts. Does this
       | mean all my existing USB-C power infrastructure won't work?
        
         | google234123 wrote:
         | No, it would be just be slower to charge
        
         | barelysapient wrote:
         | Maybe its a typo and they mean the Magsafe3 connector? I though
         | the USB-C standard was limited to 100 watts.
        
           | jhawk28 wrote:
           | They may have a usbC -> magsafe 3 connector. It lets you just
           | replace the cable when it inevitably breaks instead of the
           | whole brick.
        
             | yarcob wrote:
             | Instead of having to replace the mag safe power brick for
             | 85EUR you can now just replace the cable for 55EUR.
             | 
             | However, it my personal experience, I've never had the
             | cable fail, but I've had 2 mag safe power supplies fail
             | (they started getting very hot while charging and at some
             | point stopped working alltogether).
        
             | barelysapient wrote:
             | That's exactly what it is.
        
           | yarcob wrote:
           | They recently introduced a new standard that allows up to
           | 240W
        
           | nly wrote:
           | Dell USB-C power adapters have exceeded 100 watts in the past
           | on e.g. the XPS 17
        
         | can16358p wrote:
         | It will work but it will charge slower and probably won't
         | charge when maxing out the juice off the SoC.
        
           | robertrbairdii wrote:
           | I think the 140w power adapter is to support fast charging,
           | they mentioned charging to 50% capacity in 30 mins, I'd
           | imagine power draw should be much less than 140w
        
         | miohtama wrote:
         | It likely charges slower, though they is a minimum needed power
         | - you cannot charge with a mobile charger.
        
         | salamandersauce wrote:
         | My guess is it will work like using a 65W adapter on a Mac that
         | prefers 95W. It will charge if you're not doing much but it
         | will drain the battery if you're going full tilt.
        
           | yellow_postit wrote:
           | Just like USB-C cables can differ in capacity I'm finding a
           | need to scrutinize my wall chargers more now. A bunch of
           | white boxes but some can keep my MacBook Pro charged even
           | when on all day calls and some can't. With PD the size isn't
           | a great indicator anymore either.
        
       | gpt5 wrote:
       | It's interesting that the M1 Max is similar in GPU performance to
       | RTX 3080. A sub $1000 Mac Mini would end up being the best gaming
       | PC you could buy, at less than half the price of an equivalent
       | windows machine.
        
         | cloogshicer wrote:
         | If only the Software compatibility was there. I'd love to be
         | able to skip buying expensive desktop machines exclusively for
         | gaming.
        
         | burmanm wrote:
         | But without games to play on it. Instead, you could just get an
         | Xbox Series X for half that price.
        
           | lowbloodsugar wrote:
           | PS5! Not true apple owner buys a Microsoft product!
        
         | smileybarry wrote:
         | Similar to an RTX 3080 _Mobile_ , which is IIRC equivalent to
         | somewhere around a RTX 3060 Desktop.
        
         | neogodless wrote:
         | The M1 Max starts at $2700 for the 16" 10-core CPU, 24-core
         | GPU.
         | 
         | The comparison to higher end graphics uses the M1 Max 32-core
         | GPU which starts at $3500.
         | 
         | I'm not seeing a way for a Mac Mini to have the M1 Max, and
         | still be priced below $1000.
        
       | [deleted]
        
       | SalimoS wrote:
       | Why no one is talking about the base 14" 8-Core CPU and 14-Core
       | GPU but not a single mention in the presentation ?
       | 
       | How the new M1 Pro 8 core compared the the M1 8 core ?
        
         | masklinn wrote:
         | The M1 has 4 small cores and 4 big while the Pro is 2/6. Didn't
         | really see if they claimed difference in the performances of
         | the cores themselves.
        
         | bnastic wrote:
         | I don't think the 'old' M1 13" Pro is long for this world - for
         | PS100 more (16GB+1TB spec) you get a much better machine in the
         | 14" model. But independent comparisons will follow.
         | 
         | I'd love to see a Pro/Max Mac Mini, but that's not likely to
         | happen.
        
       | slayerjain wrote:
       | Based on the numbers it looks like the M1 Max is in the RTX
       | 3070-3080 performance territory. Sounds like mobile AAA gaming
       | has potential to reach new heights :D
        
         | smileybarry wrote:
         | If Proton is made to work on Mac with Metal, there's some real
         | future here for proper gaming on Mac. Either that or Parallels
         | successfully running x64 games via Windows on ARM
         | virtualization.
        
           | MetricExpansion wrote:
           | I've been looking into trying CrossOver Mac on my M1 Max for
           | exactly this reason. After seeing the kinds of GPUs Apple is
           | comparing themselves to, I'm very hopeful.
        
         | notSupplied wrote:
         | Elephant in the room is that an A15/M1 beefed up GPU is exactly
         | the right chip for a Nintendo Switch/Steam Deck form factor
         | device.
        
           | threeseed wrote:
           | Or an AV/VR headset.
        
         | slayerjain wrote:
         | And in the case for the M1 Pro, apple is showing it to be
         | faster than the Lenovo 82JW0012US - which has a RTX 3050ti. so
         | the performance could be between RTX 3050ti - RTX 3060. All of
         | this with insanely low power draw.
        
           | ant6n wrote:
           | But still not fanless, right? Maybe they'll update the
           | Macbook Air with some better graphics as well, so that one
           | could do some decent gaming without a fan.
        
             | zamadatix wrote:
             | Air is already at the thermal limit range with the 8 core
             | GPU, will probably have to wait until the next real
             | iteration of the technology in the chip (M2 or whatever)
             | which increases efficiency instead of just being larger.
        
         | TheRealDunkirk wrote:
         | It's not a function of capability. I spent $4,000 on a 2019
         | MBP, including $750 for a Vega 20. It plays Elder Scrolls
         | Online WORSE than a friend's 2020 with INTEGRATED graphics. (I
         | guess Bethesda gave some love to the integrated chipset, and
         | didn't optimize for the Vega. It hitches badly every couple of
         | seconds like it's texture thrashing.)
         | 
         | Whatever AAA games that might have gotten some love on the Mac
         | (and there are some), it's going to be even harder to get game
         | companies to commit to proper support to the M1 models.
         | Bethesda has said they won't even compile ESO for M1. So I will
         | continue to run it on a 12-year-old computer running an ATHLON
         | 64 and a nVidia 9xx-series video card. (This works surprising
         | well, making the fact that my Mac effectively can't play it all
         | the more galling.)
         | 
         | I'm never going to try tricking out a Mac for gaming again. I
         | should have learned my lesson with eGPU's, but no. I thought,
         | if I want a proper GPU, it's going to be built in. Well, that
         | doesn't work either. I've wasted a lot of money in this arena.
        
           | modulusshift wrote:
           | Well, Apple is selling M1 Macs like hotcakes, so it won't be
           | too long until it'll be stupid _not_ to support them. Also,
           | texture thrashing isn 't really an issue when you've got a
           | shared CPU/GPU memory with 64 GB of space. Just cache like
           | half the game in it lol
        
             | smoldesu wrote:
             | Apple had a shot at making Mac gaming a reality around
             | 2019. They decided to axe 32-bit library support though,
             | which instantly disqualifies the lion's share of PC games.
             | You could still theoretically update these titles to run on
             | MacOS, but I have little hope that any of these developers
             | would care enough to completely rewrite their codebase to
             | be compatible with 15% of the desktop market share.
        
               | TheRealDunkirk wrote:
               | Yeah, I was living with gaming on a Mac when that
               | happened, and watched 2/3rds of my Steam library get
               | greyed out.
        
               | modulusshift wrote:
               | Yeah, and they also deprecated OpenGL, which would have
               | wiped out most of those games even if the 32-bit support
               | didn't. I'm not expecting to see much backwards
               | compatibility, I'm expecting forwards compatibility, and
               | we're starting to see new titles come out with native
               | Apple Silicon support, slowly, but surely.
        
               | smoldesu wrote:
               | I wouldn't hold your breath. Metal is still a second-
               | class citizen in the GPU world, and there's not much
               | Apple can do to change that. Having powerful hardware
               | alone isn't enough to justify porting software, otherwise
               | I'd be playing Star Citizen on a decommissioned POWER9
               | mainframe right now.
        
               | wolrah wrote:
               | The major factor Apple has working in their favor with
               | regards to the future of gaming on Macs is iOS. Any game
               | or game engine that wants to support iPhone or iPad
               | devices is going to be most of the way to supporting ARM
               | Macs for "free".
               | 
               | My older Intel Macs I'm sure are more or less SOL but
               | they were never intended to be gaming machines.
        
             | Thaxll wrote:
             | There is 0% chance that game dev supports mac, it's a dead
             | platform for gaming.
             | 
             | The downvote police is there, am I missing something? are
             | they any modern game on mac?
             | 
             | https://applesilicongames.com/
        
           | gremloni wrote:
           | I love that game. Wonder why it gets some much hate online.
        
         | PragmaticPulp wrote:
         | > Based on the numbers it looks like the M1 Max is in the RTX
         | 3070-3080 performance territory.
         | 
         | The slides are comparing to a laptop with a 3080 Mobile, which
         | is not the same as a normal RTX 3080. A desktop 3080 is a power
         | hungry beast and will not work in a laptop form factor.
         | 
         | The 3080 Mobile is still very fast as a benchmark, but the
         | full-size 3080 Desktop is in another league of performance:
         | https://www.notebookcheck.net/The-mobile-GeForce-RTX-3080-is...
         | 
         | Still very impressive GPU from Apple!
        
         | jjcon wrote:
         | Mobile 3070-80 so around 1080 desktop for those that are
         | interested. I'm curious what their benchmark criteria was
         | though.
        
         | hajile wrote:
         | The 3080M has something around 20TFLOPS of theoretical f32
         | performance. That's double that of Apple's biggest offerings.
         | 
         | In theoretical compute, it's closer to the 3060M which is
         | nothing to sneeze at.
        
         | Ronson wrote:
         | Not to disagree, but what gave you that impression?
        
           | vesrah wrote:
           | The slides with performance numbers list the comparison
           | machines
        
           | slayerjain wrote:
           | based on the relative differences between the m1 and the m1
           | pro/max, and also the comparisons shown by apple to other
           | laptops from the MSI and Razerblade both featuring the RTX
           | 3080.
        
         | culopatin wrote:
         | But do you need to play those games through rosetta? Does that
         | make a difference?
        
       | [deleted]
        
       | mattfrommars wrote:
       | I am in the market for a new laptop and bit skeptic for M1 Chips.
       | Could anyone please tell me how is this not a "premium high
       | performance Chromebook" ?
       | 
       | Why should I buy this and not Dell XPS machine if I will be using
       | it for web development/Android Development/C#/DevOps. Might soon
       | mess with machine learning
        
         | raydev wrote:
         | Better battery life while performing faster than any Intel/AMD
         | chip at equivalent power usage. Portability is the reason for
         | the existence of laptops.
        
         | azeirah wrote:
         | For web dev it works very well, android development no clue, c#
         | is primarily a windows-oriented development environment so
         | probably not so great. For devops, well.. Mac is a linux-esque
         | environment, it'll be fine?
         | 
         | I have the m1 air and the big advantages are the absurd battery
         | life and the extremely consistent very high performance.
         | 
         | It's always cold, it's always charged, it's always fast.
         | 
         | I believe you can get both faster cpu and gpu performance on a
         | laptop, but it costs a lot in battery life and heat which has a
         | bigger impact on usability than I believed before getting this
         | one.
         | 
         | Might want to add, this is my first ever apple laptop. I've
         | always used Lenovo laptops, ran mostly windows/linux dual boots
         | on my laptops and desktops over the years.
        
           | avisser wrote:
           | > c# is primarily a windows-oriented development environment
           | so probably not so great.
           | 
           | I'm not confident that is still true today. .net core is
           | multi-platform all-the-way and is the future.
        
         | Weryj wrote:
         | That's a debate between OSX and alternatives, the M1 has little
         | to do with that. Except maybe the Rosetta for unsupported x86,
         | but I doubt that'll cause you any issues.
         | 
         | Edit: C# support is there on OSX, Rider has Apple Silicon
         | builds and .net core is cross platform now.
         | 
         | ML is probably a let down and you'll be stuck to CPU sized
         | workloads, that being said the M1 does pretty well compared to
         | other x86 CPU
        
         | sjroot wrote:
         | I am a very satisfied Apple customer, but will gladly tell you
         | that a Windows machine would make more sense for the use cases
         | you've described.
        
           | raydev wrote:
           | C# is the only odd item there. Entire companies are doing
           | everything else in that list using exclusively MacBook Pros.
        
             | skohan wrote:
             | Isn't C# quite available on mac?
        
               | other_herbert wrote:
               | Yes and dotnet 6 (release date scheduled for 11/9, but is
               | available now as rc2) works natively on the m1... REALLY
               | quickly I might add :)...
               | 
               | I go between vs.code and JetBrains Rider (rider now has
               | an EAP that runs natively on the m1)...
               | 
               | I am going to upgrade just because I didn't get enough
               | ram the first time around :)
        
           | bengale wrote:
           | Other than the ML work maybe. The Apple chips claim to be
           | very performant with ML workloads.
        
       | [deleted]
        
       | yumraj wrote:
       | $400 to upgrade RAM from 16GB to 32GB -- Ouch!!
        
         | tyingq wrote:
         | Also $400 to go from 32GB to 64GB if you start with the 16 inch
         | MPB with the M1Max. So $400 in that case buys 32G extra instead
         | of just 16GB extra.
         | 
         | Interesting.
        
           | Andys wrote:
           | Except you already paid more to get the M1 Max, too, the
           | upgrade to 64 is not available on any other models.
        
         | asdff wrote:
         | Apple has always been so stupid about RAM pricing. I miss the
         | days where you could triple your base RAM by spending maybe $75
         | on sticks from crucial and thirty seconds of effort.
        
       | Decabytes wrote:
       | What is it about the M1 architecture that makes it so speedy
       | compared to x86 chips? Is it the Risc instruction set? The newer
       | node process? Something else?
        
         | [deleted]
        
       | shartacct wrote:
       | M1 pro and max are very likely the fastest consumer CPUs in the
       | world, despite being stuck in a 30-40w power envelope. Apple is
       | embarrassing AMD and Intel engineers right now.
        
       | citilife wrote:
       | I thought this link was far better:
       | https://www.apple.com/newsroom/2021/10/introducing-m1-pro-an...
        
         | dang wrote:
         | Ok, we've changed to that from
         | https://www.theverge.com/2021/10/18/22726444/apple-m1-pro-
         | ma.... Thanks!
        
       | taurath wrote:
       | Kind of surprising to me they're not making more effort towards
       | game support - maybe someone can explain what the barriers are
       | towards mac support - is it lack of shared libraries, x64 only,
       | sheer number of compute cores?
       | 
       | When I see the spec sheet and "16x graphics improvement" I go
       | okay what could it handle in terms of game rendering? Is it
       | really only for video production and GPU compute tasks?
        
         | modulusshift wrote:
         | The gaming capability is there, but Apple only officially
         | supports their own Metal on macOS as far as graphics languages,
         | meaning the only devs with experience with Metal are the World
         | of Warcraft team at Blizzard and mobile gaming studios.
         | MoltenVK exists to help port Vulkan based games, but generally
         | it's rough going at the moment. I'm personally hoping that the
         | volume of M1 Macs Apple has been selling will cause devs to
         | support Macs.
        
         | thcleaner999 wrote:
         | Price you can't win the gaming market with 2000+$ product
        
           | terafo wrote:
           | Have you seen GPU prices lately? Desktop 3070 level
           | performance in portable laptop for 3500$ is not that bad of a
           | deal. If they make Mac Mini around 2500$ it would be pretty
           | competitive in PC space.
        
         | terafo wrote:
         | They never before had good graphics in mainstream products. And
         | there's no official support for any of the industry standard
         | API(to be fair there is MoltenVK, but not much traction yet),
         | yes, there is support for Metal in UE4/5 and Unity, but AAA
         | games use custom engines and cost/benefit analysis didn't make
         | much sense, maybe now it will change.
        
       | WoodenChair wrote:
       | Is there any reason to believe these will have improved single-
       | core performance? Or are these just more M1 cores in the same
       | package?
        
         | eugeniub wrote:
         | FWIW initial Geekbench scores that surfaced today do not show a
         | significant improvement in the single-core performance for M1
         | Max compared to M1.
         | https://www.macrumors.com/2021/10/18/first-m1-max-geekbench-...
        
       | 41209 wrote:
       | Has anyone tried extremely graphically intense gaming on these
       | yet, I actually would love to consolidate all of my computer
       | usage to a single machine, but it would need to handle everything
       | I need it to do. $2000 for a laptop that can replace my desktop,
       | is not a bad deal. Although that said I'm in no rush here.
        
         | smoldesu wrote:
         | Gaming is a non-starter on MacOS since 2018. You _can_ get
         | certain, specific titles working if you pour your heart and
         | soul into it, but it 's nothing like the current experience on
         | Linux/Windows unfortunately.
        
           | 41209 wrote:
           | Darn.
           | 
           | Hard pass then. I already have a pretty fast Windows laptop.
           | The only real issue is it sounds like an helicopter
           | underload. ( It also thermal throttles hard ).
        
       | thecleaner wrote:
       | Naive question - is this chip purely home-made at Apple or does
       | it use Arm licensed IP ?
        
         | modulusshift wrote:
         | Apple has an architecture license from ARM, so they're allowed
         | to create their own ARM-compatible cores. They do not license
         | any core designs from ARM (like the Cortex A57), they design
         | those in house instead.
        
         | thoughtsimple wrote:
         | Apple uses the Arm Aarch64 instruction set with Apple silicon
         | which is Arm IP. I don't believe that any Apple silicon uses
         | any other Arm IP but who really knows since it is likely Apple
         | would have written any contracts to prevent disclosure.
        
       | lowbloodsugar wrote:
       | There appears to be a sea change in RAM (on a Macbook) and its
       | affect on the price. I remember I bought a Mac Book pro back in
       | 2009, and while the upgrade to 4gb was $200, the upgrade to 8gb
       | was $1000 IIRC! Whereas the upgrade from 32GB to 64GB was only
       | $400 here.
       | 
       | Back then, more memory required higher density chips, and these
       | were just vastly more expensive. It looks like the M1 Max simply
       | adds more memory controllers, so that the 64GB doesn't need
       | rarer, higher priced, higher density chips, it just has twice as
       | many of the normal ones.
       | 
       | This is something that very high and laptops do: have four slots
       | for memory rather than two. It's great that Apple is doing this
       | too. And while they aren't user replaceable (no 128Gb upgrade for
       | me), they are not just more memory on the same channel either:
       | the Max has 400GB/s compared to the Pro 200Gb/s.
        
       | throwawaywindev wrote:
       | The M1 Max is drool worthy but Mac gaming still sucks. Can't
       | really justify it given that I don't do video editing or machine
       | learning work.
        
       | lisper wrote:
       | "Apple's Commitment to the Environment"
       | 
       | > Today, Apple is carbon neutral for global corporate operations,
       | and by 2030, plans to have net-zero climate impact across the
       | entire business, which includes manufacturing supply chains and
       | all product life cycles. This also means that every chip Apple
       | creates, from design to manufacturing, will be 100 percent carbon
       | neutral.
       | 
       | But what they won't do is put the chip in an expandable and
       | repairable system so that you don't have to discard and replace
       | it every few years. This renders the carbon-neutrality of the
       | chips meaningless. It's not the chip, it's the _packaging_ that
       | is massively unfriendly to the environment, stupid.
        
         | [deleted]
        
         | itsangaris wrote:
         | As others have said, the option to easily configure the
         | computer post-purchase would make a massive difference in terms
         | of its footprint
        
         | threeseed wrote:
         | I just used Apple's trade-in service for my 2014 MacBook Pro
         | and received $430.
         | 
         | So there is another, quite lucrative, option besides discarding
         | it.
        
           | grvdrm wrote:
           | Good to know and thanks! Looking forward to doing the same as
           | I have a 2014 MBP that works and is in good condition.
        
         | remir wrote:
         | Isn't most of the components in MacBooks recyclable? If I
         | remember correctly, Apple has a recycling program for old Macs
         | so it's not like these machines goes to landfill when their
         | past their time or broken.
        
           | iSnow wrote:
           | AFAIK their recycling is shredding everything, no matter if
           | it still works and separating gold and some other metals.
        
             | threeseed wrote:
             | When you trade-in your Mac you are asked if the enclosure
             | is free of dents, turns-on, battery holds charge etc.
             | 
             | Strange that they would ask these questions if they were
             | simply going to shred the device.
        
           | peatmoss wrote:
           | I believe Apple tries to use mostly recyclable components.
           | And they do have a fairly comprehensive set of recycling /
           | trade-in programs around the globe:
           | https://www.apple.com/recycling/nationalservices/
           | 
           | That being said, I haven't read any third-party audits to
           | know if this is more than Apple marketing. Would be curious
           | if they live up to their own marketing.
        
         | simonh wrote:
         | >so that you don't have to discard and replace it every few
         | years
         | 
         | Except you and I surely must know that's not true, that their
         | machines have industry leading service lifetimes, and
         | correspondingly high resale values as a result. Yes some pro
         | users replace their machines regularly but those machines
         | generally go on to have long productive lifetimes. Many of
         | these models are also designed to be highly recyclable when the
         | end comes. It's just not as simple as you're making out.
        
           | cududa wrote:
           | Right. Im not speaking to iPhones or iPads here, but the non-
           | serviceability creates a robustness pretty unmatched by
           | Windows laptops in terms of durability.
           | 
           | Was resting my 2010 MBP on the railing of a second story
           | balcony during a film shoot and it dropped onto the marble
           | floor below. Got pretty dented, but all that didn't work was
           | the ethernet port. Got the 2015 one and it was my favorite
           | machine ever - until it got stolen.
           | 
           | 2017 one (typing on now) is the worst thing I've ever owned
           | and I'm looking forward to getting one of the new ones. 2017
           | one: -Fries any low voltage USB device I plug in (according
           | to some internal Facebook forms they returned 2-5k of this
           | batch for that reason) -When it fried an external drive
           | plugged in on the right, also blew out the right speaker.
           | -Every time I try and charge it I get to guess which USB-C
           | port is going to work for charging. If I pick wrong I have to
           | power cycle to power brick (this is super fun when the
           | laptops dead and there's no power indicator, as there is on
           | the revived magsafe) -Half-dime shaped bit of glass popped
           | out of the bottom of the screen when it was under load - this
           | has happened to others in the same spot but user error..
           | 
           | Pissed Apple wouldn't replace it given how many other users
           | have had the same issues, but this thing has taken a beating
           | as have my past laptops. I'll still give them money if the
           | new one proves to be as good as it seems.
        
         | oblio wrote:
         | Yeah, but one of them doesn't cost a ton to implement (what
         | they're doing) and the other one would cost them a ton through
         | lost sales (what you're asking for).
         | 
         | Always follow the money :-)
        
           | grenoire wrote:
           | Erhh... I think OP gets it, he's just calling out the
           | greenwashing.
        
         | ink404 wrote:
         | agree with your point, but one could also look at the
         | performance/power savings and use that in an argument for
         | environmentally friendliness
        
         | coryfklein wrote:
         | > it's the packaging that is massively unfriendly to the
         | environment, stupid.
         | 
         | Of all the garbage my family produces over the course of time,
         | my Apple products probably take less than 0.1% of my family's
         | share of the landfill. Do you find this to be different for
         | you? Or am I speaking past the point you're trying to make
         | here?
        
         | sudhirj wrote:
         | All of my Apple laptops (maybe even all their products) see
         | about 5 to 8 years of service. Sometimes with me, sometimes as
         | hand-me-downs. So they've been pretty excellent at not winding
         | up in the trash.
         | 
         | Even software updates often stretch as far back as 5 year old
         | models, so they're pretty good with this.
        
           | dntrkv wrote:
           | Big Sur is officially supported by 2013 Mac models (8 years).
           | 
           | iOS 15 is supported by the 6s, which was 2015. So 6 years.
           | 
           | And I still know people using devices from these eras. Apple
           | may not be repair friendly, but at the end of the day, their
           | devices are the least likely to end up in the trash.
        
           | patall wrote:
           | And here I am sitting at my 2011 Dell Latitude wondering what
           | is so special about that. My sis had my 2013 Sony Duo but
           | that's now become unusable with its broken, built-in battery.
           | Yes, 5 to 8 years of service is nice, but not great or out of
           | norm for a $1000+ laptop.
        
             | jbverschoor wrote:
             | Because they run windows.
             | 
             | If you look at android phones, you're looking at a few
             | years only.
             | 
             | Because of software
        
               | patall wrote:
               | Parent is talking about laptops, I am talking about
               | laptops, why are you talking about smartphones? Though I
               | also had my Samsung S2plus from 2013 to 2019 in use, and
               | that was fairly cheap. I do not know any IPhone users
               | that had theirs for longer.
        
         | [deleted]
        
         | moralestapia wrote:
         | >This renders the carbon-neutrality of the chips meaningless.
         | 
         | You have a point but if they were actually truly neutral it
         | wouldn't matter if you make 100,000 of them and throw them
         | away.
        
         | ravi-delia wrote:
         | It absolutely doesn't render the carbon-neutrality of the chip
         | useless. Concern about waste and concern about climate change
         | are bound by a political movement and not a whole lot else.
         | It's not wrong to care about waste more, but honestly its
         | emissions that I care about more.
        
           | Mikeb85 wrote:
           | > It's not wrong to care about waste more, but honestly its
           | emissions that I care about more.
           | 
           | Waste creates more emissions. Instead of producing something
           | once, you produce it twice. That's why waste is bad, it's not
           | just about disposing of the wasted product.
        
         | savanaly wrote:
         | Is there an estimate of what the externality cost is for the
         | packaging per unit? Would be useful to compare to other things
         | that harm the environment like eating meat, taking car rides,
         | to know how much I should think about this. E.g. if my iphone
         | packaging is equivalent to one car ride I probably won't
         | concern myself that much, but if it's equivalent to 1000 then
         | yeah maybe I should. Right now I really couldn't tell you which
         | of those two the true number is closer to. I don't expect we
         | would be able to know a precise value but just knowing which
         | order of magnitude it is estimated to be would help.
        
         | nojito wrote:
         | >But what they won't do is put the chip in an expandable and
         | repairable system so that you don't have to discard and replace
         | it every few years. This renders the carbon-neutrality of the
         | chips meaningless. It's not the chip, it's the packaging that
         | is massively unfriendly to the environment, stupid.
         | 
         | Mac computers last way longer than their PC counterparts.
        
           | gruez wrote:
           | Is apple's halo effect affecting your perception of the mac
           | vs PC market? iPhones last longer because they have much
           | longer software updates, and are more powerful to start with.
           | None of these factors apply to macs vs PCs.
        
         | schleck8 wrote:
         | Apple, the company that requires the entire panel to be
         | replaced by design when a 6 dollar display cable malfunctions,
         | is proud to announce its latest marketing slogan for a better
         | environment.
        
           | rubyist5eva wrote:
           | Obsession with emissions has really made people start to miss
           | the forest from the trees.
        
             | jedberg wrote:
             | Emissions affect everyone on the planet, no matter where
             | they happen. But polluting the ground or water only happens
             | in China, so a lot of Americans that care about emissions
             | don't care about the other types of pollution, because it
             | doesn't affect them.
        
               | someperson wrote:
               | You would be surprised just how much food you eat has
               | been grown in China using polluted land and water.
               | 
               | It's not so much fresh vegetables, but ingredients in
               | other types of food -- especially the frozen fruit,
               | vegetables and farmed seafood that finds its way into
               | grocery store and restaurant supply chains.
        
             | castis wrote:
             | Could you elaborate on what you mean by this?
        
               | supertrope wrote:
               | I'm guessing they mean greenwashing statements about
               | lower CO2 emissions glosses over more "traditional"
               | pollution as heavy metals, organic solvents, SO2, NOx.
               | Taming overconsumption is greener than finding ways to
               | marginally reduce per unit emissions on ever more
               | industrial production.
        
               | 8ytecoder wrote:
               | Not OP. Personally, I've had Dell, HP and Sony laptops.
               | But the macs have been the longest lasting of them all.
               | My personal pro is from 2015.
               | 
               | It has also come to a point where none of the extensions
               | makes sense _for me_. 512GB is plenty. RAM might be an
               | issue - but I honestly don 't have enough data on that.
               | The last time I had more than 16GB RAM was in 2008 on my
               | hand built desktop.
               | 
               | As long as the battery can be replaced/fixed - even if
               | it's not user serviceable, _I 'm_ okay with that. I'd
               | guess I'm not in the minority here. Most people buy a
               | computer and then take it to the store even if there's a
               | minor issue. And Apple actually shines here. I have
               | gotten my other laptop serviced - but only in
               | unauthorized locations with questionable spare parts.
               | With Apple, every non-tech savvy person I know has been
               | able to take to an Apple store at some point and thereby
               | extend the life.
               | 
               | That's why I believe having easily accessible service
               | locations does more to device longevity than being user-
               | serviceable.
               | 
               | (In comparison, HTC wanted 4 weeks to fix my phone plus
               | 1wk either way in shipping time _and_ me paying shipping
               | costs in addition to the cost of repair. Of course, I
               | abandoned the phone entirely than paying to fix it.)
               | 
               | We could actually test this hypothesis - if we could ask
               | an electronics recycler on the average age of the devices
               | they get by brand, we should get a clear idea on what
               | brands _actually_ last longer.
        
               | fomine3 wrote:
               | Apple don't offer cheaper laptops. No one doubt it lasts
               | longer in average.
        
           | partiallypro wrote:
           | Not to mention all the eWaste that comes with the AirPods.
        
             | CharlesW wrote:
             | Who's doing better to mitigate e-waste?
             | 
             | > _" AirPods are designed with numerous materials and
             | features to reduce their environmental impact, including
             | the 100 percent recycled rare earth elements used in all
             | magnets. The case also uses 100 percent recycled tin in the
             | solder of the main logic board, and 100 percent recycled
             | aluminum in the hinge. AirPods are also free of potentially
             | harmful substances such as mercury, BFRs, PVC, and
             | beryllium. For energy efficiency, AirPods meet US
             | Department of Energy requirements for battery charger
             | systems. Apple's Zero Waste program helps suppliers
             | eliminate waste sent to landfills, and all final assembly
             | supplier sites are transitioning to 100 percent renewable
             | energy for Apple production. In the packaging, 100 percent
             | of the virgin wood fiber comes from responsibly managed
             | forests."_
        
               | partiallypro wrote:
               | Weird that they leave out the parts about the battery and
               | casing waste; and are design to only last on average of
               | 18 months to force you to buy new ones.
               | 
               | https://www.vice.com/en/article/neaz3d/airpods-are-a-
               | tragedy
        
           | rdw wrote:
           | Just because you're not getting that panel back doesn't mean
           | it's destroyed and wasted. I figure that these policies
           | simplify their front-line technician jobs, getting faster
           | turnaround times and higher success rates. Then they have a
           | different department that sorts through all the
           | removed/broken parts, repairing and using parts from them. No
           | idea if this is what they actually do, but it would be the
           | smart way to handle it.
        
             | zelon88 wrote:
             | Apple does nothing to improve front line technician
             | procedures. They aren't even an engineering factor. If you
             | happen to be able to replace something on an Apple product,
             | it's only because the cost-benefit ratio wasn't in favor of
             | making that part hostile to work with.
             | 
             | Apple puts 56 screws in the Unibody MBP keyboards. They
             | were practically the pioneer of gluing components in
             | permanently. They don't care about technicians. Not even
             | their own. They have been one of the leaders of the anti-
             | right-to-repair movement from day one.
        
               | stirlo wrote:
               | > Apple does nothing to improve front line technician
               | procedures.
               | 
               | I'm not a fan of planned obselence and waste but this is
               | clearly wrong. They've spent loads of engineering effort
               | designing a machine for their store that can replace and
               | reseal and test iPhone screen replacements out back.
        
               | modulusshift wrote:
               | Oh hey they went back to screws on the keyboards? that's
               | nice, they used to be single use plastic rivets, so at
               | least you _can_ redo that.
               | 
               | Also Apple's glue isn't usually that bad to work with.
               | Doesn't leave much residue, so as long as you know where
               | to apply the heat you can do a clean repair and glue the
               | new component back in.
        
             | schleck8 wrote:
             | The reasoning was to make the device as thin as possible
             | according to Verge iirc. The cable degrades because it's
             | too short and can't be replaced.
             | 
             | Says it all pretty much.
        
             | m4rtink wrote:
             | It seems like they will rather shredded perfectly good
             | parts than to let thirt party repair shops use them to help
             | people at sane prices:
             | 
             | https://www.washingtonpost.com/technology/2020/10/07/apple-
             | g...
             | 
             | https://www.vice.com/en/article/yp73jw/apple-recycling-
             | iphon...
        
               | nojito wrote:
               | So because a company stole devices to sell them makes
               | Apple the bad guy?
        
           | kkjjkgjjgg wrote:
           | As long as they plant a tree every time they replace a panel,
           | it should be fine?
        
         | soheil wrote:
         | They would make less money if they made the chip repairable.
         | This doesn't have to make them evil. Apple being more
         | profitable also means they can lower the cost and push the
         | technological advancement envelop forward faster. Every year we
         | will get that much faster chips. This is good for everyone.
         | 
         | This doesn't mean Apple's carbon footprint has to suffer. If
         | Apple does a better job recycling old Macbooks than your
         | average repair guy who takes an old CPU and puts in a new one
         | in a repairable laptop then Apple's carbon footprint could be
         | reduced. I remember the days when I would replace every
         | component in my desktop once a year, I barely thought about
         | recycling the old chips or even selling them to someone else.
         | They were simply too low value to an average person to bother
         | with recycling them properly or reselling them.
        
         | mlindner wrote:
         | This isn't relevant to the chips. Take this to the other
         | thread.
        
         | merpnderp wrote:
         | My family has 3 MBP's. 2 of them are 10 years old, 1 of them is
         | 8. When your laptops last that long, they're good for the
         | environment.
        
         | jjcm wrote:
         | > what they wont do is put the chip in an expandable and
         | repairable system
         | 
         | Because that degrades the performance overall. SoC has proven
         | itself to simply be more performant than a fully hotswappable
         | architecture. Look at the GPU improvements they're mentioning -
         | PCIe 5.0 (yet unreleased) maxes out at 128GB/s, whereas the SoC
         | Apple has announced today is transferring between the CPU/GPU
         | at 400GB/s.
         | 
         | In the end, performance will always trump interchangability for
         | mobile devices.
        
         | sktrdie wrote:
         | Won't that make the chip bigger and/or slower? I think the
         | compactness where the main components are so close together and
         | finely tuned that makes the difference. Making it composable
         | probably means also making it bigger (hence won't fit in as
         | small spaces) and probably slower than what it is. Just my two
         | cents though am not a chip designer.
        
           | Tuna-Fish wrote:
           | Just making the SSD (the only part that wears) replaceable
           | would greatly increase the lifespan of these systems, and
           | while supporting M.2 would take up more space in the chassis,
           | it would not meaninfully change performance or power.
        
         | Grustaf wrote:
         | This only makes sense if you presume people throw away their
         | laptops when they replace them after "a few years". Given the
         | incredibly high second hand value of macbooks, I think most
         | people sell them or hand them down.
        
           | ChuckNorris89 wrote:
           | You're talking about selling working devices but parent was
           | also talking about repairing them.
           | 
           | Seems like a huge waste to throw away a $2000+ machine when
           | it's out of warranty because some $5 part on it dies and
           | Apple not only doesn't provide a spare but actively fights
           | anyone trying to repare them, while the options they
           | realistically will give you out of warranty being having your
           | motherboard replaced for some insane sum like $1299 or having
           | you buy a new laptop.
           | 
           | Or what if you're a klutz and spill your grape juice glass
           | over your keyboard? Congrats, now you're -$2000 lighter since
           | there's no way to take it apart and clean the sticky mess
           | inside.
        
             | jrockway wrote:
             | > Or what if you're a klutz and spill your grape juice
             | glass over your keyboard? Congrats, now you're -$2000
             | lighter since there's no way to take it apart and clean the
             | sticky mess inside.
             | 
             | Thanks to the Right To Repair, you can take the laptop to
             | pretty much any repair shop and they can replace anything
             | you damaged with OEM or third-party parts. They even have
             | schematics, so they can just desolder and resolder failed
             | chips. In the past, this sort of thing would be a logic
             | board swap for $1000 at the very least, but now it's just
             | $30 + labor.
             | 
             | Oh, there is no right to repair. So I guess give Apple
             | $2000 again and don't drink liquids at work.
        
               | amelius wrote:
               | > and don't drink liquids at work.
               | 
               | Which is ironic given that Apple laptops are often
               | depicted next to freshly brewed cafe lattes.
        
               | ChuckNorris89 wrote:
               | Not gonna lie, you had me in the first half.
        
             | threeseed wrote:
             | _Seems like a huge waste to throw away a $2000+ machine_
             | 
             | There are other options besides throwing it away.
             | 
             | You can (a) trade it in for a new Mac (I just received $430
             | for my 2014 MBP) or (b) sell it for parts on eBay.
             | 
             |  _Or what if you 're a klutz and spill your grape juice
             | glass over your keyboard? Congrats, now you're -$2000
             | lighter since there's no way to take it apart and clean the
             | sticky mess inside._
             | 
             | You can unscrew a Mac and clean it out. You can also take
             | it into Apple for repair.
        
           | the_jeremy wrote:
           | My F500 company has a 5 year tech refresh policy, and old
           | laptops are trashed, not donated or resold.
        
             | azinman2 wrote:
             | I cannot imagine that represents the majority of the
             | market.
        
             | [deleted]
        
             | saiya-jin wrote:
             | Exactly, corporate never risks them getting into wrong
             | hands, no matter how theoretical risk that might be. Same
             | for phones
        
               | lizknope wrote:
               | We used to remove the hard drives which takes about 20
               | seconds on a desktop that has a "tool-less" case. Then
               | donate the computer to anybody including the employees if
               | they want it.
               | 
               | It takes a few minutes to do that on a laptop but it's
               | not that long.
        
             | oneplane wrote:
             | Doesn't that mean that the problem is the policy of the
             | F500 company and not whatever the supplier had in mind?
        
       | tiffanyh wrote:
       | Are these M1 Pro/Max based on the A14 or A15?
       | 
       | Does "M1" == "A14" or does it mean "M1" == "5nm TSMC node"?
        
         | supermatt wrote:
         | m1 is already very different to A14/15. Why do you think they
         | are "based" on the mobile SoCs?
        
           | tiffanyh wrote:
           | Because CPU tears downs show that "Apple's M1 system-on-chip
           | is an evolution of the A14 Bionic" [0]
           | 
           | [0] https://www.tomshardware.com/amp/news/apple-m1-vs-
           | apple-m14-...
        
         | Joeri wrote:
         | They share the core design, icestorm for efficiency cores and
         | firestorm for performance cores, but these are recombined into
         | entirely different systems on chip. To say the m1 max is the
         | same as a14 is like saying a xeon is the same as an i3 because
         | they both have skylake-derived cores.
        
         | wmf wrote:
         | The differences between the A14 and A15 are so small it doesn't
         | matter. I suspect the CPU/GPU cores come from the A14 but the
         | ProRes accelerator comes from the A15.
        
           | GeekyBear wrote:
           | >The differences between the A14 and A15 are so small it
           | doesn't matter.
           | 
           | The testing shows increases in performance and power
           | efficiency.
           | 
           | >Apple A15 performance cores are extremely impressive here -
           | usually increases in performance always come with some sort
           | of deficit in efficiency, or at least flat efficiency. Apple
           | here instead has managed to reduce power whilst increasing
           | performance, meaning energy efficiency is improved by 17% on
           | the peak performance states versus the A14. If we had been
           | able to measure both SoCs at the same performance level, this
           | efficiency advantage of the A15 would grow even larger. In
           | our initial coverage of Apple's announcement, we theorised
           | that the company might possibly invested into energy
           | efficiency rather than performance increases this year, and
           | I'm glad to see that seemingly this is exactly what has
           | happened, explaining some of the more conservative (at least
           | for Apple) performance improvements.
           | 
           | On an adjacent note, with a score of 7.28 in the integer
           | suite, Apple's A15 P-core is on equal footing with AMD's
           | Zen3-based Ryzen 5950X with a score of 7.29, and ahead of M1
           | with a score of 6.66.
           | 
           | https://www.anandtech.com/show/16983/the-apple-a15-soc-
           | perfo...
           | 
           | Having your phone chip be on par with single core Zen 3
           | performance is pretty impressive.
        
       | sunaurus wrote:
       | I would like to upgrade to Apple silicon, but I have no need for
       | a laptop. I hope they put the M1 Max in a Mac Mini soon.
        
         | billyhoffman wrote:
         | Seems reasonable. They still sell the Intel Mac mini, despite
         | having an M1 powered Mac mini already. The Intel one uses the
         | old "black aluminum means pro" design language of the iMac Pro.
         | Feels like they are keeping it as a placeholder in their line
         | up and will end up with two Apple silicon powered Mac mini's,
         | One consumer and one pro
        
           | deltron3030 wrote:
           | I doubt that we'll see a really powerful Mac Mini anytime
           | soon. Why? Because it would cannibalize the MacBook Pro when
           | combined with an iPad (sidecar).
           | 
           | Most professionals needing pro laptops use the portability to
           | move between studio environments or sets (e.g home and work).
           | The Mini is still portable enough to be carried in a
           | backpack, and the iPad can do enough on it's own to be viable
           | for lighter coffee shop based work.
           | 
           | Not many would do highend production work outside a studio or
           | set without additional periphery, meaning that highend
           | performance of the new MBP isn't needed for very mobile
           | situations.
           | 
           | A powerful mini and an iPad would therefore be the much
           | better logical choice vs. a highend MacBook Pro. There where
           | you need the power there's most likely a power outlet and
           | room for a Mini.
        
         | willis936 wrote:
         | There were rumors that it was supposed to be today. Given that
         | it wasn't, I now expect it to be quite a while before they do.
         | I was really looking forward to it.
        
       | franciscop wrote:
       | > M1 Pro and M1 Max also feature enhanced media engines with
       | dedicated ProRes accelerators specifically for pro video
       | processing
       | 
       | Do we know if this includes hardware encoding/decoding of AV1?
       | I've found it to be quite lackluster on my M1, and would love to
       | jump formats.
        
       | ngngngng wrote:
       | The benchmark to power consumption comparisons were very
       | interesting. It seemed very un-Apple to be making such direct
       | comparisons to competitors, especially when the Razer Blade
       | Advanced had slightly better performance with far higher power
       | consumption. I feel like typically Apple just says "Fastest we've
       | ever made, it's so thin, so many nits, you'll love it" and leaves
       | it at that.
       | 
       | I'll be very curious to see those comparisons picked apart when
       | people get their hands on these, and I think it's time for me to
       | give Macbooks another chance after switching exclusively to linux
       | for the past couple years.
        
         | zepto wrote:
         | They are selling these to people who know what the competition
         | is, and care.
        
         | kergonath wrote:
         | FWIW, they are in general quite accurate with their ballpark
         | performance figures. I expect the actual power/performance
         | curves to be similar to that they showed. Which is interesting,
         | because IIRC on the plots from Nuvia before they were bought
         | their cores had a similar profile. It would be exciting if
         | Qualcomm could have something good for a change.
        
           | zaptrem wrote:
           | If we can get an actual Windows on ARM ecosystem started
           | things will get really exciting really quickly.
        
         | snowwrestler wrote:
         | Apple used to do these performance comparisons a lot when they
         | were on the PowerPC architecture. Essentially they tried to
         | show that PowerPC-based Macs were faster (or as fast as) Intel-
         | based PCs for the stuff that users wanted to do, like web
         | browsing, Photoshop, movie editing, etc.
         | 
         | This kind of fell by the wayside after switching to Intel, for
         | obvious reasons: the chips weren't differentiators anymore.
        
         | _the_inflator wrote:
         | I think that Apple took a subtle, not so subtle stand: power
         | consumption has to do with environmental impact.
        
           | jeswin wrote:
           | Apple almost single-handedly made computing devices non-
           | repairable or upgradable; across their own product line and
           | the industry in general due to their outsized influence.
        
             | rubyist5eva wrote:
             | It's almost like it's just about marketing and not much
             | else...
        
             | GeekyBear wrote:
             | Single handedly?
             | 
             | >According to iFixit, the Surface Laptop isn't repairable
             | at all. In fact, it got a 0 out of 10 for repairability and
             | was labeled a "glue-filled monstrosity."
             | 
             | The lowest scores previously were a 1 out of 10 for all
             | previous iterations of the Surface Pro
             | 
             | https://www.extremetech.com/computing/251046-ifixit-
             | labels-s...
        
               | davrosthedalek wrote:
               | And they get away with it because Apple normalized it.
        
               | hh3k0 wrote:
               | One might argue that Surface laptops were Microsoft's
               | answer to MacBooks.
        
               | pb7 wrote:
               | If repairability was important to consumers, it would be
               | a selling point for competitors. But it's not.
        
               | paavohtl wrote:
               | If Apple actually cared about sustainability, they would
               | make their devices repairable.
        
               | pb7 wrote:
               | They are repairable, but not by consumers in most cases.
        
               | watermelon0 wrote:
               | They are mostly not repairable even by authorized repair
               | providers.
               | 
               | Basically, they can only change a few components
               | (keyboard, display (with assembly), motherboard, and
               | probably the aluminium case), but that's it.
        
               | GeekyBear wrote:
               | You literally cannot replace the battery in that Surface
               | Laptop without destroying the whole thing.
               | 
               | It's made to be thrown away, instead of repaired.
        
               | pb7 wrote:
               | Conversation is about Apple.
        
             | mrtksn wrote:
             | Just today I got one 6s and one iPhone 7 screen repaired(6s
             | got the glass replaced, the 7 got full assembly replaced)
             | and battery of the 6s replaced at a shop that is not
             | authorized by Apple. It cost me 110$ in total.
             | 
             | Previously I got 2017 Macbook Air SSD upgraded using an SSD
             | and an adapter that I ordered from Amazon.
             | 
             | What's that narrative that Apple devices are not upgradable
             | or repairable?
             | 
             | It simply not true. If anything, Apple devices are the
             | easiest to get serviced since there are not many models and
             | pretty much all repair shops can deal with all devices that
             | are still usable. Because of this, even broken Apple
             | devices are sold and bought all the time.
        
               | deltaci wrote:
               | newer MacBooks have both the SSD and RAM soldered on
               | board, it's no longer user upgradable, unless you have a
               | BGA rework station and knows how to operate it.
        
               | JohnTHaller wrote:
               | In modern Apple laptops (2018 and later), the storage is
               | soldered as the memory has been since 2015. Contrast this
               | with a Dell XPS 15 you can buy today within which you can
               | upgrade/replace both the memory and the storage. This is
               | the case with most Windows laptops. The exception is
               | usually the super thin ones that solder in RAM Apple-
               | style, but there are some others that do as well.
               | 
               | There's also the fact that Apple does things like
               | integrate the display connector into the panel part. So,
               | if it fails - like when Apple made it too short with the
               | 2016 and 2017 Macbook Pros causing the flexgate
               | controversy - it requires replacing a $600 part instead
               | of a $6 one.
        
               | ChuckNorris89 wrote:
               | _> Just today I got one 6s and one iPhone 7 screen
               | repaired_
               | 
               | Nice, except doing a screen replacement on a modern
               | iPhone like the 13 series will disable your FaceID making
               | your iPhone pretty much worthless.
               | 
               |  _> Previously I got 2017 Macbook Air SSD upgraded using
               | an SSD and an adapter that I ordered from Amazon_
               | 
               | Nice, but on the modern Macbooks, the SSD is soldered and
               | not replaceable. There is no way to upgrade them or
               | replace them if they break, so you just have to throw
               | away the whole laptop.
               | 
               | So yea, parent was right, Apple devices are the worst for
               | reparability period since the ones you're talking about
               | are not manufactured anymore therefore don't represent
               | the current state of affairs and the ones that are
               | manufactured today are built to not be repaired.
        
               | jolux wrote:
               | > Nice, but on the modern Macbooks, the SSD is soldered
               | and not replaceable. There is no way to upgrade them or
               | replace them if they break, so you just have to throw
               | away the whole laptop.
               | 
               | I mean, you can replace the logic board. Wasteful, sure,
               | but there's no need to throw out the whole thing.
        
               | mrtksn wrote:
               | People also replace IC's all the time. Heat it, remove
               | the broken SSD chip, put the new one, re-heat.
        
               | mrtksn wrote:
               | Hardware people are crafty, they find ways to transfer
               | and combine working parts. The glass replacement(keeping
               | the original LCD) I got for the 6S is not a procedure
               | provided by Apple. Guess who doesn't care? The repair
               | shop that bought a machine from China for separating and
               | re-assembly of the Glass and LCD.
               | 
               | Screen replacement is 50$, glass replacement is 30$.
               | 
               | iPhone 13 is very new, give it a few years and the
               | hardware people will leverage the desire of not spending
               | 1000$ on a new phone when the current one works fine
               | except for that broken part.
        
               | ChuckNorris89 wrote:
               | And how will the crafty HW people replace the SSD storage
               | on my 2020 Macbok if it bites the dust?
        
               | mrtksn wrote:
               | By changing chips. There are already procedures for fun
               | stuff like upgrading the RAM on the non-retina MacBook
               | Airs to 16GB. Apple never offered 16GB version off that
               | laptop but you can have it[0].
               | 
               | if there's a demand there would be a response.
               | 
               | [0] https://www.youtube.com/watch?v=RgEfMzMxX5E
        
               | ChuckNorris89 wrote:
               | You clearly don't have a clue how modern Apple HW is
               | built and why stuff that you're talking about on old
               | Apple HW just won't work anymore on the machines build
               | today.
               | 
               | I'm talking about 2020 devices where you can't just
               | "change the chips" and hope it works like in the 2015
               | model from the video you posted.
               | 
               | Modern Apple devices aren't repairable anymore.
        
               | mrtksn wrote:
               | I would love to be enlightened about the new physics that
               | Apple is using which is out of reach to the other
               | engineers.
               | 
               | /s
               | 
               | Anyway, people are crafty and engineering is not an
               | Apple-exclusive trade. believe it or not, Apple can't do
               | anything about the laws of physics.
        
               | ChuckNorris89 wrote:
               | _> I would love to be enlightened about the new physics
               | that Apple is using but is out of reach for the other
               | engineers._
               | 
               | Watch Luis Rosmann on youtube.
        
               | mrtksn wrote:
               | I know Luis, he made a career of complaining that it's
               | impossible to repair Apple devices when repairing Apple
               | devices.
               | 
               | Instead of watching videos and getting angry about Apple
               | devices being impossible to repair, I get my Apple
               | devices repaired when something breaks. Significantly
               | more productive approach, you should try it.
        
               | [deleted]
        
               | ChuckNorris89 wrote:
               | _> I get my Apple devices repaired when something breaks_
               | 
               | Your _old_ Apple devices, that are known to be vert easy
               | to repair. You wouldn 't be so confident with the latest
               | gear.
               | 
               | But why spoil it for you? Let's talk in a few year when
               | you find it out the hard way on your own skin.
        
               | mrtksn wrote:
               | Louis makes "Apple impossible to repair" videos since
               | ever. It's not an iPhone 13 thing, give it a few year and
               | you can claim that iPhone 17 impossible to repair, unlike
               | the prehistoric iPhone 13.
               | 
               | Here is a video from 2013, him complaining that Apple
               | doesn't let people repair their products:
               | https://www.youtube.com/watch?v=UdlZ1HgFvxI
               | 
               | He recently moved to a new larger shop in attempt to grew
               | his Apple repair operations. Then had to move back to a
               | smaller shop because as it turns out, it wasn't Apple who
               | is ruining his repair business.
        
               | kuschku wrote:
               | > I would love to be enlightened about the new physics
               | that Apple is using which is out of reach to the other
               | engineers.
               | 
               | That's known as private-public key crypto with keys burnt
               | into efuses on-die on the SoC.
               | 
               | You can't get around that (except for that one dude in
               | Shenzhen who just drills into the SoC and solders wires
               | by hand which happen to hit the right spots). But
               | generally, no regular third party repair shop will find a
               | way around this.
        
               | mrtksn wrote:
               | I know about it, it simply means that someone will build
               | a device that automates the thing that the dude in
               | Shenzhen does or they will mix and match devices that
               | have different kind of damage. I.e. if a phone that has
               | destroyed screen(irreparable) will donate its parts to
               | phones that have the face id lens broken.
               | 
               | You know, these encryption authentications work between
               | ICs and not between lenses and motors. Keep the coded IC,
               | change the coil. Things also have different breaking
               | modes, for example a screen might break down due to the
               | glass failure(which cannot be coded) and the repair shop
               | can replace the broken assembly part when keeping the IC
               | that ensures the communication with the mainboard. Too
               | complicated for a street shop? Someone will build a
               | service that does it B2B, shops will ship it ti them,
               | they will ship it back leaving only the installation to
               | the street shop.
               | 
               | Possibilities are endless. Some easier some harder but we
               | are talking about talent that makes all kind of replicas
               | of all kind of devices. With billions of iPhones out
               | there, it's actually very lucrative market to be able to
               | salvage 1000USD device, their margins could be even
               | better than the margins of Apple when they charge 100USD
               | to change the glass of the LCD assembly.
        
               | iSnow wrote:
               | Apple is using SOCs now where CPU and RAM are one chip
               | package. How are you going to upgrade RAM here even with
               | the mother of all reflowing stations?
        
               | mrtksn wrote:
               | You don't. It's a technological progress similar to one
               | where we lost our ability to repair transistors with
               | introduction of chips. If this doesn't work for you you
               | should stick with the old tech, I think the Russians did
               | something like that on their soviet era plane
               | electronics. There are also audiophiles who don't even
               | switch to transistor and use vacuum tubes. Also the Amish
               | who stick to the horses and candles who choose to
               | preserve their way of doing things and avoid the problems
               | of electricity and powered machinery.
               | 
               | You will need to make a choice sometimes. Often you can't
               | have small efficient and repairable all the time.
        
               | time0ut wrote:
               | Only if Apple wants to let them as far as I have seen.
               | The software won't even let you swap screens between
               | iPhone 13s. Maybe people will find a work around, but it
               | seems like Apple is trying its hardest to prevent it.
        
               | threeseed wrote:
               | _Nice, except doing a screen replacement on a modern
               | iPhone like the 13 series will disable your FaceID making
               | your iPhone pretty much worthless._
               | 
               | Only if you go to someone who isn't an authorised Apple
               | repairer.
        
               | billyhoffman wrote:
               | True, but you are talking about devices that are 4-6
               | years old. Storage is now soldered. Ram has been soldered
               | for a while now, and with Apple Silicon its part of the
               | SoC.
        
               | lotsofpulp wrote:
               | Perhaps leading to fewer failures and longer device
               | lifespans.
               | 
               | As far as I understand, the less components and heat, the
               | longer the electronics keep working.
        
               | JohnTHaller wrote:
               | For context, Apple started soldering RAM in 2015 and
               | soldering storage in 2018.
        
             | masklinn wrote:
             | Weirdly these machines have a "6.1 repairability rating"
             | when you go in their store. I wonder what ifixit will think
             | of them.
        
             | [deleted]
        
         | ssijak wrote:
         | Yeah this is the first time they actually compared to something
         | with the possibly bigger performance.
        
         | mdasen wrote:
         | I think that for the first time, Apple has a real performance
         | differentiator in its laptops. They want to highlight that.
         | 
         | If Apple is buying Intel CPUs, there's no reason making direct
         | performance comparisons to competitors. They're all building
         | out of the same parts bin. They would want to talk about the
         | form factor and the display - areas where they could often out-
         | do competitors. Now there's actually something to talk about
         | with the CPU/GPU/hardware-performance.
         | 
         | I think Apple is also making the comparison to push something
         | else: performance + lifestyle. For me, the implication is that
         | I can buy an Intel laptop that's nicely portable, but a lot
         | slower; I could also buy an Intel laptop that's just as fast,
         | but requires two power adapters to satisfy its massive power
         | drain and really doesn't work as a laptop at all. Or I can buy
         | a MacBook Pro which has the power of the heavy, non-portable
         | Intel laptops while sipping less power than the nicely portable
         | ones. I don't have to make a trade-off between performance and
         | portability.
         | 
         | I think people picked apart the comparisons on the M1 and were
         | pretty satisfied. 6-8 M1 performance cores will offer a nice
         | performance boost over 4 M1 performance cores and we basically
         | know how those cores benchmark already.
         | 
         | I'd also note that there are efforts to get Linux on Apple
         | Silicon.
        
           | ngngngng wrote:
           | I was casually aware of Asahi before this announcement. Now
           | I'm paying close attention to its development.
        
         | xur17 wrote:
         | > I'll be very curious to see those comparisons picked apart
         | when people get their hands on these, and I think it's time for
         | me to give Macbooks another chance after switching exclusively
         | to linux for the past couple years.
         | 
         | I really enjoy linux as a development environment, but this is
         | going to be VERY difficult to compete with..
        
           | modulusshift wrote:
           | Asahi Linux is making great strides in supporting M1 Macs,
           | and they're upstreaming everything so your preferred distro
           | could even support them.
           | 
           | https://asahilinux.org/
        
           | frogblast wrote:
           | You can always run Linux in a VM too
        
             | xur17 wrote:
             | That's just not the same.
        
         | vmception wrote:
         | I'm not going to wait for the comparisons this time. Maxing
         | this baby out right now.
        
           | BitAstronaut wrote:
           | Honest question, what do you do where a $6,099 laptop is
           | justifiable?
        
             | IggleSniggle wrote:
             | I mean, the Audi R8 has an MSRP > $140k and I've never been
             | able to figure out how that is justifiable. So I guess
             | dropping $6k on a laptop could be "justified" by _not_
             | spending an extra $100k on a traveling machine?
             | 
             | To be clear, I'm not getting one of these, but there's
             | clearly people that will drop extra thousands into a
             | "performance machine" just because they like performance
             | machines and they can do it. It doesn't really need to be
             | justified.
             | 
             | Truthfully, I'm struggling to imagine the scenario where a
             | "performance laptop" is justifiable to produce, in the
             | sense you mean it. Surely, in most cases, a clunky desktop
             | is sufficient and reasonably shipped when traveling, and
             | can provide the required performance in 99% of actual high-
             | performance-needed scenarios.
             | 
             | If I had money to burn, though, I'd definitely be buying a
             | luxury performance laptop before I'd be buying an update to
             | my jalopy. I use my car as little as I possibly can. I use
             | my computer almost all the time.
        
             | vmception wrote:
             | I skip getting a Starbuck's latte, and avoid adding extra
             | guac at Chipotle.
             | 
             | I'm kidding, that stuff has no affect on anything.
             | 
             | Justifiable, as in "does this make practical sense", is not
             | the word, because it doesn't. Justifiable, as in, "does it
             | fit within my budget?" yes that's accurate. I don't have a
             | short answer to why my personal budget is that flexible,
             | but I do remember there was a point in my life where I
             | would ask the same thing as you about other people. The
             | reality is that you either have it or you don't. That being
             | said, nothing I _had been_ doing for money is really going
             | to max this kind of machine out or improve my craft. But
             | things that used to be computationally expensive won 't be
             | anymore. Large catalogues of 24 megapixel RAWs used to be
             | computationally expensive. Now I won't even notice, even
             | with larger files and larger videos, and can expand what I
             | do there along with video processing, which is all just
             | entertainment. But I can also do that while running a bunch
             | of docker containers and VMs... within VMs, and not think
             | about it.
             | 
             | This machine, for me, is the catalyst for greater
             | consumptive spending though. I've held off on new cameras,
             | new NASs, new local area networking, because my current
             | laptop and devices would chug under larger files.
             | 
             | Hope there was something to glean from that context. But
             | all I can really offer is "make, or simply have, more
             | money", not really profound.
        
               | BitAstronaut wrote:
               | Thank you for a very honest and thorough answer.
        
               | ghaff wrote:
               | There's also future-proofing to some degree. I'll
               | probably get a somewhat more loaded laptop than I "need"
               | (though nowhere near $6K) because I'll end up kicking
               | myself if 4 years from now I'm running up against some
               | limit I underspeced.
        
             | mixmastamyk wrote:
             | Not your father's currency. If you think of them as pesos,
             | the price is easier to comprehend.
             | 
             | Not to mention if it makes a 200k salary worker 5% more
             | productive, its a win. (Give or take for taxes.)
        
               | dymk wrote:
               | It's a win for a worker who's compensated based on their
               | work output, which is pretty much the opposite of what a
               | salaried worker is.
        
               | mixmastamyk wrote:
               | Productivity is productivity, doesn't matter how one is
               | paid.
        
               | dymk wrote:
               | ...then why mention a salary at all?
        
             | eli wrote:
             | A few thousand dollars per year (presumably it will last
             | more than one year) is really not much for the most
             | important piece of equipment a knowledge worker will be
             | using.
        
               | speedgoose wrote:
               | It's still a waste if you don't need it though. This
               | money could be spent on much more useful things.
        
               | pb7 wrote:
               | Like what?
        
               | ngngngng wrote:
               | Guac at Chipotle
        
               | threeseed wrote:
               | If it improves compilation speeds by 1% then it's not a
               | waste.
               | 
               | My time is worth so much more to me than money.
        
               | postalrat wrote:
               | Then why are you using a laptop?
        
               | threeseed wrote:
               | Why even bother with such an inane answer ?
               | 
               | It's because I need to use my computer whilst not
               | physically attached to the same spot i.e. between
               | work/home, travel.
               | 
               | You know the same reason as almost everyone else.
        
             | syspec wrote:
             | If you don't max out the HDD space, but max out all the
             | portions which effect performance, it only about half that.
        
               | stouset wrote:
               | IIRC bigger SSDs in previous generations had higher
               | performance.
        
               | wtallis wrote:
               | That's fundamental to how NAND flash memory works. For
               | high-end PCIe Gen4 SSD product lines, the 1TB models are
               | usually not quite as fast as the 2TB models, and 512GB
               | models can barely use the extra bandwidth over PCIe Gen3.
               | But 2TB is usually enough to saturate the SSD controller
               | or host interface when using PCIe Gen4 and TLC NAND.
        
             | eppsilon wrote:
             | I assume "vmception" requires a lot of power...
        
             | artfulhippo wrote:
             | Dude this is Hacker News. I'm surprised when I meet an
             | engineer who doesn't have a maxed out laptop.
        
       | dcdevito wrote:
       | No M1 Pro/Max Mini announced and my conspiracy theory on that is:
       | they couldn't justify a high price tag for it and instead are
       | pushing pro users to the MacBook Pro and eventually Mac Pro and
       | maybe iMac Pro. If they did an M1 Pro Mini would cannibalize
       | these MacBook pros.
       | 
       | I'm glad I pulled the trigger two weeks ago and I just received
       | my M1 mini last week, I am relieved and will be continuing to
       | enjoy it
        
       | [deleted]
        
       | systemvoltage wrote:
       | Does anyone want to guess their packaging arch? Is it CoWoS? How
       | are they connecting memory with 200 Gb/s bandwidth? Interposer?
        
       ___________________________________________________________________
       (page generated 2021-10-18 23:00 UTC)