[HN Gopher] Dissecting the Apple M1 GPU, Part II
___________________________________________________________________
Dissecting the Apple M1 GPU, Part II
Author : dddddaviddddd
Score : 360 points
Date : 2021-01-22 17:34 UTC (5 hours ago)
(HTM) web link (rosenzweig.io)
(TXT) w3m dump (rosenzweig.io)
| captain_price7 wrote:
| I know enough about GPU to know how interesting this is, but not
| enough to fully get it.
|
| I wish there were some "Dissecting the Apple M1 GPU _for Dummies_
| ".
| war1025 wrote:
| I imagine the "for graphics engineers" version has to get
| figured out before they can do the "for Dummies" version.
| supernova87a wrote:
| [*Sure I get it, it's not interesting to talk meta about whether
| a post is interesting. I'm already downvoted off the planet.
| Fine.]
|
| I'm sure it's a good blog post for the right reader, but for
| something to land on HN front page, I generally look for either:
|
| 1) a story that, while specialized, reveals something accessibly
| technical and interesting to a semi-expert crowd, or
|
| 2) explains something at a sufficient level of expertise/detail
| that people not in the middle of it can grasp and apply to their
| own knowledge.
|
| This story seems not accessible to the general reader. Or it
| reads like someone's private notes. I think it needs a level of
| abstraction or summary to be accessible to the readership here.
| faitswulff wrote:
| ...well, this is _Hacker_ News, after all.
| perardi wrote:
| You're complaining this is too technical...on a site called
| "Hacker News"...which is run by a firm named after an abstract
| mathematical concept.
|
| OK, kiddo. Upworthy is over that-away.
| epistasis wrote:
| I've never worked on GPUs, will likely never work on drivers
| even, yet was able to follow along and find it interesting.
|
| I'm going to upvote pretty much any reverse engineering
| monologue, especially one as accessible as this.
| NikolaeVarius wrote:
| Then hide it and don't read it
| jandrese wrote:
| Are you saying this post is too nerdy for _Hacker News_?
| Stories come all the time, if you don 't like it don't read it.
|
| I didn't understand everything the author said, but I was able
| to get the gist of it. There GPU has a list of valid addresses
| and it's stored in a C style array with a terminator instead of
| a 1 indexed array with an object count in the Apple API style.
| The mismatch caused the author some confusion even after he
| discovered the existence of the buffer due to his code failing.
|
| It's one of the many details that has to be understood in order
| to write an open source driver for the hardware. The open
| source driver will be necessary for people who want to install
| Linux on this Apple hardware.
| asddubs wrote:
| small correction: the author is a woman
| bzbarsky wrote:
| > even after he discovered
|
| She, fwiw.
| jandrese wrote:
| Sorry. I completely forgot to check the byline.
| cbozeman wrote:
| Yeah, well, you know, that's just like... your _opinion_ , man.
| __mp wrote:
| The published code on Github looks very nice! It is easy to read
| and understand.
|
| Kudos to Alyssa!
| rowanG077 wrote:
| Amazing!! This is progresssing literally 10 times faster then I
| expected.
| Someone wrote:
| Indeed. That single triangle likely is way closer to a million
| triangles than to zero triangles.
| https://rampantgames.com/blog/?p=7745:
|
| _"Afterwards, we came to refer to certain types of
| accomplishments as "black triangles." These are important
| accomplishments that take a lot of effort to achieve, but upon
| completion you don't have much to show for it only that more
| work can now proceed. It takes someone who really knows the
| guts of what you are doing to appreciate a black triangle."_
| jdashg wrote:
| Roughly equidistant milestones for ports: 1. It crashes 2.
| Black screen 3. Any drawing at all 4. The rest of the
| friggin' engine
| gjsman-1000 wrote:
| Everyone, remember that in 2019 Alyssa was in _High School_.
| Where do people get this knowledge? It's astounding.
|
| She was literally reverse-engineering Mali for Panfrost starting
| in her Sophomore year.
| localhost wrote:
| As one of the older folks here on HN, I fondly remember the
| time when almost everything had to be done by reverse-
| engineering. Some friends of mine and I were the first people
| to reverse engineer the Commodore 4040/8050 disk drive
| "computers". We had to write all of the software to do this,
| including custom disassemblers that eventually spat out
| printouts that we annotated first by hand and later transcribed
| into source code files. There were no memory maps, and our
| knowledge of electronics was too rudimentary to figure out much
| by looking at the traces on the circuit boards. There were no
| places for folks to discuss these ideas - bulletin boards of
| the day were one-user-at-a-time things which meant that topics
| like this were way too niche to garner any discussion at all. I
| was fortunate enough to grow up in Toronto which was a hot-bed
| of Commodore hackers and we had a vibrant community that could
| gather to meet face-to-face to discuss ideas. But in between
| meetups we had to sit and think harder until we figured it out.
|
| It was a wonderful time to learn about computing from first
| principles without the distractions that exist today.
| floatingatoll wrote:
| When I was 5, my grandmother the edusoft programmer let me
| learn machine-code programming from one of her books.
| PragmaticPulp wrote:
| Truly impressive work.
|
| > Where do people get this knowledge?
|
| Time, hands-on experimentation, and focus.
|
| Anecdotally, reverse engineering and low-level hacking felt
| more popular back in the 90s and early 2000s. Back then, there
| were fewer distractions to soak up the free time of young tech
| enthusiasts. Old IRC chatrooms feel like a trickle of
| distraction relative to the firehose of Twitter, Reddit, 24/7
| news cycles, and modern distractions.
|
| A common complaint among the younger engineers I mentor is that
| they feel like they never have enough time. It's strange to
| watch a young, single person without kids and with a cushy 9-5
| (or often 11am to 5pm at modern flex/remote companies) job
| complain about lack of free time. Digging deeper, it's usually
| because they're so tapped in to social media, news, video
| games, and other digital distractions that their free time
| evaporates before they even think about planning their days
| out.
|
| It's amazing what one can accomplish by simply sitting down and
| focusing on a problem. You may not reach the levels of someone
| like Alyssa, but you'll get much farther than you might expect.
| And most importantly, you probably won't miss the media
| firehose.
| conradev wrote:
| There is a world of difference in free time between "having a
| 9-5 job" and "being in high school", though.
|
| In high school, I was lucky enough to not have to do laundry,
| feed myself, or a variety of other tasks and chores that come
| along with adulthood. I also could sleep like crap and make
| it up during school the next day. Not to mention that the
| time I spent at school wasn't spent programming, so when I
| got home it wasn't more of the same, it was exciting. I could
| sit at the computer programing from 3pm to 2am.
| salawat wrote:
| That issue you describe, while intuitive and obvious to those
| of us who were around for the transition period is simple to
| arrive at. Keep in mind that youngsters nowadays may not have
| much if any experience with the before world or know someone
| reflective enough and willing to point it out to them. Many
| just assume we all mastered it somehow.
|
| There is also the possibility, _They might be right._ Before,
| communication and the long access lag inherent in accessing
| data in remote systems meant change propagated by trickle.
| Now, change can happen and propagate across the world at
| breakneck speeds. Which means if one is to have an effect,
| one must be there /be aware.
|
| This leaves precious little time for following the white
| rabbit. I think everybody could do with a little somber
| reflection on just the impact that rapid information
| propagation has had on the world.
| PurpleFoxy wrote:
| For me, I have more than enough time, especially when working
| from home. It's just that after work I feel like a brain dead
| zombie and programming is the last thing I want to do after
| work.
|
| In the time before I started work and after finishing school
| I used to do so much open source stuff in the time that I now
| spend working.
| aardvark179 wrote:
| It's easy to think this sort of thing requires an incredibly
| level of knowledge but it's often more about dedication and a
| little lateral thinking. You need a broad understanding of the
| process of putting something on screen, the ability to
| construct a program that does something, some way to observe
| the result, and the willingness to experiment.
|
| Back in the 80s lots of kids that age wrote computer games and
| often found novel ways to use the hardware because even though
| video controllers might not be fully documented you still knew
| where their control registers were in memory and could just try
| altering the values in them.
|
| Now, modern graphics hardware is more complex and controlled by
| things like command buffers and shaders, but you can apply
| similar ideas, and you've probably got easier tools for
| examining the state of your process. Write a program that does
| a simple thing and dump its state. Do a slightly different
| thing and compare the states. When you think you understand
| things either change your program and see if you're correct or
| use a debugger to alter the state at runtime and see if you can
| change the visible result.
| chucky_z wrote:
| Where did you get your knowledge? With places like HackerNews
| along with folks like yourself and myself writing posts about
| this kind of stuff it becomes trivial to walk down a rabbit
| hole for anyone of any age. Think back to how much you
| accomplished when you were younger because you were walking in
| the footsteps of others. :)
| noch wrote:
| > With [...] folks like yourself and myself writing posts
| about this kind of stuff
|
| Please could you link to some of the reverse engineering
| writing by you guys to which you're referring?
| war1025 wrote:
| A magical thing about high school / college that people fail
| to appreciate is how few actual responsibilities you have at
| that age.
|
| It's why kids are able to devote so many hours to video games
| and other time waster stuff.
|
| Only difference is this person managed due to luck / skill /
| timing to end up on a more productive path.
|
| Time commitment is the big impediment for most open-source
| projects. Late adolescence / young adulthood is one of the
| few periods of life where you both are competent enough to do
| useful things and have enough free time to actually commit to
| that work.
| noizejoy wrote:
| > anyone of any age
|
| and intelligence and education and intellectual curiosity and
| non-working time and energy/stamina ...
|
| That Venn diagram ends up with a pretty small number of
| humans ...
| viraptor wrote:
| High school can be time when you have lots of spare time for
| learning what you want. I remember spending one year, and
| almost all of the summer break reverse engineering and writing
| binary patches for the sl45i phone. There was a huge community
| around it at the time so you just joined a forum and started
| asking questions.
| bitwize wrote:
| When I was 14 I discovered that my computer's CPU (a 186) had
| two on-chip programmable timers. One was used for DRAM refresh,
| so I used the other to get fine-grained timing so I could bit-
| bang the speaker output to get crude PWM sound.
|
| Of course back then, people used to document the components
| they put in computers...
| pjmlp wrote:
| At the age of 10 I was coding for the Timex 2068, using books
| like these ones,
|
| https://www.atariarchives.org/
|
| Yes, they are for the Atari, just trying to make a point here.
|
| So any smart kid in the age of Internet, instead of having what
| is available at the local library, is quite capable to build up
| the skills to achieve this kind of work at the high school
| level.
| shawnz wrote:
| Strongly disagree. There is a big difference between doing
| some interesting hacking at a young age and getting your GPU
| driver included in the mainline Linux kernel. The author
| appears to be a true CS prodigy.
| pjmlp wrote:
| In 8 and 16 bit home computers you wrote your own graphics
| driver, in Assembly, if you wanted to draw anything beyond
| the toy BASIC graphics commands.
|
| I guess anyone doing games were a prodigy back then.
| shawnz wrote:
| Yes, perhaps, and at that time it was also a much simpler
| proposition to do that.
|
| I would imagine at that time there were also many fewer
| kids at a high-school level doing that kind of hacking,
| so it might not be totally wrong to say they were all
| prodigies.
| pjmlp wrote:
| I guess, C64, Amiga, Atari didn't had programmable
| graphics hardware after all.
| cambalache wrote:
| OK I love this, but I am pretty sure my chances of getting a mac
| in the future are zero. From which side should I expect the PC
| response? Intel? AMD? Microsoft? ARM? Samsung? I think Apple
| willingly or not has just obliterated the whole consumer pc
| market.
| yepthatsreality wrote:
| And Apple will court (allow) developers until competition shows
| up. Then they will shut it down and obfuscate. The fact that
| Apple refuses to provide FOSS drivers themselves indicates
| this.
| unix_fan wrote:
| Apple has never provided drivers for other operating systems,
| including windows. The drivers used for BootCamp aren't even
| developed by them.
| floatboth wrote:
| Well, they clearly at least commissioned them from someone
| else, since they are the ones _distributing_ them.
| wtallis wrote:
| Intel Macs are literally PCs. The Windows drivers that
| Apple bundled up for Boot Camp were the same drivers that
| the respective component manufacturers provide for other
| PC OEMs to redistribute. In the entire history of Intel
| Macs, there have been very few components used by Apple
| for which there wasn't already an off-the-shelf Windows
| driver written by someone other than Apple for the sake
| of their non-Apple customers.
| floatboth wrote:
| They do include these too, but I'm mostly talking about
| stuff like the trackpads (before SPI, the protocol they
| ran over USB was also custom, not HID)
| altcognito wrote:
| Apple willingly or not increased their market share? Seriously,
| the AstroTurf in this thread is off the charts.
| shmerl wrote:
| Hardly. Apple has nothing to offer for high end gaming and I
| don't think they care about it.
|
| For me it's Linux with AMD both for CPU and GPU.
| devwastaken wrote:
| I'd be excited to see arm competition in the desktop space but
| I don't believe there are any arm chips that compete in
| performance to high end x86. Arm can do lots of cores which is
| very good for servers, but single threading performance is
| still a significant necessity on user end hardware.
|
| The M1 is great because it's low power with optimized
| performance, but on a desktop you can have well over 500W+ and
| that's normal.
|
| I don't see anyone else making only mobile arm chips for
| laptops other than trying to be like a Windows Chromebook. The
| software compatability will be a nightmare.
| tyingq wrote:
| I do think that's a relevant point for desktops. The M1 is
| incredible, but if you don't care about TDP, desktop can
| catch up fairly quickly for either Intel or AMD.
|
| I don't see an obvious player currently working on a "broad
| market" high performance ARM chip for the commodity desktop.
| wlesieutre wrote:
| _> I don 't see anyone else making only mobile arm chips for
| laptops other than trying to be like a Windows Chromebook.
| The software compatability will be a nightmare._
|
| I'd expect Microsoft to make a run at this with their Surface
| line.
|
| They've been trying to make ARM tablets for a years (see
| Windows RT), and they just recently added the x86-64
| compatibility layer so that it could be actually useful
| instead of "it's an iPad but worse".
|
| https://blogs.windows.com/windows-
| insider/2020/12/10/introdu...
|
| Will it see any success with 3rd party developers probably
| not bothering to support ARM? Maybe for some people who spend
| most of their time in Edge, Mail, or Office. I have a hard
| time seeing it being as successful as Apple's change, since
| the messaging from Apple is "This is happening in 2 years,
| get on board or you'll be left behind" and the messaging from
| Microsoft is "Look, we made an ARM tablet, and will probably
| sell at least 10 of them."
| zaptrem wrote:
| M1 cores outperform Comet Lake cores and are basically tied
| with AMD Vermeer despite using a fraction of the power.
| stefan_ wrote:
| They are also on a different process _shrug_
| unix_fan wrote:
| Process node improvements don'ty bring that much
| performance.
| stefan_ wrote:
| Given the massive drop in power consumption and therefore
| heat, they seem to bring inordinate amounts of
| performance in a mobile chip.
| floatboth wrote:
| Even evolution within the same process node can bring
| noticeable performance improvements. Launch day AMD Zen 2
| (TSMC 7FF) chips could barely clock to 4.2GHz, ones
| manufactured months later can often do 4.4.
| happymellon wrote:
| So they _are_ competitive, because they are a node ahead?
| monkmartinez wrote:
| Who cares if you have the darn thing plugged in anyway?
| Does the M1 outperform the Threadripper 3990x?
| astrange wrote:
| Personally I like the noise level in my room being 32 dB
| and don't like PCs having to run the fan at full speed to
| show the desktop picture.
| monkmartinez wrote:
| That is fine and I get it. Just realize there are lots of
| people that need/want more power, variety, compatibility
| with a lower cost. Water cooling is plug and play these
| days for desktop rigs. The gaming laptop market has
| become shockingly good for professional level
| CAD/Engineering apps with the industry focused on
| drastically reducing noise levels (albeit still quite
| loud in comparison to fanless). Trade offs... trade offs
| as far as you can see...
| floatboth wrote:
| Any desktop PC, even with the beefiest Threadripper,
| won't run the fan at full speed when you're just browsing
| the web or watching videos.
|
| Heck, if you're not squeezing the absolute maximum
| performance from a chip by overclocking (I am :P) you can
| run a 5950X on a regular tower cooler with the fan curve
| never coming close to 100% and it will still be
| incredibly fast.
| singhrac wrote:
| I think it's a moot question, since they're not
| comparable here. Laptops have inherent thermal
| limitations (not just power) that don't allow something
| like the Threadripper to be workable.
|
| If you wanted a fair comparison you'd wait to see what
| processor Apple puts in their Mac Pro in 2-3 years, and
| compare that to whatever is the Threadripper equivalent
| then.
| jrockway wrote:
| I think to understand the M1, you have to be a Mac laptop
| user. For years, Mac laptop performance lagged years
| behind high-end desktop performance -- they have been
| stuck on 14nm+ process chips with a mobile power budget,
| while desktop users have had 7nm chips that can draw 500W
| with no trouble. As a result, what M1 users tell you is
| fast is what PC desktop users have had for ages. A
| Threadripper and 3090 will blow the M1 out of the water
| in raw performance (but use a kilowatt while doing it,
| which a laptop obviously can't do).
|
| At my last job, they issued us 2012-era Macbooks. I
| eventually got so frustrated with the performance that I
| went out and bought everyone on my team an 8th generation
| NUC. It was night and day. I couldn't believe how much
| faster everything was. The M1 is a similar revelation for
| people that have stayed inside the Mac ecosystem all
| these years.
| macNchz wrote:
| Yeah after years of company-issued Macbook Pros I built
| myself a Ryzen 3900x dev machine last year and it was
| like waking up from one of those dreams where you need to
| do something urgently but your legs aren't cooperating.
|
| Given the benchmarks I've seen I imagine the M1 would be
| a somewhat comparable experience, but using a desktop
| machine for software development for the first time
| since...2003(!) has really turned me off the laptop-as-
| default model that I'd been used to, and the slow but
| steady iOSification of MacOS has turned me off Macs
| generally. Once people are back to working in offices I'd
| just pair it with an iPad or Surface or something for
| meetings.
| [deleted]
| MetaWhirledPeas wrote:
| They have bitten off an even bigger chunk of the meal they had
| nearly finished already:
|
| - Developers
|
| - Creatives
|
| - Dutiful citizens of The Ecosystem
|
| What they are no closer to biting off is gamers and hardware
| enthusiasts. Anyone who actually needs to open their PC for any
| purpose whatsoever.
|
| I know Apple wants to be the glossy Eve to all the PC market's
| Wall-E's, but I will continue to shun them forcefully as long
| as they remain hell-bent on stifling the nerdy half of all of
| their users.
|
| I assume the developer popularity is a result of the iPhone
| gold rush, which Apple exploited using exclusivity tactics.
| Therefore I consider it an abomination to see developers
| embrace the platform so thoroughly. iPhone development should
| feel shameful, and when there's no way to avoid it, it should
| be done on a dusty Mac Mini pulled from a cardboard box that
| was sitting in the closet ;)
| bonestamp2 wrote:
| > I assume the developer popularity is a result of the iPhone
| gold rush, which Apple exploited using exclusivity tactics.
| Therefore I consider it an abomination to see developers
| embrace the platform so thoroughly.
|
| That's not why most developers I know use macs. We use them
| because we got tired of spending a couple days a year (on
| average) unfucking our Windows machine after something goes
| wrong. When you're a developer, you're often installing,
| uninstalling and changing things... much more than the
| average personal or business user. That means there are a lot
| of opportunities for things to go wrong, and they do. Even
| most Google employees were using macs until they themselves
| started making high end Chromebooks; tens of thousands of
| google employees still use macs. Some users have moved on to
| linux, but most stay on mac because they want to spend time
| in the OS and not on the OS. I can appreciate both
| perspectives, there's no right answer for everyone.
|
| I share your wish that the hardware was more serviceable but
| everything is a compromise at the end of the day and that's
| the compromise I'm willing to take in exchange for the other
| benefits.
|
| Some complain about the price, but the high end macbook pros
| aren't even more expensive than windows workstation laptops.
| Our company actually saved several hundred dollars per
| machine when we switched from ThinkPads with the same specs.
| Not to mention, our IT support costs were cut almost in half
| with macs.
|
| So, aside from gaming or some specific edgecase requirements,
| it's hard for me to justify owning a PC. That said, I have
| one of those edgecase requirements with one of my clients so
| I have a ThinkPad just for them. But, it stays in a drawer
| when I'm not working on that specific thing.
| evilduck wrote:
| On the gaming front, I've been trying GeForce Now recently
| and while it may not be great for competitive FPS games, it
| has otherwise destroyed any reason why I'd ever purchase
| another gaming PC unless they start jacking up the monthly
| price. It works on basically all platforms (including my
| iOS devices), it doesn't spin up my MBP's fans and doesn't
| eat through battery life. I don't have to worry about ARM
| vs x86, I don't even get locked in to the platform like
| Stadia, it connects to Steam, Epic and GOG.
| evilduck wrote:
| Wow, obvious bias much? Great way to engage in reasonable,
| level headed conversation is to lead with telling people they
| should be ashamed for not holding your values and opinions. I
| honestly can't think of a better way to demonstrate to most
| people why they _should_ get a Mac than to just show off
| comments like yours.
| ActorNightly wrote:
| > I think Apple willingly or not has just obliterated the whole
| consumer pc market.
|
| They definitely won the "general public use laptop" market (if
| you conveniently ignore the rest of the laptop and OSX, which
| are utter crap IMO), but its important to understand that they
| really didn't invent anything, they just optimized. And
| optimizing something like this means you make the stuff that is
| most commonly used better, while reducing the functionality of
| the rest.
|
| Compare the Asus Zephyrus RoG 14 laptop with the Macbook pro.
| G14 has the older Ryzen 9 4900HS chip, and while the single
| core performance of the MBP is better, the multi core is the
| same despite the 4900HZ being a last gen chip. G14 gets about
| 11 hours of battery time for regular use, MBP gets about 16,
| but the G14 also has a discrete GPU that is superior for gaming
| than the integrated GPU of the mac. Different configurations
| for different things.
|
| Then, even ignoring the companies decision to mix and match
| components, the reason why you can buy any windows based laptop
| and install Linux on it and have 99% of the functionality
| working right out of the box is because of the standardization
| of the cpu architecture. With the Apple M1, even though Rosetta
| is very well built, its support is not universal across all
| applications, some run poorly or not at all.
|
| And while modern compilers are smart, they have a long way to
| go before true cross operability with all the performance
| enhancements. Just look at any graphical linux distro, and the
| fact that all android devices out there run linux, but there
| isn't a way to natively run it on them without the chroot
| method that doesn't take advantage of the hardware.
|
| So in the end, if you are someone that just wants a laptop with
| good performance and great battery life, and don't care about
| any particular software since most of your time will be spent
| on the web or in the Apple Ecosystem software, the M1 machines
| are definitely the right choice. However, if you want something
| like a dev machine with Linux where you just want to be able to
| git clone software and it work without any issues, you are most
| likely going to be better off with the "windows" based laptops
| with traditional AMD/Intel chips for quite some time.
| tambourine_man wrote:
| > but I am pretty sure my chances of getting a mac in the
| future are zero
|
| Why?
| monkmartinez wrote:
| I cannot speak for OP, but software is the reason for me. I
| can not run Solidworks on MacOS without bootcamp or other
| tricks, for example. Photogrammetry apps? GIS apps? CNC CAM
| apps? I mean, compare the Mac[1] compatible apps to the
| catalog of Apps Autodesk has.
|
| The fact of the matter is GPU support on MacOS is just not
| there in the same way it is with windows. It is really hard
| to justify the price of Apple when you compare directly to
| Windows offerings spec for spec. When you compare the sheer
| volume of software available for Windows as compared to
| MacOS, especially if you need a discrete GPU there are really
| no comparisons.
|
| [1]https://www.autodesk.com/solutions/mac-compatible-software
| criddell wrote:
| I'm always surprised when I see discussions about somebody
| dumping one operating system for another. Isn't your
| operating system choice dictated by the applications you
| want to run?
| cambalache wrote:
| Price for one, also I still consider PC and Linux (and god
| heavens even windows) more open than Mac. I also work with
| many windows-only automation software.
| chrisweekly wrote:
| sorry but what's "PC" above, given it's not windows?
| cambalache wrote:
| A generic term for hardware able to run linux and/or
| windows. Or any personal computer not build by Apple. You
| can have any combination of open/closed hardware and OS.
| ChuckNorris89 wrote:
| For me:
|
| 1. A huge catalog of apps and games, old and new, that are
| mostly windows and linux, but all exclusively x86.
|
| 2. I don't want to get locked in to an ecosystem with no
| variety. For example, if Dell's offering laptop has a
| keyboard I hate or poor selection of ports, then I can switch
| to Lenovo or HP or go for a gaming laptop with an Nvidia GPU
| and use it to train NN, etc, the variety is amazing. While if
| Apple's next machine has a flaw I dislike, then too frikin
| bad, I'll be stuck with whatever Apple bestowes upon me.
| fossuser wrote:
| You might have to bite the bullet and get a mac - I don't see
| much promise from others in this space.
|
| Intel's CEO change may save them, but they're definitely facing
| an existential threat they've failed to adapt to for years.
|
| Amazon will move to ARM on servers and probably some decent
| design there, but that won't really reach the consumer hardware
| market (probably - though I suppose they could do something
| interesting in this space if they wanted to).
|
| Windows faces issues with third party integration, OEMs, chip
| manufacturers and coordinating all of that. Nadella is smart
| and is mostly moving to Azure services and O365 on strategy - I
| think windows and the consumer market matter less.
|
| Apple owns their entire stack and is well positioned to
| continue to expand the delta between their design/integration
| and everyone else continuing to flounder.
|
| AMD isn't that much better positioned than Intel and doesn't
| have a solution for the coordination problem either. Nvidia may
| buy ARM, but that's only one piece of getting things to work
| well.
|
| I'm long on Apple here, short on Intel and AMD.
|
| We'll see what happens.
| spideymans wrote:
| I just got my M1 Air. This thing is unbelievably fluid and
| responsive. It doesn't matter what I do in the background. I
| can simultaneously run VMs, multiple emulators, compile code,
| and the UI is always a fluid 60 fps. Apps always open
| instantly. Webpages always render in a literal blink of an
| eye. This thing feels like magic. Nothing I do can make this
| computer skip a beat. Dropped frames are a thing of the past.
| The user interface of every Intel Mac I've used (yes, even
| the Mac Pro) feels slow and clunky in comparison.
|
| Oh, and the chassis of this fanless system literally remains
| _cool to the touch_ while doing all this.
|
| The improvements in raw compute power alone do not account
| for the incredible fluidity of this thing. macOS on the M1
| now feels every bit as snappy and responsive as iPadOS on the
| iPad. I've never used a PC (or Mac) that has ever felt
| anywhere near this responsive. I can only chalk that up to
| software and hardware integration.
|
| Unless Apple's competitors can integrate the software and
| hardware to the same degree, I don't know how they'll get the
| same fluidity we see out of the M1. Microsoft really oughta
| take a look at developing their own PC CPUs, because they're
| probably the only player in the Windows space suited to
| integrate software and hardware to such a degree. Indeed,
| Microsoft is rumoured to be developing their own ARM-based
| CPUs for the Surface, so it just might happen [0]
|
| [0] https://www.theverge.com/2020/12/18/22189450/microsoft-
| arm-p...
| uncledave wrote:
| So much this. M1 mini here. I am absolutely chuffed with
| it. It's insanely good.
|
| I'm going to be the first person in the queue to grab their
| iMac offering.
| monkmartinez wrote:
| Are you saying you don't see much promise for AMD, Intel and
| Nvidia in the GPU space or with computers in general? I had a
| hard time following your logic.
|
| Apple may own their stack, but there are a TON of use cases
| where that stack doesn't even form a blip on the radar of the
| people who purchase computer gear.
| fossuser wrote:
| My prediction is x86 is dead.
|
| External GPUs will remain and I think Nvidia has an
| advantage in that niche currently.
|
| The reason stack ownership matters is because it allows
| tight integration which leads to better chip design (and
| better performance/efficiency).
|
| Windows has run on ARM for a while for example, but it
| sucks. The reason it sucks is complicated but largely has
| to do with bad incentives and coordination problems between
| multiple groups. Apple doesn't have this problem.
|
| As Apple's RISC design performance improvements (paired
| with extremely low power requirements) become more and more
| obvious x86 manufacturers will be left unable to compete.
| Cloud providers will move to ARM chipsets of their own
| design (see: https://aws.amazon.com/ec2/graviton/) and
| AMD/Intel will be on the path to extinction.
|
| I'd argue Apple's M1 machines are already at this level and
| they're version 0 (if you haven't played with one you
| should).
|
| This is an e-risk for Intel and AMD, they should have been
| preparing for this for the last decade, instead Intel
| doubled down on their old designs to maximize profit in the
| short term at the cost of extinction in the long term.
|
| It's not an argument about individual consumer choice
| (though that will shift too), the entire market will move.
| The_Colonel wrote:
| > My prediction is x86 is dead.
|
| I don't see that. At least in corporate environment with
| bazillion legacy apps, x86 will be the king for the
| foreseeable future.
|
| And frankly I don't really see the pull of ARM/M1 anyway.
| I mean, I can get a laptop with extremely competitive
| Ryzen for way cheaper than MacBook with M1. The only big
| advantage I see is the battery, but that's not very
| relevant for many use cases - most people are buying
| laptops don't actually spend that much time on the go
| needing battery power. It's also questionable how
| transferable this is to the rest of the market without
| Apple's tight vertical integration.
|
| > I'd argue Apple's M1 machines are already at this level
| and they're version 0
|
| Where is this myth coming from? Apple's chips are now on
| version 15 or so.
| fossuser wrote:
| This is the first release targeting macOS, I'm not
| pretending their chips for phones don't exist - but the
| M1 is still version 0 for macs.
|
| > "And frankly I don't really see the pull of ARM/M1
| anyway. I mean, I can get a laptop with extremely
| competitive Ryzen for way cheaper than MacBook with
| M1..."
|
| Respectfully, I strongly disagree with this - to me it's
| equivalent to someone defending the keyboards on a palm
| treo. This is a major shift in capability and we're just
| seeing the start of that curve where x86 is nearing the
| end.
|
| "No wireless. Less space than a Nomad. Lame."
| The_Colonel wrote:
| > but the M1 is still version 0 for macs.
|
| Fair enough, it's just important to keep in mind that M1
| is a result of decade(s) long progressive enhancement. M2
| is going to be another incremental step in the series.
|
| > to me it's equivalent to someone defending the
| keyboards on a palm treo. This is a major shift in
| capability ...
|
| That's a completely unjustified comparison. iPhone
| brought a new way to interact with your phone. M1 brings
| ... better performance per watt? (something which is
| happening every year anyway)
|
| What new capabilities does M1 bring? I'm trying to see
| them, but don't ...
| fossuser wrote:
| > "That's a completely unjustified comparison. iPhone
| brought a new way to interact with your phone."
|
| People don't really remember, but a lot of people were
| really dismissive of the iPhone (and iPod) on launch. For
| the iPhone, the complaints were about cost, about lack of
| hardware keyboard, about fingerprints on the screen.
| People complained that it was less usable than existing
| phones for email, etc.
|
| The M1 brings much better performance at much less power.
|
| I think that's a big deal and is a massive lift for what
| applications can do. I also think x86 cannot compete now
| and things will only get a lot worse as Apple's chips get
| even better.
| The_Colonel wrote:
| > People don't really remember, but a lot of people were
| really dismissive of the iPhone (and iPod) on launch.
|
| I do remember that. iPhone had its growing pains in the
| first year, and there was a fair criticism back then. But
| it was also clear that iPhone brings a completely new
| vision to the concept of a mobile phone.
|
| M1 brings a nice performance at fairly low power, but
| that's just a quantitative difference. No new vision.
| Perf/watt improvements have been happening every single
| year since the first chips were manufactured.
|
| > I also think x86 cannot compete now and things will
| only get a lot worse as Apple's chips get even better.
|
| Why? Somehow Apple's chips will get better, but
| competition will stand still? AMD is making currently
| great progresses, and it finally looks like Intel is
| waking up from letargia as well.
| fossuser wrote:
| > "Why? Somehow Apple's chips will get better, but
| competition will stand still?"
|
| Arguably this has been the case for the last ten years
| (comparing chips on iPhones to others).
|
| I think x86 can't compete, CISC can't compete with RISC
| because of problems inherent to CISC
| (https://debugger.medium.com/why-is-apples-m1-chip-so-
| fast-32...)
|
| It won't be for lack of trying - x86 will hold them back.
|
| I suppose in theory they could recognize this e-risk, and
| throw themselves at coming up with a competitive RISC
| chip design while also somehow overcoming the integration
| disadvantages they face.
|
| If they were smart enough to do this, they would have
| done it already.
|
| I'd bet against them (and I am).
| spideymans wrote:
| >M1 brings a nice performance at fairly low power, but
| that's just a quantitative difference. No new vision.
| Perf/watt improvements have been happening every single
| year since the first chips were manufactured.
|
| I'd say the M1's improvements are a lot more than
| performance per watt. It has enabled a level of UI
| fluidity and general "snappiness" that I just haven't
| seen out of any Mac or PC before. The Mac Pro is clearly
| faster than any M1 Mac, but the browsing the UI on the
| Mac Pro just feels slow and clunky in comparison to the
| M1.
|
| I can only chalk that up to optimization between the
| silicon and the software, and I'm not sure that Apple's
| competitors will be able to replicate that.
| phkahler wrote:
| Remember, M1 is on _the_ leading edge 5nm fab process.
| Ryzen APUs are coming and may be competitive in terms of
| power consumption when they arrive on 5nm.
|
| Apple software is also important here. They do some
| things very much right. It will be interesting to run
| real benchmarks with x64 on the same node.
|
| Having said all that, I love fanless quiet computers. In
| that segment Apple has been winning all along.
| monkmartinez wrote:
| Ok, I still have questions.
|
| To start... How would a city with tens of thousands of
| computers transition to ARM in the near future?
|
| The apps that run 911 Dispatch systems and run critical
| infrastructure all over the world are all on x86
| hardware. Millions if not Billions of dollars in
| investment, training, and configuration. These are
| bespoke systems. The military industrial complex
| basically custom chips and x86. The federal government
| runs on x86. you think they are just going to say,
| "Whelp, looks like Apple won, lets quadruple the cost to
| integrate Apple silicon for our water system and missile
| systems! They own the stack!"
|
| Professional grade engineering apps and manufacturing
| apps are just going to suddenly rewrite for apple
| hardware, because M2 or M3 is sooooo fast? Price
| matters!!!! Choice Matters!!!
|
| This is solely about consumer choice right now. The cost
| is prohibitive for most consumers as well, as evidence by
| the low market penetration of Apple computers to this
| day.
| nzmsv wrote:
| Notice how the only counter examples you came up with are
| legacy applications. This is the first sign of a
| declining market. No, Intel will not go out of business
| tomorrow. But they are still dead.
|
| The growth markets will drive the price of ARM parts down
| and performance up. Meanwhile x86 will stagnate and
| become more and more expensive due to declining volumes.
| Eventually, yes, this will apply enough pressure even on
| niche applications like engineering apps to port to ARM.
| The military will likely be the last holdout.
| fossuser wrote:
| You make bets on where the puck is going, not on where it
| currently is.
|
| "How would a city with tens of thousands of HDDs
| transition to SSDs in the near future?"
|
| It happens over time as products move to compete.
|
| Client side machines matter less, the server will
| transition to ARM because performance and power is better
| on RISC. The military industrial complex relies on
| government cloud contracts with providers that will
| probably move to ARM on the server side.
|
| It's not necessarily rewriting for Apple hardware, but
| people that care about future performance will have to
| move to similar RISC hardware to remain competitive.
| nzmsv wrote:
| I think we'll see a lot of ARM use cases outside of the
| Apple stack and x86 is dead (but it will of course take its
| sweet time getting there). For the longest time everyone
| believed at a subconscious level that x86 was a
| prerequisite due to compatibility. Apple provided an
| existence proof that this is false. There is no longer a
| real need to hold onto the obsolete x86 design.
|
| The only way for Intel and AMD to thrive in this new world
| is to throw away their key asset: expertise in x86 arcana.
| They will not do this (see Innovator's Dilemma for reasons
| why). As a result they will face a slow decline and
| eventual death.
| vbezhenar wrote:
| What exactly do you expect? x86 is quite competitive. M1 might
| be slightly better, but it's not like it's miles ahead.
| cambalache wrote:
| I want a return to the status-quo when for 80% (it could be
| even less but let's not dwell in the number) of the price of
| an Apple laptop I could get a windows/linux machine matching
| or surpassing its specs(including stuff like battery
| life,energy consumption, screen dpi, noise, etc). This is not
| true now and I am not seeing an option in the short term.
| shawnz wrote:
| Perhaps it is still true, but Apple is somehow becoming the
| budget option in that equation?
| cwhiz wrote:
| Fiddling around on an M1 MBA it felt faster than my 2020 16"
| MBP. It's half the weight, seems to get double the battery
| life, and costs less than 1/3.
|
| I just can't even imagine what the gap is going to look like
| when Apple really refines this down.
| pimeys wrote:
| You should compare it to other than the old Apple laptops.
| The Ryzen models, such as ThinkPad T14 are very fast, and
| if you want to go tenfold from there, there is no
| comparison with the modern Ryzen desktop CPUs. Why Apple
| always failed with Intel and its thermals is why they feel
| so slow compared to the M1.
| esturk wrote:
| A 2020 16" Macbook Pro isn't old by any measure. This
| argument seems disingenuous.
| jayd16 wrote:
| >It's half the weight
|
| Its 70% the weight at 70% the volume of the 16". What is
| the point of comparing the weight of a 13" and a 16"
| laptop?
| junipertea wrote:
| It is faster despite the lower battery capacity and
| thermals. In fact, the Macbook air has no fan.
| banana_giraffe wrote:
| The response from Intel seems to be betting on the Evo platform
| [1], with third parties announcing laptops like the XPG Xenia
| XE
|
| 1
| https://www.intel.com/content/www/us/en/products/docs/evo.ht...
| andromeduck wrote:
| That looks more like ultrabooks but not watered down again -
| I'm not impressed.
| AtlasBarfed wrote:
| So they're announcing a "platform". Smacks of managerial
| bottom-covering. Almost like forming a committee to
| investigate the problem.
|
| - where was this 1/2/10 years ago?
|
| - how would this address the fundamental CPU performance gap?
|
| - Intel has no competitive GPU offering, yet another glaring
| failure on their part
|
| - why would OEMs go along with this when Ryzen is a better
| CPU, GPU, aside from getting Intel bribes and the usual
| marketplace branding momentum?
|
| - will this actually get ports migrated to laptops faster? It
| was criminal how long it took for HDMI 2.0 to hit laptops.
|
| I get Intel doesn't own the stack/vertical integration, but
| Intel could have devoted 1% of its revenue to a kickass Linux
| OS to keep Microsoft honest a long time ago and demonstrate
| its full hardware.
|
| Even if only coders/techies used it like Macbooks are
| standard issue in the Bay, it would have been good insurance,
| leverage, or demoware.
| banana_giraffe wrote:
| Yeah, on top of it all, given all of the shots of the
| reference models look vaguely like a Mac Book, it really
| feels to me like Intel dug around in their couch cushions
| to come up with a response.
| wg0 wrote:
| "I get Intel doesn't own the stack/vertical integration,
| but Intel could have devoted 1% of its revenue to a kickass
| Linux OS to keep Microsoft honest a long time ago and
| demonstrate its full hardware." Interesting point.. makes
| one wonder why didn't they do it while having a mountain of
| cash.
| nine_k wrote:
| They had Intel Clear Linux, a server-oriented distro.
| Quite good at what it targeted.
| tambre wrote:
| They do have something - Clear Linux [0]. Definitely not
| too mcuh investment, but they do differentiate by
| compiling packages for much newer instruction sets
| compared to other distros.
|
| [0]: https://clearlinux.org/
| AtlasBarfed wrote:
| The real differentiator would have been an army of good
| driver coders and contributors to KDE/GnomeX.
| AtlasBarfed wrote:
| They were so in bed with Microsoft.
|
| Microsoft being a true monopoly might have struck fear in
| the timid souls of Intel executives that they would go
| headlong for AMD.
|
| Or Google had this opportunity for years, and half-assed
| ChromeOS. Or AMD. Or Dell/HP/IBM who sold enough x86 to
| have money on the side.
|
| I don't buy that it would have been hard. Look at what
| Apple did with OSX with such a paltry market share and
| way before the iPhone money train came. First consumer
| OSX release was 2001.
|
| Sure, Apple had a massive advantage by buying NeXT's
| remnants and Jobs's familiarity with it and the people
| behind it, but remember that Apple's first choice was
| BeOS.
|
| So anyone looking to push things could have got BeOS, or
| an army of Sun people as Sun killed off Solaris. The
| talent was out there.
|
| Instead here we sit with Windows in a perpetual state of
| the two-desktop tiled/old frankenstein, Linux DE
| balkanization and perpetual reinvention/rewrite from
| scratch, and OSX locked on Apple.
| Fnoord wrote:
| > I get Intel doesn't own the stack/vertical integration,
| but Intel could have devoted 1% of its revenue to a kickass
| Linux OS to keep Microsoft honest a long time ago and
| demonstrate its full hardware.
|
| Moblin, MeeGo.
| selectodude wrote:
| >- where was this 1/2/10 years ago?
|
| https://en.wikipedia.org/wiki/Centrino#Notebook_implementat
| i...
| neogodless wrote:
| You have to read between the lines here. They make no claims
| about CPU performance - just integrated GPU performance from
| Xe (which is a big improvement from previous Iris GPUs.) Then
| they claim battery life (9 hours, FHD, 250-nits, etc.)
|
| What that means is laptop OEMs will have to limit TDP on CPUs
| - probably 15W or less. Given current Intel chips being very
| power hungry, these are likely NOT going to be great CPU
| performers.
|
| The only competition in CPU space to M1 will be Ryzen 5000U
| chips in the 15-25W thermal envelope. They should be ~19%
| more powerful/efficient than Ryzen 4000U chips, but I would
| not expect M1 levels of cool or battery life yet.
| flatiron wrote:
| The new Ryzen mobile processors should be interesting. Their
| GPU drivers (while not of the best code quality) are in the
| mainline Linux kernel. So it all should just "work"
| gnarbarian wrote:
| Currently using a Renoir laptop. It's smoking fast but I had
| to install a bleeding edge kernel to get the display driver
| to work at all. That should be fixed in the next ubuntu
| release though.
| imhoguy wrote:
| 5.10.x kernels are very stable and feature complete with
| AMD Ryzen Renoir - I update one almost weekly once new
| patch version is out on Ubuntu Kernel PPA. Here is nice
| script which makes the update trivial:
| https://github.com/pimlie/ubuntu-mainline-kernel.sh
| threentaway wrote:
| Weird, I had no issues with the built in display on Ubuntu
| 20.04, but I had to update the kernel to 5.8 to get display
| out over USB-C to work. Now that Ubuntu 20.10 is out and
| uses 5.8, I'm just using that so I don't have to mess with
| custom, unsigned kernels.
| pedrocr wrote:
| I just installed 20.04.1 on a 4750U Lenovo T14s and
| everything just works as far as I can tell.
| unix_fan wrote:
| The new mobile chips are basically a mix of new and old
| stuff, with them rebranding ryzen 2 parts. Kind of
| disappointing.
| neogodless wrote:
| But the slightly tweaked "old stuff" is _relatively_ low-
| end - up to and including the 5700U. You 'll find it in
| thin 'n light, budget and business laptops that will have
| more than enough power from a Zen 2 core. If you really
| absolutely need more power, you'll know it and you'll be
| shopping for a 5800H (or above).
|
| If you don't know enough about CPUs to even read reviews
| that compare the CPU to other CPUs, then you either don't
| need the extra IPC of Zen 3 (and you won't notice when you
| use your laptop day to day) or you just... don't care.
|
| If you care, get a 5600U/5800U or H line and it will never
| affect you. The laptops these come in should be priced
| accordingly.
| littlestymaar wrote:
| > Apple willingly or not has just obliterated the whole
| consumer pc market.
|
| Apple probably has the best laptop out there as of today, but I
| don't think Apple sales performances are impacted that much by
| their hardware perf actually: around 2012-2015 or something
| they had several years with a subpar mobile phone, both on the
| hardware a and the software side, and it still sold very well.
| A few years later, they have the best phone on the market
| and... it didn't change their dynamic much: it still sells very
| well, as before. On the laptop market, they have been selling
| subpar laptops for a few years without much issue, and I guess
| it won't change much that they now have the best one.
|
| Apple customers will get a much better deal for their bucks,
| which is good for their long term business, but I don't think
| it will drive that many people out of the Windows world[1] just
| for that reason (especially with the migration/compatibility
| issues which are even worse now than they where running on
| Intel).
|
| Also, many people outside of HN just don't listen to Apple
| "revolutionary" announcement, they have used that card too
| much, for no good reason most of the time, so people just
| stopped listening (even my friends who are actually Apple
| customers).
|
| [1]: which is where most people are tbh, and I don't think that
| many Linux people would switch either.
| noizejoy wrote:
| Agreed - since MacOS is even more of an entire eco-system,
| moving in and out of that is much more of a long-term
| commitment for most regular users.
|
| People who are multi-platform in daily life, are much more
| likely to switch - and that's a rather small percentage of
| computer users (and of course very much over-represented here
| at HN).
|
| > they have used that card too much
|
| you can never have enough "magic" :-)
| chrisbrandow wrote:
| question from ignorance - Why not use Metal? Is that too embedded
| in the macOS system to be useful for something like Linux? Or is
| this for the sake of understanding the bare metal?
| lunixbochs wrote:
| If you used metal as the graphics api on Linux, literally no
| existing Linux software would work with it unless you also used
| a layer like MoltenGL or MoltenVK (which have been written for
| a Mac system and would likely need modification). Linux
| graphics drivers also tend to have extra APIs for buffer
| management for X11/Wayland, which a molten compat layer
| probably doesn't do as molten is meant to run in-process with
| each app I believe.
|
| Some of the Metal APIs are also a little intertwined with swift
| and objc.
| pantalaimon wrote:
| The goal is to write a driver for Linux, this is from scratch.
| wg0 wrote:
| This might be a stupid question but if someone wants to start
| with some basic GPU to get an an understanding of how these
| things work, what that GPU would be that's wide spread, not that
| archaic?
| astrange wrote:
| Anything that supports shaders is fine, but M1 is a mobile-
| style GPU so it doesn't behave exactly like a desktop GPU.
| remexre wrote:
| Intel's GPUs are fairly well-documented at
| https://01.org/linuxgraphics/documentation/hardware-specific...
|
| IMO, writing some Vulkan (it being a thin layer over what a
| modern GPU's "actually" doing) would be good to get
| fundamentals down first, not the least of which because you'll
| have the validation layers to yell at you for incorrect usage
| instead of getting mysterious GPU behavior.
| ogre_codes wrote:
| While this is a long way from being a proper graphics driver, it
| is a good indication that we're going to see a functional driver
| for Linux before too long. I was concerned this bit would take
| years to work out, this suggests it'll be months to get a working
| driver.
|
| Chuffed.
| [deleted]
| varispeed wrote:
| So Apple strategy seems to be to commit as little resources as
| possible to their products knowing that "open source" developers
| will do their work for free thus Apple will not have to pay extra
| salaries and taxes? I don't understand why people bother working
| for free to e.g. run Linux or even write a GPU driver? I get this
| is a nice developer challenge, and being involved with Apple
| stuff is still being seen as "cool", but why don't those
| developers actually support some truly open source projects
| instead of helping filthy rich company for nothing?
| ogre_codes wrote:
| > So Apple strategy seems to be to commit as little resources
| as possible to their products knowing that "open source"
| developers will do their work for free thus Apple will not have
| to pay extra salaries and taxes?
|
| Apple has no Linux strategy. Nobody is working for Apple for
| free. People are working on their own time (or supporting the
| project with their own money) because they want to see this
| happen.
|
| There is no skin in this for Apple either way.
|
| What I don't understand is why people insist on criticizing a
| project which won't fundamentally affect them at all. Having
| Linux on Mac isn't going to hurt you and stands to benefit the
| community as a whole.
|
| While the GPU port is unlikely to benefit others, it's very
| likely some of the other work will. Any improvements to the
| Broadcom drivers will be useful for the entire community.
| Improvements and optimizations to ARM64 support will likewise
| benefit the whole community.
|
| Really tired of the mis-directed zealots who think they have
| the right to tell other people where to direct their time and
| energy.
| tenebrisalietum wrote:
| Maybe because ... those developers want to use Linux on
| hardware they _bought and own_ instead of the built-in
| operating system? The developers are helping themselves, not
| Apple.
|
| Apple could save some money by selling bare metal M1's without
| an OS installed, but then it might get into the hands of people
| who want a "cheaper Mac but your hacker friend can get it
| working" and it would damage their brand, so I see why they
| don't do it.
| varispeed wrote:
| Well, is this M1 really _that_ good to commit considerable
| amount of time into it instead of working on something more
| meaningful or benefitting ones personal life more? Are they
| able to make a commercial product out of it? Unless it gives
| the person the same kick as fishing or building Lego - but
| these hobbies don't have a side effect of filling the pockets
| of a big co.
|
| The idea about "cheaper Mac" is weak, because you can already
| buy a cheap Mac - it's not going to be the latest gen, but
| let's take into account that Intel has not made big progress
| in the last couple of years and then M1 is actually on the
| affordable side.
|
| Isn't actually more damaging to their brand that they don't
| support their products that will benefit professional users
| and that they rely on people doing work for free and thus
| Apple is avoiding paying fair share?
| tenebrisalietum wrote:
| > Well, is this M1 really _that_ good to commit
| considerable amount of time into it instead of working on
| something more meaningful or benefitting ones personal life
| more?
|
| M1 is an ARM chip that's up there with Intel desktop PCs.
| That's awesome. It's possibly the real beginning of the end
| of the effective Wintel monopoly on personal computing and
| if we are going to continue to have Linux on hardware
| that's not locked-down phones it needs to happen. I'd
| certainly put my effort there if I had the skill.
|
| > Isn't actually more damaging to their brand that they
| don't support their products that will benefit professional
| users and that they rely on people doing work for free and
| thus Apple is avoiding paying fair share?
|
| Apple has $100 billion in cash. Whatever they are doing
| now, is working.
| varispeed wrote:
| This is like building a house on a swamp. Without
| official Linux support, Apple can pull the plug anytime.
| It's likely what is going to happen is that eventually a
| viable open source project emerges that Apple didn't pay
| anyone to build and then they will announce how they
| embrace open source and tell their own developers to
| contribute few lines for PR.
|
| > Whatever they are doing now, is working. If you are
| using child labour, avoid taxes, use anti-competitive
| measures, make stuff deliberately difficult to repair and
| easy to break and then have money to shut any politicians
| willing to look into their shady business then yes it is
| definitely working.
| orangecat wrote:
| It's not about helping Apple. The M1 beats every x86 CPU in
| absolute single threaded performance, as well as multicore
| performance per watt. Hopefully AMD will close the gap (I don't
| have much hope for Intel), but for now it's an extremely
| attractive target for Linux.
___________________________________________________________________
(page generated 2021-01-22 23:00 UTC)