[HN Gopher] Apple's game porting toolkit is fantastic. Cyberpunk...
___________________________________________________________________
Apple's game porting toolkit is fantastic. Cyberpunk 2077 at Ultra
on an M1 MBP
Author : ghuntley
Score : 311 points
Date : 2023-06-07 07:27 UTC (15 hours ago)
(HTM) web link (twitter.com)
(TXT) w3m dump (twitter.com)
| CostcoFanboy wrote:
| 95% lift by CodeWeavers, but Apple is the one to thank. Lmao.
| sBqQu3U0wH wrote:
| Oh, cool! You can use a tool to make a computer game barely run
| on an overpriced computer? I will never, ever understand the
| appeal of Apple products.
| nazka wrote:
| I truly wonder if Microsoft didn't help with this port. Looking
| at the speed and quality of this port. It seems that Apple and
| Microsoft are working closer than ever. Maybe some execs of
| Microsoft asked for to be in this demo.
| gigel82 wrote:
| Insert _Bike Fall Meme_ here :)
| getcrunk wrote:
| This guy got 30-40fps, idk what settings.
|
| https://www.reddit.com/r/macgaming/comments/1435ukq/cyberpun...
| rsynnott wrote:
| That's on an M2 Max, which is a significantly heftier chip than
| the M1.
| tantalor wrote:
| I guess the video looks okay but the Twitter embed video quality
| is dog shit. Maybe post the video somewhere like YouTube instead
| that supports higher quality.
| gilgoomesh wrote:
| Keep in mind, it's getting 15fps at 1440x900. It's not saying
| that Cyberpunk 2077 at Ultra on an M1 is a great experience. It's
| merely pointing out that it's technically possible (which is a
| massive achievement).
| demarq wrote:
| all while being emulated on rosetta
| Hamuko wrote:
| Yeah, but that's the first attempt. The game porting toolkit is
| designed to shorten the time it takes to launch the game on
| macOS for the first time by allowing you to take the ready
| Windows version and just running it directly on macOS with
| translation. A finished macOS port would have additional work
| after this step.
|
| https://developer.apple.com/videos/play/wwdc2023/10123/
| ActorNightly wrote:
| "Technically possible" is a non statement. Of course its
| technically possible. The question is how much money is Apple
| willing to throw at making games run well on their gpu.
| throwaway2990 wrote:
| Ultra with 16gb of memory shared between cpu and gpu. Unlike a
| a traditional laptop or desktop with separate memory for both.
| captainbland wrote:
| I think it's a network effect thing as much as anything else.
| If it's good enough to get people playing games on their macs
| at all (even if at relatively sub-par settings), that builds
| the market, shows there are people willing to spend money on
| games to play them on their macs.
|
| Then at that point games developers might be more inclined to
| give the platform explicit support.
|
| Otherwise it's a bit of a chicken and egg situation: people
| aren't playing games on their macs because the library isn't
| there, the library isn't there because no developers will
| support a platform where there aren't gamers and so on.
| trevyn wrote:
| Not as massive an achievement as Proton getting nearly every
| Windows game in existence running _well_ from unmodified
| binaries on Linux.
| yurishimo wrote:
| Don't let perfect be the enemy of the good. Think about how
| many Steam Decks have been sold. Game devs are already
| actively targeting the Deck. If porting games to Mac can be
| made easier, we all need to actively encourage it.
| Unfortunately, the only way we get Apple to give us more game
| dev tools is by porting the games.
|
| I have a Steam Deck and a Mac. I would love to play half the
| games from my Deck on my laptop.
| icapybara wrote:
| Do you find this is a good solution for playing Steam
| games?
|
| I have an older PC, and I'm thinking about replacing it
| with a macbook air and a steam deck. Does the steam deck
| feel more limiting than just having a windows PC with
| steam?
| LegitShady wrote:
| only get a steam deck if you're really interested in
| portable playing. You can build a better desktop from old
| parts. The steamdeck is an interesting device that's
| doing a lot for gaming via proton, but that makes
| tradeoffs for battery life and portability. If you're not
| interested in the portability just get a desktop. If
| you're interested in the portability but need a large
| screen, get a laptop. If you want a sega game gear form
| factor that can run steam games get a steam deck.
| akmarinov wrote:
| Proton isn't as massive an achievement as the sun! That thing
| pumps out around 2.3012[?]10^27 joules a second!
| radicalbyte wrote:
| I'm not sure - Apple's marketing claimed that the M1 was
| beating top end PC parts. This isn't even close.
| skavi wrote:
| To be fair, the M1 is now approaching 3 years of age. And the
| game is being emulated.
| naillo wrote:
| Many much cheaper 3 year old PCs would handle this fine
| rowanG077 wrote:
| 13 inch ultra lights? I very much doubt that. You are
| basically relegated to igpus. I could see Ryzen 7840u
| beating the M1. But 3 years ago the best there was was
| the Ryzen 5800u.
| dannyw wrote:
| No, you won't many cheaper 3 year old laptops running
| Cyperpunk 2077 on Ultra this well... name one.
| sudosysgen wrote:
| 1440p900? Any laptop with a 2060 mobile would do it. At
| 1500$ it's well within budget.
| dahauns wrote:
| Honestly, I wouldn't use the phrase "this well" non-
| sarcastically for ~15fps at 1440x900 like shown in the
| video.
| WhereIsTheTruth wrote:
| this is on Ultra graphics settings.. not lot of laptop
| could even run it on Low
|
| Cyberpunk is known to be a VERY demanding open world game
|
| on lowest setting, this should probably be fine
| smoldesu wrote:
| It's an impressive turnout, but I wouldn't ignore the
| power of modern low-end APUs. Here's a 3-generation-old,
| entry level Ryzen laptop playing the game for comparison:
| https://youtu.be/Aqgm0zcV7Kw
|
| "this well" is more or less equivalent to a older Ryzen
| 3's native performance. Apple is really banking on
| developers recompiling for ARM to reduce overhead here.
| Jnr wrote:
| It says Ultra, but ray tracing is not supported so it is
| not really "ultra".
| dahauns wrote:
| Without RT it's actually not that demanding.
|
| I mean, here's the game on Ultra on a Steam Deck:
| https://www.youtube.com/watch?v=gHeso2jc_L0
| WhereIsTheTruth wrote:
| Comparison is not fair, game porting toolkit also does
| X86 -> ARM, so it's missing perf on lot of HW intrinsics
|
| Also this is the 1st gen M1, which was released in 2020,
| I wonder what's the performance like on the newest
| models?
| smoldesu wrote:
| > name one.
|
| At 900p? Steam Deck does just fine (and it's SOC is even
| older).
| flohofwoe wrote:
| CPU maybe, GPU definitely not. In the end, M1's GPU is still
| a mobile GPU with a few desktop features bolted on (like BCx
| compressed texture formats).
| izacus wrote:
| They put comparison of power against nVidia 3090 which made
| every Apple fan think it's comparable in performance too :D
| djsavvy wrote:
| That was for the M1 ultra, while TFA is on a standard M1.
| willcipriano wrote:
| Wasn't it the "fastest laptop ever"?
| reaperman wrote:
| I mean, for the work I do that's probably true. My work
| is 99% CPU/RAM/disk dependent.
|
| GPU obviously not, but maybe that claim would hold water
| at some arbitrary wattage limit.
|
| My takeaway is that the GPU "doesn't completely suck" and
| that Apple are dedicating continuing resources to making
| their platform actually usable, which I was worried
| about. I mean, it seems difficult just to intentionally
| use the Apple Neural Engine, and impossible to explicitly
| use it, which makes testing aggravating. Any continued
| focus on improving developer experience for the
| coprocessors (GPU, ANE, R1, etc) is a good signal.
| barbariangrunge wrote:
| If it's disk dependent, then the m2 ssds are half as fast
| as the m1 ssds because it's single lane, unless you
| upgrade
| reaperman wrote:
| Yeah I don't think laptops should be sold at all with
| 512GB -- I think that's as absurd of a product as a
| laptop with 128GB. Personally I spec out 2TB and judge
| price value based on that.
|
| So the speed issue doesn't affect me personally. I just
| wish they wouldn't sell that model and then it wouldn't
| affect anyone.
| throwaway290 wrote:
| Isn't it when anchored to energy consumption? Sure you
| can put a powerful GPU in a laptop and make it
| effectively anchored to wall plug...
| sudosysgen wrote:
| That's already true of the M1 Max. At maximum power from
| CPU+GPU it will barely last an hour.
|
| Edit: I wrote Ultra, but I meant Max.
| throwaway290 wrote:
| > M1 Ultra
|
| Do they put M1 Ultra in laptops tho?
| sudosysgen wrote:
| Yes. The MBP with an M1 Max will, at max performance, use
| enough power that it would discharge it's battery in less
| than an hour. I think Apple throttles it on battery,
| though.
| pram wrote:
| The MBP with an M1 Ultra isn't a thing that exists.
| sudosysgen wrote:
| You're right, I miswrote, it's the M1 Max. See
| https://www.anandtech.com/show/17024/apple-m1-max-
| performanc... - the M1 Max can draw over 100W.
| Hamuko wrote:
| No. There does not exist an M1 Ultra laptop of any kind.
| Nor an M2 Ultra laptop for that matter. The only machines
| with Mn Ultra are the Mac Studio and Mac Pro.
|
| https://en.wikipedia.org/wiki/Apple_M1#Products_that_use_
| the...
| mrtranscendence wrote:
| That may be true, but in practice I get _faaar_ better
| battery life out of my M1 Max than I would out of a
| laptop with a mobile 4090.
| sudosysgen wrote:
| Do you do AAA gaming on your M1 Max? If not, then the GPU
| is irrelevant, because a laptop with a 4090 mobile is
| going to shut it down completely.
|
| If you are indeed doing AAA gaming, then you wouldn't
| have sufficient battery life without plugging in, or you
| wouldn't have sufficient performance.
| mrtranscendence wrote:
| I'm not talking about AAA gaming here. I'm talking about
| day-to-day work-related tasks, which is primarily what I
| use my MacBook for.
| throwaway290 wrote:
| People report playing Baldur's Gate 3 for an hour on M1
| Max with 40% battery life left, is it AAA enough? https:/
| /www.reddit.com/r/macbookpro/comments/qogsov/battery_...
| (M1 Pro lasts longer)
| WhereIsTheTruth wrote:
| it translates X86 to ARM, that's not free
|
| High-end games makes ton of use of SIMD instructions to gain
| massive boost, I wonder if that's translated properly
| heliophobicdude wrote:
| Do keep in mind this script was marketed as a way for game
| developers to judge the viability to port to using native
| apis and native isa.
|
| I would imagine that not running through this codeweavers
| patch and through rosetta would have better performance.
| senttoschool wrote:
| > _I 'm not sure - Apple's marketing claimed that the M1 was
| beating top end PC parts. This isn't even close._
|
| Your statement is quite misinformed.
|
| First, this is the M1. Not M1 Pro. Not M1 Max. The M1 is
| almost 3 years old.
|
| Second, this is being translated from DX12 to Metal and also
| x86 to ARM64. Yes, both the CPU and GPU layers are being
| translated.
|
| Third, Apple claimed that the M1 Max was the most powerful
| GPU on laptops. It was probably true depending on what
| benchmarks.
|
| Finally, this is Cyberpunk 2077 running in Ultra settings.
| It's the most demanding PC game ever.
| whazor wrote:
| M1 has 8 GPU cores, M1 Pro has 16, M1 Max has 32 cores.
| Apple says the GPU of M1 Max is four times as fast as M1.
| So 30FPS Ultra on 1080p should be possible?
| _mitchie wrote:
| This is not the M1. You can't configure a MacBook Pro 16
| inch (stated in the tweet) with anything other than an M1
| Pro / M1 Max, or M2 Pro / M2 Max on the latest models.
| 58028641 wrote:
| This is an M1 with 16 GB of RAM not 16 inch.
| _mitchie wrote:
| You're right, my apologies I misread the 16 GB as 16
| inch!
| [deleted]
| mlindner wrote:
| It's being run through several layers of emulation. Of course
| it's going to be slow.
| doodlesdev wrote:
| Wine + DXVK disagrees. Graphics API emulation doesn't
| necessarily provide worse performance, in fact DXVK often
| wins against raw DirectX 9/10 and sometimes even DirectX
| 11. VKD3D performance is pretty awesome.
|
| Not sure how much of the bottleneck here is because of
| Rosetta though (i.e. cpu-bound) although I suspect not much
| really.
| dwaite wrote:
| Unless it is running a JIT internally as part of the game
| engine, Rosetta should take the whole executable and
| rebuild it ahead-of-time.
| astrange wrote:
| Rosetta can only do that for an x86 macOS binary. Once it
| goes through WINE it's all JIT. Though I think it should
| get cached after a while.
| Hamuko wrote:
| What, in playing Windows games with translated DirectX and
| amd64 calls?
| whywhywhywhy wrote:
| They did, although the graph cuts off just before the 3090
| takes the lead and goes beyond.
|
| https://cdn.videocardz.com/1/2022/03/M1-vs-3090.jpg
|
| Frustrating they're being this misleading when M1 is
| outstanding for it's own reasons, but 3090 eats it alive in
| the workflows it excels in too.
|
| Perfect machine would be both those chips in the same box
| tbh.
| disnaturally wrote:
| [dead]
| mrguyorama wrote:
| They've been straight up lying about the M series chips
| performance from day one. They will show insane graphs of
| the M chips beating top end desktop parts with an asterisk
| that explains the very specific BS benchmark they used that
| clearly favors their chip but won't generalize and then
| public benchmarks never even come close.
|
| People still parrot it.
| [deleted]
| eliasmacpherson wrote:
| Fantastic would be 25fps, this isn't there yet.
| manuelabeledo wrote:
| The tool is not meant to be used for that. It is _fantastic_ in
| the sense that it was achieved with no changes to the game
| whatsoever.
|
| I'm wondering if this is also translating the binary with
| Rosetta.
| eliasmacpherson wrote:
| Well if there are no changes made to the game whatsoever,
| then it has to be using Rosetta. Fantastic would be DXVK
| approximate levels of performance hit. This is far short of
| that.
| manuelabeledo wrote:
| DXVK does not support DirectX 12, and does only half of the
| job, missing the CPU instruction translations.
| eliasmacpherson wrote:
| I was referring to the performance of DXVK not its
| feature support. Expecting DXVK to do Rosetta itself is
| beyond ridiculous.
| manuelabeledo wrote:
| I agree: comparing Rosetta plus the graphics API
| translation layer, with DXVK, which only accomplishes the
| latter, is ridiculous.
|
| Here is what you said, though:
|
| > Fantastic would be DXVK approximate levels of
| performance hit. This is far short of that.
| eliasmacpherson wrote:
| Rosetta is a known quantity of approximately 20% of a
| drop. DXVK can do about a 20% performance drop in certain
| situations, and perform better than that in others.
|
| This is at about 50% performance drop translating DX12 to
| Metal, on top of the drop from Rosetta.
| manuelabeledo wrote:
| 50% drop of _what_? Where are you getting your baseline
| from? Because, as far as I know, there is no native macOS
| port of Cyberpunk 2077.
|
| At 1440p, and if we take this [0] at face value, it would
| be 50% of the performance of a RX 6700 XT paired with a
| 5950X, both desktop parts, which I think is pretty good.
|
| [0] https://www.digitaltrends.com/computing/cyberpunk-207
| 7-pc-pe...
| eliasmacpherson wrote:
| Please don't use amp links.
|
| The m1 gpu is broadly equivalent to a gtx 1650 in a host
| of benchmarks. This is getting less than half the fps a
| gtx 1650 does at these settings, and I am being
| charitable.
|
| I don't know why you are looking at 1440p (2560 x 1440)
| as the m1 here is running at 1440x900. While I'm there,
| that 6700 XT posts 50fps, half of which is 25fps, which
| would indeed be alright. However this is putting out less
| than 15fps most of the time.
|
| Have a nice day.
| manuelabeledo wrote:
| > I don't know why you are looking at 1440p (2560 x 1440)
| as the m1 here is running at 1440x900.
|
| Ah, you are right, I messed up with the resolution.
|
| > The m1 gpu is broadly equivalent to a gtx 1650 in a
| host of benchmarks.
|
| Which ones?
|
| > This is getting less than half the fps a gtx 1650 does
| at these settings, and I am being charitable.
|
| ... without having to emulate both CPU architecture and
| graphics layer.
|
| I mean, this is not really a debate. DXVK is not
| comparable, Wine does not do the same either. We are
| talking about translating both CPU instructions and
| graphic API calls in real time, good enough that a triple
| A game runs without any modification on a laptop with
| 16GB of shared memory.
| eliasmacpherson wrote:
| Have a fantastic day.
| GeekyBear wrote:
| > I'm wondering if this is also translating the binary with
| Rosetta.
|
| It is.
|
| >Game Porting Toolkit can translate controller inputs, audio
| and graphics APIs, CPU instructions, and other APIs
| automatically.
|
| https://www.digitaltrends.com/computing/apple-enabled-
| thousa...
| AndroTux wrote:
| That is on an M1 laptop. No dedicated GPU, not even an M1 Max.
| Just plain M1. Of course, it's not running super fast on Ultra
| settings, but imagine how slow it would be on a comparable
| Intel laptop with onboard graphics eating through your battery.
| Especially considering it only costs $1,299, which is not a lot
| for this kind of performance. And then it's not even an x86
| CPU, for which Cyberpunk was developed. So yes, it's fantastic.
| madeofpalk wrote:
| No Apple hardware has a "dedicated GPU".
| eliasmacpherson wrote:
| If you want to set the bar that low for fantastic, be my
| guest. Fps per dollar, any metric I can think of is lousy.
| You've not convinced me. Was impressed by Rosetta, but not
| this game porting toolkit, for the record.
| WhereIsTheTruth wrote:
| This not only does DX12 -> Metal, it also does X86 -> ARM,
| and still manages to give you decent performance if you
| lower the graphics settings, you can manage easily 30fps,
| wich is enough on a laptop, considering it runs with a
| battery
|
| So for 2020 laptop chip, it's pretty great achievement I'd
| say!
|
| I don't know of any project that does X86 -> ARM this well
| eliasmacpherson wrote:
| don't confuse Rosetta with this game porting toolkit.
| cultureswitch wrote:
| 25 FPS wouldn't be any more playable
| m_eiman wrote:
| Bah, when I was young we used to DREAM about 25 FPS!
|
| I have fond(?) memories of playing Doom on a 386 with a
| monochrome passive TFT screen.
| bogwog wrote:
| Look at all the time and effort spent to port Wine when all they
| had to do to win over developers was officially support OpenGL
| and Vulkan.
| veave wrote:
| This is about DirectX 12. How's the support for d3d9 through dxvk
| like for end users?
| atgctg wrote:
| Running at ~40 fps on M2 Max:
|
| https://www.reddit.com/r/macgaming/comments/1435ukq/cyberpun...
| ece wrote:
| So, you need to spend $3k to get playable framerates, which are
| possible with a 3060 laptop, maybe even a 3050ti, as they can
| do 1080p, this just seems to be 900p.
| o1y32 wrote:
| Actually you can play these games with Xbox Series S at 1440p
| easily. I wonder if people would just simply buy a game
| console (if they don't already have one) for the more
| demanding titles. Most people don't need a processor that is
| nearly as powerful as M1 Max, and I doubt anyone is going to
| spend extra money on a computer just for its GPU that doesn't
| even play games as well as a $300 console.
| ece wrote:
| There are options if you just want to play the game, yes,
| but Apple did the work here, and met developers the wrong
| half way IMO. If you just want to play with higher settings
| and fps on the computer you have, less emulation is better,
| as impressive as it might be. A Vulkan driver would be less
| emulation and more performance all around I think. Also,
| $300 can buy a lot of games if games can be made to run
| well with minimal work.
| Hamuko wrote:
| > _Actually you can play these games with Xbox Series S at
| 1440p easily._
|
| The Xbox Series S version of Cyberpunk runs at 30 FPS with
| a dynamic resolution between 2304x1296 and 2560x1440 on
| quality mode and at 60 FPS with a dynamic resolution
| between 1410x800 and 1920x1080 on performance mode. If you
| were to run it with a fixed resolution of 1440p, then you'd
| definitely not be averaging 30 FPS.
| afavour wrote:
| The point is that you'll be able to play games on the machine
| _you already have_. Yes, a PC with GPU is going to be better
| and people that are really into gaming will probably always
| opt for that. But there 's a big casual market out there too.
| shepherdjerred wrote:
| I'm already spending $3k for my laptop so that I can develop.
|
| This means I won't _also_ have to spend several thousand
| dollars on a gaming PC in addition.
| joeman1000 wrote:
| Yes, but then you have a gaming laptop...
| lizardking wrote:
| The last thing I still use my windows machine for is the
| occasional gaming session. I'd love to be able to be rid of
| it forever. This seems like a positive step in that
| direction.
| GordonS wrote:
| I play Cyberpunk with an aging AMD RX570, and get consistent
| rate of 45 FPS running at 3K with high quality settings.
|
| I'm not an Apple fan, but have to admit that Apple chip is
| getting incredible frame rates, considering it has no
| discrete GPU.
| akmarinov wrote:
| Nice, imagine all these games on an iPad...
| Cthulhu_ wrote:
| On iPad they would need their controls redone; it's possible,
| but an extra step.
|
| I'm also getting the feeling the ipad is quickly running out
| of favor, I haven't seen one in ages except on my parents'
| dinner table.
| GeekyBear wrote:
| You can pair your XBox or Playstation controller with an
| iPad.
| bouk wrote:
| Also keyboard and mouse!
| reportgunner wrote:
| Those are terrible controls for most of games.
| ninkendo wrote:
| You can pair a mouse and keyboard with an iPad too. You
| can even plug them in...
| akmarinov wrote:
| Only for RTS, most other games actually favor the
| controller due to the included aim assist in FPS games,
| for example.
|
| Having to be on console makes pretty much any game
| controller compliant.
| reportgunner wrote:
| If they were a good kind of controls they would not
| require an "assist".
| xrisk wrote:
| Imagine all these games on an Apple TV... I'm assuming at
| some point those will also start using M chips. Excited to
| see if apple can enter the console market.
| willio58 wrote:
| Could you just airplay from the mac to the Apple TV? Not
| sure about input lag in that situation but it'd be
| interesting to see
| fillskills wrote:
| That would be wonderful!
| izacus wrote:
| GeForce Now should allow you to do that right now :)
| akmarinov wrote:
| Yeah but you need to subscribe to them, you need a stable
| low latency connection, etc
|
| Flights will be way more fun to just pop down with your
| iPad (or Vision Pro) and not have to also bring along your
| switch or steam deck
| christoph wrote:
| This is total speculation on my part, but I do wonder why Apple
| have suddenly got this out the door to developers right at the
| same moment their headset is announced. They've never seemed very
| interested in this section of the games market before.
|
| I wonder if this could be the first building block of allowing
| existing modern 3D games to play in some kind of new semi
| immersive way inside. I'm imagining playing an FPS on a huge wrap
| around screen with some adjustable depth perception. That could
| potentially open up a huge market.
| akmarinov wrote:
| VR's biggest use case is gaming and Apple is so far away from
| gaming, they're not in the same universe.
|
| If they want people to game on their headset - they need devs
| to port their games over.
| fnordpiglet wrote:
| I think more than gaming porn is the current killer use of
| 360 VR tech. More on that later.
|
| I've always believed VRs biggest use case in the end is work,
| but resolution and integration to the surrounding environment
| has been lacking. Have a 360 3D environment to work in,
| assuming resolution is high enough to read text, opens a lot
| of possibilities up. Ive done a fair amount of POC with
| various devices over the years and I stand by that. The
| gesture recognition of the new apple devices leads more in
| that direction IMO. For instance, it should be able to key
| (harhar) off a virtual keyboard being typed on. (Haptics will
| be an issue!)
|
| I'd note also that VR has been successful in high end
| manufacturing design as well for these reasons.
|
| I think gaming is what proves a tech and motivates people to
| engage. On the shadier side, porn even more so. But these
| technologies wend their ways into all aspects of our lives
| after being proven out in the game / porn use cases.
| boringg wrote:
| This is an actually interesting use of the headset. Price tag
| at this state to high for most users -- but imagine it dropped
| down to something reasonable. Throw in your favorite game and
| the immersive experience would be pretty slick.
|
| You are definitely correct -- they are opening the door for
| gaming on that system though they are doing it quietly not to
| get competition up in arms.
| fnordpiglet wrote:
| I will wager the headset will drop slightly in price but new
| skus will be offered that cover a price range. Apple products
| have always been too high for most users - yet they seem
| fairly successful as a company by chasing the top half of the
| market only. I don't see them making burner VR headsets ever,
| I think they'll stick at the ultra high end with some high to
| mid market lower ends that eschew things like showing yours
| eyes etc.
| [deleted]
| dehrmann wrote:
| There are rumors floating around that macs are dropping support
| for discrete GPUs.
| mywittyname wrote:
| Maybe Apple has set their sights on Nvidia's AI market.
| DCKing wrote:
| From what I gather, Game Porting Toolkit is two things - a fork
| of Wine, and their own adaptation/extension of open source
| projects for DirectX compatibility. Some observations:
|
| Windows compatibility:
|
| * Windows platform compatibility is achieved with a slightly
| patched version of Codeweavers' Wine fork - virtually all of the
| hard work of Windows compatibility has been achieved by the Wine
| developers and Codeweavers and was already available open source
| before today.
|
| * Interestingly, Apple is distributing this thing as a Homebrew
| library. Has Apple ever done this before? [1]
|
| DirectX compatibility:
|
| * In addition, outside of the core Wine based stuff, there's a
| framework called D3DMetal.framework. This is a DirectX 9-12
| compatibility layer akin to DXVK used in Proton as a
| compatibility layer from DirectX 9-11 to _Vulkan_. This is what
| seems to be the game changer here compared to before. Before
| today, running DirectX on macOS was possible but lost a lot of
| performance and compatibility needing to go through Apple 's old
| OpenGL support or a third party Vulkan intermediary layer in
| MoltenVK. This is direct (heh) first party Direct3D to Metal
| translation.
|
| * Actually, it's more than "akin to DXVK". The D3DMetal.framework
| contains copyright attribution to DXVK as required under their
| MIT license. It's quite likely Apple ported a lot of DXVK to
| Metal. It's worth noting that DXVK itself doesn't support
| Direct3D 12 though, Proton uses another LGPL2.1 licensed library
| called VKD3D for that.
|
| * However, D3DMetal.framework is very much not open source
| itself. Its license is actually very restrictive, seemingly only
| permitting use for game development/QA use cases. [2]
|
| * The restrictive license seems to make it harder for someone
| like Valve to use this akin to how they use Proton on Linux in a
| sanctioned way. Apple seems intent on preventing developers from
| dumping their games on macOS with just their compatibility layer.
| It definitely won't stop hobbyists making better tools to
| continue run Windows games on Macs though.
|
| * The fact that D3DMetal.framework appears to support DirectX 9
| and 10 is interesting. No new commercial efforts use those
| anymore, so that's just there for what? Allowing homebrewers to
| run their 00s era Windows games?
|
| [1]: https://github.com/apple/homebrew-apple/tree/main/Formula
|
| [2]: It contains a license with the following language - "you are
| granted a limited, non-exclusive, non-transferable, personal
| copyright license to (i) install, internally use, and test the
| Apple Software for the sole purpose of developing, testing, or
| evaluating video games for use on Apple-branded products".
| brucethemoose2 wrote:
| > The fact that D3DMetal.framework appears to support DirectX 9
| and 10 is interesting.
|
| Lots of popular games are DX9 or 10. Rimworld, for instance, is
| high in the Steam charts and is DX9.
| mort96 wrote:
| It's sad that D3DMetal isn't open source. Worth noting that
| this stuff is exactly why people should strongly consider using
| a copyleft license. It's pretty annoying that nobody can build
| on top of Apple's work here or incorporate any improvements
| into DXVK, but that's the license the DXVK project chose.
| scns wrote:
| Well, i'm pro copyleft but in this case i have to disagree.
| This way Apple could develop D3DMetal much faster, and
| attrbution is given. If DXVK where GPL3 it might have
| happened years later if ever.
| PinkiesBrain wrote:
| So now they are propping up an ecosystem on which open
| computing will always be a second rank citizen at best. I
| wonder if they are all happy about it in retrospect, wine
| got patches, DXVK gets to be a brick in the wall of Apple's
| garden (if Valve can't distribute it, it's useless to them
| in the grand scheme of things, normal people want a one
| click install).
|
| Monetarily good for the devs who ended up on Apple payroll,
| another nail in the coffin for competition and open
| computing at the same time.
| mort96 wrote:
| Huh, why are you saying Apple could develop D3DMetal much
| faster because DXVK was MIT-licensed? Literally nothing
| would've changed up until now if DXVK had been GPL-
| licensed, the only difference is that now that they've made
| it publicly available, they'd have had to license D3DMetal
| under a GPL-compatible license. Where does the extra delay
| come in?
| PinkiesBrain wrote:
| They can weaponize it against Valve. Helping displace
| steam for relatively little effort bumped it up in
| priority.
| mort96 wrote:
| Is it faster to build it to weaponize it against Valve
| than it is to build it without the weaponization?
| PinkiesBrain wrote:
| There's only so much developer time and management focus.
| "This will help us cement our monopoly on software
| distribution without being too obvious to regulators" is
| a good pitch.
|
| PS. I'm sure the people who pitched it and their managers
| are all sufficiently skilled at lying and lying to
| themselves to not put it or even think in those terms.
| xign wrote:
| You can ship macOS native games on Steam as well. People
| need to stop throwing conspiracy theories. Their main
| motivation is to make sure games are native to macOS so
| they can take advantage of system-native features, which
| Win32-translated games won't. Otherwise games running on
| Macs will always be kind of janky and run slower than
| Windows.
| DCKing wrote:
| I don't care much for it being open source or not, it's just
| a shame Apple is making attempts to limit its use cases
| through licensing [1]. It's on brand for Apple and totally
| expected, but still sad. It's not going to stop a bunch of
| homebrew efforts springing up around it, but it will likely
| be enough to stop third party app stores like Steam from
| opening up a huge game library on the Mac in a
| straightforward way like what happened with Proton.
|
| [1] The license is probably actually meaningless in a lot of
| jurisdictions, but it still has a chilling effect for
| commercial parties using it. That's probably exactly what
| Apple intends.
| GeekyBear wrote:
| > Windows platform compatibility is achieved with a slightly
| patched version of Codeweavers' Wine fork
|
| If by "slightly patched" you mean "DirectX 12 support on Apple
| Silicon was added, then sure.
|
| Codeweavers recently announced their own effort to support
| DirectX 12, but so far only one game, Diablo II Resurrected,
| works with it.
| DCKing wrote:
| No, I mean slightly patched. The list of patches is included
| in the Homebrew formula [1].
|
| The D3DMetal framework and its DirectX 12 support are
| independent of that Wine work.
|
| [1]: https://raw.githubusercontent.com/apple/homebrew-
| apple/main/...
| GeekyBear wrote:
| If you ignore the part where they added DirectX 12 support,
| it's only slightly patched?
|
| You do you.
| circuit10 wrote:
| They said that's independent of the Wine fork
| GravityLabs wrote:
| This is my daily driver device and Cyberpunk 2077 is one of the
| best video games I've ever played, so this has me very excited.
| Would love to take a break from code to play a few minutes of
| Cyberpunk 2077 on the same device I daily drive...
| whizzter wrote:
| This actually makes me wonder if I should sell my Apple stocks,
| that they even put out something like this (even if only as an
| "evaluation tool" and based on CrossOvers GPL code) screams as an
| validation that there is enough developers ignoring Apple/Metal
| that they're actually starting to hurt from a lack of titles.
|
| Sure, everyone doing Unity or Unreal will probably have the
| middleware take care of the biggest differences and Vulkan being
| too verbose has kept back the field, but given a choice Metal
| will still be an afterthought for those making custom engines.
|
| One could hope for updated OpenGL/Vulkan support, but nobody is
| holding their breath anymore.
| [deleted]
| metmac wrote:
| I think this response lacks the context of the current state of
| the video game industry. In my mind this is no different than
| nVidia providing white glove engineering support and
| optimizations for titles to run on their GPUS.
|
| There is a reason nVidia has generally won the consumer GPU
| arms race over the years over AMD. They go out of their way to
| support studios and titles, compared to what AMD does imho.
|
| It shows that Apple recognizes it needs to lower the barrier of
| entry for studios to considering targeting Macs. Because
| compared to the larger gaming market most studios have very
| little incentive to build and maintain this because Mac's could
| account for 1-3% of your player base. Is that worth 1-5 ICs
| worth of time if you are a small studio. Probably not.
| whizzter wrote:
| As for AMD v Nvidia, imho it's more often a matter of AMD
| drivers being buggy even if their hardware seems to be almost
| on par in terms of performance, also right now NVidia has
| ridden high on both blockchain and now AI trends thanks to
| CUDA being propietary.
|
| And the third paragraph is my point, porting would've been
| trivial instead of requiring another engineer if they had
| supported more standard API's, but they refused and now
| they're hedging their bets on another manufacturers API. It's
| just such a roundabout way of saying that they feel the pain
| without going back and fixing the basics.
| goosedragons wrote:
| They also need to focus more on backwards compatibility. If
| Apple continually makes most of my library unplayable as they
| have done repeatedly (Classic, Rosetta 1, 32bit) then I
| wouldn't trust them with gaming. It seems moving forward
| PlayStation and Xbox are making it a priority, it already was
| on PC and I don't think Apple will ever have anything with
| the draw of Nintendo that could surmount that.
| sylens wrote:
| This is a huge draw in PC gaming. People love going back to
| old games and modding them or updating them to run better.
| Heck, didn't Portal (a 2007 game) just get an RTX update?
| yurishimo wrote:
| You're gonna sell your stocks in the most successful tech
| company of all time because they don't have enough video games?
| I think Apple has proven they can grow a company's value
| without games. But what we all want is a larger investment in
| games so the Mac becomes a competitive platform again.
|
| Not to mention the headset... we'll see, but there is no reason
| to expect it won't be an insane commercial success by gen 3.
| AmericanChopper wrote:
| > But what we all want is a larger investment in games so the
| Mac becomes a competitive platform again.
|
| As a Mac-only user, I really don't care about games. I guess
| it's cool if they work on it, but I wouldn't want Apple
| diverting resources away from other areas to try and make it
| a gaming device.
| yurishimo wrote:
| I don't think it's about diverting resources. Apple has the
| funds to buy more developers to focus on gaming.
| AmericanChopper wrote:
| Yeah probably. My main point was mostly that it's a bit
| of an overstatement to describe investment in gaming on
| Mac as something we all want.
| whizzter wrote:
| That's why I have stocks now (I bought them just before the
| first post-M1 xmas report because all I could hear about was
| them hitting out of the ballpark but the stock had hardly
| moved so I knew it to be undervalued), but past performance
| never guarantees future performance.
|
| Games are a canary, Apple likes to lock in developers with
| their own API's and that works as long as you are the clear
| leader but once you're not it'll bite you. Apple mobile
| leadership since early iPhones attracted developers, but once
| the mobile market stagnated developers has looked elsewhere.
|
| And the headset, I have a hard time seeing that people who
| buy Apple laptops,etc for their atheistic will put on these
| ski-goggles on a daily basis? And that's not even mentioning
| the pricepoint.
|
| Apple had a runway after Jobs left the first time before
| things got bad, sure they know their history now and will do
| their darnedest not to repeat history but whilst Ive wasn't
| the right person to build professional machines, I can't
| wonder if he was still the type of person Apple needed for
| "personal devices" like the headset.
| the_gipsy wrote:
| Right now, the headset exclusively appeals to shareholders
| and apple cultists.
| robenkleene wrote:
| Just like the first iPhone...
| valzam wrote:
| Apple has video games, there are probably more people playing
| games on iPhone than ps5/Xbox combined. Many very popular
| Moba/eSports titles also work on Mac (LoL, all things blizzard,
| cs:go). The fact that there aren't many AAA games (that you can
| play on consoles anyway, not your laptop) is not a problem for
| Apple.
| maleldil wrote:
| > all things blizzard
|
| Recent Blizzard games (Diablo 2 Ressurected, Diablo 4)
| require DirectX 12 and aren't natively available for macOS.
| shoo_pl wrote:
| Overwatch before that too.
|
| Its surprising how on one hand a huge effort is put into
| World of Warcraft running smoothly on macOS (they always
| implement all the new Metal features with every expansion
| and it was the first native Apple Silicon game) but
| completely abandoned macOS for new titles.
| oefrha wrote:
| Blizzard since mid to late 2010s is basically a different
| company from the one that pumped out Warcraft, StarCraft
| and Diablo. The old company's dead.
| maleldil wrote:
| I think you misunderstood your parent's comment.
|
| Blizzard _today_ takes to maintain World of Warcraft's
| macOS build. The same Blizzard is releasing all new
| titles as Windows-only.
|
| This difference between WoW and there rest is what
| puzzles you parent.
| whizzter wrote:
| Right, the point is that it isn't the same kind of games that
| you play on mobile that interests people on a "computer".
|
| Continuing from that the mere _existence_ of this _porting
| toolkit_ is an indication that they do feel the lack of AAA
| as problem at Apple right now, otherwise they would've
| continued their "Metal is the future and all other API's are
| deprecated".
|
| They want to tell the world that the M1/M2/etc family is the
| best chipset in the world (or at least in contention),
| showing games like Cyberpunk running well enough on a
| basically cold laptop would've been a coup, but games didn't
| support it because even larger developers didn't feel like
| putting in the effort.
| reportgunner wrote:
| > _Moba /eSports titles also work on Mac (LoL, all things
| blizzard, cs:go)_
|
| That's like.. 10 games ?
| Animats wrote:
| Direct-X 12 support, but not Vulkan?
| [deleted]
| AltruisticGapHN wrote:
| Man if Apple somehow could have their own Windows emulation layer
| similar to Valve's Steamdeck developments - I could see myself
| returning to the Mac. It would be so amazing. The Mac would
| become truly a gaming platform, regardless if it costs twice the
| price of an equivalent PC - since people who appreciate Apple's
| hardware and software will see its value.
|
| Having said that there's still the issue of upgrades. But again,
| I have to replace pretty much my entire PC every 5 years (new CPU
| needs a new mobo, needs new RAMs etc). So hmm.
|
| edit: as an aside an interesting question.. I wonder if Valve's
| efforts, having incentivized developers to make their games more
| Steamdeck compatible - and hence more "predictable" in terms of
| how they access Windows APIs - would also make it easier for
| Apple to translate the games - even if the target is different.
| Hamuko wrote:
| I mean, this is a Windows emulation layer. It's just designed
| for game developers to be able to quickly see what a rough
| version of their macOS port would look like. Like, does
| translating your game automatically yield you 40 FPS, or 4 FPS?
| Valve's emulation layer on the other hand is designed for end
| users to be able to run games without a native port.
| scotty79 wrote:
| Yeah but how can you stick a gtx 3090 in a Mac?
| ece wrote:
| In the new Mac pro..
| o1y32 wrote:
| I mean, nobody is going to get a Mac Pro just for games
| even if it works...
| belthesar wrote:
| The new Mac Pro that, despite having PCI Express slots
| can't be used to add discrete GPUs to the system for
| graphics tasks.
| Hamuko wrote:
| There haven't been Nvidia drivers for macOS for years.
| Maybe since like the GTX 900 series?
| ece wrote:
| Yeah, Apple could build their own nvk-like for Metal
| driver at this point if they cared about any GPU other
| than their own. Nvk/mesa itself might enable Linux gaming
| on Mac pros with 3rd party cards.. one day.
| golergka wrote:
| When people discuss Apple efforts in gaming, they tend to forget
| that this company owns second largest in the world, after
| Android, game platform. And with Vision, they could easily become
| the largest VR gaming platform in the world, even though this
| might not be the immediate focus upon release -- but iPhone
| didn't have any games on release either.
|
| Mentioning partnership with Unity in Vision announcement
| presentation is very significant. It is the most used game engine
| in the world (especially if you count by game installations)
| which dominates mobile market, and it has been developing VR/AR
| capabilities for many years now, even though it arguably doesn't
| have the same AAA graphics as Unreal.
| nhggfu wrote:
| why does it look laggy and verging on unplayable, i wonder?
| dagmx wrote:
| Because they're running it at Ultra on the lowest end hardware
| configuration, with an x86_64 to arm64 translation, and a
| shader translation and a windows api wrapper.
|
| For comparison, the M1 was routinely benchmarked in NVIDIA 1050
| TI to 1070 territory depending on the benchmark. This is only a
| few fps behind native.
| jacooper wrote:
| Have people here not tried Proton and DXVK? This is not
| impressive at all
| izacus wrote:
| So... why is this not an end-user product like Proton on SteamOS?
|
| Is this actually going to do anything for Mac gaming considering
| it's only meant for game developers for... testing? What kind of
| workflow is Apple envisioning here?
| Hamuko wrote:
| As far as I can tell (as not a game developer) from the
| introduction videos is that the game porting toolkit / Wine is
| designed to shorten the time it takes to get a first macOS
| version of your game running and to evaluate how well does it
| run. A (near) zero-effort rough draft of your potential macOS
| port.
|
| They show that previously you'd have to take the Windows
| version, port the source code, port the HLSL shaders, implement
| graphics/audio/input/HDR and then you'd be able to get a first
| running version of your game that you could evaluate. And now
| with the Wine tool, you'd take the Windows version, run it
| through the Wine tool and then you'd have the first running
| version version of the game to evaluate. And you can then
| evaluate what parts of the games run good and which don't, is
| it utilising the GPU properly, and so on.
|
| And if you decide that there's potential in the macOS version,
| you'd then start doing all of the stuff that was previously
| required to get the first running version.
| Version467 wrote:
| This is the entry level M1. 15fps at 1400x900 isn't great (or
| even playable), but it's very impressive that it runs at all
| without any changes, let alone on ultra settings.
| _mitchie wrote:
| This is not the M1. You can't configure a MacBook Pro 16 inch
| (stated in the tweet) with anything other than an M1 Pro / M1
| Max, or M2 Pro / M2 Max on the latest models.
| dagmx wrote:
| It's the 13" MBP. If you look at the metal overlay in the
| corner it shows the product name of the processor, and in
| this case, it's the base M1 with 16GB of memory.
| Version467 wrote:
| Huh where does it say that? It states 16GB of RAM, not screen
| size.
| _mitchie wrote:
| You're right, my apologies I misread the 16 GB as 16 inch!
| jlokier wrote:
| The tweet says "M1 MBP". That is short for Macbook Pro and
| means the CPU is an M1 Pro or M1 Max. Not the entry level
| M1.
| hoorible wrote:
| The 13" M1 MacBook Pro does not have the pro or Mac chip.
| Not sure that's what is being used here, but your
| assumption that this combination isn't possible is false.
| Version467 wrote:
| The 13" M1 Macbook Pro is a thing. You cannot infer M1
| Pro/Max just from that.
| heliophobicdude wrote:
| > This is the entry level M1. 15fps at 1400x900 isn't great (or
| even playable), but it's very impressive that it runs at all
| without any changes, let alone on ultra settings.
|
| And running non-natively through Codeweavers patch and through
| Rosetta.
| quitit wrote:
| The people shaming this couldn't have any idea what they're
| looking at.
|
| It runs bad! I could spend less on a PC and get a better
| frame rate!
|
| Of course it runs bad, it's miracle-like that it runs at all
| or has a frame rate, let alone a rate that is based on high
| performance settings.
|
| The entire thing is hacked together through tenuous layer
| upon layer of emulation - no part of this is designed for the
| hardware it's running on. It should not be this fast, it
| should be spitting out a low-res, low-detail frame once every
| 30 seconds, if at all.
| Applejinx wrote:
| Yes, thank you. I mean, let's have a bit of context.
|
| I'm Mac based (well, my computing is, I'm still biological ;) )
| and I took an interest in what's happening in the AI space.
| Knowing that RAM and CPU would be bottlenecks but that there
| would be open source ways to run stuff, I got a Mac Studio,
| which is M1 Ultra and 128G of RAM, which then becomes also GPU
| RAM because on M1 the architecture is like that.
|
| So I'm able to do AI stuff up to the heavy-lifting stuff (still
| steadily downloading LLaMA data on my old DSL, but I've got
| everything but 65B and have run 7B, 13B and 30B without issue,
| and of course I'm having no trouble running Stable Diffusion
| stuff)
|
| If a basic laptop with just an M1 and laptop amounts of RAM can
| do 15 fps at 1400x900, my 20-core M1 Ultra with 128G of RAM
| ought to do at least that at 3840x1600 on my big ol' curved
| monitor. And at that point it starts acting like the PC gamer
| experience, except at no point did I set out to make a gamer-
| specific system. The whole thing was put together to serve very
| much other purposes, and retains usefulness for those purposes.
|
| I'm sure I can tolerate pleb tier 30 fps or not being on Ultra
| settings or something, to get performance that's very similar
| to decent gaming rigs on games designed for Windows PCs, on a
| machine I got to serve entirely other purposes.
| lamontcg wrote:
| A lot of us with the financial means to buy a Mac Studio are
| also old and are "PatientGamers" and fire up games like
| Skyrim from time-to-time and don't chase after the latest AAA
| titles. Even 10-15 years ago I used to a have a $200 limit on
| graphics cards and roughly $100 limit on CPUs and would wait
| to play games on highest resolution until it was a few years
| later and I went through a hardware refresh cycle, always
| staying about 2 years behind. I've never bought a $1000 GPU
| for a gaming rig.
|
| At the same time I find this interesting that Apple is
| clearly starting to notice the PC gaming market. Makes me
| wonder if they're going to start a more serious push soon,
| and if we might see some changes and repair the relationship
| that Apple has with AAA studios.
|
| [Edit: And another thought is that with virtual iPhone
| saturation in the US market, Apple may be facing the economic
| reality that in order to grow they need to enter into other
| markets like gaming, and this may as well be a necessary part
| of the long-term Vision Pro strategy]
| yieldcrv wrote:
| nice proof of concept about the highest settings, gamers only
| care about 60fps minimum so should probably find/post a tweet
| showing those settings
|
| then gamers will move the goalpost to cost of the machine, but
| thats okay
| yreg wrote:
| As John Siracusa discussed on this week's ATP[0], it's incredible
| how much effort Apple puts into this considering the result.
| Apple built it's own little parallel gaming stack world that
| works really well on their hardware and the hardware is also
| amazing for the power envelopes it is wrapped in.
|
| But then Apple doesn't ship devices with actually powerful GPUs,
| so it can never compete with the gaming PCs which are far less
| expensive and far more powerfull graphics-wise. And Apple also
| doesn't know how to keep relationships with the AAA developers,
| unlike Microsoft and other platform owners.
|
| Like how does all this Metal, compile-your-shaders, port-your-
| games stuff even get budgeted, when it's eventually dead on
| arrival?
|
| I think the on stage demo of the 4 year old Death Stranding
| running poorly on the newest Macs says it all.
|
| [0] https://atp.fm/538 @1:42:20
| clint wrote:
| The high-end gaming market is infintessimal when compared to
| the casual gaming market. Apple knows this and they are
| perfectly positioned for high-quality casual games.
| senttoschool wrote:
| > _Like how does all this Metal, compile-your-shaders, port-
| your-games stuff even get budgeted, when it 's eventually dead
| on arrival?_
|
| iOS gaming is the biggest and most profitable in the world. In
| fact, iOS is bigger than PS5, Xbox, and PC gaming.
|
| iOS uses Metal. Apple Silicon is mostly just a scaled up iPhone
| SoC. Macs basically get most things funded by iOS.
|
| The Game Porting Toolkit is the first Mac-only gaming tool
| Apple made in a long time. It shows that Apple wants AAA games
| on Macs.
|
| > _But then Apple doesn 't ship devices with actually powerful
| GPUs, so it can never compete with the gaming PCs which are far
| less expensive and far more powerfull graphics-wise._
|
| But Apple does ship powerful GPUs. In fact, the M2 Max is
| probably the most or second most powerful GPU on laptops. But
| games aren't optimized for Metal nor ARM, so they run slower
| than Nvidia laptop GPUs.
| throwaway2990 wrote:
| > iOS gaming is the biggest and most profitable in the world.
| In fact, iOS is bigger than PS5, Xbox, and PC gaming.
|
| Cos of micro transactions. Apple doesn't stand to profit from
| other peoples games like Diablo 4 as it's not the gate keeper
| for transactions.
| pongo1231 wrote:
| > But Apple does ship powerful GPUs. In fact, the M2 Max is
| probably the most or second most powerful GPU on laptops. But
| games aren't optimized for Metal nor ARM, so they run slower
| than Nvidia laptop GPUs.
|
| The most powerful iGPU likely. It still won't come anywhere
| close to high-end dedicated laptop GPUs in the vast majority
| of benchmarks regardless of how much optimization you throw
| at it - that's just falling for marketing / hype.
| aurareturn wrote:
| > _The most powerful iGPU likely. It still won 't come
| anywhere close to high-end dedicated laptop GPUs in the
| vast majority of benchmarks regardless of how much
| optimization you throw at it - that's just falling for
| marketing / hype._
|
| In applications that actually use Metal natively, Apple
| Silicon GPUs do compare favorably to Nvidia laptop GPUs
| while using drastically less power.
|
| So no. It isn't just hype/marketing.
|
| Even if you look at the raw technical specs of the M2 Max
| GPU, it's comparable to Nvidia laptop GPUs - with the
| exception of ray tracing.
| dahauns wrote:
| >In applications that actually use Metal natively, Apple
| Silicon GPUs do compare favorably to Nvidia laptop GPUs
| while using drastically less power.
|
| Is that really the case? I'm not being facetious here,
| I'd really like to see more useful datapoints. There
| aren't many benchmark comparisons out there that strive
| for actual useful comparison, especially outside
| synthetic stuff with questionable applicability like
| 3dMark.
|
| And for the "drastically less power" claim...it really
| doesn't help that most benchmarks are with decked-out
| "Gamer" machines using the highest available TDP
| configuration, despite most GPUs having their sweet spot
| significantly below - especially Ada Lovelace seem to
| scale down really well (from what I've gathered, still
| 60-70% performance at 60W compared to 150W with 4080
| Mobile, for example).
|
| >Even if you look at the raw technical specs of the M2
| Max GPU, it's comparable to Nvidia laptop GPUs - with the
| exception of ray tracing.
|
| The specs put it roughly between GA106-GA104/AD107-AD106
| respectively, and I'd expect it to land there in the
| general, adequately optimized case.
| mrtranscendence wrote:
| > Is that really the case? I'm not being facetious here,
| I'd really like to see more useful datapoints. There
| aren't many benchmark comparisons out there that strive
| for actual useful comparison, especially outside
| synthetic stuff with questionable applicability like
| 3dMark.
|
| Yeah, on synthetic benchmarks a higher-end MacBook Pro
| GPU compares favorably with recent Nvidia laptop cards
| (say, 4070 or so). But in games that drops off
| dramatically ... more like a 3050 or 3060 at best.
| yreg wrote:
| M2 Max is very impressive given its power consumption, but
| it's not powerfull as in RTX 30 or 40 powerfull.
| SleepyMyroslav wrote:
| > It shows that Apple wants AAA games on Macs.
|
| Why do you think that ? I am asking as someone who is working
| in gamedev.
|
| Game industry got burned by some big company recently when
| multi year efforts were spent porting 3d stacks and the
| target platform got axed.
| diegof79 wrote:
| Some hypothetical reasons:
|
| 1. Playing games is one of the reasons to prefer a Windows
| laptop instead of a Mac. 2. The AR/VR headsets will need
| games to be successful, so they need to be more attractive
| to the gaming industry. 3. iPads are more powerful now.
| They support game controllers very well, and you can plug
| it into a big screen to play. However, most of the iPad
| games (except Divinity Original Sin 2) are scaled versions
| of iPhone games... game studios are not interested in
| porting games to the iPad.
|
| While the porting kit announcement is about Macs, I think
| that the strategy is to make the whole Apple Silicon
| platform attractive for gaming.
| aurareturn wrote:
| > _Why do you think that ? I am asking as someone who is
| working in gamedev._
|
| They built Game Porting Toolkit.
| SleepyMyroslav wrote:
| The toolkit is not part of Apple platform apparently. It
| is a development tool that can be used to evaluate
| porting no more no less. I see no changes in the platform
| itself.
| dwaite wrote:
| Including things in the platform is a double-edged sword;
| the platform has different compatibility policy from
| windows, and old games typically do not get updated with
| mandated platform changes (such as a requirement for
| 64-bit).
|
| Swift developer were upset by concurrency because the
| language's deep integration with the platform meant the
| best features were (for a while) only accessible for apps
| targeting the latest platform versions exclusively.
| ohgodplsno wrote:
| Microtransaction ridden mobile gacha games where Apple takes
| 30% aren't exactly the target when you translate D3D12 to
| Metal for your desktop platform where Apple takes nothing. In
| practice, it is doomed to be a solution that plays the games
| you bought on steam 5 years ago, or run games like cyberpunk
| on medium 900p on your multiple-thousand dollar machine.
|
| Apple gaming will not take off until game devs target Apple's
| devices, and Apple burned those bridges a long time ago.
| musicale wrote:
| > Like how does all this Metal, compile-your-shaders, port-
| your-games stuff even get budgeted, when it's eventually dead
| on arrival?
|
| I imagine Apple may be testing the waters. They're a trillion-
| dollar company, with many customers who play games, and they
| want to see if they can expand Mac gaming beyond iOS ports.
|
| Consider that there are a large number of games which could run
| fine on this sort of technology (including much of my Steam
| game library that currently only runs on Windows) as well as
| existing macOS ports whose performance can be improved.
|
| For example Final Fantasy XIV already runs on a Crossover/WINE
| type middleware layer, but at about half the frame rate of the
| Windows version on comparable hardware. Metal conversion is
| likely to greatly improve the frame rate, improving the
| experience for FFXIV players on macOS. If it helps Square Enix
| deliver a better FFXIV experience on Mac, perhaps they will be
| more likely to consider Mac ports of some newer games that are
| currently slated for Windows or consoles.
|
| Moreover it's worth noting that Apple did claim that the M1 had
| comparable raw GPU performance to the PS5 (and Macs also use
| fast flash storage like the PS5.) So as the M-series evolves
| (and the PS5 ages) it may become more feasible to port PS5
| games to Mac with decent performance. Solid ports of console
| games could greatly improve the Mac gaming landscape.
|
| Also we don't know Apple's product plans. It's likely that they
| have some GPU improvements in the works. Unified memory
| architecture may also pay off as more games adopt ray-tracing
| and procedural textures and geometry.
| andsoitis wrote:
| > Like how does all this Metal, compile-your-shaders, port-
| your-games stuff even get budgeted, when it's eventually dead
| on arrival?
|
| their Survivor bias of the "control the whole stack"
| philosophy.
| dmix wrote:
| > But then Apple doesn't ship devices with actually powerful
| GPUs, so it can never compete with the gaming PCs which are far
| less expensive and far more powerfull graphics-wise.
|
| It is still expensive to have to use Windows just so you can
| game. Or put all the effort into dual booting Linux.
|
| Most people just use a Macbook and then get an
| Xbox/Ps5/Switch/Quest2.
|
| For games I can't use on those you can get Shadow PC which
| let's you play any Windows game ever using better GPUs than I
| could afford (or more accurately would care to spend)
| otherwise.
|
| https://shadow.tech/
| DCKing wrote:
| > Apple built it's own little parallel gaming stack world that
| works really well on their hardware
|
| I think saying "its own parallel gaming stack" probably gives
| them too much credit [1]. Yes - they put in significant effort,
| but their gaming stack is neither "parallel" to the rest of the
| world, nor "their own" in any real sense. They seem to have
| adapted the open source efforts of Codeweavers, Valve, Wine and
| the broader Linux community put in Proton and achieving Windows
| games compatibility on Linux with Vulkan. Adapting that for
| Metal is no small feat [2] but an investment a giant like Apple
| can easily make without much risks. Don't forget that Wine has
| always been developed for Linux and macOS (and the BSDs) in
| parallel too - it was right there for the taking.
|
| [1]: I'm assuming this is a quote sourced from someone who just
| isn't aware of all the effort being made at the Linux side of
| thing in the last five years.
|
| [2]: At a technical level Metal and Vulkan are actually similar
| enough, but there's just a lot of surface to cover and edge
| cases to get right.
| spookie wrote:
| Wine/Proton is great. I still hit the "oh I need to set this
| env var" from time to time (mostly due to my Nvidia card).
| But, it's fantastic.
|
| When you use Bottles to wrap it for ease of use/setup, you
| catch a glimpse of a future where no program is tied to
| Windows anymore.
|
| Hope the same can happen with MacOS! And vice-versa.
| yreg wrote:
| Metal itself is the parallel stack, no?
|
| Granted, Metal is to a degree useful for the casual iOS games
| where alone Apple probably makes more money than anyone else
| in the gaming industry.
|
| >I'm assuming this is a quote sourced from someone who just
| isn't aware of all the effort being made at the Linux side of
| thing
|
| The podcaster does acknowledge that as well, I just didn't
| quote that part. The whole "rant" has about 4 minutes. But
| thanks for providing context, it is important.
| DCKing wrote:
| > Metal itself is the parallel stack, no?
|
| Ah perhaps I interpreted the quote in the wrong way then.
| At first it read to me as if Apple did their own bespoke
| compatibility work, but it's a comment about the mostly
| artificial Apple Silicon GPU / Metal stack that is only
| available on Apple devices.
| cormacrelf wrote:
| Worth noting that Apple poured a lot of resources into
| making WebGPU happen. WebGPU is, in a great many ways,
| Metal but cross-platform. The way this pays off is if
| game developers start targeting WebGPU instead of Vulkan
| or DX12. That could happen since WebGPU is a meant to be
| a lot easier to code against than Vulkan. This effort to
| port DX12 can probably be seen as more of a hedge than
| anything else. They know that some publishers will stick
| to what they know for some time, but they wish for it to
| be easier to see the upside of a cross platform
| investment by publishers by delivering an easier win. If
| it doesn't work perfectly but gets close, that still
| helps them a lot. Because Metal is no longer some
| parallel stack they're promoting and wanting people to
| build Apple-exclusive games for, it's a means to an end,
| and the end is WebGPU and cross-platform.
| nightski wrote:
| Game developers are not going to start mass adopting a
| JavaScript API.
| vore wrote:
| WebGPU is not JavaScript only:
| https://eliemichel.github.io/LearnWebGPU/
| astrange wrote:
| Does it really work well for other use cases? Obviously
| you can call it, but typically a web API has many more
| security issues to handle than a native API, so I'd
| expect there to be a lot of compromises a game developer
| wouldn't want to deal with.
| chrisco255 wrote:
| Personally, I appreciate this effort, not for AAA games,
| because I never bought a MacBook expecting it to be able
| to...but for much more casual or lower poly games that are
| available on Steam for Windows users only, because it's too
| hard to port. There's tons of sub-AAA games that could easily
| be ported and enjoyed on a modern M1/M2 Mac.
| ChrisMarshallNY wrote:
| Speaking only for myself, I use Macs to get work done. The
| lack of games is actually a plus, as I can get compulsive
| with games.
|
| It's funny, when people denigrate the worst gaming platform
| in the world as a "toy computer."
|
| People get Macs, that want to get work done. As noted
| previously, there's really nothing that can beat a well-built
| PC gaming rig.
|
| I don't really think Apple has ever cared about the Mac as a
| gaming platform, and the low-key hype around these
| technologies shows that. These almost seem like "developer
| 20% projects," compared to the big stuff, like visionOS.
|
| That said, I think that Apple wants iOS/iPadOS/visionOS to be
| gaming platforms, so they do dedicate a lot of resources to
| that.
| musicale wrote:
| > It's funny, when people denigrate the worst gaming
| platform in the world as a "toy computer."
|
| When did toys become a bad thing?
|
| "Worst gaming platform in the world" may be something of an
| exaggeration - Apple Arcade isn't bad, and Macs can also
| run many iPad games.
| ChrisMarshallNY wrote:
| Good point, but many Mac users are not aware of just How.
| Damn. Good. games play on gaming rigs. It's like being in
| a movie.
| musicale wrote:
| Many console gamers aren't aware of how [well] games play
| on PC gaming rigs, or they don't care, and are happy to
| play games on Switch/PS4/PS5/Xbox.
|
| Apple may very well be targeting mobile and console
| gamers rather than PC gamers with gaming rigs.
|
| Personally I would definitely enjoy playing on a high-end
| gaming PC (especially as GPUs become easier to find at
| MSRP), but I already have a Mac and a PS5.
|
| I wouldn't mind more of my Steam library working on macOS
| though.
| chrisco255 wrote:
| > I use Macs to get work done
|
| Ok, uninstall Steam then, I guess? Games do currently exist
| on Mac, both in the App store and on Steam. It's just that
| there's a big swath of games that haven't been ported to
| Apple Silicon in particular, due to difficulty. I grew up
| playing games on the Mac, so I'm not sure what your point
| is.
| ChrisMarshallNY wrote:
| So why the challenging and abrasive approach? Did what I
| write offend you? I certainly didn't mean to, and
| apologize for my tone.
|
| I have no interest in picking fights online, especially
| in a professional venue, where folks that could have a
| significant impact on my career are watching how I
| interact with others.
| chrisco255 wrote:
| Well HN has the ability to create arbitrary screen names
| to create pseudo anonymity. So there's that.
|
| I guess what appears to be abrasive is just lacking tone.
| It's just me being mildly exasperated for effect. My
| point is to stress that the Mac has always been a general
| purpose computing device. Jobs may have chosen to
| optimize the Mac for productivity in the 90s, and that
| probably stemmed from the niches that were available to
| Apple in the Windows-dominated 90s and 00s (that he
| learned from running NeXT).
|
| The lack of upgradeable components also held the Mac back
| in graphics technology which precluded it from premium
| games market. But the M1/M2 chips are game changers. But
| since they're newish and the Windows market is so
| established and ARM chips are different from traditional
| x86 based chips, it's tough to get devs to port for Mac.
| If they can provide tooling that automates it, it's a
| great win for Apple.
| ChrisMarshallNY wrote:
| _> pseudo anonymity_
|
| Unfortunately, that "pseudo" is rapidly dwindling. I know
| that there are already AI "decloakers," that look at
| writing style, and do a damn good job of finding folks,
| based on that.
|
| I was a UseNet troll, back in my day. It was not my
| proudest moment.
|
| One of the reasons that I deliberately make myself known,
| is that it forces me to watch my words, just like IRL.
|
| My career is done. I am retired (not by choice), and
| continue to work, but at what I want, and the way that I
| want to do it. Basically, a dream come true. I am not
| particularly worried about upsetting folks for my career,
| but I also feel I have a great deal of atonement due,
| because of my past behavior.
|
| I like the Apple ecosystem. I've been using it to be
| highly productive since 1986, and have been playing games
| on it forever. I remember _Pathways Into Darkness_ [0],
| thirty years ago, which I thought was awesome.
|
| [0] https://en.wikipedia.org/wiki/Pathways_into_Darkness
| heyoni wrote:
| I don't know, you say that but there are plenty of triple A
| games in the App Store and Apple did put in the effort into
| getting their game subscription to work on their desktops.
|
| Let's be real, we might all be getting more work done as a
| result but that's because Apple dropped the ball, hard.
| They came out with the metal api but refused to support any
| others, even letting open GL fester. The fact that they
| couldn't keep Blizzard of all companies developing games
| for the mac is all you need to know.
| flashback2199 wrote:
| I'm curious if Apple dropped the ball or if there is
| still something of an anti-gaming culture inside Apple
| since I know that the company was de facto anti-gaming in
| the late 90s going forward. In that era they actively
| killed off relationships with gaming companies, e.g. the
| game that eventually became Halo was originally a
| Macintosh exclusive. More recent things like the legal
| theatrics with Epic Games gives one a picture of a
| company that is still not necessarily super keen on fully
| embracing the gaming scene.
| kitsunesoba wrote:
| > In that era they actively killed off relationships with
| gaming companies, e.g. the game that eventually became
| Halo was originally a Macintosh exclusive.
|
| That was Microsoft's doing, not Apple's. Microsoft
| scooped up Bungie to make Halo an Xbox exclusive (at
| least initially). This purportedly angered Steve Jobs,
| who probably wasn't too into gaming himself but
| understood the appeal it might have to consumers.
| flashback2199 wrote:
| That isn't the version of history I remember reading
| about Bungie, but I suppose there are probably differing
| viewpoints about it. In any case, I do know that John
| Carmack has said that Steve Jobs told him back then that
| he should stop working on games and work on operating
| systems instead, and that he generally did not like them.
| mrtranscendence wrote:
| > there are plenty of triple A games in the App Store
|
| Mobile AAA games in the iOS App Store, or desktop AAA
| games in the Mac App Store? There are some AAAs that work
| on a desktop Mac (e.g. Resident Evil Village), but it's
| not the norm at all. As for mobile ... modern mobile
| games can die in a fire for all I care (as they're 95%
| gacha nonsense).
| heyoni wrote:
| I misspoke. A handful of triple A games. Most of which
| don't perform so well even on high end intel machines due
| to thermal throttling.
| bombcar wrote:
| For a large number of customers, Macs play games just fine
| (you can get SimCity 4 for Silicon, heh).
|
| It's not the top of the line, but it covers a decent swath,
| even without doing emulation or translation.
| artificial wrote:
| SimCity 4 was released in 2003.
| bombcar wrote:
| Yep. The surprising thing is that Asypr bothered updating
| a 20 year old game for Apple Silicon.
| https://support.aspyr.com/hc/en-
| us/articles/12168615035405-H...
| dunham wrote:
| That is interesting - I bought KOTOR 2 a couple years ago
| for my kid and ended up having to get a refund it because
| it didn't run at all (on intel). It just wedged when I
| clicked play. No response from Aspyr, so I'd assumed they
| abandoned their older games.
| bombcar wrote:
| I _think_ they did it because they still sell SC4 via the
| App Store, but I 'm not sure. Maybe someone there is just
| a huge SC4 fan.
| hedora wrote:
| > _It 's funny, when people denigrate the worst gaming
| platform in the world as a "toy computer."_
|
| Unless you're developing iOS apps or MacOS apps, it's
| pretty terrible as a work platform too. I guess iWork is
| OK, but no one uses it, so it's not helpful for
| collaboration. Maybe people use it for media production?
| Doesn't the big rendering still happen on Linux though, and
| hasn't windows caught up?
|
| Having said that, Mac laptops made passable web browser +
| video phone + dumb terminal in the intel days, and now
| they're excellent at those things and added "virtualization
| host" to that list.
|
| I guess I've always thought of them more like glorified
| vt100s than like computers. They're certainly market
| leaders for that use case, though WSL is helping windows
| catch up.
| ChrisMarshallNY wrote:
| Well, I write MacOS/iOS/WatchOS/iPadOS/visionOS stuff, so
| I'm dependent on it.
|
| Like most tools, we get used to our main one, and can
| sometimes get a bit "sneery" about alternate ones.
|
| The Mac is a particularly rich target, because the
| "snootiness" is actually a deliberate brand ploy by
| Apple, and pretty much "leads with the chin," so we have
| that.
|
| I've never been "snooty" about Apple, but it's been my
| platform for over 30 years, so I'm used to it, and I get
| a _lot_ done with my Mac.
| scarface_74 wrote:
| > guess iWork is OK, but no one uses it, so it's not
| helpful for collaboration
|
| You realize Microsoft Office has been on the Mac since
| the mid 80s right?
|
| > Maybe people use it for media production?
|
| Uhh yes?
| hedora wrote:
| Microsoft Office has always been sub-par on Mac, and I
| haven't worked at a company that uses it for
| collaboration for over a decade. (Sure, there's the
| occasional person that actually needs VB macros under
| Excel, or prefers powerpoint, but there's no reason for
| _me_ to run it.)
|
| I know you _can_ use them for media production. Looking
| around, it sounds like they still own the low-end with
| iMovie, but it gets questionable as hardware requirements
| increase. I guess it 's a viable platform for that.
|
| I know a lot of CAD software is missing MacOS ports, so
| they seem to have lost that market.
|
| Anyway, I'll continue to think of my laptop as a dumb
| terminal with a good hypervisor bolted on the side.
| scarface_74 wrote:
| I know of quite a large company - the second largest
| employer in the US that have thousands of Macs and come
| with Office. Do you really think that no one has been
| using Office for Mac and Microsoft has been selling it
| for 35 years?
| hedora wrote:
| It is hard to switch away from mainframes too, but I
| haven't heard of a company that was founded in the last
| ten years that is a Microsoft Office shop.
| scarface_74 wrote:
| Well, maybe your anecdata may be in conflict with
| reported revenue numbers? Google Office is not taking the
| world by storm.
|
| https://www.computerworld.com/article/3637079/as-google-
| move...
|
| No it's also not the year of the Linux desktop either
| qumpis wrote:
| How do you explain the adoption of macs among programmers
| in general? Especially in universities, macs is all I can
| see.
| hedora wrote:
| Everyone I know that uses one (myself included) uses it
| as a toy (== not production server grade) Unix machine
| and/or dumb terminal.
| mrguyorama wrote:
| In my university, only non computer science students used
| macs. At my work, I wasn't given a choice and was given a
| macbook because "we all use macbooks" which really just
| meant "we all work on the terminal", which mac is not
| good at, not anymore when we have WSL to compare it
| against.
| andelink wrote:
| [dead]
| mrtranscendence wrote:
| I mean, you're kind of dead wrong? I've been using Macs
| for almost two decades at this point as a developer, and
| it's never kept me from getting work done. These days
| it's great having a powerful Mac, with the ability to use
| local machine learning models with relatively large
| amounts of VRAM (more than any consumer GPU, though
| inference won't be as fast as recent Nvidia cards). It's
| great at video editing, too.
| EricE wrote:
| Don't tell IBM they are doing it wrong:
| https://www.extremetech.com/computing/301863-ibm-our-mac-
| usi...
| giantrobot wrote:
| Wow, I thought this type of comment died on Slashdot 20
| years ago.
| johnklos wrote:
| I think this is an example of how ideas about what people
| really consider important are skewed by marketing. The gaming
| market is HUGE. How big is the premium, chasing-that-last-
| quarter-percent-by-spending-twice-the-money market
| comparatively?
| throwaway6734 wrote:
| Could it all be research for the new vr headset?
| sylens wrote:
| I thought this was very insightful on his part - yeah this all
| sounds great, but so what? The Mac Pro is an empty box besides
| for Apple Silicon and a few fans. What are you going to put in
| there exactly?
| EricE wrote:
| Audio, Video and networking I/O cards - exactly as they
| outlined in the keynote. Massive bandwidth - that's what the
| Pro tower unlocks.
| afavour wrote:
| > it can never compete with the gaming PCs which are far less
| expensive and far more powerfull graphics-wise
|
| I don't think Apple is chasing the "dedicated gaming machine"
| crowd here. They want casual gamers to be able to load up a
| couple of games on the machine they're buying for non-gaming
| reasons. I'm exactly one of those people: I rarely play video
| games these days but when the pandemic hit I ended up
| installing Boot Camp to play COD:Warzone with friends. It was
| great (it performed... okay). I've since upgraded to a Silicon-
| based Mac so the door has closed on that. This toolkit is the
| means to reopen it. I'm not, and likely won't ever be, in the
| market for a gaming PC. I can't justify the purchase.
|
| > And Apple also doesn't know how to keep relationships with
| the AAA developers
|
| The App Store would beg to differ. I agree that historically
| they haven't been great at relationships with _game_ developers
| but they 're clearly able to maintain relationships with third
| party developers when they have the incentive.
| dunham wrote:
| Although, as a casual gamer, I'd like to be able to run the
| games that I bought in the Apple App Store on my Mac. (e.g.
| "DeathSpank", a fun spoof of action RPGs, is 32-bit, so it's
| now unplayable.)
|
| I've used crossover to play Skyrim on my M1 mac (and they
| just sent me an email saying Apple leveraged crossover code
| for their porting toolkit), so there might have been an
| option prior to this - if the performance is good enough for
| your game.
|
| If you do want crossover, get a free trial, but wait for a
| discount. I think they discount around 30-40% near the end of
| the trial and during special sales.
| BryantD wrote:
| Mobile gaming and PC gaming are somewhat different markets.
| In particular, mobile game development companies don't have
| any choice but to work with Apple and Google, whereas AAA PC
| gaming companies have another outlet already.
|
| I like Apple and I like their products but I think if you
| talk to any sizable mobile developer they'd be able to tell
| you stories about the difficulties of working with Apple.
| musicale wrote:
| > if you talk to any sizable mobile developer they'd be
| able to tell you stories about the difficulties of working
| with Apple
|
| If you're a sizable game developer used to working with
| Sony or Nintendo, how much harder is it to work with Apple?
|
| For pure mobile devs, how much harder is it to work with
| Apple vs. Google, and why? If it isn't worth it, why
| bother? Android seems to have larger market share. How much
| higher are Apple's platform fees vs. Google's?
| tcmart14 wrote:
| I don't think fees are a viable argument anymore. At
| least not at face value. Last I looked, Google, Apple and
| Valve all take 30% cuts. I believe Nintendo does too.
| Unless there are some backroom sweet heart deals at least
| with the major studios.
| WorldMaker wrote:
| Most of the "sizable" mobile developers are entangled in
| the PC/Console game developers if in no other place than on
| the broad sheet with one generating revenue to pay for the
| increasingly expensive other. King is a part of Activision
| Blizzard (and a part of the pending sale to Microsoft).
| Zynga is a part of Take-Two (Rockstar/2K). Riot is
| obviously Riot. A half-dozen others are arms of Tencent in
| one way or another, who in turn is heavily invested in
| Funcom and Epic and Riot and From Software and less
| invested but still invested in plenty more like Ubisoft.
| NetEase is a mobile developer and publisher that also
| develops (but so far generally doesn't publish) PC and
| console games and has been buying studios looking to deepen
| that.
|
| The list goes on; everything videogames is deeply entangled
| financially. Therefore, the markets must be deeply
| entangled, too.
| Apocryphon wrote:
| I have to wonder if that's because a lot of the hot
| mobile gaming companies flew too close to the sun, could
| not maintain their explosive growth even as they pursued
| F2P microtransactions hell policies, and ended up getting
| bought by said PC/console game developers.
| WorldMaker wrote:
| My theory reverses your theory's cause and effect: King
| was a reverse merger that was very nearly a takeover of
| Activision Blizzard at the time. Zynga was thought to be
| the same for Take-Two (which at the time was particularly
| bloodied by bankruptcy-related issues and in a position
| to be eaten). The EA and Popcap merger is another one
| that was questionably a reverse merger/near takeover,
| especially in the way it shook up the executive board at
| the time. (I forgot about Popcap in the above summary
| because _as a brand to themselves_ they 've quietly sort
| of disappeared from modern mobile trends, but their logo
| still often shows up in EA presentations.)
|
| In general, "lowly" mobile gaming still has _more_ active
| players spending more real-world money at any given time.
| It 's very hard not looking at the bottom lines of some
| of these companies, especially today's weird Activision
| Blizzard and not see "the tail wagging the dog" and
| mobile games effectively sponsoring and/or subsidizing
| development costs on every other form factor of
| videogame. The biggest exceptions seem to be Sony and
| Microsoft themselves, and Microsoft dabbled in mobile
| gaming over the years, has a big mobile gaming contractor
| in Arkadium (using the Microsoft brand for Solitaire and
| Minesweeper, among others, and generating some revenue),
| and does own one of the largest mobile games of all time
| (Minecraft) though people often don't think of it as
| such.
|
| I think it also shows up in executive leadership and how
| F2P microtransactions hell has been infesting "AAA" and
| "AA" PC/console development for years now.
|
| From my outside perspective of the industry: "Mobile
| games" _won_. PC /console games are the weird, "too
| expensive" afterthought for most of the videogame
| industry, subsidized by and beholden to the mobile games.
| The "gamer culture" that doesn't see most of the mobile
| games space as interesting or important and doesn't see
| mobile game players as "gamers" (or worse sees them only
| as "filthy casuals") is the minority out of touch with
| market realities.
|
| Admittedly, that's a somewhat extreme perspective and
| there are plenty of exceptions and gray area and further
| complications. But whether or not you agree with that
| perspective, my earlier point remains that overall mobile
| games and PC/console games are inextricably linked by
| market forces and treating them as separate markets, and
| especially treating the mobile games market as somehow
| inferior, misses a lot of the forest.
|
| (That [currently] Cold War between Apple and Epic has
| very _real_ stakes, including for PC /consoles, and isn't
| just a silly "mobile gaming" problem.)
| ericmay wrote:
| Don't PC gaming companies have only one real outlet
| (Microsoft) to work with? Not to be confused with
| distribution channels such as Steam.
| newaccount74 wrote:
| Are there still games that only come out on PC? I thought
| most big titles are released on Xbox and Playstation as
| well?
| anta40 wrote:
| Not many. For example, Jagged Alliance 3 and the Total
| War series.
|
| I'm a big fan of Zachtronics games (TIS-100, Shenzen I/O,
| etc) and they are also PC-only.
| southwesterly wrote:
| Positech Games (Democracy Series) are PC only.
| i_am_jl wrote:
| I think you nailed it.
|
| >This toolkit is the means to reopen it. I'm not, and likely
| won't ever be, in the market for a gaming PC. I can't justify
| the purchase.
|
| I think there's a segment of Mac users who own Windows
| machines exclusively for gaming. I think the value in these
| capabilities isn't that people will buy Apple Silicon machine
| primarily as gaming machines, I think the value is in
| enabling someone in the Apple ecosystem who plays games
| occasionally to opt-out of owning a Windows machine.
| lockhouse wrote:
| Yeah, I would be thrilled if this eventually enabled even
| 50% of my existing Steam library to work on my M1 Mac. If
| we get a few new releases to be Mac native, that's just
| icing on the cake.
| wslh wrote:
| Don't you think that Apple could tackle the performance issues
| in GPUs in the following years?
|
| The core issue here, business wise, is the price tag. People of
| all social classes use gaming consoles and it is difficult to
| think Apple can be relevant in this market even if they tackle
| all other issues.
| yreg wrote:
| Nevermind the low end (it will always be an uphill battle to
| compete with the consoles on price), the issue is that Apple
| doesn't even cater to the high end.
| doctorpangloss wrote:
| Here's another POV:
|
| > Like how does all this _Vulkan_ , compile-your-shaders, port-
| your-games stuff even get budgeted, when it's eventually dead
| on arrival?
|
| Vulkan today sucks. Nobody writes Vulkan native engines. It's
| not optimized anywhere.
|
| Google has twice tried and failed to make Vulkan a thing -
| first on Android, where nobody cares to target it, even on the
| Quest, and the GPUs suck anyway; and second on Stadia, which
| besides the up front product development cost, it was 3-20x
| worse performance compared to DirectX for the games that were
| ports anyway.
|
| But I'd rather have Vulkan around and succeed, even if it sucks
| today. Because having only DirectX, or only DirectX and
| middlewares for the Switch, iOS and PlayStation, is worse.
| mzs wrote:
| And every 2-3 years they throw-out whatever that huge
| investment was for a new incompatible project.
| Const-me wrote:
| These GPUs aren't too bad. Theoretically, M2 Max peaks at 14.4
| teraflops, and 400 GB/s memory bandwidth. M2 Ultra is too new,
| but Apple says it's GPU is 30% faster (probably 18.7 teraflops
| then?), and that it has 800 GB/s memory bandwidth.
|
| The numbers for M2 Ultra are comparable to some powerful GPUs.
| The theoretical TFlops number is close to Radeon 6900 XT and
| GeForce 3070, theoretical memory bandwidth is close to Radeon
| 7900 XT and GeForce 4080.
|
| However, good point on the pricing. Apparently, Mac Pro starts
| at $7k, which is way too expensive for most gamers.
| madeofpalk wrote:
| Apple's best GPU is close to a mid-range previous gen nvidia
| GPU?
|
| I'm sure it'll run a 4 year old game on medium settings
| great! But that's kind of the point, that it doesn't stand a
| chance against the rest of the industry.
|
| Who's supposed to be impressed by getting middling
| performance on Death Stranding 4 years after it came out?
| Literal definition of "also ran".
| caycep wrote:
| Apple's best [integrated] GPU is close to a mid-range
| previous gen [discrete] nvidia GPU [consuming 10x the
| power]?
|
| -fixed that for you
| paulmd wrote:
| > Who's supposed to be impressed by getting middling
| performance on Death Stranding 4 years after it came out?
| Literal definition of "also ran".
|
| Plenty of people are impressed at 3060/3070 performance _in
| a 25W system-power envelope_.
|
| You literally can't even run the memory chips for a 3070 in
| that power budget let alone the whole APU.
|
| Like I'd love to see the AMD equivalent APU to that "also-
| ran 3070 performance" macbook, please link a laptop with
| what you think would be comparable.
| olyjohn wrote:
| Great, you can keep playing your old ass game at low
| frame rates. Enjoy your power savings... the game still
| runs like crap and the GPU still isn't that great.
| ryandrake wrote:
| Game enthusiasts can be so weird. Do y'all even have fun
| playing games, or do you just keep buying hardware and
| optimizing settings until they run at 120fps, declare
| victory, and move on to the next AAA game?
|
| I don't think I have worried about game frame rate since
| the days of Quake 1. I set my graphics settings to
| "Medium" and then spend the rest of my time _actually
| enjoying_ my old ass games.
| npunt wrote:
| Its exactly why PC/console gamers (generally, not talking
| about GP) make terrible customers and Apple is right to
| not play with fire by courting them too closely - they're
| loud, cheap, immature, mercurial, and demanding, and
| being associated with them is probably a net negative for
| brand. Let them stew in forums arguing over red vs green,
| tinkering with and breaking the warranty of their PC
| parts, being disloyal to the brand they loved 5 minutes
| ago, etc.
|
| Better strategy is to make sure the door isn't closed on
| gaming for those that want to use their expensive Macs to
| occasionally play (protect the downside), rather than
| swing the door open enthusiastically for gamers to rush
| in.
|
| Re: worrying about fun vs tinkering, I'm reminded of the
| 4 quadrants of hobbies. What we see on forums are
| generally gamers interested in gear & discussing, not
| 'doing the hobby'.
| https://brooker.co.za/blog/2023/04/20/hobbies.html
| incrudible wrote:
| Yes, gamers have standards and thats why GPU performance
| on the PC does not suck and why it is relatively
| affordable. A tough market to be in for sure, everyone
| wishes they could just be Apple.
| emn13 wrote:
| The M2 Max is closer to the 3060 than it is to the 3060
| Ti, let alone 3070. And those numbers are quite possibly
| overly optimistic; workloads essentially never reach peak
| tflop, and I would not be surprised if practical
| workloads are better matched to nvidia's architecture
| than apple's, if only through sheer industry momentum
| (But that could go either way).
|
| While the perf/watt is impressive, apple is also using 4
| times the number of transistors on TSMC's latest process
| - and the comparison here is samsungs 8nm, I believe.
| It's not really all that impressive that that huge
| silicon investment has some results...
|
| It's a tantalizing hint at what might be possible, but as
| it stands, I'm not really all that impressed, personally.
| paulmd wrote:
| I didn't say 3070, that was from the parent "who is
| impressed by 3070 performance [in a 25W envelope]"? And
| the answer is a lot of people.
|
| I actually added the 3060 bit myself lol, because yeah,
| that seems to be more like where it actually lands, more
| like desktop 3060.
|
| edit: also _desktop_ vs _mobile_ is a factor here too...
| _mobile_ 3070 is not the same thing as _desktop_ 3070,
| and coming in at _desktop_ 3070 would actually be fairly
| impressive. Mobile 3060, much less so.
| emn13 wrote:
| Sure, no quibbles on that front. Comparisons like this
| are always best taken with a lot of salt anyhow; they're
| so different. And it's not like tflops are the great
| predictor of gaming performance.
|
| Positively: as a device, having such a solid iGPU is
| pretty much exactly what I've always wanted in this kind
| of device. Having performance that's PS5 ballpark clearly
| is enough for a hell of a lot of things. Who really wants
| something much faster at the cost of much worse battery
| life?
|
| But the air of incredibly ground-breaking technical
| greatness that apple manages to weave around its silicon
| seems a tad overdone. Given the amount of silicon, the
| process node, and the target tuning - this kind of result
| seems competitive with rather than outclassing their
| rivals.
| madeofpalk wrote:
| That's neat and impressive, but that's all they have.
| They don't have a high end. For $3000 you can get a top
| end gaming PC. There's no amount of money you can spend
| on a Mac to equal that.
|
| I think you're missing the point that parent (and
| Siracusa) made - Apple invests a signficant into the
| software and graphics stack, only to fumble it at the
| last minute by not having high-end graphics hardware, and
| caring enough to court "triple A" game developers to
| their platforms, despite them creating and maintaining
| Metal and this 20k-line WINE patch.
|
| There's this weird mismatch of Apple dedicating a non-
| trival amount of time in their keynote to "Mac Gaming" as
| if it's supposed to be impressive to finally play a 4
| year old game on a Mac because they don't ship high-end
| graphics devices.
| [deleted]
| abujazar wrote:
| For $3000 you can get a top end GPU, not a whole gaming
| PC. Most gamers have mid-range GPUs like those in the M2
| Max.
| samspenc wrote:
| Actually you can get NVidia's top consumer GPU today, the
| RTX 4090, for $1500-1600. Go back one generation and you
| can get a RTX 3090 for $750 which still packs a punch.
|
| So it's quite possible to build a well-performing gaming
| PC for sub-$2000 with RTX 3090 which is still
| significantly more performant than Apple's latest Mac, in
| terms of GPU throughput.
|
| I snapped myself a gaming PC for $1300 at last year's
| Thanksgiving sales, came with a AMD Ryzen, RTX 3080 (10
| GB VRAM model) and 32 GB DDR4 RAM, no way I could have
| gotten a Mac with that performance for anything close in
| terms of price.
| wlesieutre wrote:
| Two days ago we could speculate that maybe the $6000+ Mac
| Pro would bring better graphics performance, but now we
| know it's a $7000 Mac Studio with PCIe slots. And as far
| as we know you can't put a GPU in those slots.
|
| Not that it would've been in my price range anyway, but
| it could've indicated that thunderbolt eGPU support would
| make a return.
|
| Lack of that is a weird omission if Apple is trying to
| act like they have a gaming platform.
| musicale wrote:
| > Lack of that is a weird omission if Apple is trying to
| act like they have a gaming platform.
|
| Apple has a huge gaming platform, and it isn't the Mac.
|
| https://www.ign.com/articles/apple-made-more-than-
| nintendo-s...
|
| However, I imagine that they'd still like to sell more
| games in the Mac App Store (in addition to iOS ports,
| iPad games that can run on Apple Silicon, and Apple
| Arcade subscriptions) and this might help.
|
| It might also make it easier to port games to Apple
| Arcade.
| 2OEH8eoCRo0 wrote:
| Apple silicon wins on performance per watt but not in
| performance outright and suddenly everyone cares about
| power consumption. Whichever spec everyone's favorite
| fruit company excels at gets put on a pedestal.
| paulmd wrote:
| shockingly, I think there might be more than one person
| on the internet and these people might have varying
| opinions
|
| but yea you can say the same thing about tons of brands.
| Last summer all the AMD fans were talking about 1EUR/kWh
| electricity and saying they were going to buy whatever
| dGPU was most efficient... when that turned out to be Ada
| by a country mile, everybody pivoted to whining about
| price and bought RDNA2 GPUs with half of the perf/w.
|
| During RDNA2 everyone insisted that a 10% perf/w
| advantage for AMD was a buying point, back during the
| Vega years they insisted that a 2x perf/w disadvantage
| didn't matter. Rinse and repeat.
|
| I generally think power matters when it rises to the
| level of a tangible difference... 200W difference between
| 4070 and 6950XT means the latter is really a non-starter
| even if it's 10% faster (at a 5% higher price),
| especially considering the big-picture featureset (DLSS
| improves both perf and perf/w). And really it matters
| more in laptops. You're right that Mac Studio/Mac Pro are
| not really a place where it hugely matters, but, in a
| laptop, the next-best thing would be a Ryzen 6800U which
| is about GTX 1630 performance, so 3060 performance in the
| same envelope is a big step upwards!
|
| And really this "big differences matter, small ones
| don't" applies to most stuff in general. 5% this way or
| the other, who cares. That kind of thing is often less
| important than general UX/quality/features, I'll take a
| laptop that's 5% slower but way longer battery life or
| better screen/trackpad/whatever. When things start rising
| to the level of 25% or 30% difference in some spec, or in
| price... yeah that's immediately noticeable.
|
| But yea I generally agree that desktops like Studio or
| outright workstations like Mac Pro are dGPU territory and
| people are generally not looking for a super efficient
| iGPU with 3060 performance. On the other hand, being able
| to talk to 192GB of VRAM is definitely novel, especially
| with large AI models being the talk of the town this year
| (and accessible to even the most casual of
| artists/developers), and the unified APU approach with
| uniform memory/zero-paging has other advantages for
| development too. AMD had a lot of this stuff hammered out
| 10 years ago, supposedly, and then... just never did
| anything with it, other than sell it to consoles. It's
| great for PS5 and Xbox, why can't I buy a PC laptop with
| 96GB of unified/uniform memory with 3060-level
| performance in a 25W envelope?
|
| Really I think a lot of the people who have bought
| Macbooks recently are not "traditional" apple customers.
| The MBP and even MBA are legitimately really nice laptops
| with a good screen, good keyboard, good trackpad, good
| sound, etc. I have said before that I really think a lot
| of MBP customers would be interested in a "Macbook Tough"
| toughbook if they ever did that, although of course
| that's the most un-Jony Ives product possible.
|
| There is a clear demand for a high-quality AMD-based non-
| GPU ultrabook using a 6800U or 7040U or whatever.
| Framework is the first company to even try, and they're
| using crappy 13" hardware on the upcoming AMD model while
| the market clearly wants more like a 15" or 16" (and
| their 16" will not have AMD boards). Why didn't anybody
| else do it first? Apple is catching on because _they 're
| filling a market niche that everyone else is ignoring_,
| and they're not even really exactly filling it squarely,
| they just happen to be vaguely closer than the rest of
| the market.
|
| And now that the nerd crowd has the hardware... the
| software is following. It's the same reason that CUDA has
| taken off while AMD's GPGPU programme has spun its wheels
| for 15 years, and the same reason AMD has good Linux
| drivers now. Give the nerds the hardware and innovation
| will follow - when they tinker they'll be tinkering with
| _your platform_.
|
| Big missed opportunity for AMD, yet again. Or Intel, but,
| they're so far behind on APUs/integration that I think
| disappointment is basically the baseline expectation at
| this point. AMD had all the pieces, and yet again just
| chose not to do anything with them.
| philistine wrote:
| The PC industry is no longer driven by desktops; laptops
| have taken over long ago. There is a gaming PC crowd, but
| that is a small captured audience who wants performance,
| wattage be damned.
|
| Apple is selling around 80% laptops versus desktops, and
| the rest of the industry is something like 77%. The fact
| Apple is winning the laptop GPU race doesn't mean it
| should automatically be entered into the desktop GPU
| race, where it is not winning.
| smoldesu wrote:
| > The fact Apple is winning the laptop GPU race
|
| Fact? Which Apple chips are outperforming the laptop
| 3070, much less the current-gen mobile 4090?
| npunt wrote:
| I would take a guess that Apple is shipping (far) more
| TFlops of GPU power than Nvidia or anyone else in the
| mobile GPU market. Few people are buying laptops with
| 80-150w TDP GPUs, as those start to stretch the
| definition of both 'laptop' and 'battery powered'. Big
| gaming laptops with an hour of battery life are more akin
| to the luggables of yore.
| smoldesu wrote:
| That's fair. Nvidia has issues scaling their full systems
| down to laptop spec, and Apple almost has the opposite
| problem. They're both impressive in their own right, but
| right now Nvidia has both the performance _and_
| performance-per-watt crown in this space. The disparity
| in 3D applications (like gaming and Blender[0]) so ugly
| it 's not even close.
|
| And in all fairness - Apple's products might not need
| more GPU power. Cyberpunk and Elden Ring appear to be
| CPU-bottlenecked, if people are comfortable upscaling
| they could get a pretty comfortable Retina experience.
| The 2D optimization and media accelerators are a good
| focus for mobile hardware. For more demanding
| applications though, it looks like Apple's current
| approach is not scaling well.
|
| [0] https://opendata.blender.org/benchmarks/query
| npunt wrote:
| Yeah I'm really curious what Apple's next-gen GPU (with
| raytracing and a bunch of other stuff) brings to fix some
| of these shortcomings. It was supposed to show up on last
| year's iPhone 14 followed presumably by inclusion in the
| M-series, and the 3nm process was supposed to be shipping
| this year, but everything got set back a year. In Mac-
| land the M2 wound up just being an overclocked M1, so
| we're left waiting for M3 to bring us a more competitive
| GPU.
|
| The other half of the story is a lot of software (inc
| Blender, looking at these crazy results) just isn't well
| optimized and Apple is still struggling to win over
| developers in certain sectors of the market. Nvidia's
| decade+ investment in the software side has paid off so
| incredibly well for them, it's basically made the
| company.
| [deleted]
| fnordpiglet wrote:
| I see, so in the world you propose we live in the Nintendo
| Switch must be a tremendous flop?
| KingMachiavelli wrote:
| The latest consoles are also running mid range cards from a
| few years ago and are doing just fine. They are running
| games at medium at lower FPS/resolution than PC so games
| will mostly continue to target and work well on medium
| hardware. High end PC gaming is the exception, not the
| norm.
|
| The Mac audience is not trivial and has deep pockets so as
| long as porting games is fairly easy the it's an obvious
| choice.
| llm_nerd wrote:
| >it doesn't stand a chance against the rest of the industry
|
| Apple seems to be doing okay. I mean, a magnitude more
| people game on iPhones than game on PCs.
|
| Apple is trying to support ancillary gaming for users who
| chose their platform for other reasons. That's it. They
| aren't targeting the 1200 watt, 12-fan 4090 PCMR sorts. And
| that's okay.
| Yujf wrote:
| Do they need to most powerfull gaming hardware? Maybe just
| the fact that it might get reasonable to play most games on
| a macbook is enough to get people who also want to game to
| buy a macbook instead of a windows laptop. And maybe this
| is enough to get developers to consider mac.
|
| They do not need to compete with nvidea for the top of the
| line
| kitsunesoba wrote:
| Yeah, high end enthusiast hardware is in fact pretty
| niche, and I say this as someone with a 5950X/3080Ti
| tower. The vast majority of people playing games are
| doing so on pretty old/average hardware, a bar which is
| met and exceeded by several M-series Macs.
| mrtranscendence wrote:
| High-end hardware is niche, but the state of AAA game
| performance these days ... ugh. I can't even get a stable
| 60fps on Jedi Survivor with my 5800x/3080 Ti rig, even
| with lower settings. How bad is it on something like a
| 1060?
| brundolf wrote:
| I feel like you've got a chip on your shoulder about this
| for some reason
|
| I've got a gaming desktop and also a MacBook Pro. If the
| next time I go on a trip, I'm able to play some games in my
| hotel room on (gasp) medium settings, with a device I
| already own, that I probably was already going to bring
| with me, that's a positive thing!
| Apocryphon wrote:
| They should've showcased a game that's actually new and
| not just a port.
| musicale wrote:
| ... to demonstrate their porting toolkit?
| Apocryphon wrote:
| Sure, why not? Unreleased games need to be ported to
| macOS too.
| madeofpalk wrote:
| I also have a gaming desktop and a Macbook Pro, but I
| wish I didn't have to have a gaming desktop because I
| much prefer Macs and MacOS and I wish Apple was
| interested in competing. Then, they dedicate a segment to
| 'gaming on mac' to brag about porting a 4 year old game
| to the Mac.
| hajile wrote:
| The most used GPU according to Steam survey is the 1650 --
| a low-midrange card from 2019.
| nightski wrote:
| Yeah and those people aren't going to pay significantly
| more money for a Mac M2 Max.
| hedora wrote:
| I think you'd be surprised. I just checked, and my
| desktop gaming GPU is only 31% faster than that nvidia,
| and I'm typing this on a MacBook Pro M2 Max.
|
| I bought the laptop and the video card because they are
| quiet and their price/performance is better than the high
| end stuff anyway.
|
| I just tried running steam on the macbook, and was very
| disappointed. My Linux gaming desktop will live on for
| another few years, I guess.
|
| edit: I think I got the GPU in ~ 2019, though it was
| released in 2015.
| ohgodplsno wrote:
| Rosetta translation + D3D12OnMetal (which developers
| aren't allowed to use to publish their games, so you'll
| have to do it on your own and work with a subpar version)
| will happily eat that 30% difference. Not to mention the
| massive changes that drivers bring, where Apple will
| never either want or be able to do as much work as Nvidia
| does.
| hedora wrote:
| The 30% faster hardware is running Linux. The main reason
| I'm disappointed with steam on MacOS is that only a third
| of my library works at all, and that the stuff that does
| run is hit or miss, performance wise (especially the
| indie / casual games, which this hardware should laugh
| at).
|
| Also, another 25% of my library actually was ported to
| MacOS, but it is 32 bit only, so it won't run on an M2.
| (Also, typing that sentence was painful.)
|
| I guess if I want to run the vast majority of the MacOS
| software that I have ever purchased on an M2, my best bet
| is to install Asahi, and use the Windows ports under
| proton. Lame.
| wishfish wrote:
| Steam has mislabeled many of the older Mac games as being
| incompatible. Several of them will work just fine. It's
| worth double checking on one of the Mac gaming wikis if
| you want a particular game.
|
| Have no idea why the mislabeling happened. Maybe Steam is
| working solely off dates despite many games being 64 bit
| before the 32 bit cutoff.
| musicale wrote:
| > I just tried running steam on the macbook, and was very
| disappointed. My Linux gaming desktop will live on for
| another few years, I guess.
|
| I'd like to see more Steam games that work on the Mac.
| Perhaps this could help.
| fnordpiglet wrote:
| I paid a ton of money for my top of the line M2 Max
| because I build large rust systems that are high
| bandwidth and low latency and it's as good as it gets for
| that. I also like to play games, but I don't need the
| cutting edge. I would rather eat glass than buy a windows
| bing advertisement device and find another footprint in
| my home to install it just so I can get a higher frame
| rate than my eye can see. In fact, my strategy for gaming
| over the last 20 years has been to buy games 3 years old
| and devices that run them at their top end. No bugs, tons
| of reviews to guide my purchases, tons of mods, full DLC
| sets, always on sale. As long as I don't sit around
| feeling envy looking at what's cutting edge, following
| the 3 year wave front gives me _precisely_ the experience
| folks had 3 years ago - but better. I'll have their
| current experience in 3 years, so long as I don't die,
| without all the bleeding edge problems.
|
| So, great. Apple lets me stay away from bard directed
| bing advertising and OS level spy ware on my desktop,
| simplify my computing footprint in my household, and
| provides me games from a few years ago. Seems like a win
| win.
|
| Source: I am someone who paid significantly more money
| for a Max M2 Max
| madeofpalk wrote:
| > Apple lets me stay away from bard directed bing
| advertising and OS level spy ware on my desktop
|
| Apple doesn't get to take the moral high ground here
| either when they push credit card and other services ads
| in their OS.
| fnordpiglet wrote:
| No corporation gets to take a moral high ground being
| amoral entities. But windows is pervasively spammy now -
| the start menu hosting ads was bad enough, but now it's
| the task bar too. I'm trying to remember when I saw a
| cross sell in apples stuff - my memory is only when I'm
| in something like the TV app or the wallet, or some place
| where the cross sell is contextually relevant.
|
| The more Microsoft and Google tilt towards becoming
| persistent privacy threats and advertising companies, the
| more apple will see it as a differentiator as a hardware
| company with software services to be the opposite. I'm
| good with that dynamic, but I think it's useful to
| acknowledge that's the case. Pretending windows isn't a
| persistent adware spyware bundle doesn't help the
| situation.
| madeofpalk wrote:
| A Nvidia 1650 is significantly cheaper also!
|
| Upgrading from full-spec M2 Max to M2 Ultra costs $1200.
| Nvidia 1650 launched at $190 (inflation adjusted, $159
| 2019 USD).
| mekpro wrote:
| This video sample use base M1 chip.
| o1y32 wrote:
| Practically speaking a $300 (often sold at $250) Xbox Series
| S would provide a much better gaming experience than this.
| Hamuko wrote:
| Sure? It's definitely never going to deliver you anything
| other than a gaming/video experience though. It also
| doesn't come with a battery or a display in case we're
| doing an apples to apples comparison of the Xbox Series S
| with an M1 MacBook Pro.
| ranger_danger wrote:
| > how much effort Apple puts into this
|
| how much did they actually put into it? as far as I know this
| is mostly just WINE with an Apple logo on it.
| renewiltord wrote:
| How are Wine and friends actually legal considering the Oracle v
| Google thing on Java?
|
| I've always loved it but really do wonder. That was a blasphemous
| verdict.
| thesuperbigfrog wrote:
| >> How are Wine and friends actually legal considering the
| Oracle v Google thing on Java?
|
| >> I've always loved it but really do wonder. That was a
| blasphemous verdict.
|
| So you think APIs and interfaces should be copyrightable
| despite decades of precedent?
|
| If so, Oracle owes IBM a huge amount of damages for using SQL
| in Oracle database software without a license.
| nrclark wrote:
| Google won that in the end. It went all the way to the Supreme
| Court.
| renewiltord wrote:
| Oh they did? Well, that's good news. Something is good in
| this world.
| nilptr wrote:
| Won for now. Supreme Court decisions are reversible in time.
| thih9 wrote:
| What is the game porting toolkit?
|
| Is this something that game devs would use or is this some
| wrapper or emulation layer for the end user?
|
| Is there a list of games that use it?
| cromka wrote:
| https://news.ycombinator.com/item?id=36222266
| Aaargh20318 wrote:
| Apparently it's a emulator like Proton as used by Steam to run
| Windows games on Linux. It even uses Rosetta to run x86_64 core
| on ARM. But it's not intended for end users. Instead it's
| intended for developers to evaluate how their game would run on
| macOS so they can decide wether or not to port it.
| gilgoomesh wrote:
| According to CodeWeavers, it is based on their CrossOver code
| (GPL 2.1) and can be committed back into Wine:
|
| https://www.codeweavers.com/blog/mjohnson/2023/6/6/wine-
| come...
| danieldk wrote:
| It uses a proprietary framework that implements Direct3D on
| top of Metal:
|
| https://news.ycombinator.com/item?id=36224057
| nailer wrote:
| Your link confirmed that it uses wine. And your
| description sounds exactly like how codeWeavers wine
| implements direct 3D upon opengl.
|
| I imagine it's simply a proprietary product based on
| liberally licensed (LGPL) open source code, much in the
| same way Safari was built upon KHTML.
| danieldk wrote:
| Sure it uses Wine, but the Direct3D API is handled by the
| proprietary D3DMetal framework, which implements Direct3D
| on top of Metal. This is different than CodeWeaver's
| CrossOver approach, which uses Wine's DirectX-on-Vulkan
| implementation and then MoltenVK to run Vulkan on Metal.
|
| So, it is not just Wine. It is Wine plus a large
| proprietary Apple Framework that has most of the Direct3D
| magic sauce.
| nailer wrote:
| Thanks for the info! I think a couple of us in this
| thread may be arguing cross purposes - I want to ensure
| that codeweavers gets credit for their open source
| contributions and you want to ensure technical accuracy.
| These are both good goals and don't conflict with each
| other.
| danieldk wrote:
| Yeah, definitely! All of this wouldn't be possible
| without the Wine project and CodeWeavers.
| f1refly wrote:
| Wine (and Proton) is not an emulator
___________________________________________________________________
(page generated 2023-06-07 23:03 UTC)