[HN Gopher] Apple testing new external display with A13 chip
       ___________________________________________________________________
        
       Apple testing new external display with A13 chip
        
       Author : tambourine_man
       Score  : 45 points
       Date   : 2021-07-23 16:49 UTC (6 hours ago)
        
 (HTM) web link (9to5mac.com)
 (TXT) w3m dump (9to5mac.com)
        
       | wodenokoto wrote:
       | Why would you want your external GPU built into the display?
       | 
       | When I saw the headline, I thought maybe it had a webcam and the
       | A13 would do things like portrait mode, before the image reached
       | the computer.
        
         | FinalBriefing wrote:
         | Possibly to support wireless connections? You;ll be able to
         | move your mouse across devices in the next version of
         | iOS/macOS, so maybe this will allow you to use a screen without
         | a device connected to it?
        
           | sliken wrote:
           | Why not do 3d inside the monitor? You send it textures and
           | triangles and you get great 3d perf when connected, and more
           | power efficiency when you are mobile (without the monitor
           | connected).
        
             | yazaddaruvala wrote:
             | The monitor could also have retained state (based on the
             | pre-fetching of textures, etc as you said), so the majority
             | of time the display and iDevice are sending small, low
             | latency packets for mouse movement, or clicks, and periodic
             | bulk transfers of data to sync files, etc.
        
         | tarikjn wrote:
         | I think it actually makes a lot of sense, from a computer
         | architecture point of view: (1) total resolution is generally a
         | component of the graphic processing power needed (for
         | straightforward graphics rendering), and so scaling GPU power
         | with displays/resolution make sense (2) placing a GPU in the
         | display lower the demands on the interface, of which Apple has
         | repeatedly hit the limits on
         | 
         | In addition/as a result of these you get some usability and
         | user experience advantages: (a) avoiding degrading laptop
         | performance when connecting an external display without the
         | user having to know about eGPUs (b) no additional
         | box/cables/power mess from using an eGPU
        
           | amelius wrote:
           | And cooling can be easier through a large flat surface too.
           | In fact that could be one of the main reasons.
        
         | wlesieutre wrote:
         | It could be a great docking setup for a lightweight laptop. On
         | the go you have the power efficient chip with an 8 core GPU,
         | then you plug into the screen and you have a much more powerful
         | one.
        
         | yazaddaruvala wrote:
         | I don't know, but my guess:
         | 
         | Apple Exec: "Build an external display for the iPhone and/or
         | iWatch."
         | 
         | Apple Engineers: "Ok, but we will need a GPU built into the
         | display or it'll look like shit. It'll also need a lot of
         | RAM/SSD to store textures, etc. Additionally, a high bandwidth,
         | low latency wireless connection between the iDevice and the
         | display (e.g. UWB)."
         | 
         | Previously to support an enormous external display the iDevice
         | would need to have a large GPU, battery, etc that are mostly
         | unused except for when the display is "active". With a GPU in
         | the display, the iDevice just needs good enough data transfer
         | rate to get textures and stuff over to the display, meanwhile
         | events like mouse movement, or open an app, or switch apps can
         | be very low latency instructions sent from the iDevice to the
         | display and vice-versa.
        
           | [deleted]
        
           | tinus_hn wrote:
           | Perhaps ironically the original Apple Watch wasn't much more
           | than an external display for the iPhone.
        
           | floatingatoll wrote:
           | macOS uses a PDF-like canvas as the core display engine^,
           | upon which everything writes object layers that have refresh
           | loops independent from the main canvas 'flatten layers into
           | the display' loop. This is all inside WindowServer, and is
           | why <Shift-Command-4> <Spacebar> lets you select an entire
           | window _including_ the shadowed border, _without_ picking up
           | anything from above or beneath it, because Spacebar activates
           | object selection mode.
           | 
           | And we know that Apple has figured out secure pairing of T2
           | chips for external accessories (the touchID keyboard), so
           | they can run secure-trusted components outside of the main
           | hardware.
           | 
           | So they could _in theory_ run a WindowServer process on the
           | display 's internal Cortex chip, paired via T2 to provide
           | verifiable security, hosting the WindowServer canvas for that
           | display and lets the host device offload the 'flatten layers
           | to display' problem, handle the EDR mapping, and so on. That
           | would be a boon to laptops and also to thermally-constrained
           | desktops, and here's why:
           | 
           | On my Mac, WindowServer is the highest user of CPU time in
           | total, exceeding kernel_task, coreaudiod, everything _except_
           | Zoom (which does non-accelerated H.264 encoding /decoding on
           | x64). So if I were to plug in a second 5K display alongside
           | my built-in, my WindowServer CPU usage would double.
           | Fortunately, I can afford that -- but maybe a laptop cannot.
           | Offloading that WindowServer burden for managing traffic
           | to/from the LCD, and for dealing with layer flattening and
           | security property management (you can't screenshot certain OS
           | dialogs with a specific security flag set!), would be a
           | tremendous win not only for battery life of the host, but
           | also a theoretical performance win for latency, as now macOS
           | would only need to shuttle specific layer changes over the
           | wire.
           | 
           | Of course, it could still act as a dumb display for any host
           | that isn't ready to negotiate a T2-protected WindowServer
           | connection, but my idea above would definitely go a long way
           | to explaining why Apple dropped support for x64-style eGPU
           | _while still_ investigating hardware-accelerated monitors.
           | 
           | ^ it used to be implemented as PDF internally, not sure if it
           | still is or not
           | 
           | ^^ you used to be able to kernel panic macOS by making a
           | window so tall that, when screenshot using the Spacebar layer
           | selection method, an out-of-bounds write would occur from
           | within WindowServer
        
             | fstrthnscnd wrote:
             | In other words, they would reinvent X display servers, but
             | with X replaced with PDF?
        
             | Traubenfuchs wrote:
             | I have rarely ever learned this much from a single comment
             | here.
        
           | ttul wrote:
           | .. or iPad. Many professional creative types use iPads in
           | their professional work. Being able to hook up to a high end
           | display wirelessly would be a great help to this segment of
           | professional customer.
        
         | pengaru wrote:
         | Arguably what's an appropriate GPU bandwidth is largely defined
         | by the display attached to it, primarily its resolution, color
         | depth, and refresh rate.
         | 
         | But I still don't think it makes sense to physically merge the
         | two.
         | 
         | Displays wear out far faster than the GPUs driving them, and
         | just have much higher failure rates in general, being physical
         | targets for thrown handheld controllers and what not. Doesn't
         | seem wise to make the display so much more costly to replace by
         | shoving the GPU within it.
         | 
         | But this is Apple... making this expensive to replace is kind
         | of their M.O.
        
           | closewith wrote:
           | > Displays wear out far faster than the GPUs driving them,
           | and just have much higher failure rates in general, being
           | physical targets for thrown handheld controllers and what
           | not. Doesn't seem wise to make the display so much more
           | costly to replace by shoving the GPU within it.
           | 
           | Is this true? This seems nearly diametric to my experience.
        
             | kiawe_fire wrote:
             | Mine too.
             | 
             | In fact I dare say the GPU has been the single most failure
             | prone component of every computer I've owned for the past
             | 15 years.
             | 
             | Which only further illustrates the point that putting
             | something expensive and unreliable into an already
             | expensive device seems risky.
        
           | crazygringo wrote:
           | > _Displays wear out far faster than the GPUs driving them_
           | 
           | Do they?
           | 
           | In my experience they both last forever until you decide to
           | upgrade to better specs. And in that case it's quite likely
           | you upgrade both together.
           | 
           | Displays don't really "wear out" these days in normal usage.
           | 
           | Although I guess I'm not really sure why you say they're a
           | target for thrown handheld controllers ha... are you speaking
           | about yourself or about kids...?
        
             | pengaru wrote:
             | > Do they?
             | 
             | Yes.
             | 
             | Burn-in is a real problem [0], especially for OLEDs last I
             | checked.
             | 
             | LED backlights wear out, I've personally helped multiple
             | people replace them in external monitors and laptops over
             | the years.
             | 
             | And they're obviously more fragile and exposed to the
             | elements than a GPU within a case.
             | 
             | > are you speaking about yourself or about kids...?
             | 
             | What difference does it make? You do realize kids are a
             | real factor, they exist, are rather common actually. And
             | these days gaming has become just as popular among adults
             | too.
             | 
             | It's as if everyone forgot the rash of broken displays when
             | the wii came out... surely those people appreciated not
             | having to pay for a GPU too when replacing those displays.
             | 
             | [0] https://en.wikipedia.org/wiki/Screen_burn-
             | in#Plasma,_LCD,_an...
        
         | snowwrestler wrote:
         | Most people probably don't want an external GPU built into the
         | display.
         | 
         | What they maybe do want is to plug a mobile computer into a big
         | sharp hi-res monitor and get great performance. Since many
         | graphic loads scale with resolution, building some processing
         | horsepower directly into the monitor could be a way to deliver
         | the experience that people want, even if their current computer
         | doesn't have very powerful graphics.
        
       | athenot wrote:
       | Interesting. Does this mean the display itself would be running
       | the WindowServer process? On my mac with a 5k external display,
       | it's consistently using about half a core all throughout the day.
        
       | rektide wrote:
       | Sounds like a possible convergence of iMac and & XDR monitor!
       | 
       | If you have a couple thousand dollar display, throwing $60 in
       | cpu, RAM, & storage on top of it seems like a pretty dead-obvious
       | win to me. Most companies would just have nothing particularly
       | good to offer- Android? Windows? webOS? Ok options but not great,
       | and they are paying a lot more for less well-integrated cpus. For
       | Apple, they have great chips made not bought, in vast volume, and
       | can make their pick of OSXI, TV, or iOS OS software.
       | 
       | I believe the new iMac 27 might not have Target Display Mode, the
       | ability to take HDMI input, which old displays had[1]. I'm not
       | sure that kind of capability is really needed, for long. One of
       | the things I'm looking forward to with USB4 is that there is a
       | really good 40Gbps host-to-host connectivity build right in to
       | the spec. This is a generic connection, not specific to video,
       | but I can certainly imagine doing a really good low-latency
       | remote-desktop over a high quality connection like that. So, my
       | hope is, medium-term, that we don't need HDMI &c input on our
       | computers for them to act as auxiliary displays. There's already
       | a world of softwares out there for using tablets auxiliary
       | displays, my Linux desktop runs Sway which makes adding a
       | virtual/remote display extremely simple, all that we're missing
       | is a way for two computers to plug in-to one another & connect!
       | Amazingly comedically weird situation.
       | 
       | [1] https://www.lifewire.com/use-imac-as-monitor-with-target-
       | dis...
        
         | salamandersauce wrote:
         | TDM is good because you can use it as a monitor long after
         | you've stopped using it as a computer.
         | 
         | My worry with embedding an A13 or whatever is that it will
         | require an Apple device to do anything or a bunch of hackery to
         | make it work on a non-Apple device. Previous Apple monitors
         | were already sometimes a PITA to get running with non-Apple
         | stuff. It just ends up with more ecosystem lock in and that's
         | not needed from a probably $1000+ monitor.
        
       | Theophrastos wrote:
       | I would rather reconcile this rumour to this patent filed last
       | year about a "Peer-to-peer distributed computing system for
       | heterogeneous device types"
       | 
       | https://appleinsider.com/articles/20/09/08/apple-researching...
        
       | rsynnott wrote:
       | Why are we assuming this is a display, rather than some sort of
       | kiosk-iPad thing?
        
         | tmalsburg2 wrote:
         | Why are we not assuming it is both?
        
       | rtutz wrote:
       | Maybe this also aims to include some Apple TV features into the
       | display. As some people use the same screen for work and
       | entertainment in their living rooms. Therefore it could also
       | provide a new node for Siri access.
        
       | gswdh wrote:
       | "...but there are still no rumours of an updated version..."
       | 
       | Is it me or does this statement imply the need for an updated
       | version. To me, it implies an updated one is needed because
       | that's what is meant to happen, we get a new one every x years
       | because - not because it's actually needed.
        
       | woleium wrote:
       | I'm guessing it's to go along with a DaaS (desktop as a service),
       | like the recently announced windows 365.
        
       | mikece wrote:
       | Sounds like iOS or tvOS will be on board the monitor. I would
       | find it more interesting to have a monitor with 16 to 32 cores of
       | M1 (or M2?) that expand the compute and render capacity of a
       | MacBook Pro when plugged in. As nice as mobile video editing is
       | currently, having your monitor host an elastic compute capability
       | and help cut down on 4K or 8K video rendering would be very, very
       | interesting.
        
         | chaostheory wrote:
         | It could be the fabled Apple TV that Steve Jobs was hinting at
         | before his death.
         | 
         | To be honest, both Amazon and Google has gotten this right
         | already with no touch voice controls.
        
       ___________________________________________________________________
       (page generated 2021-07-23 23:02 UTC)