[HN Gopher] The paper that came up with Apple Vision Pro's outwa...
       ___________________________________________________________________
        
       The paper that came up with Apple Vision Pro's outward-facing
       display [pdf]
        
       Author : ramboldio
       Score  : 87 points
       Date   : 2023-06-08 15:39 UTC (7 hours ago)
        
 (HTM) web link (www.medien.ifi.lmu.de)
 (TXT) w3m dump (www.medien.ifi.lmu.de)
        
       | CharlesW wrote:
       | Hey _@ramboldio_ , as one of the authors of the paper, do you
       | have insider knowledge that Apple got the idea from your paper
       | vs. Facebook's "bizarre 'reverse passthrough'"1 prototype from
       | 2021? Is there a licensing arrangement? (Just curious, it's a
       | really interesting idea in any case!)
       | 
       | 1 https://www.laptopmag.com/news/facebooks-bizarre-reverse-pas...
        
         | ramboldio wrote:
         | no insider knowledge, I just know that the work from facebook
         | cited our work:
         | https://research.facebook.com/blog/2021/08/display-systems-r...
         | 
         | They also add the display that would work with different
         | angles. So it looks like, maybe Apple implemented Meta's
         | research. The timeline could work.
         | 
         | For completeness, there is also another paper "FrontFace"
         | proposing a similar idea that was published around the same
         | time: https://dl.acm.org/doi/10.1145/3098279.3098548
        
           | CharlesW wrote:
           | Thanks for the info! It must feel great to see this becoming
           | a reality, and I hope it benefits you and your partners
           | professionally.
        
             | ramboldio wrote:
             | Yes, feels awesome, thanks!
        
           | ladberg wrote:
           | If you're suggesting that Apple implemented Meta's research
           | starting as it was published on 8/2021, then that timeline
           | absolutely does _not_ work.
        
             | [deleted]
        
             | jessriedel wrote:
             | Could you say more? I agree an Apple VR headset has been in
             | the works for longer than 2 years, but is it that crazy
             | that they were working on multiple approaches and didn't
             | settle on a final design until after 8/2021, which included
             | using some non-trivial ideas from that paper?
        
               | ladberg wrote:
               | Can't really give out any non-public info unfortunately.
               | 
               | I'm not saying that everything was fixed in stone by
               | 8/2021, but any big hardware features like the front-
               | facing display would take longer than that to develop
               | start-to-finish, so I'm just refuting the possibility
               | that Apple could have started development of a front-
               | facing display on the headset and had it ready on the
               | final product in <2 years.
               | 
               | It's not necessarily that a display itself (or any other
               | individual component really) takes >2 years to develop,
               | but that a tightly integrated cutting-edge system can't
               | have significant hardware features added on <2 years
               | before the final product is demoed to the public.
        
               | refulgentis wrote:
               | I'm gonna go ahead and play:'I work near hardware in
               | FAANG, this is totally possible to pull off in 2 years'
               | 
               | ...but if you're asserting 'I work at Apple, impossible',
               | I'll give it to you.
               | 
               | Generally people believe way too strongly that phones /
               | other hardware /etc. are set 3 years in advance. Note
               | it's well-reported Vision Pro just got to DVT in the last
               | 4-6 weeks.
        
       | Demmme wrote:
       | Is this verified?
       | 
       | U do like the LMU after all I'm in Munich but this though is more
       | obvious than magic.
        
         | ramboldio wrote:
         | What I can verify:
         | 
         | To the best of my knowledge, this is the first work that
         | proposes putting a photorealistic, perspective corrected face
         | on a VR headset.
         | 
         | "FrontFace" (https://dl.acm.org/doi/10.1145/3098279.3098548) is
         | the first work that proposes putting eyes on a display on VR to
         | "lower the communication barrier".
        
       | DonHopkins wrote:
       | It should be touch sensitive so it can detect when somebody pokes
       | you in the eye.
        
         | ramboldio wrote:
         | Like this? https://gugenheimer.com/?portfolio=facetouch-
         | enabling-touch-...
        
       | gfodor wrote:
       | Anyone who thought reprojection was the solve for AR considered
       | this solution since you'd have to find a way to simulate glass.
       | The first consumer grade passthrough VR was the GearVR in 2015 or
       | so, so I don't think the idea was originally conceived this late.
        
       | billconan wrote:
       | I prefer no outward-facing display if that can make Vision Pro
       | cheaper.
        
         | haswell wrote:
         | I think that more than anything, the inclusion of this feature
         | is a hint about where Apple intends to take this product, and
         | that they want to send a crystal clear message that this device
         | is meant for interacting with other people.
         | 
         | And it seems this is such an important aspect of the product
         | that they're willing to reduce the addressable market from a
         | cost perspective.
         | 
         | This, to me, is what makes this product intriguing. And it
         | makes me think that Apple's real goal is something closer to a
         | pair of glasses, and they just know they can't get there
         | without a long series of iterations.
        
           | JohnFen wrote:
           | > that they want to send a crystal clear message that this
           | device is meant for interacting with other people.
           | 
           | But that display increases the "creepy factor" by orders of
           | magnitude.
        
             | haswell wrote:
             | This product isn't in the public's hands yet, so I think
             | it's a bit early to conclude that it's categorically
             | creepy.
             | 
             | But even if it ends up being a bit creepy, a) I think Apple
             | is well aware of that and b) again I think this highlights
             | how important they think it is to send the message from day
             | 1 that solving for isolation is a top priority and
             | intrinsic to their end-goal.
        
               | acomjean wrote:
               | When google glass came out, there was the question of if
               | you want to talk to someone who has a camera running at
               | all times, (recording?). Its weird how perspective
               | shifts.
               | 
               | Of course in some places if its recording all the time it
               | might be in some weird wire taping situation.
        
               | JohnFen wrote:
               | > Its weird how perspective shifts.
               | 
               | It's not clear that it has shifted yet. It wasn't clear
               | that Google Glass would be rejected until people started
               | wearing them publicly.
        
               | omegaworks wrote:
               | Google is also in the market of using every bit of data
               | it can possibly record of you to show you ads.
        
               | smoldesu wrote:
               | It's a shame I don't trust Apple not to do the same
               | anymore. If the advertisement creep into MacOS is any
               | indication, the Vision Pro might have more pop-ups than
               | the Quest.
        
               | JohnFen wrote:
               | > This product isn't in the public's hands yet, so I
               | think it's a bit early to conclude that it's
               | categorically creepy
               | 
               | I was just going by the materials that Apple has
               | released. It's true that I haven't personally seen the
               | device in action in real life. But what Apple has shown
               | certainly triggers a "creepy factor" in me.
               | 
               | Whether or not this is an issue for many people, and
               | whether or not Apple will address it, isn't really
               | relevant. What I've seen right now creeps me out a bit.
        
               | haswell wrote:
               | > _Whether or not this is an issue for many people, and
               | whether or not Apple will address it, isn 't really
               | relevant_
               | 
               | Why/how is this not relevant in a broader sense?
               | 
               | Or are you just saying this isn't relevant to you
               | personally?
        
               | JohnFen wrote:
               | Yes, to me, personally. I should have made that more
               | clear. Sorry.
        
               | fullspectrumdev wrote:
               | Pretty much everyone I have spoken to about the thing has
               | been like "that's fucking creepy".
               | 
               | A few have hoped you can display something else on it -
               | like cat eyes or something to make it less fucking weird.
        
         | woah wrote:
         | Apple isn't generally in the habit of cutting corners
        
           | jackmott42 wrote:
           | They cut headphone jacks to save space. No reason you can't
           | cut a pointless outer display to save space too!
        
             | wahahah wrote:
             | The 7 had space for a jack (see StrangeParts retrofitting
             | one); they removed it purely for sleekness (and money).
        
           | ec109685 wrote:
           | The first iPhone only had 2G, the first iPad was super slow,
           | and first AppleWatch had unusable apps.
           | 
           | They're always evaluating if they can cut a corner.
        
         | dangus wrote:
         | I think it's one of the most important features of the design.
         | Apple is trying to get VR/AR users out in public so that it can
         | be a mainstream device.
         | 
         | In the long run, adding a second screen isn't that expensive,
         | and the cameras that capture the video of your eyes already
         | have to be inside the system to perform eye tracking. If
         | smartphone manufacturers can make folding phones with second
         | screens for under $1000 I think that the outward-facing display
         | is not the lowest hanging fruit for cost reduction.
        
         | drcode wrote:
         | in apple's eyes, that would make you a VR zombie
         | 
         | apple has decided VR zombies hurt their brand & they won't
         | allow it
        
           | layer8 wrote:
           | This is funny, because to my eyes their main marketing image
           | for the Vision Pro (dark-skinned woman) makes a face like a
           | dazed zombie.
        
           | billconan wrote:
           | the world is biased against us introverts by forcing us to
           | socialize.
           | 
           | you see, we wear a noise cancelling headset to pretend to be
           | working, in order to avoid unwanted socialization.
        
             | oh_sigh wrote:
             | Look on the bright side, maybe you can hack the vision pro
             | to emulate direct eye contact in a conversation, when in
             | reality you're scrolling hn comments.
        
               | guhidalg wrote:
               | Honestly not a bad idea. You could even fall asleep and
               | continue to display blinking eyes with no one catching
               | on.
        
         | ramboldio wrote:
         | I have to try it out to see whether it's worth it. But since
         | the display can be fairly low-resolution, I don't except that
         | it adds a lot of cost. Weight would be a bigger concern to me..
        
           | dangus wrote:
           | How heavy is an OLED panel? Isn't it like a piece of flexible
           | plastic?
           | 
           | Early reviewers seem to say that the metal construction of
           | the Vision Pro seems to be contributing a lot to its weight.
           | Most other headsets are all plastic.
        
           | mickdarling wrote:
           | My concern is primarily power draw. For a device with only a
           | 2 hour battery, every erg matters.
        
             | dangus wrote:
             | While the displays are significant areas of power draw, for
             | this product I don't think they're the low-hanging fruit.
             | This is a device that's performing a whole bunch of
             | computation in real time: video, lidar, infrared, eye
             | tracking, and then it has to drive two 4K+ displays, one
             | for each eye, with 3D-accelerated content being rendered at
             | all times.
             | 
             | Go to an AR web page on your phone and start playing around
             | with 3D objects in your space and you'll notice your phone
             | getting significantly warmer and drawing more power. The
             | Vision Pro is doing this literally all the time.
             | 
             | Also, the camera and sensor work that is tracking your eyes
             | has to happen whether or not there is an outward-facing
             | display.
             | 
             | Apple makes a watch with a display that is always on, and
             | their phones have high resolution OLEDs that can stay on
             | for over 13 hours (iPhone 14 Pro Max 150 nits brightness
             | doing continuous 5G web browsing).
        
             | wahnfrieden wrote:
             | It uses slow refresh like new iphones
        
             | Geee wrote:
             | It's not on all the time. Only when you interact with
             | people.
        
               | gnicholas wrote:
               | Hm, doesn't it use the outward-facing display at all
               | times, to display your status (whether you're fully
               | immersed, are able to see others, or are recording a
               | video, etc.?) I'm sure there's more processing required
               | to show the eyes, but it's still illuminating the display
               | when it's showing these other states. These could be done
               | via simple LED indicators, saving some energy.
        
       | TastyLamps wrote:
       | Google did something similar in 2017 (although it's overlaying
       | your WHOLE FACE) on the headset and only when viewed through a
       | camera: https://blog.google/products/google-ar-vr/google-
       | research-an...
        
       | ladberg wrote:
       | This is not the same as the Vision Pro's display. This paper
       | describes tracking a single other person and displaying a
       | perspective-correct rendering for them, but the Vision Pro
       | displays a perspective-correct rendering for many viewpoints at
       | once using a lenticular screen.
       | 
       | Apple's solution works for >1 people at the same time and doesn't
       | require any external tracking (though it's already doing the
       | external tracking regardless), at the cost of lower resolution
       | and only being correct in one dimension vs two.
        
         | ramboldio wrote:
         | Adding multiple viewpoints is actually sth Meta first proposed,
         | based on the paper from above:
         | 
         | https://research.facebook.com/blog/2021/08/display-systems-r...
        
           | jessriedel wrote:
           | More detail from that post:
           | 
           | > There are several established ways to display 3D images.
           | For this research, we used a microlens-array light field
           | display because it's thin, simple to construct, and based on
           | existing consumer LCD technology. These displays use a tiny
           | grid of lenses that send light from different LCD pixels out
           | in different directions, with the effect that an observer
           | sees a different image when looking at the display from
           | different directions. The perspective of the images shift
           | naturally so that any number of people in the room can look
           | at the light field display and see the correct perspective
           | for their location.
           | 
           | > As with any early stage research prototype, this hardware
           | still carries significant limitations: First, the viewing
           | angle can't be too severe, and second, the prototype can only
           | show objects in sharp focus that are within a few centimeters
           | of the physical screen surface. Conversations take place
           | face-to-face, which naturally limits reverse passthrough
           | viewing angles. And the wearer's face is only a few
           | centimeters from the physical screen surface, so the
           | technology works well for this case -- and will work even
           | better if VR headsets continue to shrink in size, using
           | methods such as holographic optics.
        
         | jessriedel wrote:
         | Do you know where one could read more about Apple's technique?
         | I don't know much about lenticular displays or why the trick
         | only works in one direction (presumably the horizontal one).
        
           | ladberg wrote:
           | Think of it like those movie posters or bookmarks that change
           | as you move from side to side, but with a screen behind it.
           | 
           | The Wikipedia article might explain it better:
           | https://en.wikipedia.org/wiki/Lenticular_lens
           | 
           | It could work in both dimensions but you're sacrificing even
           | more resolution by doing it that way. For example imagine you
           | have a 1000x1000 pixel display (I just made this resolution
           | up) and you stick a 1D lenticular screen on top with a pitch
           | of 10 pixels. You've effectively split the display into 10
           | separate 100x1000 displays that are each view from a
           | different angle. You could instead use a 2D lenticular screen
           | and split it up into 100 100x100 displays viewable from a
           | different angle in a 10x10 grid at virtually no extra $ cost.
           | However, you're displaying at 1/10th the resolution just to
           | be able to support perspective-correct views from above or
           | below, which are way less common than from the side.
        
           | ramboldio wrote:
           | Facebooks paper on their technology is quite amazing:
           | https://dl.acm.org/doi/10.1145/3450550.3465338
           | 
           | I'm guessing (?) Apple's approach is similar.
        
         | cubefox wrote:
         | Please provide a source which says that Apple's solution works
         | for more than one person. I'm pretty sure they didn't say
         | anything about that.
        
           | ladberg wrote:
           | https://www.youtube.com/live/GYkq9Rgoj8E?t=6729
        
       | AndrewKemendo wrote:
       | Having patented technology for see through AR display in 2016
       | that is cited by Apple [1] and knowing how crazy hard it is, it's
       | a little bit refreshing to know that Apple recognized pass-
       | through HMD AR as too hard and decided to invest in compensatory
       | technology instead of trying to solve the hard see through AR
       | problems.
       | 
       | [1]https://patents.google.com/patent/US10757400B2/en
        
         | nickelbob wrote:
         | Just curious - why is it so hard?
        
           | Animats wrote:
           | Nobody has a good way to draw dark while keeping it in focus.
        
           | slg wrote:
           | I'm not aware of all the details around the technical
           | complications, but from a physics perspective, you can't make
           | something darker and more opaque by adding more light.
           | Therefore, an AR headset needs to be able to at least
           | partially block light in order to make convincing images. It
           | seems a lot easier just to go with Apple's approach of
           | blocking all light rather than try to develop tech that will
           | only selectively block the light behind the AR objects while
           | allowing other light through.
           | 
           | The blocking all light approach also allows you to hide other
           | potential weaknesses of a device. For example, a lower field
           | of view is much more distracting in a pass-through AR device
           | as you still have your full peripheral vision. VR devices
           | will generally black out the light outside of the FOV making
           | it easier to ignore.
        
           | mshockwave wrote:
           | another follow-up, slightly tangent question: how does F-35's
           | helmet do that without blocking all the lights?
        
           | AndrewKemendo wrote:
           | Off the top of my head:
           | 
           | 1. Field of view is limited to existing optics
           | miniaturization
           | 
           | 2. Subtractive shading (rendering black) might not be
           | solvable
           | 
           | 3. Variable focus objects in the same scene requires
           | projecting n>2 significantly different wavefronts - not
           | solved how to do this with a single vibrating element
        
             | pornel wrote:
             | Do you know why an LCD panel can't be used to block light?
        
               | moron4hire wrote:
               | The Magic Leap 2 does. It works ok. It's not as
               | completely useless as armchair quarterbacks on line would
               | have you believe. But it's also not the greatest thing.
               | Objects are a little fuzzy around the edges and friend on
               | the accuracy of the object surface detection. So moving
               | objects can lag.
        
               | Tarragon wrote:
               | The LCD panel is too close and out of focus.
               | 
               | For example, close one eye and hold the tip of a pen
               | about an inch in front the other and you'll see that it
               | doesn't actually block any of the world.
        
               | Filligree wrote:
               | Which pixels need to be darkened to shade off an object
               | depends on the distance to that object, and will also
               | block the light coming from other objects at other
               | distances. It's very inconvenient.
        
           | jacobn wrote:
           | It's one of those "small, low energy, bright, pick 1-2,
           | definitely not 3" type situations.
           | 
           | Moore's law works great for semiconductors, but Maxwell
           | doesn't negotiate ;)
        
         | CharlesW wrote:
         | (Caveat: I don't know a lot about pass-through HMD AR, and I
         | assume you're incredibly smart and that the patent is
         | innovative.)
         | 
         | > _...it 's a little bit refreshing to know that Apple
         | recognized pass-through HMD AR as too hard and decided to
         | invest in compensatory technology..._
         | 
         | I understand the framing as "compensatory technology", but is
         | it possible that what Apple's doing is the simpler _and_ better
         | way to solve the problem? Pass-through AR strikes me as an old-
         | school analog approach, like optical printing for special
         | effects. But a 100% digital vision pipeline seems like it could
         | unlock interesting capabilities like  "night vision", new ways
         | of highlighting interesting objects, etc.
        
           | gjsman-1000 wrote:
           | I think the most prominent attempt at see-through AR was the
           | Microsoft HoloLens. But if you've actually tried the
           | HoloLens, the Field of View is atrociously, tragically small.
           | The first HoloLens had a field of view of 30deg*17.5deg. The
           | second HoloLens improved to 43deg*29deg, but it's still best
           | described as "cramped." Couple that with almost all of the
           | compute budget for the device going into vision processing
           | and having very little compute left for actually running apps
           | (the first HoloLens having a 1Ghz Intel Atom from 2015, the
           | second a superior... Snapdragon 850).
           | 
           | The other problem, of course, is that nothing can be truly
           | solidly-colored. Everything has some opacity - which,
           | combined with the FOV issue, is why HoloLens was never
           | marketed as having anything to do with VR.
        
             | dfsl wrote:
             | Apple's Vision Pro launch shows what Microsoft could have
             | truly achieved with HoloLens
             | 
             | https://tecl.ink/2023/06/apples-vision-pro-wearable-
             | headset-...
        
             | moron4hire wrote:
             | The vision processing was done with a separate SoC. Your
             | applications were not contending for compute time with the
             | vision processing. That's just how anemic the CPU was.
        
           | AndrewKemendo wrote:
           | In theory the wavefront emulator is the simplest display -
           | all of the "pixels" are rendered in your brain so you don't
           | have to build an actual "display"
           | 
           | HOWEVER if someone can get the input -> photon production
           | pipeline to be less than ~10ms, then that does solve a lot of
           | the rendering issues, however it doesn't solve all of the
           | other long term problems that come with that amount of
           | hardware - including weight and complexity.
           | 
           | That said, there's a lot of known-unknowns that need to be
           | solved, for example I don't have a solution for micropiezo
           | resonance issues that I'm sure will crop up.
        
           | alfalfasprout wrote:
           | Agreed. In fact, this approach is also likely better for
           | military applications which I hope they explore.
           | 
           | Currently, NVGs that are fielded by soldiers are already
           | displaying images in a way that's not pass-through (using
           | classic image intensification tubes). Something like the
           | Apple vision headset in a lighter and more durable form
           | factor would allow for eg; fusion imagery (fusing visible,
           | thermal, and night vision).
        
             | Despegar wrote:
             | It's not in Apple's interests to get into military
             | contracting because of the potential to damage their brand,
             | as well as avoiding the geopolitical tensions that would
             | invite.
        
           | jolmg wrote:
           | > is it possible that what Apple's doing is the simpler and
           | better way to solve the problem? Pass-through AR strikes me
           | as an old-school analog approach, like optical printing for
           | special effects.
           | 
           | I think they're different and not one better than the other.
           | If I wanted to drive with an HMD on, or otherwise be in a
           | situation where it could be deadly to have my sight turned
           | off for even a second or to even have some lag or stutter or
           | other glitch in my eyesight, I'd much rather have a pass-
           | through AR HMD. One's sense of sight seems much more reliable
           | with it by its very nature. You simply don't have those modes
           | of failure with transparent plastic, no matter what's going
           | on in the hardware/software.
        
             | flangola7 wrote:
             | Your steering wheel and pedals are digital too. Airliner
             | controls have been digital for decades. Factories have life
             | or death safety systems that count on silicon and a
             | software stack to respond quickly and without error.
             | 
             | Apple can make it possible if they wanted too.
        
               | [deleted]
        
               | sdeframond wrote:
               | I believe cars' steering are analog because it would too
               | expensive to achieve the required level of reliability.
               | 
               | Planes are crazy expensive anyway,so fly-by-wire can be
               | afforded.
        
               | faitswulff wrote:
               | These are all very different from the "HUD goes dark and
               | my meat brain panics, causing me to get into an accident"
               | scenario that GP is talking about.
        
               | numpad0 wrote:
               | Steering is analog. One of reasons why by-wire steering
               | control did not succeed is user feedback, it was tried
               | and failed. Brake pedal is analog too, just assisted and
               | override-able. Gas pedal had been all digital for some
               | time.
               | 
               | Also, it's true that Airbus sidestick is fully
               | electronic, but there's a nuance in engine control being
               | in the center on airplanes. Until automated, engine
               | control on airliners was a task of a flight engineer,
               | like it still is on ocean going ships. So fully
               | digitalized engine control is a replacement to the FE,
               | not necessarily an automation of what originally was a
               | pilot's task. Which means, I think, pushing a vehicle
               | forward was never necessarily the responsibility of a
               | pilot or a driver, though controlling where not to go is.
        
               | fooker wrote:
               | >Steering is analog.
               | 
               | This is changing rapidly. eg:
               | 
               | https://www.infinitiusa.com/infiniti-
               | news/technology/direct-...
        
               | Despegar wrote:
               | Which is probably the point of the R1 and the real-time
               | subsystem.
        
         | warning26 wrote:
         | I'm quite certain they're _still_ trying to solve the hard see-
         | through AR problems, and are hoping to release a future Vision
         | headset with a true see-through display.
         | 
         | But otherwise I agree, it makes sense for them to focus on what
         | they can do best with current tech as a stopgap. With current
         | see-through HMD tech, AR ends up incredibly disappointing. (See
         | also: Hololens & Magic Leap's limited FOV)
        
           | mensetmanusman wrote:
           | They are still asking suppliers to solve the technology, yes.
        
           | birdyrooster wrote:
           | Hopefully we are not forgetting about Bosch's Retinal
           | Projection which solves many problems with lensing HMD AR. If
           | I was a betting man I would say Bosch came up with this tech
           | almost specifically for Apple to integrate it into
           | interactive glasses.
        
           | dfsl wrote:
           | Yes...
           | 
           | https://tecl.ink/2023/06/apples-vision-pro-wearable-
           | headset-...
        
           | AndrewKemendo wrote:
           | Right, I think it's going to come eventually but it's really
           | really really hard
        
           | samwillis wrote:
           | My understanding is that occlusion in "see through" AR is an
           | unsolved problem, everything looks ghostly and somewhat
           | semitransparent. Until someone solves that I suspect the re-
           | projection method is the only viable option.
        
             | AndrewKemendo wrote:
             | Yes "projecting black" is effectively impossible - every
             | once in a while someone from Magic Leap would claim they
             | had solved it but everyone knew it was bullshit
        
               | jessriedel wrote:
               | I'm sorta surprised by this. Don't liquid crystal
               | displays in laptops do exactly that?: LCDs are a
               | controllably transparent array that is overlaid on a
               | uniform backlight. So why can't one create a set of
               | glasses where the lens are an LCD array without the
               | backlight?
        
               | AndrewKemendo wrote:
               | An LCD simply turns "off" to display "black" - so the
               | emissions stop (relative to surrounding pixel emissions)
               | 
               | With a projector, you can't "throw" nothing (aka black).
               | As a result "projected black" is simply lack of
               | projection.
               | 
               | In the case of a translucent or transparent reflection or
               | waveguide surface - which is what the projection reflects
               | off of - "black" is whatever the darkest part of the
               | surface is. In effect whatever else is emitting from the
               | surface that you're looking at will change the depth of
               | "black" you get.
               | 
               | This is why the Hololens and other see through AR devices
               | are always tinted, to set a higher threshold for "black"
               | than the surrounding unaided view.
        
               | moron4hire wrote:
               | LCDs do not emit light. They have a light emitter (or
               | reflector, for passive monochrome displays ala an LCD
               | watch face) behind them and the liquid crystal part
               | selectively allows that light to be blocked or pass
               | through.
               | 
               | The are three layers of polarizing material. The two
               | outer layers are at right-angle polarizations to each
               | other and normally would be completely opaque on their
               | own. When power is applied to the liquid crystal, it
               | twists the crystal's polarization to be at a 45deg angle
               | to the other two layers, which then permits some of the
               | incident light to pass through.
               | 
               | An optically transparent waveguide display can use an LCD
               | layer to block light coming through the front and then
               | not render graphics on that area of the display. It will
               | be opaque black at that point (though rather fuzzy around
               | the edges, as the LCD won't be in focus).
               | 
               | Magic Leap 2 actually employs this technique. It's... a
               | lot like the rest of the device: a good idea on paper.
        
               | cbm-vic-20 wrote:
               | Magic Leap.. How are they still around? And have a
               | valuation in the billions?
        
               | moron4hire wrote:
               | I don't know. I mean, their newest device _is_ better
               | than the HoloLens 2. But like, that 's just a relative
               | statement. Waveguide displays are still objectively
               | dogshit.
        
               | dfsl wrote:
               | HoloLens stopped being a focus of Microsoft and it shows
               | in the product
               | 
               | https://tecl.ink/2023/06/apples-vision-pro-wearable-
               | headset-...
        
               | AndrewKemendo wrote:
               | That's a fair response, though I hope you'd agree that in
               | the context of discussing pass through Vs see through
               | "black" the majority of use cases are indeed fully
               | occluding/lit LCDs near eye and not ML style lenses.
        
               | moron4hire wrote:
               | But you were talking about the Magic Leap, replying to
               | someone talking about waveguides...
        
             | daniel_reetz wrote:
             | I work in the space. Occlusion only matters if it matters.
             | "Ghostly" information display is suitable for a shockingly
             | wide range of tasks.
        
               | samwillis wrote:
               | While that's true, I believe these (mass market) devices
               | need to be general purpose, and so need to cover the use
               | cases where occlusion is necessary.
        
               | K0balt wrote:
               | This is also what I intuited. Can you give some examples
               | of things that require "projected black" , especially
               | those that couldn't be solved by using a darkened room?
        
               | [deleted]
        
       ___________________________________________________________________
       (page generated 2023-06-08 23:01 UTC)