Posts by keithahern@mastodon.social
 (DIR) Post #AX4ev3myQNZKvGpbZg by keithahern@mastodon.social
       2023-06-25T09:22:59Z
       
       0 likes, 0 repeats
       
       @jjtech @ShinyQuagsire @zhuowei I’m hoping this is better than my hacky mouse based approach, I couldn’t get the HID based approach working, I want to stream from an AR headset.
       
 (DIR) Post #AXARBeGGM7FoGOJ3jc by keithahern@mastodon.social
       2023-06-27T17:07:35Z
       
       0 likes, 1 repeats
       
       @ShinyQuagsire @jjtech @zhuowei Yeah, progress.  Here’s the Magic Leap One publishing pose data to a UDP receiver I added to your XRGyroControls.BTW the rotations are quaternions, so you do need that 4th ‘w’ parameter in your IndigoHIDMessage camera function.
       
 (DIR) Post #AXARBf3XOufYjDMPXU by keithahern@mastodon.social
       2023-06-27T17:08:25Z
       
       0 likes, 0 repeats
       
       @ShinyQuagsire @jjtech @zhuowei My ultimate plan is to send simulator stereo rendered screens with a blank environment back to the Magic Leap One and render it in the headset, effectively remote running vision OS on a Magic Leap One (< $200 from your local eBay seller!).  I hope to replace the shipped environments with a blank one and stream it back to the AR headset and chroma key out the background or something less hacky.How is the stereo rendering coming along? 🙂
       
 (DIR) Post #AXARBfvQAZlrQKZRWi by keithahern@mastodon.social
       2023-06-27T17:58:37Z
       
       0 likes, 0 repeats
       
       @ShinyQuagsire @jjtech @zhuowei Here's my fork https://github.com/keithahern/XRGyroControlsUDP
       
 (DIR) Post #AXBu66qkF9S91J2do8 by keithahern@mastodon.social
       2023-06-29T17:48:46Z
       
       0 likes, 0 repeats
       
       @zhuowei Yes, please continue working on it. I’m not sure what the lowest latency stack will be. For AR I only want to stream the visionos overlay over black.
       
 (DIR) Post #AXBu67a7WRkVI2GsXA by keithahern@mastodon.social
       2023-06-29T17:50:00Z
       
       0 likes, 0 repeats
       
       @zhuowei So if the simulator  3D environment is keyed out it should compress pretty well.
       
 (DIR) Post #AXC0D22P8EuPqeBB7w by keithahern@mastodon.social
       2023-06-29T18:04:27Z
       
       0 likes, 0 repeats
       
       @zhuowei Actually it might need some surfaces for plane detection.
       
 (DIR) Post #AXC0D2qk75AuMljNaa by keithahern@mastodon.social
       2023-06-29T18:33:43Z
       
       1 likes, 0 repeats
       
       @zhuowei presumably there’s a low poly environment that’s generated from the real world (or from the high detail RealityKit environment in the simulator) - ie the room scan. An external AR headset could supply this low res environment from its own room scan and that could be inserted into the simulator so then you get accurate planes and surfaces etc.
       
 (DIR) Post #AXSJXKCjuL3Ivz6N0a by keithahern@mastodon.social
       2023-07-07T13:30:15Z
       
       0 likes, 0 repeats
       
       @zhuowei I played around with this using OBS and chroma keying out colours while looking at blocks of colour in the sim.  Due to the inherent alpha blending of the app icon grid and frosted glass panels it didn't look great. Are you able to isolate the layers pre-composition?
       
 (DIR) Post #AXSJXNpUQipcB2FrKC by keithahern@mastodon.social
       2023-07-07T13:34:49Z
       
       0 likes, 0 repeats
       
       @zhuowei OBS made the chroma keyed stream available as a virtual camera device, I then streamed that over webRTC to the Magic Leap which was also sending gyro events. It was laggy and I haven't gone back to optimise.