[HN Gopher] Show HN: I built an interactive cloth solver for App...
       ___________________________________________________________________
        
       Show HN: I built an interactive cloth solver for Apple Vision Pro
        
       A bit more context - the cloth sim is part of my app, Lungy
       (https://www.lungy.app). It's designed to be an active meditation /
       relaxation app, so you can play relaxing instruments in space and
       immersive breathing exercises. The original Lungy is a breathing
       app available for iOS, that uses real-time breathing with
       interactive visuals.  The cloth sim is a Verlet integration,
       running on a regular grid. For now, I have tried a couple of
       different cloth scenes - a sort of touch reactive 'pad', where
       different parts of the cloth are mapped to different sounds and a
       cloth that blows in sync with breathing. The collision detection is
       a little bit tricky with the deforming mesh, but seems to work ok
       overall. Overall, it seems like a cool interaction to explore.  The
       cloth sim is live on the app store now (and free) - would love to
       hear feedback from anyone with a Vision Pro.  App Store (Vision
       Pro): https://apps.apple.com/app/id6470201263  Lungy, original for
       iOS - https://apps.apple.com/app/id1545223887
        
       Author : lukko
       Score  : 34 points
       Date   : 2024-05-31 20:23 UTC (1 days ago)
        
 (HTM) web link (www.youtube.com)
 (TXT) w3m dump (www.youtube.com)
        
       | jncfhnb wrote:
       | It seems weird that the interface seems to encourage interacting
       | with it from afar and has no indicators as to where your actions
       | would affect it (like a highly on a pickupable node). Is there
       | some sort of intuitive reason for that that's not obvious on a
       | video? Not a complaint just curious.
        
         | lukko wrote:
         | Good point - I should have probably shown the direct
         | interaction too (touching the actual fabric in space) - there's
         | a gif here: https://jmp.sh/s/lHqJm6NEvqMqkXMPyUpZ. It was a
         | little but laggy when also screen recording on device.
         | 
         | In the video I am looking at where I want to interact and then
         | using a pinch gesture and the sounds are mapped to different
         | cells of the cloth. By either looking and tapping, or playing
         | directly you can hopefully play your intended sound.
        
           | jncfhnb wrote:
           | So it looks for gesture events and maps them to locations
           | where your eyes are looking rather than where the gesture was
           | done?
           | 
           | Is that how most vision pro interfaces work?
        
             | lukko wrote:
             | Yes, that's basically it. If the gesture is targeted to
             | something, either by looking at it or touching it directly
             | in space then it responds. Direct touching takes priority,
             | so I think you could look away but still touch in space.
        
       | kromokromo wrote:
       | From the title I thought you made a optical recognition system
       | for sorting out the laundry, finding sock pairs etc.
       | 
       | Very cool sim, but kind of disappointed still. Will build this
       | someday.
        
         | lukko wrote:
         | Haha, I wish - that would be great!
        
           | xattt wrote:
           | You'll have to track all your clothing with UWB tags!
        
       | throwaway115 wrote:
       | Congrats! What has it been like developing on the AVP?
        
         | lukko wrote:
         | Thanks!
         | 
         | It's been fun. It's not too dissimilar to iOS - a lot of
         | spatial capabilities are linked very closely with RealityKit,
         | it's worth looking at the API if you're interested. I was
         | thinking we'd use Metal for rendering, but as I think there are
         | privacy issues with accessing the raw camera data, it's only
         | supported for 'full' spaces, so not the mixed / camera feed +
         | overlay.
        
           | goeiedaggoeie wrote:
           | API's are a lot more limited than IOS however from my
           | experience.
        
       | xyst wrote:
       | Is the choppy movement of the cloth because of the limitations of
       | the device or something else?
       | 
       | Can't believe this is what $4-5K piece of tech looks like. Wild.
        
         | lukko wrote:
         | It's because of the simultaneous recording / screen capture -
         | it's smoother on device with less lag.
        
       | wouldbecouldbe wrote:
       | I've been trying to figure out if Vision Pro suffers the same
       | fate as other AR glasses that it can't be used in the sun, Anyone
       | knows?
        
         | wtallis wrote:
         | It's not AR glasses. It's VR with lots of cameras.
        
       | pavlov wrote:
       | I literally haven't thought about the Vision Pro even a single
       | time in months.
       | 
       | Wild for such a hyped-up technology product. It just shipped and
       | vanished.
        
         | sneak wrote:
         | I use mine almost every day, it's awesome for traveling. It
         | might be the best "watching videos in bed" device ever made.
        
           | consumer451 wrote:
           | This frightens me. I have been working at reducing my time of
           | watching videos in bed, and all I have to tempt me is a phone
           | and a laptop. It's such an easy trap to fall into.
           | 
           | Spending time horizontal while not asleep is very unhealthy
           | for your heart.
        
         | BoorishBears wrote:
         | Their stance on developer access made it DOA on day 1.
         | 
         | All apps are flying blind when you're multitasking: apps can't
         | even use iOS style marker tracking unless you run them in
         | "immersive mode", which makes them the only app running. That
         | combined with no camera access at all, extremely laggy hand
         | tracking, and an inability to do room scale without a constant
         | passthrough make it somehow less capable than a Quest 3.
         | 
         | I bought a Quest 3 after my Vision Pro and it's a legitimately
         | better piece of hardware except for the displays, and
         | passthrough (which is gimped in usage on AVP). Even the lenses
         | are better on the Q3. Meta has a commanding lead in VR after
         | all
        
           | jwells89 wrote:
           | On the other hand, the weakness of the onboard compute
           | significantly restricts the Quests' potential, as does their
           | inability to take a DisplayPort input from a PC (the tether
           | can only serve crappy compressed video).
           | 
           | I own a Quest 2 and AVP and while the Quest is alright for
           | what it does and regularly gets used (for PC-tethered Beat
           | Saber mainly), I'm on the lookout for a quality dedicated
           | PCVR-oriented replacement that doesn't break the bank. I
           | don't see myself buying another Quest in the future unless
           | they add back DisplayPort input or the onboard compute both
           | becomes more powerful and gains the capability to run Steam
           | so I don't have to buy games from Facebook to play
           | untethered.
        
         | Hamuko wrote:
         | Or if you live outside the US, just vanished, since it has
         | never shipped.
        
       | hi-v-rocknroll wrote:
       | Willing to trade a barely-used Quest Pro for an AVP. ;@D
        
       | superamit wrote:
       | This is really cool! I save AVP for work but have been eager to
       | find new meditation interfaces because it feels like AR has so
       | much potential for this. Will try it out!
        
       ___________________________________________________________________
       (page generated 2024-06-01 23:00 UTC)