[HN Gopher] Launch HN: Lifecast (YC W22) - 3D video for VR
___________________________________________________________________
Launch HN: Lifecast (YC W22) - 3D video for VR
Hi HN, I'm Forrest of Lifecast (https://www.lifecastvr.com), with
my co-founder Mateusz. We make software to create 3D video for VR,
robotics simulation, and virtual production. We convert any VR180
video or photo into our 6DOF VR video format, or into meshes
compatible with Unreal Engine. Our 3D reconstruction is based on
computer vision for dual fisheye lenses and deep learning. VR
video can be categorized as 3DOF (three degrees of freedom) or 6DOF
(six degrees of freedom). 3DOF responds only to rotation, while
6DOF responds to both rotation and translation--meaning you get to
move your head. VR games are 6DOF, but most VR videos are 3DOF.
3DOF can cause motion sickness and eye strain due to incorrect 3D
rendering. 6DOF VR video fixes these problems for a more
comfortable and immersive experience, but it is harder to make
because it requires a 3D model of each frame of video. There are
some prototypes for 6DOF VR video systems in big tech, but they
typically involve arrays of many cameras, so they are expensive,
not very portable, and generate an impractical amount of data.
Because of these challenges, 6DOF hasn't been widely adopted by VR
video creators. In 2015 I was working on ads at Facebook, but I
was more excited about VR. I built 3D cameras out of legos and
GoPros, showed some of this at a hackathon, and eventually they let
me do that as my day job. I was the first engineer on Facebook's 3D
VR camera team, which made Surround 360 (an open-source
hardware/software 3D VR camera), and Manifold (a ball of 20+
cameras for 6DOF). After Facebook, I was a tech lead on Lyft's
self-driving car project, and Google X's everyday robot project. I
started Lifecast because I wasn't satisfied with the progress on
6DOF VR video since I left Facebook. I learned new ideas from
robotics which can improve VR video. The Oculus Quest 2 has just
enough power to do something interesting with 6DOF. There have also
been advances in computer vision and deep learning in the last few
years that make it possible to do 6DOF better. Our software makes
it simple to create 6DOF VR video using any VR180 camera. It's a
GUI for Mac or Windows, which takes VR180 video or photos as input,
and produces Lifecast's 6DOF VR video format (more info:
https://fbriggs.medium.com/6dof-vr-video-from-vr180-cameras-...).
VR180 video can be created with any VR180 camera; the Canon R5 is
one of the best on the market right now. We make a video player for
WebVR which runs on desktop, mobile or VR. Playing the videos on
the Quest 2 doesn't require installing any software, just visiting
a web page in the Oculus Browser. In addition to our 6DOF format,
the software can also output point clouds (.pcd) or triangle meshes
(.obj) compatible with Unreal Engine. We are seeing interest in
using this for virtual production (2D film-making in a game
engine), and creating environments for robotics simulation. This
recent video review/tutorial does a nice job of explaining our
tech: https://www.youtube.com/watch?v=_4a-RnTLu-I (video by Hugh
Hou, not us). For something more interactive, the thumbnails on
https://lifecastvr.com are links to demos that run in browser/VR.
6DOF VR video is one piece of a larger puzzle. We envision a future
where people wear AR glasses with 3D cameras, and use them to
record and live-stream their experience. 3DOF is not sufficient for
this because it causes motion sickness if the camera moves. We have
prototypes which fix motion sickness in 3D POV VR video from
wearable cameras. Watching the videos in VR feels like reliving a
memory. Here's a demo: https://lifecastvr.com/trickshot.html You
can download a free trial from https://lifecastvr.com after
entering your email address, but do not need to create a full
account. The free trial is not limited in any way other than
putting a watermark on the output. We'd love to hear your thoughts
and experiences about VR video, virtual production and robotics!
Author : fbriggs
Score : 57 points
Date : 2022-03-14 18:11 UTC (4 hours ago)
| Findeton wrote:
| Ok so I've tried doing some version of this that is a bit more
| advanced [0] but I gave up because I'm not a ML expert. Have you
| thought about creating/projecting video versions of lightfields?
| Like Google's Deepview [1]. I'd love for DeepView Video kind of
| tech to be comoditized.
|
| [0] https://roblesnotes.com/blog/lightfields-deepview/
|
| [1] https://augmentedperception.github.io/deepviewvideo/
| fbriggs wrote:
| I worked on a lightfield(ish) camera at Facebook, but Lifecast
| is more focussed on what is practical with current camera
| hardware. We prefer to make the best possible 6DOF using
| existing VR180 cameras which people already have. A second
| challenge is to render the results on a Quest 2 (the most
| popular VR headset today), with its limited GPU power. Our
| format is optimized for the rendering capabilities of Quest 2,
| which means we have to make some tradeoffs on visual quality. I
| don't think Quest 2 has enough power to render multi-layer
| images (MPIs, the format in DeepView). This is the difference
| between making a product and doing academic research. I'm
| looking forward to Quest 3 or whatever comes next; I hope it
| has enough power to do stuff like MPI.
| anish_m wrote:
| This is awesome project! I looked into starting a "real world
| travel" app for oculus with recorded videos, but not having an
| easy way to record 6dof videos is a big problem for true VR
| experience with videos. If you can pull this off, you have the
| potential to actually make VR more mainstream outside gameverse.
| Good luck and congrats!
| 0x20cowboy wrote:
| This is very cool. I love this kind of stuff. I built a web
| player to view 360 streaming videos using VR:
| https://meshvue.com/, but it hasn't caught on.
|
| If you think it might help, I'd be keen to chat.
|
| (It works best on desktop, but it does work on the Quest too. But
| because the texture sizes the Quest supports are quite small,
| it's resolution is bad currently)
| charcircuit wrote:
| Is there any chance that you will release a Linux version
| considering Firefox and Chrome don't support WebXR?
| fbriggs wrote:
| I have tested the player on Chrome/Firefox/Safari on Mac and
| Chrome on Ubuntu 21. It should be working. LMK if you are
| encountering an issue.
|
| The tool to create the videos right now is only available on
| Mac and Windows. We have an internal Linux build, but we aren't
| releasing it yet.
| acgourley wrote:
| We're working on something similar - several scene layers packed
| and transmitted over h265 streams and unpacked into a 3D client
| for 6DoF playback. Captures from something as simple as a GoPro
| and then our CV compares perspectives of the scene across time to
| reconstruct it in 3D for the encoding/transmission steps.
|
| Targeting exercise market (where we got our start) but it could
| go beyond it in time.
|
| Short demo: https://www.youtube.com/watch?v=DST9jz9Rrcc
|
| Happy to chat, email in profile.
| Findeton wrote:
| That's very cool, as I mentioned in another comment I also
| tried doing something similar!
| samtimalsina wrote:
| I don't comment here on HN often, but this is impressive! It
| could be a game changer for someone like me who finds
| exercising indoors boring but could stroll outside for hours on
| end. I could see myself using this.
| fbriggs wrote:
| Very impressive results, nice work!
| chaostheory wrote:
| Why don't you guys have a Quest 2 app instead of just a mobile
| one?
| endisneigh wrote:
| This looks amazing. I would pay 10 bucks a month for a variety
| of scenes + new scenes regularly.
| blensor wrote:
| How does platform support look? We are building a VR exercise
| game/app [1] and those environments look awesome, but we are
| using GodotEngine not the usual Unity/Unreal ecosystem (same
| question goes for the OP)
|
| [1] xrworkout.io
| acgourley wrote:
| We're doing everything in Unity. The hard part is really in
| the encoding, so there is no reason we couldn't have the
| client be in Godot or even simply webGL, but that isn't our
| focus just yet.
| pedalpete wrote:
| This is where we were looking to go with our "metaverse for
| sports" app https://ayvri.com - Blending of 3D world geometries
| with video and photos captured from ground view, built into 3D
| models. I believe this is the future of video, I was calling it
| "spatial media" at the time.
|
| We still operate Ayvri, but have mostly moved on to other
| projects.
___________________________________________________________________
(page generated 2022-03-14 23:00 UTC)