[HN Gopher] Show HN: Open-source digital stylus with six degrees...
___________________________________________________________________
Show HN: Open-source digital stylus with six degrees of freedom
Author : jcparkyn
Score : 102 points
Date : 2023-11-12 20:48 UTC (2 hours ago)
(HTM) web link (github.com)
(TXT) w3m dump (github.com)
| extraduder_ire wrote:
| Outside tracking with a camera is not something I would have
| thought of. Seems cool.
|
| Reminds me of how sad I am nobody's done a good job of cheaply
| cloning the lighthouse tech that valve/htc use.
| jcparkyn wrote:
| I should point out that I'm not the first person to use camera
| tracking for this [1], but to my knowledge there hasn't
| previously been a serious attempt to combine it with inertial
| or pressure sensors (which are both necessary for competing
| with graphics tablets) or make it open-source.
|
| [1] http://media.ee.ntu.edu.tw/research/DodecaPen/
| mtsr wrote:
| AFAIK you can actually get the sensors used for tracking at
| somewhat OK prices. But iirc the sensors actually do some part
| of the position calculations and as such aren't simple enough
| to be really cheap.
| charcircuit wrote:
| Too bad this wasn't made as an OpenXR API layer so that it could
| be used with existing software
| noddingham wrote:
| It's open source so...
| ipsum2 wrote:
| The great thing is that its open source, so you can add
| whatever niche APIs you want to it!
| slaucon wrote:
| The rolling shutter compensation is pretty cool and isn't
| something I would have thought of. Did you know that would be an
| issue from the start or notice it only after you built the rest
| of the system?
| jcparkyn wrote:
| I knew it would have an effect (most of the literature for
| similar projects just uses global shutter cameras for this
| reason), but wasn't sure how significant it would be. It turned
| out to be small enough that it usually wasn't super noticeable,
| but in certain cases it really showed up (e.g., rotating the
| pen while keeping the tip in one place).
|
| The thing I was most surprised by was how effective my solution
| was, given that it's a pretty gross approximation of reality.
| There are lots of much more sophisticated techniques for
| dealing with it, which I didn't end up needing.
|
| One thing I would've liked to try out is using rolling-shutter-
| aware PnP [1], which can theoretically estimate pose and
| velocity simultaneously from one image, by exploiting the
| rolling shutter distortions.
|
| [1] https://www-
| sop.inria.fr/members/Philippe.Martinet/publis/20...
| ipsum2 wrote:
| Very cool! It has the added benefit of being able to manipulate
| objects in 3d. How does it compare to graphic tablets in terms of
| accuracy?
| jcparkyn wrote:
| Currently it's not quite at the level of graphics tablets, but
| it's not too far off, and I think there's quite a bit of
| potential to improve it using similar techniques [1].
|
| In terms of absolute accuracy, I measured an average error of
| 0.89mm (for the position of the tip) across the entire area of
| an A4 page with the camera in one place. In practice you have
| more precision than that though, because most of the errors are
| constant biases (not random noise).
|
| For example, here's one of the tests [2] I did for the thesis,
| which compares the recorded stroke to what I actually wrote
| (scanned from carbon paper). After aligning the two captures
| (as a global 2D position offset, everything else is retained),
| the average distance from the recorded stroke to the scan was
| 0.158mm.
|
| [1] This paper (which I linked in another comment) uses some
| more advanced techniques for pose estimation which could
| definitely be applied here (but it's closed-source, and I
| didn't have time to re-implement it from scratch):
| http://media.ee.ntu.edu.tw/research/DodecaPen/
|
| [2] https://github.com/Jcparkyn/dpoint/files/13329235/main-
| sketc...
| tomcam wrote:
| > This project was part of my undergraduate thesis for electrical
| engineering. I
|
| Undergrad! If you didn't get top marks on this there is no
| justice
| jcparkyn wrote:
| Thanks! I'll get results in about two weeks (fingers crossed)
| bloopernova wrote:
| Very cool. They've done what I've daydreamed about, and actually
| got it to work!
|
| When I played _Elite: Dangerous_ I used a "hand on throttle and
| stick" (HOTAS) setup, along with foot pedals. I couldn't help but
| think that there must be a better way to control a spaceship:
| your ship can pitch, yaw, and roll in addition to being able to
| fire thrusters in 6 directions.
|
| I wanted a handheld ship model that I could move such that the
| ship in _Elite_ would move in the same way. The linked project
| looks like it could do just that. Thrust would be controlled in a
| similar way, but with my other hand.
|
| Strange or new input models like that are so amazing to me. Our
| imagination can really fly high with these sorts of capabilities.
| squigz wrote:
| This is why 2 sticks is a fairly common setup for space games
| Arelius wrote:
| I still hold onto my SpaceOrb369 for this sort of game. There
| is a guy that makes a converter to make it work over USB...
|
| https://en.m.wikipedia.org/wiki/SpaceOrb_360
|
| https://www.tindie.com/products/vputz/orbotron-9001-version-...
| myself248 wrote:
| I have the slightly newer version, branded by HP as the
| SpacePilot, and use it daily in Fusion 360 CAD. It has a
| native USB connection, and 6 user-definable hotkeys below an
| unimpressive LCD.
|
| It requires some old drivers that aren't officially supported
| (why would they remove support for perfectly good hardware
| from the newer drivers? To send good stuff to the landfill,
| of course!), but when Autodesk tried to move Fusion to the
| new driver model exclusively, user outcry persuaded them to
| leave the old drivers in as an option. Apparently there's
| quite a few of us using those SpacePilots, and the phrase
| "from my cold, dead fingers" comes up not infrequently.
| tadfisher wrote:
| In the sci-fi series _The Expanse_ , the _Rocinante_ is
| controlled in part by a 3DConnexion SpaceMouse, commonly used
| in CAD and 3D modeling.
| pests wrote:
| I always imagined a sphere suspended / held by a minimum number
| of strings (or rods?) / attachments. By physically pushing,
| pulling, and twisting the sphere you could detect these
| movements via compression and tension in the attachments. You
| could motorize those strings/rods to give resistence and
| feedback to the piloting.
| bloopernova wrote:
| Yeah that's almost exactly what I was daydreaming of! Weaker
| gyroscopes might mean a ship can rotate in one direction
| slower than others, and feedback to a controller could model
| that for the user.
| navane wrote:
| this 6dof "joystick" has been around for decades:
| https://3dconnexion.com/nl/spacemouse/
| lagrange77 wrote:
| 1. Very cool project
|
| 2. Helpful documentation
|
| 3. Nice real world example for the use of a Kalman Filter!
| jimmySixDOF wrote:
| I'm a big fan of all things 6DOF ! Nice work on the hardware and
| computer vision pose work but I am almost more impressed by the
| software surface you are drawing into and able to rotate. Thats
| interesting and could be used with any tangible user interface
| control like a finger slider for the same effect. Good project
| for problem solving skills looks like you nailed it bravo!
|
| Btw the first 6DOF controller I had other (than a hacked WiiMote
| controller as a ir led Bluetooth receiver [1]) was the logitec mx
| air which was ahead of its day [2].
|
| [1]
| https://web.cs.ucdavis.edu/~okreylos/ResDev/Wiimote/MainPage...
|
| [2] https://www.cnet.com/reviews/logitech-mx-air-review/
| tomp wrote:
| Very cool!
|
| Could be useful for robotics / VR as well. One-camera hand
| tracking anyone?
|
| Question: could you use gyro+accel to track pressure as well? Or
| at least "taps"?
|
| Another question: how much does it cost? in particular, the
| pressure sensor...
| crazygringo wrote:
| Very cool. The use of a webcam really makes me wonder if there's
| a future where our regular single ~78deg FOV webcams are going to
| be replaced by dual (stereo) fisheye webcams that can:
|
| - Enable all sorts of new UX interactions (gestures with eye
| tracking)
|
| - Enable all sorts of new peripheral interactions (stylus like
| this, but also things like a steering wheel for racing games)
|
| - Enable 3D 180deg filming for far more flexible webcam meetings,
| including VR presence, etc.
|
| The idea of being able to use the entire 3D space in front of
| your computer display as an input method feels like it's coming,
| and using a webcam the way OP describes feels like it's a little
| step in that direction.
___________________________________________________________________
(page generated 2023-11-12 23:00 UTC)