[HN Gopher] Show HN: I made a new sensor out of 3D printer filam...
___________________________________________________________________
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool
sensor during my PhD (electrical engineering). This sensor is only
about 1/3 of my total research for my degree and took about a year.
I've been on HN for a while now and I've seen my fair share of
posts about the woes of pursuing a PhD. Now that I'm done with mine
I wanna share some anecdotal evidence that doing a PhD can actually
be enjoyable (not necessarily easy) and also be doable in 3 years.
When I started I knew I didn't want to work on something that would
never leave the lab or languish in a dissertation PDF no one will
ever read. Thanks to an awesome advisor I think I managed to thread
the needle between simplicity and functionality. Looking back, the
ideas and methods behind it are pretty straightforward, but getting
there took some doing. It's funny how things seem obvious once
you've figured them out! Oh, I love creating GUIs for sensor data
and visualizations as you'll see -- it's such a game changer!
pyqtgraph is my go-to at the moment - such a great library.
Author : 00702
Score : 818 points
Date : 2024-04-11 16:06 UTC (1 days ago)
(HTM) web link (paulbupejr.com)
(TXT) w3m dump (paulbupejr.com)
| Balvarez wrote:
| Wow that's super cool! Very nice work
| 00702 wrote:
| Thanks!
| JackFr wrote:
| Made me think of Navin R. Johnson's invention of the Opti-Grab.
| klysm wrote:
| > Thanks to an awesome advisor
|
| And there you have it! The difference between a miserable
| experience and a good one
| 00702 wrote:
| Very much so!
| porphyra wrote:
| That's super cool and I hope you don't mind a little bit of
| unsolicited feedback but the first question everyone's asking is
| "what does it do?" At present the blog post starts out with two
| paragraphs talking about the format of the blog post and the
| applications but not what the sensor actually measures.
| 00702 wrote:
| Good point -- I need to better explain what "bend localization"
| means on a more practical level pretty early on.
| porphyra wrote:
| Yeah. I guess in layman's terms it's a long squishy cable
| that knows where it's being bent.
| oersted wrote:
| > This means the sensor can tell you where you bent it, with
| a predefined (and coarse) resolution.
|
| > OptiGap's application is mainly within the realm of soft
| robotics, which typically involves compliant (or 'squishy')
| systems, where the use of traditional sensors is often not
| practical.
|
| This explanation is already quite clear. If I understood
| correctly, by "predefined resolution" you mean that it
| detects which silicone sleeve was bent on a tube with a
| series of them, correct?
|
| Can you provide more concrete examples for how you envision
| it being used? The first application that comes to mind is
| sensing how fingers bend in a glove controller.
| 00702 wrote:
| The finger bending example is certainly a classic for
| something like this but I think it truly shines in soft
| robot examples like flapping wing robots or swimming finned
| robot, where it's critical for sensors to be mechanically
| transparent so as to not impact the usually delicate
| dynamics. The "soft" robotic arm in my earlier paper is
| another good example
| https://ieeexplore.ieee.org/document/9763962
| IMTDb wrote:
| Also; if you can please include examples of practical use
| cases. I am sure there are tons, but I always love
| discovering complexity in hidden places
| devwastaken wrote:
| Have a quick recording that shows an outcome or effect that
| is a core part of the research.
| yoz wrote:
| Having just read the piece for the first time _after_ you
| added the "bent rope" explanation: YOU NAILED IT. I
| literally had the reaction of thinking, "Wow, there's a super
| simple explanation early on! I trust this writer much more
| now."
| Netcob wrote:
| That's basically every other HN article for me. Zero context.
| "Blorglorp 2024.4.99 released" "With the new version, Blorglorp
| finally sheds its libgnipgnop dependency and increases
| efficiency by 1.25%". Bam, top post for the day, lots of multi-
| paragraph comments, and I'll still never know what it's even
| for.
| sandebert wrote:
| The context tends to come some time later, when they close
| store. "We at Derplabs are proud that we dared to make an
| opinionated jpeg viewer, Blorglorp, that only interpreted the
| four first bits of every byte and ignored the rest. Commonly
| referred to as 'the naughty bits' by image-viewing
| connoisseurs. Unfortunately the market was not ready, but we
| are sure our ideas will gain traction in the future. Our
| deepest gratitude to our customers and investors that were
| excited to join our journey."
| stavros wrote:
| You think? All the closing announcements I've seen were
| "we've reached the end of our incredible journey, we're
| proud to have served our users but your data is gone
| tomorrow. Good luck!".
|
| Meanwhile, I never find out what the thing even does.
| cush wrote:
| I can die happy if I never have to work with libgnipgnop
| again
| Smoosh wrote:
| But have you used the new 3.6 release which uses pfnaphell
| integration to stochastically pre-convolute the tertiary
| nodes? It's a game-changer!
| tomcam wrote:
| lol v3.6.7845 doesn't even self calibrate. Most people
| are on v3.6.8002a except Mac users and people using a
| venv. Works perfectly under newer plan9 emulators. I just
| use Albus mode in Emacs (trunk) and avoid a lot of those
| problems. If you forget to use trunk the mouse is
| disabled for some reason...
| Cerium wrote:
| One application for shape sensing technology is for surgical
| robotics. An example: https://www.intuitive.com/en-us/products-
| and-services/ion/sh...
|
| Relevant patent:
| https://patents.google.com/patent/US20240044638A1/
|
| The first time I saw one of these in person I was in awe. You
| could take a normal looking cable (think bicycle cable sleeve)
| and bend it and see in real time the same shape on the display.
| nuancebydefault wrote:
| This question surprises me. Bend location == locate where
| something is bent. Then some video's of the researcher bending
| a tube. Is there any confusion possible?
| porphyra wrote:
| OP updated his blog post after I posted my comment. The
| opening paragraphs are now very clear and awesome.
| nuancebydefault wrote:
| Oh - thanks for pointing this out!
| szundi wrote:
| This guy knows how to find shit out, I like this!
| blkhawk wrote:
| on first blush that sounds like a better version of the way the
| Nintendo power-glove detected finger bending.
| moralestapia wrote:
| Paul, what an amazing project, this is what hacking is all about.
| Congratulations!
|
| Definitely try to explore the commercial side of your invention.
|
| It wouldn't hurt to talk to an IP lawyer, if you're still in Uni
| they usually have people there doing this and you can just go
| talk to them, free of charge (for you!).
|
| I'm generally against the idea of patents, mainly because of
| people who came to know to game the system and exploit it (patent
| trolls etc...) Your project is a real thing with real
| applications, you definitely deserve a share of whatever
| commercial benefit this could bring to the world, :D.
| 00702 wrote:
| My (former) school is actually already in the process of doing
| that! My dissertation committee thought it was novel enough
| that it needed some IP protection and encouraged me to pursue
| that.
| moralestapia wrote:
| Hey, that's great to hear.
|
| Best of luck with everything!
| kobalsky wrote:
| Maybe I'm not understanding the blog post so bear with me.
| Isn't what he is described what a time domain reflectometer
| does? [1]. I mean that's what it's used to detect breaks or
| kinks in fiber optics cables. The same tech is used to detect
| problems in civil infrastructure with embbeded fiber optics.
|
| [1]: https://en.wikipedia.org/wiki/Optical_time-
| domain_reflectome...
| 00702 wrote:
| Oh yeah cable companies have long been able to do that - I
| wasn't trying to compete with or replace that technology. My
| work was soft robotics-focused with simplicity in mind.
| georgeburdell wrote:
| Yeah my thoughts exactly. OTDR is what the optical networking
| industry uses (for much larger runs of fiber) to find
| kinks/breaks.
| moralestapia wrote:
| It could still be a novelty if it uses a slightly different
| method or even materials. Hence why the advice is to get
| this on the hands of IP experts.
| georgeburdell wrote:
| I wasn't commenting on the novelty aspect, I was more
| wondering in which situations the author's device might
| be better. Also, the author noted they started with a
| time of flight sensor, which would have made it extremely
| similar to OTDR
| nimish wrote:
| hell yeah, this is super cool!
| lttlrck wrote:
| This is really really cool, but why does it depend on TPU?
| Wouldn't it work with regular fiber optic? Or does it come down
| to economics?
| 00702 wrote:
| It doesn't! I heavily used TPU to drive home the point that it
| can work with almost any light-transmitting fiber. I used PMMA
| optical fiber for the more fine demos.
| lttlrck wrote:
| That's great! Well done.
| Projectiboga wrote:
| Hi, what kind of sensor? When posting to the public you should
| mention the subject of what can be sensed or scanned early on.
| bobberkarl wrote:
| Have you been to Gabon recently?
| Projectiboga wrote:
| Solid job, and congratulations on doing something useful and
| hitting the program with the goal to make something tangible and
| getting out promptly.
| 00702 wrote:
| Thanks -- I definitely relied heavily on my wife in order to
| maintain that pace.
| ericdfoley wrote:
| Not quite the same thing, but this reminds me of DAS using fiber
| optic cable for various acoustic sensing tasks--basically as an
| alternative to geophones/hydrophones. There have been a number of
| papers using transoceanic fibers for various monitoring tasks.
|
| That is also used for various industrial applications, e.g. for
| strain sensing by Luna Innovations. I know that Schlumberger has
| various patents on fiber-optic sensing relating to towed
| streamers (e.g. for marine seismic acquisition.) But I haven't
| seen it used for soft robotics before.
| 00702 wrote:
| Yeah initially doing literature review was a bit daunting
| because of all this existing work, especially FBG-type sensors,
| but this idea is so fundamentally simple that its been mostly
| bypassed by the smarter minds
| ericdfoley wrote:
| Yeah, it makes a lot of sense to just create those gaps when
| you're specifically installing the fiber for sensing.
| bythreads wrote:
| Congrats and if I may; your blog post should be how ph.d's are
| done - readable, understandable and devoid of techno mumbling.
| 00702 wrote:
| Thank you!
| lifeisstillgood wrote:
| "I stuck a piece of plastic to my desk, it bent as I did it, I
| had time to investigate and skills to get a whole PhD out of
| that"
|
| And this dear worker drones, is why schedule destroys quality :-)
|
| Loving the write up. Clear and simple. Good luck with squishy
| robots. :-)
| 00702 wrote:
| Haha this made me laugh -- thanks!
| raphman wrote:
| Very beautiful research and thorough documentation. I initially
| wanted to comment that this looks a lot like time-domain
| reflectometry on a conceptual level - but as Cindy Harnett seems
| to be your advisor, you probably know that already :)
| 00702 wrote:
| Indeed! How do you know her?
| raphman wrote:
| Oh, just from reading her papers. I have been playing with
| TDR a long time ago [1] and try to follow current
| developments.
|
| [1] https://dl.acm.org/doi/10.1145/2047196.2047264
| smoyer wrote:
| I did some work on TDR analysis when analyzing CATV fraud
| detection but it turns out that we generally know where on the
| fiber plant each CPE is and collecting the DOCS IS timing data
| from those devices is essentially free.
| teucris wrote:
| I wonder if by using a large nozzle, you could print out the
| entire sensor by laying out lengths of TPU with flexible joints
| at each air gap. It would depend on how well light traveled
| through the printed part though.
| 00702 wrote:
| 3D printing does affect the light passing through
| significantly. There are a number of options for fabricating
| these but most of the successful ones involve cutting (can even
| use a laser cutter).
| adversaryIdiot wrote:
| incredible work. good job!
| 00702 wrote:
| Thanks!
| uoaei wrote:
| Similar concept but for temperature (linked on HN a while back):
|
| https://en.m.wikipedia.org/wiki/Distributed_temperature_sens...
| poyu wrote:
| Great job, this is so cool!! Can't wait to see where this will be
| used in.
| wholinator2 wrote:
| As sometime about to start a PhD in theoretical physics, how did
| you do 3 years? I've been told a doctorate is 2 years of classes
| followed by 3 to 5 years of research. Did you already have a
| masters that you're doctorate college accepted to override class
| requirements?
| 00702 wrote:
| The short answer? Weekly meetings with my advisor! Long answer:
| I also had 2 years of classes but I started working on my
| research immediately, while taking classes. By the time I
| finished all my classes and became a candidate I had one paper
| already published and another one accepted, so I was able to
| get a 3rd paper out and defend by the end of the 3rd year.
| atlas_hugged wrote:
| Wow
| wantsanagent wrote:
| Isn't total internal reflection affected by surface defects? Aka,
| scratch the filament rather than air-gap it.
| 00702 wrote:
| I was aiming for significant attenuation when bending, so
| scratching wouldn't be enough.
| crote wrote:
| What an absolutely amazing idea, great work!
|
| Your sensor data seems to have quite large "dead zones" - those
| should be trivially fixable by reducing the inter-sensor
| distance, right?
|
| Would it be useful to sense the _direction_ of the bend? I reckon
| this might be possible by dividing the tube like a Mercedes logo,
| and having three sets of the sensor in one outer tube.
|
| Is there a way to sense multiple bends? With the current setup
| that'd result in invalid readings as you're essentially OR-ing
| the value. Are there any good solutions for this?
| 00702 wrote:
| Great ideas! Even though I haven't implemented it fully it is
| possible to sense multiple bends because each bend will always
| have the same relative attenuation (across the strands) so it
| would just be a matter of matching on the relative deltas from
| one reading to another. The catch, though, is at that at some
| point no light will reach the end if every joint bends a lot.
| There are ways to mitigate that, but my comment is too long
| already!
| gaudystead wrote:
| No need to cut your comments short! You clearly have a
| passion, and passion is the best way to get other people
| interested! :D
|
| (also, nice work!)
| liamwire wrote:
| Could you increase power output through the fibre in response
| to a loss of signal, or one that falls below a given
| threshold? Using reasonable bounds to account for errors
| where attenuation isn't the cause of no signal being
| received.
| datadrivenangel wrote:
| Love the use of lower cost materials and nice jankier DIY
| SCIENCE.
|
| 00702, 4th photo caption appears to be missing the word gap. :)
| 00702 wrote:
| Fixed -- thanks! And yes exactly! I had access to basically any
| piece of equipment I could want (including a cleanroom that can
| create ICs) but then basically no one would be able to recreate
| what I would make.
| npace12 wrote:
| This is such a clever project, and a great write up! Thanks for
| sharing!
| maxnoe wrote:
| Very Cool!
|
| Since you mentioned the visualization part, let me comment on my
| pet pieve:
|
| The rainbow / jet color map should not be used for anything. It
| distorts your data and is not accessible for people with color
| vision deficiencies.
|
| If you want to know more, have a look here:
| https://matplotlib.org/stable/users/explain/colors/colormaps...
| and this talk here: https://youtu.be/xAoljeRJ3lU
| itissid wrote:
| I recall first haring and reading about soft robot application
| from veritasium: https://www.youtube.com/watch?v=058hRtaCWC0 I
| think you must have heard of it?
|
| In the realm of rescuing or assisting people unreachable
| otherwise, could this bend and flex as much as those highly
| compliant robots?
| bobberkarl wrote:
| This makes me think of so many possible industrial applications.
| I believe you are onto something great!
| skoocda wrote:
| This is awesome! Presumably you can make this work with any
| interface that doesn't enforce the total internal reflectivity of
| a fiber optic cable, and therefore allows light to leak out.
| Instead of an air gap, have you tried experimenting with removing
| the cladding of the fiber optic cable, but keeping the core
| intact?
|
| Alteratively, could you use a short segment of colored cladding
| that allows certain wavelengths to leak out more than others? I
| think that would allow you to encode each bend point as a
| different color-- which might require a different (more
| expensive) rx sensor, but could be useful for certain
| applications.
| 00702 wrote:
| I did experiment with various ways of allowing light to escape
| but nothing came close to the properties of a total air gap.
| You can actually measure (relative) bend angle with it like a
| protractor since the attenuation is very linear!
|
| There is already existing work that uses colored segments for
| something similar but those techniques are hard to do outside a
| well equipped lab.
| jjk166 wrote:
| This is fantastic! One can easily imagine some minor refinements
| that would allow this to be mass producible with very high
| accuracy. And the applications are abundant. I imagine you could
| use space filling curves to make 2D or 3D sensors that could
| cost-efficiently give robots a sense of touch. Wrapped around
| something like a flexible tube you could make it directionally
| sensitive for proprioception. It's easily possible that other
| things that affect the air gaps like say temperature differences
| could be detected and localized as well.
| 00702 wrote:
| There are a lot of cool applications indeed! I was able to use
| it to do gait for a soft robot "leg", but I have to wait for
| the paper to be published later this year before going into too
| much detail.
| mhb wrote:
| Thanks for sharing your work. Would you be able to compare your
| device to the fiber optic one used in the data glove (e.g.,
| https://iopscience.iop.org/article/10.1088/1742-6596/2464/1/...).
| hypercube33 wrote:
| Isn't this sensor a lot or exactly like how the Powerglove
| works?
| mhb wrote:
| It's how Jaron Lanier's DataGlove worked. The Power Glove
| used cheaper flex sensors instead of fiber optics.
|
| https://en.wikipedia.org/wiki/Power_Glove
| 00702 wrote:
| There are a lot of similarities in the approach to the linked
| paper (which is a very cool concept) and I saw a lot of similar
| concepts in my lit review. At a high level, my sensor targets
| bend localization with simple fabrication techniques while the
| linked paper is doing more general camera-based gesture
| recognition. I have a more thorough comparison to existing work
| in my actual dissertation.
|
| Our lab has done a good bit of work around elastomers similar
| to the linked paper, such as multitouch pressure sensing
| (https://ieeexplore.ieee.org/abstract/document/9674750). The
| authors of your linked paper can actually achieve what they've
| done with a single light source by using one of these! The
| zones are key (https://www.st.com/en/imaging-and-photonics-
| solutions/time-o...)
| yogurtboy wrote:
| This is cool as hell! I hope your PhD continues well, and that
| this invention serves you in the future.
|
| Do you put any lube on the interface between the silicone sleeve
| and optical cable? I imagine the bending action will cause
| displacement, and friction there could rub and/or cause the
| nominal position to shift around.
| 00702 wrote:
| Since the sleeve is a stretchy rubber as long as the inner
| diameter of it is a bit smaller than the outer diameter of the
| fiber it holds just fine. For more dynamic applications,
| though, a silicone adhesive, or even super glue for more
| permanent strands, helps!
| yogurtboy wrote:
| Oh cool, that kind of robustness is not what I was expecting.
| Very cool project!
| riedel wrote:
| Have you explored fiber bragg grating. 10 years ago a student of
| mine semi successfully explored sensing the shape of firefighter
| hose using that technique. Seems to gain new traction again
| lately for optical shape sensing.
| 00702 wrote:
| I explored FBG sensors early on and they are very cool -- I was
| aiming for a less expensive and more robotics-oriented
| application. Something that can be seamlessly integrated into a
| design at a lower level without the complexity of FBG
| technology.
| Prcmaker wrote:
| I think you made the right choice. You've come up with a
| great sensor topology. FBGs are good, but there's so much to
| them that is just infuriating. I ended up late in my optics
| work spending most of my time finding ways to avoid them.
| Prcmaker wrote:
| The perpetual issue with FBGs is cost, in my experience. For
| quality sensors, you can use cheap fibre with expensive
| detectors, or cheap detectors with expensive fibre. There's
| been a perpetual promise of the tech getting cheaper, but never
| seems to really drop significantly. We always struggled to find
| buyers once they saw the price tag.
| knodi123 wrote:
| This is interesting! Although the way you have to have log2
| fibers, and a different encoding in each junction, presents quite
| a manufacturing challenge. Oh well, it's not a problem at the
| research/POC phase!
| rcarmo wrote:
| This, sir, is sheer genius. I genuinely loved your approach and
| the way you wrote it down.
| hervature wrote:
| > also be doable in 3 years
|
| I would like to add more color to this. 3 year PhD is very
| possible for a motivated individual at, what I'll call, the low
| end of R1 universities. That doesn't mean you cannot do good
| research (the OP is a counterexample) but that there is a
| fundamental difference between the program that the OP went to
| and top-tier universities. Think Harvard, Berkeley, Stanford,
| etc.
|
| It is normally pretty easy to distinguish these programs because
| they focus a lot on the course requirements and that the thesis
| counts as a course. From the OP's institution, you can see that
| the course load is at least 15 courses and I would not be
| surprised if some students do 20 [1]. These programs are more or
| less an advanced undergraduate with a real independent research
| project that spans multiple years. Conversely, top-tier
| universities typically operate under the "publish 3 things and
| you have satisfied the thesis requirements". This cannot be
| explicitly written and this is normally difficult to ascertain
| online. For example, Harvard has similar requirements [2] but you
| can still find it for some departments [3]. The Catch-22 with
| this is that someone who can publish 3 things in 3 years can
| publish 6 things (or more) in 5 years which will greatly increase
| their academic job prospects. Thus, at top-tier universities,
| even the best students stay for 5 years at a minimum to start
| working the job market. You need to be at Dantzig's level to
| finish at a top-tier in 3 years [4]. To summarize, if you want to
| finish a PhD in 3 years, look for course-heavy programs and don't
| expect to get hired into academia.
|
| Edit: I see the OP commented somewhere else that they published 3
| papers. The OP is obviously a standout but I think most people
| have to be realistic that very few areas of research allow for
| publications in your first year PhD. For example, if you are
| doing research in LLMs, you are looking at a couple of years just
| to be brought up to speed.
|
| [1] - https://catalog.louisville.edu/graduate/programs-
| study/docto...
|
| [2] - https://seas.harvard.edu/office-academic-
| programs/graduate-p...
|
| [3] - https://www.math.harvard.edu/graduate/graduate-program-
| timel...
|
| [4] - https://en.wikipedia.org/wiki/George_Dantzig
| kiwih wrote:
| In my experience in computer engineering in the academic
| systems of the USA, New Zealand, and Australia, a very large
| proportion of students will write their first paper in their
| first year. It is field dependent, but when I was a postdoc at
| a top US R1, 100% of the students I interacted with had their
| first paper in their first year. These even included students
| working on LLMs :-)
|
| Also, top non-US universities often graduate their engineering
| students within 3-4 years of commencing, with 3-4 papers being
| a very common international expectation as well.
|
| If you want to finish a PhD in 3 years and _are_ interested in
| academia, in addition to following the path the OP has laid out
| you may also look to good international universities and then
| get your next 3-4 papers as a postdoc.
| hervature wrote:
| I appreciate adding data for the crowd but "a very large
| proportion of students will write their first paper in their
| first year" is simply not true when talking about the whole
| population. My department, albeit not CS but stats/ML, the
| first year is dedicated to doing courses and preparing for
| the qualifying exams. Some students would publish a paper. A
| few more might be coauthors with an upper year student (read,
| very little involvement). Pretty close to half would not even
| have an advisor until the summer. I studied at, supposedly to
| US News, a top 10 in the US for CS.
|
| Typically, non-US universities have 3 year undergrads and 2
| year masters prior to PhD. End-to-end, you are looking at the
| same time. There are, of course, exceptions. UK I think
| shaves off a year by integrating undergrad and masters.
|
| Hiding the years or a PhD by doing an extended postdoc is
| barely the point of the exercise. The median time for a CS
| PhD in the US is 7 years [1]. Subtract 2 years for good
| students but add a year postdoc and I think you have a
| realistic 5-6 years from start of PhD to first academic
| position for the top decile.
|
| [1] - https://ncses.nsf.gov/pubs/nsf22300/report/path-to-the-
| docto...
| doug_life wrote:
| Very cool to see how it all came together. Can you elaborate on
| why you chose to use a Kalman filter in your signal chain?
| 00702 wrote:
| I used it more for future-proofing in case I wanted to do
| sensor fusion or something like that later on -- currently it's
| just 1D filtering so I could have used anything. Also I'm just
| way more familiar with using Kalman filters so it was also a
| comfort thing!
| sabujp wrote:
| is the idea to eventually make muscle fibers with lots of these
| stranded together?
| 00702 wrote:
| No I actually didn't have muscle fibers in mind -- there's
| quite a bit of ongoing research specifically in that area.
| abraae wrote:
| I've been looking for a sensor that can accurately detect a golf
| club sweeping over the top of it (how close, how quickly, arc of
| swing).
|
| The idea being to create a golf launch monitor that doesn't
| require hitting a ball, so you can play sim golf inside. Think
| playing alongside the Masters as you watch on TV in the lounge -
| without smashing a golf ball through your TV.
|
| I am wondering if this could be suitable (or a number of them
| ganged together).
| cglace wrote:
| My dad had something like this that hooked up to his computer
| in the 90's.
| risenshinetech wrote:
| This already exists, its called the Optishot, and it doesn't
| work very well when compared to any modern radar or camera
| based sensors.
| abraae wrote:
| The opportunity is for something that can accurately measure
| your golf swing without a ball.
|
| Optishot is not that:
|
| > Since no ball data is recorded, the OptiShot 2 cannot
| accurately read how your club impacts the ball, so if you top
| the ball or even miss entirely, the OptiShot 2 will show that
| you made solid contact.
| speps wrote:
| The risk of hitting the sensor and seeing it flying across the
| room into the TV seems too risky.
|
| As mentioned, a high FPS camera along with the Kinect tech to
| extract a skeleton would work so much better. You could make
| that in your garage using a PlayStation Eye and existing open
| source tech.
| abraae wrote:
| Yep, the sensor would need to be buried/encased in a mat. The
| club needs to make contact with the mat realistically (i.e.
| like taking a divot on a real grass surface).
|
| Most launch monitors today use high cameras or doppler radar.
| Neither of them really works without a ball. The difference
| between a good shot and a crap shot is a few millimetres. The
| technology can measure a bit of data from the club alone
| (club speed, angle of attack) but that's not enough to
| accurately extrapolate actual ball flight, since it's all
| about the quality of the contact of club on ball before
| ground.
| schrectacular wrote:
| Seems like a hall sensor or two plus a magnetic club head is
| what you need
| abraae wrote:
| Some kind of electromagnetic sensor does seem like a way to
| go. Not sure of the practicalities of magnetising/ keeping
| magnetised a golf club head though.
|
| And ideally it would work with all of your clubs or at least
| all your metal clubs, not just one special club.
| Prcmaker wrote:
| Very cool! I used to build sensors similar in construction, but
| in all glass. Was a great challenge and a lot of fun. I like your
| spin on things.
|
| Well done completing the Phd!
| 00702 wrote:
| Thanks!
| Glyptodon wrote:
| Could you get a similar effect by cutting dimples or notches in
| patterns (like a helical line of small conical holes) along the
| whole length in a sleeve instead of using separate pieces and
| infer overall curve shape? Or do the segments need to be cut
| completely through?
| 00702 wrote:
| _Yes!
|
| _ The catch is any notch you make will weaken the material
| significantly and you'll have fatigue failures. That's the
| sneaky part of using a flexible sleeve, you don't introduce any
| undesired weaknesses.
| robryk wrote:
| I don't understand what happens when the sensor is bent in more
| than one location.
|
| At the beginning you mention a ToF sensor, which made me think
| that you're looking at reflections from the bends and measuring
| distance to them, but this seems not to be the case. ISTM that if
| you bend the sensor in two places, you'll simply get the sum of
| the logattenuations from both. If we assume that the "strength"
| of the bend continuously changes attenuation, ISTM that you need
| as many strands as there are gap locations to be able to
| disambiguate between any two sets of bends.
|
| Am I misreading something or is this intended to operate in cases
| where we know only one bend is present?
| mariusor wrote:
| In the paragraph "Visualization of the OptiGap Sensor System"
| looks like the gap pattern from multiple sensors is providing a
| unique signature that can be translated into the exact location
| on the length of the sensors. The mechanism for translating the
| wave forms to actual location seems to be based on a bayesian
| model, according to the "Realtime Machine Learning on a
| Microcontroller" paragraph.
| downrightmike wrote:
| Isn't this just the same tech as the first Nintendo Power Glove
| had? https://www.youtube.com/watch?v=3g8JiGjRQNE
|
| They put light down a tube and then measured the light to trigger
| a key press. That's why bending your fingers/hand did anything.
| The revised the mechanism in the later generations.
| Doxin wrote:
| I think at least one novelty here is that there's multiple
| "bend points" it can detect without needing a fiber for each
| bend point.
| OisinMoran wrote:
| This is really neat--thanks for sharing! I love the general idea
| of making materials more "self-aware" or inspectable. It's very
| sci fi!
|
| The research I did before my current job touched ever so slightly
| on this too, so even cooler to see it on the front page. What we
| were doing was using complex valued neural nets to learn the
| transmission matrix of an optical fibre. It was previously done
| in the optics community by propagating Maxwell's equations, but
| we were able to beat the state of the art by a few orders of
| magnitude with a very simple architecture (the actual physics
| just boils down to a single complex matrix multiplication!). The
| connection to your work here is that if the fibre is bent you
| have to relearn a new matrix. It could even be possible to learn
| some parameterized characterisation of the fibre, so you could
| say do some input/output measurements and use that to model a
| spline of the fibre. We did not get that far though!
|
| Here are the papers if you're interested:
|
| CS-focussed one:
| https://papers.nips.cc/paper_files/paper/2018/hash/148510031...
|
| Physics-focussed one:
| https://www.nature.com/articles/s41467-019-10057-8
| spiderxxxx wrote:
| could it detect a twist instead of a bend?
| OisinMoran wrote:
| Interesting! I'm not sure to be honest. I imagine in practice
| it's hard to get a pure twist without any bending, and if the
| system can't detect any difference with a twist then the need
| to detect it is also nullified as it is effectively the same
| system.
| wcrossbow wrote:
| Maybe with polarization maintaining fiber? If at the gap they
| are aligned a teist will make them more perpendicular
| reducing mode coupling.
|
| Disclaimer: I know nothing about this field but have spent a
| lot of time in a dark lab surrounded by various types of
| optical fiber.
| karambahh wrote:
| It seems to me that the refractive index plays a role: couldn't
| you increase resolution by replacing air by another medium every
| other cut? Say air, water, air, etc.
|
| My reasoning is that you'd increase the resolution without adding
| too much technical complexity.
|
| My maths is too rusty to evaluate how it would mess with the gray
| code though.
|
| Very nice idea
| 00702 wrote:
| I didn't include this in my article but I did some experiments
| early on (for a different idea) with air bubbles in oil inside
| a Teflon coated tube but that presented a lot of challenges
| (mainly the bubble breaking up) that made it not ideal for
| something like this.
|
| This can certainly be miniaturized with the right manufacturing
| techniques but I left that for the future.
| Lucas-YANG wrote:
| Great! Really enjoyed that, thanks
| qtwhat wrote:
| very clever idea!
|
| if you know how OTDR works, location is known by high speed
| modulation and high speed sampling components, which means high
| costs, if you want to achieve higher resolution. Usually the
| laser pusle will be in the level of nano seconds.
|
| what's being introduced in the article is using multiple fibers
| with coded location info. therefore no need to have sophisticated
| OTDRA-like equipment to get the location info.
|
| not sure if my understanding is right, but good job: from a
| simple idea to sophisticated design, optimization and even
| commercialization.congrats.
| markisus wrote:
| How does it work if there is more than one bend? Is it able to
| localize both bends? And what about the bend directionality?
| Anyway nice project. I could see it being used for a mesh to give
| robots a tactile skin covering.
| eigenvalue wrote:
| Awesome work. Would be cool to make a really long one and tape it
| all along the links of an industrial robot arm and see if you can
| train a neural net to predict the location of the end effector
| (which you would already know precisely from the angles of the
| linkages) from the readings of your sensor.
| jallmann wrote:
| What an excellent write-up. Very clearly explained, and the use
| of gifs and visuals to illustrate concepts was spot-on. I'm only
| a software engineer with a limited understanding of this field,
| but really enjoyed reading this and learned a lot. Well done and
| congrats on the PhD.
| 00702 wrote:
| Thank you!
___________________________________________________________________
(page generated 2024-04-12 23:01 UTC)