[HN Gopher] Show HN: I made a new sensor out of 3D printer filam...
___________________________________________________________________
Show HN: I made a new sensor out of 3D printer filament for my PhD
Here's a "behind-the-scenes" look at my development of a cool
sensor during my PhD (electrical engineering). This sensor is only
about 1/3 of my total research for my degree and took about a year.
I've been on HN for a while now and I've seen my fair share of
posts about the woes of pursuing a PhD. Now that I'm done with mine
I wanna share some anecdotal evidence that doing a PhD can actually
be enjoyable (not necessarily easy) and also be doable in 3 years.
When I started I knew I didn't want to work on something that would
never leave the lab or languish in a dissertation PDF no one will
ever read. Thanks to an awesome advisor I think I managed to thread
the needle between simplicity and functionality. Looking back, the
ideas and methods behind it are pretty straightforward, but getting
there took some doing. It's funny how things seem obvious once
you've figured them out! Oh, I love creating GUIs for sensor data
and visualizations as you'll see -- it's such a game changer!
pyqtgraph is my go-to at the moment - such a great library.
Author : 00702
Score : 486 points
Date : 2024-04-11 16:06 UTC (6 hours ago)
(HTM) web link (paulbupejr.com)
(TXT) w3m dump (paulbupejr.com)
| Balvarez wrote:
| Wow that's super cool! Very nice work
| 00702 wrote:
| Thanks!
| JackFr wrote:
| Made me think of Navin R. Johnson's invention of the Opti-Grab.
| klysm wrote:
| > Thanks to an awesome advisor
|
| And there you have it! The difference between a miserable
| experience and a good one
| 00702 wrote:
| Very much so!
| porphyra wrote:
| That's super cool and I hope you don't mind a little bit of
| unsolicited feedback but the first question everyone's asking is
| "what does it do?" At present the blog post starts out with two
| paragraphs talking about the format of the blog post and the
| applications but not what the sensor actually measures.
| 00702 wrote:
| Good point -- I need to better explain what "bend localization"
| means on a more practical level pretty early on.
| porphyra wrote:
| Yeah. I guess in layman's terms it's a long squishy cable
| that knows where it's being bent.
| oersted wrote:
| > This means the sensor can tell you where you bent it, with
| a predefined (and coarse) resolution.
|
| > OptiGap's application is mainly within the realm of soft
| robotics, which typically involves compliant (or 'squishy')
| systems, where the use of traditional sensors is often not
| practical.
|
| This explanation is already quite clear. If I understood
| correctly, by "predefined resolution" you mean that it
| detects which silicone sleeve was bent on a tube with a
| series of them, correct?
|
| Can you provide more concrete examples for how you envision
| it being used? The first application that comes to mind is
| sensing how fingers bend in a glove controller.
| 00702 wrote:
| The finger bending example is certainly a classic for
| something like this but I think it truly shines in soft
| robot examples like flapping wing robots or swimming finned
| robot, where it's critical for sensors to be mechanically
| transparent so as to not impact the usually delicate
| dynamics. The "soft" robotic arm in my earlier paper is
| another good example
| https://ieeexplore.ieee.org/document/9763962
| IMTDb wrote:
| Also; if you can please include examples of practical use
| cases. I am sure there are tons, but I always love
| discovering complexity in hidden places
| devwastaken wrote:
| Have a quick recording that shows an outcome or effect that
| is a core part of the research.
| yoz wrote:
| Having just read the piece for the first time _after_ you
| added the "bent rope" explanation: YOU NAILED IT. I
| literally had the reaction of thinking, "Wow, there's a super
| simple explanation early on! I trust this writer much more
| now."
| Netcob wrote:
| That's basically every other HN article for me. Zero context.
| "Blorglorp 2024.4.99 released" "With the new version, Blorglorp
| finally sheds its libgnipgnop dependency and increases
| efficiency by 1.25%". Bam, top post for the day, lots of multi-
| paragraph comments, and I'll still never know what it's even
| for.
| sandebert wrote:
| The context tends to come some time later, when they close
| store. "We at Derplabs are proud that we dared to make an
| opinionated jpeg viewer, Blorglorp, that only interpreted the
| four first bits of every byte and ignored the rest. Commonly
| referred to as 'the naughty bits' by image-viewing
| connoisseurs. Unfortunately the market was not ready, but we
| are sure our ideas will gain traction in the future. Our
| deepest gratitude to our customers and investors that were
| excited to join our journey."
| stavros wrote:
| You think? All the closing announcements I've seen were
| "we've reached the end of our incredible journey, we're
| proud to have served our users but your data is gone
| tomorrow. Good luck!".
|
| Meanwhile, I never find out what the thing even does.
| cush wrote:
| I can die happy if I never have to work with libgnipgnop
| again
| Smoosh wrote:
| But have you used the new 3.6 release which uses pfnaphell
| integration to stochastically pre-convolute the tertiary
| nodes? It's a game-changer!
| Cerium wrote:
| One application for shape sensing technology is for surgical
| robotics. An example: https://www.intuitive.com/en-us/products-
| and-services/ion/sh...
|
| Relevant patent:
| https://patents.google.com/patent/US20240044638A1/
|
| The first time I saw one of these in person I was in awe. You
| could take a normal looking cable (think bicycle cable sleeve)
| and bend it and see in real time the same shape on the display.
| nuancebydefault wrote:
| This question surprises me. Bend location == locate where
| something is bent. Then some video's of the researcher bending
| a tube. Is there any confusion possible?
| porphyra wrote:
| OP updated his blog post after I posted my comment. The
| opening paragraphs are now very clear and awesome.
| nuancebydefault wrote:
| Oh - thanks for pointing this out!
| szundi wrote:
| This guy knows how to find shit out, I like this!
| blkhawk wrote:
| on first blush that sounds like a better version of the way the
| Nintendo power-glove detected finger bending.
| moralestapia wrote:
| Paul, what an amazing project, this is what hacking is all about.
| Congratulations!
|
| Definitely try to explore the commercial side of your invention.
|
| It wouldn't hurt to talk to an IP lawyer, if you're still in Uni
| they usually have people there doing this and you can just go
| talk to them, free of charge (for you!).
|
| I'm generally against the idea of patents, mainly because of
| people who came to know to game the system and exploit it (patent
| trolls etc...) Your project is a real thing with real
| applications, you definitely deserve a share of whatever
| commercial benefit this could bring to the world, :D.
| 00702 wrote:
| My (former) school is actually already in the process of doing
| that! My dissertation committee thought it was novel enough
| that it needed some IP protection and encouraged me to pursue
| that.
| moralestapia wrote:
| Hey, that's great to hear.
|
| Best of luck with everything!
| kobalsky wrote:
| Maybe I'm not understanding the blog post so bear with me.
| Isn't what he is described what a time domain reflectometer
| does? [1]. I mean that's what it's used to detect breaks or
| kinks in fiber optics cables. The same tech is used to detect
| problems in civil infrastructure with embbeded fiber optics.
|
| [1]: https://en.wikipedia.org/wiki/Optical_time-
| domain_reflectome...
| 00702 wrote:
| Oh yeah cable companies have long been able to do that - I
| wasn't trying to compete with or replace that technology. My
| work was soft robotics-focused with simplicity in mind.
| georgeburdell wrote:
| Yeah my thoughts exactly. OTDR is what the optical networking
| industry uses (for much larger runs of fiber) to find
| kinks/breaks.
| moralestapia wrote:
| It could still be a novelty if it uses a slightly different
| method or even materials. Hence why the advice is to get
| this on the hands of IP experts.
| georgeburdell wrote:
| I wasn't commenting on the novelty aspect, I was more
| wondering in which situations the author's device might
| be better. Also, the author noted they started with a
| time of flight sensor, which would have made it extremely
| similar to OTDR
| nimish wrote:
| hell yeah, this is super cool!
| lttlrck wrote:
| This is really really cool, but why does it depend on TPU?
| Wouldn't it work with regular fiber optic? Or does it come down
| to economics?
| 00702 wrote:
| It doesn't! I heavily used TPU to drive home the point that it
| can work with almost any light-transmitting fiber. I used PMMA
| optical fiber for the more fine demos.
| lttlrck wrote:
| That's great! Well done.
| Projectiboga wrote:
| Hi, what kind of sensor? When posting to the public you should
| mention the subject of what can be sensed or scanned early on.
| bobberkarl wrote:
| Have you been to Gabon recently?
| Projectiboga wrote:
| Solid job, and congratulations on doing something useful and
| hitting the program with the goal to make something tangible and
| getting out promptly.
| 00702 wrote:
| Thanks -- I definitely relied heavily on my wife in order to
| maintain that pace.
| ericdfoley wrote:
| Not quite the same thing, but this reminds me of DAS using fiber
| optic cable for various acoustic sensing tasks--basically as an
| alternative to geophones/hydrophones. There have been a number of
| papers using transoceanic fibers for various monitoring tasks.
|
| That is also used for various industrial applications, e.g. for
| strain sensing by Luna Innovations. I know that Schlumberger has
| various patents on fiber-optic sensing relating to towed
| streamers (e.g. for marine seismic acquisition.) But I haven't
| seen it used for soft robotics before.
| 00702 wrote:
| Yeah initially doing literature review was a bit daunting
| because of all this existing work, especially FBG-type sensors,
| but this idea is so fundamentally simple that its been mostly
| bypassed by the smarter minds
| ericdfoley wrote:
| Yeah, it makes a lot of sense to just create those gaps when
| you're specifically installing the fiber for sensing.
| bythreads wrote:
| Congrats and if I may; your blog post should be how ph.d's are
| done - readable, understandable and devoid of techno mumbling.
| 00702 wrote:
| Thank you!
| lifeisstillgood wrote:
| "I stuck a piece of plastic to my desk, it bent as I did it, I
| had time to investigate and skills to get a whole PhD out of
| that"
|
| And this dear worker drones, is why schedule destroys quality :-)
|
| Loving the write up. Clear and simple. Good luck with squishy
| robots. :-)
| 00702 wrote:
| Haha this made me laugh -- thanks!
| raphman wrote:
| Very beautiful research and thorough documentation. I initially
| wanted to comment that this looks a lot like time-domain
| reflectometry on a conceptual level - but as Cindy Harnett seems
| to be your advisor, you probably know that already :)
| 00702 wrote:
| Indeed! How do you know her?
| raphman wrote:
| Oh, just from reading her papers. I have been playing with
| TDR a long time ago [1] and try to follow current
| developments.
|
| [1] https://dl.acm.org/doi/10.1145/2047196.2047264
| teucris wrote:
| I wonder if by using a large nozzle, you could print out the
| entire sensor by laying out lengths of TPU with flexible joints
| at each air gap. It would depend on how well light traveled
| through the printed part though.
| 00702 wrote:
| 3D printing does affect the light passing through
| significantly. There are a number of options for fabricating
| these but most of the successful ones involve cutting (can even
| use a laser cutter).
| adversaryIdiot wrote:
| incredible work. good job!
| 00702 wrote:
| Thanks!
| uoaei wrote:
| Similar concept but for temperature (linked on HN a while back):
|
| https://en.m.wikipedia.org/wiki/Distributed_temperature_sens...
| poyu wrote:
| Great job, this is so cool!! Can't wait to see where this will be
| used in.
| wholinator2 wrote:
| As sometime about to start a PhD in theoretical physics, how did
| you do 3 years? I've been told a doctorate is 2 years of classes
| followed by 3 to 5 years of research. Did you already have a
| masters that you're doctorate college accepted to override class
| requirements?
| 00702 wrote:
| The short answer? Weekly meetings with my advisor! Long answer:
| I also had 2 years of classes but I started working on my
| research immediately, while taking classes. By the time I
| finished all my classes and became a candidate I had one paper
| already published and another one accepted, so I was able to
| get a 3rd paper out and defend by the end of the 3rd year.
| wantsanagent wrote:
| Isn't total internal reflection affected by surface defects? Aka,
| scratch the filament rather than air-gap it.
| 00702 wrote:
| I was aiming for significant attenuation when bending, so
| scratching wouldn't be enough.
| crote wrote:
| What an absolutely amazing idea, great work!
|
| Your sensor data seems to have quite large "dead zones" - those
| should be trivially fixable by reducing the inter-sensor
| distance, right?
|
| Would it be useful to sense the _direction_ of the bend? I reckon
| this might be possible by dividing the tube like a Mercedes logo,
| and having three sets of the sensor in one outer tube.
|
| Is there a way to sense multiple bends? With the current setup
| that'd result in invalid readings as you're essentially OR-ing
| the value. Are there any good solutions for this?
| 00702 wrote:
| Great ideas! Even though I haven't implemented it fully it is
| possible to sense multiple bends because each bend will always
| have the same relative attenuation (across the strands) so it
| would just be a matter of matching on the relative deltas from
| one reading to another. The catch, though, is at that at some
| point no light will reach the end if every joint bends a lot.
| There are ways to mitigate that, but my comment is too long
| already!
| gaudystead wrote:
| No need to cut your comments short! You clearly have a
| passion, and passion is the best way to get other people
| interested! :D
|
| (also, nice work!)
| datadrivenangel wrote:
| Love the use of lower cost materials and nice jankier DIY
| SCIENCE.
|
| 00702, 4th photo caption appears to be missing the word gap. :)
| 00702 wrote:
| Fixed -- thanks! And yes exactly! I had access to basically any
| piece of equipment I could want (including a cleanroom that can
| create ICs) but then basically no one would be able to recreate
| what I would make.
| npace12 wrote:
| This is such a clever project, and a great write up! Thanks for
| sharing!
| maxnoe wrote:
| Very Cool!
|
| Since you mentioned the visualization part, let me comment on my
| pet pieve:
|
| The rainbow / jet color map should not be used for anything. It
| distorts your data and is not accessible for people with color
| vision deficiencies.
|
| If you want to know more, have a look here:
| https://matplotlib.org/stable/users/explain/colors/colormaps...
| and this talk here: https://youtu.be/xAoljeRJ3lU
| itissid wrote:
| I recall first haring and reading about soft robot application
| from veritasium: https://www.youtube.com/watch?v=058hRtaCWC0 I
| think you must have heard of it?
|
| In the realm of rescuing or assisting people unreachable
| otherwise, could this bend and flex as much as those highly
| compliant robots?
| bobberkarl wrote:
| This makes me think of so many possible industrial applications.
| I believe you are onto something great!
| skoocda wrote:
| This is awesome! Presumably you can make this work with any
| interface that doesn't enforce the total internal reflectivity of
| a fiber optic cable, and therefore allows light to leak out.
| Instead of an air gap, have you tried experimenting with removing
| the cladding of the fiber optic cable, but keeping the core
| intact?
|
| Alteratively, could you use a short segment of colored cladding
| that allows certain wavelengths to leak out more than others? I
| think that would allow you to encode each bend point as a
| different color-- which might require a different (more
| expensive) rx sensor, but could be useful for certain
| applications.
| jjk166 wrote:
| This is fantastic! One can easily imagine some minor refinements
| that would allow this to be mass producible with very high
| accuracy. And the applications are abundant. I imagine you could
| use space filling curves to make 2D or 3D sensors that could
| cost-efficiently give robots a sense of touch. Wrapped around
| something like a flexible tube you could make it directionally
| sensitive for proprioception. It's easily possible that other
| things that affect the air gaps like say temperature differences
| could be detected and localized as well.
| 00702 wrote:
| There are a lot of cool applications indeed! I was able to use
| it to do gait for a soft robot "leg", but I have to wait for
| the paper to be published later this year before going into too
| much detail.
| mhb wrote:
| Thanks for sharing your work. Would you be able to compare your
| device to the fiber optic one used in the data glove (e.g.,
| https://iopscience.iop.org/article/10.1088/1742-6596/2464/1/...).
| hypercube33 wrote:
| Isn't this sensor a lot or exactly like how the Powerglove
| works?
| mhb wrote:
| It's how Jaron Lanier's DataGlove worked. The Power Glove
| used cheaper flex sensors instead of fiber optics.
|
| https://en.wikipedia.org/wiki/Power_Glove
| 00702 wrote:
| There are a lot of similarities in the approach to the linked
| paper (which is a very cool concept) and I saw a lot of similar
| concepts in my lit review. At a high level, my sensor targets
| bend localization with simple fabrication techniques while the
| linked paper is doing more general camera-based gesture
| recognition. I have a more thorough comparison to existing work
| in my actual dissertation.
|
| Our lab has done a good bit of work around elastomers similar
| to the linked paper, such as multitouch pressure sensing
| (https://ieeexplore.ieee.org/abstract/document/9674750). The
| authors of your linked paper can actually achieve what they've
| done with a single light source by using one of these! The
| zones are key (https://www.st.com/en/imaging-and-photonics-
| solutions/time-o...)
| yogurtboy wrote:
| This is cool as hell! I hope your PhD continues well, and that
| this invention serves you in the future.
|
| Do you put any lube on the interface between the silicone sleeve
| and optical cable? I imagine the bending action will cause
| displacement, and friction there could rub and/or cause the
| nominal position to shift around.
| 00702 wrote:
| Since the sleeve is a stretchy rubber as long as the inner
| diameter of it is a bit smaller than the outer diameter of the
| fiber it holds just fine. For more dynamic applications,
| though, a silicone adhesive, or even super glue for more
| permanent strands, helps!
| riedel wrote:
| Have you explored fiber bragg grating. 10 years ago a student of
| mine semi successfully explored sensing the shape of firefighter
| hose using that technique. Seems to gain new traction again
| lately for optical shape sensing.
| 00702 wrote:
| I explored FBG sensors early on and they are very cool -- I was
| aiming for a less expensive and more robotics-oriented
| application. Something that can be seamlessly integrated into a
| design at a lower level without the complexity of FBG
| technology.
| Prcmaker wrote:
| I think you made the right choice. You've come up with a
| great sensor topology. FBGs are good, but there's so much to
| them that is just infuriating. I ended up late in my optics
| work spending most of my time finding ways to avoid them.
| Prcmaker wrote:
| The perpetual issue with FBGs is cost, in my experience. For
| quality sensors, you can use cheap fibre with expensive
| detectors, or cheap detectors with expensive fibre. There's
| been a perpetual promise of the tech getting cheaper, but never
| seems to really drop significantly. We always struggled to find
| buyers once they saw the price tag.
| knodi123 wrote:
| This is interesting! Although the way you have to have log2
| fibers, and a different encoding in each junction, presents quite
| a manufacturing challenge. Oh well, it's not a problem at the
| research/POC phase!
| rcarmo wrote:
| This, sir, is sheer genius. I genuinely loved your approach and
| the way you wrote it down.
| hervature wrote:
| > also be doable in 3 years
|
| I would like to add more color to this. 3 year PhD is very
| possible for a motivated individual at, what I'll call, the low
| end of R1 universities. That doesn't mean you cannot do good
| research (the OP is a counterexample) but that there is a
| fundamental difference between the program that the OP went to
| and top-tier universities. Think Harvard, Berkeley, Stanford,
| etc.
|
| It is normally pretty easy to distinguish these programs because
| they focus a lot on the course requirements and that the thesis
| counts as a course. From the OP's institution, you can see that
| the course load is at least 15 courses and I would not be
| surprised if some students do 20 [1]. These programs are more or
| less an advanced undergraduate with a real independent research
| project that spans multiple years. Conversely, top-tier
| universities typically operate under the "publish 3 things and
| you have satisfied the thesis requirements". This cannot be
| explicitly written and this is normally difficult to ascertain
| online. For example, Harvard has similar requirements [2] but you
| can still find it for some departments [3]. The Catch-22 with
| this is that someone who can publish 3 things in 3 years can
| publish 6 things (or more) in 5 years which will greatly increase
| their academic job prospects. Thus, at top-tier universities,
| even the best students stay for 5 years at a minimum to start
| working the job market. You need to be at Dantzig's level to
| finish at a top-tier in 3 years [4]. To summarize, if you want to
| finish a PhD in 3 years, look for course-heavy programs and don't
| expect to get hired into academia.
|
| Edit: I see the OP commented somewhere else that they published 3
| papers. The OP is obviously a standout but I think most people
| have to be realistic that very few areas of research allow for
| publications in your first year PhD. For example, if you are
| doing research in LLMs, you are looking at a couple of years just
| to be brought up to speed.
|
| [1] - https://catalog.louisville.edu/graduate/programs-
| study/docto...
|
| [2] - https://seas.harvard.edu/office-academic-
| programs/graduate-p...
|
| [3] - https://www.math.harvard.edu/graduate/graduate-program-
| timel...
|
| [4] - https://en.wikipedia.org/wiki/George_Dantzig
| doug_life wrote:
| Very cool to see how it all came together. Can you elaborate on
| why you chose to use a Kalman filter in your signal chain?
| sabujp wrote:
| is the idea to eventually make muscle fibers with lots of these
| stranded together?
| abraae wrote:
| I've been looking for a sensor that can accurately detect a golf
| club sweeping over the top of it (how close, how quickly, arc of
| swing).
|
| The idea being to create a golf launch monitor that doesn't
| require hitting a ball, so you can play sim golf inside. Think
| playing alongside the Masters as you watch on TV in the lounge -
| without smashing a golf ball through your TV.
|
| I am wondering if this could be suitable (or a number of them
| ganged together).
| cglace wrote:
| My dad had something like this that hooked up to his computer
| in the 90's.
| risenshinetech wrote:
| This already exists, its called the Optishot, and it doesn't
| work very well when compared to any modern radar or camera
| based sensors.
| abraae wrote:
| The opportunity is for something that can accurately measure
| your golf swing without a ball.
|
| Optishot is not that:
|
| > Since no ball data is recorded, the OptiShot 2 cannot
| accurately read how your club impacts the ball, so if you top
| the ball or even miss entirely, the OptiShot 2 will show that
| you made solid contact.
| speps wrote:
| The risk of hitting the sensor and seeing it flying across the
| room into the TV seems too risky.
|
| As mentioned, a high FPS camera along with the Kinect tech to
| extract a skeleton would work so much better. You could make
| that in your garage using a PlayStation Eye and existing open
| source tech.
| abraae wrote:
| Yep, the sensor would need to be buried/encased in a mat. The
| club needs to make contact with the mat realistically (i.e.
| like taking a divot on a real grass surface).
|
| Most launch monitors today use high cameras or doppler radar.
| Neither of them really works without a ball. The difference
| between a good shot and a crap shot is a few millimetres. The
| technology can measure a bit of data from the club alone
| (club speed, angle of attack) but that's not enough to
| accurately extrapolate actual ball flight, since it's all
| about the quality of the contact of club on ball before
| ground.
| schrectacular wrote:
| Seems like a hall sensor or two plus a magnetic club head is
| what you need
| Prcmaker wrote:
| Very cool! I used to build sensors similar in construction, but
| in all glass. Was a great challenge and a lot of fun. I like your
| spin on things.
|
| Well done completing the Phd!
| Glyptodon wrote:
| Could you get a similar effect by cutting dimples or notches in
| patterns (like a helical line of small conical holes) along the
| whole length in a sleeve instead of using separate pieces and
| infer overall curve shape? Or do the segments need to be cut
| completely through?
___________________________________________________________________
(page generated 2024-04-11 23:00 UTC)