[HN Gopher] Show HN: Automated smooth Nth order derivatives of n...
       ___________________________________________________________________
        
       Show HN: Automated smooth Nth order derivatives of noisy data
        
       This little project came about because I kept running into the same
       problem: cleanly differentiating sensor data before doing analysis.
       There are a ton of ways to solve this problem, I've always
       personally been a fan of using kalman filters for the job as its
       easy to get the double whammy of resampling/upsampling to a fixed
       consistent rate and also smoothing/outlier rejection. I wrote a
       little numpy only bayesian filtering/smoothing library recently
       (https://github.com/hugohadfield/bayesfilter/) so this felt like a
       fun and very useful first thing to try it out on! If people find
       kalmangrad useful I would be more than happy to add a few more
       features etc. and I would be very grateful if people sent in any
       bugs they spot.. Thanks!
        
       Author : hugohadfield
       Score  : 33 points
       Date   : 2024-10-16 20:17 UTC (2 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | pm wrote:
       | Congratulations! Pardon my ignorance, as my understanding of
       | mathematics at this level is beyond rusty, but what are the
       | applications of this kind of functionality?
        
         | hugohadfield wrote:
         | No problem! Let's dream up a little use case:
         | 
         | Imagine you have a speed sensor eg. on your car and you would
         | like to calculate the jerk (2nd derivative of speed) of your
         | motion (useful in a range of driving comfort metrics etc.). The
         | speed sensor on your car is probably not all that accurate, it
         | will give some slightly randomly wrong output and it may not
         | give that output at exactly 10 times per second, you will have
         | some jitter in the rate you receive data. If you naiively
         | attempt to calculate jerk by doing central differences on the
         | signal twice (using np.gradient twice) you will amplify the
         | noise in the signal and end up with something that looks
         | totally wrong which you will then have to post process and
         | maybe resample to get it at the rate that you want. If instead
         | of np.gradient you use kalmangrad.grad you will get a nice
         | smooth jerk signal (and a fixed up speed signal too). There are
         | many ways to do this kind of thing, but I personally like this
         | one as its fast, can be run online, and if you want you can get
         | uncertainties in your derivatives too :)
        
         | uoaei wrote:
         | Basically, approximating calculus operations on noisy,
         | discrete-in-time data streams.
        
       | theaussiestew wrote:
       | I'm looking to calculate jerk from accelerometer data, I'm
       | assuming this would be the perfect use case?
        
         | hugohadfield wrote:
         | this is a perfect use case, let me know how it goes!
        
       ___________________________________________________________________
       (page generated 2024-10-16 23:00 UTC)