[HN Gopher] AutoBNN: Probabilistic Time Series Forecasting
       ___________________________________________________________________
        
       AutoBNN: Probabilistic Time Series Forecasting
        
       Author : simonpure
       Score  : 26 points
       Date   : 2024-03-29 11:37 UTC (11 hours ago)
        
 (HTM) web link (blog.research.google)
 (TXT) w3m dump (blog.research.google)
        
       | HuShifang wrote:
       | I'm not an expert in this, but...
       | 
       | > BNNs bring the following advantages over GPs: First, training
       | large GPs is computationally expensive, and traditional training
       | algorithms scale as the cube of the number of data points in the
       | time series. In contrast, for a fixed width, training a BNN will
       | often be approximately linear in the number of data points.
       | Second, BNNs lend themselves better to GPU and TPU hardware
       | acceleration than GP training operations.
       | 
       | If I'm not mistaken Hilbert Space Gaussian Processes (HSGPs) are
       | O(mn+m) (where m is the number of basis functions, often
       | something like m=30, m=60, or m=100), which is also a huge
       | improvement over conventional GPs' O(n^3). I know that there are
       | some constraints on HSGPs (e.g. they work best with stationary
       | time series, and they're not quite as accurate, flexible, or
       | readily interpretable or tunable as conventional GPs), but what
       | would be the argument for an AutoBNN over an HSGP? Is it mainly
       | about the lack of a need for domain expert input?
        
       ___________________________________________________________________
       (page generated 2024-03-29 23:01 UTC)