[HN Gopher] Cross-Validation FAQ (2022)
       ___________________________________________________________________
        
       Cross-Validation FAQ (2022)
        
       Author : Tomte
       Score  : 34 points
       Date   : 2023-07-30 14:03 UTC (8 hours ago)
        
 (HTM) web link (avehtari.github.io)
 (TXT) w3m dump (avehtari.github.io)
        
       | jwilber wrote:
       | And here's a shorter, visual explainer on cross-validation:
       | 
       | https://mlu-explain.github.io/cross-validation/
       | 
       | What's really interesting about k-fold cv is that most textbooks
       | claim that the greater the value of k in k-fold cross-validation,
       | the more variance our estimates tend to have, with Leave-One-Out
       | Cross-Validation (LOOCV) exhibiting exceptionally high variance.
       | This is frequently rationalized by asserting that a larger k will
       | lead to models being trained on very similar data, thereby
       | creating essentially identical models. Nevertheless, this
       | rationale does not always hold true and can be significantly
       | influenced by the specifics of the model and dataset in use!
        
       ___________________________________________________________________
       (page generated 2023-07-30 23:01 UTC)