[HN Gopher] Logistic Regression for Image Classification Using O...
       ___________________________________________________________________
        
       Logistic Regression for Image Classification Using OpenCV
        
       Author : andyjohnson0
       Score  : 49 points
       Date   : 2023-12-31 16:26 UTC (6 hours ago)
        
 (HTM) web link (machinelearningmastery.com)
 (TXT) w3m dump (machinelearningmastery.com)
        
       | minimaxir wrote:
       | Digits/MNIST is a very, very bad dataset for CV demos because
       | it's too easy and like done here, you can do a logistic
       | regression on raw pixel values and get sufficiently good results.
       | That's also the reason why Fashion MNIST was created, to give ML
       | demos _some_ difficulty:
       | https://github.com/zalandoresearch/fashion-mnist
       | 
       | For more typical image classification problems, you can get >90%
       | of the way for image classification on any arbitrary image
       | dataset _and_ with much less code by using CLIPVision image
       | embeddings as a model input to your classification algorithm of
       | choice.
        
         | melenaboija wrote:
         | I doubt this is meant to give an example for real applications
         | but to give some foundations on machine learning and computer
         | vision.
        
           | minimaxir wrote:
           | The title touts OpenCV, but OpenCV isn't doing much here
           | other than unnecessarily complicating things (which is bad
           | for newbies) and doesn't show off OpenCV's unique
           | capabilities.
           | 
           | In the final code sample, OpenCV is a) loading the image,
           | which could be done with PIL and b) training the model, when
           | the demo imports sklearn which has its own battle-tested
           | logistic regression implementation.
           | 
           | There's a lot of useful things that can be done with machine
           | learning and computer vision, but this article is a bad demo
           | of it that won't work on any other real-world dataset and is
           | out-of-date with more modern CV approaches. Their previous
           | article is a good explanation of the math behind logistic
           | regression, though:
           | https://machinelearningmastery.com/logistic-regression-in-
           | op...
        
           | markisus wrote:
           | In fact, a neural net is "just" a stack of logistic
           | regressions if you only use sigmoid activations.
        
           | aydyn wrote:
           | There are lots of datasets where applying multiclass LR makes
           | sense. Image classification isn't it.
        
         | MOARDONGZPLZ wrote:
         | Seconded; I've used MNIST with more classic algos and it's very
         | easy to reach 99% without any modern techniques.
        
       | dankle wrote:
       | But why
        
         | aydyn wrote:
         | clicks and clout chasing
        
         | MOARDONGZPLZ wrote:
         | To achieve Machine Learning Mastery!
        
       | agilob wrote:
       | Did you know what OpenCV is collecting money right now?
       | https://www.indiegogo.com/projects/opencv-5-support-non-prof...
        
         | ta988 wrote:
         | it always makes me sad to see that projects used by so many
         | companies have to fight to get what amounts to two average paid
         | devs in those same companies
        
       | justinl33 wrote:
       | I love when basic statistical models are used for tasks usually
       | dominated by deep learning; image classification, stock price
       | analysis. Makes me happy for some reason.
        
         | minimaxir wrote:
         | In this case there's no _advantage_ to using logistic
         | regression on an image other than the novelty. Logistic
         | regression is excellent for feature explainability, but you can
         | 't explain anything from an image.
         | 
         | Traditional classification algorithms but not deep learning
         | such as Support Vector Machines and Random Forests perform a
         | lot better on MNIST, up to 97% test set accuracy compared to
         | the 88% from logistic regression in this post. Check the
         | Original MNIST benchmarks here: http://fashion-
         | mnist.s3-website.eu-central-1.amazonaws.com/#
        
           | valec wrote:
           | even knn after dimensionality reduction does pretty good
        
       | aodin wrote:
       | Pytorch includes a simple neural network example for the MNIST
       | data: https://github.com/pytorch/examples/blob/main/mnist/main.py
       | 
       | It only takes a few minutes to train with default parameters and
       | will have >99% accuracy on the MNIST test set.
        
       ___________________________________________________________________
       (page generated 2023-12-31 23:00 UTC)