[HN Gopher] Machine Learning Cohorts: A Synthesis
___________________________________________________________________
Machine Learning Cohorts: A Synthesis
Author : polm23
Score : 13 points
Date : 2021-06-20 13:23 UTC (1 days ago)
(HTM) web link (github.com)
(TXT) w3m dump (github.com)
| seasily wrote:
| This largely misses the mark. Kaggle is a machine learning
| competition platform for a small set of exceptional machine
| learning talent, and a lot of students or hangers-on.
|
| "Machine learning projects - if ML is being attempted at all -
| are in early stages, using traditional methods that are best-
| suited for high-RAM CPU rather than GPU SKUs (ex: scikit-learn
| and clustering approaches)."
|
| The idea that the machine learning being done there is in "early
| stages" is laughable, given the prize pools and sheer
| competitiveness usually move well past existing SOTA--usually
| moving forward benchmarks on Google's image classification and
| labeling (!) benchmarks and other areas where enormous teams
| can't match the top few.
|
| Part of what you're seeing is the mass of survey participants,
| who show up to fork notebooks and fake ML skills, are the people
| who haven't exactly established themselves in the field
| (https://commons.m.wikimedia.org/wiki/File:Survivorship-
| bias....), while the top-end is too small a group to fully
| characterize with low-powered clustering.
|
| Even five years ago, most real-world MLEs knew how to use AWS,
| were deploying actual machine learning models (granted, more
| primitive than today's methods), and I'm wouldn't be so glib
| about calling anything "early stage" even then given the raw
| business value it provides.
___________________________________________________________________
(page generated 2021-06-21 23:01 UTC)