[HN Gopher] NLP Course - For You
       ___________________________________________________________________
        
       NLP Course - For You
        
       Author : mjakl
       Score  : 32 points
       Date   : 2023-12-23 19:15 UTC (3 hours ago)
        
 (HTM) web link (lena-voita.github.io)
 (TXT) w3m dump (lena-voita.github.io)
        
       | victorbjorklund wrote:
       | Definatly not responsive design but syllabus looks promising
        
         | hackernewds wrote:
         | Definatly not the best critique
        
       | light_hue_1 wrote:
       | As an ML researcher I'm sad to say that this painfully out of
       | date. It's definitely a course from 5 years ago. Totally
       | irrelevant as an intro to NLP today.
        
         | linooma_ wrote:
         | Has much changed as far as parts of speech tagging in the last
         | 5-10 years?
        
           | screye wrote:
           | The sad truth is - all of classical nlp is Dead, with a
           | capital D.
           | 
           | The bottleneck for accuracy was always data quality and human
           | effort, not model architecture.
           | 
           | Llms make the data and human problems so much easier, that
           | the benefits of supporting different architectures just
           | doesn't make sense. With quantization, I'm not even sure
           | classical models win out on cost anymore, and they had
           | already lost on (real world) accuracy.
           | 
           | LLMs are the O365 subscription that you just can't fight
           | against with bespoke mini solution. An all in one solution is
           | simply too appealing.
           | 
           | Also, if you have to learn pre-2020 NLP I would just learn to
           | use spacy. It pretty much covers all of pre-2020 NLP out of
           | the box in a well documented package with strong GPU and CPU
           | support.
        
             | behnamoh wrote:
             | So if anyone wants to learn language modeling stuff, do you
             | recommend starting with transformer and just learn how to
             | deploy and finetune LLMs (given that ordinary people can't
             | train these LLMs)?
        
           | nvtop wrote:
           | A lot. POS taggers used to be linear classifiers + features.
           | In 2018 they switched to BERT and similar encoder-only
           | models. In 2023, POS tagging is largely irrelevant, because
           | it was used as a part of a larger pipeline, but now you can
           | have everything end-to-end with better accuracy by fine-
           | tuning a sufficienly large pretrained model (LLM or encoder-
           | decoder like T5)
        
         | fantispug wrote:
         | It covers a lot of the fundamentals in some detail (attention
         | and transformers, decoding, transfer learning) that are
         | underneath current cutting edge NLP; this is still a very good
         | foundation likely to be good for several more years.
         | 
         | What might be missing is in-context learning, prompt
         | engineering, novel forms of attention, RLHF, and LoRA (though
         | it covers adaptors), but this is still changing rapidly and the
         | details may be irrelevant in another year. If you have a look
         | at a recent course like Stanford CS224N 2023 there's a lot of
         | overlap.
        
       ___________________________________________________________________
       (page generated 2023-12-23 23:02 UTC)