[HN Gopher] Fine-Tuning Transformers for NLP
       ___________________________________________________________________
        
       Fine-Tuning Transformers for NLP
        
       Author : dylanbfox
       Score  : 54 points
       Date   : 2021-06-21 14:49 UTC (8 hours ago)
        
 (HTM) web link (www.assemblyai.com)
 (TXT) w3m dump (www.assemblyai.com)
        
       | visarga wrote:
       | The same transformer diagram from the original paper, replicated
       | everywhere. Nobody got time for redrawing.
       | 
       | BTW, take a look at "sentence transformers" library, a nice
       | interface on top of Hugging Face for this kind of operations
       | (reusing, fine-tuning).
       | 
       | https://www.sbert.net/
        
       | whimsicalism wrote:
       | Hm. I read this expecting a more in-depth discussion about best
       | practices for fine-tuning massive transformers while avoiding
       | catastrophic forgetting, ie.
       | 
       | * How should you select the learning rate?
       | 
       | * What tasks are best for fine-tuning on small amounts of data?
       | etc.
       | 
       | Instead, this seems mostly to just be running through the
       | implementation of ML/DL 101: loss function for binary
       | classification, helper functions to load data, etc.
        
         | jpulliam wrote:
         | Author here! This tutorial was mainly setup for people new to
         | ML/DL and walk them through usage of a pre-trained model,
         | creating the dataloader, writing the training loop, etc.
         | 
         | In the future we definitely plan to dive further into the
         | details and touch on some of the things you mentioned!
         | 
         | This was one of the primary reasons why we chose the "Sentiment
         | Analysis" task as it's fairly simply to get a model trained
         | quickly with good performance.
        
       | uniqueuid wrote:
       | For anyone looking to fine-train transformers with less work,
       | there is the FARM project (https://github.com/deepset-ai/FARM)
       | which has some more or less ready-to-go configurations
       | (classification, question answering, NER, and a couple of
       | others). It's really almost "plug in a csv and run".
       | 
       | By the way, a pet peeve is sentiment detection. It's a useful
       | method, but please be aware that it does not measure "sentiment"
       | in a way that one would normally think, and that what it measures
       | varies strongly across methods (https://www.tandfonline.com/doi/a
       | bs/10.1080/19312458.2020.18...).
        
         | Der_Einzige wrote:
         | There's another good library for fine-tuning transformers
         | called "simple transformers". It's basically a sklearn style
         | interface on top of the base hugging face code. It's also
         | basically "plug a csv in and run".
        
       ___________________________________________________________________
       (page generated 2021-06-21 23:01 UTC)