Setup the BERT Pretrained Model
BERT pre-trained model is a good starting point for us to do a lot of NLP task, and we can use its performance as Benchmark. In this session, we simply load the model
Load Package
the package is in the transformers
library
In here, we use the one for ”Sequence Classification” because the twitter data is a sequential data: a paragraph of text
Loading the model
PreviousEncode with Tokenizer and Prepare the dataset for Machine LearningNextDataLoader, Optimizer, Scheduler
Last updated