Setup the BERT Pretrained Model
BERT pre-trained model is a good starting point for us to do a lot of NLP task, and we can use its performance as Benchmark. In this session, we simply load the model
Load Package
from transformers import BertForSequenceClassificationLoading the model
Model = BertForSequenceClassification.from_pretrained(
'bert-base-uncased', # the specific model we want to use, it is matching with the tokenizer we choose
num_labels=len(label_dict), # how many classes / labels we need
output_attentions=False, # whether the model tells us its reasoning in making that prediction
output_hidden_states=False
)PreviousEncode with Tokenizer and Prepare the dataset for Machine LearningNextDataLoader, Optimizer, Scheduler
Last updated