Question : how to add special token to bert tokenizer
Answered by : clever-cardinal-2nfqaxgpqjf4
special_tokens_dict = {'additional_special_tokens': ['[C1]','[C2]','[C3]','[C4]']}
num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
model.resize_token_embeddings(len(tokenizer))
Source : | Last Update : Thu, 28 Jan 21