Question : gpt-3 tokenizer python3
Answered by : cipher
from transformers import GPT2Tokenizer tokenizer = GPT2Tokenizer.from_pretrained("gpt2") tokenizer("Hello world")['input_ids']
[15496, 995] tokenizer(" Hello world")['input_ids']
[18435, 995]
Source : https://huggingface.co/docs/transformers/model_doc/gpt2#transformers.GPT2TokenizerFast | Last Update : Mon, 23 May 22