Skip to content

Instantly share code, notes, and snippets.

@veer66
Created July 13, 2021 13:09
Show Gist options
  • Save veer66/03ef14c399b411d4419745562a2da8a6 to your computer and use it in GitHub Desktop.
Save veer66/03ef14c399b411d4419745562a2da8a6 to your computer and use it in GitHub Desktop.
from transformers import GPT2Tokenizer, GPTNeoForCausalLM
tokenizer = GPT2Tokenizer.from_pretrained("wannaphong/thaigpt-next-125m")
model = GPTNeoForCausalLM.from_pretrained("wannaphong/thaigpt-next-125m")
inputs = tokenizer("<|startoftext|>พระทรงสารศรีเศวตเกศกุญชร", return_tensors="pt")
outputs = model(**inputs, labels=inputs["input_ids"])
print(outputs.loss)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment