Skip to content

Instantly share code, notes, and snippets.

@jadsongmatos
Created July 20, 2023 16:53
Show Gist options
  • Save jadsongmatos/9dec744f255f4ee933236306579f1ae0 to your computer and use it in GitHub Desktop.
Save jadsongmatos/9dec744f255f4ee933236306579f1ae0 to your computer and use it in GitHub Desktop.
summary
from transformers import LEDForConditionalGeneration, LEDTokenizer
tokenizer = LEDTokenizer.from_pretrained('allenai/led-large-16384')
model = LEDForConditionalGeneration.from_pretrained('allenai/led-large-16384')
inputs = tokenizer.encode("Lorem Ipsum is simply dummy text of the printing and typesetting industry.", return_tensors="pt")
# Generate Summary with higher repetition penalty
summary_ids = model.generate(inputs, num_beams=4, max_length=1024, repetition_penalty=2.0)
summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment