Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save jerrydotlam/b6c58d8a18231e543007d13fe8dd9d91 to your computer and use it in GitHub Desktop.
Save jerrydotlam/b6c58d8a18231e543007d13fe8dd9d91 to your computer and use it in GitHub Desktop.
Example code to use CodeLlama 7B Instruct model HuggingFace version

Based on discussions on https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf/discussions/10

# Use a pipeline as a high-level helper
from transformers import pipeline
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig

tokenizer = AutoTokenizer.from_pretrained("codellama/CodeLlama-7b-Instruct-hf")
model = AutoModelForCausalLM.from_pretrained("codellama/CodeLlama-7b-Instruct-hf")


# Create a pipeline
code_generator = pipeline('text-generation', model=model, tokenizer=tokenizer)

# Generate code for an input string
input_string = "Write a python function to calculate the factorial of a number"
generated_code = code_generator(input_string, max_length=100)[0]['generated_text']
print(generated_code)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment