Skip to content

Instantly share code, notes, and snippets.

@alonsosilvaallende
Created August 12, 2024 16:15
Show Gist options
  • Save alonsosilvaallende/1d130bbba88e6f7d109685d9e065125b to your computer and use it in GitHub Desktop.
Save alonsosilvaallende/1d130bbba88e6f7d109685d9e065125b to your computer and use it in GitHub Desktop.
apply_chat_template_new.ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": [],
"authorship_tag": "ABX9TyO4sIUHyOWCVf9zieKhWDiD",
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/alonsosilvaallende/1d130bbba88e6f7d109685d9e065125b/apply_chat_template_new.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"id": "0KNAocOmS6nc"
},
"outputs": [],
"source": [
"%pip install --upgrade --quiet transformers"
]
},
{
"cell_type": "code",
"source": [
"import warnings\n",
"\n",
"warnings.filterwarnings(\"ignore\", category=UserWarning)"
],
"metadata": {
"id": "oF2KvGu6W-I3"
},
"execution_count": 2,
"outputs": []
},
{
"cell_type": "code",
"source": [
"from transformers import AutoTokenizer\n",
"\n",
"model_id = \"NousResearch/Hermes-2-Pro-Llama-3-8B\"\n",
"tokenizer = AutoTokenizer.from_pretrained(model_id)"
],
"metadata": {
"id": "k_iiO4-vTdBo"
},
"execution_count": 3,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def get_current_temperature(location: str):\n",
" \"\"\"Get the temperature at a given location.\n",
" Args:\n",
" location: the location to get the temperature for\n",
" \"\"\"\n",
" return 22"
],
"metadata": {
"id": "IxdylcocWa8J"
},
"execution_count": 4,
"outputs": []
},
{
"cell_type": "code",
"source": [
"messages = [\n",
" {\"role\": \"user\", \"content\": \"What's the temperature in Paris?\"}\n",
"]"
],
"metadata": {
"id": "SEG8bJEET3kZ"
},
"execution_count": 5,
"outputs": []
},
{
"cell_type": "code",
"source": [
"prompt = tokenizer.apply_chat_template(\n",
" messages,\n",
" tools=[get_current_temperature],\n",
" tokenize=False\n",
")\n",
"print(prompt)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "jZCv-oajT8IX",
"outputId": "3516200c-249f-4364-c546-25dc6ed1d179"
},
"execution_count": 6,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"<|begin_of_text|>You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools: <tools> {\"type\": \"function\", \"function\": {\"name\": get_current_temperature\", \"description\": \"get_current_temperature(location: str) - Get the temperature at a given location.\n",
"\n",
" Args:\n",
" location(str): the location to get the temperature for\", \"parameters\": {\"type\": \"object\", \"properties\": {\"location\": {\"type\": \"string\", \"description\": \"the location to get the temperature for\"}}, \"required\": [\"location\"]}} </tools>Use the following pydantic model json schema for each tool call you will make: {\"properties\": {\"arguments\": {\"title\": \"Arguments\", \"type\": \"object\"}, \"name\": {\"title\": \"Name\", \"type\": \"string\"}}, \"required\": [\"arguments\", \"name\"], \"title\": \"FunctionCall\", \"type\": \"object\"}\n",
"For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:\n",
"<tool_call>\n",
"{\"arguments\": <args-dict>, \"name\": <function-name>}\n",
"</tool_call><|im_end|><|im_start|>user\n",
"What's the temperature in Paris?<|im_end|>\n",
"\n"
]
}
]
},
{
"cell_type": "code",
"source": [],
"metadata": {
"id": "IVjmdCH_UAdZ"
},
"execution_count": 6,
"outputs": []
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment