Skip to content

Instantly share code, notes, and snippets.

@dhruvilp
Forked from zainhas/thinking_tokens.py
Created February 18, 2025 16:01
Show Gist options
  • Save dhruvilp/09ef29e19b4cad35b604fbbfc5c5a75e to your computer and use it in GitHub Desktop.
Save dhruvilp/09ef29e19b4cad35b604fbbfc5c5a75e to your computer and use it in GitHub Desktop.
Extract ONLY thinking tokens from DeepSeek-R1
from together import Together
client = Together(api_key = TOGETHER_API_KEY)
question = "Which is larger 9.9 or 9.11?"
thought = client.chat.completions.create(
model="deepseek-ai/DeepSeek-R1",
messages=[{"role": "user", "content": question}],
stop = ['</think>']
)
PROMPT_TEMPLATE = """
Thought process: {thinking_tokens} </think>
Question: {question}
Answer:
"""
answer = client.chat.completions.create(
model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
messages=[{"role": "user",
"content": PROMPT_TEMPLATE.format(thinking_tokens=thought.choices[0].message.content, question = question) }],
)
print(answer.choices[0].message.content)
#Answers: 9.9 is larger than 9.11.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment