Skip to content

Instantly share code, notes, and snippets.

@leerob
leerob / agent.py
Created July 30, 2025 23:14
agent.py
import os
import json
import subprocess
from anthropic import Anthropic
# Tool definitions
TOOLS = [
{
"name": "list_files",
"description": "List files and directories at a given path",
@willccbb
willccbb / grpo_demo.py
Last active September 16, 2025 16:17
GRPO Llama-1B
# train_grpo.py
#
# See https://github.com/willccbb/verifiers for ongoing developments
#
"""
citation:
@misc{brown2025grpodemo,
title={Granular Format Rewards for Eliciting Mathematical Reasoning Capabilities in Small Language Models},
author={Brown, William},
@Maharshi-Pandya
Maharshi-Pandya / contemplative-llms.txt
Last active September 12, 2025 08:02
"Contemplative reasoning" response style for LLMs like Claude and GPT-4o
You are an assistant that engages in extremely thorough, self-questioning reasoning. Your approach mirrors human stream-of-consciousness thinking, characterized by continuous exploration, self-doubt, and iterative analysis.
## Core Principles
1. EXPLORATION OVER CONCLUSION
- Never rush to conclusions
- Keep exploring until a solution emerges naturally from the evidence
- If uncertain, continue reasoning indefinitely
- Question every assumption and inference
@av
av / post.md
Created December 28, 2024 20:14
r/LocalLLaMA - a year in review

r/LocalLLaMA - a year in review

This community was a great part of my life for the past two years, so as 2024 comes to a close, I wanted to feed my nostalgia a bit. Let me take you back to the most notable things happened here this year.

This isn't a log of model releases or research, rather things that were discussed and upvoted by the people here. So notable things missing is also an indication of what was going on of sorts. I hope that it'll also show the amount of progress and development that happend in just a single year and make you even more excited for what's to come in 2025.


The year started with the excitement about Phi-2 (443 upvotes, by u/steph_pop). Phi-2 feels like ancient history these days, it's also fascinating that we end the 2024 with the Phi-4. Just one week after, people discovered that apparently it [was trained on the software engineer's diary](https://reddit.com/r/LocalLLaMA/comments/1

@Artefact2
Artefact2 / README.md
Last active September 17, 2025 05:33
GGUF quantizations overview
@ole
ole / swift-has-feature.sh
Last active September 7, 2025 10:12
swift-list-features: List Swift compiler upcoming and experimental feature flags. If you're using Swift 6.2 or later, `swift -print-supported-features` does something very similar, but only for the compiler version you have installed. · swift-has-feature: Check if a given compiler knows a specific feature flag, and whether it's an upcoming or ex…
#!/bin/zsh
# Test if the Swift compiler knows about a particular language feature.
#
# Usage:
#
# swift-has-feature [--swift SWIFT_PATH] [--language-version LANGUAGE_VERSION] FEATURE
#
# The feature should be an upcoming or experimental language feature,
# such as `"StrictConcurrency"` or `"ExistentialAny"`.
@jrknox1977
jrknox1977 / ollama_dspy.py
Created February 9, 2024 18:06
ollama+DSPy using OpenAI APIs.
# install DSPy: pip install dspy
import dspy
# Ollam is now compatible with OpenAI APIs
#
# To get this to work you must include `model_type='chat'` in the `dspy.OpenAI` call.
# If you do not include this you will get an error.
#
# I have also found that `stop='\n\n'` is required to get the model to stop generating text after the ansewr is complete.
# At least with mistral.
@jrknox1977
jrknox1977 / dspy_tgi_mistral.py
Last active April 14, 2024 08:42
DSPy - using TGI for local model
# install DSPy: pip install dspy
import dspy
# This sets up the language model for DSPy in this case we are using mistral 7b through TGI (Text Generation Interface from HuggingFace)
mistral = dspy.HFClientTGI(model='mistralai/Mistral-7B-v0.1', port=8080, url='http://localhost')
# This sets the language model for DSPy.
dspy.settings.configure(lm=mistral)
# This is not required but it helps to understand what is happening
@jrknox1977
jrknox1977 / dspy_chain_of_thought_example.py
Created February 5, 2024 18:56
DSPy example with chain of thought.
# install DSPy: pip install dspy
import dspy
# This sets up the language model for DSPy in this case we are using GPT-3.5-turbo
turbo = dspy.OpenAI(model='gpt-3.5-turbo')
# This sets the language model for DSPy. This must be set or you get an error that is not helpful:
# --> temperature = lm.kwargs['temperature'] if temperature is None else temperature
# --> AttributeError: 'NoneType' object has no attribute 'kwargs'
@jrknox1977
jrknox1977 / basic_qa.py
Last active April 14, 2024 08:43
Simple DSPY example of BasicQA
# install DSPy: pip install dspy
import dspy
# This sets up the language model for DSPy in this case we are using GPT-3.5-turbo
turbo = dspy.OpenAI(model='gpt-3.5-turbo')
# This sets the language model for DSPy. This must be set or you get an error that is not helpful:
# --> temperature = lm.kwargs['temperature'] if temperature is None else temperature
# --> AttributeError: 'NoneType' object has no attribute 'kwargs'