Skip to content

Instantly share code, notes, and snippets.

View garyblankenship's full-sized avatar
🎩
I may be slow to respond.

Gary Blankenship garyblankenship

🎩
I may be slow to respond.
View GitHub Profile
@garyblankenship
garyblankenship / awesome-go.md
Last active June 23, 2025 15:40
Awesome Go Packages #go

The Actually Useful Go Package List: An Opinionated Guide

Stop googling "best Go web framework" at 2am. I've done it for you.

This isn't another list of GitHub stars. This is what you actually need to know: which packages to use, when to use them, and what choosing them says about your project. Every opinion here comes from production scars.

Web Frameworks: The Big Three (And Why There's Only Three)

Gin - The Default Choice

When to use: You're building an API and want to ship this week

@LukeMauldin
LukeMauldin / litellm_to_aider.py
Created April 12, 2025 00:15
litellm_to_aider.py
#!/usr/bin/env python3.12
"""
LiteLLM to Aider Configuration Generator
This tool generates configuration files that allow Aider to work seamlessly with LiteLLM models.
It can generate both model settings (YAML) and model metadata (JSON) files.
The generated configurations correctly handle model names in the format "litellm/provider/model"
by using Aider's extra_params feature to pass the proper model name format to LiteLLM.
@0xdevalias
0xdevalias / ai-voice-cloning.md
Created March 24, 2025 09:23 — forked from d00m4ace/ai-voice-cloning.md
AI Voice Cloning
@zackangelo
zackangelo / context.txt
Created December 13, 2024 19:39
Llama 3.3 Multi-tool Use Context Window
<|begin_of_text|><|start_header_id|>system<|end_header_id|>Environment: ipython
Cutting Knowledge Date: December 2023
Today Date: 13 Dec 2024
# Tool Instructions
You may optionally call functions that you have been given access to. You DO NOT have
to call a function if you do not require it. ONLY call functions if you need them. Do NOT call
functions that you have not been given access to.
@jdavidrcamacho
jdavidrcamacho / timelines_fertile_crescent.py
Last active June 22, 2025 16:51
Timeline of ancient civilizations in the fertile crescent
import matplotlib.pyplot as plt
# Updated timeline data with BCE year representation, including all events
timelines_simple = {
# Ubaid Period
"Ubaid Period": {"start": -6500, "end": -3800, "color": "salmon"},
# Sumerian periods
"Early Sumerian Settlement": {"start": -4500, "end": -4000, "color": "skyblue"},
"Uruk Period (Sumerians)": {"start": -4000, "end": -3100, "color": "skyblue"},
"Early Dynastic Period (Sumerians)": {
@michabbb
michabbb / Gemini.php
Created October 22, 2024 17:13
Upload Files to the Gemini API
<?php
namespace App\Services\google;
use Exception;
use Illuminate\Http\Client\ConnectionException;
use Illuminate\Support\Facades\Http;
class Gemini
{
@cugu
cugu / README.md
Last active June 1, 2025 11:12
Webhooks for PocketBase

Webhooks for PocketBase

A simple webhook plugin for PocketBase.

Adds a new collection "webhooks" to the admin interface, to manage webhooks.

Example

The webhook record in the following example send create, update, and delete events in the tickets collection to http://localhost:8080/webhook.

@oplanre
oplanre / Pipeline.php
Created June 13, 2024 22:17
Simple pipeline implementation in php as a class or function
<?php
class Pipeline {
public function __construct(
private mixed $data
) {}
public function pipe(callable ...$callbacks): static {
foreach ($callbacks as $callback) {
$this->data = $callback($this->data);
}
return $this;
@roychri
roychri / README.md
Created May 2, 2024 17:50
Stream Ollama (openai) chat completion API on CLI with HTTPie and jq

Stream Ollama (openai) chat completion API on CLI with HTTPie and jq

Explanation

This command sends a request to the Chat Completion API to generate high-level documentation for the file @src/arch.js. The API is configured to use the llama3-gradient model and to respond in Markdown format.

The messages array contains two elements:

  • The first element is a system message that provides the prompt for the API.
  • The second element is a user message that specifies the file for which to generate documentation.
# Machine Intelligence Made to Impersonate Characteristics: MIMIC
# NOTE run this $ conda install -c conda-forge mpi4py mpich to get mpi working
# accelerate launch --use_deepspeed -m axolotl.cli.train ./config_name_here
base_model: alpindale/Mistral-7B-v0.2-hf
base_model_config: alpindale/Mistral-7B-v0.2-hf
model_type: MistralForCausalLM
tokenizer_type: LlamaTokenizer
is_mistral_derived_model: true