Skip to content

Instantly share code, notes, and snippets.

@simbo1905
simbo1905 / newest
Last active December 22, 2025 22:43
Zsh function to list newest files with proper escaping for copy/paste
#!/usr/bin/env zsh
# newest - List newest files with realpaths
#
# USAGE:
# newest [path] [count] [-r] [-q]
#
# OPTIONS:
# path Directory path or hash (default: current directory)
# count Number of files to show (default: 10)
@simbo1905
simbo1905 / OPENCODE-MCP.5
Last active December 18, 2025 07:41
OpenCode Configuration Man Page
OPENCODE-MCP(5) OpenCode Configuration OPENCODE-MCP(5)
NAME
opencode-mcp - configure Model Context Protocol servers in OpenCode
SYNOPSIS
~/.config/opencode/opencode.json
./opencode.json (project-specific)
DESCRIPTION
@simbo1905
simbo1905 / mistral-vibe-macos-setup.txt
Created December 17, 2025 18:42
Mistral Vibe Setup on macOS with Isolated Python 3.12
MISTRAL VIBE SETUP ON MACOS WITH ISOLATED PYTHON 3.12
=====================================================
NAME
mistral-vibe-macos-setup - Install Mistral Vibe with isolated Python 3.12 on macOS
SYNOPSIS
This guide shows how to install Mistral Vibe using an isolated Python 3.12 environment on macOS.
PREREQUISITES

CLAUDE_DESKTOP(1) User Commands CLAUDE_DESKTOP(1)

NAME claude-desktop-fix - resolve Claude Desktop startup hang on Windows

SYNOPSIS cmd /c "rd /s /q %APPDATA%\Claude"

DESCRIPTION Claude Desktop for Windows may fail to render content after a reboot or

@simbo1905
simbo1905 / README.md
Created December 11, 2025 21:27
Setup git worktree script

Setup Git Worktree Script

This repository contains a helper script setup-git-worktree.sh that automates the creation of a bare Git repository with adjacent worktree directories, following the flat worktree architecture described in Tom Ups' article:

Overview

The layout created by the script looks like this:

@simbo1905
simbo1905 / download.sh
Last active December 3, 2025 14:52
how to zip up your /Users/Shared directories to iCloud
# this will only download what is there already
find ~/Library/Mobile\ Documents/com~apple~CloudDocs/ -maxdepth 1 -name "*.zip" -type f | \
while read -r zipfile; do \
if [[ ! -f "/Users/Shared/$(basename "$zipfile")" ]]; \
then echo "Copying $(basename "$zipfile")..." && cp "$zipfile" /Users/Shared/; \
fi; \
done;
@simbo1905
simbo1905 / colors.sh
Created October 13, 2025 12:34
iTerm2 terminal color management script
#!/usr/bin/env zsh
# shellcheck shell=bash
# colors - Manage iTerm2 background and foreground colors with state persistence
#
# USAGE:
# colors Load and apply colors from .colors file in current directory.
# If no file exists, show interactive menu to choose from
# predefined colors in ~/.colors_list
#
# colors random Randomly select a background from ~/.colors_list,
@simbo1905
simbo1905 / openhands-custom-howto.txt
Last active August 27, 2025 08:00
How to customise the main OpenHands 0.54.0 system prompt
# OpenHands Custom Build Setup (No Docker)
## Prerequisites
- Python 3.12 installed via pyenv
- Poetry installed
- Git repository cloned
## Setup Steps
@simbo1905
simbo1905 / gist:1bbc1e355b00f4130d5ccded5a370d86
Last active April 24, 2025 05:58
so Claud, you got this, right?
in our implimentation of `static <T> Pickler<?> createPicklerForSealedTrait(Class<T> sealedClass)` when ever we see a
record we write out classname of the records so that we can resolve the correct pickler to deseralize it. we have
tests of complex trees where we have the same type of node many times. that bloats the final format. we want to be
future proof to new records being added to the sealed trait between serialization and deserialization. so we cannot
use a fixed map of permittted types to bytes to make the format very compact. yet there is no reason to write out a
classname to the buffer twice. instead when we currently write out a classname we can memorize the offset in the
bytebuffer where we are about to write the size of the classname then the bytes. this can be recorded in a map of
class-to-offset called classNameToOffset. then when we are about to write out a new record we can check the keys in
the map. if we have not yet written out the classname to the current buffer we can write out
@simbo1905
simbo1905 / together_ai-deepseek-ai-DeepSeek-R1-Distill-Llama-70B.md
Last active April 10, 2025 20:27
How to use the US hosted Together.AI Pro model "deepseek-ai/DeepSeek-R1-Distill-Llama-70B" with Aider

As at 7th April 2025 the company Together.AI offers hosted US hosted version of "DeepSeek R1 Distill Llama 70B". DeepSeek-R1-Distill-Llama 70B is part of the DeepSeek-R1 family. It is a denser llama architecture model trained by the bigger Mixture-of-Experts (MoE).

In order to use it with aider you need some settings changes:

Create ~/.aider.model.settings.yml:

- name: together_ai/deepseek-ai/DeepSeek-R1-Distill-Llama-70B
  edit_format: diff-fenced
 reasoning_tag: think