Skip to content

Instantly share code, notes, and snippets.

Handoff: brownie-viz-graph-interactivity

What was accomplished

Improved the Transfer Graph demo in the brownie-viz webapp with node selection/pinning, edge highlighting, visibility filtering, and various rendering fixes.

Features added

Handoff: commit-handoff-docs

What was accomplished

  • Committed and pushed the handoff infrastructure from the previous session:
    • .claude/commands/handoff.md/handoff slash command
    • docs/agents/handoff/2026-03-23-001-profile-solana.md
    • docs/agents/handoff/2026-03-23-002-filter-sol-wsol-command.md
    • docs/agents/handoff/2026-03-23-003-docs-agents-handoff-setup.md
    • CLAUDE.md — updated to reference docs/agents/**/*.md

Write a handoff document summarising this session for future agents/sessions.

Steps

  1. Determine today's date (use the currentDate context if available).

  2. List the existing files under docs/agents/handoff/ to find the highest sequence number already used today, then increment by one. Format: NNN (zero-padded to 3 digits, starting at 001).

  3. Use the current session name, otherwise derive a short kebab-case topic slug from the main subject of this session (e.g. eth-brownie-optimization, filter-sol-wsol-command).

@shawwn
shawwn / glob.cpp
Created June 15, 2023 06:16
A simple glob function that returns a vector of strings for POSIX
#include <glob.h>
#include <vector>
#include <string>
namespace util
{
std::vector<std::string> glob(const std::string& pattern) {
glob_t glob_result = {0}; // zero initialize
@shawwn
shawwn / cmake_test.cmake
Created May 12, 2023 04:46
I was playing around with writing a lisp-to-cmake compiler. https://github.com/shawwn/pymen/tree/cmake
cmake_policy(VERSION "3.25.0")
set(reserved
ALL
"=" ON
"==" ON
"+" ON
"_" ON
"%" ON
"*" ON
"/" ON
You have fallen into Event Horizon with John Michael Gadia.
In today's episode, John is joined by Sean Pracer.
Sean Pracer is an AI researcher and machine learning engineer.
He has contributed to projects such as ThePile, an open source
training data set for large language models.
He currently works on research and development for AGI.
@shawwn
shawwn / llama-dl-dmca.md
Last active April 5, 2023 02:35
I prompted GPT-4 to draft a DMCA counterclaim to Meta's DMCA against llama-dl: https://github.com/github/dmca/blob/master/2023/03/2023-03-21-meta.md

Prompt

Meta has issued a DMCA copyright claim against llama-dl, a GitHub repository, for distributing LLaMA, a 65-billion parameter language model. Here's the full text of the DMCA claim. Based on this, draft a DMCA counterclaim on the basis that neural networks trained on public data are not copyrightable.

--

VIA EMAIL: Notice of Claimed Infringement via Email
URL: http://www.github.com
DATE: 03/20/2023

@shawwn
shawwn / hon_timestamps.txt
Last active March 30, 2023 04:44
Heroes of Newerth file timestamps. Generated with `tree -Dhf`
This file has been truncated, but you can view the full file.
[ 544 Mar 29 20:10] .
├── [ 160 Jan 18 2011] ./Abaddon Share
│   ├── [ 288 Jan 18 2011] ./Abaddon Share/CVS
│   │   ├── [ 3 Jan 18 2011] ./Abaddon Share/CVS/Entries
│   │   ├── [ 0 Jan 18 2011] ./Abaddon Share/CVS/Entries.Extra
│   │   ├── [ 0 Jan 18 2011] ./Abaddon Share/CVS/Entries.Extra.Old
│   │   ├── [ 31 Jan 18 2011] ./Abaddon Share/CVS/Entries.Log
│   │   ├── [ 0 Jan 18 2011] ./Abaddon Share/CVS/Entries.Old
│   │   ├── [ 15 Jan 18 2011] ./Abaddon Share/CVS/Repository
│   │   └── [ 45 Jan 18 2011] ./Abaddon Share/CVS/Root
@shawwn
shawwn / llama.md
Last active June 15, 2024 10:13
A transcript of an interview I did for The Verge on March 6, 2023 about LLaMA, Facebook's new 65 billion parameter language model that was recently leaked to the internet: https://news.ycombinator.com/item?id=35007978

The Verge: "Meta’s powerful AI language model has leaked online — what happens now?"


Could you confirm that you downloaded the LLaMA series from 4chan? Were you able to get it running yourself or did you just repackage the download? (I was a bit confused reading your tweets about that what exactly you'd done there, so if you're able to explain that, it'd be great)

I downloaded it from Facebook, actually. You can find some details here.

Basically, the sequence of events was:

> initializing model parallel with size 8
> initializing ddp with size 1
> initializing pipeline with size 1
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
{"seed": 187073, "temp": 0.7, "top_p": 0.0, "top_k": 40, "repetition_penalty": 1.1764705882352942, "max_seq_len": 2048, "max_gen_len": 2048}
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Loading
> initializing model parallel with size 8