Skip to content

Instantly share code, notes, and snippets.

View iamwilhelm's full-sized avatar

Wil Chung iamwilhelm

  • Mountain View, CA
View GitHub Profile
Meet Scout. The open-source company brain
YC's Summer 2026 Requests for Startups
named two ideas that point at the same thing from different angles.
Company Brain
: pull knowledge out of fragmented sources (Slack, email, tickets), structure it, keep it current, turn it into something AI can act on.
"Structure it, keep it current" is the wrong approach.
The trick is "navigation over search". More on this later.
Square profile picture
The Harness Is the Backend
The most important architectural question in AI infrastructure right now isn’t which model to use. It’s how much infrastructure is required to build something useful with it.
Anthropic, OpenAI, CrewAI, LangChain all call that wrapping the agent harness. The harness includes the orchestration loop, tools (MCP, A2A), memory, context management, and error handling that make a model useful. They all agree the model isn’t the product. The infrastructure is. They disagree deeply on how much of it should exist.
Anthropic keeps their harness thin. It’s an elegant loop: Assemble the prompt, call the model, execute tool calls, and repeat. The model decides everything. OpenAI adds more structure: instruction stacks, orchestration modes, and explicit handoff patterns. CrewAI takes a multi-pronged approach: deterministic Flows for routing and validation, autonomous agents for the rest. LangGraph has the biggest harness Every decision is a node, every transition a defined edge, the entire workfl
@iamwilhelm
iamwilhelm / finite_support_zset_abelian.lean
Last active December 13, 2025 18:11
a proof of finitely suppported zset is an abelian group
-- Lean v4.23.0-rc2
import Std
open Std
variable {α : Type} [DecidableEq α]
/- ------------------------------------------------------------
0) Your minimal abelian group structure (unchanged)
@iamwilhelm
iamwilhelm / zset_abelian.lean
Created October 17, 2025 23:49
Proof that Z-set and additive merge form an abelian group
variable {α : Type} [DecidableEq α]
-- Define our own minimal abelian group structure (no mathlib)
structure AbelianGroup (G : Type) where
add : G → G → G
zero : G
neg : G → G
add_assoc : ∀ a b c : G, add (add a b) c = add a (add b c)
add_zero : ∀ a : G, add a zero = a
zero_add : ∀ a : G, add zero a = a
@iamwilhelm
iamwilhelm / sydney.py
Created August 3, 2024 18:59 — forked from xlr8harder/sydney.py
Talk to Sydney with Llama 3.1 405B base model.
import gradio as gr
from openai import OpenAI
import jinja2
from transformers import AutoTokenizer
# get an api key from hyperbolic.
api_key = "..."
# Initialize the OpenAI client
client = OpenAI(
@iamwilhelm
iamwilhelm / balancing_node_sizes.thy
Created January 22, 2024 02:33
Incomplete proof of balancing node sizes after insert
theory balancing
imports Main
begin
type_synonym nodesizes = "nat list"
fun insert :: "nodesizes ⇒ nat ⇒ nodesizes" where
"insert [] _ = []" |
"insert (x#xs) 0 = (x+1)#xs" |
"insert (x#xs) n = x # insert xs (n-1)"
@iamwilhelm
iamwilhelm / lattice_merkle_tree.thy
Created January 22, 2024 02:29
Incomplete and in-progress proof of merge function for a unicit balanced Merkel Tree using join-semilattices
theory lattice_merkle_tree
imports Main
begin
fun opt_last :: "'a list ⇒ 'a option" where
"opt_last [] = None" |
"opt_last xs = Some (last xs)"
(****** Lattice Merke Tree ******)
// tic tac toe
import * as readline from "readline";
enum Turn {
Player = "player",
Computer = "computer",
}
type Piece = string;
import * as fs from "fs/promises";
import Knex from "knex";
const Web3 = require("web3");
import { Refract, Component, Fiber } from "./refract";
/******* custom hooks ******/
export const useFetchJson = filePath => {
let [json, setJson] = Refract.useState("useFetchJson", null);
import Web3 from "web3";
import Knex from "knex";
import {
loadTokenList,
marketContract,
borrowAndSupplyRate,
rateToApy
} from "./compound.js";
import { subscribeNewBlocks } from "../libs/eth.js";