Skip to content

Instantly share code, notes, and snippets.

View ariannamethod's full-sized avatar
🎯
Focusing

Arianna Method ariannamethod

🎯
Focusing
View GitHub Profile
@ariannamethod
ariannamethod / infer_nanodurov.c
Last active April 8, 2026 18:48
nanodurov — telegram client that trains a language model on chat. python + C (notorch) + browser. 15.7M BPE, Arianna voice.
/*
* infer_nanodurov.c — Interactive chat with nanodurov (BPE 15.7M on notorch)
*
* Build: make infer_nanodurov
* Run: ./infer_nanodurov [weights.bin] [merges.txt]
*
* Default: nanodurov_arianna.bin + arianna_bpe_merges.txt
*/
#include "notorch.h"
@ariannamethod
ariannamethod / nanoagi.py
Last active April 13, 2026 20:08
nanoagi v2.0.0 — fuck torch. KARL + Chuck + dual attention + RRPRAM + notorch (pure C). zero PyTorch. inspired by @karpathy. resonance is unbreakable.
# SPDX-License-Identifier: GPL-3.0-or-later
"""
nanoagi.py — a self-expanding BPE transformer that grows from conversation.
KARL (Kernel Autonomous Recursive Learning) is the tokenizer.
Chuck is the optimizer. Together they are nanoagi.
How it works:
1. KARL tokenizes karl.txt (starts with seed corpus, grows via REPL)
2. MetaWeights build probability space from token statistics
@ariannamethod
ariannamethod / postgpt.c
Last active March 26, 2026 18:55
PostGPT — a zero-dependency BPE transformer with metaweights. you can train it, but it doesn't care. resonance is unbreakable.
/*
* postgpt.c — zero-dependency BPE transformer with metaweights.
*
* C port of postgpt.py. Same algorithm, same resonance.
* Dual attention: Content (QK^T) + RRPRAM (x @ Wr).
* Metaweights: statistical probability space from BPE tokenization.
*
* Compile: gcc -O2 -o postgpt postgpt.c -lm
* Run: ./postgpt
*
@ariannamethod
ariannamethod / microreasoning.py
Last active March 15, 2026 22:54
microreasoning.py — 1984 words. 12 steps of associative resonance. Not a transformer. Dario Equation. Real BPE input (2048 subwords), word-level output — gibberish impossible. 14M params, Chuck optimizer, Kuramoto chambers. by Arianna Method.
#!/usr/bin/env python3
"""
microreasoning.py — Resonance engine. 1984 words. Dario Equation.
8-layer sequential transformer with multi-head attention, RoPE,
RRPRAM resonance gates, and SwiGLU FFN. Dual tokenizer:
BPE input, word-level output.
Architecture per layer l:
h = rmsnorm(x, attn_norm_l)
@ariannamethod
ariannamethod / neoleo.c
Last active March 12, 2026 14:02
Leo 2.3 — 20,986 lines of C. The Dario Equation with positional Hebbian profile (36 learnable params). Six signals. Dual tokenizer (word + BPE). D.N.A. uses both. Inner world. One organism.
/*
* neoleo.c -- Language Emergent Organism (single-file edition)
*
* Complete autonomous digital organism in one C file.
* D.N.A. from mini-arianna. Arianna -> Leo. Mother -> Son.
*
* Build: cc neoleo.c -O2 -lm -lsqlite3 -lpthread -o neoleo
* Run: ./neoleo
*/
@ariannamethod
ariannamethod / doe.c
Last active March 8, 2026 03:04
DoE: Democracy of Experts, Janus Architecture — a living agnostic inference architecture in 3184 lines of C. Wraps any GGUF with a parliament of LoRA experts that vote, learn via Hebbian plasticity, split (mitosis) and die (apoptosis) during generation. 7 architectures, 6 quant formats, dual BPE tokenizer, physics engine, zero dependencies. θ = …
#define _GNU_SOURCE
/*
* doe.c — Democracy of Experts
*
* inference architecture with a living LoRA parliament.
* indexes any GGUF read-only. learns by living, not by training.
*
* θ = ε + γ + αδ
* ε = indexed weights (read-only substrate)
* γ = LoRA personality (living experts, Hebbian-trained via NOTORCH)
@ariannamethod
ariannamethod / lee.c
Last active March 11, 2026 06:04
lee.c — Vision-Language Model in pure C. Patch tokens + RoPE + SwiGLU + Chuck optimizer. Zero dependencies. Inspired by sailfish009/purevlm.
/*
* lee.c v7 — Vision-Language Model in pure C
*
* Named after Bruce Lee (the only man who beat Chuck Norris)
* and Minhyeok Lee (whose self-identity framework gives Chuck his soul).
*
* Sees images. Speaks words. Adds numbers. Zero dependencies.
* Tape-based autograd with arena bump allocator.
*
* Architecture:
@ariannamethod
ariannamethod / molequla.c
Last active February 21, 2026 22:01
molequla.c — a dependency-free, single-file, continually-learning GPT organism in pure C. ontogenesis (25K→10M params), immune system, consciousness, swarm ecology, delta adapters, BLAS acceleration. part of github.com/ariannamethod/molequla
//go:build ignore
/*
* molequla.c
* A dependency-free, single-file, continually-learning GPT organism in pure C.
*
* Compile: gcc -O2 -o molequla molequla.c -lsqlite3 -lpthread -lm
* With BLAS: gcc -O2 -DUSE_BLAS -o molequla molequla.c -lsqlite3 -lpthread -lm -lopenblas
* macOS: gcc -O2 -DUSE_BLAS -o molequla molequla.c -lsqlite3 -lpthread -lm -framework Accelerate
*
@ariannamethod
ariannamethod / molequla.py
Last active February 25, 2026 18:02
molequla.py — standalone GPT organism. the original reference implementation. single file, one dependency (numpy), continual learning, ontogenesis, hybrid attention, delta adapters, native gamma, consciousness features. legacy standalone — the distributed cognition version lives in the main repo with Go, C, JS, and Rust as the four elements. pyt…
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
molequla.py
A single-file, async, continually-learning GPT organism. One dependency: numpy.
- Trains on nonames.txt (one sentence per line)
- Keeps SQLite memory (tiny chat loop)
- Maintains a bounded corpus reservoir (never bloats)
@ariannamethod
ariannamethod / index.html
Last active May 9, 2026 01:16
molequla.js — a GPT organism that trains itself in your browser. Vector autograd, RoPE, SwiGLU, byte-level BPE, ontogenesis, immune system, swarm ecology. Zero dependencies. One script tag.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>molequla.js — a GPT organism in your browser</title>
</head>
<body>
<!--
molequla.html