Skip to content

Instantly share code, notes, and snippets.

2022

Daily Journaling

const dayInMonth = moment().format("D");
const currentMonth = moment().format("M");
var i = 1;
while (currentMonth > i) {
    var entries = entriesSum(i);
    var max = daysInMonth(i);
    var perc = entries + "/" + max;
@leizaf
leizaf / tokenizer.zig
Created August 30, 2024 06:20
Zig MLIR Lexer
const std = @import("std");
const cc = std.ascii.control_code;
const startsWith = std.mem.startsWith;
fn isDigitNotZero(c: u8) bool {
return switch (c) {
'1'...'9' => true,
else => false,
};
}

Learning LLMs in 2025

So you know how the transformer works, and you know basic ML/DL, and you want to learn more about LLMs. One way to go is looking into the various "algorithmic" stuff (optimization algorithms, RL, DPO, etc). Lot's of materials on that. But the interesting stuff is (in my opinion at least) not there.

This is an attempt to collect a list of academic (or academic-like) materials that explore LLMs from other directions, and focus on the non-ML-algorithmic aspects.

Courses

  • David Chiang's Theory of Neural Networks course.
  • This is not primarily LLMs, but does have substantial section on Transformers. Formal/Theory. More of a book than a course.