const dayInMonth = moment().format("D");
const currentMonth = moment().format("M");
var i = 1;
while (currentMonth > i) {
var entries = entriesSum(i);
var max = daysInMonth(i);
var perc = entries + "/" + max;
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| const std = @import("std"); | |
| const cc = std.ascii.control_code; | |
| const startsWith = std.mem.startsWith; | |
| fn isDigitNotZero(c: u8) bool { | |
| return switch (c) { | |
| '1'...'9' => true, | |
| else => false, | |
| }; | |
| } |
So you know how the transformer works, and you know basic ML/DL, and you want to learn more about LLMs. One way to go is looking into the various "algorithmic" stuff (optimization algorithms, RL, DPO, etc). Lot's of materials on that. But the interesting stuff is (in my opinion at least) not there.
This is an attempt to collect a list of academic (or academic-like) materials that explore LLMs from other directions, and focus on the non-ML-algorithmic aspects.
- David Chiang's Theory of Neural Networks course.
- This is not primarily LLMs, but does have substantial section on Transformers. Formal/Theory. More of a book than a course.
OlderNewer