Skip to content

Instantly share code, notes, and snippets.

@jiahao
jiahao / llama2.jl
Last active August 2, 2023 08:35
NOTE 2023-07-30: This gist is deprecated in favor of https://github.com/rai-llc/LanguageModels.jl . llama2.jl is a port of @karpathy's llama2.c to Julia.
# A port of https://github.com/karpathy/llama2.c/blob/master/run.c
# to Julia.
# Jiahao Chen <[email protected]> 2023-07-29
#
# MIT License: see full text at https://opensource.org/license/mit/
#
using LinearAlgebra
using LogExpFunctions
@jiahao
jiahao / doubledescent.jl
Last active July 12, 2023 18:29
A small example of double descent in Julia
using Plots
using StatsPlots
using LinearAlgebra
using ClassicalOrthogonalPolynomials
using ProgressMeter
using Statistics
k = 15 # Size of training data
l = 15 # Size of test data
@jiahao
jiahao / nf4.jl
Last active July 10, 2023 14:55
Minimal Julia implementation of NF4 floating point for QLoRA
using Statistics
using BFloat16s
using StaticArrays
import Base: getindex, setindex!, length, iterate
###########################################
# Implementation of the NormedFloat4 type
# and its container type, QLoRAArray
#
@jiahao
jiahao / listpdfs.py
Created August 31, 2021 15:37
Scrape my arXiv profile to list one PDF per line. Useful for updating conference submission profiles
from bs4 import BeautifulSoup
import urllib.request
url = "https://arxiv.org/a/chen_j_2.html"
with urllib.request.urlopen(url) as response:
html = response.read()
soup = BeautifulSoup(html, 'html.parser')
for link in soup.find_all('a'):
@jiahao
jiahao / aaism.jl
Last active September 24, 2020 04:43
Implementation of the stabilized Type-I Anderson acceleration (AA-I-S-m) algorithm of Zhang, O'Donoghue and Boyd (2018). This implementation solves g(x) = 0 as opposed to f(x) = x, which you obtain from g(x) = f(x) - x.
using Dates: now
using DataFrames
import Base: *, push!
mutable struct AAUpdate{Tu,Tv} # Matrix-free representation of the H matrix
m::Int #:: Size of the AA subspace
u::Vector{Tu} #:: The quantities s-Hỹ (note typo in paper)
v::Vector{Tv} #:: The quantities (H'ŝ)/(ŝ'Hŷ)
end
@jiahao
jiahao / sparsematrixiterator.jl
Last active December 4, 2019 05:52
Iterate over SparseMatrixCSC stored entries. Implements Julia's new iterator protocol (new as of v0.7) https://julialang.org/blog/2018/07/iterators-in-julia-0.7
# Iterate over SparseMatrixCSC stored entries, ignoring stored zeros and
# missing values.
#
# Implements Julia's new iterator protocol (new as of v0.7)
# Ref: https://julialang.org/blog/2018/07/iterators-in-julia-0.7
#
# Jiahao Chen 2019-12-03
#
# MIT License available upon request
#
@jiahao
jiahao / sparselogisticpca.jl
Created November 22, 2019 10:38
Sparse logistic PCA in Julia - translated from @andland 's implementation https://github.com/andland/SparseLogisticPCA
using LinearAlgebra
using StatsBase
using StatsFuns
using NaNMath
"x->2x-1 in place"
function twoxm1!(dat; val=0.0)
@inbounds for (i,x) in enumerate(dat)
dat[i] = ifelse(isnan(x), val, 2x-1)
end
### Keybase proof
I hereby claim:
* I am jiahao on github.
* I am jiahao (https://keybase.io/jiahao) on keybase.
* I have a public key ASD8uXwFCTxC_HNYH0M6m_5niip3vql6gQ9nqYuUWnkiiQo
To claim this, I am signing this object:
@jiahao
jiahao / naivebayes.jl
Last active April 28, 2019 23:24
Multinomial naive Bayes in Julia, allowing for generic numeric types for the conditional probabilities. When using rational numbers, you can calculate exact probabilities without roundoff error.
struct MultinomialNaiveBayes{T, V<:AbstractVector}
feature_ratios::V
prior_ratio::T
end
"""
fit(MultinomialNaiveBayes, [T,] features, labels, α = 1) -> MNB
fits a `MultinomialNaiveBayes` classifier `MNB` using the
`features` matrix and `labels` vector of `Bool`s.
@jiahao
jiahao / naivebayes.jl
Created April 27, 2019 01:26
Multinomial naive Bayes in Julia, allowing for generic numeric types for the conditional probabilities. (including rational numbers) that allow you to calculate exact probabilities.
struct NaiveBayes{T, V<:AbstractVector, M<:AbstractMatrix}
probabilities::M
priors::V
end
train(::Type{NaiveBayes}, T::Type{R}, features, labels, α = 1) where R<:Real =
train(NaiveBayes{T, Vector{T}, Matrix{T}}, features, labels, α)
for (typ, op) in ((Rational, ://), (Real, :/)) @eval begin
function train(::Type{NaiveBayes{T, S, R}},