Skip to content

Instantly share code, notes, and snippets.

View vene's full-sized avatar
🏴
ahoy

Vlad Niculae vene

🏴
ahoy
View GitHub Profile
@vene
vene / lr_admm.py
Last active May 4, 2024 18:29
LR ADMM
""" Fit logistic regression by ADMM. """
# Author: Vlad Niculae <[email protected]>
# License: MIT
import numpy as np
from sklearn.datasets import load_breast_cancer
from sklearn.linear_model import LogisticRegression
from sklearn.preprocessing import StandardScaler
from scipy.linalg import cho_factor, cho_solve
@vene
vene / weighted_empirical_cov.py
Last active November 22, 2022 09:43
Fitting a gaussian to an empirical weighted measure
"""Fitting a gaussian to an empirical weighted measure"""
# author: vlad niculae <[email protected]>
# license: bsd
import numpy as np
import matplotlib.pyplot as plt
from scipy.special import logsumexp, softmax
@vene
vene / hyperbolic.py
Last active November 9, 2021 14:06
"""wrapped hyperbolic distributions
following https://arxiv.org/abs/1902.02992
"""
# author: vlad niculae <[email protected]>
# license: bsd 3-clause
import torch
""" trying to understand the determinant of sphere-to-cyl """
import numpy as np
import jax.numpy as jnp
from jax import jacfwd
# map from cylinder to sphere
def phi(zr):
d = zr.shape[0]
@vene
vene / kl.py
Created November 3, 2021 14:43
"""
Approximating the cross-entropy between two Power Sphericals.
Uses a second-order Taylor expansion to approximate E[log(1+z)].
"""
# author: vlad n <[email protected]>
# license: mit
# documentation: https://hackmd.io/@vladn/SJ93wMevK
"""
Dual p-norms illustrated.
For any norm |.|, the dual norm is defined as |y|_* = max{ <x, y> for |x| <= 1 }.
The figure shows the unit balls of the p-norm, for p = 1.5, 2, and 3.
We compute the dual norm at a dual vector y (short black arrow), rotating
uniformly around the origin over time.
@vene
vene / multi_mixins.py
Last active August 17, 2021 07:39
python multiple inheritance / mixin MRO
class Base:
def say(self, val):
print("base says", val)
class A(Base):
def say(self, val):
print("say A")
@vene
vene / README.txt
Created July 19, 2021 08:25
Experimental config with files & CLI using only OmegaConf
Example input and output.
$ python conf.py seed=42 lr=.1
project: ???
seed: 42
lr: 0.1
epochs: ???
p_drop: 0.5
baseconf: null
# author: vlad niculae <[email protected]>
# license: mit
import torch
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.colors as colors
from entmax import sparsemax, entmax15
from entmax.losses import sparsemax_loss, entmax15_loss
# author: vn
import numpy as np
from scipy.optimize import root_scalar
import torch
import matplotlib.pyplot as plt
def entropy(y, a, b):