This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from dataclasses import dataclass, replace | |
| from joblib import Parallel, delayed | |
| from typing import Optional, Callable, List, Iterable | |
| @dataclass(frozen=True) | |
| class Either: | |
| value: Optional = None | |
| exception: Optional = None | |
| def forward_exceptions(fun: Callable): |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # leverage scores l_[i] = tr(X @ (X^T X)^-1 @ X) | |
| # indication of self-sensitivity or self-influence of i-th sample. | |
| import numpy as np | |
| n = 2048 # samples | |
| d = 256 # dimensions | |
| X = np.random.randn(n, d) # design matrix | |
| # navie computation, high memory footprint (quadtratic in n^2) | |
| l_naive = np.trace(X @ np.linalg.inv(X) @ X) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import torch | |
| import torch.nn as nn | |
| from sklearn.datasets import make_moons | |
| from tqdm import tqdm | |
| import matplotlib.pyplot as plt | |
| class Flow(nn.Module): | |
| def __init__(self, n_dim=2, n_pos_dim=2, n_hidden=64): | |
| super().__init__() | |
| self.n_dim = n_dim |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| class MINE(nn.Module): | |
| """ | |
| Stub to implement Mutual Information Neural Estimation. | |
| See https://arxiv.org/pdf/1801.04062 | |
| Quote: | |
| We argue that the estimation of mutual information | |
| between high dimensional continuous random variables can be | |
| achieved by gradient descent over neural networks. We present a |
OlderNewer