Skip to content

Instantly share code, notes, and snippets.

View Nikolaj-K's full-sized avatar
💭
I'm combinating Why's.

Nikolaj Kuntner Nikolaj-K

💭
I'm combinating Why's.
  • DLR Germany, IST Austria, Infineon, ...
  • Vienna
View GitHub Profile
@Nikolaj-K
Nikolaj-K / somnia_yapper_data_250407.json
Created April 7, 2025 09:07
Yapper data from API calls to Kaito from 330 Somnia yappers from 250407
{
"aixbt@aixbt_agent": {
"user_id": "1852674305517342720",
"username": "aixbt_agent",
"yaps_all": 24515.98,
"yaps_l24h": 64.35,
"yaps_l48h": 107.54,
"yaps_l7d": 363.15,
"yaps_l30d": 1701.74,
"yaps_l3m": 8689.35,
@Nikolaj-K
Nikolaj-K / somnia_7day_yapper_ranking_snapshots.py
Last active April 7, 2025 13:29
Kaito's Somnia 7-day yapper leaderboard snapshots since March 22
"""
Below you find the Somnia 7-day top-100 yappers rankings for several dozen of days.
Ask me for an update at @ErnstKummer on X:
https://x.com/ErnstKummer/status/1909233792771936471
The last entry is currently from the 7th of April.
I started grabbing them on 22th of March, one or two per day.
The data here is written down in the form of a python dict which one can simply import.
This format is really almost the same as a JSON,
@Nikolaj-K
Nikolaj-K / sample_via_inverse_cdf_method.py
Created March 20, 2025 22:00
Inverse Transform Sampling
"""
Code explained in
https://youtu.be/tSyMnVd6DsY
* Theorem:
V := CDF_X^{-1}(U), with U from the uniform distirbution on [0,1], has same distribution as X.
Proof:
* Note: {r \in Q | sqrt{3} < r} = {r \in Q | 3 < r^2}
* Pr(V <= x) = Pr(CDF_X^{-1}(U) <= x) = Pr(U <= CDF_X(x)) = CDF_X(x)
@Nikolaj-K
Nikolaj-K / bessels_correction.md
Last active March 16, 2025 00:36
The expectation of the sample variance equals the variance

Denote the expectation by ${\mathbb E}[f(X)]:=\sum_x f(x)\cdot p(x)$, where $\sum_x$ is the sum (or integral) over all $x$ and $p(x)$ is the probability distribution in question.

Denote the variance by $\mathrm{Var}[X]:={\mathbb E}[(X-{\mathbb E}[X])^2]$

Further, define the standard deviation $\sigma[X]=\mathrm{Var}[X]^\frac{1}{2}$ (Which, if we use units, has the same units as $E[X]$ or $X$ itself.)

import re
from selenium import webdriver
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.chrome.service import Service
import time
class Config:
URL = "https://testnet.somnia.network/memecoins"
TMP_FILE_PATH = "/path/to/write/somnia_memecoin_page_tmp.html"
@Nikolaj-K
Nikolaj-K / bayes_rule_vs_propositional_logic.md
Last active November 28, 2024 18:35
Bayes' Rule vs Propositional Logic

Script discussed in the video:

https://youtu.be/tqmtZ82WHJc

==== Bayesian calculus vs Propositional logic ==== Idea: Logical implications are akin to Conditional probabilities. Reading: $P(B \vert A)=1 \iff A\to B$ $P(B \vert A)=0 \iff \neg(A\to B)$ "$P(B \vert A)$ ... $P(A\to B)$"

@Nikolaj-K
Nikolaj-K / bayes_rule_examples.py
Last active November 24, 2024 22:03
Bayes rules exmaple
import random
from scipy import special
import matplotlib.pyplot as plt
import numpy as np
def l1_normalize(finite_stream):
lst = list(finite_stream)
return np.array(lst) / sum(lst)
@Nikolaj-K
Nikolaj-K / expected_gradient_and_score.md
Created November 11, 2024 21:38
Expected gradient and score

==== Integration by parts ====

$\int_a^b (p\cdot f)' dx = (p\cdot f)_a^b = (p\cdot f)(b) - (p\cdot f)(a)$

$\implies$

$\int_a^b p\cdot f', dx = -\int_a^b p'\cdot f, dx + (p\cdot f)(b) - (p\cdot f)(a)$

=== Score ===

@Nikolaj-K
Nikolaj-K / nidlatam.txt
Created July 14, 2024 12:05
What is this?
This file has been truncated, but you can view the full file.
Mappings out of V_6
input cardinality = 0
(#1) [] ↦ []
input cardinality = 1
(#2) [[]] ↦ [[[]]]
(#3) [[[]]] ↦ [[]]
@Nikolaj-K
Nikolaj-K / smol_diffusion.md
Created July 6, 2024 18:49
Pitching the smalldiffusion library and paper

Links of the pages in the video:

https://youtu.be/Q_c0n1d5x3I

########## ########## ########## ##########

  • Paper:

Interpreting and Improving Diffusion Models from an Optimization Perspective >