Skip to content

Instantly share code, notes, and snippets.

View Nikolaj-K's full-sized avatar
💭
I'm combinating Why's.

Nikolaj Kuntner Nikolaj-K

💭
I'm combinating Why's.
  • DLR Germany, IST Austria, Infineon, ...
  • Vienna
View GitHub Profile
@Nikolaj-K
Nikolaj-K / bayes_rule_vs_propositional_logic.md
Last active November 28, 2024 18:35
Bayes' Rule vs Propositional Logic

Script discussed in the video:

https://youtu.be/tqmtZ82WHJc

==== Bayesian calculus vs Propositional logic ==== Idea: Logical implications are akin to Conditional probabilities. Reading: $P(B \vert A)=1 \iff A\to B$ $P(B \vert A)=0 \iff \neg(A\to B)$ "$P(B \vert A)$ ... $P(A\to B)$"

@Nikolaj-K
Nikolaj-K / bayes_rule_examples.py
Last active November 24, 2024 22:03
Bayes rules exmaple
import random
from scipy import special
import matplotlib.pyplot as plt
import numpy as np
def l1_normalize(finite_stream):
lst = list(finite_stream)
return np.array(lst) / sum(lst)
@Nikolaj-K
Nikolaj-K / expected_gradient_and_score.md
Created November 11, 2024 21:38
Expected gradient and score

==== Integration by parts ====

$\int_a^b (p\cdot f)' dx = (p\cdot f)_a^b = (p\cdot f)(b) - (p\cdot f)(a)$

$\implies$

$\int_a^b p\cdot f', dx = -\int_a^b p'\cdot f, dx + (p\cdot f)(b) - (p\cdot f)(a)$

=== Score ===

@Nikolaj-K
Nikolaj-K / nidlatam.txt
Created July 14, 2024 12:05
What is this?
This file has been truncated, but you can view the full file.
Mappings out of V_6
input cardinality = 0
(#1) [] ↦ []
input cardinality = 1
(#2) [[]] ↦ [[[]]]
(#3) [[[]]] ↦ [[]]
@Nikolaj-K
Nikolaj-K / smol_diffusion.md
Created July 6, 2024 18:49
Pitching the smalldiffusion library and paper

Links of the pages in the video:

https://youtu.be/Q_c0n1d5x3I

########## ########## ########## ##########

  • Paper:

Interpreting and Improving Diffusion Models from an Optimization Perspective >

@Nikolaj-K
Nikolaj-K / requirements.txt
Last active June 29, 2024 11:57
Fairly minimal requirements.txt for the smalldiffusion package
## Install hints for
## https://github.com/yuanchenyang/smalldiffusion/
## as of July 2024. Produced via `pip freeze > requirements.txt`
## You may `pip install -r requirements.txt` this file for a numpy that worked for me.
## Small discussion at
## https://github.com/yuanchenyang/smalldiffusion/issues/1
# accelerate==0.31.0
# appnope==0.1.4
# asttokens==2.4.1
@Nikolaj-K
Nikolaj-K / softmax_derivative.tex
Last active June 2, 2024 17:04
SoftMax: On derivations of its derivations, ∂σ/∂x
Scirpt used in the video:
https://youtu.be/yx2xc9oHvkY
This video was a reaction to derivations such as:
re: https://community.deeplearning.ai/t/calculating-gradient-of-softmax-function/1897/3
----
For general $s\colon{\mathbb R}\to{\mathbb R}$, define the scaled vector ${\vec x}^s$: $i\mapsto \dfrac{s(x_i)}{\sum_{k=1}^n s(x_k)}$
@Nikolaj-K
Nikolaj-K / diffusion_geometry.py
Last active May 29, 2024 17:22
Diffusion Geometry
Video discussing pointing to text 'Diffusion Geometry' (May 2024, 49 pages)
by Iolo Jones (Durham University)
https://www.youtube.com/watch?v=f2GJG7vMSZI
#### Links
* Paper:
https://arxiv.org/abs/2405.10858
https://arxiv.org/pdf/2405.10858
@Nikolaj-K
Nikolaj-K / Consequentia_Mirabilis.txt
Last active April 13, 2024 11:24
Pierce's law, Consequentia Mirabilis, ¬¬-Elimination and LEM without Explosion
Proofs discussed in the video:
https://youtu.be/h1Ikhh3J1vY
Legend and conventions:
$(P \to \bot) \lor P\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,$ ... $PEM$ ... Principle of excluded middle (a.k.a. $LEM$)
$((P \to \bot) \land P) \to Q\,\,\,$ ... $EXPL$ ... principle of explosion, a.k.a. ex falso
$((P \to Q) \to P) \to P$ ... $PP$ ... Pierce's principle
$((P \to \bot) \to P) \to P$ ... $CM$ ... consequentia mirabilis, a.k.a. Clavius's principle
@Nikolaj-K
Nikolaj-K / subcountability.tex
Last active April 8, 2024 00:04
ℕ surjects onto ℝ. Subsets of ℕ surject onto ℕ^ℕ. ℕ^ℕ injects into ℕ. All that.
Shownotes to the video:
https://youtu.be/q-mjO9Uxvy0
For a related and relevant video constructive logic basics and upshots is at
https://youtu.be/-lPrjPHElik
For a related and relevant discussion on computably enumerable sets and their compliments, see this 4yo video
https://youtu.be/Ox0tD58DTG0
For some of the non-theorems in the list it helps to understand \Pi_0^2-complete sets.
The references video on how the Axiom of Choice and Regularity each imply LEM is at
https://youtu.be/2EOW23uVcRA