Script discussed in the video:
==== Bayesian calculus vs Propositional logic ====
Idea: Logical implications are akin to Conditional probabilities.
Reading:
Script discussed in the video:
==== Bayesian calculus vs Propositional logic ====
Idea: Logical implications are akin to Conditional probabilities.
Reading:
import random | |
from scipy import special | |
import matplotlib.pyplot as plt | |
import numpy as np | |
def l1_normalize(finite_stream): | |
lst = list(finite_stream) | |
return np.array(lst) / sum(lst) |
==== Integration by parts ====
=== Score ===
Mappings out of V_6 | |
input cardinality = 0 | |
(#1) [] ↦ [] | |
input cardinality = 1 | |
(#2) [[]] ↦ [[[]]] | |
(#3) [[[]]] ↦ [[]] |
Links of the pages in the video:
########## ########## ########## ##########
Interpreting and Improving Diffusion Models from an Optimization Perspective >
## Install hints for | |
## https://github.com/yuanchenyang/smalldiffusion/ | |
## as of July 2024. Produced via `pip freeze > requirements.txt` | |
## You may `pip install -r requirements.txt` this file for a numpy that worked for me. | |
## Small discussion at | |
## https://github.com/yuanchenyang/smalldiffusion/issues/1 | |
# accelerate==0.31.0 | |
# appnope==0.1.4 | |
# asttokens==2.4.1 |
Scirpt used in the video: | |
https://youtu.be/yx2xc9oHvkY | |
This video was a reaction to derivations such as: | |
re: https://community.deeplearning.ai/t/calculating-gradient-of-softmax-function/1897/3 | |
---- | |
For general $s\colon{\mathbb R}\to{\mathbb R}$, define the scaled vector ${\vec x}^s$: $i\mapsto \dfrac{s(x_i)}{\sum_{k=1}^n s(x_k)}$ |
Video discussing pointing to text 'Diffusion Geometry' (May 2024, 49 pages) | |
by Iolo Jones (Durham University) | |
https://www.youtube.com/watch?v=f2GJG7vMSZI | |
#### Links | |
* Paper: | |
https://arxiv.org/abs/2405.10858 | |
https://arxiv.org/pdf/2405.10858 |
Proofs discussed in the video: | |
https://youtu.be/h1Ikhh3J1vY | |
Legend and conventions: | |
$(P \to \bot) \lor P\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,$ ... $PEM$ ... Principle of excluded middle (a.k.a. $LEM$) | |
$((P \to \bot) \land P) \to Q\,\,\,$ ... $EXPL$ ... principle of explosion, a.k.a. ex falso | |
$((P \to Q) \to P) \to P$ ... $PP$ ... Pierce's principle | |
$((P \to \bot) \to P) \to P$ ... $CM$ ... consequentia mirabilis, a.k.a. Clavius's principle |
Shownotes to the video: | |
https://youtu.be/q-mjO9Uxvy0 | |
For a related and relevant video constructive logic basics and upshots is at | |
https://youtu.be/-lPrjPHElik | |
For a related and relevant discussion on computably enumerable sets and their compliments, see this 4yo video | |
https://youtu.be/Ox0tD58DTG0 | |
For some of the non-theorems in the list it helps to understand \Pi_0^2-complete sets. | |
The references video on how the Axiom of Choice and Regularity each imply LEM is at | |
https://youtu.be/2EOW23uVcRA |