Skip to content

Instantly share code, notes, and snippets.

Ion is the systems language we designed and built for the Bitwise project. It's designed from the ground up to have a close correspondence to C's type system and semantics. As a result it implements essentially the entire C type system and specification as a baseline.

However, I've recently been getting annoyed with const. Const correctness is something I've been stubbornly committed to for a long time as a programmer, both professionally and in my personal work. But at the same time it's always been clear to me that the benefit is more cosmetic than semantic. With that being the case, the downsides of const correctness start to look pretty hefty. People often make a big fuzz about how const correctness is an all or nothing deal and that once it infiltrates your codebase it naturally spreads everywhere because of const checking failures at the boundaries between const correct and non-const correct code. That's all true, but if anything it understates the issue. If that were all, const correctness would be to

I was told by @mmozeiko that Address Sanitizer (ASAN) works on Windows now. I'd tried it a few years ago with no luck, so this was exciting news to hear.

It was a pretty smooth experience, but with a few gotchas I wanted to document.

First, download and run the LLVM installer for Windows: https://llvm.org/builds/

Then download and install the VS extension if you're a Visual Studio 2017 user like I am.

It's now very easy to use Clang to build your existing MSVC projects since there's a cl compatible frontend:

import libc {...}
struct Indexer {
get: func(data: void*, a: void const*, x: void const*, len: usize, stride: usize, size: usize): usize;
put: func(data: void*, a: void const*, x: void const*, len: usize, stride: usize, size: usize): usize;
del: func(data: void*, a: void const*, x: void const*, len: usize, stride: usize, size: usize): usize;
set: func(data: void*, a: void const*, x: void const *, xi: usize, len: usize, stride: usize, size: usize);
free: func(data: void*);
}
func test_dynamic_arrays() {
a: int*;
apush(a, 42);
apush(a, 36);
len := alen(a);
cap := acap(a);
for (i := 0; i < alen(a); i++) {
printf("a[%d] = %d\n", i, a[i]);
}
asetcap(a, 1);

A quadratic space is a real vector space V with a quadratic form Q(x), e.g. V = R^n with Q as the squared length. The Clifford algebra Cl(V) of a quadratic space is the associative algebra that contains V and satisfies x^2 = Q(x) for all x in V. We're imposing by fiat that the square of a vector should be the quadratic form's value and seeing where it takes us. Treat x^2 = Q(x) as a symbolic rewriting rule that lets you replace x^2 or x x with Q(x) and vice versa whenever x is a vector. Beyond that Cl(V) satisfies the standard axioms of an algebra: it lets you multiply by scalars, it's associative and distributive, but not necessarily commutative.

Remarkably, this is all you need to derive everything about Clifford algebras.

Let me show you how easy it is to bootstrap the theory from nothing.

We know Cl(V) contains a copy of V. Since x^2 = Q(x) for all x, it must also contain a copy of some nonnegative reals.

def rotate(image, cos_angle, sin_angle):
if abs(cos_angle) < abs(sin_angle):
image, cos_angle, sin_angle = image.T, -sin_angle, cos_angle
inv_cos_angle = 1/cos_angle
yshear(image, -sin_angle)
xyscale(image, cos_angle, inv_cos_angle)
xshear(image, sin_angle * inv_cos_angle)
@pervognsen
pervognsen / exp.md
Last active January 13, 2024 12:42

I came across Fabian's nice old blog post on quaternion differentiation:

https://fgiesen.wordpress.com/2012/08/24/quaternion-differentiation/

I wanted to write a quick note on some of the broader context, which hopefully makes the quaternion case look less special.

Given any associative algebra where you can define what exp means, it's always true that d/dt exp(at) = a exp(at), which means the unique solution of x' = ax is x(t) = exp(at) x(0) = exp(a)^t x(0). In fact, it's true even if you work with formal power series, where you treat t as a formal symbol and interpret differentiation as the operator that shifts t^n down to n t^(n-1).

from math import isinf
def product(xs):
p = 1
for x in xs:
p *= x
return p
# O(n) time per interpolation once the weights are computed for the sample points.
def interp(xs, ws, fs, x):

I've been reading this much-publicized paper on neural ordinary differential equations:

https://arxiv.org/abs/1806.07366

I found their presentation of the costate/adjoint method to be lacking in intuition and notational clarity, so I decided to write up my own tutorial treatment. I'm familiar with this material from its original setting in optimal control theory.

You have a dynamical system described by an autonomous first-order ODE, x' = f(x), where the state x belongs to an n-dimensional vector space. There is a value function V(x) defined over the state space. Given a particular path t -> x(t) satisfying the ODE, we may evaluate it at the terminal time T to get the terminal state x(T) and the terminal value V(x(T)).

def random_sphere_point(n):
# Any isotropic distribution would do, but isotropic multivariate normals are just a product of univariate normals.
x = np.random.randn(n)
return x / np.linalg.norm(x)
def random_ball_point(n):
# The volume cdf scales as r^n, hence the inverse cdf scales as r^(1/n).
return random_sphere_point(n) * np.random.random()**(1/n)