An overview of algebraic laws from the perspective of a functional programmer. I've used a Haskell-esque syntax for the definitions of the laws and examples.
name | definition | example |
---|---|---|
Identity ¹ | f x = x |
add 0 42 = 42 |
Involution | f (f x) = x |
not (not False) = False |
Idempotence | f (f x) = f x |
toUpperCase (toUpperCase "hi") = toUpperCase "hi" |
Commutativity ² | f a b = f b a |
add 13 37 = add 37 13 |
Commutativity ² | f (g x) = g (f x) |
add 13 (add 37 42) = add 37 (add 13 42) |
Associativity | f a (f b c) = f (f a b) c |
add 13 (add 37 42) = add (add 13 37) 42 |
DeMorgans | f (g a b) = h (f a) (f b) |
not (and True False) = or (not True) (not False) |
Distributivity ¹ | f (g a b) = g (f a) (f b) |
mul 42 (add 13 37) = add (mul 42 13) (mul 42 37) |
I'm using unusual, generalized definitions for the Identity and Distributivity laws that allow for their application to unary operations. However, usually these laws are specialized in binary operations:
- Identity typically requires a binary operation bound to an "identity element"
to produce the identity function within its domain. For example:
0
is the identity element foradd
becauseadd 0 = id
for all number inputs. - Distributivity is typically defined with a binary operation for
f
, like in the multiplication example above. However, since one of the operands is constant, and we're defining the law using unary functions anyway, I simplified its notation and expanded it to cover unary operations too.
There are two definitions of commutativity. Two operands commute when they
can be supplied in either order (also known as Symmetry); whereas two
operations commute when they can be executed in either order. Another way to
think of commutativity of operations is that the composition operator (B
) is
commutative for operations f
and g
: B f g = B g f
.
In the law definitions below, I'm using the following function combinators to improve terseness:
I x = x
T x f = f x
B f g x = f (g x)
structure | law | definition |
---|---|---|
Setoid (eq ) |
Reflexivity | eq s s |
Symmetry ¹ | eq a b = eq b a |
|
Transitivity | if and (eq a b) (eq b c) then eq a c |
|
Ord (Setoid + lte ) |
Totality | or (lte a b) (lte b a) |
Antisymmetry | if and (lte a b) (lte b a) then eq a b |
|
Transitivity | if and (lte a b) (lte b c) then lte a c |
|
Semigroupoid (compose ) |
Associativity | compose f (compose g h) = compose (compose f g) h |
Category (Semigroupoid + id ) |
Left Identity | compose id f = f |
Right Identity | compose f id = f |
|
Semigroup (cat ) |
Associativity | cat a (cat b c) = cat (cat a b) c |
Monoid (Semigroup + empty ) |
Left Identity | cat empty s = s |
Right Identity | cat s empty = s |
|
Group (Monoid + invert ) |
Left Inverse | cat (invert s) s = empty |
Right Inverse | cat s (invert s) = empty |
|
Contravariant (cmap ) |
Identity | cmap id s = s |
Composition | cmap (B g h) = B (cmap h) (cmap g) |
|
Functor (fmap ) |
Identity | fmap id s = s |
Composition ² | fmap (B g h) = B (fmap g) (fmap h) |
|
Extend (Functor + extend ) |
Associativity ³ | extend f (extend g w) = extend (B f (extend g)) w |
Comonad (Extend + extract ) |
Left Identity | extend extract s = s |
Right Identity | extract (extend f) s = f s |
|
Apply (Functor + ap ) |
Composition | ap a (ap b (fmap B c)) = ap (ap a b) c |
Applicative (Apply + of ) |
Identity | ap s (of I) = s |
Homomorphism | ap (of x) (of f) = of (f x) |
|
Interchange | ap (of x) s = ap s (of (T x)) |
|
Functor Derivability | ap s (of f) = fmap f s |
|
Alt (Functor + alt ) |
Associativity | alt a (alt b c) = alt (alt a b) c |
Distributivity | fmap f (alt a b) = alt (fmap f a) (fmap f b) |
|
Plus (Alt + zero ) |
Left Identity | alt zero s = s |
Right Identity | alt s zero = s |
|
Annihilation | fmap f zero = zero |
|
Alternative (Applicative + Plus) | Distributivity | ap s (alt f g) = alt (ap s f) (ap s g) |
Annihilation | ap s zero = zero |
|
Chain (Apply + chain ) |
Associativity ³ | chain g (chain f s) = chain (B (chain g) f) s |
Monad (Applicative + Chain) | Left Identity ³ | chain f (of x) = f x |
Right Identity | chain of s = s |
|
Functor Derivability | chain (B of f) s = fmap f s |
|
Applicative Derivability | chain (B (T a) fmap) b = ap a b |
I think Symmetry and Commutativity are different terms for the same thing, just evolved in different contexts: Symmetry is typically used when talking about "binary relations" whereas Commutativity is used when talking about "binary operations": Commutative Property, Wikipedia:
A binary operation is commutative if changing the order of the operands does not change the result. A binary relation is said to be symmetric if the relation applies regardless of the order of its operands. (emphasis mine)
Both binary operations and binary relations can be expressed as functions.
It looks to me like it's just Distributivity, eg. fmap
distributes across B
.
I asked about this on StackExchange Mathematics, and at
the time of writing, the question is yet to be answered.
The laws marked with a ³
don't exactly generalize to their corresponding
General Laws. I think their names are chosen because they are
"akin to" these laws when thinking of Monads as Monoids, as per the following
quote from the Formal Definition of Monad on Wikipedia:
The first axiom is akin to the associativity in monoids if we think of
$μ$ as the monoid's binary operation, and the second axiom is akin to the existence of an identity element (which we think of as given by$η$ ).