Skip to content

Instantly share code, notes, and snippets.

@rtindru
Created May 21, 2021 00:07
Show Gist options
  • Save rtindru/5bb48bf31905c11890050f8198e03383 to your computer and use it in GitHub Desktop.
Save rtindru/5bb48bf31905c11890050f8198e03383 to your computer and use it in GitHub Desktop.
Gaussian Floats - imagining a programming language that represents the uncertainties of the real-world.
from scipy import stats
"""
Imagining a programming language where basic types are represented as a Gaussian distribution. Would make for some pretty interesting (read: hair-pulling) programming experience, lol.
. The default assumptions in Deep Learning these days. But in a way, this may be how we should model the world around us. There are no statics or constants, everything is a probability distribution function.
"""
class GaussianFloat(object):
def __init__(self, val, scale=None):
self._val = float(val)
self._scale = scale or 1.0
self._dist = stats.norm(self._val, self._scale)
def get(self):
return self._dist.rvs(size=1)[0]
def __eq__(self, obj):
return type(obj) == type(self) and obj._val == self._val and obj._scale == self._scale
def __add__(self, obj):
assert type(self) == type(obj)
new_val = self._val + obj._val
new_scale = self._scale + obj._scale # This is prob wrong math
return self.__class__(new_val, new_scale)
def __mul__(self, obj):
assert type(self) == type(obj)
new_val = self._val * obj._val
new_scale = self._scale * obj._scale # This is prob wrong math
return self.__class__(new_val, new_scale)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment