| # /// script | |
| # requires-python = ">=3.12" | |
| # dependencies = [ | |
| # "click", | |
| # "mistralai", | |
| # "markdown", | |
| # "requests", | |
| # "beautifulsoup4", | |
| # ] | |
| # /// |
| import Control.Monad (when) | |
| import Data.Char (chr, ord) | |
| import Data.IORef | |
| import Data.Word | |
| import Debug.Trace | |
| import System.IO.Unsafe (unsafePerformIO) | |
| import Text.Parsec hiding (State) | |
| import qualified Data.IntMap.Strict as IntMap | |
| import qualified Data.Map as Map | |
| import qualified Text.Parsec as Parsec |
Theorem Proving is the ability to solve a mathematical problem. A computer program capable of competently doing that would immediatelly unlock the automation of every intellectual task that a human can perform, because all problems can be reduced to that of solving abstract equations. From the discovery of new physics, to recursive self-improvement and unfathomable
Company:
- HOC: Towards an Optimal Computer: why interaction paradigm can improve computers.
- HOC: Can we build an optimal processor?: like above, but as a tweet.
- HOC: Complete Historical Overview: the lore behind HOC's creation.
Theory:
- Interaction Combinators - Y.Lafont 1997: a concurrent model of computation
- Symmetric Interaction Combinators - Damiano Mazza 2009: a symmetric variant (used by HVM)
- The Optimal Implementation of Functional Programming Languages - best book to learn
- [BOHM](https://www.cambridge.org/core/services/aop-cambridge-core/content/
I am investigating how to use Bend (a parallel language) to accelerate Symbolic AI; in special, Discrete Program Search. Basically, think of it as an alternative to LLMs, GPTs, NNs, that is also capable of generating code, but by entirely different means. This kind of approach was never scaled with mass compute before - it wasn't possible! - but Bend changes this. So, my idea was to do it, and see where it goes.
Now, while I was implementing some candidate algorithms on Bend, I realized that, rather than mass parallelism, I could use an entirely different mechanism to speed things up: SUP Nodes. Basically, it is a feature that Bend inherited from its underlying model ("Interaction Combinators") that, in simple terms, allows us to combine multiple functions into a single superposed one, and apply them all to an argument "at the same time". In short, it allows us to call N functions at a fraction of the expected cost. Or, in simple terms: why parallelize when we can share?
A
Develop an AI prompt that solves random 12-token instances of the A::B problem (defined here), with 90%+ success rate.
We'll use your prompt as the SYSTEM PROMPT, and a specific instance of problem as the PROMPT, inside XML tags. Example:
| A::B is a system with 4 tokens: `A#`, `#A`, `B#` and `#B`. | |
| An A::B program is a sequence of tokens. Example: | |
| B# A# #B #A B# | |
| To *compute* a program, we must rewrite neighbor tokens, using the rules: | |
| A# #A ... becomes ... nothing | |
| A# #B ... becomes ... #B A# |
| #!/bin/bash | |
| setting=$(gsettings get org.gnome.settings-daemon.plugins.color night-light-enabled) | |
| if [[ $setting == "true" ]]; then | |
| gsettings set org.gnome.settings-daemon.plugins.color night-light-enabled false | |
| else | |
| gsettings set org.gnome.settings-daemon.plugins.color night-light-enabled true | |
| fi |
| """ | |
| A bare bones examples of optimizing a black-box function (f) using | |
| Natural Evolution Strategies (NES), where the parameter distribution is a | |
| gaussian of fixed standard deviation. | |
| """ | |
| import numpy as np | |
| np.random.seed(0) | |
| # the function we want to optimize |