Skip to content

Instantly share code, notes, and snippets.

@bmorphism
Created October 25, 2023 22:47
Show Gist options
  • Save bmorphism/e0050da94a9d4c4e87b497559b841536 to your computer and use it in GitHub Desktop.
Save bmorphism/e0050da94a9d4c4e87b497559b841536 to your computer and use it in GitHub Desktop.
continuum.sh
Certainly! Here's an ASCII diagram expanding upon the "Continuum" environment, showing columns that illustrate hierarchical specificity for possible next steps:
```
+-------------------------------------------------------------+
| Continuum |
+------------------+------------------+-----------------------+
| | | |
| Contexts | Actions | Detailed Steps |
| | | |
| +------------+ | +------------+ | +-----------------+ |
| | Projects | | |Open | | |Choose branch | |
| +------------+ | +------------+ | +-----------------+ |
| | | |
| +------------+ | +------------+ | +-----------------+ |
| | Recent | | |Chat | | |Engage in chat | |
| | chats | | +------------+ | +-----------------+ |
| +------------+ | | |
| | +------------+ | +-----------------+ |
| anoma/juvix | |Sync | | |Resolve conflicts| |
| | +------------+ | +-----------------+ |
| | | |
| Server queries | +------------+ | +-----------------+ |
| | |Branch | | |Merge branch | |
| | +------------+ | +-----------------+ |
| ... | | |
| | | |
+------------------+------------------+-----------------------+
```
This diagram showcases three levels:
1. **Contexts** - General areas such as Projects or Recent Chats.
2. **Actions** - Generic operations that can be performed like "Open" or "Chat".
3. **Detailed Steps** - Specific steps or instructions related to the previous action, such as "Choose branch" or "Engage in chat".
The hierarchical specificity increases from left to right, starting with broader contexts and moving towards specific detailed steps.
@bmorphism
Copy link
Author

A moment-by-moment generative flow can be implemented using a recurrent neural network (RNN), such as Gated Recurrent Units (GRUs) or Long Short-Term Memory (LSTM) networks. Both GRUs and LSTMs are designed to process sequential data and can learn long-term dependencies, making them suitable for generating new data in a continuous manner[1].

GRUs are simpler than LSTMs but can still learn long-term dependencies. To generate new data using a GRU, a seed value is fed into the network, which then generates a new output vector. This output vector is fed back into the network as input for the next time step, and this process is repeated over time to generate a continuous stream of data[1].

LSTMs, on the other hand, are more complex than GRUs but can learn longer-term dependencies. They can also be used to generate new data by feeding the output of the network back into the input, and this process can be repeated over time to generate a continuous stream of data[1].

In addition to RNNs, other types of neural networks can be used to generate new data, such as convolutional neural networks (CNNs) for processing spatial data like images, and variational autoencoders (VAEs) for learning the distribution of a dataset[1].

Citations:
[1] https://ppl-ai-file-upload.s3.amazonaws.com/web/direct-files/13830/7ce1bbe2-c2fe-4a7d-947d-42020f03f789/Causalitayyy.pdf
[2] https://neptune.ai/blog/recurrent-neural-network-guide
[3] https://d2l.ai/chapter_recurrent-modern/gru.html
[4] https://en.wikipedia.org/wiki/Long_short-term_memory
[5] https://openreview.net/pdf?id=BJjBnN9a-
[6] https://www.ibm.com/topics/recurrent-neural-networks
[7] https://www.geeksforgeeks.org/gated-recurrent-unit-networks/
[8] https://machinelearningmastery.com/gentle-introduction-long-short-term-memory-networks-experts/
[9] https://openaccess.thecvf.com/content/CVPR2021/papers/Skorokhodov_Adversarial_Generation_of_Continuous_Images_CVPR_2021_paper.pdf
[10] https://en.wikipedia.org/wiki/Recurrent_neural_network
[11] https://spotintelligence.com/2023/01/30/gated-recurrent-unit-gru/
[12] https://bridge-tweed.medium.com/a-simple-way-to-perform-prediction-on-continuous-data-using-lstm-e3224e851ef0
[13] https://arxiv.org/pdf/1512.07108.pdf%C3%A3%E2%82%AC%E2%80%9A.pdf
[14] https://arxiv.org/pdf/2304.07025.pdf
[15] http://ieeexplore.ieee.org/document/8777193/
[16] https://www.sciencedirect.com/topics/computer-science/long-short-term-memory-network
[17] https://encord.com/blog/convolutional-neural-networks-explained/
[18] https://arxiv.org/pdf/1611.09904.pdf
[19] https://www.sciencedirect.com/science/article/abs/pii/S0263224122003128
[20] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5572368/
[21] https://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/
[22] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9713748/
[23] https://towardsdatascience.com/lstm-for-time-series-prediction-de8aeb26f2ca
[24] https://machinelearningmastery.com/text-generation-lstm-recurrent-neural-networks-python-keras/
[25] https://en.wikipedia.org/wiki/Gated_recurrent_unit

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment