Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.
Avoid being a link dump. Try to provide only valuable well tuned information.
Neural network links before starting with transformers.
| # Delete all forks that haven't been updated since 2020 | |
| gh auth refresh -h github.com -s delete_repo | |
| gh search repos \ | |
| --owner tonybaloney \ | |
| --updated="<2020-01-01" \ | |
| --include-forks=only \ | |
| --limit 100 \ | |
| --json url \ | |
| --jq ".[] .url" \ | xargs -I {} gh repo delete {} --confirm | 
This episode of Recsperts was transcribed with Whisper from OpenAI, an open-source neural net trained on almost 700 hours of audio. The model includes an encoder-decoder architecture by tokenizing audio into 30-second chunks, normalizing audio samples to the log-Mel scale, and passing the data into an encoder. A decoder is trained to predict the captioned text matching the encoder, and the model includes transcription, as well as timestamp-aligned transcription, and multilingual translation.
The transcription process outputs a single string file, so it's up to the end-user to parse out individual speakers, or run the model [through a sec
| -i https://pypi.org/simple | |
| anyio==3.6.2; python_full_version >= '3.6.2' | |
| certifi==2022.12.7; python_version >= '3.6' | |
| click==8.1.3; python_version >= '3.7' | |
| colorama==0.4.6 | |
| commonmark==0.9.1 | |
| h11==0.14.0; python_version >= '3.7' | |
| httpcore==0.16.3; python_version >= '3.7' | |
| httpx==0.23.1 | |
| idna==3.4 | 
ChatGPT appeared like an explosion on all my social media timelines in early December 2022. While I keep up with machine learning as an industry, I wasn't focused so much on this particular corner, and all the screenshots seemed like they came out of nowhere. What was this model? How did the chat prompting work? What was the context of OpenAI doing this work and collecting my prompts for training data?
I decided to do a quick investigation. Here's all the information I've found so far. I'm aggregating and synthesizing it as I go, so it's currently changing pretty frequently.
PYTHON_VERISON=3.9.7, ENVIRONMENT=live, and SECRET_KEY| twitter.com##[aria-label$="Trending now" i] | |
| twitter.com##article [aria-label^="Recommended Topic" i]:upward(article) | 
| #!/usr/bin/env python | |
| import argparse | |
| import os | |
| import subprocess | |
| import sys | |
| import time | |
| # Required so we don't generate tons of logs during restore | |
| disable_logging_sql = "ALTER USER postgres RESET pgaudit.log;" | 
I got sick of writing the same Serializer & ModelViewSet classes over and over so I found and wrote some code to do it for me, and somehow it works! Please note that there are a lot of caveats to running an API like this and while this may work, I know there's A LOT of room for improvement, feel free to fork and help!
Import the router module to your main sites urls.py file as the injection point like so...
Make sure to remove any other imports from other viewsets that you don't need that may conflict!
| // service worker | |
| self.addEventListener("install", (event) => { | |
| event.waitUntil( | |
| caches.open("{{ cache_key }}").then((cache) => | |
| cache.addAll([ | |
| // {% for asset in assets %} | |
| "{{ asset }}", | |
| // {% endfor %} | |
| ]) | |
| ) |