Skip to content

Instantly share code, notes, and snippets.

View louismullie's full-sized avatar

L louismullie

View GitHub Profile
@christophemarois
christophemarois / push-remote.sh
Last active March 1, 2020 09:31
Push to a remote non-bare git directory
# on the remote
mkdir my-repo
cd my-repo
git init
git config receive.denyCurrentBranch updateInstead
# on the local
git remote add [remote-name] ssh://[ssh profile name]/[path to repo on the remote]
@jimsrc
jimsrc / gpt4_text_compression.md
Last active June 18, 2024 15:03
Minimizing the number of tokens usage to interact with GPT-4.

Overview

I just read this trick for text compression, in order to save tokens in subbsequent interactions during a long conversation, or in a subsequent long text to summarize.

SHORT VERSION:

It's useful to give a mapping between common words (or phrases) in a given long text that one intends to pass later. Then pass that long text to gpt-4 but encoded with such mapping. The idea is that the encoded version contains less tokens than the original text. There are several algorithms to identify frequent words or phrases inside a given text, such as NER, TF-IDF, part-of-speech (POS) tagging, etc.