Skip to content

Instantly share code, notes, and snippets.

View ondrejsojka's full-sized avatar

Ondřej Sojka ondrejsojka

  • Brno, Czech Republic
View GitHub Profile
@t3dotgg
t3dotgg / model-prices.csv
Last active July 13, 2025 15:49
Rough list of popular AI models and the cost to use them (cost is per 1m tokens)
Name Input Output
Gemini 2.0 Flash-Lite $0.075 $0.30
Mistral 3.1 Small $0.10 $0.30
Gemini 2.0 Flash $0.10 $0.40
ChatGPT 4.1-nano $0.10 $0.40
DeepSeek v3 (old) $0.14 $0.28
ChatGPT 4o-mini $0.15 $0.60
Gemini 2.5 Flash $0.15 $0.60
DeepSeek v3 $0.27 $1.10
Grok 3-mini $0.30 $0.50
@adrienbrault
adrienbrault / llama2-mac-gpu.sh
Last active April 8, 2025 13:49
Run Llama-2-13B-chat locally on your M1/M2 Mac with GPU inference. Uses 10GB RAM. UPDATE: see https://twitter.com/simonw/status/1691495807319674880?s=20
# Clone llama.cpp
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
# Build it
make clean
LLAMA_METAL=1 make
# Download model
export MODEL=llama-2-13b-chat.ggmlv3.q4_0.bin
@t0mm4rx
t0mm4rx / impermax.py
Created January 20, 2022 11:02
Python script to fetch Impermax.finance position total collateral and debt
"""Impermax related functions.
imxc = Impermax collateral token, given in exchange of the actual LP pair.
slp = staked LP token.
"""
from providers import polygon
from chain_data import get_lp_pair_holdings, get_token_decimals
import json
imxc_abi = json.load(open("./abis/imxc.json", "rb"))