Skip to content

Instantly share code, notes, and snippets.

View todpole3's full-sized avatar
πŸ„β€β™€οΈ
Focusing

Victoria X Lin todpole3

πŸ„β€β™€οΈ
Focusing
View GitHub Profile
@todpole3
todpole3 / latency.txt
Created August 27, 2023 05:13 — forked from jboner/latency.txt
Latency Numbers Every Programmer Should Know
Latency Comparison Numbers (~2012)
----------------------------------
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns 3 us
Send 1K bytes over 1 Gbps network 10,000 ns 10 us
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD
@todpole3
todpole3 / transformer_decoder_parameters.py
Last active March 2, 2023 00:28
Calculate the total number of paramters in a Transformer Decoder
"""
Calculate the total number of parameters in a Transformer Decoder.
Usage:
# OPT 125M
python3 transformer_decoder_parameters.py --num-decoder-layers 12 --hidden-dim 768 --num-heads 12 --vocab-size 50272 --sequence-len 2048 --use-learned-pos-emb True
# OPT 350M
python3 transformer_decoder_parameters.py --num-decoder-layers 24 --hidden-dim 1024 --num-heads 16 --vocab-size 50272 --sequence-len 2048 --use-learned-pos-emb True
# OPT 1.3B
python3 transformer_decoder_parameters.py --num-decoder-layers 24 --hidden-dim 2048 --num-heads 32 --vocab-size 50272 --sequence-len 2048 --use-learned-pos-emb True
@todpole3
todpole3 / reproduce_boto3_character_encoding_bug.py
Last active January 27, 2020 01:13
Reproducing botocore.exceptions.ClientError occurred for HIT pages that contain emoji characters
"""
Code adapted from
https://github.com/aws-samples/mturk-code-samples/blob/master/Python/CreateHitSample.py
Package version:
boto3==1.10.45
Usage:
python3 reproduce_boto3_character_encoding_bug.py --aws_credentials /tmp/accessKeys.csv
"""