Skip to content

Instantly share code, notes, and snippets.

View mrdrozdov's full-sized avatar

Andrew Drozdov mrdrozdov

View GitHub Profile
@mrdrozdov
mrdrozdov / gpu-snapshot.txt
Created February 10, 2019 03:03
gpu-snapshot.txt
$ nvidia-smi
Sat Feb 9 22:03:30 2019
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 396.26 Driver Version: 396.26 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 108... On | 00000000:1A:00.0 Off | N/A |
| 49% 82C P2 105W / 250W | 10467MiB / 11178MiB | 59% Default |
@mrdrozdov
mrdrozdov / kalpesh-intersect.py
Created February 6, 2019 23:09
kalpesh-intersect.py
"""
I have two tensors, a "reference" tensor of size (8192,) and a "query"
tensor of size (10000, 500). I want an binary mask of size (10000, 8192)
with 1s in all indices (i, j) where reference[ j ] exists in query[ i , : ].
Also, all values are integers and upper bounded by another integer.
"""
import torch
seed = 10
vocabsize = 300 # value isn't important, but it's more than refsize
@mrdrozdov
mrdrozdov / kalpesh-interview-question.py
Created February 6, 2019 21:12
kalpesh-interview-question.py
"""
How to find intersection of big tensor and all small tensors?
"""
import torch
bigsize = 30
smallsize = 8
smallcount = 12
vocabsize = 100
@mrdrozdov
mrdrozdov / ps.py
Created February 6, 2019 19:52
ps.py
import os
try:
import psutil
except:
psutil = None
def get_memory_used():
if psutil is None:
@mrdrozdov
mrdrozdov / debug-allennlp.py
Created February 6, 2019 18:00
debug-allennlp.py
import os
candidates = [
'from allennlp.common.file_utils import cached_path',
'from allennlp.common.checks import ConfigurationError',
'from allennlp.common import Params',
'from allennlp.common.util import lazy_groups_of',
'from allennlp.modules.elmo_lstm import ElmoLstm',
'from allennlp.modules.highway import Highway',
'from allennlp.modules.scalar_mix import ScalarMix',
@mrdrozdov
mrdrozdov / ontontes-example.txt
Created February 1, 2019 17:33
ontontes-example.txt
nw/wsj/16/wsj_1657 0 0 Recovering VBG VBG 4 csubj _ recover 02 3 - O B-V B-ARG1 O O -
nw/wsj/16/wsj_1657 0 1 radiophonic JJ JJ 3 amod _ - - - - O B-ARG1 I-ARG1 O O -
nw/wsj/16/wsj_1657 0 2 sovereignty NN NN 1 dobj _ - - - - O I-ARG1 I-ARG1 O O -
nw/wsj/16/wsj_1657 0 3 was VBD VBD 0 root _ be 01 - - O O B-V O O -
nw/wsj/16/wsj_1657 0 4 the DT DT 6 det _ - - - - O O B-ARG2 O O -
nw/wsj/16/wsj_1657 0 5 purpose NN NN 4 xcomp _ - - - - O O I-ARG2 O O -
nw/wsj/16/wsj_1657 0
@mrdrozdov
mrdrozdov / logs.txt
Last active January 23, 2019 14:38
gpu summary
less baseline-l_3-lr_002-d_800-dist-1gpu-b4/experiment.log.0
27s for 100 batches of 4 (1x speedup)
2019-01-22 22:22:09,913 [INFO] Epoch/Step/Batch=0/1300/1300 reconstruction_mse_loss=0.382 total_loss=0.382
2019-01-22 22:22:11,924 [INFO] Average-Length=10.29
2019-01-22 22:22:11,924 [INFO] Words-Per-Second=1
2019-01-22 22:22:36,047 [INFO] Epoch/Step/Batch=0/1400/1400 reconstruction_mse_loss=0.380 total_loss=0.380
2019-01-22 22:22:38,032 [INFO] Average-Length=10.410000000000007
2019-01-22 22:22:38,033 [INFO] Words-Per-Second=1
@mrdrozdov
mrdrozdov / pt.py
Last active December 23, 2018 22:39
pt.py
class Trainer(self):
def __init__(self, net, optimizer, ngpus=1):
self.net = net
self.optimizer = optimizer
self.ngpus = ngpus
def step(self, batch, train, ...):
"""
Alternatively, you can avoid replicating your model each batch.
This is particularly useful if your model has any type of state.
@mrdrozdov
mrdrozdov / torch-inplace.txt
Created December 16, 2018 23:40
torch-inplace.txt
In [1]: print('hello world')
hello world
In [2]: print('hello world!')
hello world!
In [3]: import torch
In [4]: print(torch.__version__)
0.4.1
@mrdrozdov
mrdrozdov / pv.txt
Created November 27, 2018 20:31
pv.txt
verbs = [
['agree', 'with'],
['bog', 'down'],
['break', 'away'],
['bring', 'about'],
['bring', 'along'],
['bring', 'back'],
['bring', 'forward'],
['bring', 'in'],
['bring', 'off'],