This gist contains all the grading script for COMP2710 2019 Spring session.
from functools import partial | |
def foo(self, inp, out, name=''): | |
log.debug('layer name {}'.format(name)) | |
log.debug('input size {}'.format(inp[0].size())) | |
log.debug('output size {}'.format(out[0].size())) | |
bar = partial(foo, name='Conv2d_1a_3x3') |
import logging | |
import logging.config | |
from datetime import datetime | |
import os | |
import sys | |
LOGFILE = '/tmp/{0}.{1}.log'.format( | |
os.path.basename(__file__), | |
datetime.now().strftime('%Y%m%dT%H%M%S')) |
Simple decorator timer in Python3.
@tick
def foo():
for i in range(100):
pass
This repo collects convenient scripts.
- graypdf. Convert PDF to grayscale.
- mergepdf. Merge a sequence of PDF files.
This gist contains my homework grading scripts for Auburn COMP2017 in Spring semseter. grade1.py
for homework 1, grade2.py
for homework 2 and so on. Shoot me an email if you have questions.
The script shows a wierd bug using keras with tensorflow.
When you use the tf.while_loop
function
- if you include the
Dropout
layer or set the total number of epochs to greater than 10, then the program hangs and never returns. - Otherwise, it works fine.
I've updated to the latest version of keras and tensorflow. This bug persists. It seems to be a bug with keras. Since I could not reproduce this with pure tensorflow code.
This Makefile sync current local working folder with the remote folder.
When working on a server, I always want to edit files locally and then
push them to servers to run. However I do not want to track what
files I just changed and push them one by one using SFTP. rsync
solves this issue elegantly.
Concretely, when I'm finishing editing files locally, I just type