Skip to content

Instantly share code, notes, and snippets.

View ppope's full-sized avatar

Phil Pope ppope

View GitHub Profile
@tuelwer
tuelwer / pytorch-lbfgs-example.py
Last active June 17, 2024 21:55
pytorch-L-BFGS-example
import torch
import torch.optim as optim
import matplotlib.pyplot as plt
# 2d Rosenbrock function
def f(x):
return (1 - x[0])**2 + 100 * (x[1] - x[0]**2)**2
@wassname
wassname / running_stats.py
Last active November 14, 2023 15:09
Running stats (mean, standard deviation) for python, pytorch, etc
import numpy as np
# handle pytorch tensors etc, by using tensorboardX's method
try:
from tensorboardX.x2num import make_np
except ImportError:
def make_np(x):
return np.array(x).copy().astype('float16')
class RunningStats(object):
@johnhw
johnhw / umap_sparse.py
Last active November 15, 2024 23:31
1 million prime UMAP layout
### JHW 2018
import numpy as np
import umap
# This code from the excellent module at:
# https://stackoverflow.com/questions/4643647/fast-prime-factorization-module
import random
import sys
from collections import OrderedDict
PY2 = sys.version_info[0] == 2
_internal_attrs = {'_backend', '_parameters', '_buffers', '_backward_hooks', '_forward_hooks', '_forward_pre_hooks', '_modules'}
class Scope(object):
def __init__(self):
self._modules = OrderedDict()
import torch
def jacobian(y, x, create_graph=False):
jac = []
flat_y = y.reshape(-1)
grad_y = torch.zeros_like(flat_y)
for i in range(len(flat_y)):
grad_y[i] = 1.
grad_x, = torch.autograd.grad(flat_y, x, grad_y, retain_graph=True, create_graph=create_graph)
jac.append(grad_x.reshape(x.shape))
@ricksladkey
ricksladkey / log.py
Last active April 25, 2024 12:27
Example Python script for GDB that reads and displays the text in a ring buffer every time the program stops
from __future__ import print_function
import struct
import gdb
def log():
# Get the inferior.
try:
@wiseodd
wiseodd / natural_grad.py
Created March 13, 2018 19:36
Natural Gradient Descent for Logistic Regression
import numpy as np
from sklearn.utils import shuffle
# Data comes from y = f(x) = [2, 3].x + [5, 7]
X0 = np.random.randn(100, 2) - 1
X1 = np.random.randn(100, 2) + 1
X = np.vstack([X0, X1])
t = np.vstack([np.zeros([100, 1]), np.ones([100, 1])])
I have run an nginx container...
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
6d67de07731d nginx "nginx -g 'daemon ..." 40 minutes ago Up 40 minutes 80/tcp, 443/tcp epic_goldberg
I want to use Debian for debug:
docker run -it --pid=container:6d67de07731d --net=container:6d67de07731d --cap-add sys_admin debian
I can see the nginx process:
'''
A logistic regression example using the meta-graph checkpointing
features of Tensorflow.
Author: João Felipe Santos, based on code by Aymeric Damien
(https://github.com/aymericdamien/TensorFlow-Examples/)
'''
from __future__ import print_function
@akiross
akiross / Convolutional Arithmetic.ipynb
Last active October 24, 2024 07:04
Few experiments on how convolution and transposed convolution (deconvolution) should work in tensorflow.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.