Skip to content

Instantly share code, notes, and snippets.

View zygmuntz's full-sized avatar

Zygmunt Zając zygmuntz

View GitHub Profile
@kastnerkyle
kastnerkyle / conv_deconv_vae.py
Last active October 19, 2024 08:20
Convolutional Variational Autoencoder, modified from Alec Radford at (https://gist.github.com/Newmu/a56d5446416f5ad2bbac)
# Alec Radford, Indico, Kyle Kastner
# License: MIT
"""
Convolutional VAE in a single file.
Bringing in code from IndicoDataSolutions and Alec Radford (NewMu)
Additionally converted to use default conv2d interface instead of explicit cuDNN
"""
import theano
import theano.tensor as T
from theano.compat.python2x import OrderedDict
@kastnerkyle
kastnerkyle / optimizers.py
Last active January 12, 2021 13:46
Theano optimizers
# Authors: Kyle Kastner
# License: BSD 3-clause
import theano.tensor as T
import numpy as np
import theano
class rmsprop(object):
"""
RMSProp with nesterov momentum and gradient rescaling
@macks22
macks22 / pmf-and-modified-bpmf-pymc.py
Last active May 13, 2021 13:37
Probabilistic Matrix Factorization (PMF) + Modified Bayesian BMF
"""
Implementations of:
Probabilistic Matrix Factorization (PMF) [1],
Bayesian PMF (BPMF) [2],
Modified BPFM (mBPMF)
using `pymc3`. mBPMF is, to my knowledge, my own creation. It is an attempt
to circumvent the limitations of `pymc3` w/regards to the Wishart distribution:
@Andy-P
Andy-P / StreamAnalytics.jl
Last active July 25, 2024 20:59
High Performance Streaming Analytics in Julia
# This code is part of a presentation on streaming analytics in Julia
# It was inspired by a number of individuals and makes use of some of their ideas
# 1. FastML.com got me thinking about inline processing after
# reading his great Vowpal Wabbit posts
# 2. John Lanford and his fantastic Vowpal Wabbit library.
# Check out his NYU video course to learn more (see below)
# 3. John Myles White's presentation on online SDG and his StreamStats.jl library
# Thank you all!
@karpathy
karpathy / min-char-rnn.py
Last active October 19, 2025 09:02
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@smly
smly / keras_interval_evalution.py
Last active February 28, 2021 10:46
An example to check the AUC score on a validation set for each 10 epochs.
"""
An example to check the AUC score on a validation set for each 10 epochs.
I hope it will be helpful for optimizing number of epochs.
"""
# -*- coding: utf-8 -*-
import logging
from sklearn.metrics import roc_auc_score
from keras.callbacks import Callback
""" Poisson-loss Factorization Machines with Numba
Follows the vanilla FM model from:
Steffen Rendle (2012): Factorization Machines with libFM.
In: ACM Trans. Intell. Syst. Technol., 3(3), May.
http://doi.acm.org/10.1145/2168752.2168771
See also: https://github.com/coreylynch/pyFM
"""
@shamatar
shamatar / rwa.py
Last active January 14, 2022 20:17
Keras (keras.is) implementation of Recurrent Weighted Average, as described in https://arxiv.org/abs/1703.01253. Follows original implementation in Tensorflow from https://github.com/jostmey/rwa. Works with fixed batch sizes, requires "batch_shape" parameter in input layer. Outputs proper config, should save and restore properly. You are welcome…
from keras.layers import Recurrent
import keras.backend as K
from keras import activations
from keras import initializers
from keras import regularizers
from keras import constraints
from keras.engine import Layer
from keras.engine import InputSpec
@eamartin
eamartin / notebook.ipynb
Last active April 22, 2025 08:11
Understanding & Visualizing Self-Normalizing Neural Networks
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@neubig
neubig / dynet-tagger.py
Last active May 21, 2018 06:01
A small sequence labeler in DyNet
"""
DyNet implementation of a sequence labeler (POS taggger).
This is a translation of this tagger in PyTorch: https://gist.github.com/hal3/8c170c4400576eb8d0a8bd94ab231232
Basic architecture:
- take words
- run though bidirectional GRU
- predict labels one word at a time (left to right), using a recurrent neural network "decoder"
The decoder updates hidden state based on:
- most recent word