Skip to content

Instantly share code, notes, and snippets.

View dneuraln's full-sized avatar

Programing with Deep Neural Networks dneuraln

View GitHub Profile
import pyeliza
class Eliza:
aliases = 'eliza'
description = 'Virtual therapist'
_therapist = pyeliza.eliza()
def execute(self, expression, context):
'''
>>> from mock import Mock
@tboggs
tboggs / dirichlet_plots.png
Last active May 26, 2025 18:01
A script to generate contour plots of Dirichlet distributions
dirichlet_plots.png
@keithshep
keithshep / csv_to_hdf5.py
Created November 5, 2014 17:33
convert CSV file to HDF5 using h5py
#!/usr/bin/env python -O
import argparse
import sys
import numpy
import h5py
import csv
class ColType:
UNKNOWN = 1
@danijar
danijar / blog_tensorflow_variable_sequence_labelling.py
Last active May 15, 2022 14:28
TensorFlow Variable-Length Sequence Labelling
# Working example for my blog post at:
# http://danijar.com/variable-sequence-lengths-in-tensorflow/
import functools
import sets
import tensorflow as tf
from tensorflow.models.rnn import rnn_cell
from tensorflow.models.rnn import rnn
def lazy_property(function):
@karpathy
karpathy / pg-pong.py
Created May 30, 2016 22:50
Training a Neural Network ATARI Pong agent with Policy Gradients from raw pixels
""" Trains an agent with (stochastic) Policy Gradients on Pong. Uses OpenAI Gym. """
import numpy as np
import cPickle as pickle
import gym
# hyperparameters
H = 200 # number of hidden layer neurons
batch_size = 10 # every how many episodes to do a param update?
learning_rate = 1e-4
gamma = 0.99 # discount factor for reward
@tomokishii
tomokishii / mnist_cnn_bn.py
Last active December 14, 2023 03:55
MNIST using Batch Normalization - TensorFlow tutorial
#
# mnist_cnn_bn.py date. 5/21/2016
# date. 6/2/2017 check TF 1.1 compatibility
#
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
@siemanko
siemanko / tf_lstm.py
Last active July 26, 2023 06:57
Simple implementation of LSTM in Tensorflow in 50 lines (+ 130 lines of data generation and comments)
"""Short and sweet LSTM implementation in Tensorflow.
Motivation:
When Tensorflow was released, adding RNNs was a bit of a hack - it required
building separate graphs for every number of timesteps and was a bit obscure
to use. Since then TF devs added things like `dynamic_rnn`, `scan` and `map_fn`.
Currently the APIs are decent, but all the tutorials that I am aware of are not
making the best use of the new APIs.
Advantages of this implementation:
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.