Skip to content

Instantly share code, notes, and snippets.

View chaitjo's full-sized avatar
🧬

Chaitanya Joshi chaitjo

🧬
View GitHub Profile
@aditya-malte
aditya-malte / smallberta_pretraining.ipynb
Created February 22, 2020 13:41
smallBERTa_Pretraining.ipynb
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import tensorflow as tf
import numpy as np
class ConvolutionalAttentionNLI(object):
def __init__(self, embeddings_shape, target_classes=2, conv_filter_size=3, conv_projection_size=300, attention_output_size=200, comparison_output_size=100, learning_rate=0.05):
self._embeddings_shape = embeddings_shape
self._target_classes = target_classes
self._conv_filter_size = conv_filter_size
self._conv_projection_size = conv_projection_size
@joelthchao
joelthchao / demo.py
Last active August 31, 2021 18:02
Keras uses TensorBoard Callback with train_on_batch
import numpy as np
import tensorflow as tf
from keras.callbacks import TensorBoard
from keras.layers import Input, Dense
from keras.models import Model
def write_log(callback, names, logs, batch_no):
for name, value in zip(names, logs):
summary = tf.Summary()
@fasiha
fasiha / posdef.py
Last active August 2, 2025 11:17
Python/Numpy port of John D’Errico’s implementation (https://www.mathworks.com/matlabcentral/fileexchange/42885-nearestspd) of Higham’s 1988 paper (https://doi.org/10.1016/0024-3795(88)90223-6), including a built-in unit test. License: whatever D’Errico’s license, since this is a port of that.
from numpy import linalg as la
import numpy as np
def nearestPD(A):
"""Find the nearest positive-definite matrix to input
A Python/Numpy port of John D'Errico's `nearestSPD` MATLAB code [1], which
credits [2].
@gyglim
gyglim / tensorboard_logging.py
Last active August 23, 2023 21:29
Logging to tensorboard without tensorflow operations. Uses manually generated summaries instead of summary ops
"""Simple example on how to log scalars and images to tensorboard without tensor ops.
License: BSD License 2.0
"""
__author__ = "Michael Gygli"
import tensorflow as tf
from StringIO import StringIO
import matplotlib.pyplot as plt
import numpy as np
@j314erre
j314erre / text_cnn.py
Created July 13, 2016 00:00
load pre-trained word2vec into cnn-text-classification-tf
import tensorflow as tf
import numpy as np
class TextCNN(object):
"""
A CNN for text classification.
Uses an embedding layer, followed by a convolutional, max-pooling and softmax layer.
"""
def __init__(
@pranv
pranv / lstm.py
Last active July 6, 2017 11:48
An Efficient, Batched, Stateful LSTM layer in Numpy
import numpy as np
from utils import orthogonal, tanh, sigmoid, dtanh, dsigmoid
class LSTM(object):
"""Long Short Term Memory Unit
Parameters
----------
@danijar
danijar / blog_tensorflow_sequence_classification.py
Last active December 24, 2021 03:53
TensorFlow Sequence Classification
# Example for my blog post at:
# https://danijar.com/introduction-to-recurrent-networks-in-tensorflow/
import functools
import sets
import tensorflow as tf
def lazy_property(function):
attribute = '_' + function.__name__
@RadhikaG
RadhikaG / heapClass.py
Last active December 13, 2017 04:04
A class to demonstrate the different operations on heaps.
def parent(i):
return i/2
def left(i):
return 2*i
def right(i):
return (2*i + 1)
class Heap: