Skip to content

Instantly share code, notes, and snippets.

View AFAgarap's full-sized avatar

Abien Fred Agarap AFAgarap

View GitHub Profile
@AFAgarap
AFAgarap / capsLayer.py
Created November 26, 2017 16:10 — forked from debarko/capsLayer.py
CapsNet Capsule Definition
# It only has two dependencies numpy and tensorflow
import numpy as np
import tensorflow as tf
from config import cfg
# Class defining a Convolutional Capsule
# consisting of multiple neuron layers
#
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@AFAgarap
AFAgarap / gaussian-naive-bayes.ipynb
Last active March 31, 2018 06:00
A notebook for step-by-step explanation of the Gaussian Naive Bayes classification model.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@AFAgarap
AFAgarap / autoencoder-full.py
Last active August 17, 2024 08:17
TensorFlow 2.0 implementation for a vanilla autoencoder. Link to tutorial: https://medium.com/@abien.agarap/implementing-an-autoencoder-in-tensorflow-2-0-5e86126e9f7
"""TensorFlow 2.0 implementation of vanilla Autoencoder."""
import numpy as np
import tensorflow as tf
__author__ = "Abien Fred Agarap"
np.random.seed(1)
tf.random.set_seed(1)
batch_size = 128
epochs = 10
@AFAgarap
AFAgarap / encoder.py
Last active November 16, 2019 18:49
TensorFlow 2.0 implementation of an encoder layer for a vanilla autoencoder.
class Encoder(tf.keras.layers.Layer):
def __init__(self, intermediate_dim):
super(Encoder, self).__init__()
self.hidden_layer = tf.keras.layers.Dense(
units=intermediate_dim,
activation=tf.nn.relu,
kernel_initializer='he_uniform'
)
self.output_layer = tf.keras.layers.Dense(
units=intermediate_dim,
@AFAgarap
AFAgarap / decoder.py
Last active November 16, 2019 18:49
TensorFlow 2.0 implementation of a decoder for a vanilla autoencoder.
class Decoder(tf.keras.layers.Layer):
def __init__(self, intermediate_dim, original_dim):
super(Decoder, self).__init__()
self.hidden_layer = tf.keras.layers.Dense(
units=intermediate_dim,
activation=tf.nn.relu,
kernel_initializer='he_uniform'
)
self.output_layer = tf.keras.layers.Dense(
units=original_dim,
@AFAgarap
AFAgarap / autoencoder.py
Created March 16, 2019 07:09
TensorFlow 2.0 implementation of a vanilla autoencoder model.
class Autoencoder(tf.keras.Model):
def __init__(self, intermediate_dim, original_dim):
super(Autoencoder, self).__init__()
self.encoder = Encoder(intermediate_dim=intermediate_dim)
self.decoder = Decoder(intermediate_dim=intermediate_dim, original_dim=original_dim)
def call(self, input_features):
code = self.encoder(input_features)
reconstructed = self.decoder(code)
return reconstructed
@AFAgarap
AFAgarap / loss.py
Created March 16, 2019 07:11
Reconstruction error function for a vanilla autoencoder.
def loss(model, original):
reconstruction_error = tf.reduce_mean(tf.square(tf.subtract(model(original), original)))
return reconstruction_error
@AFAgarap
AFAgarap / train.py
Created March 16, 2019 07:13
Optimization function for a vanilla autoencoder.
def train(loss, model, opt, original):
with tf.GradientTape() as tape:
gradients = tape.gradient(loss(model, original), model.trainable_variables)
gradient_variables = zip(gradients, model.trainable_variables)
opt.apply_gradients(gradient_variables)