Skip to content

Instantly share code, notes, and snippets.

View SmiffyKMc's full-sized avatar
🚀

Kieran McCarthy SmiffyKMc

🚀
View GitHub Profile
@SmiffyKMc
SmiffyKMc / model_v3.py
Created June 17, 2022 14:31
Using a convolutional base for the model
conv_base = keras.applications.vgg16.VGG16(
weights="imagenet",
include_top=False,
input_shape=(256, 256, 3)
)
conv_base.summary()
@SmiffyKMc
SmiffyKMc / model_v2.py
Created June 17, 2022 14:24
V2 of the CNN
inputs = keras.Input(shape=(256, 256, 3))
x = data_augmentation(inputs)
x = layers.Rescaling(1./255)(x)
x = layers.Conv2D(filters=32, kernel_size=3, activation=keras.activations.relu)(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=64, kernel_size=3, activation=keras.activations.relu)(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=128, kernel_size=3, activation=keras.activations.relu)(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=256, kernel_size=3, activation=keras.activations.relu)(x)
@SmiffyKMc
SmiffyKMc / data_augmentation.py
Created June 17, 2022 13:33
Example of the data augmentation and views from the plot
data_augmentation = keras.Sequential(
[
layers.RandomRotation(0.3),
layers.RandomZoom(0.2),
layers.RandomFlip("horizontal"),
]
)
plt.figure(figsize=(10, 10))
for images, _ in test_dataset.take(1):
@SmiffyKMc
SmiffyKMc / model_evaluation.py
Created June 16, 2022 18:23
Evaluating the model
test_model = keras.models.load_model(f'{hotDogDir}hotdog_classifier_v1.keras')
test_loss, test_acc = test_model.evaluate(test_dataset)
print(f"Test Acc: {test_acc:.3f}")
@SmiffyKMc
SmiffyKMc / fit_model.py
Created June 16, 2022 18:15
Fitting the model and allowing for saving of the model based on the best validation loss
callbacks = [
keras.callbacks.ModelCheckpoint(
filepath=f"{hotDogDir}hotdog_classifier_v1.keras",
monitor="val_loss",
save_best_only=True
)
]
history = model.fit(
train_dataset,
@SmiffyKMc
SmiffyKMc / data_preprocess.py
Created June 16, 2022 18:03
Steps done to clean the data and preprocess as datasets
from tensorflow.keras.utils import image_dataset_from_directory
test_dataset = image_dataset_from_directory(
new_base_dir / "test",
image_size=(256, 256),
batch_size=32
)
train_dataset = image_dataset_from_directory(
new_base_dir / "train",
@SmiffyKMc
SmiffyKMc / model_v1.py
Last active June 16, 2022 17:56
First version of the CNN
import tensorflow as tf
from tensorflow import keras
from keras import layers
inputs = keras.Input(shape=(256, 256, 3))
x = layers.Rescaling(1./255)(inputs)
x = layers.Conv2D(filters=32, kernel_size=3, activation=keras.activations.relu)(x)
x = layers.MaxPooling2D(pool_size=2)(x)
x = layers.Conv2D(filters=64, kernel_size=3, activation=keras.activations.relu)(x)
x = layers.MaxPooling2D(pool_size=2)(x)
@SmiffyKMc
SmiffyKMc / Segmenting-And-Clustering.ipynb
Created August 30, 2019 20:28
Created on Cognitive Class Labs
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.