| # -*- coding: utf-8 -*- | |
| """ResNet50 model for Keras with fused intermediate layers | |
| # Reference: | |
| https://arxiv.org/pdf/1604.00133.pdf | |
| Adapted from original resnet | |
| """ | |
| from __future__ import print_function |
Following mining and findings performed on EVGA GeForce GTX 1070 SC GAMING Black Edition Graphics Card cards.
First run nvidia-xconfig --enable-all-gpus then set about editing the xorg.conf file to correctly set the Coolbits option.
# /etc/X11/xorg.conf
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
| # -*- coding: utf-8 -*- | |
| import cv2 | |
| import numpy as np | |
| from tensorflow.keras.layers import Input, Dense, Conv2D, MaxPool2D, AvgPool2D, Activation | |
| from tensorflow.keras.layers import Layer, BatchNormalization, ZeroPadding2D, Flatten, add | |
| from tensorflow.keras.optimizers import SGD | |
| from tensorflow.keras.models import Model | |
| from tensorflow.keras import initializers |
This is an Keras implementation of ResNet-101 with ImageNet pre-trained weights. I converted the weights from Caffe provided by the authors of the paper. The implementation supports both Theano and TensorFlow backends. Just in case you are curious about how the conversion is done, you can visit my blog post for more details.
ResNet Paper:
Deep Residual Learning for Image Recognition.
Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
arXiv:1512.03385
This is an Keras implementation of ResNet-152 with ImageNet pre-trained weights. I converted the weights from Caffe provided by the authors of the paper. The implementation supports both Theano and TensorFlow backends. Just in case you are curious about how the conversion is done, you can visit my blog post for more details.
ResNet Paper:
Deep Residual Learning for Image Recognition.
Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
arXiv:1512.03385
| '''This script goes along the blog post | |
| "Building powerful image classification models using very little data" | |
| from blog.keras.io. | |
| It uses data that can be downloaded at: | |
| https://www.kaggle.com/c/dogs-vs-cats/data | |
| In our setup, we: | |
| - created a data/ folder | |
| - created train/ and validation/ subfolders inside data/ | |
| - created cats/ and dogs/ subfolders inside train/ and validation/ | |
| - put the cat pictures index 0-999 in data/train/cats |
| from keras.models import Sequential | |
| from keras.layers import Dense | |
| from keras.utils.io_utils import HDF5Matrix | |
| import numpy as np | |
| def create_dataset(): | |
| import h5py | |
| X = np.random.randn(200,10).astype('float32') | |
| y = np.random.randint(0, 2, size=(200,1)) | |
| f = h5py.File('test.h5', 'w') |
Your Flask app object implements the __call__ method, which means it can be called like a regular function.
When your WSGI container receives a HTTP request it calls your app with the environ dict and the start_response callable.
WSGI is specified in PEP 0333.
The two relevant environ variables are:
SCRIPT_NAME
The initial portion of the request URL's "path" that corresponds to the application object, so that the application knows its virtual "location". This may be an empty string, if the application corresponds to the "root" of the server.
| [alias] | |
| a = help | |
| ai = init | |
| aichas = revert | |
| alaks = mv | |
| apan = rebase | |
| diks = show | |
| feri = clone | |
| flaks = stash save | |
| graps = commit |