Skip to content

Instantly share code, notes, and snippets.

View rhythm92's full-sized avatar

fun_dl rhythm92

  • Japan
View GitHub Profile
@thomwolf
thomwolf / parallel.py
Last active August 8, 2023 15:50
Data Parallelism in PyTorch for modules and losses
##+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
## Created by: Hang Zhang, Rutgers University, Email: [email protected]
## Modified by Thomas Wolf, HuggingFace Inc., Email: [email protected]
## Copyright (c) 2017-2018
##
## This source code is licensed under the MIT-style license found in the
## LICENSE file in the root directory of this source tree
##+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
"""Encoding Data Parallel"""
@symisc
symisc / snapchat_dog_filter.py
Last active July 26, 2020 11:51
Mimic the two famous Snapchat filters: The flower crown & the dog facial parts with the help of the PixLab (https://pixlab.io) API.
import requests
import json
# Mimic the two famous Snapchat filters: The flower crown & the dog facial parts filter.
# The target image must contain at least one human face. The more faces you got, the more funny it should be!
#
# Only three commands are actually needed in order to mimic the Snapchat filters:
# face landmarks: https://pixlab.io/cmd?id=facelandmarks
# smart resize: https://pixlab.io/cmd?id=smartresize
# merge: https://pixlab.io/cmd?id=merge
import tensorflow as tf
import numpy as np
import tensorflow.contrib.slim as slim
total_layers = 25 #Specify how deep we want our network
units_between_stride = total_layers / 5
def resUnit(input_layer,i):
with tf.variable_scope("res_unit"+str(i)):
part1 = slim.batch_norm(input_layer,activation_fn=None)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@eerwitt
eerwitt / load_jpeg_with_tensorflow.py
Created January 31, 2016 05:52
Example loading multiple JPEG files with TensorFlow and make them available as Tensors with the shape [[R, G, B], ... ].
# Typical setup to include TensorFlow.
import tensorflow as tf
# Make a queue of file names including all the JPEG images files in the relative
# image directory.
filename_queue = tf.train.string_input_producer(
tf.train.match_filenames_once("./images/*.jpg"))
# Read an entire image file which is required since they're JPEGs, if the images
# are too large they could be split in advance to smaller files or use the Fixed
@baraldilorenzo
baraldilorenzo / readme.md
Created January 16, 2016 12:57
VGG-19 pre-trained model for Keras

##VGG19 model for Keras

This is the Keras model of the 19-layer network used by the VGG team in the ILSVRC-2014 competition.

It has been obtained by directly converting the Caffe model provived by the authors.

Details about the network architecture can be found in the following arXiv paper:

Very Deep Convolutional Networks for Large-Scale Image Recognition

K. Simonyan, A. Zisserman

@mitmul
mitmul / ResNet_A.py
Last active May 17, 2017 13:39
Deep Residual Network definition by Chainer
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import math
import chainer
import chainer.links as L
import chainer.functions as F
@baraldilorenzo
baraldilorenzo / readme.md
Last active September 13, 2025 12:17
VGG-16 pre-trained model for Keras

##VGG16 model for Keras

This is the Keras model of the 16-layer network used by the VGG team in the ILSVRC-2014 competition.

It has been obtained by directly converting the Caffe model provived by the authors.

Details about the network architecture can be found in the following arXiv paper:

Very Deep Convolutional Networks for Large-Scale Image Recognition

K. Simonyan, A. Zisserman

@karpathy
karpathy / min-char-rnn.py
Last active November 19, 2025 15:16
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@w1ndy
w1ndy / brick.jpg
Last active May 26, 2017 03:24
3D Maze in Python
brick.jpg