Skip to content

Instantly share code, notes, and snippets.

@viksit
viksit / async_flask.py
Created March 28, 2016 20:01 — forked from sergray/async_flask.py
Asynchronous requests in Flask with gevent
"""Asynchronous requests in Flask with gevent"""
from time import time
from flask import Flask, Response
from gevent.pywsgi import WSGIServer
from gevent import monkey
import requests
@viksit
viksit / gist:0ecd2deffb1d2ea5174b34525f55bae3
Created April 5, 2016 01:00 — forked from methane/gist:2185380
Tornado Example: Delegating an blocking task to a worker thread pool from an asynchronous request handler
from time import sleep
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from tornado.web import Application, asynchronous, RequestHandler
from multiprocessing.pool import ThreadPool
_workers = ThreadPool(10)
def run_background(func, callback, args=(), kwds={}):
def _callback(result):
@viksit
viksit / Workflow
Created April 5, 2016 04:49 — forked from sarguido/Workflow
A Beginner's Guide to Machine Learning with Scikit-Learn: Workflow
{
"metadata": {
"name": ""
},
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
import numpy as np
__author__ = 'Fariz Rahman'
def eq(x, y):
return x.lower().replace(" ", "") == y.lower().replace(" ", "")
def get_words(x):
x = x.replace(" ", " ")
@viksit
viksit / checkin.py
Created May 24, 2016 21:49
grpc and gevent thread pool executor
class ThreadPoolExecutor(concurrent.futures.ThreadPoolExecutor):
"""
A version of :class:`concurrent.futures.ThreadPoolExecutor` that
always uses native threads, even when threading is monkey-patched.
.. versionadded:: 1.2a1
"""
def __init__(self, max_workers):
super(ThreadPoolExecutor, self).__init__(max_workers)
@viksit
viksit / gist:b576fca3690327d519082167ef9f1f12
Created June 25, 2016 22:17 — forked from stuart11n/gist:9628955
rename git branch locally and remotely
git branch -m old_branch new_branch # Rename branch locally
git push origin :old_branch # Delete the old branch
git push --set-upstream origin new_branch # Push the new branch, set local branch to track the new remote
@viksit
viksit / json_required_demo.py
Created September 14, 2016 01:21 — forked from corbinbs/json_required_demo.py
Flask JSON required decorator
"""
Demo of json_required decorator for API input validation/error handling
"""
import inspect
import functools
import json
from traceback import format_exception
from flask import jsonify, request
import sys
@viksit
viksit / transferring-props.md
Created March 15, 2017 21:53 — forked from sebmarkbage/transferring-props.md
Deprecating transferPropsTo

Deprecating transferPropsTo

It's a common pattern in React to wrap a component in an abstraction. The outer component exposes a simple property to do something that might have more complex implementation details.

We used to have a helper function called transferPropsTo. We no longer support this method. Instead you're expected to use a generic object helper to merge props.

render() {
 return Component(Object.assign({}, this.props, { more: 'values' }));
@viksit
viksit / AttentionWithContext.py
Created June 9, 2017 07:41 — forked from rmdort/AttentionWithContext.py
Keras Layer that implements an Attention mechanism, with a context/query vector, for temporal data. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
class AttentionWithContext(Layer):
"""
Attention operation, with a context/query vector, for temporal data.
Supports Masking.
Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf]
"Hierarchical Attention Networks for Document Classification"
by using a context vector to assist the attention
# Input shape
3D tensor with shape: `(samples, steps, features)`.
# Output shape
import tensorflow as tf
import numpy as np
class ConvolutionalAttentionNLI(object):
def __init__(self, embeddings_shape, target_classes=2, conv_filter_size=3, conv_projection_size=300, attention_output_size=200, comparison_output_size=100, learning_rate=0.05):
self._embeddings_shape = embeddings_shape
self._target_classes = target_classes
self._conv_filter_size = conv_filter_size
self._conv_projection_size = conv_projection_size