Table of Contents:
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import { Component } from "React"; | |
| export var Enhance = ComposedComponent => class extends Component { | |
| constructor() { | |
| this.state = { data: null }; | |
| } | |
| componentDidMount() { | |
| this.setState({ data: 'Hello' }); | |
| } | |
| render() { |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from __future__ import absolute_import | |
| from __future__ import print_function | |
| import numpy as np | |
| np.random.seed(1337) # for reproducibility | |
| import random | |
| from keras.datasets import mnist | |
| from keras.models import Sequential, Graph | |
| from keras.layers.core import Dense, Dropout, Lambda | |
| from keras.optimizers import SGD, RMSprop |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| class AttentionLSTM(LSTM): | |
| """LSTM with attention mechanism | |
| This is an LSTM incorporating an attention mechanism into its hidden states. | |
| Currently, the context vector calculated from the attended vector is fed | |
| into the model's internal states, closely following the model by Xu et al. | |
| (2016, Sec. 3.1.2), using a soft attention model following | |
| Bahdanau et al. (2014). | |
| The layer expects two inputs instead of the usual one: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from keras import backend as K, initializers, regularizers, constraints | |
| from keras.engine.topology import Layer | |
| def dot_product(x, kernel): | |
| """ | |
| Wrapper for dot product operation, in order to be compatible with both | |
| Theano and Tensorflow | |
| Args: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # -*- coding: utf-8 -*- | |
| '''Trains an LSTM on the IMDB sentiment classification task with soft attention. | |
| Experiments with max_features=10000, max_len=80 | |
| 1) MLP-dropout-tanh attention: 83.59 at epoch 4 | |
| 2) MLP-dropout-relu attention: 83.26 at epoch 3 | |
| 3) MLP-tanh attention: 82.91 at epoch 4 | |
| 4) GlobalMaxPooling1D attention: 82.44 at epoch 7 | |
| ''' | |
| from __future__ import print_function |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from keras import backend as K | |
| from keras.layers import Input, Dense, merge, Dropout, Lambda, LSTM, Masking | |
| from keras.models import Model, Sequential | |
| from keras.optimizers import SGD, RMSprop, Adam, Nadam | |
| from sys import argv | |
| import argparse | |
| import csv | |
| import json | |
| import numpy as np | |
| import pickle |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| from keras.layers import * | |
| from keras.activations import softmax | |
| from keras.models import Model | |
| """ | |
| References | |
| ---------- | |
| [1]. Parikh, Ankur P., et al. "A decomposable attention model for natural language inference." arXiv preprint arXiv:1606.01933 (2016). |
If you're developing an application based on React it can be helpful if you don't need to develop all the basic UI components yourself. Here you can find a list with various components, component libraries and complete design systems developed with and for React.
As the list got longer and longer I thought it would be better to have a "real" site for it.