Skip to content

Instantly share code, notes, and snippets.

View rmdort's full-sized avatar

Vinay M rmdort

View GitHub Profile
@sebmarkbage
sebmarkbage / Enhance.js
Last active March 29, 2026 17:42
Higher-order Components
import { Component } from "React";
export var Enhance = ComposedComponent => class extends Component {
constructor() {
this.state = { data: null };
}
componentDidMount() {
this.setState({ data: 'Hello' });
}
render() {
from __future__ import absolute_import
from __future__ import print_function
import numpy as np
np.random.seed(1337) # for reproducibility
import random
from keras.datasets import mnist
from keras.models import Sequential, Graph
from keras.layers.core import Dense, Dropout, Lambda
from keras.optimizers import SGD, RMSprop
@jesstelford
jesstelford / event-loop.md
Last active October 16, 2025 15:48
What is the JS Event Loop and Call Stack?

Regular Event Loop

This shows the execution order given JavaScript's Call Stack, Event Loop, and any asynchronous APIs provided in the JS execution environment (in this example; Web APIs in a Browser environment)


Given the code

@mbollmann
mbollmann / attention_lstm.py
Last active August 22, 2024 07:06
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one:
@cbaziotis
cbaziotis / Attention.py
Last active October 22, 2024 08:31
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
from keras import backend as K, initializers, regularizers, constraints
from keras.engine.topology import Layer
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
@ameasure
ameasure / imdb_soft_attention_lstm
Created February 11, 2017 19:47
imdb_soft_attention_lstm.py
# -*- coding: utf-8 -*-
'''Trains an LSTM on the IMDB sentiment classification task with soft attention.
Experiments with max_features=10000, max_len=80
1) MLP-dropout-tanh attention: 83.59 at epoch 4
2) MLP-dropout-relu attention: 83.26 at epoch 3
3) MLP-tanh attention: 82.91 at epoch 4
4) GlobalMaxPooling1D attention: 82.44 at epoch 7
'''
from __future__ import print_function
@slashvar
slashvar / siamese_lstm.py
Created March 20, 2017 11:37
LSTM siamese network (masking issues)
from keras import backend as K
from keras.layers import Input, Dense, merge, Dropout, Lambda, LSTM, Masking
from keras.models import Model, Sequential
from keras.optimizers import SGD, RMSprop, Adam, Nadam
from sys import argv
import argparse
import csv
import json
import numpy as np
import pickle
@namakemono
namakemono / decomposable_attention.py
Created August 17, 2017 23:53
Decomposable Attention with Keras.
from keras.layers import *
from keras.activations import softmax
from keras.models import Model
"""
References
----------
[1]. Parikh, Ankur P., et al. "A decomposable attention model for natural language inference." arXiv preprint arXiv:1606.01933 (2016).

Libraries and Tools for React

If you're developing an application based on React it can be helpful if you don't need to develop all the basic UI components yourself. Here you can find a list with various components, component libraries and complete design systems developed with and for React.

As the list got longer and longer I thought it would be better to have a "real" site for it.

👉 Please find the (new) list here: https://react-libs.nilshartmann.net/