Skip to content

Instantly share code, notes, and snippets.

View mbollmann's full-sized avatar

Marcel Bollmann mbollmann

View GitHub Profile
@mbollmann
mbollmann / bibtex_collect_stats.py
Created November 17, 2016 11:10
Collecting stats about paper titles per year in a .bib file
#!/usr/bin/python3
# -*- coding: utf-8 -*-
import argparse
import bibtexparser
from collections import Counter
import matplotlib.pyplot as plt
import seaborn as sns
import sys
@mbollmann
mbollmann / attention_lstm.py
Last active August 22, 2024 07:06
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one:
@mbollmann
mbollmann / hidden_state_lstm.py
Created August 17, 2016 10:02
Keras LSTM that inputs/outputs its internal states, e.g. for hidden state transfer
from keras import backend as K
from keras.layers.recurrent import LSTM
class HiddenStateLSTM(LSTM):
"""LSTM with input/output capabilities for its hidden state.
This layer behaves just like an LSTM, except that it accepts further inputs
to be used as its initial states, and returns additional outputs,
representing the layer's final states.
@mbollmann
mbollmann / state_transfer_lstm.py
Created June 18, 2016 08:59
StateTransferLSTM for Keras 1.x
# Source:
# https://github.com/farizrahman4u/seq2seq/blob/master/seq2seq/layers/state_transfer_lstm.py
from keras import backend as K
from keras.layers.recurrent import LSTM
class StateTransferLSTM(LSTM):
"""LSTM with the ability to transfer its hidden state.
This layer behaves just like an LSTM, except that it can transfer (or