This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env fish | |
if not command -q kitty | |
set error_msg "kitty not found." | |
else if not command -q xdotool | |
set error_msg "xdotool not found." | |
else if not command -q wmctrl | |
set error_msg "wmctrl not found." | |
end |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# Unicode characters are neatly categorized into different "scripts", as seen on | |
# the character code chart <http://www.unicode.org/charts/#scripts> and defined | |
# in Annex #24 <https://www.unicode.org/reports/tr24/>. | |
# | |
# Unfortunately, Python's unicodedata module doesn't provide access to this | |
# information. However, the fontTools library does include this. | |
# <https://github.com/fonttools/fonttools> | |
# |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# | |
# MIT License | |
# | |
# Copyright (c) 2021 Marcel Bollmann | |
# | |
# Permission is hereby granted, free of charge, to any person obtaining a copy | |
# of this software and associated documentation files (the "Software"), to deal | |
# in the Software without restriction, including without limitation the rights | |
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
"""Usage: conv_checkpoints_to_model.py MODFILE | |
Takes a trained model file with multiple saved checkpoints and converts these | |
checkpoints into standalone models. This allows the different checkpoints to be | |
used, e.g., as parts of a model ensemble. | |
This script will: | |
- Analyze MODFILE to find all saved model components |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- theano/sandbox/cuda/opt.py 2017-05-31 23:26:09.972668647 +0200 | |
+++ theano/sandbox/cuda/opt_patched.py 2017-06-01 00:49:43.818626738 +0200 | |
@@ -38,10 +38,12 @@ | |
GpuElemwise, GpuDimShuffle, GpuReshape, GpuCAReduce, | |
gpu_flatten, | |
GpuSubtensor, GpuAdvancedSubtensor1, | |
- GpuAdvancedIncSubtensor1, GpuAdvancedIncSubtensor1_dev20, | |
+ GpuAdvancedIncSubtensor1, | |
GpuIncSubtensor, gpu_alloc, GpuAlloc, gpu_shape, GpuSplit, GpuAllocEmpty) | |
from theano.sandbox.cuda.opt_util import pad_dims, unpad_dims |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/python3 | |
# -*- coding: utf-8 -*- | |
import argparse | |
import bibtexparser | |
from collections import Counter | |
import matplotlib.pyplot as plt | |
import seaborn as sns | |
import sys |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class AttentionLSTM(LSTM): | |
"""LSTM with attention mechanism | |
This is an LSTM incorporating an attention mechanism into its hidden states. | |
Currently, the context vector calculated from the attended vector is fed | |
into the model's internal states, closely following the model by Xu et al. | |
(2016, Sec. 3.1.2), using a soft attention model following | |
Bahdanau et al. (2014). | |
The layer expects two inputs instead of the usual one: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Source: | |
# https://github.com/farizrahman4u/seq2seq/blob/master/seq2seq/layers/state_transfer_lstm.py | |
from keras import backend as K | |
from keras.layers.recurrent import LSTM | |
class StateTransferLSTM(LSTM): | |
"""LSTM with the ability to transfer its hidden state. | |
This layer behaves just like an LSTM, except that it can transfer (or |