This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Produced by running https://gist.github.com/mbollmann/827a079023ebdd18b4d06c28566fac0d | |
# with flags -o -e -c -w on commit 5a875471 | |
1993.tmi.yaml['1993.tmi-1.17']: Value of root['author_string'] changed from "Pierre Isabelle, Marc Dymetman, George Foster, Jean-Marc Jutras, Elliott" to "Pierre Isabelle, Marc Dymetman, George Foster, | |
Jean-Marc Jutras, Elliott". | |
1993.tmi.yaml['1993.tmi-1.22']: Value of root['author_string'] changed from "Masaru Tomita, Masako Shirai, Junya Tsutsumi, Miki Matsumura, Yuki" to "Masaru Tomita, Masako Shirai, Junya Tsutsumi, Miki | |
Matsumura, Yuki". | |
2005.iwslt.yaml['2005.iwslt-1.6']: Value of root['author_string'] changed from "Sanjika Hewavitharana, Bing Zhao, Hildebrand, Almut Silja, Matthias Eck, Chiori Hori, Stephan Vogel, Alex Waibel" to | |
"Sanjika Hewavitharana, Bing Zhao, Hildebrand, Almut Silja, Matthias Eck, Chiori Hori, Stephan Vogel, Alex Waibel". | |
2006.amta.yaml['2006.amta-panel1.0']: Value of root['url'] changed from "https://aclanthology.org/2006.amta-panels.0/" to "h |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# -*- coding: utf-8 -*- | |
# | |
# Copyright 2024 Marcel Bollmann <[email protected]> | |
# | |
# Licensed under the Apache License, Version 2.0 (the "License"); | |
# you may not use this file except in compliance with the License. | |
# You may obtain a copy of the License at | |
# | |
# http://www.apache.org/licenses/LICENSE-2.0 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env fish | |
if not command -q kitty | |
set error_msg "kitty not found." | |
else if not command -q xdotool | |
set error_msg "xdotool not found." | |
else if not command -q wmctrl | |
set error_msg "wmctrl not found." | |
end |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# Unicode characters are neatly categorized into different "scripts", as seen on | |
# the character code chart <http://www.unicode.org/charts/#scripts> and defined | |
# in Annex #24 <https://www.unicode.org/reports/tr24/>. | |
# | |
# Unfortunately, Python's unicodedata module doesn't provide access to this | |
# information. However, the fontTools library does include this. | |
# <https://github.com/fonttools/fonttools> | |
# |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# | |
# MIT License | |
# | |
# Copyright (c) 2021 Marcel Bollmann | |
# | |
# Permission is hereby granted, free of charge, to any person obtaining a copy | |
# of this software and associated documentation files (the "Software"), to deal | |
# in the Software without restriction, including without limitation the rights | |
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
"""Usage: conv_checkpoints_to_model.py MODFILE | |
Takes a trained model file with multiple saved checkpoints and converts these | |
checkpoints into standalone models. This allows the different checkpoints to be | |
used, e.g., as parts of a model ensemble. | |
This script will: | |
- Analyze MODFILE to find all saved model components |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
--- theano/sandbox/cuda/opt.py 2017-05-31 23:26:09.972668647 +0200 | |
+++ theano/sandbox/cuda/opt_patched.py 2017-06-01 00:49:43.818626738 +0200 | |
@@ -38,10 +38,12 @@ | |
GpuElemwise, GpuDimShuffle, GpuReshape, GpuCAReduce, | |
gpu_flatten, | |
GpuSubtensor, GpuAdvancedSubtensor1, | |
- GpuAdvancedIncSubtensor1, GpuAdvancedIncSubtensor1_dev20, | |
+ GpuAdvancedIncSubtensor1, | |
GpuIncSubtensor, gpu_alloc, GpuAlloc, gpu_shape, GpuSplit, GpuAllocEmpty) | |
from theano.sandbox.cuda.opt_util import pad_dims, unpad_dims |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/python3 | |
# -*- coding: utf-8 -*- | |
import argparse | |
import bibtexparser | |
from collections import Counter | |
import matplotlib.pyplot as plt | |
import seaborn as sns | |
import sys |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class AttentionLSTM(LSTM): | |
"""LSTM with attention mechanism | |
This is an LSTM incorporating an attention mechanism into its hidden states. | |
Currently, the context vector calculated from the attended vector is fed | |
into the model's internal states, closely following the model by Xu et al. | |
(2016, Sec. 3.1.2), using a soft attention model following | |
Bahdanau et al. (2014). | |
The layer expects two inputs instead of the usual one: |
NewerOlder