Skip to content

Instantly share code, notes, and snippets.

View Orbifold's full-sized avatar
🍀
Happy. Thinking. Understanding.

Francois Vanderseypen Orbifold

🍀
Happy. Thinking. Understanding.
View GitHub Profile
@Orbifold
Orbifold / LSTM_translation.py
Created January 27, 2018 12:37
Keras translation network.
# The [Anki repository](http://www.manythings.org/anki/) has a lot of sentence-pairs to learn a language and they are ideal to train a translation network.
# To judge the quality of a translation it helps to understand a bit both languages so in my case
# the [Dutch-English](http://www.manythings.org/anki/nld-eng.zip),
# [French-English](http://www.manythings.org/anki/fra-eng.zip)
# and [German-English](http://www.manythings.org/anki/deu-eng.zip) were perfect.
import string
import re
from pickle import dump
from unicodedata import normalize
@Orbifold
Orbifold / epochs.py
Created January 27, 2018 11:33
Running experiments to improve accuracy without grid-search or alike.
import argparse
import matplotlib.pyplot as plt
import numpy as np
from keras.layers.core import Dense
from keras.models import Sequential
from numpy import array
from scipy import signal
@Orbifold
Orbifold / Intends.ipynb
Created January 27, 2018 06:43
Mapping language to intends
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@Orbifold
Orbifold / Seq2seq.py
Created January 16, 2018 06:00
Sequence to sequence translation in Keras.
'''Sequence to sequence example in Keras (character-level).
This script demonstrates how to implement a basic character-level
sequence-to-sequence model. We apply it to translating
short English sentences into short French sentences,
character-by-character. Note that it is fairly unusual to
do character-level machine translation, as word-level
models are more common in this domain.
# Summary of the algorithm
@Orbifold
Orbifold / train_ner.py
Created June 20, 2017 15:08
NER out-of-memory trainer for Dutch.
# http://nlpforhackers.io/training-ner-large-dataset/
# http://gmb.let.rug.nl/data.php
import os
from nltk import conlltags2tree
def to_conll_iob(annotated_sentence):
"""
`annotated_sentence` = list of triplets [(w1, t1, iob1), ...]
Transform a pseudo-IOB notation: O, PERSON, PERSON, O, O, LOCATION, O
@Orbifold
Orbifold / node-d3.js
Last active July 5, 2018 05:50
Creating a pie-chart with NodeJS and d3.
var fs = require('fs');
var path = require('path');
var d3 = require('d3');
const jsdom = require("jsdom");
const JSDOM = jsdom.JSDOM;
var chartWidth = 500, chartHeight = 500;
var arc = d3.svg.arc()
.outerRadius(chartWidth / 2 - 10)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@Orbifold
Orbifold / LSTM MxNet.R
Created December 12, 2016 18:43
An LSTM neural network reproducing mini Shakespeare.
require(mxnet)
batch.size = 32
seq.len = 32
num.hidden = 16
num.embed = 16
num.lstm.layer = 1
num.round = 1
learning.rate= 0.1
wd=0.00001
clip_gradient=1
@Orbifold
Orbifold / Gensim.py
Created December 12, 2016 06:52
Using Word2Vec experiment on the Bible.
from gensim.utils import simple_preprocess
tokenize = lambda x: simple_preprocess(x)
# tokenize("We can load the vocabulary from the JSON file, and generate a reverse mapping (from index to word, so that we can decode an encoded string if we want)?!")
import os
import json
import numpy as np
from gensim.models import Word2Vec
@Orbifold
Orbifold / Feedforward.ipynb
Created November 25, 2016 09:47
Feedforward examples using Keras.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.