Skip to content

Instantly share code, notes, and snippets.

@gvx
gvx / mastermind.py
Created May 16, 2011 20:20
An implementation of Knuth's five-guess algorithm to solve a mastermind code [CC0]
from itertools import product
def score(self, other):
first = len([speg for speg, opeg in zip(self, other) if speg == opeg])
return first, sum([min(self.count(j), other.count(j)) for j in 'ABCDEF']) - first
possible = [''.join(p) for p in product('ABCDEF', repeat=4)]
results = [(right, wrong) for right in range(5) for wrong in range(5 - right) if not (right == 3 and wrong == 1)]
def solve(scorefun):
@timcowlishaw
timcowlishaw / Trie.scala
Created November 14, 2011 10:06
A Trie (Prefix-tree) implementation in Scala
package uk.ac.ucl.cs.GI15.timNancyKawal {
class Trie[V](key: Option[Char]) {
def this() {
this(None);
}
import scala.collection.Seq
import scala.collection.immutable.TreeMap
import scala.collection.immutable.WrappedString
@crtr0
crtr0 / client.js
Created June 8, 2012 17:02
A simple example of setting-up dynamic "rooms" for socket.io clients to join
// set-up a connection between the client and the server
var socket = io.connect();
// let's assume that the client page, once rendered, knows what room it wants to join
var room = "abc123";
socket.on('connect', function() {
// Connected, let's sign-up for to receive messages for this room
socket.emit('room', room);
});
@ttezel
ttezel / gist:4138642
Last active July 27, 2024 14:46
Natural Language Processing Notes

#A Collection of NLP notes

##N-grams

###Calculating unigram probabilities:

P( wi ) = count ( wi ) ) / count ( total number of words )

In english..

@neomatrix369
neomatrix369 / PerformanceRelated.md
Last active November 3, 2023 20:27
Interesting links in the areas of HPC, low latency, mechanical harmony/sympathy, garbage collection
@karpathy
karpathy / min-char-rnn.py
Last active February 26, 2025 02:03
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@fchollet
fchollet / classifier_from_little_data_script_3.py
Last active February 26, 2025 01:37
Fine-tuning a Keras model. Updated to the Keras 2.0 API.
'''This script goes along the blog post
"Building powerful image classification models using very little data"
from blog.keras.io.
It uses data that can be downloaded at:
https://www.kaggle.com/c/dogs-vs-cats/data
In our setup, we:
- created a data/ folder
- created train/ and validation/ subfolders inside data/
- created cats/ and dogs/ subfolders inside train/ and validation/
- put the cat pictures index 0-999 in data/train/cats
@monikkinom
monikkinom / rnn-lstm.py
Last active September 3, 2019 04:44
Tensorflow RNN-LSTM implementation to count number of set bits in a binary string
#Source code with the blog post at http://monik.in/a-noobs-guide-to-implementing-rnn-lstm-using-tensorflow/
import numpy as np
import random
from random import shuffle
import tensorflow as tf
# from tensorflow.models.rnn import rnn_cell
# from tensorflow.models.rnn import rnn
NUM_EXAMPLES = 10000
@codekansas
codekansas / keras_gensim_embeddings.py
Last active July 23, 2018 09:17
Using Word2Vec embeddings in Keras models
from __future__ import print_function
import json
import os
import numpy as np
from gensim.models import Word2Vec
from gensim.utils import simple_preprocess
from keras.engine import Input
from keras.layers import Embedding, merge
from __future__ import print_function
import numpy as np
from keras.callbacks import Callback
from keras.layers import Dense
from keras.layers import LSTM
from keras.models import Sequential
from numpy.random import choice
from utils import prepare_sequences