This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy as np | |
def fleiss_kappa(ratings): | |
""" | |
Args: | |
ratings: An N x R numpy array. N is the number of | |
samples and R is the number of reviewers. Each | |
entry (n, r) is the category assigned to example | |
n by reviewer r. | |
Returns: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
Author: Awni Hannun | |
This is an example CTC decoder written in Python. The code is | |
intended to be a simple example and is not designed to be | |
especially efficient. | |
The algorithm is a prefix beam search for a model trained | |
with the CTC loss function. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import time | |
import torch | |
import torch.nn as nn | |
from torch.autograd import Variable | |
def attend_bmm(eh, dhx): | |
dhx = dhx.unsqueeze(1) | |
pax = torch.bmm(eh, dhx.transpose(1,2)).squeeze(dim=2) | |
ax = nn.functional.softmax(pax) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import torch | |
import torch.autograd as autograd | |
import numpy as np | |
np.random.seed(11) | |
for size in range(1, 2000, 1): | |
a = np.random.randint(0, 2, size).astype(np.uint8) | |
av = autograd.Variable(torch.ByteTensor(a)) |
NewerOlder