Skip to content

Instantly share code, notes, and snippets.

View albertbuchard's full-sized avatar

Albert Buchard albertbuchard

View GitHub Profile
@albertbuchard
albertbuchard / sparsemax_torch.py
Last active April 18, 2023 18:31
This code defines a PyTorch implementation of the Sparsemax activation function. Sparsemax is an alternative to the softmax activation function that produces sparse probability distributions (euclidian projection to the simplex). The implementation is provided as a PyTorch nn.Module, making it easy to integrate into any architecture.
import torch
import torch.nn as nn
class Sparsemax(nn.Module):
def __init__(self, dim=-1):
super(Sparsemax, self).__init__()
self.dim = dim
def forward(self, x):
# Move the dimension to apply Sparsemax to the front
@albertbuchard
albertbuchard / timeit_fleuret.py
Last active July 23, 2022 21:39
Comparison tensor fill
setup='''
import numpy as np
import torch
V_nat = [[1, 2], [3, 4]]
U_nat = [[2, -1, 0, 0, 0, 0],
[5, 2, 8, -1, 0, 0]]
def compute_using_fleuret_1():
# Your init script
#
# Atom will evaluate this file each time a new window is opened. It is run
# after packages are loaded/activated and after the previous editor state
# has been restored.
#
# An example hack to log to the console when each text editor is saved.
#
# atom.workspace.observeTextEditors (editor) ->
# editor.onDidSave ->