Skip to content

Instantly share code, notes, and snippets.

View cwhy's full-sized avatar

Chen Yu cwhy

View GitHub Profile
@cwhy
cwhy / nips2018.md
Last active January 11, 2024 13:17
NIPS 2018 Abstract

Unsupervisedly Learned Latent Graphs as Transferable Representations

Modern deep transfer learning approaches have mainly focused on learning \emph{generic} feature vectors from one task that are transferable to other tasks, such as word embeddings in language and pretrained convolutional features in vision. However, these approaches usually transfer unary features and largely ignore more structured graphical representations. This work explores the possibility of learning generic latent graphs that capture dependencies between pairs of data units (e.g., words or pixels) from large- scale unlabeled data and transferring the graphs to downstream tasks. Our

@cwhy
cwhy / icml2018.md
Created September 19, 2018 05:31
ICML 2018 Abstracts

Spline Filters For End-to-End Deep Learning

We propose to tackle the problem of end-to-end learning for raw waveforms signals by introducing learnable continuous time-frequency atoms. The derivation of these filters is achieved by first, defining a functional space with a given smoothness order and boundary conditions. From this space, we derive the parametric analytical filters. Their differentiability property allows gradient-based optimization. As such, one can equip any Deep Neural Networks (DNNs) with these filters. This enables us to tackle in a front-end fashion a large scale bird detection task based on the freefield1010 dataset

@cwhy
cwhy / Style.elm
Last active December 8, 2018 07:39
An abandoned render engine with poor man's flexbox
module Style exposing (..)
type CustomColor
= CItem
| CItemHidden
| CUser
| CUserHidden
| Grey
| White
@cwhy
cwhy / pytorch_twisted.py
Last active July 23, 2020 09:32
Pytorch tensor plays
import torch
def cum_softmax(t: torch.Tensor) -> torch.Tensor:
# t shape: ..., sm_d, sm_d is the dim to reduce
tmax = t.cummax(dim=-1)[0].unsqueeze(-2)
denominator = (t.unsqueeze(-1) - tmax).exp()
# shape: ..., sm_d, csm_d
numerator = denominator.sum(dim=-2, keepdim=True)
return numerator / denominator
@cwhy
cwhy / add.css
Last active September 21, 2020 06:18
Print Hacker News Comment
html{
font-size: 16px;
width: 100%;
}
table td, table th { overflow-wrap: anywhere; }
.title>a{
color: black;
font-size: 1.5rem;
font-family: "charter", "Palatino Linotype", "Source Serif Pro", "Georgia", "serif"
}
import csv
from collections import Counter
def get_table(m):
invert_counter = {v: i for i, v in Counter(''.join(m)).items()}
digit_table = {
'e': invert_counter[4],
'b': invert_counter[6],
'f': invert_counter[9],
}