tags:: [[Computer Science - Artificial Intelligence]], [[Computer Science - Computation and Language]], [[Computer Science - Machine Learning]], [[Electrical Engineering and Systems Science - Audio and Speech Processing]] date:: [[Jun 14th, 2021]] extra:: arXiv: 2106.07447 title:: HuBERT Self-Supervised Speech Representation Learning by Masked Prediction of Hidden Units item-type:: [[journalArticle]] access-date:: 2021-07-27T06:40:41Z original-title:: HuBERT: Self-Supervised Speech Representation Learning by Masked Prediction of Hidden Units url:: http://arxiv.org/abs/2106.07447 short-title:: HuBERT publication-title:: "arXiv:2106.07447 [cs, eess]"
This code snippet shows how we can change a layer in a pretrained model. In the following code, we change all the ReLU activation functions with SELU in a resnet18 model.
import torch
from torchvision import model
resnet18 = model.resnet18(pretrained=True)
def funct(list_mods):
see this link. Basic commands are as follows:
git remote -v
git remote set-url origin https://github.com/USERNAME/REPOSITORY.git
git remote -v
How to modify the gradients in the backward pass? Using a register_hook for saving the gradients of a Variable: link When we want to get more crazy with the gradients and want only one gradient computation to be done from two heads
zz1
/
yy
\
zz2
git fetch
: get remote commits to local repogit pull <repo> <branch-name>
e.g.git pull origin master
: get the commits from remote branch to current branchgit branch -a
: list all branches including remote branchesgit checkout <branch-name>
: switch to the branch
git branch -m <new-name>
: rename current local branch. Check this for more information.git checkout -b <new-branch>
: create a new branch based on the current branchgit checkout -b <new-branch> <existing-branch>
: create a new branch given an existing branch
git status
: list of files tracked and not trackedgit add .
orgit add
: add files to staging area