Skip to content

Instantly share code, notes, and snippets.

View tk1363704's full-sized avatar
🎯
Focusing

Devin Hua tk1363704

🎯
Focusing
View GitHub Profile
@danijar
danijar / blog_tensorflow_scope_decorator.py
Last active January 17, 2023 01:58
TensorFlow Scope Decorator
# Working example for my blog post at:
# https://danijar.github.io/structuring-your-tensorflow-models
import functools
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
def doublewrap(function):
"""
A decorator decorator, allowing to use the decorator to be used without
@mommi84
mommi84 / awesome-kge.md
Last active April 14, 2025 11:27
Awesome Knowledge Graph Embedding Approaches
@L0SG
L0SG / freeze_example.py
Last active October 12, 2023 05:02
PyTorch example: freezing a part of the net (including fine-tuning)
import torch
from torch import nn
from torch.autograd import Variable
import torch.nn.functional as F
import torch.optim as optim
# toy feed-forward net
class Net(nn.Module):
def __init__(self):
@HarshTrivedi
HarshTrivedi / pad_packed_demo.py
Last active April 17, 2025 01:26 — forked from Tushar-N/pad_packed_demo.py
Minimal tutorial on packing (pack_padded_sequence) and unpacking (pad_packed_sequence) sequences in pytorch.
import torch
from torch import LongTensor
from torch.nn import Embedding, LSTM
from torch.autograd import Variable
from torch.nn.utils.rnn import pack_padded_sequence, pad_packed_sequence
## We want to run LSTM on a batch of 3 character sequences ['long_str', 'tiny', 'medium']
#
# Step 1: Construct Vocabulary
# Step 2: Load indexed data (list of instances, where each instance is list of character indices)