Skip to content

Instantly share code, notes, and snippets.

View imflash217's full-sized avatar
🔭
महाजनो येन गतः स पन्थाः ॥

Vinay Kumar imflash217

🔭
महाजनो येन गतः स पन्थाः ॥
View GitHub Profile
mklojiknm
,olmlm,;
@imflash217
imflash217 / meenet3.py
Last active April 6, 2021 19:21
[AutoEncoder NN] #encoder #decoder #meenet
import torch
input = torch.randn(1, 2, 1025); input
##### ENCODER
# layer-1
downsample_1a = torch.nn.Conv1d(2, 20, 5 , stride=1, padding=0)
downsample_1b = torch.nn.Conv1d(2, 20, 50 , stride=1, padding=0)
downsample_1c = torch.nn.Conv1d(2, 20, 256 , stride=1, padding=0)
downsample_1d = torch.nn.Conv1d(2, 20, 512 , stride=1, padding=0)
@imflash217
imflash217 / Install NVIDIA Driver and CUDA.md
Created October 12, 2018 06:42 — forked from wangruohui/Install NVIDIA Driver and CUDA.md
Install NVIDIA Driver and CUDA on Ubuntu / CentOS / Fedora Linux OS
@imflash217
imflash217 / GitHub-Forking.md
Created January 3, 2021 23:49 — forked from Chaser324/GitHub-Forking.md
GitHub Standard Fork & Pull Request Workflow

Whether you're trying to give back to the open source community or collaborating on your own projects, knowing how to properly fork and generate pull requests is essential. Unfortunately, it's quite easy to make mistakes or not know what you should do when you're initially learning the process. I know that I certainly had considerable initial trouble with it, and I found a lot of the information on GitHub and around the internet to be rather piecemeal and incomplete - part of the process described here, another there, common hangups in a different place, and so on.

In an attempt to coallate this information for myself and others, this short tutorial is what I've found to be fairly standard procedure for creating a fork, doing your work, issuing a pull request, and merging that pull request back into the original project.

Creating a Fork

Just head over to the GitHub page and click the "Fork" button. It's just that simple. Once you've done that, you can use your favorite git client to clone your repo or j

@imflash217
imflash217 / FlashModel.py
Last active January 13, 2021 20:51
PyTorch Lightning Model
import torch as pt
import pytorch_lightning as pl
#######################################################################
class FlashModel(pl.LightningModule):
"""This defines a MODEL"""
def __init__(self, num_layers: int = 3):
super().__init__()
self.layer1 = pt.nn.Linear()
self.layer2 = pt.nn.Linear()
### DATALOADERS ##################################################################
# When building DataLoaders. Set `num_workers>0` and `pin_memory=True`
DataLoader(dataset, num_workers=8, pin_memory=True)
### num_workers ##################################################################
# num_workers depends on the batch size and the machine
# A general place to start is to set num_workers = number of CPUs in the machine.
# Increasing num_workers all increases the CPU usage
# BEST TIP: Increase num_workers slowly and stop when there is no performance increase.
# A LightningModule ORGANIZES the PyTorch code into the following modules:
# 1. Computations (init)
# 2. Training loop (training_step)
# 3. Validation loop (validation_step)
# 4. Test loop (test_step)
# 5. Optimizers (configure_optimizers)
##############################################################################
model = FlashModel()
trainer = Trainer()
import os
import torch
import torch.nn.Functional as F
from torchvision import datasets, transforms
from torch.utils.data import DataLoader
import pytorch_lightning as pl
###########################################################################################
import torch
import torch.nn.Functional as F
import pytorch_lightning as pl
###########################################################################################
## Pytorch_Lightning version
##
class FlashModel(pl.LightningModule):
"""DOCSTRING"""
def __init__(self, model):
import torch
import torch.nn.Functional as F
import pytorch_lightning as pl
###########################################################################################
class FlashModel(pl.LightningModule):
"""DOCSTRING"""
def __init__(self, model):
super().__init__()