Skip to content

Instantly share code, notes, and snippets.

View vuiseng9's full-sized avatar

VS (Vui Seng Chua) vuiseng9

View GitHub Profile
@vuiseng9
vuiseng9 / nncf_wrap_bert.py
Last active April 13, 2023 22:41
tranformer_block_tracing via NNCF
import functools
from typing import Dict, Callable, Any, Union, List, Tuple
import torch
import torch.nn as nn
from torch.utils.data import DataLoader
from torch.utils.data import Dataset
from nncf.torch.nncf_network import NNCFNetwork
from nncf.torch.dynamic_graph.graph_tracer import create_input_infos, create_dummy_forward_fn
@vuiseng9
vuiseng9 / timm.ipynb
Created March 4, 2022 20:11 — forked from Chris-hughes10/timm.ipynb
timm.ipynb
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

Goal:

Get examples of Intel Neural Compressor (INC) up and running, with existing trained model. We will use HuggingFace's Optimum as frontend and INC is chosen as its backend. We aim to reproduce static quantization example provided by Optimum out-of-the-box

  1. Create a conda environment
conda create -n optimum-inc python=3.8
  1. Setup Intel Neural Compressor per landing page. But we do it slightly different for dev.

Remove files in wandb cloud programmatically via python API

import wandb
api = wandb.Api()
runs = api.runs("<entity>/<project>") # e.g. vchua/huggingface

for run in runs:
    print(run.entity, run.project, run.id, run.name)
    for fff in run.files():
 if fff.name == 'output.log':
docker pull openvino/ubuntu18_dev:2021.4.2 
all_onnx=$(ls /data1/vchua/tld-poc/repo/*/*.onnx)

for onnx in $all_onnx;
do
 base=$(basename $onnx .onnx)
export CUDA_VISIBLE_DEVICES=0
cd transformers/examples/pytorch/translation/
python run_translation.py  \
    --model_name_or_path Helsinki-NLP/opus-mt-de-en  \
    --dataset_name wmt16  \
    --dataset_config_name de-en  \
    --source_lang de  \
    --target_lang en  \
 --run_name opus-mt-de-en-test \
# Ensure git-lfs is installed.

REPO_ID=<label>

# create a repo with your account in model hub
huggingface-cli repo create $REPO_ID

# clone the newly-created repo (git clone command will be printed)
git clone https://huggingface.co//

Pytest in vscode

settings.json

{
    "python.pythonPath": "/home/vchua/miniconda3/envs/AutoQPrecInit/bin/python",
    "python.testing.unittestEnabled": false,
    "python.testing.nosetestsEnabled": false,
    "python.testing.pytestEnabled": true,
    "python.testing.autoTestDiscoverOnSaveEnabled": false,
 "python.testing.pytestArgs": [