Skip to content

Instantly share code, notes, and snippets.

@ravi9
ravi9 / build_install_TF_MKL_instructions.md
Last active December 30, 2023 23:48
Build and install Tensorflow with MKL from sources on Centos 7.x

Build and install instructions for Tensorflow with MKLDNN on a clean Centos 7.x machine.

Docker setup if needed

Docker install instructions: https://github.com/ravi9/misc-readmes/blob/master/install-docker-centos.md

docker pull centos/devtoolset-6-toolchain-centos7  

#Login into the docker with with USER 0 with sudo permissions.
docker run -it --user 0 centos/devtoolset-6-toolchain-centos7:latest /bin/bash
@ravi9
ravi9 / intel_tf_cnn_benchmarks.sh
Last active July 8, 2019 14:22 — forked from MattsonThieme/intel_tf_cnn_benchmarks.sh
Intel TensorFlow CNN Benchmarking Script
# !/bin/bash
# Run TensorFlow's tf_cnn benchmarks in tensorflow_p36 virtualenv
# Activate TensorFlow virtual environment
#source activate tensorflow_p36
# Clone benchmark scripts
git clone https://github.com/tensorflow/benchmarks.git
cd benchmarks/scripts/tf_cnn_benchmarks/
@ravi9
ravi9 / tfcnn-inception3-benchmark-bs1.sh
Last active June 1, 2018 05:42
Sample Inference benchmark on Amazon DL AMI with inception3.
#Example Inference benchmark on Amazon DL AMI.
#Launch a DL AMI
#Create an virtual envirnoment.
conda create -n tf1.8_mkl-master python=3.6
source activate tf1.8_mkl-master
#Download the tf1.8 wheel built with MKLDNN master branch on 05-31-2018
wget https://www.dropbox.com/s/y812q7zpdy4pjed/tensorflow-1.8.0-cp36-cp36m-linux_x86_64.whl
#pip install the wheel
@ravi9
ravi9 / build_tf1.8_mkldnn_master.sh
Created June 1, 2018 06:18
Build TF1.8 wheel with MKLDNN master branch on AMZ DL AMI(Ubuntu)
# Launch a AMZ DL AMI (Ubuntu)
#Install Bazel with --user, it will be installed at /home/ubuntu/.bazel
wget https://github.com/bazelbuild/bazel/releases/download/0.13.1/bazel-0.13.1-installer-linux-x86_64.sh
chmod +x bazel-0.13.1-installer-linux-x86_64.sh
./bazel-0.13.1-installer-linux-x86_64.sh --user
source /home/ubuntu/.bazel/bin/bazel-complete.bash
#Create and activate a virtual envirnoment.
conda create -n tf1.8_mkl-master_build python=3.6
source activate tf1.8_mkl-master_build
@ravi9
ravi9 / install_haloagent_aws.sh
Last active July 30, 2018 21:42
Install Halo agent on AWS (Linux)
#https://github.intel.com/CloudPassage/AWS-Halo-Deployment-Script
wget https://www.dropbox.com/s/imh6zsh8nrpubzd/AWS-Halo-Deployment-Script-master.zip
unzip AWS-Halo-Deployment-Script-master.zip
cd AWS-Halo-Deployment-Script-master/
sudo python HaloAgent.py
@ravi9
ravi9 / segyio-play.py
Last active September 16, 2020 16:34
segyio-playground
import segyio
filename = 'f3.segy'
src_data = segyio.tools.cube(filename)
k1,k2,k3 = 300, 500, 400
factor = 125
il_s = k1 - factor
il_e = k1 + factor
"""
Copyright (c) 2022 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
"""
Copyright (c) 2022 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
"""
Copyright (c) 2022 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
@ravi9
ravi9 / tf_freeze_graph.py
Last active September 27, 2022 19:10
See Medium blog: Accelerate Big Transfer (BiT) model inference with Intel® OpenVINO™
"""
Copyright (c) 2022 Intel Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.