Skip to content

Instantly share code, notes, and snippets.

View gauravkaila's full-sized avatar

Gaurav Kaila gauravkaila

View GitHub Profile
parameters data_type explanation
docker_image_version int Version of the docker image created in the previous step (can be found via the VSC IDE)
cpu_cores int Number of CPU cores for the Docker container
memory int Memory for the Docker container
service_name string Deployment service name
parameters data_type explanation
model_version int Version of the trained model (can be located via the VSC IDE)
pip_packages list List of pip packages to be installed
conda_packages list List of conda packages to be installed
conda_env_file string Name of the conda env file
path_scoring_script string Path to prediction script [score.py]
docker_image_name string Name of the docker image
parameter data_type explanation
experiment_name string Name of the experiment to place the current training run under (new experiment is created if no previous experiment with the same name is found)
compute_target_name string Name of VM where training will be done (new compute name is created if no previous compute of the same name is found)
vm_size string Ability to choose from various available VM configurations (e.g. Standard_D5_v2)
data_folder string Name of the folder in the Azure storage is store your training data.
local_directory string Path to the training data's local directory
conda_packages list List of conda packages required to train your model
script string Name of the training script (placed in the current working directory)
model_name string Name of trained model to be registered under in the workspace.
parameter data_type Azure name
subscription_id string Azure subscription id
resource_group string Azure resource group
workspace_name string Azure machine learning workspace
# Create stub
host, port = FLAGS.server.split(':')
channel = implementations.insecure_channel(host, int(port))
stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)
# Create prediction request object
request = predict_pb2.PredictRequest()
# Specify model name (must be the same as when the TensorFlow serving serving was started)
request.model_spec.name = 'obj_det'
# Create stub
host, port = FLAGS.server.split(':')
channel = implementations.insecure_channel(host, int(port))
stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)
# Create prediction request object
request = predict_pb2.PredictRequest()
# Specify model name (must be the same as when the TensorFlow serving serving was started)
request.model_spec.name = 'obj_det'
@gauravkaila
gauravkaila / install_docker_ubuntu_16.04.sh
Last active May 20, 2023 05:47
Install Docker and nvidia-docker on Ubuntu-16.04
#!/bin/bash
# add the GPG key for the official Docker repository to the system
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
# add the Docker repository to APT sources
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
# update the package database with the Docker packages from the newly added repo
sudo apt-get update
def make_tensor_proto(values, dtype=None, shape=None, verify_shape=False):
"""Create a TensorProto.
Args:
values: Values to put in the TensorProto.
dtype: Optional tensor_pb2 DataType value.
shape: List of integers representing the dimensions of tensor.
verify_shape: Boolean that enables verification of a shape of values.
Returns:
A TensorProto. Depending on the type, it may contain data in the
"tensor_content" attribute, which is not directly useful to Python programs.
# def _write_frozen_graph(frozen_graph_path, frozen_graph_def):
# """Writes frozen graph to disk.
#
# Args:
# frozen_graph_path: Path to write inference graph.
# frozen_graph_def: tf.GraphDef holding frozen graph.
# """
# with gfile.GFile(frozen_graph_path, 'wb') as f:
# f.write(frozen_graph_def.SerializeToString())
# logging.info('%d ops in the final graph.', len(frozen_graph_def.node))
@gauravkaila
gauravkaila / exporter.py
Created December 5, 2017 18:46
Code to comment out in the exporter.py script
# Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,