Skip to content

Instantly share code, notes, and snippets.

@csiebler
csiebler / bing_search_example.py
Created October 20, 2021 08:40
Bing Search v7 example
import requests, json
key = "xxxxx" # Paste your API key here
url = "https://api.bing.microsoft.com/v7.0/search"
search_term = "Azure Cognitive Services"
headers = {"Ocp-Apim-Subscription-Key" : key}
params = {"q": search_term, "textDecorations": True, "textFormat": "HTML"}
@csiebler
csiebler / enforce_init_script.json
Created July 19, 2021 10:41
Enforce an init script for Azure Machine Learning Compute Instance via Azure Policy
{
"mode": "All",
"policyRule": {
"if": {
"allOf": [
{
"field": "type",
"equals": "Microsoft.MachineLearningServices/workspaces/computes"
},
{
@csiebler
csiebler / package_model.py
Created May 3, 2021 13:55
Package existing model as container in Azure Machine Learning
from azureml.core import Workspace, Model
from azureml.core.model import InferenceConfig
from azureml.core.environment import Environment
from azureml.core.conda_dependencies import CondaDependencies
ws = Workspace.from_config()
env = Environment("inference-env")
env.docker.enabled = True
# Replace with your conda enviroment file
@csiebler
csiebler / conda.yml
Created February 11, 2021 11:54
Conda Environment examples
name: conda-env
dependencies:
- pip
- pip:
- --index-url https://xxxxx
- --extra-index-url https://xxxxxx
- xxxxxx==x.x.x
- -e git+https://github.com/xxxxxxxxxxx
- -e ./xxxxxxx
@csiebler
csiebler / private_repo.md
Last active February 9, 2022 13:48
Adding private Azure DevOps Artifact feeds to Azure Machine Learning

Steps:

  • Create private Feed in Azure DevOps
  • Create Personal Access Token (PAT) in Azure DevOps with Feed Read permission (details)
  • Navigate to the Azure DevOps Artifacts Feed page where you can see the details for the next steps (you'll need feed name, project name, organization name and later also package name): feed terminology
  • Create Build pipeline in Azure DevOps to create package and push to private feed:
trigger:
@csiebler
csiebler / hyperparameters.py
Created October 14, 2020 15:08
Get metrics and hyperparameters for each run in HyperDriveStepRun in Azure Machine Learning
# Get the HyperDriveStep of the pipeline by name (make sure only 1 exists)
hd_step_run = HyperDriveStepRun(step_run=pipeline_run.find_step_run('hd_step01')[0])
# Get RunID for best run (we're lazy)
best_run_id = hd_step_run.get_best_run_by_primary_metric().id
# Get all hyperparameters that where tried
hyperparameters = hd_step_run.get_hyperparameters()
# Get all metrics for the runs
@csiebler
csiebler / mount_dataset.py
Last active January 23, 2024 11:03
Mount Dataset to Azure Machine Learning Compute Instance
import os
import pandas as pd
from azureml.core import Workspace, Dataset
# Connect to Workspace and reference Dataset
ws = Workspace.from_config()
dataset = ws.datasets["german-credit-train-tutorial"]
# Create mountcontext and mount the dataset
mount_ctx = dataset.mount()
@csiebler
csiebler / example.md
Created May 15, 2020 11:39
Get run_id of training run in Azure Machine Learning Pipeline Step

In train.py:

run.tag('run_type', value='training')

In later step:

#Retrieve associated run, workspace and experiment
run = Run.get_context()
@csiebler
csiebler / example.md
Created May 15, 2020 11:39
Get run_id of training run in Azure Machine Learning Pipeline Step

In train.py:

run.tag('run_type', value='training')

In later step:

#Retrieve associated run, workspace and experiment
run = Run.get_context()
@csiebler
csiebler / predict.py
Created May 5, 2020 09:39
Prediction script example
import json
import os
import numpy as np
import pandas as pd
import joblib
# Your imports go here
# Update to your model's filename
model_filename = "model.pkl"