Skip to content

Instantly share code, notes, and snippets.

View rndazurescript's full-sized avatar

R&D Azure Script rndazurescript

View GitHub Profile
@rndazurescript
rndazurescript / DeployADFv2ArmTemplate.ps1
Last active January 22, 2022 19:35
Deploy ADFv2 exported ARM templates
# Based on
# https://docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
# https://azure.microsoft.com/mediahandler/files/resourcefiles/whitepaper-adf-on-azuredevops/Azure%20data%20Factory-Whitepaper-DevOps.pdf
#
# WARNING: Deleting the pipelines will also delete the execution log. If you do need the log, consider exporting it to Azure Monitor
# as described https://docs.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor. Also keep in mind that
# ADF stores pipeline-run data for only 45 days, so exporting to azure monitor will assist preserving the data for more time.
#
# Prerequisites
# This code runs on powershell 7+ to use ForEach-Object -Parallel.
@rndazurescript
rndazurescript / DataBricksGetIpCell.png
Last active March 7, 2020 21:03
Get DataBricks IP
DataBricksGetIpCell.png
@rndazurescript
rndazurescript / Enable-LongPathSupport.ps1
Created March 13, 2020 10:18
Enable long paths on windows
$registryPath = "HKLM:\SYSTEM\CurrentControlSet\Control\FileSystem"
$Name = "LongPathsEnabled"
$expectedValue = "1"
$currentValue=(Get-ItemProperty -Path $registryPath -Name $Name).$Name
if ($currentValue -ne $expectedValue){
Write-Host "Enabling long path support"
New-ItemProperty -Path $registryPath -Name $name -Value $expectedValue -PropertyType DWORD -Force | Out-Null
@rndazurescript
rndazurescript / VersionDataBricks.bat
Created April 2, 2020 19:34
Versioning databricks workspace files
@echo off
REM Create a conda environment
conda create -n databricks pip
REM Activate environment
conda activate databricks
REM Ensure pip is located inside that environment
pip --version
REM Should have envs\databricks in the output like ...conda\envs\databricks\lib\site-packages\pip
@rndazurescript
rndazurescript / storage.py
Last active May 2, 2020 17:02
Play with Azure Storage
from azure.storage.common.cloudstorageaccount import CloudStorageAccount
from azure.storage.common.models import AccessPolicy
from azure.storage.blob import BlockBlobService, PageBlobService, AppendBlobService
from azure.storage.models import CorsRule, Logging, Metrics, RetentionPolicy, ResourceTypes, AccountPermissions
from azure.storage.blob.models import BlobBlock, ContainerPermissions, ContentSettings
from datetime import datetime, timedelta
import time
import json
settings= {}
@rndazurescript
rndazurescript / ResetContainerLeaseState.py
Created May 2, 2020 12:48
Reset Lease state for an azure blob storage container that is in broken state
from azure.storage.common.cloudstorageaccount import CloudStorageAccount
# Retrieve the storage account and the storage key
import json
settings= {}
with open('./settings.json') as f:
settings = json.load(f)
account_name = settings["STORAGE_ACCOUNT_NAME"]
account_key = settings["STORAGE_ACCOUNT_KEY"]
@rndazurescript
rndazurescript / CreateIso.ps1
Created June 4, 2020 01:06
Create ISO from powershell
function New-IsoFile
{
<# .Synopsis Creates a new .iso file .Description The New-IsoFile cmdlet creates a new .iso file containing content from chosen folders .Example New-IsoFile "c:\tools","c:Downloads\utils" This command creates a .iso file in $env:temp folder (default location) that contains c:\tools and c:\downloads\utils folders. The folders themselves are included at the root of the .iso image. .Example New-IsoFile -FromClipboard -Verbose Before running this command, select and copy (Ctrl-C) files/folders in Explorer first. .Example dir c:\WinPE | New-IsoFile -Path c:\temp\WinPE.iso -BootFile "${env:ProgramFiles(x86)}\Windows Kits\10\Assessment and Deployment Kit\Deployment Tools\amd64\Oscdimg\efisys.bin" -Media DVDPLUSR -Title "WinPE" This command creates a bootable .iso file containing the content from c:\WinPE folder, but the folder itself isn't included. Boot file etfsboot.com can be found in Windows ADK. Refer to IMAPI_MEDIA_PHYSICAL_TYPE enumeration for possible media types: http://msdn.micr
@rndazurescript
rndazurescript / OldAzureBobService.ts
Created July 14, 2020 12:13
Old blob uploader in typescript
module UpZure.Blob {
// An uploader client
// based on https://msdn.microsoft.com/en-us/library/azure/mt427365.aspx
// Limitations taken from https://msdn.microsoft.com/en-us/library/azure/dd135726.aspx
export class BlobUploader {
private MAX_BLOB_SIZE = 4 * 1024 * 1024;//Each file will be split in 4Mb (used to be 256 KB).
private BLOCK_NAME_PREFIX = "blk-";
private MAX_BLOCKS = 50000;//a maximum of 50,000 blocks
@rndazurescript
rndazurescript / reboot.py
Created July 31, 2020 16:10
TPLink_Archer_C7_Router remote reboot script
import requests
import base64
import sys
ROUTER_IP = "192.168.0.1"
USERNAME = "admin"
PASSWORD = "YOUR_ROUTER_PASSWORD_HERE"
class TPLink_Archer_C7_Router_Web_Interface:
@rndazurescript
rndazurescript / async_storage.py
Last active September 1, 2020 15:10
Accessing storage account in parallel using managed identity
# The following sample shows how to run async authentication and access to storage
# account. This sample suffers from the IMDS limit of 5 concurrent requests but
# the python SDK has baked in retry policies
import asyncio
import logging
import os
from azure.identity.aio import DefaultAzureCredential
from azure.storage.blob.aio import BlobServiceClient
import logging