Skip to content

Instantly share code, notes, and snippets.

View Intelrunner's full-sized avatar

Eric E. Intelrunner

View GitHub Profile
@Intelrunner
Intelrunner / findFunctionDockerRegistry.sh
Created February 28, 2024 18:24
Lists the current Docker REgistry being used to deploy all Google Cloud functions in a project
#!/bin/bash
# Set your GCP project ID
PROJECT_ID=$(gcloud config get-value project)
# List all functions in the project and iterate over them
for FUNCTION_NAME in $(gcloud functions list --project=$PROJECT_ID --format="value(name)")
do
echo "Function: $FUNCTION_NAME"
# Describe each function to get the dockerRegistry value
@Intelrunner
Intelrunner / updateDatasetByProject.py
Last active February 28, 2024 14:36
A google cloud function that takes a list of projects and gets a list of all BQ datasets in each, and updates each BQ Dataset to "PHYSICAL" billing model.
"""
This (Google Cloud) Function will scan all a list of provided projects and get all Bigquery datasets for each of those projects.
It will then update the storage billing model for each dataset to be 'PHYSICAL' instead of 'LOGICAL'. This is useful for large datasets that are frequently accessed, as it can reduce costs.
Required Permissions:
- The service account running this function must have the following roles:
to be updated---
Returns:
string: A message indicating the function has completed.
@Intelrunner
Intelrunner / bq_dataset_billing_model_change.py
Last active February 27, 2024 20:46
GCP Cloud Function that Takes an EventARC Event of "InsertDataset" for BQ and immediately changes the billing model to physical for that dataset. Returns a simple log output.
import functions_framework
from google.cloud import bigquery
def extract_dataset_name(full_path):
"""
Extracts the dataset name from a full BigQuery dataset resource path.
Args:
- full_path (str): The full resource path of the dataset,
formatted as 'projects/{project_id}/datasets/{dataset_id}'.
@Intelrunner
Intelrunner / cloudbuild.yaml
Created December 18, 2023 16:13
Cloudbuild.yaml for Golang to pull from Artifact Registry (module) and build to CloudFunctions.
steps:
- name: golang
entrypoint: go
args: [ 'run', 'github.com/GoogleCloudPlatform/artifact-registry-go-tools/cmd/[email protected]', 'add-locations', '--locations=${_LOCATION}' ]
env:
# Set GOPROXY to the public proxy to pull the credential helper tool
- 'GOPROXY=proxy.golang.org'
- name: golang
entrypoint: go
@Intelrunner
Intelrunner / project_structure_build.sh
Last active April 20, 2023 20:38
This gist builds out a simple directory structure for a user facing app and grabs a general git ignore from gist
#!/bin/bash
# prompt user for monorepo name
read -p "Enter monorepo name: " monorepo_name
# check if monorepo directory already exists
if [ -d "$monorepo_name" ]; then
echo "Monorepo directory already exists. Skipping creation process."
else
# create the main directory
@Intelrunner
Intelrunner / ce_to_spanner.py
Last active April 19, 2023 17:28
Calculating Spanner Costs from GCE
class GCPCostCalculator:
def __init__(self):
self.sql_cores = 0
self.sql_ram = 0
self.sql_disk_size = 0
self.sql_disk_type = ""
self.sql_num_machines = 0
self.spanner_nodes = 0
self.spanner_region = ""
gcp_machine_family_pricing = {
'N1': {
'us-central1': {
'core_cost': 0.031611, # Cost per core-hour
'ram_cost': 0.004237, # Cost per GB-hour of RAM
'cud_discount': {
'1-year': 0.3, # 30% discount
'3-year': 0.57 # 57% discount
}
}
@Intelrunner
Intelrunner / reccy.sh
Last active June 15, 2022 20:05
Starting point to process all output from Google Cloud Compute Engine Recommendation Engine
#!/bin/bash
for project in $(gcloud projects list --format="value(projectId)")
do
for recommendation in $(gcloud recommender recommendations list --project=$project --location=global --recommender=google.compute.instance.MachineTypeRecommender)
do
# Here is where the logic for your 'what to do with each recommendation' goes
echo "$recommendation"
done
done
@Intelrunner
Intelrunner / churn_rate.sql
Created May 27, 2022 15:02
Calculate Churn Rate
WITH
[CHURN STAT] AS
(SELECT *
FROM [TABLE]
WHERE [STAT START] < '[DATE]'
AND (
([STAT END] >= '2018-01-01')
OR ([STAT END]] IS NULL))
),
status AS
@Intelrunner
Intelrunner / gcp_scan_all_tables_dlp.py
Created May 4, 2022 17:06
Will start an inspect job for ALL TABLES for all datasets in a user's BQ.
""" Warning - this script automatically submits a request for GCP DLP job creation
to scan for every table, in every dataset available to a user. This will fail if the following APIs / Permissions
are not enabled.
This script can, and may, cost you actual $$. Outcomes can be seen in the DLP console.
DLP - jobs.create, jobs.get, jobs.list
BQ - bigquery.user (role)