Skip to content

Instantly share code, notes, and snippets.

@BasPH
BasPH / gist:d842786bd2f990e53203de2348da9efd
Last active February 6, 2017 13:41
Display line numbers with bash -x

Display line numbers with bash -x:

PS4='Line ${LINENO}: ' bash -x yourscript.sh

docker-ssh() {
docker exec -i -t $1 sh
}
docker-update-time() {
docker run --rm --privileged --pid=host walkerlee/nsenter -t 1 -m -u -i -n ntpd -d -q -n -p `cat /etc/ntp.conf | awk '{ print $2 }'`
}
docker-cleanup-volumes() {
docker volume rm `docker volume ls -q -f dangling=true`
@BasPH
BasPH / black_selection.sh
Created December 11, 2018 19:53
Black on selection
#!/usr/bin/env bash
set -x
black=$1
input_file=$2
start_line=$3
end_line=$4
# Read selected lines and write to tmpfile
@BasPH
BasPH / README.md
Last active April 13, 2019 09:00
Airflow Docker development environment with different Python versions

To set up the development environment:

  1. docker-compose up -d postgres
  2. docker-compose up -d airflow37 # or other version
  3. docker logs -f airflow_airflow37_1
  4. Wait until you see "Installation ready. Start doing nothing..."
  5. docker exec -it airflow_airflow37_1 bash
  6. You're good to go, e.g.:
  • airflow webserver
"""Test the validity of all DAGs."""
import glob
from os import path
import pytest
from airflow import models as airflow_models
DAG_PATHS = glob.glob(path.join(path.dirname(__file__), "..", "..", "dags", "*.py"))
{"name":"[DAG]","children":[{"name":"runme_0","children":[{"name":"run_after_loop","children":[{"name":"run_this_last"}]}]},{"name":"runme_1","children":[{"name":"run_after_loop"}]},{"name":"runme_2","children":[{"name":"run_after_loop"}]},{"name":"also_run_this","children":[{"name":"run_this_last"}]}]}
@BasPH
BasPH / test_backports.py
Created May 23, 2020 12:09
Very inefficient test of Airflow providers backport packages
import docker
# fmt: off
backport_package_class_mapping = {
"apache-airflow-backport-providers-amazon==2020.5.20rc2": ["airflow.providers.amazon.aws.hooks.athena.AWSAthenaHook", "airflow.providers.amazon.aws.hooks.aws_dynamodb.AwsDynamoDBHook", "airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook", "airflow.providers.amazon.aws.hooks.cloud_formation.AWSCloudFormationHook", "airflow.providers.amazon.aws.hooks.datasync.AWSDataSyncHook", "airflow.providers.amazon.aws.hooks.ec2.EC2Hook", "airflow.providers.amazon.aws.hooks.emr.EmrHook", "airflow.providers.amazon.aws.hooks.glue.AwsGlueJobHook", "airflow.providers.amazon.aws.hooks.glue_catalog.AwsGlueCatalogHook", "airflow.providers.amazon.aws.hooks.kinesis.AwsFirehoseHook", "airflow.providers.amazon.aws.hooks.lambda_function.AwsLambdaHook", "airflow.providers.amazon.aws.hooks.logs.AwsLogsHook", "airflow.providers.amazon.aws.hooks.redshift.RedshiftHook", "airflow.providers.amazon.aws.hooks.s3.S3Hook", "airflow.providers.amazon.aws.hooks.sagemaker.SageMaker
@BasPH
BasPH / sla_manager.py
Last active July 20, 2022 12:27
Proposed Airflow SLA checker
import logging
import random
import string
import threading
import time
from collections import defaultdict
from dataclasses import dataclass
from typing import Dict, Set
logging.basicConfig(level=logging.INFO, format="%(threadName)s %(message)s")
@BasPH
BasPH / Dockerfile
Last active November 18, 2022 10:09
Astro Dockerfile with watchdog
FROM quay.io/astronomer/astro-runtime:6.0.4
USER root
RUN apt-get update && apt-get install -y patch patchutils && pip install watchdog~=2.1.9
# Resources:
# - https://man7.org/linux/man-pages/man1/patch.1.html
# - https://stackoverflow.com/a/543045/3066428
# - https://github.com/apache/airflow/blob/2.4.3/airflow/utils/file.py