Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save tridungle/46aed0c027c93de6885b7f125e4e0f83 to your computer and use it in GitHub Desktop.
Save tridungle/46aed0c027c93de6885b7f125e4e0f83 to your computer and use it in GitHub Desktop.
Deployment configuration for the Airflow scheduler
kind: Deployment
apiVersion: apps/v1
metadata:
name: airflow-scheduler
namespace: airflow-on-k8s
spec:
replicas: 1
selector:
matchLabels:
tier: airflow
component: scheduler
release: airflow-k8s
template:
metadata:
labels:
tier: airflow
component: scheduler
release: airflow-k8s
annotations:
cluster-autoscaler.kubernetes.io/safe-to-evict: "true"
spec:
restartPolicy: Always
terminationGracePeriodSeconds: 10
serviceAccountName: airflow-scheduler-serviceaccount
initContainers:
- name: run-airflow-migrations
image: apache/airflow:1.10.10.1-alpha2-python3.6
imagePullPolicy: IfNotPresent
args: ["bash", "-c", "airflow upgradedb || airflow db upgrade"]
env:
- name: AIRFLOW__CORE__FERNET_KEY
valueFrom:
secretKeyRef:
name: airflow-fernet-key
key: fernet-key
- name: AIRFLOW__CORE__SQL_ALCHEMY_CONN
valueFrom:
secretKeyRef:
name: airflow-airflow-metadata
key: connection
- name: AIRFLOW_CONN_AIRFLOW_DB
valueFrom:
secretKeyRef:
name: airflow-airflow-metadata
key: connection
volumeMounts:
- name: config
mountPath: "/opt/airflow/airflow.cfg"
subPath: airflow.cfg
readOnly: true
containers:
- name: scheduler
image: apache/airflow:1.10.10.1-alpha2-python3.6
imagePullPolicy: IfNotPresent
args: ["scheduler"]
env:
- name: AIRFLOW__CORE__FERNET_KEY
valueFrom:
secretKeyRef:
name: airflow-fernet-key
key: fernet-key
- name: AIRFLOW__CORE__SQL_ALCHEMY_CONN
valueFrom:
secretKeyRef:
name: airflow-metadata
key: connection
- name: AIRFLOW_CONN_AIRFLOW_DB
valueFrom:
secretKeyRef:
name: airflow-airflow-metadata
key: connection
livenessProbe:
failureThreshold: 10
periodSeconds: 30
exec:
command:
- python
- -Wignore
- -c
- |
import os
os.environ['AIRFLOW__CORE__LOGGING_LEVEL'] = 'ERROR'
os.environ['AIRFLOW__LOGGING__LOGGING_LEVEL'] = 'ERROR'
from airflow.jobs.scheduler_job import SchedulerJob
from airflow.utils.net import get_hostname
import sys
job = SchedulerJob.most_recent_job()
sys.exit(0 if job.is_alive() and job.hostname == get_hostname() else 1)
volumeMounts:
- name: logs-pv
mountPath: "/opt/airflow/logs"
- name: dags-pv
mountPath: "/opt/airflow/dags"
- name: config
mountPath: "/opt/airflow/airflow.cfg"
subPath: airflow.cfg
readOnly: true
volumes:
- name: config
configMap:
name: airflow-config
- name: dags-pv
persistentVolumeClaim:
claimName: dags-pvc
- name: logs-pv
persistentVolumeClaim:
claimName: logs-pvc
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment