Skip to content

Instantly share code, notes, and snippets.

View sairamkrish's full-sized avatar

Sairam Krish sairamkrish

View GitHub Profile
@sairamkrish
sairamkrish / install-docker.md
Created April 5, 2021 13:45 — forked from npearce/install-docker.md
Amazon Linux 2 - install docker & docker-compose using 'sudo amazon-linux-extras' command

UPDATE (March 2020, thanks @ic): I don't know the exact AMI version but yum install docker now works on the latest Amazon Linux 2. The instructions below may still be relevant depending on the vintage AMI you are using.

Amazon changed the install in Linux 2. One no-longer using 'yum' See: https://aws.amazon.com/amazon-linux-2/release-notes/

Docker CE Install

sudo amazon-linux-extras install docker
sudo service docker start
@sairamkrish
sairamkrish / docker-compose.yml
Last active October 30, 2024 18:00
Airflow - Docker swarm setup
version: "3.8"
# This should give a high level idea of the approach.
# The complete solution is too complex and involves multiple internal microservices.
# I have tried to capture core things to consider while some else needs to achieve Docker swarm based auto scalout of workers.
services:
webserver:
image: customized/airflow:prod
environment:
const evtSource = new EventSource("http://localhost:8080/status/stream?param1=test");
evtSource.addEventListener("update", function(event) {
// Logic to handle status updates
console.log(event)
});
evtSource.addEventListener("end", function(event) {
console.log('Handling end....')
evtSource.close();
});
from sse_starlette.sse import EventSourceResponse
from fastapi import APIRouter, Request
from app.utils import status_event_generator
...
router = APIRouter()
@router.get('/status/stream')
async def runStatus(
import asyncio
'''
Get status as an event generator
'''
status_stream_delay = 5 # second
status_stream_retry_timeout = 30000 # milisecond
async def status_event_generator(request, param1):
@sairamkrish
sairamkrish / __init__.py
Created June 30, 2020 21:00
alembic model autogeneration
'''
Automatically add all models to __all__
This is used in alembic while autogenerating database migration script.
'''
from os.path import dirname, basename, isfile, join
import glob
modules = glob.glob(join(dirname(__file__), "*.py"))
__all__ = [ basename(f)[:-3] for f in modules if isfile(f) and not f.endswith('__init__.py')]
@sairamkrish
sairamkrish / env.py
Last active September 25, 2024 19:18
Alembic env.py sample
import os
import sys
from logging.config import fileConfig
from sqlalchemy import create_engine
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.utils.dates import days_ago
default_args = {
'schedule_interval': '#SCHEDULE_INTERVAL',
'start_date': days_ago(2),
'catchup': False,
'owner': 'admin',
}
from operators.template_operator import TemplateOperator
from airflow.utils.decorators import apply_defaults
from airflow.utils.dates import days_ago
import os
from pathlib import Path
class RestToTemplateWrapperOperator(TemplateOperator):
@apply_defaults
'''
### DAG Creator
This workflow listens for Triggers. Based on config parameters passed., It creates DAG.
'''
from datetime import timedelta, datetime
from airflow import DAG
from airflow.utils.dates import days_ago
from operators.rest_to_template_wrapper_operator import RestToTemplateWrapperOperator
# These args will get passed on to each operator