Skip to content

Instantly share code, notes, and snippets.

View sakethramanujam's full-sized avatar
:shipit:
bravo alpha zulu india november golf alpha

Saketha Ramanjam sakethramanujam

:shipit:
bravo alpha zulu india november golf alpha
View GitHub Profile
@sakethramanujam
sakethramanujam / Rpi-InfluxDB-Install.md
Last active May 8, 2018 09:23 — forked from boseji/Rpi-InfluxDB-Install.md
Raspberry Pi InfluxDB installation

Raspberry Pi InfluxDB: The solution for IoT Data storage

Raspberry Pi is costeffect linux computer very commonly used for IoT home automation projects.

Here are the 3 problems with conventional databases for IoT data store applications:

  • Too much or complex configuration
  • Unable to expire data / set retentional policies
  • Not tailor made of Time Series Data
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from summary import read_file,daily,monthly
import datetime as dt
default_args={
'owner': 'airflow',
'start_date': dt.datetime(2019, 5, 31, 10, 00, 00),
'concurrency':1,
@sakethramanujam
sakethramanujam / Connecting a Local Database in Apache Airflow.md
Last active June 7, 2019 12:08
solution to a common trouble that airflow users might have faced.

⚠️ This post assumes that you have a basic understanding of the Airflow Web UI! ⚠️

Airflow Connecting a Local Database on Apache Airflow

What's the problem? 🤔

Once the airflow environment is setup, dags are written and the webserver is launched the ui would be something like this alt-ui So, here's my sample workflow dag that I've written. I've backfilled it for testing purposes and so it does display the task status and other metrics.

@sakethramanujam
sakethramanujam / simple_bash_dag.py
Last active January 6, 2022 16:09
Sample Dag with Bash Operator.
#!/usr/bin/python3
from airflow.models import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
# let's setup arguments for our dag
my_dag_id = "my_first_dag"
default_args = {
@sakethramanujam
sakethramanujam / add_launches.py
Last active September 12, 2019 10:40
Adding Launch Notifications to Collection
import asyncio
import json
import urllib.request
from pymongo import MongoClient
def connect(database_name: str, collection_name: str) -> object:
try:
conn = MongoClient()
db = conn.snapi
@sakethramanujam
sakethramanujam / netflix-log.py
Created October 22, 2019 09:14
Python script to scrape details from netflix usage log.
import pandas as pd
import urllib
from bs4 import BeautifulSoup
import argparse
def _parse():
parser = argparse.ArgumentParser()
parser.add_argument('-html', type=str, help='path to html file')
parser.add_argument('-f', '--filename', type=str, help='name to save the output')
@sakethramanujam
sakethramanujam / download-driver.py
Created November 20, 2019 06:13
A Python Script to automatically download chromedriver specific to your operating system
#! /usr/bin/env python3
import platform
import os
import subprocess
import requests
import zipfile
import io
import re
# !usr/bin/env python3
from bs4 import BeautifulSoup
import sys
import pandas as pd
import argparse
def args():
parser = argparse.ArgumentParser()
parser.add_argument('-f','--filename',help='path to input', type=str)
@sakethramanujam
sakethramanujam / pickle.py
Created December 9, 2019 10:32
Functions to save machine learning Models as .pickle files and importing them
import pickle
def save_pickle(model):
name=input('Name of the model: ')
with open(f'{name}.pickle','wb') as model:
pickle.dump(rf_model,model)
print('Model Saved.')
def load_pickle(filename):
file = open(filename,'rb')
model = pickle.load(file)
@sakethramanujam
sakethramanujam / Usage.md
Last active December 10, 2019 05:31
Saving Feature Importances

The above code can be used as follows

path_to_save_fi = 'your/path/'
save_importances(model,y_test.columns)