Skip to content

Instantly share code, notes, and snippets.

View Pragith's full-sized avatar
🎯
Focusing

Pragith Pragith

🎯
Focusing
View GitHub Profile
@Pragith
Pragith / pastes_readme_v1_9.md
Created November 9, 2012 20:53
Readme for Pastes

Version

1.9

Description

sudo apt-get remove scala-library scala
wget http://www.scala-lang.org/files/archive/scala-2.11.6.deb
sudo dpkg -i scala-2.11.6.deb
sudo apt-get update
sudo apt-get install scala
wget http://dl.bintray.com/sbt/debian/sbt-0.13.8.deb
sudo dpkg -i sbt-0.13.8.deb
sudo apt-get update
# Download Spark 1.4 from http://spark.apache.org/downloads.html
#
# Download the nyc flights dataset as a CSV from https://s3-us-west-2.amazonaws.com/sparkr-data/nycflights13.csv
# Launch SparkR using
# ./bin/sparkR --packages com.databricks:spark-csv_2.10:1.0.3
# The SparkSQL context should already be created for you as sqlContext
sqlContext
# Java ref type org.apache.spark.sql.SQLContext id 1
# Download Spark 1.4 from http://spark.apache.org/downloads.html
#
# Download the nyc flights dataset as a CSV from https://s3-us-west-2.amazonaws.com/sparkr-data/nycflights13.csv
# Launch SparkR using
# ./bin/sparkR --packages com.databricks:spark-csv_2.10:1.0.3
Sys.setenv(SPARK_HOME="/home/pragith/spark")
Sys.setenv('SPARKR_SUBMIT_ARGS'='"--packages" "com.databricks:spark-csv_2.10:1.0.3" "sparkr-shell"')
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
@Pragith
Pragith / PlunkerInstallationGuide.md
Created March 21, 2016 05:24 — forked from WilHall/PlunkerInstallationGuide.md
Plunker Installation Guide
@Pragith
Pragith / README.md
Created May 3, 2018 05:24 — forked from leonardofed/README.md
A curated list of AWS resources to prepare for the AWS Certifications


A curated list of AWS resources to prepare for the AWS Certifications

A curated list of awesome AWS resources you need to prepare for the all 5 AWS Certifications. This gist will include: open source repos, blogs & blogposts, ebooks, PDF, whitepapers, video courses, free lecture, slides, sample test and many other resources.

OP: @leonardofed founder @ plainflow.


@Pragith
Pragith / gist:868d822b5b0fe83f0d836cffbe2f0f13
Created July 12, 2019 18:42
Make it rain like Matrix on Mac
alias matrix='echo -e "\033[0;32m1"; while $t; do for i in `seq 1 30`;do r="$[($RANDOM % 2)]";h="$[($RANDOM % 4)]";if [ $h -eq 1 ]; then v="\033[0;32m0 $r";else v="1 $r";fi;v2="$v2 $v";done;echo -e $v2;v2="";done;
@Pragith
Pragith / download.py
Created September 25, 2019 12:53
Download file in Python with progress bar
def download_file(url, output_location):
print('URL:',url)
# Streaming, so we can iterate over the response.
r = requests.get(url, stream=True)
# Total size in bytes.
total_size = int(r.headers.get('content-length', 0));
block_size = 1024
wrote = 0
with open(output_location, 'wb') as f:
@Pragith
Pragith / lynda_to_csv.py
Last active September 28, 2019 00:16
Generates a CSV of list of Lynda videos
from bs4 import BeautifulSoup as bs
import argparse, requests, pandas as pd
# Get report related arguments from the command line
parser = argparse.ArgumentParser()
parser.add_argument("-url","--url", help="Enter Lynda course URL", type=str)
parser.add_argument("-o","--output_file", help="Enter the output filename", type=str)
args = vars(parser.parse_args())
def duration_to_seconds(T):