Skip to content

Instantly share code, notes, and snippets.

View LayneSmith's full-sized avatar
🎯
Focusing

Layne Smith LayneSmith

🎯
Focusing
View GitHub Profile
@arianagiorgi
arianagiorgi / README.md
Created December 21, 2017 21:47
Uploading Large CSVs in Postgres

Uploading Large CSVs in Postgres

When importing a large data csv file into postgres, sometimes importing with csvkit alone will stall, and it's useful to use csvkit to build the table and the postgres COPY command to get the rest of the data in there.

You will need to pip install csvkit as well as psycopg2, preferably in a virtualenv.

  1. First we'll make the CREATE TABLE statement and pipe it into a sql file. This example uses the first 100 lines of the file to determine the data type of each field, but this can be adjusted as needed.
  • csvfile.csv = large data file
  • yourtable = table you want to create
  • outputfile.sql = file which will contain CREATE TABLE statement
@LayneSmith
LayneSmith / README.md
Last active July 6, 2018 01:53 — forked from arianagiorgi/README.md
AWS Lambda Automation

Automation with AWS Lambda

Setup Python-lambda

We'll be using Python-lambda

  1. Create new directory and call it whatever you want.

  2. Enter new directory and run virtualvenv venv from Terminal. If you don't have virtualenv, you can install it with pip install virtualenv.

// Includes functions for exporting active sheet or all sheets as JSON object (also Python object syntax compatible).
// Tweak the makePrettyJSON_ function to customize what kind of JSON to export.
var FORMAT_ONELINE = 'One-line';
var FORMAT_MULTILINE = 'Multi-line';
var FORMAT_PRETTY = 'Pretty';
var LANGUAGE_JS = 'JavaScript';
var LANGUAGE_PYTHON = 'Python';