Skip to content

Instantly share code, notes, and snippets.

View radovankavicky's full-sized avatar
🐍
Pythonista

Radovan Kavicky radovankavicky

🐍
Pythonista
View GitHub Profile
@radovankavicky
radovankavicky / tweet_listener.py
Created October 19, 2017 12:56 — forked from hugobowne/tweet_listener.py
Here I define a Tweet listener that creates a file called 'tweets.txt', collects streaming tweets as .jsons and writes them to the file 'tweets.txt'; once 100 tweets have been streamed, the listener closes the file and stops listening.
class MyStreamListener(tweepy.StreamListener):
def __init__(self, api=None):
super(MyStreamListener, self).__init__()
self.num_tweets = 0
self.file = open("tweets.txt", "w")
def on_status(self, status):
tweet = status._json
self.file.write( json.dumps(tweet) + '\n' )
self.num_tweets += 1
library(idbr) # devtools::install_github('walkerke/idbr')
library(ggplot2)
library(animation)
library(dplyr)
library(ggthemes)
idb_api_key("Your Census API key goes here")
male <- idb1('JA', 2010:2050, sex = 'male') %>%
mutate(POP = POP * -1,
@radovankavicky
radovankavicky / PyData Berlin 2017.md
Last active June 8, 2017 14:49
Gist for PyData Berlin 2017
@radovankavicky
radovankavicky / post-save-hook.py
Last active June 8, 2017 14:51 — forked from jbwhit/post-save-hook.py
Saves Jupyter Notebooks as .py and .html files automatically. Add to the ipython_notebook_config.py file of your associated profile.
import os
from subprocess import check_call
def post_save(model, os_path, contents_manager):
"""post-save hook for converting notebooks to .py and .html files."""
if model['type'] != 'notebook':
return # only do this for notebooks
d, fname = os.path.split(os_path)
check_call(['jupyter', 'nbconvert', '--to', 'script', fname], cwd=d)
check_call(['jupyter', 'nbconvert', '--to', 'html', fname], cwd=d)
@radovankavicky
radovankavicky / references.txt
Created March 11, 2017 09:47 — forked from d0ugal/references.txt
Effective Code Review References
Code Complete by Steve McConnell
Jeff Atwood (Coding Horror)
https://blog.codinghorror.com/code-reviews-just-do-it/
Measuring Defect Potentials and Defect Removal Efficiency
http://rbcs-us.com/site/assets/files/1337/measuring-defect-potentials-and-defect-removal-efficiency.pdf
Expectations, Outcomes, and Challenges Of Modern Code Review
https://www.microsoft.com/en-us/research/publication/expectations-outcomes-and-challenges-of-modern-code-review/
@radovankavicky
radovankavicky / bobp-python.md
Created March 6, 2017 21:17 — forked from sloria/bobp-python.md
A "Best of the Best Practices" (BOBP) guide to developing in Python.

The Best of the Best Practices (BOBP) Guide for Python

A "Best of the Best Practices" (BOBP) guide to developing in Python.

In General

Values

  • "Build tools for others that you want to be built for you." - Kenneth Reitz
  • "Simplicity is alway better than functionality." - Pieter Hintjens
library(rvest)
library(magrittr)
library(dplyr)
library(purrr)
library(lubridate)
library(tidyr)
library(ggplot2)
library(scales)
setwd("...working directory...")
#SCRIPT_REAL is a function in Tableau which returns a result from an external service script. It's in this function we pass the python code.
SCRIPT_REAL("from nltk.sentiment import SentimentIntensityAnalyzer
text = _arg1 #you have to use _arg1 to reference the data column you're analyzing, in this case [Word]. It gets word further down after the ,
scores = [] #this is a python list where the scores will get stored
sid = SentimentIntensityAnalyzer() #this is a class from the nltk (Natural Language Toolkit) library. We'll pass our words through this to return the score
for word in text: # this loops through each row in the column you pass via _arg1; in this case [Word]
ss = sid.polarity_scores(word) #passes the word through the sentiment analyzer to get the score
@radovankavicky
radovankavicky / go-python-go.sh
Created December 15, 2016 19:18 — forked from dannguyen/go-python-go.sh
Batch installation script for installing fun Python libraries for compciv2017 class (on top of Anaconda/Python 3.5+)
###############################################################################
# Batch script for setting up fun Python libraries for Computational Methods in the Civic Sphere 2017
#
# Should be run after installing Anaconda 4.2+/Python 3.5+ via https://www.continuum.io/downloads
#
# Doesn't include libraries that are installed as dependencies (e.g. numpy via pandas)
##############################################################################
##############################################################################
###### Yes
import pandas as pd
import time
from nltk.sentiment import SentimentIntensityAnalyzer
t0 = time.time()
top_100 = pd.read_csv('/Users/brit.cava/Desktop/TabPy/top100.csv')
text = top_100['Word']
sid = SentimentIntensityAnalyzer()