- AWS Account
- AWS API User API Key/Secret
- Bitbucket Account
- Create or update an existing Bitbucket repo
- Open https://bitbucket.org/USERNAME/REPO/addon/pipelines/deployments
# United States Postal Service (USPS) abbreviations. | |
abbreviations = [ | |
# https://en.wikipedia.org/wiki/List_of_states_and_territories_of_the_United_States#States. | |
"AK", "AL", "AR", "AZ", "CA", "CO", "CT", "DE", "FL", "GA", "HI", "IA", | |
"ID", "IL", "IN", "KS", "KY", "LA", "MA", "MD", "ME", "MI", "MN", "MO", | |
"MS", "MT", "NC", "ND", "NE", "NH", "NJ", "NM", "NV", "NY", "OH", "OK", | |
"OR", "PA", "RI", "SC", "SD", "TN", "TX", "UT", "VA", "VT", "WA", "WI", | |
"WV", "WY", | |
# https://en.wikipedia.org/wiki/List_of_states_and_territories_of_the_United_States#Federal_district. | |
"DC", |
sudo apt-get update | |
sudo apt-get upgrade | |
# Added by me | |
sudo apt-get install freeglut3 freeglut3-dev libtbb-dev libqt4-dev | |
# Copied from pyimagesearch.com | |
sudo apt-get install build-essential cmake git pkg-config | |
sudo apt-get install libjpeg8-dev libtiff4-dev libjasper-dev libpng12-dev | |
sudo apt-get install libgtk2.0-dev | |
sudo apt-get install libavcodec-dev libavformat-dev libswscale-dev libv4l-dev |
from pyspark.mllib.stat import Statistics | |
import pandas as pd | |
# result can be used w/ seaborn's heatmap | |
def compute_correlation_matrix(df, method='pearson'): | |
# wrapper around | |
# https://forums.databricks.com/questions/3092/how-to-calculate-correlation-matrix-with-all-colum.html | |
df_rdd = df.rdd.map(lambda row: row[0:]) | |
corr_mat = Statistics.corr(df_rdd, method=method) | |
corr_mat_df = pd.DataFrame(corr_mat, |
''' | |
Using OpenCV takes a mp4 video and produces a number of images. | |
Requirements | |
---- | |
You require OpenCV 3.2 to be installed. | |
Run | |
---- | |
Open the main.py and edit the path to the video. Then run: |
import sys | |
import logging | |
from typing import Optional, Dict | |
from colorama import Fore, Back, Style | |
class ColoredFormatter(logging.Formatter): | |
"""Colored log formatter.""" |
#!/usr/bin/env bash | |
# | |
# Licensed to the Apache Software Foundation (ASF) under one or more | |
# contributor license agreements. See the NOTICE file distributed with | |
# this work for additional information regarding copyright ownership. | |
# The ASF licenses this file to You under the Apache License, Version 2.0 | |
# (the "License"); you may not use this file except in compliance with | |
# the License. You may obtain a copy of the License at | |
# |
These last few months, I have tried a lot of difference formulation to calculate Standardized Precipitation Index (SPI) based on rainfall data in netCDF format, check below files as a background:
The reason why I use rainfall in netCDF format in above files because the software to calculate SPI: climate-indices python package will only accept single netCDF as input, and the SPI script will read the netCDF input file based on time dimension.
Converting raster files into netCDF is easy using GDAL or other GIS software, but to make the time dimension enabled n
# Copyright 2024 Gordon D. Thompson, [email protected] | |
# | |
# Licensed under the Apache License, Version 2.0 (the "License"); | |
# you may not use this file except in compliance with the License. | |
# You may obtain a copy of the License at | |
# | |
# http://www.apache.org/licenses/LICENSE-2.0 | |
# | |
# Unless required by applicable law or agreed to in writing, software | |
# distributed under the License is distributed on an "AS IS" BASIS, |
#!/usr/bin/env bash | |
# | |
# Licensed to the Apache Software Foundation (ASF) under one or more | |
# contributor license agreements. See the NOTICE file distributed with | |
# this work for additional information regarding copyright ownership. | |
# The ASF licenses this file to You under the Apache License, Version 2.0 | |
# (the "License"); you may not use this file except in compliance with | |
# the License. You may obtain a copy of the License at | |
# |