Skip to content

Instantly share code, notes, and snippets.

View robbibt's full-sized avatar
🛰️

Robbi Bishop-Taylor robbibt

🛰️
View GitHub Profile
@robbibt
robbibt / mostcommon_utm.py
Created March 12, 2019 05:50
Function to obtain most common UTM zone for datacube query
from collections import Counter
import warnings
def mostcommon_utm(dc, product, query):
'''
Takes a given query and returns the most common UTM zone for
observations returned for that spatial extent.
Parameters
@robbibt
robbibt / gdal_cont_mosaic.py
Last active March 19, 2019 01:25
Optimised GDAL parameters for continental mosaicing
!gdalbuildvrt /g/data/r78/rt1527/nidem/output_data/mosaics/NIDEM_uncertainty.vrt ../output_data/geotiff/nidem_uncertainty/NIDEM_uncertainty*.tif
!gdal_translate \
-co COMPRESS=DEFLATE \
-co ZLEVEL=9 \
-co PREDICTOR=1 \
-co TILED=YES \
-co BLOCKXSIZE=1024 \
-co BLOCKYSIZE=1024 \
/g/data/r78/rt1527/nidem/output_data/mosaics/NIDEM.vrt /g/data/r78/rt1527/nidem/output_data/mosaics/NIDEM_mosaic.tif
@robbibt
robbibt / get_native_metadata.py
Last active March 26, 2019 03:17
Get native resolution and CRS from a OpenDataCube dataset measurement
from datacube.testutils.io import rio_slurp
from datacube.storage import measurement_paths
ds = dc.find_datasets(product='ls8_nbar_scene')[0]
band_path = measurement_paths(ds)['3']
raster_meta = rio_slurp(band_path)
raster_meta[1].transform
raster_meta[1].crs
@robbibt
robbibt / appending_geopandas.py
Last active April 16, 2022 09:16
Appending GeoPandas data to file
# Author: Claire Krause
# Save the polygons to a shapefile
schema = {'geometry': 'Polygon','properties': {'area': 'str'}}
if os.path.isfile('test.shp'):
with fiona.open('test.shp', "a", crs = from_epsg(3577), driver = 'ESRI Shapefile', schema = schema) as output:
for ix, poly in MergedPolygonsGPD.iterrows():
output.write(({'properties': {'area': poly['area']},'geometry': mapping(shape(poly['geometry']))}))
else:
@robbibt
robbibt / extract_tides.py
Last active March 26, 2019 03:16
Extract tide time series for location and time range
import pandas as pd
from otps import TimePoint
from otps import predict_tide
tidepost_lat, tidepost_lon =-12.213764, 131.824570 # pointstuart
tidepost_lat, tidepost_lon =-13.315045, 130.234212 # dalyriver
tidepost_lat, tidepost_lon =-14.957638244600385, 129.5448589323929 # josephbonapartegulf
# Use the OTPS tidal mode to compute tide heights for each observation:
date_range = pd.date_range("2013-01-01", "2019-05-05", freq="1h")
@robbibt
robbibt / simple_dcload.py
Last active October 27, 2020 03:31
Ready-to-run datacube.load example
import datacube
from datacube.utils import geometry
from datacube.utils.geometry import CRS
# Connect to datacube
dc = datacube.Datacube(app='Simple example')
# Set up analysis data query
query = {
'x': (1754576.964742866, 1762576.964742866),
@robbibt
robbibt / mask_3D_array_with_2D.py
Created March 29, 2019 00:07
Mask a 3D xarray dataset with a 2D elevation surface
# Author: Imam Alam
# imports
import numpy as np
import pandas as pd
import xarray as xr
# define an example volume, roughly MGA xy, + elevation
x = range(50000, 52000, 100)
y = range(800000, 802000, 100)
@robbibt
robbibt / animated_fade.py
Created April 2, 2019 02:36
Take two images and export frames fading between both images with a smooth transition
import imageio
import numpy as np
from_array = imageio.imread('Visualisation/pilbara/pilbara_before.png')
to_array = imageio.imread('Visualisation/pilbara/pilbara_after.png')
stacked_from_to = np.stack([to_array, from_array], axis=3)
weights = [0.2, 0.4, 0.6, 0.8, 1.0, 0.8, 0.6, 0.4, 0.2, 0]
milliseconds = [50, 50, 50, 50, 3500, 50, 50, 50, 50, 3500]
@robbibt
robbibt / load_netCDFs_into_xarray.py
Last active May 28, 2022 12:33
Load multiple netCDF files into a single xarray dataset, using data from global file attributes to populate a new dimension (e.g. time)
import glob
import xarray as xr
from datetime import datetime
# List all matching files
files = glob.glob('/g/data/r78/mc9153/tide_otps/L3_2008_nc3/*.L3m')
# Create list for
individual_files = []
@robbibt
robbibt / GA_landsat_collection_3.py
Last active September 24, 2019 00:25
GA Landsat Collection 3 details
# For anyone putting together notebooks that use Collection Upgrade/Collection 3
# Landsat, the product names are:
# • ga_ls5t_ard_3 (Landsat 5 ARD)
# • ga_ls7e_ard_3 (Landsat 7 ARD)
# • ga_ls8c_ard_3 (Landsat 8 ARD)
# The Collection 3 samples on the NCI are stored in a different database, so for
# now we can use the try/except below to connect to the datacube differently on
# both the NCI and the Sandbox (we'll hopefully just be able to connect using