Skip to content

Instantly share code, notes, and snippets.

View mdsumner's full-sized avatar

Michael Sumner mdsumner

  • Integrated Digital East Antarctica, Australian Antarctic Division
  • Hobart, Australia
View GitHub Profile
sf::gdal_utils("vectortranslate", "OAPIF:https://ogc-api.nrw.de/inspire-us-feuerwehr", tf <- tempfile(tmpdir = "/vsimem", fileext = ".fgb"), options = c("-f", "FlatGeobuf", "-where", "name = 'Schwelm'"))
terra::vect(tf)
class       : SpatVector
 geometry    : points
 dimensions  : 1, 22  (geometries, attributes)
 extent      : 381347.8, 381347.8, 5682950, 5682950  (xmin, xmax, ymin, ymax)
import s3fs
bucket = 'podaac-ops-cumulus-protected'

creds = "<the literal json available at https://archive.podaac.earthdata.nasa.gov/s3credentials>"

s3 = s3fs.S3FileSystem(key=creds['accessKeyId'], 
                       secret=creds['secretAccessKey'],
                       token=creds['sessionToken'],
 client_kwargs={'region_name':'us-west-2'})

Crop out NZ from COP30 or GEBCO:

export dsn=/vsicurl/https://opentopography.s3.sdsc.edu/raster/COP30/COP30_hh.vrt
gdalwarp $dsn nz.tif -te 165 -51 180 -33 -ts 1024 0 out.tif

if you want bathymetry as well use GEBCO (will also be a lot quicker, so maybe this and then mask, see gdal_calc.py in linux)

Rely on the geolocation arrays and GDAL warp:

gdalwarp -te -5500000 -5500000 5500000 5500000 -t_srs "+proj=geos +lon_0=140.7 +h=35785863 +x_0=0 +y_0=0 +a=6378137 +b=6356752.3 +units=m +no_defs" vrt://AHI-CHGT_v1r1_h09_s202403040030204_e202403040039398_c202403040047497.nc?sd_name=CldTopTemp test.vrt

or, assign the grid and ignore the geolocation arrays (in this case we also need GDAL_NETCDF_BOTTOMUP=NO)

library(nert)
r <- read_smips(day = "2024-01-01")

## sources() finds the actual DSN (data source name, description) of the source data but this includes your TERN API key so we can't show that
library(terra)
par(mfrow = c(1, 5))
## get different zoom levels
dsn <- "WMTS:https://services.arcgisonline.com/arcgis/rest/services/World_Imagery/MapServer/WMTS/1.0.0/WMTSCapabilities.xml"

ex <- c(-1, 1, -1, 1) * 5000 ## 5km
pt <- cbind(147, -42)         ## lon,lat anywhere you want
res <- c(10, 10) ## 10m pixels, obvsly update this inline with the buffer above to get the nx,ny dimension you actually want
## use a local projection, because this works anywhere you only need modify ex and res (or dim: ncols/nrows)

A small list of global datasets on Pawsey object storage.

'NSIDC_SEAICE_PS_S25km', 'NSIDC_SEAICE_PS_N25km' are the 25km sea ice concentrations between 1978-now.

'SEALEVEL_GLO_PHY_L4' is Copernicus altimetry, global longlat since 1993 (ssh, u/v currents

'oisst-avhrr-v02r01' is OISST 25km global longlat SST since 1982.

Virtual Zarr referencing NetCDF objects on Pawsey storage, stored as "kerchunk Parquet".

following on from https://bsky.app/profile/jerry-shannon.bsky.social/post/3lcisatdppk2h

gdalinfo /vsigs/gcp-public-data-arco-era5/ar/full_37-1h-0p25deg-chunk-1.zarr-v3
ERROR 15: No valid GCS credentials found. For authenticated requests, you need to set GS_SECRET_ACCESS_KEY, GS_ACCESS_KEY_ID, 
GS_OAUTH2_REFRESH_TOKEN, GOOGLE_APPLICATION_CREDENTIALS, or other configuration options, or create a /home/ubuntu/.boto file. Consult
https://gdal.org/en/stable/user/virtual_file_systems.html#vsigs-google-cloud-storage-files for more details. For unauthenticated requests
on public resources, set the GS_NO_SIGN_REQUEST configuration option to YES.
rast("/vsis3/idea-10.7289-v5sq8xb5/www.ncei.noaa.gov/data/sea-surface-temperature-optimum-interpolation/v2.1/access/avhrr/198109/oisst-avhrr-v02r01.19810901.nc")
## function to split "label: text" into data.frame(label = text)
v1 <- function(x) {  
  x <- gsub(": ", ":", x)  ## first clear that whitespace
  dd <- read.delim(text = x, sep = ":", header = FALSE); 
  setNames(data.frame(dd[[2]]), dd[[1]][1])
}

library(sf)
sf::sf_use_s2(FALSE)