Doing a Sentinel query of the last 120 days, the availability of imagery is very location dependent.
Scullins Monument
Hurd Point
code to do a small extract in R
https://rstats.me/@mdsumner/114729203537529604
Global Ensemble Digital Terrain Model 30m (GEDTM30)
https://zenodo.org/records/15490367
rproj <- function() sprintf("+proj=laea +lon_0=%f +lat_0=%f", runif(1, -180, 180), runif(1, -90, 90))
pts <- geosphere::randomCoordinates(1e5)
ll <- function() "EPSG:4326"
library(terra)
library(sf)
terra_ <- function(x) {
terra::project(x, to = rproj(), from = ll())
}
first clone the repo (later we'll read remotely without this)
git clone https://opensource.unicc.org/open-source-united-initiative/un-tech-over/challenge-1/ahead-of-the-storm-challenge1-datasets
cd ahead-of-the-storm-challenge1-datasets
Now move into the repo and read in R
## we can get that WKT by using Oblique Mercator params
prj <- "+proj=omerc +lonc=-122.6 +lat_0=40 +gamma=21 +x_0=1000000"
shp <- "/vsicurl/https://github.com/noaa-nwfsc/VMS-pipeline/raw/refs/heads/main/spatial_data/map_rotation/fivekm_grid_extent_rect_21.shp"
library(terra)
#> terra 1.8.52
v <- vect(shp)
wkt0 <- crs(prj)
files <- tibble::tibble(filename = vsi_read_dir(vsidir)) |>
dplyr::mutate(source = glue::glue("{vsidir}/{filename}")) |>
dplyr::mutate(longitude = glue::glue("vrt://{source}?sd_name=longitude"),
latitude = glue::glue("vrt://{source}?sd_name=latitude"))
files <- do.call(rbind, lapply(vars, \(.x) dplyr::filter(files, stringr::str_detect(filename, glue::glue("^{.x}_")))))
i <- 1
## we can probably save this for subsequent runs
pkgurl <- "https://cran.r-project.org/src/contrib/terra_1.8-50.tar.gz"
dsn0 <- sprintf("/vsitar//vsicurl/%s", pkgurl)
files <- gdalraster::vsi_read_dir(dsn0, recursive = TRUE)
tif0 <- grep("\\.tif$", files, value = TRUE)[1]
tif <- sprintf("%s/%s", dsn0, tif0)
new(gdalraster::GDALRaster, tif)
library(bowerbird)
my_source <- bb_source(
name = "Chlorophyll-a concentration in seawater (not log-transformed), generated by as a blended combination of OCI, OCI2, OC2 and OCx algorithms, depending on water class memberships",
id = "ESACCI-OC-L3S-CHLOR_A-MERGED",
description = "European Space Agency Climate Change Initiative composites of merged sensor (MERIS, MODIS Aqua, SeaWiFS LAC & GAC, VIIRS, OLCI) products.",
doc_url = "http://esa-oceancolour-cci.org",
source_url =
c("https://www.oceancolour.org/thredds/catalog/cci/v6.0-release/geographic/monthly/chlor_a/catalog.html",
"https://www.oceancolour.org/thredds/catalog/cci/v6.0-release/geographic/daily/chlor_a/catalog.html",
Just a bit of response to that list and claim that no library aims to port more of GDAL in a lazy way - there is osgeo.gdal/ogr - that is very nearly GDAL-complete, and includes multidimensional mode for rasters, including its ability to cast from "classic (2D) raster" to mdim, and vice versa
It's a problem that sits occluded by the user-package 'rasterio', we've had a very similar problem in R for a long time (rgdal, now sf/terra as representative of the underlying library), but now we have gdalraster, with much better API fidelity (and includes vector, despite the name, and we've made good progress on multidim too).
I'm surprised this is still a languishing topic, but I'd recommend Python/xarray folks look hard at the built-in library that GDAL ships with, and especially with the new cli (released in 3.11) that provides a lot more flexibility and consistency to craft abstracted pipelines without even rendering
https://proj.org/en/stable/operations/projections/omerc.html
prj <- "+proj=omerc +lonc=-119 +lat_0=37 +gamma=30"
m <- do.call(cbind, maps::map("state", "california", plot = F)[1:2])
library(terra)
#> terra 1.8.50
plot(terra::project(m, to = prj, from = "EPSG:4326"), pch = ".", asp = 1/cos(37 * pi/180), type = "b")
#> Warning: [project] 3 failed transformations