In case anyone is interested, I've been trying to use turf.js on both Mapbox and Mapzen vector tiles, and I've learned a few things I wish I'd known going in.
This thing I've been working on for a month or so here and there, to make maps like this:
In case anyone is interested, I've been trying to use turf.js on both Mapbox and Mapzen vector tiles, and I've learned a few things I wish I'd known going in.
This thing I've been working on for a month or so here and there, to make maps like this:
#!/bin/bash | |
set -ex | |
GDALBUILD="$(realpath `dirname $BASH_SOURCE`)/build" | |
GDALINST="/usr/local/gdal" | |
CPUS=4 | |
GDALOPTS=" --with-webp=yes \ | |
--with-geos=/usr/local/bin/geos-config \ | |
--with-static-proj4=/usr/local \ |
These are my notes for taking the Microsoft US Building Footprints and splitting them into more manageable chunks based on US Census Tracts.
All of this happened on an m5.xlarge in AWS and used up about ~300GB of EBS over the course of a few hours.
Make a filesystem on the EBS volume and mount it:
sudo mkfs.xfs /dev/nvme1n1
mount /dev/nvme1n1 /mnt
Before diving too deeply into the various friction points when working with archives of earth observation data in xarray, let's look at a more optimal case from the earth systems world. In the notebook here we demonstrate how using zarr's consolidated metadata option to access the dimensional and chunk reference information, a massive dataset's dimensions and variables can be loaded extremely quickly. With this consolidated metadata available to reference chunks on disk, we can leverage xarray's dask integration to use normal xarray operations to lazily load chunks in parallel and perform our calculations using dask's blocked algorithm implementations. Gravy.
But the earth observation story is more complicated... Not everything lives in standardized file containers and more importantly our grid coordinate systems are "all over the map" :] Here are some of the current challenges.
from fastapi import FastAPI | |
from pydantic import BaseModel | |
import torch | |
import torchvision | |
import numpy as np | |
from PIL import Image | |
import random | |
import requests | |
import base64 | |
import io |