Skip to content

Instantly share code, notes, and snippets.

View metasim's full-sized avatar
🎹

Simeon H.K. Fitch metasim

🎹
View GitHub Profile
@metasim
metasim / overload-on-context-bound.scala
Created September 11, 2017 15:00
Figure out how to do overloading via more specific context bounds.
import geotrellis.spark.{SpatialComponent, TemporalComponent}
object Test {
class Foo {
// ...
}
object Foo {
implicit val spatialComponent: SpatialComponent[Foo] = ???
}

Creating RasterFrames

Initialization

There are a couple of setup steps necessary anytime you want to work with RasterFrames. the first is to import the API symbols into scope:

@metasim
metasim / jmh-result.json
Created April 15, 2018 20:24
WKB getNumPoints Benchmark Result
[
{
"jmhVersion" : "1.20",
"benchmark" : "org.locationtech.geomesa.spark.jts.WKBNumPointsBench.deserializeNumPoints",
"mode" : "thrpt",
"threads" : 1,
"forks" : 1,
"jvm" : "/Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home/jre/bin/java",
"jvmArgs" : [
],
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
We can make this file beautiful and searchable if this error is corrected: It looks like row 2 should actually have 59 columns, instead of 3 in line 1.
PANCHROMATIC_LINES,NADIR_OFFNADIR,sunAzimuth,REFLECTIVE_SAMPLES,upperLeftCornerLongitude,cloudCover,MAP_PROJECTION_L1,cartURL,sunElevation,path,BPF_NAME_TIRS,THERMAL_LINES,GROUND_CONTROL_POINTS_MODEL,row,imageQuality1,REFLECTIVE_LINES,ELLIPSOID,GEOMETRIC_RMSE_MODEL,browseURL,browseAvailable,dayOrNight,CPF_NAME,DATA_TYPE_L1,THERMAL_SAMPLES,upperRightCornerLatitude,lowerLeftCornerLatitude,sceneStartTime,dateUpdated,sensor,PANCHROMATIC_SAMPLES,GROUND_CONTROL_POINTS_VERSION,LANDSAT_PRODUCT_ID,acquisitionDate,upperRightCornerLongitude,PROCESSING_SOFTWARE_VERSION,GRID_CELL_SIZE_REFLECTIVE,lowerRightCornerLongitude,lowerRightCornerLatitude,sceneCenterLongitude,COLLECTION_CATEGORY,GRID_CELL_SIZE_PANCHROMATIC,BPF_NAME_OLI,sceneCenterLatitude,CLOUD_COVER_LAND,lowerLeftCornerLongitude,GEOMETRIC_RMSE_MODEL_X,GEOMETRIC_RMSE_MODEL_Y,sceneStopTime,upperLeftCornerLatitude,UTM_ZONE,DATE_L1_GENERATED,GRID_CELL_SIZE_THERMAL,DATUM,COLLECTION_NUMBER,sceneID,RLUT_FILE_NAME,TIRS_SSM_MODEL,ROLL_ANGLE,receivingStation
16421,NADIR,162
18/09/07 16:00:12 DEBUG RasterRef$: Fetching Extent(369450.0, 3353100.0, 371763.3, 3355458.3) from HttpGeoTiffRasterSource(https://s3-us-west-2.amazonaws.com/landsat-pds/c1/L8/149/039/LC08_L1TP_149039_20170411_20170415_01_T1/LC08_L1TP_149039_20170411_20170415_01_T1_B4.TIF)
ReadMonitor(reads=0, total=45876)
+-----------+------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|spatial_key|bounds |src
swagger: '2.0'
info:
title: Asset Catalog
version: 0.5.1
description: >-
This is a derivation of the <a href="https://raw.githubusercontent.com/radiantearth/stac-spec/master/api-spec/STAC-standalone-swagger2.yaml">STAC API Specification</a>.
This is an OpenAPI definition of the core SpatioTemporal Asset Catalog API
specification. Any service that implements this endpoint to allow search of
spatiotemporal assets can be considered a STAC API. The endpoint is also
available as an OpenAPI fragment that can be integrated with other OpenAPI
@metasim
metasim / read-model.py
Created November 11, 2018 19:59
Test files for loading a saved model using RasterFrames
# Tested with Spark 2.2.1, Python 3.6.4
# Even though we've used `--py-files` on the command line,
# unintuitively this still seems necessary to import pyrasterframes
SparkContext.addPyFile(spark.sparkContext, 'pyrasterframes.zip')
from pyrasterframes import *
spark.withRasterFrames()
def set_dims(parts):
assert(len(parts) is 2, "Expected dimensions specification to have exactly two components")
assert(all([isinstance(p, int) and p > 0 for p in parts]),
"Expected all components in dimensions to be positive integers")
options.update({
"imageWidth": parts[0],
"imageHeight": parts[1]
})
if raster_dimensions is not None:
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.