-
Comparison of GeoMesa and GeoWave (single none configuration)
- PostGIS is always faster for single node configuration
- GeoWave use less storage
- GeoMesa is faster than GeoWave in queries
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// taken from: http://www.scalatest.org/user_guide/selecting_a_style | |
class TVSetSpec extends FeatureSpec with GivenWhenThen { | |
info("As a TV set owner") | |
info("I want to be able to turn the TV on and off") | |
info("So I can watch TV when I want") | |
info("And save energy when I'm not watching TV") | |
feature("TV power button") { | |
scenario("User presses power button when TV is off") { |
The following code runs graphhopper + web interface locally
-
using raster tiles from external service
-
computing routes locally
-
from: https://github.com/graphhopper/graphhopper/blob/master/docs/core/quickstart-from-source.md
-
using PBF file "baden-wuerttemberg" from http://download.geofabrik.de/europe/germany/
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
library(raster) | |
# global config | |
INPUT_FNAME <- "/home/vlx/Work/biggis/data-julian/Auto150_georef.tif" | |
BAND <- 1 | |
AGGREG_FACT <- 150 | |
AGGREG_FUNC <- mean | |
# here comes the code |
- Aftershot does not load EXIF LensId from DNG files properly
- Tested on:
- Pentax
- Pentax and Sigma lenses
- Linux and Windows
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## Load the library after installing GMP | |
library(gmp) | |
library(raster) | |
plot_tupper <- function(k) { | |
## The tupper's formula | |
tupper = function(x, y, k){ | |
z1 <- as.bigz(y + k) | |
z2 <- as.bigq(z1/17) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# if (!require('devtools')) install.packages('devtools') | |
# devtools::install_github('apache/[email protected]', subdir='R/pkg') | |
library(SparkR) | |
library(sparklyr) | |
library(dplyr) | |
# use specific version of spark/hadoop | |
sc <- spark_connect("local", version = "2.0.2", hadoop_version = "2.7") |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
\usepackage[outerbars,xcolor]{changebar} % due to \cbcolor, \cbstart ... \cbend | |
\setlength{\changebarwidth}{3pt} | |
\setlength{\changebarsep}{20pt} | |
\definecolor{ChangebarColor}{rgb}{0.75,0.85,0.95} | |
\cbcolor{ChangebarColor} | |
% ============================================================================== | |
% Marking the changebark using a margin note | |
% ============================================================================== |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# convert ";" to "," | |
tr ";" "," < input.csv > output.csv | |
# list of columns with associated numbers | |
head -1 commadata.csv | tr ',' '\n' | nl | |
# project columns 1,2,163-174 into a new file | |
cut -d',' -f 1,2,163-174 commadata.csv > SOC.csv | |
# random subsample of 100 rows (not tested) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
library(SparkR) | |
library(sparklyr) | |
library(dplyr) | |
# use specific version of spark/hadoop | |
sc <- spark_connect("local", version = "2.0.2", hadoop_version = "2.7") | |
# assuming comma-separated input | |
spark_read_csv( | |
sc, "bmw", "~/SOC.csv", |
OlderNewer