This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import random | |
for value in range(5000): | |
print("<item alpha=\"255\" value=\"{}\" label=\"{}\" color=\"#{}\"/>".format(value, value, hex(random.randint(0, 16777215))[2:].upper())) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python | |
__author__ = 'kersten.clauss' | |
__date__ = '28.03.2015' | |
"""script to create a vector/polygon from Hansens 30m global tree cover dataset | |
(http://earthenginepartners.appspot.com/science-2013-global-forest/download_v1.1.html) | |
Each tile will be downloaded, filtered to >75% tree cover for 2013 and merged into a global vector datset. The format of | |
the vector dataset is SpatiaLite/SQLite. It is advised to use the resampling to 250m option otherwise the resulting |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
__author__ = 'kersten.clauss' | |
"""Benchmark different GeoTiff compression algorithms. | |
Usage: GTiff_compression_benchmark.py some_geo.tif | |
Requires the GDAL tools to be present and executable from the command line of your system. | |
This script will take a GeoTiff as input and create copies with different compression algorithms from it. | |
It measures the filesize, compression and decompression times and returns them as a table. |
NewerOlder