-
-
Save kannes/ebfe021458f96e4f30b5 to your computer and use it in GitHub Desktop.
#!/bin/bash | |
# rename TMS tiles to the XYZ schema | |
# no quoting, since all files have simple numeric names | |
# do not run this anywhere else than INSIDE your tiles directory | |
# run it like this: find . -name "*.png" -exec ./tms2xyz.sh {} \; | |
filename=$1 | |
tmp=${filename#*/} # remove to first / | |
z=${tmp%%/*} # remove from first / | |
tmp=${filename%/*} # remove from last / | |
x=${tmp##*/} # remove to last / | |
tmp=${filename##*/} # remove to last / | |
y=${tmp%.*} # remove from first . | |
extension=${filename##*.} | |
let newy="2**$z-$y-1" # calculate the xyz tile | |
#echo $z $x $y $newy $extension | |
# remove the echo if you are sure you want to do this | |
echo mv ${filename} ./$z/$x/$newy.$extension |
Thanks a lot ..
But take care - If your tile set is crossing the equator it possibly overwrites it's own files.
I've added a check if the target file exists and moved it to *.NEW in such a case and cleanup manually afterwards.
One could extend the script to run a second cleanup round though..
This program / method takes a LONG TIME if you are dealing with 9000 + .png files. It does an
-exec ./tms2xyz.sh
for each of the 9000 files.
I probably won't post it, (unless someone asks for it ) but I wrote a .php script ( I am an old php guy ) that does 9000 .png files must faster.
for the find / -exec ./tms2xyz.sh method for 8961 files ( each file executes the .sh script ) it took:
real 10m15.164s
user 5m2.127s
sys 5m28.776s
for my php script that runs one time and does something like:
$cmd = "find 1 2 3 4 5 6 7 8 9 -type f -name '*.png'";
$rc = exec($cmd,$results);
foreach($results as $thisone) { // code to convert to the correct xyz coordinats
.
.
.
}
it took:
real 0m38.015s
user 0m14.388s
sys 0m24.345s
so 10+ minutes -vs- 40 seconds.
maybe 9000-ish files is not normal... it is for my stuff.
Anyway... if you do a lot of files.. this will work.. but the find -exec script.sh stuff is not optimal time wise.
Everyone likes python.. maybe someone can do a python script for this.... or a bash guy could find out a way to do a # directory ( there are 10 of them ) on a separate process so you can do 10 dirs at one time... instead of 1, then 2, then 3, then 4, etc... I've got an 40/80 core/process server that could probably handle 10 parallel processes ok.
- jack
Haha, yeah, this was a really dumb but working approach. :D
Asking for it if you still have it (:
Asking for it if you still have it (:
Asking for what? ;)
I needed to run it like this:
find . -name "*.png" -exec ./tms2xyz.sh '{}' \;