I know that question is rather vague but please bear w/me. I'm trying to get an idea of what sort of product performance - specifically timing - people have seen for various methodologies they have used to create google/bing map tiles. There are a slew of methods for how to do this (e.g. gdal2tiles, FME, maptiler, etc). An initial attempt at simply taking a large PNG and creating tiles using imagemagick, on a pretty decent linux server, yielded some pretty long processing times and so I wanted to see what other people are using in production. New tiles would need to be generated at least daily and so turnaround time on this is pretty critical.

The only real requirement is that it can run on a linux server. Obviously, free is better but I don't want to restrict myself to that. The input can be raw gridded/raster data or a large image. The output needs to be image tiles capable of being used as-is in google or bing maps.

Just for the sake of comparison, I'll say that the timings should be for google map's zoom level 7.

I appreciate everyone's help and again I want to apologize for how vague this question probably seems.

UPDATE: As far as the inputs, I currently have multiple (raw) data sources in various formats: netCDF, GRIB, GRIB2. In addition to the raw data itself, I also have the ability to generate really large images of that data which could then be sliced/tiled.

Ideally, I would just be chopping the image up but I am willing to try whatever will get me the fastest results.

Recommend you use Adobe fireworks for highly optimizing the final images you are using - adobe.com/products/fireworks - even exported from Photoshop and then optimized in Fireworks reduced file sizes up to 75% (png)
–
Mapperz♦Mar 25 '11 at 18:27

I think you need to expand on your input(s) and if more processing is needed or if you are just chopping them up.
–
iant♦Mar 25 '11 at 20:16

3

@Mapperz: The free equivalent is pngcrush and pngnq for quantization. - I currently work on a similar task and have an automatic chain gdal2tiles > pngnq > pngcrush > pregenerating thumbnails using imagemagick for every file that is fed into the system - I cannot claim it to be fast, but the automation takes a lot of the burden. And in my case there are no updates, it's fire and forget.
–
reletMar 28 '11 at 14:46

1

@relet - Any timings you can pass along? What is your hardware setup for this? Thanks
–
malonsoMar 29 '11 at 11:37

I was having issues with gdal2tiles taking quite a while to process a fairly large (380MB, 39K x 10K pixels) tiff into Google tiles for zoom ranges 0-12. On Ubuntu 12.04 64bit without multiprocessing it took just about all day (8 hours) to process the tiff into 1.99 million tiles @ 3.3GB. Like @Stephan Talpalaru mentions above, making gdal2tilesrun in parallel is the key. Make a backup of your original gdal2tiles.py, then install the patch from within the directory that houses gdal2tiles.py (mine was /usr/local/bin):