[Geowanking] downsampling large rasters from HD
rich.gibson at gmail.com
Mon Feb 18 10:54:40 PST 2008
I think you want to use the GDAL tools, http://gdal.org. You can download
them as part of the precompiled binary FWTools at
GDAL is a library which you can link into your own code. They also come
with command line tools to let you manipulate raster and vector data.
I just did a little test:
Using a 1.83 GHz Intel Core Duo Mac Mini with 1 GB RAM gdal_translate
700 x 700 window from a 63434 x 11679 tiff and wrote it out to a tiff in
about 13 seconds.
Grabbing the same window and writing to a jpg took a bit over 5 seconds.
Example: to grab a 700x700 pixel window from near the middle and write it to
gdal_translate -srcwin 31000 5000 700 700 -of jpeg p2.tif out.jpg
To shrink it to 5% of original size (took about 13 seconds)
gdal_translate -outsize 5% 5% -of jpeg p2.tif out.jpg
On Feb 18, 2008 6:36 AM, viktoras <viktoras.didziulis at sci.fi> wrote:
> hi friends ;-),
> just wanted to ask for any practical hints (algorithms) on downsampling
> very large raw binary (32 bit integer) raster file from disk. Looking
> for an algorithm or library that could be implemented or used in Ada,
> C/C++, Fortran, Assembly, Perl, Python or whatever else. The simpler the
> better :-). My aim is to subsample 2-4 gigabyte raster data file without
> loading it into RAM within 10 seconds on an "ordinary" PC with Intel
> processor (Windows, Linux, MacOSX on Intel). Wishing to explore as many
> options as possible... If you know any paper or website dealing with
> this kind of algorithms, I would appreciate the url.
> For example I have 50000x50000 raster and I need a portion of it resized
> to 700x700...
> Thanks in advance
> All the best!
> Geowanking mailing list
> Geowanking at lists.burri.to
Chief Scientist (and bottle washer), Locative Technologies
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Geowanking