What C library allows scaling of ginormous images?
Asked Answered
H

7

8

Consider the following file:

-rw-r--r-- 1 user user 470886479 2009-12-15 08:26 the_known_universe.png

How would you scale the image down to a reasonable resolution, using no more than 4GB of RAM?

For example:

$ convert -scale 7666x3833 the_known_universe.png

What C library would handle it?

Thank you!

Herringbone answered 20/7, 2010 at 0:43 Comment(3)
Redefine the known universe to fit within a spheroid region seven hundred and five meters in diameter. The map will be much smaller and easier to manipulate.Cleromancy
Just wondering, did you try doing this with ImageMagick? If I'm not mistaken, you can type pretty much that exact command if you have it installed, though I'm guessing ImageMagick won't be able to handle it.Dov
@Dave Jarvis: Perhaps your time would be better spent looking for things. Things you need. Things that make you go.Unman
M
4

I believe libpng has a stream interface. I think this can be used to read parts of the image at a time; depending on the image file you might be able to get the lines in order. You could then shrink each line (e.g. for 50% shrinking, shrink the line horizontally and discard every second line) and write to an output file.

Using libpng in C can take a fair amount of code, but the documentation guides you through it pretty well.

http://www.libpng.org/pub/png/libpng-1.2.5-manual.html#section-3.8

Mnemonic answered 20/7, 2010 at 1:15 Comment(1)
+1 - this is the way to go. I had to do something similar with the NASA image and used the streaming APISaccharify
Z
1

You could try making a 64 bit build of ImageMagick or seeing if there is one. My colleague wrote a blog with a super-simple png decoder (assumes you have zlib or equivalent) so you can kind of see the code you'd need to roll your own.

http://www.atalasoft.com/cs/blogs/stevehawley/archive/2010/02/23/libpng-you-re-doing-it-wrong.aspx

You would need to do the resample as you're reading it in.

Zachariahzacharias answered 20/7, 2010 at 1:21 Comment(1)
You also have to make sure you compile with 8 bits per channel.Blayze
S
1

I used cximage a few years ago. I think the latest version is at http://www.xdp.it/cximage.htm after moving off of CodeProject.

Edit: sorry, it's C++ not C.

Slapdash answered 20/7, 2010 at 1:56 Comment(0)
F
1

You could use an image processing library that is intended to do complex operations on large (and small) images. One example is the IM imaging toolkit. It links well with C (but is implemented at least partly in C++) and has a good binding to Lua. From the Lua binding it should be easy to experiment.

Froe answered 20/7, 2010 at 2:34 Comment(0)
H
1

libvips is comfortable with huge images. It's a streaming image processing library, so it can read from the source, process, and write to the destination simultaneously and in parallel. It's typically 3x to 5x faster than imagemagick and needs very little memory.

For example, with the largest PNG I have on my laptop (1.8gb), I can downsize 10x with:

$ vipsheader huge.png
huge.png: 72000x72000 uchar, 3 bands, srgb, pngload
$ ls -l huge.png 
-rw-r--r-- 1 john john 1785845477 Feb 19 09:39 huge.png
$ time vips resize huge.png x.png 0.1
real    1m35.279s
user    1m49.178s
sys 0m1.208s
peak RES 230mb

Not fast, but not too shabby either. PNG is rather a slow format, it would be much quicker with TIFF.

libvips is installable by most package managers (eg. homebrew on macOS, apt on Debian), there's a Windows binary, and it's free (LGPL). As well as the command-line, there are bindings for C, C++, Python, Ruby, Lua, node, PHP, and others.

Haupt answered 19/2, 2018 at 10:18 Comment(0)
A
0

Have you considered exploring pyramid based images? Imagine a pyramid where the image is divided up in multiple layers, each layer with a different resolution. Each layer is split up into tiles. This way you can display a zoomed out version of the image, and also a zoomed in partial view of the image, without having to re-scale.

See the Wikipedia entry.

One of the original formats was FlashPix, which I wrote a renderer for. I've also created a new format of a pyramid converter and renderer, which was used for a medical application. An actual scanner would produce 90GB+ scans of a slice of an organ for cancer research. The algorithm of the converter was actually pretty tricky to get efficient, to produce the pyramid images efficienty. Believe it or not, it was actually Java based, and it performed much better than you'd think. It used multithreading. Benchmarking showed it was unlikely that a C version would do a whole lot better. This was 6ish years ago. The original renderer I did over 10 years ago. You don't hear anything about pyramid based images anymore these days. But it's really the only efficient way to produce scaled images on demand without having to generate cached scaled versions.

Jpeg2000 may or may not have an optional pyramid feature as well.

I recall that ImageMagick's supporter formats and conversions perhaps, include FlashPix. Googling for "image pyramid" reveals some interesting results. Bring back some memories ;-)

Aldrich answered 20/7, 2010 at 3:19 Comment(0)
G
0

If you can move it to a 64-bit OS you can open it as a memory mapped file or equivalent and use pretty much any library you want. It won't be fast, and may need the increase of the page/swap file (depending on the OS and what else you want to do with it) but in return you won't be limited to streaming libraries so you'll be able to do more operation before going into resolution reduction or slicing.

Gallimaufry answered 8/8, 2010 at 15:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.