What algorithms to use for image downsizing?
Asked Answered
C

4

10

What algorithms to use for image downsizing?

What is faster?

What algorithm is performed for image resizing ( specially downsizing from big 600x600 to super small 6x6 for example) by such giants as flash and silver player, and html5?

Cobber answered 21/6, 2010 at 17:8 Comment(3)
That's a lot of questions for one question.Darwin
No matter what choice you end up going with, image downsizing should be done with care and compassion. Remember: images have families too.Stinker
#385491Whimsicality
D
11

Bilinear is the most widely used method and can be made to run about as fast as the nearest neighbor down-sampling algorithm, which is the fastest but least accurate.

The trouble with a naive implementation of bilinear sampling is that if you use it to reduce an image by more than half, then you can run into aliasing artifacts similar to what you would encounter with nearest neighbor. The solution to this is to use an pyramid based approach. Basically if you want to reduce 600x600 to 30x30, you first reduce to 300x300, then 150x150, then 75x75, then 38x38, and only then use bilinear to reduce to 30x30.

When reducing an image by half, the bilinear sampling algorithm becomes much simpler. Basically for each alternating row and column of pixels:

y[i/2][j/2] = (x[i][j] + x[i+1][j] + x[i][j+1] + x[i+1][j+1]) / 4;
Dost answered 21/6, 2010 at 17:8 Comment(2)
Do you mean that bilinear should be used only on the last step?Connally
Yes, but my understanding is that the last step must be more than 2X to look good, so 38 to 30 would not be accurate. You would go from 75 to 30.Permalloy
K
3

There's one special case: downsizing JPG's by more than a factor of 8. A direct factor of 8 rescale can be done on the raw JPG data, without decompressing it. JPG's are stored as compressed blocks of 8x8 pixels, with the average pixel value first. As a result, it typically takes more time to read the file from disk or the network than it takes to downscale it.

Khabarovsk answered 21/6, 2010 at 17:8 Comment(0)
F
3

There is an excellent article at The Code Project showing the effects of various image filters.

For shrinking an image I suggest the bicubic algorithm; this has a natural sharpening effect, so detail in the image is retained at smaller sizes.

Final answered 21/6, 2010 at 17:8 Comment(2)
I get a 404 error on that page. I'd suggest bicubic for scaling up, but I stick with my bilinear for scaling down. All in all it's not going to make a major difference.Coacher
@Ben, link fixed. I disagree; bilinear has a smoothing effect that removes jaggies when enlarging, but tends to wash details out when downsizing. Anyway, take a look at the article.Final
C
1

Normally I would stick to a bilinear filter for scaling down. For resizing images to tiny sizes, though, you may be out of luck. Most icons are pixel-edited by hand to make them look their best.

Here is a good resource which explains the concepts quite well.

Coacher answered 21/6, 2010 at 17:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.