This is not a straightforward problem, please read through!
I want to manipulate a JPEG file and save it again as JPEG. The problem is that even without manipulation there's significant (visible) quality loss. Question: what option or API am I missing to be able to re-compress JPEG without quality loss (I know it's not exactly possible, but I think what I describe below is not an acceptable level of artifacts, especially with quality=100).
Control
I load it as a Bitmap
from the file:
BitmapFactory.Options options = new BitmapFactory.Options();
// explicitly state everything so the configuration is clear
options.inPreferredConfig = Config.ARGB_8888;
options.inDither = false; // shouldn't be used anyway since 8888 can store HQ pixels
options.inScaled = false;
options.inPremultiplied = false; // no alpha, but disable explicitly
options.inSampleSize = 1; // make sure pixels are 1:1
options.inPreferQualityOverSpeed = true; // doesn't make a difference
// I'm loading the highest possible quality without any scaling/sizing/manipulation
Bitmap bitmap = BitmapFactory.decodeFile("/sdcard/image.jpg", options);
Now, to have a control image to compare to, let's save the plain Bitmap bytes as PNG:
bitmap.compress(PNG, 100/*ignored*/, new FileOutputStream("/sdcard/image.png"));
I compared this to the original JPEG image on my computer and there's no visual difference.
I also saved the raw int[]
from getPixels
and loaded it as a raw ARGB file on my computer: there's no visual difference to the original JPEG, nor the PNG saved from Bitmap.
I checked the Bitmap's dimensions and config, they match the source image and the input options: it's decoded as ARGB_8888
as expected.
The above to control checks prove that the pixels in the in-memory Bitmap are correct.
Problem
I want to have JPEG files as a result, so the above PNG and RAW approaches wouldn't work, let's try to save as JPEG 100% first:
// 100% still expected lossy, but not this amount of artifacts
bitmap.compress(JPEG, 100, new FileOutputStream("/sdcard/image.jpg"));
I'm not sure its measure is percent, but it's easier to read and discuss, so I'm gonna use it.
I'm aware that JPEG with the quality of 100% is still lossy, but it shouldn't be so visually lossy that it's noticeable from afar. Here's a comparison of two 100% compressions of the same source.
Open them in separate tabs and click back and forth between to see what I mean. The difference images were made using Gimp: original as bottom layer, re-compressed middle layer with "Grain extract" mode, top layer full white with "Value" mode to enhance badness.
The below images are uploaded to Imgur which also compresses the files, but since all of the images are compressed the same, the original unwanted artifacts remain visible the same way I see it when opening my original files.
Original [560k]: Imgur's difference to original (not relevant to problem, just to show that it's not causing any extra artifacts when uploading the images): IrfanView 100% [728k] (visually identical to original): IrfanView 100%'s difference to original (barely anything) Android 100% [942k]: Android 100%'s difference to original (tinting, banding, smearing)
In IrfanView I have to go below 50% [50k] to see remotely similar effects. At 70% [100k] in IrfanView there's no noticable difference, but the size is 9th of Android's.
Background
I created an app that takes a picture from Camera API, that image comes as a byte[]
and is an encoded JPEG blob. I saved this file via OutputStream.write(byte[])
method, that was my original source file. decodeByteArray(data, 0, data.length, options)
decodes the same pixels as reading from a File, tested with Bitmap.sameAs
so it's irrelevant to the issue.
I was using my Samsung Galaxy S4 with Android 4.4.2 to test things out. Edit: while investigating further I also tried Android 6.0 and N preview emulators and they reproduce the same issue.
Bitmap.compress()
will be moot. Epic analysis, though! :-) – HumboldtBitmap.compress()
would have been originally created for single-core, ~500MHz CPUs and 192MB device RAM. That's roughly on par with 15-year-old PCs. Hence, it was optimized for low CPU and low RAM, not max quality. I have no idea if they improved it over time or not, given that device requirements have been climbing. "I have a lot of images where this is really visible" -- bear in mind that you may be more sensitive to the issue than others. I'm notoriously lousy at it, and if I stare at the photos, I'll detect minute differences, but that's it. – Humboldt