Is it possible to get the new ImageDecoder class to return Bitmaps, one frame after another, manually?
Asked Answered
C

4

21

Background

I'm trying to go over bitmaps of animated GIF&WEBP files manually (frame by frame), so that it would work not just for Views, but on other cases too (such as a live wallpaper).

The problem

Animated GIF/WEBP files are supported only from Android P, using ImageDecoder API (example here) .

For GIF, I wanted to try Glide for the task, but I've failed, so I've tried overcoming this, by using a library that allows to load them (here, solution here). I think it works fine.

For WebP, I thought I've found another library that could work on older Android versions (here, made fork here), but it seems that it can't handle WebP files well in some cases (reported here). I tried to figure out what's the issue and how to solve it, but I didn't succeed.

So, assuming that some day Google will support GIF&WEBP animation for older Android versions via the support library (they wrote it here), I've decided to try to use ImageDecoder for the task.

Thing is, looking in the entire API of ImageDecoder , it's quite restricted in how we should use it. I don't see how I can overcome its limitations.

What I've found

This is how ImageDecoder can be used to show an animated WebP on an ImageView (just a sample, of course, available here) :

class MainActivity : AppCompatActivity() {
    @SuppressLint("StaticFieldLeak")

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        val source = ImageDecoder.createSource(resources, R.raw.test)
        object : AsyncTask<Void, Void, Drawable?>() {
            override fun doInBackground(vararg params: Void?): Drawable? {
                return try {
                    ImageDecoder.decodeDrawable(source)
                } catch (e: Exception) {
                    null
                }
            }

            override fun onPostExecute(result: Drawable?) {
                super.onPostExecute(result)
                imageView.setImageDrawable(result)
                if (result is AnimatedImageDrawable) {
                    result.start()
                }
            }

        }.execute()

    }
}

I've tried to read all of the documentations of ImageDecoder and AnimatedImageDrawable, and also look at its code, but I don't see how it's possible to manually go over each frame, and have the time that needs to be waited between them.

The questions

  1. Is there a way to use ImageDecoder API to go over each frame manually, getting a Bitmap to draw and knowing how much time it's needed to wait between frames? Any workaround available? Maybe even using AnimatedImageDrawable ?

  2. I'd like to do the same on older Android versions. Is it possible? If so how? Maybe on a different API/library? Google wrote it works on a way to use ImageDecoder on older Android versions, but I don't see it being mentioned anywhere (except for the link I've provided). Probably not ready yet... Android P didn't even reach 0.1% of users yet... Maybe Fresco can do it? I've tried to check it there too, but I don't see that it's capable of such a thing either, and it's a huge library to use just for this task, so I'd prefer to use a different library instead... I also know that libwebp is available, but it's in C/C++ and not sure if it's suited for Android, and whether there is a port for it on Java/Kotlin for Android.


EDIT:

Since I think I got what I wanted, for both a third party library and for ImageDecoder, to be able to get bitmaps out of animated WebP, I'd still want to know how to get the frame count and current frame using ImageDecoder, if that's possible. I tried using ImageDecoder.decodeDrawable(source, object : ImageDecoder.OnHeaderDecodedListener... , but it doesn't provide frame count information, and there is no way in the API that I can see that I can go to a specific frame index and start from there, or to know for a specific frame how long it needs to go to the next frame. So I made a reuqest about those here.

Sadly I also could not find that Google has ImageDecoder available for older Android versions, either.

It's also interesting if there is some kind of way to do the same as I did for the relatively new animation file of HEIC. Currently it's supported only on Android P.

Chiropteran answered 27/10, 2018 at 17:2 Comment(0)
C
10

OK, I got a possible solution, using Glide library, together with GlideWebpDecoder library .

I'm not sure if that's the best way to do it, but I think it should work fine. The next code shows how it's possible to make the drawable draw into the Bitmap instance that I create, for each frame that the animation needs to show. It's not exactly what I asked, but it might help others.

Here's the code (project available here) :

CallbackEx.kt

abstract class CallbackEx : Drawable.Callback {
    override fun unscheduleDrawable(who: Drawable, what: Runnable) {}
    override fun invalidateDrawable(who: Drawable) {}
    override fun scheduleDrawable(who: Drawable, what: Runnable, `when`: Long) {}
}

MyAppGlideModule.kt

@GlideModule
class MyAppGlideModule : AppGlideModule()

MainActivity.kt

class MainActivity : AppCompatActivity() {
    var webpDrawable: WebpDrawable? = null
    var gifDrawable: GifDrawable? = null
    var callback: Drawable.Callback? = null

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        useFrameByFrameDecoding()
//        useNormalDecoding()
    }

    fun useNormalDecoding() {
        //webp url : https://res.cloudinary.com/demo/image/upload/fl_awebp/bored_animation.webp
        Glide.with(this)
                //                .load(R.raw.test)
                //                .load(R.raw.fast)
                .load(R.raw.example2)

                //                .load("https://res.cloudinary.com/demo/image/upload/fl_awebp/bored_animation.webp")
                .into(object : SimpleTarget<Drawable>() {
                    override fun onResourceReady(drawable: Drawable, transition: Transition<in Drawable>?) {
                        imageView.setImageDrawable(drawable)
                        when (drawable) {
                            is GifDrawable -> {
                                drawable.start()
                            }
                            is WebpDrawable -> {
                                drawable.start()
                            }
                        }
                    }
                })
    }

    fun useFrameByFrameDecoding() {
        //webp url : https://res.cloudinary.com/demo/image/upload/fl_awebp/bored_animation.webp
        Glide.with(this)
                .load(R.raw.test)
                //                .load(R.raw.fast)
                //                .load(R.raw.example2)
                //                .load("https://res.cloudinary.com/demo/image/upload/fl_awebp/bored_animation.webp")
                .into(object : SimpleTarget<Drawable>() {
                    override fun onResourceReady(drawable: Drawable, transition: Transition<in Drawable>?) {
                        //                        val callback
                        when (drawable) {
                            is GifDrawable -> {
                                gifDrawable = drawable
                                val bitmap = Bitmap.createBitmap(drawable.intrinsicWidth, drawable.intrinsicHeight, Bitmap.Config.ARGB_8888)
                                val canvas = Canvas(bitmap)
                                drawable.setBounds(0, 0, bitmap.width, bitmap.height)
                                drawable.setLoopCount(GifDrawable.LOOP_FOREVER)
                                callback = object : CallbackEx() {
                                    override fun invalidateDrawable(who: Drawable) {
                                        who.draw(canvas)
                                        imageView.setImageBitmap(bitmap)
                                        Log.d("AppLog", "invalidateDrawable ${drawable.toString().substringAfter('@')} ${drawable.frameIndex}/${drawable.frameCount}")
                                    }
                                }
                                drawable.callback = callback
                                drawable.start()
                            }
                            is WebpDrawable -> {
                                webpDrawable = drawable
                                val bitmap = Bitmap.createBitmap(drawable.intrinsicWidth, drawable.intrinsicHeight, Bitmap.Config.ARGB_8888)
                                val canvas = Canvas(bitmap)
                                drawable.setBounds(0, 0, bitmap.width, bitmap.height)
                                drawable.setLoopCount(WebpDrawable.LOOP_FOREVER)
                                callback = object : CallbackEx() {
                                    override fun invalidateDrawable(who: Drawable) {
                                        who.draw(canvas)
                                        imageView.setImageBitmap(bitmap)
                                        Log.d("AppLog", "invalidateDrawable ${drawable.toString().substringAfter('@')} ${drawable.frameIndex}/${drawable.frameCount}")
                                    }
                                }
                                drawable.callback = callback
                                drawable.start()
                            }
                        }
                    }
                })
    }

    override fun onStart() {
        super.onStart()
        gifDrawable?.start()
        gifDrawable?.start()
    }

    override fun onStop() {
        super.onStop()
        Log.d("AppLog", "onStop")
        webpDrawable?.stop()
        gifDrawable?.stop()
    }

}

Not sure why SimpleTarget is marked as deprecated, and what I should use instead, though.

Using a similar technique, I've also found out how to do it using ImageDecoder, but not with the same functionality for some reason. A sample project available here.

Here's the code:

MainActivity.kt

class MainActivity : AppCompatActivity() {
    var webpDrawable: AnimatedImageDrawable? = null

    @SuppressLint("StaticFieldLeak")
    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        val source = ImageDecoder.createSource(resources, R.raw.test)
        object : AsyncTask<Void, Void, Drawable?>() {
            override fun doInBackground(vararg params: Void?): Drawable? {
                return try {
                    ImageDecoder.decodeDrawable(source)
                } catch (e: Exception) {
                    null
                }
            }

            override fun onPostExecute(drawable: Drawable?) {
                super.onPostExecute(drawable)
//                imageView.setImageDrawable(result)
                if (drawable is AnimatedImageDrawable) {
                    webpDrawable = drawable
                    val bitmap =
                        Bitmap.createBitmap(drawable.intrinsicWidth, drawable.intrinsicHeight, Bitmap.Config.ARGB_8888)
                    val canvas = Canvas(bitmap)
                    drawable.setBounds(0, 0, bitmap.width, bitmap.height)
                    drawable.repeatCount = AnimatedImageDrawable.REPEAT_INFINITE
                    drawable.callback = object : Drawable.Callback {
                        val handler = Handler()
                        override fun unscheduleDrawable(who: Drawable, what: Runnable) {
                            Log.d("AppLog", "unscheduleDrawable")
                        }

                        override fun invalidateDrawable(who: Drawable) {
                            who.draw(canvas)
                            imageView.setImageBitmap(bitmap)
                            Log.d("AppLog", "invalidateDrawable")
                        }

                        override fun scheduleDrawable(who: Drawable, what: Runnable, `when`: Long) {
                            Log.d("AppLog", "scheduleDrawable next frame in ${`when` - SystemClock.uptimeMillis()} ms")
                            handler.postAtTime(what, `when`)
                        }
                    }
                    drawable.start()
                }
            }
        }.execute()
    }

    override fun onStart() {
        super.onStart()
        webpDrawable?.start()
    }

    override fun onStop() {
        super.onStop()
        webpDrawable?.stop()
    }

}
Chiropteran answered 9/11, 2018 at 22:42 Comment(4)
Very nice solutionCraver
I don't know if it's the best one though. Do you know of a better one?Chiropteran
HI, what is the status? There are some experimental classes of animated webp encoder decoder. please look into that and try to use that class directly instead of Glide.with(). I mean get files and decode and encode by iteratingBollinger
@Bollinger What do you mean? Do you have a nicer alternative? Can you please share your sample?Chiropteran
R
3

see ImageDecoder.Source ...

one needs to first create a source, with either:

// source from file
val source = ImageDecoder.createSource(file)

// source from byte buffer
val source = ImageDecoder.createSource(byteBuffer)

// source from resource
val source = ImageDecoder.createSource(resources, resId)

// source from URI
val source = ImageDecoder.createSource(contentResolver, uri)

// source from asset file
val source = ImageDecoder.createSource(assetManager, assetFileName)

and then decode, with either:

// create bitmap
val bitmap = ImageDecoder.decodeBitmap(source)

// create drawable
val drawable = ImageDecoder.decodeDrawable(source)

update: the problem is, that the resulting AnimatedImageDrawable doesn't have the two methods: getNumberOfFrames() and getFrame(int) as an AnimationDrawable has. as @androiddeveloper pointed out ...I've messed up two different classes. I've double-checked the documentation and there seems to be no way. with the GIFImageReader it still can be extracted (source):

ArrayList<BufferedImage> getFrames(File gif) throws IOException {
    ArrayList<BufferedImage> frames = new ArrayList<BufferedImage>();
    ImageReader ir = new GIFImageReader(new GIFImageReaderSpi());
    ir.setInput(ImageIO.createImageInputStream(gif));
    for(int i = 0; i < ir.getNumImages(true); i++) {
        frames.add(ir.read(i));
    }
    return frames;
}

just tried to convert it to Kotlin, but javax.imageio.ImageIO is not available on Android.

Rossie answered 10/11, 2018 at 13:1 Comment(12)
There is no getNumberOfFrames and no getFrame in AnimatedImageDrawable and in any of the classes it extends from. The class AnimatedImageDrawable doesn't extend from AnimationDrawableChiropteran
@androiddeveloper sorry, messed up the classes; provided an alternate approach. the ir.read(i) would be a single frame. AnimatedImageDrawable just replaces Movie.Rossie
Where do those classes of GIFImageReader and GIFImageReaderSpi come from? Anyway, you could help me still with ImageDecoder . I've noticed that my example of using it frame-by-frame doesn't really work on all cases. If the animation is long enough, there are no more callbacks, due to this: github.com/zjupure/GlideWebpDecoder/issues/… , so the question about ImageDecoder alone still isn't answered well (including my own solution).Chiropteran
@androiddeveloper these are coming from Oracle javax.imageio.ImageIO. docs.oracle.com/javase/7/docs/api/javax/imageio/… some of the javax classes are even available on Android: developer.android.com/reference/packages but not all. with a common (not Android) .jar, one could possibly make these classes available; eg. some GIF splitter library.Rossie
So they have a decoder in Java for animated GIF? Also for WEBP ?Chiropteran
@androiddeveloper there are libraries: bitbucket.org/luciad/webp-imageio while I could also imagine, that one could use libwebp through JNI - had to build that library once for CentOS, to enable support for GIMP; possibly it's even natively available in later versions of Android - because somehow the format needs to be decoded, to display it - and it's the preferred format meanwhile. it seems to be available: android.googlesource.com/platform/external/webp/+/masterRossie
it reads: The Android.bp file creates WebP decoder and encoder static libraries which can be added to any application by adding libwebp-decode and libwebp-encode to static_libs.Rossie
statically linking these libraries seems to be the Android way of doing it - and that's possibly why there is no support in Android Java: android.googlesource.com/platform/external/giflib/+/masterRossie
I meant animated WEBP (like animated GIF). Sorry for that. Animated WEBP is only supported from Android P. Do you know how to use these libraries on Android ? Can you make a sample for both animated GIF and animated GIF ?Chiropteran
@androiddeveloper I'm afraid this would be a few days of work... there's no ready-made solution to it. in best case I could provide an example, which links these libraries, to have them even accessible in C. for backwards compatibility, one would need to bake them into the *.so and not use the platform native libraries. pie-release-2 might be the branch to use.Rossie
@androiddeveloper found something: github.com/waynejo/android-ndk-gif - it still uses the old tool-chain, but it basically does what I was talking about. webp would need to be handled equally. asking for the example pushed me into the right direction.Rossie
Too bad. Thank you anyway.Chiropteran
Z
2

I played with GIF images a few years ago. My idea is decode GIF images to frames, convert frames to bitmaps add create Animated Drawable from bitmaps and delay between frames. This is decode class:

public class GifDecoder extends Thread {

public static final int STATUS_PARSING = 0;
public static final int STATUS_FORMAT_ERROR = 1;
public static final int STATUS_OPEN_ERROR = 2;
public static final int STATUS_FINISH = -1;
private static final int MaxStackSize = 4096;
public int width; // full image width
public int height; // full image height
int[] lastPixels;
int[] dest;
private InputStream in;
private int status;
private boolean gctFlag; // global color table used
private int gctSize; // size of global color table
private int loopCount = 1; // iterations; 0 = repeat forever
private int[] gct; // global color table
private int[] lct; // local color table
private int[] act; // active color table
private int bgIndex; // background color index
private int bgColor; // background color
private int lastBgColor; // previous bg color
private int pixelAspect; // pixel aspect ratio
private boolean lctFlag; // local color table flag
private boolean interlace; // interlace flag
private int lctSize; // local color table size
private int ix, iy, iw, ih; // current image rectangle
private int lrx, lry, lrw, lrh;
private GifFrame currentFrame = null;
private boolean isShow = false;
private byte[] block = new byte[256]; // current data block
private int blockSize = 0; // block size
private int dispose = 0;
private int lastDispose = 0;
private boolean transparency = false; // use transparent color
// max decoder pixel stack size
private int delay = 0; // delay in milliseconds
private int transIndex; // transparent color index
// LZW decoder working arrays
private short[] prefix;
private byte[] suffix;
private byte[] pixelStack;
private byte[] pixels;
private GifFrame gifFrame; // frames read from current file
private int frameCount;
private GifAction action = null;
private byte[] gifData = null;
private int gifDataOffset;
private int gifDataLength;

private GifDecoder() {

}

public GifDecoder(byte[] data, GifAction act) {
    this(data, 0, data.length, act);
}

public GifDecoder(byte[] data, int offset, int length, GifAction act) {
    gifData = data;
    action = act;
    gifDataOffset = offset;
    gifDataLength = length;
}

public GifDecoder(InputStream is, GifAction act) {
    in = is;
    action = act;
}

public void run() {
    if (in != null) {
        readStream();
    } else if (gifData != null) {
        readByte();
    }
}

public void free() {
    GifFrame fg = gifFrame;
    while (fg != null) {
        if (fg.image != null) {
            fg.image.recycle();
        }
        fg.image = null;
        fg = null;
        gifFrame = gifFrame.nextFrame;
        fg = gifFrame;
    }
    if (in != null) {
        try {
            in.close();
        } catch (Exception ex) {
        }
        in = null;
    }
    gifData = null;
}

public int getStatus() {
    return status;
}

public boolean parseOk() {
    return status == STATUS_FINISH;
}

public int getDelay(int n) {
    delay = -1;
    if ((n >= 0) && (n < frameCount)) {
        GifFrame f = getFrame(n);
        if (f != null) delay = f.delay;
    }
    return delay;
}

public GifFrame getFrame(int n) {
    GifFrame frame = gifFrame;
    int i = 0;
    while (frame != null) {
        if (i == n) {
            return frame;
        } else {
            frame = frame.nextFrame;
        }
        i++;
    }
    return null;
}

public int[] getDelays() {
    GifFrame f = gifFrame;
    int[] d = new int[frameCount];
    int i = 0;
    while (f != null && i < frameCount) {
        d[i] = f.delay;
        f = f.nextFrame;
        i++;
    }
    return d;
}

public int getFrameCount() {
    return frameCount;
}

public Bitmap getImage() {
    return getFrameImage(0);
}

public Bitmap getFrameImage(int n) {
    GifFrame frame = getFrame(n);
    if (frame == null) {
        return null;
    } else {
        return frame.image;
    }
}

public int getLoopCount() {
    return loopCount;
}

public GifFrame getCurrentFrame() {
    return currentFrame;
}

public void reset() {
    currentFrame = gifFrame;
}

public GifFrame next() {
    if (isShow == false) {
        isShow = true;
        return gifFrame;
    } else {
        if (status == STATUS_PARSING) {
            if (currentFrame.nextFrame != null) currentFrame = currentFrame.nextFrame;
            //currentFrame = gifFrame;
        } else {
            currentFrame = currentFrame.nextFrame;
            if (currentFrame == null) {
                currentFrame = gifFrame;
            }
        }
        return currentFrame;
    }
}

private Bitmap setPixels() {
    if (dest == null) dest = new int[width * height];
    // fill in starting image contents based on last image's dispose code
    if (lastDispose > 0) {
        if (lastDispose == 3) {
            // use image before last
            int n = frameCount - 2;
            if (n > 0) {
                Bitmap lastImage = getFrameImage(n - 1);
                if (lastPixels == null) lastPixels = new int[width * height];
                lastImage.getPixels(lastPixels, 0, width, 0, 0, width, height);
            } else {
                lastPixels = null;
            }
        }
        if (lastPixels != null) {
            dest = Arrays.copyOf(lastPixels, lastPixels.length);
            // copy pixels
            if (lastDispose == 2) {
                // fill last image rect area with background color
                int c = 0;
                if (!transparency) {
                    c = lastBgColor;
                }
                for (int i = 0; i < lrh; i++) {
                    int n1 = (lry + i) * width + lrx;
                    int n2 = n1 + lrw;
                    for (int k = n1; k < n2; k++) {
                        dest[k] = c;
                    }
                }
            }
        }
    }

    // copy each source line to the appropriate place in the destination
    int pass = 1;
    int inc = 8;
    int iline = 0;
    for (int i = 0; i < ih; i++) {
        int line = i;
        if (interlace) {
            if (iline >= ih) {
                pass++;
                switch (pass) {
                    case 2:
                        iline = 4;
                        break;
                    case 3:
                        iline = 2;
                        inc = 4;
                        break;
                    case 4:
                        iline = 1;
                        inc = 2;
                }
            }
            line = iline;
            iline += inc;
        }
        line += iy;
        if (line < height) {
            int k = line * width;
            int dx = k + ix; // start of line in dest
            int dlim = dx + iw; // end of dest line
            if ((k + width) < dlim) {
                dlim = k + width; // past dest edge
            }
            int sx = i * iw; // start of line in source
            while (dx < dlim) {
                // map color and insert in destination
                int index = ((int) pixels[sx++]) & 0xff;
                int c = act[index];
                if (c != 0) {
                    dest[dx] = c;
                }
                dx++;
            }
        }
    }
    return Bitmap.createBitmap(dest, width, height, Config.ARGB_4444);
}

private int readByte() {
    in = new ByteArrayInputStream(gifData, gifDataOffset, gifDataLength);
    gifData = null;
    return readStream();
}

private int readStream() {
    init();
    if (in != null) {
        readHeader();
        if (!err()) {
            readContents();
            if (frameCount < 0) {
                status = STATUS_FORMAT_ERROR;
                action.parseOk(false, -1);
            } else {
                status = STATUS_FINISH;
                action.parseOk(true, -1);
            }
        }
        try {
            in.close();
        } catch (Exception e) {
            e.printStackTrace();
        }
    } else {
        status = STATUS_OPEN_ERROR;
        action.parseOk(false, -1);
    }
    return status;
}

private void decodeImageData() {
    int NullCode = -1;
    int npix = iw * ih;
    int available, clear, code_mask, code_size, end_of_information, in_code, old_code, bits,
            code,
            count, i, datum, data_size, first, top, bi, pi;

    if ((pixels == null) || (pixels.length < npix)) {
        pixels = new byte[npix]; // allocate new pixel array
    }
    if (prefix == null) {
        prefix = new short[MaxStackSize];
    }
    if (suffix == null) {
        suffix = new byte[MaxStackSize];
    }
    if (pixelStack == null) {
        pixelStack = new byte[MaxStackSize + 1];
    }
    // Initialize GIF data stream decoder.
    data_size = read();
    clear = 1 << data_size;
    end_of_information = clear + 1;
    available = clear + 2;
    old_code = NullCode;
    code_size = data_size + 1;
    code_mask = (1 << code_size) - 1;
    for (code = 0; code < clear; code++) {
        prefix[code] = 0;
        suffix[code] = (byte) code;
    }

    // Decode GIF pixel stream.
    datum = bits = count = first = top = pi = bi = 0;
    for (i = 0; i < npix; ) {
        if (top == 0) {
            if (bits < code_size) {
                // Load bytes until there are enough bits for a code.
                if (count == 0) {
                    // Read a new data block.
                    count = readBlock();
                    if (count <= 0) {
                        break;
                    }
                    bi = 0;
                }
                datum += (((int) block[bi]) & 0xff) << bits;
                bits += 8;
                bi++;
                count--;
                continue;
            }
            // Get the next code.
            code = datum & code_mask;
            datum >>= code_size;
            bits -= code_size;

            // Interpret the code
            if ((code > available) || (code == end_of_information)) {
                break;
            }
            if (code == clear) {
                // Reset decoder.
                code_size = data_size + 1;
                code_mask = (1 << code_size) - 1;
                available = clear + 2;
                old_code = NullCode;
                continue;
            }
            if (old_code == NullCode) {
                pixelStack[top++] = suffix[code];
                old_code = code;
                first = code;
                continue;
            }
            in_code = code;
            if (code == available) {
                pixelStack[top++] = (byte) first;
                code = old_code;
            }
            while (code > clear) {
                pixelStack[top++] = suffix[code];
                code = prefix[code];
            }
            first = ((int) suffix[code]) & 0xff;
            // Add a new string to the string table,
            if (available >= MaxStackSize) {
                break;
            }
            pixelStack[top++] = (byte) first;
            prefix[available] = (short) old_code;
            suffix[available] = (byte) first;
            available++;
            if (((available & code_mask) == 0) && (available < MaxStackSize)) {
                code_size++;
                code_mask += available;
            }
            old_code = in_code;
        }

        // Pop a pixel off the pixel stack.
        top--;
        pixels[pi++] = pixelStack[top];
        i++;
    }
    for (i = pi; i < npix; i++) {
        pixels[i] = 0; // clear missing pixels
    }
}

private boolean err() {
    return status != STATUS_PARSING;
}

private void init() {
    status = STATUS_PARSING;
    frameCount = 0;
    gifFrame = null;
    gct = null;
    lct = null;
}

private int read() {
    int curByte = 0;
    try {

        curByte = in.read();
    } catch (Exception e) {
        status = STATUS_FORMAT_ERROR;
    }
    return curByte;
}

private int readBlock() {
    blockSize = read();
    int n = 0;
    if (blockSize > 0) {
        try {
            int count = 0;
            while (n < blockSize) {
                count = in.read(block, n, blockSize - n);
                if (count == -1) {
                    break;
                }
                n += count;
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        if (n < blockSize) {
            status = STATUS_FORMAT_ERROR;
        }
    }
    return n;
}

private int[] readColorTable(int ncolors) {
    int nbytes = 3 * ncolors;
    int[] tab = null;
    byte[] c = new byte[nbytes];
    int n = 0;
    try {
        n = in.read(c);
    } catch (Exception e) {
        e.printStackTrace();
    }
    if (n < nbytes) {
        status = STATUS_FORMAT_ERROR;
    } else {
        tab = new int[256]; // max size to avoid bounds checks
        int i = 0;
        int j = 0;
        while (i < ncolors) {
            int r = ((int) c[j++]) & 0xff;
            int g = ((int) c[j++]) & 0xff;
            int b = ((int) c[j++]) & 0xff;
            tab[i++] = 0xff000000 | (r << 16) | (g << 8) | b;
        }
    }
    return tab;
}

private void readContents() {
    // read GIF file content blocks
    boolean done = false;
    while (!(done || err())) {
        int code = read();
        switch (code) {
            case 0x2C: // image separator
                readImage();
                break;
            case 0x21: // extension
                code = read();
                switch (code) {
                    case 0xf9: // graphics control extension
                        readGraphicControlExt();
                        break;
                    case 0xff: // application extension
                        readBlock();
                        String app = "";
                        for (int i = 0; i < 11; i++) {
                            app += (char) block[i];
                        }
                        if (app.equals("NETSCAPE2.0")) {
                            readNetscapeExt();
                        } else {
                            skip(); // don't care
                        }
                        break;
                    default: // uninteresting extension
                        skip();
                }
                break;
            case 0x3b: // terminator
                done = true;
                break;
            case 0x00: // bad byte, but keep going and see what happens
                break;
            default:
                status = STATUS_FORMAT_ERROR;
        }
    }
}

private void readGraphicControlExt() {
    read(); // block size
    int packed = read(); // packed fields
    dispose = (packed & 0x1c) >> 2; // disposal method
    if (dispose == 0) {
        dispose = 1; // elect to keep old image if discretionary
    }
    transparency = (packed & 1) != 0;
    delay = readShort() * 10; // delay in milliseconds
    transIndex = read(); // transparent color index
    read(); // block terminator
}

private void readHeader() {
    String id = "";
    for (int i = 0; i < 6; i++) {
        id += (char) read();
    }
    if (!id.startsWith("GIF")) {
        status = STATUS_FORMAT_ERROR;
        return;
    }
    readLSD();
    if (gctFlag && !err()) {
        gct = readColorTable(gctSize);
        bgColor = gct[bgIndex];
    }
}

private void readImage() {
    ix = readShort(); // (sub)image position & size
    iy = readShort();
    iw = readShort();
    ih = readShort();
    int packed = read();
    lctFlag = (packed & 0x80) != 0; // 1 - local color table flag
    interlace = (packed & 0x40) != 0; // 2 - interlace flag
    // 3 - sort flag
    // 4-5 - reserved
    lctSize = 2 << (packed & 7); // 6-8 - local color table size
    if (lctFlag) {
        lct = readColorTable(lctSize); // read table
        act = lct; // make local table active
    } else {
        act = gct; // make global table active
        if (bgIndex == transIndex) {
            bgColor = 0;
        }
    }
    int save = 0;
    if (transparency) {
        save = act[transIndex];
        act[transIndex] = 0; // set transparent color if specified
    }
    if (act == null) {
        status = STATUS_FORMAT_ERROR; // no color table defined
    }
    if (err()) {
        return;
    }
    try {
        decodeImageData(); // decode pixel data
        skip();
        if (err()) {
            return;
        }
        frameCount++;
        // create new image to receive frame data
        // createImage(width, height);
        Bitmap image = setPixels(); // transfer pixel data to image
        if (gifFrame == null) {
            gifFrame = new GifFrame(image, delay);
            currentFrame = gifFrame;
        } else {
            GifFrame f = gifFrame;
            while (f.nextFrame != null) {
                f = f.nextFrame;
            }
            f.nextFrame = new GifFrame(image, delay);
        }
        // frames.addElement(new GifFrame(image, delay)); // add image to frame
        // list
        if (transparency) {
            act[transIndex] = save;
        }
        resetFrame();
        if (!action.parseOk(true, frameCount)) {
            status = STATUS_FINISH;
            return;
        }
    } catch (OutOfMemoryError e) {
        Log.e("GifDecoder", ">>> log  : " + e.toString());
        e.printStackTrace();
    }
}

private void readLSD() {
    // logical screen size
    width = readShort();
    height = readShort();
    // packed fields
    int packed = read();
    gctFlag = (packed & 0x80) != 0; // 1 : global color table flag
    // 2-4 : color resolution
    // 5 : gct sort flag
    gctSize = 2 << (packed & 7); // 6-8 : gct size
    bgIndex = read(); // background color index
    pixelAspect = read(); // pixel aspect ratio
}

private void readNetscapeExt() {
    do {
        readBlock();
        if (block[0] == 1) {
            // loop count sub-block
            int b1 = ((int) block[1]) & 0xff;
            int b2 = ((int) block[2]) & 0xff;
            loopCount = (b2 << 8) | b1;
        }
    } while ((blockSize > 0) && !err());
}

private int readShort() {
    // read 16-bit value, LSB first
    return read() | (read() << 8);
}

private void resetFrame() {
    lastDispose = dispose;
    lrx = ix;
    lry = iy;
    lrw = iw;
    lrh = ih;
    lastPixels = dest;
    lastBgColor = bgColor;
    dispose = 0;
    transparency = false;
    delay = 0;
    lct = null;
}

/**
 * Skips variable length blocks up to and including next zero length block.
 */
private void skip() {
    do {
        readBlock();
    } while ((blockSize > 0) && !err());
}

}

I upload the full demo source here. Hope it can help you.

Zima answered 12/11, 2018 at 11:12 Comment(16)
Impressive ! You've implemented it all in Java! Did you do it on WebP too? But, you said you decode all frames? Do you mean you also save all frames? If so, this could lead to OOM... I think you should not use an infinite sized cache for the bitmaps, and allow the developer to choose how many bitmaps are allowed to be cached, based on the number of frames and the resolution of each bitmap. Also, why didn't you put the project on Github instead? Could be great there...Chiropteran
Looking at the code, and trying to get the total amount of memory the bitmaps use, it reaches 38,400,000 bytes for the GIF animation. While this is very efficient in terms of CPU (seems like 1% usage compared to 3% usage on my case, which I didn't cache at all as it might be done on JNI anyway), it's unbounded so it could lead to memory issues and crashes, depending on the input GIF file. I've also noticed your solution doesn't work well on the GifDrawableImageView class, as it doesn't scale to the various scale-types available, but that's a different issue.Chiropteran
I know it. This is the big issue that I face when play with GIF. It takes a lot of memory. I have to release it whenever reference count = 0 and manually manage memory by myself. If you use Glide or Ionic, they will help you control memory usage. Another option is using webview, I dont like it :)Zima
To reduce bitmap size, I have to decrease the image quanlity(You can find in the GifDecoder class). And I calculate the screen size, make it smaller than screen width, screen height...Zima
Yes, I got it. I wrote this code 4-5 years ago and no longer work on it. Just hope to help you smt :)Zima
That's very cool. Can you do it for WebP though? The very basic functionality, of course...Chiropteran
I just have a little bit of experience in WebP. Did you take a look at Fresco: github.com/facebook/fresco? It supports both GIF and WebP for Android and customizable.Zima
I have , but they don't offer what I've written. They gave me a very hack-y way to get the bitmaps: github.com/facebook/fresco/issues/2229Chiropteran
I've tried to use your code while avoiding infinite sized cache (now you can set the cache size, and it's optional), but it seems that GifDecoder class alone takes infinite sized memory. Here's a sample project: s000.tinyupload.com/?file_id=80187180417393852887 . Can you please have a look?Chiropteran
I read your code, it's quite simple so I can't image the problem. Every time GifDecoder try to make a bitmap from frame, it must take memory (of course). So I think the problem here is when will you release the memory which hold bitmap using bitmap.recycle()? I try release memory when image view is removed by onDetachedFromWindow.Zima
The problem is that the more frames the GIF has, the more memory it will use (especially with high resolution). It has infinite cache size, so it can crash if there is too much to have there. If you try my sample,and look at the profiler, you can see the app uses 600MB for playing the GIF animation. That's compared to around 70MB that other solutions have... The good thing is that the CPU usage is lower, but memory is quite important on smartphones. There should be a balance, a fallback for when there isn't enough memory. Memory is never infinite. You can get the app crash from this behavior.Chiropteran
And as a bonus, because it first cache all frames, it also takes time till it actually starts playing. The way it should be is that while it decodes (or get from cache), it sends the bitmap to be shown, and put in optional cache if possible (meaning it has enough free space in the cache that was defined).Chiropteran
Correction: it's not 70MB, but it's much lower than 600MB .Chiropteran
Do you know perhaps how to avoid this?Chiropteran
So sorry, I was busy last week. I think the only way that we have decrease bitmap size when decode frames. Do you make a comparison between this way and Glide + Fresco. They might have better result.Zima
I compared your solution (even after I've modified it to try to avoid caching) and Glide, and also with the new API of Android 9 (ImageDecoder). However, your solution has lower CPU usage, which I think is thanks to the caching. So what I think should be done is having an optional cache, with a max limit to it (developer chooses the size of the cache).Chiropteran
F
0

EDIT: In actually implementing this, I encountered a couple unexpected problems, but nothing insurmountable:

  1. AnimatedImageDrawable seems to ignore its configured bounds. I scaled the canvas instead.
  2. For reasons I don't understand, AnimatedImageDrawable.draw() occasionally neglects to schedule the next frame. I decided to call the function twice. The second time, I translate the canvas so that all drawing is out of bounds, which should allow most of the work to be optimized away.

Here's the sample code.

import android.annotation.*;
import android.graphics.*;
import android.graphics.drawable.*;
import android.os.*;
import android.service.wallpaper.*;
import android.support.annotation.*;
import android.view.*;

@TargetApi(28)
public class TestWallpaper extends WallpaperService
{
    @Override public Engine onCreateEngine()
    {
        return new Engine();
    }

    private class Engine extends WallpaperService.Engine implements Drawable.Callback
    {
        private final Drawable d;
        private final Handler h = new Handler();

        private float scaleX, scaleY;

        private Engine()
        {
            this.setOffsetNotificationsEnabled(false);
            Drawable d = null;
            try
            {
                d = ImageDecoder
                    .decodeDrawable(ImageDecoder.createSource(getResources(), R.drawable.test));
                d.setCallback(this);
                // AnimatedImageDrawable seems to ignore its configured bounds and use its
                // intrinsic bounds instead.
                // In case they fix this bug, we'll go ahead and request the current
                // behavior, and then before drawing we'll transform the canvas to compensate
                d.setBounds(0, 0, d.getIntrinsicWidth(), d.getIntrinsicHeight());
                if (d instanceof AnimatedImageDrawable)
                {
                    final AnimatedImageDrawable anim = (AnimatedImageDrawable) d;
                    anim.setRepeatCount(AnimatedImageDrawable.REPEAT_INFINITE);
                    anim.start();
                }
            }
            catch (Throwable t) // should never happen
            {
                t.printStackTrace();
            }
            this.d = d;
        }

        @Override public void invalidateDrawable(@NonNull Drawable _d)
        {
            if(isVisible())
                draw(getSurfaceHolder().getSurface());
        }

        @Override public void scheduleDrawable(@NonNull Drawable _d, @NonNull Runnable _r, long _at)
        {
            if(isVisible())
                h.postAtTime(_r, _d, _at);
        }

        @Override public void unscheduleDrawable(@NonNull Drawable _d, @NonNull Runnable _r)
        {
            h.removeCallbacks(_r, _d);
        }

        @Override public void onSurfaceChanged(SurfaceHolder _sh, int _format, int _w, int _h)
        {
            scaleX = (float) _w / d.getIntrinsicWidth();
            scaleY = (float) _h / d.getIntrinsicHeight();
            draw(_sh.getSurface());
        }

        @Override public void onSurfaceRedrawNeeded(SurfaceHolder _sh)
        {
            draw(_sh.getSurface());
        }

        private void draw(Surface _s)
        {
            try
            {
                final Canvas c = _s.lockCanvas(null);
                c.scale(scaleX, scaleY);
                d.draw(c);
                // Sometimes AnimatedImageDrawable neglects to schedule the next frame
                // after only one draw() of the current frame, so we'll draw() it again,
                // but outside the canvas this time
                c.translate(Float.MAX_VALUE, Float.MAX_VALUE);
                d.draw(c);
                //
                _s.unlockCanvasAndPost(c);
            }
            catch (Throwable t)
            {
                t.printStackTrace();
                // Most likely, the surface was destroyed while we were using it
                // The new one will be delivered to onSurfaceChanged and we'll be fine
            }
        }

        @Override public void onVisibilityChanged(boolean _visible)
        {
            super.onVisibilityChanged(_visible);
            if(_visible)
                draw(getSurfaceHolder().getSurface());
            else
                h.removeCallbacksAndMessages(null);
        }
    }
}
Faradmeter answered 17/12, 2018 at 13:8 Comment(4)
Even though I've already solved it, this could be interesting. Could you please share a full sample to try it out?Chiropteran
You didn't have to demonstrate it directly on live wallpaper (though it is more efficient this way, I think). Using a Bitmap on ImageView was fine... Have you tested it even on long high-quality animations? I remember that when I tried the new API, the callbacks worked fine in the beginning, but after some time they stopped, probably because they use some kind of caching.Chiropteran
I grabbed an interesting GIF from the Internet. Was it "long" and "high-quality"? Those aren't very objective qualifications. Feel free to try it on whatever GIF you want to use it on. And as I mentioned above, I noticed and worked around the problem where callbacks stop.Faradmeter
I'm having some difficulties trying out this code. Would you mind please share the project ? Maybe on Github?Chiropteran

© 2022 - 2024 — McMap. All rights reserved.