Picture entropy calculation
Asked Answered
D

1

1

I've run into some nasty problem with my recorder. Some people are still using it with analog tuners, and analog tuners have a tendency to spit out 'snow' if there is no signal present.

The Problem is that when noise is fed into the encoder, it goes completely crazy and first consumes all CPU then ultimately freezes. Since main point od the recorder is to stay up and running no matter what, I have to figure out how to proceed with this, so encoder won't be exposed to the data it can't handle.

So, idea is to create 'entropy detector' - a simple and small routine that will go through the frame buffer data and calculate entropy index i.e. how the data in the picture is actually random.

Result from the routine would be a number, that will be 0 for completely back picture, and 1 for completely random picture - snow, that is.

Routine in itself should be forward scanning only, with few local variables that would fit into registers nicely.

I could use zlib or 7z api for such task, but I would really want to cook something on my own.

Any ideas?

Demented answered 5/12, 2010 at 10:1 Comment(2)
I think that the first question should be "why your encoder can't handle any valid input".Anderegg
I am using windows media encoder and I can't really do anything there but try not too feed it with the stuff that it can't or it can process with great hardship. And also we all know that noise is trouble for any kind of compression algorithms.Spinks
M
2

PNG works this way (approximately): For each pixel, replace its value by the value that it had minus the value of the pixel left to it. Do this from right to left.

Then you can calculate the entropy (bits per character) by making a table of how often which value appears now, making relative values out of these absolute ones and adding the results of log2(n)*n for each element.

Oh, and you have to do this for each color channel (r, g, b) seperately.

For the result, take the average of the bits per character for the channels and divide it by 2^8 (assuming that you have 8 bit per color).

Mclaughlin answered 5/12, 2010 at 10:6 Comment(3)
What if we drop the 'meaning' from the data and assume that we operate on pure byte* - we won't get accurate entropy, but still have number to measure against?Spinks
Calculating a difference between different channels ('red' and 'green') may create an impression that the entropy is big when actually it is smallDola
I could crush rgb to luminance only and work my way from there. . .Spinks

© 2022 - 2024 — McMap. All rights reserved.