What is the OpenCV template matching Max Min value range ? Need to be used as a theshold / c++/java
Asked Answered
S

4

7

I am creating a simple openCV application using template matching where I need to compare find a small image in a big image and return the result as true(if match found) or false( no matches found).

    Imgproc.matchTemplate(largeImage, smallImage, result, matchMethod);
    Core.normalize(result, result, 0, 1, Core.NORM_MINMAX, -1, new Mat());

    MinMaxLocResult mmr = Core.minMaxLoc(result);

    double minMaxValue = 1;
    if (matchMethod== Imgproc.TM_SQDIFF || matchMethod== Imgproc.TM_SQDIFF_NORMED)
    {
        minMaxValue = mmr.minVal;
        useMinThreshold = true;
    }
    else
    {
        minMaxValue = mmr.maxVal;
    }

Now the problem is in taking decision (true/false) using this minMaxValue. I know the above two methods TM_SQDIFF and TM_SQDIFF_NORMED returns low values while the others return high values so I can have 2 different thresholds and compare one of threshold (depending on the template method type).

So it would be great if some one can explain what is the minVal and maxVal range that the MinMaxLocResult returns.

Is it 0 to 1 range?

If yes, for Max type template method value 1 is a perfect match?

Sonde answered 22/7, 2013 at 10:6 Comment(0)
L
11

MinMaxLocResult does not return minVal and maxVal range. minVal and maxVal are just minimum and maximum matching scores as can be seen in the link.

The structure MinMaxLocResult has also minLoc and maxLoc properties which are of type Point, giving the matching locations. Given that you use TM_SQDIFF or TM_SQDIFF_NORMED as a matching criterion , the best matching location will be mmr.minLoc.

In order to set a threshold for the detection, you can declare a variabledouble thresholdMatch and set its value experimentally. if minVal < thresholdMatch then it can be said that target object is detected

Lamed answered 22/7, 2013 at 10:20 Comment(10)
Thanks thomas, so you meant that the decision (true/false) can be taken using the minLoc or maxLoc instead of minVal/maxVal ?Sonde
@No, template matching does not give an info about whether you found the exact object or not. Rather, it says that "best matching (potential) object is in this location" and it is the minLoc in this particular caseLamed
Ok, In this particular example I can disregard scale rotation skew issues though. Still the template matching is not the best come to a conclusion for this purpose (return a true/false) ? I understand the point you emphasized i.e it says that "best matching (potential) object is in this location" But still I need to know is there a threshold kinda approach to come to a conclusion whether the object found. May not be with MinMaxLocResult but is there any alternatives? DOESNT MATTER EVEN IF IT'S A FEATURE DESCRIPTOR EXTRACTION TECHNIQUESonde
@Emily Webb, you can set a threshold as can be seen in the edited answerLamed
thanks and exactly this is what I did as I explained in the original post but I was not sure about the possible value range of minVal/maxVal . I guess it's 0 - 1 I guess. So depending on the template matching method I have go for minVal < thresholdMatchForMin or maxVal > thresholdMatchForMax right?Sonde
@Emily Webb, ofcourse, if method is TM_SQDIFF_NORMED minVal - maxVal range will be 0-1Lamed
Finally solved this issue as per last comments but removed normalizing to get better results.Sonde
@Tom_Crusoe Im having the same problem I also want to have true/false is the image found or not in the end. But my Minval values for 3 different tests are : 4.54... , 0, -1.86. They are not between 0-1, all 3 tests were successfully found on the source image. I really cant decide what to set the threshold...Achitophel
@Anarkie, did you normalized the mathing result by Core.normalize(result, result, 0, 1, Core.NORM_MINMAX, -1, new Mat()); ? here 0 and 1 are lower and upper boundaries for normalizationLamed
@Tom_Crusoe yes here is my code pastebin.com/PpY8zhHk using Imgproc.TM_SQDIFFAchitophel
K
4

Dont normalize the result, then it will give the proper value, i mean remove this line

   Core.normalize(result, result, 0, 1, Core.NORM_MINMAX, -1, new Mat());
Kwarteng answered 17/12, 2013 at 7:27 Comment(1)
how do i understand if its a proper value?? @Mehul ThakkarHorselaugh
T
0

faithk's answer is excellent but here is some actual code implementing it in essence. I scored good results with using 0.1 as the threshold:

import lombok.val;
import org.opencv.core.*;
import org.springframework.core.io.ClassPathResource;

import java.awt.image.BufferedImage;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;

import static javax.imageio.ImageIO.read;
import static javax.imageio.ImageIO.write;
import static javax.swing.SwingUtilities.invokeAndWait;
import static org.opencv.core.CvType.CV_32FC1;
import static org.opencv.highgui.HighGui.imshow;
import static org.opencv.highgui.HighGui.waitKey;
import static org.opencv.imgcodecs.Imgcodecs.CV_LOAD_IMAGE_UNCHANGED;
import static org.opencv.imgcodecs.Imgcodecs.imdecode;
import static org.opencv.imgproc.Imgproc.*;

public class TemplateMatcher
{
    static
    {
        // loadNativeOpenCVLibrary();
    }

    private static final int MATCH_METHOD = TM_SQDIFF_NORMED;

    private static Mat BufferedImage2Mat(BufferedImage image) throws IOException
    {
        try (val byteArrayOutputStream = new ByteArrayOutputStream())
        {
            write(image, "jpg", byteArrayOutputStream);
            byteArrayOutputStream.flush();
            val matOfByte = new MatOfByte(byteArrayOutputStream.toByteArray());
            return imdecode(matOfByte, CV_LOAD_IMAGE_UNCHANGED);
        }
    }

    public static Point performTemplateMatching(BufferedImage bigImage, BufferedImage templateImage,
                                                double detectionThreshold, boolean showMatch) throws IOException
    {
        val image = BufferedImage2Mat(bigImage);
        val template = BufferedImage2Mat(templateImage);

        // Create the result matrix
        val result_cols = image.cols() - template.cols() + 1;
        val result_rows = image.rows() - template.rows() + 1;
        val result = new Mat(result_rows, result_cols, CV_32FC1);

        // Do the matching
        matchTemplate(image, template, result, MATCH_METHOD);

        // Localize the best match
        val minMaxLocResult = Core.minMaxLoc(result);

        // / Show me what you got
        val matchedLocation = minMaxLocResult.minLoc;
        rectangle(image, matchedLocation, new Point(matchedLocation.x + template.cols(),
                matchedLocation.y + template.rows()), new Scalar(0, 255, 0));

        if (showMatch)
        {
            try
            {
                invokeAndWait(() -> imshow("Image Search", image));
            } catch (InterruptedException | InvocationTargetException exception)
            {
                exception.printStackTrace();
            }
            waitKey();
        }

        // Determine whether this sub image has been found
        val minVal = minMaxLocResult.minVal;
        if (minVal < detectionThreshold)
        {
            return minMaxLocResult.maxLoc;
        }

        return null;
    }

    public static BufferedImage getBufferedImage(String classpathFile) throws IOException
    {
        val classPathResource = new ClassPathResource(classpathFile);
        val filePath = classPathResource.getFile();
        return read(filePath);
    }
}
Trammel answered 11/8, 2018 at 14:47 Comment(0)
C
0

TERMS

  • template = image we’re trying to find
  • haystack = image we’re searching in
  • region = area in haystack that we’re currently matching against the template

The min and max values will have different possible ranges depending on the template matching type. The result of a comparison for each location in the haystack is determined by these formulas (taken from opencv docs; T() is the template, I() is the haystack):

template matching algorithms

As you noted, squared difference methods (SQDIFF) get larger as the difference between the template and the region get larger, so the best match will have the lowest value. For the other methods (cross correlation, correlation coefficient), the best match will have the highest value.

The range itself is tough to determine if you don’t understand the math (like myself), but looking at square difference I think the range would be (assuming the images are 1-byte grayscale):

0 ... (255 * 255 * template.width * template.height)

And for the normalized versions, the range should be:

0 ... 1

Chateau answered 7/4, 2020 at 23:2 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.