Curve Fitting in 2D Images With Missing Data
Asked Answered
H

1

12

I have split contours in a image and I want to connect them assuming they are parametrized curves. I was considering some curve fitting or curve tracing - but I do not know, how can I implement it.

I have exactly 1 px wide split contours:

import itertools
import cv2

# Get all pixel directions
directions = list(itertools.product([-1, 0, 1], repeat=2))
directions.remove((0, 0))

# Load image
img = cv2.imread("image.png", cv2.IMREAD_COLOR)
gray = cv2.cvtColor(img, cv2.COLOR_RGB2GRAY)

cv2.imshow("Input", img)

# Find contours
_, thresh = cv2.threshold(gray, 25, 255, cv2.THRESH_BINARY)
contours, _ = cv2.findContours(thresh, cv2.RETR_LIST, cv2.CHAIN_APPROX_NONE)

contours

I know, where are ends of all contours:

for cnt in contours:   
    ends = [] 

    # Get ends of contour
    # - first pixel is not always the first pixel of open contour
    # - last pixel is not mostly the last pixel in contour

    for pix in cnt:
        pixel = tuple(pix[0])
        pixel_x, pixel_y = pixel

        total_neighbours = 0

        # Count pixel neighbours
        for plus_x, plus_y in directions:
            current_x = pixel_x + plus_x
            current_y = pixel_y + plus_y

            if current_y < 0 or current_y >= gray.shape[0]:
                continue

            if current_x < 0 or current_x >= gray.shape[1]:
                continue

            if gray[current_y][current_x]:
                total_neighbours += 1

        # If it is end pixel
        if total_neighbours == 1:
            ends.append(pixel)
            cv2.circle(img, pixel, 3, (0, 255, 255), 1)

enter image description here

As a human I know, where are the curves and what contours are part of each unique curve:

ends of contours

For good human imagination, there are raw images:

Source image 1 Source image 2

But I do not know, how can I connect these split contours programmatically. I tried using first and second derivative to predict, but it was not good enough:

    # Make contour "predictions" by first derivative
    # - go from end to second end and calculate first derivative
    # - "predict" on second end contour connection

    for end in ends:
        pixel_x, pixel_y = end

        def predict(pixel_x, pixel_y, was):
            for plus_x, plus_y in directions:
                current_x = pixel_x + plus_x
                current_y = pixel_y + plus_y

                if current_y < 0 or current_y >= gray.shape[0]:
                    continue

                if current_x < 0 or current_x >= gray.shape[1]:
                    continue

                if (current_x, current_y) not in ends:
                    if (current_x, current_y) not in was:
                        if gray[current_y][current_x]:
                            was.append((current_x, current_y))
                            predict(current_x, current_y, was)

                else:
                    derivatives_x = []
                    derivatives_y = []

                    # Calculate derivative
                    for pix in range(len(was) - 3):
                        x1, y1 = was[pix]
                        x2, y2 = was[pix + 3]

                        derivatives_x.append(x1 - x2)
                        derivatives_y.append(y1 - y2)

                    if not derivatives_x:
                        continue

                    # Get last N derivatives and average them
                    avg_x = derivatives_x[-20:]
                    avg_y = derivatives_y[-20:]

                    avg_x = sum(avg_x) / len(avg_x)
                    avg_y = sum(avg_y) / len(avg_y)

                    # Predict N pixels
                    for i in range(7):
                        pos = round(x2 - avg_x * i), round(y2 - avg_y * i)
                        cv2.circle(img, pos, 1, (0, 0, 255), cv2.FILLED)

        predict(pixel_x, pixel_y, [])

cv2.imshow("First derivative", img)
cv2.waitKey(0)

first derivative second derivative

Anybody knows how to fit some curve/spline/Bezier/.. to it and recognize the curves as a human?

There are more binarized source images:

another binarized image 1 another binarized image 2

another binarized image 5 another binarized image 4 another binarized image 3

Thanks.

Hexane answered 24/9, 2021 at 22:38 Comment(16)
if you have those curves, and the first few derivatives, I'd think that should be enough! what were your results?Kidskin
Sorry, I forgot to upload those results. I updated a question with first and second derivative. It is better than nothing, but some curve it did not recognize correctly.Hexane
some smoothing, or wider radius, for assessing curvature, might help. otherwise you're at the mercy of single pixels. just guessing.Kidskin
Thanks! And, please, how can I implement pixel smoothing?Hexane
Some of your derivative extrapolations seem "off", you may want to play a bit with how many points back you sample your segment to produce the derivatives.Markson
Yes, you are right. I tried a lot of variations - basically I am averaging last 5/10/20 derivatives and then predicting another few pixels. Every option is good for some contour, when I change the average pixel count some predictions get better and some get unfortunately worse. Maybe I can try median instead of averagingHexane
Ok, so median is not good choice at allHexane
Ya, i was running into the same thing (changing how far back I ran my derivative would help some and hurt others), think it may need to be a numerically varied parameter. I would suggest switching to trig based derivatives though since they are better for non-functional curves.Markson
I also think that iterative rounding is rearing its head (i tried to combat it by keeping my points spread a bit out in the splines) but a better solution may be to just keep the splines in double resolution and only drop it onto integer image grid for drawingMarkson
Exactly! I am doing it same way. But some spline fitting I have still in my head as the cleanest way.Hexane
You should leave the question open then (unaccept my answer)! Someone may have a better idea/ strat than me. It is a neat problem with a good example dataset so I expect you may get some more answers if you leave it open for a few daysMarkson
If you would post your current code such that it gets people from your source image up to the point of extrapolation (in python), and unaccept my answer, I will put a bounty on this question to try to get some more ideas in the mix. (it took me about half the time just catching up to you, so adding that code will make it way easier for others to jump in and try stuff)Markson
I think I agree with you that spline fitting may be superior to a purely numerical extrapolation strategy. I am going to rework my function a bit.Markson
When I see a problem like this, my first question is: how were these lines obtained, and can we do better given the original input images?Nightshirt
Of course we can! I wanted to just illustrate the concept of curve fitting into contours, because that concept of fitting splines, .. into non-functions I didn't find anywhere. I added raw images (sorry that I did not upload them at the beginning). But if you have any ideas how to solve the problem by another way, I would like to hear them! As a second way I was considering some optimization task, but that is another way that I can't implement.Hexane
Are you aware of this paper?Calton
M
7

enter image description here

Extrapolation is notoriously vulnerable to initial conditions (in this case the state of the spline where the cut was made. That being said, I do believe that the first and second derivative are sufficient for reasonable solutions to these data sets.

I think there are two primary mistakes in your extrapolation code: (1) first and second derivative calculations on integer pixel data is going to have additional noise relating to the discrete nature of integer space (splines are continuous). This could be solved by down-sampling your integer precision curves (using a voxel grid filter or something similar) into double precision curves. (2) The way that you are calculating the second derivative is to take the cartesian difference between two subsequent directional vectors along a spline. A superior strategy is to take the angular change between subsequent direction vectors instead. In addition to being more accurate, this allows splines to wrap back on themselves as well as continuously propagating a lossless curve.

The way that I determined collision was through extrapolation of each curve against each other curve. After each step of extrapolation, 3 estimates of fitness were assessed to determine a local minima (and subsequently best fitness for curve combination). (1) cartesian distance between the endpoints of each curve. (2) angular distance between endpoints of each curve (180 degree offset is considered optimal). (3) cumulative extrapolation distance. The total fitness is a weighted sum of these 3 values. Extrapolation of each test candidate continues until the total fitness decreases. After testing every combination of every curve, if the best fitness is below a configurable threshold, those two curves are combined and the process is repeated.

Each time ideal candidates for spline merging are determined, both are extrapolated until they meet in the middle. Those extrapolation points are then averaged using sinusoidal weighting to improve the curvature/ continuity of the spline. The three sets (spline1, bridge, spline2) of points are then reconstructed and the spline reformed.

Code (c++11, opencv):

#include <stdio.h>
#include <opencv2/opencv.hpp>
#include <Windows.h>
#include <string>

#define M_PI 3.14159

using namespace std;
using namespace cv;

double sqDist(Point p1, Point p2)
{
    return std::pow((double)p1.x - p2.x,2) + std::pow((double)p1.y - p2.y,2);
}

vector<Point2d> resampleContour(vector<Point> originalContour, int voxelWidth_px = 3)
{
    //quick and dirty voxel filter downsampling to 5px spacing
    vector<Point2d> outputCurve = vector<Point2d>();
    while(originalContour.size()>0)
    {
        int binX = std::floor(originalContour[0].x / voxelWidth_px);
        int binY = std::floor(originalContour[0].y / voxelWidth_px);
        int sumX = originalContour[0].x;
        int sumY = originalContour[0].y;
        int count = 1;
        vector<Point> remainingPoints = vector<Point>();
        for (int i = 1; i < originalContour.size(); i++)
        {
            int tempBinX = std::floor(originalContour[i].x / voxelWidth_px);
            int tempBinY = std::floor(originalContour[i].y / voxelWidth_px);
            if (tempBinX == binX && tempBinY == binY)
            {
                sumX += originalContour[i].x;
                sumY += originalContour[i].y;
                count++;
            }
            else
            {
                remainingPoints.push_back(originalContour[i]);
            }
        }
        originalContour = remainingPoints;
        double aveX = (double)sumX / (double)count;
        double aveY = (double)sumY / (double)count;
        outputCurve.push_back(Point2d(aveX, aveY));
    }

    return outputCurve;
}

Point2d getNN(vector<Point2d> cloud, int targetIndex, int &nnIndex, double &dist)
{
    nnIndex = -1;
    double shortestDist = 0;
    for (int i = 0; i < cloud.size(); i++)
    {
        if (i == targetIndex) { continue; }
        double tempDist = sqDist(cloud[targetIndex], cloud[i]);
        if (nnIndex == -1 || tempDist < shortestDist)
        {
            nnIndex = i;
            shortestDist = tempDist;
        }
    }
    dist = shortestDist;
    return cloud[nnIndex];
}

int getNN(vector<Point2d> cloud, Point2d pt)
{
    int nnIndex = -1;
    double shortestDist = 0;
    for (int i = 0; i < cloud.size(); i++)
    {
        double tempDist = sqDist(pt, cloud[i]);
        if (nnIndex == -1 || tempDist < shortestDist)
        {
            nnIndex = i;
            shortestDist = tempDist;
        }
    }
    return nnIndex;
}

vector<Point2d> getNNs(vector<Point2d> cloud, int targetIndex, int count)
{
    Point2d tPt = cloud[targetIndex];
    sort(cloud.begin(), cloud.end(), [tPt](const Point2d& p1, const Point2d& p2) {
        return sqDist(tPt,p1) < sqDist(tPt, p2);
        });

    vector<Point2d> outCloud = vector<Point2d>();
    for (int i = 1; i <= count && i < cloud.size(); i++) //first point will be same point
    {
        outCloud.push_back(cloud[i]);
    }
    return outCloud;
}

Vec2d normalize(Vec2d inVec)
{
    double length = sqrt(pow(inVec(0), 2) + pow(inVec(1), 2));
    if (length == 0)
    {
        throw;
    }
    inVec = inVec / length;
    return inVec;
}

double angleBetween(Vec2d vec1, Vec2d vec2, bool directionalFlag = false)
{
    vec1 = normalize(vec1);
    vec2 = normalize(vec2);

    double angle = (atan2(vec1(1), vec1(0)) - atan2(vec2(1), vec2(0)));
    if (angle > M_PI) { angle -= 2 * M_PI; }
    else if (angle <= -M_PI) { angle += 2 * M_PI; }

    if (!directionalFlag)
    { angle = abs(angle); }
    return angle;
}

vector<Point> roundPoints(vector<Point2d> cloud)
{
    vector<Point> outCloud = vector<Point>();
    for (int i = 0; i < cloud.size(); i++)
    {
        outCloud.push_back(Point(round(cloud[i].x), round(cloud[i].y)));
    }
    return outCloud;
}

class PointNorm
{
public:
    PointNorm() {}

    //point at p1 with direction p1-p2
    PointNorm(Point2d p1, Point2d p2)
    {
        X = p1.x;
        Y = p1.y;
        dir = Vec2d(p1.x - p2.x, p1.y - p2.y);
        dir = normalize(dir);
    }

    PointNorm(double x,double y,double dx,double dy)
    {
        X = x;
        Y = y;
        dir = Vec2d(dx, dy);
        dir = normalize(dir);
    }
    double X = 0;
    double Y = 0;
    Vec2d dir = Vec2d();

    static void dif(PointNorm pn1, PointNorm pn2, double& distance, double& angle)
    {
        distance = sqrt(pow(pn1.X - pn2.X, 2) + pow(pn1.Y - pn2.Y, 2));
        
        angle = angleBetween(pn1.dir, pn2.dir, false);
    }
};

vector<Point2d> orderCurve(vector<Point2d> cloud)
{
    if (cloud.size() < 3) { return cloud; }

    vector<Point2d> outCloud = vector<Point2d>();

    int currentIndex = cloud.size() / 2;
    double lastDist = -1;
    vector<Point2d> tempCloud = cloud;
    for (int i = 0; i < cloud.size(); i++)
    {
        vector<Point2d> twoNeighbors = getNNs(cloud, i, 5); //technically u can increase this count to increase endpoint confidence
        bool endFlag = true;
        for (int ii = 0; ii < twoNeighbors.size()-1 && endFlag; ii++)
        {
            for (int iii = 0; iii < twoNeighbors.size() - 1 && endFlag; iii++)
            {
                if (ii == iii) { continue; }
                PointNorm pn1 = PointNorm(cloud[i], twoNeighbors[ii]);
                PointNorm pn2 = PointNorm(cloud[i], twoNeighbors[iii]);
                double tempAngle = angleBetween(pn1.dir, pn2.dir);

                if (tempAngle > M_PI / 2)
                { endFlag = false; }
            }
        }

        if (endFlag)
        {
            outCloud.push_back(cloud[i]);
            break;
        }
    }

    tempCloud = cloud;
    currentIndex = getNN(cloud, outCloud[0]);

    while (tempCloud.size() > 1)
    {
        int testIndex = 0;
        double testDist;
        Point2d tempPoint = getNN(tempCloud, currentIndex, testIndex, testDist);
        outCloud.push_back(tempPoint);
        tempCloud.erase(tempCloud.begin() + currentIndex);
        if (testIndex > currentIndex) { testIndex--; }
        currentIndex = testIndex;
    }
    return outCloud;
}

class ModSpline
{
public:

    ModSpline(vector<Point2d> orderedCurve, int dirEstimationOffset, bool _comboCurve = false)
    {
        //totalCurve = orderedCurve;
        totalCurve = vector<Point2d>();
        copy(orderedCurve.begin(), orderedCurve.end(), back_inserter(totalCurve));
        int end = orderedCurve.size() - 1;
        head = PointNorm(orderedCurve[0], orderedCurve[dirEstimationOffset]);//slight offset gives better estimation of direction if last point is noisy
        tail = PointNorm(orderedCurve[end], orderedCurve[end- dirEstimationOffset]);
        comboCurve = _comboCurve;
    }

    void stepExtrapolate_Head(int stepSize_px, int derivativeLookback)
    {
        int tempLookback = derivativeLookback;
        if (tempLookback > totalCurve.size() - 1) { tempLookback = totalCurve.size() - 1; }

        head.X += head.dir(0) * (double)stepSize_px;
        head.Y += head.dir(1) * (double)stepSize_px;
        totalCurve.insert(totalCurve.begin(), 1, Point2d(head.X, head.Y));
        {
            double dirChangeAngle = computeSecondDerivative(0, tempLookback);
            head.dir = normalize(rotateByAngle(head.dir, dirChangeAngle * (double)stepSize_px));
        }
    }
    void stepExtrapolate_Tail(int stepSize_px, int derivativeLookback)
    {
        int tempLookback = derivativeLookback;
        if (tempLookback > totalCurve.size() - 1) { tempLookback = totalCurve.size() - 1; }
        
        tail.X += tail.dir(0) * stepSize_px;
        tail.Y += tail.dir(1) * stepSize_px;
        totalCurve.push_back(Point2d(tail.X, tail.Y));
        {
            double dirChangeAngle = computeSecondDerivative(totalCurve.size() - 1, totalCurve.size() - (1 + tempLookback));
            tail.dir = normalize(rotateByAngle(tail.dir, dirChangeAngle * (double)stepSize_px));
        }
    }

    void stepExtrapolate_Bidirectional(int stepSize_px, int derivativeLookback)
    {
        stepExtrapolate_Head(stepSize_px, derivativeLookback);
        stepExtrapolate_Tail(stepSize_px, derivativeLookback);
    }

    void stepExtrapolate_SingleDirection(int stepSize_px, int derivativeLookback, bool headFlag)
    {
        if (headFlag) { stepExtrapolate_Head(stepSize_px, derivativeLookback); }
        else { stepExtrapolate_Tail(stepSize_px, derivativeLookback); }
    }

    static double estimateSpineDistance(ModSpline spl1, ModSpline spl2,int stepSize_px, int derivativeLookback, double distWeight, double angleWeight, double travelWeight)
    {
        bool dir_1_head, dir_2_head;
        getNearestEndpoints(spl1, spl2, dir_1_head, dir_2_head);

        //todo: run multiple times with adjusted dir and derivatives

        double lastAngle, lastDist, lastComboDist;
        PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head), lastDist, lastAngle);
        lastAngle = abs(M_PI - lastAngle);//looking for opposing vectors;
        lastComboDist = lastAngle * angleWeight + lastDist * distWeight;

        double totalExtrapolationDistance = 0;
        while (true)
        {
            spl1.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_1_head);
            spl2.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_2_head);
            totalExtrapolationDistance += (stepSize_px + stepSize_px);
            double tempAngle, tempDist, tempComboDist;
            PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head), tempDist, tempAngle);
            tempAngle = abs(M_PI - tempAngle);//looking for opposing vectors;

            tempComboDist = tempAngle * angleWeight + tempDist * distWeight +totalExtrapolationDistance* travelWeight;
            if (tempComboDist > lastComboDist) { break; }
            else
            {
                lastAngle = tempAngle;
                lastDist = tempDist;
                lastComboDist = tempComboDist;
            }
        }
        return lastComboDist;
    }

    static ModSpline mergeSplines(ModSpline spl1, ModSpline spl2, int stepSize_px, int derivativeLookback, int dirEstimationOffset, vector<vector<Point2d>> &debugClouds)
    {
        bool dir_1_head, dir_2_head;
        getNearestEndpoints(spl1, spl2, dir_1_head, dir_2_head);

        double lastAngle, lastDist, lastComboDist;
        int extrapolationCount = 0;
        PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head),lastDist, lastAngle);
        while (true)
        {
            extrapolationCount += 1;
            spl1.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_1_head);
            spl2.stepExtrapolate_SingleDirection(stepSize_px, derivativeLookback, dir_2_head);
            double tempAngle, tempDist, tempComboDist;
            PointNorm::dif(spl1.getEndpointnorm(dir_1_head), spl2.getEndpointnorm(dir_2_head), tempDist, tempAngle);

            if (tempDist > lastDist) { break; }
            else { lastDist = tempDist; }
        }

        vector<Point2d> outCloud = vector<Point2d>();

        vector<Point2d> spline1Cloud = spl1.getTotalCurve();
        if (dir_1_head) { reverse(spline1Cloud.begin(), spline1Cloud.end()); }
        vector<Point2d> spline2Cloud = spl2.getTotalCurve();
        if (!dir_2_head) { reverse(spline2Cloud.begin(), spline2Cloud.end()); }

        //spline 1
        {
            vector<Point2d> debug = vector<Point2d>();
            for (int i = 0; i < spline1Cloud.size() - extrapolationCount; i++)
            {
                outCloud.push_back(spline1Cloud[i]);
                debug.push_back(spline1Cloud[i]);
            }
            debugClouds.push_back(debug);
        }
        //fused region
        {
            vector<Point2d> debug = vector<Point2d>();
            double theta = 0;
            double theta_Step = (M_PI / 2.0) / (double)extrapolationCount;
            for (int i = 1; i < extrapolationCount-1; i++)
            {
                int invI = extrapolationCount - i;
                Point2d p1 = spline1Cloud[spline1Cloud.size() - (1 + invI)];
                Point2d p2 = spline2Cloud[i];

                double alpha = sin(theta);
                theta += theta_Step;

                //sinusoidal interpolation
                Point2d fusedPt = Point2d(alpha*p2.x+(1.0-alpha)*p1.x, alpha * p2.y + (1.0 - alpha) * p1.y);

                outCloud.push_back(fusedPt);
                debug.push_back(fusedPt);
            }
            debugClouds.push_back(debug);
        }
        //spline 2
        {
            vector<Point2d> debug = vector<Point2d>();
            for (int i = extrapolationCount; i < spline2Cloud.size(); i++)
            {
                outCloud.push_back(spline2Cloud[i]);
                debug.push_back(spline2Cloud[i]);
            }
            debugClouds.push_back(debug);
        }
        return ModSpline(outCloud,dirEstimationOffset,true);
    }

    vector<Point2d> getTotalCurve()
    {
        return totalCurve;
    }

    int getPtCount() { return totalCurve.size(); }

    vector<PointNorm> getEndpoints()
    {
        vector<PointNorm> endPoints = vector<PointNorm>();
        endPoints.push_back(head);
        endPoints.push_back(tail);
        return endPoints;
    }

    bool isComboCurve() { return comboCurve; }

    Point2d getEndpoint(bool headFlag)
    {
        if (headFlag) { return Point2d(head.X, head.Y); }
        else { return Point2d(tail.X, tail.Y); }
    }

    PointNorm getEndpointnorm(bool headFlag)
    {
        if (headFlag) { return head; }
        else { return tail; }
    }

    static void getNearestEndpoints(ModSpline spl1, ModSpline spl2, bool& headFlag1, bool& headFlag2)
    {
        double dist1 = sqDist(Point2d(spl1.head.X, spl1.head.Y), Point2d(spl2.head.X, spl2.head.Y));
        double dist2 = sqDist(Point2d(spl1.head.X, spl1.head.Y), Point2d(spl2.tail.X, spl2.tail.Y));
        double dist3 = sqDist(Point2d(spl1.tail.X, spl1.tail.Y), Point2d(spl2.head.X, spl2.head.Y));
        double dist4 = sqDist(Point2d(spl1.tail.X, spl1.tail.Y), Point2d(spl2.tail.X, spl2.tail.Y));

        double minDist = min(dist1, min(dist2, min(dist3, dist4)));
        if (minDist == dist1) { headFlag1 = true; headFlag2 = true; }
        else if (minDist == dist2) { headFlag1 = true; headFlag2 = false; }
        else if (minDist == dist3) { headFlag1 = false; headFlag2 = true; }
        else if (minDist == dist4) { headFlag1 = false; headFlag2 = false; }
    }

private:
    vector<Point2d> totalCurve;
    PointNorm head;
    PointNorm tail;
    bool comboCurve = false;

    double computeSecondDerivative(int startIndex, int endIndex)
    {
        int increment = 1;
        if (endIndex < startIndex) { increment = -1; }

        double totAngle = 0;
        double totalDistance = 0;
        int count = 0;
        for (int i = startIndex; i + 2 * increment != endIndex; i += increment)
        {
            count++;

            Point2d p1 = totalCurve[i];
            Point2d p2 = totalCurve[i + increment];
            Point2d p3 = totalCurve[i + 2 * increment];

            Vec2d v1 = Vec2d(p1.x - p2.x, p1.y - p2.y);
            Vec2d v2 = Vec2d(p2.x - p3.x, p2.y - p3.y);

            double tempAngle = angleBetween(v1, v2, true);
            double aveDist = (sqrt(sqDist(p1, p2)) + sqrt(sqDist(p2, p3))) / 2.0;
            totalDistance += aveDist;
            totAngle += tempAngle;
        }

        totAngle = totAngle / totalDistance;
        return totAngle;
    }

    static Vec2d rotateByAngle(Vec2d inVec, double angle)
    {
        Vec2d outVec = Vec2d();
        outVec(0) = inVec(0)*cos(angle) - inVec(1)*sin(angle);
        outVec(1) = inVec(0)*sin(angle) + inVec(1)*cos(angle);
        return outVec;
    }
};


vector<Scalar> colorWheel = { 
    Scalar(255,0,0),
    Scalar(0,255,0),
    Scalar(0,0,255),
    Scalar(255,255,0),
    Scalar(255,0,255),
    Scalar(0,255,255) };
int colorWheelIndex = 0;
void rotateColorWheel()
{
    colorWheelIndex++;
    if (colorWheelIndex >= colorWheel.size()) { colorWheelIndex = 0; }
}


int main(int argc, char** argv)
{

    //control downsampling and extrapolation (several these could be static members of modspline instead)
    const int stepSize_px = 1;//how far each extrapolation step travels
    const int derivativeLookback = 15;//how far back in curve is considered in determining 2nd derivative
    const int dirEstimationOffset = 2;//min 1.  determines how much deviations at ends of curves effect extrapolation (lower equals more impact)
    const int voxelWidth = 5; //voxel dimension for averaging intial pixels into point doubles

    //control spline similarity calculation (each of these contributes to a weighted sum of "spline distance")
    const double distWeighting = 1.0; //how much distance between two splines after optimal extrapolation
    const double angleWeighting = 10.0; //how much angle between two splines after optimal extrapolation
    const double travelWeighting = 0.05; //how far splines have to travel to connect
    const double maxTotalSplineError = 15; //unitless weighted combination of "spline distance"

    bool debugFlag = true;// flag to enable or disable debug visualizers


    std::string path = "C:/Local Software/voyDICOM/resources/images/SparseCurves6.png";

    Mat originalImg = cv::imread(path, cv::IMREAD_GRAYSCALE);
    if (debugFlag) { cv::imshow("orignal", originalImg); }

    Mat debugImg = cv::imread(path, cv::IMREAD_COLOR);

    Mat bwImg;
    threshold(originalImg, bwImg, 150, 255, THRESH_BINARY);

    vector<vector<Point> > contours;
    findContours(bwImg, contours, RETR_LIST, CHAIN_APPROX_NONE);

    vector<vector<Point2d>> dCurves = vector<vector<Point2d>>();

    Mat initialCurves = debugImg.clone();
    for (int i = 0; i < contours.size(); i++)
    {
        vector<Point2d> tempDCurve = resampleContour(contours[i], voxelWidth);
        if (tempDCurve.size() < 7) { continue; }

        tempDCurve = orderCurve(tempDCurve);
        dCurves.push_back(tempDCurve);

        vector<Point> tempCloud = roundPoints(tempDCurve);
        cv::polylines(initialCurves, { tempCloud }, false, colorWheel[colorWheelIndex], 2);
        circle(initialCurves, tempCloud[0], 5, Scalar(0, 255, 0), 1);
        circle(initialCurves, tempCloud[tempCloud.size() - 1], 5, Scalar(0, 0, 255), 1);
        rotateColorWheel();
    }
    if (debugFlag) { cv::imshow("initialCurves", initialCurves); }
        
    

    vector<ModSpline> travCurves = vector<ModSpline>();
    vector<ModSpline> tempCurves = vector<ModSpline>();
    for (int i = 0; i < dCurves.size(); i++)
    {
        travCurves.push_back(ModSpline(dCurves[i], dirEstimationOffset));
        tempCurves.push_back(ModSpline(dCurves[i], dirEstimationOffset));
    }


    if (debugFlag) 
    {
        for (int i = 0; i < 25; i++)
        {
            colorWheelIndex = 0;
            Mat extrapViewer = debugImg.clone();
            for (int ii = 0; ii < tempCurves.size(); ii++)
            {
                //if (!(ii == 7 || ii == 13)) { continue; }
                tempCurves[ii].stepExtrapolate_Bidirectional(stepSize_px,derivativeLookback);
                vector<Point2d> tempCloud = tempCurves[ii].getTotalCurve();
                for (int iii = 0; iii < tempCloud.size(); iii++)
                {
                    cv::circle(extrapViewer, tempCloud[iii], 1, colorWheel[colorWheelIndex], 1);
                }
                rotateColorWheel();
            }
            cv::imshow("extrapolation debug", extrapViewer);
            cv::waitKey(100);
        }
    }
    
    
    int fusionCounter = 1;
    while (true)
    {
        vector<tuple<int, int>> validMerges = vector<tuple<int, int>>();
        for (int i = 0; i < travCurves.size(); i++)
        {
            for (int ii = 0; ii < travCurves.size(); ii++)
            {
                if (i == ii) { continue; }
                double tempComboDist = ModSpline::estimateSpineDistance(
                    travCurves[i], 
                    travCurves[ii], 
                    stepSize_px,
                    derivativeLookback,
                    distWeighting,
                    angleWeighting,
                    travelWeighting);
                if (tempComboDist < maxTotalSplineError)
                {
                    validMerges.push_back(tuple<int,int>(i,ii));
                }
            }
        }

        if (validMerges.size()>0)
        {
            vector<int> splineSizes = vector<int>();
            for (int i = 0; i < travCurves.size(); i++)
            {
                splineSizes.push_back(travCurves[i].getPtCount());
            }

            //sort valid merges by combined spline size (favor larger spline merges before smaller ones)
            sort(validMerges.begin(), validMerges.end(), [splineSizes](const tuple<int, int>& spl1, const tuple<int, int>& spl2) {
                return  splineSizes[get<0>(spl1)] + splineSizes[get<1>(spl1)] > splineSizes[get<0>(spl2)] + splineSizes[get<1>(spl2)];
                });

            int splineIndex1 = get<0>(validMerges[0]);
            int splineIndex2 = get<1>(validMerges[0]);

            vector<vector<Point2d>> debugClouds = vector<vector<Point2d>>();
            ModSpline newCurve = ModSpline::mergeSplines(
                travCurves[splineIndex1], 
                travCurves[splineIndex2], 
                stepSize_px, 
                derivativeLookback,
                dirEstimationOffset,
                debugClouds);

            travCurves.erase(travCurves.begin() + max(splineIndex1, splineIndex2));
            travCurves.erase(travCurves.begin() + min(splineIndex1, splineIndex2));
            travCurves.push_back(newCurve);

            if (debugFlag)
            {
                Mat debugFusions = debugImg.clone();
                cv::polylines(debugFusions, { roundPoints(debugClouds[0]) }, false, Scalar(255, 0, 0), 2);
                cv::polylines(debugFusions, { roundPoints(debugClouds[1]) }, false, Scalar(0, 255, 0), 1);
                cv::polylines(debugFusions, { roundPoints(debugClouds[2]) }, false, Scalar(0, 0, 255), 2);
                cv::imshow("debugFusion"+std::to_string(fusionCounter++), debugFusions);
                cv::waitKey(500);
            }
        }
        else
        {
            break;
        }
    }

    Mat finalCurves = debugImg.clone();
    colorWheelIndex = 0;
    for (int i = 0; i < travCurves.size(); i++)
    {
        if (!travCurves[i].isComboCurve()) { continue; }
        cv::polylines(finalCurves, { roundPoints(travCurves[i].getTotalCurve()) }, false, colorWheel[colorWheelIndex], 2);
        rotateColorWheel();
    }
    cv::imshow("final curves", finalCurves);
    cv::imshow("initialCurves", initialCurves);
    cv::imshow("original", originalImg);
    
    cv::waitKey(0);
}

Additional data sets (same parameters across all 3 sets): enter image description here enter image description here

Final notes: playing with the variables at the top of the main function allow full control over the specificity of the spline collision detection. The next step for making this more robust would be to build up a large dataset and modify the parameters as neccesary to give an optimal solution for each data set. Then, analyzing all those individual setting configurations, you should be able to establish center points and ranges of confidence for the settings such that a single configuration works across the largest number of test images.

Markson answered 25/9, 2021 at 3:5 Comment(12)
Thank you a lot! Any code will be appreciated. And do you think that some spline fitting would not be better? For example looking for how many control points can fit certain combination of two/three/.. contours. The less control points, the greater the probability of a connection. But I do not know, how to implement spline fitting with non-function contours.Hexane
Ya, I was going to throw some polynomial regression at it... but as you said, doesn't work super well with contours that can wrap back on themselves. tbh the code I am about to post is COMPLETE AND UTTER GARBAGE... but im tired... and will give it another pass tomorrow. Hopefully there are some morsels and concepts that inspire or help youMarkson
I will also try to break out critical control variables tomorrow (right now they are just sprinkled about)Markson
Updated code to expose critical variables (at the top of main). Also improved spline merging (used sinusoidal averaging to improve curvature continuity). Collision detection was also improved.Markson
Thanks a lot! Your code works amazing! I tried it on my own on another dataset (I added another few images into question), but unfortunately I found some weird bug. Please, can you try it? I think the bug is in orderCurve, but I do not know.Hexane
TIL: (stack overflow answer has a character limit) lol... anyhow found the bug with endpoint detection and fixed it, also made some improvements to extrapolation and tuned the parameters a bit based on the bugfixes and new images.Markson
*I also added in priority to larger splines over smaller splines (not directly in weighting, but priority in merging)Markson
Lol :D. It is so good! Really appreciate your help. Certainly beyond this question and topic, but what do you think is best for the overall solution of this problem from the original raw images? (Just out of curiosity and for education; your solution is already fully sufficient). Do you think that some optimization approach (for example edited active contour model) can be better?Hexane
tbh those images look pretty complex with the overlap/ occlusions and such. I assume that you are trying to count and having trouble with double counting? It would be hard for me to say without seeing a handful of the images (i highly recommend you create another question with a couple of the source images, as it seems there wasn't much appetite on SO for curve fitting). But ya, without a couple pictures and knowing what your goal is, hard to sayMarkson
Off the cuff the first thing i would suggest is algorithmically handling the width of the worms (ie making it such that the centers of the worms create valid curves, but the outside edges are rejected). This is because on visual inspection the two things that I am noticing help my eyes pick up the worms are their sharp internal edge (could make a more selective edge filter) , and their width seems pretty constant (fancy matrix morphology?)Markson
Thanks. Yes, it is quite complex. I am trying to extract worms from image and then trying to get their shape. So, that's why I wanted to fit some curve. I will make another question.Hexane
kk, sorry that the bounty didn't get another answer. You can probably close this question if my code is working for you.Markson

© 2022 - 2024 — McMap. All rights reserved.