I recently had this problem on a test: given a set of points m (all on the x-axis) and a set n of lines with endpoints [l, r] (again on the x-axis), find the minimum subset of n such that all points are covered by a line. Prove that your solution always finds the minimum subset.
The algorithm I wrote for it was something to the effect of: (say lines are stored as arrays with the left endpoint in position 0 and the right in position 1)
algorithm coverPoints(set[] m, set[][] n):
chosenLines = []
while m is not empty:
minX = min(m)
bestLine = n[0]
for i=1 to length of n:
if n[i][0] <= minX and n[i][1] > bestLine[1] then
bestLine = n[i]
add bestLine to chosenLines
for i=0 to length of m:
if m[i] <= bestLine[1] then delete m[i] from m
return chosenLines
I'm just not sure if this always finds the minimum solution. It's a simple greedy algorithm so my gut tells me it won't, but one of my friends who is much better than me at this says that for this problem a greedy algorithm like this always finds the minimal solution. For proving mine always finds the minimal solution I did a very hand wavy proof by contradiction where I made an assumption that probably isn't true at all. I forget exactly what I did.
If this isn't a minimal solution, is there a way to do it in less than something like O(n!) time?
Thanks
n[i][0] <= m
supposed to mean ifn[i][0]
is a point on the x axis andm
is a set? Can you clarify a bit and maybe explain naturally what your idea is? By set do you mean a sorted collection or any collection? On what criteria do you ordern
? – Santoro1, 100, 101, 102, 103, 104
and the lines[1, 100] [10, 101] [20, 102] [30, 103] [40, 104] [101, 104]
. If I understand your algorithm correctly, it'll choose[1, 100]
to cover the first two points,[10, 101]
to cover the third, up to[40, 104]
to cover the last. We can do much better by choosing[1, 100]
and[101, 104]
or[1, 100]
and[40, 104]
. If I understood your algorithm correctly, then it won't work on all cases. – Santoro