Upper bounds and Lower bounds in Algorithms
Asked Answered
A

2

6

I saw several articles describing upper bound as Best Case and Lower bound as Worst Case. Meanwhile some articles have given explanations for Upper /Lower bound of Worst Case.

So basically this got me asking three questions:

  1. What exactly are Upper/Lower bounds?
  2. How can they be defined separately in a Worst Case scenario?
  3. Can bounds be defined for other cases(Best,Average) as well?
Anguiano answered 25/5, 2017 at 11:3 Comment(2)
One thing to note - asymptotic notations such as O(n) are part of mathematics and used to describe/classify/characterise functions. IIRC it was Knuth that first encouraged their use to describe the time and space complexity of algorithms. "Upper bound" and "lower bound" always make sense, but when describing general functions (not for algorithms) there's no general rule for what's best or worst. I once asked a relevant (not duplicate) question hereJaborandi
I went through the answers given for your question. Thank You! :)Anguiano
M
0

What exactly are Upper/Lower bounds?

We are interested in bounds of functions, as you can read in Wikipedia.

Moreover, a part of this answer mentions:

For a function f(n), g(n) is an upper bound (big O) if for "big enough n", f(n)<=c*g(n), for a constant c. [g dominates f]
g(n) is lower bound (big Omega) if for "big enough n", f(n) >= c*g(n), for a constant c. [f dominates g]

How can they be defined separately in a Worst Case scenario?

They are either going to be different, or the same; in that case we say Θ(n), where n is usually the size of the problem. As Dukeling said:

Worse, best and average cases can* be represented as a function (for terminating algorithms). Each of those functions has upper and lower bounds (of which there are infinitely many). Doing a constant number of operations per element (e.g. the best case for insertion sort and average/worst case for linear search) would have a tight bound (lower and upper bound) of Θ(n), but also an upper bound of O(n2) or a lower bound of Ω(1).

Can bounds be defined for other cases(Best,Average) as well?

Yes. It is possible that all the cases have their upper and lower bounds.

Murrumbidgee answered 25/5, 2017 at 11:17 Comment(8)
"Lower bound is the best case" no.Siderite
@n.m. can you please explain? =)Murrumbidgee
Bounds. When talking about complexities, bounds are our estimates of the true number of operations an algorithm may perform, normally in the worst or average cases, The best case is simply not interesting, For example, O(n log n) is the lower bound of a complexity of a comparison-based sort algorithm, which means no algorithm is more efficient in the worst case (and in the average case too). A particular algorithm may perform better on particular inputs but this is not interesting in the slightest.Siderite
Worse, best and average cases can* be represented as a function (for terminating algorithms). Each of those functions has upper and lower bounds (of which there are infinitely many). Doing a constant number of operations per element (e.g. the best case for insertion sort and average / worst case for linear search) would have a tight bound (lower and upper bound) of Θ(n), but also an upper bound of O(n^2) or a lower bound of Ω(1).Extensile
Note the Stack Overflow question you yourself linked to in your answer indicates that the worst case can have both an upper and a lower bound.Extensile
Thanks n.m., you were right! @Dukeling oh I see, yes you are right! Answer updated now, I injected your comment, but if you have an objection to that, let me know please. =)Murrumbidgee
The first part of your answer still has the same problem - those are the definitions of best and worst case.Extensile
Updated @Dukeling, what do you think?Murrumbidgee
S
6

One almost never discusses the best case. It is simply not that interesting. An algorithm can always be modified to have the smallest theoretically possible best case, which is O(max(size-of-input, size-of-output)), simply by recognising one particular input and producing output precomputed for that input. In the benchmarking business this is known as cheating.

The term bound here has the same meaning as in the rest of mathematics, namely, an arbitrary value that is no larger (no smaller) than any element of a given set.

For example, when discussing the set of sorting algorithms, we can prove that no comparison-based sorting algorithm has better asymptotic efficiency than O(n log n) in the worst case (and also in the average case). Thus O(n log n) is a lower bound on the efficiency of all possible comparison-based sorting algorithms in the worst case (and also in the average case). O(n) is another lower bound. O(n log n) is a better lower bound than O(n). And it just happens that O(n log n) is the tight lower bound, because there are in fact sorting algorithms with this complexity.

There is no finite upper bound on the complexity of the set of sorting algorithms because an arbitrarily bad sorting algorithm can be created.

On the other hand, we can discuss a particular sorting algorithm, and prove that it never exceeds a certain number of operations, which would be the upper bound on its complexity. For example, the quicksort algorithm has an upper bound of O(n2). It also has an upper bound of O(n3). It does not have an upper bound of O(n log n) because there are inputs that make it exceed this number of operations. The bound of O(n2) is tight, because it is attained for some inputs.

One theoretically could discuss the lower bound in the same sense as above, but this is almost never being done (this would amount to discussing the complexity of the best case, which we are not normally interested in).

We can also discuss the difficulty of a particular problem, and place both upper and lower bounds on it. How efficient is the most efficient algorithm (in the worst or average case) that solves it? (We don't discuss the best case because the answer is not interesting, see above). For the comparison-based sorting problem, we know that both the tight upper bound and the tight lower bound are O(n log n) because there are in fact O(n log n) sorting algorithms and it can be shown that no better algorithm exists. This is not a very interesting case because we can find the most efficient algorithm possible. For e.g. the knapsack problem the situation is more interesting. We only know that an upper bound of O(2n) exists because an algorithm with this complexity trivially exists (the brute force one). We suspect but cannot prove this bound is tight. We also cannot provide any good lower bound (we suspect that there are no algorithms that solve it with polynomial complexity but cannot prove it).

Siderite answered 25/5, 2017 at 12:17 Comment(1)
Thank You for the clarification!Anguiano
M
0

What exactly are Upper/Lower bounds?

We are interested in bounds of functions, as you can read in Wikipedia.

Moreover, a part of this answer mentions:

For a function f(n), g(n) is an upper bound (big O) if for "big enough n", f(n)<=c*g(n), for a constant c. [g dominates f]
g(n) is lower bound (big Omega) if for "big enough n", f(n) >= c*g(n), for a constant c. [f dominates g]

How can they be defined separately in a Worst Case scenario?

They are either going to be different, or the same; in that case we say Θ(n), where n is usually the size of the problem. As Dukeling said:

Worse, best and average cases can* be represented as a function (for terminating algorithms). Each of those functions has upper and lower bounds (of which there are infinitely many). Doing a constant number of operations per element (e.g. the best case for insertion sort and average/worst case for linear search) would have a tight bound (lower and upper bound) of Θ(n), but also an upper bound of O(n2) or a lower bound of Ω(1).

Can bounds be defined for other cases(Best,Average) as well?

Yes. It is possible that all the cases have their upper and lower bounds.

Murrumbidgee answered 25/5, 2017 at 11:17 Comment(8)
"Lower bound is the best case" no.Siderite
@n.m. can you please explain? =)Murrumbidgee
Bounds. When talking about complexities, bounds are our estimates of the true number of operations an algorithm may perform, normally in the worst or average cases, The best case is simply not interesting, For example, O(n log n) is the lower bound of a complexity of a comparison-based sort algorithm, which means no algorithm is more efficient in the worst case (and in the average case too). A particular algorithm may perform better on particular inputs but this is not interesting in the slightest.Siderite
Worse, best and average cases can* be represented as a function (for terminating algorithms). Each of those functions has upper and lower bounds (of which there are infinitely many). Doing a constant number of operations per element (e.g. the best case for insertion sort and average / worst case for linear search) would have a tight bound (lower and upper bound) of Θ(n), but also an upper bound of O(n^2) or a lower bound of Ω(1).Extensile
Note the Stack Overflow question you yourself linked to in your answer indicates that the worst case can have both an upper and a lower bound.Extensile
Thanks n.m., you were right! @Dukeling oh I see, yes you are right! Answer updated now, I injected your comment, but if you have an objection to that, let me know please. =)Murrumbidgee
The first part of your answer still has the same problem - those are the definitions of best and worst case.Extensile
Updated @Dukeling, what do you think?Murrumbidgee

© 2022 - 2024 — McMap. All rights reserved.