To respond to the individual points you've raised:
I think the time complexity of below code should be O(1) as worst case can be log 1000 base 2 or something definite.
Yep, that's exactly right!
But I am not sure as it's time does vary with input
You are correct that the runtime varies with the input size. However, that does not necessarily mean that the runtime is not O(1). If an algorithm's runtime is always bounded from above by some constant, regardless of what the input size is, then its runtime is O(1). Stated differently, an O(1) runtime means "without even looking at your input, I can bound how long the algorithm is going to take to run." (Technically that isn't 100% accurate, since big-O notation talks about what happens for sufficiently large inputs, but it's true in this case.)
Here's another example of this:
void sillyFunction(int n) {
for (int i = 0; i < 137 && i < n; i++) {
cout << '*' << endl;
}
}
We can guarantee that the loop will run at most 137 times regardless of what n is. However, for small values of n, we may do less work than this. But the runtime here is still O(1), since we have that bound of "at most 137 iterations."
Here's another example:
void amusingFunction(int n) {
for (int i = 137; i >= 0 && i >= n; i++) {
cout << '*' << endl;
}
}
Again, this loop is guaranteed to run at most 137 times. Here, though, the work decreases as we increase n, to the point where the loop never runs when n ≥ 137. But since we can bound the total number of loop iterations at at most 137 without even looking at n, the runtime is O(1).
Here's a trickier example:
void deviousFunction(int n) {
if (n <= 137) {
while (true) { // infinite loop!
cout << '*';
}
}
cout << "Yup." << endl;
}
This function will go into an infinite loop for any n ≤ 137. However, for sufficiently large values of n (namely, when n > 137), the algorithm always terminates immediately. This algorithm therefore has a runtime of O(1): there's a constant amount of work where, for any sufficiently large n, the algorithm does at most that much work. (This is highly contrived and I've never seen anything like this before, but you get the picture.)
and the given answer is O(n), which I am very confused about how they got that.
The runtime bound here of O(n) to me seems incorrect. It's technically not wrong to say the runtime is O(n) because that does provide a correct bound on the runtime, but it's not tight. You should ask whoever gave you this bound to explain their reasoning; perhaps there's a typo in the code or in the explanation?
If we increase n, function gets called fewer times so is it O(1/n)? Is it even possible?
As n increases, the number of recursive calls is nonincreasing, but it doesn't necessarily decrease. For example, fun2(1000)
and fun2(10000000)
each result in a total of one call being made.
It's not possible for an algorithm to have a runtime of O(1 / n) because all algorithms do at least a constant amount of work, even if that work is "set up the stack frame." A runtime bound of O(1 / n) means that, for sufficiently large n, you would be doing less than one unit of work. So in that sense, there's a difference between "the runtime drops as n gets bigger, to the point where it flattens out at a constant" and "the runtime is O(1 / n)."
O(logN)
, notO(1/N)
.O(1/N)
would mean a bigger N makes the code faster and that doesn't happen here. It takes you more "iterations" to get to an N of 1000 then it does for an N of 100. – SelfanalysisO(1)
. OrO(log2(1000))
. OrO(log2(LIMIT))
– Sextupletvoid fun(int n) { if (n <= 0) return; if (n > 10) return; fun(n+1); }
– MarkowitzO(log LIMIT)
? – Elsey