I am just a bit confused. If time complexity of an algorithm is given by
what is that in big O notation? Just or we keep the log?
I am just a bit confused. If time complexity of an algorithm is given by
what is that in big O notation? Just or we keep the log?
If that's the time-complexity of the algorithm, then it is in big-O notation already, so, yes, keep the log. Asymptotically, there is a difference between O(n^2)
and O((n^2)*log(n))
.
A formal mathematical proof would be nice here.
Let's define following variables and functions:
N
- input length of the algorithm,
f(N) = N^2*ln(N)
- a function that computes algorithm's execution time.
Let's determine whether growth of this function is asymptotically bounded by O(N^2)
.
According to the definition of the asymptotic notation [1], g(x)
is an asymptotic bound for f(x)
if and only if: for all sufficiently large values of x
, the absolute value of f(x)
is at most a positive constant multiple of g(x)
. That is, f(x) = O(g(x))
if and only if there exists a positive real number M
and a real number x0
such that
|f(x)| <= M*g(x) for all x >= x0 (1)
In our case, there must exists a positive real number M
and a real number N0
such that:
|N^2*ln(N)| <= M*N^2 for all N >= N0 (2)
Obviously, such M
and x0
do not exist, because for any arbitrary large M
there is N0
, such that
ln(N) > M for all N >= N0 (3)
Thus, we have proved that N^2*ln(N)
is not asymptotically bounded by O(N^2)
.
References:
1: - https://en.wikipedia.org/wiki/Big_O_notation
A simple way to understand the big O notation is to divide the actual number of atomic steps by the term withing the big O and validate you get a constant (or a value that is smaller than some constant).
for example if your algorithm does 10n²⋅logn steps:
10n²⋅logn/n² = 10 log n -> not constant in n -> 10n²⋅log n is not O(n²)
10n²⋅logn/(n²⋅log n) = 10 -> constant in n -> 10n²⋅log n is O(n²⋅logn)
You do keep the log because log(n) will increase as n increases and will in turn increase your overall complexity since it is multiplied.
As a general rule, you would only remove constants. So for example, if you had O(2 * n^2), you would just say the complexity is O(n^2) because running it on a machine that is twice more powerful shouldn't influence the complexity.
In the same way, if you had complexity O(n^2 + n^2) you would get to the above case and just say it's O(n^2). Since O(log(n)) is more optimal than O(n^2), if you had O(n^2 + log(n)), you would say the complexity is O(n^2) because it's even less than having O(2 * n^2).
O(n^2 * log(n)) does not fall into the above situation so you should not simplify it.
if complexity of some algorithm =O(n^2) it can be written as O(n*n). is it O(n)?absolutely not. so O(n^2*logn) is not O(n^2).what you may want to know is that O(n^2+logn)=O(n^2).
A simple explanation :
O(n2 + n)
can be written as O(n2)
because when we increase n
, the difference between n2 + n
and n2
becomes non-existent. Thus it can be written O(n2)
.
Meanwhile, in O(n2logn)
as the n
increases, the difference between n2
and n2logn
will increase unlike the above case.
Therefore, logn
stays.
© 2022 - 2024 — McMap. All rights reserved.