Suppose we want to add a large number of integers, how long will this operation take? The first question that comes to mind is the number of integers, let’s assume this number is n. Can we specify exactly how long this will take? Yes, we can for a specific computer. It can never be said universally because the computation power largely depends on the computer’s processor and memory (here, primary memory is referred to). Naturally, a Celeron processor will have much less processing power compared to a Core i-7.

Moreover, it is not logical to count how many statements are there in a program or how many statements are executed. Because the number of statements needed to solve a problem depends on the programming language. It is seen that a program in C might require 100 statements, but the same program in Python might be done in just 10 lines.

For solving this problem, the complexity of the algorithm is expressed in asymptotic notation. Based on the input size, the type of complexity that will arise at runtime is determined through asymptotic notation. Typically, we can measure the complexity of an algorithm through three types of notations –

Big-O

Big-O is a measure of the maximum time it takes to run an algorithm (worst case), that is, the algorithm’s upper bound (upper bound) is expressed through this notation.

f(n) = O(g(n)) (read as “f of n is big oh of g of n”) iff (if and only if) there exist positive constants c and n0 such that f(n)≤cg(n) for all n, n\geq n_{0}

example, 3n+2=O(n) as 3n+2≤4n for all n≥2

10n^2+4n+2=O\left(n^2\right) as 10n^2+4n+2\le11n^2 for all n≥5

g(n) is upperbound on the value of f(n) for all n, nn0

Theory:

If f(n)=a_mn^m+\cdots+a_1n+a_0 then f\left(n\right)=O(n^m)

Big-Ω

Big-Ω is a measure of the minimum time required to run an algorithm (best case), that is, the algorithm’s lower bound (lower bound) is expressed through this notation.

f(n) = Ω(g(n)) (read as “f of n is omega of g of n”) iff there exist positive constants c and n0 such that f(n)≥cg(n) for all n, nn0

example, 3n+2=Ω(n) as 3n+2≥3n for all n≥1

10n^2+4n+2=\Omega \left(n^2\right) as 10n^2+4n+2\geq n^2 for all n≥1

g(n) is only a lower bound on f(n)

Theory:

If f(n)=a_mn^m+\cdots+a_1n+a_0 then f\left(n\right)=\Omega (n^m)

Big-Θ

Big-Θ is the average calculation of the maximum or minimum time an algorithm will take to run.

f(n) = Θ(g(n)) (read as “f of n is theta of g of n”) iff there exist positive constants c1, c2, and n0 such that c1g(n)≤f(n)≤c2g(n) for all n, nn0

example, 3n+2=Θ(n) as 3n+2≥3n for all n≥2 and 3n+2≤4n for all n≥2, so

c1=3, c2=4, and n0=2

10n^2+4n+2=\Theta\left(n^2\right)

g(n) is both an upper and lower bound on f(n)

Theory:

If f(n)=a_mn^m+\cdots+a_1n+a_0 then f\left(n\right)=\Theta (n^m)

Asymptotic notation and algorithm complexity

The figure above depicts a comparison of these three types of notation.

0 0 votes
Article Rating
0
Would love your thoughts, please comment.x
()
x