What is Asymptotic Analysis?

The time required by an algorithm comes under three types:

1. Worst case : It defines the input for which the algorithm takes a huge time.
2. Average case : It tales avreage time for the program execution.
3. Best case : It defines the input for which the algorithm takes the lowest time.

Asymptotic Notations :

The commonly used asymptotic notations used for calculating the running time
complexity of an algorithm is given below:

1. Big oh notation (o):

-> This measures the performance of an algorithm by simply providing the order
of growth of the function.
-> This notation provides on upper bound on a function which ensures that function
never grows faster than the upper bound.

Example: If f(n) and g(n) one two functions defined for positive integer,
then f(n) = o g(n) is big oh of g(n) or f(n) is on order of g(n) if there exists contants cand no such that:
f(n) <= c.g(n) for all n>=no

2. Theta Notation (θ)

-> The theta notation mainly describes average case scenarias.
-> It represents reallistic time complexity of an algorithm.
-> Big theta is mainly used when the value of worst case and best case is same.

Example : let f(n) and g(n) be functions of n where n is steps required to execute program
f(n) = θ g(n)
The above condition is satisfied only if when:
c1.g(n) <= f(n) <= c2.g(n)

3. Omega Notation ( Ω )

-> It basically describes best-case scenario which is opposite to big-o-notation.
-> It is formal way to represent lower bound to an algorithm's running time.
-> It measures the best amount of time.An algorithm can possibly take to complete or best case time complexity.

Example : If f(n) and g(n) are two functions defined for positive integers,
then f(n) = Ω g(n) as f(n) is omega of g(n) or f(n) is the order of g(n) if there exists constant c and no such that
f(n) >= c.g(n) for all n>=no and c>0