As we discussed in our previous post, analysis of algorithm we know on which factor the time complexity of the algorithm depends

- Machine
- Algorithm Speed

We know, on our hand, we can control only algorithm speed Machine and its speed may vary.

So, In this post, we are going to learn time complexity which is an important tool to know algorithm running speed.

## What is Time Complexity

The time complexity of an algorithm or a program is the amount of time it needs to run to completion.

The exact time will depend on the implementation of the algorithm, programming language, optimizing the capabilities of the compiler used, the CPU speed, and other hardware characteristics/specifications and so on. However, to measure the time complexity accurately, we have to count all sorts of operations performed in an algorithm.

The time complexity also depends on the amount of data inputted to an algorithm.

We learn each topic in easiest way possible.

## Units of measuring Time Complexity

### Asymptotic Notation

Asymptotic notations are mathematical tools to represent the time complexity of algorithms for asymptotic analysis. The following 3 asymptotic notations are mostly used to represent the running time of algorithms:

Now, we are going to learn three asymptotic notation one by one to analyze the running time of the programme

##### 1. **O Notation**(Big- O)

O Notation is the simplest and most used asymptotic notation. It is the upper bound of the algorithm, what is upper bound? Upper bound means the running time of program can not go beyond this time

let’s take an example, assume we say that the Big-O Notation of Linear search is n where n is input size then this simply means this is the upper bound and if we run our program, running time do not go beyond n

```
O(g(n)) = { f(n): there exist positive constants c and
n0 such that 0 <= f(n) <= c*g(n) for
all n >= n0}
```

From Above All, Notable Point’s are

- Big O notation always point to the worst-case scenario
- Maximum Time
- Upper Bound
- Most Used

##### 2. **Ω Notation**(Big- Omega)

**Ω Notation**

As we know Big O notation provides an asymptotic upper bound on a function. However, Ω notation provides an asymptotic lower bound.

Big – Omega Notation can be useful when we have lower bound on the time complexity of an algorithm. This is the least used notation.

```
Ω (g(n)) = {f(n): there exist positive constants c and
n0 such that 0 <= c*g(n) <= f(n) for
all n >= n0}.
```

From Above All, Notable Point’s are

- Best-case scenario
- Minimum Time
- Lower Bound
- Least Used

##### 3. **Θ Notation** (Big-Theta)

Above all, we learn the upper bound and lower bound. In addition, there is also Big Theta which shows an average of these two.

The theta notation bounds functions from above and below, so it defines exact asymptotic behavior.

From Above All, Notable Point’s are

- Average-case scenario
- Average Time
- Average Bound
- Least Used

```
Θ(g(n)) = {f(n): there exist positive constants c1, c2 and n0 such
that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}
```

### Important Points in Time Complexity

- Most of the time, we do the worst-case analysis to analyze algorithms. In the worst analysis,Therefore, we guarantee an upper bound on the running time of an algorithm which is a good piece of information.
- The average case analysis is not easy to do in most of the practical cases and it is rarely done. However, the average case analysis, we must know (or predict) the mathematical distribution of all possible inputs.
- The Best Case analysis is bogus. Guaranteeing a lower bound on an algorithm doesn’t provide any information as in the worst case, an algorithm may take years to run.

In Conclusion, Big O is mostly used and Most Important