For Loop Analysis | The Complete Guide

For Loop

Whenever we want to repeat a set of instructions we used loops. For loop is the most used in while, do-while and for.

Whenever a set of instructions repeats it has a very high impact on algorithm speed.

In the previous post, we tell that worst case is most often use, so we analyze our for loop in a worst-case scenario

Similarly, In this post, we are going to learn how to analyze for loop to calculate the running time of the algorithm

After all, we had already discussed asymptotic notation and we know the best case, average case, and worst case.

In this post, we discuss the basic time complexity of for by the help of simple examples.

For Loop Analysis

1. O(1) in for loop

The time complexity of a function (or set of statements) is considered as O(1) if it doesn’t contain loop, recursion, call to any other non-constant time function and any repetitive statements.

For Example add function

void add(int a,int b)
{
int sum = a+b;
cout<<sum;
}

Another notable point is whenever a loop runs a constant number of time, it is treated as O(1). For Example,

for(int i = 0;i<C;i++)           //  C is constant
{
         //O(1) expressions
}

O(n)

Time Complexity of is considered as O(n) if variables are incremented/decremented by a constant amount. For example following functions have O(n) time complexity.

// Here c is a positive integer constant   

   for (int i = 1; i <= n; i = i + c) {  

        // some O(1) expressions

   }



   for (int i = n; i > 0; i = i - c) {

        // some O(1) expressions

   }

O(n^c)

The time complexity of nested loops is equal to the number of times the innermost statement is executed.

For example, the following sample loops have O(n2) time complexity and another one is O(n^3)

for(int i =0;i<n;i++)
{
for(int j = 0;j<n;j++)
{
// Some O(1) Expressions
}
}

for(int i = 0;i<n;i++)
{
for(int j = 0;j<n;j++)
{
for(int k = 0;k<n;k++)
// Some O(1) expressions
}
}
}

O(log(n))

Time Complexity of a loop is considered as O(log N) if the loop variables are divided/multiplied by a constant value.


   for (int i = 1; i <=n; i = i* c) {

       // some O(1) expressions

   }



   for (int i = n; i > 0; i = i / c) {

       // some O(1) expressions

   }

O(log(log(n)))

Time Complexity of a loop is considered as O(log log N). If the variables are reduced/increased exponentially by a constant amount.

 // Here c is a constant greater than 1   

   for (int i = 2; i <=n; i = pow(i,c)) { 

       // some O(1) expressions

   }

Consecutive For Loop’s

When there are consecutive loops, we calculate time complexity as sum of time complexities of individual loops.

  for (int i = 1; i <=m; i += c) {  

        // some O(1) expressions

   }
   for (int i = 1; i <=n; i += c) {

        // some O(1) expressions

   }
   The time complexity of the above code is O(m) + O(n) which is O(m+n)

   If m == n, the time complexity becomes O(2n) which is O(n).

Time complexity when there are many if, else statements inside for loops?

As discussed earlier, the worst-case time complexity is the most useful among best, average and worst. Therefore we need to consider the worst case.

If there are any number of if-else condition then only we take if – else in consideration

For example, consider the linear search function where we considered the case when an element is present at the end or not present at all.

We can get an upper bound by ignoring if-else and other complex control statements,After all, when the code is too complex to consider all if-else cases,

Tagged : / / /

What is Time Complexity || Complete Guide

As we discussed in our previous post, analysis of algorithm we know on which factor the time complexity of the algorithm depends

  1. Machine
  2. Algorithm Speed

We know, on our hand, we can control only algorithm speed Machine and its speed may vary.

So, In this post, we are going to learn time complexity which is an important tool to know algorithm running speed.

What is Time Complexity

The time complexity of an algorithm or a program is the amount of time it needs to run to completion.

The exact time will depend on the implementation of the algorithm, programming language, optimizing the capabilities of the compiler used, the CPU speed, and other hardware characteristics/specifications and so on. However, to measure the time complexity accurately, we have to count all sorts of operations performed in an algorithm.

The time complexity also depends on the amount of data inputted to an algorithm.

We learn each topic in easiest way possible.

Units of measuring Time Complexity

Asymptotic Notation

Asymptotic notations are mathematical tools to represent the time complexity of algorithms for asymptotic analysis. The following 3 asymptotic notations are mostly used to represent the running time of algorithms:

Now, we are going to learn three asymptotic notation one by one to analyze the running time of the programme

1. O Notation(Big- O)

O Notation is the simplest and most used asymptotic notation. It is the upper bound of the algorithm, what is upper bound? Upper bound means the running time of program can not go beyond this time

let’s take an example, assume we say that the Big-O Notation of Linear search is n where n is input size then this simply means this is the upper bound and if we run our program, running time do not go beyond n

O(g(n)) = { f(n): there exist positive constants c and 

                  n0 such that 0 <= f(n) <= c*g(n) for 

                  all n >= n0}

From Above All, Notable Point’s are

  1. Big O notation always point to the worst-case scenario
  2. Maximum Time
  3. Upper Bound
  4. Most Used
2. Ω Notation(Big- Omega)

As we know Big O notation provides an asymptotic upper bound on a function. However, Ω notation provides an asymptotic lower bound.

Big – Omega Notation can be useful when we have lower bound on the time complexity of an algorithm. This is the least used notation.

Ω (g(n)) = {f(n): there exist positive constants c and

                  n0 such that 0 <= c*g(n) <= f(n) for

                  all n >= n0}.

From Above All, Notable Point’s are

  1. Best-case scenario
  2. Minimum Time
  3. Lower Bound
  4. Least Used
3. Θ Notation (Big-Theta)

Above all, we learn the upper bound and lower bound. In addition, there is also Big Theta which shows an average of these two.

The theta notation bounds functions from above and below, so it defines exact asymptotic behavior.

From Above All, Notable Point’s are

  1. Average-case scenario
  2. Average Time
  3. Average Bound
  4. Least Used
Θ(g(n)) = {f(n): there exist positive constants c1, c2 and n0 such 

                 that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}

Important Points in Time Complexity

  • Most of the time, we do the worst-case analysis to analyze algorithms. In the worst analysis,Therefore, we guarantee an upper bound on the running time of an algorithm which is a good piece of information.
  • The average case analysis is not easy to do in most of the practical cases and it is rarely done. However, the average case analysis, we must know (or predict) the mathematical distribution of all possible inputs.
  • The Best Case analysis is bogus. Guaranteeing a lower bound on an algorithm doesn’t provide any information as in the worst case, an algorithm may take years to run.

In Conclusion, Big O is mostly used and Most Important

Tagged : / /