recursion time complexity - geekstocode

Recursion Time Complexity || Analysis of Recursion

Recursion and it’s Time Complexity

As we know ,”Recursion is a technique of repeating a set of instruction to solve a specific problem”. But How do we analyze recursion and find it’s time complexity

So, In this post we are going to discuss the method by which you can easily calculate time complexity of recursion.

Recursive Function

void fun(int n)
{
if(n<=0)
return;
fun(n/2);
fun(n/2);
}

Above Function is a recursive function and return whenever n becomes less then zero, but before returning it calls itself two time’s fun(n/2) and fun(n/2)

Suppose Time Complexity of fun(n) is be T(n)

Then Time complexity of fun(n/2) is T(n/2) [Simple Mathematics]

So we can say T(n) = T(n/2) + T(n/2) + C [Above Recursive Function]

Where C is constant and represents time complexity of the given code from an above recursive function

if(n<=0)
return

Now, we know how to write the equation and the only part remained is to calculate time complexity. So here are two simple step

  1. We write non-recursive part as the root of a tree and recursive part as children
  2. We keep expanding until we see a pattern
                       From T(n) = 2T(n/2) + C

Recursive Part = 2T(n/2)
Non Recursive Part = C
               Follow Step - 1

1.                             C  [Non-Recursive Part as the root of a tree]
                           /       \
                         T(n/2)     T(n/2)      [Recursive Part as Children]

Now, we know how to write Root and Children, Now we are going to create up to 4 levels so we can see a pattern easily

                    Level 1   C
                          /    \
             Level 2   T(n/2)  T(n/2)    

But we know T(n/2) = 2T(n/4) + C---(2)   [Replace n by n/2 in equation 1]
Again we follow step 1
        Level 1          C    from eq(1) [Non Recursive Part as Root]
                     /      \
      Level 2      C          C     from eq (2)[Non Recursive Part as Root]
                /   \        /  \
  Level 3    T(n/4) T(n/4)T(n/4) T(n/4)    [Recursive Part as children]

But we again Know T(n/4) = 2T(n/8) + C---(3)[Replace n by n/4 in equation 1]

       Level 1          C    from eq(1) [Non Recursive Part as Root]
                     /      \
      Level 2     C          C     from eq (2)[Non Recursive Part as Root]
                /   \        /  \
    Level 3    C     C      C    C from eq (3) [Non Recursive part as root]
             /   \  /   \  /  \ / \ 
Lev 4 T(n/8)T(n/8)T(n/8)T(n/8)T(n/8)T(n/8)T(n/8)T(n/8)[Recursive part as children]


Now add Constant time at every level and you will see a pattern

Level – 1 C

Next Level – 2 2C

Next Level – 3 4C

And you may guess that at level 4 Sum must be 8C

Hence add time at each level T(n) = C+ 2C + 4C + 8C +………

Now, we are able to find the equation, but can you guess what is the number of terms in this series.

Yes, Every time n becomes n/2 so total number of terms should be log(n)

So add the above series with n = log(n) and r = 2 and add above Geometric Progression

T(n) = (1 * (2^log(n) – 1)) / (2-1) [Formula of sum of GP]

T(n) = 2^log(n) – 1

O(T(n)) = 2^log(n)

Hence we are able to know the time complexity of Above recursive function

Calculate Time complexity of below recursive function

T(n) = 2T(n/2) + Cn

T(n) = T(n) + Cn;

Time Complexity for binary search(recursive way)

Now let’s compute the time complexity for binary search problem. The code goes as follows,

// Find returns the smallest index i at which x <= a[i].
// If there is no such index, it returns len(a).
// The slice must be sorted in ascending order.
func Find(a []int, x int) int {
    switch len(a) {
    case 0:
        return 0
    case 1:
        if x <= a[0] {
            return 0
        }
        return 1
    }
    mid := 1 + (len(a)-1)/2
    if x <= a[mid-1] {
        return Find(a[:mid], x)
    }
    return mid + Find(a[mid:], x)
}

Calculation

Use the notation T(n) to mean the number of elementary operations performed by this algorithm in the worst case, when given a sorted slice of n elements.

Now the equation of the recurrence is,

  • T(1) = 1,  (*)
  • T(n) = 1 + T(n/2), when n > 1.  (**)

The equation (**) captures the fact that the function performs constant work (that’s the one) and a single recursive call to a slice of size n/2.

(In fact, the slice may also end up having n/2 + 1 elements. We don’t worry about that, since we’re only looking for an asymptotic estimate.)

T(n) = (**)
1 + T(n/2) = (**)
1 + (1 + T(n/4)) = 2 + T(n/4) = (**)
2 + (1 + T(n/8)) = 3 + T(n/8) = ...
k + T(n/2k) = ...
log n + T(n/2log n) = log n + T(1) = (*)
log n + 1 = Θ(log n).

Conclusion

In this article, we exclusively discussed the time complexity for the recursive problem. We also discussed examples along with a detailed explanation. Hope this article helped you learn this certain topic in an easier way. Keep practising and happy coding! Leave your comments below to let us know if you have any query.

Sudhanshu is Technology geek and also a pro pubg player. I like to create content in my free time that helps others. Currently pursuing BCA from Noida and operating Geekstocode

3 Comments

  1. I truly do love engaging with your business. Your internet layout is incredibly easy about the eye. You have a very good great spot for their shop. I actually enjoyed navigating together with ordering out of your site.

Leave a Comment

Your email address will not be published. Required fields are marked *