Complexity Analysis — an overview

Hridhi Sethi
4 min readOct 18, 2020
Complexity anlasis for choosing the optimal solution

Suppose your mom asked you to bring some groceries for a special dinner tonight. You got all the items except the main ingredient. Now you need to visit an another shop for which there are 3 different possibilities supposing that there are ‘n’ number of stores
Possibility 1: You can go to a store and ask the shopkeeper about the item and you also ask him if it is available in all the ‘n-1’ stores . This could take a lot of time enquiring right ?
Possibility 2: You can visit all the stores one by one till you find your required item it is also time consuming and common method
Possibility 3: In here we can divide the shops into 2 groups and with groups more 2 groups and repeat the process and visit the store and get the item in the least time possible.

Let’s dig deep in:

This is the reason why we require complexity analysis so that we can compare different algorithms and get the most optimal algorithm for the problem given. Always remember that you need to break a given algorithm into smaller parts and find the time complexity of each part and merge them to get the overall time complexity. A general statement like assignment of operators will take up to a time complexity of O(1). Keeping that in mind lets get ahead and find the time complexity of following algorithms.

Example 1 : First we will try to understand the calculation of time complexity in for loops

Example 2: Secondly we will try to estimate the time complexity of recursive algorithms (A technique where a set of instructions run ‘n’ number of times till a required condition is met)

Approach: Suppose that the function fun(int n) takes a time complexity of T(n) then we know that the function fun(n/2) will take time complexity of T(n/2) now the overall time complexity is T(n)=2T(n/2)+c

Analysis: Now the overall complexity depending on different levels will be
T(n)=c+2c+4c+..+2^(logn)c
After applying geometric progression formula we get
T(n)=2^(logn)-1

Finally the complexity of the given recursive function is O(2^log(n))

Different solutions for a problem will have different time complexities. Among them we select the optimal solution on the basis of less time and space consumed. So complexity analysis plays a very vital role. Lets see different functions and their growth rates where you can say that linear algorithms are more preferred over exponential time complexity algorithms.

O(1)<O(logn)<O(n)<O(nlogn)<O(n²)<O(n³)<O((3/2)^n)<O(2^n)

For a given problem there can be 2 cases in time complexity , which are worst case where in we get the upper limit running time of the algorithm , and the other is average case which determines the cost of the algorithm on an average. Lets see the algorithm of quick sort to understand the cases well

Quick sort problem: This uses a divide and conquer approach for sorting, where in we choose a pivot and divide the array such that elements at the right are less than pivot and elements at left are greater than the pivot.
On an average the algorithm will run with an time complexity of O(nlogn). But in the worst case where when the pivot chosen is the maximum\minimum element in the array then the time complexity will become O(n²).

An Example:

In this algorithm the outer loop will run ’n’ times and the inner loop will execute ’n’ times in first time , ‘n/2’ times in second execution . So the time complexity will be O(nlogn)

The main purpose of learning complexity analysis is to find out better and optimal algorithms for a given problem which in turn requires practice. You can practice more questions related to time complexity here.

--

--

Hridhi Sethi

I am a student pursuing my 3rd year Btech in computer science from Amrita Coimbatore. I like building stuff and writing. AI and ML interest’s me.