What does ‘Space Complexity’ mean?

The term Space Complexity is misused for Auxiliary Space at many places. Following are the correct definitions of Auxiliary Space and Space Complexity. 

Auxiliary Space is the extra space or temporary space used by an algorithm. The space Complexity of an algorithm is the total space taken by the algorithm with respect to the input size. Space complexity includes both Auxiliary space and space used by input. 

For example, if we want to compare standard sorting algorithms on the basis of space, then Auxiliary Space would be a better criterion than Space Complexity. Merge Sort uses O(n) auxiliary space, Insertion sort, and Heap Sort use O(1) auxiliary space. The space complexity of all these sorting algorithms is O(n) though. 

Space complexity is a parallel concept to time complexity. If we need to create an array of size n, this will require O(n) space. If we create a two-dimensional array of size n*n, this will require O(n2) space.

In recursive calls stack space also counts. 

Example : 

int add (int n){
if (n <= 0){
return 0;
}
return n + add (n-1);
}

Here each call add a level to the stack :

1. add(4)
2. -> add(3)
3. -> add(2)
4. -> add(1)
5. -> add(0)
Each of these calls is added to call stack and takes up actual memory.
So it takes O(n) space.

However, just because you have n calls total doesn’t mean it takes O(n) space.
Look at the below function :

int addSequence (int n){
int sum = 0;
for (int i = 0; i < n; i++){
sum += pairSum(i, i+1);
}
return sum;
}
int pairSum(int x, int y){
return x + y;
}

There will be roughly O(n) calls to pairSum. However, those
calls do not exist simultaneously on the call stack,
so you only need O(1) space.

Asymptotic Analysis of Algorithms Notes for GATE Exam [2024]Asymptotic Notations

This Asymptotic Analysis of Algorithms is a critical topic for the GATE (Graduate Aptitude Test in Engineering) exam, especially for candidates in computer science and related fields. This set of notes provides an in-depth understanding of how algorithms behave as input sizes grow and is fundamental for assessing their efficiency. Let’s delve into an introduction for these notes:

Table of Content

  • Introduction of Algorithms
  • Asymptotic Analysis
  • Worst, Best and Average Case
  • How to Analyse Loops for Complexity Analysis of Algorithms?
  • How to combine the time complexities of consecutive loops? 
  • Algorithms Cheat Sheet for Complexity Analysis:
  • Runtime Analysis of Algorithms:
  • Little o and Little omega notations
  • What does ‘Space Complexity’ mean?
  • Previous Year GATE Questions

Similar Reads

Introduction of Algorithms

The word Algorithm means “A set of rules to be followed in calculations or other problem-solving operations” Or “A procedure for solving a mathematical problem in a finite number of steps that frequently involves recursive operations “....

Asymptotic Analysis

Given two algorithms for a task, how do we find out which one is better?...

Measurement of Complexity of an Algorithm (Worst, Best and Average Case)

Based on the above three notations of Time Complexity there are three cases to analyze an algorithm:...

How to Analyse Loops for Complexity Analysis of Algorithms?

Constant Time Complexity O(1):...

How to combine the time complexities of consecutive loops?

When there are consecutive loops, we calculate time complexity as a sum of the time complexities of individual loops....

Algorithms Cheat Sheet for Complexity Analysis:

Algorithm Best Case Average Case Worst Case Selection Sort O(n^2) O(n^2) O(n^2) Bubble Sort O(n) O(n^2) O(n^2) Insertion Sort O(n) O(n^2) O(n^2) Tree Sort O(nlogn) O(nlogn) O(n^2) Radix Sort O(dn) O(dn) O(dn) Merge Sort O(nlogn) O(nlogn) O(nlogn) Heap Sort O(nlogn) O(nlogn) O(nlogn) Quick Sort O(nlogn) O(nlogn) O(n^2) Bucket Sort O(n+k) O(n+k) O(n^2) Counting Sort O(n+k) O(n+k) O(n+k)...

Runtime Analysis of Algorithms:

In general cases, we mainly used to measure and compare the worst-case theoretical running time complexities of algorithms for the performance analysis. The fastest possible running time for any algorithm is O(1), commonly referred to as Constant Running Time. In this case, the algorithm always takes the same amount of time to execute, regardless of the input size. This is the ideal runtime for an algorithm, but it’s rarely achievable. In actual cases, the performance (Runtime) of an algorithm depends on n, that is the size of the input or the number of operations is required for each input item. The algorithms can be classified as follows from the best-to-worst performance (Running Time Complexity):...

Little o and Little omega notations:

Little-o: Big-O is used as a tight upper bound on the growth of an algorithm’s effort (this effort is described by the function f(n)), even though, as written, it can also be a loose upper bound. “Little-o” (o()) notation is used to describe an upper bound that cannot be tight....

What does ‘Space Complexity’ mean?

The term Space Complexity is misused for Auxiliary Space at many places. Following are the correct definitions of Auxiliary Space and Space Complexity....

Previous Year GATE Questions:

1. What is the worst-case time complexity of inserting n elements into an empty linked list, if the linked list needs to be maintained in sorted order? More than one answer may be correct. [GATE CSE 2020] (A) Θ(n) (B) Θ(n log n) (C) Θ(n2) (D) Θ(1) Solution: Correct answer is (C)...

Contact Us