Before jumping into solving Data Structure and Algorithm (DSA) problems, one of the most essential skills every coder must develop is the ability to analyze the performance of their code.
When you write code to solve a problem, have you ever wondered:
-
Is this the fastest way to solve it?
-
Can I make it use less memory?
-
Will it still work if input size becomes huge?
To answer these questions, you need to understand two key tools in Data Structures and Algorithms (DSA):
✅ Time Complexity
✅ Space Complexity
And to measure them effectively, we use Asymptotic Notations like Big O, Omega, and Theta.
🕒 What is Time Complexity?
Time Complexity refers to the amount of time an algorithm takes to complete, depending on the size of the input (commonly denoted as n
).
💡 Real-Life Analogy:
Imagine searching for a book in a:
-
Unorganized pile (you check every book) → takes more time.
-
Sorted shelf (you use labels) → faster!
Time complexity tells you how many steps your algorithm takes as input grows.
Example 1: Linear Search
int search(int[] arr, int target) {
for (int i = 0; i < arr.length; i++) {
if (arr[i] == target) return i;
}
return -1;
}
In this code,
If n = 5
, at worst, you check 5 items
If n = 1000
, you check 1000 items
➡️ Number of steps = n
✅ Time Complexity: O(n) (linear)
Example 2: Constant Time Access (Array Element Access)
int getFirst(int[] arr) {
return arr[0];
}
In this code,
No matter how big the array is, you're always accessing just one element.
-
There’s no loop, no recursion, nothing grows with
n
.
🧠 This is the fastest possible time complexity!
➡️ Time Complexity: O(1) (Constant Time)
Example 3: Time Complexity of Recursive Function (Fibonacci)
Let's look at the classic recursive Fibonacci function:
int fibonacci(int n) {
if (n <= 1)
return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}
What This Code Does:
-
It calculates the
n-th
Fibonacci number using recursion. -
It calls itself twice for each
n > 1
.
🔍 Let's Break Down the Time Complexity:
-
For
n = 5
, the function creates a tree of calls: fibonacci(5)
├─ fibonacci(4)
│ ├─ fibonacci(3)
│ │ ├─ fibonacci(2)
│ │ │ ├─ fibonacci(1)
│ │ │ └─ fibonacci(0)
│ │ └─ fibonacci(1)
│ └─ fibonacci(2)
│ ├─ fibonacci(1)
│ └─ fibonacci(0)
└─ fibonacci(3)
├─ fibonacci(2)
└─ fibonacci(1)
- ✅ Number of calls grows exponentially!
Time Complexity:
O(2^n)
Because each call branches into 2 more calls (like a binary tree).
For
n = 10
, it makes nearly1000
calls.For
n = 50
, it’s billions of calls — very inefficient!
🧮 Common Time Complexities:
Notation | Name | Example Scenario |
---|---|---|
O(1) | Constant Time | Accessing an array index |
O(n) | Linear Time | Linear Search |
O(log n) | Logarithmic Time | Binary Search |
O(n²) | Quadratic Time | Bubble Sort |
💾 What is Space Complexity?
Space Complexity refers to how much extra memory your algorithm needs to run.
It includes:
-
Input size
-
Temporary variables
-
Function call stack (in recursion)
🔁 Example 1: Constant Space
int sum(int[] arr) {
int total = 0; // O(1)
for (int num : arr) {
total += num; // Loop uses no extra space
}
return total;
}
(The input size may grow, but memory usage remains constant)
🔁 Example 2: Storing elements in an array
n
✅ Space Complexity: O(n)
What are Asymptotic Notations?
Asymptotic notations are symbols that help you describe time/space complexity formally:
Notation | Meaning | Use Case |
---|---|---|
O(f(n)) | Big O: Worst-case scenario | Always used in interviews |
Ω(f(n)) | Omega: Best-case scenario | For optimization discussions |
Θ(f(n)) | Theta: Average-case (tight bound) | When best = worst = average |
Input size n |
O(1) | O(log n) | O(n) | O(n log n) | O(n²) |
---|---|---|---|---|---|
1 | 1 | 0 | 1 | 0 | 1 |
10 | 1 | 3.3 | 10 | 33 | 100 |
100 | 1 | 6.6 | 100 | 660 | 10,000 |
🧠 O(n²) grows much faster than O(n)
✅ That's why choosing a better algorithm matters!
How to Calculate Time Complexity
🔸 Rule 1: Count Loops
🔸 Rule 2: Nested Loops
Why Complexity Matters?
When working with large data or solving coding interview problems, you should aim for:
-
Less time to run = faster algorithms
-
Less memory usage = optimized space
💡 Example:
-
Linear Search: O(n)
-
Binary Search: O(log n)
Binary Search is faster, but only works on sorted data.
🔎 Sample Code Comparisons
🔹 Linear Search
Space Complexity: O(1)
Space Complexity: O(1)
📊 Summary Table
Algorithm | Time Complexity | Space Complexity |
---|---|---|
Access Array | O(1) | O(1) |
Linear Search | O(n) | O(1) |
Binary Search | O(log n) | O(1) |
Bubble Sort | O(n²) | O(1) |
Merge Sort | O(n log n) | O(n) |