Algorithm Analysis

Easy 15 min read

Why Algorithm Analysis Matters

Understanding how to analyze algorithms is crucial for:

  • 📊 Predicting Performance: Know how your code will scale
  • ⚖️ Making Trade-offs: Choose the right algorithm for your needs
  • 🎯 Interview Success: Essential for technical interviews
  • 🚀 Writing Efficient Code: Build scalable applications
💡 Real-World Example: Instagram needs to display your feed instantly whether you have 100 or 100,000 posts. Algorithm analysis helps engineers choose the right data structures and algorithms to make this possible!

Big O Notation Basics

What is Big O?

Big O notation describes the worst-case performance of an algorithm as input size grows.

Common Time Complexities

O(1)
O(log n)
O(n)
O(n log n)
O(n²)
O(2ⁿ)

Complexity Comparison

Complexity Name n=10 n=100 n=1000 Example
O(1) Constant 1 1 1 Array access
O(log n) Logarithmic 3 7 10 Binary search
O(n) Linear 10 100 1000 Linear search
O(n log n) Linearithmic 30 700 10,000 Merge sort
O(n²) Quadratic 100 10,000 1,000,000 Bubble sort

Big O Rules

1. Drop Constants O(2n) → O(n) O(n/2) → O(n) 2. Drop Lower Order Terms O(n² + n) → O(n²) O(n log n + n) → O(n log n) 3. Different Variables for Different Inputs Two arrays: O(a + b) not O(2n) Nested loops: O(a × b) not O(n²)

Common Algorithm Patterns

1. Nested Loops

# O(n²) - Nested loops for i in range(n): for j in range(n): print(i, j) # O(n³) - Triple nested for i in range(n): for j in range(n): for k in range(n): print(i, j, k)

2. Divide and Conquer

# O(log n) - Binary Search def binary_search(arr, target): left, right = 0, len(arr) - 1 while left <= right: mid = (left + right) // 2 if arr[mid] == target: return mid elif arr[mid] < target: left = mid + 1 else: right = mid - 1 return -1

3. Recursive Patterns

# O(2ⁿ) - Fibonacci (bad) def fib_slow(n): if n <= 1: return n return fib_slow(n-1) + fib_slow(n-2) # O(n) - Fibonacci (optimized) def fib_fast(n): if n <= 1: return n a, b = 0, 1 for _ in range(2, n + 1): a, b = b, a + b return b

Practice Problems

Problem 1: Analyze the Loop

Easy
for i in range(n): for j in range(i): print(i, j) # What's the time complexity?
Show Answer

Answer: O(n²)
The inner loop runs: 0 + 1 + 2 + ... + (n-1) = n(n-1)/2 times
Which simplifies to O(n²)

Problem 2: Recursive Complexity

Medium
def mystery(n): if n <= 1: return 1 return mystery(n//2) + mystery(n//2) # What's the time complexity?
Show Answer

Answer: O(n)
T(n) = 2T(n/2) + O(1)
Using Master Theorem: O(n)

Problem 3: Space Complexity

Hard
def create_matrix(n): matrix = [] for i in range(n): row = [0] * n matrix.append(row) return matrix # What's the space complexity?
Show Answer

Answer: O(n²)
We create an n×n matrix, which requires n² space

Advanced Topics

Master Theorem

For recurrences of the form: T(n) = aT(n/b) + f(n)

Case 1: If f(n) = O(n^(log_b(a) - ε)) Then T(n) = Θ(n^(log_b(a))) Case 2: If f(n) = Θ(n^(log_b(a))) Then T(n) = Θ(n^(log_b(a)) × log n) Case 3: If f(n) = Ω(n^(log_b(a) + ε)) Then T(n) = Θ(f(n))

Amortized Analysis

Average performance over a sequence of operations.

Dynamic Array Resizing: - Most appends: O(1) - Occasional resize: O(n) - Amortized cost: O(1) Example sequence for n=8: 1. Append → O(1) 2. Append → O(1) 3. Append → O(1) 4. Append → O(1) 5. Append + Resize → O(4) 6. Append → O(1) 7. Append → O(1) 8. Append → O(1) Total: O(n) for n operations Amortized: O(1) per operation

Space-Time Trade-offs

Algorithm Time Space Trade-off
Bubble Sort O(n²) O(1) Slow but in-place
Merge Sort O(n log n) O(n) Fast but needs space
Hash Table O(1) avg O(n) Fast access, more memory

Quick Reference

Common Data Structure Operations

Data Structure Access Search Insert Delete
Array O(1) O(n) O(n) O(n)
Linked List O(n) O(n) O(1) O(1)
Hash Table N/A O(1)* O(1)* O(1)*
Binary Search Tree O(log n)* O(log n)* O(log n)* O(log n)*

* Average case. Worst case may differ.

Sorting Algorithm Complexities

Algorithm Best Average Worst Space
Bubble Sort O(n) O(n²) O(n²) O(1)
Quick Sort O(n log n) O(n log n) O(n²) O(log n)
Merge Sort O(n log n) O(n log n) O(n log n) O(n)
Heap Sort O(n log n) O(n log n) O(n log n) O(1)
💡 Interview Tip: Always discuss both time AND space complexity. Sometimes a slightly slower algorithm with better space complexity is the right choice!