🚀 Data Structures & Algorithms

Complete Revision Guide with Interactive Tables

No results found. Try searching for algorithm names, data structures, or complexity notations.
O(1) Constant
O(log n) Logarithmic
O(n) Linear
O(n log n) Linearithmic
O(n²) Quadratic
O(2ⁿ) Exponential

📊 Data Structures Comparison

Data Structure Access Search Insert Delete Space Key Properties Use Cases
Array O(1) O(n) O(n) O(n) O(n) Contiguous memory, indexed Random access, mathematical ops
Dynamic Array O(1) O(n) O(1)* O(n) O(n) Resizable, amortized insert General purpose collections
Linked List O(n) O(n) O(1) O(1) O(n) Sequential access only Frequent insertions/deletions
Doubly Linked O(n) O(n) O(1) O(1) O(n) Bidirectional traversal Undo operations, LRU cache
Stack O(n) O(n) O(1) O(1) O(n) LIFO principle Function calls, DFS, expression eval
Queue O(n) O(n) O(1) O(1) O(n) FIFO principle BFS, process scheduling
Binary Tree O(n) O(n) O(n) O(n) O(n) Hierarchical structure Expression trees, decision trees
BST O(log n) O(log n) O(log n) O(log n) O(n) Left < Root < Right Sorted data, range queries
AVL Tree O(log n) O(log n) O(log n) O(log n) O(n) Self-balancing BST Guaranteed O(log n) operations
Red-Black Tree O(log n) O(log n) O(log n) O(log n) O(n) Self-balancing with colors Standard library implementations
Heap O(n) O(n) O(log n) O(log n) O(n) Complete binary tree Priority queue, heap sort
Hash Table - O(1)* O(1)* O(1)* O(n) Key-value mapping Fast lookups, caching
Trie O(m) O(m) O(m) O(m) O(n×m) Prefix tree for strings Autocomplete, spell check

🌐 Graph Algorithms

Algorithm Time Complexity Space Purpose Key Points
BFS O(V + E) O(V) Shortest path (unweighted) Level-by-level, uses queue
DFS O(V + E) O(V) Graph traversal, cycle detection Uses stack/recursion
Dijkstra O((V + E) log V) O(V) Shortest path (weighted, no negative) Greedy, uses priority queue
Bellman-Ford O(VE) O(V) Shortest path (handles negative weights) Detects negative cycles
Floyd-Warshall O(V³) O(V²) All-pairs shortest path Dynamic programming
Kruskal's MST O(E log E) O(V) Minimum spanning tree Sort edges, union-find
Prim's MST O((V + E) log V) O(V) Minimum spanning tree Greedy, priority queue
Topological Sort O(V + E) O(V) Ordering in DAG Kahn's algorithm or DFS

🔄 Sorting Algorithms

Algorithm Best Average Worst Space Stable In-Place Notes
Bubble Sort O(n) O(n²) O(n²) O(1) Yes Yes Simple, good for small datasets
Selection Sort O(n²) O(n²) O(n²) O(1) No Yes Simple, not stable
Insertion Sort O(n) O(n²) O(n²) O(1) Yes Yes Good for small/nearly sorted
Merge Sort O(n log n) O(n log n) O(n log n) O(n) Yes No Divide and conquer
Quick Sort O(n log n) O(n log n) O(n²) O(log n) No Yes Pivot-based, cache-friendly
Heap Sort O(n log n) O(n log n) O(n log n) O(1) No Yes Uses heap data structure
Counting Sort O(n + k) O(n + k) O(n + k) O(k) Yes No For integers in range k
Radix Sort O(d(n + k)) O(d(n + k)) O(d(n + k)) O(n + k) Yes No For integers, d = digits
Bucket Sort O(n + k) O(n + k) O(n²) O(n + k) Yes No For uniformly distributed data

🔍 Searching Algorithms

Algorithm Time Complexity Space Prerequisites Use Case
Linear Search O(n) O(1) None Unsorted data
Binary Search O(log n) O(1) Sorted array Sorted data
Exponential Search O(log n) O(1) Sorted array Unbounded/infinite arrays
Interpolation Search O(log log n) O(1) Sorted, uniformly distributed Uniformly distributed data
Ternary Search O(log₃ n) O(1) Sorted array Alternative to binary search

💡 Dynamic Programming Patterns

Pattern Example Problems Key Insight Time Complexity Space Complexity
0/1 Knapsack Subset sum, partition Include/exclude decisions O(n×W) O(n×W)
Unbounded Knapsack Coin change, rod cutting Unlimited use of items O(n×W) O(W)
LIS Longest increasing subsequence Build optimal subsequences O(n log n) O(n)
LCS Longest common subsequence String/sequence matching O(m×n) O(m×n)
Edit Distance Levenshtein distance String transformation O(m×n) O(m×n)
Matrix Chain Optimal parenthesization Optimal splitting points O(n³) O(n²)
Palindrome Palindromic substrings Expand around centers O(n²) O(n²)
Tree DP Max path sum, diameter Bottom-up on trees O(n) O(h)

🛠️ Algorithm Design Techniques

Technique Description Example Problems Time Approach When to Use
Divide & Conquer Split problem, solve recursively Merge sort, quick sort T(n) = aT(n/b) + f(n) Problem can be split into subproblems
Greedy Make locally optimal choices MST, Huffman coding Prove greedy choice property Local optimum leads to global optimum
Dynamic Programming Solve subproblems, store results Knapsack, LCS Bottom-up or memoization Overlapping subproblems + optimal substructure
Backtracking Try all possibilities, backtrack N-Queens, Sudoku Exponential, with pruning Need to explore all valid solutions
Two Pointers Use two pointers to scan Two sum, palindrome check O(n) instead of O(n²) Sorted array or string problems
Sliding Window Maintain window of elements Max subarray, substring problems O(n) for subarray problems Contiguous subarray/substring problems
Binary Search Eliminate half the search space Search in sorted array O(log n) Search space can be halved systematically
BFS/DFS Graph/tree traversal techniques Shortest path, connected components O(V + E) for graphs Graph/tree problems, pathfinding

⏱️ Time Complexity Quick Reference

Complexity Name Example Performance (n=1M) Growth Rate
O(1) Constant Array access 1 operation 🟢 Excellent
O(log n) Logarithmic Binary search ~20 operations 🟢 Excellent
O(n) Linear Linear search 1M operations 🟡 Good
O(n log n) Linearithmic Merge sort ~20M operations 🟡 Fair
O(n²) Quadratic Bubble sort 1T operations 🔴 Poor
O(2ⁿ) Exponential Fibonacci (naive) Intractable 🔴 Very Poor

💾 Space Complexity Types

Type Description Example Memory Usage
Constant O(1) Fixed space regardless of input Bubble sort Few variables
Logarithmic O(log n) Space for recursion stack Binary search Call stack depth
Linear O(n) Space grows with input size Merge sort Additional arrays
Quadratic O(n²) 2D arrays/matrices Floyd-Warshall Matrix storage