Skip to content

Data Structures

Data Structures and Algorithms

Note: I am actively writing this book; it is still a work in progress and not yet completed.

PART I: FOUNDATIONS

Chapter 1: Introduction to Algorithms

  • 1.1 What is an Algorithm?
    • 1.1.1 Characteristics of algorithms
    • 1.1.2 Algorithm vs Program
    • 1.1.3 Algorithm specification methods
  • 1.2 Why Study Data Structures and Algorithms?
    • 1.2.1 Problem-solving skills
    • 1.2.2 Efficiency and optimization
    • 1.2.3 Real-world applications
  • 1.3 Problem-Solving Techniques
    • 1.3.1 Understanding the problem
    • 1.3.2 Devising a plan
    • 1.3.3 Implementation strategies
    • 1.3.4 Testing and verification

Chapter 2: Algorithm Analysis

  • 2.1 Time Complexity
    • 2.1.1 Running time analysis
    • 2.1.2 Best, average, and worst case
    • 2.1.3 Amortized analysis
  • 2.2 Space Complexity
    • 2.2.1 Memory usage analysis
    • 2.2.2 Auxiliary space vs total space
    • 2.2.3 Space-time tradeoffs
  • 2.3 Asymptotic Notation
    • 2.3.1 Big-O notation (O)
    • 2.3.2 Omega notation (Ω)
    • 2.3.3 Theta notation (Θ)
    • 2.3.4 Little-o and little-omega
    • 2.3.5 Comparing growth rates
  • 2.4 Recurrence Relations
    • 2.4.1 Substitution method
    • 2.4.2 Recursion tree method
    • 2.4.3 Master theorem
    • 2.4.4 Akra-Bazzi method
  • 2.5 Empirical Analysis
    • 2.5.1 Benchmarking techniques
    • 2.5.2 Profiling tools
    • 2.5.3 Experimental analysis

Chapter 3: Mathematical Foundations

  • 3.1 Summations and Series
    • 3.1.1 Arithmetic series
    • 3.1.2 Geometric series
    • 3.1.3 Harmonic series
  • 3.2 Logarithms and Exponents
    • 3.2.1 Properties and rules
    • 3.2.2 Applications in algorithms
  • 3.3 Combinatorics
    • 3.3.1 Permutations
    • 3.3.2 Combinations
    • 3.3.3 Binomial coefficients
  • 3.4 Probability Theory Basics
    • 3.4.1 Random variables
    • 3.4.2 Expected value
    • 3.4.3 Probability distributions
  • 3.5 Proof Techniques
    • 3.5.1 Direct proof
    • 3.5.2 Proof by contradiction
    • 3.5.3 Mathematical induction
    • 3.5.4 Loop invariants

PART II: BASIC DATA STRUCTURES

Chapter 4: Arrays and Strings

  • 4.1 Arrays
    • 4.1.1 One-dimensional arrays
    • 4.1.2 Multi-dimensional arrays
    • 4.1.3 Dynamic arrays
    • 4.1.4 Array operations (insertion, deletion, search)
    • 4.1.5 Circular arrays
  • 4.2 Strings
    • 4.2.1 String representation
    • 4.2.2 String operations
    • 4.2.3 String matching basics
    • 4.2.4 StringBuilder and string buffers
  • 4.3 Matrix Operations
    • 4.3.1 Matrix representation
    • 4.3.2 Sparse matrices
    • 4.3.3 Common matrix operations

Chapter 5: Linked Lists

  • 5.1 Singly Linked Lists
    • 5.1.1 Structure and implementation
    • 5.1.2 Insertion operations
    • 5.1.3 Deletion operations
    • 5.1.4 Searching and traversal
  • 5.2 Doubly Linked Lists
    • 5.2.1 Structure and advantages
    • 5.2.2 Implementation details
    • 5.2.3 Operations
  • 5.3 Circular Linked Lists
    • 5.3.1 Singly circular
    • 5.3.2 Doubly circular
    • 5.3.3 Applications
  • 5.4 Advanced Linked List Techniques
    • 5.4.1 Fast and slow pointers
    • 5.4.2 Cycle detection (Floyd’s algorithm)
    • 5.4.3 Finding middle element
    • 5.4.4 Reversing linked lists
    • 5.4.5 Merging linked lists
  • 5.5 Skip Lists
    • 5.5.1 Structure and concept
    • 5.5.2 Operations
    • 5.5.3 Probabilistic analysis

Chapter 6: Stacks

  • 6.1 Stack Fundamentals
    • 6.1.1 LIFO principle
    • 6.1.2 Stack ADT
    • 6.1.3 Array-based implementation
    • 6.1.4 Linked-list-based implementation
  • 6.2 Stack Operations
    • 6.2.1 Push, pop, peek
    • 6.2.2 isEmpty, isFull
  • 6.3 Applications of Stacks
    • 6.3.1 Expression evaluation
    • 6.3.2 Infix to postfix conversion
    • 6.3.3 Parentheses matching
    • 6.3.4 Function call stack
    • 6.3.5 Backtracking algorithms
    • 6.3.6 Undo mechanisms

Chapter 7: Queues

  • 7.1 Queue Fundamentals
    • 7.1.1 FIFO principle
    • 7.1.2 Queue ADT
    • 7.1.3 Array-based implementation
    • 7.1.4 Linked-list-based implementation
  • 7.2 Circular Queue
    • 7.2.1 Implementation
    • 7.2.2 Advantages
  • 7.3 Deque (Double-Ended Queue)
    • 7.3.1 Operations
    • 7.3.2 Implementation
    • 7.3.3 Applications
  • 7.4 Priority Queue
    • 7.4.1 Concept and ADT
    • 7.4.2 Implementation approaches
    • 7.4.3 Applications
  • 7.5 Applications of Queues
    • 7.5.1 CPU scheduling
    • 7.5.2 BFS traversal
    • 7.5.3 Buffer management
    • 7.5.4 Asynchronous data transfer

PART III: TREES

Chapter 8: Tree Fundamentals

  • 8.1 Tree Terminology
    • 8.1.1 Nodes, edges, root
    • 8.1.2 Parent, child, sibling
    • 8.1.3 Leaf, internal nodes
    • 8.1.4 Height, depth, level
    • 8.1.5 Degree of a node
  • 8.2 Types of Trees
    • 8.2.1 Binary trees
    • 8.2.2 N-ary trees
    • 8.2.3 Ordered vs unordered trees
  • 8.3 Tree Representations
    • 8.3.1 Array representation
    • 8.3.2 Linked representation
    • 8.3.3 Left-child right-sibling

Chapter 9: Binary Trees

  • 9.1 Binary Tree Basics
    • 9.1.1 Definition and properties
    • 9.1.2 Types: full, complete, perfect, balanced
    • 9.1.3 Implementation
  • 9.2 Tree Traversals
    • 9.2.1 Inorder traversal
    • 9.2.2 Preorder traversal
    • 9.2.3 Postorder traversal
    • 9.2.4 Level-order traversal
    • 9.2.5 Morris traversal
    • 9.2.6 Iterative traversals
  • 9.3 Binary Tree Operations
    • 9.3.1 Insertion and deletion
    • 9.3.2 Searching
    • 9.3.3 Height calculation
    • 9.3.4 Counting nodes
  • 9.4 Binary Tree Problems
    • 9.4.1 Diameter of tree
    • 9.4.2 Lowest common ancestor
    • 9.4.3 Tree isomorphism
    • 9.4.4 Path sum problems
    • 9.4.5 Tree serialization/deserialization

Chapter 10: Binary Search Trees (BST)

  • 10.1 BST Fundamentals
    • 10.1.1 BST property
    • 10.1.2 Implementation
    • 10.1.3 Time complexity analysis
  • 10.2 BST Operations
    • 10.2.1 Search
    • 10.2.2 Insertion
    • 10.2.3 Deletion (three cases)
    • 10.2.4 Finding minimum/maximum
    • 10.2.5 Successor and predecessor
  • 10.3 BST Traversals and Applications
    • 10.3.1 Inorder gives sorted order
    • 10.3.2 Range queries
    • 10.3.3 Floor and ceiling

Chapter 11: Balanced Trees

  • 11.1 AVL Trees
    • 11.1.1 Balance factor
    • 11.1.2 Rotations (LL, RR, LR, RL)
    • 11.1.3 Insertion with balancing
    • 11.1.4 Deletion with balancing
    • 11.1.5 Height analysis
  • 11.2 Red-Black Trees
    • 11.2.1 Properties and rules
    • 11.2.2 Color flips
    • 11.2.3 Rotations
    • 11.2.4 Insertion
    • 11.2.5 Deletion
    • 11.2.6 Comparison with AVL trees
  • 11.3 Splay Trees
    • 11.3.1 Splaying operation
    • 11.3.2 Zig, zig-zig, zig-zag cases
    • 11.3.3 Amortized analysis
    • 11.3.4 Applications
  • 11.4 B-Trees
    • 11.4.1 Structure and properties
    • 11.4.2 Order of B-tree
    • 11.4.3 Insertion
    • 11.4.4 Deletion
    • 11.4.5 B+ trees
    • 11.4.6 B* trees
    • 11.4.7 Database applications
  • 11.5 2-3 Trees and 2-3-4 Trees
    • 11.5.1 Node types
    • 11.5.2 Operations
    • 11.5.3 Relationship to red-black trees

Chapter 12: Heaps

  • 12.1 Binary Heap
    • 12.1.1 Min-heap and max-heap properties
    • 12.1.2 Array representation
    • 12.1.3 Heapify operation
    • 12.1.4 Insert operation
    • 12.1.5 Extract-min/max operation
    • 12.1.6 Decrease/increase key
    • 12.1.7 Delete operation
  • 12.2 Building a Heap
    • 12.2.1 Bottom-up approach
    • 12.2.2 Time complexity analysis
  • 12.3 Heap Sort
    • 12.3.1 Algorithm
    • 12.3.2 In-place sorting
    • 12.3.3 Analysis
  • 12.4 Priority Queue Implementation
    • 12.4.1 Using heaps
    • 12.4.2 Operations and complexity
  • 12.5 Advanced Heap Structures
    • 12.5.1 D-ary heaps
    • 12.5.2 Binomial heaps
    • 12.5.3 Fibonacci heaps
    • 12.5.4 Pairing heaps
    • 12.5.5 Leftist heaps
    • 12.5.6 Skew heaps

Chapter 13: Advanced Tree Structures

  • 13.1 Trie (Prefix Tree)
    • 13.1.1 Structure and concept
    • 13.1.2 Insertion
    • 13.1.3 Search
    • 13.1.4 Deletion
    • 13.1.5 Autocomplete implementation
    • 13.1.6 Compressed tries (Radix trees)
    • 13.1.7 Suffix tries
  • 13.2 Segment Tree
    • 13.2.1 Range query problem
    • 13.2.2 Construction
    • 13.2.3 Query operation
    • 13.2.4 Update operation
    • 13.2.5 Lazy propagation
    • 13.2.6 Applications
  • 13.3 Fenwick Tree (Binary Indexed Tree)
    • 13.3.1 Structure
    • 13.3.2 Range sum queries
    • 13.3.3 Point updates
    • 13.3.4 2D BIT
  • 13.4 Suffix Tree
    • 13.4.1 Construction (Ukkonen’s algorithm)
    • 13.4.2 Pattern matching
    • 13.4.3 Longest repeated substring
    • 13.4.4 Applications in bioinformatics
  • 13.5 K-D Trees
    • 13.5.1 Multi-dimensional data
    • 13.5.2 Construction
    • 13.5.3 Nearest neighbor search
    • 13.5.4 Range search
  • 13.6 Interval Trees
    • 13.6.1 Overlapping interval queries
    • 13.6.2 Implementation
    • 13.6.3 Applications

PART IV: HASHING

Chapter 14: Hash Tables

  • 14.1 Hashing Fundamentals
    • 14.1.1 Hash function concept
    • 14.1.2 Direct addressing
    • 14.1.3 Hash table structure
  • 14.2 Hash Functions
    • 14.2.1 Division method
    • 14.2.2 Multiplication method
    • 14.2.3 Universal hashing
    • 14.2.4 Perfect hashing
    • 14.2.5 Cryptographic hash functions (overview)
    • 14.2.6 Rolling hash (Rabin-Karp)
  • 14.3 Collision Resolution
    • 14.3.1 Chaining (separate chaining)
    • 14.3.2 Open addressing
      • Linear probing
      • Quadratic probing
      • Double hashing
    • 14.3.3 Robin Hood hashing
    • 14.3.4 Cuckoo hashing
    • 14.3.5 Hopscotch hashing
  • 14.4 Load Factor and Resizing
    • 14.4.1 Dynamic resizing
    • 14.4.2 Rehashing strategies
  • 14.5 Applications
    • 14.5.1 Dictionaries and maps
    • 14.5.2 Caching
    • 14.5.3 Symbol tables
    • 14.5.4 Database indexing
  • 14.6 Advanced Hashing
    • 14.6.1 Consistent hashing
    • 14.6.2 Bloom filters
    • 14.6.3 Count-min sketch
    • 14.6.4 HyperLogLog

PART V: GRAPHS

Chapter 15: Graph Fundamentals

  • 15.1 Graph Terminology
    • 15.1.1 Vertices and edges
    • 15.1.2 Directed vs undirected
    • 15.1.3 Weighted vs unweighted
    • 15.1.4 Degree, in-degree, out-degree
    • 15.1.5 Path, cycle, connectivity
    • 15.1.6 Complete, bipartite, planar graphs
  • 15.2 Graph Representations
    • 15.2.1 Adjacency matrix
    • 15.2.2 Adjacency list
    • 15.2.3 Edge list
    • 15.2.4 Incidence matrix
    • 15.2.5 Space-time tradeoffs
  • 15.3 Types of Graphs
    • 15.3.1 DAG (Directed Acyclic Graph)
    • 15.3.2 Trees as graphs
    • 15.3.3 Multigraphs
    • 15.3.4 Hypergraphs

Chapter 16: Graph Traversal

  • 16.1 Breadth-First Search (BFS)
    • 16.1.1 Algorithm and implementation
    • 16.1.2 Level-order exploration
    • 16.1.3 Shortest path in unweighted graphs
    • 16.1.4 Connected components
    • 16.1.5 Bipartite graph checking
    • 16.1.6 Applications
  • 16.2 Depth-First Search (DFS)
    • 16.2.1 Algorithm and implementation
    • 16.2.2 Recursive vs iterative
    • 16.2.3 DFS tree
    • 16.2.4 Discovery and finish times
    • 16.2.5 Edge classification (tree, back, forward, cross)
    • 16.2.6 Applications
  • 16.3 Bidirectional Search
  • 16.4 Iterative Deepening DFS

Chapter 17: Graph Algorithms - Connectivity and Components

  • 17.1 Connected Components
    • 17.1.1 Finding connected components
    • 17.1.2 Union-Find (Disjoint Set Union)
      • Path compression
      • Union by rank/size
      • Time complexity analysis
  • 17.2 Strongly Connected Components
    • 17.2.1 Kosaraju’s algorithm
    • 17.2.2 Tarjan’s algorithm
    • 17.2.3 Applications
  • 17.3 Articulation Points and Bridges
    • 17.3.1 Definitions
    • 17.3.2 Finding articulation points
    • 17.3.3 Finding bridges
    • 17.3.4 Biconnected components
  • 17.4 Euler Path and Circuit
    • 17.4.1 Necessary conditions
    • 17.4.2 Hierholzer’s algorithm
    • 17.4.3 Applications
  • 17.5 Hamiltonian Path and Circuit
    • 17.5.1 Problem definition
    • 17.5.2 NP-completeness
    • 17.5.3 Backtracking approach

Chapter 18: Shortest Path Algorithms

  • 18.1 Single-Source Shortest Path
    • 18.1.1 Dijkstra’s algorithm
      • Min-heap implementation
      • Fibonacci heap optimization
      • Correctness proof
    • 18.1.2 Bellman-Ford algorithm
      • Handling negative weights
      • Negative cycle detection
    • 18.1.3 SPFA (Shortest Path Faster Algorithm)
    • 18.1.4 DAG shortest paths
  • 18.2 All-Pairs Shortest Path
    • 18.2.1 Floyd-Warshall algorithm
    • 18.2.2 Johnson’s algorithm
    • 18.2.3 Matrix multiplication approach
  • 18.3 A* Search Algorithm
    • 18.3.1 Heuristic function
    • 18.3.2 Admissibility and consistency
    • 18.3.3 Implementation
    • 18.3.4 Applications in pathfinding

Chapter 19: Minimum Spanning Tree

  • 19.1 MST Problem Definition
  • 19.2 Kruskal’s Algorithm
    • 19.2.1 Greedy approach
    • 19.2.2 Using Union-Find
    • 19.2.3 Time complexity
    • 19.2.4 Correctness proof
  • 19.3 Prim’s Algorithm
    • 19.3.1 Greedy approach
    • 19.3.2 Min-heap implementation
    • 19.3.3 Time complexity
    • 19.3.4 Comparison with Kruskal’s
  • 19.4 Boruvka’s Algorithm
  • 19.5 Applications
    • 19.5.1 Network design
    • 19.5.2 Clustering

Chapter 20: Advanced Graph Algorithms

  • 20.1 Topological Sorting
    • 20.1.1 DFS-based approach
    • 20.1.2 Kahn’s algorithm (BFS-based)
    • 20.1.3 Applications
  • 20.2 Maximum Flow
    • 20.2.1 Flow networks
    • 20.2.2 Ford-Fulkerson method
    • 20.2.3 Edmonds-Karp algorithm
    • 20.2.4 Dinic’s algorithm
    • 20.2.5 Push-relabel algorithm
    • 20.2.6 Min-cut max-flow theorem
  • 20.3 Bipartite Matching
    • 20.3.1 Maximum bipartite matching
    • 20.3.2 Hungarian algorithm
    • 20.3.3 Hopcroft-Karp algorithm
  • 20.4 Graph Coloring
    • 20.4.1 Chromatic number
    • 20.4.2 Greedy coloring
    • 20.4.3 Backtracking approach
    • 20.4.4 Applications (scheduling, register allocation)
  • 20.5 Traveling Salesman Problem
    • 20.5.1 Problem definition
    • 20.5.2 Brute force
    • 20.5.3 Dynamic programming approach
    • 20.5.4 Approximation algorithms
    • 20.5.5 Branch and bound
  • 20.6 Network Flow Applications
    • 20.6.1 Assignment problem
    • 20.6.2 Circulation problems
    • 20.6.3 Image segmentation

PART VI: SORTING ALGORITHMS

Chapter 21: Elementary Sorting Algorithms

  • 21.1 Bubble Sort
    • 21.1.1 Algorithm
    • 21.1.2 Optimization (early termination)
    • 21.1.3 Analysis
  • 21.2 Selection Sort
    • 21.2.1 Algorithm
    • 21.2.2 Analysis
    • 21.2.3 Stability considerations
  • 21.3 Insertion Sort
    • 21.3.1 Algorithm
    • 21.3.2 Binary insertion sort
    • 21.3.3 Analysis
    • 21.3.4 Best case for nearly sorted data
  • 21.4 Shell Sort
    • 21.4.1 Gap sequences
    • 21.4.2 Algorithm
    • 21.4.3 Analysis

Chapter 22: Efficient Sorting Algorithms

  • 22.1 Merge Sort
    • 22.1.1 Divide and conquer approach
    • 22.1.2 Top-down merge sort
    • 22.1.3 Bottom-up merge sort
    • 22.1.4 Analysis and recurrence
    • 22.1.5 Space complexity
    • 22.1.6 External sorting
  • 22.2 Quick Sort
    • 22.2.1 Partition algorithm
    • 22.2.2 Lomuto vs Hoare partition
    • 22.2.3 Pivot selection strategies
    • 22.2.4 Randomized quick sort
    • 22.2.5 Three-way partitioning
    • 22.2.6 Analysis (best, average, worst case)
    • 22.2.7 Tail recursion optimization
  • 22.3 Heap Sort
    • 22.3.1 Review from heaps chapter
    • 22.3.2 In-place sorting
    • 22.3.3 Analysis
    • 22.3.4 Comparison with other O(n log n) sorts

Chapter 23: Linear Time Sorting

  • 23.1 Counting Sort
    • 23.1.1 Algorithm
    • 23.1.2 Stable version
    • 23.1.3 Time and space complexity
    • 23.1.4 When to use
  • 23.2 Radix Sort
    • 23.2.1 LSD (Least Significant Digit)
    • 23.2.2 MSD (Most Significant Digit)
    • 23.2.3 Analysis
    • 23.2.4 Applications
  • 23.3 Bucket Sort
    • 23.3.1 Algorithm
    • 23.3.2 Uniform distribution assumption
    • 23.3.3 Analysis
    • 23.3.4 Comparison with other sorts

Chapter 24: Advanced Sorting Topics

  • 24.1 Comparison-Based Sorting Lower Bound
    • 24.1.1 Decision tree model
    • 24.1.2 Ω(n log n) proof
  • 24.2 Stability in Sorting
    • 24.2.1 Definition and importance
    • 24.2.2 Making unstable sorts stable
  • 24.3 External Sorting
    • 24.3.1 Multi-way merge
    • 24.3.2 Replacement selection
    • 24.3.3 Polyphase merge
  • 24.4 Parallel Sorting
    • 24.4.1 Bitonic sort
    • 24.4.2 Sample sort
    • 24.4.3 Parallel merge sort
  • 24.5 Specialized Sorting
    • 24.5.1 Tim sort
    • 24.5.2 Intro sort
    • 24.5.3 Smooth sort
    • 24.5.4 Cycle sort

PART VII: SEARCHING ALGORITHMS

Chapter 25: Searching Fundamentals

  • 25.1 Linear Search
    • 25.1.1 Algorithm
    • 25.1.2 Sentinel linear search
    • 25.1.3 Analysis
  • 25.2 Binary Search
    • 25.2.1 Iterative implementation
    • 25.2.2 Recursive implementation
    • 25.2.3 Analysis
    • 25.2.4 Variants
      • Finding first/last occurrence
      • Lower bound and upper bound
      • Searching in rotated array
  • 25.3 Interpolation Search
    • 25.3.1 Algorithm
    • 25.3.2 When to use
    • 25.3.3 Analysis
  • 25.4 Exponential Search
    • 25.4.1 Unbounded search
    • 25.4.2 Algorithm
    • 25.4.3 Applications

Chapter 26: Advanced Searching

  • 26.1 Ternary Search
    • 26.1.1 Algorithm
    • 26.1.2 Finding maximum/minimum
    • 26.1.3 Unimodal functions
  • 26.2 Jump Search
    • 26.2.1 Algorithm
    • 26.2.2 Optimal jump size
  • 26.3 Fibonacci Search
  • 26.4 Search in Special Data Structures
    • 26.4.1 Search in 2D matrix
    • 26.4.2 Search in rotated sorted array
    • 26.4.3 Search in bitonic array

PART VIII: STRING ALGORITHMS

Chapter 27: String Matching

  • 27.1 Naive String Matching
    • 27.1.1 Algorithm
    • 27.1.2 Analysis
  • 27.2 Rabin-Karp Algorithm
    • 27.2.1 Rolling hash
    • 27.2.2 Spurious hits
    • 27.2.3 Analysis
    • 27.2.4 Multiple pattern matching
  • 27.3 Knuth-Morris-Pratt (KMP) Algorithm
    • 27.3.1 Failure function
    • 27.3.2 Prefix function computation
    • 27.3.3 Pattern matching
    • 27.3.4 Analysis
  • 27.4 Boyer-Moore Algorithm
    • 27.4.1 Bad character rule
    • 27.4.2 Good suffix rule
    • 27.4.3 Analysis
    • 27.4.4 Variants
  • 27.5 Aho-Corasick Algorithm
    • 27.5.1 Multiple pattern matching
    • 27.5.2 Trie + failure links
    • 27.5.3 Applications

Chapter 28: Advanced String Algorithms

  • 28.1 Z-Algorithm
    • 28.1.1 Z-array computation
    • 28.1.2 Applications
  • 28.2 Manacher’s Algorithm
    • 28.2.1 Longest palindromic substring
    • 28.2.2 Linear time solution
  • 28.3 Suffix Array
    • 28.3.1 Construction
    • 28.3.2 LCP array
    • 28.3.3 Applications
    • 28.3.4 Pattern matching using suffix array
  • 28.4 String Compression
    • 28.4.1 Run-length encoding
    • 28.4.2 Huffman coding
    • 28.4.3 LZ77/LZ78
    • 28.4.4 Burrows-Wheeler transform
  • 28.5 Edit Distance
    • 28.5.1 Levenshtein distance
    • 28.5.2 Dynamic programming solution
    • 28.5.3 Variants and applications
  • 28.6 Longest Common Subsequence/Substring
    • 28.6.1 LCS problem
    • 28.6.2 Dynamic programming approach
    • 28.6.3 Applications in diff algorithms

PART IX: ALGORITHM DESIGN PARADIGMS

Chapter 29: Divide and Conquer

  • 29.1 Paradigm Overview
    • 29.1.1 Divide, conquer, combine
    • 29.1.2 Recurrence relations
    • 29.1.3 When to use
  • 29.2 Classic Examples
    • 29.2.1 Binary search
    • 29.2.2 Merge sort and quick sort
    • 29.2.3 Maximum subarray (Kadane’s variant)
    • 29.2.4 Closest pair of points
    • 29.2.5 Strassen’s matrix multiplication
  • 29.3 Advanced Applications
    • 29.3.1 Fast Fourier Transform (FFT)
    • 29.3.2 Karatsuba multiplication
    • 29.3.3 Median of medians

Chapter 30: Greedy Algorithms

  • 30.1 Greedy Paradigm
    • 30.1.1 Greedy choice property
    • 30.1.2 Optimal substructure
    • 30.1.3 When greedy works
    • 30.1.4 Proving correctness
  • 30.2 Classic Greedy Algorithms
    • 30.2.1 Activity selection
    • 30.2.2 Fractional knapsack
    • 30.2.3 Huffman coding
    • 30.2.4 Minimum spanning tree (Kruskal, Prim)
    • 30.2.5 Dijkstra’s shortest path
  • 30.3 Scheduling Problems
    • 30.3.1 Job scheduling with deadlines
    • 30.3.2 Interval scheduling
    • 30.3.3 Minimizing lateness
  • 30.4 Other Greedy Problems
    • 30.4.1 Coin change (greedy systems)
    • 30.4.2 Gas station problem
    • 30.4.3 Jump game
    • 30.4.4 Candy distribution

Chapter 31: Dynamic Programming

  • 31.1 Dynamic Programming Fundamentals
    • 31.1.1 Optimal substructure
    • 31.1.2 Overlapping subproblems
    • 31.1.3 Memoization (top-down)
    • 31.1.4 Tabulation (bottom-up)
    • 31.1.5 Space optimization techniques
  • 31.2 Classic DP Problems
    • 31.2.1 Fibonacci numbers
    • 31.2.2 Climbing stairs
    • 31.2.3 Longest increasing subsequence
    • 31.2.4 Longest common subsequence
    • 31.2.5 Edit distance
    • 31.2.6 0/1 Knapsack
    • 31.2.7 Unbounded knapsack
    • 31.2.8 Coin change
    • 31.2.9 Subset sum
    • 31.2.10 Partition problem
  • 31.3 String DP
    • 31.3.1 Palindrome problems
    • 31.3.2 Wildcard/regex matching
    • 31.3.3 String interleaving
  • 31.4 Grid/Matrix DP
    • 31.4.1 Unique paths
    • 31.4.2 Minimum path sum
    • 31.4.3 Maximal square
    • 31.4.4 Dungeon game
  • 31.5 Interval DP
    • 31.5.1 Matrix chain multiplication
    • 31.5.2 Burst balloons
    • 31.5.3 Palindrome partitioning
  • 31.6 Tree DP
    • 31.6.1 Diameter of tree
    • 31.6.2 House robber III
    • 31.6.3 Binary tree cameras
  • 31.7 Digit DP
    • 31.7.1 Counting numbers with property
    • 31.7.2 Sum of digits
  • 31.8 State Machine DP
    • 31.8.1 Stock buying/selling problems
    • 31.8.2 State transitions
  • 31.9 Advanced DP
    • 31.9.1 Bitmask DP
    • 31.9.2 DP on trees
    • 31.9.3 DP with data structures
    • 31.9.4 Convex hull optimization
    • 31.9.5 Divide and conquer optimization

Chapter 32: Backtracking

  • 32.1 Backtracking Paradigm
    • 32.1.1 Solution space tree
    • 32.1.2 Pruning
    • 32.1.3 When to use backtracking
  • 32.2 Classic Backtracking Problems
    • 32.2.1 N-Queens problem
    • 32.2.2 Sudoku solver
    • 32.2.3 Knight’s tour
    • 32.2.4 Rat in a maze
    • 32.2.5 Hamiltonian cycle
  • 32.3 Combinatorial Generation
    • 32.3.1 Generating permutations
    • 32.3.2 Generating combinations
    • 32.3.3 Generating subsets
    • 32.3.4 Generating parentheses
  • 32.4 Constraint Satisfaction
    • 32.4.1 Graph coloring
    • 32.4.2 Cryptarithmetic puzzles
    • 32.4.3 Word search problems
  • 32.5 Optimization with Backtracking
    • 32.5.1 Branch and bound
    • 32.5.2 Pruning strategies

Chapter 33: Branch and Bound

  • 33.1 Paradigm Overview
    • 33.1.1 Bounding functions
    • 33.1.2 Pruning strategy
    • 33.1.3 Live and dead nodes
  • 33.2 Applications
    • 33.2.1 Traveling salesman problem
    • 33.2.2 0/1 Knapsack
    • 33.2.3 Job assignment problem
  • 33.3 Search Strategies
    • 33.3.1 FIFO branch and bound
    • 33.3.2 LIFO branch and bound
    • 33.3.3 Least cost branch and bound

PART X: ADVANCED TOPICS

Chapter 34: Computational Geometry

  • 34.1 Basic Geometric Primitives
    • 34.1.1 Points, lines, segments
    • 34.1.2 Cross product and orientation
    • 34.1.3 Line intersection
  • 34.2 Convex Hull
    • 34.2.1 Graham scan
    • 34.2.2 Jarvis march
    • 34.2.3 QuickHull
    • 34.2.4 Chan’s algorithm
  • 34.3 Line Sweep Algorithms
    • 34.3.1 Intersection of line segments
    • 34.3.2 Closest pair of points
    • 34.3.3 Voronoi diagram
  • 34.4 Polygon Algorithms
    • 34.4.1 Point in polygon
    • 34.4.2 Polygon area
    • 34.4.3 Triangulation

Chapter 35: Randomized Algorithms

  • 35.1 Probability Review
    • 35.1.1 Random variables
    • 35.1.2 Expected value
    • 35.1.3 Probability bounds
  • 35.2 Las Vegas vs Monte Carlo
    • 35.2.1 Definitions
    • 35.2.2 Tradeoffs
  • 35.3 Randomized Algorithms
    • 35.3.1 Randomized quick sort
    • 35.3.2 Randomized selection
    • 35.3.3 Skip lists
    • 35.3.4 Treaps
    • 35.3.5 Hash functions
  • 35.4 Probabilistic Analysis
    • 35.4.1 Birthday paradox
    • 35.4.2 Balls and bins
    • 35.4.3 Coupon collector

Chapter 36: Number Theory Algorithms

  • 36.1 Basic Number Theory
    • 36.1.1 GCD and LCM
    • 36.1.2 Euclidean algorithm
    • 36.1.3 Extended Euclidean algorithm
    • 36.1.4 Modular arithmetic
  • 36.2 Prime Numbers
    • 36.2.1 Sieve of Eratosthenes
    • 36.2.2 Segmented sieve
    • 36.2.3 Primality testing
    • 36.2.4 Miller-Rabin test
    • 36.2.5 Prime factorization
  • 36.3 Modular Exponentiation
    • 36.3.1 Fast exponentiation
    • 36.3.2 Modular inverse
    • 36.3.3 Fermat’s little theorem
    • 36.3.4 Chinese remainder theorem
  • 36.4 Applications
    • 36.4.1 RSA cryptography basics
    • 36.4.2 Discrete logarithm

Chapter 37: Approximation Algorithms

  • 37.1 Introduction to Approximation
    • 37.1.1 NP-hard problems
    • 37.1.2 Approximation ratio
    • 37.1.3 PTAS, FPTAS
  • 37.2 Classic Approximation Algorithms
    • 37.2.1 Vertex cover
    • 37.2.2 Set cover
    • 37.2.3 TSP (metric)
    • 37.2.4 Bin packing
    • 37.2.5 Knapsack FPTAS
  • 37.3 Greedy Approximations
  • 37.4 Linear Programming Relaxation

Chapter 38: Online Algorithms

  • 38.1 Competitive Analysis
    • 38.1.1 Competitive ratio
    • 38.1.2 Online vs offline
  • 38.2 Classic Online Problems
    • 38.2.1 Paging and caching (LRU, FIFO, LFU)
    • 38.2.2 Ski rental problem
    • 38.2.3 Online bipartite matching
    • 38.2.4 K-server problem

Chapter 39: Parallel and Distributed Algorithms

  • 39.1 Models of Parallel Computation
    • 39.1.1 PRAM model
    • 39.1.2 Work and span
    • 39.1.3 Amdahl’s law
  • 39.2 Parallel Algorithms
    • 39.2.1 Parallel prefix sum
    • 39.2.2 Parallel sorting networks
    • 39.2.3 Parallel matrix operations
    • 39.2.4 MapReduce paradigm
  • 39.3 Distributed Algorithms
    • 39.3.1 Leader election
    • 39.3.2 Consensus algorithms
    • 39.3.3 Distributed spanning tree

Chapter 40: Advanced Data Structures

  • 40.1 Persistent Data Structures
    • 40.1.1 Path copying
    • 40.1.2 Persistent arrays
    • 40.1.3 Persistent trees
  • 40.2 Self-Adjusting Data Structures
    • 40.2.1 Splay trees (review)
    • 40.2.2 Self-organizing lists
  • 40.3 Succinct Data Structures
    • 40.3.1 Bit vectors
    • 40.3.2 Rank and select
    • 40.3.3 Wavelet trees
  • 40.4 Cache-Oblivious Algorithms
    • 40.4.1 Model
    • 40.4.2 Van Emde Boas layout
    • 40.4.3 Cache-oblivious sorting
  • 40.5 Streaming Algorithms
    • 40.5.1 Count-distinct (Flajolet-Martin)
    • 40.5.2 Frequency moments
    • 40.5.3 Reservoir sampling

Chapter 41: Complexity Theory Basics

  • 41.1 Time Complexity Classes
    • 41.1.1 P, NP, NP-Complete, NP-Hard
    • 41.1.2 Polynomial time reductions
    • 41.1.3 Cook-Levin theorem
  • 41.2 Classic NP-Complete Problems
    • 41.2.1 SAT and 3-SAT
    • 41.2.2 Clique, independent set, vertex cover
    • 41.2.3 Hamiltonian cycle
    • 41.2.4 Subset sum, knapsack
    • 41.2.5 Partition problem
  • 41.3 Dealing with NP-Hard Problems
    • 41.3.1 Exact algorithms (backtracking, branch & bound)
    • 41.3.2 Approximation algorithms
    • 41.3.3 Heuristics and metaheuristics
    • 41.3.4 Parameterized algorithms

PART XI: SPECIAL TOPICS AND APPLICATIONS

Chapter 42: Machine Learning Algorithms (Foundational)

  • 42.1 Basic Concepts
    • 42.1.1 Training vs testing
    • 42.1.2 Overfitting and underfitting
  • 42.2 Classic Algorithms
    • 42.2.1 K-nearest neighbors (data structure perspective)
    • 42.2.2 Decision trees
    • 42.2.3 Clustering (K-means, hierarchical)
  • 42.3 Dimensionality Reduction
    • 42.3.1 PCA basics
    • 42.3.2 Random projection

Chapter 43: Cryptographic Algorithms

  • 43.1 Symmetric Encryption
    • 43.1.1 AES overview
    • 43.1.2 Stream vs block ciphers
  • 43.2 Asymmetric Encryption
    • 43.2.1 RSA (algorithmic perspective)
    • 43.2.2 Diffie-Hellman key exchange
  • 43.3 Hash Functions
    • 43.3.1 Cryptographic properties
    • 43.3.2 SHA family
    • 43.3.3 Applications (digital signatures, MACs)

Chapter 44: Optimization Algorithms

  • 44.1 Linear Programming
    • 44.1.1 Simplex algorithm
    • 44.1.2 Interior point methods
  • 44.2 Metaheuristics
    • 44.2.1 Simulated annealing
    • 44.2.2 Genetic algorithms
    • 44.2.3 Particle swarm optimization
    • 44.2.4 Ant colony optimization
  • 44.3 Gradient Descent Variants
    • 44.3.1 Stochastic gradient descent
    • 44.3.2 Mini-batch gradient descent
    • 44.3.3 Momentum-based methods

Chapter 45: Specialized Application Areas

  • 45.1 Bioinformatics Algorithms
    • 45.1.1 Sequence alignment (Needleman-Wunsch, Smith-Waterman)
    • 45.1.2 Pattern matching in DNA
    • 45.1.3 Phylogenetic trees
  • 45.2 Computer Graphics Algorithms
    • 45.2.1 Line drawing (Bresenham)
    • 45.2.2 Polygon filling
    • 45.2.3 Clipping algorithms
  • 45.3 Game Theory Algorithms
    • 45.3.1 Minimax algorithm
    • 45.3.2 Alpha-beta pruning
    • 45.3.3 Monte Carlo tree search
  • 45.4 Compiler Algorithms
    • 45.4.1 Lexical analysis
    • 45.4.2 Parsing algorithms
    • 45.4.3 Register allocation

PART XII: PRACTICAL CONSIDERATIONS

Chapter 46: Implementation Techniques

  • 46.1 Coding Best Practices
    • 46.1.1 Clean code principles
    • 46.1.2 Modularity
    • 46.1.3 Error handling
  • 46.2 Testing Algorithms
    • 46.2.1 Unit testing
    • 46.2.2 Edge cases
    • 46.2.3 Stress testing
    • 46.2.4 Correctness verification
  • 46.3 Debugging Techniques
    • 46.3.1 Print debugging
    • 46.3.2 Debugger usage
    • 46.3.3 Invariant checking
  • 46.4 Performance Optimization
    • 46.4.1 Profiling
    • 46.4.2 Bottleneck identification
    • 46.4.3 Micro-optimizations
    • 46.4.4 Memory optimization

Chapter 47: Interview and Competitive Programming

  • 47.1 Problem-Solving Strategies
    • 47.1.1 Understanding the problem
    • 47.1.2 Pattern recognition
    • 47.1.3 Breaking down complex problems
  • 47.2 Common Interview Patterns
    • 47.2.1 Two pointers
    • 47.2.2 Sliding window
    • 47.2.3 Fast and slow pointers
    • 47.2.4 Merge intervals
    • 47.2.5 Cyclic sort
    • 47.2.6 Top K elements
    • 47.2.7 Modified binary search
    • 47.2.8 Monotonic stack/queue
  • 47.3 Time and Space Tradeoffs
  • 47.4 Communication and Explanation
  • 47.5 Competitive Programming Tips
    • 47.5.1 Template code
    • 47.5.2 Time management
    • 47.5.3 Common pitfalls

Chapter 48: Real-World Systems

  • 48.1 Database Indexing
    • 48.1.1 B-trees in databases
    • 48.1.2 Hash indexes
    • 48.1.3 Full-text search
  • 48.2 Caching Systems
    • 48.2.1 LRU cache implementation
    • 48.2.2 Distributed caching
    • 48.2.3 Cache invalidation
  • 48.3 Load Balancing
    • 48.3.1 Round robin
    • 48.3.2 Consistent hashing
    • 48.3.3 Least connections
  • 48.4 Rate Limiting
    • 48.4.1 Token bucket
    • 48.4.2 Leaky bucket
    • 48.4.3 Fixed window
    • 48.4.4 Sliding window

APPENDICES

Appendix A: Mathematical Notation and Proofs

  • A.1 Set theory notation
  • A.2 Summation notation
  • A.3 Proof techniques summary
  • A.4 Common recurrence solutions

Appendix B: Complexity Cheat Sheet

  • B.1 Common time complexities
  • B.2 Data structure operation complexities
  • B.3 Sorting algorithm comparison
  • B.4 Graph algorithm complexities

Appendix C: Implementation Language Considerations

  • C.1 Language-specific features
  • C.2 Standard library data structures
  • C.3 Memory management considerations

Appendix D: Further Reading and Resources

  • D.1 Classic textbooks
  • D.2 Online resources
  • D.3 Practice platforms
  • D.4 Research papers

Appendix E: Problem Set

  • E.1 Beginner problems
  • E.2 Intermediate problems
  • E.3 Advanced problems
  • E.4 Solutions and hints
On this page