Algorithm Patterns
Two Pointer Patterns
Uses two pointers starting from opposite ends of an array or list, moving towards each other. Achieves O(n) time complexity with O(1) space, making it ideal for in-place array manipulation and searching in sorted sequences.
- Pointers move from opposite ends
- O(n) time complexity
- O(1) space complexity
- Works best on sorted arrays
Maintains a window of elements that slides through the array, either fixed or variable size. Efficiently solves substring and subarray problems by avoiding redundant computations, achieving O(n) time complexity.
- Fixed or variable window size
- O(n) time complexity
- Avoids redundant computations
- Ideal for substring problems
Uses two pointers moving at different speeds through a data structure. The fast pointer moves twice as fast as the slow pointer, enabling cycle detection and finding middle elements with O(n) time and O(1) space.
- Different pointer speeds
- O(n) time complexity
- O(1) space complexity
- Cycle detection in linked lists
Tree Traversal Patterns
Explores as far as possible along each branch before backtracking. Supports pre-order, in-order, and post-order traversals. O(n) time complexity with O(h) space for recursion stack, where h is tree height.
- Pre/in/post-order traversals
- O(n) time complexity
- O(h) space for call stack
- Recursive or iterative with stack
Explores nodes level by level using a queue. Guarantees shortest path in unweighted graphs. O(n) time complexity with O(w) space, where w is maximum width of the tree or graph.
- Level-order traversal
- O(n) time complexity
- O(w) space for queue
- Shortest path in unweighted graphs
Divide and conquer approach that repeatedly divides search space in half. Works on sorted data or search spaces with monotonic properties. O(log n) time complexity with O(1) space for iterative implementation.
- Divide and conquer strategy
- O(log n) time complexity
- O(1) space iteratively
- Requires sorted or monotonic data
Dynamic Programming Patterns
Recursive approach that stores computed results to avoid redundant calculations. Easier to conceptualize and implement, follows natural problem decomposition. Trade-off is additional space for recursion stack and memoization cache.
- Recursive with memoization
- Easier to conceptualize
- Natural problem decomposition
- Cache stores computed results
Iterative approach that builds solutions from base cases upward. Generally more space-efficient and faster than top-down due to no recursion overhead. Requires careful consideration of iteration order and dependencies.
- Iterative with table
- Better space efficiency
- No recursion overhead
- Builds from base cases up
Models problems as state transitions where each state depends on previous states and specific conditions. Excellent for problems with multiple states or phases, like stock trading with buy/sell/cooldown states.
- Models state transitions
- Multiple states per position
- Condition-based transitions
- Common in string/sequence problems
Greedy Patterns
Solves problems involving intervals by sorting and making greedy choices. Typically sorts intervals by start or end time, then processes them sequentially to find optimal non-overlapping selections or merges.
- Sort intervals by start/end
- Greedy sequential processing
- Optimal local choices
- Merge or select intervals
Makes locally optimal choice at each step, hoping to find global optimum. Works when problem has greedy-choice property and optimal substructure. Simpler than DP but only works for specific problem types.
- Local optimal choices
- Simpler than dynamic programming
- Requires greedy-choice property
- Fast execution
Graph algorithms that build solutions by always choosing the next best edge or vertex. Includes shortest path and minimum spanning tree algorithms. Efficient for finding optimal solutions in weighted graphs.
- Edge/vertex selection
- Builds optimal tree/path
- Priority queue based
- Weighted graph problems
Divide & Conquer Patterns
Splits problem into two roughly equal halves, solves recursively, then combines results. Creates O(log n) levels of recursion. Foundation for many efficient algorithms including merge sort and binary search.
- Split problem in half
- O(log n) recursion levels
- Combine sub-solutions
- Efficient for sorted operations
Divides problem into multiple parts (not just two). Quick sort and quick select are prime examples, partitioning around a pivot. Provides flexibility in how to split the problem space.
- Split into multiple parts
- Partition-based approach
- Flexible division strategy
- Average case efficiency
Focus is on merging solutions from divided subproblems. Critical step in divide-and-conquer where partial solutions are combined to form complete solution. Often the most complex part of the algorithm.
- Merge subproblem solutions
- Complex combination logic
- Efficient merging strategies
- Maintains overall complexity
