- Two Pointers:
Use when working with sorted arrays or to solve problems like pair-sum and partitioning (e.g., “left and right pointers meet in the middle”). - Sliding Window:
Ideal for subarray problems—expand or contract the window to meet the required sum or length conditions. - Prefix Sums:
Precompute cumulative sums to answer range queries in O(1) time. - In-place Reversal:
Swap characters or elements from both ends to reverse arrays/strings efficiently.
- Dummy Node Usage:
Simplify edge cases (especially for head insertions or deletions) by using a dummy starter node. - Fast & Slow Pointers:
Detect cycles, find the middle element, or locate the nth node from the end. - Reversal Techniques:
Iteratively rewire pointers (or use recursion) to reverse a linked list.
- Stack (LIFO):
Great for DFS, backtracking, and evaluating expressions; remember “last in, first out.” - Queue (FIFO):
Use for level-order traversals in trees/graphs and BFS algorithms. - Monotonic Stack:
Useful for solving next greater/smaller element problems by maintaining order.
- Depth-First Search (DFS):
Recursively (or with a stack) dive deep into branches; useful for tree traversals and backtracking. - Breadth-First Search (BFS):
Utilize a queue to explore level by level—ideal for shortest path in unweighted graphs. - Binary Search Tree (BST):
Inorder traversal yields sorted order; exploit BST properties for efficient searches. - Heaps/Priority Queues:
Maintain order (min/max) with O(log n) insertions/deletions for scheduling and selection problems. - Union-Find (Disjoint Set):
Quickly merge and find connected components; key for cycle detection and connectivity queries in graphs.
- Base & Recursive Cases:
Define a clear stopping condition and ensure each recursive call moves toward it. - Backtracking Template:
Choose → Explore → Unchoose; systematically try possibilities and revert changes as needed. - Memoization:
Cache results to avoid redundant computations in overlapping subproblems (top-down DP).
- Quicksort:
Divide and conquer using pivot partitioning; average O(n log n) but beware worst-case O(n²) without randomization. - Mergesort:
Stable, divide-and-conquer approach with predictable O(n log n) time at the cost of extra space. - Heapsort:
Build a heap and repeatedly extract the extremum to achieve O(n log n) sorting without extra space. - Counting/Radix Sort:
Linear-time sorting when elements have a fixed range or digit-based structure.
- Binary Search:
Exploit sorted arrays to halve your search space each step—O(log n) efficiency. - Graph Search (DFS/BFS):
Pick DFS for deep exploration and BFS for the shortest path in unweighted graphs.
- Optimal Substructure:
Break the problem into smaller subproblems whose solutions combine to form the overall solution. - Overlapping Subproblems:
Recognize repeated calculations and cache them using memoization or tabulation. - Bottom-Up vs. Top-Down:
Choose iterative tabulation for space/time predictability or recursion + memoization for clearer subproblem structure.
- Local Optimality:
At each step, pick the best immediate option and verify that local choices lead to a global optimum (if the problem exhibits the greedy-choice property). - Problem Structure:
Ensure the problem has a “greedy structure” (e.g., matroids) before applying a greedy solution.
- XOR Tricks:
Use XOR to swap values, find the unique element, or detect duplicates efficiently. - Bit Masks:
Represent subsets or state (e.g., in DP) compactly using binary representations. - Counting Bits:
Shift and mask operations can quickly count set bits or perform bit-level operations.
- Hash Maps & Sets:
For frequency counts, caching, and constant-time look-ups—always consider them for “find” or “duplicate” problems. - Interval Merging:
Sort intervals by start (or end), then iterate to merge overlapping segments. - Matrix Traversal:
Use direction vectors (up, down, left, right) while being mindful of boundaries to avoid out-of-bounds errors.
- Always check edge cases:
Empty inputs, single-element cases, or null pointers can often be pitfalls. - Time & Space Complexity:
Be ready to discuss the efficiency trade-offs of your approach. - Test Your Logic:
Walk through a small example to ensure your algorithm handles all conditions.
Keep this cheat sheet handy while you practice problems, and let these one-liner reminders guide you during your Meta interview prep. Good luck!