TL;DR: Prefer clarity first, then optimize. Know your constraints, pick the right structure, and control complexity: aim for O(n) over O(n²), O(log n) over O(n), and constant‑factor wins only after profiling.
Explore my DSA solutions on GitHub
Browse solved LeetCode problems, approaches, and notes. Contributions and feedback are welcome.
📦 Arrays
Contiguous memory with O(1) indexing; the workhorse for iteration patterns.
- Use for: prefix sums, sliding windows, two pointers, difference arrays
- Complexity anchors: random access O(1), insert/delete middle O(n)
- Gotchas: off‑by‑one, in‑place mutation, boundary conditions
🔠 Strings
Arrays of characters with domain‑specific tricks for search and comparison.
- Core: palindrome checks, anagram maps, frequency tables, rolling hash
- Patterns: substring search (KMP/Rabin‑Karp), sliding window for distinct/at‑most‑k
- Tip: prefer O(n) single‑pass windows before heavier DP
🔁 Recursion & Recurrence
Decompose problems and formalize state transitions; the basis for divide‑and‑conquer and DP.
- Write the recurrence first; define base cases; prove termination
- Top‑down vs bottom‑up: memoization for clarity, tabulation for speed
- Watch: stack depth (tail recursion or iterative conversion when needed)
🔦 Bit Manipulation
Compact, fast operations for set logic and parity tricks.
- Use: x & 1 (odd/even), x ^ x = 0 (unique element), x & (x‑1) (drop lowest set bit)
- Patterns: subset iteration over bitmasks, bit DP for TSP‑like problems
- Payoff: often reduces O(n log n) to O(n) with small constants
🗂️ Linked Lists
Pointer‑based sequences with O(1) head/tail ops when tracked; great for dynamic changes.
- Variants: singly, doubly, circular
- Classics: reverse in O(n), detect cycle (Floyd), merge K lists (heap)
- Use: LRU caches (with hash map), editor undo history, playlists
🔀 Sorting
Organizing data enables simpler algorithms and faster queries.
- Stable: merge sort O(n log n) — predictable for equal keys
- In‑place: quicksort average O(n log n), worst O(n²); randomize/pivot wisely
- Use sorted arrays for binary search, two‑pointer unions/intersections
🔍 Binary Search
Halve the search space with a monotonic predicate; more than finding indices.
- On arrays: lower/upper bounds, first/last occurrence
- On answer: minimize/maximize a value where feasibility is monotonic
- Gotchas: overflow (use mid = l + (r‑l)/2), infinite loops on l/r updates
📚 Stack & Queue
Abstract types with distinct traversal and scheduling semantics.
- Stack (LIFO): expression evaluation, parentheses, DFS, monotonic stack
- Queue (FIFO): scheduling, BFS (shortest path in unweighted graphs)
- Variants: deque (sliding window max), priority queue/heap (k best)
🌟 Why DSA Matters
- Problem solving: structure thinking; pick the right tool fast
- Efficiency: ship features that scale; save time and memory
- Interviews: communicate approach, justify trade‑offs, reason about complexity
- Foundation: primes you for trees, graphs, tries, DP, and beyond
What’s Next
- Trees & Graphs: traversals, shortest paths, MSTs, tree DP
- Heaps & Hashing: top‑k, frequency maps, custom hashing
- Dynamic Programming: knapsack, LIS/LCS, intervals, digit DP
- Greedy & Backtracking: exchange arguments, pruning
- Advanced: segment trees/Fenwick, tries, DSU, suffix arrays