Top 10 Must-Know Algorithms for Beginners
Top algorithms to master programming fundamentals quickly
.png&w=3840&q=75)
Top algorithms to master programming fundamentals quickly
.png&w=3840&q=75)
.png)
Every senior engineer you look up to started with the same foundation: a small set of algorithms that taught them how to think. Not just how to write code, but how to design efficient solutions, predict performance, and reason through problems before touching a keyboard.
These 10 algorithms are that foundation.
Binary Search teaches you to think logarithmically. Bubble Sort teaches you nested loops and swapping. Merge Sort teaches divide and conquer. Quick Sort teaches partitioning. BFS and DFS teach you graph traversal. Dijkstra’s teaches you greedy strategy on weighted graphs.
Every coding interview at Google, Amazon, and Microsoft tests your knowledge of these exact algorithms. Every competitive programming problem builds on these building blocks. Every system you will ever design at scale draws on these core ideas.
This blog covers all 10: what each algorithm does, how it works step by step, clean implementations in Python, JavaScript, Java, and C, and the complexity you need to know for interviews.
.png)
An algorithm is a step-by-step set of instructions for solving a problem or performing a computation. It takes an input, processes it, and produces an output.
Input → [Algorithm: defined steps] → Output
Example:
Input: Unsorted list [5, 2, 8, 1, 9]
Algorithm: Compare adjacent elements, swap if out of order, repeat
Output: Sorted list [1, 2, 5, 8, 9]A good algorithm is correct (produces the right answer), efficient (uses minimal time and memory), and clear (understandable and maintainable).
Every algorithm is evaluated on two dimensions:
TIME COMPLEXITY: How does execution time grow as input size grows?
O(1) Constant → Same speed for any input size (best)
O(log n) Logarithmic → Doubles input, adds 1 step
O(n) Linear → Doubles input, doubles steps
O(n log n) Linearithmic→ Efficient sorting algorithms
O(n²) Quadratic → Doubles input, quadruples steps
O(2ⁿ) Exponential → Doubles input, squares all steps (worst)
SPACE COMPLEXITY: How much extra memory is needed?
O(1) Constant → No extra memory proportional to input
O(n) Linear → Memory grows with input size
O(log n) Log → Recursion depth for divide-and-conquerWhen an interviewer says “can you do better?”, they are asking you to reduce time or space complexity.
.png)
Binary Search finds a target value in a sorted array by repeatedly halving the search space. Instead of checking every element (O(n)), it eliminates half the remaining elements each step (O(log n)).
Array: [2, 5, 8, 12, 16, 23, 38, 56, 72, 91]
Target: 23
Step 1: low=0, high=9, mid=4 → arr[4]=16 < 23 → search RIGHT half
Step 2: low=5, high=9, mid=7 → arr[7]=56 > 23 → search LEFT half
Step 3: low=5, high=6, mid=5 → arr[5]=23 == 23 → FOUND at index 5
Without Binary Search: could check up to 10 elements
With Binary Search: only 3 comparisons neededdef binary_search(arr, target): low, high = 0, len(arr) - 1 while low <= high: mid = (low + high) // 2 if arr[mid] == target: return mid elif arr[mid] < target: low = mid + 1 # Target in right half else: high = mid - 1 # Target in left half return -1 # Not found arr = [2, 5, 8, 12, 16, 23, 38, 56, 72, 91] print(binary_search(arr, 23)) # 5 print(binary_search(arr, 10)) # -1
Complexity: Time O(log n) | Space O(1) Requirement: Array must be sorted.
.png)
Bubble Sort repeatedly compares adjacent elements and swaps them if they are in the wrong order. After each pass, the largest unsorted element “bubbles up” to its correct position at the end.
Array: [64, 34, 25, 12, 22]
Pass 1: [34, 25, 12, 22, 64] ← 64 bubbles to end
Pass 2: [25, 12, 22, 34, 64] ← 34 bubbles to position
Pass 3: [12, 22, 25, 34, 64] ← 25 bubbles to position
Pass 4: [12, 22, 25, 34, 64] ← already sorteddef bubble_sort(arr): n = len(arr) for i in range(n): swapped = False for j in range(0, n - i - 1): if arr[j] > arr[j + 1]: arr[j], arr[j + 1] = arr[j + 1], arr[j] swapped = True if not swapped: # Optimization: stop if already sorted break return arr print(bubble_sort([64, 34, 25, 12, 22])) # [12, 22, 25, 34, 64]
Complexity: Time O(n²) worst/average, O(n) best (already sorted) | Space O(1) Best for: Teaching sorting concepts. Not used in production for large data.
.png)
Selection Sort divides the array into a sorted part and an unsorted part. In each pass, it finds the minimum element from the unsorted part and places it at the beginning.
Array: [64, 25, 12, 22, 11]
Pass 1: min=11 → swap with index 0 → [11, 25, 12, 22, 64]
Pass 2: min=12 → swap with index 1 → [11, 12, 25, 22, 64]
Pass 3: min=22 → swap with index 2 → [11, 12, 22, 25, 64]
Pass 4: min=25 → already in place → [11, 12, 22, 25, 64]def selection_sort(arr): n = len(arr) for i in range(n): min_idx = i for j in range(i + 1, n): if arr[j] < arr[min_idx]: min_idx = j arr[i], arr[min_idx] = arr[min_idx], arr[i] return arr print(selection_sort([64, 25, 12, 22, 11])) # [11, 12, 22, 25, 64]
Complexity: Time O(n²) always | Space O(1) Advantage over Bubble Sort: Fewer swaps (at most n-1 swaps total).
.png)
Insertion Sort builds a sorted array one element at a time. It picks each element and inserts it into its correct position among the already-sorted elements, shifting larger elements right to make room.
Array: [12, 11, 13, 5, 6]
Step 1: [11, 12, 13, 5, 6] ← 11 inserted before 12
Step 2: [11, 12, 13, 5, 6] ← 13 already in place
Step 3: [5, 11, 12, 13, 6] ← 5 inserted at beginning
Step 4: [5, 6, 11, 12, 13] ← 6 inserted after 5def insertion_sort(arr): for i in range(1, len(arr)): key = arr[i] j = i - 1 while j >= 0 and arr[j] > key: arr[j + 1] = arr[j] # Shift right j -= 1 arr[j + 1] = key # Insert in correct position return arr print(insertion_sort([12, 11, 13, 5, 6])) # [5, 6, 11, 12, 13]
Complexity: Time O(n²) worst, O(n) best (already sorted) | Space O(1) Best for: Small datasets and nearly-sorted arrays. Used in practice for small n.
.png)
Merge Sort uses divide and conquer: it splits the array in half recursively until each subarray has one element, then merges the sorted halves back together. It guarantees O(n log n) in all cases.
Array: [38, 27, 43, 3]
Split: [38, 27] | [43, 3]
Split: [38] [27] | [43] [3]
Merge: [27, 38] [3, 43]
Merge: [3, 27, 38, 43]def merge_sort(arr): if len(arr) <= 1: return arr mid = len(arr) // 2 left = merge_sort(arr[:mid]) right = merge_sort(arr[mid:]) return merge(left, right) def merge(left, right): result = [] i = j = 0 while i < len(left) and j < len(right): if left[i] <= right[j]: result.append(left[i]); i += 1 else: result.append(right[j]); j += 1 result.extend(left[i:]) result.extend(right[j:]) return result print(merge_sort([38, 27, 43, 3, 9, 82])) # [3, 9, 27, 38, 43, 82]
Complexity: Time O(n log n) always | Space O(n) for the merge step Best for: Large datasets where guaranteed performance matters.
.png)
Quick Sort picks a pivot element and partitions the array so all elements smaller than the pivot go left and all larger elements go right. It then recursively sorts both sides. Average case is O(n log n) and in-place.
Array: [10, 7, 8, 9, 1, 5], pivot = last element (5)
Partition step:
Elements < 5: [1]
Pivot: [5]
Elements > 5: [10, 7, 8, 9]
After partition: [1, 5, 10, 7, 8, 9]
↑ pivot in final position
Recursively sort [1] and [10, 7, 8, 9]
Final: [1, 5, 7, 8, 9, 10]def quick_sort(arr): if len(arr) <= 1: return arr pivot = arr[len(arr) // 2] left = [x for x in arr if x < pivot] middle = [x for x in arr if x == pivot] right = [x for x in arr if x > pivot] return quick_sort(left) + middle + quick_sort(right) print(quick_sort([10, 7, 8, 9, 1, 5])) # [1, 5, 7, 8, 9, 10]
Complexity: Time O(n log n) average, O(n²) worst (already sorted array with poor pivot) | Space O(log n)
.png)
Linear Search checks every element one by one until the target is found or the array ends. No sorting required. Simple but slow for large arrays.
Array: [5, 3, 8, 1, 9], target = 8
Check index 0: 5 ≠ 8
Check index 1: 3 ≠ 8
Check index 2: 8 = 8 → FOUND at index 2def linear_search(arr, target): for i, val in enumerate(arr): if val == target: return i return -1 print(linear_search([5, 3, 8, 1, 9], 8)) # 2 print(linear_search([5, 3, 8, 1, 9], 10)) # -1
Complexity: Time O(n) | Space O(1) Use when: Array is unsorted or too small to benefit from Binary Search.
.png)
DFS explores a graph by going as deep as possible along each branch before backtracking. It uses a stack (or recursion) and a visited set to avoid cycles.
Graph: 0 - 1 - 3
\\
2 - 4
DFS from 0:
Visit 0 → go deep to 1 → go deep to 3 (dead end)
Backtrack to 1 → no more neighbours
Backtrack to 0 → visit 2 → visit 4 (dead end)
Output: [0, 1, 3, 2, 4]def dfs(graph, start, visited=None): if visited is None: visited = set() visited.add(start) result = [start] for neighbour in graph[start]: if neighbour not in visited: result += dfs(graph, neighbour, visited) return result graph = {0:[1,2], 1:[0,3], 2:[0,4], 3:[1], 4:[2]} print(dfs(graph, 0)) # [0, 1, 3, 2, 4]
Complexity: Time O(V + E) | Space O(V) Best for: Cycle detection, topological sort, connected components.
.png)
BFS explores a graph level by level using a queue, visiting all nodes at distance 1 before distance 2. It guarantees the shortest path in unweighted graphs.
Graph: 0 - 1 - 3
\\
2 - 4
BFS from 0:
Level 0: [0] Visit 0
Level 1: [1, 2] Visit all neighbours of 0
Level 2: [3, 4] Visit all neighbours of 1 and 2
Output: [0, 1, 2, 3, 4]from collections import deque def bfs(graph, start): visited = {start} queue = deque([start]) result = [] while queue: node = queue.popleft() result.append(node) for neighbour in graph[node]: if neighbour not in visited: visited.add(neighbour) queue.append(neighbour) return result graph = {0:[1,2], 1:[0,3], 2:[0,4], 3:[1], 4:[2]} print(bfs(graph, 0)) # [0, 1, 2, 3, 4]
Complexity: Time O(V + E) | Space O(V) Best for: Shortest path (unweighted), level-order traversal, connected components.
.png)
Dijkstra’s algorithm finds the shortest path from a source to all other nodes in a weighted graph with non-negative edge weights. It uses a priority queue (min heap) to always process the closest unvisited node first.
Graph (weighted):
0 --4-- 1 --1-- 3
| |
2 9
| |
2 --5---+-------+
Dijkstra from node 0:
Start: dist = [0, ∞, ∞, ∞]
Process 0: update neighbours 1 (dist=4), 2 (dist=2)
Process 2: update 1 (dist=2+5=7→no, keep 4)
Process 1: update 3 (dist=4+1=5)
Final distances from 0: {0:0, 1:4, 2:2, 3:5}import heapq def dijkstra(graph, start): """ graph: {node: [(neighbour, weight), ...]} Returns shortest distances from start to all nodes. """ dist = {node: float('inf') for node in graph} dist[start] = 0 heap = [(0, start)] # (distance, node) while heap: d, node = heapq.heappop(heap) if d > dist[node]: continue # Skip outdated entries for neighbour, weight in graph[node]: new_dist = dist[node] + weight if new_dist < dist[neighbour]: dist[neighbour] = new_dist heapq.heappush(heap, (new_dist, neighbour)) return dist graph = { 0: [(1, 4), (2, 2)], 1: [(3, 1)], 2: [(1, 5)], 3: [] } print(dijkstra(graph, 0)) # {0: 0, 1: 4, 2: 2, 3: 5}
Complexity: Time O((V + E) log V) with min heap | Space O(V) Requirement: Non-negative edge weights only.
Algorithm | Best Time | Average Time | Worst Time | Space | Stable? |
|---|---|---|---|---|---|
Binary Search | O(1) | O(log n) | O(log n) | O(1) | N/A |
Linear Search | O(1) | O(n) | O(n) | O(1) | N/A |
Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No |
Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
DFS | O(V+E) | O(V+E) | O(V+E) | O(V) | N/A |
BFS | O(V+E) | O(V+E) | O(V+E) | O(V) | N/A |
Dijkstra | O((V+E) log V) | O((V+E) log V) | O((V+E) log V) | O(V) | N/A |
Week 1: Binary Search + Linear Search
Why: Teaches you O(log n) vs O(n) thinking immediately.
Builds: Foundation for every search problem.
Week 2: Bubble Sort + Insertion Sort
Why: Teaches nested loops, swapping, and O(n²) patterns.
Builds: Core understanding of sorting mechanics.
Week 3: Merge Sort + Quick Sort
Why: Teaches divide and conquer, recursion, O(n log n).
Builds: Most important sorting knowledge for interviews.
Week 4: DFS + BFS
Why: Teaches graph traversal, the foundation of all graph problems.
Builds: Skills needed for trees, graphs, paths, and components.
Week 5: Dijkstra's Algorithm
Why: Teaches weighted shortest path, greedy strategy, priority queues.
Builds: Ability to solve GPS, routing, and network problems.
After these 10, learn: Dynamic Programming, Binary Trees,
Hash Tables, and Greedy Algorithms.These 10 algorithms are the foundation of everything you will build as a programmer. Here is the full recap:
Searching: Binary Search (O(log n), sorted arrays) and Linear Search (O(n), any array). Always prefer Binary Search for sorted data.
Sorting (simple): Bubble Sort, Selection Sort, and Insertion Sort. All O(n²) worst case. Learn them to understand sorting mechanics. Use Insertion Sort for small or nearly-sorted arrays.
Sorting (efficient): Merge Sort (O(n log n) always, stable, extra space) and Quick Sort (O(n log n) average, in-place, fastest in practice).
Graph Traversal: DFS (stack, deep-first, cycle detection, topological sort) and BFS (queue, level-first, shortest path in unweighted graphs). Both O(V + E).
Weighted Shortest Path: Dijkstra’s Algorithm (priority queue, O((V+E) log V), non-negative weights only).
The single most important skill these algorithms build is not memorising the code. It is learning to think about time and space complexity before writing a single line.
FAQ