Greedy Algorithms
Make the locally optimal choice at each step to find a global optimum. Greedy works when the problem has optimal substructure and the greedy choice property.
Interactive visualization
Use the controls to connect the idea to concrete operations before diving into the full write-up.
Family
Problem-solving strategies
Reusable paradigms like greedy choice, backtracking, and dynamic programming.
Builds on
Foundational
You can start here without another page first.
Unlocks
2 next topics
Use these follow-ups when you want to apply the current idea in a richer setting.
Learning paths
1
This topic appears in curated progressions so you can keep moving without guessing the next step.
Problem
Find an optimal solution (maximize or minimize some objective) where making a sequence of locally best choices leads to a globally best result — without backtracking or reconsidering.
Intuition
A greedy algorithm works when the problem has two properties:
- Greedy choice property: a locally optimal choice at each step is part of some globally optimal solution. You never need to undo a decision.
- Optimal substructure: after making a greedy choice, the remaining problem is a smaller instance of the same problem.
If both hold, greedily choosing the best option at each step and solving the residual subproblem gives an optimal answer. If either fails, greedy may produce a wrong answer — and you likely need DP or exhaustive search.
Interval scheduling — the classic greedy
Given intervals , select the maximum number of non-overlapping intervals.
Key insight: always pick the interval that ends earliest. This leaves the most room for future intervals.
interval_scheduling(intervals):
sort intervals by end time
selected ← []
last_end ← -∞
for each interval [s, e] in sorted order:
if s >= last_end:
selected.append([s, e])
last_end ← e
return selected
Sorting is , the scan is . Total: .
Activity selection — weighted vs unweighted
The unweighted case above is pure greedy. When each interval has a weight (profit), the greedy choice property breaks: picking the earliest-ending interval might skip a high-value one. Weighted interval scheduling requires DP:
where is the latest interval that doesn’t overlap interval . The unweighted version is the special case where all weights equal 1.
Other greedy patterns
Fractional knapsack: sort items by value-to-weight ratio . Take as much as possible of the best ratio first. Unlike 0/1 knapsack, fractions are allowed, so greedy is optimal. .
Huffman coding: repeatedly merge the two lowest-frequency symbols into one node. Builds an optimal prefix-free code. with a min-heap.
Jump game: at each position, jump as far as you can reach. Track the farthest reachable index — if it reaches the end, the answer is yes. .
Minimum spanning tree: both Kruskal’s (sort edges, union-find) and Prim’s (min-heap of crossing edges) are greedy on edge weights.
When greedy fails
0/1 knapsack: you can’t take fractions, so the ratio-based greedy picks wrong items. Need DP.
Coin change (arbitrary denominations): greedy picks the largest coin first, but for denominations like with target , greedy gives (3 coins) instead of optimal (2 coins). With standard denominations , greedy happens to work.
Graph coloring: greedily assigning the smallest available color doesn’t guarantee the chromatic number.
Proving greedy correctness
Two standard techniques:
Exchange argument: assume an optimal solution differs from greedy solution . Show you can swap a choice in for the greedy choice without worsening the objective. Repeat until .
Greedy stays ahead: after each step , show the greedy solution is at least as good as any other solution’s first choices. By induction, greedy is optimal at the end.
Both boil down to: “we never regret the greedy choice.”
Complexity
| Algorithm | Time | Space |
|---|---|---|
| Interval scheduling | ||
| Fractional knapsack | extra | |
| Huffman coding | ||
| Jump game | ||
| Kruskal’s MST |
Key takeaways
- Greedy choice property + optimal substructure are the two conditions that must hold; if either fails, greedy produces wrong answers and you need DP or exhaustive search.
- Sort first — most greedy problems start by sorting on the right criterion (earliest end time, best ratio, farthest reach). Choosing the wrong sort key is the most common mistake.
- Exchange argument is the go-to proof technique: show you can swap any non-greedy choice in an optimal solution without worsening the result.
- Greedy is a special case of DP where only one subproblem matters at each step — always try greedy first, then fall back to DP if the greedy choice property doesn’t hold.
- Interval scheduling (sort by end time) is the canonical example; master it and the pattern transfers to task scheduling, meeting rooms, and merge intervals.
Practice problems
| Problem | Difficulty | Key idea |
|---|---|---|
| Jump Game | 🟡 Medium | Track the farthest reachable index greedily from left to right |
| Jump Game II | 🟡 Medium | BFS-like greedy using current and next reachable boundary to minimize jumps |
| Gas Station | 🟡 Medium | Track running fuel surplus to identify the unique valid starting index |
| Task Scheduler | 🟡 Medium | Greedily schedule the most frequent task first and calculate idle slots |
| Non-overlapping Intervals | 🟡 Medium | Sort by end time and count overlapping intervals to remove (interval scheduling) |
Relation to other topics
| Greedy | DP | Brute force | |
|---|---|---|---|
| Explores | 1 choice per step | All subproblems | All combinations |
| Backtracks | Never | Implicitly (via table) | Explicitly |
| Time | Usually | Polynomial (often or ) | Exponential |
| Correct when | Greedy choice property holds | Optimal substructure + overlapping subproblems | Always |
Greedy is a special case of DP where only one subproblem needs to be solved at each step. When it works, it’s the simplest and fastest approach.
What this enables
Once the current idea feels natural, these are the most useful next jumps.
Dijkstra's Algorithm
Find the shortest path in weighted graphs with non-negative edges. Dijkstra is BFS with a priority queue — it processes nodes in order of cumulative distance.
Minimum Spanning Tree
Find the subset of edges that connects all vertices with minimum total weight. Kruskal's and Prim's are the two classic approaches.
Related directions
These topics live nearby conceptually, even if they are not direct prerequisites.
Dijkstra's Algorithm
Find the shortest path in weighted graphs with non-negative edges. Dijkstra is BFS with a priority queue — it processes nodes in order of cumulative distance.
Minimum Spanning Tree
Find the subset of edges that connects all vertices with minimum total weight. Kruskal's and Prim's are the two classic approaches.
Heap & Priority Queue
A complete binary tree where every parent is smaller (or larger) than its children. The backbone of Dijkstra, event scheduling, and top-k problems.
Dynamic Programming
Break a problem into overlapping subproblems, solve each once, reuse the results. DP turns exponential brute force into polynomial time.
More from Problem-solving strategies
Stay in the same family when you want parallel variations of the same mental model.
Backtracking
Systematically explore all possible solutions by building candidates incrementally and abandoning paths that cannot lead to valid solutions.
Convex Hull Trick
Maintain a set of lines so DP transitions of the form m*x + b can be optimized by querying only the best candidate line.
Dynamic Programming
Break a problem into overlapping subproblems, solve each once, reuse the results. DP turns exponential brute force into polynomial time.
Paths that include this topic
Follow one of these sequences if you want a guided next step instead of open-ended browsing.
Strategy playbook
A reusable set of problem-solving moves for brute-force search, greedy choices, memoized state, and advanced DP optimization.
From the blog
Pair the atlas with recent writing from the rest of the site when you want a broader engineering perspective alongside the topic graph.