Heap & Priority Queue
A complete binary tree where every parent is smaller (or larger) than its children. The backbone of Dijkstra, event scheduling, and top-k problems.
Trees is nested under Graphs , so techniques in the broader family usually still apply here.
Interactive visualization
Use the controls to connect the idea to concrete operations before diving into the full write-up.
Family
Graphs → Trees
Hierarchies and constrained graphs. A tree is a connected acyclic graph.
Builds on
1 topic
Read these first if you want the surrounding context.
Unlocks
4 next topics
Use these follow-ups when you want to apply the current idea in a richer setting.
Learning paths
4
This topic appears in curated progressions so you can keep moving without guessing the next step.
Problem
Maintain a collection that supports insert and extract-min (or max) in time. This is a priority queue — elements come out in priority order, not insertion order.
Intuition
A binary heap is a complete binary tree stored in an array. The heap property says: every parent is smaller than both children (min-heap) or larger (max-heap).
Since the tree is complete, it maps perfectly to an array:
- Node at index
- Left child:
- Right child:
- Parent:
No pointers needed. The array IS the tree.
Operations
Insert (bubble up)
Add the new element at the end of the array (next available leaf position). Then bubble up: compare with parent, swap if smaller, repeat until the heap property is restored.
insert(heap, val):
heap.append(val)
i ← heap.length - 1
while i > 0:
parent ← (i - 1) / 2
if heap[i] < heap[parent]:
swap(heap[i], heap[parent])
i ← parent
else:
break
Worst case: the new element is the smallest and bubbles all the way to the root → .
Extract-min (bubble down)
Remove the root (minimum element). Replace it with the last element in the array. Then bubble down: compare with both children, swap with the smaller child, repeat.
extract_min(heap):
min_val ← heap[0]
heap[0] ← heap.pop() // move last to root
i ← 0
while true:
left ← 2 * i + 1
right ← 2 * i + 2
smallest ← i
if left < heap.length and heap[left] < heap[smallest]:
smallest ← left
if right < heap.length and heap[right] < heap[smallest]:
smallest ← right
if smallest == i:
break
swap(heap[i], heap[smallest])
i ← smallest
return min_val
Peek
Just return heap[0]. The minimum is always at the root. .
Heapify: building a heap in
You might think building a heap from elements takes — insert each element one by one. But there’s a faster way: bottom-up heapify.
Start from the last non-leaf node and bubble down each node:
heapify(arr):
for i in (arr.length / 2 - 1) down to 0:
bubble_down(arr, i)
Why is this and not ? Most nodes are near the bottom of the tree and bubble down only a few levels. Formally: the sum .
This is important because it means heap sort’s initial heap construction is , not .
Heap sort
Build a max-heap, then repeatedly extract the maximum and place it at the end:
heap_sort(arr):
heapify(arr) // build max-heap: O(n)
for i in (arr.length - 1) down to 1:
swap(arr[0], arr[i]) // move max to sorted portion
bubble_down(arr, 0, size=i) // restore heap on reduced array
Heap sort is always (no bad inputs), in-place ( extra space), but not stable and has poor cache locality (the bubble-down step jumps around the array).
Where heaps shine
Dijkstra’s algorithm: the priority queue is a min-heap of (distance, node) pairs. Extract-min gives you the nearest unvisited node.
Top-k elements: maintain a min-heap of size . For each element, if it’s larger than the heap’s minimum, replace the minimum. At the end, the heap contains the largest elements. .
Merge k sorted lists: maintain a min-heap of size containing the current head of each list. Extract-min, advance that list, insert the next element. where is total elements.
Event-driven simulation: events are processed in chronological order. New events are inserted as they’re generated. A min-heap on timestamp gives you the next event in .
Median maintenance: use two heaps — a max-heap for the lower half and a min-heap for the upper half. The median is at one of the roots. Insert and rebalance in .
Min-heap vs max-heap
Most languages provide a min-heap by default. To get a max-heap, negate the values on insert and negate again on extract. Or use a comparator.
| Language | Default | Max-heap trick |
|---|---|---|
Python heapq | Min-heap | Insert -val |
Java PriorityQueue | Min-heap | Collections.reverseOrder() |
C++ priority_queue | Max-heap | greater<> for min-heap |
Go container/heap | Interface-based | Implement Less accordingly |
Complexity
| Operation | Time |
|---|---|
| Insert | |
| Extract-min/max | |
| Peek | |
| Heapify (build) | |
| Heap sort |
Space is for the array. No auxiliary space needed beyond the array itself.
Key takeaways
- A heap is a complete binary tree stored in an array — parent at index , children at and , no pointers needed.
- Bottom-up heapify is , not — most nodes are near the bottom and bubble down only a few levels.
- Use a min-heap of size for top-k problems — this gives time, much better than sorting when .
- Two heaps solve the running median — a max-heap for the lower half and a min-heap for the upper half, with the median at one of the roots.
- Most languages default to min-heap — negate values for a max-heap, except C++ which defaults to max-heap.
Practice problems
| Problem | Difficulty | Key idea |
|---|---|---|
| Kth Largest Element in a Stream | 🟢 Easy | Maintain a min-heap of size ; the root is always the answer |
| Top K Frequent Elements | 🟡 Medium | Count frequencies, then use a heap to extract the top |
| Find Median from Data Stream | 🔴 Hard | Two-heap approach: max-heap for lower half, min-heap for upper half |
| Merge k Sorted Lists | 🔴 Hard | Min-heap of size holding the current head of each list |
| Task Scheduler | 🟡 Medium | Max-heap to always schedule the most frequent remaining task |
Relation to other topics
- Graphs: heaps are the backbone of Dijkstra’s shortest path and Prim’s MST algorithms, providing efficient extract-min for greedy vertex selection.
- Sorting: heap sort uses the heap structure for in-place sorting, though it has worse cache locality than quicksort.
- Binary search trees: BSTs support the same operations plus ordered traversal and predecessor/successor queries, but with higher constant factors and pointer overhead.
Build on these first
These topics supply the mental model or underlying structure that this page assumes.
What this enables
Once the current idea feels natural, these are the most useful next jumps.
Dijkstra's Algorithm
Find the shortest path in weighted graphs with non-negative edges. Dijkstra is BFS with a priority queue — it processes nodes in order of cumulative distance.
Minimum Spanning Tree
Find the subset of edges that connects all vertices with minimum total weight. Kruskal's and Prim's are the two classic approaches.
Treap
Combine binary-search-tree ordering with heap priorities to get a balanced randomized tree with elegant split and merge operations.
External Merge Sort
Sort data larger than RAM by generating sorted runs and merging them with mostly sequential disk I/O.
Related directions
These topics live nearby conceptually, even if they are not direct prerequisites.
Dijkstra's Algorithm
Find the shortest path in weighted graphs with non-negative edges. Dijkstra is BFS with a priority queue — it processes nodes in order of cumulative distance.
Minimum Spanning Tree
Find the subset of edges that connects all vertices with minimum total weight. Kruskal's and Prim's are the two classic approaches.
Binary Search Tree (BST)
A sorted tree structure enabling O(log n) search, insert, and delete. Foundation for balanced trees, sets, and maps.
Treap
Combine binary-search-tree ordering with heap priorities to get a balanced randomized tree with elegant split and merge operations.
More from Trees
Stay in the same family when you want parallel variations of the same mental model.
Tree Fundamentals
Understand roots, parents, children, depth, height, and traversal orders. A tree is a graph with enough structure to make many problems much simpler.
Binary Search Tree (BST)
A sorted tree structure enabling O(log n) search, insert, and delete. Foundation for balanced trees, sets, and maps.
Binary Lifting
Jump upward in a rooted tree by powers of two to answer k-th ancestor and many LCA-style queries quickly.
Euler Tour Technique
Flatten a rooted tree into an array so subtree problems become range problems. This is the bridge between trees and range-query structures.
Paths that include this topic
Follow one of these sequences if you want a guided next step instead of open-ended browsing.
Graph toolkit
Start with graph shape, then layer traversal, connectivity structure, and both single-source and all-pairs path algorithms.
Weighted shortest paths
See the spectrum from plain BFS to deque-based 0-1 BFS to priority-queue shortest paths and dense all-pairs DP.
Ordered structures
See how order can come from sorted arrays, search trees, skip lists, disk-friendly indexes, heaps, and prefix trees.
Database operators
See how external sorting and join strategies turn in-memory algorithms into database execution plans.
From the blog
Pair the atlas with recent writing from the rest of the site when you want a broader engineering perspective alongside the topic graph.