Skip to content

Instantly share code, notes, and snippets.

@MangaD
Created December 3, 2025 23:34
Show Gist options
  • Select an option

  • Save MangaD/a420e5ad33dbba235cab54eccfa25c6f to your computer and use it in GitHub Desktop.

Select an option

Save MangaD/a420e5ad33dbba235cab54eccfa25c6f to your computer and use it in GitHub Desktop.
The Two Heaps of C++: A Complete, Correct, and Expert-Level Guide

The Two Heaps of C++: A Complete, Correct, and Expert-Level Guide

CC0

Disclaimer: ChatGPT generated document.

Everything you wanted to know about heaps — the data-structure heap and the dynamic memory heap — in one place.


1. Introduction: Why “heap” means two completely different things

In C++, heap is an overloaded term referring to two unrelated concepts:

  1. Binary heap — the tree-based priority queue structure behind std::make_heap, std::push_heap, std::pop_heap, std::priority_queue.
  2. Memory heap (a.k.a. free store) — the region from which new, delete, malloc, and free allocate dynamically sized memory blocks.

These two concepts:

  • share no common implementation,
  • share no overlapping invariants,
  • exist for entirely different purposes,
  • and the terminology collision is historical, not technical.

This guide explains both deeply and shows how to never confuse them again.


2. The Binary Heap (data structure)

2.1 What is a binary heap?

A binary heap is a:

  1. Complete binary tree All levels are filled except possibly the last, which is filled left-to-right.
  2. With the heap property:
    • Max-heap: each parent ≥ each child
    • Min-heap: each parent ≤ each child

Because it’s complete, it can be stored contiguously in std::vector without pointers or node objects.


2.2 Array representation

Given a 0-indexed array a:

  • parent(i) = (i - 1) / 2
  • left(i) = 2*i + 1
  • right(i) = 2*i + 2

Example max-heap represented as [9, 5, 8, 1, 3, 4]:

       9
     /   \
    5     8
   / \   /
  1  3  4

2.3 Time complexity

Operation Complexity Notes
Access root (front()) O(1) Always the extremum
Insert (push) O(log n) Bubble-up
Extract extremum (pop) O(log n) Bubble-down
Build from array O(n) Floyd’s heapify algorithm

Note: Many beginners incorrectly assume heap build is O(n log n). It’s actually O(n) because lower levels cost less work.


2.4 C++ heap algorithms

All heap algorithms live in <algorithm>, not <queue>.

They operate on any random-access range — most commonly std::vector.

Function Description Complexity
std::make_heap(first,last) Creates a max-heap O(n)
std::make_heap(..., comp) Custom comparator → min-heap via std::greater<> O(n)
std::push_heap(first,last) Insert the element at last-1 into heap O(log n)
std::pop_heap(first,last) Move max to last-1, restore heap in [first,last-1) O(log n)
std::sort_heap(first,last) Converts heap → sorted range O(n log n)
std::is_heap / is_heap_until Validation helpers O(n)

Important subtlety

std::push_heap does not append elements. You must push into the container manually:

v.push_back(x);
std::push_heap(v.begin(), v.end());

2.5 std::priority_queue — a high-level wrapper

std::priority_queue is simply:

  • a std::vector internally plus
  • the heap algorithms

Default is max-heap:

std::priority_queue<int> pq;

Min-heap form:

std::priority_queue<
    int,
    std::vector<int>,
    std::greater<int>
> min_pq;

A crucial misconception fix: no decrease-key

Binary heaps (including priority_queue) cannot efficiently support decrease_key, remove(element) or update-in-place. For those, you need a Fibonacci heap, pairing heap, or indexed priority queue.


2.6 When to use heap algorithms directly

Prefer raw *_heap algorithms when you need:

  • In-place heapsort
  • Two-heap median maintenance (min-heap + max-heap)
  • Access to all elements (e.g., custom decrease-key workaround)
  • A priority queue built on a std::vector with specific memory constraints
  • A priority queue that must be serialize-friendly (raw vector)

std::priority_queue is intentionally restrictive.


2.7 Complete example (improved)

#include <vector>
#include <algorithm>
#include <functional>
#include <iostream>

int main() {
    std::vector<int> v {3,1,4,1,5,9,2,6,5,3,5};

    // 1. Build max heap
    std::make_heap(v.begin(), v.end());
    std::cout << "Max-heap root: " << v.front() << '\n'; // 9

    // 2. Push new element
    v.push_back(99);
    std::push_heap(v.begin(), v.end());
    std::cout << "After push: " << v.front() << '\n';     // 99

    // 3. Pop max element
    std::pop_heap(v.begin(), v.end()); // moves max to v.back()
    std::cout << "Popped: " << v.back() << '\n';
    v.pop_back();

    // 4. Min-heap
    std::make_heap(v.begin(), v.end(), std::greater<>{});
    std::cout << "Min-heap root: " << v.front() << '\n';

    // 5. Heapsort (ascending)
    std::sort_heap(v.begin(), v.end(), std::greater<>{});
}

3. The Memory Heap (dynamic allocation)

3.1 What it is — and what it is not

The memory heap (free store):

  • is a runtime-managed area used for variable-size allocations via new, delete, malloc, free
  • has no relation whatsoever to binary heaps
  • does not maintain ordering, parent/child relationships, or tree invariants

It’s essentially a sophisticated bookkeeping system on top of virtual memory.


3.2 Why memory allocation is fast: size classes, bins, and caches

Modern allocators never scan large lists for blocks. They operate via:

1. Size classes (bins)

Allocations rounded to predefined size buckets (e.g., 32B, 48B, 64B).

2. Free lists

Each size class maintains a linked list of freed blocks.

3. Thread-local caches

Fast per-thread allocation with no locking.

4. Page-level backing

When a bin runs empty, the allocator grabs a new slab/page from the OS, subdivides it, and populates a free list.

Typical pipeline for malloc(42):

  1. Round to nearest size class → 48 bytes
  2. Check thread-local 48-byte free list
  3. If available → pop one → O(1)
  4. If empty → refill from a page → amortized fast
  5. If page empty → request new pages from OS

free(ptr)

Push block back into its size class free list → O(1).


3.3 Well-known allocators

Allocator Used by Highlights
ptmalloc2 glibc on Linux Arenas, fast bins
jemalloc FreeBSD, Firefox, Redis Low fragmentation, scalable
tcmalloc Google/Chromium Thread-local caches, high throughput
mimalloc Microsoft, Rust optional Very predictable performance
rpmalloc/Hoard Servers, HPC Highly scalable multi-threading

Important note

Different allocators dramatically change performance of C++ programs using new/delete heavily.


3.4 Fragmentation explained correctly

Internal fragmentation

Wasted space inside allocated blocks due to rounding to size class. Typically small.

External fragmentation

Pieces of free memory too small to satisfy large allocations. Good allocators greatly reduce this via:

  • slab allocation
  • segregated fits
  • page recycling
  • coalescing (merging adjacent free blocks)

4. Side-by-side comparison

Feature Binary Heap (data structure) Memory Heap (dynamic allocation)
Purpose Fast extract-min/max priority queue Runtime memory allocation
Based on Complete binary tree in array Virtual memory + allocator metadata
Ordering Required (min/max) None
Typical ops push_heap, pop_heap, make_heap new, delete, malloc, free
Insert complexity O(log n) O(1) amortized
Delete/Extract O(log n) O(1) amortized
Root access O(1) Not applicable
Lives in C++ STL <algorithm> / <queue> Runtime library / libc
Shared concepts None None

5. Quick cheat-sheet to never confuse them

Code snippet Which heap?
std::priority_queue<int> pq; Binary heap
std::make_heap(v.begin(), v.end()) Binary heap
int* p = new int[1000]; Memory heap
malloc(64) / free(ptr) Memory heap
Concern: “largest element fast” Binary heap
Concern: “memory leak” Memory heap

6. Conclusion

  • The binary heap is an elegant O(log n) priority queue stored in a contiguous container. C++ provides both low-level heap algorithms and std::priority_queue.
  • The memory heap is a sophisticated allocator using bins, free lists, arenas, and thread caches to provide near-constant-time allocation.
  • The two share nothing except a misleading name dating back to early Lisp/ALGOL memory terminology.

You now have a precise, corrected, and expert-level understanding of both meanings of “heap” in C++.


If you'd like, I can also:

✅ Add diagrams for memory allocator internals ✅ Add a “history of the terminology” sidebar ✅ Add examples using polymorphic allocators (std::pmr) ✅ Compare binary heaps with Fibonacci heaps, pairing heaps, and binary search trees ✅ Provide interview-style questions and answers on heaps

Would you like any of those?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment