What is Big-O
Big-O notation describes how an algorithm's resource usage (time or space) grows as the input size grows. It gives you an upper bound — the worst case — so you can compare algorithms without benchmarking them on specific hardware.
How it works
Big-O focuses on the dominant term and drops constants. If an algorithm takes 3n^2 + 5n + 100 operations for an input of size n, its Big-O complexity is O(n^2). The n^2 term dominates as n gets large, so the constants and smaller terms are irrelevant for understanding scaling behavior.
Common complexity classes, from fastest to slowest:
| Big-O | Name | Example | 10 items | 1M items |
|---|---|---|---|---|
| O(1) | Constant | Hash map lookup | 1 | 1 |
| O(log n) | Logarithmic | Binary search | 3 | 20 |
| O(n) | Linear | Scanning an array | 10 | 1,000,000 |
| O(n log n) | Linearithmic | Merge sort, heap sort | 33 | 20,000,000 |
| O(n^2) | Quadratic | Nested loop comparison | 100 | 1,000,000,000,000 |
The jump between O(n) and O(n^2) is dramatic. An O(n) algorithm on a million items does a million operations. An O(n^2) algorithm does a trillion. This is why Big-O matters — it tells you which algorithms will survive real-world data sizes and which will collapse.
Big-O also applies to space. An algorithm might run in O(n log n) time but require O(n) extra memory. Both dimensions matter when choosing an approach.
Important caveats: Big-O ignores constants, so an O(n) algorithm with a huge constant factor can be slower than an O(n log n) algorithm for practical input sizes. It also describes worst-case behavior by convention — average-case and best-case can differ significantly (quicksort is O(n^2) worst case but O(n log n) average case).
Why it matters
Big-O is the universal language for discussing algorithm performance. When documentation says a hash map has O(1) lookup, or that a particular sort is O(n log n), you immediately know how it scales. It's the first question to ask about any algorithm: "What's the Big-O?" Without it, you're guessing whether your code will handle tomorrow's data.
See How Complexity Works for a deeper treatment of time and space analysis.