How does an algorithm scale?
Algorithmic complexity analyses the growth-rate of an algorithm as input size approaches infinity.
Before we can analyze an algorithm, we need to decide what operations cost time.
Primitive operations, like comparisons, assignments, arithmetic operations, array access, …, but also abstract opterations like function calls, db queries, etc.
Asymptotic notation
We compare two functions .
… time complexity / cost function of an algorithm with input size .
… a reference function for comparison
Asymptotic notation describes how relates to asymptotically (as ).
Big-O — Upper bound / Worst Case
grows at most as fast as up to a constant factor beyond some threshold .
It is bounded above by for large enough .
E.g.: Binary search is – even in the worst case, you don’t need more than comparisons.
Big-Omega — Lower bound / Best Case
grows at least as fast as up to a constant factor beyond some threshold .
It is bounded below by for large enough .
E.g.: Any comparison-based sorting algorithm is . You need at least operations. 1
Big-Theta — Tight bound
grows at the same rate as up to constant factors beyond some threshold .
It is bounded both above and below by for large enough .
E.g.: Merge sort is – in both the best and worst case, you need on the order of comparisons.
Small-o — Strict upper bound
grows strictly slower than , i.e. for any constant factor , beyond some threshold , is less than .
as , i.e. becomes insignificant compared to as approaches infinity.
E.g.: is – logarithmic growth is strictly slower than linear growth.
Little omega — Strict lower bound
grows strictly faster than , i.e. for any constant factor , beyond some threshold , is greater than .
as , i.e. dominates as approaches infinity.
E.g.: is – linear growth is strictly faster than logarithmic growth.
https://claude.ai/chat/6d1f89b2-5592-410a-a01f-76b0f6006bb8
https://chatgpt.com/share/68bf489c-cd80-800f-99d9-e381ab3168bb
Footnotes
-
You need (trust) bits of information to specify which of the permutations you have. Each comparison gives you at most 1 bit. radix sort, for example, can beat this by exploiting structure in the data itself. ↩