# Asymptotics

• When deciding how fast or slow algorithms are, we do not want to solely rely on measurements in seconds, since that is very machine-dependent (you can just buy a new computer next year and your algorithm will execute twice as fast!) and variable (running time is most likely to be different every time you execute the algorithm). Instead, we use Big O, Big Theta and Big Omega notations ($O$, $\Theta$ and $\Omega$) to describe the order of growth of algorithms.

Name Notation Informal meaning Big O Big Omega Big Theta $f(n) \in O(g(n))$ $f(n) \in \Omega(g(n))$ $f(n) \in \Theta(g(n))$ Order of growth of $f(n)$ is less than or equal to $g(n)$ Order of growth of $f(n)$ is greater than or equal to $g(n)$ Order of growth of $f(n)$ is $g(n)$ $O(N^2)$ $\Theta(N^2)$ $\Omega(N^2)$ $\frac{N^2}{2}$, $2N^2 + 1$, $\log(N)$ $\frac{N^2}{2}$, $2N^2 + 1$, $e^N$ $\frac{N^2}{2}$, $2N^2 + 1$, $N^2 + 183N + 5$