Asymptotics

  • When deciding how fast or slow algorithms are, we do not want to solely rely on measurements in seconds, since that is very machine-dependent (you can just buy a new computer next year and your algorithm will execute twice as fast!) and variable (running time is most likely to be different every time you execute the algorithm). Instead, we use Big O, Big Theta and Big Omega notations (OO, Θ\Theta and Ω\Omega) to describe the order of growth of algorithms.

    Name

    Big O

    Big Omega

    Big Theta

    Notation

    f(n)O(g(n))f(n) \in O(g(n))

    f(n)Ω(g(n))f(n) \in \Omega(g(n))

    f(n)Θ(g(n))f(n) \in \Theta(g(n))

    Informal meaning

    Order of growth of f(n)f(n) is less than or equal to g(n)g(n)

    Order of growth of f(n)f(n) is greater than or equal to g(n)g(n)

    Order of growth of f(n)f(n) is g(n)g(n)

    Example family

    O(N2)O(N^2)

    Θ(N2)\Theta(N^2)

    Ω(N2)\Omega(N^2)

    Family members

    N22\frac{N^2}{2}, 2N2+12N^2 + 1, log(N)\log(N)

    N22\frac{N^2}{2}, 2N2+12N^2 + 1, eNe^N

    N22\frac{N^2}{2}, 2N2+12N^2 + 1, N2+183N+5N^2 + 183N + 5

results matching ""

    No results matching ""