A quick reference on Big-O notation

2022 September 1

Big-O notation can help you understand how long an algorithm will take to process, and also its' memory requirements.

Explaining the notation

  • O(1) - CONSTANT TIME: it will take always the same time to run - which is “near instantly”. E.g.:
def do_something():
    print("Hey there!")
  • O(n) - LINEAR TIME: it will grow proportionally to the number of elements (n). E.g.:
def do_something(array: List):
   for element in array:
  • O(log n) - LOGARITHMIC TIME**: at each iteration, you exponentially reduce the number of elements to be processed (so it takes less than “n”). E.g.: binary search on an ordered array of numbers.

  • O(n log n): Combines Linear time with Logarithmic time. It happens when you must iterate on all elements, but you will not do an action for each one of them.

  • O(n²) - QUADRATIC TIME: It happens when you have 2 arrays, and you must do a linear time operation on each combination of both arrays. E.g.:

def find_intersection(fruits_i_like: List, fruits_i_have: List):
    can_eat = []
    for liked in fruits_i_like:
        for have in fruits_i_have:
            if have == like:
    return can_eat

Notice that you have a for inside a for - that is why it is called quadratic time (n²). If you have more than 2 loops, it may also me named as “POLINOMIAL TIME”.

  • O(2ⁿ) - EXPONENTIAL TIME: You have an algorithm that the execution time doubles for each item added on the array. This is a “brute force” algorithm. E.g. getting all subsets from a set, getting all possible combinations of “4 digit pins” on a smartphone lock screen - in that case, 10⁴.

  • O(n!) - FACTORIAL TIME: The time grows like a mathematical factorial each time you add a new element on the array. E.g.

def factorial_algorithm(number):
    for iteration in range(number):

Ranking the execution times

           O(1)    - \o/ winner!

         O(log n)


        O(n log n) - border line of the acceptable

          O(n²)    - here things start to get ugly o.O


          O(n!)    - :'( noooooo

How to calculate the cost of an algorithm

IMPORTANT: below we always take into account the worst scenario.

  • n: the number of elements on an array

  • definitions, variable returns: O(1)

  • for loops: O(n)

  • The cost formula is:

* T = an + b
      ||   |-> coeficient: pretty fast to run (insignificant time)
      ||-> total elements on the array
      time spent on one iteration (it is so insignificant that can be taken out of the total time)
  • E.g. on a for loop we can extrapolate that to: T = n