# Vagueness around Big O Notation

Let us look at the language that we use for talking about how long an algorithm takes to run.

It’s how we compare the efficiency of different approaches to a problem.It’s just something I’ve never managed to successfully motivate myself to learn about despite knowing it’s going to come up in every single interview.🙆 The world’s top tech firms test candidates knowledge of algorithms and** **how fast these algorithms run.

## What the hell is Big O notation and why do we need it ?

It is the way of measuring the efficiency of an algorithm and how well it scales based on the size of the datasets. Imagine you have a list of 10 objects, and you want to sort them in order. There’s a whole bunch of algorithms you can use to make that happen, ** but not all algorithms are built equal**. Some are quicker than others but more importantly the speed of an algorithm can vary depending on how many items it’s dealing with. Big O is a way of measuring how an algorithm scales.

With big O notation we express the runtime in terms of —*how quickly it grows relative to the input, as the input gets arbitrarily large*.🙇

Below is a list of the Big O complexities:

**O(1)/Constant Complexity: **This means irrelevant of the size of the data set the algorithm will always…