# So what is O(log n) Time Complexity Anyway?

--

For all the self-taught and boot camp developers like me who aren’t as math savvy.

# What is Time Complexity?

Time complexity is a fairly simple concept to grasp, but becomes more and more complex with the algorithms that it exists to categorize. At its most basic, time complexity describes the time it takes for an algorithm to execute (*O*) based on its input (*n*).

Other time complexities are therefore pretty straightforward. *O(1)* for example means that no matter the input, the algorithm will always take the same amount of time to execute. *O(n) *means that as the input gets bigger, the time it takes will get bigger at a constant rate (linear progression). *O(n²)* means that as the input gets bigger, the time taken will get exponentially bigger.

Those are all of the main time complexities that you’ll find yourself in, but some algorithms are a bit more complicated.

# What is a logarithm?

So now that we know that *n *is the input of the algorithm, what is a logarithm in the first place? Well the definition of a logarithm is:

A quantity representing the power to which a fixed number (the base) must be raised to produce a given number.

So what does that mean? Well let’s take this example:

*2^x = n*

First, let’s define terms. *2* is the fixed number, or base. *X* is the power that the base is being raised by, and *n *is the given number that the base and the power produce.

So in order to find the logarithm of the given number (*n)*, we have to find the base, and the number to raise to. In this case, the base is *2*, and we’re raising to *n*. Therefore we’re trying to find the logarithm of *n *at base *2*, which we write as:

`log_2 n`

And then because the logarithm of *n *represents the power that a base needs to be raised to:

`x = log_2 n`

Which leaves us with:

`(2^x = n) = (x = log_2 n)`

To summarize, *2 *to the power of *x *is equal to *n*, therefore, the logarithm of *n* with the base of *2 *is *x*. It’s a little hard to wrap your mind around at first, but once you’ve got it it’ll stick.

As a side note, when we write `log_2 n`

it means logarithm at the base of *2*. However, this isn’t necessary when describing time complexity as the base of the logarithm doesn’t matter as much when comparing this to other algorithms. Because of this, we’ll just be writing this as `log n`

instead.

# What about in Code?

So armed with the knowledge of what a logarithm actually is, how do we make use of that when writing code? Well one of the most common O(log n) algorithms is binary search. Basically, this is a method of finding a given value in a sorted array by dividing said array in half increasingly in order to find the right value. This is what a binary search algorithm might look like in python:

def binary_search(list1, low, high, n): if low <= high: mid = (low + high) // 2 if list1[mid] == n: return mid elif list1[mid] > n: return binary_search(list1, low, mid — 1, n) else: return binary_search(list1, mid + 1, high, n) else: return -1

Now let’s imagine that we give this algorithm a list of the values *[1, 3, 5, 8, 12, 13, 15, 16, 18, 20, 22, 30, 40, 50, 55, 67]*. We’re going to try to pick 12 out of this list.

*[**1, 3, 5, 8, 12, 13, 15, 16, 18, 20, 22, 30, 40, 50, 55, 67**]*

So first we’ll start at the center of the array, in this case index 7, which has the value 16. 16 is greater than 12 so we’ll forget about all of the numbers above 16 and focus on the indexes 0–6.

*[**1, 3, 5, 8, 12, 13, 15**, 16, 18, 20, 22, 30, 40, 50, 55, 67]*

The pivot of the array is index 3, which has the value 8. Since 8 is less than 12, all of the numbers from index 0–3 should be ignored, and we’re left with indexes 4–6.

*[1, 3, 5, 8, **12, 13, 15,** 16, 18, 20, 22, 30, 40, 50, 55, 67]*

The new pivot of the array is index 5, with the value 13. 13 is greater than 12, therefore we’re gonna search the index 4.

*[1, 3, 5, 8, **12, **13, 15, 16, 18, 20, 22, 30, 40, 50, 55, 67]*

Index 4 contains the value we were looking for, so we’ve completed the algorithm.

This whole process took 4 iterations through our operations for this array of 16 elements. If you’ve been paying attention you might see where I’m going with this. Here’s how this algorithm might be shown algebraically.

`16 * (1/2)^4 = 1`

Because we have *16 *elements, each iteration dividing that number in half, or multiplying it by *1*/2. This all happens a total of *4 *times in order to get *1* element in the end. So how do we get time complexity out of this?

Well *n *is the input given to an algorithm, which in this case was an array of 16 elements. So let’s replace 16 with *n*.

`n * (1/2)^4 = 1`

Then, *x* was representative of the exponent that was used to raise the base. The exponent here is 4, so let’s replace 4 with *x.*

`n * (1/2)^x = 1`

We can rearrange this a bit by taking away the parenthesis.

`n * 1/2^x = 1`

*n * 1 *is the same as *n.*

`n/2^x = 1`

Multiply both side by 2^x.

`2^x * n/2^x = 2^x`

This results in.

`n = 2^x`

If you remember from before the definition of a logarithm is

A quantity representing the power to which a fixed number (the base) must be raised to produce a given number.

So using our previous logic, we can decipher that

`(n = 2^x or 2^x = n) = (x = log_2 n)`

Therefore, we can prove that this binary search algorithm runs in *O(log_2 n) *time complexity. However, because the base of the logarithm doesn’t matter as much when comparing this to other algorithms, we can actually write this as *O(log n) *instead.

# Is There an Easier Way to Tell?

That’s all fine and dandy, but the problem is that the average developer doesn’t want to meticulously run through that operation all the time. Fortunately, *O(log n) *algorithms have a common behavior that’s easier to pick up on.

Binary search, along with many other algorithms have successive division of the input. This means that the input will decrease in size as the algorithm chugs along, as we saw in binary search where it divides it in half on each iteration. This is a common behavior in *O(log n) *algorithms and is a good indicator that you’re working with logarithmic code.

# Conclusion

Time complexity is confusing in its own right, *O(log n) *time adds another layer. However, understanding this is very important because it can help you create faster code by utilizing *O(log n)*’s ability to shrink the input every iteration. I hope that this post helps someone understand *O(log n)* time a lot faster than it took me.

*Happy coding everyone!*