Bummer! This is just a preview. You need to be signed in with a Basic account to view the entire video.
Start a free Basic trial
to watch this video
The algorithms we've discussed in this stage are very wellknown, and some job interviewers are going to expect you to know their Big O runtimes. So let's look at them!
Quicksort Run Time (Worst Case)
O(n²)
Quicksort Run Time (Average Case)
O(n log n)
Merge Sort Run Time
O(n log n)

0:00
We had to learn a lot of details for each algorithm we've covered in this course.

0:04
Developers who need to implement their own algorithms often need to choose

0:08
an algorithm for each and every problem they need to solve.

0:11
And they often need to discuss their decisions with other developers.

0:14
Can you imagine needing to describe all the algorithms in this same level of

0:19
detail all the time?

0:20
You'd spend all your time in meetings rather than programming.

0:24
That's why Big O Notation was created, as a way of quickly describing how

0:28
an algorithm performs as the data set its working on increases in size.

0:33
Big O Notation lets you quickly compare several algorithms to choose the best one

0:37
for your problem.

0:39
The algorithms we've discussed in this course are very wellknown.

0:42
Some job interviewers are going to expect you to know their Big O runtimes.

0:46
So let's look at them.

0:48
Remember that the n in Big O Notation refers to the number of elements you're

0:52
operating on.

0:53
With selection sort, you need to check each item on the list to see if it's

0:57
lowest so you can move it over to the sorted list.

0:59
So that's n operations.

1:02
Suppose you're doing selection sort on a list of 5 items, and

1:06
in this case, would be 5.

1:08
So that's 5 operations before you can move an item to the sorted list.

1:11
But with selection sort, you have to loop over the entire list for

1:15
each item you want to move.

1:17
There are 5 items in the list, and you have to do 5 comparisons to move each one.

1:21
So it's more like 5 times 5 operations, or

1:25
if we replace 5 with n, it's n times n, or n squared.

1:30
But wait, you might say, half of that 5 by 5 grid of operations is missing,

1:34
because we're testing one few item in the unsorted list with each pass.

1:38
So isn't it more like, 1/2 times n times n?

1:42
And this is true, we are not doing a full n squared of operations.

1:46
But remember, in Big O Notation,

1:48
as the value of n gets really big, constants like 1/2 become insignificant.

1:53
And so we discard them.

1:55
The Big O runtime of selection sort is widely recognized as being O(n squared).

2:02
Quicksort requires one operation for each element of listed sorting.

2:06
It needs to select a pivot first, and then it needs to sort elements into lists that

2:10
are less than or greater than the pivot.

2:12
So that's n operations.

2:14
To put that another way, if you have a list of 8 items, then n is 8.

2:18
So it will take 8 operations to split the list around the pivot.

2:22
But of course, the list isn't sorted after splitting it around the pivot just once.

2:26
You have to repeat those data operations several times.

2:29
In the best case you'll pick a pivot that's right in the middle of the lists,

2:33
so that you're dividing list exactly in half.

2:36
Then you keep dividing the list in half until you

2:38
have a list with a length of one.

2:40
The number of times you need to divide n in half until

2:44
you reach one is expressed as log n.

2:47
So you need to repeat n sorting up rations log n times.

2:51
That leaves us with the best case one times for Quicksort of O (n log n).

2:57
But that's the best case.

2:59
What about the worst case?

3:00
Well, if you pick the wrong pivot,

3:02
you won't be dividing the list exactly in half.

3:04
If you pick a really bad pivot,

3:06
the next recursive call to Quicksort will only reduce the list length by 1.

3:10
Since our Quicksort function simply picks the first item to use as a pivot,

3:14
we can make it pick the worst possible pivot repeatedly,

3:18
simply by giving it a list that's sorted in reverse order.

3:21
If we pick the worst possible pivot every time, we'll have to split the list once

3:26
for every item it contains, and then, do nsorting operations on it.

3:30
You already know another sorting algorithm that only manages to reduce the list by

3:35
one element with each pass, selection sort.

3:37
Selection sort has a runtime of O(n squared).

3:40
And in the worst case, that's the runtime for Quicksort as well.

3:44
So which do we consider when trying to decide whether to use Quicksort,

3:48
the best case or the worst case?

3:50
Well, as long as your implementation doesn't just pick the first item as

3:54
a pivot, which we did so we could demonstrate this issue,

3:57
it turns out that on average, Quicksort performs closer to the best case.

4:01
Many Quicksort implementations accomplish this simply by picking a pivot at random

4:06
on each recursive loop.

4:08
Here we are sorting our reverse sorted data again, but this time we picked pivots

4:12
at random, which reduces the number of recursive operations needed.

4:17
Sure, random pivots sometimes give you the best case and

4:19
sometimes you'll randomly get the worst case.

4:21
But it all averages out over multiple calls to the Quicksort function.

4:26
Now, with Merge Sort, there is no pivot to pick.

4:28
Your list of n items always gets divided in half log n times.

4:33
That means, Merge Sort always has a big O runtime of O(n log n).

4:38
Contrast that with Quicksort,

4:40
which only has a runtime of O(n log n) in the best case.

4:43
In the worst case, Quicksort's runtime is O(n squared).

4:47
And yet, out in the real world, Quicksort is more commonly used than Merge Sort.

4:51
Now, why is that if Quicksort's bigger runtime can sometimes be worse than

4:55
Merge Sort?

4:56
This is one of those situations where Big O Notation doesn't tell you

5:00
the whole story.

5:01
All Big O can tell you is the number of times an operation is performed.

5:05
It doesn't describe how long that operation takes.

5:08
And the operation merge of performance repeatedly,

5:11
takes longer than the operation Quicksort performance repeatedly.

5:15
Big O is a useful tool for quickly describing how the runtime

5:18
of an algorithm increases as the data set it's operating on gets really, really big.

5:23
But you can't always choose between two algorithms based just on their Big O

5:27
runtimes.

5:28
Sometimes there is additional info you need to know about an algorithm

5:31
to make a good decision.

5:33
Now that we can sort a list of items,

5:35
we're well on our way to being able to search a list efficiently as well.

5:39
We'll look at how to do that in the next stage.
You need to sign up for Treehouse in order to download course files.
Sign up