**Heads up!** To view this whole video, sign in with your Courses account or enroll in your free 7-day trial.
Sign In
Enroll

Preview

Start a free Courses trial

to watch this video

Let's evaluate what the runtime and space complexity of our algorithm is now that we've implemented it.

This video doesn't have any notes.

**Related Discussions**

Have questions about this video? Start a discussion with the community and Treehouse staff.

Sign up**Related Discussions**

Have questions about this video? Start a discussion with the community and Treehouse staff.

Sign up

If we go back to the top level, the merge
sort function, what does the runtime
0:00

of this function look like, and
what about space complexity?
0:05

How does memory usage grow
as the algorithm runs?
0:08

To answer those questions,
0:12

let's look at the individual steps,
starting with the split function.
0:13

In the split function, all we're doing
is finding the midpoint of the list, and
0:17

splitting the list at the midpoint.
0:21

This seems like a constant time operation.
0:23

But remember that the split
function isn't called once.
0:25

It's called as many times as we needed to,
0:29

to go from the initial list
down to a single element list.
0:31

Now, this is a pattern we've
seen a couple of times now, and
0:35

we know that overall,
this runs in logarithmic time.
0:38

So let's add that as a comment.
0:42

So here I'll say,
takes overall O of log n time.
0:44

Now there's a caveat here but
we'll come back to that.
0:52

So next step is the merge step.
0:55

In the merge step,
0:57

we've broken the original list
down into single element lists and
0:59

now we need to make comparison operations
and merge them back in the reverse order.
1:03

For a list of size n, we'll always
need to make an n number of merge
1:08

operations to get back from single
element lists to a merge list.
1:13

This makes our overall runtime big O
of n times log n because that's an n
1:17

number of merge steps multiplied by a log
n number of splits of the original list.
1:23

So to our merge step here,
let's add a comment,
1:29

we'll say it runs in overall,
whoops there we go.
1:33

Runs in overall linear time right?
1:37

It takes an n number of steps,
number of merge steps.
1:40

But now that we have these two,
so linear here and
1:44

logarithmic here we can multiply these and
1:48

say that the merge sort function,
the top level function,
1:51

we can conclude that the run time of
the overall sorting process is O(n log n).
1:56

Now what about the caveat
I mentioned earlier?
2:02

So if we go back to our split
function here, right here,
2:05

there we go, let's take a look at the way
we're actually splitting the list.
2:11

So we're using Python's list
slicing operation here, and
2:16

passing in two indexes
where the split occurs.
2:20

Now if you go and poke around the Python
documentation, which I've done,
2:23

it says that a slicing operation is
not a constant time operation, and
2:28

in fact has a run time of O(k),
where k represents the sliced sides.
2:33

This means that in reality,
our implementation of split,
2:38

this implementation of split,
does not run in logarithmic time, but
2:42

(k)(log n) logarithmic time.
2:46

Because there's a slice operation for
each split.
2:49

This means that our implementation
is much more expensive.
2:53

So overall that makes our overall
top level merge_sort function,
2:57

not (n log n) (but kn log n),
which is much more expensive.
3:02

Now let's get rid of all that.
3:07

To fix this we would need to
remove this slicing operation.
3:11

Now we can do that by using a technique
we learned in a previous course.
3:15

In the introduction to algorithms course
we looked at two versions of binary search
3:20

in Python a recursive and
an iterative version.
3:24

In the recursive one we use list slicing
with every recursion call, but we achieve
3:28

the same end result using an iterative
approach without using list slicing.
3:33

Over there we declared two variables
to keep track of the starting, and
3:37

ending positions in the list.
3:42

We could rewrite merge
sort to do the same, but
3:44

I'll leave that as an exercise for you.
3:47

If you want some hints, or
3:49

if you want any direction I've included a
link in the notes with an implementation.
3:51

So that is time complexity.
3:55

Now just so we know, before moving on, for
3:58

Python here our overall run time
is not what I've listed here.
4:01

But this is what the actual runtime of
the merge sort algorithm looks like.
4:05

So the merge step runs in linear time.
4:09

And the split step takes logarithmic
time for an overall n times log n.
4:12

And that is how merge sort actually works.
4:17

Okay, so what about space complexity?
4:19

The merge sort algorithm
takes linear space.
4:22

And this is weird to think about at first,
but as always, a visualization helps.
4:25

[SOUND] So if we start at the top
again with our full list and
4:30

carry out the split method until
we have single element lists,
4:34

each of these new lists take
up a certain amount of space.
4:38

So the second level here we have two lists
where each take up an n/2 amount of space.
4:42

Now this makes it seem that
the sum of all this space
4:47

is the additional space needed for merge
sort, but that's not actually the case.
4:50

In reality, there are two
factors that make a difference.
4:55

First, not every single one of these
sublists are created simultaneously.
4:58

At step two we create
2n by 2 size sublists.
5:03

When we move to the next step however,
we don't hold on to the n by 2 sublist and
5:07

then create 4n by 4 size sublist for
the next split.
5:13

Instead after the 4n by 4
size sublists are created,
5:17

the n by 2 ones are deleted from memory.
5:21

There's no reason to hold
on to them any longer.
5:24

At the second point is that our code
doesn't execute every path simultaneously.
5:27

Think of it this way.
5:32

When we pass our list to the top
level merge sort function,
5:34

our implementation called split which
returns a left half and a right half.
5:38

The next line of code then calls
merge sort on the left half again.
5:45

This runs the function, the merge
sort function again with a new list.
5:50

In that second run of the function split
is called again, we get a second left and
5:55

right half, and
5:59

then again, like before, we call
merge sort on this left half as well.
6:00

What this means is that the code walks
down the left path all the way down
6:05

until that initial left half is sorted and
merged back into one array.
6:09

Then, it's going to walk all the way down
the right path and sort that until we're
6:15

back to that first split
with 2 n/2 sized sublists.
6:19

Essentially, we don't run all of
these paths of code at once, so
6:24

the algorithm doesn't need
additional space for every sublist.
6:28

In fact,
it is the very last step that matters.
6:32

In the last step,
6:35

the two sublists are merged back into
the new sorted list and returned.
6:36

That sorted list has an equal number
of items as the original unsorted list.
6:42

And since this is a new list
it means that, at most,
6:48

the additional space the algorithm
will require at a given time is n.
6:52

Yes, at different points in the algorithm
we require log n amount of space but
6:57

log n is smaller than n and so we consider
the space complexity of merge sort to
7:02

be linear because that
is the overall factor.
7:07

Okay, that was a lot.
7:10

So let's stop here.
7:12

Don't worry if you've got questions about
merge sort because we're not done yet.
7:13

Over the next few videos let's wrap up
this course by implementing merge sort on
7:16

a linked list.
7:21

You need to sign up for Treehouse in order to download course files.

Sign upYou need to sign up for Treehouse in order to set up Workspace

Sign up