1
00:00:00,000 --> 00:00:04,994
The next worst case runtime we're going to
look at is one that's called quasilinear.
2
00:00:04,994 --> 00:00:08,954
And it's sort of easier to understand,
for lack of a better word,
3
00:00:08,954 --> 00:00:11,160
by starting with the big O notation.
4
00:00:11,160 --> 00:00:16,421
Quasilinear runtimes are written
out as big O(n log n).
5
00:00:16,421 --> 00:00:18,629
We learned what (log n) was, right?
6
00:00:18,629 --> 00:00:21,853
A logarithmic runtime where
as n grew the number of
7
00:00:21,853 --> 00:00:24,865
operations only increased
by a small factor.
8
00:00:24,865 --> 00:00:28,710
With a quasilinear runtime
what we're saying is that for
9
00:00:28,710 --> 00:00:33,369
every value of n we're going to
execute a log n number of operations.
10
00:00:33,369 --> 00:00:36,871
Hence the runtime of n times log n.
11
00:00:36,871 --> 00:00:40,956
So you saw earlier with a quadratic
run time, that for each value of n,
12
00:00:40,956 --> 00:00:42,630
we conducted n operations.
13
00:00:42,630 --> 00:00:46,759
It's sort of the same in that as we
go through the range of values in n,
14
00:00:46,759 --> 00:00:48,982
we're executing log n operations.
15
00:00:48,982 --> 00:00:53,965
In comparison to other runtimes,
a quasilinear algorithm has a runtime that
16
00:00:53,965 --> 00:00:58,272
lies somewhere between a linear
runtime and a quadratic runtime.
17
00:00:58,272 --> 00:01:02,875
So where would we expect to see this
kind of runtime in practical use?
18
00:01:02,875 --> 00:01:06,709
Well, sorting algorithms is one
place you will definitely see it.
19
00:01:06,709 --> 00:01:08,436
Merge Sort, for example,
20
00:01:08,436 --> 00:01:13,252
is a sorting algorithm that has a worst
case run time time of big O(n log n).
21
00:01:13,252 --> 00:01:15,824
Let's take a look at a quick example.
22
00:01:15,824 --> 00:01:19,722
Let's say we start off with a list
of numbers that looks like this, and
23
00:01:19,722 --> 00:01:20,790
we need to sort it.
24
00:01:20,790 --> 00:01:25,360
Merge Sort starts by splitting this
list into two lists down the middle.
25
00:01:25,360 --> 00:01:29,944
It then takes each sublist and
splits that in half down the middle again.
26
00:01:29,944 --> 00:01:35,185
It keeps doing this until we end up
with a list of just a single number.
27
00:01:35,185 --> 00:01:39,162
When we're down to single numbers
we can do one sort operation and
28
00:01:39,162 --> 00:01:42,505
merge these sublists back
in the opposite direction.
29
00:01:42,505 --> 00:01:48,372
The first part of Merge Sort cuts those
lists into sublists with half the numbers.
30
00:01:48,372 --> 00:01:50,808
This is similar to binary search,
31
00:01:50,808 --> 00:01:55,937
where each comparison operation cuts
down the range to half the values.
32
00:01:55,937 --> 00:01:59,530
You know the worst case run
time in binary search is log n.
33
00:01:59,530 --> 00:02:05,311
So the splitting operations have the same
runtime, big O(log n)or log arethmic.
34
00:02:05,311 --> 00:02:09,327
But splitting into half isn't the only
thing we need to do with Merge Sort.
35
00:02:09,327 --> 00:02:13,277
We also need to carry out comparison
operations so we can sort those values.
36
00:02:13,277 --> 00:02:16,319
And if you look at each
set of this algorithm,
37
00:02:16,319 --> 00:02:19,777
we carry out in n number
of comparison operations.
38
00:02:19,777 --> 00:02:24,657
And that brings the worst case run time
of this algorithm to n times log n,
39
00:02:24,657 --> 00:02:26,584
also known as quasilinear.
40
00:02:26,584 --> 00:02:29,892
Don't worry if you didn't
understand how Merge Sort works.
41
00:02:29,892 --> 00:02:31,926
That wasn't the point
of this demonstration.
42
00:02:31,926 --> 00:02:35,135
We will be covering Merge Sort soon,
in a future course.