Heads up! To view this whole video, sign in with your Courses account or enroll in your free 7-day trial. Sign In Enroll
Well done!
      You have completed Introduction to Algorithms!
      
    
You have completed Introduction to Algorithms!
Preview
    
      
  Space complexity is a measure of how much working storage, or extra storage, is needed as a particular algorithm grows. In this video let's take a look at the space complexity of our algorithms
Glossary
- Space Complexity - a measure of how much working storage, or extra storage, is needed as a particular algorithm grows
 - Tail call - a call, inside a function, to the same function itself as the last operation. Also called tail recursion
 
Resources
Related Discussions
Have questions about this video? Start a discussion with the community and Treehouse staff.
Sign upRelated Discussions
Have questions about this video? Start a discussion with the community and Treehouse staff.
Sign up
                      At the beginning of this series,
                      0:00
                    
                    
                      I mentioned that there were two ways of
measuring the efficiency of an algorithm.
                      0:01
                    
                    
                      The first was time complexity, or
                      0:05
                    
                    
                      how the runtime of an algorithm
grows as N grows larger.
                      0:08
                    
                    
                      The second is space complexity.
                      0:11
                    
                    
                      We took a pretty long route
to build up this example, but
                      0:14
                    
                    
                      now we're in a good place to
discuss space complexity.
                      0:17
                    
                    
                      Space complexity is a measure
of how much working storage or
                      0:20
                    
                    
                      extra storage is needed as
a particular algorithm grows.
                      0:25
                    
                    
                      We don't think about it much these days,
but
                      0:29
                    
                    
                      every single thing we do on
a computer takes up space in memory.
                      0:32
                    
                    
                      In the early days of computing,
considering memory usage was of paramount
                      0:36
                    
                    
                      importance because memory was limited and
really expensive.
                      0:41
                    
                    
                      These days we're spoiled,
our devices are rich with memory.
                      0:45
                    
                    
                      This is okay when we write everyday
code because most of us aren't dealing
                      0:48
                    
                    
                      with enormously large data sets.
                      0:53
                    
                    
                      When we write algorithms however, we
need to think about this because we want
                      0:55
                    
                    
                      to design our algorithms to
perform as efficiently as it can,
                      1:00
                    
                    
                      as the size of the data set,
N, grows really large.
                      1:04
                    
                    
                      Like time complexity,
                      1:07
                    
                    
                      space complexity is measured in the
worst-case scenario using big O notation.
                      1:09
                    
                    
                      Since you are familiar with
the different kinds of complexities,
                      1:14
                    
                    
                      let's dive right into an example.
                      1:19
                    
                    
                      In our iterative implementation of binary
search, the first one we wrote that uses
                      1:21
                    
                    
                      a while loop, let's look at what happens
to our memory usage as N gets large.
                      1:26
                    
                    
                      Let's bring up that function.
                      1:31
                    
                    
                      Let's say we start off with
a list of ten elements.
                      1:34
                    
                    
                      Now inspecting the codes we see that our
solution relies heavily on these two
                      1:38
                    
                    
                      variables first and last.
                      1:43
                    
                    
                      First points to the start of the list and
last, to the end.
                      1:44
                    
                    
                      When we eliminate a set of values,
we don't actually create a sublist.
                      1:48
                    
                    
                      Instead we just redefine first and
last as you see here,
                      1:52
                    
                    
                      to point to a different
section of the list.
                      1:57
                    
                    
                      Since the algorithm only considers
the values between first and
                      2:01
                    
                    
                      last when determining the midpoint,
by redefining first and
                      2:05
                    
                    
                      last as the algorithm proceeds we can find
the solution using just the original list.
                      2:09
                    
                    
                      This means that for
any value of n the space complexity of
                      2:14
                    
                    
                      the interative version of
binary search is constant.
                      2:18
                    
                    
                      Or that the iterative version of
binary search takes constant space.
                      2:22
                    
                    
                      Remember that we would
write this as big O of one.
                      2:27
                    
                    
                      This might seem confusing,
because as n grows,
                      2:31
                    
                    
                      we need more storage to account for
that larger list size.
                      2:34
                    
                    
                      Now this is true, but
                      2:37
                    
                    
                      that storage is not what space
complexity cares about measuring.
                      2:39
                    
                    
                      We care about what additional storage
is needed as the algorithm runs and
                      2:43
                    
                    
                      tries to find a solution.
                      2:48
                    
                    
                      If we assume something simple, say that
for a given size of a list, represented
                      2:50
                    
                    
                      by a value N, it takes N amount of
space to store it, whatever that means.
                      2:55
                    
                    
                      Then for the iterative version of
binary search, regardless of how large
                      3:00
                    
                    
                      the list is, at the start, middle,
and end of the algorithm process,
                      3:05
                    
                    
                      the amount of storage required
does not get larger than N.
                      3:10
                    
                    
                      And this is why we consider
it to run in constant space.
                      3:15
                    
                    
                      Now, this is an entirely different story
with the recursive version, however.
                      3:18
                    
                    
                      In the recursive version of binary search
we don't make use of variables to keep
                      3:23
                    
                    
                      track of which portion of
the list we're working with.
                      3:27
                    
                    
                      Instead, we create new lists every
time with a subset of values, or
                      3:30
                    
                    
                      sublists, with every
recursive function call.
                      3:35
                    
                    
                      Let's assume we have a list of size n and
in the worst case scenario,
                      3:39
                    
                    
                      the target element is
the last in the list.
                      3:43
                    
                    
                      Calling the Recursive Implementation
of binary search on this list and
                      3:46
                    
                    
                      target would lead to a scenario like this.
                      3:50
                    
                    
                      The function would call itself and
                      3:53
                    
                    
                      create a new list that goes from
the midpoint to the end of the list.
                      3:55
                    
                    
                      Since we're discarding half the values,
the size of the sublist is N by 2.
                      3:59
                    
                    
                      This function will keep calling itself,
                      4:05
                    
                    
                      creating a new sublist that's
half the size of the current one,
                      4:07
                    
                    
                      until it arrives at a single element list,
and a stopping condition.
                      4:11
                    
                    
                      This pattern that you see here,
where the size of the sublist is reduced
                      4:15
                    
                    
                      by a factor on each execution
of the algorithmic logic,
                      4:19
                    
                    
                      well we've seen that pattern before.
                      4:23
                    
                    
                      Do you remember where?
                      4:25
                    
                    
                      This is exactly how binary search works.
                      4:26
                    
                    
                      [SOUND] It discards half the values
every time until it finds a solution.
                      4:29
                    
                    
                      Now we know that because of this pattern,
                      4:33
                    
                    
                      the running time of binary
search is logarithmic.
                      4:36
                    
                    
                      In fact,
                      4:38
                    
                    
                      this space complexity of the recursive
version of binary search is the same.
                      4:39
                    
                    
                      If we start out with the memory allocation
of size N that matches the list,
                      4:44
                    
                    
                      on each function call of
recursive binary search,
                      4:49
                    
                    
                      we need to allocate additional
memory of size N/2, N/4, and so
                      4:53
                    
                    
                      on until we have a sublist that is
either empty or contains a single value.
                      4:57
                    
                    
                      Because of this, we say that
the recursive version of the binary
                      5:02
                    
                    
                      search algorithm runs in logarithmic
time with a big O of log N.
                      5:07
                    
                    
                      Now, there's an important caveat here.
                      5:12
                    
                    
                      This totally depends on the language.
                      5:14
                    
                    
                      Remember how I said that a programming
language like Swift can do some tricks to
                      5:16
                    
                    
                      where recursion depth doesn't matter?
                      5:21
                    
                    
                      The same concept applies here.
                      5:23
                    
                    
                      If you care to read more about this
concept, it's called tail optimization.
                      5:25
                    
                    
                      It's called tail optimization because
if you think of a function as
                      5:30
                    
                    
                      having a head and a tail, if the recursive
function call is the last line
                      5:35
                    
                    
                      of code in the function, as it is in
our case, we call this tail recursion,
                      5:40
                    
                    
                      since it's the last part of
the function that calls itself.
                      5:45
                    
                    
                      Now the trick that Swift does to
reduce the amount of space, and
                      5:49
                    
                    
                      therefore computing overhead,
to keep track of this recursive calls,
                      5:53
                    
                    
                      is called tail call optimization,
or tail call elimination.
                      5:58
                    
                    
                      It's one of those things that
you'll see thrown around a lot
                      6:02
                    
                    
                      in algorithm discussions, but
may not always be relevant to you.
                      6:06
                    
                    
                      Now, what of any this is relevant to us?
                      6:09
                    
                    
                      Well, Python does not implement
tail call optimization.
                      6:12
                    
                    
                      So the recursive version of binary
search takes logarithmic space.
                      6:16
                    
                    
                      If we had to choose between
the two implementations,
                      6:20
                    
                    
                      given that time complexity or run-time
of both versions, the iterative and
                      6:24
                    
                    
                      the recursive version are the same,
we should definitely go with the iterative
                      6:28
                    
                    
                      implementation in Python,
since it runs in constant space.
                      6:33
                    
                    
                      Okay, that was a lot.
                      6:37
                    
                    
                      But all of this, with all of this we've
now established two important ways
                      6:38
                    
                    
                      to distinguish between algorithms
that handle the same task,
                      6:43
                    
                    
                      and determine which one we should use.
                      6:47
                    
              
        You need to sign up for Treehouse in order to download course files.
Sign upYou need to sign up for Treehouse in order to set up Workspace
Sign up