Bummer! This is just a preview. You need to be signed in with a Basic account to view the entire video.
Start a free Basic trial
to watch this video
While the iPhone CPU is good at a great many things, it isn't best suited for image processing and rendering. Here's where the GPU shines. For our second attempt at solving this problem, let's write some code to get the GPU involved.
-
0:00
[SOUND] Okay, so we solved that problem.
-
0:06
It took a few long videos but we did it,
-
0:08
our implementation involved creating two image context.
-
0:12
A temporary one to scale the image down and another one to apply the filter along
-
0:16
with the operations based approach to schedule all this work in the background.
-
0:21
This is a fine approach and we could stop here, but this is an opportunity
-
0:25
to talk and learn about the various ways we can tackle different problems.
-
0:30
So let's switch tracks here and talk about the GPU.
-
0:34
When the core image frame work renders and image,
-
0:37
it can do so using either the CPU or the GPU.
-
0:40
Before we talk about why this matters,
-
0:42
let's make sure we understand the difference between these two components.
-
0:46
In a previous course, we touched upon the CPU, the Central Processing Unit or
-
0:51
what is commonly referred to as the brains of the computer.
-
0:55
There's another important component in modern computers known as the GPU or
-
0:59
the Graphics Processing Unit.
-
1:01
We learned when talking about concurrency, that the CPU is architected with a few
-
1:06
cores that can handle different software threads at the same time.
-
1:10
This processing structure means that the CPU can render images on one of its'
-
1:15
background threads, but it can only do this well enough for basic graphics.
-
1:19
The GPU, on the other hand, is a very specialized device
-
1:23
that is extremely efficient at concurrent processing.
-
1:26
Where the CPU has a few cores, modern GPUs can have several hundred cores
-
1:31
that process thousands of threads simultaneously.
-
1:35
As its name states, this makes it quite suited for graphics processing and
-
1:39
in fact, nearly every pixel that ends up on your screen was put there by the GPU.
-
1:44
Even tasks like scrolling, view transitions, zooming,
-
1:48
are all handled by the GPU on your mobile device.
-
1:52
Without using multiple contexts and operations our filter processes ran
-
1:57
really slowly, and that's because we were rendering images on the CPU.
-
2:01
So let's try switching over to using the GPU and controlling the process.
-
2:06
Now, few videos ago we created a separate get branch to work on the first solution.
-
2:12
Let's switch back to master,
-
2:13
so that we can create a new branch of the GPU based approach.
-
2:17
So if you don't have you're project navigator up, bring it up and
-
2:21
hover to the second tab, and you can do this easily by hitting command two,
-
2:25
this is the source control navigator.
-
2:27
Now in here we can see the different branches,
-
2:30
we're going to right-click on the master branch and select Checkout.
-
2:34
Now when we do this, it says here that all files in the working copy will switch
-
2:38
from the current branch to master.
-
2:40
Hit check out, and you'll notice that all your code should be back to the place
-
2:44
where we took the snapshot earlier right before we switched to the branch, okay.
-
2:49
Now, right-click on master again, and select branch from master.
-
2:53
This time the branch name GPU-based-approach, again,
-
2:58
if you inspect the code you can see we're back to square one more or less.
-
3:04
Let's navigate to the FilteredImageBuilder class, and
-
3:09
then inspect the code we wrote to apply the filter.
-
3:13
You can see that our apply method here is back to returning an instance of ciImage,
-
3:18
which we then convert to an instance of UIImage and return at some point.
-
3:22
The reason our filters take awhile to load here is because
-
3:26
we're using the CPU to draw from the CGImage back to the UIImage.
-
3:30
And then we hand the object right back to UIKit, which uses the GPU to render it.
-
3:36
cgImage, as the class name suggests, is form the core graphics framework, and
-
3:41
we can avoid this round trip from the CPU to the GPU by using the GPU directly to
-
3:45
do all the work.
-
3:46
To do this we're going to utilize the Open Graphics Library, commonly called OpenGL.
-
3:52
OpenGL is a multipurpose open standard graphics library that supports
-
3:57
applications for 2D and 3D digital content creation, mechanical and
-
4:01
architectural design, virtual prototyping, flight simulation, video games, and more.
-
4:07
Now, OpenGL for embedded systems, OpenGL ES is what we're going to use, and
-
4:12
this is a simplified version of OpenGL that is easier to implement in
-
4:16
mobile graphics hardware.
-
4:18
To do this, we're going to redefine the context that we use.
-
4:22
So instead of using a CIContext like we did earlier we're
-
4:26
going to use an underlying EAGLContext.
-
4:29
An EAGLContext object manages an OpenGL ES rendering context.
-
4:35
The state information, the commands, and
-
4:38
the resources needed to draw directly using OpenGL ES.
-
4:41
So navigate to PhotoFilterController.swift, and
-
4:45
in here as a stored property we're going to define our EAGLContext.
-
4:51
So, let eaglContext = EAGLContext,
-
4:55
now the initializer we're going to use takes an EAGLRenderingAPI.
-
5:03
Here, we're going to use OpenGL E3, so if you start typing openGLES3 right here.
-
5:10
This is only available in iOS 7 and higher versions.
-
5:14
Now, the mechanics by which OpenGL ES does its magic, is a bit outside the scope of
-
5:19
this course and honestly quite complicated, all over my head at least.
-
5:23
And that's not all there is to this, instead of rendering the image via
-
5:28
UIImage and UIKit, we are now going to use an instance of GLKView.
-
5:32
Which is a subclass of UIView that communicates with this
-
5:37
OpenGL context directly, and lets the GPU draw on it.
-
5:41
Since we're doing that, drawing to a GLKView directly, we don't need
-
5:45
this image view that we added earlier, or this instance of UIImage.
-
5:49
So we're going to go back to FilteredImageBuilder, and
-
5:53
again we're going to modify our message to return a different image type.
-
5:57
Now earlier we switched from returning an array of UIImage to CGImage,
-
6:02
we're gonna change that again and this time return CIImages instead.
-
6:08
One thing we didn't talk about earlier,
-
6:11
was that ciImage isn't actually an image format, think of it more like a recipe,
-
6:15
it contains instructions about how the image should be rendered.
-
6:19
So what is the point of this?
-
6:21
Well, holding the data this way makes it easy for
-
6:24
us to chain operations like filters or any other image processing requirements.
-
6:29
Imagine you had a large image that you wanted to apply a black and
-
6:32
white filter to and then crop by 50%.
-
6:34
If you first apply the filter, render the image with the filter,
-
6:38
given that it's a large image this is going to be expensive.
-
6:42
Now after this operation, after the filter application we're going to crop the image
-
6:46
and reduce it's size in half.
-
6:48
So had you combined all those operations together, had you applied the filter after
-
6:52
the crop, where we have a smaller image, it would decrease processing time.
-
6:56
Now you know this because we ran into that very constrain and
-
6:59
had to take the exact approach.
-
7:01
So what ciImage does is it takes all the instructions we provide crops, filters,
-
7:06
drawing directions and so on, and combines them so that it optimizes the process.
-
7:11
So that when we eventually render it,
-
7:13
it does it in the right order to minimize how expensive the operation is.
-
7:16
Inside the main method, so inside our applyFilter method over here,
-
7:21
we'll use the output image that we get from this line to do something
-
7:25
similar to what we did earlier.
-
7:27
We're going to return the output image cropped to the bounds of the input image.
-
7:32
So here we'll just get rid of the return statement and
-
7:36
say outputImage.cropped to Rect inputImage.extent, okay.
-
7:42
Now, let's change this return type to CIImage and
-
7:46
the last one as well, we're almost there.
-
7:50
We now have an OpenGL ES context, and we have an object that returns a set of
-
7:55
instructions on how to draw a filtered image.
-
7:58
The next place we need to make changes is the collection view cell,
-
8:01
which we'll do in the next video.
You need to sign up for Treehouse in order to download course files.
Sign up