Heads up! To view this whole video, sign in with your Courses account or enroll in your free 7-day trial. Sign In Enroll
Preview
Video Player
00:00
00:00
00:00
- 2x 2x
- 1.75x 1.75x
- 1.5x 1.5x
- 1.25x 1.25x
- 1.1x 1.1x
- 1x 1x
- 0.75x 0.75x
- 0.5x 0.5x
In this video, we'll demonstrate how to document the findings of your study. How you report your findings will affect how your teammates will respond to the studies you've facilitated.
Related Discussions
Have questions about this video? Start a discussion with the community and Treehouse staff.
Sign upRelated Discussions
Have questions about this video? Start a discussion with the community and Treehouse staff.
Sign up
You've completed your study.
0:00
Now comes another critical step in
the useability study journey, the writeup.
0:02
How you report your findings will affect
what your teammates hear from the results
0:07
and whether they are motivated to improve
the product based on what you found.
0:12
Writing a full research report
usually takes a few weeks but
0:17
don't wait that long to let your
team know how the study went.
0:21
Immediately after
the study has wrapped up,
0:25
create a high level summary listing
the most obvious and pressing issues so
0:28
the design team can start
working on the solutions.
0:32
Let them know that a full
report is on the way.
0:35
Looking at all of your handwritten notes,
video footage, and
0:39
yellow stickies from the observer
discussions is just a lot.
0:43
Start by turning all of your
analog notes into digital ones.
0:47
I've provided an example
in the teacher's notes.
0:51
Spread sheets will make things easy
to read and easy to search for
0:54
keywords as you discover
patterns in your observations.
0:58
This initial spreadsheet
will be handy reference for
1:02
you but plan on putting together
a write up for your coworkers.
1:04
We'll go over that one later.
1:08
You can use your favorite
spreadsheet software to digitize and
1:11
organize the information.
1:14
Let's take a look at some helpful
techniques using a short example I've put
1:16
together using a Google spreadsheet.
1:19
It is also included in
the teacher's notes.
1:21
You may notice I used a combination
of realistic findings,
1:23
such as was concerned about a star rating,
and
1:26
dummy text, lorem ipsum dolor to
fill out this fictional example.
1:30
Now, on with the demo.
1:36
First, digitize your notes by observation.
1:38
Add a line for
each observation that stood out to you.
1:41
Most of them would likely be problems,
1:46
but you can also add
positive highlights as well.
1:48
Associate each observation with a task,
have relevant screen,
1:51
the participant number, and
any details that may be relevant.
1:57
Second, group related responses together.
2:02
Did one participant talk about
not knowing what a BTU is?
2:06
Maybe another asked how they would pick
the the right A/C unit for the room size,
2:09
while another wanted to know whether
the type of filter used is important.
2:13
Sounds like these are all questions about
A/C features that were not answered on
2:17
the site for them.
2:21
So let's group those together now.
2:22
Put that there, and this here.
2:27
Now, we'll need to count them up.
2:30
So how many participants struggled
with not having enough information on
2:33
the product page?
2:37
Let's find that number.
2:38
We can do this by inserting a total
2:41
column along with a total row.
2:46
For each observation, we add one for
how many times it's referenced.
2:55
And at the end,
we create a summation row, like that.
3:01
So, we now know that three participants
3:06
struggled with not enough
information about A/C features.
3:10
Next, we'll go on to log task
completion rates and time on task.
3:14
Let's click to that tab right now.
3:19
If your study had tasks with
a clear completion goal,
3:22
count how many participants got there.
3:25
Here it is.
3:28
If the time it takes to complete
a task is important for your product,
3:30
make sure to capture those numbers too.
3:34
Here's where those would go.
3:36
However, since this wasn't an important
factor for the Amazon study, I simply put
3:38
an N/A or not applicable, but you still
have it in your template for reference.
3:42
Next, we'll look at
summarizing rating questions.
3:47
In our Amazon study we asked participants,
on a scale of 1 to 10,
3:51
to rate the ease of use of the task.
3:55
We listed it right here.
3:58
Averages are only a little helpful with
small sample sizes typical of usability
4:00
testing which is 7.5 in this case.
4:05
However, trends should be
spotted in highlighted.
4:08
Did most participant choose 8 to 10 or
was it more like 1 to 3?
4:11
It seems here that the response
range was from 5 to 10.
4:16
Which is pretty much
neutral to very very easy.
4:20
Those don't look like great scores to me.
4:23
Too many people aren't sure.
4:26
You will often find that some of
the members on your team will have
4:29
a tendency to spot a usability finding
that may actually be just an edge case.
4:34
But they will wanna prioritize
the team's resources on that possibly
4:39
because it was part of the one
usability session they attended.
4:43
Get ahead of this tendency and rate
your findings according to criticality
4:47
which is a combination of severity,
how bad would things be if this happened?
4:53
And probability, what are the chances
that this will actually happen?
5:00
Assign a criticality or importance
rating to each of your findings.
5:05
High, medium, or low will suffice.
5:10
Now that you've isolated your findings,
time to start writing up your report.
5:14
You need to sign up for Treehouse in order to download course files.
Sign upYou need to sign up for Treehouse in order to set up Workspace
Sign up