Usability Testing63:33 with Dan Gorgone
In this live workshop, Dan Gorgone walks through the steps of usability testing for your website or app. He covers the benefits of testing; what to do before, during, and after tests; and strategies to improve your testing process over time.
Check out Dan's course, Usability Foundations for more details on making websites and apps more usable.
For recording screens during tests, Dan recommends trying out an app like Camtasia.
Dan also refers to a pair of usability-related articles from Smashing Magazine, including How Usability Testing Drastically Improved My Client’s App and Comprehensive Review Of Usability And User Experience Testing Tools.
For further reading on the subject, Dan also recommends checking out Steve Krug's official website and his books, Don't Make Me Think and Rocket Surgery Made Easy.
[Usability Testing with Dan Gorgone] 0:00 [Dan Gorgone @dangorgone] Hey everyone, I'm Dan Gorgone. 0:04 Thanks for joining me here at this workshop. 0:06 I'm going to present about usability testing 0:08 and do a little Q&A at the end. 0:10 I want to talk about the benefits and the purpose 0:12 of doing usability testing and the process, 0:16 what happens before you test, 0:18 what happens during a test, what will happen after a test 0:20 and then some ways you can take your testing 0:23 to the next level. 0:26 Now, a recent course I just published 0:28 is Usability Foundations. 0:32 It's something I believe in very strongly as a designer 0:34 of really anything, whether you're talking about websites or apps 0:39 or you're talking about light switches on the wall 0:43 or your car or anything. 0:46 If you want someone to use something that you're designing, 0:49 it's got to be usable, and if you want them to use it a lot, 0:55 it has to really work, and they have to be satisfied with the experience. 0:59 Part of what I talk about in Usability Foundations, 1:04 which is this new course you can check out, 1:07 is defining what usability is, how it factors into usability experience, 1:09 but then getting into some of the details about how you can improve 1:15 the usability of your websites, your apps, 1:19 and use different things like testing and frameworks 1:22 and other stuff, especially the mobile angle, 1:25 because the mobile angle has certainly changed design as well as testing. 1:29 Many of you may be wondering 1:36 is usability testing really worth it? 1:39 And what is the purpose? 1:43 What are the benefits behind testing? 1:46 The first thing you might think is that we want to make sure 1:48 that the site or the app is working. 1:52 That's true. 1:55 That's a good thing that you want to know. 1:57 But a lot of people have this misconception 2:00 that testing is something you do right before you roll something out. 2:03 Well, it is a good time to test certainly, 2:07 but the idea is you want to have done testing 2:12 throughout the entire process. 2:15 The way I think about it, I use this analogy. 2:17 If you're a runner and you're getting ready to run a race 2:19 the last thing you want to do 2:23 as the starter is counting down the seconds for you to start 2:26 is to check and see if your shoelaces are tied or not. 2:31 You should have done that. 2:33 You should have checked your equipment, checked everything you needed to do 2:35 before you run this race, and you need to be ready 2:38 when the launch is going to happen. 2:40 Right before you launch is not the only time 2:42 you should be doing testing, and in fact, 2:46 I think you shouldn't be doing testing at all then. 2:48 You should be concentrating on the launch. 2:50 That's why you should have it spread out throughout the design and development process, 2:52 from the very beginning. 2:55 By doing testing throughout that process 2:57 you will improve the design, and you will improve your development process 3:00 throughout the entire thing, make things much more efficient. 3:04 By doing testing, you shouldn't have this fear 3:10 that you have to make sure that everything works 3:14 every time you do testing. 3:18 Testing is a great opportunity 3:20 to pinpoint specific features on a site or app, 3:22 and I'm going to use site and app pretty interchangeably here, 3:27 so whatever I say about sites you can apply to apps as well. 3:31 You can pinpoint these features 3:35 and make sure that the most important features 3:37 or parts of your site are working. 3:40 Think about it this way, and we'll get into it in more detail. 3:43 But think about it this way. 3:46 If specific parts of your site were not working, 3:48 what are the ones you could get away with not working, 3:52 and what are the ones that would absolutely cripple you? 3:55 Making sure that the essential features 3:58 are working is part of what you should be doing with testing. 4:02 Another incredible benefit 4:08 of testing is gaining an outside perspective. 4:10 Design and development can be a very solitary sort of process. 4:15 Even when you're working with a team you're very insulated 4:23 from the outside world, even insulated from other departments 4:26 if you're in a company, especially if you're by yourself, 4:29 you're freelancing, things like that. 4:33 You're stuck in a cube where you're really buried in your laptop 4:35 and you're building stuff. 4:38 It happens that—and it's only natural— 4:40 that you use your instincts as a designer, 4:44 and if you've been hired to do things, you've done lots of research, 4:47 you've had lots of training, you should know how to design things 4:50 because you can anticipate what other people are thinking 4:53 or will need when they use the thing that you're designing. 4:57 But gaining that outside perspective, once you've actually put something together, 5:02 is incredibly powerful. 5:07 And it's really something that you have to see it to believe it, 5:10 and you can trust me on this. 5:15 I've had the opportunity to test different sites before, 5:17 and when you see someone use something differently for the first time, 5:21 it's a real eye opener. 5:26 But it's an incredible opportunity 5:29 to learn something really valuable. 5:31 It can be a positive interaction 5:34 or a negative interaction. 5:36 Maybe they used something, and they were really happy. 5:38 They got a big smile on their face, and that's great. 5:41 It's really validating as a designer. 5:43 Or they could use something, and you see there's a real stumbling block there. 5:46 There's a mistake. 5:51 Well, this is also a positive for you 5:53 because you know what's broken, and you can fix it. 5:55 An outside perspective is very, very valuable. 5:59 And another purpose behind testing is to ensure, 6:04 like I said at the beginning, what a lot of people think of first, 6:08 to ensure that your site or app is working correctly. 6:10 Now, this is a benefit that I see 6:15 is valuable after a launch. 6:19 You want to make sure that your site continues to work, 6:21 and you don't want to leave things to chance 6:24 or think that no news is good news. 6:27 Like if you haven't heard anything about a new section of the site 6:30 or a new feature that you've rolled out that you can't assume 6:34 that it's okay or that it's working correctly. 6:37 You do have to test these things. 6:40 You have to ensure that everything is working, 6:42 and then you have that peace of mind going forward. 6:45 Let's talk about the testing process. 6:50 Now, if you have a site or an app 6:53 that you are responsible for, 6:57 that you are building, you do want to make sure that you test, 6:59 and you want this to be part of the design and development process. 7:03 You need access to something that works. 7:09 Now, you can theoretically test a mockup 7:13 or a screenshot or something you rough up in Photoshop. 7:17 Even something on a cocktail napkin. 7:22 You can ask someone's opinion 7:24 about the things you design or a potential new design, 7:26 a fix for something. 7:30 Getting information from someone else, that outside perspective I talked about, 7:32 is incredibly valuable. 7:35 But to get the most information you can 7:38 and to get the most relevant and actionable information you can, 7:40 you have to have something that's working. 7:43 You have to have a site that is live 7:45 or in development, something on a staging server 7:47 or an app that you can give a mobile device to someone 7:51 and they can start using it or manipulating it 7:55 or creating content on it, whatever it's used for. 7:59 You have to have something that works. 8:02 Dealing in the generalities of giving people screenshots, 8:05 and you can ask questions like "Where would you click?" 8:09 And you would get an answer, and you can try and get some more information that way. 8:13 But to lead them through an entire process, 8:17 take them from screen to screen to screen to screen, 8:20 from beginning to finish, from the home page to the ending of a shopping cart, 8:24 that's the spot where you get the most detail, 8:28 and you get the most information that will help you. 8:32 Determining the tasks to test. 8:36 This is vital for your testing process, 8:38 and it's something that I did talk about a little earlier. 8:41 What are those things that really drive your site? 8:45 What are the things that really tie into the metrics of success, if you will? 8:48 Revenue, leads that you generate from a contact form, 8:54 clients that you get from your portfolio site or your personal site, 9:01 sales or clicks or downloads, 9:06 whatever the metric is that is important to you, 9:09 there is a process, there is a feature, 9:12 there is a section of the site that is tied to it 9:14 so that when people click something or download something 9:17 or submit a form that's a success to you. 9:20 That is the process that you want to test. 9:25 Chances are, you have 1, 2, perhaps 3 of those things that are very important to you 9:28 and your company on the site, and that's fine. 9:35 If you had, say, 3 tasks that you wanted to test 9:39 at a particular time, that's a great number. 9:42 You don't want to test every single page on the site. 9:45 Maybe you have a small site, and it's possible. 9:49 But you have to think about how you can scale these tests out 9:51 and about how much capacity you have, 9:55 how many resources you have. 9:58 I'll talk more about administering the tests 10:00 and some of those numbers a little later. 10:03 But you want to stay focused here. 10:06 What are the most important things for your site? 10:08 Focus on those so you can test. 10:11 Identifying personas. 10:15 I don't want to scare anyone off with the persona buzzword. 10:17 It's really all comes down to who is going to use your site? 10:20 If it's a site that's online, 10:25 who uses your site right now? 10:27 Who are they? Who are these people? 10:30 If your site is about a particular type of content 10:32 it's clearly going to be people that like that content. 10:36 If you have a site about greatest movie villains of all time, 10:39 these are people that love movies. 10:43 They love entertainment. 10:46 If this is a site about learning how to code, these are people that are learning how to code 10:48 or refreshing their skills, kind of like teamtreehouse.com. 10:51 But figure out based on their needs 10:56 and their interests as well as demographic data if you have it. 11:01 So male or female, a particular age group. 11:06 Are they educated? 11:09 Do they live in a certain part of the world? 11:11 Do they live in a certain part of your town? 11:14 You can use all this information to your advantage, 11:19 and you want to target those people 11:21 who would generally use your site or that you want to use your site. 11:25 If you get some of those types of people 11:28 to test your site, you're getting the most insightful data that you can. 11:32 If you have people that wouldn't normally use your site testing it, 11:37 you can get some information, but it's not quite as valuable 11:43 as picking the right people. 11:47 But I will say this: testing with 1 person 11:50 is always better than testing with nobody. 11:52 Before you test, you want to locate a place in which to test. 11:56 This is really important. 12:00 You want to have a controlled environment 12:02 in which to test. 12:05 And what I mean by that is if you want to test in a public place, 12:07 you're going to have distractions. 12:12 If you're at the local coffee shop 12:14 or on a bus or at a bus stop or out and about somewhere, 12:18 again, you can get information from people and helpful feedback. 12:22 But if you have a conference room at your office 12:27 or you have a closed off area or you have even your house, 12:31 asking a friend or family or girlfriend or spouse or whatever 12:37 to take a look at a site and asking them questions 12:41 in the privacy of your own home or asking a colleague 12:45 in a conference room questions about things, 12:50 those are controlled spaces because you know you're not going to have 12:52 the distractions of the outside world. 12:56 You can control that space. 12:58 You can set up equipment to record. 13:00 You can set up your computer or mobile device, whatever you're testing on. 13:02 It is so much more easy 13:05 and less stressful than trying to test in public 13:10 and having who knows what will happen 13:14 as you're trying to ask questions. 13:17 You want to get as much information as you can, 13:19 so you need to be focused, and the test subjects need to be focused as well. 13:21 So once you have an idea 13:25 about what you want to test, you have your site 13:27 or your app, you have some tasks in mind, 13:31 you know what type of people you want to test with, 13:34 the first thing you want to do is set up that test environment. 13:37 Now, that includes the physical space that I just talked about, 13:41 but it also includes the site or the app itself. 13:44 Make sure that you have access to it 13:49 when you're going to be ready to test. 13:53 And if we're talking about the day of the test, 13:55 well, make sure it works, 13:57 but be ready for it. 14:00 Think about these things the day before or the week before. 14:02 Make sure you have access, so if you're working in a company, for example, 14:05 and you have to reserve a conference room for an entire day, do that in advance. 14:09 Don't spring that upon people. 14:14 If you are testing a development version of a website 14:17 and you want to have access to it, and you need to talk to some developers 14:22 or server guys or something to make sure you have access, 14:26 talk to those people. 14:29 Let them know about it. 14:31 It's important to also do this in advance. 14:33 You have to draft test users. 14:37 You have to identify those people 14:39 that based on those personas we discussed, 14:42 the demographics, what are their interests, what are their needs, 14:45 things like that, start targeting people. 14:48 See who you can find, 14:51 and I'll tell you about an example. 14:53 I'll tell you about a scenario that I went through. 14:56 I was testing the old version of the Treehouse site last year, 14:58 and I needed some test users, and we're here in Orlando. 15:02 But I didn't want to use anyone who was already here in the office 15:06 and knows a lot about our website. 15:11 I wanted to get people who were relatively new to it 15:14 or completely new to the site, 15:17 and I wanted to ask them questions about how easy it is to go through 15:20 and learn different things, were there any stumbling blocks, things like that. 15:26 Well, the way that I reached out to find people 15:30 was I went through our Meetup.com group. 15:33 Meetup.com, if you haven't been there, 15:36 is a site where you can set up events, and you can set up a schedule of different things 15:40 for organizations or local groups, things like that, 15:46 and invite them, and you can hold events, 15:50 and it has a nice schedule for you, RSVPs online, things like that. 15:52 I put out a call for test users for usability testing, 15:56 and I gave the instructions, and I said, 16:00 "I'm looking for people who are interested in web design 16:03 or development or making apps," which is what we do at teamtreehouse.com. 16:08 "I'm looking for people who are not members of our site," 16:14 so I tried to get people who had not been there before or maybe visited once or twice. 16:21 "And can you be available on this date, 16:26 this particular time, or can you give me an hour of your time on this day?" 16:29 That's actually what I did, because I ended up getting 16:35 maybe 10 or 12 different people who responded positively. 16:38 They said, "Oh, yeah, I'm available, and I can be there, 16:44 and these are the hours that I can be available," 16:47 whether they're on a lunch break or it's the morning or the afternoon. 16:50 Based on that, I was able to get some information from them 16:54 by setting up a survey. 16:59 I went into surveymonkey.com, 17:01 a great online survey tool, which I hope many of you are familiar with. 17:03 It's surveymonkey.com. I set up a survey and asked them questions. 17:07 I asked them have they been to the site before. 17:11 I asked them what technologies are they interested in, 17:13 and I listed a number of things that we offer on the site 17:17 and a few things that we don't. 17:20 And then I asked what their availability was, 17:23 and then based on all of that information I was able to see 17:25 who could look at specific types of content 17:28 we had on the site, who would be available, 17:32 who would fill in the slots on my schedule here, 17:35 and I was able to schedule 6 people, 17:37 an hour each, 3 before lunch and 3 after, 17:40 and I was able to bring them all in and get some great information. 17:44 That's how I was able to find 17:48 people that met the criteria that I had set 17:51 based on who we would expect to use the site 17:56 and some other demographics as well. 18:01 That's one example of how you can draft people 18:06 who happen to be in the area and have the same interests 18:09 or same needs that you're offering with your site or app. 18:12 It's something that a number of people who are interested and sign up for those sites 18:15 are already interested. 18:21 If you give them an opportunity like that, 18:24 you'll find that there's plenty of people that would take you up on it. 18:26 Before you test, again, it's important not to insulate yourself 18:30 from the rest of the company. 18:35 Testing should be something that you share 18:37 with other people, especially if, say, you're a designer 18:40 and you have a development team over there, 18:44 or you could be on the marketing side, 18:46 online marketing, and you have the people that build the site over there. 18:50 You want to be able to talk with them, communicate to them 18:54 about the testing process, 18:57 because after you do testing, 18:59 you're going to have results for them. 19:02 You're going to have some great information to share, 19:04 and it's a little disconcerting when you're part of the team that works on the site 19:06 and makes this thing happen and you're passionate about it 19:13 and then someone comes to you from out of nowhere and says, 19:15 "All right, I've got a bunch of changes for the site, 19:18 and you're going to make them," or something like that. 19:20 Without being part of that process it can be difficult, 19:24 and this is more of an interpersonal, interdepartmental sort of diplomacy type deal, 19:28 but you can get some great insight from them as well 19:35 because there may be things they really want to test, 19:39 but they haven't had an opportunity, 19:42 or they may be very curious about what people think about a new section 19:44 or a potential redesign of something. 19:49 You may have some great stuff to test, 19:51 and giving that opportunity, since you're setting something up already, 19:54 they can be a great resource for you. 19:58 One of the last things before setting up your test 20:01 is to write up a testing script. 20:04 Now, this is one of those things that I think some people might overlook, 20:08 because they have these ideas, they know what they want to test, 20:13 and maybe it's like a little checklist or something like that. 20:17 They have 3 questions they want to ask or something like that. 20:20 But if you have a script, a script can put you at ease. 20:23 It can put the test subject at ease. 20:28 It's very, very helpful. 20:31 A place where you can get a script to use is right here, 20:33 Steve Krug's "Don't Make Me Think." 20:37 This is a fantastic book. 20:40 It came out in 2000. 20:42 It is one of the industry-changing books, I'll say, 20:44 about usability, its importance to your site, 20:48 and how you can improve through testing. 20:51 Included in that book and also on Steve Krug's website 20:55 sensible.com, he has a number of resources you can download 21:01 including a test script, which is really a template you can fill in 21:05 all the information you have. 21:09 Also a recording consent form. 21:12 We've talk about recording in a minute. 21:14 He's got some other stuff on there too. 21:17 But the importance of the script is that as you're administering these tests 21:19 you've got people coming in from the outside, 21:26 you've got a testing environment, and you're trying to keep everything together, 21:29 and you've got a lot of things happening, and this is probably generally not what you'd do 21:35 from day to day, so you've already got lots to think about. 21:39 Relieve some of the stress 21:42 and have a script all written out for yourself 21:45 once you start going through the process. 21:49 And really, it's as simple as looking at it and reading it 21:52 and actually telling the person who came in, 21:56 "I'm going to read from the script, 21:58 "because I don't want to forget anything. 22:00 I'm going to go through point by point, and if you have any questions, let me know." 22:03 But the script is the easiest way for you to organize your thoughts 22:06 before you even get to this craziness of doing tests, 22:09 and it can be very stressful for people. 22:12 But the script is a great crutch. 22:16 It's very helpful, and you can put everything you want in there, 22:19 and then you repeat the process with each new test user that comes in. 22:22 During your test, or I should say right before your test, 22:29 you're going to prep that controlled space. 22:33 You're going to get everything ready 22:35 from your laptop, your mobile device, 22:39 get a schedule together, 22:41 print out forms like recording consent forms, 22:44 print out your script that you've prepared, 22:47 and one other thing I know was on a slide earlier 22:49 when we talked about drafting users 22:53 there's also the issue of compensating them. 22:56 A lot of people are more than happy to help, 22:58 they're just busy, and their time is worth something. 23:01 It really is worth something to them. 23:04 But their opinion is also worth something to you as well, 23:06 so compensate them for that. 23:09 A standard thing could be $50 or $100. 23:13 It could be a gift card. It could be cash. 23:16 If you happen to be right near a coffee shop, 23:18 maybe pick up a Starbucks gift card and say, "Here you go. 23:22 Thanks for your help," but some kind of compensation. 23:25 Maybe a free membership to the site that you're testing, 23:28 maybe a discount code or something like that. 23:31 Make sure you compensate them for their time, 23:35 because they'll appreciate it, and they'll also be more willing 23:37 to give up an hour of their life or however long it is 23:40 and spend it with you testing a site. 23:44 Now, a part of prepping that space 23:47 is getting all that stuff, whether you have to talk to a financial person 23:50 in your office or go to the bank and get some cash. 23:55 Get all those things ready to go, 23:59 because then you can even have a folder 24:01 set up for each test user that comes in. 24:04 You just open the next folder. 24:06 Here's the script. Here's the thing you have to sign. 24:08 Here's that compensation. 24:10 Once we get all this stuff in there, 24:12 folders closed, move onto the next one. 24:14 It makes it nice and easy. 24:16 Start recording. 24:19 Oh, my goodness, if you're recording, 24:21 I mean if you're doing a test and you don't record the test, 24:23 you're absolutely missing out on something. 24:27 Recording what happens on the screen, 24:30 recording the person's face and all that stuff 24:35 is very helpful after the fact. 24:39 Being able to go back and watch a replay 24:42 of what the test subject was doing, 24:45 what they were talking about, 24:49 where their mouse was going, their body language, 24:52 their face, all these different things, 24:55 and especially the things they say, 24:57 any mistakes they make, 24:59 all these things that happen during the test, 25:02 they can easily be missed because you're trying to administer this thing. 25:05 And again, lots of stuff is happening. 25:09 You've got paperwork. 25:11 You're trying to listen. You're trying to engage them. 25:13 Record. Record, and again, take some of that stress away from this whole process. 25:16 You can go back and watch a recording of the test as many times as you want, 25:23 and you can get some great insight that you missed the first time. 25:27 I'm going to give you an example of a testing setup that we had. 25:34 When we did the test of the teamtreehouse.com site last year, 25:37 I went into a conference room here in Orlando, 25:41 set up my laptop, 25:44 and I had my notes, and I had my folder, 25:47 my glass of water right next to me, 25:49 and sitting with me there is one of our friends that came in to test the site. 25:53 He has a computer in front of him 25:58 where he's looking at the Treehouse site, 26:01 and you can see a screenshot of it there, 26:05 and I apologize if that's pixelated. 26:07 It's not what actually happened with the recording. 26:09 That's a screenshot of a screenshot of a video recording 26:11 of a pixelated something. 26:15 But what you can see there is that the website is there. 26:18 That's an old version of the Treehouse site. 26:22 And sort of picture in picture you have a web cam 26:24 recording of our test user. 26:29 Going back 1 slide, you might be able to see on my laptop 26:32 I actually have Camtasia loaded up on my laptop, 26:36 and that is the software I used to record that session. 26:41 Camtasia 2 is out now. 26:45 That's a software that I recommend. 26:47 It is a great multi-purpose tool actually. 26:50 I think they probably developed for user testing and screen recording, things like that, 26:53 but you can use it for recording yourself off of web cams 27:00 to do presentations. 27:04 You can use it for video editing and audio editing. 27:06 It's certainly no Final Cut Pro or anything like that, 27:10 but as an affordable piece of software 27:13 that can allow you to record the screen 27:15 during tests like this and have that web cam shot of the user so you can see them too, 27:19 for me it's a bargain, and like I said, it's a great multipurpose tool. 27:24 But that is what I used to record the test, 27:29 and I was able to go back and watch those recordings any time that I wanted. 27:34 And of course, make sure to leave yourself a note 27:42 to start recording. 27:44 It's a nice thing to build into your script, 27:48 and that's why I say use your script and take notes 27:50 throughout the entire process. 27:53 The script is there to help you. 27:55 Make whatever notes you want 27:57 inside the script. 27:59 You don't have to read every single thing there. 28:01 If there's a big, giant note in there 28:03 in all caps and it's this big and it says, "Start recording," 28:05 that's a great reminder. 28:09 And then at the end you can say, "Stop recording" when it's done, 28:11 and you'll get everything that you wanted to get. 28:14 Try to take notes as much as you can 28:18 during the tests. 28:21 That's a great time to take notes is during the test. 28:23 It's also a great time after the fact 28:27 when you can go back and review those things 28:30 and watch replays of the sessions and be able to match up those notes 28:32 that you took live with maybe some new notes that you take 28:38 when you're watching a recording. 28:43 Taking notes is great. 28:45 However, you have to make this experience 28:47 as pleasing, as effortless, 28:50 as easy as you can for whoever comes in to visit you. 28:54 Make sure that you listen 28:58 to your test users. 29:01 Make sure that you ask them some questions about themselves, 29:03 and don't look at them merely as a cog in the machine or something. 29:06 They are real human beings. 29:13 This is one of the huge points 29:16 of testing is to get that other person's perspective. 29:18 They are there to help you, 29:21 and they're giving up their time to look at whatever site or app you've built. 29:24 Listen to them and engage with them. 29:29 Have a conversation with them. 29:32 Don't take up too much time. You're there for a purpose. 29:34 But make them feel at ease. 29:36 I know when we had a couple people come in for our test 29:39 1 or 2 of them were nervous 29:42 because they almost felt like they were the ones being tested, 29:44 and that's something you can tell them. 29:48 You can explicitly tell them, "Don't worry. 29:50 "You can't make any mistakes, because you are not the one that's being tested. 29:53 "The site is being tested, and you are here to give us the great benefit 29:57 of your opinion and feedback, and we really appreciate you being here." 30:02 Tell them that, and what's more, 30:07 believe that, because they are a huge benefit to you. 30:11 Lastly, during your test this is something to prepare, of course, before. 30:16 But during your test, should anything come up 30:20 make sure you have a backup plan. 30:25 A classic example of that is when a test user doesn't show up, 30:27 and I said earlier that I had 6 people signed up, 30:30 reserved for times during our all-day test of teamtreehouse.com. 30:36 One of them did not show up. 30:42 One of them I think emailed me 30:44 and said they couldn't make it for whatever reason. 30:46 And that's fine, because I overcompensated beforehand. 30:50 I would be more than happy to have 3 to 5 test users come in, 30:56 and that number actually is based on research by Jacob Nielsen 31:01 from way, way back. 31:06 Having 3 to 5 people come in and test features on your website 31:08 statistically is about the perfect range 31:12 of people that you can have, 31:18 because less than that number, so 1 or 2 people, 31:20 you're going to find less things that 3, 4, or 5 people will, 31:26 and based on the data and research Jacob Nielsen did 31:30 once you start going past 5 people 31:33 the amount of effort that it takes 31:36 to run these tests no longer returns 31:40 the benefit of doing the testing. 31:44 It's the law of diminishing returns right there. 31:47 Once you get to 5 test users, 31:50 you will probably find as many things 31:54 as you are going to find. 31:57 If you have 10 test users, 20 test users, 31:59 you may find 1 or 2 more things, 32:02 but the big ticket items that you have to fix 32:05 you probably already found in the first 3, 4, 5 test users. 32:07 Don't make so much work for yourself. 32:11 Limit yourself to that, and you're going to have more than enough information 32:15 from those few people. 32:18 Trust me, I did. 32:20 Have a backup plan 32:23 for different things that may come up. 32:26 One of the things that did come up during our test 32:29 was for some reason the wi-fi in our office 32:32 was slow that day, which really doesn't help 32:34 when you have a website that serves up these big videos, 32:38 instructional videos to teach you how to do HTML. 32:42 The video was kind of jumpy, and one of the things I had ready 32:46 was I had the videos I wanted them to watch downloaded on my laptop, 32:51 and so I tried to give them the perfect experience, 32:55 the classic experience that people would have on the site. 33:00 But when I saw that it just wasn't working, 33:03 and you can tell that people are getting a little antsy 33:05 and they're thinking, "Am I wasting my time? 33:10 Should I leave?" give them something that they can do. 33:14 Have that as part of your backup plan, 33:19 and that way the opportunity isn't wasted, 33:21 especially for you. 33:24 You brought them in there for a reason. 33:26 Get something from them, and if you don't use your backup plan, 33:28 save it for the next time, but that backup plan could be you have a version of the site on your laptop 33:33 or the computer where you're testing. 33:38 Potentially you could have screenshots of the different things, 33:40 mockups, things like that. 33:43 That could be part of your backup plan right there. 33:46 After you test, certainly thank these people for their time. 33:51 That's one thing. 33:56 Show that you value them. 33:58 But after your test, you will really see the true value 34:00 of all this effort that you've put into the testing. 34:04 Reviewing the recordings and your notes 34:08 will be your top priority, and it should be something that you do very quickly. 34:11 It doesn't have to be that day, especially if you have a full day of testing that you've done. 34:17 But it should be at least that week or the very next day 34:21 where you start looking at everything, 34:25 because if you let it go for too long, 34:28 it's going to be difficult to get back into it, 34:31 and also things may change on the site 34:33 that are no longer applicable. 34:36 Get right to it. 34:38 Show some value for what you've produced there. 34:40 Produce a list of results. 34:44 By going through the recordings and the notes, 34:46 you will begin to see patterns. 34:49 You'll see positive patterns like I asked them to do this, 34:52 and they did it with no problem, 34:56 and that test user did it, 34:58 #2, #3, #4, they all did it successfully. 35:00 Maybe they did it slightly different ways, 35:03 but it all worked out, and it was perfect. 35:05 That's great. 35:08 Share those results. 35:10 Don't just share problems that you find 35:12 like there was this typo or people had problems 35:14 performing whatever task. 35:18 Don't just report the bad news. 35:20 People want to hear the good news as well, and besides, 35:22 good news makes bad news a little easier to take sometimes. 35:24 By producing a list of results based on the different tasks 35:29 you can easily organize that information, 35:33 and whoever is going to be looking at this information 35:38 will be able to see some patterns. 35:41 They'll be able to see where certain things broke down. 35:43 You can also break down the results by the test user. 35:46 You can look for patterns there as well 35:51 where one type of user who maybe had no experience with your site or content or features 35:55 or things like that, maybe they had consistent problems with each task. 36:00 Maybe someone who was more experienced had no problems with it. 36:05 Maybe that says something about your site or app. 36:08 These patterns could be based on anything. 36:11 It could be demographics. It could be age. 36:15 It could be experience. It could be interest in the content or the features. 36:17 You just don't know until you look at the data. 36:21 Now, you can have very, very granular results. 36:24 I know I had a few pages of bullets and different notes 36:30 for the test that we ran with 5 users, 36:36 and I asked them to perform 3 different tasks, 36:39 or 2, 2 or 3 different tasks. 36:43 I had plenty of data to share 36:45 with the design team here at Treehouse. 36:47 But I was also able to take those results 36:49 and translate them into the main ideas, 36:54 the main takeaways, and I remember I had 3 specific things that I knew had to be fixed. 36:59 I knew there was 1 particular page 37:06 that was confusing for them where they started. 37:08 There was 1 particular task where there was confusion about where to start, 37:11 but once they got started, it was fantastic. 37:18 I was able to look at all the different evidence that I had 37:20 and collate that data and present something 37:23 a little easier to digest 37:26 for whoever was involved, and then whoever was going to get directly involved, 37:29 get their hands dirty in fixing things, 37:35 could look at all that granular data 37:37 and see specifics about where things broke, 37:39 what link, what page, 37:42 what video, whatever. 37:44 Generating recommendations based on your results 37:49 is something that could be the difference 37:51 between your efforts actually turning into something or not. 37:55 Being able to take what you've done 38:06 and translate it into something actionable 38:08 really goes a huge, long way 38:12 toward fixing your site or fixing your app. 38:16 Too many times when different people have done tests 38:21 they've reported the results, 38:28 and then they've stopped there, 38:31 because maybe they think they're not qualified to come up with the fixes, 38:33 or they think that's all they have to generate, 38:37 and their work is done. 38:41 If you find things that are wrong with a site or an app, 38:43 being able to report those is good. 38:47 That's a good first step. 38:51 But suggesting a fix can be even better. 38:53 The fix may not be the right thing, and that's fine, 38:56 and this comes up in any professional or project environment. 39:00 Finding a problem, reporting it 39:05 and suggesting a fix can lead to the correct fix, 39:08 whoever is fixing the problem there. 39:12 But stopping at the results is a bad idea. 39:16 Make some recommendations, and don't be afraid to say, 39:21 "I don't know how to fix this, but this is clearly a problem." 39:25 Maybe that means you go online and you do some research 39:28 about other sites that have similar features 39:30 or content, and you see how they treat those things that are a problem on your site. 39:35 Maybe you do some research and you find some services, 39:41 some paid services that do those things for you. 39:45 Maybe you don't have to buy those services 39:48 and spend a lot of money. 39:50 Maybe you're able to build something in-house 39:52 and fix it for yourself, and that's great. 39:54 At least it's led to some kind of solution, 39:56 and that's what usability testing is about is taking us from our current state 39:59 to something that is better for everyone involved, 40:06 for the users and for yourself. 40:11 Another thing about recommendations, 40:14 if I can add this as well, 40:16 is that it helps to break down recommendations 40:18 sometimes into different categories. 40:21 You could have the absolute needs 40:24 of things that need to be fixed or changed 40:27 or edited somehow, and then you could have a wish list. 40:30 So the needs and the wants. 40:33 You can have a wish list. 40:35 You can say this particular part of the site works. 40:37 It's okay, but the test user mentioned that they saw something like this 40:42 on something something dot com, 40:48 and if you know they have a system 40:51 or a content management system or a service 40:54 or something like that, maybe you can make the recommendation 40:56 of maybe that's something we should consider, 40:59 investing in some type of technology 41:02 that makes our site better, that makes the experience better for the users. 41:06 You can have needs and wants. 41:10 You can also have short-term recommendations 41:12 and long-term recommendations. 41:14 This might actually be even better 41:16 if you're going to be sharing these things 41:18 with the people that work on the site, because you can give them the list 41:21 and say, "Here are those easy things, easy fixes, 41:24 "that I know you can knock out maybe in a day. 41:27 "And then here's a list of other things that are going to take a lot more time and effort, 41:31 maybe some more capital, some more resources, things like that." 41:35 That can be helpful rather than a big, long checklist of oh, my goodness, 41:39 your eyes start glazing over. 41:47 It's a lot to do. 41:49 Sharing results with the team. 41:51 That's what I was just talking about. 41:53 Don't sit on these results, 41:55 unless you're the one doing all the work, and that's fine. 41:57 Then you can take that and action that stuff, start fixing things. 42:00 That's great if you're a 1-man team or whatever. 42:04 But if you have other teams that do all the work, 42:08 hopefully you took my advice. 42:10 You talked to them beforehand. 42:12 Share those results. 42:14 You've already created the expectation. 42:16 You're saying, "I'm doing testing. I'm going to let you know how it goes," 42:18 or "You can be involved. 42:20 You can sit in," whatever. 42:22 But share the results with the teams that would be interested in it, 42:24 and then the hope is they can start to see the value in this testing, 42:26 because one of the things that can be a struggle, 42:32 especially if you're working in a company environment, 42:36 is just getting the sign off, getting the approval 42:40 from your boss or your manager or your team or whoever 42:44 to do this testing in the first place, 42:48 because there's an expense involved. 42:50 There's time, especially compensation 42:52 if you're going to be compensating the test users. 42:55 You do have to sell this. 42:57 But there's no better way to sell this 43:01 than showing them the data from a successful test. 43:05 I know that's kind of backwards. 43:09 You have to do the test to get the data 43:13 to show that it's valuable. 43:15 You can do something under the radar. 43:17 You can do something cheap, for free, 43:19 a small scale example of testing 43:22 to show that results can be found. 43:24 If you had a little more money or you had access to other things, 43:28 other tools, other resources, you can get even more information, 43:32 and it could mean even more success for your site or app. 43:36 Lastly, don't lose steam. 43:42 If you have a successful test, you have people come in 43:45 that give you great insight, schedule that next set of tests. 43:48 Get it done. 43:52 If you test once a month 43:55 or even once every other month, 43:58 that can lead to a really great process internally 44:00 during design and development or even after the launch 44:08 leading up to further redesigns or rollouts of other features, things like that. 44:11 It can be such a great process 44:18 to have the regular expectation 44:22 of testing happen, because other people will get involved, 44:25 other people can participate. 44:29 Maybe other people can administer it. 44:31 But any channels that you build 44:33 with, say, groups of people who might be interested 44:40 in being test users, you can have a whole group of people online 44:43 or locally in the area to help you out. 44:48 You can even reach out to some of those test users again 44:52 if they're especially helpful and bring them in 44:56 and ask them questions again. 44:58 But don't lose steam. 45:01 Schedule another set of tests, 45:05 and when you do tests a second time, a third time and so on, 45:07 you don't have to test the exact same thing every single time. 45:13 In fact, I wouldn't recommend that. 45:17 You might want to test the most important task again, 45:19 because that's always great to get insight on. 45:23 But there may be other parts of the site that could use some attention. 45:25 There may be some new features, 45:29 and there may be other things that maybe the design team 45:31 or development comes to you and says, "We'd really like to get this in front of some other people 45:33 and get their perspective on it," and so being able to get their insight on it 45:38 would be fantastic. 45:43 That's testing. 45:48 If you want to take testing to the next level, so to speak, 45:50 here's something that if people haven't thought about it, it definitely blows their minds. 45:52 Test competitor sites or apps. 45:57 It's okay to test other websites. 46:00 You can test your own site, 46:03 and that's great, and you can improve your site and everything. 46:06 But go out and look at your competitors, 46:08 see what they're doing, and run a usability test on them. 46:11 Get some other people's insight on them, 46:15 because you may be too close to the situation. 46:18 Your site does this, 46:22 and their site does almost the same thing. 46:24 You already know a lot about how certain things work. 46:26 Get another person's perspective on it. 46:31 Get an outsider's perspective 46:33 and see what they think of it, 46:35 because again, just like your own site, 46:37 they may see things that you don't see 46:40 because their experience, what they bring from past experience using other sites, 46:42 sites that you may have never used before, 46:49 really the experience of trying to fulfill their own needs 46:52 as an outsider is going to be different than yours. 46:55 Test competitor sites. 46:58 It's okay. 47:02 You're not breaking any laws. 47:04 You're trying to get more information 47:06 and gain insight so you can build an even better site than them, 47:08 because you can guarantee that they will look at your site 47:13 and maybe test yours, and they will try to outdo you as well. 47:16 Some other things that you can do, and these are actually a couple things. 47:22 You can add some automated tools to your site, 47:26 or you can use some paid services, and sometimes they're both. 47:28 Automated tools, the things like Google Analytics that are going to give you analytics, 47:32 but there are other tools you can add to your site 47:38 that can track user sessions. 47:41 They can give you a little bit more data 47:44 about what goes on in general 47:47 but also for specific user sessions. 47:51 There are apps and services out there 47:53 that can do that, and there are paid services that allow you to set up 47:56 usability tests that can give you even more detail, 48:00 and it can give you actually quick access 48:04 to data quicker than you setting up a whole session 48:06 and spending a whole day and spending a week generating results, things like that, 48:12 and that's fine. 48:15 And paid services are great, 48:17 especially ones like UserTesting, 48:19 IntuitionHQ, Verify, which is the service from Zurb. 48:22 And these are just a few of them. 48:27 These are the ones that I know of, and I know user testing. 48:29 I've spoken with the people from IntuitionHQ before. 48:32 They all offer different ways that you can test your site. 48:37 And they really vary, so it's important to check those out, 48:41 and if you want more information on paid services, 48:47 there's a great article from Smashing Magazine 48:50 that I actually shared on my Twitter feed, 48:54 @dangorgone on Twitter. 48:56 I just shared it before we went on the live stream here, 48:58 so there are a couple of articles there. 49:03 One about a lot of these different services, 49:05 a lot of comparison of what they offer, 49:07 so if you're looking for something specific, 49:10 definitely check that out, 49:13 and it's a couple years old, so people have been updating the comments 49:16 with newer stuff as well, so check that out as well. 49:21 There's another article about the benefits of usability testing, 49:24 and I believe it's an app called Hello Sign 49:28 that they recently did some usability testing, 49:31 and by doing the testing they were able to change the log in screen, 49:35 the welcome screen of the app, and they made a huge difference 49:42 in the usability of the app. 49:45 You would think the most important stuff is inside the app, right? 49:48 But if you have something that's behind the log in wall, 49:52 that is something you have to keep in mind when you're testing 49:55 is that people visiting your site for the first time 49:59 are not already going to be logged in. 50:02 They're going to come to the logged out version of your site, 50:04 and they're going to see what that looks like 50:07 and try to deal with that and make their own idea 50:09 about whether they want to enter the site 50:14 and do some more stuff. 50:17 I'm going to wrap it up there. 50:22 But I know we've got some questions 50:24 that we're going to look at, but I do want to add that getting another person's perspective 50:28 by doing testing is something that is incredibly valuable. 50:34 If you haven't gotten that impression from me yet, 50:40 I'm going to say it again. 50:42 Asking other people to check out the work that you've done 50:44 is always an eye opener. 50:46 And whether you are validating the work that you do through testing, 50:51 like you're getting positive reinforcement and it's working exactly as you thought, 50:56 or you are discovering problems and mistakes 51:00 and things that you can fix, 51:05 either way it is not a waste of time. 51:07 It's not a waste of resources at all 51:10 because when your manager or your CEO 51:12 walks in and says, "How's the site doing?" 51:16 you know exactly what to say, 51:19 although you'd probably say, "Hey, it's doing fine" either way. 51:21 But if there are problems on the site, you know exactly what to fix. 51:24 If everything is working great, you can prove it with data. 51:28 I know a couple of the questions are about third-party software for doing testing. 51:35 I know I've recommended a couple of those things, but definitely check out 51:40 the Smashing Magazine article that I shared on Twitter. 51:43 Again, that's @dangorgone on Twitter. 51:47 You'll see the last couple things I posted. 51:49 Check out those articles, because those can be helpful. 51:51 Let's see. 51:55 If you're doing prototyping on paper, 51:58 what's the best way to test interaction? 52:01 This is a question from Fernando. 52:03 You can absolutely test when you have paper prototypes 52:06 or even something on a whiteboard, 52:10 but paper is even better, 52:12 whether it's something you mock up using a tool like Mockingbird 52:15 or whether you're actually drawing it out. 52:19 What I would try to do with those 52:22 is anticipate what you need to test 52:25 as you're going through the whole step-by-step process. 52:30 Try not to leave it at 1 screen and ask a question. 52:34 If it's possible, give people more of a real-life experience 52:38 by having the other pages or other designs 52:44 there ready to go. 52:49 Whether it's a map that you have or whether you've got those things in a folder 52:51 and you say, "What would you click?" and then they say, "I would click on that," 52:56 and then you pull out another piece of paper and say, "Good," or you have it in a notebook 53:00 or something like that. 53:03 You can do it. You can get insight that way. 53:05 Certainly that's something that happens obviously before a launch. 53:08 But that can be a helpful way 53:14 I've actually found to get some insight about weighing designs against each other. 53:17 You can have 2 versions of a design 53:24 on a piece of paper printed out, and you put them side by side. 53:29 And you can say, "Check this out, look at this." 53:32 Is there a navigation menu style 53:35 that you think is better? 53:39 And then when they point to one, ask them why 53:41 and get that information from them. 53:44 Mohammed asked the best way to collect feedback from a large number of users. 53:49 What is the best way? 53:53 I would say one of those add-on tools 53:55 or one of the paid services that I mentioned. 54:00 They will allow you to get feedback from many more users 54:05 than you would in person. 54:10 Now, understand that those paid services are paid, 54:12 so they are going to cost money. 54:16 Some of them have steps to the cost, 54:18 so it could be free up to a certain amount of users 54:23 or a certain amount of tests or page views or something. 54:26 You have to look at each one and see 54:29 what you will actually pay for 54:32 and how much feedback you can actually get from each one. 54:34 Mouth asks are there any benefits to testing on a live server 54:42 versus a local setup? 54:46 Well, the live server will give you the true, real-life experience of testing. 54:49 You'll know that if something is on the live site 54:57 that it's guaranteed to be there, 55:01 that it's guaranteed that if a link doesn't work, 55:04 all right, that link definitely doesn't work. 55:07 We have to fix that. 55:09 If it's a local setup, the thing that I would do— 55:11 and I would recommend this for any of the tests— 55:15 is I would go through every test you're going to do personally 55:17 to see if the pages that you're going to, the pages you would expect the user to go to 55:21 are there, are working, all the images work. 55:27 You don't have to test every single link and everything, 55:30 but generally eyeball it and say I think they're going to go to the products page, 55:34 and you click to the products page, and you see that. 55:40 And I think they're going to go to this next. 55:42 Try to make sure those things are all working in advance, 55:44 because a local setup could be a great backup plan for sure. 55:48 And if that happens that all of a sudden your wi-fi goes down 55:52 or the site reboots for some reason 55:57 and you have to rely on a local setup, that's fine. 55:59 Be honest with the test user as well 56:03 and say, "Oh, looks like our Internet is down. 56:05 I'm going to use this local setup." 56:07 But it shouldn't be any different from what you expect, 56:09 and then watch them and observe them and see if there really is any difference, 56:12 because there shouldn't be if it's a local setup. 56:17 Shakara asks do you have any templates 56:23 you recommend for sharing results of user testing 56:26 with the rest of your team? 56:29 There are a number of templates out online 56:32 from different places that you can find. 56:35 But really for me it varies. 56:37 I think you have to do testing at least 1 time 56:42 with whatever site or app you have 56:45 to see what you're going to get 56:48 and to see how that information is really going to shake out. 56:50 If you have 3 different tasks that you're going to administer 56:55 to 3 different people, than you know you're going to get basically 9 sets of results, 57:01 3 tasks, 3 people, 3 times 3 is 9. 57:06 Whether you want to set things up in a grid, you could do that. 57:09 You could set it up bullet style. 57:13 You can do what I did where I had totally granular notes, 57:15 like everything they did and everything they said all in a Google doc. 57:20 But then at the top of that Google doc 57:23 was the summary, and I recommend whatever template you have 57:26 or are going to use, however you share these results, 57:31 at least have an executive type summary at the top, 57:35 because you want to be able to give people a brief, at a glance idea 57:39 of what happened in the tests, 57:44 how successful were people, is there some general sentiment 57:46 that you're getting from things, and what are the things that need to be addressed? 57:51 That's the other thing as well. 57:55 Mohammed asks once we have the results of our tests, 57:59 how do we prioritize which changes to make 58:02 based on the suggestions of our users? 58:06 This is a great question 58:08 and something that you have to gauge. 58:10 You want to get all the information you can from them 58:14 and all the feedback, and as you're going through the tests, 58:18 some of the test users might get a little self-confidence, 58:21 and they may start suggesting things that you should change. 58:26 It's not really based on anything in particular that's happening. 58:28 Like, "Oh, I see your logo is blue. 58:31 If it was green, I think it would be cool." 58:35 That's not really relevant here. 58:38 It doesn't help. 58:41 Make a note of it. You never know. 58:43 Something could come up where the design team sees that, 58:45 and they're like, "Oh, yeah. 58:47 We've been trying to change it to green forever." 58:49 You never know where some of this information will come up. 58:51 But keep all that stuff definitely in a granular version 58:53 of changes, but to prioritize, to get back to your question, 58:56 you want to look at those primary tasks that you tested first. 59:04 You have to know if those things are working or not, 59:07 and I'll give you a classic example. 59:10 When I used to teach at Full Sail University 59:14 and I taught usability in testing 59:17 in my very first month where I had my students do testing on their sites 59:20 because they had their own sites and businesses and things, 59:25 this guy was part of a design agency, 59:29 and his most recent client was a sporting good apparel company, 59:32 and so they had their site all set up and everything. 59:37 He goes through and of course identifies the primary task we're going to test here 59:40 is can people go through and find, say, a set of sneakers, 59:45 some running shoes that they want, and can they go through the whole shopping cart process and buy it? 59:52 He even came up with a dummy credit card or something 59:56 that the system would accept. 1:00:02 That's not something everyone has the ability to do, 1:00:04 but he was able to do that, so he gave them the information, 1:00:06 so they were able to go through the whole process. 1:00:08 Well, as luck would have it, 1:00:11 he goes through, and the shopping cart was broken. 1:00:14 The shopping cart would not work at all. 1:00:18 You get through, you enter the information, 1:00:22 you hit submit, and then some kind of error came up. 1:00:24 And he was absolutely shocked. 1:00:28 He couldn't believe it. 1:00:31 Now, it was some minor thing, of course, 1:00:33 but when you're talking priorities, 1:00:37 this was the first thing that he identified, 1:00:39 the top thing that's important to the success of this site, 1:00:41 the success of the business, tested it first, 1:00:45 tested it for each user, found out that it failed. 1:00:48 That became priority #1. 1:00:51 Not just in the report that he submitted to me 1:00:53 but when he submitted that Sunday night, 1:00:56 the first thing he did the very next morning, Monday morning, 1:00:58 was he got together with his development team, 1:01:00 and he fixed that problem. 1:01:02 You better believe it that he did. 1:01:04 Some of these things are very obvious, 1:01:06 and they really mean the difference between success and failure for a site. 1:01:08 But the other way to prioritize is think about is it a short-term thing, 1:01:12 is it a long-term thing? 1:01:17 Figuring out whether it's needs or wants, 1:01:19 that can be a little more fuzzy. 1:01:23 But if you are ever in question 1:01:25 about whether something works or not 1:01:28 or whether something is a priority or not, 1:01:31 test it again. 1:01:33 Test it again and get another person's perspective on it, 1:01:35 and you may start to see a pattern emerge. 1:01:38 Sometimes 3 to 5 users is not enough 1:01:42 to determine whether something needs to go a certain way or not. 1:01:45 The design or the feature or the content, whatever it is. 1:01:50 Sometimes it does take more testing, 1:01:54 and that is just the way it is with testing. 1:01:56 It's something that you can AB test 1:01:59 if things are too close to one another. 1:02:04 If you have the possibility of AB testing 1:02:07 in design or a feature or something like that, 1:02:11 that can be another way, and whether you can do it yourself 1:02:13 or through one of the paid services, that can be a way to get some resolution 1:02:17 on that question. 1:02:21 And the last question here, and again, I thank everyone for sticking with me here. 1:02:23 Any plans for more UX content on the Treehouse site? 1:02:28 Absolutely, absolutely. 1:02:32 UX and UI I believe are a couple of the subjects 1:02:34 that I know we started at one point, 1:02:38 and we switched some gears, and like I said, 1:02:40 I've got Usability Foundations that I just put out, 1:02:43 and I've heard from a whole bunch of students about it, 1:02:48 and I'm very happy that they dig it. 1:02:50 But UX and UI will definitely have that, 1:02:54 and one of the spots where we regularly announce new content for teamtreehouse.com 1:02:57 is on our Roadmap, so if you go to teamtreehouse.com/roadmap, 1:03:05 you can see exactly what's coming up, 1:03:09 and if there are any subjects that you're looking for, 1:03:12 definitely drop us a line. 1:03:18 I want to thank you for watching this workshop on usability testing. 1:03:20 If you have any questions, anything further, 1:03:23 be sure to find me in the Treehouse forum, 1:03:26 or you can find me on Twitter, @dangorgone. 1:03:29 Thanks for watching. 1:03:31
You need to sign up for Treehouse in order to download course files.Sign up