Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
Psychology For Designers37:01 with Joe Leech
How can an understanding of psychology make your designs better? In the spirit of FOWD. @mrjoe will make three predictions for the future of web design based on psychology. We'll also cover why Siri doesn't work very well and won't for a while, why, right now, we are designing like Sheldon from the Big Bang Theory and how we'll be designing in five years time.
So thanks very much for your time today everybody, hello. 0:02 So, today I'm going to talk about psychology for designers. 0:05 You gave me a great intro there. 0:08 So, I'm Mr.Joe on Twitter. 0:10 Little show of hands, how many of you have bought a, or booked a hotel online before? 0:13 Hands up. 0:19 Okay. 0:20 Bought a train ticket online? 0:20 Hands up. 0:22 Yeah. 0:23 Bought anything on the leading online auction site before, hands up? 0:24 Yeah. 0:27 Alright so, you've used something I've designed over the years. 0:27 So, every year about ten, just under $10 billion goes through the stuff I designed. 0:31 Which is thoroughly scary proposition for me. 0:36 But, it means I've got plenty of 0:40 experience at designing stuff that's mission critical. 0:41 Now, a lot of what I do, is I use psychology in my design. 0:45 So, I work for a little design practice 0:48 called CX Partners in Bristol in the United Kingdom. 0:49 And we and I spend most of my time researching and designing. 0:53 And I wrote a book. 0:57 Anybody bought the book? 0:59 Yeah? 1:01 Anybody read the book? 1:02 Great. 1:04 It's only $3, so for those of you who bought 1:04 it, said it's $3, but nobody's read the damn thing. 1:07 So, I encourage you to do it. 1:10 And before you're thinking I'm here for the talk, the book is different. 1:12 The book tells you how to go out there. 1:15 Find the psychology paper, that solves the 1:17 design problem you've got right here and now. 1:19 That's what the book does. 1:22 That's not what this talk is gonna be about. 1:23 This talk's gonna be about three things. 1:25 I'm gonna make three predictions today, cuz 1:27 today is about the future of web design. 1:29 So I thought, right, well why not? 1:30 Let's make three predictions for the future of web design. 1:32 And I love making predictions of the future because, 1:36 you know, you're bound to get it absolutely, fundamentally wrong. 1:37 So, I'm gonna try today and you'll see. 1:40 Maybe I'll get my predictions right. 1:43 Maybe I won't. 1:44 But, you know, time will tell. 1:45 Okay, so, let's look at the future. 1:46 Say, we're in a movie theater. 1:49 One of the very first movies, I remember seeing at the movie 1:51 theater was this one The Voyage Home by Star Trek, from Star Trek. 1:53 Anybody see [INAUDIBLE] Star Trek Four. 1:56 Anybody seen this? 1:57 Wonderful film isn't it? 1:58 It's fantastic, lots of fun. 1:59 It's where they travel back from the 23rd Century to, 2:01 an alien world we call New York in 1986, fantastic. 2:04 And there's this one scene that stuck with me, and it's stuck 2:08 with me to this day, and it's this little scene here, hopefully. 2:12 De. 2:18 >> Perhaps the professor could use your computer. 2:21 >> Please. 2:23 [NOISE] >> Computer? 2:25 Computer? 2:30 Hello computer. 2:36 >> Just use the keyboard. 2:40 >> The keyboard how quaint. 2:41 [NOISE] >> Do you get the idea? 2:49 If we're making future predictions, we should be 2:53 talking to our computers right now shouldn't we? 2:55 We should be. 2:58 Scotty said he thought we were back in 1986. 2:58 It's 2013 we should be doing this stuff. 3:01 And you know what, we do. 3:03 So, this I now ride a bicycle, and this is my main interaction 3:05 with my with my iPhone and of my android is through a headset. 3:09 Which means I have to talk to my dam computer. 3:13 Great. 3:16 Future's right here, yeah? 3:16 Here's the problem I have. 3:18 [MUSIC] 3:19 What's this playing, Siri? 3:21 >> [SOUND] Searching the web for what's this playing, Siri. 3:25 >> [LAUGH] >> You see the problem right there? 3:31 I'd like to be riding my bicycle, I'd like to know what songs playing. 3:34 I'm like what is this? 3:36 [INAUDIBLE] What's going on? 3:37 Why is Siri getting it wrong? 3:41 What Siri's ending up doing there is 3:42 searching the web, for what is playing Siri. 3:44 Why does she not understand, or in this case him because it's 3:46 a British guy, why does he not understand, a simple question like that? 3:49 I could ask anybody this and they'd 3:54 completely understand what I was talking about. 3:56 What's wrong with this picture? 3:58 So, I thought, well. 3:59 I'm a physiologist at heart. 4:00 How can I figure out and use physiology to understand, 4:02 what on Earth is going with our friend Siri here? 4:04 But, you know, it does get it right if you persevere, you know, 4:08 search for movie tickets, if you persevere it'll do some good things for you. 4:11 What song is this playing Siri? 4:17 [NOISE] You are listening to things you can't do 4:18 by Del the Sankiho mass ap, damn the automator. 4:23 [LAUGH] Yeah computers are a bit quirky when it 4:30 comes to voice recognition but you get the idea. 4:31 If I phrase the sentence exactly right Siri will get it. 4:34 He repeats me understanding exactly what's going on so. 4:38 Why do I have to face is exactly right 4:40 for something like Siri when a little conversation with 4:42 my wife, and say now what's this playing on 4:45 the stereo, she gets it, what's going on here? 4:47 So I thought well, okay let's go back and have 4:50 a look at psychology, and see what psychology we can 4:51 find to fig to figure out what's going on, I 4:54 went back and found a study by Mister H P Grace. 4:56 And some logic in conversation. 4:59 He studied thousands of conversations, and came up with 5:00 some rules to understand how humans converse with each other. 5:03 He came up with these four rules. 5:06 He called them maxims. 5:08 So, if you know anything about psychology, psychologists and scientists start 5:09 to invent or use alternative names for thing that nobody really understands. 5:12 Not rules, he's created maxims. 5:16 He created four of these things. 5:17 The maximum of quantity, maximum of quality, relevance and of manner. 5:19 And this is four ways of assessing when a conversation's going well. 5:23 These things are all happening. 5:28 When a conversation breaks, somebody is breaking a rule in the conversation. 5:28 It's like crazy things, like, avoid obscurity of expression be relevant. 5:32 Do not say what you believe to be false. 5:39 I mean, clearly, if you're lying the conversation isn't going to go very well. 5:41 And, he spent a lot of time putting these rules together. 5:44 So, I thought, well great, fantastic, we 5:46 have some rules to understand human conversation. 5:47 Can we use this stuff in web design? 5:50 Turns out, actually, Mr. H.P. Grice Bit of a stick up his ass. 5:54 Doesn't quite get how conversation really happens, within the real world. 5:58 Here's a little comedy sketch I apologize in advance. 6:03 This is very English. 6:06 Have a look at this. 6:09 You'll see that actually those rules that Grace 6:10 used just then, don't hold up in real life. 6:12 >> Yes, well I thought it might be a good idea to have a bit of a now, because 6:19 I remember from my own experience, that it was when 6:24 I was just you know, coming up to 18 [CROSSTALK]. 6:27 >> This is a father having a conversation with his teenage son. 6:31 >> That I first began to take a serious interest in the, 6:33 [NOISE] in the officer Tom, the officer's number. 6:39 [LAUGH] Now I don't know, Rodger, if you know anything about. 6:44 The method whereby you came to be brought about. 6:51 [LAUGH] 6:55 >> Well Siri some of the boys at school say very filthy things about it sir. 6:56 >> [LAUGH] This is what I was worried about, and this is 7:01 why I would have a bit of a chat and explain absolutely frankly. 7:04 And openly the method where by you and every body in this world came to be. 7:08 >> [LAUGH]. 7:17 >> Rodger, in order. 7:21 >> [LAUGH]. 7:23 >> You get the idea. 7:25 We all know what they were talking about, don't we? 7:26 Yeah. 7:28 Those developers in the back. 7:29 Sorry guys, they're talking about sex. 7:31 Come and see me afterwards I'll explain it all to you later on sorry. 7:34 Develop a joke there. 7:37 The big problem with Grice's rules about conversation, is 7:39 we break them all the time in our general conversations. 7:42 Things like humor come from breaking these rules. 7:45 Warmth, not being like a robot, stuff like that comes from these rules. 7:48 If we stick to rules, when it comes to designing, we end 7:54 up designing robots or stuff that doesn't have humans so again know that. 7:57 >> [NOISE] Hey Penny, how was work? 8:02 >> Great. 8:05 I hope I am a waitress at the Cheesecake Factory my entire life. 8:06 >> [NOISE] Was that sarcasm? 8:09 >> No. 8:12 Was that sarcasm? 8:13 [LAUGH] >> Yes. 8:14 >> Was that sarcasm? 8:16 >> Stop it. 8:19 [LAUGH] >> And that's the difference. 8:20 If you start sticking to Graces' rules and designing 8:21 computer systems that fit them, you end up being 8:25 a bit like Sheldon from The Big Bang Theory 8:28 you start to be a bit like a weird robot. 8:29 And it's funny in certain situations, but this is 8:32 the big problem we have with computer technology right now. 8:34 You try to be human and you can get rules to show you how to be human, but actually 8:37 in following those rules you end up, being more 8:41 like a robot than you were in the first place. 8:43 So, this stuff doesn't work. 8:46 It's not quite there yet. 8:48 So, what do we do? 8:50 What do we do as designers, how do we get this stuff right? 8:51 Grice's rules they're great. 8:55 They're great for one thing and, one thing only and that's designing forms. 8:57 You can see forms would be great for this sort of stuff here. 9:01 You know avoid obscurity [NOISE] expression avoid ambiguity. 9:03 [INAUDIBLE] Leave to be false. 9:08 This stuff works really well as a 9:10 technique for analyzing, and developing, and designing forms. 9:12 So, a tip for you if you want to design forms 9:14 that look really, impressive in front of your team and your clients. 9:17 You use Grace's maxims, they work a tree for forms. 9:19 What else is going on in this question? 9:25 And so I asked, too, this question, what's this playing? 9:26 Was that sarcasm, was the question that Sheldon asked? 9:31 To do something I love you, take it or leave it, these 9:34 are words, do we know what these words are called in English? 9:38 Yeah that's right these are called pronouns, yeah, and it turns 9:42 out these are really easy for us to understand what's going on. 9:46 When I say I love you, the other 9:50 person understands what's going on, was that sarcasm? 9:51 Is what's this playing? 9:56 It's quite difficult a thing for to computer to get right, because it 9:57 can't reference the pronoun back to the thing that were originally talking about. 9:59 It turns out computers really struggle to do this stuff. 10:04 Us though, as humans we are in our brains. 10:09 Here's a good example of a brain right here. 10:11 Our brains come because they develop to be as big as this. 10:13 Well, not quite this big, but they're big because we're constant doing this stuff. 10:17 We're constantly referencing the past, the present, 10:21 and the context of everything else around us. 10:24 We're analyzing everything that's going on through our big brains. 10:26 Our brains are designed to analyze this stuff. 10:30 We're really good at knowing when we talk about. 10:32 This and that, what it refers to. 10:35 So, a typical human conversation, same thing. 10:38 What's this playing? 10:40 If you don't understand what this means, what do we do? 10:42 We ask the other person, this because we don't know what it means. 10:44 Computers don't do that very well. 10:48 On the iPods [INAUDIBLE] the funky homosapien. 10:50 Well, actually it's Deltron 3030. 10:53 Is the exact answer you're gonna get back from a human. 10:54 And that's, humans are great at this stuff. 10:57 We knew this stuff without even thinking about it, because we built to do it. 10:58 Anybody have the new Deltron album? 11:03 It's great they're playing Brooklyn tomorrow night. 11:04 They're amazing. 11:08 Well worth it. 11:09 Anyway, as us humans are good at this. 11:10 If we don't know the answer to a question. 11:12 We ask somebody what's going on. 11:14 Another developer tip. 11:17 If you don't know the answer to a question, don't Google it. 11:18 Just ask somebody else what's going on, you get 11:19 a much better response in doing it that way. 11:22 But dude, we're good at this stuff as humans. 11:24 And what happens, is, when we understand [INAUDIBLE]. 11:27 We have a shared understanding of what this 11:29 means, we both know what this is referring to. 11:31 And, this shared understanding is incredibly 11:34 important when it comes to designing. 11:37 Stuff, that communicates, we need to have 11:38 a shared understanding about what's going on. 11:40 I need to know what this is, my wife needs to 11:42 know what this is, so we can have a shared understanding. 11:45 If we don't know what this means, that's when confusion arises. 11:46 We starting to do this, so this is a good example that Jeremy did. 11:51 This is a sign up form, another form, and this is designed like a conversation. 11:54 And it's great, it's starting to play on the idea 11:57 that us human, like a bit of context and conversation. 11:59 Stuff needs to flow rather than fill in field in the label. 12:03 This is a bit more natural as to what's going on. 12:06 But, it still falls down in the same ways. 12:08 So, when we have a problem, we still factor this computer says no response. 12:10 I want my username to be please fill out this field. 12:16 It's not a particularly human response to what's going on. 12:20 So, for our computers to work for us and to 12:23 understand us, we can't be designing rubbish like this, sorry Jeremy. 12:25 The form is really good but the error messages need some work. 12:29 And that's because we don't spend any time writing error messages. 12:31 We just don't do it in our job. 12:34 We just rely on the default, in this 12:35 case browser settings, this is not quite Jeremy's fault. 12:38 The browser setting's not being right, to do this sort of stuff. 12:40 But, things are changing! 12:44 Any Googlers in the house? 12:46 Yeah, feeling the Googlers' love today! 12:48 You guys are great! 12:50 You've done it! 12:52 So, who is the manager of Manchester United? 12:52 This is Google now. 12:55 So, there's nothing you can ask natural speech to and it answers the question. 12:56 Manchester United are a football team I mean football, the ball you kick 12:59 with your foot, not the one you throw around that's shaped like an egg. 13:03 Soccer. 13:06 You can ask these questions to Google Now. 13:07 But, what's brilliant with Google now, is you can ask the questions. 13:09 How old is he? 13:13 Great, you get a great response. 13:15 Now, I did this slide in Manchester a few weeks ago. 13:16 Alex Ferguson isn't the manager of 13:20 Manchester United any more, it's somebody else. 13:21 Wikipedia hadn't been updated. 13:23 So, it wasn't perfect. 13:25 But, it did the really difficult thing, which 13:27 is telling me how old Alex Ferguson was 13:29 and not telling me in fact it's not 13:31 Alex Ferguson, it's the manager of Manchester United. 13:32 But, you get the idea. 13:35 The old how is he thing is difficult. 13:36 You've got to remember, you've got to have a memory of what 13:39 he is, and he relates to the last search that we've done. 13:41 So well done, Google, for doing this stuff. 13:44 Brilliant. 13:46 This is hard. 13:46 But again, it's a bit more natural. 13:48 It's back again to how we communicate when we communicate with other people. 13:49 Alright, so. 13:54 Future prediction number one. 13:55 You'll wanna write this down, this is really important stuff, guys, yeah? 13:56 Designing like conversation. 14:01 Okay? 14:03 Good, you are writing it down, that's wonderful. 14:03 And that means three or more synchronous interactions. 14:06 Sends interaction out, back, and out again. 14:10 Forms are the closest we're getting to this. 14:13 When you write something in a form, it tells you if 14:16 it's right or it's not, but it's not like a conversation. 14:17 Three things going backwards and forwards. 14:20 We need to have a conversation, with our computers a lot more. 14:22 Maybe it all quick because it's a little bit like we're having 14:26 a conversation with the computer makes you feel a little bit stupid. 14:28 But, actually this is going to be the future of what we're going to do. 14:30 We're going to be, computers are going to be asking 14:32 if you understand you can be researching what you want. 14:34 But, the futures going to be based around conversation. 14:36 Okay, prediction number two another story. 14:40 This wasn't a good day. 14:43 So, I had my Christmas party at CX partners last year 14:44 on a Thursday night, had a wonderful time, got quite drunk. 14:47 Ordered a taxi. 14:52 And this is before Christmas it's 11 O'clock on a Thursday night. 14:53 Phoned up the taxi company, and said, hey, 14:56 can we have a taxi, please, for five people? 14:58 Come and pick us up at this restaurant, take us home. 15:00 They're, like, no problem we'll be with you in ten minutes. 15:01 We'll call you when the taxi's outside. 15:06 I'm, like, fantastic, great. 15:07 So, I put my phone down. 15:08 On the table there in front of me, what happens? 15:10 I order another drink, expecting this call, the call doesn't come. 15:16 The phone there and I'm thinking well it hasn't 15:19 rung, it hasn't vibrated, it hasn't lit up, nothings happened. 15:21 Ten minutes later nothings happened, ten minutes later nothings happened. 15:23 I eventually think I'll have a look at my phone, I've got five missed calls. 15:26 From the taxi outside taxi has now left its now midnight its 15:30 snowing its cold I'm gonna have to walk home what went wrong? 15:35 And actually turns out its this thing do not disturb, anybody use this? 15:40 Yeah, its designed for you to go goodnight sleep, because all those 15:45 interactions that are happening all the 15:49 notifications are happening while you sleep. 15:50 Won't wake you up, because you're phone effectively goes to sleep. 15:52 So, I had this on Thursday night. 15:55 It happened to be Christmas time, so it's like 15:57 I wasn't tucked up in bed at 11 O'clock. 16:00 I was actually out needing this call to come through but it didn't, 16:02 because the do not disturb was on and so I missed the call. 16:07 So, what did I do immediately after this? 16:12 I turned Do Not Disturb off, and I haven't turned it on since. 16:13 Now, there are settings you can use that when do 16:16 something to the phone and it works, but it's fiddly. 16:18 And this is the big problem, Joe talked about it earlier on. 16:20 Context is really difficult, okay? 16:24 How is my device to know I was at Christmass party? 16:29 I told it, explicitly told it don't interrupt me after 11 16:31 O'clock at night and you know what, it did interrupt me. 16:35 It didn't interrupt me just like I told it to, but 16:37 I needed that interruption to come through from that Taxi company. 16:39 And it didn't and it was frustrating. 16:42 So, it's nothing worse when technology fails, because of this. 16:44 So, what's going on. 16:47 Us as humans are good at this stuff. 16:49 I mean yeah, this stuff's been around for 16:50 years, Joe talked a lot about that [INAUDIBLE] Johnson. 16:53 This is auto profiles, this is a Nokia series 60, do you remember that? 16:56 When things looked like this before? 16:59 Looks a little bit like flat design to me. 17:02 Anyway, 17:04 this was around, this was 2000 I think, this came out and I 17:06 still got my phone, and what it does is it reads your calendar. 17:09 And adjust the sounds, vibrations on your phone and turn the sounds off. 17:11 So, if you're in a meeting, it turns your phone off for you. 17:16 So, it's on silent, great, you know, if you're asleep it turns off. 17:18 Except, of course, for that problem when you miss the damn 17:21 call you're expecting, and you delete the app off your phone. 17:23 But, it's intention was great, it reads your calendar, spot 17:28 you're in a meeting, do something clever with your phone. 17:31 You can see a use for it until it went wrong. 17:33 And it went wrong in a situation for me it was very important in my life. 17:35 This damn thing, okay? 17:41 There's other things as well. 17:43 So this is, Geo Fencing this is called. 17:44 This is knowing where you are, interms of contexts 17:47 and updating phone feature based on where you are. 17:50 Great you know, fantastic. 17:53 For my location, in here I need to pick up my inflatable brain. 17:54 And as you can see, I have my inflatable brain right here. 17:57 So context, it turns out is great. 18:01 This loads information out there [INAUDIBLE]. 18:04 There's loads of models of this stuff out there right now, and 18:06 these models that had been around for a long time is loose. 18:12 This Andrew Heaton, Google, we talked about earlier on. 18:13 The So Lo Mo, there's and loads of models of contexts. 18:16 You know, all of he things that you got to understand to 18:20 make your context, location, mobile device, 18:21 social situation, all of these stuff, great. 18:25 And, you know, devices can do these stuff. 18:27 There's a big problem us as humans can do this as well. 18:29 We can collect loads of information about our environment, we know, we're 18:32 listening, we're feeling the vibrations from the movie that's on down below. 18:35 There's loads of stuff going in to our heads, 18:38 that tells us about the context we're in right now. 18:40 You know, computers can do that as well. 18:43 There's been models, way back in 1999, about 18:46 how to measure context, but it's not here yet. 18:48 This stuff still breaks all the time. 18:50 Why is that? 18:55 I ask myself why is this? 18:56 What can psychology tell us about, why models of context keep breaking 18:58 and stuff keeps going wrong in my little life and I miss taxis? 19:02 So, this is the control for our oven at home 19:07 and my, my flat mate, he's great, my flat mate. 19:11 So this, this is in Celsius, before you're thinking we cook things 19:15 really, in a cool temperature in the UK, no this is Celsius. 19:18 [LAUGH]. 19:22 When she comes home, my flatmate [INAUDIBLE], 19:23 she wants to heat the oven up. 19:26 She turns it right to 250 degrees, even though, you know, maybe 19:28 she's making a pizza and it's needs to be at 180 degrees. 19:32 She turns it right up to 250 degrees. 19:35 And leaves it, and I sat there thinking, and I say to her, 19:37 you know, I say, look it's a thermostat, it doesn't work like that. 19:40 All a thermostat does is it stops 19:43 the heat, when it hits a particular temperature. 19:45 And she's like no, Joe, you're wrong, no it definitely heats up quicker 19:46 if you turn it to 250 degrees first, and then turn it down later. 19:52 And, I'm like no it doesn't work, then we have this conversation about it. 19:55 Yeah, and you know and it makes sense. 19:59 It does make sense. 20:01 You do see I can see her logic right there and then. 20:02 You got another control on a oven, and this is for a gas oven in the UK. 20:06 You turn it up to a higher temperature, it's a higher temperature. 20:11 You turn it lower it's a lower temperature. 20:14 These controls are both on the same device, they're like, almost identical. 20:17 They're often next to each other. 20:20 It's not her fault she made the mistake, it's not. 20:22 I just happen to know how thermostats work, which shows I'm a geek. 20:25 She doesn't need to know about these things. 20:28 And this stuff is called mental models. 20:31 So, all of our users on everything that we do. 20:34 In all of our websites, they build up a 20:36 mental model of how the stuff that we build works. 20:37 Okay, sometimes they have the correct one like the one on the 20:41 right, and sometimes they have the incorrect like the one on the left. 20:43 But, it's up to us as designers to make sure, that our users have 20:45 the correct mental model for how our stuff works or it's going to fail. 20:49 This is the biggest number one reason why designs fail, is because you 20:54 have an incorrect mental model where users 20:58 [INAUDIBLE] about how your device's service works. 21:00 And in the case of the oven, it's not 21:03 mission critical if something goes wrong, but it could be. 21:04 I see this all the time. 21:09 Here's another good example. 21:10 Form fields. 21:11 I love form fields. 21:13 I write a lot about form fields, I've got, I'm a bit geeky about this stuff. 21:14 So, if you're interested in form fields, go and check out some of my stuff online. 21:17 Anyway, I see this news interesting maybe once every month, six weeks let's say. 21:19 I didn't see them, the asterisks, there's nothing that explains what they mean. 21:24 So, this lady knew the test, didn't know what the asterisk meant. 21:28 She said to me, what is that little red thing mean, I 21:31 mean oh that means you have to fill in that form field. 21:33 She's like okay, I get it I see nobody explained that to me before. 21:36 You think about it, this is a really rubbish design pattern. 21:40 It's terrible. 21:43 You can see where it came from. 21:44 A laziness early in the early days of the internet. 21:45 Where you guys are doing this stuff, we thought let's just put 21:48 an asterisk next to each one to tell people what to do. 21:49 Doesn't wear very well, and has Joey showed 21:52 this Celeron, this stuff is not accessible either, alright, 21:53 it's just a bad idea to do this stuff so just kinda stop if you're doing it. 21:56 But, where did this come from? 21:59 And this came from this stuff which is a printed form, okay, 22:01 now this lady here, probably in her 40's so she grew up. 22:06 With a primary mental model of how forms work, from the bits of paper that the IRS 22:10 mail us, or in this case the DVLA which 22:15 is the driving vehicle licensing people in the UK. 22:17 But, she grew up with this model in her head about how forms work. 22:20 And what happens with paper forms, if you 22:24 have to fill something in, it says optional. 22:25 At the bottom it says if any [INAUDIBLE] have ever 22:28 changed since your last please give the previous details below. 22:30 So, it's telling you if something's changed do something about it. 22:33 Not this weird red asterisk that you all understand as code the people understand. 22:36 So, her mental model, of this form was broken 22:40 and that wasn't her fault it was our fault. 22:45 It was designers for using these asterisks, now this is changing, you know. 22:47 Generation Y as we saw earlier on from Charles' talk, are growing up. 22:51 They're used to this stuff, they grew up with this stuff so 22:54 they get it, but, it doesn't mean that all users are the same. 22:56 So, we have to predict and understand what our users 22:58 mental models are of the world and design to them. 23:01 If their mental models are this we be, we need to mark optional fields. 23:04 Maybe a bit more friendly. 23:08 If it's this, it's fine. 23:09 Okay? 23:10 So, let's have a look at this, back to context again. 23:12 How does context work in terms of mental models? 23:15 So this is me. 23:17 My, I need to speak to my wife, our littlest is sick and at home. 23:19 I've got a problem here, okay, and this is back 23:22 to the problem I had with the context thing earlier. 23:24 And my phone not alerting me in that meeting. 23:26 So, I needed to speak to my wife, it was quite an urgent thing. 23:29 Okay it is 10:48 on a Monday, great, useful piece of 23:32 information, that's the time of the weekly ops meeting that runs 23:35 from 10 a.m to 11 a.m so know that my wife 23:37 has a meeting between 10 and 11 a.m every Monday morning. 23:40 I know that's the case, so case that got some calendar information. 23:43 I know from Find My iPhone that she's in the 23:47 office, anybody use this to understand where their partners are? 23:49 Not creepy, don't worry my wife's here, she's comfortable with it. 23:53 We do this to understand where the other person is, it's quite useful too. 23:56 You know, on the way home she's like you're 23:58 near the supermarket can you pick me up some milk. 24:00 Useful stuff so, I'm not being creepy, honestly don't worry. 24:01 okay. 24:07 So, I know where she is, I know what time it is, I know what she's doing at work. 24:07 So, what do I do? 24:14 When this happened last time, and I didn't call she was worried, she was 24:16 worried, but I hadn't called her to tell us, tell her about this important thing. 24:18 Okay, understandable. 24:22 So, I've got a bit of history, a bit of context to the past again, context people. 24:24 They ain't very good at past mistakes, informing future ways of doing this thing. 24:27 Exactly when I ought to be better at that stuff. 24:31 But, past is always a context. 24:33 You hear about geo, you hear about calendar. 24:35 The past is a much bigger context than many other things. 24:38 It's urgent. 24:43 Alright, I know it's urgent, I know what to do. 24:43 The best thing I can do, it's best that I call. 24:46 So again, this thing here is designed to understand context. 24:48 And it's designed to understand, what to do 24:54 in a particular piece of con, of context. 24:56 We're very, very good at doing this stuff. 24:58 Oh God there's a line in the bushes over there, there's a lake there. 25:00 Where do I run, I don't want to get shot between here and the lake. 25:04 We're very good at predicting situations, building up 25:06 mental models of what to do in this situation. 25:09 We're built for this stuff. 25:11 So context, any fool, and any computer system can gather context. 25:15 The biggest problem is understanding relevancy 25:20 in context, that's the thing that breaks. 25:23 So, if you do anything to do with context dependence. 25:26 Take this one thing away, it's not a context 25:28 problem, it's a relevancy problem that we have right now. 25:30 Alright, there is loads of data out there, huge amounts of it. 25:34 We need to know which data is relevant. 25:36 So, back to the [INAUDIBLE] again, can you see that? 25:38 Can you see the face? 25:42 It doesn't really exist. 25:44 Us humans are good, at pattern recognition, this 25:47 big pink thing here is good at spotting. 25:49 The importance of relevance within a world, or a sea of almost 25:52 seemingly random noise, we're good at this stuff, we'll go for it. 25:56 So, prediction number two, let me get there in a second, is, we're 26:00 gonna be designing with mental models of relevance, not mental models of context. 26:06 Mental models of relevance. 26:12 We've got to understand what's the relevant information here. 26:13 Not the 50 other sources of location or calendar, or this, that, and the other. 26:15 It's the relevant piece of information that's important at this point. 26:20 'Kay? 26:22 Context, it's relevance that's important, that's prediction number two. 26:24 [SOUND] So, prediction number three, you're almost there. 26:27 How am I doing for time? 26:31 >> Ten minutes. 26:32 >> Ten minutes. 26:34 We've got loads of time to talk about this stuff. 26:34 Great, okay. 26:36 So, prediction number three. 26:36 This is, here, the human brain. 26:40 My specimen we've got here. 26:43 I'm gonna do a little bit of neuroscience. 26:44 You mentioned I'd studied neuroscience. 26:46 I'm gonna do a little brief primer for you in neuroscience. 26:47 The very most basic thing you can know. 26:50 The three bits of the brain, that we can design 26:52 for, okay, is the instinct bit which is keeping you alive. 26:54 This is the stuff that tells you you've got to 26:58 eat, you've got to drink, and you've got to have sex. 27:01 Right? 27:05 Really simple stuff. 27:06 This stuff is the stuff that drives you. 27:07 Okay. 27:09 It's also got stuff in it around keeping you alive but 27:10 on the whole it's eating, drinking, and getting you under way. 27:13 Those are the three things that you can design towards, and we've seen 27:18 it designers of the male persuasion are very good at putting a large. 27:20 Attractive pair of women's you know what on 27:26 the page, and your eye gets drawn towards that. 27:28 The 800 user test that you mentioned that I do every year. 27:30 We do a lot of this stuff with eye tracking. 27:32 One of the biggest things I see that of eye 27:34 tracking all the time, is if you got a model 27:36 of attractive women on the screen, men their eyes go 27:38 directly to it and it's no surprise I know, yeah. 27:42 Clever Annie, same is true of women. 27:45 If you've got a man on there, people are drawn to that particular image. 27:46 And the same is true of a man's chest as well. 27:50 So, women will be drawn to the various points of the chest. 27:51 I can show you some very funny examples of some eye tracking data I've got about full 27:54 length pictures of men and women, and which 27:59 people look at, which bits people look at first. 28:00 >> [LAUGH]. 28:03 >> Turn the cameras off it's dit is the order. 28:04 But that's how it works. 28:09 That's the stuff that we're programmed to do as humans. 28:10 We are programmed to look for mates, we're programmed to 28:12 look for food, and we're programmed to look for water. 28:15 And we use all of those things. 28:17 You know, real estate agents, when they show you around 28:18 the house, you know, they will say, have some coffee boiling, 28:20 have, have some bread making, it's back to this really strong 28:22 basic desire we've got, instinct in the bottom of our brain. 28:26 This stuff's easy to design for, okay? 28:29 We do take advantage of it, and we do kind of 28:31 go a bit far but it's easy stuff to get right. 28:33 Next bit then is is feeling and emotion. 28:36 This is the next easiest bit for us to design 28:40 for, and it's a little easier a little harder than instinct. 28:41 This is emotion, and we can evoke emotions like fear, which is you know 28:45 scary things like snakes and lions and that kind of stuff to invoke fear. 28:49 We can also invoke love and other feelings. 28:52 We do this for photography. 28:55 This is the next easiest thing to get right in our brains. 28:56 Okay, and looking at this stuff, sooner if you can. 28:59 Instincts, the base feelings, a little bit better. 29:01 You know, and you can use this stuff to push peoples buttons quite effectively. 29:04 You know that as designers. 29:07 And it's backed up by the neuroscience. 29:08 And, then at the top we've got this big bit here, across the top, which is huge. 29:11 And this is for thinking. 29:16 And planning, and understand. 29:18 And this is a huge part of their brain, but it's slow. 29:19 It takes a lot of effort, people are lazy. 29:23 So, when you have lots of information on the screen, people have got to 29:25 read lots of it, it's the cognitive bit of the brain that's doing it. 29:28 It uses a lot of energy and a lot of effort. 29:31 If you have too much information on the 29:32 screen, it's hard work, people are lazy bla. 29:34 Pictures of naked men or naked women, people love that stuff. 29:38 Okay? 29:41 No effort is involved, in making people think. 29:42 No surprise, you've probably read "Don't make me 29:44 think" by Steve Kruug it's the same pace. 29:46 Don't make people think, it's hard, so, three bits of the brain to design to. 29:48 Instinct's easy, good results, feelings a bit harder, thinking is hard. 29:54 And on the whole, most of the websites we design are based around cognition. 29:58 We layer on the copy loads of stuff's going 30:01 on, but we made people think about what's going on. 30:04 So, anyway, now my third prediction. 30:07 I promised you a revelation tonight, and, here it is. 30:09 Okay, so, little more bit of neuroscience. 30:12 So, this little piece here, of your brain, is called, the olfactory bulb. 30:16 Olfaction is our, is our friend smell. 30:23 And look where it is in the brain, it 30:27 sits actually, actually technically quite close to instinct and feeling. 30:28 It's a bit further away from cognition, so, 30:32 what does that tell us about olfaction and smell. 30:35 Revelation. 30:40 This ladies and gentlemen is the future of interaction design. 30:41 [LAUGH] Don't laugh at me people this is the future. 30:45 Nah, I'm joking. 30:49 The idea is that because of where these bits of the brain are, this 30:50 bit here is very close to the important bit that is good for pressing our buttons. 30:57 So, in the future we're all gonna be wearing 31:02 these things and we're gonna be designing with smell. 31:04 Okay? 31:09 Big deal, I know. 31:10 I'm sort of half joking, I have to admit, but it's very effective. 31:11 And you think about it, again, in terms of smell. 31:15 So, evocative stuff, remember about your childhood. 31:16 Remember things, the smells of baking bread, of coffee. 31:19 What do they invoke? 31:22 They invoke feelings of happiness, of warmth. 31:23 Smell more than anything else can take you back to 31:26 a period in time, based on that smell you've had. 31:29 Maybe it's a perfume or cologne from somebody 31:31 you love, but that smell is so effective 31:33 and strong that we should be taking advantage 31:36 of that in terms of our web design stuff. 31:38 Now, all right, we're not gonna be wearing these things in our head, 31:40 and admittedly this is, this is the April Fools from Google last year. 31:44 Google knows smelling is believing and, actually, when I saw it I was like, 31:49 that's a really good idea and then I realized there was an April Fool's joke. 31:54 But, actually, I think this is a product 31:57 that is certainly going to be more successful than 32:00 the Google product Buzz, by the sounds of 32:02 things but [LAUGH] sorry Googlers, Google knows it's great. 32:03 This stuff can be really effective and we can do this now. 32:07 So, we can do this stuff now in our design. 32:11 We can use evocative words that educe things like smell. 32:13 So, you can talk about the smell of bread. 32:19 You know? 32:20 You think about now. 32:21 You think about the terms, the smell of bread breaking, you 32:21 can imagine in your head, it's almost like you're experiencing that problems. 32:23 So, what happens in your brain, when you 32:26 imagine something, the same bits of your brain? 32:28 That respond to that stuff light up. 32:32 It's the old factory bulb, when you invoke it, or 32:35 suggest smells that bit of your brain is lighting up. 32:37 We can invoke that in terms of copy, in terms of imagery. 32:40 So we design with smell now. 32:43 We don't have to have, you know, phones that produce smell. 32:44 So, my prediction number three is we 32:46 will design smell based interactions in the future. 32:49 Yes. 32:54 Thank you for forward looking person at the back there. 32:55 The rest of you you'll see, and actually I am doing 32:58 a project involving smell for a big hotel chain coming up. 33:01 I can't talk I can't talk about it now, but I would love to talk 33:04 about it in more detail but I'm certain to use this stuff in interactive design. 33:06 Alright, so, three, three predictions for the future of web design. 33:09 One, two, and three. 33:14 Designing like conversation, mental models of relevance, and 33:15 olfaction which is a partial way of saying smell. 33:17 Those three, two of them slightly more likely than the 33:20 third I have to say, but they share a model. 33:25 And this is an important thing. 33:28 We've got to get it right. 33:29 Okay. 33:30 When two people are together, we have a shared understanding. 33:31 We have this socially shared model of what's going on. 33:36 We all have this stuff. 33:38 Okay? 33:40 So, when I talk about CSS, you all know what I mean. 33:40 When I talk about CSS with my mother, she's got no idea what I'm talking about. 33:44 We have a shared model of CSS, us guys here. 33:47 Me and my mother, we've got no idea what CSS is. 33:50 We have a shared model. 33:52 Alright, this is great, and we rely on this stuff 33:54 as human beings throughout our lives to function in the world. 33:56 We know what we can expect. 33:59 What knowledge other people have, and we can predict what they're going to do. 34:01 So, for our computers to get this stuff right, they've got to 34:04 have a shared model of what we know and what we understand. 34:09 But, the biggest problem towards this stuff is, 34:12 the biggest situation, the biggest one we're facing 34:17 right now is the problem we've got is 34:19 we've got to trust our computers with this stuff. 34:21 So you've got to trust our computers to tell us, tell em where we 34:24 are, what, you know, what our calendar is, what we like, what we don't like. 34:27 For our computers to act like personal assistants, which we want them 34:33 to be like Google now, we've got to trust them like personal assistants. 34:36 Which means, we've gotta open up to these open up 34:39 to our computers and give them information [INAUDIBLE] that's important. 34:42 Now, the big issue with that is we've gotta trust 34:45 our computers, we've gotta trust the people behind our computers. 34:48 So, the Facebooks, the Apples, the Googles of 34:50 this world, I've gotta take care of our data. 34:52 Again, the Google buzz story that, that Charles talked about earlier on, 34:56 is where, they get that so badly wrong, it breaks our mistrust. 34:59 And so, products like Google now are very very mistrustful of it. 35:03 Because the information they've gotta give it 35:06 to make it successful, is the same stuff 35:08 they can use against me, in terms of advertising, or sharing it the wrong way. 35:11 So, for this to work. 35:15 We've gotta damn well trust our computers and 35:17 trust the Googlers, the Facebookers, the Apple people out 35:19 there behind it, not to excuse my French, not 35:22 to fuck around with that data that we've got. 35:25 Okay, we trust these people to do it. 35:27 We trust other humans to do this, so we've 35:29 got to trust these companies to do it as well. 35:31 So, finally final part of the day. 35:34 I'll leave you with this very profound thought here and yes, 35:39 it does fit into 140 characters with Mr. Joe on it. 35:44 So, if you want to tweet that you can tweet that now. 35:48 [LAUGH] But, it makes sense okay? 35:50 We as designers need to understand psychology, to understand 35:55 how our users work and therefore to design by them. 35:59 So, if we don't understand psychology we are 36:02 going to end up designing buildings like this 36:03 that looks pretty, but we couldn't live in 36:05 that, we couldn't use that, but it looks beautiful. 36:07 There's no understanding of humans that are in there. 36:10 So, we as designers need to understand psychology. 36:12 Although it's gonna end up being 36:16 like an architecture who doesn't understand physics 36:17 and stuff's gonna fall around [INAUDIBLE] and it's the end of the day. 36:18 Alright, so ladies and gentlemen, thank you very much for your time today. 36:23 [SOUND] That's very kind. 36:25 I'm sorry about the, the boob references, the swearing and the developer jokes. 36:28 [LAUGH] But please do, please do buy my book. 36:35 It's gone. 36:39 PsychologyForDesigners.com. 36:39 It's only $3. 36:41 It's an eBook. 36:42 Download it now. 36:43 Nobody say anything. 36:45 It'll give people time to download it. 36:46 $3, download my book now. 36:47 Psychology for Designers. 36:49 It's worth it. 36:50 It's a great book. 36:51 It's a lot of fun. 36:51 I enjoyed writing it. 36:52 Thank you, you guys have been great. 36:53 I've really enjoyed my time. 36:54 Thank you. 36:55 Bye bye. 36:55 [SOUND] 36:56 [BLANK_AUDIO] 37:01
You need to sign up for Treehouse in order to download course files.Sign up