Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
Contextual Design: Creating the Anytime Everywhere Experience55:41 with Joe Johnston
Interaction with applications and websites can happen everywhere and anytime. Understanding who your user is is only half the battle – you need to know how they'll be using your site, where they are and what they want at that moment in time. Screen size isn’t the only thing that's different about mobile experience. Are they walking through the aisle of your store, comparing the price with an online retailer? Learn how contextual design can help enhance your experience and find out how other companies are beginning to explore these kinds of interactions.
[BLANK_AUDIO] 0:00 So, this is the, the design, Designing the Contextual Interface, session here. 0:01 My name is Joe Johnston. 0:07 I come from, a digital agency called Universal Mind. 0:09 I work there as. 0:15 The one of the directors of user experience, as well 0:16 as I have al little fun side job too there, I 0:18 do a lot of the, R & D development, as well, 0:20 with another colleague of mine that sits on the technologies side. 0:22 So we have a lot of fun, building some unique experiences, which I'll talk 0:25 to you a little bit about today and how that combines with the user experience. 0:29 There's some thing to follow me by, if you guys wanna Hit 0:34 me up on Twitter, or, read some of the articles on Medium. 0:37 I'm a big fan of Medium, articles. 0:40 A lot of great content there. 0:42 And I, [LAUGH] I don't know if you guys can recognize this device at the bottom. 0:44 But, that is a, [SOUND] well I think 0:49 that's the BlackBerry, device [LAUGH] back in the day. 0:51 Well, actually this thing is way in the corner. 0:55 I tried to cut it off as much as I could cuz I didn't wanna show it too much. 0:57 we do a lot of device playing so it's a, a lot of fun. 1:00 It's like the thing on my head, which is another story in general. 1:03 There was a session here earlier that talked a little bit about the 1:06 whole cyborg nature of wearing Google Glass and so I had them on. 1:08 For a while we've had them in R&D so. 1:13 Being able to, have a device like this and use 1:16 it in context of what we do on a daily, basis. 1:20 It's kind of a unique situation, and, I wear, I 1:23 wear the Google Glass in, in kind of two ways. 1:26 One for the R&D factor to test out some things, cuz we're trying to build 1:28 some connections with some apps that we have into the card metaphor of the design. 1:31 Which in of itself, is a pretty unique, 1:35 operating system, obviously there's a lot of talk right 1:37 now about you know, the terminology of glass 1:40 holes and all the people that are using them. 1:42 The privacy factors as well as. 1:44 >> [COUGH]. 1:46 >> The weirdness of, of talking to you guys through this thing right now and 1:46 know it's not on and it's not recording 1:49 you guys, which is always a weird question. 1:50 but, the other one is, is that, is that personal nature of. 1:52 Having a conversation with someone and then 1:56 seeing how they react to it, that's the 1:58 other thing, being a user experience person, 1:59 understanding how people react to, devices and technology. 2:01 I wear it out and about as much as I can, at 2:04 places like this, at a technology 2:06 conference, or related design and technology conference. 2:07 It's interesting to get your guys' feedback, from it as well. 2:10 There was a great session earlier in here talking about that. 2:12 And also just normal people that are on the street, 2:15 which is actually more fascinating to me than anything else. 2:17 So here's just some pictures that I took. 2:20 This is actually just this morning with Josh Clark talking, which you can 2:21 see a lot of similarities, great, great talker, speaker, I love listening to him. 2:23 And I'm sure most of you hit his talk this morning. 2:28 A lot of great content there. 2:30 You'll see some similar content as well 2:31 as some, [SOUND] connection points as well there. 2:33 So, you can see all the cards take some pictures up there 2:35 and they, and they do a little bit of, the great weather here. 2:38 Great, 99 degrees we'll get up today. 2:41 I'm from Grand Rapids, Michigan by the way. 2:43 So this kinda warm weather is, is nice 2:45 because we went through a very hard winter. 2:47 I think a couple weeks ago the ice finally melted on 2:48 lake Michigan, so we at least got that going for us. 2:50 So this, this is again the wall here. 2:54 So I, I try to get some picture prior to the conference just 2:56 to kind of show you guys how it works from a picture standpoint. 3:00 And I'll get into a little bit more. 3:02 If there's any questions, feel free to jump in. 3:03 This is, [SOUND] more of a conversational talk as I 3:05 get into it a little bit, but if there's any 3:07 Google Glass related questions, take them now a little bit 3:09 or we can take them at the end as well. 3:11 And I'll be free to, have you guys put on the glasses after the, 3:12 after the talk here and you guys can play with them a little as well. 3:15 And. 3:18 Throughout the conference, so if you see me, stop me, play with them. 3:19 Cuz it is a device that is pretty unique in 3:21 the sense that, obviously they're expensive, they're more of a beta. 3:23 [SOUND] People don't get a whole lot of chance to actually play with them. 3:25 And that's really the key is, trying to get people to 3:27 interact with this device, more than just seeing the, the the shows 3:29 on Saturday Night Live making fun of them as well as reading 3:33 all the articles, but getting a chance to actually look at it. 3:36 One story I wanted to tell was I was in Raleigh, North 3:40 Carolina at another conference, it was more of a marketing relating conference. 3:41 I was standing in line getting some barbecue, 3:44 it was actually really great barbecue truck there. 3:46 Funny thing is they had their market going on. 3:48 It was opening day for their, city market, farmer's market, so I went down there. 3:50 Hundreds of thousands of people down there. 3:54 It was crazy. 3:56 I was a little overwhelmed, but, I was walking around. 3:56 I said oh I'll take some pictures before the, the conference like I 3:58 did earlier, showing you some of the Vegas stuff and, and the conference. 4:00 So, the funny thing was, is that, I'm standing there, 4:04 eating my, sandwich, my barbecue sandwich, and these three, three 4:07 wonderful women were sitting there next to me, and they 4:11 turn around, it's like, is that the Google Glass thing? 4:13 I'm like, yeah, it is, and, and they started actually talking about it. 4:15 They weren't a part of any, tech conference, they weren't a part 4:18 of anything related to technology in general, but they knew what it was. 4:20 And they had some questions about it so I talked to them a little bit. 4:23 And I, and I said yeah you wanna try them on, I was like yeah. 4:26 So they played with it a little bit. 4:28 I said oh here's how it works. 4:30 It's voice activated, I can say okay glass take a picture. 4:31 So when I did. 4:33 [LAUGH] Obviously, these two were ready kind of. 4:35 And the, the one at the back was a little bit hey don't take a picture of me. 4:38 And that's the one key thing that you'll 4:42 notice is that the privacy factor which I'll 4:43 hit on a little bit, is the whole, 4:45 privacy of this wearable devices slash Google Glass 4:47 slash, anything that's related to, attached to myself 4:50 and my personal data, even down to your 4:53 phone, I mean how many people would actually 4:54 hand your phone over fo someone to use it,. 4:56 If they're lets say, someone's jogging and they forgot their phone and 4:58 they're sweating profusely in the 99 degree weather, would you be willing 5:00 to give your phone to someone so they can, slap their greasy 5:03 face up to their phone and then give it back to you. 5:05 So, there's always this privacy factor now with these wearable devices being 5:07 the privacy of your own data is even, deeper in that sense, so 5:11 I'm just gonna take a picture of you guys really quickly just to, 5:15 kind of get, you know, the bearings here, I'm gonna say okay, Glass. 5:17 And you guys can all like jump and holler and hoot. 5:20 And I'll say, and I was trying to take a note. 5:23 So here's the problem with Google Glass: it's all voice activated. 5:24 Sometimes it picks you up, sometimes it doesn't. 5:27 So let me go back here and say okay Glass, take a picture. 5:29 [BLANK_AUDIO] 5:32 All right, so we got you guys. 5:35 I'll put it up later when I get done talking. 5:36 You'll see it up on my Twitter handle there. 5:38 So, that's about how easy it was from taking pictures. 5:40 Obviously we can get into a whole deeper conversation 5:42 about, how it works and more of a enterprise related 5:44 session or, scenario cuz obviously the consumer markets a little 5:47 bit not really ready for it as we well know. 5:52 But I'm gonna take these things off because obviously it is 5:54 weird talking them to a bunch of people at a conference. 5:57 And I can't see very well cuz I don't have my contacts in, so I'm 6:00 gonna put my other glasses on which 6:02 are not Google Glasses, which are regular glasses. 6:03 So, let's jump into contextual design, so, 6:07 probably wondering what is contextual design, there's been 6:09 a little bit of conversations about it at 6:12 the conference so far which has been fantastic. 6:14 Josh Clark hit on some things about 6:17 how these digital, physical elements are coming together. 6:19 And that's kind of part of this as well. 6:21 But as you can see, this is, you know, the, the definition 6:23 of, of context, and, you know, it really doesn't make much sense. 6:26 So, what does it really mean? 6:30 Or, what, what I like to say is what the hell does that really mean? 6:32 And there's actually three main things that really make up context when 6:34 you boil it down and you wanna design something for contextual design approach. 6:38 And, and, and these are, are statements that seem like 6:42 they may be easy to do but, I'll tell you right 6:46 now they're pretty much the hardest thing you could possibly, 6:48 to figure out is actually understanding what these, these things are. 6:50 And that's what is the user trying to accomplish at that moment. 6:54 And, that moment is a very key, term that I use there because a lot of times 6:58 we gotta figure out exactly what they're trying 7:02 to accomplish in that context of what they're doing. 7:04 That's very important, very difficult to do. 7:06 And then where they try to accomplish it? 7:09 So, this is more or less location or facility 7:10 or where ever they might be doing it at. 7:13 And then, what are they trying to accomplish it on. 7:16 And that gets down to the actual physical device. 7:18 So they're using a Google Glass, so they're 7:19 using a tablet or they're using a phone. 7:21 So, those three things. 7:23 You boil that down and that's really how you start defining and understanding what 7:24 you need to start building for contextual 7:27 interface or contextual design kind of approach. 7:29 So I wanna take a step back and see how did we actually get here with, 7:33 with this contextual design approach and why these 7:35 things are starting to become more and more important. 7:38 Obviously mobile first, obviously it's, probably 7:40 something here at the conference that is, 7:42 a little bit older but I wanted to bring it up a little 7:43 bit and that's the, taking that approach of taking a application or, website 7:46 or, anything in general and thinking about 7:51 it from mobile pers, mobile perspective first. 7:52 And that's not only from, you know, designing it for 7:54 a particular screen or multiple screens or building a prototype. 7:57 It's actually down to even to the content. 8:00 We had a great conversation last night with 8:02 one of the, these great content strategists in 8:03 understanding that, and that's one thing that's very 8:05 important in all of these, is understand the content. 8:07 If it fix, fits in that context, not only the device, but also the context. 8:10 And then we get in the responsive web design, so 8:13 that's one of these approaches of building for mobile first. 8:15 So obviously, there's plenty of ways to, approach 8:18 responsive web design, adaptive, all those different things. 8:21 And then that's where contextual design, so it's this third tier of actually. 8:24 Taking all of the mobile first methodology, the responsive design 8:28 approaches and then boiling it down to a contextual design approach. 8:31 And that takes a step back again. 8:35 That was a great conversation, or great keynote yesterday morning. 8:37 >> [COUGH]. 8:40 >> From Sarah, talking about user research. 8:40 And that's, really what it boils down to. 8:42 A lot of re, user research, user understanding. 8:44 And, I wanna take a, a little sidebar here. 8:47 And anybody know the terminology, SoLoMo, at all? 8:50 No? 8:53 Not even you guys? 8:53 So, I added a little bit to it. 8:56 So, SoLoMo actually stands for social, location and mobility. 8:58 And those three things combined help to start formulating contextual design. 9:02 But those three things are pieces of information you can start to gather. 9:06 I added two more things to this, which I think is probably 9:10 more important than actually the other three and that's sensors and data. 9:12 So, if you could think about all these different types of devices, 9:16 different types of, obviously the beacons you're hearing all about in the 9:20 contextual kind of retail story experiences, 9:23 taking that information, you can now 9:26 start leveraging that data from that 9:28 information, actually building these very contextual design. 9:30 Designs for these individuals at that point 9:33 of time that they need something to happen. 9:36 So the, you gotta think about understanding the, the 9:38 full scope of the full context of that user. 9:40 Now you can gather that in that form of data 9:43 with all these different sensors that we're starting to see. 9:45 Josh Clark talked about some of those this morning and we're 9:47 being able to have access to those which is just phenomenal. 9:50 So, there's also another great article. 9:54 This is a couple years old, but Google wrote actually 9:56 a pretty lengthy document here, talking about the multi-screen world. 9:59 And then, and then, and how it, it, it's basically anytime and everywhere. 10:03 And context is always about, where we are and 10:06 then taking that story with us along the way. 10:09 And Josh talk, talked about. 10:11 Some of the stuff that Apple's coming out, which I'll touch on a little bit as well. 10:12 Taking that continuity of your information 10:15 as you go through those difference contexts. 10:17 But even down to, what's point out here at the end, is 10:19 even your attitude and your emotion is, is also a part of context. 10:23 What you feel and how you feel in that particular moment is gonna 10:26 determine how you interact with a piece of content or device or thing. 10:30 Be frustrated, or you could have something you needed immediately. 10:33 Maybe the speed isn't fast enough for you so all those 10:36 things play into the contextual design, you really have to think about. 10:39 There's really, that's what that really hard part about understanding 10:42 your user in that context is very difficult to do. 10:44 And there's ways to do that, with that at the graphic research 10:47 that understanding of, of users and watching them use it actually in. 10:50 In context of what they're trying to do. 10:55 I have a couple of links here at the 10:56 bottom and I I should have this presentation up, too. 10:58 You'll be able to jump into, so you don't need to type all these things in. 11:02 But it's a great article. 11:04 It has a tons of great content into it, into it as well. 11:05 But it actually tells a story of, how we use different devices at 11:08 different times of the day based on our context on what we're doing. 11:12 Like from, riding a train to work, to at work, to actually coming home or 11:14 actually sitting in a chair and using your 11:18 iPad or actually watching a TV that interactive. 11:19 So it's all based on your story and 11:22 attitude and where you are and what you're doing. 11:24 And then, of course I had to add a slide here with. 11:27 But obviously iOS 8. 11:29 As you all know, iOS 8 will be launching sooner, later this year in the fall. 11:31 Their whole OS in iOS 8 and the new 11:34 Mac OS is all gonna be tied together contextually. 11:38 Down to the messaging, the way the messaging works, notifications, all 11:41 the way down to how you use apps inside of the application. 11:44 And app developers can leverage this as well, so 11:47 it allows you to take, this example is pretty, simple. 11:50 Taking a excel spreadsheet which is in numbers here and then actually 11:52 showcasing how you can leave one spot and jump to the next. 11:58 Josh Clark showed a video of it from the Apple Keynote earlier this morning. 12:00 And it's called Continuity. 12:04 That's their whole approach is that they'll be able 12:06 to, take anything from writing in the stream an email. 12:08 Say you left off a sentence. 12:12 You pick up your phone and you can immediately come back 12:13 to that sentence that you left off and then continue your experience. 12:15 That Continuity, that flow is the kind of 12:18 design thinking you think about for your application. 12:20 Not only an iOS. 12:23 I know this is a very platform specific type of 12:23 experience, but you can work across all those different experiences. 12:26 >> [COUGH]. 12:29 Go to web. 12:30 Android, iPhone, TV, Glass. 12:30 Any type of device you'll start seeing coming out. 12:33 Think about how that Continuity of flow is gonna happen, because they are gonna jump 12:35 from one device to another in context, the way they are and what they're working at. 12:38 [SOUND] So, there's so great data, Loop W 12:44 actually leveraging his, his graph of data here. 12:47 This is actually data from. 12:50 An app that used to be called Read It Later but now it's called Pocket. 12:52 And what happens is you can save information 12:55 on the web and actually read it later. 12:58 Or in the sense put in your pocket and read it later. 13:00 So this graph represents iPhone users usage 13:03 over the course of duration of 24 hours. 13:07 These are for average, usage times. 13:08 And you can kinda see if you start laying across 13:11 these different types of contexts of what starts to happen. 13:13 You can take this data, this is just blanketed user data not really, obviously 13:16 observational data, you can start laying in times of days, and seeing that hey, the 13:21 morning is, is a pretty high spike cuz their probably on their phone or 13:26 their taking a morning commute, or their 13:29 taking a train, or taking whatever type of. 13:30 Travel they need to go to actually the morning they're drinking their 13:32 coffee and reading their phone, then they, then they have their commute. 13:36 Then obviously the work as, obviously should spike 13:38 down, hopefully so they're actually doing their work. 13:41 And then on their commute home it spikes back up again, 13:44 so maybe they're hopping back on their mobile phone or their tablet. 13:46 And then reading some information, or actually this case it's just all iPhone. 13:48 And then evening it spikes up and down a little bit. 13:52 You take that same data for that same app and you 13:54 put it on the iPad, overlay that same information, you can 13:57 start to see now, obviously, it's starting to be known that 14:00 the iPad is more of a lean back type of experience. 14:03 So I sit in my couch and reading information. 14:06 As you can see from the spike here, that evening usage is drastically 14:08 higher across the board on an iPad than it is on the iPhone. 14:12 So now you can start building that context that you know that from your, this. 14:15 This limited data, this point, is, you can take iPad users 14:19 for the pocket app is leveraged deeply on, in the evening. 14:23 So now you can start designing texture for 14:27 the iPad for that later evening type of design. 14:29 So taking just the research data here that you 14:31 see, you can start, figuring out contacts, context wise. 14:33 iPhone versus iPad and the, and the types of work that, 14:36 or types of travel through the day that a user may happen. 14:39 [SOUND] So there's a couple things I wanna kinda group together. 14:43 We tend to forget about, and it was a great, Josh 14:45 Clark hit on some great things about, the audio and how 14:47 contrans, you know, you can take data across multiple devices using 14:50 audio even ones you can hear and ones you can't hear. 14:53 But this is the great thing about context is 14:56 that we have these devices that are all audio capable. 14:58 The only avail, availability is, is just having it on all the time, which, with 15:00 iOS 8, now the audio is, the mike at least is always gonna be listening. 15:06 That lends to some privacy concerns, which I'll have 15:10 another conversation in a second, a couple of slides here. 15:12 So, with audio, that's a piece of context. 15:15 So now you can take your ambient sound and now start, using that for your 15:18 context and your design to see what's going 15:21 on if that's applicable inside of your application. 15:23 That's one audio we tend to forget sometimes. 15:25 Sometimes we think audio's only about maybe in games 15:27 or in video, or when they wanna listen to something. 15:30 But we can actually take that audio as 15:32 input and then create an experience around that. 15:33 [SOUND] The next thing is, is, it's been out for 15:38 a little while, the, the ability to tap into your calendar. 15:40 And this is again, all the privacy concern. 15:43 And there's always this leverage of, the, the privacy you're asking 15:45 for, has to be relevant to the value the user gets. 15:50 So, that's where our trade-off in comes in. 15:54 And there's been a great article actually on medium, 15:56 and I, I don't think I have it in 15:58 here, but it has a great explanation in iOS 15:59 cases, how to actually showcase how to privacy's gonna happen. 16:02 So it actually had these great messages that come 16:05 up, notifications, saying how to step through and tell the 16:07 person why I'm using your contacts, why I'm using 16:10 your, location, because it serves up you a better experience. 16:12 When we come to contextual design, we always 16:16 run into that case, that fine line between privacy. 16:18 And if, as long as your privacy you're asking for, and the person's 16:20 willing to give it up, they're, they wanna see the value you're gonna return. 16:23 So as long as you tell them what that is, you'll be able to get that. 16:26 That's where that leverage comes in, big time. 16:29 The, so the calendar part of things is great, in the sense 16:31 that now people open up that calendar, now you can start contextually. 16:34 Start serving up some stuff, so even to the point of, if you can look in to 16:37 see whose birthdays are happening, or when your 16:41 at a wedding anniversary, or if you're in a 16:43 store and you just started seeing the contextual, 16:44 ad beacon nature you can actually serve up, hey, 16:46 you may wanna get flowers for your wife, cuz 16:48 it's your anniversary tomorrow if you happen to forget. 16:50 So there's ways that you can, serve up and it brings 16:52 value to yourself, so the calendar is a big part of context. 16:54 And then, location. 16:58 We always think about location as GPS or in 16:59 certain now cases with diabeacons micro geolocation, proximity based location. 17:01 But this one was always great. 17:06 I, I grabbed this from Twitter when I was 17:07 talking a while back and someone made a reference 17:09 to, they wished the iPhone dictionary was location aware 17:12 so they could actually serve up names of streets. 17:15 Which I thought was fantastic if you think about it. 17:18 If you're out and about somewhere you're not normally, 17:20 like Las Vegas, let's say, and I'm out on a 17:22 corner somewhere and I wanna tell my friends I'm 17:24 here I don't happen to know the name of the 17:25 street, I kind of know it, wouldn't it be 17:27 great if it knew you were trying to say the 17:29 name of that street and actually come up and tell 17:31 you hey, this may be what you're trying to say. 17:32 IOS is getting there, iOS 8 I think with their notifications 17:35 and messaging system is starting to serve up some things like this. 17:37 That'd be great to see that so there's, location things you can think about that 17:40 aren't just I am in this spot, here I am, show me things around me. 17:43 It's actually taking location, incorporating it into your design, 17:46 and serving up a better experience giving that value. 17:49 That's the real big part. 17:52 [SOUND] So here's a great thing, this was also touched on a 17:55 little bit by Josh this morning was, how people are interacting with content. 17:58 While they're watching TV. 18:02 We always talk about the living room as contextual, the second screen. 18:03 But some people say the second screen is actually 18:07 the TV now, and these devices are the first screen. 18:09 That may be the case, we may be getting there soon. 18:12 But as you can see from the data here that is from 2012. 18:14 Obviously these are probably, ever changing. 18:18 Obviously email is still huge among the usage of information that people 18:20 are trying to do both during shows and, during commercials, but also it's, 18:25 it's more widely used as you can see the percentage with the, with 18:30 the tablet that also comes into play with earlier data I showed you. 18:33 That the, the tablet is more of a, lean back 18:37 experience and try to learn some of that, how many 18:40 people have actually kind of used your device while watching 18:42 a show, whether social email or sports or whatever, yeah, so. 18:45 It's a pretty high percent. 18:49 I mean this crowd probably is a little bit higher than normal but, my 18:50 wife does it all the time and she has nothing to do with technology. 18:53 She's in the healthcare field so she's always checking her email or 18:55 checking her Facebook and stuff, especially when Game of Thrones is on, 18:58 gotta make sure you, see all that fun stuff or at least 19:01 not look at your social media if you haven't watched Game of Thrones. 19:04 So don't want to give anything away. 19:06 But no, it's definitely an interesting space, 19:08 especially now with more and more smart TVs. 19:12 Obviously with, with Google getting in the 19:14 play, and Amazon just released their new phone. 19:16 Obviously tying into all their, their prime 19:18 data and obviously taking advantage of the 19:20 TV and leveraging how you can share content on the TV as well as. 19:23 Learn about information on TV. 19:26 Now, some of this data here represents that not a lot of 19:28 people are shopping as you can see by some of the stats here. 19:30 So, it's one of the things. 19:33 It's only 29% of 25 to 35 year olds shop on smart phones while they're watching TV. 19:35 So it's, not as high as you, would hope it 19:39 would be if you're a marketer trying to sell some products. 19:42 It's actually the, the, the other, social side of things. 19:44 And, obviously, seeking information is one that is, is up 19:47 there, but it's with a little bit older crowd of information. 19:50 Obviously, sports is always gonna be a giant, huge thing to see what's going.on. 19:53 To see sports scores or, at least, have those conversations. 19:56 So this is always, so remember the context of 20:00 TV living room, In that space, or in general. 20:02 How to leverage those devices. 20:05 Obviously we live in a multitasking world, maybe some good some bad, in that sense. 20:07 And then obviously the device, right? 20:13 The device is always in context and that's 20:15 huge because theres new devices every single day. 20:17 Just like yesterday. 20:19 The new Amazon Fire phone. 20:21 And obviously if you guys haven't seen it yet, go look at it. 20:23 There's some great new contextual things even down to facial recognition. 20:25 The head movement and doing some 3D stuff so it's a great new 20:29 device that would hopefully gonna push all the other devices a little bit further. 20:32 So new devices every day. 20:35 That even gets down to wearable devices. 20:38 There is obviously the Misfit that I was talking 20:39 about this morning, but there's tons of other ones, Mio. 20:41 Is another one which you'll see in my deck 20:44 as well and we have to start thinking about those. 20:46 More of the gestural design or gesture based content. 20:49 manipulation. 20:53 And I think Amazon has another kind of, a layer of that as well, in their new phone. 20:53 So, how do you start kind of, looking at some of these things? 21:00 And I, I kinda, related back to the 21:03 responsive design and kinda how we think about it 21:04 from a mobile first perspective and when you start 21:08 designing for internet application, website or whatever might live. 21:10 It's always about progressively enhancing that experience. 21:14 Knowing that if your application or experience 21:18 is going to be applicable across multiple screens. 21:22 Obviously, starting in that mobile world first, right? 21:25 So, and then thinking about it from, okay, well, how does it work 21:27 on this particular smaller screen, or the mobile device, what they would leverage? 21:30 How do I progressively enhance that experience? 21:34 Not necessarily starting at that large screen, 21:36 and then kind of working your way back. 21:39 If you think about it starting from a 21:41 mobile first perspective, on that mobile screen first. 21:42 Much easier to progressively enhance that experience and gain 21:45 more functionality and features and ability to do that. 21:48 It's a lot easier to be thinking in this fashion rather than going the other way. 21:51 Now obviously, the luxury of doing this is great if you're starting off from scratch. 21:54 Lot of times you're starting over there because you had an application 21:58 that's ten years old and you wanna try to take advantage of things. 22:01 Very difficult to do, and it's a lot of money to start over again. 22:05 So, there's a lot of intricacies to try to get here in 22:07 a medium space without, you know, wasting everything you've done over there. 22:11 But taking advantage of all the new things. 22:14 It's a very touchy, difficult thing to do. 22:16 But if, if you have the chance, try to get to that point. 22:19 I want to talk about this multi-screen kind of ability 22:22 and in again it was talked about this morning but 22:26 there's more than just obviously multi-screen sharing or even the 22:28 coherence, that, that's kind of the response of design approach. 22:33 This design studio out of Germany, they put together a nice kind of 22:36 info graphic on several different ways that multi-screen sharing can happen. 22:42 And I think this, idea of the different abilities to do this is 22:48 one that can context will be able to play a big role in. 22:53 And when we start thinking about. 22:56 How my phone interacts with my computer. 22:57 That's where the continuity of iOS comes into play. 22:59 They're taking advantage of many different versions of these in different ways. 23:02 So I just want to bring this slide up because I think it's 23:07 a great illustration of how there's different 23:09 ways to think about utilizing multiple screens. 23:11 So, to leverage that, I want to show a little videos. 23:16 This was something we put together in an R and D group at Universal Mind. 23:19 And what, I have a lot of videos here, 23:23 but I'll give a little, a little preamble here the. 23:24 So this is an iPad table. 23:26 There's 15 iPads on this table. 23:27 And they're all connected on a basic wi-fi network and 23:30 we're actually had these contact cards you'll see floating around. 23:33 And I have an iPhone app you'll see there that can change. 23:35 The content on the table. 23:38 What happens is, again with the story of a little bit of magic, you 23:41 can toss these car, cards across multiple 23:45 iPads, and they continually go down the line. 23:47 And all it is is the 23:50 messaging, the communication of portation, speed, acceleration. 23:51 But when you put it all together it makes a little bit more magical. 23:54 But this is more of a multi-device communication. 23:57 Way of utilizing multiple screens. 24:00 [MUSIC] 24:05 >> We want to showcase some of the 24:14 ideas from taking content from one device, and then. 24:16 Seamlessly passing it across to another device. 24:20 So we thought the iPad's a great use case for that. 24:23 So, some of the original ideas for the iPad table came from an application we 24:26 created called I brainstorm which did some 24:29 multi-device communication between an iPhone and an iPad. 24:32 We also extended that again with a retail demo where 24:34 we took gesture based content and shared it across multiple devices. 24:37 So, you wanna be able to show case that to a little bit more of a higher scale. 24:41 So we decide to take 15 iPads and actually do that to all of those. 24:45 So, now we can add, actually as you toss content over 24:48 from one iPad to another and it goes succesionally down the 24:51 line. 24:58 A very unique way to actually control the iPad table. 24:58 We have an iPhone app that allows us to change. 25:00 Kind of the background colors, as well 25:03 as the different types of array configurations. 25:04 So, we can actually control of the master control scenario of the iPad table. 25:06 One of the things that we're kind of exited about, 25:10 it's how this would be used in a real world scenario. 25:14 So, take for instance businesses, they're buying iPad's for all fo their employees. 25:17 Wouldn't it be great if you could walk into a 25:22 conference room, lay their iPads down on a conference table. 25:23 Turn on this platform and then start sharing content throughout 25:26 the meeting, at this collaborative creative meeting sharing this content 25:30 and then actually taking that back with them to their 25:33 desk and then sharing that with their colleagues down the road. 25:35 In the digital classroom scenario, a teacher could send out a math problem. 25:39 And then receive answers back and then be able to assess 25:42 who needs extra help and who doesn't need that extra help. 25:45 Another scenario where this will be fantastic to 25:48 use will be at an event base types of 25:50 scenarios, where there are trade shows or parties, 25:51 be able to have the iPad Table run through. 25:54 Showing of content. 25:57 And then actually taking that content away, 25:58 and having a personal one on one conversation. 26:00 It gives a place where people can actually surround a space, can 26:02 actually share that content, and actually unload it and have fun with it. 26:05 But also explore the content in it as well. 26:08 Little more immersion inside of content than just 26:10 your traditional stand back and watch presentation, let's say. 26:12 The main reason we started this research 26:15 and development group Universe in Mind was to. 26:17 Try to explore these ideas without restraints. 26:19 So we wanted to take ideas like the iPad 26:22 table, or ideas like gesture based content sharing, explore those 26:24 to a point where we can kind of start 26:28 utilizing bits and pieces of those into real working scenarios. 26:30 And really look forward to, you know, working on ideas 26:33 that people may or may not even know they exist. 26:36 Or technologies that are just on the forefront of actually coming to fruition. 26:39 Cuz the iPad table a lot of times we will explain it to 26:44 as something a little bit futuristic that they may see down the road 26:47 being used, but we're actually dealing 26:51 with those kind of technology challenges now, 26:54 how that might be able to be used in whatever scenario in the future. 26:57 [MUSIC] 27:01 >> So that was just a little bit 27:05 of an example of utilizing maybe something that may 27:07 not translate directly into something that usable right now, 27:10 but we've seen a lot of actually interesting people. 27:15 We've taken to quite a few shows, actually just tentative Tedex, Denver. 27:18 Not too long ago so it was a great way to showcase some content there. 27:22 We're taking it to mobile bead and venture 27:24 bead and we have some, con clients that are 27:26 using them in their innovation rooms to showcase some 27:28 of their technologies so it's a great fun project. 27:30 Multi-dimension communications is definitely gonna be something. 27:33 I think some of it was talked about in 27:35 Josh's talk this morning so when we talk about 27:36 contextual design there's also some things I wanted to 27:40 bring up before we jump into some other examples. 27:41 So this one here is just because they're on a 27:45 mobile device doesn't necessarily mean they're on a slow connection. 27:48 Now that could be the case, but I know about, I don't know 27:51 about you guys, but I'm, obviously we just talked about sitting in a 27:54 living room and I'll be on my own wifi on my mobile phone 27:56 because I'm too lazy to go get my computer or whatever it might be. 27:58 So that's also the case that a lot of times 28:01 these people are using wifi networks for mobile devices, so. 28:03 We do have to be, you know, think about that case where 28:06 there are gonna be the slower connections to serve up content correctly. 28:09 But don't always assume that that is, is the case. 28:12 And that gets me to the next one, is just 28:15 because we have context, doesn't mean we automatically induce intent. 28:16 So, and that's when I was just explaining the whole mobile ideas is 28:20 just because on a mobile phone, don't 28:23 assume that they're gonna be doing something. 28:24 This rolls back to obviously understanding and doing that at the graphic 28:26 observational research as the users and how they interact with your system. 28:30 Get in their shows, watching on what they do and how 28:34 they interact with it, it's gonna better serve you and understanding 28:36 what the context is gonna be and not just assuming that 28:39 just because they're doing this, they wanna try to do that. 28:42 And then the privacy talk. 28:46 So the way we have to think about it, and that's why 28:48 I kind of explained earlier the 28:50 notifications and explaining what's gonna be happening. 28:51 I can't really take note of, this was actually at a forester conference. 28:54 We have to think about it from more of 28:57 a big mother perspective rather than a big brother perspective. 28:59 So instead of this all watching eye spying on us, taking our content. 29:01 And then trying to distribute it to other people 29:07 to make money, thinking about it from a nurturing aspect. 29:08 So, I wanna learn more about you, so give up some 29:11 of your privacy, and I'll be able to make your life better. 29:13 So, as long as you think about it from that way, and 29:15 you tell them the truth of what you're really trying to do. 29:17 That's where you really gonna get that 29:20 value from the user perspective and that privacy. 29:21 Again, the more privacy you can get, the more context you can have. 29:24 So, if you try to think about it from that way, it'd be help. 29:28 Now this is going to be, to take a little time because obviously there's a lot 29:30 of scary things happening with data, and taking 29:33 data, so it's always going to be a risk. 29:35 So, when we, when we talked about trying to 29:37 figure out exactly what that user's trying to do 29:39 in the beginning of that first slide there, when 29:41 I said, what was context design, or contextual design? 29:42 You've probably heard of the term 29:46 Goldilocks principle, that's, that's kind of why 29:47 Earth is here, we're here at the right spot, the right time, and 29:48 everything happened right, so that's exactly what you wanna try to do to 29:50 create these experiences, you wanna try to 29:53 focus on creating the just right experience. 29:55 Like I said, extremely difficult to do. 29:58 There's a lot of factors that go into that. 30:00 And obviously you're not gonna create the 30:03 perfect experience for everybody, so you really 30:04 need to focus on your core users 30:06 and understand what they're trying to really accomplish. 30:07 And really focus on that. 30:11 So, this is probably one of the most 30:12 difficult things we at Universal Mind try to 30:14 do is understand and do a lot of 30:16 user research and build that design around that. 30:18 And there's a lot of pieces to his puzzle 30:20 both from the data side like I showed earlier. 30:22 But also some of that actually watching and observing people. 30:24 And there's two key tools to doing any 30:28 observational research, and that's your eyes and your ears. 30:30 So if you can use those two things at all times, then 30:32 that's where you're really gonna get a lot of your information from. 30:34 And then leverage all of the data to back that up. 30:36 So we talked about, a little bit about the understand the user and walking in 30:40 their shoes and has anybody done a journey 30:45 map before or a customer journey map before? 30:48 Great yeah. 30:50 So this one thing that we've, we've done a lot is is is it's one 30:51 way to really understand what are the deltas 30:54 that our users having or having issues with. 30:57 So a lot of times what we create are something called a ghost map. 30:59 so that is actually our intention of what the experience. 31:03 Should be and so we'll have assumptions like oh we know that 31:06 this is probably where some of the pain points are gonna be. 31:09 And then we'll actually go out and 31:10 observe that, those users interacting in that situation. 31:12 We'll record that and then we'll see the deltas between there. 31:15 Is our assumptions correct or are they way off? 31:18 Or is there spots that we didn't even know they had problems with? 31:20 Those are the areas you really wanna focus your design 31:22 and approaches for and understand why they are having those problems. 31:25 So that's the part of the observational and ethnographic research. 31:28 We actually created a little iPad app called Journeys to help document that. 31:31 If anybody's interested feel free to hit me up or come 31:35 up and grab my card or, or shoot me a message. 31:38 We're actually looking for feedback because we do a lot of analog. 31:40 Journey mapping but we wanted a way to capture that information. 31:43 We take a lot of data, both video, audio, 31:45 and even location data to support that with journey maps. 31:48 It's a great way to actually collect information but we're, we're trying to, to 31:51 get some feedback to see if it's actually a direction where we wanna go. 31:55 But journey maps in general, just writing them down 31:57 analog is just a great way to figure out. 31:59 How to really formulate that design around context. 32:02 And then dogfooding. 32:06 So, if I heard the story of Facebook, building Facebook home a while back. 32:06 And it wasn't quite what they wanted to be, because they had a 32:11 lot of iOS developers building an application 32:13 in android that really didn't work well. 32:15 They didn't conformed the android matrix, so. 32:17 What they tried to do and what they encouraged is obviously more 32:20 of our people to use Android and design for it as well. 32:24 So if you're building anything for any platform, 32:26 it's heavily, I would heavily recommend to either use 32:29 your application in context with how users use it and try to eat your own dog food. 32:32 It's the best way to actually live and breathe your application so. 32:37 We at universal mind we have several 32:40 different types of developers we have developers 32:42 that live and breathe android so that when we do an android design and development. 32:43 We understand how they live and breathe in 32:47 that environment, same thing with iOS, same thing with 32:48 windows, same thing with all different platforms, now 32:51 that can be very difficult obviously with smaller teams, 32:53 but thinking and being in their shoes, get that device, try it out and really see how 32:55 it works cause you can build a very 32:59 consistent application or a consistent experience across multiple platforms. 33:01 But understanding those user paradigms from the navigational aspects or 33:05 how an Android user gonna use it versus an IOS user. 33:09 But understand that you actually use your application. 33:11 Too many times, the, the people we've seen don't really use their applications. 33:14 Sometimes it doesn't fit, but use it as much as you can. 33:17 So how is it used today? 33:22 So I'm gonna go through some of [UNKNOWN] okay? 33:24 So there's this great app, so there's, when I 33:26 talk about audio this is something that you probably wouldn't 33:30 think of but there's anybody know the little baby carrots 33:33 bags, the little carrots, you eat those things, they're great. 33:35 They actually created an app and I actually met 33:38 the guy that built this a while back and 33:40 it was a fantastic story he was telling me about so this is an app on an iPhone 33:43 and an iPad that actually your this little guy 33:47 in this shopping cart and your whole idea is 33:49 to drive down this road through the city and 33:52 you're supposed to dodge all these different types of obstacles. 33:54 The way you get boost is you actually bite a baby carrot. 33:57 Next to your device and it'll actually record that audio and give you the boost. 34:01 Now they did so much audio since we're 34:05 in this synchin synchronization across all those and algorithms. 34:07 They actually devised a way where it actually only 34:11 detect the baby carrot snap versus say biting anything else. 34:14 Either a chip or whatever, so they actually 34:18 instituted a way to actually showcase baby carrots. 34:20 And then put it into a game and actually offer you know, value into that. 34:23 Now this is a game, a perspective, but it's another way 34:26 of audio and how that an incorporate into a gaming perspective. 34:29 Which is very, it's a very fascinating thing. 34:32 It showed me the whole, whole scenarios, really interesting. 34:34 And obviously pay with square. 34:38 They've tried this before, obviously with Any types of 34:39 Starbucks as well as any types of local organizations. 34:42 But they have this whole, Pay with Square is 34:46 actually a way to be basically pay with visuals. 34:49 So now you can actually offer up your information and then the person, the 34:52 merchant you're going to be paying for 34:58 can actually validate your purchase with visual. 34:59 So you set up Pay with Square on your phone. 35:01 You can actually pay, walk into the store, not even take 35:04 your phone out, know it's based on the geolocation where you're at. 35:06 The merchant is seeing your screen, sees your image, can validate 35:08 with, with actual visual and now you can make that purchase. 35:11 So they're, they're trying to make things a little bit seamless in that sense. 35:14 And there's gonna be a lot of different variations. 35:17 There's a lot of interesting things happening in the retail space and this 35:19 could be obviously leveraged with more of 35:22 the contextual designer by beacons as well. 35:23 And this is another one. 35:27 This is a really, really cool way to project this. 35:27 I don't know if I would personally it but it was, it was really 35:31 interesting when I, when I [UNKNOWN] it 35:33 there is a, this fashion retailer in Europe. 35:36 They actually devised a way to actually build these digital hangers 35:40 and these numbers here represent the likes that it gets on Facebook. 35:44 So, you can actually go and like these particular items on Facebook, and 35:49 then now when you walk in to the store, let's say you're not to 35:53 fashous, fashi, fashion conscience, Walk in and actually say, oh, well this one 35:55 is actually a lot more, and you know, favorable than all the other ones. 36:00 So that actually starting to just do this whole digital, physical. 36:03 A realm of actually doing the physical clothing. 36:05 So it was really interesting how they tied that together in to the rack system. 36:07 I don't think I've seen anything quite like this yet, in the U.S. 36:11 But it'll be interesting to see if this will catch on at all. 36:14 And then Google Now Cards, this is 36:16 starting to be implemented across even more 36:19 and more of Androids platform, I think 36:21 it's, now it's in the normal Google search. 36:23 It started off in the actual Google App, that's on multiple platforms. 36:25 The whole card based system is, is, is kinda the 36:29 design paradigm for Google glass and many others that are coming. 36:32 It's a little bit easier way to showcase quick content in context. 36:35 So here they're bringing up your information right when you need it, 36:40 and you can swipe these cards off so they actually can read. 36:42 Anything from your flights to your contacts if 36:45 you have sporting tickets to here as an example. 36:47 Or actually any other information if you need to go to your next 36:50 appointment, they are starting to build 36:52 these things in, obviously unlocking your privacy 36:53 is going to encourage more and more of these card coming up, so 36:57 one of these design cards metaphors is you'll start seeing more and more. 36:59 You'll see a lot of iOS apps starting to take 37:03 advantage of the card metaphors and context for that matter. 37:05 And then anybody use allrecipes at all, the app 37:11 that kind of, does a lot of interesting things. 37:12 So it, this is probably one of my 37:15 favorite ones because it actually ties together multiple contacts. 37:17 Design thing. 37:21 It actually create an iPad App, iPhone, and the 37:22 actual tablet app and a, and a desktop browser application. 37:25 Actually, browser in, in, application. 37:29 So we actually start with doing your research online. 37:32 Say I'm gonna build a cake. 37:35 So I'll go on my laptop, find the recipe that I want to create. 37:37 Add that to a to-do list. 37:40 Now I'll go into the store. 37:41 I have my to-do list on my phone. 37:42 Go through and get my items in the store and then you can think of about how 37:44 this contextually be Tye dye beacons inside of 37:46 the store as well, which is another great story. 37:48 And then I get back home with my products, and then I can set 37:50 my tablet up and they actually design the tablet so while I'm cooking obviously so 37:53 they built the touch areas big enough to where you can use your knuckle 37:59 or your elbow to actually go to the next item so they actually thought about. 38:02 Your hands are dirty. 38:05 I'm using all these devices. 38:07 And I'm building the cake or whatever it might be. 38:08 And I need to go and see some things. 38:10 And that app actually has a lot of different timers and such. 38:12 So actually designed it both visually functionality wise. 38:14 And they tied 'em all together from that seamless experience. 38:18 So it's a nice, full rounded, contextual design application. 38:20 So this, this is a, this is a great one. 38:24 So now, I wanted to jump a little bit forward to more 38:27 of a futuristic side of things, and say how will it be used. 38:30 And there's some things here that are just on the fore front of, of being launched. 38:32 There's a thing called Chamaeleon Launcher which is more 38:36 the Android operating system, you know, to customize that. 38:39 This whole operating system is tied towards context. 38:43 So, It knows based on your wi-fi location, your actually wi-fi, your geo location. 38:45 As well as several other types of inputs that you can actually start designing 38:53 the interface to, to kind of conform to who you are and when you are. 38:57 So this particular screen, so if I back up a second. 39:02 This particular screen is in the morning, so you'll see 39:04 that you have your weather, and you have your other information. 39:06 And then as you go into work, it becomes more of a 39:08 work-related, so it has your documentation, 39:10 your books, your email, your calendar. 39:12 Then when you get home, it becomes more of an entertainment type of environment. 39:13 So it's starting to conform the OS directly to 39:16 your context of where you're going throughout your life. 39:19 So this is more of the movie and music scenario. 39:21 So these operating systems start to conform to 39:24 that, and I think you'll see some pieces of 39:25 iOS 8 in the future and coming out in the fall is starting to do this kinda thing. 39:27 So the more OS's take advantage of context that just opens the 39:32 door for your apps to leverage that data that's happening on those devices. 39:35 And then obviously estimote beacons. 39:38 I'm sure you've probably heard a lot about beacons. 39:40 There's a video here I can run through really quickly. 39:42 [MUSIC] 39:44 >> The phones that we carry around are 39:44 pretty smart, but they could be a lot smarter. 39:46 For example they can connect to a server in another part of the world. 39:48 But they have no idea you are in a kitchen, 39:51 in a conference room, or shopping at your favorite retail store. 39:53 They lack micro location context. 39:56 But now that's changed with Estimote Beacons. 39:58 They use new Bluetooth Smart Technology, supported by all 40:01 major mobile platforms, including the recently-announced iOS7 with iBeacons. 40:05 Put anywhere in the physical world, they broadcast context and 40:10 location to all compatible phones and smart devices in range. 40:13 Phones can now automatically pick up the signal 40:17 and trigger contextual actions designed by business owners. 40:19 Customers can enjoy a seamless experience with 40:22 more information about the products that interest them. 40:25 Photos, videos, reviews, personalized pricing and even social updates. 40:27 As they browse through the store, their phones 40:32 will transition from one item to the next, 40:35 based on their proximity to the displays, enhancing 40:37 the shopping experience every step of the way. 40:40 Also, business owners can now benefit from 40:42 quantitative location data on visits and customer feedback. 40:45 Better for business, and a better experience for shoppers. 40:48 Smart retail solutions by Estimote. 40:52 Pre-order now at estimote.com. 40:54 >> So Estimote was just one brand that's coming out 40:56 with more and more iBeacons, there's actually quite a few. 41:00 There's Know Me, there's actually Roximity is 41:02 another one that does a lot of iBeacons. 41:04 So you'll find, you can just Google iBeacons or Beacons in general. 41:06 There is tons of different types of beacons available but 41:09 basically what it is is proximity based, it's not geolocation. 41:11 So it's not like you're trying to find someone, it's letting you find something. 41:14 So that's when that context becomes, so that 41:18 if you have something next to a product, 41:20 you can have your app conform to that 41:21 information, when they're at that particular micro-geolocation range. 41:23 There's actually quite a bit of interesting 41:27 things you can do with iBeacons, based 41:29 on the, the amount of meters or you're far, feet, you're far away from. 41:31 So it's a very interesting technology, and it's another way to 41:34 kind of start building the retail contextual world into your applications. 41:37 And it's built into every single iPhone. 41:41 So every single iPhone is is actually an iBeacon. 41:43 So it's a matter of just taking advantage of it, as 41:45 well as those little devices that you see inside the retail space. 41:47 Hopefully, retail world can get past those 41:51 push notifications that are just becoming spam. 41:54 So hey, here's a coupon for this particular item. 41:56 That's gonna be the first thing of prior 41:58 wave of things that'll happen from a marketing perspective. 41:59 But I think they'll eventually translate into more of a story where you walk into 42:01 the grocery store, you walk through the aisle, you may get a coupon for something. 42:06 But then you'll say oh, you know what, I wore my Fitbit, I ran four miles. 42:10 Guess what? 42:13 Hey, my app just told me, hey, you can go get that 42:14 Ben and Jerry's Chunky Monkey because you just worked it off earlier today. 42:16 So you can start to correlate these, all 42:18 these wearable devices and your context in the store. 42:20 And then, of course, your wife says, oh, don't forget this. 42:22 And then that can actually bounce up a notification to you as 42:25 you walk past that product in the store, so you don't forget it. 42:28 I'm sure all of us don't forget items that our wives tell us to pick us up. 42:30 But that's definitely, I do that all the time. 42:34 So, the next one, anybody been to Disney recently and used the magic bands. 42:37 Have you used the magic bands? 42:41 >> [INAUDIBLE]. 42:43 >> Yeah, they're, they're great. 42:43 I definitely, if you ever get the chance. 42:44 Obviously Disney being Disney, they're the 42:46 masters of building, you know, great experiences. 42:49 And the magic bands are, are no exception. 42:52 They're basically a wrist wearable device. 42:55 It's actually RFID. 42:57 They don't use you know anything else except 42:58 for that, but it actually holds all your information. 43:02 So all your payment information, all your, your keys to 43:04 your room but it also does all the FastPass stuff. 43:07 So if you've been to Disney, the FastPass is always a nice thing to have cuz you 43:09 can go get it and they tell you to come back and you can actually get in there. 43:12 But it actually utilizes the app in the phone 43:14 as well as the, the actual wrist bands themselves. 43:17 So, actually correlated altogether, made a nice package. 43:20 you, you actually get a little, nice little pack of all little things, 43:23 everybody's name on it, get to pick the colors all that kinda fun stuff. 43:26 So they made it a whole experience in itself. 43:28 The great thing is there's just an article that came out that they just 43:30 said, it's like, how do you collect 43:33 people's data without them really knowing it? 43:35 Well, they just created an excellent way to track all this great information. 43:37 Everyone loves it and they're not even worried about it. 43:40 Now, Disney has all this humongous amounts of data. 43:42 Not like they had a ton already, but now even more based on FastPass information 43:45 as well as individuals, till now, they 43:49 can actually just create a even better experience. 43:51 So this is a great way to have a 43:54 wearable device and actually leverage that data for good. 43:55 And people are so excited to use it that they 43:58 want to cuz they created a great experience on using it. 44:00 Definitely great, if you ever get a chance definitely, definitely give it a try. 44:02 So there's a new app. 44:06 I don't know if you've heard of it or not. 44:08 It's called Humin and this is another twist on contextual design. 44:09 It actually is a way to replace your phone. 44:13 So of all apps on your iPhone, I mean how many weather 44:15 apps can you have and how many alarm clock apps can you have? 44:19 Well, nobody really has tried to replicate replacing 44:22 the phone, actually what we buy phones for. 44:24 And there's no really contextual nature to the phone app in itself. 44:26 Someone calls you or voicemail or resend, that's about it, right. 44:30 They actually started to devise a way to inject location, as 44:33 well as individuals you have with your meetings in your calendar. 44:37 So you can see in a couple screens, this is mine. 44:40 Obviously, I said I was from Grand Rapids, and It's kinda interesting. 44:43 It actually brings up in the middle screen here, you can see 44:45 that I had a meeting coming up in 35 minutes with these individuals. 44:49 And it kinda gives me that preamble of, hey, I'm gonna meet with 44:52 that's actually one of our colleagues Jay Maze there in the black and white. 44:56 And then David Tucker, another R&D colleague. 44:58 And I was gonna have a meeting with them in 35 minutes. 45:00 But also, since I was in Grand Rapids, it 45:03 starts showing all the relevant individuals that are from there. 45:05 So if I need to contact them or bring up a context, so the phone itself 45:07 is now starting to try to bring up those things and actually starting to do more. 45:11 It's actually in beta right now. 45:15 Actually, I think it's humin.com with an i up there. 45:16 And they actually are allowing people to take advantage 45:19 of the beta, so you can start using now. 45:22 This is where the privacy comes in cuz it wants you to use 45:24 your contacts, your calander, and a lot of different things, or even your email. 45:27 It gets a little bit scary but they do a great 45:30 job explaining why they're doing it, another great process to it. 45:33 So i would recommend giving it a try or take a look. 45:36 They have some cool UI metaphors too because they actually, 45:38 if you see that last screen, you'll see the avatar. 45:41 I'm in the middle of actually holding on the avatar and I swipe left to right. 45:43 And if I swipe right, you'll see the phone come up. 45:47 And if I swipe all the way over, it will actually say, call this person. 45:49 It will actually start calling them. 45:52 When I swipe the other way, then I could text them. 45:53 So, they're actually starting to do some, you know, faster 45:54 ways to contact people inside the actual avatar of the Humin. 45:57 So whenever you see an avatars face, I can slide 46:00 either left or right to text or call that person. 46:03 Or, if they don't have that information, I could request more info from that person. 46:05 So it's actually really, really neat. 46:08 The guys are great, I've been actually chatting with them a little bit. 46:10 And the way their search works too, is, is really interesting cuz 46:13 you can actually say, hey who did I meet in Las Vegas? 46:15 And it would actually bring everybody up that I actually met in 46:19 Las Vegas if I wanted to put them in as a contact. 46:21 So it remembers when and where you added that contact. 46:23 So it's really interesting how you can, I don't remember that person's 46:25 name but I know I met them at you know, Future Insights Conference. 46:28 So it would bring up everybody that I met there. 46:31 So that's another great app. 46:34 They're great. 46:35 So MYO, again, I'm not gonna go too much into it because it was talked 46:36 this morning, but another great device actually just 46:39 coming out of their final round of beta. 46:41 Utilizes all the electrodes and does a lot 46:44 of great things with gestures in your hand. 46:48 So instead of waving your hands around, it actually takes all the, the 46:50 movements and, and takes all the the 46:53 processing in your hands to actually control. 46:56 And they have some great examples, part of this morning. 46:58 Josh Clark showed some flying of the that parrot drone there earlier. 47:01 But we started to see these devices actually leveraging so when 47:06 we talked about the delete motion, he talked about a little bit. 47:10 That's that's a little bit futuristic in the sense that I'm not sure we're 47:12 all ready for the gorilla arms yet, waving our hands around and interacting that way. 47:16 But having the ability to utilize your hands and doing different hand 47:19 motions to actually execute some things is, is gonna be very interesting. 47:22 So, ultimately creating all these experiences, the one person that you 47:28 wanna try to make the best experience possible is your end consumer. 47:33 Right? 47:36 You want them to be as happy as possible. 47:36 So that's basically the wrap of my, my talk here. 47:39 But I wanted to leave it open for some 47:42 questions, as well as if anybody wants any more 47:43 deeper questions about the Google Glass or anything else 47:45 I talked about, I open conversations to you guys. 47:48 Any questions at all? 47:55 I'm sure there's questions. 47:55 There has to be. 47:57 No questions. 47:58 [BLANK_AUDIO] 48:01 I wish I had food and beer to give away, but I don't. 48:05 >> [LAUGH] >> So I have the Google Glass. 48:08 Anybody wants to try the Glass on, more than happy to. 48:13 Come on up. 48:16 >> [INAUDIBLE] >> Everybody wants to watch you do it. 48:16 >> Practice. 48:23 I haven't actually tried it. 48:24 >> So we'll see if it comes out on here. 48:26 So I'll give you a little demonstration of how 48:27 it all works and sometimes a little bit okay. 48:30 So it just hopped on. 48:33 It'll probably turned off cuz there's a little timer to it. 48:34 So when you want, you just tap it on. 48:36 So you have to go back by your eyes here a little bit. 48:37 There you go. 48:40 And then if you swipe left or right, you can actually go through some cards there. 48:40 And you were in the session earlier bout how it's a little bit 48:44 >> Yeah [CROSSTALK]. 48:48 >> It's a little bit distracting at first, 48:49 but once your eyes kinda acclimate to the lens. 48:51 >> [INAUDIBLE] >> Yeah, it's not too bad. 48:54 You can actually adjust a little bit to make sure. 48:55 You can read a bit, yeah, like they talked earlier there. 48:58 It's like New York Magazine and it's not really a 48:59 device you're gonna be reading a lot of content on. 49:02 It's more of a notification kind of alerting type of 49:04 device, as well as the perspective of video is great. 49:08 But, again, the consumerization of the device is, isn't quite there. 49:11 It's gonna be used more of the enterprise world, we're actually looking at 49:14 a lot of different ways to take advantage of it in a manufacturing space. 49:16 They do this thing called pick and 49:19 pack, or assembling inside of manufacturing spaces. 49:20 Right now, they use arm bands with little 49:22 ring scanners to actually make sure you're validating products. 49:24 So that you could actually wear these in your safety glasses inside of 49:26 a manufacturing space, grab the red widget, put it in the blue bin. 49:29 You could validate all this with alternate reality and stuff. 49:32 But, obviously, we talked about that session two, the battery life. 49:34 Isn't as good as most as, as it should be, not unlike most mobile devices. 49:37 So if anybody has a solution for 49:41 that, you would be an instant millionaire, billionaire. 49:43 >> You can actually tap up these the cards [INAUDIBLE]. 49:45 >> Yes, you can actually tap to the cards. 49:48 Feel free to tap through my email. 49:49 Read those. 49:51 That's great. 49:51 No, I'm just joking. 49:51 [LAUGH] >> [INAUDIBLE] 49:53 >> So you can actually hook it up to a lot of things. 49:53 One, there's a couple apps and we talked a little bit about it earlier too. 49:55 There's some great apps and building apps for it. 49:58 And we talked about, you know, some kids with autism and 50:02 trying to help them learn, doing facial recognition, as well as, I 50:04 think there was another talk about a blind person using them 50:07 to identify people that they see cuz they can do facial recognition. 50:09 Still another thing with privacy but the, the, the great 50:12 thing, there is an app in here called Word Lens, 50:16 if you guys have seen an iOS or not, you 50:18 can actually hold it up to any sign that you see. 50:20 You can actually the cards that come up actually show translates the word from 50:25 whatever, say, English to Spanish or Spanish 50:30 to English, right in line with the sign. 50:32 So it actually great, does a great way to do that. 50:34 So if you're ever traveling you can wear this, pop open that, 50:36 do augment reality, you can start reading signs without knowing any other language. 50:39 So there's, there's some, there's some value to some of these things. 50:42 But it's definitely a beta device. 50:44 It's, it's obviously expensive, but I think they're doing a great job 50:48 of, of trying to find really great uses for it and enterprise space. 50:50 Before, I think, we'll see anything unlike how they marketed it 50:54 by jumping out of airplanes and, and hot air balloons and stuff. 50:57 I think that was a little exaggeration on showing a 51:00 product that probably wasn't really in, meant to be a consumer. 51:01 It's more of an enterprise, business-driven type 51:04 of environment, but it's a very interesting device. 51:07 The point of perspective on some of the video is actually really great. 51:10 That's probably the only thing that, from a consumer 51:14 perspective, I'd, I, I think would really take off. 51:16 It's play, playing, playing catch with my 51:19 son, and that perspective of playing baseball 51:21 with your son is, is pretty interesting when you see that video back again. 51:23 It's much different than holding a phone or taking a picture. 51:26 You can actually use your hands, take that video, and 51:29 it's all voice activated, so you can do some great things. 51:31 And Android, obviously, being one of the better voice dictation 51:33 and, and recording is, is definitely really great for the device. 51:36 So anybody else wanna give it a try, give it a whirl, take some pictures? 51:39 No? 51:44 Yeah, you wanna give it a try? 51:44 Did you raise your hand? 51:47 No, no? 51:48 Any other questions, anything else? 51:50 If you had, yeah, go ahead. 51:51 >> Are your slides available? 51:53 >> Yeah, they should be. 51:55 I uploaded them, so I can, actually I can, actually will upload them as well. 51:56 But I think they have them and I'm not sure where they're gonna put 51:58 them up, but they'll be up with all the other Future Insight speaker sessions. 52:01 Yeah. 52:05 >> What we'd like to see in the next, like, version, you think that 52:06 they've got all their basics right and yes, it's a matter of making it better. 52:09 What do you think? 52:13 >> For the glass? 52:13 Yeah, well, I think they got a little ways to go. 52:14 There's, obviously, battery life is still a huge concern. 52:16 And the, the apps themselves are still a little clunky. 52:20 There are some hardware things that they need to try to workout. 52:24 One of the weirdest things is that it does heat up really, really drastically. 52:27 So when you have on your head it, it feels like it's almost gonna burn your ear off. 52:31 But other than that but, you know, it does get 52:35 really hot for some reason and multiple people have said that. 52:37 It's actually when you become a Glass Explorer you can do a lot of they 52:39 have a whole chat thing and then 52:42 everybody's, that's one thing that they complained about. 52:43 One of the things I loved about it though, they just recently updated. 52:45 It was, they call it the Glass Feature. 52:48 So, how it works is, is there's an audible 52:50 tone that will come through and they call it like 52:53 a bone transfer but it's basically just a vibration and 52:54 sounds, so really you can hear it more than others. 52:57 But the way you'd get notification there, say, if I get email. 53:00 Normally, I would hear the audible but you'd have to physically touch it. 53:03 So, in order to do that, obviously, you 53:06 have to stop what you're doing and you'd have 53:08 to touch the device so it would come up so I could see it on the screen. 53:10 With the glance feature, though, if I get a audio notification, all I have to do 53:14 is look up at the Glass and it'll 53:17 automatically turn on to show me the notification. 53:18 You have to calibrate it, which is really kinda cool. 53:20 And that's where you get the heads-up display. 53:22 If anybody was in the session earlier, we were talking about HUDs. 53:23 And there's motorcycle helmets that are doing, and cars are 53:26 now bouncing the, the speed and speedometer up on the screen. 53:28 So that way, you're not looking around. 53:31 So that's one of the things. 53:33 I think if they can add more of that type of glancing feature, and then 53:34 almost, you know, AKA Iron Man style, where you got all the stuff going on. 53:38 Obviously, not that quite that drastic. 53:42 But I think there's ways that it can start to utilize that 53:44 and, obviously, not making it look so weird as I talk to you. 53:46 So if it's integrated, smaller, in a natural glass type of environment, 53:51 with Glasses we're still working on I think that'll be the next stage. 53:55 I think it will be something used more. 53:59 I still don't know if it's consumer-ready 54:01 because it's still that privacy concern of like. 54:03 How do I know you're not recording me? 54:06 It's still weird to talk, even if my colleagues here 54:07 was, they're like, oh, it's still weird talking to you. 54:09 And the way, I think Luke W did it, did a 54:11 well job of explaining how, that we all have these variable devices. 54:14 Probably, I may have a Fitbit or a Jawbone or whatever it might be. 54:17 How many times do you stop using it? 54:20 Well, you always gotta charge that damn thing, right? 54:22 So, he always says that the way that I can tell if 54:24 I'm gonna use a device is if I keep charging it or not. 54:27 So the way I look at it is I'll use a device if I can wear it out with my wife. 54:30 So I can't wear these out with my wife because she wont let it, so [LAUGH]. 54:34 So the other things I can, that's how I 54:38 kind of justify whether or not it's ready for 54:39 prime time is if I can go out with my wife to dinner and have them on or not. 54:41 So but no it's, it's definitely I think has some great potential. 54:44 And I think in that business enterprise, which obviously, medical industry is using 54:48 it more and more, is actually businesses are building it specifically for medical, 54:51 even from doctor visits to build 54:55 a recording, actually said that they increased 54:57 their their time by, I think, 18% cuz they're not recording all time. 54:59 They actually use this,. 55:03 See the patients information up in the Glass and actually do audio 55:05 recordings as well as see what maybe medical history they might have. 55:07 So they don't have to go back and do some different things. 55:11 So they actually shrunk the time that they served their patients and increased it. 55:14 So doctors would actually wear them. 55:17 Virgin Atlantic is using it for service as 55:19 well helping people that maybe have their flight cancelled. 55:20 They can bring up all their information and try to guide them to 55:23 a new spot by using, scanning on the, on the ticket, stuff like that. 55:26 So, I think you see some progression in it, with it. 55:30 It will take a little while before it really hits prime time but but yeah. 55:32 I thank you all for coming. 55:38
You need to sign up for Treehouse in order to download course files.Sign up