Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
Keynote - Beyond Mobile: Where No Geek Has Gone Before49:25 with Josh Clark
Everyday technology is hurtling into the realm of science fiction, even magic, with new devices that are as surprising and delightful as they are useful. Developers and designers are running hard to keep up with this warp-speed pace of tech innovation, and for now, mobile devices are at the forefront. But what's next? Trends are emerging at the hazy edges of the tech universe that hint at the future of computer interfaces, including computers without interfaces at all. Designer Josh Clark, author of "Tapworthy," takes you on an expedition of this final frontier. Learn how smartphones and other sensor-rich devices have changed how we approach computing, and explore how we can better design for sensors. Learn how dumber machines will make us smarter, and how our current work lays the groundwork for a future of social devices. Along the way, you'll see how games lead the fleet, how robots can help us build our software, and why post-PC computing is about far more than phones and tablets. Finally, understand how a smart approach to technology choices now will better prepare you for the future, to boldly go where no geek has gone before.
So heard rumor that this thing is called the Future of Web Design 0:02 and so I thought that it would be a good idea to take us 0:04 just a little bit into the future and look a little bit at what's 0:07 coming just around the corner and I'm not talking about far in the future. 0:10 I'm talking about the kind of stuff that we're 0:13 gonna be dealing with designing in the next few months. 0:15 To one to two years and the things we need to prepare ourselves for. 0:18 In a lot of this, we've been hearing a lot of words like magic, today and wonder. 0:22 And I think that it's really true that in a very real 0:28 way, we're, we're hurdling into this era of science fiction right now. 0:30 You know, where, where it's sort of these ideas, these tropes that we've 0:34 seen in movies are now actually becoming part of our regular, everyday lives. 0:37 And we started the day hearing about [INAUDIBLE] talking about magic. 0:41 And talking about things like Shazam. 0:45 I think the first time we saw Shazam work, it was like, whoa man, alright hello. 0:46 And it's like it's true that these things do start to become sort of mundane. 0:51 [COUGH] Excuse me, or quotidian as As [INAUDIBLE] said in very fancy language. 0:55 But you know for me, this idea of things moving 1:01 into science fiction, really started when, in, in seven years 1:03 ago when Steve Jobs introduced this thing, the iPhone and 1:06 of course, all the, the phones and tablets that followed since. 1:09 Because for the first time in my adult life this was, it was really the, the 1:13 first time in my childhood notion of what the future might look like, had come true. 1:18 This little personal computer in my pocket that knew everything 1:22 about me and knew everything about the environment around me. 1:25 And so this thing's unlocked a lot of things for us. 1:28 And I think it's something that certainly from the, the 1:30 programming of this conference and I think of our workaday lives. 1:32 This is something that's occupying us, how do 1:36 we use these things, and that's what I've been 1:38 sort of working on for the last several 1:41 years, is really focusing on mobile and tablet interfaces. 1:42 But that's branching out into other areas 1:45 because frankly what's keeping me up at night 1:47 is not so much what to do with mobile, but what comes next, you know? 1:50 What's the post post PC future. 1:53 So I'd like to take a little glimpse at that. 1:56 You guys ready? 1:58 Cuz I propose, friends, an expedition into the 2:00 hazy edges of the technology universe to look 2:03 for hints at emerging technologies and, and in 2:06 the ways that we're gonna be working with information. 2:09 So that we take that information, 2:14 those, those emerging technologies, and prepare now. 2:15 [SOUND] Change our work now. 2:19 You know, thank you. 2:22 You ready? 2:22 You ready? 2:23 [SOUND] Alright. 2:23 Well, before we go too far on this journey, I 2:26 wanna, I wanna begin with where we are right now. 2:28 [NOISE] What is it about phones and about tablets that 2:31 has fundamentally changed the way that we think about information. 2:35 And the way that we deal with computer interfaces. 2:39 We're heard some things about this. 2:42 Cory started today to talk about sort of, the sense of magic that sensors can give. 2:43 Joe Johnson, this morning, talking about context and, and Joe, just now, 2:49 talking about, you know, sort of, the, the challenge of making these. 2:53 sort of almost human interactions actually seem 2:58 real and not creepy and weird and androidey. 3:01 We've been talking about the personal computer since the days of disco, right. 3:03 But in a very real way, these things and all this, 3:10 this new era of personal computers, these are the real things. 3:13 These are the truly personal computer. 3:16 Not just because they're always with us. 3:18 Although, that's part of it. 3:21 I think more important is this issue of immediate contact and hopefully as 3:23 Joe was saying of relevance, that they bring with us, cuz it's these things. 3:27 It's the sensors, that really have changed the game. 3:32 Not just sort of the ubiquity of these things. 3:36 That's, that's a piece of it. 3:38 But the fact that they know so much about us, 3:40 in our immediate environment, unlocks all kinds of new interactions. 3:43 These things are loaded with super powers. 3:49 And so they give us super powers. 3:51 And think about all the things that these things can do, right? 3:53 I mean, they've got touch and GPS and camera 3:55 and microphone and proximity detector and light detector and compass. 3:58 gyroscope, for crying out loud. 4:03 You know, a lot of times people assume 4:05 that mobile is the companion experience to the real. 4:07 Computing experience. 4:11 And I think as, you know, as Jason [UNKNOWN] sort of 4:12 mentioned earlier, you know, wow there's lots of data to show actually. 4:14 This is, you know, mobile is, is kind of, is the computing experience. 4:17 And I would say that the question is not just 4:22 to say, how can we do less on these devices. 4:23 The real question is, how can we do more? 4:26 Because these things can do more than the traditional computer. 4:29 So let's take a look at, at what these things have unlocked. 4:32 A lot of times when we think about sensors and 4:36 contacts where we started was with sort of GPS stuff right. 4:38 Where can I get a pizza nearby. 4:41 When, what time's the next train leaving from the station nearest me. 4:42 And that's been cool. 4:46 And incredibly helpful. 4:47 And it explains why I can actually no longer read a 4:48 paper map because a whole part of my brain is apparently shrunk. 4:51 Because now, my phone can, can do that stuff for me. 4:54 But the interesting challenge is not to say, to 4:57 use these things to find out just what's around me. 5:00 But what's in front of me? 5:02 And how can I transform that information? 5:04 Not just interpret it. 5:06 But actually transform my experience of 5:07 what's around me into something more meaningful. 5:10 That's, of course, sort of a. 5:12 The supposed promise of augmented reality, right? 5:14 Is that how can I actually sort of reveal information that my puny human 5:18 eyes and ears can't see on their own using the sensors on a device. 5:23 Augmented reality, though, has sort of been a solution looking for a problem. 5:27 And something it's kind of a fun little gimmick to play with, but then when 5:31 you really get down to it, in many 5:35 cases it's just something that's not super useful. 5:36 There are some cases though and I think we've 5:40 seen that this augmented reality using this new class 5:42 of device is something that we can put to 5:46 good and, but most important sort of entertaining use. 5:47 And it's now games, this is really an area where we're seeing that 5:51 augmented reality are actually, it's actually a 5:55 truly useful if I may say, thing. 5:58 Skinvaders is one example of this. 6:01 This is a game that uses the front facing camera on your phone or tablet to turn 6:04 your face into a game board where aliens invade your skin. 6:09 You're ready? 6:15 Alright, let's take a look at how this works. 6:16 So, this girl's gonna take the camera and it's gonna scan her face, right? 6:17 Great, got the lay of the land and now aliens are, 6:22 are, you know, planting eggs as they do in her face. 6:24 She wins, but look. 6:27 Oh, she's got an alien embedded in her brain. 6:28 Right? 6:30 So we've got this whole, it's like, this is her roo, it's her living room. 6:30 It's like, and you sorta see all this stuff imposed on these kids' faces. 6:35 Super fun. 6:38 It's this whimsical overlay of, of fantasy onto reality. 6:39 Now obviously, you know, this is fun and this is sort of 6:46 a, a cute game but you know, as you start to then 6:48 think about well what other kinds of layers whimsy or hopefully of 6:50 information can actually be more practical, when we overlay the stuff on? 6:53 Word Lens is a, is an app for iOS. 6:58 I think maybe they've got one for Android now too. 7:01 Where you take the camera and you aim it at text in one language and on 7:04 the screen it just translates it naturally into 7:09 text in another language, here, Spanish to English. 7:11 You know, it means that you'll never order the tongue or the tripe by accident again. 7:16 And hopefully it will save you from some embarrassing situations. 7:20 As well, potentially. 7:24 The, this is not especially new technology, right? 7:26 This is optical character recognition, we've had this for decades. 7:29 It's just that the, the our phones are 7:31 actually powerful enough to do this on the fly. 7:33 The fancy trick is sort of that it's using the same font 7:36 and color and doing it there in real time without any internet connection. 7:39 But the gut of this. 7:43 The thing that's really. 7:45 You know this is sort of a fun, neat thing. 7:46 I was transforming this reality, you know this augmented reality doing this. 7:47 But it's really saving me input, right. 7:52 This is something that Coy was telling us 7:54 about, that's the root of magic for these things. 7:55 Is minimizing input and maximizing output. 7:58 How can I make it so the thing can just discern what I intend. 8:02 Given the sort of context and relevance issue that we were just hearing about. 8:06 And given the information without even 8:10 without, without even requiring much action. 8:12 So that's interesting we got sort of an information here, we've got 8:15 entertainment, but maybe we can actually combine the two in a way. 8:17 So the idea of games is to 8:22 introduce this whimsical what-if objects into the environment. 8:24 Ikea kind of double-downs and says let's 8:27 introduce some whimsical what-if furniture into my environment. 8:30 So it works like this. 8:34 You use the app and you scan the catalog page that you're interested. 8:34 [MUSIC] 8:39 Use the camera as like sort of the anchor 8:39 in the room where you want the furniture to appear. 8:42 [NOISE] 8:44 And now you've got sort of your instant interior designer tester, but you 8:48 know even though this is sort of obviously to see how furniture will place. 8:53 They're having fun with this thing too actually. 8:58 Next best thing would right then would be 9:03 to 3D print your furniture right into your house. 9:04 You know so, it's like we're getting there. 9:06 We tend to think about virtual reality though, 9:10 or augmented reality as being this visual device. 9:12 Right, we heard though from Joe Johnson earlier 9:15 today about the importance of audio as well. 9:18 And I think that using all of the sensors to start 9:21 to, to push the interface off of the screen and into 9:24 the environment around us, is the thing that we're seeing right 9:27 now that is pushing the boundaries of sensors on these devices. 9:29 And it's something that we need to continue to push. 9:33 Table Drum is an app from a company in Sweden. 9:36 And it does this, by using what they call, augmented audio. 9:41 Something that's pretty interesting, which is, you 9:44 know, this thing is basically a drum pad. 9:46 Drum machine app. 9:49 Tons of these things in the app stores. 9:50 Where you know, you can sort of choose some instruments and tap out a rhythm. 9:53 On the screen. 9:56 The difference here, is that you can also. 9:58 In addition to sort of that sort of mundane stuff. 10:01 You can also actually use the microphone to listen in on sounds 10:03 that you make tapping out a rhythm on the table or desk. 10:08 And it translates those. 10:11 In the drum sounds, let's take a look. 10:13 Just tapping on the table, and it's 10:17 translating those into drum sounds that are here. 10:19 All the logic is happening on the phone but 10:21 it's essentially reduced to being a speaker in this case. 10:24 It doesn't know the glass sound. 10:27 So let's teach it, you know, assign a symbol. 10:29 [SOUND] To that glass. 10:33 [SOUND] We're gonna put it into training mode to tell it [SOUND] what the glass 10:35 sound [SOUND] to listen to that glass 10:39 sound [SOUND] and associate it with that symbol. 10:41 Ready? 10:43 [MUSIC] 10:43 Yeah. 10:49 And so what's happening here is that 10:49 all the interactions going off the screen, right? 10:51 It's no longer on this, on the screen. 10:54 I don't have to tap it to, thing I 10:56 can actually move my interaction into the environment around me. 10:57 And this very natural thing of actually how 11:00 I, you know play the drums without knowing 11:02 how to play the drums, by just banging on my desk or on my table, right? 11:05 How can we use sensors to push the interaction off the screen, so we aren't 11:09 always heads down on our screens instead 11:15 of interacting with the people who we love. 11:18 And, the environment that's important to us. 11:21 And, this is the opportunity of sensors, is to actually, finally divorce 11:24 ourselves from the machine, and let the machine follow us into the world. 11:27 And, this again, this is this idea that Coyne was telling us about, earlier. 11:29 Maximize input, design for sensors, instead of the screen. 11:34 In order to maximize output. 11:39 So like I said, I've been working with touch interfaces really 11:41 exclusively, and heavily thinking about it for the last several years. 11:43 And as I've been doing this, I think it will 11:47 be no surprise to you guys as both designers and consumers. 11:48 They've found that sort of the best touch interface is sometimes no touch at all. 11:52 Touch screens are clumsy, inaccurate, they're slow for a lot of things. 11:56 And so that's the point of things like bar codes or 12:01 computer vision recognition, of facial recognition software, of this audio stuff. 12:04 How can I use that stuff to speed the input and 12:09 just interpret and even transform in the case of augmented reality? 12:12 The environment around me, right? 12:16 So speech, for example, is something that's coming on strong. 12:18 Joe did a great job of talking about a lot of it's weaknesses. 12:22 You know you can see that it's nearly mature, but you still don't 12:25 want to run, you know, a nuclear power plant with this stuff, right. 12:28 Same with natural gesture that connects style stuff or Leap Motion 12:31 which is another natural gesture technology that came out this summer. 12:35 Touch is, is just the first, sort of 12:39 these, these new sensor based inputs that's become mature. 12:42 It became mature a few years ago. 12:46 But we've got all these other things, right. 12:47 Like I said speech, natural gesture, facial recognition. 12:50 All the ways we communicate with one and another as human beings. 12:55 You know, touch, gesture, facial recognition, speech, 13:00 all of these things that make us human. 13:02 The machines are beginning to understand at a 13:04 technical level and as Joe pointed out, it's 13:05 now up to us as designers to figure 13:08 out, wow, how do we actually interpret that data. 13:10 So that we can finally communicate with these devices on that, on that level. 13:12 Things get even more interesting though, as we 13:19 start to add custom sensors into the mix, right. 13:21 So we've already got this really rich set of sensors and the devices 13:23 that most of us have in our pockets and hand bags every day. 13:26 What's really interesting though is that you 13:30 can actually then supplement them with custom sensors. 13:31 The medical device industry is especially moving forward in this. 13:35 They've always been great at creating really suttle sensors 13:38 that can detect things in the environment and our bodies. 13:41 But then they make these really crummy computer systems and UX to go with them. 13:44 And they've suddenly realized hey most of our patients actually have these things. 13:50 Great computers with great UI. 13:54 Maybe we can just plug our sensors into that. 13:55 So, we're starting to get these 13:57 really interesting interactions with the environment. 13:59 Within our bodies, even. 14:02 So, for example, there's a company Proteus. 14:03 It's developed this little, tiny, itty bitty sensor. 14:05 Right? 14:09 Small enough to put into a pill. 14:10 So, that means that you can make these 14:12 pills that can tell when they've been taken. 14:13 This pill is made of copper and magnesium. 14:16 It's the same stuff you'd find in a vitamin. 14:18 Totally harmless. 14:19 When you swallow it though, it hits your stomach acid, creates a battery 14:21 that sends a signal and then the patients wearing a patch, here right? 14:25 So, strong enough just to reach this patch which has this Bluetooth transmitter 14:29 that then relays the signal which has a serial code of the pill. 14:33 To your phone or tablet. 14:37 Which adds GPS time stamps, sends it up to 14:39 your doctor, boom Josh is taking his pill good boy. 14:40 Whoa this is getting to be some pretty 14:44 trippy groovy Star Trek stuff going on right. 14:46 It's like tricoder things. 14:50 My phone knows whats going on inside my body, and so does my doctor, and 14:51 so does I don't know, McDonald's if they put one of these things in a burger. 14:56 [INAUDIBLE] >> [LAUGH] 14:58 >> I mean, right. 15:00 There are privacy implications to all of this stuff. 15:01 But the point is, is that it's so trivially inexpensive 15:03 to put a sensor and an internet connection on anything. 15:06 Even a little pill. 15:10 That anything that can be connected, and everything can, will be. 15:12 So then the question is, how are we going to. 15:16 Have that interaction with all those devices that have sensors on them. 15:19 Cuz this stuff can get kind of wacky. 15:24 Some, some Mickey Mouse ideas here, literally. 15:25 Here's an idea that came from, Walt Disney research. 15:28 It's an idea that they called Botanicus Interacticus. 15:32 >> Botanicus Interacticus is a new interactive plant technology. 15:35 It requires no plant instrumentation. 15:42 A simple electrode placed in the soil turns any 15:45 plant into an expressive, multi touch, gesture sensitive controller. 15:48 [MUSIC] 15:52 >> Oh yeah. 15:56 Plant UI, you guys. 15:56 You know you've been waiting for it. 15:58 Oh yeah. 16:01 Yeah, right there. 16:02 That feels good. 16:03 That's good right there. 16:05 I have no idea why they made this thing. 16:05 I really, I don't know. 16:08 The parks, I don't know. 16:10 But the interesting thing is right it's basically a 16:12 plant can now be a controller for your computer. 16:15 And presumably vice versa your plant, your computer 16:17 can somehow have some influence on the plant. 16:20 Data and control flowing back and forth. 16:23 Between the simplest most mundane objects or animals. 16:26 The Swiss, of course, figured out this little 16:30 sensor that you can put inside a cow. 16:33 [COW MOO] And it tells if the cow is in heat. 16:36 >> [LAUGHTER] And then the cow texts the farmer. 16:40 [LAUGH] Sending 16:45 texts when you're in heat. 16:51 I have a lot of friends who do that too, so here you go. 16:52 And, it, look because it's Switzerland, the text can go out in one 16:56 of three languages, French, Italian, German because 16:59 cow love knows no language boundary friends. 17:02 The internet of cows, here we are. 17:04 So, I'm not, you know I don't bring this up to sort of say, oh great, now we 17:06 are finally enabled at long last, the long human 17:09 dream of communicating with plants and cows is unlocked. 17:11 My point really in, in it is again the idea that we can put sensors on anything. 17:14 We can get data, and even give control, to our environment. 17:18 We have this new, sort of, exchange, this, 17:23 this interplay with our environment that is unprecedented. 17:25 And what does that us allow to do in terms of interaction? 17:29 Cuz we've been focusing on the last several years of 17:32 making digital physical, of etching our interfaces on these little 17:33 slabs that we take out into the world, and then 17:36 can, somehow, interact and detect things in the world around us. 17:39 And that's now being compounded, this interesting intersection where 17:43 physical objects are beginning to have a digital presence. 17:47 Can put, can submit their data, and even their control into the, into the mix. 17:51 So, wow you guys, I'm sorry I was like, I promised this whole 17:58 expedition I've just spent all this time talking about where we are right now. 18:00 Sensors are a big of this. 18:03 So let's take at least a short [BEEP] 18:05 little hop to our next stop in this expedition. 18:06 Where we get to mirroring. 18:10 Mirroring is basically broadcast right I mean that's 18:10 the whole idea behind Apple's airplay, which is 18:13 this idea where you can mirror your screen via your apple TV to you TV set. 18:16 To share photos or share videos. 18:20 And that begins to make the thing social. 18:23 Not just social with other people, which is certainly an important piece of this. 18:26 But social with other devices, right. 18:29 This is a smart device. 18:32 Your phone or tablet sending content to a dumb device, the TV. 18:33 So it's not just a sensor. 18:38 Right. 18:40 Phones and tablets could be broadcasters to dumb devices too. 18:40 It becomes a sensor and receiver for those dumb devices. 18:44 Now, you have that's that's great for one sort of ecosystem. 18:48 But you have companies like Samsung that are like whoa, wait a second. 18:50 I don't know if we want to, sort of. 18:53 You know, what do we do to answer this? 18:55 So, they make things like Smart TVs. 18:57 Let's make everything smart. 18:58 Right, so you get the Samsung Smart TV for example 19:00 >> Hi TV. 19:02 Smart Hub. 19:03 See that? 19:05 This TV actually recognizes my voice and pulls up Samsung Smart Hub 19:07 giving me instant access to my favorite apps like ESPN and Sports Center. 19:11 >> Yeah, I kinda hate that guy. 19:17 >> [LAUGH] >> He doesn't know what I'm talking about. 19:19 I'm not sure that I like his TV either, frankly. 19:24 You know, because the thing is, I'm not sure that 19:26 I need everything to be smart or to have its 19:30 own operating system, and that's sort of been a staple 19:33 of kind of this future of connected objects and connected appliances. 19:36 Futurists love to talk about the kitchen. 19:41 Everything in the kitchen is going to be connected. 19:44 The refrigerator for some reason is a real object of fascination. 19:45 We're going to have this, you know, web browsing refrigerator, in 19:49 fact, you know, Samsung again, you know, at CES earlier this year. 19:52 They unveiled their Evernote refrigerator, oh yeah. 19:57 Your refrigerator syncs with Evernote. 20:00 Yeah I don't know, maybe I guess. 20:02 The point though is not that when we connect objects and can 20:04 start to work with them, through our phones and tablets and web-browsers. 20:07 It's not that, everything is gonna suddenly become 20:11 a web-browser or a Twitter client when it's connected. 20:15 You know, there's this, there's this idea that's 20:18 been around for a few decades of asking. 20:19 Not so much, you know, what problem are you trying to solve, 20:22 but what job are you hiring an object or a service to perform? 20:24 It's not always immediately obvious. 20:28 What job does the microwave do? 20:31 How can we make it do a better job? 20:34 I don't know about you guys, but I use my microwave more than anything as a clock. 20:35 It is a clock with an oven attached to it, right. 20:42 How can I make it a better clock by being connected? 20:46 How about making sure the clock is always right, right. 20:50 If there, if there's a power outage, when the, the time changes. 20:53 It's just always right. 20:56 Mundane computing is the word for this. 20:59 Things that it's like how can we make things just a little bit 21:02 better if the job we hired them to do, by making them connected. 21:03 So it's not necessarily that I want a 21:07 screen and browser window plastered onto my microwave. 21:09 Help me make it a better microwave oven, that's for the mundane computing. 21:12 So that's part of thing why I think I don't 21:16 necessarily want everything to be smart, maybe a little smarter. 21:18 But I think it's still okay to have a pretty dumb computers. 21:22 Because again you know what? 21:25 I already got too many operating systems in my life. 21:26 You know, and every time I have to learn 21:29 a new one, I have to understand how it's different 21:31 from the others, and know this doesn't work very well, 21:33 and there's always these edge cases that I don't understand. 21:35 It's hard, right? 21:38 We already have too many operating systems. 21:39 How do I get from here. 21:42 To go here, how can I have more peace with the objects in my life, right? 21:43 Well let's, let's go a little bit further out on our mission. 21:47 If you go out from this place of sensors and mirroring, part of 21:49 maintaining sanity in this era of connected devices is that, is, 21:54 is to start you know, limit the number of smart devices that we have to deal with. 22:00 And so we sort of get in to this notion of remote 22:05 control where we can drive everything for example, with my phone or tablet. 22:08 I mentioned Airplay earlier, and they're sort of similar technologies, but 22:13 just to sort of continue in Apples playground here with, with Airplay. 22:15 Airplay does let you do that, sort of have 22:21 this remote control, although you see it far more rarely. 22:22 Here's one example. 22:25 It's a game called Metalstorm: Wingman that uses 22:26 the iPad to fly your plane on the TV. 22:29 So this is beyond mirroring, obviously. 22:32 This is generic device that's acting as if it's 22:36 a purpose-built controller to, to work a dumb device. 22:39 In this case, your TV, right? 22:43 So, again, this is, you know, Apple's technology, Airplay. 22:45 We're seeing some other things come out. 22:49 Google, about a month ago came out with Chromecast, which is 22:51 this little dongle that you put into your TV and then, boom. 22:55 You know, you can just, sort of, say, oh, I'm looking at this video on my phone. 22:58 And, tell my TV to grab it from the web and play it. 23:01 The remote control, again, from the smart device to a dumb one. 23:04 So that's, this is, sort of, summarizing our, our current solar system. 23:08 Where we're, where we're seeing things push at the edges, 23:12 currently, with the technologies that we already have as consumers. 23:16 Most of us. 23:19 Majority of Americans have this. 23:20 Have the technology available to do all of this stuff. 23:22 So, sensors, mirroring, remote control. 23:25 In all those cases, though, this is 23:27 a single, smart device controlling the display. 23:29 Either its own display, or the display of another dumb device, by broadcast, right? 23:32 Mirroring. 23:37 So let's go push a little bit further. 23:38 And here's where things get interesting. 23:40 Migrating interfaces. 23:42 It's an important element of the very near future, will be 23:43 more ambiguous control, shared control among sort of peer devices, right? 23:47 Where, where the primary control shifts from one device to another based on 23:54 frankly our immediate context or, or even better as Joe was telling us. 23:58 relevance. 24:04 So we know this concept already, from the good old car phone. 24:06 To say you're driving down the highway. 24:11 You got your, your, you know, your phone is, is a, is linked to your car. 24:13 And you know, your phone rings. 24:19 But it's not your phone that rings. 24:21 Your car rings, right? 24:22 And then look you're going, you're along you are talking to your car. 24:24 You're talking to a dashboard. 24:28 The car has control. 24:29 The phone might be running the logic but the interface 24:31 is with another smart computer that's with you, your car. 24:33 Right, but then you park your car and then you know, you take 24:36 your phone you just walk back, you're talking and the conversation follows you. 24:40 The interface follows you from your car to your phone and that's kind of a amazing. 24:45 We don't even think about this stuff anymore but 24:51 that's these are two smart computers two smart devices. 24:53 The caller and the phone sharing control and doing this seamless 24:57 hand off just to follow us where we need it to be. 25:00 Again, so, so I call this a migrating 25:04 interface, and we see surprisingly few experiments with this. 25:05 With the smart devices in our lives. 25:10 Did see this sort of shortly after the iPad came out. 25:12 Simple Scrabble game, right? 25:15 Where you have two phones interacting with an iPad. 25:17 You just sort of you know, this is 25:20 just three devices sharing control over a single experience. 25:22 [MUSIC] 25:25 [INAUDIBLE] And when it works, hm, it's pretty seamless. 25:25 You know? 25:27 Where you're flipping tiles off of one phone onto the screen. 25:28 Manipulate them there on the main screen. 25:32 There you have it, folks. 25:34 The world's most expensive board game. 25:35 [SOUND] 25:37 Way to go, Milton Bradley, or whoever you are. 25:39 Alright. 25:42 So, frankly most interesting stuff, though, I think, is probably yet to come. 25:43 And hopefully, it won't be too long. 25:46 Corning it's a glass company. 25:49 You probably knew them in a previous life as the 25:51 people who made you know, Pyrex and casseroles and measuring cups. 25:54 But then they came up with this thing called gorilla glass. 25:59 The hard glass that's on most of our phones and tablets. 26:01 And all of a sudden they were like hey we really like this new era of technology. 26:05 Screens everywhere, right? 26:09 Let's, let's push this. 26:11 So that made this concept video called A Day Made of Glass. 26:13 I'm not usually a huge fan of concept videos cuz they tend to 26:16 conveniently paper over things like cost 26:20 and whether it's possible, stuff like that. 26:23 This was sort of interesting, though, cuz I 26:26 really sort of made a good faith effort to. 26:27 Show not only what's possible, but to 26:29 actually say where things are in the process. 26:32 Whether it's expensive, or, or, you know, whether this 26:34 pie in the sky are ready to go now. 26:37 So let's look a little bit at this. 26:39 I'm gonna start with this girl, who's using her tablet in her bedroom. 26:40 [MUSIC] 26:44 >> Now, I'm not sure if you noticed, but this 26:47 closet door is actually a display driven by Amy's tablet. 26:49 All the intelligence that you see on this display, all 26:55 these apps, they're all residing and running on Amy's tablet. 26:58 This display spans the entire door. 27:03 It has its own small footprint operating system, and 27:05 is smart enough to be aware of Amy's device. 27:08 [MUSIC] 27:10 And based upon proximity and other rules, it 27:11 knows what to display, and in what format. 27:13 To make this part of Amy's days a reality, Corning 27:16 is helping to deliver large scale edge to edge displays. 27:20 Corning looks to partners for operating systems. 27:24 And apps that seamlessly scale and 27:26 transfer between tablets and larger displays. 27:28 [MUSIC] 27:31 >> Oh, is that all creepy British guy who hangs out in little girl's bedrooms? 27:32 [SOUND] 27:36 The only thing we need are the operating systems and the 27:39 apps that seemlessly display from little tiny thing to gigantic thing. 27:43 I don't know, I think all of us here 27:46 are struggling with that problem of creating interfaces that. 27:48 Move to different sizes, at least, you know, thinking 27:52 about different-sized rectangles on these screens that we carrot. 27:54 This is not a trivial problem, right, creating a software to run this? 27:57 But, the hardware is already here, right? 28:00 There, there, it, we can make displays like that with just enough 28:03 intelligence to provide an interface, of its own, sort of, custom interface. 28:06 In this case, sort of a touch interface. 28:10 That can talk to another operating system to sort of bring over that experience. 28:12 So, we can have this smart device that we carry 28:18 with us, our phone for example, a tablet in that example. 28:21 It's already got all of our apps and all of our information around us 28:25 and we can just bring it over and boom this surface takes over the experience. 28:28 Alright sort of follows us wherever we go. 28:33 This is sort of something that you know, Corning would 28:38 like to say that can be surface we go to. 28:41 Let's sort of follow thread. 28:44 >> This is Sarah, Amy's sister. 28:46 Her tablet is just like Amy's. 28:48 [MUSIC] 28:50 It's a primary computing device too. 28:51 Oh this looks like mischief. 28:55 This dashboard is made from formed thin durable glass. 29:01 It feels better than plastic and it looks better too. 29:06 It's also a display, which means it can take on the appearance of pretty much 29:09 anything Of course, in driving mode, its function 29:12 is to display critical and ancillary driving information. 29:16 >> Man, I mean, we were talking about the 29:20 tri-corder before, but now we're getting into holodeck territory, right? 29:21 Where we can actually make any surface take on the appearance of what 29:24 everybody wants, or as long as our smart device is able to imagine it. 29:27 This is hard to do. 29:33 This is really expensive. 29:34 It's possible. 29:35 It's in labs now. 29:36 But it's something that is really pricey to do. 29:37 And the manufacturing process is, is difficult. 29:39 So how long? 29:42 How long do we have this? 29:43 Well this guy would tell you 20 years. 29:45 Not necessarily 20 years from now but 20 years 29:47 from the moment that it was invented in the lab. 29:49 This is Bill Buxon. 29:51 For those of you who don't know him, he's sort 29:53 of one of these guys where you're sort of like, computers? 29:54 Oh yeah, that guy. 29:56 He and his team invented the capacitive touch screen, 29:58 which most of us have on these devices, now. 30:01 Back in 1984 I think 30:04 he sort of applied Fitts law to gui interfaces. 30:08 Big, big name in computing. 30:11 And his idea, the thing that he sorta says, 30:13 is that it takes 20 years from the moment that 30:15 something, an idea is formed, and, and begins in 30:18 a lab, to when it's ready in the mass market. 30:21 And that might seem like a really long time, 30:24 but you know what's sort of exciting about that? 30:27 Is the next big, huge technology thing to hit the mass market. 30:29 Things can be really big in five years, it's been in labs for 15. 30:33 It's already there for us. 30:36 This stuff doesn't spring out of nowhere. 30:38 This long period of time where we can look around 30:40 and say, wow, what are the building blocks that are coming? 30:42 But thankfully, we don't even need to do that because we already have 30:45 such amazing building blocks that we can, not even scratched the surface of. 30:47 These technologies that we already have that can enable some pretty magical 30:52 experiences that we've already seen and I'm gonna show you some more. 30:55 Again in our pockets and hand bags and in our 30:58 living rooms in form of Xbox's with Kinect and so on, 31:01 really amazing technologies that let us finally interact with our 31:04 environment instead of the screens and yet we aren't there yet. 31:09 As an industry and sort of get our 31:12 heads around anything beyond designing for the screens. 31:13 We have to pull back and look at some 31:16 of those possibilities because the exciting thing is, is that 31:18 we have this opportunity to design the spaces between all 31:23 the gadgets in our life by leaning on these sensors. 31:27 Friends there is magic in the gap between devices. 31:31 And that's the area that I think that 31:35 we should all be thinking about designing for. 31:37 How do we move content in an information, and behavior, and intent between devices? 31:40 We looked at some examples of that with a 31:46 dumb TV, and smart phones, what else can we do? 31:47 We tend to think about designing interfaces for a single input and output. 31:52 The screen, for output. 31:59 Until very recently, the mouse and cursor for input. 32:02 But wow, we, we're, we've got tons of inputs now, I mentioned this earlier. 32:05 Touch is the mature one. 32:08 Once we've been dealing with for a while. 32:10 But you see this sort of parade of things that are just about getting ready 32:12 speech being, you know, probably most notable as 32:16 well as as well as the natural gesture. 32:18 I think that, you know, instead of, it's, it's gonna be the opportunities 32:21 of designing not just the touch interface, 32:24 the speech interface, the natural gesture interface. 32:27 But it's the combination of all of them that's gonna be really powerful. 32:30 And I'm especially intrigued by the combination of speech and gesture. 32:33 Because I think that those are actually gonna go together. 32:37 You have to wave at the machine to tell it you're talking to 32:39 it, or talk to it to let it know that you're waving to it. 32:41 You know what you get, when you combine speech and gesture? 32:44 Yeah friends you get magic, we're doing spells here right? 32:53 I mean you know the design community, we've heard it a lot today you know. 32:58 We've sort of appropriated the language of magic and wonder 33:01 in all the stuff we talk about, of building web pages. 33:04 And you know, I guess, but man we've got the real 33:07 opportunity to make things that really make us seem like wizards. 33:10 Here's an example. 33:13 It's My friend [UNKNOWN]. 33:14 He looks a little tired here, he's at the end of an overnight hack-athon. 33:15 You guys know the drill, you stay up all night working on something. 33:20 He's doing it right though, I don't know if you see these wine bottles here. 33:23 He's actually on a yacht. 33:27 Off of Cannes, in the south of France. 33:29 If you're gonna do a hack-a-thon, follow this guy, right? 33:31 He found the right one. 33:33 Alright. 33:35 So he brought to this hack-a-thon, his phone, 33:35 a projector, a Kinect, and a Mac Mini. 33:38 And this is the hack he put together. 33:41 >> So you're sitting at home on your sofa watching television. 33:43 And something interesting comes on. 33:46 And you wanna share it, say, tweet it. 33:48 So, I walk up to my TV, and I just kinda wave at it, so it knows I'm there. 33:49 And then, when something interesting comes up, I can just grab it, and boom. 33:54 >> What? 33:58 >> I grab, and boom, I put it over there. 33:58 So I'm just holding it, in my hand, and 34:02 I'm putting it on my phone, and that's my hack. 34:05 >> And, I want to be clear, he did this overnight, drinking wine, on a boat. 34:07 [LAUGH] 34:11 Technology's lying around, waiting for us to use. 34:14 It's the matter of putting it together in imaginative ways. 34:16 Because this isn't necessarily that complicated. 34:20 All the SDK is already built. 34:22 You just had to teach the Kinect this gesture means screen capture. 34:23 And, you know, you can walk around with it like it's in your fist the whole time. 34:27 And probably, by then, it's already downloaded to the phone. 34:30 But I touch the phone to release it. 34:33 It's a combination. 34:36 Of inputs. 34:38 Across a combination of devices, it suddenly seems wow this is really magical. 34:39 But also it's like effortless, you know? 34:43 It cuts out the hassle. 34:46 It's a direction of, of intent just by being able to work with my environment. 34:48 To point at a thing I want, grab it and throw it in here. 34:52 Alright, how can we start using sensors, again, to move 34:56 those interactions off the screen, and into the world around us? 34:58 Design for sensors not for screens. 35:01 Screens are still gonna be with us though, and 35:04 we're still gonna have this interesting challenge, whether it's 35:06 this kind of solution or others of figuring out 35:08 how do we share content and experiences across those devices. 35:10 So, and a few years ago there was some California 35:15 [MUSIC] 35:17 Design students who were thinking about this in 35:17 terms of screen gestures, touch gestures across multiple devices. 35:19 And they came up with this, this sort of paper prototype. 35:24 This, this sort of thought piece of, you know, 35:26 how might we make these devices interact and share information. 35:29 So they came up with, with some of these things. 35:33 And some of the, the gestures, as 35:35 you're seeing here, are actually pretty whimsical. 35:36 Sort of, really drawing on the way that we 35:38 mix and match content with in the physical world. 35:41 Sort of like, wow, it's sort of a scanning behavior. 35:45 Stretching things out. 35:47 Or even, in this case, you know, almost sort of like 35:49 you're mixing powder Some sort of, you know, rich media pharmacist here. 35:51 [MUSIC] 35:55 Now, that's cute, right? 35:58 But obviously we can't do stuff like that yet, right? 35:59 Well, yeah, we can. 36:01 I don't know if you guys have seen this, these, these toys, Sifteo Cubes. 36:02 They're these little cubes that have 36:06 accelerometers, and can detect each others' presence. 36:08 It can interact with each other, just like being put next to each other. 36:12 It, it got accelerometers again. 36:15 So you can just like, shake gameplay from one to another. 36:16 These are kids' toys. 36:20 We've already kind of got this, really social interaction. 36:22 But 36:26 again is, is you know, they aren't super smart, but they're not dumb. 36:29 And it's all about these interaction among smart devices, 36:33 it's migrating interface notion that I've been talking about. 36:37 They love it when you buy more cubes by the way. 36:40 Since you have these word games and of 36:46 course those work better with more cubes as well. 36:47 I think I forget how many it supports. 36:49 Nine or ten cubes or something like that. 36:52 So you can get some pretty 36:54 sophisticated combinations though. 36:58 [MUSIC] 37:05 You know I've got this really expensive 37:11 phone and this really expensive computer and 37:13 you know what sometimes I just like put it down next to it, nothing. 37:16 They just like ignore each other. 37:20 It's like rude, right? 37:22 I got this little kids toy and wow they're super social. 37:25 You know do you use, who had a, who had a palm pilot back in the day? 37:28 Yeah, right? 37:32 Alright, remember how easy it was to exchange contacts or, or, or, or events? 37:32 Wow, it was super-easy, it was beam the thing and boom, it was done. 37:37 We've lost that somewhere. 37:40 It's pretty easy to beam information to somebody on the other 37:41 side of the planet, but when it's in the same room, tricky. 37:44 You know, you see Apple struggling with that now 37:47 with their airdrops, they're trying to make that work. 37:49 It's a little clunky, it's not as easy as it 37:51 was just to beam somebody something with a palm pilot. 37:52 How do we get that back? 37:55 How do we get that exchange between devices and information? 37:56 And browsers are interesting. 37:59 They're working on it too. 38:00 Not gonna go into it here, but web RTC is something to look into. 38:02 Browsers that can exchange information and control, browser to 38:05 browser without even reaching out to the cloud, just directly. 38:08 So you do start to see some things, 38:13 though, starting to happen with these kinds of 38:16 interactions, but again it's not necessarily the smart 38:18 devices, it's sort of smart devices plus dumb devices. 38:21 So, a device called the Shine from an outfit called Misfit Wearables. 38:25 And it's basically like a FitBit tracker. 38:31 It's a pedometer, counts steps, gives you points throughout the day. 38:33 And the way that you sync it, is that you put the little thing, on the screen, 38:35 like this, right? 38:42 And it's sort of like, it's almost like it's soaking the 38:43 data up through the screen, has a sort of cute interaction. 38:45 The lights on the hardware thing light up. 38:49 And 38:51 then, boom, it's synced. 38:53 But so, you know what's interesting about it, is 38:55 this actually has nothing to do with this screen. 38:56 It's a bluetooth, you know, wireless transmission. 38:57 But because of the aluminum housing on the thing, it has 39:01 to be super close to the device in order to work. 39:03 So this is a little case of misdirection, as a lot of magic often is. 39:05 We're like man, how do we get people to put it so, clo, close enough to do it. 39:09 Let's make them think that it actually has to go through the screen to do it. 39:12 And whatever it takes, I mean it's looks great. 39:15 The experience of it is, it's like sort of 39:17 this, it's even better than if it were wireless. 39:19 Cuz somehow it's like pairing, and it requires intent, 39:22 and makes, puts me in control of the thing right? 39:25 Me in control is a big part of the future, you know, it's sort of 39:28 this, this, how do I enable this wizard 39:30 like information of moving information back and forth? 39:32 But you know another piece of it as we move further out on our little journey is 39:35 that we're getting into also passive interfaces, which are 39:38 interfaces that do things for me on my behalf. 39:41 Corey stared off the day talking about the, the Moves app, right. 39:44 Where you don't even have to do anything, it's 39:49 doing it all for you, there's no interaction required. 39:50 So we're starting to see more and more of these things. 39:54 These devices that just do their work and talk 39:55 to one another without even needing us to intervene. 39:58 We're just the legs that bring them into proximity, right. 40:01 We're incidental to their behavior. 40:05 So, we're starting to see things like this. 40:06 Many of you are probably familiar with the Nest thermostat. 40:09 The thermostat, it's like fully loaded with information that 40:12 it can do a smarter job of managing your house. 40:15 It has proximity sensors to know when you're home, it has 40:18 a humidity sensor, temperature of course, wi-fi and internet connection so it 40:20 can find out the temperature outside, but also so it can be 40:25 controlled by, you know, through the web, or through apps from anywhere. 40:28 On the internet. 40:32 It's something that learns your behavior. 40:33 You know, it's like oh, you like to turn up the heat around 6:30. 40:35 I'll just start doing that for you. 40:37 You guys, it's a thermostat. 40:39 And yet it's fully loaded. 40:43 I mean, it's still, it's an expensive thermostat, but still it sorta 40:44 shows that you can like throw in all this stuff and internet connection. 40:48 And it doesn't become so expensive that it breaks the bank. 40:51 A relatively dumb device but again fully 40:54 loaded and importantly in constant communication so that 40:57 Fitbit Flex you know, sort of one of the, the big players in the wearables industry. 41:01 And it's something that you know, it counts 41:05 your, your steps, your activity through the day. 41:07 Tracks your steps, your sleep, whatever and then it talks to 41:11 your phone you know, whenever it comes into contact, whenever it's nearby. 41:14 It synchs when it can, it communicates when it can. 41:18 It leans on Bluetooth to do that and from 41:20 there you know, the information's available on any device. 41:22 If you can [INAUDIBLE]. 41:25 So again a relatively dumb computer with a dumb 41:26 sensor with a dumb display just LED sensors on 41:29 it but importantly something that's passively doing it's work 41:32 and talking on it's own to machines around it. 41:36 How do we then reflect that information back through 41:39 the web experiences we create through the apps we create. 41:41 Or even have it talk to other dumb devices and how do we create 41:45 the API's to make this kinda glue happen, I'll talk about that in a minute. 41:49 We tend to think though about technology as being ever more sophisticated right. 41:53 Faster CPU's, brighter screens, possibly bigger screens. 42:00 We're seeing that happen of course. 42:04 It's certainly a big part of our technology 42:06 future is that computers will become more sophisticated. 42:08 At the same time we're also going to become dumber. 42:11 We're gonna have two forks here right. 42:13 Cuz we don't want everything to be smart, trust me. 42:16 We want to have a lot of dumb devices that are 42:20 just doing work for us, talking to the handful of smart devices. 42:23 That we actually interact with. 42:26 Now the interfaces of these devices can be, as we've seen very simple. 42:28 Possibly without screens at all or maybe even very tiny screens. 42:33 You've seen this. 42:36 Oh yeah credit cards with a keyboard and screen on it. 42:37 These things are in the field in France 42:40 and Singapore in prototypes from both MasterCard and Visa. 42:42 We're starting to get, wow, we're having to deal with interface that are sharply 42:46 different from even the challenging interfaces we 42:50 have been dealing with on phones for example. 42:52 Of course you guys have been paying 42:54 attention to this nonsense, smart watches, this is 42:56 the Samsung Gear which I'm not sure is exactly the right model for a wearable. 42:58 But still, you know, how is our web, how are websites gonna look 43:02 on these you know, one and half by one and half inch screens. 43:07 And you know, forget about rectangles you know, that's whats been our focus 43:10 lately is how do we squeeze and 43:14 stretch our content into different size rectangles. 43:15 Oh I think we're finally, oh don't move I 43:18 think we've finally figured out how to do that. 43:20 And then bam something like this happens. 43:22 Right and it's like, oh man, Google Glass didn't see that coming. 43:25 Right? 43:30 So how do we deal with that? 43:30 Or how do we put our, you know, software on a credit card? 43:33 Or on a, on a, on a Nike FuelBand, that has, you know, ten-character display? 43:35 How do we deal with speech? 43:41 How does your thing sound? 43:43 How do we allow people to experience our sites and apps and such incredibly? 43:44 Different devices. 43:49 Now, what's interesting is something like Google 43:50 Glass, Pebble Watch, or the Google Gear. 43:52 These are all little windows into our phone. 43:55 Alternate displays talking to a smart device for contacts 43:57 that are appropriate to that wearable, right, which is interesting. 44:02 This is also fitting what I'm talking 44:06 about, where we have one single smart device. 44:07 That we can just sort of experience software through different displays. 44:10 But man our content is not ready yet, 44:14 to be moved across all these different displays. 44:16 That is gonna take a little bit of work. 44:20 Your API is the application you guys, that's sort of like the final idea that 44:23 I want to leave you with before I release you to to be around other portables. 44:27 Until we can kind of get our content organized on the server 44:34 side, on the back end, well described, well structured, well chunked up, so 44:38 that we can kind of reach out and grab the parts that we 44:43 need that will fit in our appropriate to the specific interface, you know? 44:45 That's the application. 44:50 We focus so much on the presentation. 44:52 I mean, that's our job, as designers, right? 44:53 It's, like, oh, alright, great. 44:55 We have to make this iPhone app. 44:56 We need to make this responsive website. 44:58 But folks, presentation deprecates. 45:01 The way we built websites two years ago has changed. 45:04 We're throwin' 'em out, and buildin' em' again. 45:06 Apps that were good a year or two, throwing them out. 45:09 It's the content and services underneath that count. 45:12 And all the time you talk about APIs and structured 45:15 content with designers, and their eyes roll back in their head. 45:17 You guys, this is our job. 45:19 To push our design skills down the stack, 45:21 into the content itself, because these are our ingredients. 45:23 This is the way that we can have some sort of creative 45:27 control in this chaotic environment of 45:30 multi-platforms that we're, we're dealing with, now. 45:32 That's creative control in an uncertain world. 45:36 So, what you're seeing is [NOISE] when you sort of combine all of these 45:39 themes of sensors and mirroring and remote 45:43 control and migrating control on passive interfaces. 45:46 Is this cloud of social devices. 45:49 I'm not talking about the cloud that you see in all the airport ads. 45:51 I'm talking about this sort of, the set of personal devices and gadgets 45:54 that are constantly flooding into our, into our homes, and into our lives. 45:59 So that means that we're doing a lot more than building applications or websites. 46:04 Those, again, are just. 46:08 Containers but it's what we tend to focus on right in our day to day lives. 46:09 It's the individual container or application. 46:13 We need to start thinking about the service and content that is more lasting. 46:16 That we can move that stuff into all these 46:20 different and super awesome containers and make them portable. 46:23 Across devices by grabbing 'em here, and 46:27 throwing 'em here, so that they look appropriate. 46:29 So what I've tried to do here is to pull together these themes that we see now. 46:32 And this is imperfect, you know? 46:35 This is, it's hard to sort of peek into 46:37 the future, and connecting the, the dots is hard. 46:39 Steve Jobs, in fact, said that you can't connect the dots looking forward. 46:42 You can only connect them looking backward. 46:46 Still the stuff are already in labs. 46:49 And I think we can see enough of this come in changes to prepare now. 46:51 So what do we do now? 46:54 This is where we started. 46:55 This is just sort of some, some principles that I 46:57 will leave you with and then we'll call it a day. 46:59 The Push Sensors, you have the opportunity to move 47:02 sense to move interactions off the screen and into 47:05 the environment around us To save us time, and 47:07 to make important inferences, and do things on our behalf. 47:10 [INAUDIBLE] Think social. 47:13 And, for once, I'm not talking about Twitter and Facebook. 47:16 I'm talking about, how can we make devices social? 47:18 How do we move content seamlessly from device to device, screen to screen. 47:20 Whether those are smart devices, or a smart and dumb combination. 47:25 Focus on your ecosystem. 47:29 What's, what is your collection of apps and services and how can you create 47:31 back end API that actually can talk to all if these crazy devices and interactions. 47:35 That means that we're all cloud developers right. 47:41 Every interaction has to go through our servers. 47:44 You know, so that we can actually sort of process this stuff. 47:48 We, there's a lot of conversation, native apps versus web apps, whatever. 47:50 Our customers don't care. 47:54 I mean, it's an important implementation thing 47:54 for us, for people who build these things. 47:57 The important thing is that content and behavior syncs 47:59 in the cloud, however it is that we present it. 48:03 The presentation deprecates, after all we're gonna change our minds. 48:06 In a couple of years anyway. 48:09 That means your API is the application. 48:11 Now we need to focus on new input methods to access those API. 48:14 Focus, think about how do we use speech? 48:17 How do we use gesture? 48:19 How do we use touch? 48:20 How do we use them in combination? 48:21 Cuz we're seeing devices that combine them all. 48:22 HP just came out with a laptop that 48:25 has keyboard, mouse, touchscreen, and leap motion natural gesture. 48:27 How do you create interfaces that can create with all those? 48:31 Cause you guys I've been talking about the future. 48:34 But you know what it's already here. 48:36 Bill Buxton's rule that says this stuff is already in labs waiting for us to use. 48:38 But, more importantly, we already have this basic technology 48:42 in our handbags, our living rooms and our pockets. 48:45 Use it. 48:49 Put, put it to use. 48:50 Think about moving off of the screen and design 48:51 for rectangles and design for the environment around us. 48:54 You guys, we live in one of the most exciting times in the history 48:57 of technology possibly in the history of culture, and we're in the center of it. 49:02 So don't be afraid to sort of look up from the job at hand and step back. 49:07 Think man, how can I make something amazing? 49:11 Go make something amazing you guys. 49:14 Thank you. 49:15 Thanks. 49:15
You need to sign up for Treehouse in order to download course files.Sign up