Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
Interaction Interfaces Survey20:08 with Alan Davis
Alan Davis is a developer at Leap Motion. In this talk, Alan describes the topology of common robot designs and then details some of the various options for controlling and interfacing with robots.
[MUSIC] 0:00 Alright, so I'm here to talk about interaction interfaces. 0:05 I happen to work for Leap Motion, but 0:07 I wanna talk about a whole variety of things. 0:09 And some motivations for how to design around 0:11 them but more importantly how to design for them. 0:14 Thinking about physical things we're going to make. 0:18 I know there's a lot of things to make this weekend I'm focused 0:20 little a more closely on robotics, 0:22 something we can move around and manipulate. 0:24 So let me get started. 0:27 So I, I hear that you're building 0:29 a robot. Is that right? 0:30 Is he going to look like this? 0:31 Probably not. 0:34 Maybe a little more humble, like a mech warrior 0:35 from Mech Warrior 2000, I remember playing in 1994. 0:37 What a sweet game. 0:39 It's probably not gonna look quite like that, but 0:42 a little more like Wall-E or Johnny Pi, right? 0:44 Simpler, smaller, more manipulable. 0:46 Even have some some treads on it. 0:49 But since we only have maybe 24 hours here, 0:51 it's probably gonna look a little more like this. 0:53 Has the classic Johnny five treads on it, but there is also like a 0:56 bread board and arduina sticking out the 0:59 top Instead of treads, you might have wheels. 1:01 Something like this. 1:04 So, this is kind of one of the models you could, you could deal 1:05 with, with your, something to roll around on the ground if you are a bot. 1:09 You could be a little more ambitious and try to 1:12 build a hexapod, like a spider like, or insect like thing. 1:14 But, again, it's not going to look quite that sexy, it will probably 1:18 look at little more like this. [LAUGH] 1:21 Which is still pretty damn good, I would say. 1:24 You have six legs there and it's carrying around 1:27 its laptop, so it has a brain with it. 1:29 very, it's a centralized nervous system on this spider 1:31 which is a, a better than the octopus has, right? 1:34 You might even, go, go even farther from the legs and 1:38 try to attach arms to it and build a terminator robot. 1:40 But again, it won't look like this, this weekend. 1:43 It'll look a lot 1:45 more like this or we may even try to put multiple 1:46 things together, treads and arms and have some horrible, disgustingly scary thing 1:49 like this which, there will probably be small children in the room 1:53 who are going to have a nightmare about this machine attacking them. 1:57 I'll talk a little bit about these, and 2:01 very, very, briefly just mention that also drones 2:03 are another option, but since you just heard 2:06 about drones, I won't focus too much on them. 2:07 And I hear that autonomous 2:10 drone are all the rage. 2:11 I believe we have some [UNKNOWN] on site or if built your own it might 2:13 look more like this with again the bread 2:16 board and stuff wires bursting out the top. 2:18 I'd also like to mention the robot that I've had to deal with, 2:21 dealt with in the past week, just a few weeks ago we made 2:24 a launch of a, a relaunch of our store redesigned and so to 2:28 celebrate one of our web engineers 2:31 got this little balloon, helium-filled fish here. 2:33 It's like a little blimp. 2:37 And on the back it has a motorized fin and it has a very simple controls. 2:38 we, we blew this up and actually flew it around the office and had fun celebrating. 2:44 These are, these are just a little all, all the possible array of 2:48 robots you might have The question is, How do you control them, right? 2:52 There's all these different, wheels, there may be 2:57 flying; it may have arms, things like this. 2:59 So now we 3:02 get into the, the interaction of it. 3:02 If you're going to control this in real time, most likely, you're not going to be 3:04 sitting there at a terminal and, and controlling 3:06 it with text, although that is an option. 3:08 You're gonna want to have some sort 3:10 of, something more, more interesting, more natural. 3:11 And as we heard from Whurley earlier we're all lazy, right? 3:14 Or at least, the smartest engineers are the laziest he claims. 3:17 So the simplest possible solution is, for every 3:20 servo on the robot, maybe we'll have a button. 3:22 So if you just have a motor and you 3:25 have one axle on the rear wheel, rear, your rear axle 3:27 is servo controlled, you maybe just has one button on the robot. 3:30 Go and then you let go and that's don't go. 3:34 Right? 3:37 That, is that, that could be as simple as possible. 3:38 Maybe you don't even use the joystick on our favorite Atari 2600 controller. 3:40 But again this is a, there's a, this button is discrete. 3:45 Maybe it's continuous on a RC car where 3:47 typically there's only two things that control: faster/slower or 3:49 also, and then left/right. 3:53 This is a very classic looking, R, remote control car controller that we've seen. 3:55 We have the trigger to control continuously 4:00 how fast or slow we're going and 4:02 oftentimes you push it forward you can actually apply some brakes to the RC car. 4:03 And then we have a wheel on the side you're 4:07 turning left and right and again, that's in a continuous fashion. 4:10 In the central there's a, neutral zone and then you're going 4:13 to turn a little more to the left and to the right. 4:15 So these are all very, very good options 4:17 that have been explored. 4:18 Probably all have seen controllers like this. 4:19 But what if your robot has more functions than that? 4:22 What do you do? 4:25 Do you add more buttons to your 4:26 controller, and come up with something like this? 4:27 This is actually a musical instrument by the way. 4:30 It's called a monome. 4:32 There's 512 keys here and you can do 4:33 beat sequencing and behind every button there's an LED. 4:35 You can see the feedback of what you're controlling on the computer. 4:38 It's a beautiful musical instrument. 4:42 However, I don't think this would be 4:44 very effective or useful at controlling a robot. 4:45 You probably want something that's, fits your hands better. 4:48 Maybe, you know, you don't use 512 buttons, but something fewer. 4:51 And you're gonna need some continuous controls. 4:55 So again for music, this is Maldov, a DJ, who makes his own instruments. 4:57 He following turntablism, he has pushed forward an art called controllerism where 5:01 he makes his own musical controllers, custom fit to his hand and 5:06 he can do amazing beautiful things with this. 5:10 This could be an appropriate control if your robot has many functions. 5:12 You've got buttons for these discrete or binary possible effects, as 5:16 well as sliders and knobs for some sort of continuous controls. 5:21 But, this isn't gonna be very intuitive to control 5:25 something like if it has wheels, or you're flying around. 5:28 You're gonna want something more like this, which is 5:31 what a professional would use to control a flying plane. 5:33 And in fact, there's one in the room over there on the, our Hiroki friends 5:36 have one to control their tiny, little I think it's called a crazy fly drone. 5:40 There it is. 5:44 In in, in their case, they only use 4 of the controls to control the pitch, yaw, 5:47 roll and height and there are many other switches 5:52 that more, a fancier drone or airplane might use. 5:56 But this, at least, 5:59 affords you some, some more intuitive control, 6:01 especially, it only takes a little while to 6:03 get used to this and many of us have played with RC cars as kids. 6:06 You'll be able to pick this up, and control 6:09 that, that, that drone or that plane very easily. 6:11 You could even use this to, control something rolling on the ground like, 6:14 you could use this for an RC car or a robot with treads. 6:19 You could be, use the left and right to control it like tank treads, 6:22 much like Katamari Damacy which is for the PlayStation, and that's 6:25 why they will use the dual shock controller there on the right. 6:30 These again, are more ergonomic and meant to sit 6:33 in your hand for eight or ten hours, or 6:36 however long Microsoft wants you to be playing their, 6:37 and these not only are more comfortable, but they're portable. 6:40 You, you, these are wireless. 6:44 You can control it, you can walk around outside with it, follow your robot. 6:45 And then it, it helps avoid the Resident Evil effect, 6:48 which is if you have something with kind 6:51 of tank controls or RC car, if your orientation 6:54 is the opposite of it, it becomes very 6:56 difficult to visualize left and right, they're mixed up. 6:58 Whereas if they're facing the same 7:00 direction it's, it's easier to, to visualize. 7:02 And if it's more complex instead of just adding more buttons to 7:05 it, a massive control a controller like this, you have a specialized setup. 7:09 It's known as Steel Battalion is the game. And this was sold, I believe, 7:13 for Dreamcast and Xbox, just to really get the full experience of playing this game. 7:17 It's somewhat like a MechWarrior where you're 7:22 controlling a very large tank like thing. 7:25 It's overwhelming to me to see all of these controls laid out, even though 7:27 they are kind of put in a specialized way that may become more natural. 7:31 It looks like this, this. 7:35 There is just so much it, in your face, what to do with it. 7:36 So, so there's kind of a balance like how many things 7:40 are there to control on your robot and well as how 7:42 much information do you need to get back from, from the 7:44 robot and they are, they are completely different styles rather than 7:46 you know, buttons and levers and joystick and that sort of thing. 7:51 You could actually use, use your body in free form, a very different 7:55 type of control and if this game, Steel Battalion, they later import to Xbox. 7:59 Sadly, it was a disaster. 8:05 From what I've heard, the experience of playing Steel Battalion with the 8:08 Connect worse than you, what you might imagine playing Assassin's Creed is. 8:11 There was a, a great video people people put online, 8:15 a spoof that they made Assassin's Creed for the Connect. 8:17 In it, down here, this is an Intel creative camera. 8:20 They're both depth sense, depth sensing cameras. 8:22 So it gives you, it, it, the information it provides 8:25 to you is how far away are things in the world. 8:27 It's much like are, a video camera is a, 8:30 a color map of, of a square view of the world. 8:34 This will provide you a depth map, looks something like this. 8:37 Here, this happens to be Libfreenect, the open 8:41 source hack version of the original connect libraries. 8:43 The, its called heat map colored to show you depth. 8:46 The hand is closest to you, then the arm 8:49 in the person, and this other square thing on the 8:51 side of the next, far it is, and then 8:53 in the very very back its dark blue and black. 8:55 But, how do you interpret 8:58 this information and actually make it useful? 8:59 That, that is a challenge. 9:02 And one of the things that Libfreenect 9:04 community has done, it's called OpenNI project. 9:06 They built skeletal modeling on top of this, which Microsoft did internally, but 9:09 that of course was not available to people who hacked the Kinect originally. 9:13 So now there are two different libraries that you can use with the 9:17 Kinect: the sanctioned Microsoft one as well 9:19 as the open source quote unquote hacked 9:22 one, and you see what it does is provide 9:24 you articulation points to different joints, on the body. 9:27 Sometimes it does actually require a, a initialization step. 9:32 You'll need to step in front of the camera in 9:35 a very clear T shape like this, and then it's like, 9:36 oh yeah that's certainly a body, and then it can 9:38 track from then on, and you can do more interesting things. 9:40 But if you just kind of are hunched over into the field 9:42 of view, it may not be able to recognize that you're a human. 9:45 Once you've done that initialization step, 9:47 it now has this, basic skeleton model of your joints. 9:49 As an example, you have this shoulder, the elbow, 9:53 and the hand, three joints provided for, for the arm. 9:55 And then you can do interesting things like this. 9:59 With this, for example, there are physical therapy programs 10:01 that have been written where it, maybe a kid 10:06 is playing a game and the exercises they need 10:08 to, to exercise their leg in a certain way to, 10:10 build up muscle mass that they've lost due to an injury or something like that. 10:14 So you can use the, this to actually create a game for that example and this is 10:18 like what, what is often use when Kinect is 10:23 used to control games or like the dancing games. 10:26 And if you had a robotic arm like I 10:29 showed you in the video earlier this morning you can 10:30 use the, that whole arm to control, the, the 10:33 arm information coming out of the, the connect skeletal tracking 10:37 API to control the arm in space. 10:39 But you, in this case, you wouldn't have any articulation points on the 10:42 hand at all because you'd just have one for the whole hand en, entirely. 10:45 So, that's where a diff, different types, type of sensor may be useful. 10:49 In this case it will be motion controller. 10:53 Which also uses infrared light but it, it's a very different technique. 10:56 And instead of providing a depth map we 10:59 go, we directly provide hand and finger information 11:01 so it will provide, for example, a thesis on 11:05 the hand and fingertip positions and directions of the finger. 11:07 We even tracked tools. 11:13 You can do pencils and pens and chopsticks and that sort of thing. 11:14 And in the upper right is an example of a gesture, 11:17 circle gestures and taps and swipes and that sort of thing. 11:19 And you can build your own more customized gesture recognition on top of that. 11:21 And this may be useful for a,a couple of different things. 11:25 I'll show 11:29 you a demo in just a moment where a type of interaction I think 11:30 is very natural and how, how we can actually apply this to a robot situation. 11:33 I, I would like to mention, unless the joints on the robotic 11:38 arm are perfectly mapped to joints on your arm, you will have to 11:42 do a, some additional work, which is an inverse schenamanix problem, this is 11:45 what U-Gen did in the video I showed earlier of the, robotic arm. 11:52 We use 11:55 the, the position of this palm coming out of the Leap Motion API and try to move 11:55 the robot's arm in such a way that the, the tip of it is in that pos, position. 12:00 That's very different from the servos how the servos work. 12:05 So you'll need to solve essentially 12:08 a math optimization problem with physical modelling 12:11 and there's constraint about the lengths of things and how far servos can move. 12:14 And in order to find the angles that you, you're trying 12:18 to adjust the, the servos toward to actually get the 12:21 position of the end of that arm in the right place. 12:25 And if you're in this case we have three 12:28 flexible joints and a total of seven servos here. 12:31 So you'll need to solve a math problem to do that. 12:35 A lot of this is available for free online, 12:37 but, and you can use that as a starting point. 12:39 You'll need to tweak, and there's a little extra work to do. 12:41 Before we get 12:45 into exactly how to connect these two, I want to show you a, a demo of what 12:46 you can do with a Leap Motion, controller, that 12:50 I think is a, a very compelling interaction style. 12:53 So here we are on Amelia Island, and my hands, let me raise 12:56 the leaf up a bit more, so you can see exactly what I'm doing. 13:00 If I put my hand in the zone, there, it should track where my hand is. 13:03 And in the center of this space, there's essentially a dead zone where 13:07 nothing, nothing interesting happens. 13:11 You'll see this little point in the center of the screen. 13:13 That represents the position of my palm and 13:16 using just the height, I can go away from 13:18 the ground, and I can pan left, pan right 13:20 and, and even fly forward a bit, like that. 13:24 So there's, you know, three different axes I can move along with just the palm point. 13:26 I can also twist my hand. I can use yaw to rotate around like this. 13:30 I can also tilt 13:35 up to look more at the sky. Tilt down to look more at the ground. 13:37 And, of course roll does not work because they, 13:40 they do not support rolling the camera in Google Earth. 13:42 But if I combine these actions together, I can get a 13:45 fairly natural movement, kind of like a helicopter ride over the city. 13:47 Back up like this and here we are on Amelia Island. 13:53 I'm gonna zoom out to all of Florida. 13:58 I'm so glad the wifi works in this room. 14:01 Here, we have the state of Florida. 14:06 And then I'll quickly fly to Manhattan to show you 14:07 what a, fly through over a city with, buildings looks like. 14:09 Let's go down Wall Street. 14:14 You can fly through the buildings. And, let me, zoom around this 14:22 new tower that we have and 14:27 then get a little wider field of view of the whole city. 14:31 Hopefully there are some New Yorkers in the room. 14:41 And I, a little subtlety I did at the end 14:43 is I closed my fist and then removed it from 14:45 the field of view. 14:47 If I didn't do that, I would've backed 14:47 before then, a, a null gesture in some sense. 14:49 It's a don't interact at all. 14:52 I'm not interested in interacting, my hand's a fist. 14:54 This, I think, is a compelling experience because it may take just a minute 14:56 or two to, to get use to the sensitivity of it that it really is 15:00 a very fine grain control, but in a matter of minutes you can fly 15:03 around like a helicopter and you're not 15:07 holding a controller with any joysticks on it. 15:09 You're using your hand as 15:11 you would if you're playing with a kid doing an airplane game like this. 15:12 And I think it, it's, it's really fluid and, and natural way to, to move around 15:16 a 3D scene like this and you could very easily connect it to a quad copter. 15:20 I think I showed a video this morning of someone who did it in lab UNI. 15:24 In 24 hours, they did something like this style of control with a quad copter. 15:27 You could do something similar with an airplane as well, but then you have 15:31 different constraints, because you, you can't pan 15:33 left and right with an airplane, right? 15:35 So now that I've shown you that, I 15:38 wanna talk about how do you actually connect this 15:39 to, you know, manipulating servos different motors on 15:41 a robot to, to actually get what you want. 15:44 So, as I mentioned, we have this fish in the office which is remote controlled. 15:48 Yep, if you can see it, there is exact, there 15:52 are exactly four buttons beyond the pow, the power switch. 15:54 Because there are four ways to control this this fish. 15:57 The fin on the back, much like a real fish, can go left and right. 16:00 And then there's a counterbalance. 16:04 It pitches the head down, or if you, you, it slides back on the belly of the fish. 16:05 It goes higher up because it's offsetting the helium in there. 16:09 And this is exactly how you control it. 16:13 If you want the, the fin to go to the right, you press right. 16:15 But that doesn't, like, turn the fish to the right, it just moves the tail. 16:18 So you literally have to move this control left, right, left, right, 16:22 left, right, in order to swim this fish, much like how the spinal 16:25 chord of the fish would be controlling its fin. 16:30 But at the cognitive level the, the, the fish's brain is not really doing that. 16:32 And that's what I'm trying to would like to argue, is this is the way to, to 16:36 design a, a robot control is to think back 16:39 about the cognitive level and the spinal cord level. 16:43 When when I, make a, a movement like 16:46 this, forward, I'm thinking about I wanna move forward. 16:49 There's a goal in mind. 16:52 And my cerebellum and 16:54 spinal cord together are doing the whole orchestra. 16:55 They're the orchestra that is the musculature of sending ero, ceocholine 16:59 down my spine to make the muscles actually to work in symphony. 17:04 And in this case, instead of, you know, left-right, 17:09 left-right, think about, I want the fish to swim. 17:12 Swim forward and I, I had made the unfortunate mistake of changing the code, 17:15 the primal, I made cardinal sin in demos, live demos. 17:20 I changed the code just moments ago and it's slightly 17:23 broken, but I will fix it and have it available soon. 17:26 I'll, in, after this, in about an hour. 17:29 I'll bring the fish in and you'll see with 17:31 the Lead Motion controller and the same control scheme 17:33 I can slide my hand forward and the fish 17:35 begins swimming forward and the tail, the tail moves. 17:38 If i go farther forward, it'll go faster and faster and faster. 17:41 And if I pull back, 17:44 is it, the back tail just stops. 17:46 And I can tilt my hand up and down to pitch the fish. 17:49 And so I simply tilt up and move forward and it begins swimming up. 17:53 This is our hardware engineer on the right. 17:56 Since he deals with infrared light all day long, he decided 17:59 to just simply read what's coming out for remote control, and 18:02 he built us a circuit that sends over the serial port 18:04 with a USB connection infrared light and we can just beam 18:08 it straight to the fish. 18:11 And then Peter on the left, one of our web 18:12 engineers built an app, which unfortunately I don't have enough 18:13 time to show you much of, to actually control the 18:16 fish in a much more natural way, swimming forward, tilting up. 18:19 And I think the, the major difference that that worked really 18:23 well for him is he had a, he had a goal-directed approach. 18:25 And the, I'll very basically show you the architecture. 18:28 Hope you can read enough. 18:31 The basic architecture is, you have an app. 18:32 it, 18:35 it synthesizes two, parts of data. 18:35 The fish service, as he called it, is the spinal cord. 18:38 It talks about, okay, what's the goal? 18:42 Now, let me turn that into move the muscles, move the fin, 18:44 tilt the pit up and down and this converts something like let's 18:47 move forward command into moving the tail back and forth at it's 18:51 certain speed depending on how fast or how far forward your hand is, 18:57 the fin at a faster or slower pace. 19:01 And then over here, the leap service that he wrote, is the other side of it. 19:04 Given the hand data coming out of Elite Motion API about 19:09 the pitch and the tilt and the position and all that. 19:12 Translate that into a goal. 19:14 So let's, if your, if your hand is moving forward the goal is let's swim forward. 19:16 If you come back, let's stop swimming. 19:20 This is, the architecture is split. It's interpreting the input 19:22 toward, to, to the goal and then sending that off to the app 19:26 over here, the app is listening to the goals from the leap service and 19:29 then sending that to the fish service and the fish is then turning, turning 19:33 those goals into actually moving the, the fin and, and tilting up and down. 19:38 And again, I'll have this robot here hopefully in the matter 19:42 of an hour or so and show you and I can 19:45 explain more about the code in this little puffy script app 19:47 which I would also like to make a call out to. 19:49 It used 19:51 the known web kit serial port library thanks to our, our organizer. 19:52 If there is any time for questions, maybe I will take questions. 19:55 I guess we are going right to the next one. 19:59 Thank you very much and I will see you later with the fish. 20:00 [NOISE] 20:04
You need to sign up for Treehouse in order to download course files.Sign up