Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
(En)Lightening Talks!35:52 with Alastair Somerville and Yoav Weiss
Alastair Somerville: Start Making Sense | Laura Cortes: UX in VR | Yoav Weiss: Responsive Images, or Fixing Web Standards
[MUSIC] 0:00 Okey dokey, let's start. 0:05 So we're talking, start making sense. 0:07 I'm Alastair Somerville, and if you ever need to get hold of me. 0:10 I am much easiest to find on Twitter, which is my Twitter address there. 0:14 So what I do, I work in sensory design. 0:19 Which is a relatively obscure area, in a sense that 0:22 it's an edge onto accessibility but in a rather strange area. 0:27 I work in trying to make information meaningful. 0:31 In that, I work in an area where what I want to do is 0:34 discover how users sense information. 0:39 And then translate or adjust the information that I'm given to their needs. 0:43 So a lot of my work is taking visual information, and 0:49 then translating it to tactile. 0:53 Or taking incredibly complex textual information and 0:54 converting it to graphics or to audio. 0:59 And so it creates a lot of problems, one of my biggest problems is, 1:03 I spend a lot of time working and making stuff which is then cast into steel. 1:08 So if you've ever designed a UI, I have UIs out there which will be still in 1:14 existence when I'm retired and dead because they're lost in 25 years. 1:18 And so, for example, last year I was doing the Imperial War Museum 1:24 where they have a new World War I gallery. 1:29 And they needed a map for people with vision impairments. 1:31 To be able to show where things were, 1:35 what the journey was through the building, which is fine. 1:37 So I designed a map which is, and you can go and visit it, which is there. 1:40 But it works for people with visual impairment. 1:46 The simultaneous to the map is actually doing another thing which 1:50 is not really noticed by a lot of people. 1:54 The map is also a map of where the quiet places are in the gallery. 1:56 Because the problem with the exhibition, it's a very tight space. 2:01 It's a very, very hard space for noise. 2:06 So we were aware that people with sensory processing difficulties 2:09 there would be a lot of problems, a lot of panic. 2:14 So we actually designed to show where the seats were, 2:17 where to escape the exhibition. 2:22 So this is designing for different sensory needs. 2:23 Where it crosses over with digital is, of course, the what I view as sensory design, 2:28 is also now what a lot of people view as being post screen interaction. 2:33 The possibility of designing multi-modal interfaces which are spread across 2:38 the body or spread across the place. 2:43 So wearables, the Internet of things. 2:45 Where there are senses, 2:48 there are screens, there are things which are beyond screens, everywhere. 2:49 And what I'm going to try and talk to you about today is some of the possible issues 2:54 that you're going to discover when trying to design for this. 2:59 And I'm going to talk about in these terms. 3:04 I'm going to talk to you about senses which is sort of fundamental error for 3:05 me the importance of adaption. 3:10 How we as humans, adapt to context, to content all the time. 3:13 And to bring it back to digital, 3:21 how important personalization is in the future. 3:24 I'm also going to be conning you. 3:28 There will be a trick in this, about two thirds of the way through. 3:31 I'm going to tell you something which is a trick, which is a con. 3:34 I apologize. 3:39 Fundamentally, what I'm going to talk about now is embodied cognition. 3:43 How many people know embodied cognition? 3:49 Ish? 3:52 Good. That means I won't be corrected. 3:54 >> [LAUGH] >> Neuroscientists. 3:56 You really don't want to get on the wrong side of them. 4:00 Embodied cognition matches nowadays because you're not talking about just 4:02 a person and a screen. 4:07 You're talking about the whole person moving through space, moving through time. 4:08 And so the ideas of embodied cognition become really, really important. 4:14 And embodied cognition's got fundamentally two things going on with it. 4:18 One that your brain is not separate from your body. 4:22 The body and the brain actually completely match each other. 4:26 The brain considers itself to be the body and it holds a model of itself. 4:29 So it is reflecting permanently what's going on. 4:34 And secondarily the other thing which is really important is to understand that 4:38 you don't just think with your brain. 4:43 You think with your whole body. 4:46 You have neurons, you have brain material spread through the whole of your body. 4:48 So the actual way in which you move, the actual way in which you sit, affect. 4:52 Your ability to think, your ability to remember the ability to do stuff. 4:59 Which creates some slightly odd effects. 5:04 Which are really interesting, but they're really odd. 5:08 So, deep down, we sense, we act, cellular level. 5:10 We sense. 5:18 We take meaning. 5:19 We act. 5:19 Again, a fairly easy thing. 5:21 But we have five senses, and that makes things slightly more complex. 5:25 Which of the senses is important? 5:29 And that's why we have emotions. 5:31 Emotions exist to coat certain experiences. 5:33 So that you actually act upon them faster. 5:38 They exist we actually exaggerated sensory input so you backed faster. 5:41 But even that's not enough. 5:47 Sorry. 5:50 You have many senses you have. 5:52 I mean we generally work with nine senses as being the idea. 5:54 You have many senses. 5:57 There's too many senses. 5:58 There are too many emotions. 5:59 And what that means is you end up having to have consciousness. 6:01 Consciousness is a trick on you. 6:05 It is a way in which emotions surface it. 6:07 It's a way in which you can actually make decisions. 6:10 That. 6:15 It's an interesting issue. 6:18 So we sense context and content. 6:20 We adapt. 6:23 We adapt through emotions. 6:24 There. 6:26 And we adapt through consciousness. 6:27 And this is how we're adapting to complexity of content and 6:30 all the stuff around us. 6:32 So what do you do? 6:37 What does it matter to you? 6:39 You support the whole of this embodiment through personalization. 6:42 Through the ways in which you enable personalization through the web. 6:46 You actually support the whole thing. 6:50 And I'm going to give you a couple of ways of how. 6:53 One, you need to understand sensory mapping. 6:57 One of the fundamental issues is a lot of people think that senses are a spectrum. 7:01 They're not. 7:07 They're an axes. 7:08 You're balanced between your consciousness and 7:10 your unconscious desire to sense or to avoid sensory experiences. 7:14 This is one of the things. 7:21 You want to pay attention, but parts of you don't want to pay attention. 7:22 And when you're playing tricks with attention, 7:27 the attention economy and all that. 7:28 You have to understand that people are balancing in this way. 7:30 Between things they think they're doing and things they don't know they're doing. 7:34 And this creates problems. 7:38 But actually there are ways of discovering this. 7:40 We do have questionnaires. 7:42 There are coming out of occupational therapy ways of discovering 7:44 how to discover what the user, their biases are. 7:47 So there are ways of thinking about this. 7:52 Sensory framing. 7:55 Again, you need to give a framework for 7:56 people to be able to adjust the experience to their sensory capacity. 7:58 So you're supporting embodiment through the personalization. 8:05 Again, a framework, a personalization. 8:08 And this is where the con is. 8:12 Is,when I talk about personalization. 8:13 When I talk about these things, it's all sounds terribly difficult, 8:16 as though you're going to have to do something new. 8:18 And yet you don't, because personalization already exists. 8:21 The whole framework is out there. 8:26 It's accessibility. 8:28 The accessibility framework which is already existing in web design 8:30 is personalization. 8:33 It is sensory design. 8:35 It's just. 8:37 It's really difficult to spot. 8:38 So I mean this is IBM. 8:41 IBM talk about hyper personalization. 8:42 They talk about this in those terms. 8:45 That's the W3C, headings for accessibility. 8:49 And I have to say it is very clear that 8:54 nobody would spot that that's about sensory design or personalization. 8:56 However, if you look at the iOS accessibility features. 9:01 This is iOS 8. 9:06 I think, actually, iOS 9 does change the heading slightly. 9:06 You can see that this is about adjusting the device, 9:09 adjusting the experience to the personal sensory needs of the individual. 9:13 This is personalization. 9:18 Accessibility is personalization. 9:20 So, to come back to the hows. 9:25 Understand that people are all living, 9:28 swaying slightly, between conscious and unconscious. 9:33 Desire to experience sensory things. 9:37 Understand that you can enable user personalization, user framework, user map. 9:40 And finally you can do something else. 9:48 You can use this to do this. 9:52 You can enable diversity. 9:53 You see, diversity is normal. 9:55 Euro diversity, racial diversity, sexual diversity. 9:57 All of these things. 9:59 That's normal. 10:00 And actually, you have the ability to do it. 10:01 You can broaden experiences, 10:04 you can deepen experiences, and you can enrich lives. 10:07 Thank you. 10:11 >> [APPLAUSE] 10:13 >> Hello, thank you for having me. 10:29 I am Laura Cortes. 10:33 I am a creative strategist at UNIT9, ex-user experience and 10:34 user interface designer. 10:38 And for the past year I've been working on projects in pitch and 10:40 new business proposals for virtual reality, and 10:45 I've been involved in both brand new projects and internal research projects. 10:49 And today I'm here to talk to you about some of our learnings that we came across 10:56 during this past year when designing for this new medium, virtual reality. 11:01 So I have a question. 11:06 Who here has tried a VR headset? 11:08 Cool, yeah, so hopefully you guys resonate to what I will be talking about, 11:12 and for the ones who haven't, 11:17 I highly recommend you to go on and try Oculus Rift or Samsung Gear VR. 11:19 So what is VR? 11:24 That is the first question. 11:26 Before I introduce the learnings, 11:27 we need to give some context into what this new medium is. 11:29 It is a world that wraps around your eyes and your field of view. 11:33 And it's a 360 environment. 11:37 So it is something that goes all around you and 11:39 it hopefully immerses you so much that it tricks your brain into 11:42 believing that what you're experiencing is actually real. 11:46 It takes you to places and it puts you in situations that you wouldn't expect to be. 11:51 And if the experience is actually done in a good way, 11:56 it creates environments and feels that you can live in your real world. 12:00 So what you see behind me is an experience that was done where they placed 12:05 a pornographic movie on a VR headset. 12:10 And then they gave it to people to try it and 12:13 some of the reactions you see are quite scary, 12:15 people are quite disgusted of what they're seeing in front of them. 12:18 And they even take the headset off. 12:22 So things like these prove that it is quite powerful, it does create emotions 12:24 and sensations that you can live otherwise without this new medium. 12:29 And it also takes to places, they can be fictional or real. 12:34 It removes you from where you are, the moment where you are and 12:39 what you are living right now. 12:43 And it place you somewhere else in another dimensity. 12:45 So now we understood what is VR. 12:50 But what kind of experience can we actually build? 12:52 Because this is all very brand new and people don't know yet 12:55 what type of content we can produce. 12:59 So at UNIT9, we were able to identify three main experiences. 13:02 One being hyper-immersive emotional, like the pornographic movie you saw. 13:07 It's an experience where like, for example, at 5GUM, 13:12 we did a branded experience where we placed a user with headsets 13:16 on a massive container, hanging from the air like this. 13:22 And we've blown scented air to their faces. 13:27 And the whole idea was to put them in four different 13:30 fictional worlds that related to the four flavors of the gum. 13:34 So 5Gum is the gum. 13:39 And the outcomes of these were quite amazing. 13:41 People were actually engaged, and they felt like they were leaving that world, 13:43 that abstract world, this was completely abstract. 13:48 There was no relation to the real world. 13:51 And they were quite immersive in interacting with it. 13:53 Second type of experiences is, point of view documentary live action. 13:59 This is quite self self-explanatory. 14:03 So if you'd like to go and visit British Columbia but you've never been there, 14:06 you can watch movies. 14:10 You can take your friend's words for granted. 14:11 But what if you can actually put the headset on and 14:14 be in the middle of a lake listening to the birds, the water, the boat. 14:17 And hopefully get a sense of what it is to be in the middle of British Columbia. 14:21 And finally we identified a third category which are games. 14:29 At UNIT9, we haven't really done a lot of games. 14:34 Brands are still very scary of investing money in VR, especially for 14:36 games, but BlazeRush is a good example of another company doing this, 14:41 they did a version for VR. 14:46 And one thing that is quite interesting is that they did what 14:49 is not a very common approach which is a third point of view perspective. 14:53 So you're not actually guiding, riding the car. 14:57 You're sitting away from it like a god's point of view, 15:00 and looking at the track and racing the cars, like that. 15:05 So once we understand what kind of experiences we can do, 15:09 we go on and identify what is the technology available. 15:14 So far we have these main three VR headsets available in the market. 15:18 Samsung Gear VR, Oculus Rift and Google Cardboard. 15:23 And since Oculus was bought by Facebook we've noticed that there's a lot 15:27 more people and brands investing money in creating VR headsets like HC. 15:32 They created a new one like a few months ago. 15:38 However, we identified, and I think a lot of people who have been working with 15:42 virtual reality can say the exact same thing is that haptic feedback is lacking. 15:47 It's impossible to actually bring someone into the virtual world if all your body, 15:53 except your head, is still in this world. 15:58 So you, none of the things that touch you you can actually feel it. 16:01 If it's cold, if it's warm, if you're wet, if you're dry. 16:04 So all the experiences we can do are quite static. 16:11 They're quite laid back type of experiences where you sit back and 16:14 you watch whatever is happening in front of you. 16:18 We tested the British Columbia experience last time 16:21 I gave this talk in a conference. 16:25 And most people were just standing like this. 16:27 And I had to go there and tell them, walk around, move around, 16:30 go like this cuz people didn't really understand what they had to do. 16:35 So having this into consideration and 16:40 understanding all the constraints that we face, 16:43 we were able to outline a series of rules, I would say or guidelines. 16:47 They were also based on the Oculus Rift guidelines and 16:52 I am going to run a little bit through it. 16:57 So the first one comes directly from Game Design, it's binaural sound. 16:59 So, binaural sound is what we can describe as 3D sound. 17:04 It is very important that sound replicates what happens in the real world so if 17:08 we're trying to put someone in the middle of a a forest, sound bounces off things. 17:13 Sound bounces off the floor, sound bounces off trees, leaves. 17:17 If it's rain, it falls all over you. 17:21 So 3D sound design is very, very important to be considered. 17:24 This one is different from game design. 17:31 So whoever has worked with game design, 17:33 the HUD approach doesn't work for virtual reality. 17:36 We cannot ask from a user to consider a 2D graphic environment 17:40 as well as a 3D world around them. 17:45 So every type of UI, text, button, call to actions, 17:48 little indications of where to go, they need to be placed in the 3D environment. 17:52 And they need to be scalable. 17:58 And they need to have a perspective, depending where they are on the 3D. 18:00 Another thing different from game design are avatars. 18:06 So avatars are not very important when it comes to 18:09 first-person point of view type of game. 18:13 You don't need to have an avatar to be immersed in the game. 18:16 But in virtual reality it's quite different. 18:19 So if you look down and you don't see a body, 18:21 you see just emptiness, you feel like, okay, I'm actually not in this world. 18:24 I don't belong here. 18:29 Where's my body? 18:30 But do you have a body, then the problem becomes that, 18:31 does that body look like your own body? 18:34 Am I a robot? 18:37 Am I a male? 18:38 Am I a female? 18:39 Is the body actually the same size as me? 18:40 Do I feel like my legs are too short, too tall? 18:43 So if we do include avatars in the experience, 18:46 they need to be considered very carefully and 18:49 they need to resonate with the type of world you are building. 18:53 Another learning is using controllers. 18:57 So, again, this is different from game design outside of virtual reality. 19:01 When you're playing a game, you can see what you're controlling. 19:06 You can see what you're holding, if it's a keyboard a mouse, a joystick, whatever. 19:09 But if you're wearing a headset, you can't really see whatever you're holding. 19:13 So, controllers need to be built and used for blind mode. 19:17 So you can't have people interacting with the keyboards or a mouse. 19:21 Another thing different from game design is content. 19:27 So in VR, like I said before, content is 360, it's all around you. 19:30 So you need to consider every single little pixel that it's displayed. 19:35 If you don't have a sky and the user looks up and there's just a black shade. 19:40 It breaks immersion and removes the person from 19:47 the experience you're trying to immerse them in. 19:51 Animation, so this learning comes directly from real life, 19:56 what we call IRL, in real life interaction. 20:01 And it is what makes people believe that they are in the real world. 20:04 The world around us has motion. 20:11 It moves. 20:12 It creates blurriness if you turn your head very fast. 20:14 So all the animation and visual effects that are applied. 20:18 If you're trying to replicate what goes on in this world, 20:22 need to be as close as possible to that movement. 20:26 Again, from real life, when you grab your phone, you control it. 20:30 So I grab my phone and I'm the one controlling the movement that I'm doing. 20:34 And the speed that the phone is moving. 20:38 So every type of movement and 20:40 interaction needs to be controlled directly by the user. 20:42 You cannot have zoom ins and zoom outs automatically done for the user. 20:45 Because it will create motion sickness. 20:49 It will create people getting dizzy and 20:51 actually wanting to stop to experience what you're doing. 20:54 And finally latency. 20:58 This is a big, big issue. 21:00 What you see here is an experiment done by a Swedish team where they 21:02 placed two VR headsets on a guy and a girl, they were playing ping pong. 21:06 What they were seeing was what it actually is in front of them. 21:10 However, the latency made them completely miss 21:12 the ball when they were trying to hit it. 21:17 And then finally a test. 21:20 This goes across all disciplines in our industries, digital, we need to test. 21:23 You can't expect the experience you're building actually works. 21:28 >> [LAUGH] >> So in this case, 21:33 these guys actually decide to test it, 21:35 asked the security guard to try it on and he was so, so into it. 21:39 He felt it was so real that. 21:45 Whatever was going on, it was probably like a roller coaster or something. 21:47 But he fell out of the chair because he couldn't hold it. 21:50 >> [LAUGH] >> So as you can see, 21:54 there's a lot more dos than don'ts. 21:57 We don't have answers for everything, it's a lot of trying and 22:00 failing, but we do understand something. 22:05 Do not place someone in a virtual world without telling them what to do or 22:10 where to go. 22:14 People won't understand it. 22:14 If it is an experimental, emotional, artistic project. 22:17 Yes. 22:21 It can work. 22:22 It's more for the experiential part of it. 22:23 But if it actually tries to communicate a message. 22:26 If it tries to sell a product, if it's for a brand like 5 Gum, 22:29 you really need to have good indicators for users. 22:33 So those are slow and progressive familiarization. 22:36 Don't drop users and suddenly tell them run their. 22:40 They want to know how to do it ,slowly walk them through. 22:43 Start the animation and everything needs to be very, very progressive. 22:47 Visual cues are very important. 22:52 So things like arrows pointing to go here or sounds, or 22:54 indicators on the floor to move next, and guidance and help. 22:59 The good thing is that with all the technology available today, 23:05 things are moving quite fast. 23:10 At Unit9, we're experimenting with a lot of new gadgets, 23:12 like sensorial gloves or perspiration trackers, heartbeat trackers. 23:18 So things like heart beat, gaze, geolocation, head motion, 23:24 perspiration and even brain activity. 23:29 We really believe they will be plugged into VR headsets in the future. 23:31 And whatever experience we're building will be completely 23:36 personalized to the user. 23:38 So if your heartbeat is racing, 23:39 we will understand that you might want something more slowing down. 23:41 Or if you're bored, we can fast the experience and 23:46 make the adrenalin pump up. 23:51 And to finalize, it's a great time to be creative in user experience. 23:54 We have now the chance to create a new grammar. 23:59 A lot of words and meanings that weren't a part of the digital 24:02 world before like gaze and run and things that come from games of course. 24:07 But are new to the new VR world. 24:13 We have the chance to build this new grammar. 24:17 Thank you. 24:20 [APPLAUSE] >> Is this thing on? 24:23 Yeah, we're good. 24:29 So, hi, yeah, I'm Yoav Weiss. 24:30 I work for FMI and I'm here to talk about Responsive Images and 24:32 web standards in general. 24:36 So once upon a time. 24:39 There was a mobile web. 24:42 We had mobile web sites served to mobile specific browsers over mobile phones. 24:43 It was as sad as it looks, not that many people use it. 24:51 Should have went with the other clicker. 24:55 Okay. 25:04 And, eventually, 25:05 the iPhone came out and broke this barrier. 25:10 All of a sudden desktop web sites were working on mobile browsers. 25:14 And the result of that was that people moved, 25:20 started making iPhone specific websites. 25:24 [LAUGH] I'll stay here. 25:29 But that wasn't scalable that was something that's with the myriad 25:36 of devices that followed just wasn't working. 25:41 And eventually we got responsive web design figured out, 25:46 which was based on media queries, fluid grids, and flexible images. 25:53 The flexible image and the flexible images part was optimistically defined. 25:59 The spirit was that that's an easy problem. 26:07 We just need to send the largest possible image and 26:10 let the browser resize it. 26:16 That's, yes please. 26:24 And we considered it a job well done. 26:27 We were pretty happy about ourselves. 26:30 But pretty soon we realized that, thanks. 26:32 That it's not working as well as we thought it was. 26:39 Responsive web sites were bloated web sites. 26:43 Most of these resources that we were serving 26:50 to various devices were the same resources across all form factors. 26:53 Most of these resources were images. 26:58 And a lot of data could have been saved if we adapted the images to the device. 27:01 So the result was data plan abuse. 27:07 We were abusing our users' data plan, 27:10 abusing their pockets, and wasting their time. 27:14 Developers gather around from all over the Internet demanding a solution. 27:17 Turned to the various standard mailing lists and 27:25 started proposing various proposals. 27:29 And eventually the Responsive Images Community Group 27:32 was formed to resolve this issue. 27:35 After a couple months of discussions, 27:38 the community group came out with a picture proposal to resolve that problem. 27:42 At the same time on the web WET mailing list, 27:48 an Apple engineer came with a seemingly competing proposal 27:52 of extending the image elements to include the source set attribute. 27:56 That proposal was added to the HTML spec over the weekend 28:00 without much public discussion. 28:05 The developer community was not pleased by that. 28:08 And a lot of mailing lists, flame wars ensued, and 28:11 for a short while we had the picture versus srcset situation. 28:16 We had two competing proposals that were fighting against 28:23 each other on the same ground. 28:28 But shortly after that we realized that Picture and 28:30 source set are two complementary proposals that can co-exist. 28:34 And each one of them resolves different use cases, 28:38 resolves different needs by developers. 28:43 Browsers weren't convinced by that, and for a long while nothing happened 28:48 on that front in terms of browser implementation. 28:54 Until we were able to get some browser engineers involved and 28:58 come up with proposals of their own. 29:02 There was the source and proposal, which was quickly rejected, 29:04 but triggered a lot of proposals now from browser folks. 29:10 Now from people working on the browser and 29:16 proposing basically rehashing the discussion that we had in the community. 29:21 A year and a half earlier. 29:24 And eventually we got back to a simplified form of picture. 29:28 And an extended form of source that as the eventual solution. 29:33 Mozilla were positive about that approach. 29:38 Blink, the rendering engine behind Chrome was less enthusiastic about 29:42 it due to implementation concerns and missing infrastructure. 29:48 I started working on that infrastructure in order to 29:54 convince a project that it can be done. 29:59 And eventually crowdfunded part of that effort. 30:03 Before I was [INAUDIBLE] that was when I was still independent. 30:07 And eventually left landed patches. 30:13 And the whole feature shipped in Chrome 38 about a year ago. 30:18 Also shipped in Firefox and 30:27 now being worked on in IE and parts of it are shipped in Safari. 30:28 And this was a group effort of various individuals 30:34 from the web developer community. 30:39 Web designers, browser folks, 30:41 standard folks all gathered together to resolve this issue. 30:44 And it was heralded as the new way of doing web standards. 30:48 So great success. 30:54 But do what we do next? 30:57 How do we replicate that model to continue to evolve the way web standards are made. 31:00 First, the question, why change? 31:11 And in my view, there are two major problems with the way web 31:13 standards have been done up until now. 31:17 The theoretical model behind that way is that web developers encounter a problem. 31:22 They go out to the mailing list of the relevant working group, 31:30 specify that problem, specify the use cases that they can't resolve. 31:34 And then the working group members specify a solution. 31:41 Browser vendors implement that solution. 31:49 And eventually we have a feature that is working and resolve the user's problems. 31:52 The problem is that more often than not the result is that 31:58 users are using the feature in ways that the implementors haven't predicted. 32:05 That the solution that we have in place doesn't resolve what 32:13 developers actually need. 32:18 That's problem number one. 32:22 Problem number two is that in my view the web is in trouble. 32:25 The web is a wonderful platform full of human knowledge. 32:32 Supporting the world's economy. 32:40 It tends billions of users. 32:41 Millions of developers. 32:46 But only hundreds of browser implementors. 32:49 Which means that browser people are extremely busy, and they're resolving 32:53 the problems that they see before them, the problems that they consider important. 32:58 Which are not necessarily the problems that developers are considering Important. 33:02 And in my view, 33:08 we have on our hands, we're heading towards a tragedy of the commons. 33:09 We're heading towards a situation where everyone are using this platform and 33:14 no one is chipping in, or very few people are actually working on it. 33:20 And solution to both of these problems is basically to get more developers involved, 33:26 move peoples from the millions group to the hundreds group. 33:34 Get the developers involved in the feature creation process rather than just 33:41 the use cases at the beginning and get feedback all along the way. 33:48 The problem with that approach as that 33:56 getting involved in web standards can be overwhelming. 33:59 There are a lot of groups. 34:02 A lot of mailing lists themselves are very hostile to navigate through. 34:05 And the web standard community wasn't always known to 34:11 be very welcoming towards new people. 34:17 So in order to resolve that We created 34:23 the WICG, The Web Platform Incubator Community Group. 34:28 It's a new community group whose purpose is to 34:33 help developers get involved in web standards. 34:38 Provide friendly forum to Incubate new proposals 34:43 to discuss, to come up with new ideas, to bring their needs and 34:50 help these ideas get developed. 34:57 And then move them over to the relevant working group. 35:02 To the relevant standard standard bodies that will standardize that effort. 35:06 So I'm out of time. 35:11 But In short the web is ours. 35:14 It's on us to maintain the web. 35:19 It's on us to make sure that it's that 35:21 it gets evolved in the way that we need it to be in our day to day work. 35:25 So join the community group, follow us on Twitter, and 35:31 join or start discussions on our discourse instance. 35:37 And Just get involved. 35:42 Thank you. 35:46 >> [APPLAUSE] 35:48
You need to sign up for Treehouse in order to download course files.Sign up