Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
Redesigning Must Die43:36 with Louis Rosenfeld
As a term almost devoid of meaning, "redesign" is a crutch that leads to disastrous results. Redesigns originate in the hazy concerns and political machinations of senior leaders, coalesce around poorly defined problem statements, gain steam around hopes of "fixing everything once and for all," and conclude with Pyrrhic celebrations of what are ultimately a sad bag of cosmetic changes. Along the way, agencies get fat and rich, in-house teams get crushed, and users suffer. And because redesign efforts yield so little institutional learning, the whole sad mess gets repeated every few years. In his session, Lou Rosenfeld will argue instead for the rational and cost-effective alternative of incremental change: tuning sites over time, rather than "fixing" them all at once. Over and over again.
[NOISE] Well 0:00 thank you very much for coming to hear me talk. 0:05 This is probably the least practical, maybe even 0:09 the least relevent talk of the conference, Criss RE. 0:13 >> Criss/g: I talked earlier. 0:16 >> Louis: Oh [LAUGH]. 0:18 So they, so I, I agreed to do this talk. 0:20 It's a talk I used to give a lot and when they asked me to do it, I 0:24 kinda said yeah, and then I realized that the 0:28 slides were, I don't know, seven years old or something? 0:30 So, I redid them for the keynote era. 0:33 Moving from Powerpoint to Keynote is an amazing thing. 0:37 And, if you haven't done it, I recommend 0:40 you buy an Apple product, just for that reason. 0:42 Although, I hear moving to the new, new 0:45 Keynote really sucks, so stop about two years ago. 0:47 Keynote would. 0:50 So, I'm gonna talk about, redesign must die. 0:50 So, before I do that. 0:55 I'll just say you, you need to know my bias. 0:57 My bias besides being the guy ish behind Rosenfeld 1:00 Media which publishes books and provides consulting on user experience. 1:04 My prior career is an information architect. 1:09 I'm the co-author of the O'Reilly polar bear book. 1:12 Anybody familiar with that? 1:14 A few of you, thank you. 1:16 Its essentially the big book on IA. 1:18 so, my perspective is kind of 1:22 as a librarian turned information architect who's 1:26 worked with a lot of really 1:28 large organizations with really big information headaches. 1:30 So really, like, my appropriate job title would be information therapist. 1:35 I mean, I try to get organizations on a couch and see what hurts, 1:41 and it's usually got something to do 1:44 about finding information when they're talking to me. 1:45 And that's kind of what I'm talking about today, 1:48 is redesigns that try to Change things for the better, 1:51 but end up going off the tracks; which is why 1:57 I've, used to certainly do a lot of information therapy. 2:00 So let me jump right in with the story, I 2:03 used to live in Ann Arbor; Michigan for many years. 2:06 Ann Arbor; Michigan was the home of 2:09 my former company, August Associates; which just 2:11 did Information Architecture consulting; we have 40 2:15 people in Michigan in the late 90s, early. 2:17 [INAUDIBLE] Doing that kind of work is remarkable. 2:20 It's also the home of the University of Michigan 2:23 which I am a two and a half time alumni. 2:26 Any other Wolverines here? 2:28 >> [SOUND] >> All right, a couple of ya. 2:29 so you know, it's, it's interesting so here I am living here 2:32 in oh 1998 in Ann Arbor where the U of M is. 2:36 And I, I pick up the Ann Arbor News one 2:41 day and I read the main page, front page top headline. 2:44 University of Michigan to redesign website. 2:50 And the subtitle is hires local firm, Marcus Associates, to do the work. 2:55 It's kind of interesting to me because they'd never talked to us about that. 2:59 And I was president of the company you'd, probably would have been in the loop. 3:03 But no, hadn't heard about that. 3:08 I found this pretty quizzical but the, the, the general idea of 3:12 re-designing the site wasn't so surprising because it kinda looked like this. 3:16 And if you're old enough to remember sites that look like this, few of you. 3:21 So the University of Michigan main page was kinda, is kinda 3:25 cute in a way to look at these things now but really, 3:29 this was pretty horrible for a lot of people who were 3:32 trying to use this site to do things and to find things. 3:35 And that was what most of the complaints were about. 3:39 And you know, not only was this page not 3:41 very helpful in that regard as a gateway to 3:45 this huge, huge information space, but it led to 3:47 things that look like this, and this, and this. 3:50 So like, the School of Art and Design had a flash and a text version at the time. 3:53 The Athletic Depart The Athletic Department had 3:58 just a completely different brand, and then my 4:03 alma mater, the School of Information had 4:06 pioneered this unbelievable new thing called vertical tabs. 4:08 Have you seen them? 4:11 [SOUND] There's a reason. 4:12 I think they're the first and last. 4:15 So we had these vertical tabs. 4:17 We had just basically a mishmash of crap. 4:18 You know, these guys had their crap. 4:22 These guys had their crap. 4:25 Et cetera, et cetera. 4:26 And it was just a big mess. 4:27 And so the University of Michigan was [NOISE] 4:29 damn it, gonna fix it once and for all. 4:31 They were gonna get it right. 4:33 Okay. 4:34 So, they got about a About a quarter million dollars dedicated 4:36 to the project, and I had, literally, senior-level support from the 4:42 president of the university, a guy who's so smart that he 4:46 actually left U of M and became the head of Princeton. 4:49 So, I was wondering where my company fit in all of this, and 4:53 ultimately we didn't, so granted I have probably a little of sour grapes here. 4:57 But they said, you know we've got a bunch of work study students and 5:02 we've got some technology called web objects, 5:04 I think we've got it covered after all. 5:07 Okay. 5:09 So about six months later, they have an actual ribbon cutting 5:11 ceremony, they literally had a ribbon cutting for the new site. 5:15 And even brought in Abe Lincoln, from [NOISE] 5:21 way back. 5:25 And what did they ribbon cut? 5:28 well, it kinda look like this now, which I guess is an improvement, 5:31 but primarily a cosmetic one, because pretty much everything is structured. 5:38 Even the way the search works Exactly the same. 5:43 Kind of a, maybe a better lipstick on a 5:47 pig, but a lipstick on a pig situation nonetheless. 5:50 And guess what, it's still linked to this, and this, and this. 5:52 So it wasn't working. 5:56 It wasn't working. 5:58 And here's the thing, it just. 5:59 They got it right. 6:03 They had a ribbon cutting ceremony. 6:05 They celebrated victory, spent their money. 6:06 Some of them looked good, somewhere out in the line and then they just 6:10 kept doing it again, and again, and again, and again, and again, and again. 6:13 I even got brought in as a consultant to help on one of these. 6:18 And I was ignored from I basically more sour grapes. 6:22 I told the, I worked with a team for a while. 6:26 I got them recharged got them recentred. 6:29 And then the senior associate vice president provost. 6:32 Or whatever they were. 6:35 Said, you know, we can save you guys a lot of money. 6:36 By not redesigning but by improving things. 6:39 Actually fixing some problems that we identify. 6:43 And I was told, No, I'd rather not do that. 6:45 All my peer institutions are doing bigger e designs, and you know the 6:49 subtle behind, the, the, the, the behind the curtain issue that person was looking 6:53 for their next promotion, and by the time the sites fell off their 6:59 tracks again, they'd be on to the next job which is precisely what happened. 7:02 If you look at the University of Michigan today. 7:06 Now it looks like this, which is basically the same information [UNKNOWN] 7:08 it still leads to crap like this, like this and like this. 7:11 Very little has changed. 7:14 What's going on here? 7:16 Have you seen this before, any of you? 7:19 Some of you like this? 7:21 Any of you work in a large organization or even a medium 7:22 sized organization where one hand doesn't know what the other hand is doing? 7:25 One hand doesn't care about what the other hand is doing. 7:29 One hand hates the other hand! 7:32 [LAUGH] Want to stamp it out. 7:34 That's a problem, not just in large 7:36 institutions, but in medium-sized and even small institutions. 7:38 If your organization is too big too to fit in one room and has 7:42 a web presence of some sort of 7:46 another, then you've probably had problems like this. 7:47 Okay. 7:51 So I want this to die. 7:52 I want this to die. 7:53 And really, what we're looking at is the same old saw about, insanity 7:56 is doing the same thing over and over again but expecting different results. 7:59 And I want to add a few things. 8:03 I want to modify this and say, this is 8:04 sort of like, part of the issue here is vanity. 8:05 It's not just insanity, it's like, you know, in, in the certain near term 8:09 it's that leader, who says, Well I, I need to make it look good. 8:13 I need to make it look like we've solved 8:17 the problem and then we're going to move on. 8:18 Or I'm gonna move on. 8:20 I think because of that over time what 8:21 you see is another thing which is organizational stupidity. 8:24 Cause it's different individuals operating in predictable ways. 8:27 No just leaders but people who are on teams, agencies and so forth. 8:30 But then as an organization how does this look to the rest of the world. 8:35 Not so good So who's to blame? 8:39 Is it the boss? 8:43 Is it the individual? 8:46 I think it's all of us. 8:49 Let me talk about it a little bit more, let me dig in 8:51 a little bit and I think what the problem if you'd scratch the surface 8:53 and get beyond the cosmetics is what I would call misdiagnostics in other words 8:59 there's a vacuum for diagnosing problems should be happening that is not happening. 9:04 And what happens when there's a vacuum is something fills it, of course. 9:10 And in this case, I think there's three typical routes or 9:14 kind of rationals or, or causes, or, that fill that vacuum. 9:18 Sometimes one of them is the primary one. 9:22 Sometimes they're all operating concurrently. 9:24 So you might say that one is that we're, we're trying to do the impossible. 9:26 We're trying to change entire web environments at once. 9:32 We're trying to boil the ocean, if you will. 9:34 And that's pretty hard. 9:36 How do you hold down that website, put your knee on it's 9:38 throat, and make it stop moving long enough to make whole scale changes? 9:41 [INAUDIBLE] tough. 9:45 Leaving the unbelievable when there's an absence of diagnostics. 9:47 [INAUDIBLE] What we do is we go for the things that sound good. 9:51 So that vendor of that search engine, or of that CMS. 9:56 Or that agency who's worked with institutions 10:00 like yours, sounds like they've got a good 10:03 solution, because they've talked to people like 10:05 you before, who are in decision making positions. 10:07 And wow, what they've come up with sounds really good. 10:11 So, let's just go with that because we haven't done our homework. 10:14 Unfortunately those third parties are going to give you the solution that 10:17 sounds best to them and not necessarily is most appropriate for you. 10:21 And then finally there's just becoming irresponsible. 10:24 This is really the vanity issue where senior leaders say alright, you know, 10:30 I, I just need to make it look good enough until I'm outta here. 10:34 If it goes bad again that's somebody else's 10:37 problem, kick the can down the road a bit. 10:39 So, now I wanna, kinda go back in time a little 10:43 further about another five years back to when I was a. 10:45 A lowly doctoral student at the University of Michigan. 10:48 And I took a class with this guy, his name is John Holland. 10:50 John Holland, is the first person to get a Phd in computer science in the U.S. He's 10:56 the guy that invented things like genetic algorithms 11:01 that didn't really catch on for about 30 years. 11:03 Took people that long to figure out what the hell they were for. 11:06 And how they could help. 11:09 John is one of these people associated with the Sante 11:10 Fe Institute that talks about complexity and specifically complex adaptive systems. 11:13 I took his class and I had no clue what he was talking about. 11:19 It took me a good ten years to figure out 11:23 what it was about and that it actually had some applicability. 11:25 To what we all do. 11:28 So now what I want to do is think about reframing some of 11:30 these problems in ways that actually give us a new perspective on the problem. 11:32 So complex adaptive systems, or what people like John Holland talk about. 11:38 And here's a definition. 11:42 I'll read it to you while you look at the pretty video. 11:43 Entity consisting of many diverse and autonomous 11:47 components, a bunch of things we call agents. 11:49 They're interrelated, they're interdependent, 11:53 and they're linked densely. 11:55 And they behave even though they're all, 11:57 they're all separate, they behave as a whole. 12:00 As a single system, and they're not just 12:02 a system they're a system that's adapting, that's learning. 12:04 It's changing in response to the changes in its environment. 12:07 Complex Adaptive Systems. 12:12 I would hazard that, that's what we got on our hands with our website. 12:15 Before we dive into that, let me give you a couple other 12:18 examples of Complex Adaptive Systems that might be a little more familiar. 12:21 Like, a natural gas pipeline, I know you're all real familiar with those. 12:27 How gas moves through a network like this changes 12:32 upon a hourly basis based on routing, and traffic, 12:36 and supply, and where the supply originates, and a 12:40 number of other factors that are too many for us. 12:43 With our small brains to really take in all at once. 12:45 Or a trading floor of a commodities exchange. 12:49 What happens in a, a farming area in Brazil last 12:53 week has a huge impact on what's going on on the floor in Chicago at the moment. 12:59 And these are very They're sort of. 13:04 It's hard to say they're unpredictable. 13:08 What's predictable is that there's gonna be a 13:10 swing back and forth between chaos and order. 13:12 And, that's kinda what we have, really, in effect with our websites. 13:16 There's predictable changes. 13:19 They swing back and forth from being very centrally 13:21 managed, to being very distributed, and autonomous, and back again. 13:23 It seems chaotic at the time but there is some predictability there. 13:27 Another thing that those of you who are 13:31 parents certainly are familiar with is the immune system. 13:33 When you have a little kid, you can see how different their immune 13:36 systems are from your own, and how their immune systems affect your immune system. 13:39 Think of all the variables that go into how your body repairs and protects itself. 13:44 How different it is for people in this room, maybe, than it is 13:49 for people a few miles up the road, in a school in, in Tampa. 13:53 It's completely different based on so many variables, that 13:57 it's really hard to take it all in at once. 14:00 So now we've got our web environments. 14:03 And our web environments are just 14:06 variables, built on variables, built on variables. 14:07 So we've got things like. 14:09 Things like users that really very in tons of ways. 14:10 I'm just scratching the surface here. 14:15 We've got our content that's constantly changing as well. 14:17 We've got our organizational context that's constantly 14:21 changing in response to the market place. 14:24 In response to mergers and acquisitions and splits and New 14:27 management and stock price changes and God knows what else. 14:31 And yet, you know we've these moving targets built on 14:36 moving targets and, are we really gonna control this stuff? 14:39 The whole theory of redesign is that you can. 14:44 That you can get something right. 14:46 You can hold it down long enough to perfect it and make it good 14:48 enough and then Hope and pray that entropy doesn't pull it back down the toilet. 14:51 Nope, that's where we are typically with so many web environments. 14:57 And so, who's to blame? 15:01 Well, really everyone, right. 15:03 No one and no one. 15:06 It was certainly bad leadership, irresponsible leadership. 15:06 But, ultimately, I think, most of us, or people 15:10 who were practitioners, were kind of to blame as well. 15:12 We get sucked into the things that are really interesting today. 15:15 We tend to be kind of trend followers to a large degree. 15:19 I mean, it's kind of interesting to think about how many sites now probably are 15:24 fully responsive and yet still are impossible to 15:28 find information and then do basic tasks within. 15:32 They're responsive but they're not very useful. 15:34 So, what I wanna do is just think about just a few way that May be not the 15:38 most practical in the world, but a least to 15:44 give you a framework for thinking about the problem differently. 15:46 That to some degree draws together the idea of Complex Adaptive Systems. 15:49 Let's jump right into that. 15:53 So, I, I'd like us to stop thinking whether we're senior people, or 15:56 people running teams, or whatever position we're in to stop pretending we can 16:01 be God with our web environments, to stop thinking that we can have 16:05 a huge impact, and instead look to have small impacts in meaningful places. 16:08 Things that we can do that actually will make a difference. 16:14 So I've got five ideas, five themes for you. 16:16 Swapping control for prioritization. 16:20 Embracing evolution over revolution. 16:23 Emphasizing process over projects. 16:25 Using anchors to counterbalance reaction and sneaking 16:28 in governance at the grass roots level. 16:32 So I want to go into some of these now. 16:35 Actually all of them. 16:37 So, swapping control for prioritization. 16:39 This is something called the zipf curve and I will give a free Rosenfeld Media 16:44 book to the first person who tells, and 16:49 Deb you're, you're excluded here, wherever you are. 16:53 But, anyone but Deb who knows what the Zipf Distribution of the Zipf Curve is? 16:56 Anyone, familiar with it? 17:00 Deb you have enough of our books already, so, you don't want another one, okay? 17:02 So the Zipf Distribution, Zipf was a 17:09 linguist, and, George Kingsley Zipf, to be specific. 17:12 And what linguists often do are things like 17:18 counting the words in texts, which sounds kinda boring, 17:20 but actually when you have a hunch like 17:23 Zipf did, it can lead to really interesting things. 17:25 So what Zipf found was when he took pretty much 17:29 any texts' words and stacked them from most frequent to 17:32 least frequent They had an interesting distribution that started really 17:37 high up and then dropped [SOUND] like a hockey stick. 17:44 Right on down, so, a few words were really common. 17:47 The, with, etcetera. 17:51 And then there was some sort of what you might call like, the middle torso. 17:54 So, we have the short head, the middle torso, and then the long tail. 17:57 And what we find, is like this, this phenomenon occurs again and again and 18:01 again in so many places, many of them relevant to the people in this room. 18:06 So, why, my favorite toy, by the way is Sir 18:11 Channel where we look at what people search on a site. 18:13 And, we try to stack those queries. 18:16 Sos it, those individual search queries from most common to least common. 18:18 It's the same as with zif found is the words In a text. 18:21 What's interesting is it doesn't play out like this. 18:25 It's not an even distribution. 18:29 Your site is not a democracy. 18:31 Your site is not a democracy. 18:34 So, there's not like an even distribution of search queries. 18:36 They're not all pretty much the same distribution. 18:41 It's not even a gradual drop-off. 18:44 No, it's this really steep one, and the long tail is really, really long. 18:46 Goes on and on and on. 18:53 So it's lots of esoterica And the way 18:54 people interact with our information and our websites. 18:57 Let me, I wanna make this point really clear, so I'm gonna do it now, in text. 19:00 again, in this example, we're using search queries. 19:07 This comes from Michigan State University. 19:10 where, they looked at the, average, the, the, I'm sorry. 19:14 In a particular, I don't know, couple week period, I 19:19 think it was, the most common search was for, campus map. 19:20 That was the most common number one 19:26 search, 7,000 plus times in this time period. 19:28 That was 1.4% of all the search activity, which 19:33 doesn't sound like very much, but when it, you realize 19:37 it comes out of tens and tens of thousands 19:40 of search queries, one of them being 1.4% is remarkable. 19:42 How many does it take us to get to 10%? 19:47 Only 14 search queries. 19:51 How many does it take us to get to 50% of all our search traffic? 19:53 Only about 500. 19:58 That's amazing. 19:59 So, if we want things to work better, like Search in this case, and we 20:02 don't have a lot of resources Certainly less than the average redesign costs. 20:07 We can still improve performance dramatically by biting off the first 20:12 10% or 20% or whatever, and these are very manageable numbers. 20:16 So let's say 50% of all your users search, and you can improve the search experience 20:22 for 30% of them by fixing the performance of your top 98 most common search queries. 20:28 Thirty percent times 50% is 15%. 20:34 Now, you can argue that you've improved the user experience 15%. 20:36 Lots of ways to shoot holes in 20:40 those numbers, but they're helpful numbers, nonetheless. 20:42 Use them wisely. 20:45 Alright, so, search queries, but other things too. 20:46 Your documents. 20:51 Your sites documents you know are gonna have the same distribution. 20:53 I can promise you that a few documents are doing all the heavy lifting. 20:56 Conversely, there are many documents that will never be accessed once. 21:00 In fact, I like to spread this rumor because 21:04 I like to pick on Microsoft like everyone else, but 21:06 I heard a rumor that something like 90% of all 21:08 the documents on Microsoft.com Have never been accessed one time. 21:10 They're spending a lot of money to create those documents. 21:17 They're spending a lot of money to serve those documents. 21:19 They're spending a lot of money to dress them up in all kinds of ways and they 21:22 get really focused on those documents when they 21:26 wanna boil the ocean and do a full redesign. 21:28 But ultimately, it's kind of pointless. 21:31 And in fact, those documents' presence probably diminishes a user experience 21:33 for many users by getting in the way of good information. 21:37 And then certainly allocating resources away from those documents, 21:40 they really do most of the heavy lifting or should. 21:44 Same thing, speaking of Microsoft The same thing is true of features. 21:48 Bloatware is typically something that 21:53 doesn't observe ZIP distribution, or should. 21:55 The designer should observe, because only a few features 21:57 really are necessary, and on and on and on. 21:59 Some of your audiences are more important than other audiences, and so forth. 22:02 So if we start thinking about ZIP as a tool that 22:06 we can use, to figure out what are our high priorities, 22:09 And then we find that investing just a small amount of 22:12 effort and resources into fixing the 22:16 problems identified by the Zipf distribution. 22:19 Then we have a real benefit. 22:22 A real inexpensive way to make wide scale improvements that last for quite a while. 22:24 Now, I'm just going to take this concept a little bit further. 22:32 When you think of your content and having, you 22:35 know, certain tiers of value, some stuff is accessed 22:39 a lot, some stuff is not accessed at all, 22:42 and then there's tiers in between, you can start 22:44 thinking about it as kind of a layered model, 22:48 an onion model And, you know, at different layers, 22:50 whether you're an information architect or usability engineer or 22:54 content strategist or some other person that's dealing with content, 22:57 you can start saying well, at a bare minimum, my level ze, my layer zero, 23:01 I'm gonna do things like, I'm gonna 23:06 make sure everything is indexed by search engine. 23:08 At the other end of the spectrum, you might invest a lot of hand 23:10 crafted effort in doing things like applying 23:13 metadata that's really expensive metadata, by hand. 23:16 And then there's all kinds of gradations in between 23:20 for each of these different tracks and others as well. 23:23 So I'm just trying to give you, an idea of when you, get a sense. 23:25 What's sort of the critical stuff, content, center of the onion. 23:30 Then you can start thinking about applying your resources more effectively 23:34 in terms of making the really important stuff accessible in multiple ways. 23:38 Stuff that's on the outer layers. 23:43 Doing kind of cheap inexpensive things. 23:45 Like pointing your search engine and having 23:47 it crawl and all that kind of stuff. 23:48 Okay. 23:50 Now, we can come back to this during the Q&A if it'll help. 23:52 So when you put all this stuff together, 23:56 what we're really talking about, again, is prioritization. 23:58 So, I find, for example, a prioritization along the way of a broader process. 24:01 Has a real benefit in terms of 24:08 helping us essentially invest our resources more wisely. 24:10 So I, I like to sort of think of this as a report card. 24:14 Let's come up with a prioritization of audiences. 24:17 So if I'm at an university example, I might say My most 24:20 important audience is our applicants, people who you hope will give us money. 24:25 Students, people who are giving us money. 24:28 And alumni, peop, people we hope who will forever give us money. 24:30 And so, once I have said of all the audiences 24:34 out there, these are the three I'm gonna really focus on. 24:36 Then I might say, all right, well, from the stakeholder perspective. 24:38 What's the most important thing for each of these audiences? 24:42 So for applicants, when I do stakeholder interviews, I might find that 24:45 the top five things according to them are that applicants should be able 24:50 to get access to brochures, application forms find out how to visit campus 24:53 learn about our mentoring programs, and 25:00 hear wonderful stories about our successful alumni. 25:01 When I do user research, talking to actual applicants, 25:05 I might find some overlap, but some really different things. 25:07 And I might use search analytics. 25:11 I might use task analysis and other basic methods to find out. 25:12 Like, I, I might find out yes, they might 25:16 want to be able to access an application form. 25:18 They want to be able to visit campus. 25:20 They want the campus map. 25:22 The people on campus don't think about that because they know their way around. 25:24 But the people out off campus do. 25:29 And they also want to know where the keggers are. 25:30 They want to go out and party. 25:32 Now the art and science of negotiation comes to play. 25:35 And we take these two sets of short head needs of each of these perspectives. 25:38 We combined them somehow. 25:43 And then we test them out. 25:45 Where are we succeeding, and more importantly, where are we failing? 25:47 Oh, it's really hard to learn about the mentoring programs. 25:50 If you test it out, do some really basic testing, five users, 25:53 maybe, you might say, wow, four of them couldn't find the mentoring information. 25:57 That's something to fix. 26:02 And that's a small thing to fix with a big impact. 26:03 And with this approach you'e covering it from 26:07 both a stakeholders perspective and the user's perspective. 26:09 Now rinse and repeat, rinse and repeat. 26:13 A simple process like this you can repeat 26:14 every month and you'll fe, keep finding the F's 26:17 and the D's, however, whatever the failures are and 26:20 they're gonna typically be very kind of narrow gauged. 26:23 And often fixable with limited resources. 26:26 Certainly less than you're going to spend on a redesign with much greater benefit. 26:29 Okay, so that's the one about periodization. 26:34 Let's go to the second one, about evolution rather than revolution. 26:36 So This, going back to my old friends at Michigan State 26:43 University, is kind of a, a crazy spreadsheet that they put together. 26:47 What it's doing is, over time, they looked at all their 26:52 common searches as an expression of what users wanted from their site. 26:55 And, they they then color coded them. 27:00 So yellow are like grading systems and core systems. 27:04 Whitish are courses. 27:08 Grey are sports, like football. 27:10 Black are things like maps, campus maps and so forth. 27:12 Let's take a closer look. 27:15 So here, over a few months you can 27:18 start seeing some interesting trends, what we call seasonality. 27:20 Like for example, beginning of the, the football 27:25 season people care about football and then, as usual, 27:29 the Michigan State University football team goes to 27:32 the toilet by, like, November and they drop off. 27:34 >> [LAUGH] 27:37 >> And, beginning of the semester, both in September 27:38 and, and in January, people are looking for maps. 27:41 Hey look in December. 27:45 All of a sudden people care about the library. 27:46 Where is it? 27:48 Yeah there it is. 27:49 A lot like, Oh do we have one? 27:50 Is it open at 3:00 AM? 27:52 How do I get there? 27:53 So all of a sudden that comes up. 27:55 So there's a, there's a. 27:58 If you look at the data, and this is not an expensive thing I'm showing you. 28:00 You start seeing trends. 28:04 This is how the trends look, a little [INAUDIBLE] 28:05 panning back a little bit and a little closer. 28:08 The colors are actually kind of helpful. 28:10 And this is pretty powerful I mean like, even, an organization that you wouldn't 28:14 exactly place at the pinnacle of design of any flavor, the IRS, gets this. 28:19 Like hey, it's the night before taxes are due. 28:24 What do they have on the main page? 28:29 As the April 17th deadline nears, check out the 1040 central area. 28:31 Last minute tips for those who file paper returns and need more time to file. 28:34 And then on April 18th, never too early to get organized. 28:40 So the, I mean seasonality is something that people are telling you many cases. 28:45 And it may not be monthly, may not be seasonally. 28:50 You might find that there are different needs from your site 28:51 at different times of the day, like, if you're running an 28:54 internet, maybe people care a lot about the lunchroom menu in 28:58 the morning and they care about traffic conditions in the afternoon. 29:02 Why aren't we giving them that in a way that makes sense? 29:05 And actually, this is kind of a really cool way 29:08 to squash a lot of main page debates that people have. 29:10 So I want you to look for change that 29:14 is longitudinal and then look for patterns within that change. 29:18 And you'll often find something of value. 29:21 You may not know initially that it's gonna be 29:24 a monthly thing or daily thing or something else. 29:27 Probably there's gonna be a bit of all that at the 29:30 same time, but you won't know that for quite some time. 29:32 Another theme, process over projects. 29:37 Now I [LAUGH] I couldn't figure out a good picture to use. 29:39 [SOUND] So if in doubt, the flying 29:44 spaghetti monster will usually serve you well. 29:47 If you have any better ideas, I'm, I'm all ears. 29:51 So, let's think about what we, not only how 29:55 people express their needs, as something that's evolutionary, but 29:58 how we respond to those needs, whether we're designers 30:02 or researches, in kind of a, a regular ongoing way. 30:06 This idea of emphasizing process Over project 30:09 where the work you do isn't going to 30:13 some often be thought of as a time 30:15 boxed project but more of an iterative responsibility. 30:17 A process that takes up x percent of your regular week. 30:21 For example this concept of constant inventory. 30:24 A lot of people do that once in a while, and, it's one of those 30:29 things where it's like, Well I've gotta take 30:33 a snapshot of a 2 million page website. 30:34 And like, sort of make sense of it, while it's changing under my feet. 30:38 Doesn't really make a lot of sense. 30:44 In fact I'm kind of one of the people 30:46 who's at fault here, because in the polar bear book 30:47 we told people to do this without realizing that websites 30:50 in 1998 would get a lot bigger, than they did. 30:54 And so it was one thing in 1998 to tell 30:58 people to do this, and another thing a few years later. 31:01 So, I'm going back on my advice and saying what you 31:04 should do instead is more of a rolling or ongoing process. 31:07 In this case, a rolling content inventory would be more about coming up with ways 31:09 of sampling content, both core content but also more dynamic changing things. 31:15 They're ways that you can be aware, through things 31:20 like spidering of new content in a web environment. 31:23 Oh!, 31:25 I hadn't seen this before. 31:25 Let me take a look. 31:26 I mean somebody somebody else created it. 31:27 I didn't create it but I probably want to know about it. 31:29 So the idea of sampling taking hey a random hundred dots every once 31:33 in a while maybe every month and looking and seeing what you got. 31:39 Looking for changes, new pockets of content, maybe massive 31:42 departures of contents that was there and suddenly has disappeared. 31:47 These are all ways of constantly understanding and making sense of a 31:51 content environment without the fallacy of 31:55 trying to taking one snapshot of everything. 31:58 So, I think there's a lot of our research and design processes that can follow 32:01 this approach where instead of saying, we're going to spend, one week doing this 32:06 type of work and then it's done, 32:12 we're constantly looking at content or constantly 32:14 doing whatever it is we do, maybe it's 5% of our daily, of our week. 32:16 10% or whatever it might be. 32:20 So really what I'm trying to get you to think about 32:23 is to move from process not just from project to process. 32:25 But from process to cadence. 32:29 The idea of cadence where you are actually doing work lots of different work. 32:31 Maybe it's not just you it's a team, many teams. 32:37 But its easier research or different types of design work where its stuff that just 32:40 gets done on a kind of schedule where you you're looking to balance that schedule. 32:44 In other words I've got a few ideas here of user research 32:50 methods some of which are kind of cheap and easy to do. 32:53 Some of which are not so you don't do it as often. 32:56 Can you come up with a list of things you do and map them along a cadence? 33:01 You might find that there's a lot of little things and not many some of these 33:07 sort of bigger, more involved things like a 33:11 field study, which is expensive and takes time. 33:13 So or, or maybe vice versa is the case, you're looking 33:17 at things once a year and not in an ongoing way. 33:21 So I'm encouraging you to think about this concept of cadence. 33:24 Here's another way to express it. 33:28 There may be things you do weekly. 33:29 There may be things you do quarterly. 33:33 There may be things you do annually but look 33:34 at it from that perspective and look to have balance 33:35 across the board so that not everything is a weekly 33:39 thing or not everything's an An annual thing, that's all. 33:42 Okay, number four: use anchors to counterbalance reaction. 33:44 So one of the things that I find pretty 33:51 valuable, is everything that Stewart Brand has ever written. 33:54 The guy who's behind the Long Now 34:00 foundation, long before that the whole earth catalog 34:02 He came, he spoke at the IA Summit gosh, I think we had him there about 11 years ago. 34:07 About this concept of pace layering. 34:13 Anybody familiar with pace layering? 34:15 So Brand was talking about the fact that pretty much any, 34:18 any universe Can be looked at from a perspective of different layers. 34:22 And, actually his, his model's right here. 34:29 This is him at the IA Summit, about 11 34:31 years ago, talking at sort of a core layer. 34:33 Where things change the least. 34:36 Things like nature, like laws of physics. 34:38 And then at outer layers there are things that change 34:41 all the time, see that line kind of squiggling around. 34:45 Like fashion. 34:48 Now why I think this is a valuable thing 34:50 for us to understand is again in the interest 34:52 of taking a more balanced approach to the work we do We are all up here pretty much. 34:54 Upper layers. 34:59 We're very much in a reactive mode. 35:00 We're kinda swimming around all the time and 35:02 bouncing from one side of the track to another. 35:04 And, it's hard to even know we're going in, in a direction whatsoever. 35:07 In situations like that you need a counterveiling force, some kind 35:11 of anchor, some kind of stabilizer Help you stay on course. 35:15 Not just because we tend to get 35:18 distracted, but because others tend to distract us. 35:20 And what it is we're about and what it is we're supposed to do. 35:25 In that interest, I think boring things like this, I never ever wanted to 35:28 have anything to do with personally, I'm 35:33 starting to find are just like, miraculous. 35:35 They're so helpful. 35:37 Things like charter, mission, and vision, 35:39 and value statements are just unbelievably useful. 35:41 Not just because you have, like, this little thing 35:44 that might go on the back of your business card. 35:46 That's not even really that important. 35:48 It tells you what you do but you go through 35:50 the exercise it's the journey of figuring out what you do. 35:51 And then having a good sense of, you know, how to stay on course. 35:55 Having good sense of what the course is. 35:59 If you're not familiar with it, I really like "Game 36:02 Storming," by Dave Gray and Sunni Brown and James Macanufo. 36:04 It's a kind of fun, practical way of doing 36:07 lots of things that have to do with meetings. 36:10 And they have exercises around things like developing an elevator pitch. 36:11 I've used it with clients many times, including University of Michigan. 36:16 Where you, it's a Mad Libs exercise for developing 36:20 an elevator pitch for whatever, organization, a product, a team. 36:23 And I highly recommend checking that out and trying that for yourself. 36:28 So start asking yourself, you know, if you do agree with me that 36:33 we're kinda getting pulled in different directions 36:37 all the time What's gonna stabilize you? 36:39 Is it one of these types of things, is it something else? 36:41 A personal mission, maybe? 36:45 Okay. 36:46 Last thing. 36:47 Sneaking in governance. 36:51 This, sort of, Moving, we're kind of pulling back from very kind 36:54 of localized interactions to now thinking about organizations and how they work. 36:58 And as we do that it becomes probably less and less relevant for those of 37:02 us who are practitioners and don't feel like we have a lot that we can do. 37:06 That we don't have our fingers on very many of those variables, those levers. 37:10 That I showed you earlier that are those moving targets of moving targets. 37:14 However, I don't know if I buy that. 37:18 I don't know if I buy that. 37:20 Even if you can't get your team or your organization to have 37:21 a clear anchor, a clear mission, I think you can as an individual. 37:24 And I think the same is true in terms of governance. 37:28 I think when we, when we all talk about governance, we often have a 37:30 very top down model of what governance 37:34 means in web environments, really any environment. 37:36 But that always assumes that, you know, that, governance means to govern 37:39 and not to be governed, and not to participate in that process. 37:44 It's very top down. 37:47 So, I, I think that we should be thinking about 37:49 ways that we as maybe not senior leaders, but other 37:51 people involved, who have a stake, who are participants, who 37:54 aren't powerless and Exercise that power in a more meaningful way. 37:58 So, by the way. 38:03 The problem is, you will feel like 38:05 you're fixing the airplane while you're piloting it. 38:07 I love this ad. 38:10 It's from, like, the 30s, from some guy from Miami who 38:12 invented a way to fix your airplane while flying it, literally. 38:14 [LAUGH] So, I can give you the reference, but there 38:18 he is, there he is I wonder if he made it. 38:20 Alright, so I'll tell you a quick story about a woman named Samantha Starmer who 38:26 was a REI, she was at the time helping run a group of user researchers. 38:32 And she felt like they didn't really have enough of an impact in the organization. 38:40 You know, she, she kind of felt like she was in this big place, and she was 38:48 part of this little team And it was 38:52 hard to really feel impacted and she also caught 38:54 whiff of the fact that relevant decisions were 38:57 being made in other parts of the organization that 39:00 really were relevant to what she was doing and 39:02 relevant certainly to the user experience as a whole. 39:04 What did she do? 39:08 She did this crazy, crazy thing. 39:09 She left the building on the REI campus where she worked. 39:11 She walked across campus to where the marketing 39:16 people were, and she brought candy with her. 39:18 She sat in the cafeteria until she met some people that she could, 39:23 she asked around and she, you know, any of you doing market research? 39:27 Any of you interested in user research? 39:31 She just kind of put herself out there and bribed people with candy. 39:32 Have conversations with her. 39:36 It's crazy, it's crazy stuff. 39:38 And then, after a while, she met people who weren't like her, but 39:40 had relevance to the same kinds of projects that she was working on. 39:45 Wow. 39:49 That really is crazy. 39:51 And then, they started finding common ground that went beyond Snickers 39:52 and Milky Ways and to things like Are we converting transactions? 39:55 Are we doing good customer service? 40:01 Do we have good information to make good design decisions based on? 40:03 And what ended up happening, is, starting small there, they built up 40:08 to doing things like a brown bag series around, around user research. 40:12 That was con, co-owned by different groups. 40:17 And, long story short, after about, I think 40:19 it was three years, they ended up having 40:21 a single team that did user research and 40:23 involved all those original divisions and silos around REI. 40:26 They combined everyone. 40:30 So, they had all these like different blind men 40:31 looking at the same elephant at the same time. 40:33 You can actually come up with some good insights, 40:36 based on putting all these different mindsets, together at once. 40:39 And having some kind of synthesis. 40:43 So starting small, and having conversations 40:45 and taking a risks and making 40:48 yourself a little vulnerable, is in a way grass root governance, cause its, 40:49 its helping, Decision making improve not only at the top down level, but 40:54 at a local level, in a way that triangulates the top down stuff. 40:59 And you could go even further. 41:02 Dave Grigg, and, I really love that guy. 41:05 he, told me about a concept called boundary objects. 41:09 [INAUDIBLE] anybody familiar with boundary objects? 41:12 Boundary objects are things that sit between 41:14 disciplines, that are common to those disciplines. 41:17 So, when you are from one discipline, and you have a certain perspective, you have a 41:21 certain set of concepts, and most important, you 41:25 have a language that makes sense to you. 41:28 That you speak with people like you. 41:30 And like Samantha you talk to another discipline 41:32 like marketing people and they have a language 41:35 that doesn't sound like something familiar to you, 41:38 what you need to do is establish some kind 41:41 of common vocabulary almost a pigeon some common 41:43 language and so what you might start looking 41:47 for things where there are kind of maybe 41:50 not exact maps er matches but close enough like. 41:53 They, you say tomato, I say to-mah-to. 41:56 You say KPI, and I say goals. 41:59 You say segments, and I say personas. 42:02 Something like that, where if you can establish a handful of common objects. 42:04 Even if you acknowledge that you're not using them exactly 42:10 the same way, you're not using exactly the same terms. 42:12 That will actually help you have conversations 42:15 with people who are different than you. 42:18 And that's I think, again, was that they're kinda core of, of governance. 42:20 It actually greatly reduces the need for top-down governance 42:24 by kind of lubricating conversations at a grassroots level. 42:27 Dave actually, Dave Gray took his whole concept, the existing 42:32 concept of boundary objects and kind of blew it up to 42:35 something called a boundary matrix that I want to encourage 42:38 you to take a look at, and there's a URL there. 42:41 Alright, so those those are my five themes. 42:44 So I told you this wouldn't be a very practical talk. 42:48 I didn't show you any software, [LAUGH] if I told you to do anything in particular. 42:52 But I wanted to give you a framework for doing stuff that's not easy, but that is 42:56 something that, you know, not only will be good 43:00 for your organization by helping maybe getting off the 43:04 redesign cycle, if it's on that track, But most 43:08 importantly besides saving money, you're gonna deliver a much 43:11 better experience and maybe you may even feel more 43:15 fulfilled in your work if you're working as part 43:19 of not just your own team, but part 43:21 of a bigger organization of different teams that 43:23 are aligned that can have conversations with each 43:26 other and are headed In the same direction. 43:29 So I hope that's helpful. 43:32
You need to sign up for Treehouse in order to download course files.Sign up