1 00:00:00,000 --> 00:00:09,122 2 00:00:09,122 --> 00:00:11,554 MICHELLE: Hey there, I'm Michelle Zohlman. 3 00:00:11,554 --> 00:00:13,953 My pronouns are she/her. 4 00:00:13,953 --> 00:00:17,633 I work at Treehouse as the training program manager. 5 00:00:17,633 --> 00:00:22,530 Today, I'll be co-presenting this course with my colleague, Hope Armstrong. 6 00:00:22,530 --> 00:00:25,246 This course will introduce ethical design, 7 00:00:25,246 --> 00:00:28,612 which considers the moral implication of one's work. 8 00:00:28,612 --> 00:00:31,438 With all of the power that technology yields, 9 00:00:31,438 --> 00:00:34,123 it comes with tremendous responsibility. 10 00:00:34,123 --> 00:00:37,330 We'll reflect on how tricky interfaces and 11 00:00:37,330 --> 00:00:41,932 dirty data practices have negative consequences on society. 12 00:00:41,932 --> 00:00:45,550 You'll use ethical frameworks and tools to evaluate and 13 00:00:45,550 --> 00:00:48,370 align your actions with your values. 14 00:00:48,370 --> 00:00:52,504 And to wrap things up, we'll look at advocacy techniques to nurture human 15 00:00:52,504 --> 00:00:54,942 centered decisions in your organization. 16 00:00:54,942 --> 00:00:59,262 This course is for everyone who works in tech, regardless of their role. 17 00:00:59,262 --> 00:01:04,554 I'll use the term design in the general sense to refer to those who design, 18 00:01:04,554 --> 00:01:07,754 develop, deploy, and manage technology. 19 00:01:07,754 --> 00:01:12,132 Let's get started by defining ethical design. 20 00:01:12,132 --> 00:01:14,855 Design ethics concerns moral behavior and 21 00:01:14,855 --> 00:01:18,610 responsible choices in the practice of design. 22 00:01:18,610 --> 00:01:23,903 It guides how designers work with clients, colleagues, and the end users of products. 23 00:01:23,903 --> 00:01:26,490 How they conduct the design process. 24 00:01:26,490 --> 00:01:29,442 How they determine the features of products, 25 00:01:29,442 --> 00:01:32,470 and how they assess the ethical significance or 26 00:01:32,470 --> 00:01:37,096 moral worth of the products that result from the activity of designing. 27 00:01:37,096 --> 00:01:40,804 The book Tragic Design summarizes it like this. 28 00:01:40,804 --> 00:01:45,749 "Badly designed products serve their creator or 29 00:01:45,749 --> 00:01:49,980 sponsor first and the users second." 30 00:01:49,980 --> 00:01:54,923 In 2015, Volkswagen was issued a violation of the Clean Air Act in the United States. 31 00:01:54,923 --> 00:01:59,713 The Environmental Protection Agency discovered Volkswagen intentionally 32 00:01:59,713 --> 00:02:05,173 programmed diesel engines to activate their emissions control only during inspections. 33 00:02:05,173 --> 00:02:10,011 The cars appear to pass air quality standards during testing although 34 00:02:10,011 --> 00:02:14,692 they created up to 40 times the emissions in real-world driving. 35 00:02:14,692 --> 00:02:19,567 The software was placed in 11 million cars worldwide, leading to a massive 36 00:02:19,567 --> 00:02:24,292 increase in greenhouse gas emissions that contribute to climate change. 37 00:02:24,292 --> 00:02:31,102 This scandal has cost Volkswagen at least an estimated $33.3 billion. 38 00:02:31,102 --> 00:02:34,132 A software engineer was sentenced to federal prison for 39 00:02:34,132 --> 00:02:36,260 implementing the software. 40 00:02:36,260 --> 00:02:42,693 This is just one example of how software has real world consequences. 41 00:02:42,693 --> 00:02:44,815 HOPE: Hi, I'm Hope Armstrong. 42 00:02:44,815 --> 00:02:50,282 I'm a Product Designer and Teacher at Treehouse, and my pronouns are she/her. 43 00:02:50,282 --> 00:02:53,122 Let's look at an ethical framework. 44 00:02:53,122 --> 00:02:59,814 Here's the ethical hierarchy of needs as defined by Aral Balkan and Laura Kalbag. 45 00:02:59,814 --> 00:03:03,321 The most foundational section is human rights. 46 00:03:03,321 --> 00:03:08,340 This is when technology respects and protects civil liberties, 47 00:03:08,340 --> 00:03:12,272 reduces inequalities, and benefits democracy. 48 00:03:12,272 --> 00:03:16,605 Building off of that, human effort is when technology respects 49 00:03:16,605 --> 00:03:21,352 people's effort by being functional, convenient, and reliable. 50 00:03:21,352 --> 00:03:25,030 At the top of the pyramid is human experience, 51 00:03:25,030 --> 00:03:29,570 which is when technology is intuitive and joyful to use. 52 00:03:29,570 --> 00:03:34,152 MICHELLE: We can use this framework when evaluating the morality of our work. 53 00:03:34,152 --> 00:03:38,240 You may have heard the terms inclusive design, usability, 54 00:03:38,240 --> 00:03:43,870 accessibility, universal design, and human-centered design. 55 00:03:43,870 --> 00:03:48,812 These are related terms that evoke the humanity involved with technology. 56 00:03:48,812 --> 00:03:52,705 After all, as much as we focus on technology being built, 57 00:03:52,705 --> 00:03:55,570 we need to consider the people it affects. 58 00:03:56,730 --> 00:03:59,343 So what does it mean to act ethically? 59 00:03:59,343 --> 00:04:01,641 Well, it's complicated. 60 00:04:01,641 --> 00:04:06,285 Each of us has different boundaries for safety and privacy, and 61 00:04:06,285 --> 00:04:10,423 our identity and personal experiences shape our needs. 62 00:04:10,423 --> 00:04:14,336 If you have a history of being racially discriminated against, 63 00:04:14,336 --> 00:04:17,680 you may hesitate to disclose your identity on a forum. 64 00:04:17,680 --> 00:04:20,660 And there are cultural differences too. 65 00:04:20,660 --> 00:04:25,573 What is acceptable in one country may be negatively perceived in another. 66 00:04:25,573 --> 00:04:28,352 Further, it changes over time. 67 00:04:28,352 --> 00:04:32,392 The term Overton window explains this phenomenon. 68 00:04:32,392 --> 00:04:36,526 An Overton window is the range of policies deemed acceptable to 69 00:04:36,526 --> 00:04:40,350 the mainstream population at a given time. 70 00:04:40,350 --> 00:04:43,220 A few decades ago, putting a listening device in 71 00:04:43,220 --> 00:04:47,992 your living room would have felt extreme, as if you're wiretapping yourself. 72 00:04:47,992 --> 00:04:53,254 But now that voice activated devices are so common, it's normalized. 73 00:04:53,254 --> 00:04:58,110 Although it's still a bit creepy if you really think about the implications.