1 00:00:00,000 --> 00:00:06,846 [MUSIC] 2 00:00:06,846 --> 00:00:10,140 Hey, y'all, what's up, y'all, are you ready? 3 00:00:10,140 --> 00:00:11,275 What's up, what's up? 4 00:00:11,275 --> 00:00:13,630 >> [LAUGH] We're even early. 5 00:00:13,630 --> 00:00:15,782 >> I know, a little bit. 6 00:00:15,782 --> 00:00:18,824 I've got some folks joining us. 7 00:00:18,824 --> 00:00:19,903 >> I was right on time. 8 00:00:19,903 --> 00:00:23,866 >> [CROSSTALK] Well, while they're hopping on, 9 00:00:23,866 --> 00:00:27,940 I am going to introduce you, beautiful folks. 10 00:00:27,940 --> 00:00:32,115 It is my absolute pleasure to introduce Amy Lima and 11 00:00:32,115 --> 00:00:35,520 Jarvis Moore to the stage y'all. 12 00:00:35,520 --> 00:00:40,394 Amy is a product designer based in New York City one of my favorites of 13 00:00:40,394 --> 00:00:45,700 first generation American college graduate in tech professional. 14 00:00:45,700 --> 00:00:51,312 Her design work is driven by inclusive human centered practices and 15 00:00:51,312 --> 00:00:56,335 champion marginalized voices with the aim of dismantling, 16 00:00:56,335 --> 00:00:59,792 I should say, exclusionary [INAUDIBLE]. 17 00:00:59,792 --> 00:01:02,685 Jarvis Moore is a UX designer, writer and 18 00:01:02,685 --> 00:01:06,370 UX mentor based in Oklahoma City, Oklahoma. 19 00:01:06,370 --> 00:01:10,243 His story started when he dropped out of college to get a job to support his 20 00:01:10,243 --> 00:01:11,290 family. 21 00:01:11,290 --> 00:01:13,668 Being the only source of income in his household, 22 00:01:13,668 --> 00:01:16,080 he couldn't afford to attend a boot camp. 23 00:01:16,080 --> 00:01:18,679 But he had skills as a graphic designer and 24 00:01:18,679 --> 00:01:22,110 he leveraged those to build a career in UX. 25 00:01:22,110 --> 00:01:25,511 He specializes in branding and design strategy and 26 00:01:25,511 --> 00:01:31,125 he tries to share what he's learned on his unconventional journey and design with as 27 00:01:31,125 --> 00:01:37,240 many people as possible to show them that anybody can chase after their passion. 28 00:01:37,240 --> 00:01:39,730 Please welcome, Jarvis and Amy. 29 00:01:40,850 --> 00:01:46,320 >> Thank you so much for such a lovely introduction and hey, everyone. 30 00:01:46,320 --> 00:01:52,380 Welcome to our talk on the very important topic of underrepresentation in tech. 31 00:01:52,380 --> 00:01:54,700 We're so excited to be here today. 32 00:01:54,700 --> 00:01:58,968 And firstly, would like to thank Treehouse again for inviting us to speak at such 33 00:01:58,968 --> 00:02:02,570 a great festival alongside so many other talented folks. 34 00:02:02,570 --> 00:02:05,860 And, we especially wanna thank all of you for tuning in today. 35 00:02:05,860 --> 00:02:10,490 It's been a long and tiresome year of virtual events in screen time and 36 00:02:10,490 --> 00:02:13,639 you could be doing many other things right now. 37 00:02:13,639 --> 00:02:18,547 But you chose to join this space with us, and for that, we are so grateful and 38 00:02:18,547 --> 00:02:21,631 we promise to make your time today worthwhile. 39 00:02:28,362 --> 00:02:33,920 >> Excuse me while I get our presentation up and we can get started. 40 00:02:35,130 --> 00:02:41,330 >> Cool, so before diving in, we'd like to briefly introduce ourselves. 41 00:02:41,330 --> 00:02:43,399 Again, right, so I'm Amy. 42 00:02:43,399 --> 00:02:48,004 My pronouns are she/her and I'm a first generation American college grad in tech 43 00:02:48,004 --> 00:02:51,890 professional in my family currently working as a product designer. 44 00:02:53,910 --> 00:02:57,967 My experience as a first generation immigrant in tech has come with barriers, 45 00:02:57,967 --> 00:02:59,995 a mighty sense of responsibility, and 46 00:02:59,995 --> 00:03:04,250 the never ending pursuit of feeling truly seen in this industry. 47 00:03:04,250 --> 00:03:06,545 I've always been interested in tech and design, but 48 00:03:06,545 --> 00:03:10,290 growing up didn't see many people in the field who looked like me. 49 00:03:10,290 --> 00:03:14,406 I thought a career in tech was an elite two members-only club that only 50 00:03:14,406 --> 00:03:16,988 the most privileged were invited to join and 51 00:03:16,988 --> 00:03:21,080 it was audacious of me to even dream of playing a part. 52 00:03:21,080 --> 00:03:24,777 But, once I learned that the field actually did include people who looked 53 00:03:24,777 --> 00:03:28,293 like me, and even more importantly needed people who looked like me, 54 00:03:28,293 --> 00:03:31,650 I followed my dreams of becoming a designer and never looked back. 55 00:03:32,870 --> 00:03:36,325 I'm a tech optimist at heart and believes that we have the power to make 56 00:03:36,325 --> 00:03:40,250 the world a much better place through responsible use of technology. 57 00:03:40,250 --> 00:03:44,270 And I hope to help shape that better worlds through my design work. 58 00:03:44,270 --> 00:03:48,138 My daily inspiration comes from any person of color forging their way in 59 00:03:48,138 --> 00:03:50,072 even the most oppressive spaces and 60 00:03:50,072 --> 00:03:53,635 empowering even the most marginalized through design and tech. 61 00:03:57,050 --> 00:04:02,740 >> And I'm Jarvis, self-taught UX designer, mentor, and occasional writer. 62 00:04:03,800 --> 00:04:07,160 I'm also a part of an organization called Black UX labs, 63 00:04:07,160 --> 00:04:10,100 where our goal is to put more black B in the C suite. 64 00:04:11,780 --> 00:04:13,082 So as we kind of touched on, 65 00:04:13,082 --> 00:04:16,530 I'm on the opposite side of Amy where I didn't finish college. 66 00:04:16,530 --> 00:04:20,127 And so, not having a college degree made it exponentially more difficult 67 00:04:20,127 --> 00:04:21,690 to break into tech. 68 00:04:21,690 --> 00:04:24,570 And, when you add on top of that being a black man, 69 00:04:24,570 --> 00:04:28,274 it meant that I had to be better and faster and there was no room for 70 00:04:28,274 --> 00:04:30,620 error in order to prove that I belonged. 71 00:04:31,820 --> 00:04:34,490 But despite the odds, I was still able to find my way. 72 00:04:35,750 --> 00:04:40,340 Because of that, because of the journey really that it took me to get into tech, 73 00:04:40,340 --> 00:04:43,763 I felt like I can be like a shining light or a beacon of hope for 74 00:04:43,763 --> 00:04:48,284 people who don't have degrees as well to realize that even without that you can 75 00:04:48,284 --> 00:04:50,660 still accomplish your dreams. 76 00:04:50,660 --> 00:04:52,709 So, through my unconventional path, 77 00:04:52,709 --> 00:04:56,280 I take it as my responsibility to uplift people around me. 78 00:04:56,280 --> 00:05:01,082 Extend a helping hand to anyone that I can and to try and make the tech industry 79 00:05:01,082 --> 00:05:05,750 a more diverse and inclusive field for the next generation of designers. 80 00:05:09,607 --> 00:05:13,914 So, we wanna start off by sharing some statements and 81 00:05:13,914 --> 00:05:18,029 asking that you just nod along if you're aware of or 82 00:05:18,029 --> 00:05:22,160 not surprised by anything that we mentioned here. 83 00:05:24,290 --> 00:05:29,319 So, first, did you know that algorithms built with insufficient data sets 84 00:05:29,319 --> 00:05:34,587 can lead to everything from police disproportionately targeting communities 85 00:05:34,587 --> 00:05:39,869 of color to misdiagnosis of certain skin cancers for darker skinned patients? 86 00:05:43,622 --> 00:05:48,114 Did you know that AI chat bots trained to learn human behavior by 87 00:05:48,114 --> 00:05:53,287 interacting with Internet have users often produce streams of sexist, 88 00:05:53,287 --> 00:05:56,186 racist and even pro-Hitler messages? 89 00:06:00,387 --> 00:06:05,892 And, did you know that companies that have higher degrees of racially and ethnically 90 00:06:05,892 --> 00:06:12,690 diverse employers are 35% more profitable than companies with homogenous workforces? 91 00:06:12,690 --> 00:06:16,988 And yet, people of color make up a tiny percentage of the tech workforce. 92 00:06:20,565 --> 00:06:23,350 >> If you nodded along to any of the things we said, 93 00:06:23,350 --> 00:06:26,916 this session is gonna serve as an examination of why that is. 94 00:06:26,916 --> 00:06:30,579 The systems at play that perpetuate under representation in tech, 95 00:06:30,579 --> 00:06:35,070 the implications of a uniform industry, and how to break the cycle. 96 00:06:35,070 --> 00:06:37,738 If you didn't identify with anything we said, 97 00:06:37,738 --> 00:06:40,347 this session may serve as a reckoning for you. 98 00:06:40,347 --> 00:06:44,654 Bringing to light the difficult but undeniable issues that people of color, 99 00:06:44,654 --> 00:06:47,569 your colleagues, your friends face every day, and 100 00:06:47,569 --> 00:06:51,363 hopefully provide inspiration on how to be a more actionable ally. 101 00:06:54,352 --> 00:07:00,200 >> So, we hear a lot about under representation in design and technology. 102 00:07:00,200 --> 00:07:02,200 But, what does that actually mean? 103 00:07:03,310 --> 00:07:07,884 And, what are the implications of a homogenous field, particularly one with so 104 00:07:07,884 --> 00:07:10,920 much power and influence in our day-to-day lives? 105 00:07:12,130 --> 00:07:15,265 By the end of this talk, we hope you can answer these questions and 106 00:07:15,265 --> 00:07:17,146 feel empowered to tackle them head on. 107 00:07:20,703 --> 00:07:24,610 >> We'll soon be dissecting the current state of underrepresentation in tech. 108 00:07:24,610 --> 00:07:26,970 But first, we wanna go back to the beginning and 109 00:07:26,970 --> 00:07:29,810 highlight some early pioneers of color in the industry. 110 00:07:29,810 --> 00:07:33,637 Who designed culture shifting products we use in our daily lives, acknowledge 111 00:07:33,637 --> 00:07:37,708 the obstacles they overcame, and consider what we can still learn from them today. 112 00:07:41,243 --> 00:07:42,355 >> In this day and age, 113 00:07:42,355 --> 00:07:46,360 we're used to having an infinite amount of data at our fingertips. 114 00:07:46,360 --> 00:07:50,552 And, we're not just talking about the Internet, we're talking about hard data. 115 00:07:52,553 --> 00:07:56,990 Have you ever tracked your screen time, monitor your spending habits? 116 00:07:58,320 --> 00:08:02,410 These often use charts and graphs to help us make sense of the data we're seeing. 117 00:08:02,410 --> 00:08:06,506 These data visualizations are vital in making cold statistics so 118 00:08:06,506 --> 00:08:08,523 meaningful and even personal. 119 00:08:10,616 --> 00:08:15,410 One of the earliest pioneers in the field of modern data visualization was 120 00:08:15,410 --> 00:08:17,240 WEB Du Bois. 121 00:08:17,240 --> 00:08:21,702 He used his background in sociology, anthropology, and civil activism, 122 00:08:21,702 --> 00:08:25,688 to highlight impressive statistics surrounding black American. 123 00:08:28,287 --> 00:08:33,700 These data visualizations were premiered at the Paris Exhibition in 1900, where 124 00:08:33,700 --> 00:08:38,830 Du Bois made African American culture more visible to a wider spectrum of people. 125 00:08:39,920 --> 00:08:43,360 His work showed African American advances in education, 126 00:08:43,360 --> 00:08:46,660 lingering effects of slavery, and most importantly, 127 00:08:46,660 --> 00:08:51,028 that despite centuries of oppression, people of color were excelling. 128 00:08:54,479 --> 00:08:58,981 This data was in direct conflict to the widely accepted white supremacist 129 00:08:58,981 --> 00:09:02,910 paradigm that was dominating mainstream science at this time. 130 00:09:03,920 --> 00:09:08,746 Using these visualizations made Du Bois one of the first great American mind, 131 00:09:08,746 --> 00:09:12,980 whose reach extended beyond academics to the masses. 132 00:09:12,980 --> 00:09:17,686 He leveraged design as a tool to educate and democratize information and 133 00:09:17,686 --> 00:09:20,243 made us rethink how we interpret data. 134 00:09:20,243 --> 00:09:24,683 His innovative visualization techniques are still widely regarded and 135 00:09:24,683 --> 00:09:25,781 used to this day. 136 00:09:29,577 --> 00:09:35,250 >> In the 1960s, technology was beginning to develop at an unprecedented pace. 137 00:09:35,250 --> 00:09:39,342 These technologies laid the foundation for an entire half decade of 138 00:09:39,342 --> 00:09:44,311 scientific innovations, many of which resulted in the products we enjoy today. 139 00:09:48,006 --> 00:09:51,887 After being among the first African Americans to attend and 140 00:09:51,887 --> 00:09:56,853 graduate St Louis University, Roy Clay was recruited by HP in 1965 to be 141 00:09:56,853 --> 00:10:01,740 instrumental in the development of an ambitious project Taking 142 00:10:01,740 --> 00:10:06,701 a room sized computer of the day and making it available for personal use. 143 00:10:06,701 --> 00:10:10,531 Clay ultimately created and led HP computer division, 144 00:10:10,531 --> 00:10:14,452 making HP the first computer company in Silicon Valley. 145 00:10:14,452 --> 00:10:19,502 One year later, Clay and his team developed the HP 2116A, 146 00:10:19,502 --> 00:10:25,900 one of the world's first mini computers and the first computer to be sold by HP. 147 00:10:25,900 --> 00:10:29,932 Not only did Clay develop the software for this computer, he also went on to 148 00:10:29,932 --> 00:10:34,163 become the director of the first HP research and development computer group. 149 00:10:36,924 --> 00:10:40,719 Clay is often called the godfather of Silicon Valley because of 150 00:10:40,719 --> 00:10:44,229 the opportunities he created for others in the industry, 151 00:10:44,229 --> 00:10:46,390 specifically African Americans. 152 00:10:46,390 --> 00:10:51,027 While at HP, he expanded their employment recruitment from historically 153 00:10:51,027 --> 00:10:55,230 black colleges and universities to join HP's computer division. 154 00:10:55,230 --> 00:10:58,957 And when Clay went on to start his own technology company Rod-L, 155 00:10:58,957 --> 00:11:03,092 he professed it was at one point the largest employer of African American 156 00:11:03,092 --> 00:11:05,155 professionals in Silicon Valley. 157 00:11:07,821 --> 00:11:10,693 >> Outside of his tech accomplishments, 158 00:11:10,693 --> 00:11:16,617 Roy became the first minority to serve on the Palo Alto City Council in 1973. 159 00:11:16,617 --> 00:11:23,216 He also became the first African American vice mayor of Palo Alto in 1976. 160 00:11:23,216 --> 00:11:28,198 Today, Roy is one of the most celebrated figures in technology. 161 00:11:28,198 --> 00:11:29,862 Despite discrimination and 162 00:11:29,862 --> 00:11:33,119 limited opportunities available to Black Americans, 163 00:11:33,119 --> 00:11:37,636 Roy has enjoyed great success in an industry that is known to lack diversity. 164 00:11:37,636 --> 00:11:42,746 And in 2003, was inducted into the Silicon Valley, Engineering Council Hall 165 00:11:42,746 --> 00:11:47,722 of Fame where he was honored for his pioneering professional accomplishments. 166 00:11:47,722 --> 00:11:51,297 Not only that his contribution saved HP and technology, but 167 00:11:51,297 --> 00:11:55,448 he also helped to pave the way for minorities to follow in his footsteps 168 00:12:00,118 --> 00:12:02,197 >> When we think of modern day computing, 169 00:12:02,197 --> 00:12:05,158 a few names usually come to mind and get all the credit for 170 00:12:05,158 --> 00:12:06,925 the interfaces we know and love. 171 00:12:06,925 --> 00:12:11,342 Typically, all white men. 172 00:12:11,342 --> 00:12:15,415 But we don't hear about the names of the people behind the scenes who shaped 173 00:12:15,415 --> 00:12:18,081 the look and feel of the screens we use every day. 174 00:12:18,081 --> 00:12:21,924 People who created the Windows, dialog boxes, and icons we've largely take for 175 00:12:21,924 --> 00:12:22,936 granted these days. 176 00:12:25,999 --> 00:12:28,030 Among them is Loretta Staples, 177 00:12:28,030 --> 00:12:31,800 one of the earliest interface designers in San Francisco. 178 00:12:31,800 --> 00:12:36,241 For years, she dreamed of interactive experiences meant to delight and 179 00:12:36,241 --> 00:12:37,639 satisfy the end user. 180 00:12:37,639 --> 00:12:42,869 And that was long before the term design thinking became a buzzword, 181 00:12:42,869 --> 00:12:45,858 and the fields became to be known as UI. 182 00:12:45,858 --> 00:12:49,208 When the Loretta first started designing, the field was so 183 00:12:49,208 --> 00:12:53,572 new that most of the software from Photoshop to Figma didn't even exist yet. 184 00:12:53,572 --> 00:12:56,774 She used a combination of the tools available at the time to come up with 185 00:12:56,774 --> 00:13:00,050 creative ways to bring technology to life through delightful design. 186 00:13:06,253 --> 00:13:11,788 >> Loretta went on to become a full time interface designer at Apple in 1989. 187 00:13:11,788 --> 00:13:15,355 Before opening up her own studio U.I 1992, 188 00:13:15,355 --> 00:13:20,923 where she helped create a design for an interactive television prototype, 189 00:13:20,923 --> 00:13:24,753 a predecessor in many ways to streaming TV of today. 190 00:13:27,182 --> 00:13:32,070 Loretta curved a space for herself and her passions before they were formally 191 00:13:32,070 --> 00:13:36,142 defined as a field and paved the way for those who came after her. 192 00:13:36,142 --> 00:13:39,476 In this way, she can be known as one of the earliest pioneers in 193 00:13:39,476 --> 00:13:41,349 the field of interaction design. 194 00:13:45,287 --> 00:13:50,097 But despite pioneering landmark achievements by technologists of color, 195 00:13:50,097 --> 00:13:53,252 we are still largely facing the same obstacles and 196 00:13:53,252 --> 00:13:56,648 barriers to access an entry as our predecessors did. 197 00:13:56,648 --> 00:14:00,580 We can look to them for inspiration for making an impact in the field, but 198 00:14:00,580 --> 00:14:03,943 we still need to recognize the present battles left to fight. 199 00:14:03,943 --> 00:14:08,084 When we say that tech has a diversity and inclusion problem, this is what we mean. 200 00:14:11,292 --> 00:14:15,539 >> We hear a lot about underrepresentation in tech today, and for good reason. 201 00:14:15,539 --> 00:14:19,446 It's no secret that the industry skews overwhelmingly white. 202 00:14:19,446 --> 00:14:23,875 Specifically Hispanic, Latinx, and black people are the most 203 00:14:23,875 --> 00:14:28,805 underrepresented in tech relative to their representation in the US. 204 00:14:28,805 --> 00:14:33,155 Technology is the one industry that shapes all other industries, so 205 00:14:33,155 --> 00:14:36,305 representation in this field truly matters, and 206 00:14:36,305 --> 00:14:39,835 it's a pivotal step towards a more equitable society. 207 00:14:43,086 --> 00:14:46,622 In 2016, big tech companies acknowledged this gap and 208 00:14:46,622 --> 00:14:50,723 made it a public goal to increase diversity in their workforces, and 209 00:14:50,723 --> 00:14:54,346 even made hefty donations to civil justice organizations. 210 00:14:54,346 --> 00:14:58,642 Unfortunately, five years later, these prominent companies and 211 00:14:58,642 --> 00:15:03,538 the industry as a whole, have barely moved the needle in increasing minority 212 00:15:03,538 --> 00:15:05,965 representation in their workforce. 213 00:15:05,965 --> 00:15:08,914 Beyond that, a recent study of diversity and 214 00:15:08,914 --> 00:15:13,650 technology found that companies that made statements of solidarity had 215 00:15:13,650 --> 00:15:17,937 20% fewer black employees on average than those who didn't. 216 00:15:17,937 --> 00:15:21,696 Highlighting a gap between what companies say about social issues, and 217 00:15:21,696 --> 00:15:24,106 what they do about it in their own workplaces. 218 00:15:27,335 --> 00:15:30,520 Tech leaders have often pointed to a pipeline problem, 219 00:15:30,520 --> 00:15:33,918 to explain away the lack of minority hiring and promotion. 220 00:15:36,825 --> 00:15:43,147 But even in 2017, 8.9% of graduates with bachelor's degrees in computer and 221 00:15:43,147 --> 00:15:48,444 information science were black, and a little over 10% were Latino. 222 00:15:48,444 --> 00:15:52,816 However low these numbers are, they're much higher than the percentage of 223 00:15:52,816 --> 00:15:57,544 minorities represented in workforces across tech, which hovers at about 3%. 224 00:15:57,544 --> 00:16:01,392 So just thinking about the problem of under representation in tech cannot be 225 00:16:01,392 --> 00:16:03,447 explained by a pipeline problem alone. 226 00:16:05,935 --> 00:16:09,765 >> So what else explains the disparity? 227 00:16:09,765 --> 00:16:12,542 What else explains this disparity? 228 00:16:12,542 --> 00:16:15,061 Unsurprisingly, the reasons build on each other. 229 00:16:17,294 --> 00:16:22,222 The first barrier to entry minorities encounter is their early education and 230 00:16:22,222 --> 00:16:23,370 socialization. 231 00:16:23,370 --> 00:16:28,455 Early on, societal stereotypes and unconscious bias reinforced the perception 232 00:16:28,455 --> 00:16:32,593 that minorities are not as good as white kids in STEM disciplines. 233 00:16:32,593 --> 00:16:37,724 Due to often unconscious bias, parents and teachers are likely to 234 00:16:37,724 --> 00:16:43,141 discourage minorities from pursuing computer related activities. 235 00:16:43,141 --> 00:16:47,988 For women of color, the intersectionality of gender and race combined, puts them at 236 00:16:47,988 --> 00:16:52,633 an even more of a disadvantage when it comes to computer science and engineering, 237 00:16:52,633 --> 00:16:55,624 widening the gap of minority women in tech further. 238 00:16:58,805 --> 00:17:03,519 The next line of offense in hiring practices is the perpetuation of 239 00:17:03,519 --> 00:17:04,621 gatekeeping. 240 00:17:07,458 --> 00:17:11,509 Companies are often reluctant to broaden the schools they recruit from 241 00:17:11,509 --> 00:17:15,770 to include historically black colleges and universities, for example. 242 00:17:15,770 --> 00:17:20,397 As widening the recruitment that can be seen as a threat to institutions who pride 243 00:17:20,397 --> 00:17:23,198 themselves in being elite and of the select few. 244 00:17:26,755 --> 00:17:31,743 Many tech companies also rely heavily on referrals from current employees, 245 00:17:31,743 --> 00:17:35,360 which is a system that reinforces the network effects. 246 00:17:35,360 --> 00:17:38,707 People in power typically refer those who look and act like they do, 247 00:17:38,707 --> 00:17:41,124 which further perpetuates this vicious cycle. 248 00:17:45,393 --> 00:17:47,958 >> Once you do get your foot through the door, 249 00:17:47,958 --> 00:17:51,885 people of color face often another hurdle, lack of mentorship. 250 00:17:54,462 --> 00:17:58,407 Racial minorities are underrepresented in tech leadership roles, 251 00:17:58,407 --> 00:17:59,835 even when you control for 252 00:17:59,835 --> 00:18:03,991 the fact that they're underrepresented at these companies as a whole. 253 00:18:03,991 --> 00:18:07,100 This not only maintains a power imbalance, but 254 00:18:07,100 --> 00:18:10,839 can also hinder the growth of employees at that company. 255 00:18:10,839 --> 00:18:14,045 Similar to exclusionary hiring practices, 256 00:18:14,045 --> 00:18:18,262 people in senior roles who skew overwhelmingly white men, 257 00:18:18,262 --> 00:18:23,429 often seek protegees that look like them and remind them of themselves. 258 00:18:23,429 --> 00:18:27,797 Because of this, people of color in tech often lack someone who will advocate for 259 00:18:27,797 --> 00:18:32,051 them and who they can turn to in the face of micro aggressions in the workplace. 260 00:18:32,051 --> 00:18:35,048 This results in a high turnover rate of diverse talent, 261 00:18:35,048 --> 00:18:36,783 bringing us back to square one. 262 00:18:40,012 --> 00:18:44,599 So we see that simply acknowledging the problem of underrepresentation in tech 263 00:18:44,599 --> 00:18:45,561 isn't enough. 264 00:18:45,561 --> 00:18:49,766 While setting up ambitious goals to increase representation in your work 265 00:18:49,766 --> 00:18:54,310 force, pledging significant money to tech pipeline diversity programs, and 266 00:18:54,310 --> 00:18:59,153 even slightly moving the needle for people of color in the industry are meaningful. 267 00:18:59,153 --> 00:19:03,323 These acts of altruism ring performative at best until they're 268 00:19:03,323 --> 00:19:07,419 quantitatively reflected in the company's diversity data. 269 00:19:07,419 --> 00:19:11,799 Increasing opportunities for people of color in one of the fastest growing and 270 00:19:11,799 --> 00:19:14,984 highest paid sectors of the economy is gonna require more 271 00:19:14,984 --> 00:19:17,656 incessant efforts to bring about real change. 272 00:19:22,184 --> 00:19:26,927 The tech industry still has a lot of work to do, and understanding the hurdles and 273 00:19:26,927 --> 00:19:29,857 nuances that people of color face at every stage in 274 00:19:29,857 --> 00:19:34,602 the hiring chain helps contextualize the problem and know how to ask the right and 275 00:19:34,602 --> 00:19:36,948 difficult questions at your company. 276 00:19:36,948 --> 00:19:39,380 Which are the first steps in helping reverse this trend. 277 00:19:42,085 --> 00:19:47,474 >> So, how does this underrepresentation affect us in our everyday lives? 278 00:19:47,474 --> 00:19:49,685 The root causes of biases and 279 00:19:49,685 --> 00:19:55,045 racism infiltrating into our technologies comes from a few places. 280 00:19:55,045 --> 00:19:59,767 For starters, it's extremely difficult to root out unconscious bias. 281 00:19:59,767 --> 00:20:03,869 In contrast to explicit bias, where someone's deliberately and 282 00:20:03,869 --> 00:20:08,632 willfully discriminating against you, unconscious bias refers to the deep 283 00:20:08,632 --> 00:20:12,895 seated prejudices we all absorb due to living in an unequal society. 284 00:20:14,893 --> 00:20:19,407 These biases can be present even in people who genuinely believe they're 285 00:20:19,407 --> 00:20:20,944 committed to equality. 286 00:20:20,944 --> 00:20:25,190 It's harder to spot out and root out than obvious discrimination. 287 00:20:27,271 --> 00:20:32,419 These biases affect society in many harmful ways in our day to day lives. 288 00:20:32,419 --> 00:20:38,343 Such as medical professionals believing that black patients are less susceptible 289 00:20:38,343 --> 00:20:43,856 to pain, and less likely to comply with medical advice than white patients. 290 00:20:45,875 --> 00:20:50,895 Police instinctively seeing darker faces as being more criminal. 291 00:20:53,045 --> 00:20:57,500 And hiring managers associating ethnic sounding names with aggression. 292 00:21:01,128 --> 00:21:05,078 >> While it's unreasonable to expect people to completely abandon their 293 00:21:05,078 --> 00:21:09,420 implicit biases, it's easy to see how they can infiltrate the technology and 294 00:21:09,420 --> 00:21:11,150 the products we use every day. 295 00:21:11,150 --> 00:21:15,451 Especially if the workforces behind building these products lack 296 00:21:15,451 --> 00:21:18,531 a diversity of perspectives and backgrounds. 297 00:21:18,531 --> 00:21:21,382 With tech tools so ingrained in modern life, 298 00:21:21,382 --> 00:21:24,919 racism in tech can exacerbate prejudicial attitudes. 299 00:21:24,919 --> 00:21:27,329 So how does this harmful cycle begin? 300 00:21:30,253 --> 00:21:34,244 The first culprit in biased technology is biased data. 301 00:21:34,244 --> 00:21:35,246 Algorithms and 302 00:21:35,246 --> 00:21:40,265 artificial intelligence are trained based on datasets humans feed to them. 303 00:21:40,265 --> 00:21:45,641 Biased data is a result of prejudiced assumptions made during the algorithm 304 00:21:45,641 --> 00:21:50,437 development process, or prejudices in the training data itself. 305 00:21:53,665 --> 00:21:58,471 An example of the harmful effects of biased data is predictive policing tools. 306 00:21:58,471 --> 00:22:02,540 Whose goal is to send officers to the scene of a crime before one occurs. 307 00:22:03,750 --> 00:22:06,791 The assumption is that locations where individuals have 308 00:22:06,791 --> 00:22:11,860 been previously arrested correlate with a likelihood of future illegal activity. 309 00:22:11,860 --> 00:22:16,100 However, if those initial arrests were racially motivated or 310 00:22:16,100 --> 00:22:21,060 even illegal, this approach can provide algorithmic justification for 311 00:22:21,060 --> 00:22:25,867 further police harassment of minority and low income neighborhoods. 312 00:22:25,867 --> 00:22:30,856 Using such flawed data to train new systems embeds the police departments 313 00:22:30,856 --> 00:22:33,761 documented misconduct in the algorithm. 314 00:22:33,761 --> 00:22:36,348 And perpetuates practices already known to be 315 00:22:36,348 --> 00:22:39,209 terrorizing those most vulnerable to that abuse. 316 00:22:42,008 --> 00:22:46,535 >> Another precursor to biased tech is lack of complete data. 317 00:22:46,535 --> 00:22:50,447 If data is not complete before it's fed to a machine learning model, 318 00:22:50,447 --> 00:22:54,372 it may not be representative, and therefore, it may include bias. 319 00:22:57,171 --> 00:23:01,866 For example, if an AI tool that's trained to identify people is given 320 00:23:01,866 --> 00:23:04,422 100 images of faces to learn from. 321 00:23:04,422 --> 00:23:08,034 And only 10% of those images include people of color, 322 00:23:08,034 --> 00:23:13,166 the AI will learn more about identifying white faces than it does any other race. 323 00:23:15,475 --> 00:23:19,811 This has presented problems such as faulty facial recognition software, 324 00:23:19,811 --> 00:23:22,841 mis-identification and surveillance software, 325 00:23:22,841 --> 00:23:25,468 which has also led to wrongful convictions. 326 00:23:25,468 --> 00:23:31,825 And even photo search engines classifying photos of people of color as gorillas. 327 00:23:34,435 --> 00:23:38,552 So, to alleviate these instances of racism in technology, 328 00:23:38,552 --> 00:23:43,085 training data should be as diverse and free from bias as possible. 329 00:23:43,085 --> 00:23:47,622 And having people of color in the professional spaces where this tech is 330 00:23:47,622 --> 00:23:52,928 built, helps identify this biased datasets before they can become more harmful. 331 00:23:55,378 --> 00:23:59,931 >> In other words, it's dangerously easy for machine learning algorithms to 332 00:23:59,931 --> 00:24:04,850 perpetuate society's existing race, class, and gender based inequalities. 333 00:24:04,850 --> 00:24:07,919 But remember, we're talking about machines here. 334 00:24:07,919 --> 00:24:12,148 Powerful as they are, we still call the shots indicating how and 335 00:24:12,148 --> 00:24:14,273 why they behave the way they do. 336 00:24:16,669 --> 00:24:22,727 Much of the racism in technology doesn't actually come from malice, but ignorance. 337 00:24:22,727 --> 00:24:27,091 It's born of the way tech tools like AI are trained and coded. 338 00:24:27,091 --> 00:24:31,575 It involves unconscious bias, the limitations of technology, and 339 00:24:31,575 --> 00:24:32,897 racial oversight. 340 00:24:35,064 --> 00:24:40,117 With that, a logical and actionable way to combat racist technology is to ensure 341 00:24:40,117 --> 00:24:45,493 there's representation in the people who build this technology in the first place. 342 00:24:45,493 --> 00:24:50,938 However, it's important to acknowledge that AI systems may never be 343 00:24:50,938 --> 00:24:56,676 completely free from bias, as is the case with the humans who build them. 344 00:24:56,676 --> 00:25:01,337 But with more people of color involved in all stages of the development process. 345 00:25:01,337 --> 00:25:05,282 With awareness of the dangerous implications of these biases, 346 00:25:05,282 --> 00:25:09,165 we can greatly reduce the potential for harm these tools hold. 347 00:25:09,165 --> 00:25:13,302 And work towards producing a safer and more just future. 348 00:25:14,895 --> 00:25:17,539 >> So where do we go from here? 349 00:25:17,539 --> 00:25:22,126 The purpose of this talk was not to paint a grim picture of the tech industry or 350 00:25:22,126 --> 00:25:26,653 to leave you feeling defeated and homeless at the greater systems at play. 351 00:25:28,024 --> 00:25:32,798 Instead, we hope this discussion can serve as at least a starting point to 352 00:25:32,798 --> 00:25:37,187 increase your awareness about the systemic lack of representation 353 00:25:37,187 --> 00:25:39,584 in this hugely impactful industry. 354 00:25:39,584 --> 00:25:44,084 The repercussions this lack of diversity has on our daily lives through 355 00:25:44,084 --> 00:25:46,790 technology we use and depend on every day. 356 00:25:46,790 --> 00:25:53,585 And how to use this knowledge to both empower yourself and those around you. 357 00:25:53,585 --> 00:25:58,816 >> Most importantly, we want to emphasize that people of color in tech have been, 358 00:25:58,816 --> 00:26:01,796 and will continue to be pioneers in this field 359 00:26:03,864 --> 00:26:08,631 From the early contributions of W E B Du Bois, Roy Clay, and Loretta Staples, 360 00:26:08,631 --> 00:26:12,113 to the present day work of contemporary tech activists. 361 00:26:12,113 --> 00:26:16,110 The perspectives and impact of people of color in tech are what allows us to 362 00:26:16,110 --> 00:26:18,700 continue reaching new heights of innovation. 363 00:26:18,700 --> 00:26:23,126 If you doubt that you have a place in this industry, please remember that you've 364 00:26:23,126 --> 00:26:26,984 always been here and your continuous involvement drives us forward. 365 00:26:29,956 --> 00:26:34,275 If you don't know where to turn to find people who look like you in the industry, 366 00:26:34,275 --> 00:26:37,316 here are some present day pioneers and organizations. 367 00:26:37,316 --> 00:26:40,882 Who are breaking barriers in the field and paving the way for 368 00:26:40,882 --> 00:26:42,956 people of color to thrive in tech. 369 00:26:42,956 --> 00:26:46,390 Everyone on this list provides resources, community, and 370 00:26:46,390 --> 00:26:49,772 great inspiration to empower the next generation of tech. 371 00:26:49,772 --> 00:26:54,232 For those already in the industry or thinking of breaking in, 372 00:26:54,232 --> 00:26:59,206 we wanna leave you with some gentle reminders and self care tips as you 373 00:26:59,206 --> 00:27:04,557 navigate influential, intimidating, and often homogeneous spaces. 374 00:27:05,922 --> 00:27:10,570 >> We want you to be mindful of moments when you're shrinking yourself to fit in. 375 00:27:10,570 --> 00:27:14,310 This can manifest as code switching, not asking for 376 00:27:14,310 --> 00:27:18,739 the promotion you deserve, or just downplaying your value. 377 00:27:18,739 --> 00:27:21,921 We know that it can feel difficult, intimidating, and 378 00:27:21,921 --> 00:27:26,230 sometimes even dangerous to take up space as a person of color. 379 00:27:26,230 --> 00:27:29,227 But remind yourself of your power and worth. 380 00:27:29,227 --> 00:27:32,634 Don't be afraid to be unapologetically yourself in these environments. 381 00:27:32,634 --> 00:27:34,383 And call up microaggressions, 382 00:27:34,383 --> 00:27:38,539 and potentially inadequate applications of technology when you see them. 383 00:27:40,936 --> 00:27:46,105 On the other hand, recognize when you are in implicitly oppressive environments, 384 00:27:46,105 --> 00:27:49,541 and the emotional toll of having to constantly show up. 385 00:27:49,541 --> 00:27:53,327 Give yourself the time and space to decompress. 386 00:27:53,327 --> 00:27:57,524 Practice self care and surround yourself with a strong support system, 387 00:27:57,524 --> 00:28:00,482 either within your organization or outside of it, 388 00:28:00,482 --> 00:28:03,934 that you know you can go to in times of need. 389 00:28:03,934 --> 00:28:07,935 >> We know this conversation isn't easy. 390 00:28:07,935 --> 00:28:10,109 The most important one seldom are. 391 00:28:10,109 --> 00:28:14,712 But while difficult, we hope this talk will help you see yourself, show up for 392 00:28:14,712 --> 00:28:17,804 yourself, and empower yourself in the tech world. 393 00:28:17,804 --> 00:28:21,790 And hopefully be part of its continuous evolution towards the greater good. 394 00:28:23,070 --> 00:28:26,472 It's easy to focus on the problematic aspects of the industry. 395 00:28:26,472 --> 00:28:30,945 But we're firm believers that we're living at the pinnacle of technology poised to 396 00:28:30,945 --> 00:28:32,843 make the world a much better place. 397 00:28:32,843 --> 00:28:37,840 And having a part in shaping that future firsthand, is both a privilege and 398 00:28:37,840 --> 00:28:38,487 a right. 399 00:28:38,487 --> 00:28:42,014 The next frontier of the tech revolution starts with us, and 400 00:28:42,014 --> 00:28:43,473 we hope to see you there. 401 00:28:46,705 --> 00:28:51,075 >> We know this is a tough topic, but we do wanna open it up for questions. 402 00:28:51,075 --> 00:28:54,565 We are by no means experts in diversity and inclusion. 403 00:28:54,565 --> 00:28:58,763 But we do wanna provide a space to begin to ask tough questions. 404 00:28:58,763 --> 00:29:02,686 And maybe together we can start coming up with some solutions. 405 00:29:08,323 --> 00:29:10,969 Please feel free to drop any questions in the Q&A. 406 00:29:14,068 --> 00:29:16,850 >> So we have one question already. 407 00:29:16,850 --> 00:29:24,310 Do you have any recommendations for affinity organizations we can join? 408 00:29:24,310 --> 00:29:27,359 Maybe do you wanna bring the slide back up Jarvis, 409 00:29:27,359 --> 00:29:29,906 where we list some great organizations? 410 00:29:29,906 --> 00:29:33,237 It's by no means an exhaustive list, right? 411 00:29:33,237 --> 00:29:38,755 Just some of our favorites that we're personally involved in and 412 00:29:38,755 --> 00:29:41,176 have found to be impactful. 413 00:29:41,176 --> 00:29:43,605 And hopefully that can help you. 414 00:29:48,797 --> 00:29:55,883 How can I motivate my underrepresented peoples and get them excited about tech? 415 00:29:55,883 --> 00:29:59,861 I have some thoughts, but do you want to take that on, Jarvis? 416 00:29:59,861 --> 00:30:03,140 I know that you're an educator working directly with some students. 417 00:30:04,960 --> 00:30:09,699 Yeah, I think the biggest thing that I had to realize was that I need to 418 00:30:09,699 --> 00:30:13,639 embrace who I am as a person and not be afraid to be myself. 419 00:30:13,639 --> 00:30:15,998 But also be aware of my surroundings and 420 00:30:15,998 --> 00:30:19,653 how I'm being perceived to the people in that environment. 421 00:30:19,653 --> 00:30:24,749 I've been in situations where things happened that I didn't 422 00:30:24,749 --> 00:30:30,680 people saw me in a way that I didn't even realize I was being seen. 423 00:30:30,680 --> 00:30:34,369 And so it's kinda made me a little bit more hyper focused with making sure that 424 00:30:34,369 --> 00:30:38,060 how I feel like I'm being portrayed as how people are actually receiving that 425 00:30:38,060 --> 00:30:38,659 behavior. 426 00:30:38,659 --> 00:30:44,980 And so really just trying to communicate and be aware of my surroundings. 427 00:30:44,980 --> 00:30:47,540 So, that's probably the best advice I can give for now. 428 00:30:47,540 --> 00:30:48,960 Amy, you, you can add to that? 429 00:30:50,190 --> 00:30:53,217 >> Yeah, definitely I would just add, and 430 00:30:53,217 --> 00:30:58,024 that's why we presented some, like a very short list, right, 431 00:30:58,024 --> 00:31:04,185 of just pioneering innovators of color in the tech space early on in this session. 432 00:31:04,185 --> 00:31:08,448 To kind of cement the fact that people of color have actually always been in 433 00:31:08,448 --> 00:31:13,265 tech and have been behind some of the most culture shifting impactful technologies 434 00:31:13,265 --> 00:31:16,590 and products that we still use every single day, right? 435 00:31:16,590 --> 00:31:22,510 So that you, unfortunately, don't read that as much in the history books. 436 00:31:22,510 --> 00:31:27,818 And so, just knowing that, that people of color have always been here. 437 00:31:27,818 --> 00:31:32,497 And, it's not so much breaking in as much as it is kind of 438 00:31:32,497 --> 00:31:37,540 reclaiming that power, I think is a powerful narrative. 439 00:31:37,540 --> 00:31:40,395 So, that's kind of why we opened up the chat with that and 440 00:31:40,395 --> 00:31:43,264 I think just even that perspective is hugely impactful. 441 00:31:47,169 --> 00:31:51,971 >> I saw one question that said, in your interview process or in the teams you've 442 00:31:51,971 --> 00:31:57,088 worked, have you felt any discrimination, and if so, how have you called it out? 443 00:32:02,839 --> 00:32:06,315 I know I have stories but I wanna give you let you go first, Amy. 444 00:32:06,315 --> 00:32:09,550 >> [LAUGH] You go ahead, you go first. 445 00:32:11,830 --> 00:32:16,297 >> So, it's a little bit harder to spot that interview process 446 00:32:16,297 --> 00:32:20,851 because usually people would be very friendly to your face and 447 00:32:20,851 --> 00:32:24,630 that they just won't continue to hiring process. 448 00:32:25,890 --> 00:32:31,159 So, I haven't notice that a ton, but I have experienced it in the workplace. 449 00:32:31,159 --> 00:32:36,397 And it's definitely something that's really hard to deal with, 450 00:32:36,397 --> 00:32:41,445 because there's a lot of routes you can go and none of them really 451 00:32:41,445 --> 00:32:46,327 give you good solutions, just depending on your situation. 452 00:32:46,327 --> 00:32:51,306 And my particular situation, it was issues I had with not only 453 00:32:51,306 --> 00:32:55,708 the people who were on the design team that I was on, but 454 00:32:55,708 --> 00:32:59,990 also the people who I was working underneath. 455 00:32:59,990 --> 00:33:04,537 And so, I really didn't have anybody that I could go to within the company to 456 00:33:04,537 --> 00:33:09,454 address the issues because it's the people below me, it's the people above me. 457 00:33:09,454 --> 00:33:12,039 I probably could have taken it to HR, but 458 00:33:12,039 --> 00:33:17,150 I don't think that that would have led to anybody losing their job or anything. 459 00:33:17,150 --> 00:33:19,780 So these are still people that I'm gonna have to work with. 460 00:33:19,780 --> 00:33:23,755 And then, when it's kind of come up from both sides like that, 461 00:33:23,755 --> 00:33:25,859 I decided to just end up leaving. 462 00:33:25,859 --> 00:33:29,576 And I really emphasize company culture whenever I'm 463 00:33:29,576 --> 00:33:32,059 interviewing new workplaces now. 464 00:33:32,059 --> 00:33:37,022 And making sure that they are gonna be accepting of who I am, and 465 00:33:37,022 --> 00:33:41,998 they do have initiatives to push for more diverse workplaces. 466 00:33:41,998 --> 00:33:48,280 >> Mm-hm, mm-hm. 467 00:33:48,280 --> 00:33:55,473 And I would just add that, for me, that's kind of manifested even implicitly and 468 00:33:55,473 --> 00:34:00,176 feeling underestimated kind of that at every level. 469 00:34:00,176 --> 00:34:04,932 Which is not a universal experience and I thought it was just, yeah, 470 00:34:04,932 --> 00:34:07,880 this is what everyone kind of goes through. 471 00:34:07,880 --> 00:34:12,986 But, not necessarily and people being kind of surprised and shocked when you're 472 00:34:12,986 --> 00:34:18,170 you kind of show up and take up space and produce something valuable or impressive. 473 00:34:18,170 --> 00:34:19,937 And it's like, why are we surprised, though? 474 00:34:19,937 --> 00:34:24,891 [LAUGH] Yeah, like it seems normal at first but then you're like, wait, 475 00:34:24,891 --> 00:34:28,550 no, you expect other people to be this impressive. 476 00:34:28,550 --> 00:34:29,355 Why is it shocking when I am? 477 00:34:29,355 --> 00:34:33,880 So that's something, again, that just was always kind of a constant. 478 00:34:33,880 --> 00:34:37,631 And I never realized that it isn't universal. 479 00:34:37,631 --> 00:34:43,549 And it's something, again, that just people of color women especially in these 480 00:34:43,549 --> 00:34:49,680 spaces have to kind of go above and beyond to just cement their adequacy sometimes. 481 00:34:49,680 --> 00:34:52,650 So, that's another thing to look out for. 482 00:34:54,860 --> 00:34:57,020 >> So there's one that says, do we have a slack community? 483 00:34:57,020 --> 00:34:59,972 We do not. 484 00:34:59,972 --> 00:35:01,502 Maybe at- >> Not yet at least, but 485 00:35:01,502 --> 00:35:03,373 there are some great these organizations. 486 00:35:03,373 --> 00:35:09,900 That's come up as a slack community, quite a few of these design buddies, right? 487 00:35:09,900 --> 00:35:12,970 Yeah, so- >> Discord design buddy exists. 488 00:35:12,970 --> 00:35:17,003 >> Of course, yeah, highly recommend, by the way. 489 00:35:17,003 --> 00:35:21,296 >> And then another question says, when looking for 490 00:35:21,296 --> 00:35:27,437 a company to work for, should you be researching how diverse they are? 491 00:35:27,437 --> 00:35:34,440 >> I mean, beyond, yes, that's definitely encouraged. 492 00:35:34,440 --> 00:35:38,040 But beyond just doing, the hard research yourself, 493 00:35:38,040 --> 00:35:41,210 you can even ask that in an interview process. 494 00:35:41,210 --> 00:35:42,840 I actually do this. 495 00:35:43,860 --> 00:35:48,458 And honestly even gauging people's responses is quite indicative, right, 496 00:35:48,458 --> 00:35:51,115 just how they reacts to a question like that. 497 00:35:51,115 --> 00:35:53,983 Like, hey, yeah, just curious if you have any diversity and 498 00:35:53,983 --> 00:35:56,202 inclusion initiatives in your organization. 499 00:35:56,202 --> 00:35:59,815 Or I noticed I didn't speak to any people of color yet 500 00:35:59,815 --> 00:36:03,910 throughout my interview or recruitment process. 501 00:36:03,910 --> 00:36:06,760 Do you have any people of color in leadership positions? 502 00:36:06,760 --> 00:36:08,481 It's a fair question, I think. 503 00:36:08,481 --> 00:36:12,761 And even if the answer is no, maybe they have programs in place, 504 00:36:12,761 --> 00:36:15,810 maybe that's why they're talking to you. 505 00:36:15,810 --> 00:36:17,542 And it's up to you how you feel about that. 506 00:36:17,542 --> 00:36:23,740 Maybe they aren't expecting that question at all and they're caught super off guard. 507 00:36:23,740 --> 00:36:27,026 So I find that that's a very honest, 508 00:36:27,026 --> 00:36:33,078 reasonable way to speak approach that like in a human centered way. 509 00:36:33,078 --> 00:36:37,802 Beyond just looking at hard data that you might find online, but 510 00:36:37,802 --> 00:36:40,260 I would recommend both for sure. 511 00:36:42,320 --> 00:36:47,345 >> Yeah, and just to add to that, it's kind of tough to really know 512 00:36:47,345 --> 00:36:51,999 based off of the data that they might have posted publicly, 513 00:36:51,999 --> 00:36:56,760 what the initiatives are kinda what he was talking about. 514 00:36:56,760 --> 00:37:00,094 I know where I'm working at now, I do a lot in healthcare and 515 00:37:00,094 --> 00:37:03,513 the company that I'm working for isn't the most diversed. 516 00:37:03,513 --> 00:37:05,401 But they're aware of that and 517 00:37:05,401 --> 00:37:10,050 they're putting things in place to solve that moving forward. 518 00:37:10,050 --> 00:37:13,867 And so I felt if you can uncover what their plan is to combat, the lack of 519 00:37:13,867 --> 00:37:18,590 diversity in a workforce is more important than how diverse they are right now. 520 00:37:18,590 --> 00:37:22,677 Because we were already aware that there is that lack of diversity and 521 00:37:22,677 --> 00:37:25,928 that not everybody is gonna be where they wanna be yet. 522 00:37:25,928 --> 00:37:29,980 But it's what plans do you have to fix this moving forward, 523 00:37:29,980 --> 00:37:31,691 that I think means more. 524 00:37:41,504 --> 00:37:46,328 >> So, yeah, we have what self-care tips do you recommend going 525 00:37:46,328 --> 00:37:49,430 through this field as a person of color? 526 00:37:50,650 --> 00:37:56,227 So we mentioned a few things top of mind, or kind of like the overarching umbrella 527 00:37:56,227 --> 00:38:01,330 will be to just kind of practice self-awareness, first and foremost. 528 00:38:01,330 --> 00:38:07,462 And really try to be in tune with the environments and spaces that you're in. 529 00:38:07,462 --> 00:38:11,556 And, how you're feeling and around certain people, 530 00:38:11,556 --> 00:38:16,632 certain situations, certain organizations, that intuition and 531 00:38:16,632 --> 00:38:20,920 just an internal reaction you have is not a coincidence. 532 00:38:20,920 --> 00:38:25,252 So, most often when something feels a bit off, if it's a comment, 533 00:38:25,252 --> 00:38:30,269 if it's slight, if it's just something that happens in your organization, 534 00:38:30,269 --> 00:38:32,770 it might not be just you, right? 535 00:38:32,770 --> 00:38:36,100 It's a very valid feeling and could be an indication of something bigger. 536 00:38:36,100 --> 00:38:41,593 So I think just even like practicing that self-awareness is the first step 537 00:38:41,593 --> 00:38:47,705 in kind of recognizing whether you're in an explicitly oppressive environment. 538 00:38:47,705 --> 00:38:52,465 And then kind of take mediating that, deciding if that is a place for 539 00:38:52,465 --> 00:38:53,560 you long term. 540 00:38:53,560 --> 00:38:58,136 Unfortunately that also comes with tremendous privilege, 541 00:38:58,136 --> 00:39:02,978 right, to be able to walk away from an oppressive environment or 542 00:39:02,978 --> 00:39:06,130 employer like most of us can't do that. 543 00:39:06,130 --> 00:39:10,069 So, recognizing that and you having a strong support system, 544 00:39:10,069 --> 00:39:12,683 that's when that really comes in handy. 545 00:39:12,683 --> 00:39:15,724 If you don't have the privilege to be able to just walk 546 00:39:15,724 --> 00:39:19,521 away from something that's actively causing you harm like that. 547 00:39:19,521 --> 00:39:24,593 Surrounding yourself with people who you can talk to, reach out to as a confidant, 548 00:39:24,593 --> 00:39:28,082 ask for support, hey, have you been in this situation? 549 00:39:28,082 --> 00:39:32,950 Passed up for a promotion or something with co-workers just being having 550 00:39:32,950 --> 00:39:37,920 a community where you can share these conversations and stories. 551 00:39:37,920 --> 00:39:40,175 That's when this is hugely important, again. 552 00:39:40,175 --> 00:39:44,983 And another reason we shared this by no means exhaustive list of organizations and 553 00:39:44,983 --> 00:39:47,920 communities, where you can start finding that. 554 00:39:49,920 --> 00:39:54,654 >> And I feel like this doesn't just relate to being in like an oppressive 555 00:39:54,654 --> 00:39:59,240 environment, just in general like, practice meditation. 556 00:39:59,240 --> 00:40:03,636 Get away from your computer, go on walks, take care of your mental health, 557 00:40:03,636 --> 00:40:08,668 take care of your physical health Do things away from your computer just so 558 00:40:08,668 --> 00:40:12,769 that mentally and physically you have a lot of clarity on that front. 559 00:40:12,769 --> 00:40:17,368 So that way when you do have to, be clocked into work, you've done the best 560 00:40:17,368 --> 00:40:22,124 that you can to prepare yourself to deal with whatever is gonna come your way. 561 00:40:22,124 --> 00:40:25,137 So definitely practice mental health, 562 00:40:25,137 --> 00:40:30,016 mental wellness, and physical health as well as much as you can. 563 00:40:33,880 --> 00:40:38,419 So I see a question that says, how can I hold my employer accountable? 564 00:40:38,419 --> 00:40:44,224 And is it worth calling stuff out or should I just lay low and 565 00:40:44,224 --> 00:40:47,671 survive for fear of retaliation? 566 00:40:47,671 --> 00:40:52,567 For this one, I would say, if within the company you 567 00:40:52,567 --> 00:40:57,463 can find allies who feel the same way that you do, and 568 00:40:57,463 --> 00:41:03,936 you can come to leadership as a group instead of just a single person. 569 00:41:03,936 --> 00:41:09,474 Specifically, if you can get those allies in positions where they may have a little 570 00:41:09,474 --> 00:41:14,375 bit of pool within the company, that can help you gain a lot of traction and 571 00:41:14,375 --> 00:41:18,200 make some strides to improve the way that things are now. 572 00:41:18,200 --> 00:41:20,850 If you can't do that, then, 573 00:41:20,850 --> 00:41:26,469 it kinda depends on the amount of risk that you wanna put forth. 574 00:41:26,469 --> 00:41:29,529 So you can push for change on your own, but 575 00:41:29,529 --> 00:41:34,866 you do have to understand that there could potentially be repercussions 576 00:41:34,866 --> 00:41:39,779 that result in maybe a loss of a job or something along those lines. 577 00:41:39,779 --> 00:41:43,534 If it goes against what everybody else in that company feels, or 578 00:41:43,534 --> 00:41:46,815 what the executives feel for that particular company. 579 00:41:46,815 --> 00:41:51,315 So definitely, start by trying to find some allies. 580 00:41:59,000 --> 00:42:02,500 >> Yeah, there's the question, how did you two connect? 581 00:42:02,500 --> 00:42:04,676 Jarvis is a mentor of mine. 582 00:42:04,676 --> 00:42:11,010 He's a mentor at the design school I went to, design bootcamp rather. 583 00:42:11,010 --> 00:42:17,797 And yeah, that's how we've connected and collaborated through that. 584 00:42:17,797 --> 00:42:22,665 I see some comments, questions in the chat not even Q&A 585 00:42:22,665 --> 00:42:26,194 if we don't have questions in the Q&A. 586 00:42:26,194 --> 00:42:27,663 I see some. 587 00:42:30,389 --> 00:42:32,380 No sorry, did I miss it? 588 00:42:37,999 --> 00:42:40,669 Sorry it was right here. 589 00:42:40,669 --> 00:42:45,974 I'm worried as I start a new role that I won't have the support I need and 590 00:42:45,974 --> 00:42:49,610 will suffer like I have in previous industries. 591 00:42:49,610 --> 00:42:53,116 So yeah, I think you were just kind of speaking to this a bit, right Jarvis? 592 00:42:53,116 --> 00:42:59,149 The importance of seeking and having allies, ideally within your organization. 593 00:42:59,149 --> 00:43:04,040 But again, even if it's outside of it, even if that's family and friends, 594 00:43:04,040 --> 00:43:07,520 just having that support system is hugely important. 595 00:43:07,520 --> 00:43:11,745 No one should go through that alone and we sell them have to, 596 00:43:11,745 --> 00:43:16,898 at the very least to be able to have people who you can speak to this with and 597 00:43:16,898 --> 00:43:19,366 kind of even form a plan of action. 598 00:43:19,366 --> 00:43:23,636 Just even debrief and say, okay, this is what happened, what steps do I wanna take? 599 00:43:23,636 --> 00:43:28,742 If any, just to have a safe space to be able to communicate those things, 600 00:43:28,742 --> 00:43:30,930 I think that will go a long way. 601 00:43:30,930 --> 00:43:34,461 But again, that's entirely possible. 602 00:43:34,461 --> 00:43:39,637 If you don't have that in your own life, design and tech communities, 603 00:43:39,637 --> 00:43:45,334 the inclusive design and tech communities are so valuable and so welcoming and 604 00:43:45,334 --> 00:43:50,539 just wonderful resources to be able to have those safe conversations. 605 00:43:50,539 --> 00:43:55,370 And connect with people who have very likely gone through the same things you 606 00:43:55,370 --> 00:43:56,289 have and are. 607 00:43:56,289 --> 00:43:59,080 So I would highly recommend seeking that out and 608 00:43:59,080 --> 00:44:02,166 you absolutely won't be going through that alone. 609 00:44:05,190 --> 00:44:09,270 >> I second that. 610 00:44:09,270 --> 00:44:13,529 So I see two questions that are kinda similar. 611 00:44:13,529 --> 00:44:16,659 Somebody said they're fairly new to the field, so any tips. 612 00:44:16,659 --> 00:44:23,416 And somebody else said, can you join the field if you're not an artist? 613 00:44:23,416 --> 00:44:26,292 >> Proceed, 614 00:44:26,292 --> 00:44:31,273 yeah, I see >> I'll 615 00:44:31,273 --> 00:44:34,602 address the one about being an artist. 616 00:44:34,602 --> 00:44:39,350 The thing about UX is that it's not art, we study patterns and behaviors. 617 00:44:39,350 --> 00:44:44,495 And we're not attempting to build something that's super innovative and 618 00:44:44,495 --> 00:44:48,669 super out there in a way that people won't know how to use it. 619 00:44:48,669 --> 00:44:51,977 We're trying to leverage common patterns and 620 00:44:51,977 --> 00:44:57,320 recognizable behavior that people are already used to seeing every day and 621 00:44:57,320 --> 00:45:02,089 making the software that's gonna be new to their everyday lives. 622 00:45:02,089 --> 00:45:07,009 So we don't wanna be super out there in artistic in 623 00:45:07,009 --> 00:45:12,179 that fashion, we wanna be very scientific in a way. 624 00:45:12,179 --> 00:45:13,873 >> Yeah, very well said. 625 00:45:13,873 --> 00:45:18,364 And as far as certifications or design degrees, 626 00:45:18,364 --> 00:45:22,526 college degrees are concerned, at least for 627 00:45:22,526 --> 00:45:27,672 design the weight of a design applicant is far greater on 628 00:45:27,672 --> 00:45:33,064 their actual work than it is their education background. 629 00:45:33,064 --> 00:45:37,903 Even if you did go to the most prestigious design school or art school, 630 00:45:37,903 --> 00:45:42,512 that won't weigh as heavily as the actual work in your portfolio. 631 00:45:42,512 --> 00:45:46,519 So I would always emphasize the actual work over anything else. 632 00:45:46,519 --> 00:45:51,674 If your work is phenomenal, it won't matter where you got your degree or 633 00:45:51,674 --> 00:45:54,730 lack thereof, like in the case of Jarvis. 634 00:45:54,730 --> 00:45:58,497 It absolutely not necessary, but again, 635 00:45:58,497 --> 00:46:03,104 that doesn't mean that there aren't privileges or 636 00:46:03,104 --> 00:46:07,300 biases that come with people without degrees. 637 00:46:07,300 --> 00:46:12,157 I think Elizabeth Lizzy was even mentioned in the chat that 638 00:46:12,157 --> 00:46:17,510 applications that require college degrees or bachelor degrees 639 00:46:17,510 --> 00:46:22,582 are inaccessible by default, I'm of that opinion as well. 640 00:46:22,582 --> 00:46:26,130 So, again, that it's not every employer that will even require that. 641 00:46:26,130 --> 00:46:31,930 Some will explicitly encourage applicants from underrepresented communities, 642 00:46:31,930 --> 00:46:35,170 both ethnically and educationally to apply. 643 00:46:35,170 --> 00:46:36,910 And I always think that that's great. 644 00:46:36,910 --> 00:46:42,531 That kind of levels the playing field a bit more in democratizes the industry or 645 00:46:42,531 --> 00:46:45,011 at least an entry to the industry. 646 00:46:45,011 --> 00:46:51,809 But again, as we've mentioned in this talk and as most people know firsthand, 647 00:46:51,809 --> 00:46:56,882 you will undoubtedly regardless probably end up doing more 648 00:46:56,882 --> 00:47:04,300 work than your white counterpart just by default, even without that degree, right? 649 00:47:04,300 --> 00:47:07,499 So that's kind of an unfortunate reality. 650 00:47:07,499 --> 00:47:10,164 >> Yeah, and to add to it. 651 00:47:10,164 --> 00:47:15,396 It will be significantly harder for you to get into the tech space without 652 00:47:15,396 --> 00:47:20,714 the certification or degree or some type of piece of paper to validate your 653 00:47:20,714 --> 00:47:25,729 experience just because that's how our society is kinda set up now. 654 00:47:25,729 --> 00:47:32,047 But because we're in a space where we have work to backup what we know, 655 00:47:32,047 --> 00:47:36,545 it does mean that there is still an opportunity for 656 00:47:36,545 --> 00:47:39,130 you to break into the field. 657 00:47:39,130 --> 00:47:43,389 You may not be taken seriously by everybody, you probably won't. 658 00:47:43,389 --> 00:47:48,632 I know there are companies that I can't apply to because I don't have a degree. 659 00:47:48,632 --> 00:47:54,272 Not saying that I don't meet all the qualifications within their job posting, 660 00:47:54,272 --> 00:47:58,750 but they'll be really strict on no bachelor's degree sorry. 661 00:47:58,750 --> 00:48:01,009 I've been told that by recruiters before. 662 00:48:01,009 --> 00:48:06,261 So there will be opportunities that you can't take advantage of. 663 00:48:06,261 --> 00:48:09,278 There will be times where people don't take you seriously. 664 00:48:09,278 --> 00:48:12,590 There will be times where, yeah, people will ask, 665 00:48:12,590 --> 00:48:17,348 can you send us over your transcript or your degree or something like that? 666 00:48:17,348 --> 00:48:19,470 And you say no and then that's it. 667 00:48:19,470 --> 00:48:24,479 So just being aware of that, you can still make it without one. 668 00:48:24,479 --> 00:48:25,874 Clearly living proof of that. 669 00:48:32,110 --> 00:48:37,315 >> Yeah, I think we got through all of the questions right on time too. 670 00:48:37,315 --> 00:48:38,732 >> Perfect. 671 00:48:38,732 --> 00:48:41,314 >> Perfect, we'd love that. 672 00:48:41,314 --> 00:48:45,666 So thanks again, everyone for having us in engaging in such a thoughtful and 673 00:48:45,666 --> 00:48:47,640 meaningful Q&A and discussion. 674 00:48:47,640 --> 00:48:51,149 Feel free to reach out to either Jarvis or myself. 675 00:48:51,149 --> 00:48:55,401 You can find us on LinkedIn, socials. 676 00:48:55,401 --> 00:48:59,269 Not sure if you're on socials Jarvis, but yeah, you'll be able to find all of our 677 00:48:59,269 --> 00:49:01,973 information through the Treehouse registration site. 678 00:49:01,973 --> 00:49:05,181 And hope you enjoy the rest of the festival. 679 00:49:05,181 --> 00:49:07,139 >> Yes, thank everybody. 680 00:49:07,139 --> 00:49:08,931 >> We appreciate you. 681 00:49:08,931 --> 00:49:10,380 >> Thank you for coming out. 682 00:49:10,380 --> 00:49:11,020 >> Bye.