1 00:00:00,000 --> 00:00:04,234 [MUSIC] 2 00:00:04,234 --> 00:00:05,211 Hello, everyone. 3 00:00:05,211 --> 00:00:07,280 My name is Ryan Carson. 4 00:00:07,280 --> 00:00:09,440 I'm the co-founder and CEO of Treehouse. 5 00:00:09,440 --> 00:00:14,310 It is lovely to see you all at this online event. 6 00:00:14,310 --> 00:00:19,600 I have a pleasure of welcoming up our next speaker, which I'm so excited about. 7 00:00:19,600 --> 00:00:24,800 Valerie Osei is a Front End Software Engineer based in Toronto Ontario. 8 00:00:24,800 --> 00:00:28,375 She enjoys building the web and bringing building big ideas, and 9 00:00:28,375 --> 00:00:30,980 bringing them to life through technology. 10 00:00:30,980 --> 00:00:33,290 She has a passion for solving problems and 11 00:00:33,290 --> 00:00:36,810 using technology to make people’s lives better. 12 00:00:36,810 --> 00:00:41,010 She is always eager to talk about the latest trends in software development. 13 00:00:41,010 --> 00:00:45,160 And as a dedicated lifelong learner, please welcome Valerie. 14 00:00:46,490 --> 00:00:48,598 >> Hi everyone, I hope you guys can hear me, 15 00:00:48,598 --> 00:00:51,020 I'm just gonna share my screen with all of you. 16 00:00:52,100 --> 00:00:55,792 Okay, so as Ryan was saying, my name is Valerie Osei. 17 00:00:55,792 --> 00:01:00,810 And before I start, I just wanna thank all of you for coming on with 18 00:01:00,810 --> 00:01:06,302 me today to listen to me speak where I can share some knowledge with you. 19 00:01:06,302 --> 00:01:10,630 And hopefully shed some light on Unconscious Bias in the Tech Industry. 20 00:01:14,380 --> 00:01:17,580 Okay, so Ryan kinda did a really good job of introducing me. 21 00:01:17,580 --> 00:01:20,100 So I won't spend too much time on this. 22 00:01:20,100 --> 00:01:22,749 But I work as a Front End Software Developer. 23 00:01:22,749 --> 00:01:25,649 Initially started coding on my own and I also took 24 00:01:25,649 --> 00:01:30,700 a computer science course in high school that kind of picked my interest. 25 00:01:30,700 --> 00:01:34,340 And then from there, I kind of decided to make a career transition. 26 00:01:34,340 --> 00:01:37,928 After I completed my undergraduate at uft, I thought, 27 00:01:37,928 --> 00:01:42,140 I'm more into this coding stuff, let me see what I can do with it. 28 00:01:42,140 --> 00:01:42,981 And my reason for 29 00:01:42,981 --> 00:01:46,358 being here today is because I love to share knowledge with people. 30 00:01:46,358 --> 00:01:48,638 And I also really love giving back to this Tech Community. 31 00:01:48,638 --> 00:01:52,912 Because it's constantly growing it's so huge and it's just got so much for 32 00:01:52,912 --> 00:01:53,990 all of us to learn. 33 00:01:56,290 --> 00:02:00,742 So, today I'm gonna be sharing some insights with you all on Unconscious Bias 34 00:02:00,742 --> 00:02:02,470 in the Tech Industry. 35 00:02:02,470 --> 00:02:06,274 And I'll be shedding some light on how we can frame our questions when we're 36 00:02:06,274 --> 00:02:10,557 approached with situations where we feel we've either been impacted by unconscious 37 00:02:10,557 --> 00:02:14,140 bias, or we've projected our own unconscious bias sees onto others. 38 00:02:16,950 --> 00:02:19,574 So first off, what is unconscious bias? 39 00:02:19,574 --> 00:02:23,206 Many of us have likely heard this term used in one way or 40 00:02:23,206 --> 00:02:28,720 another to describe a certain situation or micro aggression of some kind. 41 00:02:28,720 --> 00:02:32,410 And it's often used interchangeably with cognitive bias. 42 00:02:32,410 --> 00:02:37,336 Which basically refers to how we as human beings create shortcuts in our minds 43 00:02:37,336 --> 00:02:39,019 to process information. 44 00:02:39,019 --> 00:02:43,763 So I have just a quick quote here from Warren Buffett, which states that, 45 00:02:43,763 --> 00:02:48,585 what the human being is best at doing is interpreting all new information so 46 00:02:48,585 --> 00:02:51,730 that their prior conclusions remain intact. 47 00:02:51,730 --> 00:02:54,862 So I think we can all kind of safely attest to 48 00:02:54,862 --> 00:02:58,342 this where we have our preconceived notions and 49 00:02:58,342 --> 00:03:03,043 we use those as points of reference to consume new information. 50 00:03:06,145 --> 00:03:11,141 So, essentially what we're dealing with is, and I'm sure most of us are aware 51 00:03:11,141 --> 00:03:16,226 of this at this point is that, the tech industry has seen years and years of bias. 52 00:03:16,226 --> 00:03:20,113 And this has often been attributed to the socio economic and 53 00:03:20,113 --> 00:03:24,900 historical contexts where certain roles were gaining popularity. 54 00:03:24,900 --> 00:03:29,677 And now, the legacy of that is that we're dealing with a lot of roles that aren't 55 00:03:29,677 --> 00:03:33,260 being filled with diverse with diverse candidates. 56 00:03:33,260 --> 00:03:37,816 And as a result, we're dealing with a problem that has been swept under the rug, 57 00:03:37,816 --> 00:03:40,430 but is having very clear and adverse effects. 58 00:03:42,730 --> 00:03:46,241 So I wanted to take some time to kind of highlight a few of the types of 59 00:03:46,241 --> 00:03:48,220 Unconscious Bias that exists. 60 00:03:48,220 --> 00:03:52,490 These aren't the only types but I thought they were relevant for today's talk. 61 00:03:52,490 --> 00:03:56,250 The first one that I have is called the Halo Effect. 62 00:03:56,250 --> 00:04:00,706 And basically, it refers to when we place someone on a pedestal after learning 63 00:04:00,706 --> 00:04:02,740 something impressive about them. 64 00:04:02,740 --> 00:04:04,878 So if you can think of a time or 65 00:04:04,878 --> 00:04:09,872 maybe you heard that someone graduated from Harvard Law or so and so 66 00:04:09,872 --> 00:04:14,706 is going to medical school, immediately we get this reaction. 67 00:04:14,706 --> 00:04:18,957 Where we think of that person as being superior in skill and 68 00:04:18,957 --> 00:04:23,473 intelligence without actually diving deeply into what those 69 00:04:23,473 --> 00:04:27,920 skills are whether or not that person actually can perform. 70 00:04:27,920 --> 00:04:33,810 And then the next one I have here is what is called the Availability Heuristic. 71 00:04:33,810 --> 00:04:37,476 And this is where we find that we're favoring ideas that 72 00:04:37,476 --> 00:04:39,200 come most easily to mind. 73 00:04:39,200 --> 00:04:45,240 So this is often based on the frequency of occurrence of certain events in our life. 74 00:04:45,240 --> 00:04:50,187 Or perhaps, if there's a particular situation that happened and 75 00:04:50,187 --> 00:04:55,500 it left an indelible mark on us or a very impactful impression on us. 76 00:04:55,500 --> 00:04:59,198 We often reference these experiences when we're 77 00:04:59,198 --> 00:05:04,050 faced with new information as a way to process that information. 78 00:05:04,050 --> 00:05:07,955 And then I have here as well Confirmation or Implicit Bias. 79 00:05:07,955 --> 00:05:12,496 So these terms are often used interchangeably with unconscious bias. 80 00:05:12,496 --> 00:05:16,248 And can also be used as sort of an umbrella to classify or 81 00:05:16,248 --> 00:05:20,810 under which the other types of biases can be classified. 82 00:05:20,810 --> 00:05:24,860 So this is where we're looking to confirm what we already think or believe. 83 00:05:24,860 --> 00:05:28,843 And, we can safely say that all of us have certain beliefs or 84 00:05:28,843 --> 00:05:33,786 certain ways that we live our lives, certain viewpoints that we hold that 85 00:05:33,786 --> 00:05:37,150 inform how we process the information around us. 86 00:05:37,150 --> 00:05:41,002 So we take the time to dissect Unconscious Bias real briefly, 87 00:05:41,002 --> 00:05:43,088 we can identify a couple of PROS. 88 00:05:43,088 --> 00:05:48,112 For one, Unconscious Bias does allow us to process a bulk of information quickly, 89 00:05:48,112 --> 00:05:51,920 and there are situations where this would be beneficial. 90 00:05:51,920 --> 00:05:53,430 Sometimes we don't need all the details. 91 00:05:53,430 --> 00:05:56,050 We don't need to know a situation at the granular level. 92 00:05:56,050 --> 00:06:00,480 Sometimes we just need the little pieces we need to make an informed decision. 93 00:06:02,800 --> 00:06:07,761 But of course, there are situations where this can have a negative impact. 94 00:06:07,761 --> 00:06:10,696 If we take the year 2020 as an example, 95 00:06:10,696 --> 00:06:16,160 we've been presented with an unprecedented time and constant new information 96 00:06:16,160 --> 00:06:21,235 that's being presented to us and then relied and then re presented to us. 97 00:06:21,235 --> 00:06:24,669 So we're being given so much new information that if we rely on 98 00:06:24,669 --> 00:06:27,378 unconscious bias to process this information, 99 00:06:27,378 --> 00:06:31,095 we're likely going to be filtering out a lot of essential detail. 100 00:06:32,705 --> 00:06:34,316 And this essentially, 101 00:06:34,316 --> 00:06:39,641 prevents us from thoroughly evaluating the evidence that is presented to us. 102 00:06:39,641 --> 00:06:43,191 And will inevitably lead to huge errors in judgement 103 00:06:43,191 --> 00:06:45,940 when we're making complex decisions. 104 00:06:45,940 --> 00:06:49,600 So and this is why it is a huge problem that we need to start facing more 105 00:06:49,600 --> 00:06:50,252 directly. 106 00:06:54,331 --> 00:06:58,924 So with Unconscious Bias, comes this dilemma of diversity where, 107 00:06:58,924 --> 00:07:03,839 if we kind of pretend that it doesn't exist, we're inevitably gonna be 108 00:07:03,839 --> 00:07:08,674 facing the elephant in the room which is the fact that we are dealing with 109 00:07:08,674 --> 00:07:13,370 a lot of tech teams and tech workspaces that aren't diverse enough. 110 00:07:14,910 --> 00:07:18,940 And you're also dealing with a lot of large tech companies that prefer to sweep 111 00:07:18,940 --> 00:07:20,344 the problem under the rug or 112 00:07:20,344 --> 00:07:24,640 just not disclose numbers that pertain to their diversity and inclusion efforts. 113 00:07:26,300 --> 00:07:30,575 And then this is where of course we can think to the mostly white and 114 00:07:30,575 --> 00:07:35,360 mostly male majority that is affecting a lot of tech workspaces. 115 00:07:35,360 --> 00:07:40,210 And unfortunately, what this ends up doing is it excludes other groups from 116 00:07:40,210 --> 00:07:41,970 having a seat at the table. 117 00:07:43,750 --> 00:07:46,160 And this, of course also prevents companies from tapping into 118 00:07:46,160 --> 00:07:47,200 diverse markets. 119 00:07:47,200 --> 00:07:51,184 If you can think about the fact that we have teams that are making decisions on 120 00:07:51,184 --> 00:07:53,423 products, on various types of software, 121 00:07:53,423 --> 00:07:57,529 if those teams do not comprise diverse members, then you're going to end up in 122 00:07:57,529 --> 00:08:01,108 a situation where they feel like they've covered all their bases. 123 00:08:01,108 --> 00:08:06,820 But they're actively excluding others who fit a different narrative. 124 00:08:06,820 --> 00:08:10,380 And of course, this causes minorities to feel like they do not belong. 125 00:08:10,380 --> 00:08:11,723 I myself can attest to this and 126 00:08:11,723 --> 00:08:14,920 I'm sure many of you listening right now can attest to this. 127 00:08:14,920 --> 00:08:19,073 We're facing being in a role where we don't see enough people that look like us, 128 00:08:19,073 --> 00:08:22,030 and sometimes we question like, should I be here? 129 00:08:22,030 --> 00:08:22,920 Do I belong here? 130 00:08:22,920 --> 00:08:29,160 Is this a place that can be beneficial to me? 131 00:08:29,160 --> 00:08:34,211 So, I also want to take some time to share results 132 00:08:34,211 --> 00:08:40,090 from a report that was conducted by trust radius. 133 00:08:40,090 --> 00:08:44,250 This report was disseminated back in September of this year, so 134 00:08:44,250 --> 00:08:49,027 some pretty recent results which basically break down the experiences of 135 00:08:49,027 --> 00:08:53,110 marginalized individuals within the tech industry. 136 00:08:53,110 --> 00:08:57,223 And what it did was it surveyed about 1200 people globally, 137 00:08:57,223 --> 00:08:59,710 6% of which were Canadian. 138 00:08:59,710 --> 00:09:04,610 And it kind of went over questions that pertain to what they may be 139 00:09:04,610 --> 00:09:07,465 experiencing in the tech industry. 140 00:09:07,465 --> 00:09:12,555 65% of respondents of colour overall, they do see an increase 141 00:09:12,555 --> 00:09:18,490 in diversity in the tech industry compared to 58% of white respondents. 142 00:09:20,350 --> 00:09:24,583 And then, generally, there has been an increase in Canadian initiatives focused 143 00:09:24,583 --> 00:09:27,440 on diversity and inclusion in the tech space. 144 00:09:27,440 --> 00:09:33,053 And these initiatives have been pointed towards combating racism and 145 00:09:33,053 --> 00:09:38,949 breaking down barriers as a way to close the gap between certain groups and 146 00:09:38,949 --> 00:09:42,010 access to the tech industry overall. 147 00:09:42,010 --> 00:09:44,710 There also has been a general increase in funding to support 148 00:09:44,710 --> 00:09:46,270 black Canadian entrepreneurs. 149 00:09:46,270 --> 00:09:48,499 And this has actually been a huge leap, 150 00:09:48,499 --> 00:09:53,110 which has allowed the greater population to support black owned business. 151 00:09:53,110 --> 00:09:57,480 And it's also Given rise to access that black Canadian 152 00:09:57,480 --> 00:10:02,270 entrepreneurs have where otherwise they may have been excluded. 153 00:10:02,270 --> 00:10:06,163 Now, resources are being reallocated so the playing field can be leveled. 154 00:10:07,874 --> 00:10:12,071 And of course, this is wonderful progress overall, but there is still a lot of 155 00:10:12,071 --> 00:10:16,026 concern over the lack of diversity in various areas of the tech industry. 156 00:10:16,026 --> 00:10:20,531 Even though we are seeing an acknowledgement of a need to reallocate 157 00:10:20,531 --> 00:10:24,561 resources so that other minority groups can feel included, 158 00:10:24,561 --> 00:10:28,749 we're also still seeing that roles are generally being filled 159 00:10:28,749 --> 00:10:32,086 by certain groups and not being filled by others. 160 00:10:32,086 --> 00:10:35,319 67% of respondents also reported that less than a quarter of their 161 00:10:35,319 --> 00:10:37,280 leadership is made up of people of color. 162 00:10:37,280 --> 00:10:39,467 So this, of course, is a problem. 163 00:10:39,467 --> 00:10:43,597 We've got people at higher levels of a company who have a huge impact on 164 00:10:43,597 --> 00:10:46,120 decision-making, but unfortunately, 165 00:10:46,120 --> 00:10:49,707 they're making decisions based on their own experiences. 166 00:10:49,707 --> 00:10:50,814 And inevitably, 167 00:10:50,814 --> 00:10:55,180 the experiences of people of color get left out of that conversation. 168 00:10:56,560 --> 00:11:00,873 Less than half also reported that they have a dedicated department that is 169 00:11:00,873 --> 00:11:04,320 focused on diversity and inclusion efforts. 170 00:11:04,320 --> 00:11:07,139 Trust rate is to take into account that this could 171 00:11:07,139 --> 00:11:11,266 be naturally attributed to the fact that smaller companies may not have 172 00:11:11,266 --> 00:11:15,350 as many resources as larger companies do when it comes to this. 173 00:11:15,350 --> 00:11:20,037 So of course, this brings into question some other ideas where maybe we need to 174 00:11:20,037 --> 00:11:24,230 find alternatives to allow smaller companies to have the resources to 175 00:11:24,230 --> 00:11:29,085 implement diversity and inclusion efforts the same way a larger company might. 176 00:11:32,632 --> 00:11:37,680 So we kinda wanna start thinking about how we can avoid this dilemma. 177 00:11:37,680 --> 00:11:40,018 So there are a few ways. 178 00:11:40,018 --> 00:11:45,851 For one, being objective in how we explain occupational requirements in job ads and 179 00:11:45,851 --> 00:11:51,359 job descriptions and focusing on them as being genuine requirements to perform 180 00:11:51,359 --> 00:11:56,536 the actual job rather than getting hung up on where someone went to school or 181 00:11:56,536 --> 00:11:58,954 whether or not they have a degree. 182 00:12:00,418 --> 00:12:01,886 In addition to that, 183 00:12:01,886 --> 00:12:06,980 we want to start thinking about the use of language in job descriptions. 184 00:12:06,980 --> 00:12:11,990 So stop using words that skew a job ad to only one group of applicants. 185 00:12:11,990 --> 00:12:14,262 This one's a bit tricky because, of course, 186 00:12:14,262 --> 00:12:18,467 it does play into human psychology a bit, and that's not an area that I'm versed in. 187 00:12:18,467 --> 00:12:20,917 But it is something that is important for 188 00:12:20,917 --> 00:12:24,233 us to think about because perhaps without knowing it, 189 00:12:24,233 --> 00:12:28,714 we've used language to describe a candidate that is inherently biased. 190 00:12:30,659 --> 00:12:32,363 And then I also have here, 191 00:12:32,363 --> 00:12:38,230 we wanna be able to incorporate technology into workflows with diversity in mind. 192 00:12:38,230 --> 00:12:44,020 This is gaining popularity now especially because we are in a technological age and 193 00:12:44,020 --> 00:12:48,750 more so because most of us are working virtually at this point. 194 00:12:48,750 --> 00:12:54,560 And the virtual world is gaining much more prevalence given the times that we're in. 195 00:12:54,560 --> 00:12:59,036 So you have more companies that are starting to consider ways in which they 196 00:12:59,036 --> 00:13:03,449 can fill the gaps and technology has stepped in to sort of play that role. 197 00:13:05,970 --> 00:13:11,095 So speaking about tech, I kind of figured I'd cite a few examples 198 00:13:11,095 --> 00:13:16,510 of technology solutions that are attempting to attack the problem. 199 00:13:17,510 --> 00:13:20,578 So we have a few here and basically the focus is, 200 00:13:20,578 --> 00:13:26,187 could be anywhere from educating employees on the experiences of diverse individuals 201 00:13:26,187 --> 00:13:31,290 that are outside of themselves, fully integrated anti-racism tools. 202 00:13:31,290 --> 00:13:35,546 And also in some cases, collecting data from your specific team or 203 00:13:35,546 --> 00:13:40,792 company that can be used in an analytical way to develop a customized action plan. 204 00:13:40,792 --> 00:13:45,234 So you can directly address issues of unconscious bias that may be taking place 205 00:13:45,234 --> 00:13:47,440 within your own four walls. 206 00:13:47,440 --> 00:13:49,734 And then of course, the last one down here, 207 00:13:49,734 --> 00:13:53,516 we have Diverst which is basically they run on the premise that employees 208 00:13:53,516 --> 00:13:57,060 who feel included are more likely to deliver with greater quality. 209 00:13:57,060 --> 00:14:00,000 So this doesn't just benefit the affected groups. 210 00:14:00,000 --> 00:14:03,592 This benefits everyone and that's why it's such an important thing to consider. 211 00:14:03,592 --> 00:14:06,411 Even if you've never thought about it previously, you wanna 212 00:14:06,411 --> 00:14:09,893 start looking at ways that you can make everyone feel like they have a place. 213 00:14:14,801 --> 00:14:17,263 Of course, how can we start a conversation and 214 00:14:17,263 --> 00:14:21,570 what types of questions can we ask when we're thinking about unconscious bias? 215 00:14:23,000 --> 00:14:25,976 First off, before we do get into our line of questioning, 216 00:14:25,976 --> 00:14:29,150 it's important that we hold our own selves accountable. 217 00:14:29,150 --> 00:14:32,604 It's likely that we've already made assumptions about a person or 218 00:14:32,604 --> 00:14:35,740 a situation at a given time without even realizing it. 219 00:14:35,740 --> 00:14:39,226 And this is where in order for us to start asking questions, 220 00:14:39,226 --> 00:14:42,446 we need to know what we we're asking questions about. 221 00:14:42,446 --> 00:14:47,283 And these assumptions can often be based on a person's style of dress, 222 00:14:47,283 --> 00:14:51,635 their skin color, their accent, whether or not they speak with 223 00:14:51,635 --> 00:14:56,471 an accent according to you, and also whether or not they have a family or 224 00:14:56,471 --> 00:15:01,015 kids or their marital status, etc, etc, etc, the list goes on. 225 00:15:01,015 --> 00:15:05,392 And essentially, we in many ways are defaulted to reacting to a person or 226 00:15:05,392 --> 00:15:07,905 to feeling comfortable around a person, 227 00:15:07,905 --> 00:15:12,600 depending on how closely they resemble what we're comfortable with. 228 00:15:12,600 --> 00:15:16,089 So if we hear someone speaking with an accent that's unfamiliar to us, 229 00:15:16,089 --> 00:15:20,218 that can inherently have an impact on how we process that person's way of speaking, 230 00:15:20,218 --> 00:15:24,690 the assumptions we make about their intelligence, their fluency in English. 231 00:15:24,690 --> 00:15:28,964 We could literally just talk about this particular thing all day, but 232 00:15:28,964 --> 00:15:33,819 it does have a negative impact because we are actively either including certain 233 00:15:33,819 --> 00:15:38,610 people arbitrarily or excluding others without really knowing who they are. 234 00:15:40,340 --> 00:15:44,742 So we want to look within ourselves and start thinking all I know about this 235 00:15:44,742 --> 00:15:48,650 person is what I know about this particular person. 236 00:15:48,650 --> 00:15:53,558 Be objective in how you process meeting someone for the first time. 237 00:15:55,657 --> 00:15:59,160 And start to ask yourself, what do I really know about this person or 238 00:15:59,160 --> 00:16:00,190 situation? 239 00:16:00,190 --> 00:16:04,802 Look at the facts, rather than filling in your gaps of knowledge with 240 00:16:04,802 --> 00:16:09,270 assumptions that you made based on what color the sky was that day. 241 00:16:10,650 --> 00:16:14,844 And you also wanna make sure that if there are assumptions you've made, 242 00:16:14,844 --> 00:16:18,090 be intentional about identifying those assumptions. 243 00:16:18,090 --> 00:16:21,638 Sometimes that can be difficult because the way the human brain works and 244 00:16:21,638 --> 00:16:24,546 the way we've had to adapt to processing new information, 245 00:16:24,546 --> 00:16:27,655 we're constantly making assumptions to get through the day. 246 00:16:27,655 --> 00:16:28,865 And that's just how it works. 247 00:16:28,865 --> 00:16:29,860 We have to save time. 248 00:16:31,980 --> 00:16:34,039 So when framing your questions, 249 00:16:34,039 --> 00:16:37,867 you wanna build a solid foundation starting with facts only. 250 00:16:38,913 --> 00:16:42,085 And then think about what your motivation might be. 251 00:16:42,085 --> 00:16:44,330 Why is this important to you? 252 00:16:44,330 --> 00:16:45,330 What are you trying to achieve? 253 00:16:46,910 --> 00:16:52,270 And then be clear and specific about who or what your question is directed to. 254 00:16:52,270 --> 00:16:55,872 If you faced a situation where someone made an offensive comment at work, 255 00:16:55,872 --> 00:16:58,092 perhaps you've been microaggressed upon. 256 00:16:58,092 --> 00:17:00,935 You wanna start thinking okay, who am I gonna address about this issue? 257 00:17:00,935 --> 00:17:03,110 Do I wanna go to my colleague directly who offended me? 258 00:17:03,110 --> 00:17:05,425 Do I wanna talk to my reporting manager? 259 00:17:05,425 --> 00:17:06,865 Do I wanna go to HR? 260 00:17:06,865 --> 00:17:09,469 And that will help you to form sort of a paradigm for 261 00:17:09,469 --> 00:17:12,015 how you go about this line of questioning. 262 00:17:12,015 --> 00:17:15,654 It's not easy and by no means am I implying that it is, but this is where you 263 00:17:15,654 --> 00:17:19,211 start to build a foundation for how you're going to address the issue. 264 00:17:22,411 --> 00:17:25,490 So of course, we have the idea of interviews that come to mind. 265 00:17:25,490 --> 00:17:29,210 Many of us are either in the interviewing process, we aspire to be in 266 00:17:29,210 --> 00:17:32,885 the interviewing process for a certain role in the tech industry. 267 00:17:32,885 --> 00:17:34,972 Or we've been in the interviewing process and 268 00:17:34,972 --> 00:17:38,410 have reflected on how those interviews went, whether that was good or bad. 269 00:17:39,640 --> 00:17:43,353 And one thing that often comes up is the opportunity to ask questions at the end of 270 00:17:43,353 --> 00:17:44,860 an interview. 271 00:17:44,860 --> 00:17:49,202 So be on the lookout for questions during your interview that center on company 272 00:17:49,202 --> 00:17:52,470 culture or the type of dynamic that you work best in. 273 00:17:52,470 --> 00:17:55,021 Even if this isn't touched upon during the interview, 274 00:17:55,021 --> 00:17:57,990 it may be in the job description for the role you're going for. 275 00:17:57,990 --> 00:18:02,208 And you can zero in on that and take the opportunity to clarify or 276 00:18:02,208 --> 00:18:06,985 to ask questions to the interview regarding what their company culture 277 00:18:06,985 --> 00:18:11,140 actually is, and what a culture fit actually means. 278 00:18:11,140 --> 00:18:12,737 And from there, you can kind of assess. 279 00:18:12,737 --> 00:18:15,528 Is this a situation you're comfortable with? 280 00:18:15,528 --> 00:18:20,111 >> This is essentially an opportunity for you to express how important it is that 281 00:18:20,111 --> 00:18:24,622 you work for a company that concerns itself with diversity and also actively 282 00:18:24,622 --> 00:18:29,580 seeks to bring up difficult conversations in order to create a safe space. 283 00:18:29,580 --> 00:18:30,955 Is your company or your team or 284 00:18:30,955 --> 00:18:34,860 your management willing to put themselves on the line so that you can feel included? 285 00:18:38,430 --> 00:18:41,423 And when we think about fighting unconscious bias and 286 00:18:41,423 --> 00:18:46,214 participating in this concerted effort, we wanna look at the removal of educational 287 00:18:46,214 --> 00:18:48,290 backgrounds from the hiring model. 288 00:18:48,290 --> 00:18:50,580 And some companies have already committed to doing this. 289 00:18:52,820 --> 00:18:56,405 We also wanna look into the use of gender neutral language in job descriptions. 290 00:18:56,405 --> 00:19:00,980 Again, this kind of plays more on human psychology where we find certain 291 00:19:00,980 --> 00:19:05,415 groups are more prone to certain types of language over other groups. 292 00:19:06,475 --> 00:19:10,623 And without going too deeply into that, it's essentially a situation where you 293 00:19:10,623 --> 00:19:14,842 want to make sure that you acknowledging that in 2020, gender is not binary. 294 00:19:14,842 --> 00:19:16,042 It's not a binary concept. 295 00:19:16,042 --> 00:19:20,546 It is a spectrum and we want to make sure that regardless of where one falls on that 296 00:19:20,546 --> 00:19:22,498 spectrum, they feel included and 297 00:19:22,498 --> 00:19:27,092 they feel like they are being given access to an area that they want to be a part of. 298 00:19:29,310 --> 00:19:30,588 In addition to that, 299 00:19:30,588 --> 00:19:35,700 we have the use of blind AI technology to source candidates without bias. 300 00:19:35,700 --> 00:19:36,901 Now, this of course, 301 00:19:36,901 --> 00:19:41,270 comes with the fact that it all depends on who's building this technology. 302 00:19:41,270 --> 00:19:44,840 Technology is only good as good as the human minds behind it. 303 00:19:44,840 --> 00:19:48,482 If we have tech teams and companies that lack diversity and 304 00:19:48,482 --> 00:19:51,751 that are building this technology, at some point, 305 00:19:51,751 --> 00:19:56,508 it's going to be discovered that certain groups are being left out of having 306 00:19:56,508 --> 00:20:00,800 access to that technology in a way that it can Benefit them. 307 00:20:03,130 --> 00:20:07,174 And then of course, we wanna be able to use our conscious knowledge to override 308 00:20:07,174 --> 00:20:09,087 our unconscious default settings. 309 00:20:09,087 --> 00:20:13,653 So rather than relying on past experiences 100% of the time, 310 00:20:13,653 --> 00:20:17,316 we can rely on our past experiences some of the time. 311 00:20:17,316 --> 00:20:21,936 But also make room for new information to override some of that arbitrary knowledge 312 00:20:21,936 --> 00:20:24,910 that we've gathered based on our own personal bias. 313 00:20:28,947 --> 00:20:32,697 So when we think about setting ourselves up for success, this is where we start to 314 00:20:32,697 --> 00:20:35,990 think about ways in which we can hold ourselves accountable. 315 00:20:35,990 --> 00:20:40,123 How we participated in situations where we projected unconscious biases unto 316 00:20:40,123 --> 00:20:42,270 a certain group of people. 317 00:20:42,270 --> 00:20:46,254 Have we kept quiet in situations where we could have spoken up about unconscious 318 00:20:46,254 --> 00:20:48,660 biases that have affected other people. 319 00:20:48,660 --> 00:20:51,505 We all have a role to play in this, and it's important for 320 00:20:51,505 --> 00:20:53,360 us to be as self-aware as possible. 321 00:20:53,360 --> 00:20:57,773 And we also wanna understand that unconscious bias exists everywhere and 322 00:20:57,773 --> 00:21:00,490 it's not going anywhere anytime soon. 323 00:21:00,490 --> 00:21:03,007 It's a symptom of how the human mind works and 324 00:21:03,007 --> 00:21:05,400 the world that we're living in right now. 325 00:21:07,460 --> 00:21:11,304 This point here is a tough one, I do not touch on this one lightly. 326 00:21:11,304 --> 00:21:15,287 But of course, don't be afraid to bring up diversity in your next job interview or 327 00:21:15,287 --> 00:21:18,578 your next meeting, provided you have the opportunity to do so, and 328 00:21:18,578 --> 00:21:21,365 provided that you are in the mental states to do so. 329 00:21:21,365 --> 00:21:26,095 This isn't easy for all us, sometimes we have traumatic experiences that 330 00:21:26,095 --> 00:21:29,105 would get unveiled through this engagement. 331 00:21:29,105 --> 00:21:31,095 So it's not easy for us to do it. 332 00:21:31,095 --> 00:21:33,164 But if you feel like it's something you can do, 333 00:21:33,164 --> 00:21:34,785 I definitely encourage you to do so. 334 00:21:36,776 --> 00:21:40,809 Of course, it is gonna be uncomfortable, and that's okay, because it's normal, 335 00:21:40,809 --> 00:21:41,659 it's by design. 336 00:21:43,619 --> 00:21:46,786 And if the thought of discussing diversity is uncomfortable for you, 337 00:21:46,786 --> 00:21:48,190 there may be a reason as to why. 338 00:21:50,070 --> 00:21:53,740 You wanna think to yourself, does your company provide a safe space for 339 00:21:53,740 --> 00:21:55,850 these conversations to be had? 340 00:21:55,850 --> 00:21:59,729 And is there someone in particular that you can go to if you have a grievance or 341 00:21:59,729 --> 00:22:01,801 an issue, and that you want addressed? 342 00:22:04,618 --> 00:22:08,540 Well, just a few Canadian companies I wanted to cite briefly. 343 00:22:08,540 --> 00:22:13,218 Scotiabank has confessed that they are committed to 344 00:22:13,218 --> 00:22:17,640 removing education from the hiring model. 345 00:22:17,640 --> 00:22:18,387 And instead, 346 00:22:18,387 --> 00:22:22,570 what they wanna do is start relying on the job experiences of their candidates. 347 00:22:23,660 --> 00:22:25,590 We also have is Hispanotech.ca. 348 00:22:25,590 --> 00:22:27,503 And through the use of technology, 349 00:22:27,503 --> 00:22:31,790 they're looking to address unconscious bias in their recruiting practices. 350 00:22:33,610 --> 00:22:38,776 And then we have Hubba, who as of 2017, conducted its first diversity survey, 351 00:22:38,776 --> 00:22:43,260 and from there they planned on providing progress reports. 352 00:22:43,260 --> 00:22:45,934 And they've also released an open-source framework, 353 00:22:45,934 --> 00:22:49,050 which will help other companies to hopefully do the same. 354 00:22:49,050 --> 00:22:54,031 The URL I posted here is where you can find that article that is linked to that 355 00:22:54,031 --> 00:22:55,863 open-source framework. 356 00:22:59,391 --> 00:23:01,488 And just as I wrap up today, 357 00:23:01,488 --> 00:23:06,740 there's just a few takeaways I wanted to cite real quick. 358 00:23:06,740 --> 00:23:09,557 So first thing, we wanna make sure we understand unconscious bias and 359 00:23:09,557 --> 00:23:10,270 its impact. 360 00:23:10,270 --> 00:23:11,826 Whether negative or positive, 361 00:23:11,826 --> 00:23:14,600 we wanna be very clear on the depths of unconscious bias. 362 00:23:15,790 --> 00:23:19,684 We also wanna make sure that we're being intentional about uncovering our own 363 00:23:19,684 --> 00:23:20,807 unconscious biases. 364 00:23:20,807 --> 00:23:23,930 Because a lot of the times our questions can be rooted in bias. 365 00:23:23,930 --> 00:23:27,572 Naturally, we are coming from experiences that have taught us and have informed our 366 00:23:27,572 --> 00:23:31,180 future experiences, but it is important to critically think about those as well. 367 00:23:33,170 --> 00:23:36,706 You also wanna be able to take advantage of opportunities to challenge unconscious 368 00:23:36,706 --> 00:23:40,138 biases around you, whether this be within your professional working space, or 369 00:23:40,138 --> 00:23:41,310 your social environment. 370 00:23:43,470 --> 00:23:47,846 And then concern yourself with unconscious biases even if they benefit you or 371 00:23:47,846 --> 00:23:49,530 do not affect you at all. 372 00:23:49,530 --> 00:23:51,650 This is where the shift really takes place. 373 00:23:51,650 --> 00:23:56,175 If we can all look at unconscious bias as affecting us all the same way, 374 00:23:56,175 --> 00:23:59,569 then we can kind of look to it as more of a team effort. 375 00:23:59,569 --> 00:24:02,323 Rather than leaving it to those who are being marginalized or 376 00:24:02,323 --> 00:24:05,190 affected by unconscious bias to handle the issue on their own. 377 00:24:08,093 --> 00:24:12,127 So with that, I just wanna say thank you to Liz for coordinating this, and 378 00:24:12,127 --> 00:24:13,250 Ryan as well. 379 00:24:13,250 --> 00:24:16,630 And also Team Treehouse, Treehouse festival for having me speak today. 380 00:24:17,680 --> 00:24:23,076 And for anyone who does wanna connect with me, I'm on LinkedIn under Valerie Osei, 381 00:24:23,076 --> 00:24:27,111 and I'm happy to take questions if anybody has any questions. 382 00:24:27,111 --> 00:24:29,243 I actually gonna switch screens. 383 00:24:29,243 --> 00:24:34,222 Okay, so we have, okay, how would you recommend approaching a company that 384 00:24:34,222 --> 00:24:39,065 only has white sis people on their boards and in managerial positions? 385 00:24:39,065 --> 00:24:42,503 Do we stay and try and change these systems internally or leave and 386 00:24:42,503 --> 00:24:44,765 try to support these changes externally? 387 00:24:44,765 --> 00:24:47,746 That's a very, very good question, and 388 00:24:47,746 --> 00:24:51,770 that's also a very tough thing to think about. 389 00:24:51,770 --> 00:24:56,326 What I'll say is that sometimes, and unfortunately, very unfortunately and 390 00:24:56,326 --> 00:25:01,222 may not be worth exerting energy on trying to change a system that is deeply, deeply, 391 00:25:01,222 --> 00:25:03,740 deeply rooted in white supremacy. 392 00:25:03,740 --> 00:25:07,910 And just refusing to acknowledge the experiences of minority groups, 393 00:25:07,910 --> 00:25:11,452 a lot of the time you try to tell someone there's a problem and 394 00:25:11,452 --> 00:25:14,880 they'll tell you that you're the one with the problem. 395 00:25:14,880 --> 00:25:18,826 So this is where you wanna make sure to assess the situation. 396 00:25:18,826 --> 00:25:22,311 If you feel like the energy or exerting in this situation is too much for 397 00:25:22,311 --> 00:25:24,749 your mental health and too much for you overall. 398 00:25:24,749 --> 00:25:29,271 Then I would say that it is okay to exit from that situation and 399 00:25:29,271 --> 00:25:31,406 change it from the outside. 400 00:25:31,406 --> 00:25:35,762 Speak about it more, be vocal about it, and challenge the institutions that 401 00:25:35,762 --> 00:25:40,530 support it, rather than feeling like you have to carry the world on your shoulders. 402 00:25:42,010 --> 00:25:45,646 We have another one, what is the best way to be an ally while at the same 403 00:25:45,646 --> 00:25:49,570 time competing for jobs with those that I want equality for? 404 00:25:49,570 --> 00:25:53,348 Okay, yeah, so being an ally means listening to those of us 405 00:25:53,348 --> 00:25:57,809 who are recounting experiences that we may have had in the workplace or 406 00:25:57,809 --> 00:26:00,020 in our social environments. 407 00:26:00,020 --> 00:26:04,168 And in particular, if someone is holding you accountable, listen. 408 00:26:04,168 --> 00:26:06,620 I think the best thing to do is just open your ears, listen. 409 00:26:06,620 --> 00:26:11,158 Even if you feel like you don't think you've done that or you don't think you've 410 00:26:11,158 --> 00:26:15,380 engaged in offensive behavior, just listen to what that person is saying. 411 00:26:15,380 --> 00:26:20,436 Because they're citing experiences that you likely don't have privy to or 412 00:26:20,436 --> 00:26:24,020 you're not able to relate to those experiences. 413 00:26:24,020 --> 00:26:26,990 So being a good listener is a big one. 414 00:26:26,990 --> 00:26:31,674 And then, competing for jobs with those that you want equality for, 415 00:26:31,674 --> 00:26:35,950 this is where you can also hold hiring managers accountable. 416 00:26:35,950 --> 00:26:38,963 If you're going for a job and you're in an interview, 417 00:26:38,963 --> 00:26:43,422 it goes back to being unafraid to speak up about possible unconscious biases that 418 00:26:43,422 --> 00:26:45,730 are taking place during your interview. 419 00:26:45,730 --> 00:26:49,406 Sometimes, the assumption that you're willing to participate 420 00:26:49,406 --> 00:26:54,228 in beer O'clock Fridays as an example, so drinking beer with the buddies on Friday. 421 00:26:54,228 --> 00:26:56,430 That assumption alone is rooted in bias. 422 00:26:56,430 --> 00:26:58,157 You can address that too, and 423 00:26:58,157 --> 00:27:02,447 also let the hiring managers know that they need to look at candidates with 424 00:27:02,447 --> 00:27:06,750 a lens that's not rooted in favouring certain groups over other groups. 425 00:27:06,750 --> 00:27:11,360 How do we approach companies that are performative, for example LGBT support to 426 00:27:11,360 --> 00:27:15,734 make sales and attract talent, but internally they don't support def? 427 00:27:15,734 --> 00:27:17,835 Yeah, okay, that's a big one. 428 00:27:17,835 --> 00:27:20,196 So I'm approaching companies that are performative. 429 00:27:20,196 --> 00:27:23,586 Yes, so when it comes to companies that are performative, again, 430 00:27:23,586 --> 00:27:26,502 if you're feeling like this is something you can take on, 431 00:27:26,502 --> 00:27:29,729 speaking out against this, there's nothing wrong with that. 432 00:27:29,729 --> 00:27:32,300 You can cite these companies if you feel comfortable to do so. 433 00:27:32,300 --> 00:27:39,120 Now, if you're employed by this company, obviously,, it's gonna look different. 434 00:27:39,120 --> 00:27:42,801 But if you feel like within the company you're working for, 435 00:27:42,801 --> 00:27:47,059 there's a level of performative activism going on, you wanna try and 436 00:27:47,059 --> 00:27:52,230 see if there's a channel for communication to address the issue directly. 437 00:27:52,230 --> 00:27:54,160 And this does come with consequences. 438 00:27:54,160 --> 00:27:57,731 It's hard to predict what these consequences will be because every company 439 00:27:57,731 --> 00:28:02,330 is different and every manager is gonna have a different ear to this issue, right? 440 00:28:02,330 --> 00:28:05,578 So what I'd say is don't be afraid to speak up and 441 00:28:05,578 --> 00:28:10,570 in a sense, be willing to sacrifice a part of yourself for the cause. 442 00:28:10,570 --> 00:28:13,313 Cuz unfortunately, that's what we do when we open our mouth, 443 00:28:13,313 --> 00:28:15,300 we accept the consequences of the backlash. 444 00:28:16,330 --> 00:28:19,441 And we have here, how do we approach companies? 445 00:28:19,441 --> 00:28:22,207 I think that was Peter's question. 446 00:28:22,207 --> 00:28:24,560 Perfect, okay, so that's the end of the questions. 447 00:28:24,560 --> 00:28:28,821 So I just wanted to also say a huge thank you to all of you for the questions and 448 00:28:28,821 --> 00:28:33,220 hoping that I've been able to share some useful information today. 449 00:28:33,220 --> 00:28:36,710 And thanks so much, hope you guys enjoy the rest of your day.