1 00:00:00,888 --> 00:00:05,600 MICHELLE: In Bruce Schneier's book, Data and Goliath, he explains the four basic 2 00:00:05,600 --> 00:00:08,650 consumer surveillance streams that existed before the Internet. 3 00:00:09,860 --> 00:00:14,040 First, companies keep track of their customers' purchasing behavior. 4 00:00:14,040 --> 00:00:19,350 This started out simple, such as hotels keeping track of their frequent guests. 5 00:00:19,350 --> 00:00:23,420 Then it evolved into monitoring sales from the initial browsing to the final 6 00:00:23,420 --> 00:00:24,460 purchase. 7 00:00:24,460 --> 00:00:30,060 Collecting activity via loyalty cards, trading consumer lists with other stores, 8 00:00:30,060 --> 00:00:32,930 and using customer relationship management tools. 9 00:00:34,130 --> 00:00:36,640 The second stream is direct marketing, 10 00:00:36,640 --> 00:00:40,010 where paper mail is sent directly to people's homes. 11 00:00:40,010 --> 00:00:44,400 While direct is in the name, it was roughly based on location, 12 00:00:44,400 --> 00:00:49,310 demographics, and customer lists traded from like minded businesses. 13 00:00:49,310 --> 00:00:53,310 These days it's much more targeted and data informed. 14 00:00:53,310 --> 00:00:57,980 Thirdly, credit bureaus collect personal financial data. 15 00:00:57,980 --> 00:01:02,450 Credit history affects a person's approval chances when leasing an apartment or 16 00:01:02,450 --> 00:01:03,480 taking out a loan. 17 00:01:04,590 --> 00:01:08,540 The fourth stream is government records such as birth certificates, 18 00:01:08,540 --> 00:01:11,810 voter registration records and driver's licenses. 19 00:01:12,980 --> 00:01:18,390 Now these streams have been combined to form large data brokers like Axiom. 20 00:01:18,390 --> 00:01:22,500 They buy your personal data from the products you use, combine it with other 21 00:01:22,500 --> 00:01:26,400 data streams, and sell it to companies who want to know more about you. 22 00:01:27,520 --> 00:01:29,790 And it's not just the retail industry. 23 00:01:29,790 --> 00:01:31,910 It's even healthcare and law enforcement. 24 00:01:33,580 --> 00:01:35,770 There are new streams as well. 25 00:01:35,770 --> 00:01:39,550 Cookies are a common way for browsers to store information. 26 00:01:39,550 --> 00:01:43,340 This can be anything from a language preference to how a visitor found out 27 00:01:43,340 --> 00:01:44,980 about the site. 28 00:01:44,980 --> 00:01:48,070 A cookie is a piece of data sent from a website and 29 00:01:48,070 --> 00:01:51,020 stored on the user's computer by their web browser. 30 00:01:52,130 --> 00:01:55,330 Everyday objects are now data collection tools. 31 00:01:55,330 --> 00:02:00,560 Smart devices such as Amazon Alexa, listen in on people in their homes. 32 00:02:00,560 --> 00:02:02,321 Providing them services, but 33 00:02:02,321 --> 00:02:05,712 also collecting data which can be sent to law enforcement. 34 00:02:07,332 --> 00:02:12,378 HOPE: Harvard professor and author Shoshana Zuboff defines surveillance 35 00:02:12,378 --> 00:02:17,250 capitalism as unilateral claiming of private human experience as 36 00:02:17,250 --> 00:02:21,520 free raw material for translation into behavioral data. 37 00:02:22,830 --> 00:02:26,410 That data is then repackaged into predictions, 38 00:02:26,410 --> 00:02:31,025 informing companies about what we will do now, soon, and later. 39 00:02:31,025 --> 00:02:37,992 In the corporate surveillance in everyday life report by crackedlabs.org, 40 00:02:37,992 --> 00:02:45,063 the primary data collectors are platforms such as Facebook, Google, Apple, 41 00:02:45,063 --> 00:02:51,628 the three major credit reporting agencies, and consumer data brokers. 42 00:02:51,628 --> 00:02:55,615 MICHELLE: Technology continues to grow at a much faster pace than the government 43 00:02:55,615 --> 00:02:56,636 can regulate it. 44 00:02:56,636 --> 00:03:00,903 The European Union has led the way with its General Data 45 00:03:00,903 --> 00:03:05,553 Protection Regulation, commonly referred to as EU GDPR. 46 00:03:05,553 --> 00:03:11,295 It provides data protection and privacy for those in the EU and expands outside of 47 00:03:11,295 --> 00:03:17,720 its geographical area as it applies to personal data transfer outside the EU. 48 00:03:17,720 --> 00:03:20,240 You've probably seen a cookie consent banner, 49 00:03:20,240 --> 00:03:23,360 those became widespread because of GDPR. 50 00:03:23,360 --> 00:03:26,320 Many companies have become GDPR compliant for 51 00:03:26,320 --> 00:03:30,760 their entire product, even though they're not required to do so. 52 00:03:30,760 --> 00:03:34,960 It is too costly and time consuming to provide a different experience for 53 00:03:34,960 --> 00:03:36,850 a subset of users. 54 00:03:36,850 --> 00:03:40,130 While companies may feel motivated to stockpile data, 55 00:03:40,130 --> 00:03:41,940 it's increasingly a liability. 56 00:03:43,460 --> 00:03:46,500 In the United States, children are further protected by 57 00:03:46,500 --> 00:03:52,150 the Children's Online Privacy Protection Rule, commonly abbreviated as COPPA. 58 00:03:52,150 --> 00:03:57,070 Enacted in 2000, it applies to personal information collected from kids under 59 00:03:57,070 --> 00:03:59,430 13 in the United States. 60 00:03:59,430 --> 00:04:03,845 It even includes children outside of the US if the company is US-based. 61 00:04:04,920 --> 00:04:08,770 These are just a couple of examples of data protection regulation. 62 00:04:08,770 --> 00:04:13,590 There are several laws worldwide that apply to various jurisdictions. 63 00:04:13,590 --> 00:04:15,450 Check out the link in the teacher's notes. 64 00:04:16,560 --> 00:04:20,870 HOPE: Sometimes users are pressured to disclose more data. 65 00:04:20,870 --> 00:04:23,670 In 2016, TechCrunch reported that 66 00:04:23,670 --> 00:04:27,510 Uber began tracking riders whereabouts after they left their ride. 67 00:04:28,510 --> 00:04:34,320 Previously, location information was only collected while the app was open. 68 00:04:34,320 --> 00:04:39,070 The change in 2016 forced users to choose between allowing 69 00:04:39,070 --> 00:04:43,590 the app to always track their locations or never to track their location. 70 00:04:44,600 --> 00:04:48,450 The latter meant that they would need to type out their current location for 71 00:04:48,450 --> 00:04:49,580 every ride request. 72 00:04:50,820 --> 00:04:55,320 Uber specifically wanted access to a rider's location from when 73 00:04:55,320 --> 00:05:00,890 a ride is requested until five minutes after the driver drops the passenger off, 74 00:05:00,890 --> 00:05:04,320 even if the app is not in the foreground of the customer's phone. 75 00:05:05,870 --> 00:05:09,170 Uber said their intent was to improve drop offs and 76 00:05:09,170 --> 00:05:13,330 pickups as limiting street crossing is safer. 77 00:05:13,330 --> 00:05:16,590 But the extra location data could be invasive. 78 00:05:17,780 --> 00:05:21,170 In 2017, after public pressure, 79 00:05:21,170 --> 00:05:25,600 Uber re-enabled the option to share location only while using the app. 80 00:05:26,690 --> 00:05:30,190 Another ethical aspect to consider with data collection 81 00:05:30,190 --> 00:05:31,620 is whether it is inclusive. 82 00:05:32,880 --> 00:05:37,790 When required to specify one's gender, often there are only binary 83 00:05:37,790 --> 00:05:42,750 options of man or woman instead of a spectrum of genders. 84 00:05:42,750 --> 00:05:47,470 As for race and ethnicity, sometimes there isn't an option to select multiple 85 00:05:47,470 --> 00:05:51,800 options or an appropriate option that matches one's identity. 86 00:05:51,800 --> 00:05:56,480 When this information is required, it can put people in a stressful situation 87 00:05:56,480 --> 00:06:01,110 if they do not know how the data will be used or misused. 88 00:06:01,110 --> 00:06:06,130 Until 2019, Facebook allowed discriminatory advertising targeting. 89 00:06:07,210 --> 00:06:12,190 Advertisers could choose to exclude people based on their gender, race, and 90 00:06:12,190 --> 00:06:17,110 disability when posting job descriptions, housing, and credit offers. 91 00:06:18,110 --> 00:06:21,110 Some companies defend the selling of our data 92 00:06:21,110 --> 00:06:23,000 by saying they're providing a free service. 93 00:06:24,050 --> 00:06:26,480 But is it a fair and balanced deal? 94 00:06:28,060 --> 00:06:31,200 MICHELLE: Some people argue in defense of data collection by saying, 95 00:06:31,200 --> 00:06:33,000 I have nothing to hide. 96 00:06:33,000 --> 00:06:36,200 Machines make decisions on data without context. 97 00:06:36,200 --> 00:06:41,890 Data can be misinterpreted and predictions of our future behavior can be wrong. 98 00:06:41,890 --> 00:06:46,470 Excessive data collection hinders our basic human needs for freedom, privacy, 99 00:06:46,470 --> 00:06:47,810 and safety. 100 00:06:47,810 --> 00:06:48,740 More on that later.