1 00:00:00,000 --> 00:00:02,150 [Seminar: Surviving the Internet] 2 00:00:02,150 --> 00:00:04,300 [Esmond Kane-Harvard University] 3 00:00:04,300 --> 00:00:07,010 [This is CS50.-CS50.TV] 4 00:00:07,680 --> 00:00:09,790 Hello, and welcome to "Surviving the Internet." 5 00:00:09,790 --> 00:00:14,690 It is one of the seminars that comprise part of this CS50 curriculum. 6 00:00:15,320 --> 00:00:19,460 My name is Esmond Kane. My name and address are on that slide deck in front of you. 7 00:00:19,460 --> 00:00:21,790 It is esmond_kane@harvard.edu. 8 00:00:21,790 --> 00:00:27,360 In my day job I am one of the IT security directors for HUIT, 9 00:00:27,360 --> 00:00:31,850 but I have to acknowledge that today I am on an away mission 10 00:00:31,850 --> 00:00:33,850 which is why I am wearing a red shirt. 11 00:00:33,850 --> 00:00:37,090 This is not going to comprise anything that is attributable 12 00:00:37,090 --> 00:00:41,030 directly to my day job, so this is not about IT security to Harvard. 13 00:00:41,030 --> 00:00:44,690 This is more just personal information; this is how when you're-- 14 00:00:45,320 --> 00:00:48,220 these are the kind of skills that you'll acquire to try and help you 15 00:00:48,220 --> 00:00:51,800 harden your work stations and your environment throughout your career. 16 00:00:52,200 --> 00:00:57,320 But nothing that I mention today should be applied to any of your 17 00:00:57,320 --> 00:01:00,980 university material, your servers, or your workstations 18 00:01:01,550 --> 00:01:04,470 without contacting your local IT support. 19 00:01:05,230 --> 00:01:08,420 And indeed if I mention any applications or any incidents as part of this 20 00:01:08,420 --> 00:01:14,200 talk or discussion it is not reporting anything that I am privileged to report. 21 00:01:14,200 --> 00:01:16,200 It is usually public 22 00:01:16,310 --> 00:01:19,220 And nor indeed should any mention of any application imply any 23 00:01:19,220 --> 00:01:23,400 endorsement through Harvard or indeed any condemnation. 24 00:01:23,400 --> 00:01:27,440 >> So today why we are here--now that we are done with the disclaimer-- 25 00:01:28,060 --> 00:01:31,210 we are here today to talk about surviving the Internet. 26 00:01:31,210 --> 00:01:34,030 And why is it such an important topic right now? 27 00:01:34,300 --> 00:01:38,060 So to paraphrase Perry Hewitt who works in the Harvard Press and Communications office-- 28 00:01:38,060 --> 00:01:42,230 I apologize for reading this right now--she has stated, "We live in an 29 00:01:42,230 --> 00:01:47,180 atmosphere of escalating risk, but also one of unparalleled innovation. 30 00:01:47,180 --> 00:01:51,510 The rapid rise of the Internet, the Cloud, and social technologies 31 00:01:51,510 --> 00:01:56,040 has resulted in many more people having public profiles online 32 00:01:56,040 --> 00:01:59,770 with indeed access to an ever increasing array of information. 33 00:01:59,770 --> 00:02:05,580 And that means that everyone and their associations have never been more visible. 34 00:02:06,980 --> 00:02:09,979 As Harvard's digital footprint--its digital network expands, 35 00:02:09,979 --> 00:02:12,220 we attract a broader audience. 36 00:02:12,220 --> 00:02:15,180 We hope for the betterment, but sometimes we will 37 00:02:15,180 --> 00:02:17,500 attract some negative attention. 38 00:02:18,260 --> 00:02:21,180 So as a representative of Harvard," and this includes everybody 39 00:02:21,180 --> 00:02:25,880 watching at home or indeed anybody here, "our faculty, our students, our staff, 40 00:02:25,880 --> 00:02:30,440 our researchers, the risk of compromise to you and indeed to 41 00:02:30,440 --> 00:02:34,380 your associated network has never been higher." 42 00:02:34,780 --> 00:02:38,940 >> So often in information security when we try to balance this 43 00:02:38,940 --> 00:02:44,130 risk it is a complicated trade off between security and the user experience. 44 00:02:45,170 --> 00:02:48,850 In the era of immediacy we have to make thoughtful decisions 45 00:02:48,850 --> 00:02:52,720 about what will enhance security without a major inconvenience. 46 00:02:54,200 --> 00:02:57,560 We are told sometimes an ounce of prevention is worth twice the cure, 47 00:02:57,560 --> 00:03:01,850 but when choosing to implement security precautions to reduce your risk 48 00:03:02,230 --> 00:03:06,330 we need to acknowledge that it will never reduce the potential risk to zero. 49 00:03:07,670 --> 00:03:11,080 So that said--we are here today to discuss some simple and not so simple 50 00:03:11,080 --> 00:03:13,710 security precautions that you can take right now. 51 00:03:15,210 --> 00:03:17,210 I should also add--if you have any questions throughout the 52 00:03:17,210 --> 00:03:20,490 presentation just raise your hand. 53 00:03:22,720 --> 00:03:25,840 So the first topic--we are often told to pick a good password. 54 00:03:25,840 --> 00:03:28,790 A password is your first and best defense. 55 00:03:28,790 --> 00:03:30,980 It is often the only one that is available to you 56 00:03:30,980 --> 00:03:33,180 when you are choosing to use an online resource. 57 00:03:34,250 --> 00:03:38,430 But as we have seen throughout this summer and indeed the preceding year 58 00:03:38,430 --> 00:03:40,990 we've seen attacks like LinkedIn, eHarmony. 59 00:03:40,990 --> 00:03:43,130 We've seen RockYou. 60 00:03:43,130 --> 00:03:48,520 We've had some total of 70 million passwords and accounts compromised. 61 00:03:48,670 --> 00:03:51,170 And when those passwords were released into the public domain 62 00:03:51,580 --> 00:03:54,880 they also comprised the password hash. 63 00:03:55,400 --> 00:04:00,860 >> So basically these days if somebody retrieves an account hive 64 00:04:01,590 --> 00:04:05,260 they do not need to crack a password anymore.; they do not need to brute force a password 65 00:04:05,260 --> 00:04:09,520 because they have this massive trove of released information on what people are choosing. 66 00:04:11,020 --> 00:04:15,710 They've already got behavioral data to mind what people tend to use. 67 00:04:15,760 --> 00:04:19,600 And they have broken that down to a list of about a thousand passwords 68 00:04:19,600 --> 00:04:23,500 which comprise almost 80 to 90% of the passwords that we choose in common use. 69 00:04:24,520 --> 00:04:27,300 So a quick example--anybody want to hazard what you thought 70 00:04:27,300 --> 00:04:30,950 Bashar al-Assad used for his password when it was compromised last year? 71 00:04:32,080 --> 00:04:35,220 This is a gentleman who is subject to intense scrutiny. 72 00:04:35,830 --> 00:04:38,870 And his password was 12345. 73 00:04:39,720 --> 00:04:43,200 Okay--so these are lessons that we have learned; we need to move 74 00:04:43,200 --> 00:04:45,200 beyond just thinking of a password. 75 00:04:45,200 --> 00:04:47,380 We are told to start using a pass phrase. 76 00:04:47,380 --> 00:04:52,930 There is a great comic from or indeed a web comic from Randy Monroe 77 00:04:52,930 --> 00:04:55,720 which goes into choosing a pass phrase; he uses--I want to say-- 78 00:04:55,720 --> 00:04:58,670 battery, staple, limit or something like that--you know--just-- 79 00:04:59,340 --> 00:05:05,060 or indeed there is the joke that somebody who picked Goofy, Nemo, 80 00:05:05,060 --> 00:05:09,280 Pluto--all these different characters and London because he was told 81 00:05:09,280 --> 00:05:12,250 to pick 8 characters and a capital. 82 00:05:12,250 --> 00:05:18,060 But--so we learn we need to go think beyond just a password. 83 00:05:18,060 --> 00:05:22,710 >> There is actually an Ezine in Boston called Ars Technica. 84 00:05:23,300 --> 00:05:26,640 There is a gentleman called Dan Goodin who is doing a series on 85 00:05:26,640 --> 00:05:31,400 this changing scope--either from the attacker space where we have 86 00:05:31,400 --> 00:05:33,740 this massive trove available for us 87 00:05:33,740 --> 00:05:36,710 to either mind we no longer need to generate stuff through rainbow tables; 88 00:05:36,710 --> 00:05:39,570 we have 70 million passwords. 89 00:05:40,260 --> 00:05:42,880 But also we've had--you know--a changing scape in the 90 00:05:42,880 --> 00:05:47,400 actual cracking space because GPU cards have made this 91 00:05:47,400 --> 00:05:49,850 virtually near real-time. 92 00:05:49,850 --> 00:05:53,380 And there is a gentleman in Def Con in August who put together 93 00:05:53,380 --> 00:05:57,240 12 of these cards into a commodity PC. 94 00:05:58,970 --> 00:06:02,260 He did it for about $2,000 or $3,000, and he was able to crack 95 00:06:02,260 --> 00:06:06,810 the LinkedIn trove in--you know--near real-time. 96 00:06:06,810 --> 00:06:08,920 It was quite scary. 97 00:06:09,280 --> 00:06:12,090 Dan Goodin's article--I highly recommend it if you want to go read it. 98 00:06:12,340 --> 00:06:16,110 A gentleman called Sean Gallagher--this morning--also published a 99 00:06:16,110 --> 00:06:19,820 quick update on it; a lot of their work is built on-- 100 00:06:19,820 --> 00:06:25,500 from material available from Bruce Schneier, but also from 101 00:06:25,500 --> 00:06:28,430 Cormac Herely from Microsoft Research. 102 00:06:28,430 --> 00:06:34,580 They kind of stated about 5-6 years ago that we need to start thinking beyond passwords. 103 00:06:34,580 --> 00:06:37,570 The suggestions at that time were things like pass phrases, 104 00:06:37,570 --> 00:06:39,770 gestural interfaces--that kind of stuff. 105 00:06:39,770 --> 00:06:42,510 You know--if something you know is no longer sufficient at this point; 106 00:06:42,510 --> 00:06:44,510 that is one of the things that I want to communicate today. 107 00:06:44,510 --> 00:06:48,610 If you do have to use a password, let us not be shy in stating you should still 108 00:06:48,610 --> 00:06:52,720 pick a good one; it should be hopefully something beyond 10 characters. 109 00:06:52,720 --> 00:06:55,190 It should vary between upper and lower case. 110 00:06:55,610 --> 00:06:58,320 >> I would highly encourage you not to reuse passwords. 111 00:06:58,320 --> 00:07:02,070 I can speak to several instances where we've seen an account get 112 00:07:02,070 --> 00:07:05,130 compromised and somebody hopped and skipped--the domino effect. 113 00:07:05,130 --> 00:07:08,020 They mine each account at each stage in the process for this 114 00:07:08,020 --> 00:07:12,820 data, and then they proceed to use that data that they mined in each instance 115 00:07:12,820 --> 00:07:15,610 against another credential source. 116 00:07:16,080 --> 00:07:18,560 So--again--pick a good password. 117 00:07:19,090 --> 00:07:22,810 Make it unique. You may want to think about using a password manager service. 118 00:07:23,470 --> 00:07:26,490 There are ones out there from--they are all in the app stores. 119 00:07:26,490 --> 00:07:31,560 There is one called OnePass, KeePass, LastPass-- 120 00:07:31,560 --> 00:07:39,360 it is a nice way for it to help you create unique credentials, strong credentials, 121 00:07:39,360 --> 00:07:42,660 but also facilitate the archive and record keeping for you. 122 00:07:43,850 --> 00:07:47,480 The down side to that is you need to bring that to a password store; 123 00:07:47,480 --> 00:07:50,370 you need to make sure that that password manager that you're trusting 124 00:07:50,370 --> 00:07:52,540 is worthy of your trust as well. 125 00:07:52,540 --> 00:07:57,190 >> So make sure those guys are also using some valid password mechanisms. 126 00:07:57,190 --> 00:08:00,440 In particular the one I am going to mention right now 127 00:08:00,920 --> 00:08:03,080 is multi-factor authentication. 128 00:08:03,080 --> 00:08:07,970 So multi-factor authentication--and there are several instances I will go through shortly-- 129 00:08:08,410 --> 00:08:11,020 It is the simple expedient of taking something you know like your 130 00:08:11,020 --> 00:08:15,020 user name and your password and adding to it--you are adding another factor. 131 00:08:15,020 --> 00:08:18,670 So the first factor that we will mention today are these ones on the boards. 132 00:08:18,670 --> 00:08:21,730 It is something you have in your possessions, so that is either an application 133 00:08:21,730 --> 00:08:25,510 that is running on your smartphone or indeed on your phone itself. 134 00:08:25,510 --> 00:08:27,750 And you might be able to receive an SMS text. 135 00:08:27,750 --> 00:08:30,980 Beware if you travel abroad that is not necessarily going to follow you. 136 00:08:30,980 --> 00:08:34,260 An application can work greater in that instance. 137 00:08:34,679 --> 00:08:37,590 Or indeed the other factor you may want to think about is something you are. 138 00:08:37,590 --> 00:08:40,669 >> Now this is still kind of very much skunkworks. 139 00:08:40,669 --> 00:08:42,750 We do not see too much adoption of it. 140 00:08:42,750 --> 00:08:49,200 This is--you know--Mission Impossible style--you know--your vein print, 141 00:08:49,200 --> 00:08:52,020 your thumb print, your retina print. 142 00:08:52,020 --> 00:08:56,880 Those are kind of further out; they are not really very valid authentication factors. 143 00:08:56,880 --> 00:09:02,450 We see--when I talk to my security colleagues--more pressure that 144 00:09:02,450 --> 00:09:05,840 you put on a keypad, your particular typing pattern, is probably 145 00:09:05,840 --> 00:09:10,160 directly on the horizon--much more so than these other biometric identifiers. 146 00:09:10,160 --> 00:09:15,990 But the ones today are applications or SMS text or even just a 147 00:09:15,990 --> 00:09:18,390 challenge response email that you are going to get 148 00:09:18,390 --> 00:09:22,820 to validate that you did in fact choose to log on at this point in time. 149 00:09:23,130 --> 00:09:26,080 So there is a link right there; I have mailed out the slide deck this morning. 150 00:09:26,080 --> 00:09:28,370 It will be on the Wiki. 151 00:09:28,370 --> 00:09:31,050 >> Both Gmail and Google do this; Yahoo will do it. 152 00:09:31,050 --> 00:09:36,010 Paypal has it; Paypal also has a little actual hardware key which does a rotating number. 153 00:09:36,010 --> 00:09:38,070 But you can also choose to use a phone number. 154 00:09:38,070 --> 00:09:40,730 Facebook also does a log in approval, so you choose to 155 00:09:40,730 --> 00:09:46,950 approve it; they are also working towards more valid hard strength security. 156 00:09:46,950 --> 00:09:50,290 Dropbox has 2-step verification as well; you can also just 157 00:09:50,290 --> 00:09:52,290 purchase a hardware key for them. 158 00:09:52,290 --> 00:09:54,920 We also see in the Gmail one or the Google one, a lot of people are 159 00:09:54,920 --> 00:09:58,520 actually co-opting Google's authenticator, so--for instance-- 160 00:09:58,520 --> 00:10:02,780 I use LastPass--it does not imply any endorsement--but they can reuse 161 00:10:02,780 --> 00:10:05,280 Google's 2-step verification so that means I do not need to 162 00:10:05,280 --> 00:10:07,980 walk around with 2 applications on my phone. 163 00:10:08,360 --> 00:10:12,580 But also research computing within Harvard or using an analogy 164 00:10:12,580 --> 00:10:15,790 to Google's 2-step authentication because the one-time password 165 00:10:15,790 --> 00:10:19,140 algorithm was open sourced there about 10 years ago. 166 00:10:19,140 --> 00:10:22,340 Any questions? Good. 167 00:10:25,150 --> 00:10:29,090 >> So another factor consideration beyond passwords is when you are 168 00:10:29,090 --> 00:10:32,810 using these resources be aware of what data you are committing to them. 169 00:10:32,810 --> 00:10:35,220 Just limit what you are actually putting up there. 170 00:10:35,510 --> 00:10:41,080 So we are aware that these people who are providing a service for us on the Internet-- 171 00:10:41,080 --> 00:10:44,910 these Cloud providers--they have a vested interest in you 172 00:10:44,910 --> 00:10:47,750 not being as secure as you potentially can. 173 00:10:47,750 --> 00:10:51,750 They tend to make available a bare minimum set of security, 174 00:10:51,750 --> 00:10:56,270 and then there is a bunch of other ones that are optional that you need to choose to opt in to. 175 00:10:56,270 --> 00:11:02,690 The kind of take away from this talk is security is a shared responsibility. 176 00:11:02,690 --> 00:11:06,440 It is between you and the partners that you make--the alliances that you form. 177 00:11:06,440 --> 00:11:09,930 You need to take an active role. Choose to opt in to that. 178 00:11:09,930 --> 00:11:13,180 You know--take the time now; make it more secure. 179 00:11:13,180 --> 00:11:17,380 The alternative is there are already people validating and testing 180 00:11:17,380 --> 00:11:22,590 these security factors against you; the more you can choose to opt in 181 00:11:22,590 --> 00:11:25,600 to the better prepared you are for the eventual compromise. 182 00:11:25,600 --> 00:11:27,600 And it is eventual. 183 00:11:27,600 --> 00:11:29,620 >> But the other factor to think about is as I mentioned 184 00:11:29,620 --> 00:11:33,870 these Internet parties that you are trusting with your credentials--with your identity. 185 00:11:34,940 --> 00:11:38,330 I'll give you 2 analogies; Larry Ellison and Mark Zuckerberg--they are both 186 00:11:38,330 --> 00:11:43,870 on record stating privacy is largely an illusion. 187 00:11:43,870 --> 00:11:46,150 And that the age of privacy is over. 188 00:11:46,940 --> 00:11:50,450 That is kind of a sad indictment that we really need to wait 189 00:11:50,450 --> 00:11:55,230 for the government to step in to force these parties to be more secure, 190 00:11:55,620 --> 00:11:59,820 to introduce more legislation because when we try to work with 191 00:11:59,820 --> 00:12:06,110 these vendors for instance some of these Dropbox like parties, 192 00:12:06,110 --> 00:12:08,890 they are in the business of providing services to the consumer. 193 00:12:08,890 --> 00:12:13,320 They are not directly interested in having enterprise-grade security controls. 194 00:12:13,540 --> 00:12:15,350 The consumers voted with their wallet, 195 00:12:15,350 --> 00:12:17,690 and they have already accepted a minimum grade. 196 00:12:18,440 --> 00:12:20,620 It is time to change that thinking. 197 00:12:21,540 --> 00:12:26,320 So when we provide our data to these parties, we need to co-opt our 198 00:12:26,320 --> 00:12:29,430 existing trust mechanisms; so we are social creatures by default. 199 00:12:29,430 --> 00:12:32,720 >> So why all of the sudden when we start putting the data online 200 00:12:32,720 --> 00:12:36,880 do we now have access to the same protections we do personally? 201 00:12:36,880 --> 00:12:40,110 So when I can read your body language, when I can choose to 202 00:12:40,110 --> 00:12:45,030 network with a social circle and indeed to that circle divulge 203 00:12:45,030 --> 00:12:47,560 just the information that I want to. 204 00:12:48,420 --> 00:12:52,260 So we have access to this body language, expression, to vocalize, 205 00:12:52,260 --> 00:12:55,720 we have access to these identity proximity protections 206 00:12:55,720 --> 00:12:58,410 in a physical location; they are still developing online. 207 00:12:58,410 --> 00:13:01,210 We do not have access to them, but we are starting to see them. 208 00:13:01,210 --> 00:13:05,240 So we have facets in Facebook--for instance--like groups. 209 00:13:05,240 --> 00:13:08,040 We have access to things in Google+ like circles. 210 00:13:08,460 --> 00:13:10,490 Absolutely use them. 211 00:13:10,890 --> 00:13:15,700 So the last thing you want to see is in this space in particular 212 00:13:15,700 --> 00:13:20,170 when you go to get a job is you have now made a lot of your 213 00:13:20,170 --> 00:13:22,850 personality public. 214 00:13:22,850 --> 00:13:26,540 And when somebody wants to--should they choose to--it might be part 215 00:13:26,540 --> 00:13:29,330 of company policy or not--it is certainly not part of Harvard's-- 216 00:13:29,330 --> 00:13:31,850 but they may choose to do a Google search. 217 00:13:32,210 --> 00:13:35,940 And when they do so--if you provided--let us say some information 218 00:13:35,940 --> 00:13:40,090 which you would have difficulty standing behind-- 219 00:13:40,090 --> 00:13:42,830 you have done yourself a disservice. 220 00:13:43,530 --> 00:13:48,060 And indeed as I mentioned--these social companies they have a vested interest 221 00:13:48,060 --> 00:13:50,460 in making it public--you know--they need to mine your data. 222 00:13:50,460 --> 00:13:55,060 They are selling your demographics and your marketing material for someone. 223 00:13:55,060 --> 00:13:58,710 The kind of analogy in this space is--if you are not paying for a product 224 00:13:58,710 --> 00:14:00,740 are you the product? 225 00:14:04,470 --> 00:14:08,560 So create circles for your friends, be cautious, be diligent, 226 00:14:08,560 --> 00:14:10,590 try not to make everything public. 227 00:14:10,590 --> 00:14:14,570 >> Another analogy I will make is end-user license agreements 228 00:14:14,570 --> 00:14:18,210 change; they are going to tell you what they can do with your data, 229 00:14:18,210 --> 00:14:20,800 and they are going to bury it in a 50-page click through. 230 00:14:21,320 --> 00:14:24,200 And they can choose to change that, and they just send you a quick email. 231 00:14:24,200 --> 00:14:26,600 But you are not a lawyer; it is very much in legalese. 232 00:14:26,600 --> 00:14:28,640 You need to be cautious of what you're doing. 233 00:14:28,640 --> 00:14:31,810 They may own your pictures; they may own your intellectual property. 234 00:14:31,810 --> 00:14:33,950 You know--just exercise diligence. 235 00:14:33,950 --> 00:14:39,690 Another example Library of Congress is archiving every single tweet known to man. Everything. 236 00:14:39,690 --> 00:14:44,130 Every 10 years roughly the body of material that is generated 237 00:14:44,130 --> 00:14:49,970 in that 10 years accounts or greatly outpaces everything we've 238 00:14:49,970 --> 00:14:52,510 created throughout human history. 239 00:14:52,890 --> 00:14:56,070 The Library of Congress has a vested interest in preserving that information 240 00:14:56,070 --> 00:15:01,190 for posterity, for future archivists, for future researchers and historians, 241 00:15:01,190 --> 00:15:03,390 so everything you are putting out there is there. 242 00:15:03,390 --> 00:15:06,010 It will actually make an immense resource at some point 243 00:15:06,010 --> 00:15:10,420 once people start to mine social engineering or social networking sites. 244 00:15:12,050 --> 00:15:15,170 So keep apprised of the protections available within each application. 245 00:15:15,170 --> 00:15:18,380 >> There is something I will mention as well; there is a third party tool 246 00:15:18,380 --> 00:15:22,320 called Privacyfix; it can plug right in to some of these 247 00:15:22,320 --> 00:15:24,390 social networking applications. 248 00:15:24,390 --> 00:15:27,000 And it can check to see where you are with respect to the protections 249 00:15:27,000 --> 00:15:29,930 that are available on them if you can choose to ratchet them up further. 250 00:15:31,110 --> 00:15:34,590 There are tools like the Data Liberation Front from Google 251 00:15:34,590 --> 00:15:39,420 where you can choose to export or extract your data. 252 00:15:39,420 --> 00:15:41,870 There are things like the Internet Suicide Machine which will log on 253 00:15:41,870 --> 00:15:45,230 to some of your profiles and actually delete every single attribute 254 00:15:45,230 --> 00:15:49,350 one at a time, untag every single association friends in your network would have made. 255 00:15:49,350 --> 00:15:53,310 And it will pursue to iteratively purge everything about you 256 00:15:53,310 --> 00:15:55,360 that that site would know. 257 00:15:58,430 --> 00:16:01,840 If I can just exercise some caution there as well; there was an instance 258 00:16:01,840 --> 00:16:06,740 a couple of years ago in Germany where a citizen decided to 259 00:16:06,740 --> 00:16:11,590 exercise his freedom of information rights and ask Facebook to provide 260 00:16:11,590 --> 00:16:15,130 what information they had on record for him even after he deleted his account. 261 00:16:15,130 --> 00:16:20,070 They provided him with a CD with 1,250 pages of information 262 00:16:20,070 --> 00:16:22,650 even though his account theoretically no longer existed. 263 00:16:23,020 --> 00:16:26,130 There is the concept in this space a lot that some of these 264 00:16:26,130 --> 00:16:31,440 entities will maintain some data about you to do with your associations and your networks. 265 00:16:33,090 --> 00:16:37,350 They say that they cannot have control over it; that is a bit of a stretch in my opinion. 266 00:16:38,010 --> 00:16:41,570 They create these shadow accounts--the shadow personas. 267 00:16:41,570 --> 00:16:43,880 Just be careful. 268 00:16:45,260 --> 00:16:47,290 Limit what you can. 269 00:16:47,680 --> 00:16:50,830 At an actual device level when you are just talking about-- 270 00:16:50,830 --> 00:16:56,020 you know--hardware--your smartphone, your tablets, 271 00:16:56,020 --> 00:17:00,220 your workstation, your laptop, perhaps a server that you are responsible for. 272 00:17:00,220 --> 00:17:04,740 >> You have probably heard about concepts like operation, system updates, 273 00:17:04,740 --> 00:17:08,720 application updates, antivirus; you've heard of things like firewalls, 274 00:17:08,720 --> 00:17:11,770 disk encryption, and back up. 275 00:17:11,770 --> 00:17:14,190 The one thing you should be aware of is you don't hear about 276 00:17:14,190 --> 00:17:16,900 those kind of protections in the mobile phone space. 277 00:17:16,900 --> 00:17:19,730 They are just as susceptible to the same threats. 278 00:17:19,730 --> 00:17:23,280 We had--I want to say--a million smartphones are going to be 279 00:17:23,280 --> 00:17:25,380 activated by the end of this month. 280 00:17:25,380 --> 00:17:28,640 That has vastly outpaced the--within the short amount of time that 281 00:17:28,640 --> 00:17:30,640 they have been available, that has vastly outpaced the growth of 282 00:17:30,640 --> 00:17:32,740 the PC, the laptop, the workstation market. 283 00:17:33,260 --> 00:17:35,520 But we do not have access to the same controls, and I 284 00:17:35,520 --> 00:17:37,570 will talk about that shortly. 285 00:17:37,800 --> 00:17:41,320 So before we get to the mobile phone space let us talk about 286 00:17:41,320 --> 00:17:44,150 what is available there that I just briefly went over. 287 00:17:44,150 --> 00:17:48,160 So antivirus software--here are some free choices. 288 00:17:49,240 --> 00:17:55,430 Microsoft gives away theirs--you know--Sophos gives away theirs for OSX as well 289 00:17:56,800 --> 00:17:59,120 Patch your computer--just be aware of whatever your vendor's 290 00:17:59,120 --> 00:18:02,310 current patch level is, and you shouldn't be a significant delta from that. 291 00:18:02,310 --> 00:18:04,860 There is a nice tool from a company called Secunia. 292 00:18:04,860 --> 00:18:07,740 And Secunia will run in the background, and it will tell you if there's an 293 00:18:07,740 --> 00:18:09,970 updated available and if you need to apply it. 294 00:18:10,470 --> 00:18:14,840 >> Enable automatic updates--both Apple and Microsoft will have some aspect of this. 295 00:18:14,840 --> 00:18:17,170 They will alert you that there is an update available. 296 00:18:18,430 --> 00:18:22,610 And Secunia--you know--is kind of a nice safety net to have as well--fall back mechanism. 297 00:18:23,190 --> 00:18:26,210 At the host layer--not getting to smartphones yet. 298 00:18:26,880 --> 00:18:30,280 Enable the firewall native to the operating system. 299 00:18:31,080 --> 00:18:34,130 There is some information about the Windows in the OSX one. 300 00:18:35,450 --> 00:18:39,870 Test your firewall; do not just leave it there and think that it is a secure mechanism. 301 00:18:39,870 --> 00:18:43,670 Take an active role; there is an application there from GRC--Steve Gibson. 302 00:18:44,490 --> 00:18:49,470 Wi-Fi security in this space--this can also apply to the smartphone and the tablet-- 303 00:18:49,470 --> 00:18:52,900 when you are choosing to go on the road you need to be aware 304 00:18:52,900 --> 00:18:55,910 that there are different classes of wireless network. 305 00:18:55,910 --> 00:19:00,680 And in particular do not choose the most commonly available one. 306 00:19:00,680 --> 00:19:02,850 It might be low cost, but there might be a reason for that. 307 00:19:02,850 --> 00:19:05,080 Perhaps they are mining your data. 308 00:19:05,080 --> 00:19:08,070 We see this more when you are traveling internationally. 309 00:19:08,070 --> 00:19:13,650 There are some really highly efficient cyber criminal syndicates 310 00:19:13,650 --> 00:19:18,140 that are able to leverage what we typically see in the nation states' espionage. 311 00:19:18,930 --> 00:19:22,750 A factor where they are outright injecting themselves in a network stream. 312 00:19:22,750 --> 00:19:25,690 They are pulling stuff out of there, and they are injecting 313 00:19:25,690 --> 00:19:29,050 applications on to your workstations. 314 00:19:29,050 --> 00:19:34,030 >> It is--the other aspect that I know was mentioned in some of these 315 00:19:34,030 --> 00:19:38,430 security seminars--or not seminars CS50 seminars--is a tool called Firesheep. 316 00:19:38,430 --> 00:19:42,470 And Firesheep was a particular attack in the mobile phone space 317 00:19:42,470 --> 00:19:47,920 where some of these social networking applications were sending credentials in plain text. 318 00:19:48,370 --> 00:19:52,380 And this was quite commonly accepted because everyone at that time 319 00:19:52,380 --> 00:19:56,090 was thinking that there was no appetite in the consumer space for it, 320 00:19:56,090 --> 00:20:01,710 that to use higher strength encryption implied a performance burden 321 00:20:01,710 --> 00:20:06,240 on the server, so if they did not have to do it--they did not want to. 322 00:20:06,820 --> 00:20:09,490 And then all of the sudden when this security researcher made 323 00:20:09,490 --> 00:20:13,690 the attack trivial very quickly--you know--we started to see that kind of 324 00:20:13,690 --> 00:20:16,100 improvement that everybody in the security space had 325 00:20:16,100 --> 00:20:19,260 been complaining about for a significant length of time. 326 00:20:19,260 --> 00:20:22,950 So--in particular--Firesheep was able to retrieve Facebook, Twitter 327 00:20:22,950 --> 00:20:25,010 credentials from the Wi-Fi stream. 328 00:20:25,240 --> 00:20:28,830 And because it was in plain text, and they were able to inject. 329 00:20:28,830 --> 00:20:31,700 >> Again, if you are going to use Wi-Fi choose to use one that 330 00:20:31,700 --> 00:20:35,030 is sufficiently protected--WPA2 if you can. 331 00:20:35,670 --> 00:20:39,390 If you have to use unencrypted Wi-Fi--and in particular I am talking 332 00:20:39,390 --> 00:20:42,420 to anybody that is using the Harvard University wireless-- 333 00:20:42,420 --> 00:20:45,520 you may want to think about using VPN. I highly encourage it. 334 00:20:46,230 --> 00:20:49,620 Other factors you may want to think about are if you do not trust the Wi-Fi 335 00:20:49,620 --> 00:20:51,840 that you are on you may want to limit use. 336 00:20:51,840 --> 00:20:54,730 Do not do any e-commerce; do not do any banking. 337 00:20:54,730 --> 00:20:57,060 Do not access your university credentials. 338 00:20:57,730 --> 00:20:59,850 There is a major win in this space if somebody 339 00:20:59,850 --> 00:21:03,540 does steal your credentials--you know--do they have your mobile phone? 340 00:21:03,540 --> 00:21:07,850 So--you know--that is another factor that they cannot necessarily hijack 341 00:21:07,850 --> 00:21:12,040 or just makes their attack more complicated. 342 00:21:12,950 --> 00:21:14,950 Encrypt your hard disk. 343 00:21:14,950 --> 00:21:17,650 We are at an era right now--encryption used to be a big deal 10 years ago. 344 00:21:17,650 --> 00:21:19,950 It was a significant performance impact. 345 00:21:19,950 --> 00:21:24,290 It is no longer--in fact--most of the mobile phones and that kind of stuff 346 00:21:24,290 --> 00:21:26,920 they are doing it in hardware, and you don't even notice-- 347 00:21:26,920 --> 00:21:28,990 the performance is so negligible. 348 00:21:28,990 --> 00:21:31,720 >> If you are talking about a workstation, we are talking about BitLocker. 349 00:21:31,720 --> 00:21:35,500 We are talking about File Vault; enable it--take the time now. 350 00:21:35,500 --> 00:21:39,430 In the Linux space obviously True Crypts can work across both of those. 351 00:21:39,430 --> 00:21:42,400 You may want to think about--in the Linux space--there is dm-crypt, 352 00:21:42,400 --> 00:21:46,470 there is Luxcrypt--there are a bunch of other options--also True Crypt. 353 00:21:46,850 --> 00:21:49,970 Other quick way to protect yourself at the workstation level 354 00:21:49,970 --> 00:21:52,000 back up your hard disk. 355 00:21:52,000 --> 00:21:56,130 And one slight wrinkle here--it is not sufficient to use one of 356 00:21:56,130 --> 00:22:01,410 these Cloud synchronization providers, so Dropbox or G-Drive or something else 357 00:22:01,410 --> 00:22:03,410 That is not a back up solution. 358 00:22:03,410 --> 00:22:05,410 If somebody deletes something on one of these devices 359 00:22:05,410 --> 00:22:08,280 because they inserted themselves somehow it is going-- 360 00:22:08,280 --> 00:22:11,170 that deletion gets replicated across your entire persona. 361 00:22:11,170 --> 00:22:15,310 That is not a back up; that is just a propagation mechanism. 362 00:22:15,310 --> 00:22:17,310 So it is good to have a back up solution. 363 00:22:17,310 --> 00:22:19,890 There are some suggestions here for some people; some of them are free-- 364 00:22:19,890 --> 00:22:23,100 capacity based-- 2 gigs of back up--you can do it. 365 00:22:23,100 --> 00:22:30,040 If you are using university G-mail--university Google at college and co, G-Drive 366 00:22:30,040 --> 00:22:32,490 if it is not already--it will be available soon. 367 00:22:32,490 --> 00:22:34,490 It is a good replacement. 368 00:22:34,490 --> 00:22:37,370 We will also look at these things like Mozy Home. 369 00:22:37,370 --> 00:22:39,600 It is good to have 2 solutions. 370 00:22:40,170 --> 00:22:42,300 Do not have all of your eggs in one basket. 371 00:22:44,230 --> 00:22:47,410 If you are disposing of something or indeed if you are in the process 372 00:22:47,410 --> 00:22:51,480 of sending something confidential--some suggestions here to 373 00:22:51,480 --> 00:22:53,560 securely erase a device. 374 00:22:53,560 --> 00:23:00,340 Darik's Boot and Nuke--that is kind of more for the IT savvy. 375 00:23:01,110 --> 00:23:03,290 You may want to think about just giving it to some of these 376 00:23:03,290 --> 00:23:05,740 commercial providers if you can. 377 00:23:05,740 --> 00:23:10,210 >> Encrypting email--if you have to--there are some services on campus 378 00:23:10,210 --> 00:23:14,600 called Accellion; you are off-campus or for personal use I will recommend Hushmail. 379 00:23:15,680 --> 00:23:19,690 We see it a lot used in whistle blower; it is one of the main 380 00:23:19,690 --> 00:23:21,900 mechanisms for WikiLeaks 381 00:23:22,950 --> 00:23:25,140 as well as Tor and some other equivalents. 382 00:23:26,130 --> 00:23:30,360 And--now to talk about the phone level--so the problem here is 383 00:23:30,360 --> 00:23:32,440 there is not that much of an appetite yet. 384 00:23:32,440 --> 00:23:35,940 Unfortunately most of the smartphones and the tablet OSs 385 00:23:35,940 --> 00:23:40,020 they are still based on some of the principles that we saw in the 1990s. 386 00:23:40,020 --> 00:23:43,730 They have not really incorporated some of the improvements 387 00:23:43,730 --> 00:23:46,400 that we see at the workstation level. They are not doing heat protection. 388 00:23:46,400 --> 00:23:50,120 They are not doing--you know--layer randomization. 389 00:23:50,120 --> 00:23:52,360 They are not doing address protection. 390 00:23:52,360 --> 00:23:54,490 They are not doing execute protection--that kind of stuff. 391 00:23:55,210 --> 00:23:58,550 But also the device itself by defacto is not going to have any 392 00:23:58,550 --> 00:24:00,750 end point security built into it. 393 00:24:00,750 --> 00:24:04,460 So we are starting to see this change--again--most of the smartphone 394 00:24:04,460 --> 00:24:09,680 manufacturers--Android, Apple, and Windows--the appetite just 395 00:24:09,680 --> 00:24:11,690 wasn't there; the benchmark was Blackberry. 396 00:24:11,690 --> 00:24:15,460 But Blackberry has kind of lost its traction in the marketplace at this point. 397 00:24:15,460 --> 00:24:17,820 And Apple has really stepped in. 398 00:24:17,820 --> 00:24:20,760 About 2 years ago there was a watershed moment where they 399 00:24:20,760 --> 00:24:24,300 started to build in a lot more enterprise type controls. 400 00:24:24,300 --> 00:24:29,780 And--indeed--in August they did a presentation at Def Con which was just unheard of. 401 00:24:31,860 --> 00:24:34,420 >> So they will do the minimum controls that I described. 402 00:24:34,420 --> 00:24:38,950 They will do strong password; they'll do a prompt for that password on idle-- 403 00:24:38,950 --> 00:24:42,750 the device--you forget about it and after 15 minutes it activates. 404 00:24:43,170 --> 00:24:47,240 They will do encryption, and they will also do what is called remote wiping. 405 00:24:48,200 --> 00:24:53,740 In the Android and the Windows space these are still TBD--to be determined. 406 00:24:53,740 --> 00:24:58,830 Android has access to some applications called Prey and Lookout. 407 00:24:58,830 --> 00:25:02,240 And indeed some of the end point security tools like Kaspersky I know does it. 408 00:25:02,240 --> 00:25:04,240 I know ESET does it as well 409 00:25:04,240 --> 00:25:07,350 They will let you send an SMS text and purge the device. 410 00:25:08,370 --> 00:25:12,070 Windows phone at this point it is primarily oriented toward 411 00:25:12,070 --> 00:25:15,310 corporate style--what is called exchange. 412 00:25:15,310 --> 00:25:19,430 Exchange is a robust mail infrastructure, and it can mandate some of these controls. 413 00:25:19,430 --> 00:25:25,280 Windows 8 just shipped last week, so I cannot speak to that definitively. 414 00:25:25,280 --> 00:25:29,020 Windows 6.5 was the great security device. 415 00:25:29,020 --> 00:25:34,650 Windows 7 Mobile was a disaster; they didn't make all these native controls 416 00:25:34,650 --> 00:25:36,970 mandatory across the different vendors. 417 00:25:36,970 --> 00:25:43,050 So you had to ratify each Windows Mobile 7 phone one at a time. 418 00:25:43,050 --> 00:25:47,190 >> Android--since the 3.0 space has had a major improvement as well. 419 00:25:47,190 --> 00:25:53,450 Honeycomb, Ice Cream Sandwich, Jellybean--they will support these minimum controls, 420 00:25:53,450 --> 00:25:58,860 and indeed they will support some of the enterprise control that you can do as well. 421 00:25:59,100 --> 00:26:03,560 In your personal account space there is a Google personal sync that 422 00:26:03,560 --> 00:26:06,370 you can enable if you have your own Google space as well. 423 00:26:10,690 --> 00:26:15,620 So what do you do when it all goes horribly wrong? 424 00:26:15,620 --> 00:26:19,900 And if I can--another takeaway from this is really when--it is not if. 425 00:26:19,900 --> 00:26:24,380 This is going to happen to all of us at some point. What can you do? 426 00:26:24,380 --> 00:26:28,650 So what you can do--and there is a slide--the next slide will 427 00:26:28,650 --> 00:26:31,310 point you to some of the FTC resources for it, 428 00:26:31,310 --> 00:26:35,270 but a bare minimum place a fraud alert on your credit cards. 429 00:26:35,270 --> 00:26:38,980 If I can encourage you to think about when you are using a credit card 430 00:26:38,980 --> 00:26:43,320 in an online capacity--depending on the transaction you're making 431 00:26:43,740 --> 00:26:51,020 debit cards--the ability to claim or the ability to retract a fraudulent 432 00:26:51,020 --> 00:26:54,920 claim on a debit card is actually a much smaller window than it is on a credit card. 433 00:26:55,330 --> 00:26:57,950 So once you get your report on a debit card you only have a certain 434 00:26:57,950 --> 00:27:02,940 time frame--and it is very low--to notify the bank of a fraudulent transaction. 435 00:27:02,940 --> 00:27:07,830 Credit cards it is much larger; there tends to be a limit up to about $50,000 436 00:27:11,020 --> 00:27:13,360 before they will really be able to reimburse you. 437 00:27:14,060 --> 00:27:18,840 So that is quite a lot of money; they bumped it up from about $13,000 or $18,000 there quite recently. 438 00:27:18,840 --> 00:27:21,870 So--you know--when you think about using a credit card online, 439 00:27:21,870 --> 00:27:27,980 can you think about using a top up card or a disposable credit card, a burner card? 440 00:27:28,660 --> 00:27:32,130 >> If you do see anything--and I will show you how you can get access shortly-- 441 00:27:32,130 --> 00:27:35,500 close any fraudulent accounts if you are made aware of it. 442 00:27:35,880 --> 00:27:38,180 File a police report if you are on campus. 443 00:27:38,180 --> 00:27:41,200 Reach out to HUPD--let them know. 444 00:27:42,870 --> 00:27:45,790 Think about an identity monitoring service. 445 00:27:45,790 --> 00:27:50,580 if as part of--if you do get compromised--you may have to-- 446 00:27:50,580 --> 00:27:53,240 they may fund identity protection service. 447 00:27:53,240 --> 00:27:56,680 If they do not perhaps you should do it. 448 00:27:56,950 --> 00:28:00,880 Collect and keep all evidence--in particular any discussions you've had 449 00:28:00,880 --> 00:28:03,180 with any criminal authorities 450 00:28:04,190 --> 00:28:06,840 particularly for insurance purposes. 451 00:28:06,840 --> 00:28:09,030 Change all of your passwords. 452 00:28:09,030 --> 00:28:13,050 Change the answers to any security questions that can be used to reset your password. 453 00:28:13,860 --> 00:28:16,580 Disable any past identity services. 454 00:28:16,580 --> 00:28:20,170 So if you are reusing your Facebook account to log on to Twitter or vice versa, 455 00:28:20,170 --> 00:28:27,240 break that; if the compromise involved your email account 456 00:28:27,240 --> 00:28:29,590 check to see if anything is being forwarded. 457 00:28:30,690 --> 00:28:33,200 Because otherwise they still have access to your data. 458 00:28:33,600 --> 00:28:39,840 And if the theft includes your Harvard account please notify IThelp@harvard.edu. 459 00:28:39,840 --> 00:28:44,300 I cannot state that enough, but also in particular if the device gets lost or 460 00:28:44,300 --> 00:28:47,340 stolen and it had access to your university data and perhaps you 461 00:28:47,340 --> 00:28:50,660 did not have some of these protections be respective; please let us know-- 462 00:28:50,660 --> 00:28:53,980 HUPD and IT Help at Harvard. 463 00:28:55,080 --> 00:28:58,110 >> So the link that I just mentioned that goes into that with more detail 464 00:28:58,110 --> 00:29:02,650 FTC.gov/identitytheft. 465 00:29:02,650 --> 00:29:08,260 The Postal Service also has some fraud or identity protection services-- 466 00:29:08,260 --> 00:29:12,400 you just put a hold or a stop on credit cards going through or stuff like that. 467 00:29:12,810 --> 00:29:16,950 The FBI has a link as well; it is in the notes of the slides that I sent out. 468 00:29:16,950 --> 00:29:20,450 And indeed Massachusetts Better Business Bureau and 469 00:29:20,450 --> 00:29:25,050 Consumer Protection Bureau has some guidance as well; it is in the notes. 470 00:29:25,520 --> 00:29:31,770 Take the time now, make yourself aware of what you can do, and take the action. 471 00:29:31,770 --> 00:29:37,150 The principle--as I mentioned earlier--is if you do not have a plan 472 00:29:37,150 --> 00:29:43,010 for your identity being stolen you are immediately going to be 473 00:29:43,010 --> 00:29:46,970 subject to a lot of work when it does happen, and it is when. 474 00:29:48,030 --> 00:29:50,910 But even when you take these precautions--let me just add a 475 00:29:50,910 --> 00:29:56,190 slight word of caution--no plan survives first contact with the enemy. 476 00:29:56,190 --> 00:30:02,770 So even at that we still think that there can be some subversion--you know-- 477 00:30:02,770 --> 00:30:06,640 your bank for instance who you have built all these protections around 478 00:30:06,640 --> 00:30:10,690 they may get compromised; these trusted parties that you have given your data to. 479 00:30:11,230 --> 00:30:15,570 So you are your own best defense. 480 00:30:15,570 --> 00:30:17,960 You know--remain vigilant--remain alert. 481 00:30:17,960 --> 00:30:22,570 Take the time now to choose to opt in to these; hopefully socialize 482 00:30:22,570 --> 00:30:24,920 this, talk to this with your friends. 483 00:30:24,920 --> 00:30:28,880 Pick good passwords; use unique passwords for your accounts. 484 00:30:29,570 --> 00:30:33,260 And do not reuse passwords--in particular--around some of 485 00:30:33,260 --> 00:30:36,630 your more sensitive assets; do not use your university account elsewhere. 486 00:30:36,630 --> 00:30:39,350 Do not use your credit card account elsewhere. 487 00:30:39,350 --> 00:30:42,020 Password protect your mobile device right now. 488 00:30:42,020 --> 00:30:48,430 And by mobile device I mean smartphone, I mean your tablet. 489 00:30:48,430 --> 00:30:51,250 >> Think about using good security reset questions, and I will talk about 490 00:30:51,250 --> 00:30:54,120 this shortly why; check your credit report. 491 00:30:54,120 --> 00:30:58,040 Another way that you can be a good citizen in this space 492 00:30:58,040 --> 00:31:05,350 is the government forced the 3 agencies Experian, Transunion, and Equifax 493 00:31:05,350 --> 00:31:07,460 to release credit reports. 494 00:31:07,460 --> 00:31:10,270 For some of the Harvard community, especially in the student space, 495 00:31:10,270 --> 00:31:13,260 this might be new to them, but you are allowed to pull those 496 00:31:13,260 --> 00:31:16,510 agencies at least once a year. 497 00:31:17,180 --> 00:31:20,420 Good caution--go on to that site; it is available on the FTC one. 498 00:31:20,420 --> 00:31:23,260 And do it every 4 months instead, and you are able to keep 499 00:31:23,260 --> 00:31:28,130 tabs on who is soliciting requests for your credit card information, 500 00:31:28,130 --> 00:31:31,060 or if indeed if anybody opens any fraudulent accounts. 501 00:31:31,430 --> 00:31:34,450 And--in general--the guidance is to be aware. 502 00:31:34,450 --> 00:31:37,120 And I am going to give you a specific example shortly, 503 00:31:37,120 --> 00:31:40,510 but that is essentially the meat and potatoes of the discussion. 504 00:31:41,110 --> 00:31:43,810 >> So why this is important right now is during the summer there was a 505 00:31:43,810 --> 00:31:47,200 gentleman called Matt Honan--if you are out there thank you very much 506 00:31:47,200 --> 00:31:49,920 for being so forthcoming with your information. 507 00:31:50,360 --> 00:31:55,840 But what happened with Matt is he worked for Wired Magazine, 508 00:31:55,840 --> 00:31:59,530 and some cyperhacktivists went after his Twitter account. 509 00:32:00,070 --> 00:32:03,630 And they used some of these resources--some of this public persona 510 00:32:03,630 --> 00:32:06,740 that he made available. 511 00:32:06,740 --> 00:32:11,170 And they built a map; they knew where to attack and when. 512 00:32:11,980 --> 00:32:15,400 So from that they started to slice and dice the information that he made 513 00:32:15,400 --> 00:32:17,440 available, and they found that he had a Gmail account. 514 00:32:17,890 --> 00:32:21,580 So he was using a less than wise password for his Gmail, 515 00:32:21,580 --> 00:32:24,890 and he did not have any multi-factor authentication on it. 516 00:32:24,890 --> 00:32:27,800 So they compromised his Gmail; once they had access to his Gmail 517 00:32:27,800 --> 00:32:31,390 they saw all these other accounts that he had plugged into his Gmail. 518 00:32:31,820 --> 00:32:35,760 Indeed, they had access to his whole entire Gmail or Google persona. 519 00:32:37,230 --> 00:32:40,850 And--in particular--they started to notice that he had an Amazon account 520 00:32:40,850 --> 00:32:44,700 because there were some emails being reported to him. 521 00:32:44,930 --> 00:32:47,540 So then they got on to his Amazon, and they got on to his Amazon 522 00:32:47,540 --> 00:32:50,800 by just resetting his password because it went to his Gmail. 523 00:32:51,940 --> 00:32:56,430 He did not have--he kind of had a domino effect or credential chaining going on here 524 00:32:56,430 --> 00:33:00,090 where once they got his Gmail they had the keys to the kingdom. 525 00:33:00,320 --> 00:33:03,950 So once they got on to his Amazon--and this was through no fault 526 00:33:03,950 --> 00:33:07,010 to these other guys--this was--you know--Matt had not chosen to 527 00:33:07,010 --> 00:33:10,640 opt in to these more secure mechanisms that only these people had made available 528 00:33:12,050 --> 00:33:14,230 and all of these Internet sources. 529 00:33:14,230 --> 00:33:18,340 >> So once they got on to his Amazon they had access--it didn't show them 530 00:33:18,340 --> 00:33:20,420 his credit card, but it showed them the last 4 digits 531 00:33:20,420 --> 00:33:24,280 just so he knew what it was; it showed them his shipping address. 532 00:33:24,280 --> 00:33:26,620 It showed them some other information that he done on some orders. 533 00:33:26,620 --> 00:33:29,790 And then from that they decided to attack his Apple account. 534 00:33:30,860 --> 00:33:33,170 And they social engineered the Apple help desk. 535 00:33:33,640 --> 00:33:36,920 Apple should not have done it, but based on this information that 536 00:33:36,920 --> 00:33:39,990 they were able to mine from the other 2 accounts. 537 00:33:41,040 --> 00:33:43,310 You know--the guy at the help desk probably thought he was being 538 00:33:43,310 --> 00:33:46,730 a good citizen--you know--I am being helpful; there is an Apple customer 539 00:33:46,730 --> 00:33:50,370 out there that is stranded out there on his own, and I need to help him. 540 00:33:51,340 --> 00:33:53,680 But it wasn't the real Apple customer. 541 00:33:53,680 --> 00:33:56,920 So they reset his Apple account, and they sent the information to the Gmail. 542 00:33:56,920 --> 00:34:00,580 Once the attackers had access to his Apple account 543 00:34:00,580 --> 00:34:04,390 Matt had all of his devices tied into his iCloud, 544 00:34:04,390 --> 00:34:08,600 and they started issuing perjury sets and wiping everything. 545 00:34:08,989 --> 00:34:14,530 Again, he had just his data propagated; he was using iCloud as the synchronization mechanism. 546 00:34:14,530 --> 00:34:17,800 So when they deleted it everything went bang. 547 00:34:18,600 --> 00:34:21,010 They still had access at this point to his Twitter account which is what 548 00:34:21,010 --> 00:34:23,770 they had tried to attack. 549 00:34:24,739 --> 00:34:26,980 I do not know if they used Maltego or some of these other mechanisms 550 00:34:26,980 --> 00:34:31,710 to build out his Internet persona, but--you know--within a matter of 551 00:34:31,710 --> 00:34:34,429 course they got access to 4 different identity services before 552 00:34:34,429 --> 00:34:36,790 they got to his Twitter, and it cost Matt-- 553 00:34:36,790 --> 00:34:39,350 Matt was quite lucky he saw it happen because his kids came to him 554 00:34:39,350 --> 00:34:41,350 when the iPad locked itself off. 555 00:34:41,350 --> 00:34:43,770 And they said--you know, "Dad, there is something going on with the iPad." 556 00:34:43,770 --> 00:34:48,050 And he shut everything down because he noticed it was happening everywhere. 557 00:34:48,389 --> 00:34:51,560 And he started calling Apple to see what the hell had just happened. 558 00:34:52,199 --> 00:34:54,840 And Apple genuinely thought that there was something going on 559 00:34:54,840 --> 00:34:58,170 that iCloud had gone rogue until they figured out-- 560 00:34:58,170 --> 00:35:01,380 he actually figured out that they were sending information, and 561 00:35:01,380 --> 00:35:03,380 they started calling him the wrong name. 562 00:35:03,380 --> 00:35:09,200 Because Apple had on file information that the attacker had subverted. 563 00:35:09,990 --> 00:35:13,720 >> Okay--so that is the kind of information that we use to build this 564 00:35:13,720 --> 00:35:17,990 kind of best practice; we use this as part of a whole series of 565 00:35:17,990 --> 00:35:21,030 seminars through October--National CyberSecurity Awareness Month. 566 00:35:21,030 --> 00:35:23,530 It has been made available to you guys. 567 00:35:23,530 --> 00:35:28,160 I'll make sure that I sent it out in the Wiki when David makes it available to me as well. 568 00:35:28,160 --> 00:35:30,960 But there is advice and guidance in there much more granularly than 569 00:35:30,960 --> 00:35:34,230 I am able to summarize in this short amount of time I have available. 570 00:35:34,230 --> 00:35:37,350 around what is called, Cloudy with a Chance of Identity Theft: 571 00:35:37,350 --> 00:35:39,400 Picking Good User Names and Passwords. 572 00:35:39,400 --> 00:35:42,700 Is it ever not social? And the answer is no, it is always social, 573 00:35:42,700 --> 00:35:45,500 but you need to be aware of what that means. 574 00:35:47,020 --> 00:35:50,640 And it is Taming Lions, Tigers, and Windows which is around 575 00:35:50,640 --> 00:35:54,300 hardening operating systems with some of the information we went to today. 576 00:35:54,540 --> 00:35:57,320 And the last one was about, Have Device, Will Travel 577 00:35:57,320 --> 00:36:00,200 to talk about going mobile with these kind of data sources. 578 00:36:00,910 --> 00:36:03,710 So other than that if you have any questions my email address is 579 00:36:03,710 --> 00:36:08,200 there, and if anybody in the room has any questions please raise your hand. 580 00:36:08,690 --> 00:36:10,910 Other than that, I am going to stop recording. 581 00:36:11,870 --> 00:36:16,000 All right. Done. 582 00:36:16,000 --> 00:36:19,190 [CS50.TV]