1 00:00:00,000 --> 00:00:03,050 2 00:00:03,050 --> 00:00:04,440 DAVID MALAN: This is CS50. 3 00:00:04,440 --> 00:00:07,990 4 00:00:07,990 --> 00:00:08,730 Hello, world. 5 00:00:08,730 --> 00:00:12,420 This is the CS50 podcast, episode 5, 0 indexed. 6 00:00:12,420 --> 00:00:15,868 My name is David Malan, and I'm here with CS50's own Colton Ogden. 7 00:00:15,868 --> 00:00:17,160 COLTON OGDEN: Glad to be here-- 8 00:00:17,160 --> 00:00:20,243 interesting thing to start us off-- so, we've talked about robocalls a lot 9 00:00:20,243 --> 00:00:22,030 in the recent past, multiple episodes. 10 00:00:22,030 --> 00:00:24,780 And I think we touched briefly upon the prospect 11 00:00:24,780 --> 00:00:26,510 of finding a solution to this problem. 12 00:00:26,510 --> 00:00:28,720 You know, people are getting robocalls all the time, 13 00:00:28,720 --> 00:00:30,150 even though, in the last couple of weeks, 14 00:00:30,150 --> 00:00:33,192 I have noticed the numbers sort of dropping, at least for me, personally. 15 00:00:33,192 --> 00:00:36,855 I still get the occasional call from a presumed spoofed caller. 16 00:00:36,855 --> 00:00:38,625 DAVID MALAN: Yeah, sorry about that. 17 00:00:38,625 --> 00:00:40,320 COLTON OGDEN: But, apparently, the FCC-- 18 00:00:40,320 --> 00:00:42,870 Ajit Pai has proposed a ruling that would actually 19 00:00:42,870 --> 00:00:47,250 allow phone companies to block these unwanted calls, these spoofed calls, 20 00:00:47,250 --> 00:00:49,588 before they even get to potential customers. 21 00:00:49,588 --> 00:00:51,630 DAVID MALAN: Yeah, no, this is a nice initiative. 22 00:00:51,630 --> 00:00:53,670 It's perhaps a little belated at this point, certainly. 23 00:00:53,670 --> 00:00:56,503 Because, as we've discussed, these robocalls, these automated calls, 24 00:00:56,503 --> 00:00:59,818 have really been proliferating, in large part 25 00:00:59,818 --> 00:01:01,860 because of the software via what you can do this, 26 00:01:01,860 --> 00:01:04,155 and the API access which you can do this. 27 00:01:04,155 --> 00:01:06,030 But I think the fundamental problem, frankly, 28 00:01:06,030 --> 00:01:08,460 is that the phone system that we have today 29 00:01:08,460 --> 00:01:11,520 really is not all that fundamentally different from what we've 30 00:01:11,520 --> 00:01:13,590 had for decades now, which is to say that there's 31 00:01:13,590 --> 00:01:16,110 no authentication of these calls in the first place. 32 00:01:16,110 --> 00:01:20,100 The systems generally just trust that the number being presented in caller ID 33 00:01:20,100 --> 00:01:22,110 is, in fact, the number from which a call came. 34 00:01:22,110 --> 00:01:24,150 And that's, of course, not always the case. 35 00:01:24,150 --> 00:01:25,400 COLTON OGDEN: Right, and the-- 36 00:01:25,400 --> 00:01:29,040 I guess the proposed sort of authentication system that they're 37 00:01:29,040 --> 00:01:33,330 going to roll out is called Shaken Stir, which is very akin to what James Bond's 38 00:01:33,330 --> 00:01:35,430 says when he orders a martini. 39 00:01:35,430 --> 00:01:39,045 But the acronym is a-- 40 00:01:39,045 --> 00:01:40,920 basically, the shaken part of it is signature 41 00:01:40,920 --> 00:01:45,480 based handling of asserted information using tokens. 42 00:01:45,480 --> 00:01:50,745 And then the stir part would be secure telephone identity revisited. 43 00:01:50,745 --> 00:01:52,620 DAVID MALAN: Indeed, it's a wonderful acronym 44 00:01:52,620 --> 00:01:55,810 if you allow yourself to use arbitrary letters from some of the words. 45 00:01:55,810 --> 00:01:56,940 COLTON OGDEN: Yeah, and it's a bit of a mouthful. 46 00:01:56,940 --> 00:01:59,640 But this is cool, because this suggests that we'll actually 47 00:01:59,640 --> 00:02:03,840 get what you just alluded to, a way of actually signing calls and making sure 48 00:02:03,840 --> 00:02:08,070 that people who present themselves as xyz are in fact xyz and not, 49 00:02:08,070 --> 00:02:10,680 you know, sort of proxying themselves or presenting themselves 50 00:02:10,680 --> 00:02:11,725 as some other entity. 51 00:02:11,725 --> 00:02:14,100 DAVID MALAN: Yeah, I mean, much like the web-- thankfully 52 00:02:14,100 --> 00:02:17,490 we got that right, presumably because of lessons learned from things 53 00:02:17,490 --> 00:02:19,260 like telephony over the years. 54 00:02:19,260 --> 00:02:22,320 Of course, the phone system has been around for so long now 55 00:02:22,320 --> 00:02:25,380 that it's certainly hard, I imagine, to shoehorn 56 00:02:25,380 --> 00:02:27,330 in some of these more technological features 57 00:02:27,330 --> 00:02:29,487 without breaking some of the intermediate points 58 00:02:29,487 --> 00:02:31,320 or some of the last miles, some of the folks 59 00:02:31,320 --> 00:02:34,487 who are on the other end of the line that might not necessarily have access, 60 00:02:34,487 --> 00:02:36,450 in their municipality, to the latest hardware. 61 00:02:36,450 --> 00:02:38,520 So, I'll be curious to see how this evolves. 62 00:02:38,520 --> 00:02:41,250 I mean, to be honest, this might all become moot over time 63 00:02:41,250 --> 00:02:44,160 if phones themselves, or phone numbers, are perhaps 64 00:02:44,160 --> 00:02:47,332 replaced by more data based services. 65 00:02:47,332 --> 00:02:49,290 I mean, right now, we're very much in the phase 66 00:02:49,290 --> 00:02:54,730 of commercial services like WhatsApp, and iMessage, and so forth. 67 00:02:54,730 --> 00:02:57,570 I mean, but those have started to supplant already things like SMS, 68 00:02:57,570 --> 00:02:59,658 so, frankly, maybe the solution is ultimately 69 00:02:59,658 --> 00:03:02,700 just going to be too late in coming if the world moves to something else, 70 00:03:02,700 --> 00:03:03,585 anyway. 71 00:03:03,585 --> 00:03:06,150 COLTON OGDEN: Yeah, I imagine, when folks were developing the phone system 72 00:03:06,150 --> 00:03:08,858 we have in place, they weren't expecting the ability for somebody 73 00:03:08,858 --> 00:03:12,510 to arbitrarily code and script, en masse, the sort of behavior 74 00:03:12,510 --> 00:03:13,735 that we're experiencing now. 75 00:03:13,735 --> 00:03:16,380 DAVID MALAN: Yeah-- hey, back in the day, it used to be based-- 76 00:03:16,380 --> 00:03:19,050 at least pay phones-- on actual sounds, right? 77 00:03:19,050 --> 00:03:20,880 There are so many documented cases, and I 78 00:03:20,880 --> 00:03:25,730 think Steve Jobs and Steve Wozniak were among the folks involved 79 00:03:25,730 --> 00:03:28,980 in this back in the day, where you could have a little box that would generate 80 00:03:28,980 --> 00:03:32,220 the appropriate sounds that mimicked what the sound was if you 81 00:03:32,220 --> 00:03:34,140 put a quarter or a dime into a phone. 82 00:03:34,140 --> 00:03:36,990 So, you could effectively make free long distance phone calls 83 00:03:36,990 --> 00:03:38,263 by spoofing those sounds. 84 00:03:38,263 --> 00:03:40,680 So there, too-- there was a sort of an assumption of trust 85 00:03:40,680 --> 00:03:42,135 that was quickly broken. 86 00:03:42,135 --> 00:03:45,010 COLTON OGDEN: I think the theme is always that, if there is a system, 87 00:03:45,010 --> 00:03:47,065 humans will find a way to abuse and break it. 88 00:03:47,065 --> 00:03:49,380 DAVID MALAN: Indeed, but there are some really real world implications of this. 89 00:03:49,380 --> 00:03:51,380 In fact, just the other day did I see an article 90 00:03:51,380 --> 00:03:54,570 online about what have been called virtual kidnappings which, frankly, 91 00:03:54,570 --> 00:03:58,240 is literally ripped out of a "Law and Order" episode 92 00:03:58,240 --> 00:04:00,240 that I'm pretty sure I've seen, which is ironic, 93 00:04:00,240 --> 00:04:02,130 because usually it's "Law and Order" ripping 94 00:04:02,130 --> 00:04:03,730 things out of the actual headlines. 95 00:04:03,730 --> 00:04:05,730 But this, I think, predates this, whereby 96 00:04:05,730 --> 00:04:08,460 folks have started to get, terrifyingly, what 97 00:04:08,460 --> 00:04:11,640 appear to be actual phone calls from their child's phone 98 00:04:11,640 --> 00:04:14,760 number, or relative's phone number, or a co-worker's phone number, 99 00:04:14,760 --> 00:04:18,209 and on the other end of the line is some adversary, some human who 100 00:04:18,209 --> 00:04:23,460 is pretending to have actually kidnapped the person whose phone they're 101 00:04:23,460 --> 00:04:27,180 purporting to be calling from when, in reality, they're just spoofing 102 00:04:27,180 --> 00:04:30,060 that number and tricking someone into thinking that they've actually 103 00:04:30,060 --> 00:04:33,260 physically hijacked their phone number and kidnapped that person. 104 00:04:33,260 --> 00:04:36,260 COLTON OGDEN: Yeah, presumably, I mean, with this new ruling, hopefully, 105 00:04:36,260 --> 00:04:38,052 you know, this sort of horrendous situation 106 00:04:38,052 --> 00:04:41,040 doesn't end up becoming common at all, or at least it 107 00:04:41,040 --> 00:04:42,420 gets completely remediated. 108 00:04:42,420 --> 00:04:43,170 DAVID MALAN: Yeah. 109 00:04:43,170 --> 00:04:45,150 COLTON OGDEN: Because this is one of the more terrifying examples of how 110 00:04:45,150 --> 00:04:45,903 to abuse spoofing. 111 00:04:45,903 --> 00:04:47,070 DAVID MALAN: No, absolutely. 112 00:04:47,070 --> 00:04:49,153 And it's horrifying that it's gotten to this point 113 00:04:49,153 --> 00:04:51,630 but, you know, what you might think is kind of a cool hack, 114 00:04:51,630 --> 00:04:53,547 the ability to spoof your phone number, really 115 00:04:53,547 --> 00:04:55,800 does have some non-trivial implications. 116 00:04:55,800 --> 00:04:58,350 And especially, for most folks out there, you know-- myself, 117 00:04:58,350 --> 00:05:01,690 before I even thought about this the other day after reading the article-- 118 00:05:01,690 --> 00:05:04,560 you might not even realize that this is possible 119 00:05:04,560 --> 00:05:10,110 and what the implications, therefore, are of these sort of bugs at best or-- 120 00:05:10,110 --> 00:05:12,175 bugs at worst, or missing features at best. 121 00:05:12,175 --> 00:05:14,800 COLTON OGDEN: Yeah, I mean I think if this even happened to me, 122 00:05:14,800 --> 00:05:18,270 I think my initial inclination would be to believe it. 123 00:05:18,270 --> 00:05:20,010 I mean, certainly it would be terrifying, 124 00:05:20,010 --> 00:05:21,810 and you wouldn't want to take any risks and assume 125 00:05:21,810 --> 00:05:23,610 that whoever's on the other end of the line 126 00:05:23,610 --> 00:05:26,280 is actually bluffing you or telling the truth. 127 00:05:26,280 --> 00:05:28,200 Now, speaking of ransoms, unfortunately, I 128 00:05:28,200 --> 00:05:31,050 think these have cropped up in other contexts in the news of late 129 00:05:31,050 --> 00:05:33,270 and for the past couple of years, in fact. 130 00:05:33,270 --> 00:05:34,187 DAVID MALAN: Yeah, no. 131 00:05:34,187 --> 00:05:37,860 I mean, there have been multiple cases, WannaCry being very prominent in 2017, 132 00:05:37,860 --> 00:05:41,040 of these sort of worms that infect people's systems 133 00:05:41,040 --> 00:05:44,920 and, you know, potentially encrypt the hard drive, or do other things, 134 00:05:44,920 --> 00:05:47,910 and request that, in order to have this fixed, 135 00:05:47,910 --> 00:05:50,910 the end user end up paying some amount of money, 136 00:05:50,910 --> 00:05:54,813 either bitcoin or actual money, to decrypt their hard drive 137 00:05:54,813 --> 00:05:57,105 or do whatever needs to be done to unlock their system. 138 00:05:57,105 --> 00:05:59,250 COLTON OGDEN: Yeah, no, and that's the problem with worms, and viruses, 139 00:05:59,250 --> 00:06:01,470 and just malware, malicious software in general, 140 00:06:01,470 --> 00:06:05,910 is that, if it has the same privileges that you, the user, who accidentally 141 00:06:05,910 --> 00:06:07,230 installed it, somehow do-- 142 00:06:07,230 --> 00:06:09,600 or worse, it has administrative or root access 143 00:06:09,600 --> 00:06:12,810 to the computer-- it can do anything with your system and the data. 144 00:06:12,810 --> 00:06:16,770 You know, it almost makes exploits like sending spam automatically, 145 00:06:16,770 --> 00:06:19,710 unbeknownst to you, from your computer seem like completely delightful 146 00:06:19,710 --> 00:06:23,520 in comparison because, now, these most recent forms of ransomware 147 00:06:23,520 --> 00:06:24,780 are indeed doing exactly that. 148 00:06:24,780 --> 00:06:28,500 They're actually running algorithms to encrypt the files 149 00:06:28,500 --> 00:06:31,140 on your own hard drive and then not telling you, 150 00:06:31,140 --> 00:06:33,868 the owner of those files, what the key is, the sort of secret 151 00:06:33,868 --> 00:06:35,160 with which they were encrypted. 152 00:06:35,160 --> 00:06:38,080 And, so, in this way can the bad guys literally say, 153 00:06:38,080 --> 00:06:42,720 hey, pay us some number of dollars or, in practice, some number of bitcoins 154 00:06:42,720 --> 00:06:46,593 in order to get access to the key via which you can unlock your data. 155 00:06:46,593 --> 00:06:48,510 Who knows if you're even going to get the key. 156 00:06:48,510 --> 00:06:50,850 I mean, frankly, an even more compelling ransomware 157 00:06:50,850 --> 00:06:53,225 would be to just encrypt the data and throw the key away. 158 00:06:53,225 --> 00:06:55,850 Then you don't even have to communicate further with the person 159 00:06:55,850 --> 00:06:56,980 once you get that fund. 160 00:06:56,980 --> 00:07:02,670 DAVID MALAN: Yeah, and, in light of this sort of horrible new trend 161 00:07:02,670 --> 00:07:05,670 of ransomware that we've observed over the last few years, 162 00:07:05,670 --> 00:07:09,622 there are companies that do try and take advantage of this and will say, 163 00:07:09,622 --> 00:07:11,580 you know, we will help you decrypt your system. 164 00:07:11,580 --> 00:07:17,100 We will use high tech, quote unquote, solutions to reverse this ransomware. 165 00:07:17,100 --> 00:07:19,410 But it turns out that some companies, instead 166 00:07:19,410 --> 00:07:23,520 of actually having the algorithms and the technology to do this, 167 00:07:23,520 --> 00:07:28,290 are paying the actual people responsible for the ransomware 168 00:07:28,290 --> 00:07:30,537 directly and then charging you a premium. 169 00:07:30,537 --> 00:07:33,120 COLTON OGDEN: Yeah, no, this is really kind of a tricky thing, 170 00:07:33,120 --> 00:07:38,220 and I'm reminded of most any Hollywood movie, where someone is taken hostage. 171 00:07:38,220 --> 00:07:40,920 And, at least the US, in these movies, is always-- 172 00:07:40,920 --> 00:07:44,850 takes the position officially-- the US does not negotiate with terrorists. 173 00:07:44,850 --> 00:07:47,520 Well, that may very well or not very well 174 00:07:47,520 --> 00:07:49,770 be the case, because the closer you get to home, 175 00:07:49,770 --> 00:07:54,030 and the closer you get to it involving people you know, or files you own, 176 00:07:54,030 --> 00:07:58,680 or information you need, do these decisions become a little less obvious. 177 00:07:58,680 --> 00:08:01,750 And it's a little harder to take that sort of moral stance, if you will. 178 00:08:01,750 --> 00:08:06,060 And, in fact, in one of the articles on ProPublica was this wonderful quote. 179 00:08:06,060 --> 00:08:08,340 It is easy to take the position that no one should 180 00:08:08,340 --> 00:08:11,160 pay a ransom in a ransomware attack, because such payments 181 00:08:11,160 --> 00:08:13,890 encourage future ransomware attacks. 182 00:08:13,890 --> 00:08:16,110 It is much harder, however, to take that position 183 00:08:16,110 --> 00:08:18,420 when it is your data that has been encrypted 184 00:08:18,420 --> 00:08:21,720 and the future of your company and all of the jobs of your employees 185 00:08:21,720 --> 00:08:22,470 are in peril. 186 00:08:22,470 --> 00:08:23,933 It's a classic moral dilemma. 187 00:08:23,933 --> 00:08:26,100 And that really does put it into perspective, right? 188 00:08:26,100 --> 00:08:28,620 It's one thing to sort of argue-- no, we should not pay this ransom, 189 00:08:28,620 --> 00:08:31,245 because it's only going to happen to us or perhaps other people 190 00:08:31,245 --> 00:08:32,320 with greater frequency. 191 00:08:32,320 --> 00:08:37,299 But, if you really need the data on that hard drive, the financial information, 192 00:08:37,299 --> 00:08:40,860 the medical information, anything, the business information, 193 00:08:40,860 --> 00:08:43,440 you're only recourse might actually be to pay the ransom 194 00:08:43,440 --> 00:08:45,690 and then hopefully lock your systems down much more 195 00:08:45,690 --> 00:08:47,327 effectively the next time around. 196 00:08:47,327 --> 00:08:49,410 DAVID MALAN: Yeah, it's difficult when you're so-- 197 00:08:49,410 --> 00:08:51,118 when you're far removed from the problem, 198 00:08:51,118 --> 00:08:53,000 it's easy to say, oh, just don't negotiate. 199 00:08:53,000 --> 00:08:54,750 But, when you're actually there, when it's 200 00:08:54,750 --> 00:08:58,088 your data, your information, your loved ones, it gets a little bit trickier. 201 00:08:58,088 --> 00:08:59,130 It's a little bit greyer. 202 00:08:59,130 --> 00:09:01,440 COLTON OGDEN: And, if you do pay that one time to get your data back, 203 00:09:01,440 --> 00:09:04,770 man, you've just presented yourself to the bad guys as being someone 204 00:09:04,770 --> 00:09:07,200 they can clearly fleece again. 205 00:09:07,200 --> 00:09:09,235 So, it really boils down to-- 206 00:09:09,235 --> 00:09:12,780 try to avoid putting yourself in that situation at all, 207 00:09:12,780 --> 00:09:15,300 and have all of the defenses you can think 208 00:09:15,300 --> 00:09:18,450 of in place in terms of your systems, in terms of your personnel. 209 00:09:18,450 --> 00:09:20,400 I mean, frankly, too often are these exploits 210 00:09:20,400 --> 00:09:24,900 the result of social engineering, actually tricking people 211 00:09:24,900 --> 00:09:28,080 into revealing their passwords by typing it into a website, 212 00:09:28,080 --> 00:09:31,570 or tricking them into opening a link, or click on some attachment, or the like. 213 00:09:31,570 --> 00:09:33,060 And then the whole setup-- 214 00:09:33,060 --> 00:09:36,020 your whole system can perhaps be compromised. 215 00:09:36,020 --> 00:09:38,520 So, getting ahead of that and instituting better principles, 216 00:09:38,520 --> 00:09:42,180 some of which we've discussed on the podcast, password length and so forth-- 217 00:09:42,180 --> 00:09:45,705 password managers can be just a step toward avoiding the problem altogether. 218 00:09:45,705 --> 00:09:46,440 DAVID MALAN: Yeah, it's so tricky. 219 00:09:46,440 --> 00:09:47,280 I mean, we have-- 220 00:09:47,280 --> 00:09:49,830 like we've talked about before multiple times, 221 00:09:49,830 --> 00:09:51,570 the good guys have it the hardest. 222 00:09:51,570 --> 00:09:53,815 The bad guys just need to find one way in. 223 00:09:53,815 --> 00:09:55,565 COLTON OGDEN: Yeah, they just need to find 224 00:09:55,565 --> 00:09:58,778 one employee who accidentally clicks on that link or discloses that password. 225 00:09:58,778 --> 00:10:00,570 DAVID MALAN: One open window, so to speak-- 226 00:10:00,570 --> 00:10:01,650 [SIGH] 227 00:10:01,650 --> 00:10:02,550 It's unfortunate. 228 00:10:02,550 --> 00:10:04,675 It's unfortunate, because there are vulnerabilities 229 00:10:04,675 --> 00:10:06,022 that ship, not only just-- 230 00:10:06,022 --> 00:10:07,980 there are vulnerabilities that don't arise just 231 00:10:07,980 --> 00:10:13,327 out of the negligence of individuals but the negligence of companies themselves. 232 00:10:13,327 --> 00:10:14,857 COLTON OGDEN: Speaking of-- 233 00:10:14,857 --> 00:10:17,815 DAVID MALAN: And, in the news recently, some folks might know already-- 234 00:10:17,815 --> 00:10:21,262 WhatsApp actually had a vulnerability that was revealed. 235 00:10:21,262 --> 00:10:23,220 There was a company that was releasing spyware. 236 00:10:23,220 --> 00:10:26,550 It was actually shipping spyware through calls 237 00:10:26,550 --> 00:10:28,770 made through the WhatsApp application, which 238 00:10:28,770 --> 00:10:33,360 is a incredibly commonly used application in the United States 239 00:10:33,360 --> 00:10:34,417 and abroad. 240 00:10:34,417 --> 00:10:35,459 COLTON OGDEN: Absolutely. 241 00:10:35,459 --> 00:10:38,370 I mean, it is, ironically, an alternative to SMS or texting 242 00:10:38,370 --> 00:10:39,960 that I alluded to earlier. 243 00:10:39,960 --> 00:10:43,950 It's data based, in which-- a case that uses TCP/IP 244 00:10:43,950 --> 00:10:46,395 and network protocols to actually transmit the messages. 245 00:10:46,395 --> 00:10:49,520 And, as best I could tell from actually reading Facebook's own disclosure-- 246 00:10:49,520 --> 00:10:52,710 Facebook, of course, being the owners of WhatsApp-- 247 00:10:52,710 --> 00:10:55,980 it seemed to be some low level code that actually rendered 248 00:10:55,980 --> 00:10:59,070 the application vulnerable to a so-called buffer overflow 249 00:10:59,070 --> 00:11:02,160 exploit, whereby they must be allocating some amount of memory 250 00:11:02,160 --> 00:11:04,170 inside of the source code for WhatsApp. 251 00:11:04,170 --> 00:11:06,850 And, unfortunately, at some point in their code, 252 00:11:06,850 --> 00:11:11,640 they weren't checking to make sure that they were confining their use of memory 253 00:11:11,640 --> 00:11:12,600 to that footprint. 254 00:11:12,600 --> 00:11:15,240 So, if they allocated 100 bytes, they weren't actually checking 255 00:11:15,240 --> 00:11:18,570 to make sure that they didn't accidentally write more than 100 bytes 256 00:11:18,570 --> 00:11:20,040 to that location in memory. 257 00:11:20,040 --> 00:11:23,467 And, if you're using a language like Objective C, or other lower level code 258 00:11:23,467 --> 00:11:25,800 that's involved with networking, you might very well not 259 00:11:25,800 --> 00:11:28,620 have the language to protect you from yourself. 260 00:11:28,620 --> 00:11:32,490 And, in this case, it seemed to allow an adversary to actually install 261 00:11:32,490 --> 00:11:34,410 malicious software on your own phone. 262 00:11:34,410 --> 00:11:37,415 And, in this case, it seems to have been spyware of some form, which 263 00:11:37,415 --> 00:11:39,540 is to say that you might have some software running 264 00:11:39,540 --> 00:11:43,245 on your phone unbeknownst to you, somehow listening to you or your data. 265 00:11:43,245 --> 00:11:45,120 DAVID MALAN: It's interesting, because CS50-- 266 00:11:45,120 --> 00:11:48,510 in your lectures, you even talk about buffer overflow attacks 267 00:11:48,510 --> 00:11:49,777 and how to mitigate them. 268 00:11:49,777 --> 00:11:52,770 COLTON OGDEN: Yeah, I mean that depends on how complex your code is. 269 00:11:52,770 --> 00:11:56,162 It can be easy still using languages-- perhaps Objective C, in this case. 270 00:11:56,162 --> 00:11:58,620 Although, they weren't very forthcoming with the particular 271 00:11:58,620 --> 00:12:00,660 implementation details of the hack. 272 00:12:00,660 --> 00:12:02,130 It's certainly still possible. 273 00:12:02,130 --> 00:12:05,250 There are good tools out there that can help you detect these things. 274 00:12:05,250 --> 00:12:09,960 Whether or not those tools were in use in this context is also not clear, 275 00:12:09,960 --> 00:12:14,820 but it's sort of a fundamental flaw, at worst, or missing feature, 276 00:12:14,820 --> 00:12:17,760 at best, to borrow our terminology earlier, 277 00:12:17,760 --> 00:12:21,010 that this is even possible in these languages. 278 00:12:21,010 --> 00:12:25,710 So, this is why there's been trends toward languages like Java, and Python, 279 00:12:25,710 --> 00:12:28,627 and the like that actually don't even let you do this in this case. 280 00:12:28,627 --> 00:12:31,210 DAVID MALAN: Yeah, with great power comes great responsibility 281 00:12:31,210 --> 00:12:34,420 and a lot of weight on your shoulders if you're a low level developer. 282 00:12:34,420 --> 00:12:34,820 COLTON OGDEN: Yeah, no. 283 00:12:34,820 --> 00:12:37,050 And just think, to your point earlier, all it takes 284 00:12:37,050 --> 00:12:40,260 is for one adversary out there with a little too much free time 285 00:12:40,260 --> 00:12:43,350 to find the one bug that's in WhatsApp, though surely there's 286 00:12:43,350 --> 00:12:44,790 many more than that. 287 00:12:44,790 --> 00:12:47,640 And then he or she can have access, potentially, to a whole system 288 00:12:47,640 --> 00:12:49,140 if the bug is bad enough. 289 00:12:49,140 --> 00:12:50,973 DAVID MALAN: Yeah and, in this case, I mean, 290 00:12:50,973 --> 00:12:54,240 they were even able to transmit the data if they didn't answer the call. 291 00:12:54,240 --> 00:12:57,600 So they could get a call, not answer it, still get infected. 292 00:12:57,600 --> 00:13:00,000 And it was the case that some of the calls 293 00:13:00,000 --> 00:13:02,490 actually could be removed from folks' logs, too. 294 00:13:02,490 --> 00:13:05,878 So, they wouldn't even be all the more privy to the fact that they got a call 295 00:13:05,878 --> 00:13:07,920 and were potentially infected in the first place. 296 00:13:07,920 --> 00:13:10,470 COLTON OGDEN: Yeah, you know, it reminds me of an incident a few years ago 297 00:13:10,470 --> 00:13:12,270 now when Sony had some software-- 298 00:13:12,270 --> 00:13:14,430 DRM software-- for digital rights management 299 00:13:14,430 --> 00:13:18,390 whereby, if you put, I think, a CD into your computer, 300 00:13:18,390 --> 00:13:20,640 it would actually install what was effectively a route 301 00:13:20,640 --> 00:13:23,310 exploit, somehow taking advantage of the ability 302 00:13:23,310 --> 00:13:27,180 to install software, run it behind the scenes, but then cover its tracks, 303 00:13:27,180 --> 00:13:30,990 and not even show up in the Windows Task Manager, for instance, as I recall. 304 00:13:30,990 --> 00:13:34,350 So these are particularly malicious, and that was done by a company, 305 00:13:34,350 --> 00:13:38,010 not even just by an adversary on the internet. 306 00:13:38,010 --> 00:13:40,575 It's scary that this is still possible in systems. 307 00:13:40,575 --> 00:13:41,610 DAVID MALAN: I remember hearing about that. 308 00:13:41,610 --> 00:13:43,568 I'm not sure if it was us that talked about it, 309 00:13:43,568 --> 00:13:46,612 but I remember thinking, wow, I can't believe a company that big 310 00:13:46,612 --> 00:13:47,820 is doing something like that. 311 00:13:47,820 --> 00:13:49,690 And who else might be doing something like that, 312 00:13:49,690 --> 00:13:50,786 unbeknownst to the rest of us? 313 00:13:50,786 --> 00:13:51,990 COLTON OGDEN: Yes, that did not end well for Sony, 314 00:13:51,990 --> 00:13:54,055 if you take a look at the articles online or the Wikipedia article. 315 00:13:54,055 --> 00:13:55,763 DAVID MALAN: I vaguely do remember people 316 00:13:55,763 --> 00:13:57,625 being a little bit upset about that. 317 00:13:57,625 --> 00:13:59,750 COLTON OGDEN: Yeah, but companies do make mistakes. 318 00:13:59,750 --> 00:14:03,930 I mean, also in the news this past week was a zombie load exploits affecting 319 00:14:03,930 --> 00:14:05,700 some of Intel's hardware. 320 00:14:05,700 --> 00:14:07,230 That I find particularly scary. 321 00:14:07,230 --> 00:14:10,500 And, in short, in this case, with the zombie load attack, 322 00:14:10,500 --> 00:14:14,820 is it possible to essentially convince the CPU, the brains of your computer, 323 00:14:14,820 --> 00:14:18,540 to leak information in ways that you didn't intend? 324 00:14:18,540 --> 00:14:21,150 And this is problematic if one application 325 00:14:21,150 --> 00:14:24,202 is able to see information from another application. 326 00:14:24,202 --> 00:14:26,160 And, in fact, in this case here, thankfully, it 327 00:14:26,160 --> 00:14:29,202 seems to have been the good guys, the security researchers, who uncovered 328 00:14:29,202 --> 00:14:31,140 this first and reported it to Intel. 329 00:14:31,140 --> 00:14:33,240 It's not known if it was actually exploited, 330 00:14:33,240 --> 00:14:35,040 but they actually had a compelling proof of concept, 331 00:14:35,040 --> 00:14:36,623 for which there's a nice video online. 332 00:14:36,623 --> 00:14:39,180 If you Google zombie load Intel, you should 333 00:14:39,180 --> 00:14:42,030 find at any number of articles which showed them 334 00:14:42,030 --> 00:14:44,385 visiting various websites in a browser. 335 00:14:44,385 --> 00:14:46,260 And then, in a little command line interface, 336 00:14:46,260 --> 00:14:49,302 where they had written a program that was just running behind the scenes, 337 00:14:49,302 --> 00:14:51,600 they were able to log all of the host names 338 00:14:51,600 --> 00:14:54,810 that were being used by the browser to access those web pages, 339 00:14:54,810 --> 00:14:58,080 effectively leaking information across processes, which should not 340 00:14:58,080 --> 00:14:59,603 be possible on a system. 341 00:14:59,603 --> 00:15:01,270 DAVID MALAN: Yeah, it's pretty chilling. 342 00:15:01,270 --> 00:15:03,840 I mean, in that same article they talked about-- 343 00:15:03,840 --> 00:15:07,152 this might be host names now, but this could be your security-- 344 00:15:07,152 --> 00:15:08,235 this could be your tokens. 345 00:15:08,235 --> 00:15:09,460 This could be your passwords. 346 00:15:09,460 --> 00:15:10,880 This could be any bit of-- 347 00:15:10,880 --> 00:15:13,760 your card numbers, what have you, any bit of information 348 00:15:13,760 --> 00:15:18,460 that is going to potentially lead to a massive security vulnerability for you. 349 00:15:18,460 --> 00:15:20,100 And it's scary when it's hardware, too. 350 00:15:20,100 --> 00:15:23,308 I mean, hardware is supposed to be the stuff that doesn't need to be updated, 351 00:15:23,308 --> 00:15:25,370 but that's just silly and naive. 352 00:15:25,370 --> 00:15:28,640 I mean, running on today's hardware is essentially embedded software 353 00:15:28,640 --> 00:15:30,922 or firmware, as it's typically called. 354 00:15:30,922 --> 00:15:32,630 And most people, frankly, probably aren't 355 00:15:32,630 --> 00:15:36,260 really in the habits of updating their bios in the PC world, or that low level 356 00:15:36,260 --> 00:15:36,770 software. 357 00:15:36,770 --> 00:15:38,940 Apple, thankfully, takes care of this for users. 358 00:15:38,940 --> 00:15:41,690 And, so, who knows how often these things are actually discovered? 359 00:15:41,690 --> 00:15:43,790 But, when it's baked into hardware, that even 360 00:15:43,790 --> 00:15:45,832 puts it a little more out of most people's reach. 361 00:15:45,832 --> 00:15:48,665 COLTON OGDEN: Yeah, no, this is pretty frightening, because, I mean, 362 00:15:48,665 --> 00:15:51,440 this transcends just what might be one person's physical machine. 363 00:15:51,440 --> 00:15:56,030 This could easily apply-- and CS50's own infrastructure is a big part of this-- 364 00:15:56,030 --> 00:15:59,990 to virtual machines hosted in the Cloud, because these all eventually run 365 00:15:59,990 --> 00:16:01,155 on physical machines. 366 00:16:01,155 --> 00:16:04,280 But, you know, one physical machine that might be running since CS50's code 367 00:16:04,280 --> 00:16:07,100 with x other company's code-- 368 00:16:07,100 --> 00:16:10,160 x company might find a way to get access to all of our credentials, 369 00:16:10,160 --> 00:16:11,530 or whoever other company, right? 370 00:16:11,530 --> 00:16:13,817 Because it's all, you know, at the hardware level. 371 00:16:13,817 --> 00:16:15,567 DAVID MALAN: Absolutely, it's frightening. 372 00:16:15,567 --> 00:16:18,442 COLTON OGDEN: There was something interesting that I saw, which was-- 373 00:16:18,442 --> 00:16:21,500 and this is one of the coolest, cleverest ways I've seen of, again, 374 00:16:21,500 --> 00:16:26,480 abusing a system, finding a way into a system that you shouldn't have, 375 00:16:26,480 --> 00:16:29,030 and that's with Google Drive. 376 00:16:29,030 --> 00:16:33,358 So, somebody released, on GitHub, a program that actually allows folks-- 377 00:16:33,358 --> 00:16:35,150 because here's the thing with Google Drive. 378 00:16:35,150 --> 00:16:38,390 You can store, in your Google Drive, unlimited Google Docs. 379 00:16:38,390 --> 00:16:42,080 There's no quota cap on Google Docs. 380 00:16:42,080 --> 00:16:44,060 But this is only for Google Docs format. 381 00:16:44,060 --> 00:16:47,420 But somebody found a way to encode arbitrary information, 382 00:16:47,420 --> 00:16:49,790 arbitrary binaries, as Google Docs. 383 00:16:49,790 --> 00:16:52,130 And, well, that essentially led to them having 384 00:16:52,130 --> 00:16:53,830 unlimited disk space in Google Drive. 385 00:16:53,830 --> 00:16:55,580 DAVID MALAN: Yeah, and I would say this is 386 00:16:55,580 --> 00:16:58,130 more of a theoretical convenience than a practical one, 387 00:16:58,130 --> 00:17:00,380 because there's some overhead in running the software. 388 00:17:00,380 --> 00:17:03,410 But, yeah, it's kind of a brilliant sort of hack, 389 00:17:03,410 --> 00:17:06,859 if you will, or exploit, or work around, when really it's 390 00:17:06,859 --> 00:17:09,859 just kind of taking advantage of the design of the system. 391 00:17:09,859 --> 00:17:14,125 Like, normally, you're supposed to use Google Drive, and Dropbox, and iCloud, 392 00:17:14,125 --> 00:17:17,000 and those other kinds of file based services by dragging and dropping 393 00:17:17,000 --> 00:17:20,839 your files, whether it's a text file, or binary file, or video file, or program, 394 00:17:20,839 --> 00:17:24,240 or whatever, into the drive or up through the browser, and it gets saved. 395 00:17:24,240 --> 00:17:27,448 But, of course, it takes up some number of bytes, or megabytes, or gigabytes, 396 00:17:27,448 --> 00:17:29,930 and that counts against your finite quota. 397 00:17:29,930 --> 00:17:33,650 But, for reasons that maybe the staff of Google 398 00:17:33,650 --> 00:17:36,770 who wrote Google Docs didn't think about this, or didn't think anyone 399 00:17:36,770 --> 00:17:39,563 would be crazy enough to try this, it's really kind of cool. 400 00:17:39,563 --> 00:17:42,230 You can take any binary file, convert it to text using something 401 00:17:42,230 --> 00:17:47,540 like Base64 encoding, which is similar in spirit to Bas10, or Base2, 402 00:17:47,540 --> 00:17:51,900 or Base16, which are decimal, or binary, or hexadecimal, respectively. 403 00:17:51,900 --> 00:17:54,380 But just turn it into text, and then automatically paste it 404 00:17:54,380 --> 00:17:57,050 into one or more Google documents, and then 405 00:17:57,050 --> 00:18:00,200 reconstitute it later when you actually want to download the data. 406 00:18:00,200 --> 00:18:03,205 I mean, frankly, this is probably more annoying than anything, 407 00:18:03,205 --> 00:18:05,330 and Google could clamp down on this pretty quickly. 408 00:18:05,330 --> 00:18:08,690 They could probably say, you know, if you have a million Google Docs, 409 00:18:08,690 --> 00:18:11,507 you're probably not using them for Google Docs purposes. 410 00:18:11,507 --> 00:18:13,340 So, they could put some thresholds in there, 411 00:18:13,340 --> 00:18:16,100 but it would be fascinating to be privy to the chats going on 412 00:18:16,100 --> 00:18:18,710 at Google, if someone was like, oh, we knew this was possible, 413 00:18:18,710 --> 00:18:21,650 but we just didn't worry about it, because it's not that useful, 414 00:18:21,650 --> 00:18:26,290 or if minds were blown and, wow, that's such a clever sort of exploit. 415 00:18:26,290 --> 00:18:28,290 COLTON OGDEN: Yeah, no, if folks are interested, 416 00:18:28,290 --> 00:18:31,870 they can go to GitHub.com/StuartMcGowan/UDS and see 417 00:18:31,870 --> 00:18:32,870 exactly what's going on. 418 00:18:32,870 --> 00:18:36,240 I imagine, probably very soon, it will no longer be a relevant codebase. 419 00:18:36,240 --> 00:18:38,615 I have to imagine Google's going to find a way around it. 420 00:18:38,615 --> 00:18:39,770 DAVID MALAN: No, this is one of those this is why 421 00:18:39,770 --> 00:18:41,645 we can't have nice things situations. 422 00:18:41,645 --> 00:18:45,620 COLTON OGDEN: Yeah, no, but it's a very fascinating experiment. 423 00:18:45,620 --> 00:18:48,500 Another company-- another big company is Microsoft. 424 00:18:48,500 --> 00:18:50,150 That's a little bit of a segue there. 425 00:18:50,150 --> 00:18:54,890 They released a series of patches recently for some vulnerabilities 426 00:18:54,890 --> 00:18:57,200 that apparently exist on older versions of Windows, 427 00:18:57,200 --> 00:19:01,145 for operating systems such as XP and Windows 2003, among many others. 428 00:19:01,145 --> 00:19:03,020 DAVID MALAN: Yeah, so, for those of you still 429 00:19:03,020 --> 00:19:07,310 running Windows XP from like 20 years ago, this is for you. 430 00:19:07,310 --> 00:19:11,270 COLTON OGDEN: Yeah, 16 updates targeting at least 79 security holes 431 00:19:11,270 --> 00:19:14,330 in Windows and related software, which is awesome that they're actually 432 00:19:14,330 --> 00:19:16,400 being proactive about doing this, and they're not 433 00:19:16,400 --> 00:19:18,170 doing this on the heels of an exploit that 434 00:19:18,170 --> 00:19:20,445 comes out from some nefarious actor-- 435 00:19:20,445 --> 00:19:23,570 DAVID MALAN: Granted, but it's also terrifying that, since the last update, 436 00:19:23,570 --> 00:19:26,117 there have been 79 security related bugs fixed. 437 00:19:26,117 --> 00:19:27,950 And those are the ones that have been fixed. 438 00:19:27,950 --> 00:19:32,475 Let's just imagine how many have not yet been discovered, let alone fixed. 439 00:19:32,475 --> 00:19:35,720 COLTON OGDEN: Right, there was one I remember reading that was a day 0 440 00:19:35,720 --> 00:19:37,940 vulnerability that they had just fixed. 441 00:19:37,940 --> 00:19:41,810 And there was another fix for remote desktop services, which 442 00:19:41,810 --> 00:19:44,900 is built into various versions of Windows, including 7, Windows Server 443 00:19:44,900 --> 00:19:48,740 2008, R2, and Windows Server 2008. 444 00:19:48,740 --> 00:19:49,940 So, pretty crazy that-- 445 00:19:49,940 --> 00:19:52,340 and all of these computers may have been compromised, 446 00:19:52,340 --> 00:19:54,840 may not have been compromised, at least to folks' knowledge. 447 00:19:54,840 --> 00:19:57,440 But, at the very least, now, people are running this software. 448 00:19:57,440 --> 00:20:00,620 They can rest assured that a small chunk of potential vulnerabilities 449 00:20:00,620 --> 00:20:01,935 are at least taken care of now. 450 00:20:01,935 --> 00:20:04,018 DAVID MALAN: Yeah, well, and for those unfamiliar, 451 00:20:04,018 --> 00:20:06,135 worms are among the most scary of malware attacks, 452 00:20:06,135 --> 00:20:08,260 whereas a virus, for instance, is the kind of thing 453 00:20:08,260 --> 00:20:10,900 that you have to sort of accidentally or foolishly 454 00:20:10,900 --> 00:20:13,480 click on a link that opens some software and runs it, 455 00:20:13,480 --> 00:20:16,780 or you have to open an attachment that actually is infected with software. 456 00:20:16,780 --> 00:20:18,880 A worm is, by definition, self propagating. 457 00:20:18,880 --> 00:20:22,300 So, once that process or that program is running, perhaps unbeknownst to you 458 00:20:22,300 --> 00:20:24,757 on your computer, it can spread, via a network connection, 459 00:20:24,757 --> 00:20:27,340 to another computer, or another computer, or another computer, 460 00:20:27,340 --> 00:20:29,530 if all of those computers are themselves vulnerable. 461 00:20:29,530 --> 00:20:31,930 And, in this case, too, if your system's not already patched, 462 00:20:31,930 --> 00:20:33,130 you are in fact vulnerable. 463 00:20:33,130 --> 00:20:34,963 And, so, this frankly really got me thinking 464 00:20:34,963 --> 00:20:37,190 about a trend, which is a good thing in recent years, 465 00:20:37,190 --> 00:20:41,200 especially in the Apple ecosystem, which is essentially compelling people 466 00:20:41,200 --> 00:20:42,580 to automatically update. 467 00:20:42,580 --> 00:20:46,660 Auto update, dare say, used to be more of an opt in thing, not on by default. 468 00:20:46,660 --> 00:20:49,000 And, to be fair, you do in some contexts still have 469 00:20:49,000 --> 00:20:51,440 to opt into it on Apple's platforms. 470 00:20:51,440 --> 00:20:54,250 But it's getting more and more in companies' interest 471 00:20:54,250 --> 00:20:57,400 to sort of compel users to update, and this 472 00:20:57,400 --> 00:21:01,772 is helping to narrow the number of systems that are actually vulnerable. 473 00:21:01,772 --> 00:21:03,730 Because, if you're auto updating on a schedule, 474 00:21:03,730 --> 00:21:06,340 at least you're with a lower probability of running the older, 475 00:21:06,340 --> 00:21:07,468 more vulnerable stuff. 476 00:21:07,468 --> 00:21:10,260 So, it's a good thing, generally speaking, to have auto updates on. 477 00:21:10,260 --> 00:21:13,302 COLTON OGDEN: I know Windows 10 is the particular offender in this realm, 478 00:21:13,302 --> 00:21:16,878 because they are hyper-aggressive about making you automatically update, 479 00:21:16,878 --> 00:21:19,420 and they make it really difficult for you to actually get out 480 00:21:19,420 --> 00:21:20,505 of that behavior. 481 00:21:20,505 --> 00:21:22,480 DAVID MALAN: Yeah, no, this is very true. 482 00:21:22,480 --> 00:21:25,330 And it backfires in terms of UX or user experience. 483 00:21:25,330 --> 00:21:27,880 I remember years ago, when the Xbox One first came out, 484 00:21:27,880 --> 00:21:30,010 we had one here in the office for students to use. 485 00:21:30,010 --> 00:21:33,220 And the first thing we tried to do was set it up around the holidays, 486 00:21:33,220 --> 00:21:36,820 and everyone was so excited that we had the brand new Xbox One 487 00:21:36,820 --> 00:21:39,970 and wanted to play some game, maybe a soccer game or something like that, 488 00:21:39,970 --> 00:21:40,480 on it. 489 00:21:40,480 --> 00:21:43,188 And, so, everyone plugged it in and, just like Christmas morning, 490 00:21:43,188 --> 00:21:44,830 everyone's ready to start, and then-- 491 00:21:44,830 --> 00:21:46,570 downloading, downloading. 492 00:21:46,570 --> 00:21:49,390 And then, like, no joke, an hour or more later, 493 00:21:49,390 --> 00:21:53,210 was the Xbox finally ready to let us play a game, by which point 494 00:21:53,210 --> 00:21:55,060 Christmas was over, or whatever the day was. 495 00:21:55,060 --> 00:21:58,960 And, so, it really kind of got in the way of a good user experience. 496 00:21:58,960 --> 00:22:02,270 But, maybe that protected our system from being compromised. 497 00:22:02,270 --> 00:22:05,532 So, it really is a trade-off, which is thematic in computing. 498 00:22:05,532 --> 00:22:07,240 COLTON OGDEN: Yeah, trust and trade-offs, 499 00:22:07,240 --> 00:22:10,105 if we had to boil down CS into two words-- 500 00:22:10,105 --> 00:22:12,345 DAVID MALAN: Yeah, I think that's pretty apt. 501 00:22:12,345 --> 00:22:13,540 COLTON OGDEN: Well, somebody actually requested 502 00:22:13,540 --> 00:22:15,623 we talk about this, which is kind of a cool thing. 503 00:22:15,623 --> 00:22:18,105 Careers and technology would be the topic here. 504 00:22:18,105 --> 00:22:20,980 DAVID MALAN: Yeah, so we got this question from one of our listeners. 505 00:22:20,980 --> 00:22:21,522 I like these. 506 00:22:21,522 --> 00:22:24,280 Can you talk about careers in tech in a future podcast, 507 00:22:24,280 --> 00:22:27,130 maybe what areas have more job openings in the next few years, 508 00:22:27,130 --> 00:22:31,330 what skills are in demand, and what areas may decline in the future, also 509 00:22:31,330 --> 00:22:32,787 maybe the interview process? 510 00:22:32,787 --> 00:22:34,120 So, a bit of a loaded question-- 511 00:22:34,120 --> 00:22:35,828 I think we can touch on this a little bit 512 00:22:35,828 --> 00:22:38,390 here and certainly welcome other such questions. 513 00:22:38,390 --> 00:22:40,870 I mean, it's hard to go wrong nowadays, certainly, 514 00:22:40,870 --> 00:22:45,460 in bolstering your technical comforts and your technical skill expertise. 515 00:22:45,460 --> 00:22:47,530 It's so much easier these days to find access 516 00:22:47,530 --> 00:22:50,375 to high quality educational content for free on the internet. 517 00:22:50,375 --> 00:22:52,750 You don't need to necessarily go through formal schooling 518 00:22:52,750 --> 00:22:55,515 or pay for these actual programs. 519 00:22:55,515 --> 00:22:57,640 With that said, it's tough to predict these trends. 520 00:22:57,640 --> 00:23:00,250 I mean, there's certainly things that are in vogue these days. 521 00:23:00,250 --> 00:23:02,260 Python, for instance, is a language that's 522 00:23:02,260 --> 00:23:05,920 very much in vogue these days for web programming, for data science 523 00:23:05,920 --> 00:23:08,500 applications, for interactivity. 524 00:23:08,500 --> 00:23:11,470 JavaScript is another one that's perhaps even more popular 525 00:23:11,470 --> 00:23:14,535 and trending these days, both on the client side and the server side. 526 00:23:14,535 --> 00:23:16,660 And then there's the whole, like, operations world, 527 00:23:16,660 --> 00:23:19,960 technologies like Docker, and virtual machines, 528 00:23:19,960 --> 00:23:23,530 and so forth, that are really transforming how systems 529 00:23:23,530 --> 00:23:25,940 are hosted in the Cloud and elsewhere. 530 00:23:25,940 --> 00:23:27,880 So, there's a lot of exciting trends. 531 00:23:27,880 --> 00:23:30,732 But, frankly, I think, rather than even chasing these trends, 532 00:23:30,732 --> 00:23:33,190 I think you can't really go wrong in studying, really first 533 00:23:33,190 --> 00:23:37,930 and foremost, the fundamentals and focusing on having a strong software 534 00:23:37,930 --> 00:23:41,140 background with procedural programming, with classes like CS50, 535 00:23:41,140 --> 00:23:43,720 functional programming, object oriented programming, 536 00:23:43,720 --> 00:23:47,080 as by taking other classes, and then keeping an eye-- 537 00:23:47,080 --> 00:23:51,190 that really opens doors, I think, to all sorts of entry level and higher level 538 00:23:51,190 --> 00:23:52,777 software jobs. 539 00:23:52,777 --> 00:23:55,110 COLTON OGDEN: Yeah, problem solving I think ultimately-- 540 00:23:55,110 --> 00:23:55,555 DAVID MALAN: Absolutely. 541 00:23:55,555 --> 00:23:57,310 COLTON OGDEN: That's probably the number one skill that I 542 00:23:57,310 --> 00:23:58,992 would say people should focus on. 543 00:23:58,992 --> 00:24:02,200 DAVID MALAN: Yeah, and then certainly, at a lot of the bigger tech companies, 544 00:24:02,200 --> 00:24:04,180 certainly in the software context, are-- 545 00:24:04,180 --> 00:24:07,270 the interview process really focused on problem solving. 546 00:24:07,270 --> 00:24:11,290 Generally the types of questions you might have are generally 547 00:24:11,290 --> 00:24:13,720 language agnostic, or the interviewers often 548 00:24:13,720 --> 00:24:16,720 don't care what language it is you're using to solve a problem. 549 00:24:16,720 --> 00:24:18,640 Frankly, your syntax doesn't necessarily have 550 00:24:18,640 --> 00:24:21,850 to be 100% correct if it's more of a Whiteboard kind of conversation, 551 00:24:21,850 --> 00:24:25,328 or even just like a Google shared document on a telephone call or video 552 00:24:25,328 --> 00:24:26,620 conference that you might have. 553 00:24:26,620 --> 00:24:29,560 The goal really is to get a sense of how people think 554 00:24:29,560 --> 00:24:31,190 and how they approach programming. 555 00:24:31,190 --> 00:24:33,190 I mean, frankly, I, when we've interviewed folks 556 00:24:33,190 --> 00:24:35,950 even for part time or full time roles here on CS50's team, 557 00:24:35,950 --> 00:24:38,470 for software oriented roles, what I really want to do 558 00:24:38,470 --> 00:24:42,220 is get a sense of what it would be like to work with that person in a room, 559 00:24:42,220 --> 00:24:44,830 in front of a whiteboard, with his or her laptop 560 00:24:44,830 --> 00:24:47,740 off to the side, where we're just designing the solution 561 00:24:47,740 --> 00:24:49,880 to a problem, even independent of code. 562 00:24:49,880 --> 00:24:55,930 And, so, I think, being able to have really robust design conversations, 563 00:24:55,930 --> 00:24:59,830 being able to understand, as you know, the trade-offs between doing something 564 00:24:59,830 --> 00:25:02,350 or something else when it comes to designing a system-- 565 00:25:02,350 --> 00:25:05,905 that's, I think, one of the best ways to prepare yourself for this. 566 00:25:05,905 --> 00:25:08,530 COLTON OGDEN: Yeah, I think, given our experience here at CS50, 567 00:25:08,530 --> 00:25:11,470 and based on just what I've read, it seems like the model 568 00:25:11,470 --> 00:25:14,530 that big companies have taken in recent years, 569 00:25:14,530 --> 00:25:18,350 or maybe even not recent for a lot of the larger ones, the whiteboard sort 570 00:25:18,350 --> 00:25:20,350 of model, and the problem solving based model, I 571 00:25:20,350 --> 00:25:24,010 think even smaller companies are probably adopting this a bit more 572 00:25:24,010 --> 00:25:25,480 than they used to now. 573 00:25:25,480 --> 00:25:29,050 Because people are getting a lot more of an influx of software developers 574 00:25:29,050 --> 00:25:29,980 looking for work. 575 00:25:29,980 --> 00:25:32,250 And, so, I think we see this thing pretty commonly. 576 00:25:32,250 --> 00:25:32,770 DAVID MALAN: Absolutely. 577 00:25:32,770 --> 00:25:34,960 COLTON OGDEN: And it does ultimately boil down to, 578 00:25:34,960 --> 00:25:36,740 not what language you might be comfortable with, 579 00:25:36,740 --> 00:25:38,953 but, you know, the ultimate the core problem at hand, 580 00:25:38,953 --> 00:25:40,370 which is what CS50 tries to teach. 581 00:25:40,370 --> 00:25:42,550 It's not-- we advertise ourselves-- you advertise 582 00:25:42,550 --> 00:25:44,560 the course as not a course on programming, 583 00:25:44,560 --> 00:25:46,550 per se, but ultimately on problem solving. 584 00:25:46,550 --> 00:25:47,800 DAVID MALAN: Yeah, absolutely. 585 00:25:47,800 --> 00:25:50,062 And, speaking a little more practically here, 586 00:25:50,062 --> 00:25:53,020 at Harvard we have a tradition, thanks to some former teaching fellows, 587 00:25:53,020 --> 00:25:56,200 of holding a prep and practice for tech interviews every year. 588 00:25:56,200 --> 00:25:58,450 So, if you actually Google or go on YouTube and search 589 00:25:58,450 --> 00:26:02,260 for CS50 prep and practice for technical interviews, 590 00:26:02,260 --> 00:26:04,510 odds are one of the recent years' videos should pop up 591 00:26:04,510 --> 00:26:07,480 where CS50's own Tommy MacWilliam, a former head teaching fellow, 592 00:26:07,480 --> 00:26:09,730 actually leads folks through a discussion of how 593 00:26:09,730 --> 00:26:12,790 to and not to format your resume, how to prepare for an interview, 594 00:26:12,790 --> 00:26:14,140 how to conduct an interview. 595 00:26:14,140 --> 00:26:16,010 So, you might want to check that out. 596 00:26:16,010 --> 00:26:17,830 A very popular book here on campus, too, is 597 00:26:17,830 --> 00:26:19,840 one called "Cracking the Coding Interview," 598 00:26:19,840 --> 00:26:22,810 or Cracking the PM, product management, interview. 599 00:26:22,810 --> 00:26:26,350 Those, on Amazon or other websites, might be of interest as well, 600 00:26:26,350 --> 00:26:30,400 just as a nice, thick reference book as to where you could begin. 601 00:26:30,400 --> 00:26:33,510 Frankly, it could take you weeks, months to go through everything 602 00:26:33,510 --> 00:26:35,260 in those texts, but it'll give you a sense 603 00:26:35,260 --> 00:26:36,927 of how you might go about preparing. 604 00:26:36,927 --> 00:26:39,260 But, in short, in terms of the opportunities themselves, 605 00:26:39,260 --> 00:26:42,070 I would say hard to go wrong in the DevOps world, 606 00:26:42,070 --> 00:26:43,900 knowing one or more programming languages, 607 00:26:43,900 --> 00:26:47,290 knowing a little something about how you can run an application using 608 00:26:47,290 --> 00:26:51,100 Cloud services of any sort, certainly version control, and GitHub, 609 00:26:51,100 --> 00:26:53,470 and GitLab, and other such products. 610 00:26:53,470 --> 00:26:56,770 And then also security, just being one who 611 00:26:56,770 --> 00:26:59,080 can help companies understand and analyze threats 612 00:26:59,080 --> 00:27:01,240 to their system, who can chase those things down, 613 00:27:01,240 --> 00:27:02,770 who can help secure systems-- 614 00:27:02,770 --> 00:27:05,508 I mean, there's no lack for need in the security space as well. 615 00:27:05,508 --> 00:27:08,300 COLTON OGDEN: Yeah, having technical literacy in this day and age-- 616 00:27:08,300 --> 00:27:10,270 I think that is incredibly useful. 617 00:27:10,270 --> 00:27:12,090 We're only getting more automated. 618 00:27:12,090 --> 00:27:13,660 DAVID MALAN: Yeah, absolutely-- so, a lot of exciting 619 00:27:13,660 --> 00:27:14,660 opportunities out there. 620 00:27:14,660 --> 00:27:18,610 And I think, if you just get to first base with some of the fundamentals, 621 00:27:18,610 --> 00:27:21,610 and taking one or a few classes, or experiences, or boot camps, 622 00:27:21,610 --> 00:27:23,560 or the like, can you really then bootstrap 623 00:27:23,560 --> 00:27:27,743 yourself there onward until you really feel like you're hitting home runs. 624 00:27:27,743 --> 00:27:28,660 COLTON OGDEN: Awesome. 625 00:27:28,660 --> 00:27:30,790 I like how that ended, some solid advice there. 626 00:27:30,790 --> 00:27:32,050 DAVID MALAN: Thanks, I don't know if that metaphor works. 627 00:27:32,050 --> 00:27:33,505 But it sounded kind of poetic. 628 00:27:33,505 --> 00:27:36,835 COLTON OGDEN: Well, thank you for coming here to do this podcast with me. 629 00:27:36,835 --> 00:27:38,960 DAVID MALAN: Oh, well thanks so much for having me. 630 00:27:38,960 --> 00:27:41,770 COLTON OGDEN: Episode 5, zero index of the CS50 podcast-- 631 00:27:41,770 --> 00:27:44,605 what are some takeaways that you would recommend from the discussion 632 00:27:44,605 --> 00:27:46,605 here, since we like to end with a few takeaways? 633 00:27:46,605 --> 00:27:47,438 DAVID MALAN: I know. 634 00:27:47,438 --> 00:27:49,942 I worry the theme too often is be afraid, be very afraid. 635 00:27:49,942 --> 00:27:52,150 But I think, hopefully more constructively this time, 636 00:27:52,150 --> 00:27:55,360 are there things you can be mindful about. 637 00:27:55,360 --> 00:27:59,590 And, honestly, thinking about technologies from first principles, 638 00:27:59,590 --> 00:28:01,540 even in the context of virtual kidnappings, 639 00:28:01,540 --> 00:28:04,240 god forbid, understanding-- well, wait a minute. 640 00:28:04,240 --> 00:28:05,680 How is this happening to me? 641 00:28:05,680 --> 00:28:08,560 Don't necessarily take things that you see on a system at face value. 642 00:28:08,560 --> 00:28:12,010 Consider what sequence of steps might have led you to see this symptom 643 00:28:12,010 --> 00:28:16,180 and then decide for yourself, in an informed way, yes, this is a threat, 644 00:28:16,180 --> 00:28:17,170 or no it isn't. 645 00:28:17,170 --> 00:28:20,380 And I think just knowing how to defend yourself as well-- don't get yourself 646 00:28:20,380 --> 00:28:22,930 into the situation of things like ransomware attacks 647 00:28:22,930 --> 00:28:26,200 or vulnerable WhatsApp applications on your phone. 648 00:28:26,200 --> 00:28:28,810 Make sure your auto updates are on, which is probably 649 00:28:28,810 --> 00:28:32,377 a net positive in general, even though updates can be rolled out 650 00:28:32,377 --> 00:28:33,460 that are themselves buggy. 651 00:28:33,460 --> 00:28:35,350 That's probably the lesser evil-- 652 00:28:35,350 --> 00:28:38,170 so, staying on top of your system and not just using things out 653 00:28:38,170 --> 00:28:39,790 of the box the way you receive them. 654 00:28:39,790 --> 00:28:41,740 In fact, a certain someone comes to mind as 655 00:28:41,740 --> 00:28:43,810 to whose iOS is not always up to date. 656 00:28:43,810 --> 00:28:45,410 COLTON OGDEN: I was going to make a comment about that 657 00:28:45,410 --> 00:28:46,618 when we got to auto updating. 658 00:28:46,618 --> 00:28:50,285 Yeah, I have a bad habit of not updating my stuff as often as I should. 659 00:28:50,285 --> 00:28:53,410 DAVID MALAN: Yeah, so I'm going to send you a link to episode 5 of the CS50 660 00:28:53,410 --> 00:28:54,820 podcast and see what happens there. 661 00:28:54,820 --> 00:28:56,140 COLTON OGDEN: All the talks that we've had in here 662 00:28:56,140 --> 00:28:57,790 have convinced me that maybe it's time to start 663 00:28:57,790 --> 00:28:59,290 taking that a little more seriously. 664 00:28:59,290 --> 00:29:02,927 DAVID MALAN: All right, well, thanks so much for tuning into the CS50 podcast. 665 00:29:02,927 --> 00:29:04,885 Looking forward to chatting with folks further. 666 00:29:04,885 --> 00:29:07,170 COLTON OGDEN: Likewise-- thanks for tuning in. 667 00:29:07,170 --> 00:29:07,300