1 00:00:00,000 --> 00:00:02,826 2 00:00:02,826 --> 00:00:04,239 This is CS50. 3 00:00:04,239 --> 00:00:07,727 4 00:00:07,727 --> 00:00:08,810 DAVID MALAN: Hello, world. 5 00:00:08,810 --> 00:00:11,060 This is the CS50 podcast, and my name is David Malan. 6 00:00:11,060 --> 00:00:12,810 COLTON OGDEN: And my name is Colton Ogden. 7 00:00:12,810 --> 00:00:15,310 DAVID MALAN: And so glad to have everyone back with us today 8 00:00:15,310 --> 00:00:17,740 for our second ever episode of the CS50 podcast. 9 00:00:17,740 --> 00:00:19,180 COLTON OGDEN: Yeah, super excited. 10 00:00:19,180 --> 00:00:20,640 So curious, before we start. 11 00:00:20,640 --> 00:00:23,140 I walked in to your office just before this podcast started, 12 00:00:23,140 --> 00:00:23,890 and you were on the phone. 13 00:00:23,890 --> 00:00:24,850 Who were you on the phone with? 14 00:00:24,850 --> 00:00:26,980 DAVID MALAN: You couldn't have asked that before we started rolling? 15 00:00:26,980 --> 00:00:27,980 COLTON OGDEN: You seemed a little bit disgruntled. 16 00:00:27,980 --> 00:00:29,605 DAVID MALAN: No, if you can believe it. 17 00:00:29,605 --> 00:00:30,880 It was a robocall. 18 00:00:30,880 --> 00:00:33,130 And in fact, ever since our discussion thereof, 19 00:00:33,130 --> 00:00:36,400 and since the last week's Tonight Show with John Oliver started 20 00:00:36,400 --> 00:00:38,410 focusing on this topic, I have legitimately 21 00:00:38,410 --> 00:00:40,570 started getting more and more of these calls. 22 00:00:40,570 --> 00:00:41,830 Where they're just spam calls. 23 00:00:41,830 --> 00:00:43,840 And then you pick up and there's a very cheery computer 24 00:00:43,840 --> 00:00:45,445 voice on the other end of the line. 25 00:00:45,445 --> 00:00:47,350 COLTON OGDEN: You know, actually, I had to block a number, 26 00:00:47,350 --> 00:00:49,390 because I was actually getting called consistently. 27 00:00:49,390 --> 00:00:50,050 I got called-- 28 00:00:50,050 --> 00:00:51,800 DAVID MALAN: I'm sorry I'll call you less. 29 00:00:51,800 --> 00:00:54,430 COLTON OGDEN: I got called everywhere every five to 10 seconds. 30 00:00:54,430 --> 00:00:56,840 The same phone number was calling me after I called. 31 00:00:56,840 --> 00:00:58,465 It must've been on a loop or something. 32 00:00:58,465 --> 00:00:59,330 DAVID MALAN: That's awful. 33 00:00:59,330 --> 00:01:01,570 Well, I mean the nice thing about iPhone is that you can actually 34 00:01:01,570 --> 00:01:02,653 block calls pretty easily. 35 00:01:02,653 --> 00:01:04,653 And I'm guessing you can do the same on Android. 36 00:01:04,653 --> 00:01:07,450 But landlines from yesteryear, you're pretty much out of luck 37 00:01:07,450 --> 00:01:11,160 unless you punch in some code with your phone provider to do it as well. 38 00:01:11,160 --> 00:01:14,210 COLTON OGDEN: Yeah you can bet that actor got blocked. 39 00:01:14,210 --> 00:01:16,210 DAVID MALAN: You know it's really obnoxious too. 40 00:01:16,210 --> 00:01:17,710 And they think they're being clever. 41 00:01:17,710 --> 00:01:22,090 Because most of the spam calls I get nowadays, like in the past week 42 00:01:22,090 --> 00:01:25,480 are 617- 555- something, something, something, something. 43 00:01:25,480 --> 00:01:29,238 Where 617- 555- matches my own phone number's prefix. 44 00:01:29,238 --> 00:01:31,030 And I think the presumption is, and I think 45 00:01:31,030 --> 00:01:32,830 John Oliver might have pointed this out, that they're 46 00:01:32,830 --> 00:01:36,190 trying to trick you sort of social engineering-wise into thinking like, 47 00:01:36,190 --> 00:01:38,050 oh, this must be a neighbor down the road 48 00:01:38,050 --> 00:01:39,970 because their phone number is so similar. 49 00:01:39,970 --> 00:01:42,280 And it's really frustrating now. 50 00:01:42,280 --> 00:01:43,750 This really has peaked. 51 00:01:43,750 --> 00:01:46,420 COLTON OGDEN: I don't think I ever got a call from anybody 52 00:01:46,420 --> 00:01:50,222 in my life who is a legitimate actor who had the same prefix to my phone number. 53 00:01:50,222 --> 00:01:52,180 DAVID MALAN: Yeah that's a good point actually. 54 00:01:52,180 --> 00:01:55,000 I don't even notice, frankly, because it comes up sometimes 55 00:01:55,000 --> 00:01:56,380 with my contact information. 56 00:01:56,380 --> 00:01:58,280 And, anyhow. 57 00:01:58,280 --> 00:02:00,370 Thanks for that. 58 00:02:00,370 --> 00:02:03,880 Well, sort of a tie back into actually the last podcast episode, 59 00:02:03,880 --> 00:02:05,650 where we talked about Facebook. 60 00:02:05,650 --> 00:02:10,030 And they're sort of storing unhashed passwords out in the clear. 61 00:02:10,030 --> 00:02:14,560 It looks like recently they committed another sort of offense 62 00:02:14,560 --> 00:02:20,590 where they were actually asking people for their email passwords. 63 00:02:20,590 --> 00:02:24,487 Not a Facebook password, but their actual external email passwords 64 00:02:24,487 --> 00:02:25,195 through Facebook. 65 00:02:25,195 --> 00:02:26,170 COLTON OGDEN: Yeah. 66 00:02:26,170 --> 00:02:30,820 I read this, and I think they were trying to do this for well-intentioned, 67 00:02:30,820 --> 00:02:33,820 at least we can perhaps give them the benefit of the doubt, in that they 68 00:02:33,820 --> 00:02:36,640 wanted people to be able to confirm that some email address was 69 00:02:36,640 --> 00:02:37,720 in fact their own. 70 00:02:37,720 --> 00:02:39,700 And I presume some developer thought, well, 71 00:02:39,700 --> 00:02:43,690 it'll be easy if we just ask them for their username, their password, 72 00:02:43,690 --> 00:02:47,050 pretend to log into that actual system on the user's behalf, 73 00:02:47,050 --> 00:02:49,420 and if they get in successfully, hopefully just 74 00:02:49,420 --> 00:02:51,267 disconnect without poking around. 75 00:02:51,267 --> 00:02:53,350 And just assume that the address is indeed theirs. 76 00:02:53,350 --> 00:02:57,530 But this is just so unnecessary and so wrong on multiple levels. 77 00:02:57,530 --> 00:03:00,640 I mean this is why companies actually instead send 78 00:03:00,640 --> 00:03:04,300 you an email, usually with a special number, or word in it, 79 00:03:04,300 --> 00:03:06,347 or URL that you can then click. 80 00:03:06,347 --> 00:03:07,930 Because the presumption there is that. 81 00:03:07,930 --> 00:03:09,940 Well if we send you an email, and you are 82 00:03:09,940 --> 00:03:12,010 able to click on that email within 15 minutes, 83 00:03:12,010 --> 00:03:16,000 presumably you do indeed know the username and password to that email. 84 00:03:16,000 --> 00:03:18,925 And so therefore you are indeed who you say you are. 85 00:03:18,925 --> 00:03:21,550 That's sort of the right, or at least the industry standard way 86 00:03:21,550 --> 00:03:22,250 of doing this. 87 00:03:22,250 --> 00:03:23,560 Even though it does add some friction. 88 00:03:23,560 --> 00:03:24,550 You have to go check your mail. 89 00:03:24,550 --> 00:03:26,530 You might have to hit refresh a few times. 90 00:03:26,530 --> 00:03:28,810 So there's some UX downsides. 91 00:03:28,810 --> 00:03:30,560 User experience downsides. 92 00:03:30,560 --> 00:03:32,980 But that's secure, because you're not asking the user 93 00:03:32,980 --> 00:03:34,480 to divulge private information. 94 00:03:34,480 --> 00:03:36,790 This is just reckless, especially for a company 95 00:03:36,790 --> 00:03:40,810 as big as Facebook to be conditioning people into thinking this is OK. 96 00:03:40,810 --> 00:03:42,910 DAVID MALAN: I think letting big entities 97 00:03:42,910 --> 00:03:46,960 like this act on our behalf in the security realm. 98 00:03:46,960 --> 00:03:49,600 I mean, especially Facebook given that they've recently been 99 00:03:49,600 --> 00:03:52,270 caught storing plain text passwords. 100 00:03:52,270 --> 00:03:54,700 Putting my email password in Facebook's hands, 101 00:03:54,700 --> 00:03:56,680 I don't know where that's going to end up at the end of the day. 102 00:03:56,680 --> 00:03:57,388 COLTON OGDEN: No. 103 00:03:57,388 --> 00:03:59,140 And honestly, even if it's not malicious, 104 00:03:59,140 --> 00:04:01,150 and it is just foolish or accidental. 105 00:04:01,150 --> 00:04:05,650 The reality is that servers log input or log transactions in their databases. 106 00:04:05,650 --> 00:04:09,568 And so the data may end up just sticking around unintentionally so. 107 00:04:09,568 --> 00:04:11,860 So it doesn't matter even that the intentions are good. 108 00:04:11,860 --> 00:04:13,330 This is just bad practice. 109 00:04:13,330 --> 00:04:17,829 And again to my point earlier, if you see this behavior being normalized 110 00:04:17,829 --> 00:04:19,888 on very popular websites like Facebook. 111 00:04:19,888 --> 00:04:22,930 Well what's to stop a user, especially a less technically proficient user 112 00:04:22,930 --> 00:04:24,790 from thinking, oh, I guess that's OK. 113 00:04:24,790 --> 00:04:25,640 That's the norm. 114 00:04:25,640 --> 00:04:26,830 That's how this is done. 115 00:04:26,830 --> 00:04:28,870 If they see it on some random adversaries 116 00:04:28,870 --> 00:04:31,960 website that they get socially engineered into clicking on. 117 00:04:31,960 --> 00:04:34,450 DAVID MALAN: It was kind of entertaining when 118 00:04:34,450 --> 00:04:38,080 it was sort of brought to their Facebook attention that this was a bad idea. 119 00:04:38,080 --> 00:04:42,700 In the Daily Beast article that I actually read about this, 120 00:04:42,700 --> 00:04:45,580 someone actually brought this to Facebook's attention. 121 00:04:45,580 --> 00:04:50,200 And Facebook came out, and to the world said, you know, 122 00:04:50,200 --> 00:04:53,320 this probably wasn't the best way to approach solving this problem. 123 00:04:53,320 --> 00:04:54,820 Because they had been caught doing [INAUDIBLE].. 124 00:04:54,820 --> 00:04:55,670 COLTON OGDEN: To say the least. 125 00:04:55,670 --> 00:04:58,753 You know, and it's interesting because big companies, Facebook among them, 126 00:04:58,753 --> 00:05:01,450 presumably do have code review processes in place. 127 00:05:01,450 --> 00:05:03,880 Involving multiple humans and design reviews. 128 00:05:03,880 --> 00:05:07,750 And so what's especially worrisome here, or certainly surprising, 129 00:05:07,750 --> 00:05:09,210 is how did this even ship? 130 00:05:09,210 --> 00:05:09,710 Right. 131 00:05:09,710 --> 00:05:12,770 At no point did some human presumably object to doing this. 132 00:05:12,770 --> 00:05:14,885 And so that's I think the sort of fundamental flaw 133 00:05:14,885 --> 00:05:18,010 or the fundamental concern is that how did something like this even happen? 134 00:05:18,010 --> 00:05:20,350 Because students coming out of CS50 might certainly 135 00:05:20,350 --> 00:05:22,270 be inclined to implement things in this way 136 00:05:22,270 --> 00:05:26,850 and frankly, if you don't really think about it adversary, 137 00:05:26,850 --> 00:05:30,430 or if you haven't been taught to think about things defensively, 138 00:05:30,430 --> 00:05:32,050 you might make this mistake too. 139 00:05:32,050 --> 00:05:34,960 But that's what mentorship is there for, more experienced personnel 140 00:05:34,960 --> 00:05:36,273 or older folks are there for. 141 00:05:36,273 --> 00:05:37,940 To actually catch these kinds of things. 142 00:05:37,940 --> 00:05:41,080 And so that's the sort of process flaw that's of concern too. 143 00:05:41,080 --> 00:05:44,830 DAVID MALAN: I'm certainly grateful that we have so many folks online who 144 00:05:44,830 --> 00:05:47,560 are getting more technically literate in this domain 145 00:05:47,560 --> 00:05:49,620 and are bringing this to everyone's attention. 146 00:05:49,620 --> 00:05:52,772 Looking for these kinds of patterns and catching Facebook red-handed 147 00:05:52,772 --> 00:05:54,230 when they do these types of things. 148 00:05:54,230 --> 00:05:56,063 Not necessarily just Facebook, I'm sure this 149 00:05:56,063 --> 00:05:58,660 happens at scale with many companies. 150 00:05:58,660 --> 00:06:01,688 But it's nice to know that people are actually on the lookout for this. 151 00:06:01,688 --> 00:06:02,480 COLTON OGDEN: Yeah. 152 00:06:02,480 --> 00:06:04,438 And you know I should disclaim too, because I'm 153 00:06:04,438 --> 00:06:07,210 sure we have students out there who will remember this. 154 00:06:07,210 --> 00:06:09,580 Some 10 or so years ago, even CS50 actually 155 00:06:09,580 --> 00:06:12,610 did foolishly use this technique on one or more of our web apps. 156 00:06:12,610 --> 00:06:14,950 Because at the time there actually was no, 157 00:06:14,950 --> 00:06:18,850 I believe, sort of standard that we could have used 158 00:06:18,850 --> 00:06:21,147 to authenticate users in a better way. 159 00:06:21,147 --> 00:06:23,230 OAuth, for instance has come onto the scene since. 160 00:06:23,230 --> 00:06:25,980 And maybe even if it existed then, it wasn't nearly as omnipresent 161 00:06:25,980 --> 00:06:26,532 as it is now. 162 00:06:26,532 --> 00:06:28,990 So in short, there are technical solutions to this problem. 163 00:06:28,990 --> 00:06:33,160 Whereby, the right way to do this is you don't ask the user 164 00:06:33,160 --> 00:06:34,660 for their username and password. 165 00:06:34,660 --> 00:06:37,810 You redirect them to Yahoo or Gmail or whatever 166 00:06:37,810 --> 00:06:40,210 the account owner's website is. 167 00:06:40,210 --> 00:06:43,000 Have them log in and then be redirected back to you. 168 00:06:43,000 --> 00:06:45,850 Essentially with some kind of cryptographic token. 169 00:06:45,850 --> 00:06:49,420 Something that's mathematically significant and very hard to forge that 170 00:06:49,420 --> 00:06:50,200 proves, yes. 171 00:06:50,200 --> 00:06:54,010 Colton did in fact just log into his actual Yahoo email account. 172 00:06:54,010 --> 00:06:56,920 You can trust that he is who he say says he is. 173 00:06:56,920 --> 00:06:59,710 That mechanism either didn't exist, or wasn't familiar 174 00:06:59,710 --> 00:07:01,370 even to me back in the day. 175 00:07:01,370 --> 00:07:03,470 And so we would just have a web form on CS50 site 176 00:07:03,470 --> 00:07:06,320 to log in with their Harvard email address and their password. 177 00:07:06,320 --> 00:07:09,350 And again, we were not intending this to be malicious. 178 00:07:09,350 --> 00:07:12,010 We certainly didn't log anything deliberately, or thankfully, 179 00:07:12,010 --> 00:07:13,090 accidentally. 180 00:07:13,090 --> 00:07:14,020 But we could have. 181 00:07:14,020 --> 00:07:16,180 And I think the fact that even we, as a course, 182 00:07:16,180 --> 00:07:20,560 conveyed the message that, oh, this is OK, was a very bad message to send. 183 00:07:20,560 --> 00:07:23,770 And so thankfully, some years ago we actually transitioned 184 00:07:23,770 --> 00:07:26,560 to using more industry standard approaches like OAuth, 185 00:07:26,560 --> 00:07:30,160 again this mechanism where you bounce the user to their Harvard Law login 186 00:07:30,160 --> 00:07:34,090 then back to CS50 as just a sample client, or an application. 187 00:07:34,090 --> 00:07:36,160 That's a much better way of doing this. 188 00:07:36,160 --> 00:07:36,340 DAVID MALAN: Yeah. 189 00:07:36,340 --> 00:07:39,070 Because in that scenario, you're actually allowing a third party 190 00:07:39,070 --> 00:07:42,010 to let you perform this handshake I think 191 00:07:42,010 --> 00:07:45,340 more securely than just having one entity perform 192 00:07:45,340 --> 00:07:46,558 all the security for you. 193 00:07:46,558 --> 00:07:47,350 COLTON OGDEN: Yeah. 194 00:07:47,350 --> 00:07:50,950 No, and if you look closely, there might actually be examples of this elsewhere. 195 00:07:50,950 --> 00:07:53,830 For instance, and it's been a few months since I looked. 196 00:07:53,830 --> 00:07:57,070 In Gmail, I believe under your settings you can actually 197 00:07:57,070 --> 00:08:00,610 add accounts to your account so that you can retrieve mail 198 00:08:00,610 --> 00:08:02,050 from another account via POP. 199 00:08:02,050 --> 00:08:03,160 Post Office Protocol. 200 00:08:03,160 --> 00:08:05,320 Which is a way of downloading email. 201 00:08:05,320 --> 00:08:07,630 There too, you're doing exactly the same thing. 202 00:08:07,630 --> 00:08:09,940 You are trusting Google with your username and password 203 00:08:09,940 --> 00:08:12,040 to some other email account. 204 00:08:12,040 --> 00:08:16,300 The design there, though, is to enable Google to import that email for you 205 00:08:16,300 --> 00:08:17,170 into this account. 206 00:08:17,170 --> 00:08:20,080 And so as such, there's really no other way 207 00:08:20,080 --> 00:08:24,250 to do that unless there is some other process involved where, via OAuth, they 208 00:08:24,250 --> 00:08:24,970 can do that. 209 00:08:24,970 --> 00:08:26,950 But that's actually not how POP works. 210 00:08:26,950 --> 00:08:29,860 And so there, it's too is a technical constraint. 211 00:08:29,860 --> 00:08:32,200 And that's kind of an artifact of yesteryear's designs 212 00:08:32,200 --> 00:08:33,250 with a lot of these systems. 213 00:08:33,250 --> 00:08:35,750 But it's worth keeping in mind that these things still happen. 214 00:08:35,750 --> 00:08:38,260 And I think even when you log into sites for the first time 215 00:08:38,260 --> 00:08:40,570 you're sometimes prompted for a username and password. 216 00:08:40,570 --> 00:08:43,120 Maybe it's LinkedIn I'm thinking of, or Yahoo. 217 00:08:43,120 --> 00:08:46,140 Because they want to make it easy to import your contacts. 218 00:08:46,140 --> 00:08:49,090 So what better way than to just access your outright account. 219 00:08:49,090 --> 00:08:50,050 But there, too. 220 00:08:50,050 --> 00:08:51,310 You're trusting someone. 221 00:08:51,310 --> 00:08:54,830 You are normalizing a behavior that's probably not best. 222 00:08:54,830 --> 00:08:57,310 And so I think we as a society should really 223 00:08:57,310 --> 00:08:59,530 start to resist this and distrust this. 224 00:08:59,530 --> 00:09:00,400 Just don't do that. 225 00:09:00,400 --> 00:09:02,733 DAVID MALAN: I feel like distrust is a very common theme 226 00:09:02,733 --> 00:09:05,460 in the world of higher CS. 227 00:09:05,460 --> 00:09:06,970 What was the article? 228 00:09:06,970 --> 00:09:08,560 The very famous article on trust? 229 00:09:08,560 --> 00:09:09,430 COLTON OGDEN: Oh, Trusting Trust. 230 00:09:09,430 --> 00:09:10,180 DAVID MALAN: Yeah. 231 00:09:10,180 --> 00:09:10,850 Yeah, indeed. 232 00:09:10,850 --> 00:09:12,940 It was actually a Turning Award acceptance speech 233 00:09:12,940 --> 00:09:15,010 that was then put into paper form. 234 00:09:15,010 --> 00:09:18,680 If you really get into the weeds here, nothing is really trustable. 235 00:09:18,680 --> 00:09:19,180 Right. 236 00:09:19,180 --> 00:09:23,590 In CS50, we talk about compilers, which are programs that, of course, convert 237 00:09:23,590 --> 00:09:24,880 one language into another. 238 00:09:24,880 --> 00:09:28,810 Usually source code into machine code, at least in our case of C. 239 00:09:28,810 --> 00:09:32,590 And who's to say that Clang, the compiler we happen to use in CS50, 240 00:09:32,590 --> 00:09:34,780 doesn't have some malicious lines of code in there. 241 00:09:34,780 --> 00:09:37,960 Such that if you're implementing any program that does use usernames 242 00:09:37,960 --> 00:09:40,750 and passwords, what if the author of the compiler 243 00:09:40,750 --> 00:09:45,130 is inserting his or her user name automatically always into your code? 244 00:09:45,130 --> 00:09:46,600 Even unbeknownst to you. 245 00:09:46,600 --> 00:09:49,842 So there too, unless you actually built the hardware yourself 246 00:09:49,842 --> 00:09:51,550 and wrote the software that's running it. 247 00:09:51,550 --> 00:09:54,970 At some point, you either need to just curl up into a ball, 248 00:09:54,970 --> 00:09:57,190 terrified that you can't trust anyone, or you 249 00:09:57,190 --> 00:09:59,870 have to trust some of those lower lying building blocks. 250 00:09:59,870 --> 00:10:03,490 COLTON OGDEN: It's kind of a testament to just how pivotal trust is 251 00:10:03,490 --> 00:10:06,693 to where we are with technology. 252 00:10:06,693 --> 00:10:08,110 Where we are with computers today. 253 00:10:08,110 --> 00:10:10,210 I don't think any of this would be possible 254 00:10:10,210 --> 00:10:13,450 if we were on the far end of the paranoid spectrum. 255 00:10:13,450 --> 00:10:16,690 I think there is definitely pragmatically an inflection 256 00:10:16,690 --> 00:10:19,128 point at which we do need to actually trust people. 257 00:10:19,128 --> 00:10:19,795 Most definitely. 258 00:10:19,795 --> 00:10:22,878 DAVID MALAN: No, I'm guessing that this is why some people, not that many, 259 00:10:22,878 --> 00:10:24,200 live off the grid, so to speak. 260 00:10:24,200 --> 00:10:26,200 Or in a cabin somewhere, disconnected from all of this, 261 00:10:26,200 --> 00:10:27,242 because they don't trust. 262 00:10:27,242 --> 00:10:30,640 And honestly, we've seen enough articles, and revelations, and news 263 00:10:30,640 --> 00:10:32,590 lately that, they're kind of right. 264 00:10:32,590 --> 00:10:35,680 All of these big companies, too, that you would have thought were adhering 265 00:10:35,680 --> 00:10:37,180 to best practices aren't. 266 00:10:37,180 --> 00:10:38,800 So there's something to that. 267 00:10:38,800 --> 00:10:40,770 COLTON OGDEN: There has to be a certain, I 268 00:10:40,770 --> 00:10:43,687 guess, maybe based comfort level folks have to have with at least some 269 00:10:43,687 --> 00:10:46,150 of their information being publicly accessible. 270 00:10:46,150 --> 00:10:47,080 DAVID MALAN: Yeah. 271 00:10:47,080 --> 00:10:51,130 No, and I think you have to make an individual decision as to whether, 272 00:10:51,130 --> 00:10:54,190 does the convenience you derive from some tool, or the pleasure 273 00:10:54,190 --> 00:10:57,400 you derive from some game, or whatever the application is, 274 00:10:57,400 --> 00:10:59,050 outweigh the price that you're paying? 275 00:10:59,050 --> 00:11:01,133 And that certainly is a theme in computer science, 276 00:11:01,133 --> 00:11:06,310 in CS50 specifically, making a reasoned choice based on the pluses and minuses. 277 00:11:06,310 --> 00:11:10,930 But I think the concern here, as with sort of liberties more generally 278 00:11:10,930 --> 00:11:15,160 in a republic or in a government, is that it's very easy incrementally 279 00:11:15,160 --> 00:11:17,420 to say, oh, I'll give up a little bit of my privacy 280 00:11:17,420 --> 00:11:19,420 for this additional convenience or this feature. 281 00:11:19,420 --> 00:11:20,920 OK, I'll give you a little bit more. 282 00:11:20,920 --> 00:11:22,592 OK, I'll give you a little bit more. 283 00:11:22,592 --> 00:11:24,550 And then when you actually turn around and look 284 00:11:24,550 --> 00:11:28,090 at the trail of things you've given up can actually 285 00:11:28,090 --> 00:11:29,480 start to add up quite a bit. 286 00:11:29,480 --> 00:11:31,900 And then some other party or company or government 287 00:11:31,900 --> 00:11:34,450 has much more control, or access, than you 288 00:11:34,450 --> 00:11:36,085 might have originally had agreed to. 289 00:11:36,085 --> 00:11:37,120 COLTON OGDEN: Sure. 290 00:11:37,120 --> 00:11:39,190 All makes sense. 291 00:11:39,190 --> 00:11:43,425 I guess maybe to sort of pivot away from the trust discussion. 292 00:11:43,425 --> 00:11:45,550 Back into it maybe something little more technical. 293 00:11:45,550 --> 00:11:49,330 It looks like this last week Apache actually patched 294 00:11:49,330 --> 00:11:53,080 a bug that granted folks root access on shared hosting environments. 295 00:11:53,080 --> 00:11:58,510 The Apache web server, which is such a ubiquitous web server, 296 00:11:58,510 --> 00:12:01,827 there were malicious CGI scripts that were capable of actually running 297 00:12:01,827 --> 00:12:03,160 on a shared hosting environment. 298 00:12:03,160 --> 00:12:07,712 Which I think CS50 is even used Apache for shared hosting. 299 00:12:07,712 --> 00:12:08,920 DAVID MALAN: Yeah many years. 300 00:12:08,920 --> 00:12:11,225 COLTON OGDEN: At least in the V host. 301 00:12:11,225 --> 00:12:13,540 Is V host the technically shared hosting? 302 00:12:13,540 --> 00:12:16,600 DAVID MALAN: V host is a technical term saying virtual hosting. 303 00:12:16,600 --> 00:12:21,640 Which generally means hosting multiple domains on the same physical server. 304 00:12:21,640 --> 00:12:23,810 And Apache makes that very easy. 305 00:12:23,810 --> 00:12:24,310 Yeah. 306 00:12:24,310 --> 00:12:25,930 No, we used Apache for years. 307 00:12:25,930 --> 00:12:28,990 It's free open source software, it's very highly performing, 308 00:12:28,990 --> 00:12:30,610 can handle lots and lots of requests. 309 00:12:30,610 --> 00:12:33,760 It's a competitor, essentially to Engine X, which then swept onto the scene 310 00:12:33,760 --> 00:12:35,718 and took sort of a different technical approach 311 00:12:35,718 --> 00:12:38,090 to the same problem of scaling web services. 312 00:12:38,090 --> 00:12:38,590 And Yeah. 313 00:12:38,590 --> 00:12:43,270 This was an example of a bug whereby if you have an account on a server 314 00:12:43,270 --> 00:12:45,100 that's running the Apache web server. 315 00:12:45,100 --> 00:12:46,868 As you would if you were running yourself, 316 00:12:46,868 --> 00:12:49,910 or if you're paying someone a few dollars a month for shared web hosting. 317 00:12:49,910 --> 00:12:52,960 Which is still very common, especially for languages like PHP. 318 00:12:52,960 --> 00:12:55,090 You have typically a shell account. 319 00:12:55,090 --> 00:12:57,580 A username and password and therefore home directory. 320 00:12:57,580 --> 00:13:00,100 And the ability, sometimes, to run programs. 321 00:13:00,100 --> 00:13:02,260 Otherwise known in the web as CGI scripts. 322 00:13:02,260 --> 00:13:03,940 Common Gateway Interface. 323 00:13:03,940 --> 00:13:08,080 Which is a way of running languages like Python. 324 00:13:08,080 --> 00:13:11,950 You can do it with Python, but more commonly PHP, or Perl, 325 00:13:11,950 --> 00:13:13,540 or other languages as well. 326 00:13:13,540 --> 00:13:17,020 And in short, if you have the ability to install these CGI scripts on a server, 327 00:13:17,020 --> 00:13:19,420 you can write a program in such a way that it actually 328 00:13:19,420 --> 00:13:22,720 gives you, as you know, root access, or administrator access 329 00:13:22,720 --> 00:13:23,800 to the whole darn server. 330 00:13:23,800 --> 00:13:26,770 Which is horrible if you're not on your own server, 331 00:13:26,770 --> 00:13:28,855 but you are on someone else's shared host. 332 00:13:28,855 --> 00:13:31,480 Because now you have access to all the other customers or users 333 00:13:31,480 --> 00:13:32,398 accounts potentially. 334 00:13:32,398 --> 00:13:33,190 COLTON OGDEN: Yeah. 335 00:13:33,190 --> 00:13:37,690 Nick and I, on the stream, this was part of one of the CTF, Capture the Flag, 336 00:13:37,690 --> 00:13:38,500 challenges we did. 337 00:13:38,500 --> 00:13:41,650 Where we had to sort of finagle our way into getting 338 00:13:41,650 --> 00:13:45,640 privilege escalation from several user groups up until a root 339 00:13:45,640 --> 00:13:47,930 access by exploiting these kinds of vulnerabilities. 340 00:13:47,930 --> 00:13:48,680 DAVID MALAN: Yeah. 341 00:13:48,680 --> 00:13:51,040 No, the threat, of course, is that if somehow you 342 00:13:51,040 --> 00:13:54,610 have a bad actor on your staff, or in your course, 343 00:13:54,610 --> 00:13:57,980 or really just on your server, he or she can, of course, install something 344 00:13:57,980 --> 00:13:58,480 like this. 345 00:13:58,480 --> 00:14:02,020 And then gain or grant root access to someone else too. 346 00:14:02,020 --> 00:14:04,090 So even if it's your own server, you certainly 347 00:14:04,090 --> 00:14:07,570 don't want your own code to accidentally be able to slip into route mode. 348 00:14:07,570 --> 00:14:10,600 Because that means any commands that are executed thereafter 349 00:14:10,600 --> 00:14:12,220 could damage anything on the system. 350 00:14:12,220 --> 00:14:16,000 You can add files, remove files, send files elsewhere. 351 00:14:16,000 --> 00:14:18,268 Once you have route, the front door is wide open. 352 00:14:18,268 --> 00:14:19,060 COLTON OGDEN: Yeah. 353 00:14:19,060 --> 00:14:20,080 Delete databases. 354 00:14:20,080 --> 00:14:20,770 Delete users. 355 00:14:20,770 --> 00:14:22,220 DAVID MALAN: Yeah everything. 356 00:14:22,220 --> 00:14:23,930 So this is a very serious threat. 357 00:14:23,930 --> 00:14:27,550 And it's a simple fix to just run the update and actually 358 00:14:27,550 --> 00:14:28,967 a patch the software, so to speak. 359 00:14:28,967 --> 00:14:31,758 But these are the kinds of things that you want to be cognizant of. 360 00:14:31,758 --> 00:14:34,720 And frankly, I think far too many system administrators and people 361 00:14:34,720 --> 00:14:38,420 running web servers don't necessarily pay attention to these kinds of alerts. 362 00:14:38,420 --> 00:14:41,950 And so, making sure you're keeping an eye on Apache's own mailing list 363 00:14:41,950 --> 00:14:44,380 or Twitter account these days, or Tech Crunch, 364 00:14:44,380 --> 00:14:46,630 or other such sites that tends to propagate 365 00:14:46,630 --> 00:14:47,987 announcements of security flaws. 366 00:14:47,987 --> 00:14:49,570 You really do want to keep an eye out. 367 00:14:49,570 --> 00:14:52,790 Because you're going to be regretting it, 368 00:14:52,790 --> 00:14:55,212 I think, otherwise, if the fix were available 369 00:14:55,212 --> 00:14:58,045 and you just didn't realize you need to apply it to defend yourself. 370 00:14:58,045 --> 00:15:01,030 COLTON OGDEN: Sure. 371 00:15:01,030 --> 00:15:02,320 So unrelated to that. 372 00:15:02,320 --> 00:15:05,950 An interesting thing that we saw in the last week 373 00:15:05,950 --> 00:15:09,380 was that Office Depot recently was accused 374 00:15:09,380 --> 00:15:12,350 of forging computer scan results. 375 00:15:12,350 --> 00:15:15,920 Folks would bring their computers in, and Office Depot would just flat out 376 00:15:15,920 --> 00:15:19,892 lie about the computer's safety of the folks that brought their computers in. 377 00:15:19,892 --> 00:15:21,350 What do you have to say about that? 378 00:15:21,350 --> 00:15:22,000 What do you think about that? 379 00:15:22,000 --> 00:15:25,100 DAVID MALAN: Well, today's podcast is brought to you by Office Depot. 380 00:15:25,100 --> 00:15:26,000 [LAUGHTER] 381 00:15:26,000 --> 00:15:27,170 DAVID MALAN: No. 382 00:15:27,170 --> 00:15:28,670 No one actually. 383 00:15:28,670 --> 00:15:30,440 No, this is horrible thing. 384 00:15:30,440 --> 00:15:33,170 This isn't even necessarily related to technology. 385 00:15:33,170 --> 00:15:35,550 This seems to be, and I presume this is true. 386 00:15:35,550 --> 00:15:40,700 I'm reading the same thing you are off the FTC website in the US here. 387 00:15:40,700 --> 00:15:42,380 That it was just outright deception. 388 00:15:42,380 --> 00:15:44,720 And the software was configured, or designed, 389 00:15:44,720 --> 00:15:48,395 or the humans chose to give misleading information. 390 00:15:48,395 --> 00:15:51,020 Incorrect information to people just to trick them, presumably, 391 00:15:51,020 --> 00:15:55,190 into upselling them to have their computer disinfected from some virus. 392 00:15:55,190 --> 00:15:57,500 Or some malware when it wasn't actually there. 393 00:15:57,500 --> 00:15:58,958 COLTON OGDEN: And this is horrible. 394 00:15:58,958 --> 00:16:00,570 But how do we fix this problem? 395 00:16:00,570 --> 00:16:03,487 How do we protect the folks that don't necessarily 396 00:16:03,487 --> 00:16:05,445 know any better, that the computer is infected? 397 00:16:05,445 --> 00:16:07,490 DAVID MALAN: Yeah I mean you'd like to think 398 00:16:07,490 --> 00:16:10,010 that these are anomalous situations. 399 00:16:10,010 --> 00:16:12,770 Where, at least if you're going to brand name places, 400 00:16:12,770 --> 00:16:15,025 you would like to think that you can trust them 401 00:16:15,025 --> 00:16:17,150 with higher probability than say some random person 402 00:16:17,150 --> 00:16:19,910 you find on Craigslist to disinfect your computer for you. 403 00:16:19,910 --> 00:16:23,060 But case in point, even the big fish company like Office Depot. 404 00:16:23,060 --> 00:16:25,460 For those unfamiliar, is a pretty big company in the US 405 00:16:25,460 --> 00:16:29,060 that sells office furniture and apparently will steal money from you 406 00:16:29,060 --> 00:16:32,450 and pretend your hard drive is infected with malware when it's not. 407 00:16:32,450 --> 00:16:36,530 So, as we've seen with Facebook and other companies, these mea culpas, 408 00:16:36,530 --> 00:16:38,042 big companies are doing this too. 409 00:16:38,042 --> 00:16:40,250 And maybe it's not systematically across the company, 410 00:16:40,250 --> 00:16:44,210 maybe it's some bad actor, or management, or one or few stores. 411 00:16:44,210 --> 00:16:46,883 But I think the nice thing about the software world, 412 00:16:46,883 --> 00:16:48,800 and the nice thing about the open source world 413 00:16:48,800 --> 00:16:50,960 is, that there's a lot of free products and tools 414 00:16:50,960 --> 00:16:52,700 that you can actually download at home. 415 00:16:52,700 --> 00:16:55,460 And while you might need a bit more technical savvy, 416 00:16:55,460 --> 00:16:58,130 it's definitely more convenient to be able to do it yourself. 417 00:16:58,130 --> 00:17:01,010 You can perhaps trust the process a bit more. 418 00:17:01,010 --> 00:17:05,458 At least if you have identified a good, compelling product or open source tool. 419 00:17:05,458 --> 00:17:08,000 Not some random thing that you were tricked into downloading, 420 00:17:08,000 --> 00:17:10,490 and that's a whole other can of worms there. 421 00:17:10,490 --> 00:17:13,099 But there tend to be popular programs that, frankly, I used 422 00:17:13,099 --> 00:17:14,810 to use when I used PCs more frequently. 423 00:17:14,810 --> 00:17:15,838 I would run them myself. 424 00:17:15,838 --> 00:17:19,130 And then when they did detect something, I would have it clean my own computer. 425 00:17:19,130 --> 00:17:22,010 It's definitely not something have to pay someone else for. 426 00:17:22,010 --> 00:17:24,920 But even for those least comfortable, honestly, 427 00:17:24,920 --> 00:17:26,575 invite someone over that you trust. 428 00:17:26,575 --> 00:17:28,700 Whether it's a colleague a friend or family member, 429 00:17:28,700 --> 00:17:31,370 have him or her run such software for you. 430 00:17:31,370 --> 00:17:34,742 And then trust their judgment, not necessarily a random third party. 431 00:17:34,742 --> 00:17:36,950 COLTON OGDEN: Back to the theme, of course, of trust. 432 00:17:36,950 --> 00:17:37,820 DAVID MALAN: Yeah. 433 00:17:37,820 --> 00:17:40,790 You have to trust, too, that your niece or nephew isn't 434 00:17:40,790 --> 00:17:44,780 coming over just trying to cheat you out of 20 bucks to scan your computer. 435 00:17:44,780 --> 00:17:47,690 But you can also just pay them to show you how to run the software, 436 00:17:47,690 --> 00:17:49,107 and then do this perhaps yourself. 437 00:17:49,107 --> 00:17:52,400 COLTON OGDEN: Always, always gets into somewhat of a dark realm 438 00:17:52,400 --> 00:17:54,050 when we talk about trust. 439 00:17:54,050 --> 00:17:56,930 In the context of, I think, general trust. 440 00:17:56,930 --> 00:18:00,200 But in the context of CS, especially. 441 00:18:00,200 --> 00:18:03,560 I think going down that rabbit hole can often be somewhat depressing. 442 00:18:03,560 --> 00:18:04,310 DAVID MALAN: Yeah. 443 00:18:04,310 --> 00:18:06,830 But I think this is true if we really want to depress ourselves 444 00:18:06,830 --> 00:18:07,747 in the real world too. 445 00:18:07,747 --> 00:18:10,850 Like driving a car, you generally need to trust 446 00:18:10,850 --> 00:18:14,720 that the other humans are going to obey the traffic laws, the traffic lights, 447 00:18:14,720 --> 00:18:18,560 so that you can behave in a logical way without actually hitting or being 448 00:18:18,560 --> 00:18:21,357 hit from someone else. 449 00:18:21,357 --> 00:18:22,940 So I think that's kind of omnipresent. 450 00:18:22,940 --> 00:18:24,860 And when you go out to a restaurant, you'd 451 00:18:24,860 --> 00:18:27,680 like to assume that everything is sanitary. 452 00:18:27,680 --> 00:18:31,820 And unfortunately I've watched far too many Gordon Ramsay shows 453 00:18:31,820 --> 00:18:35,210 and Kitchen Nightmares to know that I shouldn't be trusting all restaurants, 454 00:18:35,210 --> 00:18:36,470 actually. 455 00:18:36,470 --> 00:18:39,020 So I think this is not necessarily unique to technology, 456 00:18:39,020 --> 00:18:42,530 but I think it's all the more present lately. 457 00:18:42,530 --> 00:18:43,280 This concern. 458 00:18:43,280 --> 00:18:44,120 Or these threats. 459 00:18:44,120 --> 00:18:46,790 COLTON OGDEN: Yeah, certainly. 460 00:18:46,790 --> 00:18:50,810 Back more to the technical side of our discussion, and sort of 461 00:18:50,810 --> 00:18:52,040 related to the Apache thing. 462 00:18:52,040 --> 00:18:54,830 The other actor that you mentioned, Engine X. 463 00:18:54,830 --> 00:18:58,670 There was a vulnerability with some Cisco routers recently. 464 00:18:58,670 --> 00:19:01,370 The RV 320, and I think another series. 465 00:19:01,370 --> 00:19:06,620 And there was the RedTeam pen testing group that, I guess, 466 00:19:06,620 --> 00:19:08,840 ended up doing some tests on those routers. 467 00:19:08,840 --> 00:19:12,320 And found a config file in which it specified 468 00:19:12,320 --> 00:19:16,730 that one of the fixes for a vulnerability that they found 469 00:19:16,730 --> 00:19:19,220 was actually just a banned Curl. 470 00:19:19,220 --> 00:19:21,443 The program Curl. 471 00:19:21,443 --> 00:19:23,360 What are your thoughts on that as an approach? 472 00:19:23,360 --> 00:19:27,805 DAVID MALAN: Yeah, I think we have such a knack in this podcast already. 473 00:19:27,805 --> 00:19:29,930 Anything we have to say is not going to be positive 474 00:19:29,930 --> 00:19:33,230 when we point out that something was in the news it seems here. 475 00:19:33,230 --> 00:19:34,120 Technologically so-- 476 00:19:34,120 --> 00:19:36,470 COLTON OGDEN: I need to do some more research on positive, friendly topics. 477 00:19:36,470 --> 00:19:37,870 DAVID MALAN: We do. 478 00:19:37,870 --> 00:19:40,370 Well, what are some websites where you can see some puppies? 479 00:19:40,370 --> 00:19:40,640 Wholesome-- 480 00:19:40,640 --> 00:19:43,070 COLTON OGDEN: Well, this is the podcast, so people can't see anything. 481 00:19:43,070 --> 00:19:44,090 DAVID MALAN: That's true, so we need some-- 482 00:19:44,090 --> 00:19:45,710 COLTON OGDEN: We need audio clips of puppies. 483 00:19:45,710 --> 00:19:46,752 DAVID MALAN: There we go. 484 00:19:46,752 --> 00:19:48,590 Next time, next time in episode two. 485 00:19:48,590 --> 00:19:50,780 So Curl, for those unfamiliar, allows you 486 00:19:50,780 --> 00:19:54,072 to connect to a URL generally with a command line client. 487 00:19:54,072 --> 00:19:56,030 Or there's actually a library version where you 488 00:19:56,030 --> 00:19:58,280 can write code that connects to a URL. 489 00:19:58,280 --> 00:20:02,200 And you can use Curl therefore to download content or download HTTP 490 00:20:02,200 --> 00:20:02,810 headers. 491 00:20:02,810 --> 00:20:05,460 It essentially pretends to be a browser in a headless way. 492 00:20:05,460 --> 00:20:06,210 Without a GUI. 493 00:20:06,210 --> 00:20:08,190 It just does everything textually. 494 00:20:08,190 --> 00:20:10,470 And so in the case of this Cisco router, it 495 00:20:10,470 --> 00:20:13,530 seems as though there was indeed a vulnerability in their code, 496 00:20:13,530 --> 00:20:15,217 on the routers themselves. 497 00:20:15,217 --> 00:20:16,300 Such that you could trick. 498 00:20:16,300 --> 00:20:19,630 These devices into executing code that they were not supposed to execute. 499 00:20:19,630 --> 00:20:21,210 And that, in general, is a bad thing. 500 00:20:21,210 --> 00:20:25,230 You don't want a piece of hardware able to execute code 501 00:20:25,230 --> 00:20:26,250 that you did not intend. 502 00:20:26,250 --> 00:20:28,458 Because, of course, it can maybe do things malicious. 503 00:20:28,458 --> 00:20:31,830 It can steal data, write data, read data, delete data, any number of things 504 00:20:31,830 --> 00:20:33,340 could be possible. 505 00:20:33,340 --> 00:20:36,030 And it indeed seems that this penetration testing 506 00:20:36,030 --> 00:20:38,160 team noticed that, well, gee. 507 00:20:38,160 --> 00:20:40,500 It seems that Cisco's fix for this problem 508 00:20:40,500 --> 00:20:43,890 is just to blacklist a certain user agent, so to speak. 509 00:20:43,890 --> 00:20:48,450 And a user agent is a term of our, in HTTP, that refers to a string. 510 00:20:48,450 --> 00:20:53,220 A unique string that's passed from client to browser that says I 511 00:20:53,220 --> 00:20:54,270 am Chrome. 512 00:20:54,270 --> 00:20:55,560 Or I am Safari. 513 00:20:55,560 --> 00:20:57,030 Or I am Firefox. 514 00:20:57,030 --> 00:20:59,100 Or in this case, I am Curl. 515 00:20:59,100 --> 00:21:01,980 And this is useful just statistically so that servers can actually 516 00:21:01,980 --> 00:21:05,880 keep track of who's using which operating system, which browser, 517 00:21:05,880 --> 00:21:07,840 which piece of software, and so forth. 518 00:21:07,840 --> 00:21:09,900 But this is entirely the honor system, right. 519 00:21:09,900 --> 00:21:13,443 Every HTTP header in a packet from client to server could be forged. 520 00:21:13,443 --> 00:21:15,360 You can write code to do this, or you can even 521 00:21:15,360 --> 00:21:18,000 run Curl to do this using a command line argument. 522 00:21:18,000 --> 00:21:21,510 And so with dash capital A can you change your user agent string. 523 00:21:21,510 --> 00:21:24,700 And so with the RedTeam actually did, with the proof of concept here, 524 00:21:24,700 --> 00:21:27,840 is if you read the advisory, you'll see that they just 525 00:21:27,840 --> 00:21:32,520 changed it from Curl with a C. To Kurl with a K. Just to be cute. 526 00:21:32,520 --> 00:21:33,390 With a c. 527 00:21:33,390 --> 00:21:34,800 [LAUGHTER] 528 00:21:34,800 --> 00:21:41,160 That demonstrated that, essentially, the regular expression that Cisco had built 529 00:21:41,160 --> 00:21:47,190 into the server software, Engine X was just checking for C-U-R-L. 530 00:21:47,190 --> 00:21:50,790 So if you literally pass in anything other than C-U-R-L, 531 00:21:50,790 --> 00:21:54,017 for instance K-U-R-L, that request actually gets through. 532 00:21:54,017 --> 00:21:55,850 They didn't actually fix the underlying bug. 533 00:21:55,850 --> 00:21:56,642 COLTON OGDEN: Yeah. 534 00:21:56,642 --> 00:21:59,807 Such heavy-handed, but also such a simple, naive approach too. 535 00:21:59,807 --> 00:22:00,890 DAVID MALAN: It really is. 536 00:22:00,890 --> 00:22:01,470 In here too. 537 00:22:01,470 --> 00:22:03,900 Yeah, I don't necessarily fault the developer, 538 00:22:03,900 --> 00:22:05,850 because this is a mistake I might have made. 539 00:22:05,850 --> 00:22:07,720 I might still make perhaps. 540 00:22:07,720 --> 00:22:10,380 CS50 student might make, certainly shortly 541 00:22:10,380 --> 00:22:11,755 after graduating from the course. 542 00:22:11,755 --> 00:22:13,213 You need to be taught these things. 543 00:22:13,213 --> 00:22:16,590 You need to realize these things from news articles, or discussions thereof, 544 00:22:16,590 --> 00:22:18,180 but someone should have caught this. 545 00:22:18,180 --> 00:22:22,707 This also, not only being a technical mistake, is a procedural mistake. 546 00:22:22,707 --> 00:22:23,790 How did this slip through? 547 00:22:23,790 --> 00:22:28,990 Someone hopefully, and yet tragically, reviewed this code and said, yes. 548 00:22:28,990 --> 00:22:29,490 Ship it. 549 00:22:29,490 --> 00:22:30,340 This is OK. 550 00:22:30,340 --> 00:22:32,340 And that seems to be where the threat really is. 551 00:22:32,340 --> 00:22:34,882 COLTON OGDEN: Yeah it's almost operating under the assumption 552 00:22:34,882 --> 00:22:37,230 that user agents are baked permanently. 553 00:22:37,230 --> 00:22:40,907 They're immutable by default. Which clearly is not the case. 554 00:22:40,907 --> 00:22:42,240 DAVID MALAN: Well, to be honest. 555 00:22:42,240 --> 00:22:45,270 Not to get all lofty, but I'd like to think that in CS50, this 556 00:22:45,270 --> 00:22:46,890 is one of the things we do try to do. 557 00:22:46,890 --> 00:22:48,870 Not just with this topic, but many others. 558 00:22:48,870 --> 00:22:52,923 Where we really try to introduce students to low-level primitives. 559 00:22:52,923 --> 00:22:55,590 Case in point, we use C. Which is about as close to the hardware 560 00:22:55,590 --> 00:22:58,860 as you can get before you drop down into assembly code and actual machine 561 00:22:58,860 --> 00:22:59,520 instructions. 562 00:22:59,520 --> 00:23:03,360 And I think via that bottom up exploration, 563 00:23:03,360 --> 00:23:06,810 do you begin with higher probability, hopefully, than otherwise. 564 00:23:06,810 --> 00:23:09,750 To think about what threats might be, right. 565 00:23:09,750 --> 00:23:12,410 Even if you don't necessarily know that much about HTTP. 566 00:23:12,410 --> 00:23:14,910 You just know that there are these text based messages going 567 00:23:14,910 --> 00:23:17,070 back and forth from client to server. 568 00:23:17,070 --> 00:23:19,157 At some point, it probably starts to dawn on you. 569 00:23:19,157 --> 00:23:19,990 Well, wait a minute. 570 00:23:19,990 --> 00:23:23,070 If I can write software that generates requests, 571 00:23:23,070 --> 00:23:24,872 maybe I can just forge these requests. 572 00:23:24,872 --> 00:23:26,580 And indeed, all I have to do is print out 573 00:23:26,580 --> 00:23:28,330 this string instead of this other one. 574 00:23:28,330 --> 00:23:30,270 So we can't possibly in CS50, or any course, 575 00:23:30,270 --> 00:23:34,960 teach everyone something about everything. 576 00:23:34,960 --> 00:23:37,088 So if you instead focus more on the primitives 577 00:23:37,088 --> 00:23:38,380 the underlying building blocks. 578 00:23:38,380 --> 00:23:39,360 What is HTTP. 579 00:23:39,360 --> 00:23:40,350 What is a header. 580 00:23:40,350 --> 00:23:42,060 What is a TCP client. 581 00:23:42,060 --> 00:23:43,890 Can they begin to assemble for themselves 582 00:23:43,890 --> 00:23:48,630 critically what is actually possible and what those threats actually are. 583 00:23:48,630 --> 00:23:49,890 COLTON OGDEN: Yeah. 584 00:23:49,890 --> 00:23:50,680 Pretty amusing. 585 00:23:50,680 --> 00:23:53,070 Pretty depressing altogether. 586 00:23:53,070 --> 00:23:54,495 Seeing all of this things. 587 00:23:54,495 --> 00:23:56,620 DAVID MALAN: And these are the things we're seeing. 588 00:23:56,620 --> 00:23:57,121 Right. 589 00:23:57,121 --> 00:23:59,790 This is thanks to companies and people, like RedTeam, which 590 00:23:59,790 --> 00:24:01,530 actually noticed something like this. 591 00:24:01,530 --> 00:24:04,320 Can you even imagine how omnipresent these mistakes are 592 00:24:04,320 --> 00:24:05,900 that we just haven't discovered yet. 593 00:24:05,900 --> 00:24:06,692 COLTON OGDEN: Yeah. 594 00:24:06,692 --> 00:24:09,000 Thank goodness for folks like RedTeam and folks 595 00:24:09,000 --> 00:24:13,590 that are paying attention to the validity of things like the Office 596 00:24:13,590 --> 00:24:14,490 Depot scans. 597 00:24:14,490 --> 00:24:17,868 And question why Facebook is asking for their email, password, 598 00:24:17,868 --> 00:24:19,410 and bring it to the public conscious. 599 00:24:19,410 --> 00:24:22,500 Because otherwise this would be a little bit trickier. 600 00:24:22,500 --> 00:24:25,543 A lot of bad things might be happening underneath us, 601 00:24:25,543 --> 00:24:27,210 and we would be none the wiser about it. 602 00:24:27,210 --> 00:24:27,690 DAVID MALAN: Yeah. 603 00:24:27,690 --> 00:24:29,357 Well, and there's a term of art in tech. 604 00:24:29,357 --> 00:24:31,860 White hats, or ethical hackers, so to speak. 605 00:24:31,860 --> 00:24:34,290 People whose job it is, or mission in life, 606 00:24:34,290 --> 00:24:36,510 is to actually think like an adversary. 607 00:24:36,510 --> 00:24:39,060 Or sort of pretend to be the bad guy, at least in your mind, 608 00:24:39,060 --> 00:24:40,770 but to use those powers for good. 609 00:24:40,770 --> 00:24:44,103 And to actually build a business around or reputation around. 610 00:24:44,103 --> 00:24:45,520 Discovering these kinds of things. 611 00:24:45,520 --> 00:24:47,478 And honestly, it's taken the industry some time 612 00:24:47,478 --> 00:24:49,180 to get comfortable with this idea. 613 00:24:49,180 --> 00:24:50,850 Especially with outsiders. 614 00:24:50,850 --> 00:24:52,890 There's another term, bounties, for instance. 615 00:24:52,890 --> 00:24:55,530 And some companies, not all, will actually 616 00:24:55,530 --> 00:24:58,230 offer you a few hundred dollars, few thousand dollars, 617 00:24:58,230 --> 00:25:02,520 if you identify in a responsible way some security hole in their software. 618 00:25:02,520 --> 00:25:04,460 Report it via the appropriate channels. 619 00:25:04,460 --> 00:25:07,210 Not Twitter, but via email or some web form. 620 00:25:07,210 --> 00:25:10,240 And allow them a reasonable amount of time to fix the problem. 621 00:25:10,240 --> 00:25:13,900 And I think a lot of companies might be scared to invite that kind of attention 622 00:25:13,900 --> 00:25:14,500 on their code. 623 00:25:14,500 --> 00:25:16,510 But it probably is a net positive, and you 624 00:25:16,510 --> 00:25:19,810 get a lot of smart people trying to help you help yourself. 625 00:25:19,810 --> 00:25:22,480 The worrisome part is that if you just leave it to the bad guys, 626 00:25:22,480 --> 00:25:24,610 they're not going to be telling you when they find these mistakes. 627 00:25:24,610 --> 00:25:27,430 They're just going to be attacking your systems and your product. 628 00:25:27,430 --> 00:25:29,140 COLTON OGDEN: And it's going to be hard and this is something 629 00:25:29,140 --> 00:25:31,090 that I know you've mentioned many times. 630 00:25:31,090 --> 00:25:34,510 But it is practically infinitely easier being an attacker 631 00:25:34,510 --> 00:25:37,270 than it is being a defender in the computer science realm. 632 00:25:37,270 --> 00:25:38,020 DAVID MALAN: Yeah. 633 00:25:38,020 --> 00:25:40,960 We are on the losing end of this against the adversaries. 634 00:25:40,960 --> 00:25:47,630 We, if I may be so bold and to call us the good guys, we have to be perfect. 635 00:25:47,630 --> 00:25:52,240 We have to find and fix every possible mistake in our code. 636 00:25:52,240 --> 00:25:54,100 Every possible exploit. 637 00:25:54,100 --> 00:25:55,630 Fix every possible bug. 638 00:25:55,630 --> 00:25:59,860 But all the adversaries need to find is just one oversight, one mistake. 639 00:25:59,860 --> 00:26:02,110 It's like leaving, in your house, if you've 640 00:26:02,110 --> 00:26:05,170 got all the doors locked and dead bolted, 641 00:26:05,170 --> 00:26:07,600 and you've got the alarm system on. 642 00:26:07,600 --> 00:26:11,650 But you have got one window open already that the person can slip through. 643 00:26:11,650 --> 00:26:13,420 It doesn't matter, any of the other stuff. 644 00:26:13,420 --> 00:26:16,140 It all reduces to the weakest link, so to speak. 645 00:26:16,140 --> 00:26:18,610 COLTON OGDEN: It's so brutally unfair. 646 00:26:18,610 --> 00:26:19,810 DAVID MALAN: Yeah. 647 00:26:19,810 --> 00:26:22,060 But I think that's why talking about this 648 00:26:22,060 --> 00:26:24,760 and emphasizing themes in computer science classes, 649 00:26:24,760 --> 00:26:28,090 like that of trade offs and that of security itself, 650 00:26:28,090 --> 00:26:30,058 just gets people thinking more consciously. 651 00:26:30,058 --> 00:26:32,350 Because at the end of the day, it's just a cost, right. 652 00:26:32,350 --> 00:26:34,720 You could put bars on your windows, which 653 00:26:34,720 --> 00:26:37,660 would partly mitigate that threat, but there's a physical cost there. 654 00:26:37,660 --> 00:26:39,970 There's an aesthetic cost, and so at some point 655 00:26:39,970 --> 00:26:41,410 you just have to draw the line. 656 00:26:41,410 --> 00:26:44,740 But security really is all about raising the cost to the adversary. 657 00:26:44,740 --> 00:26:47,980 Either financially or time-wise, resource-wise. 658 00:26:47,980 --> 00:26:52,360 And just making it worth their while no longer to attack you. 659 00:26:52,360 --> 00:26:54,880 I think there's an expression along the lines of, 660 00:26:54,880 --> 00:26:59,030 security is all about getting the adversary to attack someone else. 661 00:26:59,030 --> 00:26:59,530 Right. 662 00:26:59,530 --> 00:27:02,648 Because if the price they must pay to attack you is too high, 663 00:27:02,648 --> 00:27:04,940 they're indeed going to turn their attention elsewhere. 664 00:27:04,940 --> 00:27:07,840 And so that's perhaps a bit of a perverse way of thinking about it, 665 00:27:07,840 --> 00:27:10,672 but that's how a logical adversary would presumably think about it. 666 00:27:10,672 --> 00:27:12,880 COLTON OGDEN: Yeah, even Nick and I were talking when 667 00:27:12,880 --> 00:27:14,710 we did a steam Kali Linux recently. 668 00:27:14,710 --> 00:27:16,780 Which Kali Linux is a version of Linux that 669 00:27:16,780 --> 00:27:21,370 has some tools built into it to help folks get into penetration testing. 670 00:27:21,370 --> 00:27:24,610 And he was saying that one of the biggest ways, easiest ways, 671 00:27:24,610 --> 00:27:27,100 to get adversaries to stop messing with you is just 672 00:27:27,100 --> 00:27:29,410 choose extremely secure passwords. 673 00:27:29,410 --> 00:27:36,120 And along those lines, generally speaking, just adopt as secure things 674 00:27:36,120 --> 00:27:36,620 as you can. 675 00:27:36,620 --> 00:27:39,820 Don't do a lot of things that are very easy to guess, basically. 676 00:27:39,820 --> 00:27:40,570 DAVID MALAN: Yeah. 677 00:27:40,570 --> 00:27:41,290 No, absolutely. 678 00:27:41,290 --> 00:27:41,440 Right. 679 00:27:41,440 --> 00:27:44,440 Because if you're running an attack script on your server and my server, 680 00:27:44,440 --> 00:27:46,500 and I have the longer more secure password. 681 00:27:46,500 --> 00:27:49,000 The adversary is going to get into your server and not mine, 682 00:27:49,000 --> 00:27:50,292 and then start focusing on you. 683 00:27:50,292 --> 00:27:51,120 So, woof! 684 00:27:51,120 --> 00:27:52,470 I escaped detection there. 685 00:27:52,470 --> 00:27:54,220 COLTON OGDEN: You've deflected the burden. 686 00:27:54,220 --> 00:27:57,460 And then ideally, in a world where your neighbor also does 687 00:27:57,460 --> 00:28:00,760 the same thing and so on and so forth. 688 00:28:00,760 --> 00:28:05,257 In a theoretical model, you don't have attackers at least doing as much damage 689 00:28:05,257 --> 00:28:06,340 nearly as they aren't now. 690 00:28:06,340 --> 00:28:08,085 Because they just can't find anybody to attack. 691 00:28:08,085 --> 00:28:08,770 DAVID MALAN: Right. 692 00:28:08,770 --> 00:28:09,270 No. 693 00:28:09,270 --> 00:28:13,420 So I think, ideally, you want societally to sort of raise the cost all around. 694 00:28:13,420 --> 00:28:16,030 And help each other patch these holes, because it 695 00:28:16,030 --> 00:28:18,753 does no one any good if attacks are being 696 00:28:18,753 --> 00:28:20,170 waged from other people's servers. 697 00:28:20,170 --> 00:28:23,590 Case in point, worse than a denial of service attack, or DOS, 698 00:28:23,590 --> 00:28:27,170 is typically a DDOS, a distributed denial of service attack, 699 00:28:27,170 --> 00:28:30,460 which is the act of an adversary taking over somehow multiple machines 700 00:28:30,460 --> 00:28:32,290 and using those multiple machines to attack 701 00:28:32,290 --> 00:28:35,270 one or some number of other parties. 702 00:28:35,270 --> 00:28:39,130 So it does not behoove me to allow your house to be broken into, so to speak, 703 00:28:39,130 --> 00:28:41,380 or your server to be compromised, because I could then 704 00:28:41,380 --> 00:28:42,670 be the next victim. 705 00:28:42,670 --> 00:28:44,740 Because your machine is now part of the threat. 706 00:28:44,740 --> 00:28:47,615 COLTON OGDEN: And you did a lot of this new PHD, right, with botnets. 707 00:28:47,615 --> 00:28:48,115 Right? 708 00:28:48,115 --> 00:28:48,950 DAVID MALAN: Yeah. 709 00:28:48,950 --> 00:28:52,300 A botnet is a collection of servers that has somehow 710 00:28:52,300 --> 00:28:53,800 been taken over by an adversary. 711 00:28:53,800 --> 00:28:56,710 By some virus or worm running on those systems. 712 00:28:56,710 --> 00:29:00,880 And a botnet is really just a silly term describing a whole collection 713 00:29:00,880 --> 00:29:05,950 of servers that have been commandeered by some adversary via some software. 714 00:29:05,950 --> 00:29:10,060 And it's among the scarier threats because via software commands, 715 00:29:10,060 --> 00:29:12,970 can that botnet do anything that a piece of software can do, 716 00:29:12,970 --> 00:29:14,360 including attack other systems. 717 00:29:14,360 --> 00:29:15,152 COLTON OGDEN: Yeah. 718 00:29:15,152 --> 00:29:16,160 It's pretty frightening. 719 00:29:16,160 --> 00:29:19,670 We will try and maybe segue into something slightly less frightening. 720 00:29:19,670 --> 00:29:21,960 DAVID MALAN: How can we play the puppy sound now? 721 00:29:21,960 --> 00:29:25,527 COLTON OGDEN: This could still have a slightly negative connotation 722 00:29:25,527 --> 00:29:26,860 depending on how you look at it. 723 00:29:26,860 --> 00:29:30,610 But I was doing a little bit of research within the last couple of weeks. 724 00:29:30,610 --> 00:29:34,750 Google has launched a sort of announcement, or preview for, 725 00:29:34,750 --> 00:29:36,520 a feature called AMP. 726 00:29:36,520 --> 00:29:39,100 Accelerated Mobile Pages for Gmail. 727 00:29:39,100 --> 00:29:41,290 Whereby, within an email, you can sort of 728 00:29:41,290 --> 00:29:47,890 embed these cool web page looking functionality driven mini 729 00:29:47,890 --> 00:29:48,550 pages I guess. 730 00:29:48,550 --> 00:29:49,300 DAVID MALAN: Yeah. 731 00:29:49,300 --> 00:29:52,930 Much more interactive snippets of HTML-like code 732 00:29:52,930 --> 00:29:56,800 in emails, especially, that make them, indeed, more interactive, clickable, 733 00:29:56,800 --> 00:29:58,460 more visually interesting. 734 00:29:58,460 --> 00:30:00,460 You know, I kind of have mixed feelings on this. 735 00:30:00,460 --> 00:30:04,360 Because on the one hand, you're just describing the web. 736 00:30:04,360 --> 00:30:07,490 And we can certainly just have users click on a link in an email 737 00:30:07,490 --> 00:30:09,253 and open up a full fledged web browser. 738 00:30:09,253 --> 00:30:12,170 And with it all of the protections that are in place from the browsers 739 00:30:12,170 --> 00:30:14,600 as to what JavaScript, for instance, can and cannot do. 740 00:30:14,600 --> 00:30:17,210 What HTML and CSS can and cannot do. 741 00:30:17,210 --> 00:30:20,120 And AMP is proposing to add some additional features, essentially 742 00:30:20,120 --> 00:30:24,050 by way of additional HTML attributes, and properties, and so forth. 743 00:30:24,050 --> 00:30:28,250 Other features into products like Gmail. 744 00:30:28,250 --> 00:30:31,640 And so, it's still kind of appealing to me. 745 00:30:31,640 --> 00:30:33,680 You even use the word cool, because it does 746 00:30:33,680 --> 00:30:36,560 make a static tool, that's been static for a very 747 00:30:36,560 --> 00:30:38,780 long time, a little more interactive. 748 00:30:38,780 --> 00:30:44,900 And I kind of am willing to accept that, because of the coolness, as you say. 749 00:30:44,900 --> 00:30:48,470 But really the additional features you can get. 750 00:30:48,470 --> 00:30:51,230 You can get carousels of images within an email 751 00:30:51,230 --> 00:30:53,120 so that you can stay in the email. 752 00:30:53,120 --> 00:30:54,260 See a bunch of images. 753 00:30:54,260 --> 00:30:56,060 Maybe it's an album that someone posted. 754 00:30:56,060 --> 00:30:56,720 Maybe worse. 755 00:30:56,720 --> 00:30:59,570 It's a series of ads or products that you want to flip through, 756 00:30:59,570 --> 00:31:01,070 but it makes them more interactive. 757 00:31:01,070 --> 00:31:07,070 Which keeping me in SITU, in the same place, is probably a compelling thing. 758 00:31:07,070 --> 00:31:09,517 We use Slack, for instance, within CS50 to communicate. 759 00:31:09,517 --> 00:31:10,850 Which is a chat based mechanism. 760 00:31:10,850 --> 00:31:12,710 And Slack has done an amazing job at adding 761 00:31:12,710 --> 00:31:15,860 integrations, or supporting integrations, via their API. 762 00:31:15,860 --> 00:31:18,935 Where you can have other products tap into Slack. 763 00:31:18,935 --> 00:31:20,810 So that you can stay in the Slack environment 764 00:31:20,810 --> 00:31:24,320 and execute commands that somehow influence those other products. 765 00:31:24,320 --> 00:31:28,250 And it's really convenient, honestly, to for instance, get a Slack message 766 00:31:28,250 --> 00:31:30,230 and be able to respond in that chat window, 767 00:31:30,230 --> 00:31:33,413 but have it posted to some other web server or some other tool. 768 00:31:33,413 --> 00:31:34,580 And so even I appreciate it. 769 00:31:34,580 --> 00:31:36,170 It's such a marginal difference. 770 00:31:36,170 --> 00:31:39,920 I can absolutely just click a link, go to a web page, and do the same thing. 771 00:31:39,920 --> 00:31:41,190 But I'm doing a lot per day. 772 00:31:41,190 --> 00:31:42,440 We're not getting any younger. 773 00:31:42,440 --> 00:31:44,565 There's only a finite number of seconds in the day. 774 00:31:44,565 --> 00:31:47,480 Every few seconds I can spend doing tedious work 775 00:31:47,480 --> 00:31:49,868 is probably some compelling time saved. 776 00:31:49,868 --> 00:31:50,660 COLTON OGDEN: Yeah. 777 00:31:50,660 --> 00:31:55,370 I think altogether I do agree that it is a cool, dynamic addition to Gmail. 778 00:31:55,370 --> 00:31:58,580 The one thing that I was thinking about as I looked at it 779 00:31:58,580 --> 00:32:02,190 was, would this make it easier for phishing attacks? 780 00:32:02,190 --> 00:32:02,940 DAVID MALAN: Yeah. 781 00:32:02,940 --> 00:32:03,680 Probably. 782 00:32:03,680 --> 00:32:06,710 Let's just assume, yes. 783 00:32:06,710 --> 00:32:08,252 With every good thing comes some bad. 784 00:32:08,252 --> 00:32:08,752 And, yeah. 785 00:32:08,752 --> 00:32:11,450 Because one of the features, too, besides carousels of images, 786 00:32:11,450 --> 00:32:14,000 you can actually embed forms, for instance, in the email. 787 00:32:14,000 --> 00:32:17,900 And allow the user to submit those forms within the email itself. 788 00:32:17,900 --> 00:32:18,620 And, yeah. 789 00:32:18,620 --> 00:32:20,745 I'm guessing that's going to be the first threat we 790 00:32:20,745 --> 00:32:24,260 see is someone tricking users into actually typing information 791 00:32:24,260 --> 00:32:25,478 into those websites. 792 00:32:25,478 --> 00:32:26,270 COLTON OGDEN: Yeah. 793 00:32:26,270 --> 00:32:29,840 I would just visualizing in my head an email from some malicious actor. 794 00:32:29,840 --> 00:32:31,780 But it's the Twitter login page. 795 00:32:31,780 --> 00:32:35,720 Oh, log into your Twitter to verify your account or whatnot. 796 00:32:35,720 --> 00:32:38,300 And that leading to some bad guy dot com or whatever. 797 00:32:38,300 --> 00:32:41,480 It seems like now it's all the easier for this because of that embedded form 798 00:32:41,480 --> 00:32:42,063 functionality. 799 00:32:42,063 --> 00:32:43,100 DAVID MALAN: Absolutely. 800 00:32:43,100 --> 00:32:44,000 No, I agree. 801 00:32:44,000 --> 00:32:45,988 No, and I think they could easily pretend 802 00:32:45,988 --> 00:32:48,530 to be some bank that they aren't, and actually then trick you 803 00:32:48,530 --> 00:32:51,620 into typing in your credentials, your account number, or some information 804 00:32:51,620 --> 00:32:52,148 like that. 805 00:32:52,148 --> 00:32:52,940 COLTON OGDEN: Sure. 806 00:32:52,940 --> 00:32:56,210 One of the last things that I think we might want to talk about 807 00:32:56,210 --> 00:33:00,950 before we wrap up is within the last few years it's been common, I think, 808 00:33:00,950 --> 00:33:04,340 to start see the big fish open source a lot of the technologies. 809 00:33:04,340 --> 00:33:06,080 Facebook open sourced React. 810 00:33:06,080 --> 00:33:11,000 Microsoft has open sourced VS code which is now arguably the top, I think, 811 00:33:11,000 --> 00:33:14,960 per number of stars, it's actually the top text editor on GitHub. 812 00:33:14,960 --> 00:33:19,700 And now Uber recently open sourced their resource scheduler, Peloton. 813 00:33:19,700 --> 00:33:22,500 And we won't go necessarily into the specifics on Peloton. 814 00:33:22,500 --> 00:33:26,630 But I wanted to get your thoughts on, are these big companies 815 00:33:26,630 --> 00:33:28,520 necessarily obligated to do this? 816 00:33:28,520 --> 00:33:30,485 Is this a good move on their part? 817 00:33:30,485 --> 00:33:31,610 how do you feel about this? 818 00:33:31,610 --> 00:33:32,950 DAVID MALAN: Open sourcing their software? 819 00:33:32,950 --> 00:33:35,325 COLTON OGDEN: Open sourcing some of their tools at least. 820 00:33:35,325 --> 00:33:37,640 DAVID MALAN: I am a fan of open source software. 821 00:33:37,640 --> 00:33:41,510 It is a wonderful entry point for aspiring programmers 822 00:33:41,510 --> 00:33:44,380 to cut their teeth on a product to which they have access. 823 00:33:44,380 --> 00:33:46,880 They can contribute back and kind of put a toe in the water. 824 00:33:46,880 --> 00:33:49,067 And learn more about real world software, 825 00:33:49,067 --> 00:33:51,650 especially if they're not quite ready or don't yet have access 826 00:33:51,650 --> 00:33:54,410 to like a full fledged developer job. 827 00:33:54,410 --> 00:33:56,500 Free, I think is a very compelling things 828 00:33:56,500 --> 00:33:59,510 and allows so many more people to solve problems 829 00:33:59,510 --> 00:34:01,040 using some common functionality. 830 00:34:01,040 --> 00:34:02,690 Libraries, frameworks, and so forth. 831 00:34:02,690 --> 00:34:04,400 You mentioned React, for instance. 832 00:34:04,400 --> 00:34:06,620 It's a wonderful set of shoulders you can stand on 833 00:34:06,620 --> 00:34:10,453 to do something even cooler and more impactful yourself. 834 00:34:10,453 --> 00:34:13,370 And I think too it's a shame that we have so many companies out there, 835 00:34:13,370 --> 00:34:16,280 in general, writing software that isn't necessarily 836 00:34:16,280 --> 00:34:17,880 juicy, intellectual property. 837 00:34:17,880 --> 00:34:21,290 It's not the core of their business, but it's a commodity type problem 838 00:34:21,290 --> 00:34:23,480 that others might benefit from. 839 00:34:23,480 --> 00:34:25,909 For instance, mapping tools from someone like Google. 840 00:34:25,909 --> 00:34:27,739 And some of the work, certainly, that Uber 841 00:34:27,739 --> 00:34:31,610 is now doing when it comes to their services. 842 00:34:31,610 --> 00:34:35,179 And so there's sort of a social good to open sourcing that. 843 00:34:35,179 --> 00:34:38,120 Because we humans have a finite amount of time on this earth. 844 00:34:38,120 --> 00:34:41,030 We might as well stand on each other's shoulders as best we can. 845 00:34:41,030 --> 00:34:45,710 And move ourselves forward and hope via karma, and sort of collaboration, 846 00:34:45,710 --> 00:34:48,440 will that benefit us in turn too. 847 00:34:48,440 --> 00:34:50,489 By our having initiated the same. 848 00:34:50,489 --> 00:34:53,449 So I think in principle it's a good thing. 849 00:34:53,449 --> 00:34:56,150 With that said, I think there are some costs. 850 00:34:56,150 --> 00:34:59,150 Even you and I, when we've written code, were embarrassed by it 851 00:34:59,150 --> 00:35:00,238 sometimes, if I might say. 852 00:35:00,238 --> 00:35:02,030 And I've written things that I don't really 853 00:35:02,030 --> 00:35:05,480 want open source, because it's going to take me a non-trivial amount of time 854 00:35:05,480 --> 00:35:09,320 to go clean it up, and comment it, and really feel proud of it. 855 00:35:09,320 --> 00:35:13,100 Such that I'd be comfy saying, hello, world I wrote this code. 856 00:35:13,100 --> 00:35:14,120 So there's that price. 857 00:35:14,120 --> 00:35:15,690 And a lot of companies might think. 858 00:35:15,690 --> 00:35:16,690 That's not our business. 859 00:35:16,690 --> 00:35:17,940 That doesn't generate revenue. 860 00:35:17,940 --> 00:35:20,000 That's not a good use of our limited human time. 861 00:35:20,000 --> 00:35:22,310 So I can appreciate the tension, but I think finding 862 00:35:22,310 --> 00:35:23,727 that balance is pretty compelling. 863 00:35:23,727 --> 00:35:25,080 COLTON OGDEN: Yeah, I agree. 864 00:35:25,080 --> 00:35:25,610 I think so. 865 00:35:25,610 --> 00:35:30,740 And given that so many large actors have been open sourcing. 866 00:35:30,740 --> 00:35:35,700 I guess maybe companies may start to get a little bit of pressure to think, 867 00:35:35,700 --> 00:35:39,158 oh all these other companies have all these awesome projects out there 868 00:35:39,158 --> 00:35:41,450 that people are using, and seeing, and contributing to, 869 00:35:41,450 --> 00:35:42,827 but we don't have anything. 870 00:35:42,827 --> 00:35:44,660 Do you think that's something that companies 871 00:35:44,660 --> 00:35:46,203 should, and will, worry about? 872 00:35:46,203 --> 00:35:47,870 DAVID MALAN: I don't know, to be honest. 873 00:35:47,870 --> 00:35:50,090 It's a worthy experiment to see. 874 00:35:50,090 --> 00:35:53,930 I think that it's a potential recruiting tool, right. 875 00:35:53,930 --> 00:35:57,020 If you gain exposure to some company because you 876 00:35:57,020 --> 00:35:59,970 are using, or looking at, or contributing to their software. 877 00:35:59,970 --> 00:36:01,970 It feels like a very natural next step to aspire 878 00:36:01,970 --> 00:36:04,280 to have a part time or full time job with them. 879 00:36:04,280 --> 00:36:08,120 And so honestly, strategically, it might help you identify amazing developers, 880 00:36:08,120 --> 00:36:11,270 because you have these volunteers essentially initially contributing 881 00:36:11,270 --> 00:36:13,470 freely to your product via open source. 882 00:36:13,470 --> 00:36:15,200 Whether it's on GitHub or somewhere else. 883 00:36:15,200 --> 00:36:17,690 Submitting pull requests, and participating in issues, 884 00:36:17,690 --> 00:36:18,980 and reporting bugs. 885 00:36:18,980 --> 00:36:22,230 And you kind of get to know someone and then can very comfortably say, 886 00:36:22,230 --> 00:36:22,967 you know what. 887 00:36:22,967 --> 00:36:25,800 Why don't you come onto our side of the fence and do this full time? 888 00:36:25,800 --> 00:36:28,280 So I don't know if that's a theoretical upside, or an actual one, 889 00:36:28,280 --> 00:36:30,350 but it certainly feels worth trying on some scale. 890 00:36:30,350 --> 00:36:32,870 COLTON OGDEN: And that's been actually achieved at CS50 too, right. 891 00:36:32,870 --> 00:36:35,480 Folks like Kareem, and Chad, and other folks that have worked with us. 892 00:36:35,480 --> 00:36:35,960 DAVID MALAN: Yeah. 893 00:36:35,960 --> 00:36:38,030 To be honest, and that was actually very organic. 894 00:36:38,030 --> 00:36:42,003 It wasn't part of some overall clever strategy, I'll admit. 895 00:36:42,003 --> 00:36:45,170 We discovered Chad Sharp because he was submitting pull requests and opening 896 00:36:45,170 --> 00:36:47,060 issues on some of our open source libraries. 897 00:36:47,060 --> 00:36:50,750 And Kareem, of course, was contributing so actively in CS50's Facebook group 898 00:36:50,750 --> 00:36:52,490 and later to software development. 899 00:36:52,490 --> 00:36:54,320 And it's a great way to get to know someone 900 00:36:54,320 --> 00:36:57,170 in a way that's not a more traditional 30 minute 901 00:36:57,170 --> 00:37:00,190 interview where everyone's trying to impress the other person. 902 00:37:00,190 --> 00:37:01,940 You don't really have a sense of what's it 903 00:37:01,940 --> 00:37:04,670 going to be like to work with this person on a project. 904 00:37:04,670 --> 00:37:08,030 Open source software makes it very easy to get to know, and become friendly 905 00:37:08,030 --> 00:37:11,690 with people, and technically collaborate with people in a way 906 00:37:11,690 --> 00:37:14,718 that a whiteboard and a conference room don't really allow. 907 00:37:14,718 --> 00:37:15,510 COLTON OGDEN: Yeah. 908 00:37:15,510 --> 00:37:16,468 It's much more organic. 909 00:37:16,468 --> 00:37:17,560 DAVID MALAN: Yeah. 910 00:37:17,560 --> 00:37:21,590 Hey, case in point, Colton and I met primarily 911 00:37:21,590 --> 00:37:23,675 by playing Scramble with Friends as I recall to-- 912 00:37:23,675 --> 00:37:26,300 COLTON OGDEN: That's a very professional way to get acquainted. 913 00:37:26,300 --> 00:37:29,130 DAVID MALAN: I noticed you're very good with words, and here we are talking. 914 00:37:29,130 --> 00:37:30,510 COLTON OGDEN: I don't even know if that's true. 915 00:37:30,510 --> 00:37:31,220 DAVID MALAN: OK. 916 00:37:31,220 --> 00:37:33,020 Well, I'm trying to say something nice. 917 00:37:33,020 --> 00:37:34,877 COLTON OGDEN: I appreciate it. 918 00:37:34,877 --> 00:37:37,960 I have fond memories of losing almost every single match of that actually. 919 00:37:37,960 --> 00:37:39,260 DAVID MALAN: But you kept trying. 920 00:37:39,260 --> 00:37:40,340 And that was what we were looking. 921 00:37:40,340 --> 00:37:41,380 COLTON OGDEN: I think it was the banter. 922 00:37:41,380 --> 00:37:42,505 It was the friendly banter. 923 00:37:42,505 --> 00:37:44,213 DAVID MALAN: Well, and to be fair too, we 924 00:37:44,213 --> 00:37:46,850 got to chatting early on with CS50 because you offered to get 925 00:37:46,850 --> 00:37:48,410 involved with the transcriptions. 926 00:37:48,410 --> 00:37:50,705 And helping us actually caption things for folks 927 00:37:50,705 --> 00:37:52,580 for whom English is a second language, or who 928 00:37:52,580 --> 00:37:54,470 would need it for accessibility sake. 929 00:37:54,470 --> 00:37:56,943 So that was a very worthy contribution as well early on. 930 00:37:56,943 --> 00:37:58,110 COLTON OGDEN: It's been fun. 931 00:37:58,110 --> 00:37:58,735 It's been good. 932 00:37:58,735 --> 00:38:02,000 And we've only done more and more in that domain. 933 00:38:02,000 --> 00:38:02,990 Which is great. 934 00:38:02,990 --> 00:38:03,650 DAVID MALAN: Case in point. 935 00:38:03,650 --> 00:38:05,733 I have not played Scramble with Friends for years. 936 00:38:05,733 --> 00:38:07,250 COLTON OGDEN: We haven't had time. 937 00:38:07,250 --> 00:38:09,917 Well, I think that's a good amount of topics to cover, actually. 938 00:38:09,917 --> 00:38:13,000 Are there any takeaways that you'd like to say for the folks listening in? 939 00:38:13,000 --> 00:38:14,000 DAVID MALAN: Be afraid. 940 00:38:14,000 --> 00:38:14,870 Be very afraid. 941 00:38:14,870 --> 00:38:16,970 COLTON OGDEN: People are going to stop tuning in. 942 00:38:16,970 --> 00:38:17,720 They're going to get too depressed. 943 00:38:17,720 --> 00:38:18,320 DAVID MALAN: That is true. 944 00:38:18,320 --> 00:38:20,270 Go Google some puppies right now if you could. 945 00:38:20,270 --> 00:38:24,140 But no, I think the real takeaway is to just be more thoughtful and deliberate 946 00:38:24,140 --> 00:38:25,370 about choices you make. 947 00:38:25,370 --> 00:38:27,860 And yes, there is the theoretical risk that data 948 00:38:27,860 --> 00:38:31,130 you're inputting into a website like Facebook might be misused. 949 00:38:31,130 --> 00:38:35,340 But does the value of your gaining by using that tool perhaps outweigh that? 950 00:38:35,340 --> 00:38:37,670 And as we discussed earlier, you want to make 951 00:38:37,670 --> 00:38:40,910 sure that you're not making all of those locally optimal decisions 952 00:38:40,910 --> 00:38:42,170 again, and again, and again. 953 00:38:42,170 --> 00:38:45,390 Such that weeks or months later, you look back and realize, wow, 954 00:38:45,390 --> 00:38:48,200 I globally made a very poor decision, because now this website 955 00:38:48,200 --> 00:38:50,060 knows everything about me. 956 00:38:50,060 --> 00:38:52,850 So I think just making very conscious choices along the way 957 00:38:52,850 --> 00:38:56,240 and realizing what prices you're paying, and what benefits you're getting, 958 00:38:56,240 --> 00:38:57,140 is the real takeaway. 959 00:38:57,140 --> 00:38:58,580 Because these threats have been omnipresent. 960 00:38:58,580 --> 00:39:01,108 Not just in tech, but in the physical real world as well. 961 00:39:01,108 --> 00:39:03,650 And so I think being sensitized to them is the real takeaway. 962 00:39:03,650 --> 00:39:06,483 COLTON OGDEN: Getting sensitized and getting educated too, probably. 963 00:39:06,483 --> 00:39:07,715 DAVID MALAN: Absolutely. 964 00:39:07,715 --> 00:39:10,340 COLTON OGDEN: Tune into places like Hacker News. 965 00:39:10,340 --> 00:39:12,290 Tune into the CS50 podcast. 966 00:39:12,290 --> 00:39:14,150 This was episode 1. 967 00:39:14,150 --> 00:39:15,440 It was a pleasure. 968 00:39:15,440 --> 00:39:18,110 And I look forward to doing the next episode with you. 969 00:39:18,110 --> 00:39:19,970 DAVID MALAN: Chat with you all soon. 970 00:39:19,970 --> 00:39:20,470