1 00:00:00,000 --> 00:00:02,850 [MUSIC PLAYING] 2 00:00:02,850 --> 00:00:04,275 DAVID MALAN: This is CS50. 3 00:00:04,275 --> 00:00:08,080 4 00:00:08,080 --> 00:00:09,020 Hello, world. 5 00:00:09,020 --> 00:00:12,100 This is the CS50 podcast my name is David Malan, 6 00:00:12,100 --> 00:00:14,590 and this is Episode Zero, our very first, 7 00:00:14,590 --> 00:00:16,270 and I'm joined here by CS50's own Colt-- 8 00:00:16,270 --> 00:00:17,020 COLTON OGDEN: Yep. 9 00:00:17,020 --> 00:00:18,160 [LAUGHTER] 10 00:00:18,160 --> 00:00:21,380 Colton Odgen. This is an interesting new direction that we're going. 11 00:00:21,380 --> 00:00:23,290 DAVID MALAN: Yeah, it's one in which we clearly haven't rehearsed. 12 00:00:23,290 --> 00:00:24,082 COLTON OGDEN: Yeah. 13 00:00:24,082 --> 00:00:28,120 DAVID MALAN: So, but what we thought we'd do with the CS50 podcast is really 14 00:00:28,120 --> 00:00:32,570 focus on the week's current events as it relates to technology, 15 00:00:32,570 --> 00:00:35,170 use this as an opportunity to talk about the implications 16 00:00:35,170 --> 00:00:38,960 of various technologies, and really explain things as it comes up, 17 00:00:38,960 --> 00:00:40,440 but really in a non-visual way. 18 00:00:40,440 --> 00:00:42,190 And so, perhaps, I think the topics Colton 19 00:00:42,190 --> 00:00:45,210 and I'll hit on here will focus on things you, yourself, might have read 20 00:00:45,210 --> 00:00:47,350 in the news that maybe didn't register necessarily 21 00:00:47,350 --> 00:00:50,400 or maybe you didn't really understand how it pertains to technologies 22 00:00:50,400 --> 00:00:51,400 that you, yourself, use. 23 00:00:51,400 --> 00:00:54,150 COLTON OGDEN: Yeah, and I think that's ties as well to prior, when 24 00:00:54,150 --> 00:00:56,530 we did CS50 live, and this was kind of the same idea. 25 00:00:56,530 --> 00:00:57,190 DAVID MALAN: Yeah, absolutely. 26 00:00:57,190 --> 00:01:00,010 Whereas CS50 Live, when we did it on video, was much more visual-- 27 00:01:00,010 --> 00:01:03,100 we prepared slides, we actually looked at sample videos and such-- 28 00:01:03,100 --> 00:01:06,100 here, we thought we'd really try to focus on ideas. 29 00:01:06,100 --> 00:01:09,460 And it'll be up to you to decide if this works well or not well, 30 00:01:09,460 --> 00:01:12,760 but we come prepared with a look at some of the past week's news. 31 00:01:12,760 --> 00:01:14,230 And why don't we get right into it? 32 00:01:14,230 --> 00:01:15,522 COLTON OGDEN: Yeah, absolutely. 33 00:01:15,522 --> 00:01:17,530 One of the things I noticed, actually, is-- 34 00:01:17,530 --> 00:01:19,990 I put together this list of topics, but the one thing 35 00:01:19,990 --> 00:01:22,750 that I didn't put in here that you actually found and put in here, 36 00:01:22,750 --> 00:01:24,750 today, was something about Facebook passwords. 37 00:01:24,750 --> 00:01:28,770 DAVID MALAN: Yeah, so a website named Krebs on Security, the author of this, 38 00:01:28,770 --> 00:01:31,810 was contacted apparently by some employee-- presumably 39 00:01:31,810 --> 00:01:34,090 and a current employee of Facebook-- who revealed 40 00:01:34,090 --> 00:01:37,300 to him that during some recent audit of their security processes, 41 00:01:37,300 --> 00:01:39,670 they discovered that for like seven years, 42 00:01:39,670 --> 00:01:42,880 since 2012, had one or more processes inside of Facebook 43 00:01:42,880 --> 00:01:46,330 been storing passwords-- users' passwords, like yours and mine 44 00:01:46,330 --> 00:01:49,240 potentially, in the clear, so to speak, clear text, not 45 00:01:49,240 --> 00:01:52,690 cipher text, which means unencrypted-- in some database or some file 46 00:01:52,690 --> 00:01:53,190 somewhere. 47 00:01:53,190 --> 00:01:55,273 COLTON OGDEN: Typically, people will use some sort 48 00:01:55,273 --> 00:01:58,540 of hashing algorithm to store things cryptographically and much more 49 00:01:58,540 --> 00:01:59,170 securely? 50 00:01:59,170 --> 00:02:03,190 DAVID MALAN: Indeed, even like ROT13, like rotate every character 13 places, 51 00:02:03,190 --> 00:02:04,850 would have been arguably more secure. 52 00:02:04,850 --> 00:02:07,480 And there's not a huge amount of technical detail out there. 53 00:02:07,480 --> 00:02:10,389 If you go to krebsonsecurity.com, you can actually 54 00:02:10,389 --> 00:02:12,985 dig up the blog post itself. 55 00:02:12,985 --> 00:02:14,860 And then Facebook actually did respond, and I 56 00:02:14,860 --> 00:02:17,913 think there's a link in Krebs on Security to the Facebook announcement. 57 00:02:17,913 --> 00:02:20,080 But to be honest, the Facebook announcement which is 58 00:02:20,080 --> 00:02:23,470 on newsroom.fb.com, pretty, to be honest, 59 00:02:23,470 --> 00:02:27,460 it's pretty nondescript and really doesn't-- 60 00:02:27,460 --> 00:02:29,540 I mean, it's kind of disingenuous. 61 00:02:29,540 --> 00:02:33,070 They seem to use this as an opportunity to talk about best practices 62 00:02:33,070 --> 00:02:36,040 when it comes to passwords and all of the various other mechanisms 63 00:02:36,040 --> 00:02:39,370 that they have in place to help you secure your password. 64 00:02:39,370 --> 00:02:42,310 And yet, they really kind of didn't address the topic 65 00:02:42,310 --> 00:02:44,740 at hand, which is, well, despite all of those mechanisms, 66 00:02:44,740 --> 00:02:47,260 you were storing our passwords in the clear, or at least 67 00:02:47,260 --> 00:02:50,370 millions of Facebook users, particularly on Facebook Light, 68 00:02:50,370 --> 00:02:53,575 lighter-weight version of the app that's useful in low bandwidth locations 69 00:02:53,575 --> 00:02:55,450 or where bandwidth is very expensive or slow. 70 00:02:55,450 --> 00:02:58,230 COLTON OGDEN: So this strikes you sort of as an opportunity for them to, 71 00:02:58,230 --> 00:03:00,830 what, hand wave over the issue and sort of distract people? 72 00:03:00,830 --> 00:03:02,610 Is that sort of how it-- this rubs you? 73 00:03:02,610 --> 00:03:03,070 DAVID MALAN: Yeah, maybe. 74 00:03:03,070 --> 00:03:05,290 I think they, you know, acknowledged the issue, 75 00:03:05,290 --> 00:03:07,510 but then used this as an opportunity to emphasize all 76 00:03:07,510 --> 00:03:09,310 of the things that are being done well. 77 00:03:09,310 --> 00:03:12,640 And that's fine, but I think the world is done a disservice 78 00:03:12,640 --> 00:03:15,970 when companies aren't just candid with their mea culpas 79 00:03:15,970 --> 00:03:17,390 and what they got wrong. 80 00:03:17,390 --> 00:03:19,960 I think there's learning opportunities and, as I read this, 81 00:03:19,960 --> 00:03:22,480 there's really little for me as a technical person 82 00:03:22,480 --> 00:03:24,820 or as an aspiring program to really learn from, 83 00:03:24,820 --> 00:03:27,790 other than the high order bit which is, encrypt your passwords. 84 00:03:27,790 --> 00:03:29,002 But how did this happen? 85 00:03:29,002 --> 00:03:30,460 What are the processes that failed? 86 00:03:30,460 --> 00:03:32,752 I mean if companies like Facebook can't get this right, 87 00:03:32,752 --> 00:03:35,115 how can little old me, an aspiring programmer, 88 00:03:35,115 --> 00:03:36,490 get these kinds of details right? 89 00:03:36,490 --> 00:03:37,030 I wonder. 90 00:03:37,030 --> 00:03:39,490 COLTON OGDEN: So an article more about how they failed 91 00:03:39,490 --> 00:03:41,620 and how they could address it, and how other companies could address it, 92 00:03:41,620 --> 00:03:42,980 you think that would've been more productive? 93 00:03:42,980 --> 00:03:43,980 DAVID MALAN: I think so. 94 00:03:43,980 --> 00:03:47,200 I mean, postmortems, as they're called in many contexts, including in tech, 95 00:03:47,200 --> 00:03:49,690 and I've always really admired companies that when they do 96 00:03:49,690 --> 00:03:53,590 have some significant mistake or human error, where they own up to it 97 00:03:53,590 --> 00:03:57,310 and they explain in technical terms exactly what went wrong. 98 00:03:57,310 --> 00:03:59,410 They can still have a more layman's explanation 99 00:03:59,410 --> 00:04:01,660 of the problem too, where most people might only take 100 00:04:01,660 --> 00:04:03,205 an interest in that level of detail. 101 00:04:03,205 --> 00:04:05,080 But for the technophiles and for the students 102 00:04:05,080 --> 00:04:08,080 and the aspiring technophiles out there, I think it's just appreciated. 103 00:04:08,080 --> 00:04:10,900 And these are such teachable moments and all 104 00:04:10,900 --> 00:04:14,892 that-- but I would respect the persons, the company all the more 105 00:04:14,892 --> 00:04:18,100 if they really just explained what it is they failed so that we can all learn 106 00:04:18,100 --> 00:04:19,959 from it and not repeat those mistakes. 107 00:04:19,959 --> 00:04:22,570 COLTON OGDEN: If a large company like Facebook is doing something like this, 108 00:04:22,570 --> 00:04:25,270 how prevalent do you think this practice is in the real world? 109 00:04:25,270 --> 00:04:25,560 DAVID MALAN: Yeah. 110 00:04:25,560 --> 00:04:26,230 Oh my God. 111 00:04:26,230 --> 00:04:28,750 I mean, probably frighteningly common, and it's 112 00:04:28,750 --> 00:04:31,660 just if you have fewer users or fewer eyes on the company, 113 00:04:31,660 --> 00:04:34,452 you probably just notice these things less frequently. 114 00:04:34,452 --> 00:04:35,910 But I do think things are changing. 115 00:04:35,910 --> 00:04:39,160 I mean with laws like GDPR in the EU, the European Union, 116 00:04:39,160 --> 00:04:42,550 I think there's increased pressure on companies now, increased 117 00:04:42,550 --> 00:04:45,790 legal pressure, on them to disclose when these kinds of things happen, 118 00:04:45,790 --> 00:04:47,740 to impose penalties when it does, to therefore 119 00:04:47,740 --> 00:04:49,510 discourage this from even happening. 120 00:04:49,510 --> 00:04:53,830 And you know, I'm wondering why this audit detected this in 2019, and not 121 00:04:53,830 --> 00:04:58,090 in 2012 or 2013 or 2014 and so forth. 122 00:04:58,090 --> 00:05:00,742 COLTON OGDEN: GDPR, did that happened back in 2012? 123 00:05:00,742 --> 00:05:01,450 Oh no, that was-- 124 00:05:01,450 --> 00:05:01,930 DAVID MALAN: No, this was recent. 125 00:05:01,930 --> 00:05:02,180 COLTON OGDEN: That one Came onto force-- 126 00:05:02,180 --> 00:05:02,440 OK. 127 00:05:02,440 --> 00:05:04,900 DAVID MALAN: Recent months, actually, has this been rolled out. 128 00:05:04,900 --> 00:05:06,190 COLTON OGDEN: Was this-- is this related at all 129 00:05:06,190 --> 00:05:09,780 to the proliferation, now, of cookie messages that you see on websites? 130 00:05:09,780 --> 00:05:15,640 DAVID MALAN: That's US-specific, where I believe it's now being enforced. 131 00:05:15,640 --> 00:05:18,660 Because that actually has been around for quite some time in Europe. 132 00:05:18,660 --> 00:05:20,910 Anytime you took your laptop abroad, for instance, 133 00:05:20,910 --> 00:05:23,160 would you notice that almost every darn site asks you, 134 00:05:23,160 --> 00:05:24,600 hey, can we store cookies. 135 00:05:24,600 --> 00:05:27,870 And honestly, that's a very annoying and almost silly manifestation of it 136 00:05:27,870 --> 00:05:29,730 because the reality is, as you know, I mean 137 00:05:29,730 --> 00:05:31,860 the web doesn't work without cookies or at least 138 00:05:31,860 --> 00:05:33,250 dynamic applications don't work. 139 00:05:33,250 --> 00:05:36,000 And anyone who's taken CS50 or who's and a bit of web programming, 140 00:05:36,000 --> 00:05:37,980 really, in any language, know that the only way 141 00:05:37,980 --> 00:05:42,030 to maintain state in most HTTP-based applications is with cookies. 142 00:05:42,030 --> 00:05:44,490 So, I mean, we've created a culture where people just 143 00:05:44,490 --> 00:05:47,700 dismiss yet another message, and I don't think that's a net positive either. 144 00:05:47,700 --> 00:05:49,860 COLTON OGDEN: I think I see a lot, too, of the messages that say, 145 00:05:49,860 --> 00:05:51,930 by continuing to use this site, you acknowledge 146 00:05:51,930 --> 00:05:55,920 that we have access to whatever information, using cookies, and so on. 147 00:05:55,920 --> 00:05:59,663 So I almost think that they do it already and sort of legally 148 00:05:59,663 --> 00:06:01,830 can get away with it by having this message visible. 149 00:06:01,830 --> 00:06:03,913 DAVID MALAN: Yeah, I mean, it's like cigarette ads 150 00:06:03,913 --> 00:06:06,240 which, abroad, as well, there was-- 151 00:06:06,240 --> 00:06:09,450 before the US, there was much more of, I presume, 152 00:06:09,450 --> 00:06:13,410 law around having to have very scary warnings on packages. 153 00:06:13,410 --> 00:06:16,680 And companies somewhat cleverly, but somewhat tragically, kind of 154 00:06:16,680 --> 00:06:21,540 steered into that and really owned that and put the scariest of messages. 155 00:06:21,540 --> 00:06:24,540 And it-- you almost become desensitized to it because it's just so silly 156 00:06:24,540 --> 00:06:28,075 and it's so over the top, you know, smoking kills. 157 00:06:28,075 --> 00:06:30,450 And then, here's the price tag and here's the brand name. 158 00:06:30,450 --> 00:06:32,820 Like, you start to look past those kinds of details too, 159 00:06:32,820 --> 00:06:35,902 so I'm not sure even that is all that effective. 160 00:06:35,902 --> 00:06:37,860 But someone who's looked at this and studied it 161 00:06:37,860 --> 00:06:41,170 can perhaps attest quantitatively just how effective it's been. 162 00:06:41,170 --> 00:06:42,450 COLTON OGDEN: Yeah, indeed. 163 00:06:42,450 --> 00:06:44,760 Well, scary to know that our passwords may 164 00:06:44,760 --> 00:06:47,310 have been reflected visibly on somebody's server, 165 00:06:47,310 --> 00:06:49,860 a big website like Facebook. 166 00:06:49,860 --> 00:06:53,580 Related to that, another of the topics that I sort of dug into a little bit 167 00:06:53,580 --> 00:06:56,280 yesterday-- or not yesterday, a few days ago, 168 00:06:56,280 --> 00:06:58,500 was Gmail Confidential Mode, a new feature 169 00:06:58,500 --> 00:07:00,660 that they're starting to roll out. 170 00:07:00,660 --> 00:07:01,410 DAVID MALAN: Yeah. 171 00:07:01,410 --> 00:07:04,785 Yeah, I saw that, just in March, one of the articles on Google's blog 172 00:07:04,785 --> 00:07:05,410 discussed this. 173 00:07:05,410 --> 00:07:07,380 What, so do you understand what the-- 174 00:07:07,380 --> 00:07:09,127 what they're offering now as a service? 175 00:07:09,127 --> 00:07:11,460 COLTON OGDEN: I have to reread back through the article. 176 00:07:11,460 --> 00:07:16,620 So from what I understood, though, it was encrypting not P2P emails, 177 00:07:16,620 --> 00:07:21,540 but encrypting the emails sort of towards a sort of proxy, 178 00:07:21,540 --> 00:07:23,730 towards a center point, and then forwarding 179 00:07:23,730 --> 00:07:27,150 that encrypted email to the other person on the receiving end. 180 00:07:27,150 --> 00:07:30,248 But I remember reading in the article that P2P encryption wasn't something 181 00:07:30,248 --> 00:07:32,790 that they were actually going to start implementing just yet. 182 00:07:32,790 --> 00:07:34,140 DAVID MALAN: Yeah, and this is-- 183 00:07:34,140 --> 00:07:37,380 I mean, this is kind of the illusion of security or confidentiality. 184 00:07:37,380 --> 00:07:39,390 In fact, I was just reading, after you sent me 185 00:07:39,390 --> 00:07:43,310 this link on the EFF's website, the Electronic Frontier Foundation who 186 00:07:43,310 --> 00:07:47,760 are really very progressively-minded, security-conscious, privacy-conscious 187 00:07:47,760 --> 00:07:53,040 individuals as a group, they noted how this really isn't confidential. 188 00:07:53,040 --> 00:07:56,970 Google, of course, still has access to the plain text of your email. 189 00:07:56,970 --> 00:07:59,140 They don't claim to be encrypting it Peer-to-Peer, 190 00:07:59,140 --> 00:08:00,720 so they're not being disingenuous. 191 00:08:00,720 --> 00:08:05,090 But I think they, too, are sort of creating and implying 192 00:08:05,090 --> 00:08:09,152 a property, confidentiality, that isn't really there. 193 00:08:09,152 --> 00:08:10,860 And what this does, for those unfamiliar, 194 00:08:10,860 --> 00:08:13,980 is when you send an email in Gmail, if you or your company 195 00:08:13,980 --> 00:08:14,977 enables this feature-- 196 00:08:14,977 --> 00:08:17,310 I think it might still be in beta mode or in trial mode. 197 00:08:17,310 --> 00:08:18,990 COLTON OGDEN: I don't think it's officially fully deployed yet, 198 00:08:18,990 --> 00:08:19,560 but yeah, [INAUDIBLE]. 199 00:08:19,560 --> 00:08:21,420 DAVID MALAN: Yeah, so you can opt into it if you have a corporate account, 200 00:08:21,420 --> 00:08:22,290 for instance. 201 00:08:22,290 --> 00:08:25,290 It gives you an additional, like, Lock icon on the Compose window 202 00:08:25,290 --> 00:08:28,290 for an email, where you can say that this message expires, sort of James 203 00:08:28,290 --> 00:08:30,420 Bond style, after some number of hours. 204 00:08:30,420 --> 00:08:33,570 You can add an SMS code to it so the human who is receiving it 205 00:08:33,570 --> 00:08:36,630 has to type in a code that they get on their cell phone. 206 00:08:36,630 --> 00:08:41,820 And so it also prevents users from forwarding it, for instance, 207 00:08:41,820 --> 00:08:44,850 therefore accidentally or intentionally sending it to someone else. 208 00:08:44,850 --> 00:08:47,640 But there's the fundamental issue because you 209 00:08:47,640 --> 00:08:50,620 start to condition people, potentially, into thinking, 210 00:08:50,620 --> 00:08:51,930 oh, this is confidential. 211 00:08:51,930 --> 00:08:54,688 No one can see the message that I'm sending or that I've received. 212 00:08:54,688 --> 00:08:55,980 And that's just baloney, right? 213 00:08:55,980 --> 00:08:58,022 And you or I could take out our cell phone, right 214 00:08:58,022 --> 00:09:01,440 now and not screenshot, but photograph anything on our screen. 215 00:09:01,440 --> 00:09:05,250 You could certainly highlight and Copy-Paste it into some other email. 216 00:09:05,250 --> 00:09:09,900 And so I think these kinds of features are dangerous if users don't really 217 00:09:09,900 --> 00:09:11,400 understand what's going on. 218 00:09:11,400 --> 00:09:13,442 And honestly, this is going to be a perfect topic 219 00:09:13,442 --> 00:09:17,343 in CS50 itself or CS50 for MBAs or for JDs at Harvard's graduate schools. 220 00:09:17,343 --> 00:09:20,010 Because if you really push on this, what does confidential mean? 221 00:09:20,010 --> 00:09:21,210 Well, not really much. 222 00:09:21,210 --> 00:09:24,210 You're just kind of-- it's more of a social contract between two people. 223 00:09:24,210 --> 00:09:26,577 Like, OK, OK, I won't forward this. 224 00:09:26,577 --> 00:09:27,660 It's just raising the bar. 225 00:09:27,660 --> 00:09:28,830 It's not stopping anything. 226 00:09:28,830 --> 00:09:31,497 COLTON OGDEN: Part of this kind of reminds me, too, of the point 227 00:09:31,497 --> 00:09:33,270 you like to mention in most of the courses 228 00:09:33,270 --> 00:09:37,770 that you teach in that security doesn't really mean much just by virtue 229 00:09:37,770 --> 00:09:39,240 of seeing something. 230 00:09:39,240 --> 00:09:42,660 Somebody sees a padlock icon in their browser in, let's say, 231 00:09:42,660 --> 00:09:45,180 bankofamerica.com, that doesn't necessarily mean 232 00:09:45,180 --> 00:09:46,763 that anything that they see is secure. 233 00:09:46,763 --> 00:09:48,138 DAVID MALAN: Yeah, well that too. 234 00:09:48,138 --> 00:09:50,730 I mean, there too, we humans learned, years ago, well maybe we 235 00:09:50,730 --> 00:09:52,650 shouldn't be putting padlocks in places that 236 00:09:52,650 --> 00:09:55,338 have no technical meaning for exactly that reason. 237 00:09:55,338 --> 00:09:58,380 People just assume it means something that it doesn't, so we seem doomed, 238 00:09:58,380 --> 00:10:00,210 as humans, to repeat these mistakes. 239 00:10:00,210 --> 00:10:02,700 And this isn't to say that I think this is a bad feature. 240 00:10:02,700 --> 00:10:07,310 Frankly, I wish that I could somehow signal to recipients of emails 241 00:10:07,310 --> 00:10:09,897 I send, sometimes, please don't forward this to someone else 242 00:10:09,897 --> 00:10:12,980 because it's not going to reflect well or it's going to sound overly harsh 243 00:10:12,980 --> 00:10:14,240 or whatever the email is. 244 00:10:14,240 --> 00:10:16,650 You sometimes don't want other people to see it, 245 00:10:16,650 --> 00:10:19,220 even if it's not the end of the world if they actually do. 246 00:10:19,220 --> 00:10:22,520 But short of writing in all caps, like, do not forward this email, 247 00:10:22,520 --> 00:10:25,420 at the very start of your message, most people might not realize. 248 00:10:25,420 --> 00:10:27,650 So I think having a software mechanism that says 249 00:10:27,650 --> 00:10:30,230 don't, not forwardable, isn't bad. 250 00:10:30,230 --> 00:10:34,130 But, you know, it should probably be like, please don't forward, and not 251 00:10:34,130 --> 00:10:37,850 imply that this is confidential and no one else is going to see it. 252 00:10:37,850 --> 00:10:41,900 COLTON OGDEN: Do you think that there is some sort of risk involved 253 00:10:41,900 --> 00:10:46,460 in making these emails self-destructive, in as much as maybe 254 00:10:46,460 --> 00:10:49,683 it will bite people sort of in the future when maybe they 255 00:10:49,683 --> 00:10:52,100 want to look back on records that are important like this? 256 00:10:52,100 --> 00:10:53,017 DAVID MALAN: Could be. 257 00:10:53,017 --> 00:10:56,030 I mean, there too, I suspect there are business motivations for this, 258 00:10:56,030 --> 00:10:59,300 for retention policies, where there might be laws or policies in place 259 00:10:59,300 --> 00:11:02,450 where companies do or don't want to keep information around 260 00:11:02,450 --> 00:11:04,040 because it can come back to bite them. 261 00:11:04,040 --> 00:11:07,520 And so maybe it's a good thing if emails do expire after some amount of time, 262 00:11:07,520 --> 00:11:10,140 so long as that's within the letter of the law. 263 00:11:10,140 --> 00:11:13,460 But I presume it's motivated, in part, by that. 264 00:11:13,460 --> 00:11:16,265 So this is a software technique that helps with that. 265 00:11:16,265 --> 00:11:18,140 And so, in that sense, you know, confidential 266 00:11:18,140 --> 00:11:21,020 does have that kind of meaning, but it's not secure 267 00:11:21,020 --> 00:11:24,290 and I worry that you put a padlock on it-- that doesn't necessarily 268 00:11:24,290 --> 00:11:25,580 mean to people what you think. 269 00:11:25,580 --> 00:11:27,413 I mean, so many people, and kids especially, 270 00:11:27,413 --> 00:11:31,563 might think or once thought that Snapchat messages are indeed ephemeral 271 00:11:31,563 --> 00:11:32,480 and they'll disappear. 272 00:11:32,480 --> 00:11:34,595 But ah, I mean, they're still on the servers. 273 00:11:34,595 --> 00:11:34,850 COLTON OGDEN: Undoubtedly. 274 00:11:34,850 --> 00:11:36,517 DAVID MALAN: They can be on the servers. 275 00:11:36,517 --> 00:11:40,310 You can snap-- screenshot them or record them with another device. 276 00:11:40,310 --> 00:11:43,760 So I think we do humans a disservice if we're not really upfront as 277 00:11:43,760 --> 00:11:45,820 to what a feature means and how it works, and I 278 00:11:45,820 --> 00:11:48,800 think we should label things appropriately so as to not oversell 279 00:11:48,800 --> 00:11:49,610 them. 280 00:11:49,610 --> 00:11:51,943 COLTON OGDEN: And it sort of takes unfortunate advantage 281 00:11:51,943 --> 00:11:55,030 of those who are not as technically literate as well, allowing 282 00:11:55,030 --> 00:11:56,780 them to sort of-- or at least capitalizing 283 00:11:56,780 --> 00:11:59,740 on people taking for granted these things that they assume to be true. 284 00:11:59,740 --> 00:12:00,260 DAVID MALAN: Yeah. 285 00:12:00,260 --> 00:12:02,385 I mean, we've been doing this, honestly, as humans, 286 00:12:02,385 --> 00:12:04,700 for like, what, 20, 30 years with DOS. 287 00:12:04,700 --> 00:12:09,890 You might recall that when you format a hard drive, which generally means to-- 288 00:12:09,890 --> 00:12:13,250 kind of means to erase it and prepare it to have something new installed 289 00:12:13,250 --> 00:12:17,180 on it, the command, back then, when you used to delete it or fdisk it 290 00:12:17,180 --> 00:12:21,890 or whatever it was, was are you sure you want to proceed? 291 00:12:21,890 --> 00:12:24,870 This will erase the entire disk, something like that, 292 00:12:24,870 --> 00:12:27,060 and I think it actually was in all caps. 293 00:12:27,060 --> 00:12:28,500 But it was false, technically. 294 00:12:28,500 --> 00:12:29,000 Right? 295 00:12:29,000 --> 00:12:32,282 All it would do is rewrite part of the headers on disk, 296 00:12:32,282 --> 00:12:34,990 but it would leave all of your zeros and ones from previous files 297 00:12:34,990 --> 00:12:36,020 there in place. 298 00:12:36,020 --> 00:12:40,070 And there, too, we said it would delete or erase information, but it doesn't. 299 00:12:40,070 --> 00:12:42,645 And so, for years maybe to this day, do people 300 00:12:42,645 --> 00:12:45,020 assume that when you delete something from your Mac or PC 301 00:12:45,020 --> 00:12:47,450 or empty the Recycle Bin or whatnot, that it's gone? 302 00:12:47,450 --> 00:12:50,380 But anyone who's taken CS50 knows that's not the case. 303 00:12:50,380 --> 00:12:53,590 I mean, we have students recover data in their forensics homework alone. 304 00:12:53,590 --> 00:12:56,090 COLTON OGDEN: You have a background, certainly, in this too. 305 00:12:56,090 --> 00:12:58,040 You did this for a few years. 306 00:12:58,040 --> 00:12:59,990 DAVID MALAN: Yeah, or a couple, a year or two. 307 00:12:59,990 --> 00:13:01,452 Yeah, yeah, in graduate school. 308 00:13:01,452 --> 00:13:04,160 COLTON OGDEN: If you were to advise our listeners on the best way 309 00:13:04,160 --> 00:13:07,690 to sort of format their hard drive and avoid this fallacy, 310 00:13:07,690 --> 00:13:08,940 what would be your suggestion? 311 00:13:08,940 --> 00:13:10,357 DAVID MALAN: Drop it in a volcano. 312 00:13:10,357 --> 00:13:11,100 [LAUGHTER] 313 00:13:11,100 --> 00:13:13,130 COLTON OGDEN: So then, are you insinuating 314 00:13:13,130 --> 00:13:15,417 that there is no truly safe way to clean a hard drive? 315 00:13:15,417 --> 00:13:16,250 DAVID MALAN: No, no. 316 00:13:16,250 --> 00:13:18,125 Well, in software, it's risky. 317 00:13:18,125 --> 00:13:19,400 COLTON OGDEN: In software. 318 00:13:19,400 --> 00:13:21,080 DAVID MALAN: I think if you really want peace of mind 319 00:13:21,080 --> 00:13:24,450 because you have personal documents, financial documents, family documents, 320 00:13:24,450 --> 00:13:28,370 whatever it is that you want to destroy, physical destruction is probably 321 00:13:28,370 --> 00:13:29,240 the most safe. 322 00:13:29,240 --> 00:13:33,250 And there are companies that allow you to physically destroy hard drives. 323 00:13:33,250 --> 00:13:35,540 They drill holes in it or they crush it or whatnot, 324 00:13:35,540 --> 00:13:38,880 or you can take out a hammer and try to break through the device. 325 00:13:38,880 --> 00:13:41,270 But it's difficult, as we've seen in class when 326 00:13:41,270 --> 00:13:45,290 we've disassembled things, you and I, for CS50's Introduction to Technology 327 00:13:45,290 --> 00:13:46,300 class. 328 00:13:46,300 --> 00:13:48,320 It's hard just to get the damn screws open. 329 00:13:48,320 --> 00:13:51,080 So that's the most robust way, is physical destruction. 330 00:13:51,080 --> 00:13:53,150 You can wipe the disk in software. 331 00:13:53,150 --> 00:13:56,630 Frankly, it tends not to be terribly easy. 332 00:13:56,630 --> 00:13:59,940 It's easier with mechanical drives, hard disk drives that spin around. 333 00:13:59,940 --> 00:14:04,160 But with SSDs, the Solid State Drives that are purely electronic these days, 334 00:14:04,160 --> 00:14:04,880 it's even harder. 335 00:14:04,880 --> 00:14:07,310 Because those things, in a nutshell, are designed 336 00:14:07,310 --> 00:14:11,570 to only have certain parts of them written to a finite number of times. 337 00:14:11,570 --> 00:14:15,470 And eventually, the hard drive, after a certain number of writes 338 00:14:15,470 --> 00:14:18,920 or after a certain amount of time, will stop using certain parts of the disks. 339 00:14:18,920 --> 00:14:22,160 And that does mean you have a slightly less space available, potentially, 340 00:14:22,160 --> 00:14:26,130 but it ensures that your data's still intact. 341 00:14:26,130 --> 00:14:28,550 That means that even if you try to overwrite that data, 342 00:14:28,550 --> 00:14:30,770 it's never going to get written to because the device isn't 343 00:14:30,770 --> 00:14:31,790 going to write to it anymore. 344 00:14:31,790 --> 00:14:32,270 COLTON OGDEN: Oh, I see. 345 00:14:32,270 --> 00:14:33,470 It closes off certain sectors-- 346 00:14:33,470 --> 00:14:33,890 DAVID MALAN: Exactly. 347 00:14:33,890 --> 00:14:35,210 COLTON OGDEN: --that might have data written in. 348 00:14:35,210 --> 00:14:35,730 That's interesting. 349 00:14:35,730 --> 00:14:36,320 I didn't know that. 350 00:14:36,320 --> 00:14:39,445 DAVID MALAN: So you're better off just destroying that disk, at that point, 351 00:14:39,445 --> 00:14:40,220 too. 352 00:14:40,220 --> 00:14:42,410 So it's wasteful unfortunately, financially, 353 00:14:42,410 --> 00:14:46,610 but if you want true peace of mind, you shouldn't just wipe it with software. 354 00:14:46,610 --> 00:14:49,285 You shouldn't hand it off to someone and assume that Best Buy 355 00:14:49,285 --> 00:14:52,410 or whatever company is doing it for you is going to do it properly as well. 356 00:14:52,410 --> 00:14:54,260 You should probably just remove the device if you can, 357 00:14:54,260 --> 00:14:56,210 destroy it, and sell the rest of the equipment. 358 00:14:56,210 --> 00:14:57,350 COLTON OGDEN: I think this is reflected too, 359 00:14:57,350 --> 00:14:59,863 in Mr. Robot, where he microwaves an SD card that he 360 00:14:59,863 --> 00:15:01,030 trusts I'll get off of his-- 361 00:15:01,030 --> 00:15:01,580 DAVID MALAN: Did he? 362 00:15:01,580 --> 00:15:02,860 I Don't know if I saw that episode, then. 363 00:15:02,860 --> 00:15:04,370 COLTON OGDEN: This was, I think, the second episode. 364 00:15:04,370 --> 00:15:05,940 DAVID MALAN: That's probably not the right way to do it. 365 00:15:05,940 --> 00:15:07,305 That's probably just very dangerous. 366 00:15:07,305 --> 00:15:10,430 COLTON OGDEN: That's probably very-- yeah, I think it exploded in the video, 367 00:15:10,430 --> 00:15:10,730 but yeah. 368 00:15:10,730 --> 00:15:12,810 DAVID MALAN: You don't put metal thing-- for our CS50 listeners 369 00:15:12,810 --> 00:15:14,630 out there, don't put metal things in microwaves. 370 00:15:14,630 --> 00:15:15,950 COLTON OGDEN: Yeah, generally not advisable. 371 00:15:15,950 --> 00:15:17,658 DAVID MALAN: No, I think never advisable. 372 00:15:17,658 --> 00:15:19,442 [LAUGHTER] 373 00:15:19,442 --> 00:15:21,150 COLTON OGDEN: Yeah, so off of the-- well, 374 00:15:21,150 --> 00:15:24,240 I guess sort of related to the topic of security, 375 00:15:24,240 --> 00:15:26,850 there was an article recently published on Gizmodo 376 00:15:26,850 --> 00:15:30,000 about how the FCC admitted in court that it can't track who 377 00:15:30,000 --> 00:15:31,617 submits fake comments to its database. 378 00:15:31,617 --> 00:15:33,200 DAVID MALAN: Yeah, I was reading that. 379 00:15:33,200 --> 00:15:37,170 And as best I could tell, it sounded like they had a web-based form 380 00:15:37,170 --> 00:15:42,000 to solicit feedback on, what was it, net neutrality or some topic like that, 381 00:15:42,000 --> 00:15:44,220 and they claimed that they couldn't trace 382 00:15:44,220 --> 00:15:48,030 who it was because apparently there were millions of bogus comments generated 383 00:15:48,030 --> 00:15:51,080 by script kiddies or just adversaries who wrote programs 384 00:15:51,080 --> 00:15:53,830 to just submit comments again and again and again and again. 385 00:15:53,830 --> 00:15:56,100 And as best I could infer, it sounds like they 386 00:15:56,100 --> 00:16:00,030 were weren't logging, maybe, who they were coming from, 387 00:16:00,030 --> 00:16:01,950 maybe it's IP address. 388 00:16:01,950 --> 00:16:04,710 It sounded like maybe they didn't even have a CAPTCHA in place 389 00:16:04,710 --> 00:16:09,480 to sort of force a presumed human to answer some challenge like a math 390 00:16:09,480 --> 00:16:12,030 problem or what is this blurry text or click 391 00:16:12,030 --> 00:16:15,307 all of the icons that have crosswalks in them or something like that. 392 00:16:15,307 --> 00:16:16,140 COLTON OGDEN: Right. 393 00:16:16,140 --> 00:16:18,330 DAVID MALAN: And so they just don't have much metadata, 394 00:16:18,330 --> 00:16:19,830 it seemed, about who the users were. 395 00:16:19,830 --> 00:16:23,730 So short of looking at the text that was submitted alone, 396 00:16:23,730 --> 00:16:26,310 it sounds like they can't necessarily filter things out. 397 00:16:26,310 --> 00:16:28,227 It's a little strange to me because it sounded 398 00:16:28,227 --> 00:16:32,220 like they do have IP addresses, at least in the article that I read, 399 00:16:32,220 --> 00:16:36,010 and the FCC doesn't want to release that for reasons of privacy. 400 00:16:36,010 --> 00:16:38,970 But you could certainly filter out a good amount of the traffic, 401 00:16:38,970 --> 00:16:41,470 probably, if it all seems to be coming from the same IP. 402 00:16:41,470 --> 00:16:44,940 I'm guessing many of the adversaries weren't as thoughtful 403 00:16:44,940 --> 00:16:49,290 as to use hundreds or thousands of different IPs, 404 00:16:49,290 --> 00:16:50,910 so that's a little curious too. 405 00:16:50,910 --> 00:16:52,952 COLTON OGDEN: Is it all related to onion routing? 406 00:16:52,952 --> 00:16:56,400 And this is more of my sort of lack of knowledge of Tor and onion 407 00:16:56,400 --> 00:16:59,250 routing, but is this sort of how onion routing works in that you can 408 00:16:59,250 --> 00:17:00,990 spoof your IP from a million locations? 409 00:17:00,990 --> 00:17:02,490 DAVID MALAN: Not even spoof your IP. 410 00:17:02,490 --> 00:17:05,790 You just really send the data through an anonymized network such 411 00:17:05,790 --> 00:17:06,900 that it appears to be-- 412 00:17:06,900 --> 00:17:10,230 that it is coming from someone else that's not you. 413 00:17:10,230 --> 00:17:11,530 So yeah, that's an option. 414 00:17:11,530 --> 00:17:14,190 I've not used that kind of software in years 415 00:17:14,190 --> 00:17:18,060 or looked very closely at how it's advanced, but that's the general idea. 416 00:17:18,060 --> 00:17:20,940 Like, you just get together with a large enough group of other people 417 00:17:20,940 --> 00:17:23,700 who you presumably don't know, so n is large, so to speak, 418 00:17:23,700 --> 00:17:25,950 and all these computers are running the same software. 419 00:17:25,950 --> 00:17:30,630 And even though you might originate a message in email or form submission, 420 00:17:30,630 --> 00:17:34,290 that information gets routed through n minus 1 other people, or some subset 421 00:17:34,290 --> 00:17:37,522 thereof, so that you're kind of covering your tracks. 422 00:17:37,522 --> 00:17:40,230 It's like in the movies, right, when they show a map of the world 423 00:17:40,230 --> 00:17:43,560 and like the bad guys' data is going from here to here to here 424 00:17:43,560 --> 00:17:45,840 and a red line is bouncing all over the world. 425 00:17:45,840 --> 00:17:48,640 That's pretty silly, but it's actually that kind of idea. 426 00:17:48,640 --> 00:17:50,640 You just don't have software that visualizes it. 427 00:17:50,640 --> 00:17:52,470 COLTON OGDEN: And that's why it looks-- that's why they call it onion routing, 428 00:17:52,470 --> 00:17:54,530 because it's like layers of an onion, kind of going all around? 429 00:17:54,530 --> 00:17:54,620 DAVID MALAN: Oh is it? 430 00:17:54,620 --> 00:17:55,540 I never thought about it. 431 00:17:55,540 --> 00:17:56,960 COLTON OGDEN: I thought that that was why it was called onion routing. 432 00:17:56,960 --> 00:17:57,240 DAVID MALAN: Maybe. 433 00:17:57,240 --> 00:17:58,490 That sounds pretty compelling. 434 00:17:58,490 --> 00:17:59,272 So sure, yes. 435 00:17:59,272 --> 00:18:01,230 COLTON OGDEN: They apparently, per the article, 436 00:18:01,230 --> 00:18:05,040 the API logs contain dozens of IP addresses 437 00:18:05,040 --> 00:18:07,870 that belong to groups that uploaded millions of comments combined. 438 00:18:07,870 --> 00:18:10,620 So to your point, it does sound like that indeed is what happened. 439 00:18:10,620 --> 00:18:12,900 DAVID MALAN: Oh, so that's presumably how they know minimally 440 00:18:12,900 --> 00:18:14,400 that there were bogus comments? 441 00:18:14,400 --> 00:18:14,960 COLTON OGDEN: Yeah. 442 00:18:14,960 --> 00:18:16,470 DAVID MALAN: And but hard to distinguish maybe 443 00:18:16,470 --> 00:18:18,150 some of the signal from the noise. 444 00:18:18,150 --> 00:18:20,025 COLTON OGDEN: Folks concerns were that people 445 00:18:20,025 --> 00:18:22,800 were sort of creating these bogus comments that 446 00:18:22,800 --> 00:18:28,625 were propaganda, essentially malicious-- maliciously-oriented comments. 447 00:18:28,625 --> 00:18:30,750 DAVID MALAN: Yeah, well this is pretty stupid then, 448 00:18:30,750 --> 00:18:32,500 sounding then because, honestly, there are 449 00:18:32,500 --> 00:18:35,370 so many like available APIs via which you can at least raise 450 00:18:35,370 --> 00:18:36,930 the barrier to adversaries. 451 00:18:36,930 --> 00:18:39,840 Right, using CAPTCHAs so that, theoretically, you 452 00:18:39,840 --> 00:18:44,200 can't just write a program to answer those kinds of challenge questions; 453 00:18:44,200 --> 00:18:45,450 a human actually has to do it. 454 00:18:45,450 --> 00:18:49,260 So you might get bogus submissions, but hopefully not thousands or millions 455 00:18:49,260 --> 00:18:49,860 of them. 456 00:18:49,860 --> 00:18:50,652 COLTON OGDEN: Yeah. 457 00:18:50,652 --> 00:18:52,995 No, it sounded more like a technical-- 458 00:18:52,995 --> 00:18:53,870 I might not want to-- 459 00:18:53,870 --> 00:18:55,020 I don't want to stretch my words here-- but it 460 00:18:55,020 --> 00:18:58,160 sounded like there was a little bit of potential technical illiteracy involved 461 00:18:58,160 --> 00:18:58,290 at least in this. 462 00:18:58,290 --> 00:18:59,070 DAVID MALAN: Could be. 463 00:18:59,070 --> 00:18:59,610 COLTON OGDEN: Potentially. 464 00:18:59,610 --> 00:19:00,100 DAVID MALAN: It could be. 465 00:19:00,100 --> 00:19:01,580 COLTON OGDEN: I want to try to sound as diplomatic as possible. 466 00:19:01,580 --> 00:19:03,630 DAVID MALAN: Good thing they're making all these decisions around technology. 467 00:19:03,630 --> 00:19:04,650 COLTON OGDEN: Ah, yeah, exactly. 468 00:19:04,650 --> 00:19:05,080 Right? 469 00:19:05,080 --> 00:19:05,940 And I have a picture-- 470 00:19:05,940 --> 00:19:07,940 OK, I'm not going to go in that direction, but-- 471 00:19:07,940 --> 00:19:10,988 DAVID MALAN: I don't think we can show pictures on this podcast, though. 472 00:19:10,988 --> 00:19:12,780 COLTON OGDEN: Another topic sort of related 473 00:19:12,780 --> 00:19:15,830 to this was-- and John Oliver sort of did 474 00:19:15,830 --> 00:19:20,370 a skit on this related to robocalls-- is, well, robocalls. 475 00:19:20,370 --> 00:19:22,290 For those that-- do you want to maybe explain 476 00:19:22,290 --> 00:19:24,030 what robocalls are for our audience? 477 00:19:24,030 --> 00:19:24,780 DAVID MALAN: Yeah. 478 00:19:24,780 --> 00:19:28,380 I mean, a robocall is like a call from a robot, so to speak. 479 00:19:28,380 --> 00:19:32,310 Really, a piece of software that's pretending to dial the phone, 480 00:19:32,310 --> 00:19:34,830 but is doing it all programmatically through software. 481 00:19:34,830 --> 00:19:38,370 And it's usually because they want to sell you something or it's 482 00:19:38,370 --> 00:19:41,370 an advertisement or it's a survey or they 483 00:19:41,370 --> 00:19:45,510 want to trick you into giving your social security number 484 00:19:45,510 --> 00:19:48,020 or that you owe taxes. 485 00:19:48,020 --> 00:19:51,270 I mean, they can be used for any number of things and, sometimes, good things. 486 00:19:51,270 --> 00:19:54,790 You might get a reminder from a robocall from like an airline saying hey, 487 00:19:54,790 --> 00:19:56,340 your flight has been delayed an hour. 488 00:19:56,340 --> 00:19:58,740 That's useful, and you might invite that. 489 00:19:58,740 --> 00:20:02,310 But robocalls have a bad rap because they're often unsolicited 490 00:20:02,310 --> 00:20:05,303 and because I have not signed up for someone to call me. 491 00:20:05,303 --> 00:20:07,720 And indeed, these have been increasing in frequency for me 492 00:20:07,720 --> 00:20:12,140 too, on my cell phone in particular, which theoretically is unlisted. 493 00:20:12,140 --> 00:20:14,183 And I added the Do Not Track thing, years ago, 494 00:20:14,183 --> 00:20:15,850 but that's really just the honor system. 495 00:20:15,850 --> 00:20:18,632 You don't have to honor people who are on those lists. 496 00:20:18,632 --> 00:20:21,340 COLTON OGDEN: They just supposedly have to check the list, right, 497 00:20:21,340 --> 00:20:22,690 but they can still call you afterwards? 498 00:20:22,690 --> 00:20:25,750 DAVID MALAN: Yeah, and I mean certainly if the problem is with bad actors, 499 00:20:25,750 --> 00:20:28,210 then, by definition, those people aren't respecting these lists 500 00:20:28,210 --> 00:20:28,970 in the first place. 501 00:20:28,970 --> 00:20:31,887 So it's the good people who you might want to hear from who you're not 502 00:20:31,887 --> 00:20:34,418 because they are honoring the Do Not Call lists. 503 00:20:34,418 --> 00:20:37,210 COLTON OGDEN: I've noticed that I've received a great many as well. 504 00:20:37,210 --> 00:20:37,580 Most of them from-- 505 00:20:37,580 --> 00:20:39,430 DAVID MALAN: Oh, yeah, sorry about those. 506 00:20:39,430 --> 00:20:43,477 COLTON OGDEN: Most of them from 949, which is where I grew up in California, 507 00:20:43,477 --> 00:20:46,060 and that's where the bulk of all the messages are coming from. 508 00:20:46,060 --> 00:20:48,110 DAVID MALAN: But well, per-- seem to be coming from. 509 00:20:48,110 --> 00:20:49,120 I've noticed this too. 510 00:20:49,120 --> 00:20:53,780 I get them from 617, which is Boston's area code, too. 511 00:20:53,780 --> 00:20:55,030 They're doing that on purpose. 512 00:20:55,030 --> 00:20:56,650 I just read this, and it makes perfect sense, now, 513 00:20:56,650 --> 00:20:59,460 in retrospect why I keep seeing the same prefix in these numbers. 514 00:20:59,460 --> 00:21:01,210 because they're trying to trick you and me 515 00:21:01,210 --> 00:21:03,730 into thinking that, oh, this is from a neighbor or someone 516 00:21:03,730 --> 00:21:05,350 I know in my locality. 517 00:21:05,350 --> 00:21:07,810 No it's just another obnoxious technique, to be honest. 518 00:21:07,810 --> 00:21:08,980 COLTON OGDEN: I live in Massachusetts now, 519 00:21:08,980 --> 00:21:11,510 so I know that it's not a neighbor, definitely, if they're 949. 520 00:21:11,510 --> 00:21:11,800 DAVID MALAN: Yeah. 521 00:21:11,800 --> 00:21:12,445 COLTON OGDEN: Likely. 522 00:21:12,445 --> 00:21:15,237 DAVID MALAN: Well, the thing is, I don't know anyone's number, now. 523 00:21:15,237 --> 00:21:19,300 So if I don't, it's not in my contacts, I know it's probably a robocall. 524 00:21:19,300 --> 00:21:21,670 So no, it's really awful and, I mean, I think 525 00:21:21,670 --> 00:21:24,460 one of the primary reasons this is becoming even more of a problem 526 00:21:24,460 --> 00:21:26,170 is that making calls is so darn cheap. 527 00:21:26,170 --> 00:21:28,233 I mean, you and I have experimented with Twilio, 528 00:21:28,233 --> 00:21:31,150 which is a nice service that has, I think, a free tier but a paid tier 529 00:21:31,150 --> 00:21:35,500 too where you can automate phone calls, hopefully for good purposes. 530 00:21:35,500 --> 00:21:37,960 And I was just reading on their website that they actually 531 00:21:37,960 --> 00:21:40,793 deliberately, though this certainly is a business advantage for them 532 00:21:40,793 --> 00:21:43,735 too, charge minimally by the minute, not by the second, 533 00:21:43,735 --> 00:21:46,360 because they want to charge even potential adversaries at least 534 00:21:46,360 --> 00:21:47,720 60 seconds for the call. 535 00:21:47,720 --> 00:21:49,270 Though, of course, this means that if you and I are 536 00:21:49,270 --> 00:21:51,603 writing an app that just needs a few seconds of airtime, 537 00:21:51,603 --> 00:21:52,870 we're overpaying for it. 538 00:21:52,870 --> 00:21:55,240 But it's a hard problem because calls are just so cheap. 539 00:21:55,240 --> 00:21:57,910 And this is why spam has so proliferated, right? 540 00:21:57,910 --> 00:22:02,780 Because it's close to zero cents to even send a bogus email, these days. 541 00:22:02,780 --> 00:22:05,890 And so those two are dominating the internet, too. 542 00:22:05,890 --> 00:22:07,720 And thankfully, companies like Google have 543 00:22:07,720 --> 00:22:09,470 been pretty good at filtering it out. 544 00:22:09,470 --> 00:22:12,220 You know, we don't really have a middleman filtering out our phone 545 00:22:12,220 --> 00:22:15,160 calls, and it's unclear if you'd want a middleman-- software 546 00:22:15,160 --> 00:22:17,710 picking up the phone figuring out if it's is legit 547 00:22:17,710 --> 00:22:19,140 and then connecting them to you. 548 00:22:19,140 --> 00:22:19,460 COLTON OGDEN: Right. 549 00:22:19,460 --> 00:22:22,237 DAVID MALAN: It feels a little invasive and a time-consuming, too. 550 00:22:22,237 --> 00:22:24,070 COLTON OGDEN: And it's kind of alarming just 551 00:22:24,070 --> 00:22:27,460 how easy it is for your average person, your average beginning programmer 552 00:22:27,460 --> 00:22:30,362 to set up an automated robocall system. 553 00:22:30,362 --> 00:22:33,070 This was illustrated, and again, back to the John Oliver segment, 554 00:22:33,070 --> 00:22:35,680 this was illustrated on there where they had literally had-- 555 00:22:35,680 --> 00:22:38,805 they were showing a clip of somebody who wrote a little command line script 556 00:22:38,805 --> 00:22:39,927 or something like that. 557 00:22:39,927 --> 00:22:42,010 And even John Oliver made light of it in this skit 558 00:22:42,010 --> 00:22:45,310 where he said that his tech person took only 15 minutes to sort 559 00:22:45,310 --> 00:22:48,250 of bomb the FCC with phone calls. 560 00:22:48,250 --> 00:22:51,538 But I mean, the demonstration showed writing a simple script, 561 00:22:51,538 --> 00:22:53,080 20 phones just light up on the table. 562 00:22:53,080 --> 00:22:56,140 And this can be scaled and, you know, however large you want to go with it. 563 00:22:56,140 --> 00:22:56,330 DAVID MALAN: No. 564 00:22:56,330 --> 00:22:59,290 And, fun fact, I actually did this in CS50 once, a few years ago, 565 00:22:59,290 --> 00:23:04,330 and have not done it since because this blew up massively on me. 566 00:23:04,330 --> 00:23:07,870 Long story short-- and we have video footage of this if you dig through-- 567 00:23:07,870 --> 00:23:10,910 several years ago, maybe if it's 2018, most recently, 568 00:23:10,910 --> 00:23:12,995 it's probably 2014, give or take. 569 00:23:12,995 --> 00:23:16,120 In one of the lectures, mid-semester, we were talking about web programming 570 00:23:16,120 --> 00:23:21,820 and APIs and I wrote a script in advance to send a message via text-- 571 00:23:21,820 --> 00:23:24,010 well, technically, via email to text. 572 00:23:24,010 --> 00:23:25,840 It was sent through what's called an email 573 00:23:25,840 --> 00:23:30,685 to SMS gateway that would send a message to every CS50 student in the room. 574 00:23:30,685 --> 00:23:32,560 And at the time, I foolishly thought it would 575 00:23:32,560 --> 00:23:35,590 be cute to say something like, where are you? 576 00:23:35,590 --> 00:23:36,700 Why aren't you in class? 577 00:23:36,700 --> 00:23:37,930 Question mark. 578 00:23:37,930 --> 00:23:40,850 And the joke was supposed to be, because if anyone were, 579 00:23:40,850 --> 00:23:42,820 you know, cutting class that day and weren't 580 00:23:42,820 --> 00:23:45,650 there they'd get this message from CS50's bot thinking, 581 00:23:45,650 --> 00:23:47,615 oh my God, they know I'm not there. 582 00:23:47,615 --> 00:23:49,990 When, really, everyone else in the classroom was in on it 583 00:23:49,990 --> 00:23:53,080 because they saw me running the program and they knew what was going to happen. 584 00:23:53,080 --> 00:23:54,790 And it was pretty cool in that, all of the sudden, 585 00:23:54,790 --> 00:23:56,582 a whole bunch of people in the room started 586 00:23:56,582 --> 00:23:59,950 getting text messages with this, I thought, funny message. 587 00:23:59,950 --> 00:24:05,920 But I had a stupid bug in my code, and essentially my loop 588 00:24:05,920 --> 00:24:08,290 sent one text message, the first iteration; 589 00:24:08,290 --> 00:24:10,540 Then two text messages, the second iteration; 590 00:24:10,540 --> 00:24:12,790 then three text messages, the third iteration. 591 00:24:12,790 --> 00:24:17,890 Whereby the previous recipients would get another and another and another 592 00:24:17,890 --> 00:24:22,570 because I essentially kept appending to an array or to a list of recipients 593 00:24:22,570 --> 00:24:25,693 instead of blowing away the previous recipient list. 594 00:24:25,693 --> 00:24:27,610 COLTON OGDEN: It's like a factorial operation. 595 00:24:27,610 --> 00:24:29,395 DAVID MALAN: Well, a geometric series, technically. 596 00:24:29,395 --> 00:24:29,830 COLTON OGDEN: [INAUDIBLE]. 597 00:24:29,830 --> 00:24:30,820 DAVID MALAN: Or if you did it-- 598 00:24:30,820 --> 00:24:32,237 I did, I think I did out the math. 599 00:24:32,237 --> 00:24:35,110 If I had not hit Control-C pretty quickly 600 00:24:35,110 --> 00:24:41,290 to cancel or to interrupt the process, I would have sent 20,000 text messages. 601 00:24:41,290 --> 00:24:42,970 And they were going out quickly. 602 00:24:42,970 --> 00:24:46,240 And I felt horrible because this was enough years ago where some people were 603 00:24:46,240 --> 00:24:48,250 still paying for text messaging plans. 604 00:24:48,250 --> 00:24:51,670 It wasn't unlimited, which is pretty common, at least in the US these days, 605 00:24:51,670 --> 00:24:54,610 to just have unlimited texts or iMessage or whatever. 606 00:24:54,610 --> 00:24:57,970 So, you know, this could have been costing students $0.10 to $0.25 607 00:24:57,970 --> 00:24:58,810 or whatever. 608 00:24:58,810 --> 00:25:01,660 So we offered to compensate anyone for this, 609 00:25:01,660 --> 00:25:06,440 and I did have to single out a $20 bill I think to one student whose phone I 610 00:25:06,440 --> 00:25:07,170 had overwhelmed. 611 00:25:07,170 --> 00:25:10,870 But, there too, it was also, phones were old enough that they only had finite-- 612 00:25:10,870 --> 00:25:12,600 well, they always had finite memory. 613 00:25:12,600 --> 00:25:15,397 They had terribly little memory, and so when 614 00:25:15,397 --> 00:25:17,730 you get a whole bunch of text messages, back in the day, 615 00:25:17,730 --> 00:25:21,012 it would push out older text messages and I felt awful about that. 616 00:25:21,012 --> 00:25:21,720 COLTON OGDEN: Oh. 617 00:25:21,720 --> 00:25:23,850 DAVID MALAN: Kind of overwhelming people's memory. 618 00:25:23,850 --> 00:25:26,670 So anyhow, this is only to say that even hopefully 619 00:25:26,670 --> 00:25:32,340 good people with good intentions can use robocalls or robotexting accidentally 620 00:25:32,340 --> 00:25:32,960 for ill. 621 00:25:32,960 --> 00:25:34,950 And if you're trying to do that deliberately, 622 00:25:34,950 --> 00:25:37,050 maliciously, it's just so darn easy. 623 00:25:37,050 --> 00:25:38,743 COLTON OGDEN: So solutions to this then? 624 00:25:38,743 --> 00:25:40,785 DAVID MALAN: Don't let me in front of a keyboard. 625 00:25:40,785 --> 00:25:42,390 [LAUGHTER] 626 00:25:42,390 --> 00:25:45,400 COLTON OGDEN: Do we-- so there is a little bit of reading I was doing, 627 00:25:45,400 --> 00:25:47,233 and it might have been in this same article, 628 00:25:47,233 --> 00:25:49,710 but cryptographically signing phone calls, 629 00:25:49,710 --> 00:25:51,720 is this something that you think is possible? 630 00:25:51,720 --> 00:25:52,050 DAVID MALAN: Yeah. 631 00:25:52,050 --> 00:25:54,508 I mean, I don't know terribly much about the phone industry 632 00:25:54,508 --> 00:25:57,420 other than it's pretty backwards or dated 633 00:25:57,420 --> 00:25:59,310 in terms of how it's all implemented. 634 00:25:59,310 --> 00:26:01,020 I mean, I'm sure this is solvable. 635 00:26:01,020 --> 00:26:05,610 But the catch is how do you roll it out when you have old-school copper phone 636 00:26:05,610 --> 00:26:09,330 lines, when you have all of us using cell phones on different carriers? 637 00:26:09,330 --> 00:26:12,600 It just feels like a very hard coordination problem. 638 00:26:12,600 --> 00:26:16,470 And honestly, now that data plans are so omnipresent and we decreasingly 639 00:26:16,470 --> 00:26:18,390 need to use voice, per se-- 640 00:26:18,390 --> 00:26:20,730 you can use Voice over IP, so to speak-- 641 00:26:20,730 --> 00:26:24,510 you know, I wouldn't be surprised if we don't fix the phone industry, 642 00:26:24,510 --> 00:26:28,740 but we instead replace it with some equivalent of WhatsApp or Facebook 643 00:26:28,740 --> 00:26:32,730 Messenger or Skype or Signal or any number of tools 644 00:26:32,730 --> 00:26:34,547 that communicate voice, but over software. 645 00:26:34,547 --> 00:26:36,630 And at that point, then yes, you can authenticate. 646 00:26:36,630 --> 00:26:37,780 COLTON OGDEN: OK, that makes sense. 647 00:26:37,780 --> 00:26:40,800 And especially if this keeps scaling, I feel like this is an eventuality. 648 00:26:40,800 --> 00:26:42,008 DAVID MALAN: I imagine, yeah. 649 00:26:42,008 --> 00:26:45,750 I mean, even now, right, like I don't get calls via any of those apps-- 650 00:26:45,750 --> 00:26:48,090 well, some of them, the Facebook ones I do-- 651 00:26:48,090 --> 00:26:50,010 from people that aren't in your contacts. 652 00:26:50,010 --> 00:26:53,400 Sometimes, it just goes to your other folder or whatnot. 653 00:26:53,400 --> 00:26:55,380 But I'm pretty sure you can prevent calls 654 00:26:55,380 --> 00:26:58,410 from people who weren't on your whitelist on those apps like Signal 655 00:26:58,410 --> 00:27:01,048 and WhatsApp that do use end-to-end encryption. 656 00:27:01,048 --> 00:27:02,090 COLTON OGDEN: Sure, yeah. 657 00:27:02,090 --> 00:27:02,798 That makes sense. 658 00:27:02,798 --> 00:27:03,705 That makes sense. 659 00:27:03,705 --> 00:27:05,187 DAVID MALAN: So we shall see. 660 00:27:05,187 --> 00:27:06,270 COLTON OGDEN: There is a-- 661 00:27:06,270 --> 00:27:08,697 so away from the, I guess, the security, which 662 00:27:08,697 --> 00:27:10,530 has been a major theme of the podcast today, 663 00:27:10,530 --> 00:27:13,320 towards something a little bit different actually-- and this is pretty cool 664 00:27:13,320 --> 00:27:15,600 and I'm, particularly for me because I'm into games-- 665 00:27:15,600 --> 00:27:18,840 but Google actually announced a brand-new streaming service 666 00:27:18,840 --> 00:27:20,400 that people are really talking about. 667 00:27:20,400 --> 00:27:21,450 DAVID MALAN: Yeah, that's really interesting. 668 00:27:21,450 --> 00:27:25,468 You probably know more about this world than I do, since I am a fan of the NES, 669 00:27:25,468 --> 00:27:26,010 the original. 670 00:27:26,010 --> 00:27:27,510 [LAUGHTER] 671 00:27:27,510 --> 00:27:29,998 COLTON OGDEN: Well, it's called Stadia, and I've 672 00:27:29,998 --> 00:27:32,040 done a little bit of reading on it, not terribly, 673 00:27:32,040 --> 00:27:34,220 because there's actually not that much about it, right now. 674 00:27:34,220 --> 00:27:34,460 DAVID MALAN: OK. 675 00:27:34,460 --> 00:27:36,270 COLTON OGDEN: Because it just was announced maybe two or three days ago 676 00:27:36,270 --> 00:27:39,872 and, actually, Karim, one of our team, actually kindly showed it to me 677 00:27:39,872 --> 00:27:40,830 because I wasn't aware. 678 00:27:40,830 --> 00:27:42,450 This is going on at a live event. 679 00:27:42,450 --> 00:27:44,280 DAVID MALAN: OK. 680 00:27:44,280 --> 00:27:48,240 COLTON OGDEN: But it's essentially an idea that's been done before. 681 00:27:48,240 --> 00:27:51,600 The companies have done this sort of, we process all the games 682 00:27:51,600 --> 00:27:54,000 and then we stream the video signal to you and you play, 683 00:27:54,000 --> 00:27:55,740 and there's this back and forth. 684 00:27:55,740 --> 00:28:00,570 My initial sort of qualm about this is that we're fundamentally 685 00:28:00,570 --> 00:28:02,000 dealing with streaming latency. 686 00:28:02,000 --> 00:28:02,896 DAVID MALAN: Yeah, of course. 687 00:28:02,896 --> 00:28:04,050 COLTON OGDEN: And it's-- 688 00:28:04,050 --> 00:28:08,400 I find it highly unlikely that we can-- especially for a geographically distant 689 00:28:08,400 --> 00:28:12,510 locations amongst servers and amongst consumers-- 690 00:28:12,510 --> 00:28:16,650 that we can deal with less than 13 milliseconds of latency 691 00:28:16,650 --> 00:28:19,637 in between a given frame and the input on someone's machine. 692 00:28:19,637 --> 00:28:21,970 DAVID MALAN: Maybe right now, but this seems inevitable. 693 00:28:21,970 --> 00:28:25,433 So I kind of give Google credit for being a little bleeding edge here. 694 00:28:25,433 --> 00:28:27,600 Like, this probably won't work well for many people. 695 00:28:27,600 --> 00:28:29,060 But it feels inevitable, right? 696 00:28:29,060 --> 00:28:33,362 Like eventually, we'll have so much bandwidth and so low 697 00:28:33,362 --> 00:28:36,570 latency that these kinds of things seem inevitable to me, these applications. 698 00:28:36,570 --> 00:28:39,540 So I'm kind of comfortable with it being a bit bleeding edge, 699 00:28:39,540 --> 00:28:43,500 especially if it maybe has sort of lower quality graphics, more Nintendo 700 00:28:43,500 --> 00:28:47,670 style than Xbox style which-- or at least with the Wii, the original Wii, 701 00:28:47,670 --> 00:28:50,220 was like a design decision. 702 00:28:50,220 --> 00:28:53,970 I think it could kind of work, and I'm very curious to see how well it works. 703 00:28:53,970 --> 00:28:56,940 But, yeah, I mean, even latency, we for CS50's IDE 704 00:28:56,940 --> 00:28:58,800 and for the sandbox tool and the lab tool 705 00:28:58,800 --> 00:29:00,810 that to support X-based applications, which 706 00:29:00,810 --> 00:29:03,960 is the windowing system for Linux, the graphical system, 707 00:29:03,960 --> 00:29:05,628 it doesn't work very well for animation. 708 00:29:05,628 --> 00:29:08,420 I mean, even you, I think, implemented Breakout for us a while ago. 709 00:29:08,420 --> 00:29:08,910 COLTON OGDEN: A while back. 710 00:29:08,910 --> 00:29:11,800 DAVID MALAN: And tried it out and, eh, you know, it's OK, 711 00:29:11,800 --> 00:29:13,290 but it's not compelling. 712 00:29:13,290 --> 00:29:15,888 But I'm sure there are games that would be compelling. 713 00:29:15,888 --> 00:29:16,680 COLTON OGDEN: Yeah. 714 00:29:16,680 --> 00:29:19,805 I know that in their examples, they were doing things like Assassin's Creed 715 00:29:19,805 --> 00:29:24,158 Odyssey, you know, very recent games that are very high graphic quality. 716 00:29:24,158 --> 00:29:25,200 I mean, I would like to-- 717 00:29:25,200 --> 00:29:27,195 I would definitely like to see at work, if possible. 718 00:29:27,195 --> 00:29:27,510 DAVID MALAN: Yeah. 719 00:29:27,510 --> 00:29:29,130 No, I think that would be pretty cool. 720 00:29:29,130 --> 00:29:32,130 One less thing to buy, too, and it hopefully lowers the barrier to entry 721 00:29:32,130 --> 00:29:32,630 to people. 722 00:29:32,630 --> 00:29:34,262 You don't need the hardware. 723 00:29:34,262 --> 00:29:35,970 You don't need to connect something else. 724 00:29:35,970 --> 00:29:37,440 You don't need to draw the power for it. 725 00:29:37,440 --> 00:29:39,150 I mean, there's some upsides here, I think. 726 00:29:39,150 --> 00:29:41,440 COLTON OGDEN: I think especially if they are doing this at scale-- 727 00:29:41,440 --> 00:29:43,080 and Google already does this, surely-- but, you 728 00:29:43,080 --> 00:29:45,120 know, they have a CDN network that's very-- 729 00:29:45,120 --> 00:29:48,000 and it's very, maybe a US-centric thing at first, 730 00:29:48,000 --> 00:29:50,340 and then can scale it out to other countries. 731 00:29:50,340 --> 00:29:53,070 Maybe the latency between any given node on their network 732 00:29:53,070 --> 00:29:56,370 is either-- or the gap is small enough such that the latency is minimal. 733 00:29:56,370 --> 00:29:56,700 DAVID MALAN: Yeah, hopefully. 734 00:29:56,700 --> 00:29:59,367 COLTON OGDEN: As long as it's less than 13 milliseconds, though. 735 00:29:59,367 --> 00:30:02,190 That's the one in 60-- one's over 60, which is the-- 736 00:30:02,190 --> 00:30:04,180 typically, the games are 60 frames per second-- 737 00:30:04,180 --> 00:30:06,990 that's the amount of time it needs to be to process input 738 00:30:06,990 --> 00:30:08,550 and feel like it's a native game. 739 00:30:08,550 --> 00:30:09,420 DAVID MALAN: Well, to be honest, I mean this 740 00:30:09,420 --> 00:30:11,250 is similar to their vision for Chromebooks 741 00:30:11,250 --> 00:30:15,090 which, if you're unfamiliar, is a relatively low-cost laptop that 742 00:30:15,090 --> 00:30:16,067 is kind of locked down. 743 00:30:16,067 --> 00:30:18,150 It pretty much gives you a browser, and that's it, 744 00:30:18,150 --> 00:30:21,030 the presumption being that you can use things like Gmail and Google 745 00:30:21,030 --> 00:30:24,810 Docs and Google Calendar, even partly offline, if you're on an airplane, 746 00:30:24,810 --> 00:30:29,173 so long as you pre-open them in advance and sort of cache some of the code. 747 00:30:29,173 --> 00:30:31,590 I mean, that works well so long as you have good internet. 748 00:30:31,590 --> 00:30:34,423 But we've chatted with some of our high school students and teachers 749 00:30:34,423 --> 00:30:36,840 whose schools use Chromebooks and it's not great 750 00:30:36,840 --> 00:30:39,300 when the students need to or want to take the laptops home. 751 00:30:39,300 --> 00:30:42,360 Maybe they don't have or can't afford their own internet access at home, 752 00:30:42,360 --> 00:30:44,322 so there's certainly some downsides. 753 00:30:44,322 --> 00:30:45,030 But I don't know. 754 00:30:45,030 --> 00:30:47,460 I feel like within enough years, we'll be at the point 755 00:30:47,460 --> 00:30:51,240 where internet access of some sort is more commodity like electricity 756 00:30:51,240 --> 00:30:54,630 in the wall and so long as you have that kind of flowing into the house, 757 00:30:54,630 --> 00:30:56,940 that it'll be even more omnipresent than it is now. 758 00:30:56,940 --> 00:30:57,732 COLTON OGDEN: Sure. 759 00:30:57,732 --> 00:30:58,932 Yeah that makes total sense. 760 00:30:58,932 --> 00:31:00,640 I would definitely like to see it happen. 761 00:31:00,640 --> 00:31:01,300 I hope it does. 762 00:31:01,300 --> 00:31:02,640 DAVID MALAN: Yeah, so. 763 00:31:02,640 --> 00:31:04,080 COLTON OGDEN: Well, I think that's all the topics 764 00:31:04,080 --> 00:31:05,280 that we've sort of had lined up. 765 00:31:05,280 --> 00:31:07,080 We covered a nice sort of breadth of them. 766 00:31:07,080 --> 00:31:07,980 This was great. 767 00:31:07,980 --> 00:31:08,790 I like this format. 768 00:31:08,790 --> 00:31:09,540 DAVID MALAN: Yeah. 769 00:31:09,540 --> 00:31:11,778 No, hopefully you're still listening because I 770 00:31:11,778 --> 00:31:14,070 feel like we should offer a couple bits of advice here. 771 00:31:14,070 --> 00:31:15,830 I mean, one, on the Facebook password front, 772 00:31:15,830 --> 00:31:17,413 I mean, even I did change my password. 773 00:31:17,413 --> 00:31:20,550 I don't know if mine was among the millions that were apparently 774 00:31:20,550 --> 00:31:23,520 exposed in the clear, and it's not clear that any humans noticed 775 00:31:23,520 --> 00:31:27,150 or used the password in any way, but changing your password's 776 00:31:27,150 --> 00:31:28,020 not a bad idea. 777 00:31:28,020 --> 00:31:30,930 And as you may recall from CS50, itself, if you've taken the class, 778 00:31:30,930 --> 00:31:33,660 you should probably be using a password manager anyway and not 779 00:31:33,660 --> 00:31:36,243 just picking something that's pretty easy for you to remember. 780 00:31:36,243 --> 00:31:38,480 Better to let software do it instead. 781 00:31:38,480 --> 00:31:41,550 And on the robocall front, I, mean there's a couple defenses here. 782 00:31:41,550 --> 00:31:45,480 I mean, even on my phone, I block numbers once I realized, wait a minute, 783 00:31:45,480 --> 00:31:48,240 I don't want you calling me. 784 00:31:48,240 --> 00:31:50,460 But you can also use things like Google Voice, 785 00:31:50,460 --> 00:31:54,150 right, where they have a feature which seems a little socially obnoxious where 786 00:31:54,150 --> 00:31:56,100 Google will pick up the phone for you and they 787 00:31:56,100 --> 00:31:57,992 will ask the human to say who they are. 788 00:31:57,992 --> 00:32:00,450 Then you get, on your phone, a little preview of who it is, 789 00:32:00,450 --> 00:32:02,280 so it's like your own personal assistant. 790 00:32:02,280 --> 00:32:03,130 COLTON OGDEN: That's kind of interesting. 791 00:32:03,130 --> 00:32:04,350 I actually didn't realize that was thing. 792 00:32:04,350 --> 00:32:07,050 DAVID MALAN: It's an interesting buffer, but it's kind of obnoxious. 793 00:32:07,050 --> 00:32:07,842 COLTON OGDEN: Yeah. 794 00:32:07,842 --> 00:32:09,800 DAVID MALAN: Right, to have that intermediate. 795 00:32:09,800 --> 00:32:11,560 COLTON OGDEN: You could have a whitelist, surely though, that-- 796 00:32:11,560 --> 00:32:12,010 DAVID MALAN: For sure. 797 00:32:12,010 --> 00:32:12,390 COLTON OGDEN: Yeah. 798 00:32:12,390 --> 00:32:14,182 DAVID MALAN: No, so for unrecognized calls, 799 00:32:14,182 --> 00:32:17,445 but people tried rolling this out for email, though, years ago, 800 00:32:17,445 --> 00:32:19,320 and I remember even being put off by it then. 801 00:32:19,320 --> 00:32:21,720 If I email you for the first time, we've never met, 802 00:32:21,720 --> 00:32:23,790 you could have an automated service bounce back 803 00:32:23,790 --> 00:32:28,800 and say, oh, before Colton will reply to this, you need to confirm who you are 804 00:32:28,800 --> 00:32:30,752 and click a link, or something like that. 805 00:32:30,752 --> 00:32:32,460 And at least for me at the time-- maybe I 806 00:32:32,460 --> 00:32:37,083 was being a little, you know, a little presumptuous-- but it just felt like, 807 00:32:37,083 --> 00:32:40,000 ugh, this is, why is the burden being put on me to solve this problem. 808 00:32:40,000 --> 00:32:41,850 COLTON OGDEN: Also, a little high and mighty, potentially. 809 00:32:41,850 --> 00:32:42,640 DAVID MALAN: Yeah. 810 00:32:42,640 --> 00:32:45,328 But, I mean, it's an interesting software solution, 811 00:32:45,328 --> 00:32:47,370 and that's what Google Voice and there's probably 812 00:32:47,370 --> 00:32:48,953 other services that do the same thing. 813 00:32:48,953 --> 00:32:51,720 So you can look into things like that. 814 00:32:51,720 --> 00:32:54,990 And as for like Stadia, I'll be curious to try this out 815 00:32:54,990 --> 00:32:56,150 when it's more available. 816 00:32:56,150 --> 00:32:56,640 COLTON OGDEN: Yeah, me too. 817 00:32:56,640 --> 00:32:57,390 Me too. 818 00:32:57,390 --> 00:32:58,140 DAVID MALAN: Yeah. 819 00:32:58,140 --> 00:33:02,730 You know, I think it's worth noting that podcasting is how CS50 itself ended up 820 00:33:02,730 --> 00:33:04,950 online, way back when. 821 00:33:04,950 --> 00:33:07,057 Long story short, before CS50, as Colton knows, 822 00:33:07,057 --> 00:33:10,140 I taught a class at Harvard's Extension School, the Continuing Ed program, 823 00:33:10,140 --> 00:33:12,822 called Computer Science E-1, Understanding Computers 824 00:33:12,822 --> 00:33:13,530 and the Internet. 825 00:33:13,530 --> 00:33:17,070 And we, in 2005, I believe, started podcasting 826 00:33:17,070 --> 00:33:20,220 that course, which initially meant just distributing MP3s, 827 00:33:20,220 --> 00:33:22,260 which are audio files, of the lectures. 828 00:33:22,260 --> 00:33:25,650 And then, I think one year later, when the video iPod, of all things, 829 00:33:25,650 --> 00:33:30,930 came out, we started distributing videos in QuickTime format and Flash format, 830 00:33:30,930 --> 00:33:31,450 probably. 831 00:33:31,450 --> 00:33:32,700 COLTON OGDEN: Definitely MOVs. 832 00:33:32,700 --> 00:33:34,660 DAVID MALAN: Yeah, of the course's lectures. 833 00:33:34,660 --> 00:33:37,080 And it was really for the convenience of our own students 834 00:33:37,080 --> 00:33:39,962 who might be commuting on a train or maybe they're on a treadmill, 835 00:33:39,962 --> 00:33:42,420 and it was just kind of trying to make it easier for people 836 00:33:42,420 --> 00:33:43,830 to access the course's content. 837 00:33:43,830 --> 00:33:45,997 And, long story short, a whole bunch of other people 838 00:33:45,997 --> 00:33:49,740 who weren't in the class online took an interest, found the material valuable. 839 00:33:49,740 --> 00:33:52,050 And certainly, these days, there's such a proliferation 840 00:33:52,050 --> 00:33:53,760 of educational content online. 841 00:33:53,760 --> 00:33:56,100 But it was because that course we started 842 00:33:56,100 --> 00:33:59,070 podcasting that when I took over CS50 in 2007, 843 00:33:59,070 --> 00:34:01,500 it just felt natural, at that point, to make the course's 844 00:34:01,500 --> 00:34:03,390 videos available online as well. 845 00:34:03,390 --> 00:34:07,366 And even though we've kind of come full circle now and taken away the video 846 00:34:07,366 --> 00:34:09,449 and replaced it just with audio, I think it really 847 00:34:09,449 --> 00:34:12,030 allows us to focus on the conversation and the ideas 848 00:34:12,030 --> 00:34:17,971 without really any distractions of visuals or need to rely on video. 849 00:34:17,971 --> 00:34:19,679 So hopefully, this opens up possibilities 850 00:34:19,679 --> 00:34:22,679 for folks to listen in, as opposed to having to be rapt 851 00:34:22,679 --> 00:34:23,953 attention on a screen. 852 00:34:23,953 --> 00:34:27,120 COLTON OGDEN: And what I like is this is a more current events focused talk, 853 00:34:27,120 --> 00:34:27,570 too. 854 00:34:27,570 --> 00:34:29,250 Yeah, we have so much other content, it's 855 00:34:29,250 --> 00:34:32,458 nice to sort of have a discussion on the things that are relevant in the tech 856 00:34:32,458 --> 00:34:33,956 world, or otherwise. 857 00:34:33,956 --> 00:34:35,770 You know, it fits this format very well. 858 00:34:35,770 --> 00:34:37,020 DAVID MALAN: Yeah, absolutely. 859 00:34:37,020 --> 00:34:39,719 Well, so this was Episode Zero of CS50's podcast. 860 00:34:39,719 --> 00:34:43,360 We hope you'll join us soon for Episode 1, our second podcast. 861 00:34:43,360 --> 00:34:46,378 Thanks so much for CS50's own Colton Ogden, whose idea this has been, 862 00:34:46,378 --> 00:34:47,670 and thank you for spearheading. 863 00:34:47,670 --> 00:34:50,389 COLTON OGDEN: And thanks, David, for sort of leading the way here. 864 00:34:50,389 --> 00:34:51,270 DAVID MALAN: Yeah, absolutely. 865 00:34:51,270 --> 00:34:52,145 Talk to you all soon. 866 00:34:52,145 --> 00:34:53,600 COLTON OGDEN: Bye-bye. 867 00:34:53,600 --> 00:34:56,552