1 00:00:00,000 --> 00:00:02,880 [MUSIC PLAYING] 2 00:00:02,880 --> 00:00:04,675 SPEAKER 1: This is the [INAUDIBLE]. 3 00:00:04,675 --> 00:00:07,285 [MUSIC PLAYING] 4 00:00:07,285 --> 00:00:07,897 5 00:00:07,897 --> 00:00:08,980 DAVID MALAN: Hello, world. 6 00:00:08,980 --> 00:00:10,440 This is the CS50 Podcast. 7 00:00:10,440 --> 00:00:11,500 My name is David Malan. 8 00:00:11,500 --> 00:00:13,617 And I'm here again, with CS50s own, Brian Yu. 9 00:00:13,617 --> 00:00:14,700 BRIAN YU: Good to be back. 10 00:00:14,700 --> 00:00:17,940 DAVID MALAN: So in the news, of late, has been this app for iOS 11 00:00:17,940 --> 00:00:20,940 and Android called FaceApp, as some of you might have heard. 12 00:00:20,940 --> 00:00:22,867 Brian, have you used this app before? 13 00:00:22,867 --> 00:00:24,700 BRIAN YU: So people keep talking about this. 14 00:00:24,700 --> 00:00:26,280 I don't have it myself, on my phone. 15 00:00:26,280 --> 00:00:28,650 But one of our teaching fellows for CS50 does have it 16 00:00:28,650 --> 00:00:30,990 and was actually showing it to me earlier, today. 17 00:00:30,990 --> 00:00:33,570 It's kind of scary, what it can do. 18 00:00:33,570 --> 00:00:37,470 So the way it seems to work is that you open up the app, on your phone. 19 00:00:37,470 --> 00:00:41,220 And you can choose a photo of yourself, from your photo library, a photo of you 20 00:00:41,220 --> 00:00:42,120 or a friend. 21 00:00:42,120 --> 00:00:43,650 And you submit it. 22 00:00:43,650 --> 00:00:47,100 And then you can apply any number of these different image filters 23 00:00:47,100 --> 00:00:47,980 to it, effectively. 24 00:00:47,980 --> 00:00:49,900 But they can do a variety of different things. 25 00:00:49,900 --> 00:00:53,540 So they can show you what they think you would look like when you are older, 26 00:00:53,540 --> 00:00:55,413 and show you an elderly version of yourself, 27 00:00:55,413 --> 00:00:57,330 or what you looked like when you were younger. 28 00:00:57,330 --> 00:00:59,148 They can change your hairstyle. 29 00:00:59,148 --> 00:01:02,190 They can do all sorts of different things to the background of the image, 30 00:01:02,190 --> 00:01:03,240 for instance. 31 00:01:03,240 --> 00:01:05,550 So it's a pretty powerful image tool. 32 00:01:05,550 --> 00:01:09,000 And it does a pretty good job of trying to create a realistic looking photo. 33 00:01:09,000 --> 00:01:09,270 DAVID MALAN: Yeah. 34 00:01:09,270 --> 00:01:09,870 It's striking. 35 00:01:09,870 --> 00:01:13,260 And I discovered this app two years after everyone else did, it seems. 36 00:01:13,260 --> 00:01:16,290 Because someone sent me a modified photo of myself 37 00:01:16,290 --> 00:01:19,780 recently, whereby I was aged in the photo. 38 00:01:19,780 --> 00:01:21,855 And it was amazing. 39 00:01:21,855 --> 00:01:24,360 The realism of the skin, I thought was compelling. 40 00:01:24,360 --> 00:01:26,850 And what was most striking to me, so much so, 41 00:01:26,850 --> 00:01:29,400 that I forwarded it to a couple of relatives afterward, 42 00:01:29,400 --> 00:01:33,600 is that I look like a couple of my older relatives do in reality. 43 00:01:33,600 --> 00:01:39,690 And it was fascinating to see that the app was seeing these familial traits 44 00:01:39,690 --> 00:01:42,780 in myself, that even I don't see when I look in the mirror, right now. 45 00:01:42,780 --> 00:01:45,570 But apparently, if you age me and you make my skin 46 00:01:45,570 --> 00:01:48,225 a different texture, over time, oh my god, 47 00:01:48,225 --> 00:01:50,850 I'm going to actually look like some of my relatives, it seems. 48 00:01:50,850 --> 00:01:51,475 BRIAN YU: Yeah. 49 00:01:51,475 --> 00:01:53,052 It's incredible what the app can do. 50 00:01:53,052 --> 00:01:55,260 A human trying to do this type of thing on their own, 51 00:01:55,260 --> 00:01:56,620 might not be able to do at all. 52 00:01:56,620 --> 00:02:00,043 And I think that really just speaks to how powerful machine learning has 53 00:02:00,043 --> 00:02:00,960 gotten, at this point. 54 00:02:00,960 --> 00:02:04,500 That these machines have been trained to look at these huge data 55 00:02:04,500 --> 00:02:06,840 sets of analyzing younger and older pictures 56 00:02:06,840 --> 00:02:10,650 probably, and trying to understand, fundamentally, how you translate 57 00:02:10,650 --> 00:02:12,770 one younger photo to an older photo. 58 00:02:12,770 --> 00:02:15,878 And now, they've just gotten really good at being able to do that in a way 59 00:02:15,878 --> 00:02:18,170 that humans, on their own, never would've been able to. 60 00:02:18,170 --> 00:02:20,700 DAVID MALAN: So this, is then, related to our recent chat 61 00:02:20,700 --> 00:02:22,080 about machine learning, more generally. 62 00:02:22,080 --> 00:02:24,038 Where, I assume, the training data in this case 63 00:02:24,038 --> 00:02:27,450 is just lots, and lots, and lots of photos of people, young and old, 64 00:02:27,450 --> 00:02:28,565 and of all sorts. 65 00:02:28,565 --> 00:02:29,190 BRIAN YU: Yeah. 66 00:02:29,190 --> 00:02:30,148 That would be my guess. 67 00:02:30,148 --> 00:02:34,630 So FaceApp doesn't publicly announce exactly how their algorithm is working. 68 00:02:34,630 --> 00:02:38,490 But I would imagine that it's probably just a lot of training data, 69 00:02:38,490 --> 00:02:42,330 where you give the algorithm a whole bunch of younger photos and older 70 00:02:42,330 --> 00:02:42,870 photos. 71 00:02:42,870 --> 00:02:44,578 And you try and train the algorithm to be 72 00:02:44,578 --> 00:02:47,730 able to figure out how to turn the younger photo into the older photo. 73 00:02:47,730 --> 00:02:50,760 Such that you can give it a new younger photo as input and have it 74 00:02:50,760 --> 00:02:53,180 predict what the older photo is going to look like. 75 00:02:53,180 --> 00:02:54,330 DAVID MALAN: It's amazing. 76 00:02:54,330 --> 00:02:56,490 It's really quite fascinating too, to allow 77 00:02:56,490 --> 00:02:59,580 people to imagine what they might look like in different clothes, 78 00:02:59,580 --> 00:03:02,045 or I suppose, with different makeup on, or so forth. 79 00:03:02,045 --> 00:03:03,420 Computers can do so much of this. 80 00:03:03,420 --> 00:03:06,720 But it's actually quite scary too, because a corollary 81 00:03:06,720 --> 00:03:09,360 of being able to mutate people's faces in this way, digitally, 82 00:03:09,360 --> 00:03:11,785 is that you can surely identify people, as well. 83 00:03:11,785 --> 00:03:14,910 And I think that's one of the topics that's been getting a lot of attention 84 00:03:14,910 --> 00:03:19,290 here, certainly in the US, whereby a few cities, most recently, have actually 85 00:03:19,290 --> 00:03:23,850 outlawed, outright, the police's use of facial recognition 86 00:03:23,850 --> 00:03:25,048 to bring in suspects. 87 00:03:25,048 --> 00:03:25,590 For instance. 88 00:03:25,590 --> 00:03:28,510 Somerville, Massachusetts, which is right around the corner from Cambridge, 89 00:03:28,510 --> 00:03:29,760 Massachusetts, here, did this. 90 00:03:29,760 --> 00:03:33,232 And I mean, that's actually the flip side of the cool factor. 91 00:03:33,232 --> 00:03:35,190 I mean, honestly, I was pretty caught up in it, 92 00:03:35,190 --> 00:03:39,480 when I received this photo of myself some 20, 30, 40 years down the road. 93 00:03:39,480 --> 00:03:41,340 Sent it along happily to some other people. 94 00:03:41,340 --> 00:03:44,940 And then didn't really stop to think, until a few days later, 95 00:03:44,940 --> 00:03:47,730 when I started reading about FaceApp and the implications thereof. 96 00:03:47,730 --> 00:03:51,090 That actually, this really does forebode a scary future, 97 00:03:51,090 --> 00:03:54,720 where all too easily can computers, and whatever humans own them, 98 00:03:54,720 --> 00:03:57,000 pick us out in the crowd or track, really 99 00:03:57,000 --> 00:03:59,220 in the extreme, your every movement. 100 00:03:59,220 --> 00:04:04,780 I mean, do you think that policy is really the only solution to this? 101 00:04:04,780 --> 00:04:06,840 BRIAN YU: So I think that certainly, technology 102 00:04:06,840 --> 00:04:09,322 is going to get good enough that facial recognition is 103 00:04:09,322 --> 00:04:10,530 going to keep getting better. 104 00:04:10,530 --> 00:04:12,238 Because it's already really, really good. 105 00:04:12,238 --> 00:04:15,270 And I know this from whenever photos get posted on Facebook. 106 00:04:15,270 --> 00:04:17,940 And I'm in the background corner of a very small part 107 00:04:17,940 --> 00:04:20,790 of the image, Facebook, pretty immediately, is able to tell me, 108 00:04:20,790 --> 00:04:21,930 oh, that's me in the photo. 109 00:04:21,930 --> 00:04:23,720 When I don't even know if I would have noticed myself in the photo. 110 00:04:23,720 --> 00:04:23,902 DAVID MALAN: I know. 111 00:04:23,902 --> 00:04:26,320 Even when it just seems to be like a few pixels, off to the side. 112 00:04:26,320 --> 00:04:26,945 BRIAN YU: Yeah. 113 00:04:26,945 --> 00:04:31,860 So technology is certainly not going to be the factor that holds anyone back, 114 00:04:31,860 --> 00:04:33,400 when it comes to facial recognition. 115 00:04:33,400 --> 00:04:38,100 So if a city wants to protect itself against the potential implications 116 00:04:38,100 --> 00:04:41,750 of this, then I think policy is probably the only way to do it. 117 00:04:41,750 --> 00:04:45,930 Though it seems like the third city that most recently banned facial recognition 118 00:04:45,930 --> 00:04:47,460 in the city is Oakland. 119 00:04:47,460 --> 00:04:51,150 And it looks like their main concern is the misidentification of individuals, 120 00:04:51,150 --> 00:04:54,510 and how that might lead to the misuse of force, for example. 121 00:04:54,510 --> 00:04:58,860 And certainly, facial recognition technology is not perfect, right now. 122 00:04:58,860 --> 00:05:01,360 But it is getting better and better. 123 00:05:01,360 --> 00:05:03,480 So I can understand why more and more people might 124 00:05:03,480 --> 00:05:07,630 feel like they could begin to rely on it, even though it's not 100% accurate 125 00:05:07,630 --> 00:05:09,648 and may never be 100% accurate. 126 00:05:09,648 --> 00:05:12,190 DAVID MALAN: But that too, in and of itself, seems worrisome. 127 00:05:12,190 --> 00:05:14,260 Because if towns, or cities, are starting 128 00:05:14,260 --> 00:05:18,755 to ban it on the basis of the chance of misidentification, 129 00:05:18,755 --> 00:05:21,880 surely the technology, as you say, is only going to get better, and better, 130 00:05:21,880 --> 00:05:22,340 and better. 131 00:05:22,340 --> 00:05:24,700 And so that argument, you would think, is going to get weaker, and weaker, 132 00:05:24,700 --> 00:05:25,660 and weaker. 133 00:05:25,660 --> 00:05:28,390 Because I mean, even just a few years ago, was Facebook, 134 00:05:28,390 --> 00:05:31,900 you noted, claiming that they could identify 135 00:05:31,900 --> 00:05:37,720 humans in photos with an accuracy-- correct me if I'm wrong-- of 97.25%. 136 00:05:37,720 --> 00:05:41,560 Whereas, humans, when trying to identify other humans and photos, 137 00:05:41,560 --> 00:05:45,010 had an accuracy level of 97.5%. 138 00:05:45,010 --> 00:05:48,340 So almost exactly the same statistic. 139 00:05:48,340 --> 00:05:53,110 So at that point, if the software is just as good, if not better, 140 00:05:53,110 --> 00:05:58,540 than humans' own identification, it seems like a weak foundation 141 00:05:58,540 --> 00:06:00,070 on which to ban the technology. 142 00:06:00,070 --> 00:06:02,860 And really, our statement should be stronger than just, oh, 143 00:06:02,860 --> 00:06:04,570 there's this risk of misidentification. 144 00:06:04,570 --> 00:06:07,635 But rather, this is not something we want societally, no? 145 00:06:07,635 --> 00:06:08,260 BRIAN YU: Yeah. 146 00:06:08,260 --> 00:06:11,380 I think that, especially now that facial recognition technology 147 00:06:11,380 --> 00:06:12,590 has gotten better. 148 00:06:12,590 --> 00:06:16,180 But when the Facebook did that study, I think that was back in 2014, or so. 149 00:06:16,180 --> 00:06:18,312 So I would guess that Facebook's facial recognition 150 00:06:18,312 --> 00:06:21,520 abilities have gotten even better than that, over the course of the past five 151 00:06:21,520 --> 00:06:22,670 years, or so. 152 00:06:22,670 --> 00:06:25,822 So facial recognition is probably better when 153 00:06:25,822 --> 00:06:28,030 a computer is doing it than when humans are doing it, 154 00:06:28,030 --> 00:06:29,950 by now, or at least close to as good. 155 00:06:29,950 --> 00:06:32,560 And so given that, I do think that when it 156 00:06:32,560 --> 00:06:38,080 comes to trying to decide on how we want to shape the policies in our society, 157 00:06:38,080 --> 00:06:41,230 that we should not just be looking at how accurate these things are. 158 00:06:41,230 --> 00:06:43,990 But also, looking at what kind of technologies 159 00:06:43,990 --> 00:06:47,710 do we want to be playing a role in our policing system, and in the way 160 00:06:47,710 --> 00:06:50,140 that the society runs, and the rules there. 161 00:06:50,140 --> 00:06:51,550 DAVID MALAN: And I imagine this is going to play out differently 162 00:06:51,550 --> 00:06:52,450 in different countries. 163 00:06:52,450 --> 00:06:54,430 And I feel like you've already seen evidence of this, 164 00:06:54,430 --> 00:06:55,680 if you travel internationally. 165 00:06:55,680 --> 00:06:58,060 Because customs agencies, in a lot of countries, 166 00:06:58,060 --> 00:07:01,600 are already photographing, even with those silly little webcams, 167 00:07:01,600 --> 00:07:04,675 when you swipe your passport and sign into a country. 168 00:07:04,675 --> 00:07:07,300 They've been logging people's comings and going, for some time. 169 00:07:07,300 --> 00:07:10,750 So really the technology is just facilitating all the more of that 170 00:07:10,750 --> 00:07:11,820 and tracking. 171 00:07:11,820 --> 00:07:13,570 I mean, in the UK, for years, they've been 172 00:07:13,570 --> 00:07:18,410 known as having hundreds, thousands of CCTVs, closed-circuit televisions. 173 00:07:18,410 --> 00:07:20,650 Which, I believe, historically were used really 174 00:07:20,650 --> 00:07:24,340 for monitoring, either in real time or after the fact, based on recordings. 175 00:07:24,340 --> 00:07:28,338 But now, you can imagine software just scouring a city, almost like a Batman. 176 00:07:28,338 --> 00:07:30,880 I was just watching, I think, The Dark Knight, the other day, 177 00:07:30,880 --> 00:07:34,450 where Bruce Wayne is able to oversee everything going on in Gotham, 178 00:07:34,450 --> 00:07:37,150 or listen in, in that case, what bias people's cell phones. 179 00:07:37,150 --> 00:07:40,240 It just feels like we're all too close to the point where 180 00:07:40,240 --> 00:07:43,540 you could do a Google search for someone, essentially on Google Maps, 181 00:07:43,540 --> 00:07:44,540 and find where they are. 182 00:07:44,540 --> 00:07:46,685 Because there are so many cameras watching. 183 00:07:46,685 --> 00:07:47,310 BRIAN YU: Yeah. 184 00:07:47,310 --> 00:07:49,143 And so, those privacy concerns, I think, are 185 00:07:49,143 --> 00:07:51,370 part of what this whole recent controversy has been 186 00:07:51,370 --> 00:07:53,780 with facial recognition and FaceApp. 187 00:07:53,780 --> 00:07:55,630 And in particular, with FaceApp, the worry 188 00:07:55,630 --> 00:07:59,320 has been that when FaceApp is running these filters to take your face 189 00:07:59,320 --> 00:08:01,270 and modify it to be some different face. 190 00:08:01,270 --> 00:08:03,550 It's not just a program that's running on your phone 191 00:08:03,550 --> 00:08:05,140 to be able to do that sort of thing. 192 00:08:05,140 --> 00:08:07,510 It's that you've taken a photo, and that photo 193 00:08:07,510 --> 00:08:09,610 is being uploaded to FaceApp servers. 194 00:08:09,610 --> 00:08:12,160 And now your photo is on the internet, somewhere. 195 00:08:12,160 --> 00:08:15,020 And potentially, it could stay there and be used for other purposes. 196 00:08:15,020 --> 00:08:16,300 And who knows what might happen to it. 197 00:08:16,300 --> 00:08:16,510 DAVID MALAN: Yeah. 198 00:08:16,510 --> 00:08:18,635 I mean, you, and some other people on the internet, 199 00:08:18,635 --> 00:08:21,130 dug into the privacy policy that FaceApp has. 200 00:08:21,130 --> 00:08:23,080 And if we read just a few sentences here, 201 00:08:23,080 --> 00:08:26,290 one of the sections in the "Terms of Service" 202 00:08:26,290 --> 00:08:31,360 are that, "You grant FaceApp consent to use the user content, 203 00:08:31,360 --> 00:08:34,570 regardless of whether it includes an individual's name, likeness, voice, 204 00:08:34,570 --> 00:08:37,929 or persona sufficient to indicate the individual's identity. 205 00:08:37,929 --> 00:08:40,690 By using the services, you agree that the user consent 206 00:08:40,690 --> 00:08:42,610 may be used for commercial purposes. 207 00:08:42,610 --> 00:08:44,508 You further acknowledge that FaceApp's use 208 00:08:44,508 --> 00:08:46,300 of the user content for commercial purposes 209 00:08:46,300 --> 00:08:49,630 will not result in any injury to you or any other person you 210 00:08:49,630 --> 00:08:51,740 authorized to act on your behalf." 211 00:08:51,740 --> 00:08:52,700 And so forth. 212 00:08:52,700 --> 00:08:57,790 So you essentially are turning over your facial property, and any photos 213 00:08:57,790 --> 00:08:59,470 thereof, to other people. 214 00:08:59,470 --> 00:09:01,780 And in my case, it wasn't even me who opted into this. 215 00:09:01,780 --> 00:09:03,550 It was someone else who uploaded my photo. 216 00:09:03,550 --> 00:09:06,970 And, at the time, I perhaps didn't take enough offense or concern. 217 00:09:06,970 --> 00:09:11,290 But that too is an issue, ever more so, when folks are using services 218 00:09:11,290 --> 00:09:14,410 like this, not to mention Facebook and other social media apps, 219 00:09:14,410 --> 00:09:17,650 and are actually providing, not only their photos, but here is my name, 220 00:09:17,650 --> 00:09:20,470 here is my birthday, here are photos from what I did yesterday, 221 00:09:20,470 --> 00:09:23,420 and God knows how much more information about you. 222 00:09:23,420 --> 00:09:27,040 I mean, we've all tragically opted into this, under the guise of, 223 00:09:27,040 --> 00:09:27,790 oh, this is great. 224 00:09:27,790 --> 00:09:29,920 We're being social with other people online. 225 00:09:29,920 --> 00:09:32,620 When really, we're providing a lot of companies with treasure 226 00:09:32,620 --> 00:09:34,330 troves of information about us. 227 00:09:34,330 --> 00:09:37,665 And now, governmental agencies seem to be hopping on board, as well. 228 00:09:37,665 --> 00:09:38,290 BRIAN YU: Yeah. 229 00:09:38,290 --> 00:09:39,270 Facebook, especially. 230 00:09:39,270 --> 00:09:42,640 It's just scary how much they know about exactly who you are 231 00:09:42,640 --> 00:09:44,848 and what your internet browsing habits are like. 232 00:09:44,848 --> 00:09:47,890 It's all too often that I'll be reading about something, on the internet, 233 00:09:47,890 --> 00:09:49,540 that I might be interested in purchasing. 234 00:09:49,540 --> 00:09:51,010 And all of a sudden, I go and check Facebook, 235 00:09:51,010 --> 00:09:52,690 and there's an advertisement for the very thing 236 00:09:52,690 --> 00:09:54,440 that I was just thinking about purchasing. 237 00:09:54,440 --> 00:09:57,220 Because Facebook has their cookies installed 238 00:09:57,220 --> 00:10:00,040 on so many websites that are just tracking every website you visit. 239 00:10:00,040 --> 00:10:03,280 And they can link that back to you and know exactly what you've been doing. 240 00:10:03,280 --> 00:10:04,030 DAVID MALAN: Yeah. 241 00:10:04,030 --> 00:10:04,230 I know. 242 00:10:04,230 --> 00:10:06,130 And I was thinking that the other day, because I 243 00:10:06,130 --> 00:10:08,922 was seeing ads for something, that I actually went ahead and bought 244 00:10:08,922 --> 00:10:09,700 from some website. 245 00:10:09,700 --> 00:10:10,750 I don't even remember what it was. 246 00:10:10,750 --> 00:10:13,500 But I was actually annoyed that the technology wasn't smart enough 247 00:10:13,500 --> 00:10:16,090 to opt me out of those same adverts, once I had actually 248 00:10:16,090 --> 00:10:17,830 completed the transaction. 249 00:10:17,830 --> 00:10:20,890 But you know, I was thinking too, because just yesterday, I 250 00:10:20,890 --> 00:10:22,480 was walking back to the office. 251 00:10:22,480 --> 00:10:26,590 And I passed someone who I was, for a moment, super sure that I knew. 252 00:10:26,590 --> 00:10:28,440 But I wasn't 100% confident. 253 00:10:28,440 --> 00:10:29,382 So I kept walking. 254 00:10:29,382 --> 00:10:30,340 And then I felt guilty. 255 00:10:30,340 --> 00:10:32,290 And so I turned around, because I didn't want to just walk 256 00:10:32,290 --> 00:10:33,707 past someone without saying hello. 257 00:10:33,707 --> 00:10:35,680 But then when I saw them a second time, nope, 258 00:10:35,680 --> 00:10:38,170 it still wasn't the person I thought it was. 259 00:10:38,170 --> 00:10:39,490 But I had that hesitation. 260 00:10:39,490 --> 00:10:42,850 And I couldn't help but think now, in hearing these statistics, 261 00:10:42,850 --> 00:10:46,750 that Facebook and real humans are on statistically 97% 262 00:10:46,750 --> 00:10:48,340 good at detecting faces. 263 00:10:48,340 --> 00:10:50,020 That was my 3%, yesterday. 264 00:10:50,020 --> 00:10:53,020 Out of 100 people, he was one of the three people in this week 265 00:10:53,020 --> 00:10:54,530 that I'm going to fail to recognize. 266 00:10:54,530 --> 00:10:56,197 And it really put this into perspective. 267 00:10:56,197 --> 00:10:59,650 Because while you might think that humans are perfect, 268 00:10:59,650 --> 00:11:02,860 and it's the machines that are trying to catch up, it feels like sometimes 269 00:11:02,860 --> 00:11:05,020 it's the machines that are already catching up. 270 00:11:05,020 --> 00:11:07,295 And, case in point, there was my own mistake. 271 00:11:07,295 --> 00:11:07,920 BRIAN YU: Yeah. 272 00:11:07,920 --> 00:11:11,003 And when machine learning algorithms, like facial recognition, but machine 273 00:11:11,003 --> 00:11:13,990 learning more generally, are trained, humans are often the baseline 274 00:11:13,990 --> 00:11:17,410 that computers are striving to match, in terms of performance. 275 00:11:17,410 --> 00:11:20,620 Where, you try and have a human perform some task 276 00:11:20,620 --> 00:11:23,462 of trying to label images, or documents, or the like. 277 00:11:23,462 --> 00:11:25,420 And then you give the same task to the computer 278 00:11:25,420 --> 00:11:28,513 and see, how accurately does the computer match up 279 00:11:28,513 --> 00:11:29,430 with the human's task. 280 00:11:29,430 --> 00:11:32,260 With the goal of being, how human can we get the computer to be. 281 00:11:32,260 --> 00:11:35,740 But there are certain tasks where, you could actually imagine cases, 282 00:11:35,740 --> 00:11:37,420 where the computer can get better. 283 00:11:37,420 --> 00:11:39,370 And facial recognition is one of those cases, 284 00:11:39,370 --> 00:11:41,498 where I feel like, eventually, if not already, 285 00:11:41,498 --> 00:11:42,790 it could be better than humans. 286 00:11:42,790 --> 00:11:45,160 I think, self-driving cars is another example, which 287 00:11:45,160 --> 00:11:47,620 we've talked about before, where there's a lot of potential 288 00:11:47,620 --> 00:11:50,772 for cars to be better when they're being driven by computers 289 00:11:50,772 --> 00:11:52,480 than when they're being driven by people. 290 00:11:52,480 --> 00:11:54,605 DAVID MALAN: But I think that's an interesting one, 291 00:11:54,605 --> 00:11:58,840 because it's hard for people I think to rationally acknowledge that, right? 292 00:11:58,840 --> 00:12:01,240 Because I feel like, you read all the time 293 00:12:01,240 --> 00:12:04,600 about a self-driving car that's been involved in an accident. 294 00:12:04,600 --> 00:12:07,510 Because this seems to be evidence, among some minds, of this 295 00:12:07,510 --> 00:12:09,760 is why we shouldn't have self-driving cars. 296 00:12:09,760 --> 00:12:13,240 Yet, I'm guessing we're nearing the point if we're not there already 297 00:12:13,240 --> 00:12:16,510 where it is humans who are crashing their cars far more 298 00:12:16,510 --> 00:12:18,520 frequently than these computers. 299 00:12:18,520 --> 00:12:21,940 And so, we need to appreciate that, yes, the machines are 300 00:12:21,940 --> 00:12:23,170 going to make mistakes. 301 00:12:23,170 --> 00:12:27,040 And in the worst extreme case, God forbid, a computer, a machine, 302 00:12:27,040 --> 00:12:30,130 might actually hit, and hurt someone, or kill someone. 303 00:12:30,130 --> 00:12:33,610 But that's the same reality in our human world. 304 00:12:33,610 --> 00:12:36,790 And it's perhaps a net positive, if machines 305 00:12:36,790 --> 00:12:39,790 get to the point of being at least better than we humans. 306 00:12:39,790 --> 00:12:42,340 Of course, in facial recognition that could actually 307 00:12:42,340 --> 00:12:45,010 mean adversarily, for humans, that they're being detected. 308 00:12:45,010 --> 00:12:47,350 They're being monitored far more commonly. 309 00:12:47,350 --> 00:12:50,110 So it almost seems these trends in machine learning 310 00:12:50,110 --> 00:12:53,230 are both for good and for bad. 311 00:12:53,230 --> 00:12:55,625 I mean even FaceApp, a couple of years ago, apparently-- 312 00:12:55,625 --> 00:12:58,750 and I only realized this by reading up on some of the recent press it's now 313 00:12:58,750 --> 00:12:59,680 gotten again-- 314 00:12:59,680 --> 00:13:03,393 I mean, even they got themselves into some touchy social waters, 315 00:13:03,393 --> 00:13:05,560 when it came to some of the filters they rolled out. 316 00:13:05,560 --> 00:13:07,352 Apparently, a couple of years ago, they had 317 00:13:07,352 --> 00:13:10,630 a hot filter, which was supposed to make you look prettier in your photos. 318 00:13:10,630 --> 00:13:14,170 The catch is, that for many people, this was apparently exhibiting patterns 319 00:13:14,170 --> 00:13:18,520 like lightening skin tone, thereby invoking some racial undertones, 320 00:13:18,520 --> 00:13:19,810 as to what defines beauty. 321 00:13:19,810 --> 00:13:22,777 And they even had in more explicit filters, I gather, a couple of years 322 00:13:22,777 --> 00:13:25,360 ago, where you could actually change your own ethnicity, which 323 00:13:25,360 --> 00:13:26,873 did not go over well, either. 324 00:13:26,873 --> 00:13:28,790 And so those features have since been removed. 325 00:13:28,790 --> 00:13:32,230 But that doesn't change the fact that, we are at the point technologically, 326 00:13:32,230 --> 00:13:34,450 where computers can do this and are probably 327 00:13:34,450 --> 00:13:37,987 poised to do it even better, for better or for worse. 328 00:13:37,987 --> 00:13:39,820 And so again, it seems to boil down to then, 329 00:13:39,820 --> 00:13:43,990 how we humans decide proactively, or worse, reactively, 330 00:13:43,990 --> 00:13:49,930 to put limits on these technologies or restrain ourselves 331 00:13:49,930 --> 00:13:51,240 from actually using them. 332 00:13:51,240 --> 00:13:51,865 BRIAN YU: Yeah. 333 00:13:51,865 --> 00:13:54,920 I think that's one of the big challenges for societies and governments, 334 00:13:54,920 --> 00:13:59,620 especially right at this point in time, is catching up with technology, where 335 00:13:59,620 --> 00:14:02,230 technology is really moving fast, and, every year, 336 00:14:02,230 --> 00:14:05,000 is capable of more things than the year before. 337 00:14:05,000 --> 00:14:08,990 And that's expanding the horizon on what computers can do. 338 00:14:08,990 --> 00:14:13,840 And I think it's really incumbent upon society to be able to figure out, 339 00:14:13,840 --> 00:14:16,720 OK, what things should this computers be able to do, 340 00:14:16,720 --> 00:14:20,120 and placing those appropriate limits earlier rather than later. 341 00:14:20,120 --> 00:14:21,650 DAVID MALAN: Yeah, absolutely. 342 00:14:21,650 --> 00:14:23,410 And I think it's not just photos, right? 343 00:14:23,410 --> 00:14:26,080 Because there's been in the press, over the past year or two, 344 00:14:26,080 --> 00:14:28,840 this notion of deepfake videos, as well. 345 00:14:28,840 --> 00:14:31,270 Whereby, using machine learning and algorithms, 346 00:14:31,270 --> 00:14:34,690 you feed these algorithms lots of training data, 347 00:14:34,690 --> 00:14:39,580 like lots of videos of you teaching, or talking, or walking, and moving, 348 00:14:39,580 --> 00:14:40,600 and so forth. 349 00:14:40,600 --> 00:14:44,170 And out of that learning process can come a synthesized video 350 00:14:44,170 --> 00:14:47,650 of you saying something, moving something, doing something, 351 00:14:47,650 --> 00:14:50,920 that you never actually said, or did, or moved. 352 00:14:50,920 --> 00:14:54,640 A couple of clips gained a decent amount of notoriety some months ago, 353 00:14:54,640 --> 00:14:58,150 because someone did this, for instance, for President Obama in the US. 354 00:14:58,150 --> 00:15:01,200 In fact, do you want to go ahead and play the clip of this deepfake? 355 00:15:01,200 --> 00:15:02,710 So there is a video component, too. 356 00:15:02,710 --> 00:15:07,785 But what you're about to hear is not Obama, much as it sounds like him. 357 00:15:07,785 --> 00:15:08,660 BRIAN YU: Yeah, sure. 358 00:15:08,660 --> 00:15:08,870 [AUDIO PLAYBACK] 359 00:15:08,870 --> 00:15:11,240 - We're entering an era in which our enemies can 360 00:15:11,240 --> 00:15:15,110 make it look like anyone is saying anything, at any point in time. 361 00:15:15,110 --> 00:15:17,700 Even if they would never say those things. 362 00:15:17,700 --> 00:15:24,260 So for instance, they could have me say things like, I don't know, 363 00:15:24,260 --> 00:15:29,410 Killmonger was right, or Ben Carson is in the sunken place. 364 00:15:29,410 --> 00:15:35,480 Or, how about this, simply, President Trump is a total and complete dipshit. 365 00:15:35,480 --> 00:15:39,790 Now, you see, I would never say these things, 366 00:15:39,790 --> 00:15:41,240 at least not in a public address. 367 00:15:41,240 --> 00:15:47,950 But someone else would, someone like Jordan Peele. 368 00:15:47,950 --> 00:15:51,640 This is a dangerous time. 369 00:15:51,640 --> 00:15:54,060 Moving forward, we need to be more vigilant with what 370 00:15:54,060 --> 00:15:56,220 we trust from the internet. 371 00:15:56,220 --> 00:16:01,390 That's a time when we need to rely on trusted news sources. 372 00:16:01,390 --> 00:16:07,600 It may sound basic, but how we move forward in the age of information 373 00:16:07,600 --> 00:16:11,590 is going to be the difference between whether we survive 374 00:16:11,590 --> 00:16:15,040 or whether we become some kind of fucked-up dystopia. 375 00:16:15,040 --> 00:16:16,610 Thank you. 376 00:16:16,610 --> 00:16:17,660 And stay woke bitches. 377 00:16:17,660 --> 00:16:19,030 [END PLAYBACK] 378 00:16:19,030 --> 00:16:21,280 DAVID MALAN: So if you're familiar with Obama's voice, 379 00:16:21,280 --> 00:16:24,760 this probably sounds quite like him, but maybe not exactly. 380 00:16:24,760 --> 00:16:27,370 And it might sound a bit more like an Obama impersonator. 381 00:16:27,370 --> 00:16:30,490 But honestly, if we just wait a year, or two, or more, 382 00:16:30,490 --> 00:16:33,880 I bet these deepfake impressions of actual humans 383 00:16:33,880 --> 00:16:38,305 are going to become indistinguishable from the actual humans themselves. 384 00:16:38,305 --> 00:16:40,180 And in fact, it's perhaps all too appropriate 385 00:16:40,180 --> 00:16:43,360 that this just happened on Facebook, or more specifically, 386 00:16:43,360 --> 00:16:45,640 on Instagram recently, where Facebook's own Mark 387 00:16:45,640 --> 00:16:48,460 Zuckerberg was deepfaked via video. 388 00:16:48,460 --> 00:16:50,713 Should we go ahead and have a listen to that, too? 389 00:16:50,713 --> 00:16:51,380 [AUDIO PLAYBACK] 390 00:16:51,380 --> 00:16:55,340 - Imagine this for a second, one man with total control 391 00:16:55,340 --> 00:16:59,960 of billions of people's stolen data, all their secrets, their lives, 392 00:16:59,960 --> 00:17:01,400 their futures. 393 00:17:01,400 --> 00:17:03,710 I owe it all to Spectre. 394 00:17:03,710 --> 00:17:08,045 Spectre showed me that, whoever controls the data, controls the future. 395 00:17:08,045 --> 00:17:09,545 [END PLAYBACK] 396 00:17:09,545 --> 00:17:12,670 DAVID MALAN: So there too, it doesn't sound perfectly like Mark Zuckerberg. 397 00:17:12,670 --> 00:17:14,462 But if you were to watch the video online-- 398 00:17:14,462 --> 00:17:18,300 and if you go ahead Indeed and google President Obama deepfake and Mark 399 00:17:18,300 --> 00:17:20,430 Zuckerberg deepfake, odds are, you'll find your way 400 00:17:20,430 --> 00:17:22,560 to these very same videos, and actually see 401 00:17:22,560 --> 00:17:25,079 the mouth movements and the facial movements that are 402 00:17:25,079 --> 00:17:26,730 synthesized by the computer, as well. 403 00:17:26,730 --> 00:17:29,340 That too, is only going to get better. 404 00:17:29,340 --> 00:17:33,870 And I wonder, you can certainly use this technology all too obviously for evil, 405 00:17:33,870 --> 00:17:36,390 to literally put words in someone's mouth 406 00:17:36,390 --> 00:17:39,030 that they never said, but they seem to be saying, 407 00:17:39,030 --> 00:17:42,630 in a way, that's far more persuasive than just misquoting someone 408 00:17:42,630 --> 00:17:46,170 in the world of text or synthesizing someone's voice, 409 00:17:46,170 --> 00:17:48,510 as seems to happen often in TV and movie shows. 410 00:17:48,510 --> 00:17:52,230 But doing even more compellingly because people are all the more inclined, 411 00:17:52,230 --> 00:17:55,890 I would think, to believe, not only what they hear or read, but what they see, 412 00:17:55,890 --> 00:17:56,920 as well. 413 00:17:56,920 --> 00:18:01,260 But you could imagine, maybe even using this technology for good. 414 00:18:01,260 --> 00:18:03,390 You and I, for instance, spend a lot of time 415 00:18:03,390 --> 00:18:06,960 preparing to teach classes on video, for instance, that don't necessarily 416 00:18:06,960 --> 00:18:08,340 have students there physically. 417 00:18:08,340 --> 00:18:10,380 Because we do it in a studio environment. 418 00:18:10,380 --> 00:18:14,250 So I wonder, to be honest, if you give us a couple of years time 419 00:18:14,250 --> 00:18:17,400 and feed enough recordings of us, now in the present, 420 00:18:17,400 --> 00:18:20,730 to computers of the future, could they actually synthesize 421 00:18:20,730 --> 00:18:24,900 you teaching a class, or me teaching a class, and have the voice sound right, 422 00:18:24,900 --> 00:18:29,380 have the words sound right, have the facial and the physical movements look 423 00:18:29,380 --> 00:18:29,880 right? 424 00:18:29,880 --> 00:18:32,190 So much so, that you and I, down the road, 425 00:18:32,190 --> 00:18:34,887 could just write a script for what it is that we want to say, 426 00:18:34,887 --> 00:18:36,720 or what it is we want to teach, and just let 427 00:18:36,720 --> 00:18:39,335 the computer take it the final mile? 428 00:18:39,335 --> 00:18:40,710 BRIAN YU: That's a scary thought. 429 00:18:40,710 --> 00:18:41,910 We'd be out of a job. 430 00:18:41,910 --> 00:18:44,160 DAVID MALAN: Well, someone's got to write the content. 431 00:18:44,160 --> 00:18:47,640 Although, surely if we just feed the algorithms enough words 432 00:18:47,640 --> 00:18:48,990 that we've previously said. 433 00:18:48,990 --> 00:18:51,840 You could imagine, oh, just go synthesize what it is my thoughts 434 00:18:51,840 --> 00:18:53,188 would be on this topic. 435 00:18:53,188 --> 00:18:53,730 I don't know. 436 00:18:53,730 --> 00:18:57,390 I mean, there's some actually interesting applications of this, 437 00:18:57,390 --> 00:19:00,480 at least if you disclaim to the audience, 438 00:19:00,480 --> 00:19:03,600 for instance, that this is indeed synthesized and not 439 00:19:03,600 --> 00:19:05,490 the actual Brian or the actual David. 440 00:19:05,490 --> 00:19:08,335 But if you're a fan of Black Mirror, the TV show that's 441 00:19:08,335 --> 00:19:10,210 been popular for a few years now, on Netflix, 442 00:19:10,210 --> 00:19:12,043 there's actually in the most recent season-- 443 00:19:12,043 --> 00:19:15,840 no spoilers here-- but in most recent season, starring Miley Cyrus, 444 00:19:15,840 --> 00:19:19,860 and the rest of the cast, actually touch on this very subject, 445 00:19:19,860 --> 00:19:23,940 and use, although they don't identify it by name, this notion of deepfaking, 446 00:19:23,940 --> 00:19:25,175 when it comes to videos. 447 00:19:25,175 --> 00:19:25,800 BRIAN YU: Yeah. 448 00:19:25,800 --> 00:19:27,675 It's a very interesting technology, for sure. 449 00:19:27,675 --> 00:19:30,270 And these videos of Mark Zuckerberg and Obama, 450 00:19:30,270 --> 00:19:33,443 certainly you can tell, if you're watching and paying attention closely, 451 00:19:33,443 --> 00:19:36,360 that there's certain things that don't look or don't feel quite right. 452 00:19:36,360 --> 00:19:39,913 But I would be very curious to see a Turing test, of sorts, 453 00:19:39,913 --> 00:19:40,830 on this type of thing. 454 00:19:40,830 --> 00:19:44,310 Where you ask someone to be able to look at two videos and figure out, 455 00:19:44,310 --> 00:19:47,700 which one is the actual Obama, and which one is the actual Mark Zuckerberg. 456 00:19:47,700 --> 00:19:49,560 I guess that, on these videos, most people 457 00:19:49,560 --> 00:19:51,060 would probably do a pretty good job. 458 00:19:51,060 --> 00:19:52,800 But I don't think it'd be 100%. 459 00:19:52,800 --> 00:19:56,710 But I would be very curious to see, year after year, how that rate would change. 460 00:19:56,710 --> 00:19:59,040 And as these technologies get better, as people 461 00:19:59,040 --> 00:20:01,080 become less able to be able to distinguish, 462 00:20:01,080 --> 00:20:04,050 to the point where it just be a 50/50 shot as to which one of the fake. 463 00:20:04,050 --> 00:20:04,800 DAVID MALAN: Yeah. 464 00:20:04,800 --> 00:20:06,550 Especially when it's not just celebrities, 465 00:20:06,550 --> 00:20:09,030 but it's a person you've never met and you are seeing them, 466 00:20:09,030 --> 00:20:11,230 or quote-unquote them for the first time on video. 467 00:20:11,230 --> 00:20:13,230 I bet it would be even harder for a lot of folks 468 00:20:13,230 --> 00:20:16,530 to distinguish someone for whom they don't have just ample press 469 00:20:16,530 --> 00:20:21,330 clippings in their memory, of having seen them or heard them before. 470 00:20:21,330 --> 00:20:23,490 So what do you think, in the short term-- 471 00:20:23,490 --> 00:20:27,180 because this problem only seems to get scarier and worse down the road-- 472 00:20:27,180 --> 00:20:30,120 is there anything people like you, and I, and anyone 473 00:20:30,120 --> 00:20:32,520 else out there can actually do to protect themselves 474 00:20:32,520 --> 00:20:35,830 against this trending, if you will? 475 00:20:35,830 --> 00:20:38,580 BRIAN YU: So I think one of the important things 476 00:20:38,580 --> 00:20:43,350 is just being aware of it, and being mindful of it, and being on the lookout 477 00:20:43,350 --> 00:20:44,700 for it as it comes up. 478 00:20:44,700 --> 00:20:47,370 Because certainly, there is nothing we can really 479 00:20:47,370 --> 00:20:50,220 do to stop people from generating content like this, 480 00:20:50,220 --> 00:20:54,090 and generating fake audio recordings, or video recordings. 481 00:20:54,090 --> 00:20:58,012 But I think that, if people look at something, 482 00:20:58,012 --> 00:20:59,970 and it's potentially a fake video, and you just 483 00:20:59,970 --> 00:21:01,940 take it at face value as accurate. 484 00:21:01,940 --> 00:21:03,690 Then that's a potentially dangerous thing. 485 00:21:03,690 --> 00:21:06,390 But encouraging people to take a second look at things, 486 00:21:06,390 --> 00:21:10,080 to be able to look a little more deeply to try and find the primary sources, 487 00:21:10,080 --> 00:21:12,300 that's probably a way to mitigate it. 488 00:21:12,300 --> 00:21:14,880 But even then, the ultimate primary source 489 00:21:14,880 --> 00:21:16,590 is the actual person doing the speaking. 490 00:21:16,590 --> 00:21:20,410 So if you can simulate that, then even that's not a perfect solution. 491 00:21:20,410 --> 00:21:23,160 DAVID MALAN: So is it fair to say maybe, that the biggest takeaway 492 00:21:23,160 --> 00:21:26,160 here, certainly educationally, would be just critical thinking 493 00:21:26,160 --> 00:21:30,030 and seeing, hearing something and deciding for yourself evaluatively 494 00:21:30,030 --> 00:21:33,330 if this is some source I should believe. 495 00:21:33,330 --> 00:21:34,620 BRIAN YU: Yeah, I'd say so. 496 00:21:34,620 --> 00:21:35,640 DAVID MALAN: And you should probably stop 497 00:21:35,640 --> 00:21:37,350 uploading photos of yourself to Facebook, 498 00:21:37,350 --> 00:21:39,300 and Instagram, and Snapchat, and the like. 499 00:21:39,300 --> 00:21:40,410 BRIAN YU: Well, that's a good question. 500 00:21:40,410 --> 00:21:43,493 Should you stop uploading photos to Facebook, and Instagram, and Snapchat. 501 00:21:43,493 --> 00:21:46,080 I mean, certainly there's a lot of positive value for that. 502 00:21:46,080 --> 00:21:49,110 Like my family always loves it when they see photos of me on Facebook. 503 00:21:49,110 --> 00:21:51,560 And maybe is that worth the trade-off of my photos being online? 504 00:21:51,560 --> 00:21:53,173 DAVID MALAN: Living in a police state. 505 00:21:53,173 --> 00:21:54,442 [LAUGHTER] 506 00:21:54,442 --> 00:21:55,290 507 00:21:55,290 --> 00:21:56,520 I don't know. 508 00:21:56,520 --> 00:22:02,790 I mean, I think that to some extent, the cat is out of the bag. 509 00:22:02,790 --> 00:22:05,250 There's already hundreds of photos, I'm guessing, of me 510 00:22:05,250 --> 00:22:08,790 out there online, whether it's in social media or other people's accounts 511 00:22:08,790 --> 00:22:12,738 that I even know about, for instance, because I just wasn't tagged in there. 512 00:22:12,738 --> 00:22:14,780 But I would think that that's really the only way 513 00:22:14,780 --> 00:22:18,470 to stay off the grid is not to, at least, participate in this media. 514 00:22:18,470 --> 00:22:22,340 But again, especially in the UK, and other cities, and surely 515 00:22:22,340 --> 00:22:24,440 other locations here in the US, you can't even 516 00:22:24,440 --> 00:22:26,330 go outside anymore without being picked up 517 00:22:26,330 --> 00:22:30,170 by one or more cameras, whether it's in an ATM, at a bank, 518 00:22:30,170 --> 00:22:33,892 or whether it's street view camera up above, or literally Street View. 519 00:22:33,892 --> 00:22:37,100 I mean, there are cars driving around taking pictures of everything they see. 520 00:22:37,100 --> 00:22:38,808 And at least companies, like Google, have 521 00:22:38,808 --> 00:22:41,000 tended to be in the habit of blurring out faces. 522 00:22:41,000 --> 00:22:44,040 They still have those faces somewhere in their archives [INAUDIBLE].. 523 00:22:44,040 --> 00:22:44,300 BRIAN YU: Yeah. 524 00:22:44,300 --> 00:22:47,207 I was actually just grocery shopping at Trader Joe's the other day. 525 00:22:47,207 --> 00:22:48,290 And I was walking outside. 526 00:22:48,290 --> 00:22:50,373 An Apple Maps car drove by with all their cameras 527 00:22:50,373 --> 00:22:52,790 that we're looking around and taking photos of the street. 528 00:22:52,790 --> 00:22:53,840 DAVID MALAN: I saw one recently, too. 529 00:22:53,840 --> 00:22:56,077 But their cars are not nearly as cool as Google's. 530 00:22:56,077 --> 00:22:58,160 BRIAN YU: I've never seen a Google car, in person. 531 00:22:58,160 --> 00:22:58,600 DAVID MALAN: Oh, yeah. 532 00:22:58,600 --> 00:23:00,142 No, I've seen them from time to time. 533 00:23:00,142 --> 00:23:02,030 They're much better painted and branded. 534 00:23:02,030 --> 00:23:05,240 Apple's looked like someone had just set it up on top of their own car. 535 00:23:05,240 --> 00:23:06,590 [LAUGHTER] 536 00:23:06,590 --> 00:23:10,310 Well, and on that note, please do keep the topics of interest coming. 537 00:23:10,310 --> 00:23:14,060 Feel free to drop me and Brian a note at podcast@cs50.harvard.edu, 538 00:23:14,060 --> 00:23:16,460 if you have any questions or ideas for next episodes. 539 00:23:16,460 --> 00:23:20,630 But in the meantime, if you haven't yourself seen or tried out FaceApp, 540 00:23:20,630 --> 00:23:23,870 don't necessarily go, and rush, and download, and install this app. 541 00:23:23,870 --> 00:23:25,700 That was not intended to be our takeaway. 542 00:23:25,700 --> 00:23:26,750 But be mindful of it. 543 00:23:26,750 --> 00:23:31,010 And certainly, if you just google FaceApp on Google Images, or the like, 544 00:23:31,010 --> 00:23:34,340 you can actually see some examples of just how compelling, 545 00:23:34,340 --> 00:23:36,560 or how frightening, the technology is. 546 00:23:36,560 --> 00:23:38,570 So it's out there. 547 00:23:38,570 --> 00:23:40,150 This then was the CS50 Podcast. 548 00:23:40,150 --> 00:23:41,212 My name is David Malan. 549 00:23:41,212 --> 00:23:42,170 BRIAN YU: I'm Brian Yu. 550 00:23:42,170 --> 00:23:44,290 See you all next time.