WEBVTT X-TIMESTAMP-MAP=LOCAL:00:00:00.000,MPEGTS:900000 00:00:00.500 --> 00:00:02.970 [MUSIC PLAYING] 00:00:48.432 --> 00:00:49.390 DAVID MALAN: All right. 00:00:49.390 --> 00:00:51.050 This is CS50. 00:00:51.050 --> 00:00:52.930 And this is the end. 00:00:52.930 --> 00:00:55.600 All that remains here on out is really your final projects. 00:00:55.600 --> 00:00:57.835 And we cannot wait to see what you create. 00:00:57.835 --> 00:01:00.460 Today what we thought we would do is take a bit of a look back, 00:01:00.460 --> 00:01:03.400 but also we'll look forward so that you know exactly where 00:01:03.400 --> 00:01:06.550 and what you can do beyond CS50 itself. 00:01:06.550 --> 00:01:08.650 But first a word of thanks. 00:01:08.650 --> 00:01:11.200 We are, of course, here in the Loeb Drama Center 00:01:11.200 --> 00:01:14.380 with The American Repertory Theater, who have been our amazing hosts 00:01:14.380 --> 00:01:15.460 this whole semester. 00:01:15.460 --> 00:01:20.260 And truly, they have breathed new life, new lights, new animation, new sounds 00:01:20.260 --> 00:01:21.100 into CS50. 00:01:21.100 --> 00:01:23.350 And we are so grateful to have had such a privilege 00:01:23.350 --> 00:01:27.040 to work with the amazingly talented team here to indeed bring 00:01:27.040 --> 00:01:30.460 this whole stage to life and evolve it over the course of the semester. 00:01:30.460 --> 00:01:32.290 And then, of course, there's CS50'S team. 00:01:32.290 --> 00:01:34.360 And though I'm the only one here on stage 00:01:34.360 --> 00:01:37.300 with everyone else spread quite far apart this semester, 00:01:37.300 --> 00:01:42.220 it would not be without CS50's team that we have the videos that we have, 00:01:42.220 --> 00:01:44.500 the technology that we have, and all of the visuals 00:01:44.500 --> 00:01:47.170 that supplement, hopefully, everything that you yourselves 00:01:47.170 --> 00:01:48.500 have been doing hands on. 00:01:48.500 --> 00:01:53.820 So thank you truly to both teams for having made this semester all possible. 00:01:53.820 --> 00:01:57.190 These are suffice it to say, among the more unusual and difficult times. 00:01:57.190 --> 00:02:00.520 And we hope, if you're watching this now live, or in some time from now, 00:02:00.520 --> 00:02:02.810 that this finds everyone healthy and well. 00:02:02.810 --> 00:02:05.050 And that, indeed, we have helped you find your way 00:02:05.050 --> 00:02:08.539 along this path of learning something new. 00:02:08.539 --> 00:02:10.990 Of course, there's more folks even than that 00:02:10.990 --> 00:02:13.160 behind the scenes, CS50'S whole team. 00:02:13.160 --> 00:02:15.430 And when I look out on the crowd here, really, there 00:02:15.430 --> 00:02:16.650 is no crowd here in person. 00:02:16.650 --> 00:02:19.150 And if you've wondered what it looks like behind the scenes, 00:02:19.150 --> 00:02:21.790 pictured here is a photograph of exactly what 00:02:21.790 --> 00:02:24.400 it is I am seeing when we hold each of these classes. 00:02:24.400 --> 00:02:27.100 And indeed, if we Zoom in, when we're having these conversations 00:02:27.100 --> 00:02:30.280 or answering or asking questions, it really is just us 00:02:30.280 --> 00:02:31.960 and some TV screens here this year. 00:02:31.960 --> 00:02:35.320 But we do look forward to all reuniting before long. 00:02:35.320 --> 00:02:37.940 Now behind the scenes, there's indeed this whole team. 00:02:37.940 --> 00:02:40.570 In fact, pictured here are just most of, but not even 00:02:40.570 --> 00:02:44.440 all, CS50'S teaching Fellows, teaching assistants, and course assistants, 00:02:44.440 --> 00:02:48.520 both at Harvard and at Yale, without whom this semester would also not 00:02:48.520 --> 00:02:49.028 be possible. 00:02:49.028 --> 00:02:51.820 Because they are, indeed, the backbone of and the support structure 00:02:51.820 --> 00:02:53.890 for getting everyone ultimately to the finish 00:02:53.890 --> 00:02:56.260 line with problem sets, labs, and more. 00:02:56.260 --> 00:02:59.740 But it's worth noting that we are all fallible. 00:02:59.740 --> 00:03:02.050 And, indeed, I'm told it's fairly instructive when 00:03:02.050 --> 00:03:05.950 I do something completely wrong, or get a little befuddled here on stage 00:03:05.950 --> 00:03:09.010 and can't quite figure out why my own code isn't working, 00:03:09.010 --> 00:03:11.800 or can't quite answer a question off the top of my head. 00:03:11.800 --> 00:03:13.862 This all happens, certainly, to all of us. 00:03:13.862 --> 00:03:15.820 So even if you are feeling here, toward the end 00:03:15.820 --> 00:03:17.950 of the semester, that not everything quite clicked 00:03:17.950 --> 00:03:20.830 and you're still struggling sometimes to find that bug in your code, 00:03:20.830 --> 00:03:23.860 or you're still googling or searching for some answer to some smaller 00:03:23.860 --> 00:03:27.280 technical problem, rest assured, or take comfort in knowing, 00:03:27.280 --> 00:03:29.900 that that is never really going to go away. 00:03:29.900 --> 00:03:32.740 And in fact, to reinforce that, besides all of the mistakes 00:03:32.740 --> 00:03:36.250 I here have made on stage, we thought we would share a little bit of a clip, 00:03:36.250 --> 00:03:40.090 some bloopers if you will, from when the teaching staff some weeks ago prepared 00:03:40.090 --> 00:03:45.477 that passing of TCP/IP packets on video, which worked out wonderfully well, 00:03:45.477 --> 00:03:47.560 where folks were passing up, down, left and right. 00:03:47.560 --> 00:03:50.290 The goal of which was to get three TCP/IP packets 00:03:50.290 --> 00:03:52.210 from the bottom right-hand corner of Zoom 00:03:52.210 --> 00:03:53.948 to the top left-hand corner of Zoom. 00:03:53.948 --> 00:03:56.740 But we thought we would give you a glimpse of what actually went on 00:03:56.740 --> 00:03:58.990 behind the scenes and just how many takes it took us 00:03:58.990 --> 00:04:00.970 to get even that demonstration right. 00:04:00.970 --> 00:04:03.700 I give you some of CS50'S team. 00:04:03.700 --> 00:04:04.600 [VIDEO PLAYBACK] 00:04:04.600 --> 00:04:06.400 - There we go. 00:04:06.400 --> 00:04:07.262 Buffering. 00:04:07.262 --> 00:04:08.620 OK. 00:04:08.620 --> 00:04:09.818 Josh? 00:04:09.818 --> 00:04:11.803 - Hi. 00:04:11.803 --> 00:04:14.050 - Helen, oh! 00:04:14.050 --> 00:04:17.858 [LAUGHTER] 00:04:17.858 --> 00:04:18.899 - [INAUDIBLE]. 00:04:18.899 --> 00:04:19.839 No, wait. 00:04:25.810 --> 00:04:26.860 That was amazing, Josh. 00:04:30.636 --> 00:04:34.164 Um, Sophie? 00:04:34.164 --> 00:04:39.130 [LAUGHTER] 00:04:39.130 --> 00:04:41.640 Amazing. 00:04:41.640 --> 00:04:43.490 That was perfect. 00:04:43.490 --> 00:04:44.532 Moni? 00:04:44.532 --> 00:04:47.250 [LAUGHTER] 00:04:47.250 --> 00:04:49.110 I think I-- 00:04:49.110 --> 00:04:52.220 - [INAUDIBLE]. 00:04:52.220 --> 00:04:53.338 - Amazing. 00:04:53.338 --> 00:04:53.838 - Guy? 00:04:57.970 --> 00:04:59.050 That was amazing. 00:04:59.050 --> 00:04:59.775 Thank you all. 00:04:59.775 --> 00:05:00.810 - So good! 00:05:00.810 --> 00:05:01.676 [APPLAUSE] 00:05:01.676 --> 00:05:03.410 [END PLAYBACK] 00:05:03.410 --> 00:05:06.870 DAVID MALAN: So suffice it to say, computer science is hard for all of us. 00:05:06.870 --> 00:05:08.510 And so some of these feelings, some of these frustrations 00:05:08.510 --> 00:05:09.635 are never going to go away. 00:05:09.635 --> 00:05:12.537 But hopefully you have, indeed, all the more tools in your toolkit, 00:05:12.537 --> 00:05:14.370 all the more of a foundation now to build on 00:05:14.370 --> 00:05:18.290 so that you can take comfort in being a little uncomfortable as you forge ahead 00:05:18.290 --> 00:05:20.270 and solve new problems, learn new languages, 00:05:20.270 --> 00:05:22.490 and ultimately pick up new ideas and skills. 00:05:22.490 --> 00:05:25.528 But remember, for CS50 alone, what ultimately 00:05:25.528 --> 00:05:27.320 matters in this course is not so much where 00:05:27.320 --> 00:05:29.900 you end up relative to your classmates, but where you end 00:05:29.900 --> 00:05:32.360 up relative to yourself when you began. 00:05:32.360 --> 00:05:35.390 And consider, it wasn't all that long ago that you began. 00:05:35.390 --> 00:05:38.180 In fact, just some weeks ago was this perhaps, 00:05:38.180 --> 00:05:40.640 the biggest of your problems in CS50, just trying 00:05:40.640 --> 00:05:42.950 to figure out how to get the pyramid to align right, 00:05:42.950 --> 00:05:44.780 whether you did the less comfortable version or the more 00:05:44.780 --> 00:05:47.390 comfortable version, figuring out how to print spaces, 00:05:47.390 --> 00:05:50.750 how to shift the pyramid over and the like, figuring out how to nest loops, 00:05:50.750 --> 00:05:54.140 let alone getting all of the semicolons and compilation steps right. 00:05:54.140 --> 00:05:56.690 And then, fast forward to just a week or two ago 00:05:56.690 --> 00:05:59.600 when you built your very own web application, one 00:05:59.600 --> 00:06:03.350 that used a third party API and pulled in nearly real time data 00:06:03.350 --> 00:06:06.800 and generated views for the user, had a controller governing 00:06:06.800 --> 00:06:09.813 with the model exactly all of the data you were reading and writing 00:06:09.813 --> 00:06:10.355 and the like. 00:06:10.355 --> 00:06:15.432 Like, that is a huge way to have gone over the course of just a few months. 00:06:15.432 --> 00:06:16.640 So take comfort in that, too. 00:06:16.640 --> 00:06:18.557 Especially as you dive into your final project 00:06:18.557 --> 00:06:21.680 and might bump up against some more walls, those, too, 00:06:21.680 --> 00:06:24.470 will you ultimately push through. 00:06:24.470 --> 00:06:27.380 So what have we focused on over the course of this semester? 00:06:27.380 --> 00:06:30.550 A lot of the times we've spent time talking about and doing programming. 00:06:30.550 --> 00:06:33.050 But really, we'd like to think it's some of the higher level 00:06:33.050 --> 00:06:36.650 ideas and the takeaways that are what last you 00:06:36.650 --> 00:06:39.200 far longer than the particulars of these languages. 00:06:39.200 --> 00:06:42.680 Whether it's Scratch or C or Python or JavaScript or SQL, 00:06:42.680 --> 00:06:46.340 or any of the other practical tools that we looked at, all of those 00:06:46.340 --> 00:06:49.340 are eventually in some form going to be out of date. 00:06:49.340 --> 00:06:52.160 Or they might remain with us is older languages, 00:06:52.160 --> 00:06:54.745 but newer and better things will come along. 00:06:54.745 --> 00:06:57.120 And what we hope, then, is that over the past few months, 00:06:57.120 --> 00:07:01.430 you've walked away with the fundamentals and sort of a foundation on which you 00:07:01.430 --> 00:07:04.280 can bootstrap yourself to learn new things as they come out 00:07:04.280 --> 00:07:07.970 and really reduce new things to their basic building blocks, 00:07:07.970 --> 00:07:11.300 the puzzle pieces with which we began, first principles from which you 00:07:11.300 --> 00:07:14.880 can infer how some new system, some new piece of hardware, 00:07:14.880 --> 00:07:17.180 how some new language must surely work. 00:07:17.180 --> 00:07:20.360 Because underneath the hood, at the end of the day, it's still going to be, 00:07:20.360 --> 00:07:23.160 for some time, just 0's and 1's. 00:07:23.160 --> 00:07:26.510 And so we introduced in Week 0, recall, computational thinking, 00:07:26.510 --> 00:07:29.990 encouraging you to think more methodically, more algorithmically. 00:07:29.990 --> 00:07:33.530 But really, computational thinking is just a computer scientist's incarnation 00:07:33.530 --> 00:07:36.290 of what we might otherwise think of as just critical thinking. 00:07:36.290 --> 00:07:39.290 This process is taking as input information 00:07:39.290 --> 00:07:41.810 and producing as output some solution. 00:07:41.810 --> 00:07:45.050 And in between there, of course, are our algorithms, the black box that's 00:07:45.050 --> 00:07:48.360 doing something interesting and perhaps difficult. But at the end of the day, 00:07:48.360 --> 00:07:49.370 this is problem solving. 00:07:49.370 --> 00:07:52.460 And this isn't going anywhere, irrespective of the languages 00:07:52.460 --> 00:07:56.280 that you use or pick up or even forget somewhere along the way. 00:07:56.280 --> 00:08:00.200 And indeed, today, too, whether this is input and output in binary form, 00:08:00.200 --> 00:08:03.930 or it's just information and decisions or facts and conclusions, 00:08:03.930 --> 00:08:06.620 this process of taking input and producing 00:08:06.620 --> 00:08:09.500 as output correct answers, correct conclusions, 00:08:09.500 --> 00:08:12.260 correct decisions is hopefully going to be with you far 00:08:12.260 --> 00:08:15.140 longer than the particulars of C or Python 00:08:15.140 --> 00:08:19.550 or any of the more hands-on skills that we've spent time on this term. 00:08:19.550 --> 00:08:21.530 And recall, too, that at least within CS50, 00:08:21.530 --> 00:08:27.500 the tools with which we propose that you evaluate the quality of your approach 00:08:27.500 --> 00:08:30.740 to problem solving are these three axes, the first and foremost 00:08:30.740 --> 00:08:32.090 of which is surely correctness. 00:08:32.090 --> 00:08:35.640 Because if it doesn't work, what's the point of it all in the first place? 00:08:35.640 --> 00:08:39.679 So getting your code, your algorithm, your process from input to output 00:08:39.679 --> 00:08:41.900 to be correct is certainly paramount. 00:08:41.900 --> 00:08:44.270 But after that comes questions of design. 00:08:44.270 --> 00:08:46.730 If you actually want to build more complex systems, 00:08:46.730 --> 00:08:49.670 or solve more sophisticated problems, you really 00:08:49.670 --> 00:08:53.360 do want to design your solutions to those problems cleanly. 00:08:53.360 --> 00:08:55.520 You don't want them to be slow or inefficient. 00:08:55.520 --> 00:08:57.650 You don't want them to be a mess in real terms. 00:08:57.650 --> 00:09:00.170 You don't want your code to be completely undecipherable. 00:09:00.170 --> 00:09:03.830 Because that's just going to hamper you longer term to using those same tools, 00:09:03.830 --> 00:09:05.970 those same libraries to solve more interesting, 00:09:05.970 --> 00:09:07.340 more sophisticated problems. 00:09:07.340 --> 00:09:09.560 And it's surely going to make it harder to interface 00:09:09.560 --> 00:09:12.660 with other people, other collaborators, and other systems. 00:09:12.660 --> 00:09:15.720 And indeed along those lines, is style still important? 00:09:15.720 --> 00:09:18.470 It's perhaps the third in this trio for us. 00:09:18.470 --> 00:09:22.160 But it's the aesthetics of your code, and the indentation in the variables 00:09:22.160 --> 00:09:26.235 and all that, much like you might convey in our human language, 00:09:26.235 --> 00:09:28.610 put your best foot forward with punctuation and the like. 00:09:28.610 --> 00:09:31.127 It just helps other people understand you. 00:09:31.127 --> 00:09:33.710 And indeed, even though we spent a lot of our time interacting 00:09:33.710 --> 00:09:36.505 with computers, in a course like this, at the end of the day, 00:09:36.505 --> 00:09:37.880 you're really just communicating. 00:09:37.880 --> 00:09:41.030 And whether you're communicating to a machine or to another human, 00:09:41.030 --> 00:09:44.120 doing that cleanly and in a way that helps your ideas, 00:09:44.120 --> 00:09:48.380 your solutions, become adopted is surely no less important 00:09:48.380 --> 00:09:50.710 than some of these other ideas as well. 00:09:50.710 --> 00:09:52.460 But what about other basic building blocks 00:09:52.460 --> 00:09:55.610 that transcend the particular languages and pieces that we did? 00:09:55.610 --> 00:09:59.300 Well, abstraction, this idea of taking fairly complicated ideas 00:09:59.300 --> 00:10:03.100 and simplifying them so you don't have to worry about the lower level 00:10:03.100 --> 00:10:06.400 implementation details, you can focus only on the solution 00:10:06.400 --> 00:10:09.940 that that puzzle piece or that building block actually provides. 00:10:09.940 --> 00:10:12.220 And abstraction is everywhere around us. 00:10:12.220 --> 00:10:15.850 Certainly in code, we saw functions like get_string as an abstraction. 00:10:15.850 --> 00:10:16.690 I don't know. 00:10:16.690 --> 00:10:18.790 I don't really remember exactly how get_string 00:10:18.790 --> 00:10:21.220 is implemented underneath the hood, let alone a function 00:10:21.220 --> 00:10:23.170 that comes with C, like printf. 00:10:23.170 --> 00:10:24.218 But I know that it works. 00:10:24.218 --> 00:10:25.510 And I know that it takes input. 00:10:25.510 --> 00:10:26.920 I know that it produces output. 00:10:26.920 --> 00:10:29.980 And I can therefore build my own ideas, my own software, 00:10:29.980 --> 00:10:34.090 on top of that building block, abstracting away those particulars. 00:10:34.090 --> 00:10:37.570 And in the real world, too, we abstract things-- 00:10:37.570 --> 00:10:42.070 we abstract away things all of the time, taking a complex idea or a process 00:10:42.070 --> 00:10:43.870 and assume that that will be done. 00:10:43.870 --> 00:10:45.200 Someone else might do that. 00:10:45.200 --> 00:10:48.070 And I can therefore build on the output of that process, 00:10:48.070 --> 00:10:51.700 even if I am myself am no expert in the underlying implementation 00:10:51.700 --> 00:10:53.240 details thereof. 00:10:53.240 --> 00:10:55.570 But then there's precision, this other idea 00:10:55.570 --> 00:10:58.030 where it's super important, certainly when writing code, 00:10:58.030 --> 00:11:02.140 but also when talking to another human, to be precise and make super clear 00:11:02.140 --> 00:11:03.490 what you mean. 00:11:03.490 --> 00:11:07.030 And to consider the corner cases, to consider the inputs 00:11:07.030 --> 00:11:09.760 that you might not otherwise expect, but might nonetheless 00:11:09.760 --> 00:11:13.810 happen so that you don't err and have some unexpected behavior, as we 00:11:13.810 --> 00:11:16.000 certainly did more than once in actual code. 00:11:16.000 --> 00:11:18.100 And sometimes, abstractions and precisions 00:11:18.100 --> 00:11:19.665 are kind of at odds with one another. 00:11:19.665 --> 00:11:21.790 Because abstraction would sort of have you think at 00:11:21.790 --> 00:11:25.198 and talk at a fairly high level, whereas precision 00:11:25.198 --> 00:11:27.490 would suggest that you really should get into the weeds 00:11:27.490 --> 00:11:31.450 and really go step by step by step when it comes to giving someone 00:11:31.450 --> 00:11:33.490 else or a computer instructions. 00:11:33.490 --> 00:11:36.010 And we thought we would bring this to life, perhaps, 00:11:36.010 --> 00:11:37.630 with a couple of examples. 00:11:37.630 --> 00:11:40.630 And we thought we would try to involve as many people as we can in this, 00:11:40.630 --> 00:11:43.420 albeit from afar, by having everyone, if you can, 00:11:43.420 --> 00:11:45.872 take out a piece of paper and a pen or pencil. 00:11:45.872 --> 00:11:47.830 It's OK if you don't quite have that available. 00:11:47.830 --> 00:11:49.955 You can do this on a computer, too, if you'd rather 00:11:49.955 --> 00:11:53.390 draw on a notepad or a tablet, or something like that totally fine. 00:11:53.390 --> 00:11:56.350 But ideally, taking out now something with which to draw and something 00:11:56.350 --> 00:11:57.940 to draw on. 00:11:57.940 --> 00:12:00.700 We're going to go ahead and try to apply some principles 00:12:00.700 --> 00:12:03.950 of computational thinking and see just how helpful or hurtful 00:12:03.950 --> 00:12:09.680 it is to use abstraction or precision at one level or another. 00:12:09.680 --> 00:12:12.250 So I think to do this, Brian, we're going to need 00:12:12.250 --> 00:12:14.740 a helping hand from the audience. 00:12:14.740 --> 00:12:17.650 I think we're going to need for one person 00:12:17.650 --> 00:12:22.570 out there to volunteer to write instructions verbally 00:12:22.570 --> 00:12:23.593 for everyone else. 00:12:23.593 --> 00:12:25.510 We're going to treat everyone in the audience, 00:12:25.510 --> 00:12:27.640 or really, n minus 1 people in the audience 00:12:27.640 --> 00:12:30.340 as the computers today who are going to be programmed. 00:12:30.340 --> 00:12:33.850 And we need one human volunteer to be the programmer. 00:12:33.850 --> 00:12:36.550 And that programmer is going to be Daniel. 00:12:36.550 --> 00:12:38.740 So Daniel, thank you for volunteering. 00:12:38.740 --> 00:12:42.280 Brian, could we go ahead and share with Daniel, and only Daniel, 00:12:42.280 --> 00:12:46.163 a picture of something that we want everyone else to draw? 00:12:46.163 --> 00:12:48.580 So Daniel, what you should see on your screen in a moment, 00:12:48.580 --> 00:12:50.122 if you haven't already, is a picture. 00:12:50.122 --> 00:12:51.880 Don't tell anyone else what it is. 00:12:51.880 --> 00:12:55.000 You may use any words in a moment that you want. 00:12:55.000 --> 00:12:57.653 You should not use your hands or any gestures like that. 00:12:57.653 --> 00:13:00.070 But the goal is going to be to write an algorithm verbally 00:13:00.070 --> 00:13:02.200 for everyone else in the room, step by step, 00:13:02.200 --> 00:13:05.530 so that ideally they draw what it is you see. 00:13:05.530 --> 00:13:08.800 And you can say anything you want, but just no physical gestures. 00:13:08.800 --> 00:13:10.420 Does that make sense? 00:13:10.420 --> 00:13:10.960 - Got it. 00:13:10.960 --> 00:13:13.043 And do you want to say a little bit about yourself 00:13:13.043 --> 00:13:14.727 first to the group before we begin? 00:13:14.727 --> 00:13:15.310 STUDENT: Sure. 00:13:15.310 --> 00:13:16.150 My name's Daniel. 00:13:16.150 --> 00:13:20.590 I'm from Ezra Stiles College at Yale University. 00:13:20.590 --> 00:13:22.915 And I really CS50 this semester. 00:13:22.915 --> 00:13:24.040 DAVID MALAN: Oh, wonderful. 00:13:24.040 --> 00:13:25.600 Thank you for volunteering. 00:13:25.600 --> 00:13:29.290 And let's go ahead and have everyone else with their paper 00:13:29.290 --> 00:13:30.640 and pencil or pen ready. 00:13:30.640 --> 00:13:33.650 Daniel, what should be the first thing everyone does, step one. 00:13:33.650 --> 00:13:34.150 STUDENT: OK. 00:13:34.150 --> 00:13:38.680 So the first thing we're going to draw is a hexagon. 00:13:38.680 --> 00:13:40.780 So it's a regular hexagon. 00:13:40.780 --> 00:13:50.828 And we're going to make sure that we draw it so that one of the vertices 00:13:50.828 --> 00:13:53.370 is on the very bottom of the hexagon, and one of the vertices 00:13:53.370 --> 00:13:55.990 is at the very top of the hexagon. 00:13:55.990 --> 00:13:58.210 So one side is not laying flat. 00:13:58.210 --> 00:13:58.958 You're doing-- 00:13:58.958 --> 00:14:00.250 DAVID MALAN: Wup, dut, dut dut. 00:14:00.250 --> 00:14:01.120 No hand gestures. 00:14:01.120 --> 00:14:02.287 STUDENT: No hands, no hands. 00:14:02.287 --> 00:14:03.400 Right. 00:14:03.400 --> 00:14:06.940 One vertice at the very top, one vertice is at the very bottom. 00:14:06.940 --> 00:14:10.850 And you've got your other four vertices on the sides. 00:14:10.850 --> 00:14:13.870 So once you've got your hexagon, your next step 00:14:13.870 --> 00:14:18.580 is going to be find your midpoint of the hexagon. 00:14:18.580 --> 00:14:20.740 And so once you've found your midpoint, you're 00:14:20.740 --> 00:14:26.080 going to draw three lines from a vertices to that midpoint. 00:14:26.080 --> 00:14:29.440 The vertices that you're going to choose to draw from 00:14:29.440 --> 00:14:37.510 are the very bottom to the midpoint and then from the midpoint to the vertices 00:14:37.510 --> 00:14:42.142 that are on the left and right of the top vertices. 00:14:42.142 --> 00:14:43.100 DAVID MALAN: All right. 00:14:43.100 --> 00:14:46.720 Any final instructions? 00:14:46.720 --> 00:14:48.858 STUDENT: I think, hopefully, that should be it. 00:14:48.858 --> 00:14:49.900 DAVID MALAN: [INAUDIBLE]. 00:14:49.900 --> 00:14:51.670 Those were very long steps one and two. 00:14:51.670 --> 00:14:53.230 But yes. 00:14:53.230 --> 00:14:55.690 All right, well, let's go ahead and reveal. 00:14:55.690 --> 00:14:57.310 This will be a little bit of a hack. 00:14:57.310 --> 00:14:59.890 But if anyone everyone is comfortable picking up 00:14:59.890 --> 00:15:02.310 their piece of paper or their tablet and holding it 00:15:02.310 --> 00:15:07.980 in front of their Zoom camera steadily for five or 10 seconds, 00:15:07.980 --> 00:15:11.100 we'll see exactly what everyone has drawn. 00:15:11.100 --> 00:15:13.950 If you go into gallery view, you'll be able to see everyone else. 00:15:13.950 --> 00:15:17.220 Daniel, hopefully you're seeing some familiar pictures? 00:15:17.220 --> 00:15:19.940 I think we definitely have range. 00:15:19.940 --> 00:15:22.547 Are you seeing one or more that match what you had in mind? 00:15:22.547 --> 00:15:23.130 STUDENT: Yeah. 00:15:23.130 --> 00:15:24.210 They all look pretty good. 00:15:24.210 --> 00:15:24.900 DAVID MALAN: They all? 00:15:24.900 --> 00:15:25.900 All right, so good. 00:15:25.900 --> 00:15:28.020 Let me go ahead, then, and share on my screen 00:15:28.020 --> 00:15:31.190 in just a moment what it is Daniel was describing. 00:15:31.190 --> 00:15:34.260 So what Brian had shared with Daniel in advance was this picture 00:15:34.260 --> 00:15:38.490 here, which I dare say is a cube. 00:15:38.490 --> 00:15:42.360 But indeed, it's composed of a hexagon and then the additional lines 00:15:42.360 --> 00:15:43.410 that Daniel described. 00:15:43.410 --> 00:15:47.670 And Daniel, I did happen to see, maybe on pages two and three of the Zoom 00:15:47.670 --> 00:15:52.590 window, there were definitely some that weren't quite cubes. 00:15:52.590 --> 00:15:57.330 What was going through your mind as to how you approached 00:15:57.330 --> 00:15:59.420 the algorithm that you provided? 00:15:59.420 --> 00:16:02.250 STUDENT: I wanted to-- 00:16:02.250 --> 00:16:05.040 to me, the first thing that went through my head was a cube. 00:16:05.040 --> 00:16:08.040 But I knew that there's so many ways to draw a cube. 00:16:08.040 --> 00:16:09.980 I didn't want to describe it as a cube. 00:16:09.980 --> 00:16:11.730 Because if I said just draw a cube, I knew 00:16:11.730 --> 00:16:13.990 that we would get tons of different results. 00:16:13.990 --> 00:16:16.260 So I wanted to be as clear as possible. 00:16:16.260 --> 00:16:21.030 And I knew that if I could describe it in sort of a mathematical way, 00:16:21.030 --> 00:16:23.505 describing it with a hexagon and describing it 00:16:23.505 --> 00:16:27.690 with the vertices in the midpoint, that hopefully more people would 00:16:27.690 --> 00:16:29.790 be able to draw a precise shape. 00:16:29.790 --> 00:16:30.540 DAVID MALAN: Yeah. 00:16:30.540 --> 00:16:31.350 Really well said. 00:16:31.350 --> 00:16:33.420 Now if we had everyone's volume on, odd are, 00:16:33.420 --> 00:16:35.520 you'd hear a bit of chuckling now, perhaps, 00:16:35.520 --> 00:16:37.383 or maybe a little bit of awkwardness. 00:16:37.383 --> 00:16:40.050 And I daresay not all of the pictures quite turned out that way, 00:16:40.050 --> 00:16:43.110 but that's a perfect example of where maybe abstractions can kind of get us 00:16:43.110 --> 00:16:43.652 into trouble. 00:16:43.652 --> 00:16:47.190 Because if Daniel had just said, draw a cube, right, some of you 00:16:47.190 --> 00:16:48.833 might start drawing immediately a cube. 00:16:48.833 --> 00:16:52.000 But many of you would have a question, well, what should the orientation be? 00:16:52.000 --> 00:16:52.900 What should the size be? 00:16:52.900 --> 00:16:53.820 What should the position be? 00:16:53.820 --> 00:16:56.350 And so there, precision becomes increasingly important. 00:16:56.350 --> 00:16:58.562 But the more precise it gets, odds are some of you 00:16:58.562 --> 00:17:01.770 just kind of got overwhelmed with the amount of detail and sort of lost track 00:17:01.770 --> 00:17:04.440 where your pen or pencil was supposed to be at one point 00:17:04.440 --> 00:17:07.190 because you were operating at a much lower level. 00:17:07.190 --> 00:17:08.910 So there's this tension, then. 00:17:08.910 --> 00:17:11.410 But I think we did get some of you to that finish line. 00:17:11.410 --> 00:17:13.530 Let's see if we can't now take the pressure off of all of you, 00:17:13.530 --> 00:17:15.405 and thank you to Daniel, in particular, let's 00:17:15.405 --> 00:17:19.035 see if we can't now have all of you collectively program me, if you will. 00:17:19.035 --> 00:17:21.160 So I'm going to go ahead and pull up my screen here 00:17:21.160 --> 00:17:25.900 where I have the ability to draw with my mouse and cursor on my screen here. 00:17:25.900 --> 00:17:29.190 And Brian, if you don't mind, could you share with everyone 00:17:29.190 --> 00:17:34.560 else a picture that I promise I have not seen in advance. 00:17:34.560 --> 00:17:35.890 So we will see how this goes. 00:17:35.890 --> 00:17:38.280 So I'm the only one right now in the Zoom room 00:17:38.280 --> 00:17:39.990 that has not seen this picture. 00:17:39.990 --> 00:17:42.930 But Brian has gone and provided only you all with the URL. 00:17:42.930 --> 00:17:44.640 So pull that up on your screen. 00:17:44.640 --> 00:17:47.760 And then, Brian, if we could perhaps iteratively call on some volunteers. 00:17:47.760 --> 00:17:52.477 Why don't I try to draw what people tell me to do, step by step? 00:17:52.477 --> 00:17:53.310 BRIAN YU: All right. 00:17:53.310 --> 00:17:54.330 David has not seen this. 00:17:54.330 --> 00:17:56.760 I just picked this out, like, five minutes ago. 00:17:56.760 --> 00:17:59.677 And you're all going to raise your hand if you want to give him, like, 00:17:59.677 --> 00:18:01.560 one instruction for what to do next. 00:18:01.560 --> 00:18:03.780 And let's start with George. 00:18:03.780 --> 00:18:06.570 STUDENT: So you're going to start by drawing 00:18:06.570 --> 00:18:09.593 a circle near the top of the screen. 00:18:09.593 --> 00:18:10.260 DAVID MALAN: OK. 00:18:10.260 --> 00:18:11.970 A circle near the top of the screen. 00:18:11.970 --> 00:18:15.100 And let me make clear, I have no delete abilities on the computer. 00:18:15.100 --> 00:18:16.650 So once I commit, we're in. 00:18:16.650 --> 00:18:20.940 So drawing a circle near the top of the screen. 00:18:20.940 --> 00:18:21.720 OK. 00:18:21.720 --> 00:18:22.710 Thank you, George. 00:18:22.710 --> 00:18:23.820 Brian, step two? 00:18:23.820 --> 00:18:25.750 BRIAN YU: All right, let's go to Sophia next. 00:18:25.750 --> 00:18:32.800 STUDENT: Then, in the very center of the screen, door a black, filled in circle, 00:18:32.800 --> 00:18:36.233 which is approximately a tenth of the size of the circle at the top. 00:18:36.233 --> 00:18:36.900 DAVID MALAN: OK. 00:18:36.900 --> 00:18:41.710 A black, filled in circle, I heard, that's a tenth of the size. 00:18:41.710 --> 00:18:46.580 So I'm going to do something like this, and then just kind of shade it in. 00:18:46.580 --> 00:18:48.110 All right, thank you, Sophia. 00:18:48.110 --> 00:18:48.800 Step three? 00:18:48.800 --> 00:18:50.270 BRIAN YU: Let's go to Santiago. 00:18:50.270 --> 00:18:55.290 STUDENT: You're going to draw another circle. 00:18:55.290 --> 00:18:57.440 But it's not actually going to be a circle, 00:18:57.440 --> 00:19:04.190 it's more of an ellipse, that's going to be bigger than the first one. 00:19:04.190 --> 00:19:05.610 So it's going to be in the middle. 00:19:05.610 --> 00:19:08.360 And it's going to enclose that filled in circle 00:19:08.360 --> 00:19:10.723 and leave some room in the bottom. 00:19:10.723 --> 00:19:11.390 DAVID MALAN: OK. 00:19:11.390 --> 00:19:12.182 So it's an ellipse. 00:19:12.182 --> 00:19:14.670 It's bigger than the first circle. 00:19:14.670 --> 00:19:18.320 But it encloses the smaller one? 00:19:18.320 --> 00:19:20.105 All right, so I heard kind of this. 00:19:23.560 --> 00:19:26.260 OK. 00:19:26.260 --> 00:19:27.910 Step 4? 00:19:27.910 --> 00:19:30.280 STUDENT: Under that smaller ellipse, you're 00:19:30.280 --> 00:19:35.110 going to want to draw a bigger circle underneath it 00:19:35.110 --> 00:19:38.020 and act as if the circle is going through that ellipse, 00:19:38.020 --> 00:19:41.320 but don't actually show the lines going through the ellipse. 00:19:41.320 --> 00:19:44.590 So that is, we draw a bigger circle underneath, but without having 00:19:44.590 --> 00:19:46.240 the lines go through. 00:19:46.240 --> 00:19:50.250 It looks like it will kind of be going through the edge of it. 00:19:50.250 --> 00:19:51.270 DAVID MALAN: OK. 00:19:51.270 --> 00:20:01.850 I'm a little worried here, but what I heard was like this, maybe? 00:20:01.850 --> 00:20:03.620 Step five? 00:20:03.620 --> 00:20:07.190 BRIAN YU: All right, let's go to [INAUDIBLE] next. 00:20:07.190 --> 00:20:11.060 STUDENT: So in that kind of middle ellipse, 00:20:11.060 --> 00:20:14.002 you know, like, when kids act like they're an airplane, 00:20:14.002 --> 00:20:15.710 and then they make, like, airplane noise, 00:20:15.710 --> 00:20:17.627 then they do that weird thing with their arms? 00:20:17.627 --> 00:20:18.830 DAVID MALAN: Uh huh. 00:20:18.830 --> 00:20:21.890 STUDENT: So draw those kind of, like, arms in that middle ellipse, 00:20:21.890 --> 00:20:24.152 coming out of the big middle ellipse. 00:20:24.152 --> 00:20:25.610 DAVID MALAN: In the middle ellipse? 00:20:25.610 --> 00:20:28.010 This lower one? 00:20:28.010 --> 00:20:29.960 STUDENT: The one outside of it. 00:20:29.960 --> 00:20:32.720 DAVID MALAN: Oh, this big ellipse? 00:20:32.720 --> 00:20:34.050 STUDENT: Yeah, the outer bound. 00:20:34.050 --> 00:20:34.550 Yeah. 00:20:34.550 --> 00:20:37.883 DAVID MALAN: All right, so I should draw some hands like a kid would have when-- 00:20:42.400 --> 00:20:43.570 OK. 00:20:43.570 --> 00:20:45.527 I'm not sure this is going to end well. 00:20:45.527 --> 00:20:46.360 BRIAN YU: All right. 00:20:46.360 --> 00:20:49.180 We need some more volunteers to help David finish this. 00:20:49.180 --> 00:20:51.420 Let's go to Gabrielle. 00:20:51.420 --> 00:20:52.590 STUDENT: OK. 00:20:52.590 --> 00:20:55.147 Try to draw a-- 00:20:55.147 --> 00:20:56.772 DAVID MALAN: Try is the operative word. 00:20:56.772 --> 00:20:58.500 [LAUGHTER] 00:20:58.500 --> 00:21:00.345 STUDENT: You've got a bigger ellipse that's 00:21:00.345 --> 00:21:04.110 at the very bottom, that's bigger than both the top and middle one, 00:21:04.110 --> 00:21:07.620 but showing no overlapping lines between the middle one and the one 00:21:07.620 --> 00:21:08.580 that you're trying. 00:21:08.580 --> 00:21:10.440 DAVID MALAN: So show no overlapping lines. 00:21:10.440 --> 00:21:12.460 So I heard an even bigger ellipse. 00:21:12.460 --> 00:21:14.640 So, like, oops, sorry. 00:21:14.640 --> 00:21:15.540 This? 00:21:15.540 --> 00:21:16.710 OK? 00:21:16.710 --> 00:21:17.550 STUDENT: Good job. 00:21:17.550 --> 00:21:18.150 DAVID MALAN: Thank you. 00:21:18.150 --> 00:21:18.570 OK. 00:21:18.570 --> 00:21:18.870 Good. 00:21:18.870 --> 00:21:20.610 Keep the positive reinforcement coming. 00:21:20.610 --> 00:21:21.862 Final couple steps? 00:21:21.862 --> 00:21:22.695 BRIAN YU: All right. 00:21:22.695 --> 00:21:23.637 [? Ika? ?] 00:21:23.637 --> 00:21:25.470 STUDENT: One other step you would have to do 00:21:25.470 --> 00:21:30.840 is draw a small, filled in circle, slightly smaller than the one 00:21:30.840 --> 00:21:34.230 you already drew, right in the center of the first circle 00:21:34.230 --> 00:21:35.883 you drew right at the top. 00:21:35.883 --> 00:21:36.550 DAVID MALAN: OK. 00:21:36.550 --> 00:21:38.217 Right in the center of the first circle. 00:21:38.217 --> 00:21:38.820 OK. 00:21:38.820 --> 00:21:41.160 And I think this is starting to take shape for me. 00:21:41.160 --> 00:21:44.013 And I regret some of my earlier decisions. 00:21:44.013 --> 00:21:46.930 BRIAN YU: [INAUDIBLE],, you want to provide an additional instruction? 00:21:46.930 --> 00:21:49.472 STUDENT: Another circle in between the last one you just drew 00:21:49.472 --> 00:21:53.520 and in between the edge of the circle, so 00:21:53.520 --> 00:21:56.490 to the left of that circle you're going to draw another circle. 00:21:56.490 --> 00:21:59.550 DAVID MALAN: To the left of this circle? 00:21:59.550 --> 00:22:00.390 STUDENT: Mhm. 00:22:00.390 --> 00:22:01.057 DAVID MALAN: OK. 00:22:03.380 --> 00:22:05.150 BRIAN YU: And Ryan? 00:22:05.150 --> 00:22:08.730 STUDENT: Underneath, you're going to want to repeat the same process, 00:22:08.730 --> 00:22:10.850 except draw a circle on the right side. 00:22:10.850 --> 00:22:12.280 DAVID MALAN: OK. 00:22:12.280 --> 00:22:13.258 Little loop. 00:22:13.258 --> 00:22:15.800 BRIAN YU: I think we've got maybe one or two more steps left. 00:22:15.800 --> 00:22:17.200 Let's go back to Sophia. 00:22:17.200 --> 00:22:22.130 STUDENT: Underneath the filled in circle, that's in the middle ellipse, 00:22:22.130 --> 00:22:25.360 you want to draw two replicas of that circle 00:22:25.360 --> 00:22:28.630 below the original one in the middle ellipse. 00:22:28.630 --> 00:22:33.310 DAVID MALAN: In the middle ellipse, so here, OK. 00:22:33.310 --> 00:22:35.530 I think I know what this is. 00:22:35.530 --> 00:22:37.870 BRIAN YU: And Jason, how about one last instruction? 00:22:37.870 --> 00:22:39.610 DAVID MALAN: All right. 00:22:39.610 --> 00:22:42.100 STUDENT: Underneath, in the top most circle, 00:22:42.100 --> 00:22:46.840 under the three, so the row of three circles, draw a wide V, 00:22:46.840 --> 00:22:48.850 sort of shaped with two straight lines. 00:22:48.850 --> 00:22:52.000 DAVID MALAN: A wide V with two straight lines. 00:22:52.000 --> 00:22:52.750 OK. 00:22:52.750 --> 00:22:54.930 That part, I think, I nailed. 00:22:54.930 --> 00:22:57.640 Shall I switch over and reveal? 00:22:57.640 --> 00:23:00.130 So this is the URL I believe all of you were given. 00:23:00.130 --> 00:23:01.510 I have not visited it yet. 00:23:01.510 --> 00:23:04.780 But if I go and visit this now. 00:23:04.780 --> 00:23:06.910 Hey, that's not all that bad. 00:23:06.910 --> 00:23:09.245 All right, I definitely took a detour partway through. 00:23:09.245 --> 00:23:10.370 But here's another example. 00:23:10.370 --> 00:23:13.840 Had you just started with draw a snowman as follows, like, 00:23:13.840 --> 00:23:16.600 that might have helped orient me, truthfully, similar in spirit 00:23:16.600 --> 00:23:20.197 to Daniel's design that would have given you a mental model of what I, 00:23:20.197 --> 00:23:23.030 or given me a mental model of what it is I should have been drawing. 00:23:23.030 --> 00:23:25.090 So here, too, abstraction is hard. 00:23:25.090 --> 00:23:28.840 And even precision is hard and figuring out the right level of detail 00:23:28.840 --> 00:23:32.230 to operate at is kind of part of the process of problem solving. 00:23:32.230 --> 00:23:34.780 Though, now that I look at it, that's actually not half bad. 00:23:34.780 --> 00:23:36.822 Like, I definitely did the wrong thing over here. 00:23:36.822 --> 00:23:40.390 But very well done to all of our volunteers online. 00:23:40.390 --> 00:23:43.060 So remember these kinds of details when you're 00:23:43.060 --> 00:23:46.178 trying to explain some process to someone, when you're giving someone 00:23:46.178 --> 00:23:48.970 instructions, even if it's for something mundane in the real world, 00:23:48.970 --> 00:23:53.470 like going to run errands or pick up supplies at the market, being precise 00:23:53.470 --> 00:23:56.830 is certainly important, but the more precision you provide, the much easier 00:23:56.830 --> 00:23:59.870 it is for the person to get lost in those weeds. 00:23:59.870 --> 00:24:05.030 And so sometimes a higher level list of details is all that someone might need. 00:24:05.030 --> 00:24:07.947 So now that you have this ability to program and to do things 00:24:07.947 --> 00:24:10.030 that we've shown you in lectures and problem sets, 00:24:10.030 --> 00:24:11.800 and indeed, have the ability to figure out 00:24:11.800 --> 00:24:13.870 how to do things that we haven't even shown you, 00:24:13.870 --> 00:24:17.350 we wanted to take a moment to consider just whether you should do those things 00:24:17.350 --> 00:24:19.220 and, if you should, how you should do them. 00:24:19.220 --> 00:24:21.970 But beyond just answering these questions with your own instincts, 00:24:21.970 --> 00:24:24.012 we thought we would invite some of our colleagues 00:24:24.012 --> 00:24:27.280 from the Philosophy department to propose a more formal framework, 00:24:27.280 --> 00:24:29.740 a thought process by which we can approach problems 00:24:29.740 --> 00:24:32.050 when it comes to technology and the writing of code 00:24:32.050 --> 00:24:36.080 to help us decide, ultimately, just because I can code something, 00:24:36.080 --> 00:24:39.820 should I, and if so, indeed, how should I do that? 00:24:39.820 --> 00:24:43.090 So allow me to introduce our colleagues from the Philosophy department, Meica 00:24:43.090 --> 00:24:45.610 Magnani and also Susan Kennedy. 00:24:45.610 --> 00:24:46.600 Meica? 00:24:46.600 --> 00:24:47.590 MEICA MAGNANI: So hi. 00:24:47.590 --> 00:24:49.000 I'm Meica Magnani. 00:24:49.000 --> 00:24:52.690 I am a philosophy postdoc with the Embedded Ethics Program 00:24:52.690 --> 00:24:53.380 here at Harvard. 00:24:53.380 --> 00:24:54.130 SUSAN KENNEDY: Hi. 00:24:54.130 --> 00:24:54.910 I'm Susan Kennedy. 00:24:54.910 --> 00:24:59.020 And I'm also a philosophy postdoc with the Embedded Ethics Program at Harvard. 00:24:59.020 --> 00:25:01.040 MEICA MAGNANI: And before we get started, 00:25:01.040 --> 00:25:03.860 I'll just say a few things about the Embedded Ethics Program. 00:25:03.860 --> 00:25:08.380 So we are an interdisciplinary team of philosophers and computer 00:25:08.380 --> 00:25:12.250 scientists working together to integrate ethics into the computer science 00:25:12.250 --> 00:25:13.570 curriculum. 00:25:13.570 --> 00:25:17.920 The idea behind this approach is to embed tools of ethical reasoning 00:25:17.920 --> 00:25:21.040 into computer science courses themselves. 00:25:21.040 --> 00:25:25.540 The reason for this is that when making decisions about the design, deployment, 00:25:25.540 --> 00:25:30.130 or development of a piece of technology, one is, whether or not one realizes it, 00:25:30.130 --> 00:25:31.990 making ethical decisions. 00:25:31.990 --> 00:25:35.210 That is, making decisions which stand to have social, political, 00:25:35.210 --> 00:25:36.612 or human impact. 00:25:36.612 --> 00:25:39.070 At Harvard, we think it's important for computer scientists 00:25:39.070 --> 00:25:42.730 to be equipped with tools for thinking through these implications. 00:25:42.730 --> 00:25:45.730 SUSAN KENNEDY: Technology holds a lot of power and influence over us. 00:25:45.730 --> 00:25:49.938 And that means, by extension, that the people who design technology do, too. 00:25:49.938 --> 00:25:52.480 Now that you're starting to think about what responsibilities 00:25:52.480 --> 00:25:54.940 you might have as computer scientists, so you 00:25:54.940 --> 00:25:58.570 can avoid notable mishaps, like face mash, for instance, 00:25:58.570 --> 00:26:01.960 we're going to turn your attention to the topic of social media platforms 00:26:01.960 --> 00:26:04.420 and how they affect the distribution of and engagement 00:26:04.420 --> 00:26:07.010 with news and information. 00:26:07.010 --> 00:26:09.260 It would seem that this topic is especially relevant 00:26:09.260 --> 00:26:13.370 now, given the recent US presidential election, where political content has 00:26:13.370 --> 00:26:17.120 been dominating the internet and television broadcasts and controversy 00:26:17.120 --> 00:26:19.880 has played out on social media, garnering attention 00:26:19.880 --> 00:26:21.930 from around the world. 00:26:21.930 --> 00:26:24.840 Undoubtedly, technology has completely revolutionized 00:26:24.840 --> 00:26:28.740 the way information and news is both disseminated and consumed. 00:26:28.740 --> 00:26:32.370 Instead of a paper boy shouting, get your news here on the street corner, 00:26:32.370 --> 00:26:34.590 just about everyone uses the internet to stay up 00:26:34.590 --> 00:26:39.250 to date with what's happening, not just locally, but around the world. 00:26:39.250 --> 00:26:42.790 And in the past few years, social media platforms, in particular, 00:26:42.790 --> 00:26:46.240 have started to play a huge role in how people access, share, 00:26:46.240 --> 00:26:48.460 and engage with information. 00:26:48.460 --> 00:26:52.630 For instance, research shows that 44% of US adults 00:26:52.630 --> 00:26:55.850 report getting the news from Facebook. 00:26:55.850 --> 00:26:58.850 It's safe to say a lot has changed in recent years, 00:26:58.850 --> 00:27:01.230 owing to developments in technology. 00:27:01.230 --> 00:27:05.030 And this matters when we consider what's at stake, namely, the ability 00:27:05.030 --> 00:27:07.010 for the public to engage in discourse that 00:27:07.010 --> 00:27:09.950 supports a well-functioning democracy. 00:27:09.950 --> 00:27:13.280 So I'll first present you a brief overview of where we came from 00:27:13.280 --> 00:27:16.430 and where we are now, owing to technological developments 00:27:16.430 --> 00:27:20.220 and then consider what challenges we're faced with today. 00:27:20.220 --> 00:27:22.530 Before the internet, news and information 00:27:22.530 --> 00:27:26.670 was almost entirely in the hands of a few major broadcast stations and print 00:27:26.670 --> 00:27:31.020 media outlets, otherwise known as the mass media sphere. 00:27:31.020 --> 00:27:34.110 Since a few organizations were responsible for disseminating 00:27:34.110 --> 00:27:37.230 all the news, information was essentially 00:27:37.230 --> 00:27:40.110 filtered through a narrow lens or narrow aperture 00:27:40.110 --> 00:27:44.080 from organizations to a wide public audience. 00:27:44.080 --> 00:27:46.900 The journalists who are responsible for researching and writing 00:27:46.900 --> 00:27:51.680 the content for these organizations all shared a professional ethos. 00:27:51.680 --> 00:27:55.520 They were concerned with truth, representation of social groups, 00:27:55.520 --> 00:28:00.400 creating a forum for criticism, clarifying public values, 00:28:00.400 --> 00:28:03.190 and offering comprehensive coverage. 00:28:03.190 --> 00:28:05.380 And notably, since the aim was to produce 00:28:05.380 --> 00:28:07.720 content that appealed to a wide audience, 00:28:07.720 --> 00:28:12.810 there was less polarization and extremist commentary than we see today. 00:28:12.810 --> 00:28:15.270 But the journalists responsible for news coverage 00:28:15.270 --> 00:28:17.740 were very uniform in a lot of ways. 00:28:17.740 --> 00:28:22.380 There were relatively affluent, highly educated, mostly white, male, 00:28:22.380 --> 00:28:24.040 and so forth. 00:28:24.040 --> 00:28:28.170 And this had effects on the coverage of racial politics, economic policy, 00:28:28.170 --> 00:28:31.660 and views about the role of the US in the world. 00:28:31.660 --> 00:28:34.720 Moreover, there were seldom opportunities for the audience 00:28:34.720 --> 00:28:37.810 to respond, to develop new themes or topics, 00:28:37.810 --> 00:28:41.110 or level criticism against the mass media sphere. 00:28:41.110 --> 00:28:43.210 There weren't any likes and comment sections 00:28:43.210 --> 00:28:46.270 for the newspaper or television broadcasts. 00:28:46.270 --> 00:28:49.570 If you didn't like it, well, tough luck. 00:28:49.570 --> 00:28:51.640 This all started to change in recent years, 00:28:51.640 --> 00:28:56.290 as news coverage not only moved online, but onto social media platforms. 00:28:56.290 --> 00:28:59.830 We now live in a digitally networked public sphere. 00:28:59.830 --> 00:29:03.070 So instead of having a narrow aperture of communications, 00:29:03.070 --> 00:29:06.940 or just a few organizations to disseminate information to the public, 00:29:06.940 --> 00:29:10.600 we now have a digital sphere with a wide aperture, where lots of people 00:29:10.600 --> 00:29:13.170 can share news and information. 00:29:13.170 --> 00:29:15.450 More specifically, the sources of content 00:29:15.450 --> 00:29:19.140 are not just organizations and the professional journalists they employed, 00:29:19.140 --> 00:29:22.920 but the public and particularly, social media users. 00:29:22.920 --> 00:29:26.010 Anyone can tweet or post on Facebook, and anyone 00:29:26.010 --> 00:29:28.590 can read those tweets and posts. 00:29:28.590 --> 00:29:30.630 It's not only resulted in greater diversity 00:29:30.630 --> 00:29:34.250 of content, but greater access to information as well. 00:29:34.250 --> 00:29:36.900 If you want to follow the news, there are a ton of options 00:29:36.900 --> 00:29:41.500 and free places online you can access with just a few mouse clicks. 00:29:41.500 --> 00:29:43.990 These prospects of increased diversity and access 00:29:43.990 --> 00:29:47.230 are what led many people to believe that the digital sphere held 00:29:47.230 --> 00:29:49.990 great promise for improving the public discourse that 00:29:49.990 --> 00:29:52.540 supports a well-functioning democracy. 00:29:52.540 --> 00:29:55.460 And in some ways, this has been true. 00:29:55.460 --> 00:29:57.930 For example, thanks to Twitter and Facebook, 00:29:57.930 --> 00:30:00.560 we saw the mobilization of social justice movements, 00:30:00.560 --> 00:30:03.440 like Me, Too and Black Lives Matter. 00:30:03.440 --> 00:30:05.780 And the increased diversity of perspectives 00:30:05.780 --> 00:30:08.930 made it possible for individual researchers and scientists 00:30:08.930 --> 00:30:12.350 to weigh in on the CDC'S claims about coronavirus. 00:30:12.350 --> 00:30:16.160 So while the CDC did not initially say coronavirus was characterized 00:30:16.160 --> 00:30:19.363 by airborne transmission, leading to community spread, 00:30:19.363 --> 00:30:21.530 they ended up revising their stance after scientists 00:30:21.530 --> 00:30:25.700 took to Twitter with evidence proving that this was the case. 00:30:25.700 --> 00:30:28.260 While the digital sphere has brought about some improvements, 00:30:28.260 --> 00:30:32.680 it's also exacerbated some problems and created new challenges. 00:30:32.680 --> 00:30:37.120 For example, since anyone can create content, fact checking and monitoring 00:30:37.120 --> 00:30:39.455 have become much more difficult. People are 00:30:39.455 --> 00:30:42.580 left to fend for themselves when it comes to figuring out whether something 00:30:42.580 --> 00:30:45.390 they read online is trustworthy. 00:30:45.390 --> 00:30:48.240 We've also seen increased personalization with respect 00:30:48.240 --> 00:30:51.600 to news and information, where specific content could 00:30:51.600 --> 00:30:54.870 be targeted to specific users by the means of curated news 00:30:54.870 --> 00:30:58.260 feeds on social media and cable news stations cropping up 00:30:58.260 --> 00:31:02.310 that take a particular angle on the news that they cover. 00:31:02.310 --> 00:31:03.510 This is significant. 00:31:03.510 --> 00:31:06.660 Because we end up with a somewhat paradoxical effect. 00:31:06.660 --> 00:31:09.720 Despite a greater diversity in the content that's available, 00:31:09.720 --> 00:31:12.390 there's less diversity in the news and information people 00:31:12.390 --> 00:31:16.650 actually end up consuming, with the personalization of information having 00:31:16.650 --> 00:31:19.110 a tendency to reinforce a person's viewpoints, 00:31:19.110 --> 00:31:22.070 rather than challenge or broaden them. 00:31:22.070 --> 00:31:25.150 Additionally, in the absence of centralized sources of news, 00:31:25.150 --> 00:31:29.650 we've also seen different aims expressed by those creating and sharing content. 00:31:29.650 --> 00:31:33.550 Some have bypassed a concern for truth in an effort to garner more views 00:31:33.550 --> 00:31:37.540 and likes with extremist content or fake news. 00:31:37.540 --> 00:31:39.820 And fake news became a huge issue around the time 00:31:39.820 --> 00:31:43.270 of the 2016 presidential election, as there were concerns 00:31:43.270 --> 00:31:46.330 that the massive spread of misinformation on social media 00:31:46.330 --> 00:31:50.590 could influence or sway individuals political views. 00:31:50.590 --> 00:31:53.380 While the spread of misinformation has always been an issue, 00:31:53.380 --> 00:31:56.740 it's truly been exacerbated by the digital public sphere, 00:31:56.740 --> 00:32:00.910 with social media platforms essentially pouring gasoline on the fire. 00:32:00.910 --> 00:32:04.210 The dissemination of fake news explodes on social media 00:32:04.210 --> 00:32:07.810 because the structure of digital environments, from likes to retweets, 00:32:07.810 --> 00:32:10.900 allows a single post on fake news to go viral, 00:32:10.900 --> 00:32:13.570 reaching the screens of millions around the world. 00:32:13.570 --> 00:32:16.690 And there are serious worries about how fake news has played a role 00:32:16.690 --> 00:32:20.200 in amplifying political polarization. 00:32:20.200 --> 00:32:23.350 So while technology has made possible unique advantages, 00:32:23.350 --> 00:32:26.080 it's also brought on unique challenges. 00:32:26.080 --> 00:32:28.420 One major question that we're faced with now 00:32:28.420 --> 00:32:32.800 is figuring out how content should be regulated on social media platforms, 00:32:32.800 --> 00:32:34.270 if at all. 00:32:34.270 --> 00:32:37.180 Given the scale of the problem, some might be skeptical, 00:32:37.180 --> 00:32:40.760 believing that any form of content regulation would be impossible. 00:32:40.760 --> 00:32:43.660 There's just too many people posting online to fact check them all. 00:32:43.660 --> 00:32:46.990 And fake news spreads so quickly, it's hard to stop before it's already 00:32:46.990 --> 00:32:49.280 reached a huge audience. 00:32:49.280 --> 00:32:51.260 There's also worries that attempts to regulate 00:32:51.260 --> 00:32:54.740 content could end up becoming a form of censorship that violates 00:32:54.740 --> 00:32:57.380 the right to freedom of speech. 00:32:57.380 --> 00:33:00.020 But some people are more optimistic about the possibilities 00:33:00.020 --> 00:33:03.620 of designing social media platforms in a way that promotes and preserves 00:33:03.620 --> 00:33:05.300 democracy. 00:33:05.300 --> 00:33:08.510 In particular, there's a possibility that with responsibly designed 00:33:08.510 --> 00:33:11.300 algorithms and user interface choices, we 00:33:11.300 --> 00:33:13.310 might be able to slow the spread of fake news 00:33:13.310 --> 00:33:17.150 and more generally improve the ways information is disseminated and engaged 00:33:17.150 --> 00:33:19.410 with on social media. 00:33:19.410 --> 00:33:23.340 For example, some people believe that companies like Facebook, Twitter 00:33:23.340 --> 00:33:26.340 and YouTube have a responsibility to regulate content 00:33:26.340 --> 00:33:29.460 because of the enormous influence they have over us. 00:33:29.460 --> 00:33:33.300 In particular, it's thought that social media platforms have a responsibility 00:33:33.300 --> 00:33:37.140 to police fake news and reduce the power of data driven algorithms that 00:33:37.140 --> 00:33:40.590 personalize the user experience, even if doing these things 00:33:40.590 --> 00:33:42.960 would come at the cost of user engagement, 00:33:42.960 --> 00:33:48.330 resulting in less time spent on the platform and less advertising revenue. 00:33:48.330 --> 00:33:50.370 It's clear that the path going forward in terms 00:33:50.370 --> 00:33:54.630 of content regulation on social media platforms is going to be tricky. 00:33:54.630 --> 00:33:57.600 Whether or not we promote democratic ideals or undermine 00:33:57.600 --> 00:34:01.670 them will come down to the particular design choices we make. 00:34:01.670 --> 00:34:04.070 In order to use technology to create solutions 00:34:04.070 --> 00:34:06.320 to the problems we're facing today, we'll 00:34:06.320 --> 00:34:08.969 need to make informed decisions about design choices. 00:34:08.969 --> 00:34:12.620 And this requires some critical thinking about ethics and philosophy 00:34:12.620 --> 00:34:15.130 to figure out the best way to do this. 00:34:15.130 --> 00:34:17.815 But we're hoping that students like you, taking CS50, 00:34:17.815 --> 00:34:21.594 can harness your creativity, technical knowledge, and ethical reasoning 00:34:21.594 --> 00:34:25.703 to design technology in a responsible way. 00:34:25.703 --> 00:34:27.620 So I'm now going to pass things over to Meica, 00:34:27.620 --> 00:34:30.400 who will tell you about some philosophical concepts that'll 00:34:30.400 --> 00:34:32.679 help you think proactively about particular design 00:34:32.679 --> 00:34:36.550 choices and algorithmic tools that can be implemented to structure 00:34:36.550 --> 00:34:40.960 social media platforms in a way that promotes democratic public discourse. 00:34:40.960 --> 00:34:43.420 MEICA MAGNANI: In Democracy and The Digital Public Sphere, 00:34:43.420 --> 00:34:46.900 an article which offers a fantastic diagnosis of our situation, 00:34:46.900 --> 00:34:49.000 and from which Susan and I are drawing heavily 00:34:49.000 --> 00:34:53.080 upon for this lecture, the authors Joshua Cohen and Archon Fung, 00:34:53.080 --> 00:34:56.409 tell us that the bloom is off the digital rose. 00:34:56.409 --> 00:34:58.840 As Susan was describing, we had such high hopes 00:34:58.840 --> 00:35:02.200 for the democratizing potential of social media and the internet. 00:35:02.200 --> 00:35:05.740 But now we face an environment in which fake news runs rampant, 00:35:05.740 --> 00:35:08.380 citizens appear to be dramatically polarized, 00:35:08.380 --> 00:35:11.020 information swirls in its own isolated bubbles, 00:35:11.020 --> 00:35:14.140 and hate speech reaches appalling levels of vitriol. 00:35:14.140 --> 00:35:17.440 All of which stand to threaten, or so people speculate, 00:35:17.440 --> 00:35:20.290 the conditions required for an effective democracy. 00:35:20.290 --> 00:35:23.500 So the following questions arise, in what ways 00:35:23.500 --> 00:35:25.720 are the conditions of democracy threatened? 00:35:25.720 --> 00:35:27.580 What can or should be done about it? 00:35:27.580 --> 00:35:29.920 Is the structure of our technology responsible? 00:35:29.920 --> 00:35:33.610 Or is it just us, as human beings, creating these problems? 00:35:33.610 --> 00:35:35.830 In this module, we're focusing specifically 00:35:35.830 --> 00:35:38.140 on the issue of content regulation. 00:35:38.140 --> 00:35:40.960 Social media companies like Twitter, Facebook, and YouTube 00:35:40.960 --> 00:35:44.200 are now all in the game of trying to address these problems through platform 00:35:44.200 --> 00:35:45.970 design and features. 00:35:45.970 --> 00:35:49.750 From one angle then, they are acting in the service of protecting democracy 00:35:49.750 --> 00:35:52.960 by trying to get control over the spread of misinformation, 00:35:52.960 --> 00:35:56.530 the amplification of hate speech, and the deepening of polarization. 00:35:56.530 --> 00:35:58.810 However, from another angle, they're stepping in 00:35:58.810 --> 00:36:01.220 to shape the distribution of information. 00:36:01.220 --> 00:36:03.520 And depending on the particular design choices, 00:36:03.520 --> 00:36:05.560 might be set to be regulating or silencing 00:36:05.560 --> 00:36:09.310 speech, which of course, is at odds with democratic commitments 00:36:09.310 --> 00:36:11.590 to free speech and discourse. 00:36:11.590 --> 00:36:14.170 The point of this module, then, is to give you some tools 00:36:14.170 --> 00:36:18.040 to think through these issues, tools for understanding the problem, 00:36:18.040 --> 00:36:22.510 diagnosing the sources of the problem, and brainstorming solutions. 00:36:22.510 --> 00:36:24.640 In the remaining 10 or 15 minutes, I'm going 00:36:24.640 --> 00:36:27.080 to provide an overview of the main tools which 00:36:27.080 --> 00:36:29.200 you will find detailed in the readings. 00:36:29.200 --> 00:36:33.350 They are also the tools you will be asked to analyze in this week's lab. 00:36:33.350 --> 00:36:35.620 So first then, we need to think clearly about what 00:36:35.620 --> 00:36:37.870 is required for a healthy democracy. 00:36:37.870 --> 00:36:42.040 If we're going to be making claims about how tech threatens democracy, 00:36:42.040 --> 00:36:46.750 we better understand A, what a democracy is, and B, what sort of conditions 00:36:46.750 --> 00:36:50.860 support democracy, such that those conditions could come under threat. 00:36:50.860 --> 00:36:54.610 In their article, Archon Fung, who is a professor in political science 00:36:54.610 --> 00:36:58.900 here at Harvard, and Joshua Cohen, who is a political philosopher now working 00:36:58.900 --> 00:37:03.800 with the faculty at Apple University, provide us with these tools. 00:37:03.800 --> 00:37:06.070 So behind the idea of democracy is an ideal 00:37:06.070 --> 00:37:08.170 of what political society should be. 00:37:08.170 --> 00:37:11.810 Fung and Cohen reduce this ideal to three elements. 00:37:11.810 --> 00:37:15.610 First, the idea of a democratic society, a society 00:37:15.610 --> 00:37:19.982 in which the political culture views individuals as free and equal. 00:37:19.982 --> 00:37:21.940 Even though it is likely that these people have 00:37:21.940 --> 00:37:25.900 different interests, identities, and systems of belief, as citizens, 00:37:25.900 --> 00:37:30.040 they are committed to arriving m through reflection and discourse, principles 00:37:30.040 --> 00:37:33.040 that will enable them to work together while respecting their freedom 00:37:33.040 --> 00:37:35.040 and equality. 00:37:35.040 --> 00:37:38.400 Second is the idea of a democratic political regime, which 00:37:38.400 --> 00:37:42.060 is characterized by regular elections, rights of participation, 00:37:42.060 --> 00:37:44.160 along with associative and expressive rights 00:37:44.160 --> 00:37:48.220 that make participation both informed and effective. 00:37:48.220 --> 00:37:52.030 Third and lastly is the idea of a deliberative democracy, 00:37:52.030 --> 00:37:54.130 according to which political discussion should 00:37:54.130 --> 00:37:56.620 appeal to reasons that are suitable for cooperation 00:37:56.620 --> 00:38:00.070 amongst free and equal persons. 00:38:00.070 --> 00:38:04.330 So in justifying a policy, you cannot appeal to, say, your own religion, 00:38:04.330 --> 00:38:07.660 given that others do not necessarily hold those same beliefs. 00:38:07.660 --> 00:38:10.900 You can appeal to the notion of, say, religious freedom, but not 00:38:10.900 --> 00:38:14.990 the particular beliefs contained within the religion itself. 00:38:14.990 --> 00:38:17.440 So democracy, then, is basically an ideal 00:38:17.440 --> 00:38:20.920 that we govern ourselves by collective decision making, decision making that 00:38:20.920 --> 00:38:23.200 respects our freedom and equality. 00:38:23.200 --> 00:38:26.980 This decision making consists not only of the formal procedures of voting, 00:38:26.980 --> 00:38:31.750 elections, and legislation, it is also informed by the informal public sphere, 00:38:31.750 --> 00:38:35.140 that is, citizens identifying problems and concerns, 00:38:35.140 --> 00:38:38.890 discussing and debating problems, expressing opinions, challenging 00:38:38.890 --> 00:38:41.590 viewpoints, and organizing around causes. 00:38:41.590 --> 00:38:45.850 This is an absolutely critical part of the democratic decision-making process. 00:38:45.850 --> 00:38:51.010 It is where we, as the public, form, test, disperse, exchange, 00:38:51.010 --> 00:38:53.290 challenge, and revise our views. 00:38:53.290 --> 00:38:56.080 The flow of information, along with user engagement 00:38:56.080 --> 00:38:58.510 on Facebook, YouTube, and Twitter, are all part 00:38:58.510 --> 00:39:01.740 of this informal public sphere. 00:39:01.740 --> 00:39:03.840 In order that individuals can participate 00:39:03.840 --> 00:39:07.200 as free and equal citizens in this arena of public discourse, 00:39:07.200 --> 00:39:10.140 Cohen and Fung lay out a set of rights and opportunities 00:39:10.140 --> 00:39:12.930 that a well-functioning democracy will require. 00:39:12.930 --> 00:39:15.900 And these are the tools of analysis on offer. 00:39:15.900 --> 00:39:18.030 So first, rights. 00:39:18.030 --> 00:39:20.610 "As citizens of a democracy, we have rights 00:39:20.610 --> 00:39:24.930 to basic liberties, liberties of expression and association. 00:39:24.930 --> 00:39:27.090 The right to expressive liberty is important 00:39:27.090 --> 00:39:29.160 not only for the freedom of the individual, 00:39:29.160 --> 00:39:33.520 so that he or she will not be censored, but also for democracy itself. 00:39:33.520 --> 00:39:37.410 It enables citizens to bring their ideas into conversation with one another 00:39:37.410 --> 00:39:41.250 and to criticize and hold accountable those who exercise power." 00:39:41.250 --> 00:39:44.700 Second is the opportunity for expression. 00:39:44.700 --> 00:39:47.040 "Not only should we be free of censorship, but 00:39:47.040 --> 00:39:50.940 we should have fair opportunity to participate in public discussion. 00:39:50.940 --> 00:39:53.190 It shouldn't be the case that because someone is, say, 00:39:53.190 --> 00:39:58.260 wealthier or more powerful, that they have more opportunity to participate." 00:39:58.260 --> 00:40:00.390 Third is access. 00:40:00.390 --> 00:40:03.150 "Each person should have good and equal access 00:40:03.150 --> 00:40:06.540 to quality and reliable information on public matters. 00:40:06.540 --> 00:40:08.640 That is, if we make the effort, we should 00:40:08.640 --> 00:40:10.690 be able to acquire this information. 00:40:10.690 --> 00:40:16.200 Effective participation in decision making requires being informed." 00:40:16.200 --> 00:40:18.540 Fourth is diversity. 00:40:18.540 --> 00:40:23.070 "Each person should have good and equal chances to hear a wide range of views. 00:40:23.070 --> 00:40:25.440 We need access to competing views in order to have 00:40:25.440 --> 00:40:28.410 a more informed and reasoned position" 00:40:28.410 --> 00:40:31.620 And lastly, number five, communicative power. 00:40:31.620 --> 00:40:35.880 "Citizens should have good and equal chances to explore interests and ideas 00:40:35.880 --> 00:40:37.570 in association with others. 00:40:37.570 --> 00:40:40.710 And through these associations, to develop new concerns that 00:40:40.710 --> 00:40:44.120 might challenge the mainstream view." 00:40:44.120 --> 00:40:47.360 These rights and opportunities together provide critical conditions 00:40:47.360 --> 00:40:50.360 for enabling participation in public discussion. 00:40:50.360 --> 00:40:53.120 They might seem like a lot to keep track of initially, 00:40:53.120 --> 00:40:56.420 but if we're going to think through how social media threatens democracy, 00:40:56.420 --> 00:41:00.210 and more concretely, how platform design might promote or hinder democracy, 00:41:00.210 --> 00:41:01.910 these are valuable tools. 00:41:01.910 --> 00:41:04.700 We can use, say, the access condition, the idea 00:41:04.700 --> 00:41:07.280 that we should all have access to reliable information, 00:41:07.280 --> 00:41:09.470 as a lens of analysis. 00:41:09.470 --> 00:41:11.840 Does our platform prevent certain groups or users 00:41:11.840 --> 00:41:14.270 from accessing reliable information? 00:41:14.270 --> 00:41:17.270 Or we can use the diversity condition, the idea 00:41:17.270 --> 00:41:20.180 that we should all have access to a plurality of conflicting views 00:41:20.180 --> 00:41:22.020 as a lens of analysis. 00:41:22.020 --> 00:41:25.640 So for example, we might ask ourselves, does our platform 00:41:25.640 --> 00:41:28.880 create a filter bubble in which individuals are no longer confronted 00:41:28.880 --> 00:41:31.610 with opposing views? 00:41:31.610 --> 00:41:34.100 In addition to understanding what conditions support 00:41:34.100 --> 00:41:38.690 a democratic society, we also need to understand the purported problems 00:41:38.690 --> 00:41:42.010 before we can propose effective interventions. 00:41:42.010 --> 00:41:43.720 Consider fake news. 00:41:43.720 --> 00:41:47.080 Why are people so gullible when it comes to fake news? 00:41:47.080 --> 00:41:51.490 Why do they often repost without proper critical assessment? 00:41:51.490 --> 00:41:54.490 Regina Rini, in the reading, proposes that in order 00:41:54.490 --> 00:41:56.620 to understand the phenomenon of fake news, 00:41:56.620 --> 00:41:59.780 we should think about it as a form of testimony. 00:41:59.780 --> 00:42:01.780 When another person shares information with you, 00:42:01.780 --> 00:42:04.310 you typically take it to be true. 00:42:04.310 --> 00:42:07.240 This is because of the norms governing our practice of testimony. 00:42:07.240 --> 00:42:09.940 When you assert something, passing it on to others, 00:42:09.940 --> 00:42:12.580 you typically take responsibility for its truth. 00:42:12.580 --> 00:42:15.160 It is assumed that you have either acquired evidence 00:42:15.160 --> 00:42:18.460 for yourself or you've received this information from a source 00:42:18.460 --> 00:42:20.440 that you deem reliable. 00:42:20.440 --> 00:42:23.710 Most of our knowledge about the world comes through this practice. 00:42:23.710 --> 00:42:27.280 We could not possibly acquire evidence for all the beliefs we hold. 00:42:27.280 --> 00:42:31.860 So we often have to rely on sources we deem and hope to be credible. 00:42:31.860 --> 00:42:36.690 But social media, Rini points out, has unsettled testimonial norms. 00:42:36.690 --> 00:42:41.670 When someone posts a piece of news, we seem to hold two conflicting views. 00:42:41.670 --> 00:42:44.730 On the one hand, we see it as an active endorsement. 00:42:44.730 --> 00:42:48.660 The person posting has taken some degree of responsibility 00:42:48.660 --> 00:42:51.210 for the accuracy of their post, the same way one 00:42:51.210 --> 00:42:54.480 would before passing on information in a conversation. 00:42:54.480 --> 00:42:57.240 On the other hand, though, it's just a share. 00:42:57.240 --> 00:43:00.480 We see this attitude coming through when Donald Trump, called out 00:43:00.480 --> 00:43:06.060 on one of his questionable tweets, retorts with, eh, it's just a tweet. 00:43:06.060 --> 00:43:09.090 To fight fake news, then, Rini argues that we 00:43:09.090 --> 00:43:13.440 need to stabilize social media's norms of testimony so that, as she says, 00:43:13.440 --> 00:43:15.870 the same norms that keep us honest over cocktails 00:43:15.870 --> 00:43:18.150 will keep us honest in our posts. 00:43:18.150 --> 00:43:20.340 We need people to be held accountable for, 00:43:20.340 --> 00:43:22.920 or to have a sense of responsibility, for the information 00:43:22.920 --> 00:43:24.840 that they share with others. 00:43:24.840 --> 00:43:29.650 Her concrete proposal, give users a credibility score. 00:43:29.650 --> 00:43:33.300 So in practice, this would be an amendment to Facebook's system, 00:43:33.300 --> 00:43:35.910 using independent fact checking organizations, 00:43:35.910 --> 00:43:41.460 Facebook flags problematic news and warns users before they repost it. 00:43:41.460 --> 00:43:44.280 When a user tries to post something that has been identified 00:43:44.280 --> 00:43:47.760 as false or misleading, a pop up appears that explains the problem 00:43:47.760 --> 00:43:49.930 and identifies the original source. 00:43:49.930 --> 00:43:51.930 It then asks the user to confirm that they would 00:43:51.930 --> 00:43:54.620 like to continue with their repost. 00:43:54.620 --> 00:43:57.690 A user's credibility score, for Rini, would 00:43:57.690 --> 00:44:00.450 depend on how often they choose to ignore these warnings 00:44:00.450 --> 00:44:02.820 and pass on misleading information. 00:44:02.820 --> 00:44:05.220 Quote, "a green dot by the user's name could 00:44:05.220 --> 00:44:08.820 indicate that the user hasn't chosen to share much disputed news. 00:44:08.820 --> 00:44:11.250 A yellow dot could indicate that they do it sometimes. 00:44:11.250 --> 00:44:13.350 And a red could indicate that they do it often." 00:44:13.350 --> 00:44:14.670 Unquote. 00:44:14.670 --> 00:44:16.710 The idea, then, is that a credibility score 00:44:16.710 --> 00:44:18.960 would incentivize users to take responsibility 00:44:18.960 --> 00:44:22.470 for what they share and would also give others a sense of their reliability 00:44:22.470 --> 00:44:24.690 as sources. 00:44:24.690 --> 00:44:27.540 So Rini comes up with this solution through a careful analysis 00:44:27.540 --> 00:44:30.240 of why we are so gullible to fake news. 00:44:30.240 --> 00:44:33.090 I will leave it up to you to consider this proposal 00:44:33.090 --> 00:44:35.730 in light of the various rights and opportunities required 00:44:35.730 --> 00:44:38.010 for a democratic public sphere. 00:44:38.010 --> 00:44:41.190 Does Rini's proposal violate or threaten freedom of expression? 00:44:41.190 --> 00:44:45.000 Does it promote or hinder our access to reliable information, our access 00:44:45.000 --> 00:44:48.300 to diversity of views, or does it promote or hinder 00:44:48.300 --> 00:44:50.190 our communicative power? 00:44:50.190 --> 00:44:51.990 It is these sorts of questions that we hope 00:44:51.990 --> 00:44:54.032 that you will start to ask yourself when thinking 00:44:54.032 --> 00:44:56.760 through the following sorts of issues. 00:44:56.760 --> 00:45:00.520 What problems do fake news, hate speech, polarization, et cetera 00:45:00.520 --> 00:45:02.550 pose to democracy? 00:45:02.550 --> 00:45:05.830 How successful are various attempts by companies like Twitter, YouTube, 00:45:05.830 --> 00:45:08.370 and Facebook to address these problems? 00:45:08.370 --> 00:45:11.400 And how might particular design features of social media platforms 00:45:11.400 --> 00:45:15.180 promote or hinder these particular rights and opportunities? 00:45:15.180 --> 00:45:18.750 Whether as a future computer scientist, a tech industry leader, 00:45:18.750 --> 00:45:22.950 or just as a user of these technologies, we hope asking these sorts of questions 00:45:22.950 --> 00:45:25.783 will help you navigate these tricky issues with a more critical eye. 00:45:25.783 --> 00:45:28.867 SUSAN KENNEDY: We're really looking forward to the sorts of design choices 00:45:28.867 --> 00:45:30.850 that you'll be making in the future. 00:45:30.850 --> 00:45:31.200 MEICA MAGNANI: Great. 00:45:31.200 --> 00:45:32.867 Thanks so much for having us here today. 00:45:32.867 --> 00:45:36.052 And best of luck to everybody. 00:45:36.052 --> 00:45:39.010 DAVID MALAN: Well, thank you so much to Susan and Meica for joining us. 00:45:39.010 --> 00:45:41.510 Indeed, in this coming week's lab, we'll have an opportunity 00:45:41.510 --> 00:45:44.050 to consider some of these issues in the context of some very 00:45:44.050 --> 00:45:46.730 specific real world scenarios. 00:45:46.730 --> 00:45:49.690 So we now thought we would take a look forward at what you can do 00:45:49.690 --> 00:45:52.300 and how you can do it after CS50 when it comes 00:45:52.300 --> 00:45:56.530 to the more practical side of things beyond computational thinking alone. 00:45:56.530 --> 00:45:58.910 So programming, of course, for many of you, 00:45:58.910 --> 00:46:02.127 this will be by design the only computer science or programming course 00:46:02.127 --> 00:46:02.710 that you take. 00:46:02.710 --> 00:46:04.030 And that's certainly OK. 00:46:04.030 --> 00:46:06.130 Indeed, we hope that you'll be able now to return 00:46:06.130 --> 00:46:09.970 to your own domains of interest in the arts and humanities, social sciences, 00:46:09.970 --> 00:46:13.840 or sciences and actually be able to have a concrete set of practical skills 00:46:13.840 --> 00:46:16.630 be it in Python or C or any of the other technical languages 00:46:16.630 --> 00:46:20.470 we looked at and can actually solve problems in your own preferred domain. 00:46:20.470 --> 00:46:23.140 And if you're interested in learning more about computer science 00:46:23.140 --> 00:46:25.330 itself and moving on in that world, we hope 00:46:25.330 --> 00:46:27.400 that you'll walk away with a solid foundation 00:46:27.400 --> 00:46:32.800 for further theoretical and systematic explorations of this particular field. 00:46:32.800 --> 00:46:34.720 But very practically speaking, we hope now 00:46:34.720 --> 00:46:38.230 that you can not only program, but also ask questions better, 00:46:38.230 --> 00:46:41.240 whether that's in the technical world or even in just the real world. 00:46:41.240 --> 00:46:44.810 Odds are, if you've ever asked a question on CS50's discussion forums, 00:46:44.810 --> 00:46:48.040 the teaching fellows or I might have very well responded with questions 00:46:48.040 --> 00:46:49.210 asking you to clarify. 00:46:49.210 --> 00:46:52.090 Or better yet, you would have provided us, in anticipation, 00:46:52.090 --> 00:46:55.000 with answers to all of the questions that we might have. 00:46:55.000 --> 00:46:56.950 And if you've noticed on Ed, we deliberately 00:46:56.950 --> 00:46:59.230 have this sort of template via which you're 00:46:59.230 --> 00:47:02.242 coaxed to answer, well, what are the symptoms that you are seeing? 00:47:02.242 --> 00:47:04.450 What's the error message that you're struggling with? 00:47:04.450 --> 00:47:06.670 What steps have you tried to resolve the problem? 00:47:06.670 --> 00:47:09.295 Because if we imagine in the real world, even just reaching out 00:47:09.295 --> 00:47:11.860 to some random company's customer service line, 00:47:11.860 --> 00:47:14.450 those are exactly the kinds of questions that someone else 00:47:14.450 --> 00:47:16.450 is going to have to ask you to better understand 00:47:16.450 --> 00:47:18.710 a problem from your own perspective. 00:47:18.710 --> 00:47:22.270 And so we would encourage you to think about, as you emerge from CS50 itself, 00:47:22.270 --> 00:47:24.550 just how to ask better questions of people. 00:47:24.550 --> 00:47:26.420 If you've got more information than they, 00:47:26.420 --> 00:47:29.950 how can you succinctly but correctly convey that information to them 00:47:29.950 --> 00:47:32.710 so that they can help you more efficiently. 00:47:32.710 --> 00:47:34.120 But also, finding answers. 00:47:34.120 --> 00:47:39.970 Like, we absolutely understand that many of CS50's weeks, all of CS50's weeks, 00:47:39.970 --> 00:47:42.118 maybe, have been quite the frustration. 00:47:42.118 --> 00:47:45.160 Because you quite often feel like, well, we didn't cover that in lecture. 00:47:45.160 --> 00:47:46.720 Or I didn't see that in section. 00:47:46.720 --> 00:47:48.280 And I see some noddings of the head. 00:47:48.280 --> 00:47:49.690 So this seems to be the case. 00:47:49.690 --> 00:47:52.870 And much as I would love to reassure otherwise, like, 00:47:52.870 --> 00:47:54.580 that was very much the intent. 00:47:54.580 --> 00:47:57.790 Because the last of the training wheels of any course like this 00:47:57.790 --> 00:47:59.640 now really do officially come off. 00:47:59.640 --> 00:48:01.390 And in the coming weeks, while we'll still 00:48:01.390 --> 00:48:03.782 be with you to lend a hand with final projects 00:48:03.782 --> 00:48:05.740 and answer questions along those lines, there's 00:48:05.740 --> 00:48:09.970 of course no specification for the final project telling you exactly what to do, 00:48:09.970 --> 00:48:12.780 or in what language to do it, or what libraries to use. 00:48:12.780 --> 00:48:14.530 Undoubtedly, in the coming weeks, you will 00:48:14.530 --> 00:48:16.870 run into error messages you haven't even seen before. 00:48:16.870 --> 00:48:20.380 And frankly, maybe I, maybe Brian, maybe the teaching assistants, 00:48:20.380 --> 00:48:23.260 and the course assistants haven't even seen those errors before. 00:48:23.260 --> 00:48:26.470 But the goal, of course, is to get you over those hurdles in a way 00:48:26.470 --> 00:48:29.050 that you can figure out how to do those things on your own. 00:48:29.050 --> 00:48:32.380 And so when it comes to just using the internet, be it Google, or Stack 00:48:32.380 --> 00:48:34.360 Overflow, or interacting with other humans, 00:48:34.360 --> 00:48:37.870 just finding answers when it comes to the world of programming 00:48:37.870 --> 00:48:41.800 or really just the world of problem solving more generally, 00:48:41.800 --> 00:48:44.450 we hope that is actually a lasting skill. 00:48:44.450 --> 00:48:48.040 And we hope that you've been able to do that with admittedly frustration, 00:48:48.040 --> 00:48:51.320 but with the safety net of the course underneath you all these months. 00:48:51.320 --> 00:48:55.180 But here on out, we hope you'll be more comfortable, again, being uncomfortable 00:48:55.180 --> 00:48:56.972 as you figure out new things. 00:48:56.972 --> 00:48:58.930 And part of that is just reading documentation. 00:48:58.930 --> 00:49:02.440 And here, too, this is a frustration that may very well never go away. 00:49:02.440 --> 00:49:06.430 Like, some documentation out there for certain languages or libraries, just 00:49:06.430 --> 00:49:07.150 isn't good. 00:49:07.150 --> 00:49:10.630 It was written by people that just don't think 00:49:10.630 --> 00:49:14.780 like you or I do, don't think with the same form of empathy as you might hope. 00:49:14.780 --> 00:49:18.100 And therefore, it's written at a very low level of technical detail, 00:49:18.100 --> 00:49:20.470 and they don't just tell you what does the function do. 00:49:20.470 --> 00:49:23.140 Or conversely, it's written at such a high level that, my God, 00:49:23.140 --> 00:49:25.990 you have to start looking at the source code of the library 00:49:25.990 --> 00:49:27.670 to even figure out how to use it. 00:49:27.670 --> 00:49:29.770 And you will see both extremes. 00:49:29.770 --> 00:49:33.280 But getting comfortable with reading things like Python's documentation, 00:49:33.280 --> 00:49:36.670 like some API's documentation is just going to empower you, we hope, 00:49:36.670 --> 00:49:39.070 all the more to just do much cooler things 00:49:39.070 --> 00:49:42.910 and solve more powerful problems on your own, ultimately. 00:49:42.910 --> 00:49:46.030 And then lastly, and this is perhaps the biggest one, teaching you 00:49:46.030 --> 00:49:47.860 how to teach yourself new languages. 00:49:47.860 --> 00:49:51.250 There is a reason we didn't spend that much time on Python. 00:49:51.250 --> 00:49:53.680 And we spent even less time on JavaScript. 00:49:53.680 --> 00:49:55.840 And about an equal amount of time on SQL. 00:49:55.840 --> 00:49:58.603 We spent a number of weeks on C, not because C 00:49:58.603 --> 00:50:00.520 is more important than any of those languages, 00:50:00.520 --> 00:50:02.740 but because along the way, many of you, most of you 00:50:02.740 --> 00:50:04.750 were just learning programming itself. 00:50:04.750 --> 00:50:08.320 And even as the language has changed and evolved as the course went on, 00:50:08.320 --> 00:50:09.520 the ideas didn't go away. 00:50:09.520 --> 00:50:12.400 There were still functions, and conditions, and loops, and even 00:50:12.400 --> 00:50:13.550 events by terms. 00:50:13.550 --> 00:50:16.690 And, again, so we hope that you walk away from a class like this 00:50:16.690 --> 00:50:19.450 not thinking that, oh, I learned how to program in C. 00:50:19.450 --> 00:50:21.490 Or oh, I learned how to program in Python. 00:50:21.490 --> 00:50:24.610 Because none of us have been experts at those things yet. 00:50:24.610 --> 00:50:28.390 But you certainly are now more expert at just being a programmer 00:50:28.390 --> 00:50:31.360 and figuring out what holes you need to fill in in your knowledge, what 00:50:31.360 --> 00:50:33.740 gaps you need to fill in order to figure out, 00:50:33.740 --> 00:50:36.880 oh, what is the syntax for this same approach in this language 00:50:36.880 --> 00:50:38.738 as I've already seen in another. 00:50:38.738 --> 00:50:41.530 And that's, indeed, why we compared so many of these languages side 00:50:41.530 --> 00:50:45.640 by side to just reinforce that the ideas are no different, even 00:50:45.640 --> 00:50:49.797 though the syntax is going to require a bunch of Googling, a bunch of asking. 00:50:49.797 --> 00:50:52.630 And that, too, is something we hope you'll be able to do on your own 00:50:52.630 --> 00:50:56.832 as the next and best thing comes along well after these languages. 00:50:56.832 --> 00:50:58.540 Well, speaking of training wheels, you're 00:50:58.540 --> 00:51:02.410 welcome and encouraged to keep using CS50 IDE for your final project. 00:51:02.410 --> 00:51:05.510 And heck, you can use it even after that for other courses or projects. 00:51:05.510 --> 00:51:08.260 But at the end of the day, this, too, is probably a training wheel 00:51:08.260 --> 00:51:10.360 that you should take off for yourself. 00:51:10.360 --> 00:51:14.720 The IDE is designed to be representative of a real world programming 00:51:14.720 --> 00:51:15.220 environment. 00:51:15.220 --> 00:51:17.620 But we definitely did a lot of things for you. 00:51:17.620 --> 00:51:20.392 We installed all the libraries you might need over 00:51:20.392 --> 00:51:21.850 the course of the semester for you. 00:51:21.850 --> 00:51:24.580 We've got these nice commands that end in the number 50. 00:51:24.580 --> 00:51:27.678 Those don't tend to exist in the real world when you're at your first job, 00:51:27.678 --> 00:51:29.470 or you're going back to your own department 00:51:29.470 --> 00:51:31.210 and solving some problem in code, there's 00:51:31.210 --> 00:51:32.937 not going to be a help50 longer term. 00:51:32.937 --> 00:51:35.770 And so what we thought we would do, too, is spend just a few minutes 00:51:35.770 --> 00:51:38.890 giving you a sense of what are some of the more industry standard 00:51:38.890 --> 00:51:42.010 tools that you should consider using, playing with, perhaps 00:51:42.010 --> 00:51:45.820 over break or in the months to come, so that you know exactly how to do 00:51:45.820 --> 00:51:48.880 the same kinds of things you did this term, but on your own Mac 00:51:48.880 --> 00:51:51.380 or PC or some other device. 00:51:51.380 --> 00:51:55.870 So for instance, if you would like to install a set of command line tools 00:51:55.870 --> 00:51:59.200 on your Mac or PC, turns out some of them are already there. 00:51:59.200 --> 00:52:01.510 Indeed, I mentioned at one point that Mac OS 00:52:01.510 --> 00:52:05.170 has under its applications folder utility's terminal a terminal window. 00:52:05.170 --> 00:52:07.180 And Windows has an analog as well. 00:52:07.180 --> 00:52:10.810 But there's other commands that don't necessarily come with your Mac or PC, 00:52:10.810 --> 00:52:13.900 for instance, a compiler for C or some other tools. 00:52:13.900 --> 00:52:17.650 And so we would encourage you to visit URLs like these on your Mac or PC, 00:52:17.650 --> 00:52:21.490 respectively, if you'd like to just install more of the command line tools 00:52:21.490 --> 00:52:26.120 that you saw and used in CS50 in your own environment. 00:52:26.120 --> 00:52:29.380 Another tool we would recommend that you read up on, or in this case 00:52:29.380 --> 00:52:31.650 watch, a video by Brian, is Git. 00:52:31.650 --> 00:52:35.410 Git is an example of version control, a fundamental building 00:52:35.410 --> 00:52:38.650 block of any good software practice these days. 00:52:38.650 --> 00:52:42.460 We kind of use Git in CS50, but we hide this detail from you. 00:52:42.460 --> 00:52:46.465 Any time you have run check50 or submit50, we underneath the hood, 00:52:46.465 --> 00:52:51.230 have been running an open source command called Git, which pushes your code, 00:52:51.230 --> 00:52:54.550 in this case from CS50 IDE to GitHub.com, which is just one 00:52:54.550 --> 00:52:56.980 of several popular websites via which you 00:52:56.980 --> 00:53:00.400 can host code, share code, collaborate on code, run automated tests, 00:53:00.400 --> 00:53:01.220 and the like. 00:53:01.220 --> 00:53:05.020 But Git itself can be used to put an end to the convention 00:53:05.020 --> 00:53:08.473 that you probably have, even with things like Microsoft Word or Google Docs, 00:53:08.473 --> 00:53:11.140 where when you want to save something or another copy of a file, 00:53:11.140 --> 00:53:14.020 maybe you just changed the end of the file name to 2, 00:53:14.020 --> 00:53:16.720 and then the next time to 3, or to 4. 00:53:16.720 --> 00:53:19.568 Or maybe you do dash Sunday night, dash Monday morning. 00:53:19.568 --> 00:53:21.610 I mean, I'm still guilty of this sometimes when I 00:53:21.610 --> 00:53:22.960 want to version my files. 00:53:22.960 --> 00:53:24.770 There are better ways to do that. 00:53:24.770 --> 00:53:26.770 And so if you find yourself in the future, 00:53:26.770 --> 00:53:29.650 doing something that you think there's got to be a better way, 00:53:29.650 --> 00:53:33.007 Git is an example of one of those better ways. 00:53:33.007 --> 00:53:34.840 And if you watch this particular video, read 00:53:34.840 --> 00:53:38.950 up a bit more, it will help you not only maintain multiple versions, in essence, 00:53:38.950 --> 00:53:42.010 backups of your own code, it will also empower you ultimately 00:53:42.010 --> 00:53:44.230 to collaborate with others. 00:53:44.230 --> 00:53:46.390 As for text editors, the tool that you might 00:53:46.390 --> 00:53:50.225 use to actually write code, perhaps one of the latest and greatest and most 00:53:50.225 --> 00:53:52.600 popular out there these days is something called VS Code. 00:53:52.600 --> 00:53:56.260 This is an open source tool that you can download on your own Mac and PCs. 00:53:56.260 --> 00:53:58.840 Increasingly, it's available on the web as well. 00:53:58.840 --> 00:54:02.170 But this is one of the most popular tools, certainly, out there today. 00:54:02.170 --> 00:54:03.410 But it's just a text editor. 00:54:03.410 --> 00:54:06.700 And there are absolutely alternatives to each and every one of these tools 00:54:06.700 --> 00:54:09.352 that you're certainly welcome to take a look at as well. 00:54:09.352 --> 00:54:11.560 Well, if you're interested in the web side of things, 00:54:11.560 --> 00:54:14.200 and you want to host a website, like a static website, just 00:54:14.200 --> 00:54:17.560 your own personal homepage, GitHub pages is a thing. 00:54:17.560 --> 00:54:18.700 Netlify is a thing. 00:54:18.700 --> 00:54:21.800 And dot dot dot, there are so many other web hosts out there, 00:54:21.800 --> 00:54:24.640 many of which offer free or student level 00:54:24.640 --> 00:54:28.240 accounts so that they don't necessarily need to even cost anything. 00:54:28.240 --> 00:54:30.010 But static is different from dynamic. 00:54:30.010 --> 00:54:33.370 And if you actually want to host a web application that actually takes 00:54:33.370 --> 00:54:35.560 user input, stores things in a database, does 00:54:35.560 --> 00:54:37.650 more interesting things than a static website, 00:54:37.650 --> 00:54:40.150 you might want to use something called Heroku, which is just 00:54:40.150 --> 00:54:43.930 a popular third party service that also has a free entry level account that you 00:54:43.930 --> 00:54:46.030 can use to start playing with, quite commonly used 00:54:46.030 --> 00:54:47.560 by students for final projects. 00:54:47.560 --> 00:54:50.630 And then there's other providers out there, bigger cloud providers, 00:54:50.630 --> 00:54:52.888 so to speak, like Amazon, and Microsoft, and Google, 00:54:52.888 --> 00:54:55.180 for which the learning curve's perhaps a little higher. 00:54:55.180 --> 00:54:58.540 But they, too, are really good typically about providing discounts 00:54:58.540 --> 00:55:01.690 or free accounts for student uses as well. 00:55:01.690 --> 00:55:04.180 How to stay abreast of topics in technology. 00:55:04.180 --> 00:55:07.180 We focus, of course, in a class like this really on fundamentals. 00:55:07.180 --> 00:55:10.600 But you're not going to be able to pick up the news in any form down the road 00:55:10.600 --> 00:55:12.790 and not see something that's technology related. 00:55:12.790 --> 00:55:15.850 And if you'd just like to keep your fingers on the pulse of things 00:55:15.850 --> 00:55:18.940 in the tech world more generally, here's just a few places 00:55:18.940 --> 00:55:21.010 that you might enjoy staying abreast of. 00:55:21.010 --> 00:55:24.285 So Reddit has a couple of different communities, or subreddit, specifically 00:55:24.285 --> 00:55:26.410 about programming, both for experienced programmers 00:55:26.410 --> 00:55:28.035 and those of us who are still learning. 00:55:28.035 --> 00:55:29.785 Stack Overflow, of course, you've probably 00:55:29.785 --> 00:55:32.380 used to solve small problems over the course of the past term. 00:55:32.380 --> 00:55:34.610 Server Fault is similar in spirit to that, 00:55:34.610 --> 00:55:38.770 but it's focused more on administration, Linux-type stuff as well. 00:55:38.770 --> 00:55:41.718 Techcrunch is a popular place, not just for consumer-focused news, 00:55:41.718 --> 00:55:43.760 but just really anything that's trending in tech. 00:55:43.760 --> 00:55:47.650 And then a website called Hacker News on YCombinator's site 00:55:47.650 --> 00:55:49.982 that also is a place to just glance at once in a while 00:55:49.982 --> 00:55:52.690 because you'll see the latest and greatest libraries or something 00:55:52.690 --> 00:55:53.770 that's quite nascent. 00:55:53.770 --> 00:55:56.230 So if in general you just want to get a sense of what's new 00:55:56.230 --> 00:55:58.150 and what's trending out there in the tech world, things 00:55:58.150 --> 00:56:00.442 that you should just be aware of even if you don't care 00:56:00.442 --> 00:56:02.860 to get into the weeds of doing those things hands on, 00:56:02.860 --> 00:56:06.740 these are all good sites and surely others out there as well. 00:56:06.740 --> 00:56:10.180 And then, CS50, of course, has its own online community, some of which 00:56:10.180 --> 00:56:13.420 some of you have been part for some time, in high school or even prior. 00:56:13.420 --> 00:56:15.700 Please feel free to keep in touch with us in some way, 00:56:15.700 --> 00:56:17.950 or give back a little something to your successors who 00:56:17.950 --> 00:56:20.140 might take this or another course down the road 00:56:20.140 --> 00:56:23.290 and participate not only in asking questions in these communities here, 00:56:23.290 --> 00:56:27.470 but also in answering others' questions as well. 00:56:27.470 --> 00:56:30.100 So we thought we would do a little less of the talking 00:56:30.100 --> 00:56:34.790 now and turn things around for a sort of final community activity together. 00:56:34.790 --> 00:56:37.320 Thanks to many of you who have contributed questions 00:56:37.320 --> 00:56:38.570 over the past couple of weeks. 00:56:38.570 --> 00:56:41.950 Thanks to Brian, we thought we'd put together a CS50 quiz show 00:56:41.950 --> 00:56:43.960 on which to end this final lecture. 00:56:43.960 --> 00:56:47.200 These are questions written by you, by the staff, by Brian. 00:56:47.200 --> 00:56:49.900 And it'll be an opportunity for everyone to buzz in 00:56:49.900 --> 00:56:54.250 with their answers to some 20 questions that we have prepared in advance. 00:56:54.250 --> 00:56:55.640 Time is of the essence. 00:56:55.640 --> 00:56:59.060 So your score will be higher if you buzz in more quickly. 00:56:59.060 --> 00:57:01.150 So it's important not only to be correct, 00:57:01.150 --> 00:57:04.820 but also to be fast for this particular one as well. 00:57:04.820 --> 00:57:08.060 And in just a moment, I'm going to go ahead and share my screen. 00:57:08.060 --> 00:57:12.250 And, again, we'll have some 20 questions here, all of them drawn from, 00:57:12.250 --> 00:57:14.530 inspired by CS50 in some form. 00:57:14.530 --> 00:57:18.640 And after each question, depending on how many people get it right or wrong, 00:57:18.640 --> 00:57:21.970 we'll take a moment to at least explain where it is you went right 00:57:21.970 --> 00:57:23.920 or where it is you went wrong. 00:57:23.920 --> 00:57:24.700 All right, Brian. 00:57:24.700 --> 00:57:26.172 Ready on your end? 00:57:26.172 --> 00:57:27.380 BRIAN YU: We are ready to go. 00:57:27.380 --> 00:57:29.880 DAVID MALAN: All right, well, let's go ahead and take a look 00:57:29.880 --> 00:57:31.150 with the first question here. 00:57:31.150 --> 00:57:35.860 What are the steps for compiling source code into machine code? 00:57:35.860 --> 00:57:38.290 Preprocessing, compiling, assembling, and linking? 00:57:38.290 --> 00:57:40.810 Writing, compiling, debugging, and testing? 00:57:40.810 --> 00:57:43.270 Processing, creating, asserting, and clang? 00:57:43.270 --> 00:57:45.410 Or make? 00:57:45.410 --> 00:57:49.310 Go ahead and buzz in on your phone or laptop or desktop, 00:57:49.310 --> 00:57:51.500 using that same URL that Brian provided. 00:57:51.500 --> 00:57:55.040 You've got 20 seconds for each question, two of which now remain. 00:57:55.040 --> 00:57:55.970 That's it for time. 00:57:55.970 --> 00:57:58.095 Let's go ahead and take a look at the results here. 00:57:58.095 --> 00:58:02.000 It looks like 70% of you said preprocessing, compiling, assembling, 00:58:02.000 --> 00:58:02.750 and linking. 00:58:02.750 --> 00:58:04.987 Brian, would you like to tell us if that's right? 00:58:04.987 --> 00:58:05.570 BRIAN YU: Yes. 00:58:05.570 --> 00:58:06.800 That is the correct answer. 00:58:06.800 --> 00:58:10.010 Preprocessing first, compiling, assembling, and linking, all of that 00:58:10.010 --> 00:58:11.440 is behind the scenes. 00:58:11.440 --> 00:58:13.940 So you don't necessarily think about every time you compile. 00:58:13.940 --> 00:58:15.590 But those are, indeed, the steps. 00:58:15.590 --> 00:58:18.245 DAVID MALAN: And to be fair, make is arguably an abstraction 00:58:18.245 --> 00:58:20.870 for all of that insofar as it just kicks off the whole process. 00:58:20.870 --> 00:58:24.170 But I think a little more precisely, an answer to steps would be, 00:58:24.170 --> 00:58:25.543 indeed, those four things there. 00:58:25.543 --> 00:58:27.710 All right, let's take a look at the scoreboard here. 00:58:27.710 --> 00:58:29.418 We have a whole number of guest accounts. 00:58:29.418 --> 00:58:33.500 Guest number 200 is in the lead, but tied with several other guests here. 00:58:33.500 --> 00:58:36.240 So those of you with 1,000 points buzzed in really quickly. 00:58:36.240 --> 00:58:37.910 So again time is of the essence. 00:58:37.910 --> 00:58:42.560 Next question, what is the runtime of binary search? 00:58:42.560 --> 00:58:47.330 Is it O of 1, O of log n, O of n, or O of n squared? 00:58:47.330 --> 00:58:52.700 15 seconds remain, the runtime of binary search. 00:58:52.700 --> 00:58:55.280 Recall, this was one of the first algorithms we looked at. 00:58:55.280 --> 00:58:57.350 It was first incarnated with a phone book, 00:58:57.350 --> 00:59:00.500 even if we didn't call it that by name early on. 00:59:00.500 --> 00:59:02.780 Brian, let's take a look at the results. 00:59:02.780 --> 00:59:05.690 Looks like 61% of you say log n. 00:59:05.690 --> 00:59:06.840 Brian? 00:59:06.840 --> 00:59:08.670 BRIAN YU: Log n is the correct answer. 00:59:08.670 --> 00:59:10.040 If you remember that phone book, the question 00:59:10.040 --> 00:59:11.900 really came down to how many times can we 00:59:11.900 --> 00:59:13.940 divide that phone book and half again and again 00:59:13.940 --> 00:59:16.590 and again, until we get down to just one page. 00:59:16.590 --> 00:59:20.120 And that turns out to be log of n if three are n pages in the phone book. 00:59:20.120 --> 00:59:23.240 DAVID MALAN: Indeed, and sort of pro tip moving forward in life, any time 00:59:23.240 --> 00:59:26.150 you see something happening in half and half and half and half, 00:59:26.150 --> 00:59:28.670 odds are there's going to be an algorithm involved 00:59:28.670 --> 00:59:31.070 somewhere in the analysis thereof. 00:59:31.070 --> 00:59:34.710 All right, next leaderboard here, guest 200 slipped down a little bit. 00:59:34.710 --> 00:59:39.200 But we have a whole bunch of people tied in first place for 2,000 points now. 00:59:39.200 --> 00:59:42.500 Next question, which of these animals was the first 00:59:42.500 --> 00:59:45.620 to be mentioned in a CS50 lecture? 00:59:45.620 --> 00:59:50.250 Llama, python, duck, cat. 00:59:50.250 --> 00:59:52.750 15 seconds remain. 00:59:52.750 --> 00:59:57.930 Which was mentioned first in a CS50 lecture? 00:59:57.930 --> 00:59:59.910 And let's see the results. 00:59:59.910 --> 01:00:04.500 Looks like cat just barely eked out duck with 51%. 01:00:04.500 --> 01:00:05.177 Brian? 01:00:05.177 --> 01:00:07.260 BRIAN YU: And cat is, in fact, the correct answer. 01:00:07.260 --> 01:00:10.520 Llamas showed up in Lab 1, but they were not mentioned in lecture. 01:00:10.520 --> 01:00:12.270 The duck didn't show up until a little bit 01:00:12.270 --> 01:00:13.890 later when we talked about debugging. 01:00:13.890 --> 01:00:17.280 And Python was briefly mentioned at the end of the lecture. 01:00:17.280 --> 01:00:19.590 But it was after we introduced ourselves to Scratch. 01:00:19.590 --> 01:00:22.350 And the main character in Scratch is, of course, the cat. 01:00:22.350 --> 01:00:25.600 DAVID MALAN: All right, we're probably going to see a bit of spread here soon. 01:00:25.600 --> 01:00:27.852 We have a whole bunch of people with 3,000, though. 01:00:27.852 --> 01:00:29.310 But the names are starting to vary. 01:00:29.310 --> 01:00:30.990 Let's move on to the next question. 01:00:30.990 --> 01:00:34.920 Every time you malloc memory, you must also be sure to-- 01:00:34.920 --> 01:00:39.690 realloc, return, free, or exit? 01:00:39.690 --> 01:00:44.310 Every time you malloc memory, you should also be sure to realloc, 01:00:44.310 --> 01:00:48.270 return, free, or exit? 01:00:48.270 --> 01:00:51.420 Recall that malloc was the source of a lot of segmentation faults 01:00:51.420 --> 01:00:52.230 mid-semester. 01:00:52.230 --> 01:00:55.950 The responses now are 78% said free. 01:00:55.950 --> 01:00:57.042 Brian, do you concur? 01:00:57.042 --> 01:00:58.500 BRIAN YU: And they are all correct. 01:00:58.500 --> 01:01:01.542 Whenever you malloc memory, ask the computer for some memory dynamically. 01:01:01.542 --> 01:01:04.250 When you're done with it, you should give it back to the computer 01:01:04.250 --> 01:01:05.010 by calling free. 01:01:05.010 --> 01:01:06.927 DAVID MALAN: Indeed, and Brian, as a teachable 01:01:06.927 --> 01:01:10.140 moment, why is it that we never had a call free forget_string, which we now 01:01:10.140 --> 01:01:12.090 know underneath the hood is using something 01:01:12.090 --> 01:01:14.070 like malloc to allocate memory? 01:01:14.070 --> 01:01:16.500 BRIAN YU: So get_string was a function in CS50's library. 01:01:16.500 --> 01:01:19.890 And CS50's library takes care of that memory management process for you. 01:01:19.890 --> 01:01:23.070 So you didn't have to worry about freeing all of that memory yourself. 01:01:23.070 --> 01:01:26.250 DAVID MALAN: Indeed, but anytime you call malloc, you must call free. 01:01:26.250 --> 01:01:28.440 All right, the leaderboard here looks like we have 01:01:28.440 --> 01:01:30.840 guest 600 still in the lead with 4,000. 01:01:30.840 --> 01:01:31.860 Next question. 01:01:31.860 --> 01:01:34.550 What is a race condition? 01:01:34.550 --> 01:01:37.040 When conditions are nice out for racing? 01:01:37.040 --> 01:01:40.670 When two things happen at the same time and produce an unexpected result? 01:01:40.670 --> 01:01:43.170 When a line of code is executed too quickly? 01:01:43.170 --> 01:01:45.830 When a line of code is executed too slowly? 01:01:45.830 --> 01:01:49.130 What is a race condition? 01:01:49.130 --> 01:01:50.850 Ah, things just escalated quickly. 01:01:50.850 --> 01:01:54.590 But you'll recall this came up in the context of SQL. 01:01:54.590 --> 01:01:57.020 And databases, 0 seconds, let's see. 01:01:57.020 --> 01:02:00.170 85% said when two things happen at the same time 01:02:00.170 --> 01:02:02.138 and produce an unexpected result. Brian? 01:02:02.138 --> 01:02:03.680 BRIAN YU: That is the correct answer. 01:02:03.680 --> 01:02:06.650 I appreciate that at least 1% of people said when conditions outside 01:02:06.650 --> 01:02:07.640 are nice for racing. 01:02:07.640 --> 01:02:10.078 But in the context of computer science, at least, 01:02:10.078 --> 01:02:13.370 when two things happen at the same time and could produce an unexpected result, 01:02:13.370 --> 01:02:15.680 that is what we would refer to as a race condition. 01:02:15.680 --> 01:02:17.330 DAVID MALAN: Indeed, recall that's how Brian and I ended up 01:02:17.330 --> 01:02:18.973 with too much milk in the refrigerator. 01:02:18.973 --> 01:02:21.140 Because we both inspected the state of that variable 01:02:21.140 --> 01:02:22.955 at essentially the same time. 01:02:22.955 --> 01:02:24.330 All right, the leader board here. 01:02:24.330 --> 01:02:26.630 Now we have a whole bunch of people with 5,000 points. 01:02:26.630 --> 01:02:27.590 Let's move on. 01:02:27.590 --> 01:02:32.660 Does zooming in on a photo let you enhance it to generate more detail? 01:02:32.660 --> 01:02:35.210 Yes, just like in CSI. 01:02:35.210 --> 01:02:39.370 No, a photo only has a certain amount of detail. 01:02:39.370 --> 01:02:45.120 Does zooming in on a photo let you enhance it to generate more detail? 01:02:45.120 --> 01:02:48.390 And I will admit, I was watching some show recently and thought of you 01:02:48.390 --> 01:02:50.340 all when they literally said, enhance. 01:02:50.340 --> 01:02:52.490 All right, 0 seconds. 01:02:52.490 --> 01:02:56.970 Looks like 93% of you said, no, a photo only has a certain amount of detail. 01:02:56.970 --> 01:02:59.010 7% of you said yes, just like in CSI. 01:02:59.010 --> 01:03:01.290 Brian, can you help us reconcile the two? 01:03:01.290 --> 01:03:03.480 BRIAN YU: The 93%, in this case, are correct. 01:03:03.480 --> 01:03:05.800 A photo only has a certain number of pixels. 01:03:05.800 --> 01:03:08.100 And if you keep zooming in on one pixel, you're 01:03:08.100 --> 01:03:10.470 not going to be able to generate additional detail that 01:03:10.470 --> 01:03:11.528 wasn't there before. 01:03:11.528 --> 01:03:14.070 DAVID MALAN: And to be fair, that's kind of sort of changing. 01:03:14.070 --> 01:03:15.840 Or at least the answer is getting a little harder nowadays 01:03:15.840 --> 01:03:18.450 with machine learning or artificial intelligence, 01:03:18.450 --> 01:03:21.210 where algorithms sort of figure out what level of detail 01:03:21.210 --> 01:03:22.750 could or should be there. 01:03:22.750 --> 01:03:24.810 But that really is just statistical inference, 01:03:24.810 --> 01:03:27.810 that is not actually recovering information that was ever stored 01:03:27.810 --> 01:03:29.820 on the camera or some other device. 01:03:29.820 --> 01:03:34.290 All right, the leaderboard now is at 6,000 points with these folks tied. 01:03:34.290 --> 01:03:38.670 Which of the following is not a characteristic of a good hash function? 01:03:38.670 --> 01:03:44.280 Deterministic output, randomness, uniform distribution, efficiency. 01:03:44.280 --> 01:03:45.780 Things just got real again. 01:03:45.780 --> 01:03:50.160 Which of the following is not a characteristic of a good hash function? 01:03:50.160 --> 01:03:54.600 Recall we used hash functions in the context of hash tables 01:03:54.600 --> 01:03:57.070 when talking about data structures. 01:03:57.070 --> 01:03:57.570 All right? 01:03:57.570 --> 01:03:58.770 One second. 01:03:58.770 --> 01:04:00.930 The answers are more spread this time. 01:04:00.930 --> 01:04:03.270 62% don't like randomness. 01:04:03.270 --> 01:04:04.172 Brian, should they? 01:04:04.172 --> 01:04:05.880 BRIAN YU: And that is the correct answer. 01:04:05.880 --> 01:04:08.670 Randomness is not a characteristic of a good hash function. 01:04:08.670 --> 01:04:10.650 You want your hash function to always give you 01:04:10.650 --> 01:04:12.330 the same output given the same input. 01:04:12.330 --> 01:04:15.150 That way you can rely on whatever the output of it is. 01:04:15.150 --> 01:04:17.290 If it's random, it's going to be hard to use. 01:04:17.290 --> 01:04:19.123 DAVID MALAN: Indeed, consider a spellchecker 01:04:19.123 --> 01:04:20.550 that randomly says yes or no. 01:04:20.550 --> 01:04:23.250 This is a word, probably not a property you want. 01:04:23.250 --> 01:04:26.430 All right, the leaderboard now, we're eking our way up to 7,000 points, 01:04:26.430 --> 01:04:28.590 but finally starting to see some spread. 01:04:28.590 --> 01:04:31.260 So a few of you haven't been quite quick or correct enough. 01:04:31.260 --> 01:04:34.140 Next question, what does FIFO stand for? 01:04:34.140 --> 01:04:35.550 FIFO. 01:04:35.550 --> 01:04:40.620 Is it a common dog's name, your credit score, first in, first out, 01:04:40.620 --> 01:04:43.580 function input, file output? 01:04:43.580 --> 01:04:46.700 What does FIFO stand for? 01:04:46.700 --> 01:04:49.220 I'll be curious to see the spread here. 01:04:49.220 --> 01:04:50.330 Let's see. 01:04:50.330 --> 01:04:52.190 80% of you said first in, first out. 01:04:52.190 --> 01:04:52.777 Brian? 01:04:52.777 --> 01:04:53.860 BRIAN YU: That is correct. 01:04:53.860 --> 01:04:56.090 And that was what we were using to describe 01:04:56.090 --> 01:04:58.490 what would be called like a queue, where the first thing in the queue 01:04:58.490 --> 01:05:00.448 is the first thing that comes out of the queue. 01:05:00.448 --> 01:05:02.443 So it obeys that FIFO ordering. 01:05:02.443 --> 01:05:04.610 DAVID MALAN: Indeed, let's see the leaderboard here. 01:05:04.610 --> 01:05:07.880 All right, we have some 8,000s, but more in the 7,000 range. 01:05:07.880 --> 01:05:09.860 Next up is a more colorful question. 01:05:09.860 --> 01:05:13.760 Which of the following would represent pink using RGB values? 01:05:13.760 --> 01:05:15.980 And I'll let you read these on your own. 01:05:15.980 --> 01:05:19.730 And surely, there's some Googling happening behind the scenes now. 01:05:19.730 --> 01:05:21.050 But that's OK. 01:05:21.050 --> 01:05:22.430 In fact, Google is pretty smart. 01:05:22.430 --> 01:05:24.860 If you type in a hexadecimal code, it might even 01:05:24.860 --> 01:05:28.017 show you a little color wheel or swatch. 01:05:28.017 --> 01:05:29.850 All right, let's take a look at the results. 01:05:29.850 --> 01:05:33.590 Looks like 55% of you said ffd0e0. 01:05:33.590 --> 01:05:34.370 Brian? 01:05:34.370 --> 01:05:35.670 BRIAN YU: And that is correct. 01:05:35.670 --> 01:05:38.450 So those RGB values are like six different values, 01:05:38.450 --> 01:05:42.438 where each two correspond to one color, two for red, two for green, 01:05:42.438 --> 01:05:42.980 two for blue. 01:05:42.980 --> 01:05:44.300 This is all in hexadecimal. 01:05:44.300 --> 01:05:46.668 And pink would be a lot of each of them. 01:05:46.668 --> 01:05:48.710 Because it's very close to white, which is, like, 01:05:48.710 --> 01:05:50.360 all red, all green, and all blue. 01:05:50.360 --> 01:05:52.740 But it's more red than it is green and blue. 01:05:52.740 --> 01:05:58.130 And so that one, ffd0e0, is a lot of red, a little bit less green, 01:05:58.130 --> 01:05:59.460 and a little bit less blue. 01:05:59.460 --> 01:06:00.080 DAVID MALAN: Indeed. 01:06:00.080 --> 01:06:01.788 All right, let's see where we're at here. 01:06:01.788 --> 01:06:03.560 We're now up to-- interesting. 01:06:03.560 --> 01:06:05.300 No one has a perfect score anymore. 01:06:05.300 --> 01:06:09.800 But guest 200 is still in the lead with just shy of 9,000 points. 01:06:09.800 --> 01:06:12.330 In C, which of the following lines of code allocates 01:06:12.330 --> 01:06:17.090 enough memory for a copy of the string s? 01:06:17.090 --> 01:06:19.160 I'll let you read these. 01:06:19.160 --> 01:06:21.680 In C, which of the following lines of code allocates 01:06:21.680 --> 01:06:25.970 enough memory for a copy of the string s? 01:06:25.970 --> 01:06:28.250 Bunch of viable choices here, it would seem. 01:06:28.250 --> 01:06:31.130 And time, let's take a look at the results. 01:06:31.130 --> 01:06:34.880 Looks like 46% said malloc of size s. 01:06:34.880 --> 01:06:39.110 But Brian, 33% said malloc of strlen of s plus 1. 01:06:39.110 --> 01:06:40.490 Who is right? 01:06:40.490 --> 01:06:43.610 BRIAN YU: And in this case, the minority, the 33% are correct here. 01:06:43.610 --> 01:06:46.940 Malloc, remember, takes as its argument the number of bytes of memory 01:06:46.940 --> 01:06:48.080 that you want to allocate. 01:06:48.080 --> 01:06:50.850 And if you have a string and you want to figure out how many bytes you need, 01:06:50.850 --> 01:06:52.600 first thing you need to know is figure out 01:06:52.600 --> 01:06:54.680 how long is that string. strlen will tell you 01:06:54.680 --> 01:06:56.690 how many characters are in that string. 01:06:56.690 --> 01:07:00.050 But you do need one additional byte, because at the end of every string, 01:07:00.050 --> 01:07:01.880 we have that null terminating character. 01:07:01.880 --> 01:07:03.680 And we need one byte of memory for that. 01:07:03.680 --> 01:07:07.040 So strlen of s will give you the length of the string plus 1. 01:07:07.040 --> 01:07:09.170 That's how many bytes you need for memory. 01:07:09.170 --> 01:07:09.590 DAVID MALAN: Indeed. 01:07:09.590 --> 01:07:11.007 And see, you get nothing for free. 01:07:11.007 --> 01:07:12.830 Anything you want you need to do yourself. 01:07:12.830 --> 01:07:15.740 And indeed, the plus 1 is a problem for you to solve. 01:07:15.740 --> 01:07:19.580 The distribution now, guest 200 still in the lead with just 01:07:19.580 --> 01:07:20.900 shy of 10,000 points. 01:07:20.900 --> 01:07:22.010 That was question 10. 01:07:22.010 --> 01:07:23.750 We're in the second half of the game. 01:07:23.750 --> 01:07:26.930 How should you organize your clothes to be cool? 01:07:26.930 --> 01:07:28.100 This is number 11. 01:07:28.100 --> 01:07:32.120 Stack, queue, dictionary, binary tree. 01:07:32.120 --> 01:07:35.840 How should you organize your clothes to be cool? 01:07:35.840 --> 01:07:39.830 You might recall Jack and Lou, who taught us this one. 01:07:39.830 --> 01:07:41.740 Two seconds remain. 01:07:41.740 --> 01:07:45.000 And it looks like 48% said queue, Brian. 01:07:45.000 --> 01:07:46.250 BRIAN YU: And that is correct. 01:07:46.250 --> 01:07:48.560 So from that video with Jack and Lou, there 01:07:48.560 --> 01:07:50.442 were different ways of organizing the clues. 01:07:50.442 --> 01:07:52.150 But the conclusion of that video was, you 01:07:52.150 --> 01:07:53.830 want to put your clothes in a queue. 01:07:53.830 --> 01:07:56.290 So that after you're done with one, you put it at the end of the queue. 01:07:56.290 --> 01:07:59.470 And you use something else before you go back to the one you already wore. 01:07:59.470 --> 01:08:00.070 DAVID MALAN: Indeed. 01:08:00.070 --> 01:08:00.570 All right. 01:08:00.570 --> 01:08:03.040 And the leaderboard now, looks like guest 10 broke 10,000. 01:08:03.040 --> 01:08:04.630 But so did a bunch of other people. 01:08:04.630 --> 01:08:07.540 Next question, what is a segmentation fault? 01:08:07.540 --> 01:08:10.060 When a computer runs out of memory, when our program tries 01:08:10.060 --> 01:08:13.480 to read an empty file, when a program tries to access memory that it 01:08:13.480 --> 01:08:16.620 shouldn't, when an earthquake happens. 01:08:16.620 --> 01:08:19.609 Looks like a lot of these could be pretty close. 01:08:19.609 --> 01:08:21.260 Two seconds. 01:08:21.260 --> 01:08:22.963 And let's see. 01:08:22.963 --> 01:08:26.130 Looks like 80% said when a program tries to access memory that it shouldn't. 01:08:26.130 --> 01:08:26.810 Brian? 01:08:26.810 --> 01:08:28.160 BRIAN YU: That is the correct answer. 01:08:28.160 --> 01:08:30.077 Segmentation fault can happen if you're trying 01:08:30.077 --> 01:08:33.790 to touch memory that you're not supposed to have access to inside of a program. 01:08:33.790 --> 01:08:35.540 DAVID MALAN: And for the 13% of people who 01:08:35.540 --> 01:08:38.927 said when a computer runs out of memory, why is that not quite the answer here? 01:08:38.927 --> 01:08:41.010 BRIAN YU: So the computer could run out of memory. 01:08:41.010 --> 01:08:43.640 Where when you call malloc, malloc might return null, 01:08:43.640 --> 01:08:46.040 because there's no available memory to allocate. 01:08:46.040 --> 01:08:48.707 But as long as you check for that, and we try and encourage you, 01:08:48.707 --> 01:08:52.100 whenever you're mallocing memory, to check to see if the value you get back 01:08:52.100 --> 01:08:52.760 is null. 01:08:52.760 --> 01:08:54.805 That can help you to avoid those types of errors. 01:08:54.805 --> 01:08:56.930 DAVID MALAN: So let's take a look at the board now. 01:08:56.930 --> 01:08:59.569 11,000 something for guest 200. 01:08:59.569 --> 01:09:01.670 Let's now proceed with this question. 01:09:01.670 --> 01:09:05.689 Which of the following types of overflow can result from recursion 01:09:05.689 --> 01:09:07.340 without a base case? 01:09:07.340 --> 01:09:13.359 Heap overflow, integer overflow, stack overflow, buffer overflow. 01:09:13.359 --> 01:09:16.370 And all forms of overflow, indeed, came up. 01:09:16.370 --> 01:09:19.510 One of them is also, of course, the name of a popular website. 01:09:19.510 --> 01:09:21.580 But all of these are actual things. 01:09:21.580 --> 01:09:23.660 But which is correct? 01:09:23.660 --> 01:09:25.149 All right, let's see the results. 01:09:25.149 --> 01:09:29.319 Looks like 61%, 60% went with stack overflow. 01:09:29.319 --> 01:09:29.990 Brian? 01:09:29.990 --> 01:09:31.240 BRIAN YU: And that is correct. 01:09:31.240 --> 01:09:34.598 Every time you call a function, you end up getting a little bit of memory 01:09:34.598 --> 01:09:35.890 on the stack for that function. 01:09:35.890 --> 01:09:38.515 And if you keep calling that function recursively over and over 01:09:38.515 --> 01:09:40.960 and never stop, because there's no base case, 01:09:40.960 --> 01:09:42.460 then you can run out of stack space. 01:09:42.460 --> 01:09:43.997 And we call that a stack overflow. 01:09:43.997 --> 01:09:44.830 DAVID MALAN: Indeed. 01:09:44.830 --> 01:09:46.660 All right, let's see the leaderboard now. 01:09:46.660 --> 01:09:48.970 Guest 200, still the one to beat. 01:09:48.970 --> 01:09:52.149 But guest 216 is not too far behind. 01:09:52.149 --> 01:09:53.020 Next question. 01:09:53.020 --> 01:09:55.630 In the town of Fiftyville, what were the names 01:09:55.630 --> 01:09:59.680 of the three people who witnessed the rubber duck robbery? 01:09:59.680 --> 01:10:01.720 I'll let you read these. 01:10:01.720 --> 01:10:04.660 In the town of Fiftyville, what were the names of the three people 01:10:04.660 --> 01:10:06.430 who witnessed the rubber duck robbery. 01:10:06.430 --> 01:10:10.810 A new problem this year, recall that he disappeared altogether from the IDE 01:10:10.810 --> 01:10:11.770 for that week. 01:10:11.770 --> 01:10:13.540 All right, let's see the results. 01:10:13.540 --> 01:10:15.460 Brian, this one is close. 01:10:15.460 --> 01:10:17.875 33% said Ruth, Eugene, and Raymond. 01:10:17.875 --> 01:10:20.000 BRIAN YU: And Ruth, Eugene, and Raymond is correct. 01:10:20.000 --> 01:10:22.083 They've got more responses than any of the others. 01:10:22.083 --> 01:10:22.930 It was tricky. 01:10:22.930 --> 01:10:24.430 But yeah, that's the correct answer. 01:10:24.430 --> 01:10:26.500 There wasn't a whole lot of reason behind the names. 01:10:26.500 --> 01:10:28.300 I put a lot of thought into the story itself. 01:10:28.300 --> 01:10:29.500 But not a lot of thought to the names. 01:10:29.500 --> 01:10:31.630 Those were kind of just randomly selected. 01:10:31.630 --> 01:10:33.767 But those were the names of the witnesses. 01:10:33.767 --> 01:10:34.600 DAVID MALAN: Indeed. 01:10:34.600 --> 01:10:38.380 And the leaderboard now, we still have guest 200 is the one to beat. 01:10:38.380 --> 01:10:39.530 This is question 15. 01:10:39.530 --> 01:10:42.280 So we are nearing the end, still chance to pull ahead. 01:10:42.280 --> 01:10:46.180 Which of these command line programs checks your code for memory leaks? 01:10:46.180 --> 01:10:50.110 Valgrind, clang, mkdir, make. 01:10:50.110 --> 01:10:53.860 Notice that none of these have 50 on it, which means these are all real world 01:10:53.860 --> 01:10:56.950 commands that you would continue to see on your own Mac or PC 01:10:56.950 --> 01:10:59.800 or some future Linux system. 01:10:59.800 --> 01:11:02.380 And let's see the results. 01:11:02.380 --> 01:11:05.020 Here we have valgrind, the clear winner, 78%. 01:11:05.020 --> 01:11:05.560 Brian? 01:11:05.560 --> 01:11:06.760 BRIAN YU: And that's the correct answer. 01:11:06.760 --> 01:11:09.302 That's the program you can use in order to check your program 01:11:09.302 --> 01:11:11.380 to see if you have any memory leaks, to see 01:11:11.380 --> 01:11:14.890 if you're touching memory you shouldn't, if you're forgetting to free something. 01:11:14.890 --> 01:11:16.100 Valgrind is useful for all of that. 01:11:16.100 --> 01:11:19.030 DAVID MALAN: And if I may, I feel like 5% of you are just messing with us now. 01:11:19.030 --> 01:11:20.320 Hopefully, but we shall see. 01:11:20.320 --> 01:11:22.300 All right, last five questions to go. 01:11:22.300 --> 01:11:26.210 After taking a look at the leaderboard now, guest 200's still up at the top. 01:11:26.210 --> 01:11:30.130 Which of the following exists in C, but not Python. 01:11:30.130 --> 01:11:32.800 Boolean expressions, do-while loops, recursive 01:11:32.800 --> 01:11:35.740 functions, floating-point numbers. 01:11:35.740 --> 01:11:40.450 Which of the following exists in C, but not Python? 01:11:40.450 --> 01:11:44.740 An interesting comparison between two languages that goes beyond syntax. 01:11:44.740 --> 01:11:45.580 All right. 01:11:45.580 --> 01:11:46.330 Time's up. 01:11:46.330 --> 01:11:47.590 Let's take a look. 01:11:47.590 --> 01:11:50.380 Looks like 68% went with do-while loops. 01:11:50.380 --> 01:11:51.160 Brian? 01:11:51.160 --> 01:11:52.360 BRIAN YU: That is correct. 01:11:52.360 --> 01:11:53.560 Python has for loops. 01:11:53.560 --> 01:11:54.910 Python has while loops. 01:11:54.910 --> 01:11:57.730 But it doesn't have do-while loops in the same way that C does. 01:11:57.730 --> 01:12:00.190 You'd have to find some other way of trying to achieve 01:12:00.190 --> 01:12:01.810 that same kind of logical idea. 01:12:01.810 --> 01:12:02.200 DAVID MALAN: Indeed. 01:12:02.200 --> 01:12:04.408 And Brian, what was the approach that we took when we 01:12:04.408 --> 01:12:06.010 tried to recreate that some weeks ago? 01:12:06.010 --> 01:12:06.250 BRIAN YU: Yeah. 01:12:06.250 --> 01:12:09.070 So one approach to it is having an infinite loop, while true, 01:12:09.070 --> 01:12:10.780 that will just always repeat. 01:12:10.780 --> 01:12:13.720 And then when you reach a point where you can exit the loop, 01:12:13.720 --> 01:12:16.000 you can use the command break to get out of the loop 01:12:16.000 --> 01:12:17.627 and move on to the rest of the program. 01:12:17.627 --> 01:12:18.460 DAVID MALAN: Indeed. 01:12:18.460 --> 01:12:20.210 All right, let's take a look at the board. 01:12:20.210 --> 01:12:25.810 Guest 200 still now at 15,938, but still a few close folks behind. 01:12:25.810 --> 01:12:30.310 What HTTP request method should you use when sending private information 01:12:30.310 --> 01:12:31.600 like a password? 01:12:31.600 --> 01:12:35.200 GET, POST, SELECT, or TEXT? 01:12:35.200 --> 01:12:37.900 Which HTTP request method should use when sending 01:12:37.900 --> 01:12:41.860 private information like a password? 01:12:41.860 --> 01:12:43.510 Take a look at the results. 01:12:43.510 --> 01:12:44.290 All right. 01:12:44.290 --> 01:12:48.548 And the distribution is a lot of people said POST, Brian, 74% 01:12:48.548 --> 01:12:49.840 BRIAN YU: And they are correct. 01:12:49.840 --> 01:12:52.007 Yeah, if it was a get request, then you would end up 01:12:52.007 --> 01:12:54.790 with sensitive information inside the URL that might show up 01:12:54.790 --> 01:12:56.695 in your browsing history, for example. 01:12:56.695 --> 01:12:59.600 So to be secure, you want to be sure to use the POST request 01:12:59.600 --> 01:13:00.850 method for that type of stuff. 01:13:00.850 --> 01:13:02.410 DAVID MALAN: And to be clear, don't do this. 01:13:02.410 --> 01:13:04.310 Get is possible, and we saw how to do that. 01:13:04.310 --> 01:13:07.420 But of course, that then ends up in your history and other exposed places. 01:13:07.420 --> 01:13:10.550 SELECT and TEXT were not HTTP verbs. 01:13:10.550 --> 01:13:12.010 So POST is indeed spot on. 01:13:12.010 --> 01:13:14.140 All right, only three questions remain. 01:13:14.140 --> 01:13:17.830 Guest 200 is still the one to beat, followed by guest 216. 01:13:17.830 --> 01:13:22.870 What data structure allows for constant time look up for words in a dictionary? 01:13:22.870 --> 01:13:28.330 A linked list, a binary search tree, an array, or a trie? 01:13:28.330 --> 01:13:30.730 Recall that a dictionary was an abstract data 01:13:30.730 --> 01:13:33.880 type, insofar as you could implement it in different ways. 01:13:33.880 --> 01:13:37.000 But to get constant time look up, you might want 01:13:37.000 --> 01:13:40.780 to use one of these over the others. 01:13:40.780 --> 01:13:42.570 Let's see the results. 01:13:42.570 --> 01:13:43.800 Interesting. 01:13:43.800 --> 01:13:45.870 Brian, 32% said trie. 01:13:45.870 --> 01:13:47.010 Can you help us out here? 01:13:47.010 --> 01:13:47.310 BRIAN YU: Yeah. 01:13:47.310 --> 01:13:48.602 The trie is the correct answer. 01:13:48.602 --> 01:13:52.020 For all of the others, the linked list, the binary search tree, and the array, 01:13:52.020 --> 01:13:54.220 as you have more and more words in the dictionary, 01:13:54.220 --> 01:13:56.580 it's going to take longer and longer to find a word, 01:13:56.580 --> 01:13:58.650 as you have to either linear search through it 01:13:58.650 --> 01:14:02.340 or you have to go down through various nodes in the binary search tree. 01:14:02.340 --> 01:14:05.580 The trie on the other hand, it only depends upon the length of the word 01:14:05.580 --> 01:14:06.540 that you're looking up. 01:14:06.540 --> 01:14:08.970 It doesn't matter how many words are in the dictionary. 01:14:08.970 --> 01:14:12.090 You just follow one node for each letter in the word you're looking up. 01:14:12.090 --> 01:14:14.490 And you'll find that word in constant time. 01:14:14.490 --> 01:14:17.440 DAVID MALAN: And Brian, if constant time, Big O of 1 is so good, 01:14:17.440 --> 01:14:19.107 why not use tries, then, for everything? 01:14:19.107 --> 01:14:21.273 BRIAN YU: Well, there are trade offs for everything. 01:14:21.273 --> 01:14:23.640 The trie gives you theoretically constant time. 01:14:23.640 --> 01:14:25.440 But one of the big trade offs is memory. 01:14:25.440 --> 01:14:27.570 That tries end up using much more memory to be 01:14:27.570 --> 01:14:31.297 able to store a dictionary than many of those other data structures would. 01:14:31.297 --> 01:14:32.130 DAVID MALAN: Indeed. 01:14:32.130 --> 01:14:33.450 Let's look at the results. 01:14:33.450 --> 01:14:35.400 And guest 200 still at the lead. 01:14:35.400 --> 01:14:38.310 But guest 752 is now nipping at their heels. 01:14:38.310 --> 01:14:40.500 We have two final questions. 01:14:40.500 --> 01:14:42.900 And speed, again, does matter. 01:14:42.900 --> 01:14:44.980 What is a cookie? 01:14:44.980 --> 01:14:48.730 Data used to identify your computer to websites, a delicious snack, 01:14:48.730 --> 01:14:53.140 both of the above, or none of the above. 01:14:53.140 --> 01:14:54.310 This is a tough one, Brian. 01:14:54.310 --> 01:14:56.950 Especially if there's only one right answer. 01:14:56.950 --> 01:14:58.660 We might see a bit more of a split. 01:14:58.660 --> 01:15:00.230 Which of these is a cookie? 01:15:00.230 --> 01:15:02.590 All right, let's see the results. 01:15:02.590 --> 01:15:05.530 Data used to identify your computer to websites with 60%. 01:15:05.530 --> 01:15:07.300 Both of the above was 35%. 01:15:07.300 --> 01:15:09.340 Only 2% of you like cookies alone. 01:15:09.340 --> 01:15:10.215 Brian? 01:15:10.215 --> 01:15:12.340 BRIAN YU: Both of the above was the correct answer. 01:15:12.340 --> 01:15:15.465 I'll remind that all of these questions were written originally by students 01:15:15.465 --> 01:15:18.490 and the answer choice of students selected as the correct one was 01:15:18.490 --> 01:15:19.452 both of the above. 01:15:19.452 --> 01:15:20.410 DAVID MALAN: All right. 01:15:20.410 --> 01:15:24.400 And now the second to last leaderboard, Guest 200 is still in the lead. 01:15:24.400 --> 01:15:26.770 But there's been some variance toward the bottom there. 01:15:26.770 --> 01:15:30.580 Very last question of CS50 itself. 01:15:30.580 --> 01:15:34.330 What's your comfort level now? 01:15:34.330 --> 01:15:39.560 And we'll let you decide among these answers, too. 01:15:39.560 --> 01:15:40.760 All right. 01:15:40.760 --> 01:15:41.720 Answers are all in. 01:15:41.720 --> 01:15:43.520 Let's take a look at the distribution. 01:15:43.520 --> 01:15:47.360 Looks like 43% of you said you're among more of those more comfortable. 01:15:47.360 --> 01:15:49.610 24% of you went with the second. 01:15:49.610 --> 01:15:52.480 19%, very fascinating distribution from top to bottom. 01:15:52.480 --> 01:15:55.640 But the point is that you are all indeed now officially inducted 01:15:55.640 --> 01:15:57.410 into those more comfortable. 01:15:57.410 --> 01:15:59.300 Thank you so much for joining us in CS50. 01:15:59.300 --> 01:16:01.610 We cannot wait to see your final projects. 01:16:01.610 --> 01:16:03.110 This, then, is the end. 01:16:03.110 --> 01:16:04.490 And this was CS50. 01:16:14.090 --> 01:16:17.440 [MUSIC PLAYING]