1 00:00:00,000 --> 00:00:01,260 DAVID J. MALAN: Hello, world. 2 00:00:01,260 --> 00:00:02,270 This is CS50. 3 00:00:02,270 --> 00:00:04,760 And this is CS50's office hours, an opportunity 4 00:00:04,760 --> 00:00:09,770 to have some questions and answers or Q&A with me, David and CS50's own Carter 5 00:00:09,770 --> 00:00:10,320 Zenke. 6 00:00:10,320 --> 00:00:13,150 So we are here today on a bit of a special day. 7 00:00:13,150 --> 00:00:14,645 What is today and behind us? 8 00:00:14,645 --> 00:00:16,520 CARTER ZENKE: Today, I think is commencement. 9 00:00:16,520 --> 00:00:19,423 And we have a lot of schools undergoing activities. 10 00:00:19,423 --> 00:00:21,340 We also have a lot of rain going on right now. 11 00:00:21,340 --> 00:00:21,840 DAVID J. MALAN: Yeah, indeed. 12 00:00:21,840 --> 00:00:24,173 And for those unfamiliar, commencement is Harvard's word 13 00:00:24,173 --> 00:00:26,733 for graduation, when all of the seniors and graduate students 14 00:00:26,733 --> 00:00:28,650 actually finish their studies here at Harvard. 15 00:00:28,650 --> 00:00:32,159 And indeed, it's just started raining and even thundering and lightning, 16 00:00:32,159 --> 00:00:34,460 so they've rather sent a lot of people inside. 17 00:00:34,460 --> 00:00:38,270 For the curious, the way Harvard's graduation has been structured for years 18 00:00:38,270 --> 00:00:39,620 is there was a morning ceremony. 19 00:00:39,620 --> 00:00:42,230 And the rain held during that, so all was well. 20 00:00:42,230 --> 00:00:45,390 And thousands of people gather in Harvard Yard. 21 00:00:45,390 --> 00:00:47,610 They put up literally thousands of folding chairs. 22 00:00:47,610 --> 00:00:51,170 And all of the dignitaries, and deans, and presidents are up on the stage 23 00:00:51,170 --> 00:00:55,170 presenting students sort of verbally with their degrees. 24 00:00:55,170 --> 00:00:57,980 But then in the afternoon, all of the undergraduates at least 25 00:00:57,980 --> 00:01:01,000 go back to their dormitories or their houses, as they're called here. 26 00:01:01,000 --> 00:01:04,470 And then on a much smaller scale, a few hundred students at a time 27 00:01:04,470 --> 00:01:06,120 actually receive their diplomas. 28 00:01:06,120 --> 00:01:10,140 And each of the graduate schools go back to their own campuses as well. 29 00:01:10,140 --> 00:01:12,637 So where is this that we are right now, Carter? 30 00:01:12,637 --> 00:01:15,720 CARTER ZENKE: Well, we're at the SEC, the Science and Engineering Complex, 31 00:01:15,720 --> 00:01:16,960 which is over in Allston. 32 00:01:16,960 --> 00:01:19,180 Apart from Cambridge, a lot of things are happening right now. 33 00:01:19,180 --> 00:01:21,070 DAVID J. MALAN: Yeah, so this is actually a pretty new building. 34 00:01:21,070 --> 00:01:22,170 It opened during COVID times. 35 00:01:22,170 --> 00:01:25,300 And it's where computer science and a bunch of other folks recently moved. 36 00:01:25,300 --> 00:01:28,735 So through these windows and over by that stadium, 37 00:01:28,735 --> 00:01:31,110 if you can kind of see it in the background and even past 38 00:01:31,110 --> 00:01:32,683 that is Cambridge, Massachusetts. 39 00:01:32,683 --> 00:01:34,600 And that's where Harvard's original campus is. 40 00:01:34,600 --> 00:01:36,350 But it turns out Harvard has a lot of land 41 00:01:36,350 --> 00:01:39,490 here in Allston, which is a neighborhood of Boston, Massachusetts. 42 00:01:39,490 --> 00:01:41,292 And so over the next few decades, Harvard's 43 00:01:41,292 --> 00:01:43,000 probably going to grow all the more here. 44 00:01:43,000 --> 00:01:44,590 And indeed, engineering is growing. 45 00:01:44,590 --> 00:01:46,270 We're actually in a pretty empty space. 46 00:01:46,270 --> 00:01:47,350 We're in this massive building. 47 00:01:47,350 --> 00:01:49,142 But this building, this room that we're in, 48 00:01:49,142 --> 00:01:51,760 is actually all concrete and columns still. 49 00:01:51,760 --> 00:01:54,588 They haven't even finished building out the entire campus here. 50 00:01:54,588 --> 00:01:55,380 CARTER ZENKE: Yeah. 51 00:01:55,380 --> 00:01:57,180 Can I ask you what your commencement was like? 52 00:01:57,180 --> 00:01:58,680 DAVID J. MALAN: My own commencement? 53 00:01:58,680 --> 00:02:02,560 Yeah, so I graduated from the college in 1999. 54 00:02:02,560 --> 00:02:07,200 I don't remember too much of it, but I remember that you wake up very early 55 00:02:07,200 --> 00:02:10,090 at, like, 5:00 AM, 6:00 AM. 56 00:02:10,090 --> 00:02:10,889 And I think we-- 57 00:02:10,889 --> 00:02:13,380 I mean, this is very silly, but I think we had, like, 58 00:02:13,380 --> 00:02:16,620 champagne and strawberries as a special treat, 59 00:02:16,620 --> 00:02:19,650 the downside being had to wake up at, like, 5:00 or 6:00 AM for this. 60 00:02:19,650 --> 00:02:22,450 And everyone put on their black gowns at the time. 61 00:02:22,450 --> 00:02:24,660 And then I think we probably-- 62 00:02:24,660 --> 00:02:28,560 I think we professed to the church in Harvard Yard, which 63 00:02:28,560 --> 00:02:30,070 is called Memorial Church. 64 00:02:30,070 --> 00:02:33,810 And in there, there was some kind of benediction or speeches 65 00:02:33,810 --> 00:02:36,670 just to get folks thinking about the morning and the ceremony. 66 00:02:36,670 --> 00:02:37,920 And then we went outside. 67 00:02:37,920 --> 00:02:42,180 And on a per house or a per dormitory basis, we all sat together in clusters. 68 00:02:42,180 --> 00:02:44,910 And then all of the other students gathered. 69 00:02:44,910 --> 00:02:47,760 I don't really remember anything after that other 70 00:02:47,760 --> 00:02:51,617 than being grumpy most of the day because everyone wanted to take photos. 71 00:02:51,617 --> 00:02:53,200 And this wasn't even with cell phones. 72 00:02:53,200 --> 00:02:54,600 This was old school cameras. 73 00:02:54,600 --> 00:02:57,310 And it was film, and so you had to get the picture just right, 74 00:02:57,310 --> 00:02:58,980 so everything took forever. 75 00:02:58,980 --> 00:03:00,363 So I'm not really-- 76 00:03:00,363 --> 00:03:02,530 I have not really been back since my own graduation. 77 00:03:02,530 --> 00:03:04,320 It was pretty tiring, honestly. 78 00:03:04,320 --> 00:03:07,528 And you, at least, kind of lucked out in that you had a virtual commencement. 79 00:03:07,528 --> 00:03:08,320 CARTER ZENKE: Yeah. 80 00:03:08,320 --> 00:03:10,170 Mine was unique, for better or for worse, 81 00:03:10,170 --> 00:03:12,030 where it was a virtual commencement. 82 00:03:12,030 --> 00:03:15,370 And I remember we had pre-recorded these hello videos that 83 00:03:15,370 --> 00:03:17,730 were going to play on the slideshow. 84 00:03:17,730 --> 00:03:20,220 And I had filmed mine on this DSLR. 85 00:03:20,220 --> 00:03:21,850 And those [INAUDIBLE] on their phones. 86 00:03:21,850 --> 00:03:23,500 I think they cropped my face in a weird way, 87 00:03:23,500 --> 00:03:25,510 so it's like my face was like this [INAUDIBLE]. 88 00:03:25,510 --> 00:03:27,700 DAVID J. MALAN: And yours should look the best with that kind of camera. 89 00:03:27,700 --> 00:03:28,890 CARTER ZENKE: Yeah, but it was really funny. 90 00:03:28,890 --> 00:03:30,040 I got a big laugh out of it. 91 00:03:30,040 --> 00:03:31,080 It was really kind of fun. 92 00:03:31,080 --> 00:03:33,190 DAVID J. MALAN: Did you say, hello, world in your video? 93 00:03:33,190 --> 00:03:34,540 CARTER ZENKE: I don't know if I did, but I should have. 94 00:03:34,540 --> 00:03:35,610 That would have been a good one. 95 00:03:35,610 --> 00:03:38,500 DAVID J. MALAN: Well, this is actually Carter's last office hours with us, 96 00:03:38,500 --> 00:03:39,000 sadly. 97 00:03:39,000 --> 00:03:41,080 He's moving on to the real world. 98 00:03:41,080 --> 00:03:45,537 And Carter, it's been so wonderful having you involved in CS50 full time. 99 00:03:45,537 --> 00:03:47,370 And rest assured, Carter's still going to be 100 00:03:47,370 --> 00:03:50,940 heading our SQL class, our new and new and coming 101 00:03:50,940 --> 00:03:54,790 forthcoming course on our programming as well, so he's not really going away. 102 00:03:54,790 --> 00:03:56,340 He's just going elsewhere. 103 00:03:56,340 --> 00:03:58,208 What do you have in mind for the real world? 104 00:03:58,208 --> 00:04:01,500 CARTER ZENKE: So joining a startup that focuses on computer science curriculum. 105 00:04:01,500 --> 00:04:03,370 And they're focused more on K-12, so excited 106 00:04:03,370 --> 00:04:05,130 to see what it looks [INAUDIBLE] excited about computer science 107 00:04:05,130 --> 00:04:06,420 even earlier in their lives too. 108 00:04:06,420 --> 00:04:07,350 DAVID J. MALAN: Well, wonderful. 109 00:04:07,350 --> 00:04:08,933 Well, we definitely wish you the best. 110 00:04:08,933 --> 00:04:13,360 And much like Brian and Colton before him still will be in CS50's ecosystem, 111 00:04:13,360 --> 00:04:15,700 so you'll continue, I'm sure, to see Carter online. 112 00:04:15,700 --> 00:04:16,450 CARTER ZENKE: Most definitely. 113 00:04:16,450 --> 00:04:18,970 DAVID J. MALAN: In the meantime, I think we have a whole palette of questions. 114 00:04:18,970 --> 00:04:21,750 And in fact, if you're tuning in live to the office hours 115 00:04:21,750 --> 00:04:24,690 and you would like to, one, say hello to the world yourself, 116 00:04:24,690 --> 00:04:27,210 and perhaps, what country or city you're from in the world, 117 00:04:27,210 --> 00:04:29,190 please feel free to say hi to classmates. 118 00:04:29,190 --> 00:04:31,066 And two, if you have additional questions, 119 00:04:31,066 --> 00:04:33,160 I'd love to see what's on folks' minds. 120 00:04:33,160 --> 00:04:35,335 Office hours here on campus, more traditionally, 121 00:04:35,335 --> 00:04:37,710 is meant to be an opportunity for students and professors 122 00:04:37,710 --> 00:04:39,900 or students and teaching assistants, or TAs, 123 00:04:39,900 --> 00:04:41,792 just to talk about courses' material. 124 00:04:41,792 --> 00:04:44,250 And while it's probably not the best use of all of our time 125 00:04:44,250 --> 00:04:47,040 to talk about bugs you're specifically having and problem sets, 126 00:04:47,040 --> 00:04:51,120 may be more broadly about CS50, computer science, the real world. 127 00:04:51,120 --> 00:04:53,650 Happy to take any of those questions as well. 128 00:04:53,650 --> 00:04:56,980 And you also kindly submitted a bunch of questions in advance via comments, 129 00:04:56,980 --> 00:04:58,548 so we'll pluck off some of those too. 130 00:04:58,548 --> 00:04:59,340 CARTER ZENKE: Yeah. 131 00:04:59,340 --> 00:05:00,818 [INAUDIBLE] kind of apt for today. 132 00:05:00,818 --> 00:05:03,360 People are graduating with their degrees in computer science, 133 00:05:03,360 --> 00:05:06,360 for instance, is a reflection on undergraduate education in computer 134 00:05:06,360 --> 00:05:07,043 science. 135 00:05:07,043 --> 00:05:09,960 Somebody was asking, there's a lot of disciplines in computer science, 136 00:05:09,960 --> 00:05:12,640 like machine learning, AI systems, and so on. 137 00:05:12,640 --> 00:05:14,610 To what extent do you think the degree kind 138 00:05:14,610 --> 00:05:17,380 of has too much in it right now and should be split apart? 139 00:05:17,380 --> 00:05:20,120 Or [INAUDIBLE] it's actually pretty good as it is right now? 140 00:05:20,120 --> 00:05:20,790 DAVID J. MALAN: That's a good question. 141 00:05:20,790 --> 00:05:22,140 And this has actually come up a lot lately 142 00:05:22,140 --> 00:05:24,848 in the context of artificial intelligence, which actually came up 143 00:05:24,848 --> 00:05:25,900 in that other question. 144 00:05:25,900 --> 00:05:30,863 And should AI be its own major or concentration, as Harvard calls it here? 145 00:05:30,863 --> 00:05:32,030 And I don't really think so. 146 00:05:32,030 --> 00:05:33,630 I mean, this has already happened to some extent. 147 00:05:33,630 --> 00:05:35,400 Depending on the university that you're at 148 00:05:35,400 --> 00:05:38,920 or the community college that you're at, it might be called computer science. 149 00:05:38,920 --> 00:05:41,650 It might be computer engineering, electrical engineering. 150 00:05:41,650 --> 00:05:43,380 There's physics, more proper. 151 00:05:43,380 --> 00:05:46,718 And there's certainly other areas that probably are offshoots 152 00:05:46,718 --> 00:05:48,010 of computer science that exist. 153 00:05:48,010 --> 00:05:49,240 I'm not sure it really matters, though. 154 00:05:49,240 --> 00:05:51,060 Like, at the end of the day, what matters is what you study 155 00:05:51,060 --> 00:05:52,720 and not what you call it. 156 00:05:52,720 --> 00:05:55,200 And I think, frankly, as a catch all calling 157 00:05:55,200 --> 00:05:57,360 the field and the subfields computer science 158 00:05:57,360 --> 00:05:59,690 is a pretty strong signal, anyway. 159 00:05:59,690 --> 00:06:02,860 I think if we start to make things too niche by calling it 160 00:06:02,860 --> 00:06:04,960 a specialization in artificial intelligence, 161 00:06:04,960 --> 00:06:07,840 then it kind of does a disservice to all of the fundamentals 162 00:06:07,840 --> 00:06:11,235 that you would acquire on your way toward learning more about AI 163 00:06:11,235 --> 00:06:14,110 because underlying AI, of course, is algorithms, and data structures, 164 00:06:14,110 --> 00:06:18,280 and so many of the computational ideas that would be in CS itself. 165 00:06:18,280 --> 00:06:19,697 So I'm not sure it really matters. 166 00:06:19,697 --> 00:06:20,488 CARTER ZENKE: Yeah. 167 00:06:20,488 --> 00:06:21,290 I agree with that. 168 00:06:21,290 --> 00:06:24,852 And I think some of that work is done by the undergraduate degree, the master's 169 00:06:24,852 --> 00:06:28,060 degree, the PhD degree because they give you more specialization as you go on 170 00:06:28,060 --> 00:06:28,940 in school, right? 171 00:06:28,940 --> 00:06:30,100 DAVID J. MALAN: I think that's a good answer. 172 00:06:30,100 --> 00:06:30,892 CARTER ZENKE: Yeah. 173 00:06:30,892 --> 00:06:31,510 Makes sense. 174 00:06:31,510 --> 00:06:32,650 All right. 175 00:06:32,650 --> 00:06:36,100 One we had here too is, what's the biggest mistake you see 176 00:06:36,100 --> 00:06:38,278 people make when they start out in CS? 177 00:06:38,278 --> 00:06:40,070 DAVID J. MALAN: What's the biggest mistake? 178 00:06:40,070 --> 00:06:43,090 Honestly, it's not allocating enough time 179 00:06:43,090 --> 00:06:48,590 or thinking that it will go more quickly more easily than it actually will. 180 00:06:48,590 --> 00:06:51,400 And sadly, among our undergraduate population 181 00:06:51,400 --> 00:06:53,350 here, no matter how many times I feel we say 182 00:06:53,350 --> 00:06:56,530 this at the start of the semester, no matter how many times we remind them 183 00:06:56,530 --> 00:06:58,930 of this during this semester, there's still frequently, 184 00:06:58,930 --> 00:07:02,650 I think, this frustration, this disappointment with oneself 185 00:07:02,650 --> 00:07:06,740 that they just don't feel good at it, or it's taking too long, or it's too hard. 186 00:07:06,740 --> 00:07:10,840 And at least for me, personally, when it comes to programming in particular, 187 00:07:10,840 --> 00:07:14,540 if I can't clear out a big block of time, I just don't enjoy it. 188 00:07:14,540 --> 00:07:17,770 I mean, I would hate the idea of trying to do one of our own problem sets 189 00:07:17,770 --> 00:07:21,740 under pressure under a deadline because it takes the fun, 190 00:07:21,740 --> 00:07:23,830 the delight out of actually doing it. 191 00:07:23,830 --> 00:07:25,490 And it turns it into more of a slog. 192 00:07:25,490 --> 00:07:29,380 So I think the best thing people can do in general and the best thing 193 00:07:29,380 --> 00:07:32,590 to do to avoid that mistake is just clear your schedule, 194 00:07:32,590 --> 00:07:35,710 or start early, or do whatever it is so that you're not bumping up 195 00:07:35,710 --> 00:07:36,890 against a deadline. 196 00:07:36,890 --> 00:07:37,060 CARTER ZENKE: Yeah. 197 00:07:37,060 --> 00:07:37,722 I agree. 198 00:07:37,722 --> 00:07:40,180 And to your point about being difficult and time consuming, 199 00:07:40,180 --> 00:07:44,290 I think one mistake I see people doing frequently is going at it alone saying, 200 00:07:44,290 --> 00:07:46,390 this is a thing that I need to learn by myself. 201 00:07:46,390 --> 00:07:48,098 And one thing I value is CS50's community 202 00:07:48,098 --> 00:07:49,932 that we've built up over the past few years, 203 00:07:49,932 --> 00:07:52,360 like many years of people who help each other out online 204 00:07:52,360 --> 00:07:54,320 and ask questions and respond to them. 205 00:07:54,320 --> 00:07:55,300 So it's really, I think, an important part 206 00:07:55,300 --> 00:07:57,050 of learning computer science, that community. 207 00:07:57,050 --> 00:07:57,925 DAVID J. MALAN: Yeah. 208 00:07:57,925 --> 00:07:58,570 No, I do agree. 209 00:07:58,570 --> 00:07:58,870 It's funny. 210 00:07:58,870 --> 00:08:01,680 What also comes to mind here too, because not everyone is like me, 211 00:08:01,680 --> 00:08:02,180 certainly. 212 00:08:02,180 --> 00:08:05,140 And I'm reminded of CS50's own Tommy MacWilliam, one of our former head 213 00:08:05,140 --> 00:08:06,110 teaching fellows. 214 00:08:06,110 --> 00:08:10,150 Some of you might remember him from some of the videos we've filmed online. 215 00:08:10,150 --> 00:08:12,790 And Tommy had this incredible knack for being 216 00:08:12,790 --> 00:08:15,373 able to make optimal use of his time so much 217 00:08:15,373 --> 00:08:17,290 so that when we were traveling to a conference 218 00:08:17,290 --> 00:08:22,570 once, sitting in the boarding area of an airport waiting for an airplane, 219 00:08:22,570 --> 00:08:26,020 he was able to sit on top of his suitcase, take out his laptop, 220 00:08:26,020 --> 00:08:28,310 and write code for 15 minutes. 221 00:08:28,310 --> 00:08:31,070 And he sort of got in and got out of that mental headspace. 222 00:08:31,070 --> 00:08:33,760 And honestly, I just am not capable of doing 223 00:08:33,760 --> 00:08:36,850 that, unless it's something silly like fix a typographical error 224 00:08:36,850 --> 00:08:38,919 or change some aesthetics on a web page. 225 00:08:38,919 --> 00:08:42,309 I need to clear my morning and not have some deadline I'm 226 00:08:42,309 --> 00:08:45,988 bumping up against because otherwise, I just can't get into that zone. 227 00:08:45,988 --> 00:08:46,780 CARTER ZENKE: Yeah. 228 00:08:46,780 --> 00:08:49,180 When I was first starting in undergrad, this idea kind 229 00:08:49,180 --> 00:08:51,830 of resonated with me of this idea of deep work, which is you 230 00:08:51,830 --> 00:08:55,510 have to set aside at least an hour to do something deeply and actually 231 00:08:55,510 --> 00:08:56,995 get into their right mind space. 232 00:08:56,995 --> 00:08:59,120 DAVID J. MALAN: There's a word for this, deep work. 233 00:08:59,120 --> 00:08:59,290 See? 234 00:08:59,290 --> 00:09:01,240 Carter actually went to Education School. 235 00:09:01,240 --> 00:09:02,040 CARTER ZENKE: Yeah. 236 00:09:02,040 --> 00:09:04,500 So I've tried to structure my days or have 237 00:09:04,500 --> 00:09:07,635 at least a little bit of time for deep work, but it doesn't always happen, 238 00:09:07,635 --> 00:09:08,135 right? 239 00:09:08,135 --> 00:09:09,000 [INAUDIBLE] 240 00:09:09,000 --> 00:09:10,920 DAVID J. MALAN: Well, that's the flip side is you probably know, I mean, 241 00:09:10,920 --> 00:09:12,900 there's certain software projects that I, myself, 242 00:09:12,900 --> 00:09:14,550 have been working on because I want to. 243 00:09:14,550 --> 00:09:17,220 And they just kind of linger, and linger, and linger 244 00:09:17,220 --> 00:09:18,970 because something always comes up. 245 00:09:18,970 --> 00:09:23,580 And if it interferes with my ideal work scenario, I think to a fault, 246 00:09:23,580 --> 00:09:24,640 I postpone it. 247 00:09:24,640 --> 00:09:24,810 CARTER ZENKE: Yeah. 248 00:09:24,810 --> 00:09:25,440 That's fair. 249 00:09:25,440 --> 00:09:27,090 DAVID J. MALAN: Well, I think we have a bunch of other questions 250 00:09:27,090 --> 00:09:28,110 that have come in. 251 00:09:28,110 --> 00:09:32,280 For instance, next up-- let's see. 252 00:09:32,280 --> 00:09:34,390 Why don't we go ahead and pluck off data science? 253 00:09:34,390 --> 00:09:36,890 So one of our students online has asked, any tips for people 254 00:09:36,890 --> 00:09:38,740 who want to pursue data science? 255 00:09:38,740 --> 00:09:41,280 And I ask you because, of course, your interest in SQL 256 00:09:41,280 --> 00:09:43,300 and in R, which are very germane to that world. 257 00:09:43,300 --> 00:09:43,740 CARTER ZENKE: Yeah. 258 00:09:43,740 --> 00:09:45,780 So I see data science as being a combination 259 00:09:45,780 --> 00:09:49,080 of different disciplines, one being math and statistics 260 00:09:49,080 --> 00:09:50,830 and the other being computer science. 261 00:09:50,830 --> 00:09:54,120 And so I think you could really think about entering both those disciplines 262 00:09:54,120 --> 00:09:57,642 at once and picking out which pathways you want to take within both of those. 263 00:09:57,642 --> 00:10:00,100 So for instance, if you want to learn computer science, how 264 00:10:00,100 --> 00:10:03,040 it relates to data science, You could take a course like our own Introduction 265 00:10:03,040 --> 00:10:06,302 to Programming with R, which gives you this tool set in this language called R 266 00:10:06,302 --> 00:10:08,260 to learn how to do the kinds of programming you 267 00:10:08,260 --> 00:10:10,125 might do in data science. 268 00:10:10,125 --> 00:10:13,000 But you'll probably still want to learn about the math and statistics 269 00:10:13,000 --> 00:10:15,700 behind that field, so take courses in math and statistics. 270 00:10:15,700 --> 00:10:17,950 And learn what you can learn to then implement it then 271 00:10:17,950 --> 00:10:20,110 in code which you might learn in R, for instance. 272 00:10:20,110 --> 00:10:23,068 DAVID J. MALAN: And this CS50 course on R I keep hearing so much about, 273 00:10:23,068 --> 00:10:24,195 when does it launch? 274 00:10:24,195 --> 00:10:25,070 CARTER ZENKE: July 1. 275 00:10:25,070 --> 00:10:25,945 DAVID J. MALAN: Nice. 276 00:10:25,945 --> 00:10:30,490 So you can go to actually cs50.edx.org/r in order to tune in to that. 277 00:10:30,490 --> 00:10:34,330 Well, we also had a question about AI and ChatGPT, 278 00:10:34,330 --> 00:10:35,560 which is germane to startups. 279 00:10:35,560 --> 00:10:38,200 And someone asks, in the near future, like, two to three years, 280 00:10:38,200 --> 00:10:43,807 can we use AI, like ChatGPT, as a co-founder of an actual startup? 281 00:10:43,807 --> 00:10:44,890 CARTER ZENKE: Interesting. 282 00:10:44,890 --> 00:10:48,820 I think I've actually seen maybe a LinkedIn post or two about somebody 283 00:10:48,820 --> 00:10:54,130 who's tried to use ChatGPT to generate a startup idea and then implement it. 284 00:10:54,130 --> 00:10:56,590 And they'll go back to ChatGPT and say, here's the results. 285 00:10:56,590 --> 00:10:57,860 What do you think we should do? 286 00:10:57,860 --> 00:10:59,590 And it actually gives pretty good advice about what 287 00:10:59,590 --> 00:11:01,132 to do when you're building a startup. 288 00:11:01,132 --> 00:11:03,832 At least, this one person found not one instance. 289 00:11:03,832 --> 00:11:06,040 I don't know if that's a reliable way to go about it, 290 00:11:06,040 --> 00:11:07,745 but an interesting example, I think. 291 00:11:07,745 --> 00:11:08,620 DAVID J. MALAN: Yeah. 292 00:11:08,620 --> 00:11:12,100 I do think if you have a friend, a colleague, an acquaintance who 293 00:11:12,100 --> 00:11:14,170 has some skill set that you, yourself, don't have 294 00:11:14,170 --> 00:11:17,253 and there's an opportunity to actually collaborate with a human, at least, 295 00:11:17,253 --> 00:11:19,220 for now, that's probably the better bet. 296 00:11:19,220 --> 00:11:21,290 But who knows down the road? 297 00:11:21,290 --> 00:11:22,810 CARTER ZENKE: Yeah. 298 00:11:22,810 --> 00:11:26,440 And I think to add on to that, a new model like GPT-- 299 00:11:26,440 --> 00:11:28,450 what is it 4.0, for instance? 300 00:11:28,450 --> 00:11:30,910 Yeah, that can do a whole lot of-- you can just talk to it, 301 00:11:30,910 --> 00:11:32,938 and you can get back a verbal response. 302 00:11:32,938 --> 00:11:33,980 I think it's really cool. 303 00:11:33,980 --> 00:11:34,855 DAVID J. MALAN: Yeah. 304 00:11:34,855 --> 00:11:35,890 Makes sense. 305 00:11:35,890 --> 00:11:38,715 Well, the pace that students should be taking CS50, 306 00:11:38,715 --> 00:11:40,840 this is a question that's come up before because we 307 00:11:40,840 --> 00:11:42,530 described the class as self paced. 308 00:11:42,530 --> 00:11:43,930 But what does that mean? 309 00:11:43,930 --> 00:11:47,800 And what would your advice be when it comes to actually successfully starting 310 00:11:47,800 --> 00:11:49,070 and finishing the class? 311 00:11:49,070 --> 00:11:49,480 CARTER ZENKE: Yeah. 312 00:11:49,480 --> 00:11:51,370 Well, I think to go back to what you were saying earlier about this, 313 00:11:51,370 --> 00:11:52,750 just taking time to learn. 314 00:11:52,750 --> 00:11:55,660 When we say self paced, I would say move on when 315 00:11:55,660 --> 00:11:58,570 you think you've mastered the concepts for that particular week. 316 00:11:58,570 --> 00:12:02,350 So when you feel like you have solved the problem set and the best design 317 00:12:02,350 --> 00:12:05,950 you can, maybe that's your cue to go on and try something new. 318 00:12:05,950 --> 00:12:09,837 Here on campus, of course, it's, like, one week of content per one actual week 319 00:12:09,837 --> 00:12:12,670 in person here, but that doesn't have to be the case when you go off 320 00:12:12,670 --> 00:12:13,930 and do it on your own either. 321 00:12:13,930 --> 00:12:14,805 DAVID J. MALAN: Yeah. 322 00:12:14,805 --> 00:12:15,640 That makes sense. 323 00:12:15,640 --> 00:12:19,930 And when it comes to building one's final project-- 324 00:12:19,930 --> 00:12:23,050 this is a very open-ended question because it's a very open-ended project. 325 00:12:23,050 --> 00:12:26,080 Do you have any tips for folks on how to do that well? 326 00:12:26,080 --> 00:12:28,120 CARTER ZENKE: I'm curious what you think too. 327 00:12:28,120 --> 00:12:30,520 But one thing I think I've-- 328 00:12:30,520 --> 00:12:31,550 maybe two things. 329 00:12:31,550 --> 00:12:35,030 One, again, carving out time to work on it can be really helpful, 330 00:12:35,030 --> 00:12:38,020 like, just take a full hour to say, this is my final project time 331 00:12:38,020 --> 00:12:41,800 and maybe two, I think just picking out things that you find 332 00:12:41,800 --> 00:12:43,513 are problems in your own life. 333 00:12:43,513 --> 00:12:46,180 Things you could automate, things you could use computer science 334 00:12:46,180 --> 00:12:48,370 to help solve, that can give you some motivation 335 00:12:48,370 --> 00:12:51,620 to actually spend the time on it because it's solving a real problem that you, 336 00:12:51,620 --> 00:12:52,657 yourself, had, so yeah. 337 00:12:52,657 --> 00:12:54,490 DAVID J. MALAN: And if we circle back to AI, 338 00:12:54,490 --> 00:12:56,470 definitely a frequently asked question nowadays 339 00:12:56,470 --> 00:12:58,720 is one that was explicitly asked here, which was, 340 00:12:58,720 --> 00:13:01,240 what should we be focusing on to avoid getting 341 00:13:01,240 --> 00:13:03,410 replaced by artificial intelligence? 342 00:13:03,410 --> 00:13:06,010 I heard that cybersecurity, says this student, 343 00:13:06,010 --> 00:13:08,647 is a good sector that's relatively safe from AI. 344 00:13:08,647 --> 00:13:09,980 CARTER ZENKE: Yeah, potentially. 345 00:13:09,980 --> 00:13:14,030 I got to listen in to Sam Altman's talk here at Harvard not too long ago. 346 00:13:14,030 --> 00:13:17,050 And one thing that Sam was talking about is 347 00:13:17,050 --> 00:13:20,710 we tend to think early-- underestimate how much better AI is 348 00:13:20,710 --> 00:13:22,370 going to get, at least, in his mind. 349 00:13:22,370 --> 00:13:25,490 And I think that is kind of a good point to take with you. 350 00:13:25,490 --> 00:13:28,810 And I wonder if maybe the solution or one step in the right direction 351 00:13:28,810 --> 00:13:34,750 is to try to learn all you can about AI, how it works, how you can use it to be 352 00:13:34,750 --> 00:13:38,007 the person who's using the AI and not being replaced by it, 353 00:13:38,007 --> 00:13:38,840 if that makes sense. 354 00:13:38,840 --> 00:13:39,715 DAVID J. MALAN: Nice. 355 00:13:39,715 --> 00:13:41,780 And let me ask our friends, Max here. 356 00:13:41,780 --> 00:13:44,480 Max, we're seeing reports of folks having trouble hearing audio. 357 00:13:44,480 --> 00:13:45,600 Is that, in fact, fixed? 358 00:13:45,600 --> 00:13:46,100 All right. 359 00:13:46,100 --> 00:13:47,440 So hopefully, you can now hear us again. 360 00:13:47,440 --> 00:13:49,898 And we'll fix the audio you might have missed later on when 361 00:13:49,898 --> 00:13:51,310 we post the video after the fact. 362 00:13:51,310 --> 00:13:51,810 All right. 363 00:13:51,810 --> 00:13:54,460 So Carter, what questions do you have, as you like to say? 364 00:13:54,460 --> 00:13:55,460 CARTER ZENKE: Let's see. 365 00:13:55,460 --> 00:14:00,220 So ones-- I'm trying to figure [INAUDIBLE] our chat here. 366 00:14:00,220 --> 00:14:02,418 Any-- actually, let's go based on what we had from-- 367 00:14:02,418 --> 00:14:03,210 DAVID J. MALAN: Oh. 368 00:14:03,210 --> 00:14:06,192 Carter, actually, as someone notes in the chat, R is fairly easy. 369 00:14:06,192 --> 00:14:07,650 CARTER ZENKE: Oh, R is fairly easy. 370 00:14:07,650 --> 00:14:08,640 DAVID J. MALAN: If anyone is wondering, so why would 371 00:14:08,640 --> 00:14:10,530 you teach R if it's so easy, Carter? 372 00:14:10,530 --> 00:14:13,738 CARTER ZENKE: Well, there are parts of R that are certainly a little bit less 373 00:14:13,738 --> 00:14:15,720 easy and I think are helpful to do in a more 374 00:14:15,720 --> 00:14:17,880 structured way with a community, where the group is 375 00:14:17,880 --> 00:14:19,680 helping you learn those things. 376 00:14:19,680 --> 00:14:23,190 But actually, R is easy if you want to do certain analysis. 377 00:14:23,190 --> 00:14:27,642 Like, it is easy to do basic statistics with it, easy to do-- 378 00:14:27,642 --> 00:14:30,600 representing data with it, which is kind of by design for this language 379 00:14:30,600 --> 00:14:33,910 called R. People made it to be able to work easily with data. 380 00:14:33,910 --> 00:14:37,020 So if you want to learn how to do that easily, you should learn R. 381 00:14:37,020 --> 00:14:37,260 DAVID J. MALAN: Yeah. 382 00:14:37,260 --> 00:14:38,460 But I think there's something to be said, 383 00:14:38,460 --> 00:14:40,627 as folks will see when they actually take the class, 384 00:14:40,627 --> 00:14:44,100 if they do, online, that understanding some of the primitives 385 00:14:44,100 --> 00:14:46,762 and the design decisions underlying R is probably 386 00:14:46,762 --> 00:14:48,720 helpful for just wrapping your mind around what 387 00:14:48,720 --> 00:14:52,270 it should be able to do, how it does it, what the data structures are in memory 388 00:14:52,270 --> 00:14:55,013 so that it's not a complete abstraction. 389 00:14:55,013 --> 00:14:55,930 CARTER ZENKE: I agree. 390 00:14:55,930 --> 00:14:57,580 DAVID J. MALAN: Yeah. 391 00:14:57,580 --> 00:15:02,170 So in the chat here, someone notes that recently the founder of NVIDIA 392 00:15:02,170 --> 00:15:04,630 said that AI will make learning code irrelevant 393 00:15:04,630 --> 00:15:06,910 and kids should focus elsewhere. 394 00:15:06,910 --> 00:15:08,383 Your thoughts? 395 00:15:08,383 --> 00:15:09,800 CARTER ZENKE: An interesting take. 396 00:15:09,800 --> 00:15:13,310 I've seen it repeated around various platforms. 397 00:15:13,310 --> 00:15:15,190 I think it remains to be seen. 398 00:15:15,190 --> 00:15:18,370 I do like the idea of maybe eventually, we 399 00:15:18,370 --> 00:15:21,255 move away from writing a lower level language like C 400 00:15:21,255 --> 00:15:24,130 and more thinking in terms of this higher level English-like language 401 00:15:24,130 --> 00:15:26,410 to do the work of programming. 402 00:15:26,410 --> 00:15:29,420 But I also think it's still useful to learn these underlying primitives, 403 00:15:29,420 --> 00:15:32,140 underlying ideas because it helps you understand why 404 00:15:32,140 --> 00:15:34,490 things are working the way they work. 405 00:15:34,490 --> 00:15:34,990 Yeah. 406 00:15:34,990 --> 00:15:36,230 It's kind of where I would stand on that. 407 00:15:36,230 --> 00:15:37,160 I don't know what you think too. 408 00:15:37,160 --> 00:15:38,380 DAVID J. MALAN: I mean, I would note that NVIDIA 409 00:15:38,380 --> 00:15:41,157 is in the business of selling hardware to facilitate AI, 410 00:15:41,157 --> 00:15:42,740 so I'm not sure how objective that is. 411 00:15:42,740 --> 00:15:45,670 And history is littered with examples of famous people 412 00:15:45,670 --> 00:15:48,580 making bold claims all the way back to Bill Gates and technology 413 00:15:48,580 --> 00:15:52,000 and the like that are then refuted or contradicted just some years later. 414 00:15:52,000 --> 00:15:53,920 I do think the reality is that we hopefully 415 00:15:53,920 --> 00:15:56,837 won't be writing code in exactly the same way because frankly, there's 416 00:15:56,837 --> 00:15:57,710 a lot of TDM to it. 417 00:15:57,710 --> 00:16:00,260 It's annoying to have to look things up or to have 418 00:16:00,260 --> 00:16:02,180 to try to remember, oh what is that function, 419 00:16:02,180 --> 00:16:03,930 or what's the signature of that function? 420 00:16:03,930 --> 00:16:05,450 And then there's a lot of plumbing that needs 421 00:16:05,450 --> 00:16:07,770 to be done, unit tests that need to be written and so forth. 422 00:16:07,770 --> 00:16:09,950 So as much as even I, like, programming, there's 423 00:16:09,950 --> 00:16:13,760 just so much baggage that comes with it, that takes away from the germ of an idea 424 00:16:13,760 --> 00:16:15,020 that you're trying to solve. 425 00:16:15,020 --> 00:16:19,050 So I certainly hope that AI changes what programming is. 426 00:16:19,050 --> 00:16:21,590 And frankly, in an ideal world, I, and you, 427 00:16:21,590 --> 00:16:24,860 and hopefully, others would often be programming, in some sense, 428 00:16:24,860 --> 00:16:28,490 by just using our voice and getting natural language, processing, 429 00:16:28,490 --> 00:16:32,930 or NLP to actually be where you would have long hoped it would be, 430 00:16:32,930 --> 00:16:36,270 even though the voice assistants of today are still only just so-so. 431 00:16:36,270 --> 00:16:40,310 But the reality is computer science, certainly as we presented in CS50, 432 00:16:40,310 --> 00:16:43,130 has never been about programming in C or in Python, 433 00:16:43,130 --> 00:16:46,560 even though as of 2024, those are two of the languages we use. 434 00:16:46,560 --> 00:16:48,082 It really is about problem solving. 435 00:16:48,082 --> 00:16:51,290 And we say that in, like, week zero of the class because what you're learning 436 00:16:51,290 --> 00:16:53,623 is how to think more methodically, more algorithmically, 437 00:16:53,623 --> 00:16:55,980 how to be and act as a smarter person. 438 00:16:55,980 --> 00:16:59,030 And so sure, if all you're doing is picking up 439 00:16:59,030 --> 00:17:02,270 a book on programming in Python, maybe that's not the best use of time. 440 00:17:02,270 --> 00:17:06,180 But that's just an implementation detail when it comes to studying a field. 441 00:17:06,180 --> 00:17:10,940 And so I don't think AI is going to replace the value of being an educated, 442 00:17:10,940 --> 00:17:12,402 smart person anytime soon. 443 00:17:12,402 --> 00:17:13,319 CARTER ZENKE: I agree. 444 00:17:13,319 --> 00:17:15,500 And a related question here I think is somebody 445 00:17:15,500 --> 00:17:17,390 who might be a junior in the industry, they 446 00:17:17,390 --> 00:17:19,020 feel like they're being replaced by AI. 447 00:17:19,020 --> 00:17:20,603 Their skills are being replaced by AI. 448 00:17:20,603 --> 00:17:24,440 What would you recommend for them getting into the industry like software 449 00:17:24,440 --> 00:17:25,875 development or working in AI? 450 00:17:25,875 --> 00:17:26,750 DAVID J. MALAN: Yeah. 451 00:17:26,750 --> 00:17:28,190 I mean, I would first disclaim that I think 452 00:17:28,190 --> 00:17:30,290 no one knows exactly what's going to happen 453 00:17:30,290 --> 00:17:32,090 or how quickly it's going to happen. 454 00:17:32,090 --> 00:17:36,950 But I think it's only to one's benefit to move and learn quickly, 455 00:17:36,950 --> 00:17:40,290 to be willing to and determined to adapt. 456 00:17:40,290 --> 00:17:43,460 I think if you find yourself doing a lot of the same thing 457 00:17:43,460 --> 00:17:45,440 again and again, even if you really like it, 458 00:17:45,440 --> 00:17:48,830 like making lots of HTML, lots of CSS, or just 459 00:17:48,830 --> 00:17:51,860 writing tests for a company, anything wherein you're 460 00:17:51,860 --> 00:17:55,280 doing a lot of repetition, that's kind of ripe for disruption, 461 00:17:55,280 --> 00:17:59,570 so to speak, by AI because it can probably be pretty readily automated. 462 00:17:59,570 --> 00:18:01,680 We've seen examples of this before. 463 00:18:01,680 --> 00:18:06,170 I've mentioned before that I think about this question in the context of, indeed, 464 00:18:06,170 --> 00:18:10,580 HTML in how back in the day, I used to write HTML by hand, line by line, 465 00:18:10,580 --> 00:18:11,567 character by character. 466 00:18:11,567 --> 00:18:13,400 And then tools like Dreamweaver came around, 467 00:18:13,400 --> 00:18:16,530 downloadable software that click, click, click, and it generates the HTML. 468 00:18:16,530 --> 00:18:18,800 Nowadays, we have things like Wix and Squarespace 469 00:18:18,800 --> 00:18:20,310 that you don't even need to download anything. 470 00:18:20,310 --> 00:18:21,143 It's in the browser. 471 00:18:21,143 --> 00:18:23,610 And soon enough, it'll be even more abstracted than that. 472 00:18:23,610 --> 00:18:27,920 But once you've done that once, twice, 10 times, 20 times, 473 00:18:27,920 --> 00:18:30,770 like for a lot of us, the fun starts to dissipate, 474 00:18:30,770 --> 00:18:32,490 and it just gets a little monotonous. 475 00:18:32,490 --> 00:18:34,910 So I think if you find yourself doing anything monotonous, 476 00:18:34,910 --> 00:18:37,970 even admittedly, if you like it quite a bit, 477 00:18:37,970 --> 00:18:40,588 just be willing to continue to adapt and evolve 478 00:18:40,588 --> 00:18:42,380 because I think if you don't, then you will 479 00:18:42,380 --> 00:18:44,760 be left behind as technology advances. 480 00:18:44,760 --> 00:18:46,820 But otherwise, if you're keeping up as a human, 481 00:18:46,820 --> 00:18:49,650 there's only going to be more exciting stuff on the horizon. 482 00:18:49,650 --> 00:18:51,800 CARTER ZENKE: Yeah, I think learning the right skills, like you've mentioned, 483 00:18:51,800 --> 00:18:52,920 is really a good point. 484 00:18:52,920 --> 00:18:56,900 And I think just getting into the industry itself, like finding a job, 485 00:18:56,900 --> 00:19:00,413 just trusting the process of applying, trying again, trying again. 486 00:19:00,413 --> 00:19:03,330 I think if you're doing the right things, it will eventually work out. 487 00:19:03,330 --> 00:19:06,122 And so you should just keep giving it that, try giving it that try. 488 00:19:06,122 --> 00:19:08,910 If it takes six months, a year, longer, it is what it is. 489 00:19:08,910 --> 00:19:09,930 It's going to work out at some point. 490 00:19:09,930 --> 00:19:10,805 DAVID J. MALAN: Yeah. 491 00:19:10,805 --> 00:19:14,360 I think a term in English is upskilling, which sort of refers to just upgrading, 492 00:19:14,360 --> 00:19:16,020 enhancing your own human skills. 493 00:19:16,020 --> 00:19:18,560 And I mean, case in point, I only ever formally studied 494 00:19:18,560 --> 00:19:20,810 things like C a little bit of C Plus Plus when 495 00:19:20,810 --> 00:19:22,440 it comes to programming in college. 496 00:19:22,440 --> 00:19:24,740 And then after that, I taught myself SQL. 497 00:19:24,740 --> 00:19:26,540 And I taught myself PHP. 498 00:19:26,540 --> 00:19:29,120 I taught myself Python with a lot of help from other people, 499 00:19:29,120 --> 00:19:30,720 but I didn't take classes on this. 500 00:19:30,720 --> 00:19:34,860 And so in the sense of upskilling, I just continued to learn new things. 501 00:19:34,860 --> 00:19:39,540 And so that's kind of kept me abreast of the latest trends in industry. 502 00:19:39,540 --> 00:19:43,880 It's opened up interesting problem solving opportunities. 503 00:19:43,880 --> 00:19:47,752 So just don't stay comfortable with where you're at. 504 00:19:47,752 --> 00:19:49,460 CARTER ZENKE: And speaking of upskilling, 505 00:19:49,460 --> 00:19:52,460 do you think there are any new features to be added to the duck debugger 506 00:19:52,460 --> 00:19:53,850 to upskill, let's say, the duck? 507 00:19:53,850 --> 00:19:54,080 DAVID J. MALAN: No. 508 00:19:54,080 --> 00:19:54,600 Yes. 509 00:19:54,600 --> 00:19:56,270 Well, hopefully, the duck will only get better and better. 510 00:19:56,270 --> 00:19:58,187 Honestly, even if we don't do anything, thanks 511 00:19:58,187 --> 00:20:01,400 to standing on the shoulders of folks like Microsoft Azure and OpenAI 512 00:20:01,400 --> 00:20:06,770 on whose APIs the duck is built, we've long talked for years now within CS50 513 00:20:06,770 --> 00:20:11,510 about complementing check50 and style50 with a third tool, 514 00:20:11,510 --> 00:20:14,750 design50 so that not only the assessment of the course, 515 00:20:14,750 --> 00:20:19,490 but also the pedagogical feedback is entirely automatable in addition 516 00:20:19,490 --> 00:20:21,860 to any humans we might also have on staff. 517 00:20:21,860 --> 00:20:26,000 And so a design50 tool in my mind would be sort of already 518 00:20:26,000 --> 00:20:28,280 implemented by the duck now, whereby if you just 519 00:20:28,280 --> 00:20:30,950 ask the duck for some advice about specific code you've written, 520 00:20:30,950 --> 00:20:33,450 it might very well be able to give you that feedback. 521 00:20:33,450 --> 00:20:34,410 But it's not automated. 522 00:20:34,410 --> 00:20:35,690 We don't use it for assessment. 523 00:20:35,690 --> 00:20:38,070 And you have to do a bunch of the legwork, literally the copying 524 00:20:38,070 --> 00:20:39,200 and pasting yourself. 525 00:20:39,200 --> 00:20:40,950 So I think that will start to help. 526 00:20:40,950 --> 00:20:44,355 I'd like to see more tight integration with VS Code. 527 00:20:44,355 --> 00:20:46,730 And these are features we've actually deliberately turned 528 00:20:46,730 --> 00:20:48,800 off where you get little squiggles under things 529 00:20:48,800 --> 00:20:51,560 that could be better or little notifications 530 00:20:51,560 --> 00:20:53,190 in the gutter of the editor. 531 00:20:53,190 --> 00:20:56,720 I think we could just have the duck more omnipresent. 532 00:20:56,720 --> 00:20:59,220 And honestly, as we've seen from OpenAI recently, 533 00:20:59,220 --> 00:21:01,020 I think just being able to talk to the duck 534 00:21:01,020 --> 00:21:03,430 is going to get pretty compelling soon. 535 00:21:03,430 --> 00:21:06,840 Browsers make this a little hard to do audio in two directions. 536 00:21:06,840 --> 00:21:09,310 It's not impossible, but it just adds some complexity. 537 00:21:09,310 --> 00:21:11,550 And so we don't have that plumbing in place just yet. 538 00:21:11,550 --> 00:21:14,550 But I think if you could talk to or have the duck talk back 539 00:21:14,550 --> 00:21:17,970 at you, that's just going to add another vector and aid with accessibility, 540 00:21:17,970 --> 00:21:20,345 because everything has to be typed at a keyboard as well. 541 00:21:20,345 --> 00:21:22,887 CARTER ZENKE: I think that opens the interesting question of, 542 00:21:22,887 --> 00:21:24,900 what would the duck's voice sound like? 543 00:21:24,900 --> 00:21:26,690 [INTERPOSING VOICES] 544 00:21:26,690 --> 00:21:27,333 545 00:21:27,333 --> 00:21:28,500 DAVID J. MALAN: [INAUDIBLE]. 546 00:21:28,500 --> 00:21:29,320 Yeah, I don't know. 547 00:21:29,320 --> 00:21:32,320 I mean, at that point, you could probably choose your own virtual animal 548 00:21:32,320 --> 00:21:36,210 to moo at you or bark at you. 549 00:21:36,210 --> 00:21:39,540 CARTER ZENKE: That's really interesting. 550 00:21:39,540 --> 00:21:41,880 Maybe a question to switch gears a little bit, 551 00:21:41,880 --> 00:21:45,360 for folks who really like this field, want to perhaps teach it, 552 00:21:45,360 --> 00:21:48,390 how can they become a teacher, like you, like us, 553 00:21:48,390 --> 00:21:49,915 who are teaching computer science? 554 00:21:49,915 --> 00:21:51,540 DAVID J. MALAN: That's a good question. 555 00:21:51,540 --> 00:21:57,140 I mean, for me, it kind of happened accidentally or organically. 556 00:21:57,140 --> 00:21:59,570 The way I tell the story in CS50 is-- 557 00:21:59,570 --> 00:22:02,360 or to people who ask is when I was in college, 558 00:22:02,360 --> 00:22:04,863 I ran for student government, or the undergraduate council, 559 00:22:04,863 --> 00:22:05,780 as it was called here. 560 00:22:05,780 --> 00:22:06,920 And I lost. 561 00:22:06,920 --> 00:22:09,140 And among my shortcomings I felt at the time 562 00:22:09,140 --> 00:22:12,060 were my inability to publicly speak well. 563 00:22:12,060 --> 00:22:14,160 And so I wanted to fix that. 564 00:22:14,160 --> 00:22:16,580 And I signed up for Harvard's Computer Society, 565 00:22:16,580 --> 00:22:19,910 which is the group of really geeky kids on campus, myself included, 566 00:22:19,910 --> 00:22:22,470 who were just involved in computer things. 567 00:22:22,470 --> 00:22:25,470 And the thing I chose to get involved with was their seminar series. 568 00:22:25,470 --> 00:22:29,600 And so I had an opportunity to teach my classmates about HTML. 569 00:22:29,600 --> 00:22:31,940 And I don't know if CSS even existed at the time, 570 00:22:31,940 --> 00:22:34,530 but HTML at least how to make web pages. 571 00:22:34,530 --> 00:22:37,490 And then that evolved into an opportunity my senior year of college 572 00:22:37,490 --> 00:22:42,680 to be a teaching assistant, or TA, for a class called Computer Science E1 573 00:22:42,680 --> 00:22:45,510 Introduction to Computers and the Internet. 574 00:22:45,510 --> 00:22:48,650 So this was 1998, so we were still introducing people 575 00:22:48,650 --> 00:22:50,180 to computers and the internet. 576 00:22:50,180 --> 00:22:53,347 And then right place, right time, I had the fortunate opportunity, 577 00:22:53,347 --> 00:22:55,430 thanks to Professor Henry Leitner here at Harvard, 578 00:22:55,430 --> 00:22:57,660 to teach that same class as the instructor. 579 00:22:57,660 --> 00:23:00,282 So it was me and a bunch of 100 or so adults. 580 00:23:00,282 --> 00:23:02,990 I think I was definitely the youngest one in the room so much so, 581 00:23:02,990 --> 00:23:07,520 that I was so determined not to look like the youngest person in the room, 582 00:23:07,520 --> 00:23:08,220 that I wore. 583 00:23:08,220 --> 00:23:11,900 Not only a suit, but also suspenders. 584 00:23:11,900 --> 00:23:14,360 I have not worn suspenders since 1999. 585 00:23:14,360 --> 00:23:18,200 But I wore suspenders because I remember thinking my childhood physician always 586 00:23:18,200 --> 00:23:22,460 wore suspenders, and he looked old, so I should therefore wear suspenders. 587 00:23:22,460 --> 00:23:23,720 And I don't know if it worked. 588 00:23:23,720 --> 00:23:26,580 But anyhow, I really got a taste for teaching. 589 00:23:26,580 --> 00:23:29,390 And on the scale of being a courses lecturer, 590 00:23:29,390 --> 00:23:33,050 I'd been tutoring classmates in some form for many years off and on. 591 00:23:33,050 --> 00:23:35,760 But this really kind of took the opportunity to the next level. 592 00:23:35,760 --> 00:23:39,800 And so after that, at least here in the US, in higher education, a lot of doors 593 00:23:39,800 --> 00:23:43,160 tend to be closed to you as an instructor, 594 00:23:43,160 --> 00:23:45,110 unless you have a higher advanced degree, 595 00:23:45,110 --> 00:23:49,530 like a master's degree in something or a PhD, as in my case, in computer science. 596 00:23:49,530 --> 00:23:53,802 So I pursued that so as to keep those doors open to me. 597 00:23:53,802 --> 00:23:56,760 CARTER ZENKE: And maybe similar to what you were doing as an undergrad, 598 00:23:56,760 --> 00:23:59,880 thinking you don't have to know everything to teach somebody something. 599 00:23:59,880 --> 00:24:01,440 You can know what you know. 600 00:24:01,440 --> 00:24:04,450 And people will be grateful to know what if you can teach it to them. 601 00:24:04,450 --> 00:24:05,448 So take what you know. 602 00:24:05,448 --> 00:24:06,990 Try to teach it to help somebody out. 603 00:24:06,990 --> 00:24:08,290 See what you can teach them. 604 00:24:08,290 --> 00:24:10,207 DAVID J. MALAN: Yeah, I think just practicing. 605 00:24:10,207 --> 00:24:12,930 And I think, frankly, a wonderful side effect of teaching 606 00:24:12,930 --> 00:24:15,790 is ideally just being able to communicate effectively. 607 00:24:15,790 --> 00:24:18,090 So even if you go off into the business world and you need to give a pitch, 608 00:24:18,090 --> 00:24:20,070 or you're trying to convince someone to invest in your company, 609 00:24:20,070 --> 00:24:21,540 or you're just trying to convince colleagues 610 00:24:21,540 --> 00:24:23,362 to come on board with some idea you have, 611 00:24:23,362 --> 00:24:25,570 it's all about public speaking at the end of the day. 612 00:24:25,570 --> 00:24:27,990 So just getting rid of those nerves, I think, 613 00:24:27,990 --> 00:24:30,880 is a wonderful feature of going that route. 614 00:24:30,880 --> 00:24:32,130 CARTER ZENKE: Most definitely. 615 00:24:32,130 --> 00:24:36,708 And just scrolling through here to see if we can find other questions for us. 616 00:24:36,708 --> 00:24:39,375 DAVID J. MALAN: Do you look up syntax when working on a project? 617 00:24:39,375 --> 00:24:40,740 CARTER ZENKE: Oh, absolutely. 618 00:24:40,740 --> 00:24:41,590 All the time. 619 00:24:41,590 --> 00:24:44,790 I look up a lot of R syntax to teach this course. 620 00:24:44,790 --> 00:24:49,993 But I think that is a habit you can reduce with time. 621 00:24:49,993 --> 00:24:52,410 As you learn a new language, you can look those things up. 622 00:24:52,410 --> 00:24:55,470 Maybe intentionally try to learn them, so the next time 623 00:24:55,470 --> 00:24:59,070 you encounter that same question of what was that syntax, just 624 00:24:59,070 --> 00:25:00,387 try to remember it. 625 00:25:00,387 --> 00:25:01,720 And if you can't, go look it up. 626 00:25:01,720 --> 00:25:03,900 But I think if you just go through the process of trying it, and trying 627 00:25:03,900 --> 00:25:05,490 it, and trying it, you'll eventually-- 628 00:25:05,490 --> 00:25:06,700 it comes to you eventually. 629 00:25:06,700 --> 00:25:08,700 DAVID J. MALAN: Yeah, because suffice it to say, 630 00:25:08,700 --> 00:25:10,650 the syntax aspects of programming languages 631 00:25:10,650 --> 00:25:13,720 is generally not the intellectually interesting part. 632 00:25:13,720 --> 00:25:17,380 If you don't know the syntax, you're just slowing yourself down. 633 00:25:17,380 --> 00:25:19,410 And so there's this intrinsic motivation, 634 00:25:19,410 --> 00:25:22,050 I think, to just get good at the syntax because you'll save yourself time. 635 00:25:22,050 --> 00:25:25,230 And programming is no fun if you're literally tabbing back and forth, back 636 00:25:25,230 --> 00:25:27,090 and forth, looking every darn thing up. 637 00:25:27,090 --> 00:25:28,470 But yeah, I do this all the time. 638 00:25:28,470 --> 00:25:30,720 And honestly, I have found ChatGPT, for instance, 639 00:25:30,720 --> 00:25:34,550 to be especially helpful because frankly, it's hard to Google syntax 640 00:25:34,550 --> 00:25:36,300 because you're typing in weird characters. 641 00:25:36,300 --> 00:25:39,220 And sometimes they come up with results on Stack Overflow or the like. 642 00:25:39,220 --> 00:25:42,760 But really, it's hard to express the question sometimes. 643 00:25:42,760 --> 00:25:44,430 And there's certain languages. 644 00:25:44,430 --> 00:25:46,030 Bash comes to mind. 645 00:25:46,030 --> 00:25:51,540 Perl comes to mind, a little bit of Ruby that are just so arcane in their syntax. 646 00:25:51,540 --> 00:25:52,540 I don't remember it. 647 00:25:52,540 --> 00:25:54,790 And even code, I've written-- literally the other day, 648 00:25:54,790 --> 00:25:57,900 I had a copy of a line of bash code, some kind of regular expression, 649 00:25:57,900 --> 00:26:01,030 plus some other stuff and asked ChatGPT, what does this line of code do? 650 00:26:01,030 --> 00:26:02,572 Because I don't even remember myself. 651 00:26:02,572 --> 00:26:06,930 So those kinds of things, I think, are especially-- 652 00:26:06,930 --> 00:26:09,960 it's especially useful to be able to ask some smart human 653 00:26:09,960 --> 00:26:11,658 or barring that computer. 654 00:26:11,658 --> 00:26:12,450 CARTER ZENKE: Yeah. 655 00:26:12,450 --> 00:26:17,130 And speaking of ways of getting help online by googling or through some AI, 656 00:26:17,130 --> 00:26:19,650 a question here is, if it was a conscious decision 657 00:26:19,650 --> 00:26:23,880 to limit the amount of text you can put into a chat bot like CS50.ai? 658 00:26:23,880 --> 00:26:26,850 DAVID J. MALAN: Yeah, short answer, yes. 659 00:26:26,850 --> 00:26:28,710 It's partly cost, honestly. 660 00:26:28,710 --> 00:26:32,430 So the APIs, or Application Programming Interfaces that we, ourselves, use, 661 00:26:32,430 --> 00:26:38,340 which borrow features from OpenAI, and Microsoft Azure, and similar such tools, 662 00:26:38,340 --> 00:26:39,150 they cost money. 663 00:26:39,150 --> 00:26:42,150 And we're fortunate to be supported by OpenAI, and Microsoft, and others 664 00:26:42,150 --> 00:26:45,120 who make it possible for CS50x students and teachers around the world 665 00:26:45,120 --> 00:26:46,870 to use these services for free. 666 00:26:46,870 --> 00:26:48,790 But there is a computational cost, right? 667 00:26:48,790 --> 00:26:50,290 There are servers somewhere running. 668 00:26:50,290 --> 00:26:52,000 There is electricity somewhere being used. 669 00:26:52,000 --> 00:26:53,500 There's internet bandwidth somewhere being used. 670 00:26:53,500 --> 00:26:54,580 And all of that adds up. 671 00:26:54,580 --> 00:26:57,870 And so we do try to minimize our utilization thereof, 672 00:26:57,870 --> 00:27:00,810 so that we can maximize just how many students and teachers can use it 673 00:27:00,810 --> 00:27:02,392 per day, for instance. 674 00:27:02,392 --> 00:27:04,350 And the reality is there's probably diminishing 675 00:27:04,350 --> 00:27:08,460 returns of being able to paste a bigger, and a bigger, and bigger, and bigger 676 00:27:08,460 --> 00:27:11,620 chunk of text certainly relevant sometimes for code. 677 00:27:11,620 --> 00:27:14,160 But honestly, if you think about how you would interact 678 00:27:14,160 --> 00:27:16,710 with a human like on Stack Overflow, are you 679 00:27:16,710 --> 00:27:19,410 really going to post pages, and pages, and pages of code? 680 00:27:19,410 --> 00:27:21,930 No human is going to bother giving you the time of day 681 00:27:21,930 --> 00:27:23,890 if you overwhelm them with information. 682 00:27:23,890 --> 00:27:27,000 Now, ChatGPT might be a lot friendlier when it comes to that. 683 00:27:27,000 --> 00:27:29,940 But presumably, the quality of results might 684 00:27:29,940 --> 00:27:33,390 be better if you help focus the AI on your actual problem 685 00:27:33,390 --> 00:27:35,503 and not just paste a big blob of text. 686 00:27:35,503 --> 00:27:36,670 CARTER ZENKE: Yeah, I agree. 687 00:27:36,670 --> 00:27:39,270 And speaking not just of the quantity of text, but let's say, 688 00:27:39,270 --> 00:27:41,020 the number of messages you might send over 689 00:27:41,020 --> 00:27:43,260 the course of an hour, what do you think is maybe 690 00:27:43,260 --> 00:27:46,020 best for students who are beginners versus those 691 00:27:46,020 --> 00:27:49,120 who are more advanced as they talk to these AI tools? 692 00:27:49,120 --> 00:27:49,930 DAVID J. MALAN: Yeah, it's a good question 693 00:27:49,930 --> 00:27:51,722 because this is something we've limited to. 694 00:27:51,722 --> 00:27:54,700 As you might know, there's a heart system, like HP, like Zelda 695 00:27:54,700 --> 00:27:58,025 or other games whereby you can only ask so many questions per unit of time. 696 00:27:58,025 --> 00:28:00,650 And then the hearts regenerate, and you can ask more questions. 697 00:28:00,650 --> 00:28:03,483 That, too, is partly for cost, but it's also for pedagogical reasons 698 00:28:03,483 --> 00:28:07,900 whereby when we looked at this past summer at the distribution of questions, 699 00:28:07,900 --> 00:28:11,447 most people were asking a, let's call it a healthy number of questions. 700 00:28:11,447 --> 00:28:14,030 And I don't know what that number is, but it feels reasonable. 701 00:28:14,030 --> 00:28:15,940 But then there was this long tail where there 702 00:28:15,940 --> 00:28:18,170 were hundreds of questions being asked. 703 00:28:18,170 --> 00:28:21,340 And we made this judgment call that if we think about the real world 704 00:28:21,340 --> 00:28:24,910 and like a real world class, a student might go up 705 00:28:24,910 --> 00:28:26,480 to ask a teacher a few questions. 706 00:28:26,480 --> 00:28:28,280 And admittedly, that might not be enough. 707 00:28:28,280 --> 00:28:31,760 So in an ideal world, a student would ask a teacher even more questions. 708 00:28:31,760 --> 00:28:37,240 But if you're occupying the teacher for 200-plus questions, 709 00:28:37,240 --> 00:28:41,290 odds are you're not really synthesizing or reflecting on the answers that are 710 00:28:41,290 --> 00:28:42,140 being given. 711 00:28:42,140 --> 00:28:45,590 And at some point, you should go back to your desk and think about the problem. 712 00:28:45,590 --> 00:28:47,530 And I'm specifically thinking of one of my high school teachers 713 00:28:47,530 --> 00:28:50,900 who used to send me back to my desk when I'm asking too many questions. 714 00:28:50,900 --> 00:28:53,440 So I think the upside of AI in software is 715 00:28:53,440 --> 00:28:58,540 that we can now empower humans to ask more questions than social conventions 716 00:28:58,540 --> 00:29:00,830 or real world time constraints allow. 717 00:29:00,830 --> 00:29:05,500 But I don't think that should be to the detriment of actually helping a learning 718 00:29:05,500 --> 00:29:09,100 student find the sweet spot between asking, answering, 719 00:29:09,100 --> 00:29:11,030 and thinking about the same. 720 00:29:11,030 --> 00:29:13,630 CARTER ZENKE: I agree. 721 00:29:13,630 --> 00:29:14,887 Other ones here. 722 00:29:14,887 --> 00:29:17,720 DAVID J. MALAN: Well, along those lines, let's push a little harder. 723 00:29:17,720 --> 00:29:20,290 As Isaac asks here, is there a limit to how much 724 00:29:20,290 --> 00:29:24,370 is appropriate to use the duck AI independent of our own hearts? 725 00:29:24,370 --> 00:29:26,675 CARTER ZENKE: Yeah, that's interesting. 726 00:29:26,675 --> 00:29:28,550 I think it really depends on the context such 727 00:29:28,550 --> 00:29:31,670 that I have a hard time giving a concrete answer of saying 728 00:29:31,670 --> 00:29:34,760 you should ask no more than 10 questions per hour, for instance. 729 00:29:34,760 --> 00:29:39,560 I think for myself, I can kind of get a good conversation 730 00:29:39,560 --> 00:29:41,300 going with a tool like ChatGPT. 731 00:29:41,300 --> 00:29:45,350 And I find that I do want to ask more questions than sometimes I perhaps 732 00:29:45,350 --> 00:29:49,480 should, so maybe having a little bit of self-restraint is helpful sometimes. 733 00:29:49,480 --> 00:29:51,630 Say, maybe I could figure this out on my own. 734 00:29:51,630 --> 00:29:54,463 I don't have to just ask a question, ask a question, ask a question. 735 00:29:54,463 --> 00:29:56,547 DAVID J. MALAN: Yeah, I feel like there's probably 736 00:29:56,547 --> 00:29:59,280 some self-policing there because you want to solve the problem. 737 00:29:59,280 --> 00:30:02,420 You don't want to just have an endless conversation. 738 00:30:02,420 --> 00:30:04,670 And presumably, you enjoy the programming part, not 739 00:30:04,670 --> 00:30:05,940 the asking questions part. 740 00:30:05,940 --> 00:30:09,710 So I feel like there should be some intrinsic pressure on tabbing away 741 00:30:09,710 --> 00:30:11,893 from the AI at some point. 742 00:30:11,893 --> 00:30:14,810 CARTER ZENKE: I think there is a natural inclination, at least for me, 743 00:30:14,810 --> 00:30:18,432 where I don't want to bother a human with 20 questions all at once. 744 00:30:18,432 --> 00:30:19,640 But I can do that with an AI. 745 00:30:19,640 --> 00:30:23,450 That's kind of a benefit to using a tool that is not a human. 746 00:30:23,450 --> 00:30:26,580 But at the same time, there is maybe a limit to myself. 747 00:30:26,580 --> 00:30:27,437 I can self-police. 748 00:30:27,437 --> 00:30:28,770 DAVID J. MALAN: It occurs to me. 749 00:30:28,770 --> 00:30:32,700 I should caution that once you move on from CS50's full-time team, 750 00:30:32,700 --> 00:30:36,360 you become sort of like the ghost version of the Jedi in Star Wars. 751 00:30:36,360 --> 00:30:39,060 So we might still call on you once in a while via Slack 752 00:30:39,060 --> 00:30:42,270 or text message with questions, as Brian, and Colton, and Tommy, 753 00:30:42,270 --> 00:30:44,400 and others know all too well. 754 00:30:44,400 --> 00:30:45,520 So I hope that's OK. 755 00:30:45,520 --> 00:30:45,900 CARTER ZENKE: Absolutely. 756 00:30:45,900 --> 00:30:47,733 DAVID J. MALAN: But we will turn to the duck 757 00:30:47,733 --> 00:30:49,760 when we don't want to bother Carter, for sure. 758 00:30:49,760 --> 00:30:51,900 So there was an interesting question here. 759 00:30:51,900 --> 00:30:54,600 And the chat asked about why Python is less 760 00:30:54,600 --> 00:30:56,953 used in back end development than JavaScript. 761 00:30:56,953 --> 00:30:58,120 That may or may not be true. 762 00:30:58,120 --> 00:31:01,203 I don't know the statistics, but that sounds like a reasonable conjecture. 763 00:31:01,203 --> 00:31:04,518 But whether or not it's factually true, why might that be? 764 00:31:04,518 --> 00:31:05,310 CARTER ZENKE: Yeah. 765 00:31:05,310 --> 00:31:08,280 I have to say I don't know exactly why. 766 00:31:08,280 --> 00:31:09,580 I could give a hypothesis. 767 00:31:09,580 --> 00:31:13,590 I might say we use a lot of JavaScript for front end development for making 768 00:31:13,590 --> 00:31:15,610 things interactive on a web page. 769 00:31:15,610 --> 00:31:19,680 I wonder if for folks who are designing web pages if it just made sense. 770 00:31:19,680 --> 00:31:21,000 They already knew Javascript. 771 00:31:21,000 --> 00:31:23,880 Just take that language, and do it on the back end as well. 772 00:31:23,880 --> 00:31:25,020 Do you have other ideas? 773 00:31:25,020 --> 00:31:26,910 DAVID J. MALAN: I think that's a big one because then 774 00:31:26,910 --> 00:31:29,618 if you think about it strategically from a company's perspective, 775 00:31:29,618 --> 00:31:31,788 your team only needs to know the one language. 776 00:31:31,788 --> 00:31:33,580 And there's just an economy of scale there. 777 00:31:33,580 --> 00:31:36,510 It's easier to onboard, recruit people, probably. 778 00:31:36,510 --> 00:31:39,100 Pedagogically, though-- I'll take the CS50 perspective. 779 00:31:39,100 --> 00:31:42,013 I just think Python is easier and more accessible for web 780 00:31:42,013 --> 00:31:44,430 development, at least, in the sense that we want to do it. 781 00:31:44,430 --> 00:31:45,555 I actually love JavaScript. 782 00:31:45,555 --> 00:31:48,690 And of the languages out there, I kind of actually 783 00:31:48,690 --> 00:31:53,100 prefer programming, only because there's just some intellectually interesting 784 00:31:53,100 --> 00:31:53,850 stuff there. 785 00:31:53,850 --> 00:31:55,980 If you're using a so-called single-threaded model, 786 00:31:55,980 --> 00:31:58,390 but you therefore need to write asynchronous code, 787 00:31:58,390 --> 00:32:02,250 you either need to use things like callback functions or promises. 788 00:32:02,250 --> 00:32:04,690 Or there's a slightly newer syntax like async, await. 789 00:32:04,690 --> 00:32:07,110 But you have to understand the sophistication of what's 790 00:32:07,110 --> 00:32:09,900 going on inside of the computer, whereas Python just kind of does 791 00:32:09,900 --> 00:32:11,070 what you intend. 792 00:32:11,070 --> 00:32:12,970 And there's a value to that. 793 00:32:12,970 --> 00:32:15,460 But JavaScript is just kind of interesting. 794 00:32:15,460 --> 00:32:17,280 But when we thought about whether or not, 795 00:32:17,280 --> 00:32:21,620 for instance, we should introduce even more JavaScript server side into CS50, 796 00:32:21,620 --> 00:32:24,120 I just don't think we can do it justice in the time we have. 797 00:32:24,120 --> 00:32:26,650 And honestly, it escalates too quickly. 798 00:32:26,650 --> 00:32:30,840 I mean, so many of the students who've taken CS50, CS50x literally started 799 00:32:30,840 --> 00:32:34,380 programming, what, three months, six months before in Scratch. 800 00:32:34,380 --> 00:32:38,313 And that's just-- at some point, it's too much and too many ideas. 801 00:32:38,313 --> 00:32:40,230 And I'd much rather students get their footing 802 00:32:40,230 --> 00:32:43,140 with a more procedurally-oriented language like Python, 803 00:32:43,140 --> 00:32:47,508 even though it has functional aspects than context switch too much. 804 00:32:47,508 --> 00:32:48,300 CARTER ZENKE: Yeah. 805 00:32:48,300 --> 00:32:50,842 And it's partially why we have courses like Brian's webcourse 806 00:32:50,842 --> 00:32:54,450 that dives even more deeply into Python as a backend language, but also 807 00:32:54,450 --> 00:32:55,510 JavaScript too. 808 00:32:55,510 --> 00:32:56,740 DAVID J. MALAN: Yeah, indeed. 809 00:32:56,740 --> 00:33:00,630 I think we have time for a few more questions here. 810 00:33:00,630 --> 00:33:01,810 Let's see. 811 00:33:01,810 --> 00:33:05,190 Feel free to pluck anything off what you see as well. 812 00:33:05,190 --> 00:33:07,510 CARTER ZENKE: One-- let's see. 813 00:33:07,510 --> 00:33:08,280 Oh. 814 00:33:08,280 --> 00:33:12,720 Maybe related to the changing of the field and all these new things 815 00:33:12,720 --> 00:33:15,360 happening, what keeps your interest in the field 816 00:33:15,360 --> 00:33:17,485 after you've been doing it for as long as you have? 817 00:33:17,485 --> 00:33:18,277 DAVID J. MALAN: Oh. 818 00:33:18,277 --> 00:33:19,450 That's a good question. 819 00:33:19,450 --> 00:33:20,860 Honestly, it continues to evolve. 820 00:33:20,860 --> 00:33:24,703 Certainly, CS50 and CS50x in turn itself are continually evolving. 821 00:33:24,703 --> 00:33:27,370 I think we focus much more on the human aspect of things, right? 822 00:33:27,370 --> 00:33:30,540 It's not just the teaching and the execution of the courses' curriculum. 823 00:33:30,540 --> 00:33:31,450 It's the communities. 824 00:33:31,450 --> 00:33:35,160 We, just a few weeks back, spent a week in Indonesia in Jakarta 825 00:33:35,160 --> 00:33:37,260 working with almost 300 teachers who will now 826 00:33:37,260 --> 00:33:40,838 go back to their own classrooms, be it at middle school or high school levels, 827 00:33:40,838 --> 00:33:42,880 and teach computer science to their own students. 828 00:33:42,880 --> 00:33:44,460 And so there's that social aspect to it. 829 00:33:44,460 --> 00:33:46,990 We've done a lot of that within the US and elsewhere in the world. 830 00:33:46,990 --> 00:33:49,450 And so for me, it really has been that community aspect. 831 00:33:49,450 --> 00:33:53,160 And if I think we were just doing more of the same for all of these years, 832 00:33:53,160 --> 00:33:54,643 then it wouldn't be nearly as fun. 833 00:33:54,643 --> 00:33:55,810 CARTER ZENKE: Yeah, I agree. 834 00:33:55,810 --> 00:33:58,180 I think it's the social aspect that keeps me going for this kind of-- 835 00:33:58,180 --> 00:33:59,055 DAVID J. MALAN: Yeah. 836 00:33:59,055 --> 00:34:02,370 And even technologically, I mean, being a geek, the course has evolved. 837 00:34:02,370 --> 00:34:05,160 Even though fundamentally, if you peel back the packaging of it 838 00:34:05,160 --> 00:34:07,590 and the implementation details of the languages and the problem sets 839 00:34:07,590 --> 00:34:10,480 and so forth, the syllabus, the backbone is still really the same, 840 00:34:10,480 --> 00:34:14,235 but the libraries are changing and the techniques that you can use. 841 00:34:14,235 --> 00:34:17,610 And therefore, teaching things have been getting easier for students and teachers 842 00:34:17,610 --> 00:34:23,580 alike, thanks to just advancements in coding platforms and frameworks. 843 00:34:23,580 --> 00:34:25,988 So that keeps the geek in me interested too. 844 00:34:25,988 --> 00:34:27,280 CARTER ZENKE: Yeah, absolutely. 845 00:34:27,280 --> 00:34:28,464 DAVID J. MALAN: Do you have a favorite language? 846 00:34:28,464 --> 00:34:30,320 CARTER ZENKE: Hmm, favorite language. 847 00:34:30,320 --> 00:34:34,929 I've really liked learning R and teaching it recently. 848 00:34:34,929 --> 00:34:37,090 If I had to have a go-to language, I would 849 00:34:37,090 --> 00:34:40,449 say that's probably still Python just because it's so high level, very easy 850 00:34:40,449 --> 00:34:43,210 to use. 851 00:34:43,210 --> 00:34:46,760 I do like C for its ability to get a little more low level. 852 00:34:46,760 --> 00:34:49,532 I feel like you can just kind of learn the entire language. 853 00:34:49,532 --> 00:34:51,199 You can't often do with other languages. 854 00:34:51,199 --> 00:34:55,040 So I guess, I would appreciate each one for its own merits if I had to say. 855 00:34:55,040 --> 00:34:57,010 DAVID J. MALAN: OK And when you-- 856 00:34:57,010 --> 00:35:02,020 for those unfamiliar, Carter did his graduate education in Education School. 857 00:35:02,020 --> 00:35:05,920 And for those, particularly those who asked earlier about teaching itself, 858 00:35:05,920 --> 00:35:09,928 what does one learn in Education School that you don't in school, school? 859 00:35:09,928 --> 00:35:10,720 CARTER ZENKE: Sure. 860 00:35:10,720 --> 00:35:15,070 So it takes a more reflective approach to your own educational experiences 861 00:35:15,070 --> 00:35:17,300 and thinking through what works and what doesn't. 862 00:35:17,300 --> 00:35:19,758 And to that end, there are lot of directions you can go in. 863 00:35:19,758 --> 00:35:23,650 One that I really like thinking about is just human development and how do people 864 00:35:23,650 --> 00:35:24,710 learn things. 865 00:35:24,710 --> 00:35:26,660 And once we know that, we can know a lot more 866 00:35:26,660 --> 00:35:30,900 about how to deliver an effective lesson or create an effective experience. 867 00:35:30,900 --> 00:35:33,200 I think one thing that I took away that I 868 00:35:33,200 --> 00:35:35,960 think is maybe useful for those who are just beginning 869 00:35:35,960 --> 00:35:41,210 is thinking maybe less of teaching as I told somebody something 870 00:35:41,210 --> 00:35:45,020 and they learned it and more about trying to create, I don't know, 871 00:35:45,020 --> 00:35:48,960 conditions and structures to help support learning in a long-term way, 872 00:35:48,960 --> 00:35:53,120 so checking for understanding, asking questions, getting students 873 00:35:53,120 --> 00:35:55,310 to give you feedback on your own teaching 874 00:35:55,310 --> 00:35:58,380 is really valuable as you're going off and doing things like education. 875 00:35:58,380 --> 00:36:01,588 DAVID J. MALAN: And one other question that's come up in the chat a few times 876 00:36:01,588 --> 00:36:05,240 is about a different sort of engineering, prompt engineering. 877 00:36:05,240 --> 00:36:06,290 What is that? 878 00:36:06,290 --> 00:36:08,392 And is this something students now need to learn? 879 00:36:08,392 --> 00:36:10,100 CARTER ZENKE: Yeah, so prompt engineering 880 00:36:10,100 --> 00:36:15,830 refers to writing a prompt to some AI, like ChatGPT, like the duck, 881 00:36:15,830 --> 00:36:19,107 for instance, that helps it behave in the way you want it to behave. 882 00:36:19,107 --> 00:36:21,440 And so we actually did some prompt engineering ourselves 883 00:36:21,440 --> 00:36:26,728 to create the CS50 doc to have it be built on top of a model like GPT-4, 884 00:36:26,728 --> 00:36:30,020 but then say, for instance, you should behave like a duck would, like a teacher 885 00:36:30,020 --> 00:36:30,600 would. 886 00:36:30,600 --> 00:36:32,433 And so I think it's a good skill if you want 887 00:36:32,433 --> 00:36:34,670 to be able to use AI appropriately thinking 888 00:36:34,670 --> 00:36:37,130 of what prompts you can use to get the AI to do 889 00:36:37,130 --> 00:36:38,505 what you want to do successfully. 890 00:36:38,505 --> 00:36:39,380 DAVID J. MALAN: Yeah. 891 00:36:39,380 --> 00:36:42,290 I think our friend, Rongxin, just the other day, put it in a way 892 00:36:42,290 --> 00:36:44,903 that I really liked, which is that prompt engineering, 893 00:36:44,903 --> 00:36:46,070 it's not really engineering. 894 00:36:46,070 --> 00:36:48,290 I mean, it really is using English or whatever human language 895 00:36:48,290 --> 00:36:50,240 to just, with higher probability, get the AI 896 00:36:50,240 --> 00:36:52,050 to do what it is you want it to do. 897 00:36:52,050 --> 00:36:55,310 But in that sense, it's really just about asking good questions 898 00:36:55,310 --> 00:36:56,880 or giving good instructions. 899 00:36:56,880 --> 00:36:59,270 And ironically, that's arguably what computer science 900 00:36:59,270 --> 00:37:01,812 is all about, or at least, the algorithms in computer science 901 00:37:01,812 --> 00:37:02,460 is all about. 902 00:37:02,460 --> 00:37:05,885 And so it's not really a skill one should be putting on LinkedIn. 903 00:37:05,885 --> 00:37:08,510 It's not really something that one should be taking courses in. 904 00:37:08,510 --> 00:37:12,350 Frankly, I do think AI, as it advances in the coming months and years, 905 00:37:12,350 --> 00:37:17,000 is only going to get more tolerant of us humans being bad at prompt engineering. 906 00:37:17,000 --> 00:37:19,410 And it's just going to tolerate free form English. 907 00:37:19,410 --> 00:37:21,143 And we see this already in CS50. 908 00:37:21,143 --> 00:37:23,060 I mean, you would be surprised to see just how 909 00:37:23,060 --> 00:37:26,270 terse some of the questions that are asked of the-- 910 00:37:26,270 --> 00:37:29,060 case in point, someone copy and pastes a bunch of text. 911 00:37:29,060 --> 00:37:31,115 And then the question is, any thoughts? 912 00:37:31,115 --> 00:37:32,240 CARTER ZENKE: Any thoughts. 913 00:37:32,240 --> 00:37:34,073 DAVID J. MALAN: But this is a real question. 914 00:37:34,073 --> 00:37:37,670 And amazingly, the duck can handle sometimes questions like that and infer 915 00:37:37,670 --> 00:37:41,210 from context what it is the student is probably asking about. 916 00:37:41,210 --> 00:37:44,000 That is not a good example of prompt engineering. 917 00:37:44,000 --> 00:37:46,730 But I do think this is sort of a short-lived term of art 918 00:37:46,730 --> 00:37:50,143 that I'd be surprised if we're still living with it before long. 919 00:37:50,143 --> 00:37:51,060 CARTER ZENKE: I agree. 920 00:37:51,060 --> 00:37:54,230 And maybe one to close us out on this same theme, 921 00:37:54,230 --> 00:37:58,170 do you get tired of hearing of AI all the time? 922 00:37:58,170 --> 00:37:59,840 DAVID J. MALAN: That's a good question. 923 00:37:59,840 --> 00:38:00,870 No. 924 00:38:00,870 --> 00:38:05,055 I got more tired of hearing about blockchain and Bitcoin for a while. 925 00:38:05,055 --> 00:38:07,650 I got more tired of hearing about cloud computing 926 00:38:07,650 --> 00:38:10,800 as though it was suddenly invented when really it 927 00:38:10,800 --> 00:38:14,460 was just a very clever branding and abstraction on top of outsourcing, 928 00:38:14,460 --> 00:38:17,100 and servers, and renting things, and so forth. 929 00:38:17,100 --> 00:38:19,860 But AI does feel a little different, at least, to me. 930 00:38:19,860 --> 00:38:22,290 Like, this has happened quicker and sooner 931 00:38:22,290 --> 00:38:25,440 than I would have expected, having something like ChatGPT in the wild. 932 00:38:25,440 --> 00:38:30,570 It is better in its first, well, version 3.5 and now 4.0 933 00:38:30,570 --> 00:38:32,320 than I would have expected. 934 00:38:32,320 --> 00:38:35,580 And it's been impactful, really, overnight. 935 00:38:35,580 --> 00:38:38,040 I'm reminded of other flashes in the pan, so to speak, 936 00:38:38,040 --> 00:38:40,200 like Google Glass, which the whole world was talking about. 937 00:38:40,200 --> 00:38:41,220 And then it completely went away. 938 00:38:41,220 --> 00:38:44,190 But it didn't really move the needle or do anything fundamentally 939 00:38:44,190 --> 00:38:46,180 game changing, at least, in the time. 940 00:38:46,180 --> 00:38:50,250 But AI in this form and large language models in particular 941 00:38:50,250 --> 00:38:52,290 have kind of done that already so much so, 942 00:38:52,290 --> 00:38:57,490 that CS50, within the span of months from November 2022 943 00:38:57,490 --> 00:39:02,080 to the late spring of 2023, we steered the ship that 944 00:39:02,080 --> 00:39:05,920 is CS50 in the direction of AI and the CS50 duck. 945 00:39:05,920 --> 00:39:09,850 And that's pretty unprecedented for us curricularly, 946 00:39:09,850 --> 00:39:12,280 I mean, even a change from PHP to Python, 947 00:39:12,280 --> 00:39:13,840 which is a different kind of change. 948 00:39:13,840 --> 00:39:18,040 But we talked about that for years before finally deciding, OK, now 949 00:39:18,040 --> 00:39:19,060 the time is right. 950 00:39:19,060 --> 00:39:20,290 But we pivoted fast. 951 00:39:20,290 --> 00:39:24,708 And to me, that just reinforces what is already my instinct that yes, there's 952 00:39:24,708 --> 00:39:25,750 something different here. 953 00:39:25,750 --> 00:39:30,040 So I'm not tired of hearing about it, but I am 954 00:39:30,040 --> 00:39:34,520 very excited to see where it's going. 955 00:39:34,520 --> 00:39:38,375 I feel like we might actually live like the Jetsons in our own lifetime. 956 00:39:38,375 --> 00:39:39,250 DAVID J. MALAN: Yeah. 957 00:39:39,250 --> 00:39:40,935 Check back in a year. 958 00:39:40,935 --> 00:39:41,810 CARTER ZENKE: Indeed. 959 00:39:41,810 --> 00:39:45,190 DAVID J. MALAN: Well, Carter, thank you so much, not only for today, but also 960 00:39:45,190 --> 00:39:47,150 for the past several years in CS50. 961 00:39:47,150 --> 00:39:49,570 So glad you'll remain in CS50's family online 962 00:39:49,570 --> 00:39:52,780 through CS50's SQL class, this new R class, which everyone 963 00:39:52,780 --> 00:39:54,980 should register for if of interest. 964 00:39:54,980 --> 00:39:56,690 But thank you so much from us all. 965 00:39:56,690 --> 00:39:57,170 CARTER ZENKE: Thank you. 966 00:39:57,170 --> 00:39:57,980 And thank you to the team. 967 00:39:57,980 --> 00:40:00,480 And thank you to all of you who've supported me in this role 968 00:40:00,480 --> 00:40:03,400 and helped me contribute something, I hope, to your own education. 969 00:40:03,400 --> 00:40:04,670 DAVID J. MALAN: Should we end it on a fun note? 970 00:40:04,670 --> 00:40:05,712 CARTER ZENKE: Yeah, sure. 971 00:40:05,712 --> 00:40:07,960 DAVID J. MALAN: Why is the CS50 profile pic a cat? 972 00:40:07,960 --> 00:40:08,860 CARTER ZENKE: This? 973 00:40:08,860 --> 00:40:11,410 In my three years here, I haven't figured out. 974 00:40:11,410 --> 00:40:13,210 DAVID J. MALAN: Well, I can answer that. 975 00:40:13,210 --> 00:40:17,687 So back in, like, 15-plus years ago when we signed up for YouTube and other 976 00:40:17,687 --> 00:40:20,770 accounts, we needed a profile picture, and I certainly wasn't going to use 977 00:40:20,770 --> 00:40:21,760 my face. 978 00:40:21,760 --> 00:40:26,910 And so happy cat was a very popular meme at the time, who sadly doesn't really 979 00:40:26,910 --> 00:40:28,160 get talked about much anymore. 980 00:40:28,160 --> 00:40:31,410 And he's even a little hard to google, even though knowyourmeme.net or whatnot 981 00:40:31,410 --> 00:40:33,170 has a bio on him. 982 00:40:33,170 --> 00:40:35,240 He's just a very happy, adorable cat. 983 00:40:35,240 --> 00:40:37,210 And so we used his face for some time. 984 00:40:37,210 --> 00:40:41,240 And now that we've done it for so long, it feels like, well, that is the mascot. 985 00:40:41,240 --> 00:40:44,350 And it's only because of rubber duck debugging 986 00:40:44,350 --> 00:40:49,600 that the AI took on the persona of a duck. 987 00:40:49,600 --> 00:40:52,930 But otherwise, a cat is probably the de facto mascot, 988 00:40:52,930 --> 00:40:55,262 even though the duck seems to be eclipsing it now. 989 00:40:55,262 --> 00:40:56,470 CARTER ZENKE: Cats are great. 990 00:40:56,470 --> 00:40:56,680 DAVID J. MALAN: Yeah. 991 00:40:56,680 --> 00:40:57,500 It was close. 992 00:40:57,500 --> 00:41:00,460 The duck was almost a cow, I will say, because we 993 00:41:00,460 --> 00:41:03,850 know-- you might recall that in CS50's Python class and now CS50, 994 00:41:03,850 --> 00:41:07,120 we introduce cowsay, a sort of older school program that with ASCII art 995 00:41:07,120 --> 00:41:09,297 lets a cow say something out of its mouth virtually. 996 00:41:09,297 --> 00:41:11,380 And I thought that would be kind of a cute version 997 00:41:11,380 --> 00:41:15,460 because the Android and iOS emoji for a cow is actually super cute too, 998 00:41:15,460 --> 00:41:16,938 the big face one. 999 00:41:16,938 --> 00:41:18,230 So that would have worked well. 1000 00:41:18,230 --> 00:41:21,190 And in fact, ironically, the duck emoji isn't quite apt 1001 00:41:21,190 --> 00:41:23,560 because it tends to be depicted as a mallard. 1002 00:41:23,560 --> 00:41:26,400 I wish they would change that to a rubber duck. 1003 00:41:26,400 --> 00:41:28,400 CARTER ZENKE: You could petition the consortium. 1004 00:41:28,400 --> 00:41:28,670 DAVID J. MALAN: We could. 1005 00:41:28,670 --> 00:41:31,287 We do know someone within the Unicode group that we could ask, 1006 00:41:31,287 --> 00:41:33,620 but I don't know if that's going to happen anytime soon, 1007 00:41:33,620 --> 00:41:38,840 but thus was born the CS50 cat, and thus was born Carter's years with us. 1008 00:41:38,840 --> 00:41:40,718 I'm so happy again to have had you with us. 1009 00:41:40,718 --> 00:41:42,010 CARTER ZENKE: Yeah, me as well. 1010 00:41:42,010 --> 00:41:42,560 DAVID J. MALAN: All right. 1011 00:41:42,560 --> 00:41:43,820 Well, this was CS50. 1012 00:41:43,820 --> 00:41:47,020 If you're new to the community, go to cs50.edx.org. 1013 00:41:47,020 --> 00:41:49,630 To register for the R class in particular, 1014 00:41:49,630 --> 00:41:52,510 add a /R to the end of that URL. 1015 00:41:52,510 --> 00:41:55,230 This then was CS50. 1016 00:41:55,230 --> 00:41:57,000