1 00:00:00,000 --> 00:00:01,840 2 00:00:01,840 --> 00:00:03,700 RONGXIN LIU: So let's get that right in. 3 00:00:03,700 --> 00:00:06,520 So today we are going to take a deep dive 4 00:00:06,520 --> 00:00:10,000 into CS50.ai which is a specific duck. 5 00:00:10,000 --> 00:00:15,070 The duck actually spends some time in Bali, right. 6 00:00:15,070 --> 00:00:16,990 Surfing in the beach happily. 7 00:00:16,990 --> 00:00:21,670 So the duck is actually with us, but this is actually 8 00:00:21,670 --> 00:00:23,320 completely generated by AI. 9 00:00:23,320 --> 00:00:27,760 So with the prompt "A yellow rubber duck is surfing happily in Bali." 10 00:00:27,760 --> 00:00:29,710 And that's what we get, right. 11 00:00:29,710 --> 00:00:33,100 12 00:00:33,100 --> 00:00:35,860 Generated AI has now become, like, a trendy term. 13 00:00:35,860 --> 00:00:40,660 I think everyone must have heard of it at some point. 14 00:00:40,660 --> 00:00:46,300 This AI model can generate image, generate video, generate text. 15 00:00:46,300 --> 00:00:50,470 The image you just saw was generated by Midjourney. 16 00:00:50,470 --> 00:00:54,910 But recently, there's another company called Pika. 17 00:00:54,910 --> 00:00:58,430 It will even generate video for you as well. 18 00:00:58,430 --> 00:01:06,880 So imagine the duck is actually surfing in the ocean, right. 19 00:01:06,880 --> 00:01:12,070 So this is the capability that nowadays generated AI can do. 20 00:01:12,070 --> 00:01:14,110 Moreover, it can generate text. 21 00:01:14,110 --> 00:01:18,520 For example, if some of you did the Tideman problem set, 22 00:01:18,520 --> 00:01:19,630 I don't know how to do it. 23 00:01:19,630 --> 00:01:23,260 So I asked ChatGPT please just do it for me. 24 00:01:23,260 --> 00:01:27,340 And ChatGPT will happily say, sure, let me complete for you. 25 00:01:27,340 --> 00:01:29,140 So it complete all the function for me. 26 00:01:29,140 --> 00:01:30,070 But you know what? 27 00:01:30,070 --> 00:01:32,030 I'm kind of, like, a copy paster. 28 00:01:32,030 --> 00:01:37,840 So I will just say, thanks, can you just give me the complete code? 29 00:01:37,840 --> 00:01:39,970 ChatGPT will happily give me the complete code. 30 00:01:39,970 --> 00:01:45,020 And I just copy paste it to tideman.c, run check50. 31 00:01:45,020 --> 00:01:48,040 It's all green, right. 32 00:01:48,040 --> 00:01:52,420 Great, right. 33 00:01:52,420 --> 00:01:55,030 But not so great for educators. 34 00:01:55,030 --> 00:01:58,660 Because nowadays, in that case, students didn't actually learn anything. 35 00:01:58,660 --> 00:02:02,960 In one minute, they finished Tideman but they learned nothing. 36 00:02:02,960 --> 00:02:05,630 So the problem with all these generative AI tool 37 00:02:05,630 --> 00:02:09,979 is, like, they're just too helpful, right. 38 00:02:09,979 --> 00:02:12,830 So how can we solve this problem? 39 00:02:12,830 --> 00:02:16,760 So one intuition is just to make it dumb. 40 00:02:16,760 --> 00:02:19,850 That's how the duck comes from. 41 00:02:19,850 --> 00:02:25,280 So we try to restrict the capability of ChatGPT 42 00:02:25,280 --> 00:02:30,270 while providing useful feedback for students. 43 00:02:30,270 --> 00:02:36,120 And just to recap, as you all know, CS50 is a large course on campus and online. 44 00:02:36,120 --> 00:02:38,460 That's the big duck on the stage. 45 00:02:38,460 --> 00:02:42,120 So we have 500 on campus students. 46 00:02:42,120 --> 00:02:47,460 Roughly 40 TAs for the fall 2023 semester. 47 00:02:47,460 --> 00:02:54,130 Globally we have over 5.4, or even more, learners nowadays. 48 00:02:54,130 --> 00:02:59,020 So it's kind of hard to support this large community with limited humans, 49 00:02:59,020 --> 00:02:59,520 right. 50 00:02:59,520 --> 00:03:01,830 So with this generative AI we can actually 51 00:03:01,830 --> 00:03:04,410 utilize it to provide, like, an-- 52 00:03:04,410 --> 00:03:08,040 Approximate, like, a one-to-one teacher to student ratio. 53 00:03:08,040 --> 00:03:15,180 Just so we can offer personalized education tutoring for each individual, 54 00:03:15,180 --> 00:03:18,060 either on campus or online. 55 00:03:18,060 --> 00:03:20,340 So here comes the CS50 Duck. 56 00:03:20,340 --> 00:03:22,560 You probably all use it a little bit. 57 00:03:22,560 --> 00:03:25,080 58 00:03:25,080 --> 00:03:28,443 The most straightforward way to access CS50 Duck is through your codespace. 59 00:03:28,443 --> 00:03:30,610 You can just ask question there, and it will happily 60 00:03:30,610 --> 00:03:33,410 help you assist you with your problem set without giving you 61 00:03:33,410 --> 00:03:35,950 an answer right away. 62 00:03:35,950 --> 00:03:40,390 Behind the scenes underneath the hood, all this CS50 Duck interaction 63 00:03:40,390 --> 00:03:43,630 is powered by what is called a large language model. 64 00:03:43,630 --> 00:03:46,600 It is kind of a deep neural network, deep 65 00:03:46,600 --> 00:03:51,580 learning model, that train on huge amount of text corpus. 66 00:03:51,580 --> 00:03:55,570 As you've already seen it's capable of generating image, video, 67 00:03:55,570 --> 00:03:58,540 all kinds of media. 68 00:03:58,540 --> 00:04:01,630 So far we have 90k users globally. 69 00:04:01,630 --> 00:04:04,480 We have processed 3.4 million prompts so far. 70 00:04:04,480 --> 00:04:07,840 So it's getting more and more popular. 71 00:04:07,840 --> 00:04:12,698 And this is the overall system architect of the CS50 Duck you just viewed. 72 00:04:12,698 --> 00:04:13,990 It looks complicated right now. 73 00:04:13,990 --> 00:04:14,620 It looks scary. 74 00:04:14,620 --> 00:04:17,230 I will just break it down for you piece by piece. 75 00:04:17,230 --> 00:04:21,110 So first let's focus on the UI side. 76 00:04:21,110 --> 00:04:26,080 So there's multiple ways to interact with the CS50 Duck. 77 00:04:26,080 --> 00:04:31,680 One of the most common way you interact with the duck is through the codespace. 78 00:04:31,680 --> 00:04:35,460 But actually the very first feature we built 79 00:04:35,460 --> 00:04:39,930 when we experimented with the duck is actually the "explain highlighted 80 00:04:39,930 --> 00:04:40,590 code." 81 00:04:40,590 --> 00:04:44,400 Where we first added support to CS Code that 82 00:04:44,400 --> 00:04:48,480 allows student to select a portion of code and get an explanation. 83 00:04:48,480 --> 00:04:51,810 But as you students realize, that this is, kind of, like, 84 00:04:51,810 --> 00:04:52,890 a one way interaction. 85 00:04:52,890 --> 00:04:56,160 So each time you can only select a portion of code and get, like, 86 00:04:56,160 --> 00:04:59,310 a response back, but you cannot really talk to the duck, right. 87 00:04:59,310 --> 00:05:02,170 88 00:05:02,170 --> 00:05:04,150 Another use case is, like-- 89 00:05:04,150 --> 00:05:08,200 Another similar use case to explain the style changes. 90 00:05:08,200 --> 00:05:12,910 So we have Style50 that assists students in improving their coding style. 91 00:05:12,910 --> 00:05:16,180 But sometimes it is hard for students to understand what 92 00:05:16,180 --> 00:05:19,540 it means from the Style50 suggestion. 93 00:05:19,540 --> 00:05:24,430 So we can also utilize AI to offer like a plain English explanation 94 00:05:24,430 --> 00:05:29,140 on what it means and what steps to take to improve the UI 95 00:05:29,140 --> 00:05:30,625 or improve their coding style. 96 00:05:30,625 --> 00:05:33,990 97 00:05:33,990 --> 00:05:37,230 On the at discussion forum when you post question, 98 00:05:37,230 --> 00:05:41,010 the duck will often be the first one to answer the question. 99 00:05:41,010 --> 00:05:44,030 100 00:05:44,030 --> 00:05:50,780 That's actually a more nuanced, a more elaborate implementation, 101 00:05:50,780 --> 00:05:55,565 of the large language model I will dive into a little bit later. 102 00:05:55,565 --> 00:06:01,640 103 00:06:01,640 --> 00:06:04,610 And of course, the whole purpose of the CS50 Duck 104 00:06:04,610 --> 00:06:08,750 is to provide you, like, a personalized tutor and it's available to you 24/7. 105 00:06:08,750 --> 00:06:13,370 So we have this web app available called CS50.ai. 106 00:06:13,370 --> 00:06:15,410 You can just chat with the duck. 107 00:06:15,410 --> 00:06:17,890 But of course, you cannot keep asking the duck. 108 00:06:17,890 --> 00:06:18,890 The duck will get tired. 109 00:06:18,890 --> 00:06:21,440 So at a certain point, the duck will just go to sleep 110 00:06:21,440 --> 00:06:28,200 and you have to wait for the duck to revive, right. 111 00:06:28,200 --> 00:06:29,790 But these are the UI side. 112 00:06:29,790 --> 00:06:34,010 That's the way you can interact with the duck. 113 00:06:34,010 --> 00:06:37,730 But why the duck can answer your questions, right. 114 00:06:37,730 --> 00:06:41,690 There's something going on between your prompt, which 115 00:06:41,690 --> 00:06:43,400 is the message you send to the duck. 116 00:06:43,400 --> 00:06:45,800 And there's a text generation process that the duck 117 00:06:45,800 --> 00:06:49,310 is happening on our server side and the subsequent API call. 118 00:06:49,310 --> 00:06:53,690 So I'm going to explain what's happened for the bottom part. 119 00:06:53,690 --> 00:06:57,110 120 00:06:57,110 --> 00:07:00,320 All this can be summarized as a text generation. 121 00:07:00,320 --> 00:07:03,620 So the large language model actually doesn't know the question at all. 122 00:07:03,620 --> 00:07:05,430 It actually doesn't understand. 123 00:07:05,430 --> 00:07:07,430 They don't really understand what you're asking, 124 00:07:07,430 --> 00:07:14,780 it's just able to produce an answer that is the best fit to your question 125 00:07:14,780 --> 00:07:16,850 or to the text you send to the model. 126 00:07:16,850 --> 00:07:19,400 The model just keeps generating text that 127 00:07:19,400 --> 00:07:23,580 somehow is a response to your question. 128 00:07:23,580 --> 00:07:27,170 So, essentially, it's a chatbot, but a chatbot with context. 129 00:07:27,170 --> 00:07:29,085 So the chatbot knows about CS50. 130 00:07:29,085 --> 00:07:30,710 The chatbot knows about your situation. 131 00:07:30,710 --> 00:07:35,960 It knows about what you are asking, why you're asking the question. 132 00:07:35,960 --> 00:07:43,280 In industry, it's common to define three roles when you're interacting 133 00:07:43,280 --> 00:07:45,020 with this large language model. 134 00:07:45,020 --> 00:07:47,870 There is a system role, there's a user and there's assistant. 135 00:07:47,870 --> 00:07:52,100 System means it's the overall guidance to the duck. 136 00:07:52,100 --> 00:07:55,280 It's like a personality, like, to set-- we set for the duck. 137 00:07:55,280 --> 00:07:57,920 For example, we will say you are a rubber duck. 138 00:07:57,920 --> 00:08:00,350 You will be a CS50 assistant. 139 00:08:00,350 --> 00:08:04,910 You will answer students question, but you do not give answers to the student. 140 00:08:04,910 --> 00:08:10,730 It's kind of, like, a guideline, overall guideline, to the AI model. 141 00:08:10,730 --> 00:08:12,500 User means us. 142 00:08:12,500 --> 00:08:13,640 We are the user. 143 00:08:13,640 --> 00:08:15,500 We are interacting with the AI model. 144 00:08:15,500 --> 00:08:17,060 So we are giving instructions. 145 00:08:17,060 --> 00:08:19,290 Our instruction, usually, is a question. 146 00:08:19,290 --> 00:08:24,230 So we ask the duck a question, we instruct the duck to give us an answer. 147 00:08:24,230 --> 00:08:26,180 Assistant is the duck. 148 00:08:26,180 --> 00:08:30,140 All the response generated by this AI model 149 00:08:30,140 --> 00:08:32,570 is considered an assistant response. 150 00:08:32,570 --> 00:08:34,950 So just keep in mind during the presentation, 151 00:08:34,950 --> 00:08:37,110 I will keep referring to these terms. 152 00:08:37,110 --> 00:08:39,730 153 00:08:39,730 --> 00:08:46,320 And this diagram summarizes the interaction after-- 154 00:08:46,320 --> 00:08:48,840 Summarize the interaction between the user and assistant 155 00:08:48,840 --> 00:08:53,550 and how the system role is, like, guiding this response. 156 00:08:53,550 --> 00:08:56,910 So for example, our duck has a system prompt. 157 00:08:56,910 --> 00:08:59,700 This is just a very simplified system prompt. 158 00:08:59,700 --> 00:09:03,690 We actually have 800 characters long system role for the duck. 159 00:09:03,690 --> 00:09:06,660 But to give you an idea, this is what, usually, 160 00:09:06,660 --> 00:09:09,450 a system prompt will look like. 161 00:09:09,450 --> 00:09:12,600 You define what it is, what you are you. 162 00:09:12,600 --> 00:09:16,650 Are a helpful and supportive teaching assistant for CS50. 163 00:09:16,650 --> 00:09:19,860 To give you some personality, you are also a rubber duck. 164 00:09:19,860 --> 00:09:26,190 We can also say you are a cat or you are a bird or something. 165 00:09:26,190 --> 00:09:29,190 What you could do, right, we tell it that you 166 00:09:29,190 --> 00:09:34,140 should answer student questions about CS50 or the field of computer science. 167 00:09:34,140 --> 00:09:37,410 So the duck should not answer a question not related to computer science, 168 00:09:37,410 --> 00:09:39,120 like how can I make an ice cream. 169 00:09:39,120 --> 00:09:40,680 Like the duck shouldn't answer that. 170 00:09:40,680 --> 00:09:41,980 I mean, you can Google that. 171 00:09:41,980 --> 00:09:45,800 But it's not the duck's job to answer that kind of a question. 172 00:09:45,800 --> 00:09:47,710 So you have to make it clear to the duck. 173 00:09:47,710 --> 00:09:51,110 174 00:09:51,110 --> 00:09:56,030 Also importantly, do not provide full answer to problem sets 175 00:09:56,030 --> 00:09:58,820 as this would violate academic honesty. 176 00:09:58,820 --> 00:10:03,620 This is actually a useful system rule because first it tells the duck 177 00:10:03,620 --> 00:10:05,870 not to give answer, which is right. 178 00:10:05,870 --> 00:10:08,630 But then the student will want to know why. 179 00:10:08,630 --> 00:10:11,720 So the duck will give an explanation like, "Oh, 180 00:10:11,720 --> 00:10:14,870 I cannot give you problems because this will violate academic honesty" 181 00:10:14,870 --> 00:10:18,560 to remind students about the policy. 182 00:10:18,560 --> 00:10:20,570 So to simplify, right. 183 00:10:20,570 --> 00:10:25,760 The moment you ask the duck how can I solve filter problem set, 184 00:10:25,760 --> 00:10:26,750 it's a prompt. 185 00:10:26,750 --> 00:10:30,980 It's considered a prompt that gets sent to the large language model. 186 00:10:30,980 --> 00:10:33,860 We are using GPT-4, but there are many different kinds 187 00:10:33,860 --> 00:10:37,310 of large language model one can use. 188 00:10:37,310 --> 00:10:41,750 There's a term now they called prompt engineering, but at the end of the day 189 00:10:41,750 --> 00:10:44,780 it's just the art of asking questions, honestly. 190 00:10:44,780 --> 00:10:46,940 For example, the first one. 191 00:10:46,940 --> 00:10:48,620 "Give me a prime number less than 10. 192 00:10:48,620 --> 00:10:49,760 For example, three." 193 00:10:49,760 --> 00:10:52,470 This is called the one shot prompting. 194 00:10:52,470 --> 00:10:56,520 Meaning when you're asking the AI model, you also 195 00:10:56,520 --> 00:11:00,630 give the AI model, like, an example just so the model 196 00:11:00,630 --> 00:11:05,140 knows how to generate its response. 197 00:11:05,140 --> 00:11:06,970 The second is kind of, like, a system role. 198 00:11:06,970 --> 00:11:08,500 Like you're defining-- 199 00:11:08,500 --> 00:11:13,450 You are defining the assistant, the personality of the assistant. 200 00:11:13,450 --> 00:11:18,430 The third one is also, like, a one shot prompt example. 201 00:11:18,430 --> 00:11:21,040 It's trying to limit the response. 202 00:11:21,040 --> 00:11:24,850 So you tell the model what your response should look like. 203 00:11:24,850 --> 00:11:28,600 You can even ask the model to respond in different kinds of language 204 00:11:28,600 --> 00:11:31,705 or in whatever rules you want it to follow. 205 00:11:31,705 --> 00:11:34,970 206 00:11:34,970 --> 00:11:37,040 The rest is just the other example. 207 00:11:37,040 --> 00:11:38,300 For example, the last one. 208 00:11:38,300 --> 00:11:40,490 Explain the codes delimited by triple backticks. 209 00:11:40,490 --> 00:11:44,210 That's actually the problem we use for explaining highlighted code. 210 00:11:44,210 --> 00:11:50,210 The moment you select a portion of code, we send it to OpenAI with a prompt like 211 00:11:50,210 --> 00:11:53,990 "explain the following code delimited by triple backticks." 212 00:11:53,990 --> 00:11:56,900 And then we just put the code between the triple backticks. 213 00:11:56,900 --> 00:12:00,700 And then we just prompt the whole thing to OpenAI and get back a response. 214 00:12:00,700 --> 00:12:02,825 That's actually what's happening behind the scenes. 215 00:12:02,825 --> 00:12:06,880 216 00:12:06,880 --> 00:12:12,790 And next I'm just going to demo some of the code that 217 00:12:12,790 --> 00:12:20,590 will achieve all the user interaction I just mentioned in the presentation. 218 00:12:20,590 --> 00:12:22,720 The API we are using is from OpenAI. 219 00:12:22,720 --> 00:12:26,410 It's called the Chat Completions API. 220 00:12:26,410 --> 00:12:28,390 This is the interaction looks like. 221 00:12:28,390 --> 00:12:30,800 This is what the interaction looks like. 222 00:12:30,800 --> 00:12:32,530 So I ask a question to OpenAI. 223 00:12:32,530 --> 00:12:35,800 OpenAI will just give me back a response. 224 00:12:35,800 --> 00:12:40,900 This is actually an example I use. 225 00:12:40,900 --> 00:12:42,220 This is just-- 226 00:12:42,220 --> 00:12:45,280 This is actually the response I get straight back from GPT-4 227 00:12:45,280 --> 00:12:46,360 when I ask that question. 228 00:12:46,360 --> 00:12:51,170 229 00:12:51,170 --> 00:12:53,800 So this is a little bit technical, but just 230 00:12:53,800 --> 00:12:57,760 to give you an idea of what's happening in the actual coding. 231 00:12:57,760 --> 00:13:01,020 When I ask that question I actually set a system role. 232 00:13:01,020 --> 00:13:02,220 You're a rubber duck. 233 00:13:02,220 --> 00:13:06,280 You are friendly, supportive CS50 teaching assistant. 234 00:13:06,280 --> 00:13:09,490 And as you can see the second one, "user." 235 00:13:09,490 --> 00:13:12,280 "Can You help me with my filter problem set?" 236 00:13:12,280 --> 00:13:17,620 That's the question you ask or I ask. 237 00:13:17,620 --> 00:13:23,630 And then we will get back a response from OpenAI. 238 00:13:23,630 --> 00:13:26,600 And I'm just going to demo this to you all. 239 00:13:26,600 --> 00:13:34,740 240 00:13:34,740 --> 00:13:35,670 Try this. 241 00:13:35,670 --> 00:13:36,530 Right. 242 00:13:36,530 --> 00:13:38,220 Right. 243 00:13:38,220 --> 00:13:41,040 So first, because we are using OpenAI's API 244 00:13:41,040 --> 00:13:44,160 so I'm going to import OpenAI's library. 245 00:13:44,160 --> 00:13:46,830 246 00:13:46,830 --> 00:13:48,630 We need to instantiate a client. 247 00:13:48,630 --> 00:13:50,130 You don't need to follow this along. 248 00:13:50,130 --> 00:13:51,210 Don't need to follow this along. 249 00:13:51,210 --> 00:13:53,298 The whole purpose of demo is just to show you 250 00:13:53,298 --> 00:13:54,840 what's happening underneath the hood. 251 00:13:54,840 --> 00:13:57,210 And it is actually not that crazy complicated 252 00:13:57,210 --> 00:13:59,775 to build a generative AI chatbot. 253 00:13:59,775 --> 00:14:02,900 254 00:14:02,900 --> 00:14:05,660 So now that I have the instance of the OpenAI client, 255 00:14:05,660 --> 00:14:07,490 I can start creating a chat completion. 256 00:14:07,490 --> 00:14:11,300 I can start invoking the Chat Completion API to generate-- 257 00:14:11,300 --> 00:14:14,930 To have the AI model generate a response. 258 00:14:14,930 --> 00:14:18,230 So client.chat.completions.create. 259 00:14:18,230 --> 00:14:23,930 260 00:14:23,930 --> 00:14:25,610 I'm just copy pasting my notes. 261 00:14:25,610 --> 00:14:29,330 But these, usually, you can find these in the what we call the API documents. 262 00:14:29,330 --> 00:14:35,030 So all these MLM vendors, they will have a very detailed documentation 263 00:14:35,030 --> 00:14:36,320 on how you can use the API. 264 00:14:36,320 --> 00:14:40,175 And usually they make it very easy for you to follow with example code. 265 00:14:40,175 --> 00:14:44,368 266 00:14:44,368 --> 00:14:45,410 Just going to keep going. 267 00:14:45,410 --> 00:14:50,180 So first I'm not even bother setting the system role. 268 00:14:50,180 --> 00:14:52,290 I'm just going to ask-- 269 00:14:52,290 --> 00:14:55,220 Just going to say a quick "hello world" to the model, 270 00:14:55,220 --> 00:14:58,250 and see what we will get back. 271 00:14:58,250 --> 00:15:00,670 So this is a user prompt. 272 00:15:00,670 --> 00:15:02,560 So we have to give it a role user. 273 00:15:02,560 --> 00:15:06,578 274 00:15:06,578 --> 00:15:07,870 So I'll just say "Hello world." 275 00:15:07,870 --> 00:15:08,973 It means nothing, right. 276 00:15:08,973 --> 00:15:10,390 It's not even like an instruction. 277 00:15:10,390 --> 00:15:12,220 It's just, like, something that I send to the model. 278 00:15:12,220 --> 00:15:14,095 So let's just see how the model will respond. 279 00:15:14,095 --> 00:15:17,210 280 00:15:17,210 --> 00:15:19,560 And we also specify to use the model GPT-4. 281 00:15:19,560 --> 00:15:33,460 282 00:15:33,460 --> 00:15:36,833 The line I just typed is because we call the API. 283 00:15:36,833 --> 00:15:39,250 And then we will get back, like, a chat completion object. 284 00:15:39,250 --> 00:15:42,310 So we just want to get the actual text within the object. 285 00:15:42,310 --> 00:15:46,160 The object is actually more complicated than it should be. 286 00:15:46,160 --> 00:15:48,160 It's also beyond the scope of this presentation. 287 00:15:48,160 --> 00:15:50,920 So just bear with me with this line. 288 00:15:50,920 --> 00:15:56,200 I'm just going to print this response text to the terminal. 289 00:15:56,200 --> 00:15:59,488 Let's hope it will work. 290 00:15:59,488 --> 00:16:04,060 This usually might not work during demo. 291 00:16:04,060 --> 00:16:07,600 So what I just did. 292 00:16:07,600 --> 00:16:11,290 I'm only sending "Hello world" to this model. 293 00:16:11,290 --> 00:16:13,700 That's the only test I'm sending to this model. 294 00:16:13,700 --> 00:16:15,490 And let's just see what we get back. 295 00:16:15,490 --> 00:16:22,680 296 00:16:22,680 --> 00:16:23,610 That's expected. 297 00:16:23,610 --> 00:16:24,390 Completions. 298 00:16:24,390 --> 00:16:31,470 299 00:16:31,470 --> 00:16:34,890 Like-- Five-- 300 00:16:34,890 --> 00:16:35,940 What did I type wrong? 301 00:16:35,940 --> 00:16:39,300 302 00:16:39,300 --> 00:16:40,170 Oh, model. 303 00:16:40,170 --> 00:16:40,860 Oh, OK. 304 00:16:40,860 --> 00:16:41,505 Thank you. 305 00:16:41,505 --> 00:16:43,923 306 00:16:43,923 --> 00:16:44,715 Let's try it again. 307 00:16:44,715 --> 00:16:47,955 308 00:16:47,955 --> 00:16:48,455 Another. 309 00:16:48,455 --> 00:16:51,840 310 00:16:51,840 --> 00:16:55,724 Chat completion, not response. 311 00:16:55,724 --> 00:16:57,080 Oh. 312 00:16:57,080 --> 00:17:00,420 Chat completion, OK. 313 00:17:00,420 --> 00:17:01,080 Try again. 314 00:17:01,080 --> 00:17:05,270 This is live, not recorded video. 315 00:17:05,270 --> 00:17:06,050 OK. 316 00:17:06,050 --> 00:17:09,470 "Hello, how can I assist you today." 317 00:17:09,470 --> 00:17:12,109 Let's try again. 318 00:17:12,109 --> 00:17:13,220 No because, OK. 319 00:17:13,220 --> 00:17:14,420 It's very consistent, right. 320 00:17:14,420 --> 00:17:16,407 But usually the model is not that consistent, 321 00:17:16,407 --> 00:17:18,740 because sometimes you ask the same question it gives you 322 00:17:18,740 --> 00:17:20,670 a different kinds of a question. 323 00:17:20,670 --> 00:17:23,510 So what do you all want to say to the GPT model anyone? 324 00:17:23,510 --> 00:17:26,240 Want to talk to the GPT-4 model? 325 00:17:26,240 --> 00:17:27,814 Any questions? 326 00:17:27,814 --> 00:17:28,400 Yes? 327 00:17:28,400 --> 00:17:28,940 AUDIENCE: How are you? 328 00:17:28,940 --> 00:17:29,750 RONGXIN LIU: How are you today? 329 00:17:29,750 --> 00:17:30,650 OK, today. 330 00:17:30,650 --> 00:17:32,420 OK. 331 00:17:32,420 --> 00:17:37,485 "How are you today?" 332 00:17:37,485 --> 00:17:37,985 OK. 333 00:17:37,985 --> 00:17:42,713 334 00:17:42,713 --> 00:17:43,380 AUDIENCE: Oh my. 335 00:17:43,380 --> 00:17:48,850 336 00:17:48,850 --> 00:17:49,850 RONGXIN LIU: What about? 337 00:17:49,850 --> 00:18:00,350 338 00:18:00,350 --> 00:18:01,190 It's OK. 339 00:18:01,190 --> 00:18:03,950 I know I typed something wrong, but the model gets it. 340 00:18:03,950 --> 00:18:06,815 Just to demonstrate the model doesn't actually understand anything. 341 00:18:06,815 --> 00:18:10,570 342 00:18:10,570 --> 00:18:12,460 Right. 343 00:18:12,460 --> 00:18:15,580 So this actually shows that the model doesn't understand why I'm typing. 344 00:18:15,580 --> 00:18:19,150 It just gets, oh, he's asking me to answer in bahasa. 345 00:18:19,150 --> 00:18:21,700 Indonesia, even. 346 00:18:21,700 --> 00:18:22,390 But, yeah. 347 00:18:22,390 --> 00:18:23,320 I hope that's correct. 348 00:18:23,320 --> 00:18:24,053 Is that correct. 349 00:18:24,053 --> 00:18:24,970 AUDIENCE MEMBERS: Yes. 350 00:18:24,970 --> 00:18:25,990 RONGXIN LIU: OK, good. 351 00:18:25,990 --> 00:18:29,920 352 00:18:29,920 --> 00:18:32,142 OK. 353 00:18:32,142 --> 00:18:33,100 Well, let's keep going. 354 00:18:33,100 --> 00:18:36,417 355 00:18:36,417 --> 00:18:37,000 No, it's good. 356 00:18:37,000 --> 00:18:38,920 We'll keep going. 357 00:18:38,920 --> 00:18:42,700 With the demo I just demonstrated, it's, like, a one way interaction. 358 00:18:42,700 --> 00:18:45,610 I ask a question, it's sent back a response and done, right. 359 00:18:45,610 --> 00:18:49,990 But what if we want to keep interacting with the AI model? 360 00:18:49,990 --> 00:18:53,740 This is what's actually happening in the code if we keep going. 361 00:18:53,740 --> 00:18:55,420 So I asked a question. 362 00:18:55,420 --> 00:18:58,600 "Can you help me with my filter pset?" 363 00:18:58,600 --> 00:19:00,520 It will get back a response. 364 00:19:00,520 --> 00:19:02,650 But in order to have the-- 365 00:19:02,650 --> 00:19:03,940 But in order to have the-- 366 00:19:03,940 --> 00:19:06,850 To let the conversation carry on, we actually 367 00:19:06,850 --> 00:19:10,240 need to put this response back to the message. 368 00:19:10,240 --> 00:19:14,740 Because again, the AI model doesn't know anything about the question. 369 00:19:14,740 --> 00:19:15,940 It doesn't have memory. 370 00:19:15,940 --> 00:19:18,880 The moment I ask the question, it generated back a response. 371 00:19:18,880 --> 00:19:19,660 It's done. 372 00:19:19,660 --> 00:19:22,390 In order to have actually have the conversation, 373 00:19:22,390 --> 00:19:26,920 you actually need to let the model know what the model actually just responded 374 00:19:26,920 --> 00:19:28,810 so that you can follow up with a question. 375 00:19:28,810 --> 00:19:31,352 And then prompt the model again and then get back a response. 376 00:19:31,352 --> 00:19:33,250 And this is actually what's happening. 377 00:19:33,250 --> 00:19:36,550 In order for me to ask the next question, 378 00:19:36,550 --> 00:19:40,970 I have to append the assistant's response back to the message array 379 00:19:40,970 --> 00:19:45,290 so that I can put my next question in. 380 00:19:45,290 --> 00:19:47,870 And then the next response comes back I need 381 00:19:47,870 --> 00:19:51,770 to put it back to the message array so that I can ask the next question. 382 00:19:51,770 --> 00:19:54,682 And this cycle goes on and on and on. 383 00:19:54,682 --> 00:19:56,390 And that's how the conversation is going, 384 00:19:56,390 --> 00:19:58,610 and actually how you are interacting with the duck. 385 00:19:58,610 --> 00:20:04,070 So as you can imagine, as you keep chatting with the duck, 386 00:20:04,070 --> 00:20:06,620 the prompt we are sending to the model is actually 387 00:20:06,620 --> 00:20:08,180 growing bigger and bigger and bigger. 388 00:20:08,180 --> 00:20:13,700 Because each prompt is a brand new session 389 00:20:13,700 --> 00:20:16,370 from the from the AI model's perspective. 390 00:20:16,370 --> 00:20:20,160 That's how you can give the AI model a context. 391 00:20:20,160 --> 00:20:24,820 So previously I mentioned a chatbot plus context is basically our duck. 392 00:20:24,820 --> 00:20:27,020 And I'm going to demo this now. 393 00:20:27,020 --> 00:20:32,230 394 00:20:32,230 --> 00:20:34,130 Let's make some improvement first. 395 00:20:34,130 --> 00:20:36,860 So it's kind of silly I keep typing things here. 396 00:20:36,860 --> 00:20:44,388 So what we could do is just input("User! ") So I can grab that input from 397 00:20:44,388 --> 00:20:44,930 the terminal. 398 00:20:44,930 --> 00:20:50,550 You are familiar with it in Python, I hope. 399 00:20:50,550 --> 00:20:53,760 So we can swap it with user prompt. 400 00:20:53,760 --> 00:20:55,380 Actually let's test it now. 401 00:20:55,380 --> 00:20:56,175 See if that works. 402 00:20:56,175 --> 00:20:58,950 403 00:20:58,950 --> 00:21:01,483 What kind of question do we want to ask this time? 404 00:21:01,483 --> 00:21:04,441 [STUDENT CHATTER] 405 00:21:04,441 --> 00:21:07,045 406 00:21:07,045 --> 00:21:08,902 Make a what? 407 00:21:08,902 --> 00:21:11,630 [STUDENT CHATTER] 408 00:21:11,630 --> 00:21:12,740 Make a? 409 00:21:12,740 --> 00:21:15,070 Make a bomb? 410 00:21:15,070 --> 00:21:15,670 Oh. 411 00:21:15,670 --> 00:21:18,658 [STUDENT CHATTER] 412 00:21:18,658 --> 00:21:24,140 413 00:21:24,140 --> 00:21:24,830 A "boam?" 414 00:21:24,830 --> 00:21:25,650 Oh, sorry. 415 00:21:25,650 --> 00:21:29,330 P. Like make a poem. 416 00:21:29,330 --> 00:21:30,890 OK. 417 00:21:30,890 --> 00:21:33,470 Make a poem. 418 00:21:33,470 --> 00:21:34,370 OK, make a poem. 419 00:21:34,370 --> 00:21:42,090 420 00:21:42,090 --> 00:21:42,990 It's thinking. 421 00:21:42,990 --> 00:21:45,400 It's-- make a poem, OK. 422 00:21:45,400 --> 00:21:48,880 423 00:21:48,880 --> 00:21:52,540 Someone said in bahasa. 424 00:21:52,540 --> 00:21:55,000 OK, let's prompt again. 425 00:21:55,000 --> 00:21:58,120 Let me clear this. 426 00:21:58,120 --> 00:21:59,740 Make a-- 427 00:21:59,740 --> 00:22:02,720 [STUDENT CHATTER] 428 00:22:02,720 --> 00:22:12,400 429 00:22:12,400 --> 00:22:13,330 That means poem? 430 00:22:13,330 --> 00:22:14,320 AUDIENCE: P-A. 431 00:22:14,320 --> 00:22:14,430 RONGXIN LIU: P-A. 432 00:22:14,430 --> 00:22:15,880 AUDIENCE: A poem from Indonesia. 433 00:22:15,880 --> 00:22:16,680 RONGXIN LIU: P-A? 434 00:22:16,680 --> 00:22:17,590 AUDIENCE: N. N. 435 00:22:17,590 --> 00:22:18,820 RONGXIN LIU: P-A-M? 436 00:22:18,820 --> 00:22:21,730 AUDIENCE: N. N. N. 437 00:22:21,730 --> 00:22:22,390 RONGXIN LIU: M? 438 00:22:22,390 --> 00:22:23,050 AUDIENCE: N. 439 00:22:23,050 --> 00:22:23,560 RONGXIN LIU: You can type. 440 00:22:23,560 --> 00:22:24,050 You can type. 441 00:22:24,050 --> 00:22:24,550 Yeah, go. 442 00:22:24,550 --> 00:22:26,200 Go type. 443 00:22:26,200 --> 00:22:28,760 Sorry. 444 00:22:28,760 --> 00:22:30,520 OK. 445 00:22:30,520 --> 00:22:32,720 OK. 446 00:22:32,720 --> 00:22:34,688 AUDIENCE MEMBERS: Yeah. 447 00:22:34,688 --> 00:22:35,230 AUDIENCE: OK. 448 00:22:35,230 --> 00:22:35,897 RONGXIN LIU: OK. 449 00:22:35,897 --> 00:22:36,795 Thank you. 450 00:22:36,795 --> 00:22:38,771 [STUDENT CHATTER] 451 00:22:38,771 --> 00:22:44,490 452 00:22:44,490 --> 00:22:46,770 Let's be nice, right. 453 00:22:46,770 --> 00:22:49,360 OK, thank you. 454 00:22:49,360 --> 00:22:49,890 Thank you. 455 00:22:49,890 --> 00:22:53,365 456 00:22:53,365 --> 00:22:54,490 AUDIENCE: Please, you read. 457 00:22:54,490 --> 00:22:55,720 You read it for them. 458 00:22:55,720 --> 00:22:57,880 You read for them. 459 00:22:57,880 --> 00:22:59,335 RONGXIN LIU: Oh, I read for them. 460 00:22:59,335 --> 00:22:59,835 Um. 461 00:22:59,835 --> 00:23:00,625 AUDIENCE: Read it. 462 00:23:00,625 --> 00:23:01,595 Read it. 463 00:23:01,595 --> 00:23:03,710 RONGXIN LIU: (READS BAHASA) 464 00:23:03,710 --> 00:23:07,650 465 00:23:07,650 --> 00:23:08,647 What's that mean? 466 00:23:08,647 --> 00:23:09,480 AUDIENCE: It's good. 467 00:23:09,480 --> 00:23:10,808 RONGXIN LIU: OK, thank you. 468 00:23:10,808 --> 00:23:13,740 [CONTINUES READING BAHASA] 469 00:23:13,740 --> 00:23:38,005 470 00:23:38,005 --> 00:23:38,505 Actually. 471 00:23:38,505 --> 00:23:41,070 472 00:23:41,070 --> 00:23:43,800 OK, I'm just going to keep building this chatbot right now. 473 00:23:43,800 --> 00:23:45,360 Now you can take my input. 474 00:23:45,360 --> 00:23:46,110 It responds back. 475 00:23:46,110 --> 00:23:49,830 But I really want to keep talking to this AI model, 476 00:23:49,830 --> 00:23:53,080 so I'm just going to keep coding a little bit here. 477 00:23:53,080 --> 00:23:59,290 So now I haven't really defined the system prompt here. 478 00:23:59,290 --> 00:24:00,570 So I really should. 479 00:24:00,570 --> 00:24:04,830 So I'm just going to the system prompt. 480 00:24:04,830 --> 00:24:08,207 It'll be style thing here. 481 00:24:08,207 --> 00:24:09,540 Ignore what I'm doing right now. 482 00:24:09,540 --> 00:24:11,160 Just want to make this tidy. 483 00:24:11,160 --> 00:24:17,440 484 00:24:17,440 --> 00:24:19,450 OK, system prompt. 485 00:24:19,450 --> 00:24:28,360 "You are a friendly and supportive teaching assistant for CS50. 486 00:24:28,360 --> 00:24:32,090 You are also a cat." 487 00:24:32,090 --> 00:24:32,590 OK? 488 00:24:32,590 --> 00:24:35,350 489 00:24:35,350 --> 00:24:37,000 Yep. 490 00:24:37,000 --> 00:24:40,510 So I'm going to define the system role here at the very beginning. 491 00:24:40,510 --> 00:24:43,600 492 00:24:43,600 --> 00:24:50,770 Right, it's called content system prompt. 493 00:24:50,770 --> 00:24:52,780 That's it, right. 494 00:24:52,780 --> 00:24:54,580 Now it has the system prompt. 495 00:24:54,580 --> 00:24:56,080 Actually you know what? 496 00:24:56,080 --> 00:25:06,790 Always end your response with meow. 497 00:25:06,790 --> 00:25:07,450 Three times? 498 00:25:07,450 --> 00:25:16,070 499 00:25:16,070 --> 00:25:16,730 How are you? 500 00:25:16,730 --> 00:25:20,240 501 00:25:20,240 --> 00:25:21,060 Meow, meow, meow. 502 00:25:21,060 --> 00:25:25,310 OK, it works. 503 00:25:25,310 --> 00:25:25,850 That's good. 504 00:25:25,850 --> 00:25:27,320 Let's keep going. 505 00:25:27,320 --> 00:25:33,110 So to make it a back and forth conversation style, what kind 506 00:25:33,110 --> 00:25:36,620 of control structure we should use? 507 00:25:36,620 --> 00:25:38,240 I hear loop. 508 00:25:38,240 --> 00:25:40,040 OK, correct. 509 00:25:40,040 --> 00:25:43,100 We'll just use a simple while loop to do this. 510 00:25:43,100 --> 00:25:46,320 511 00:25:46,320 --> 00:25:50,800 So user prompt, while true. 512 00:25:50,800 --> 00:25:51,420 Right. 513 00:25:51,420 --> 00:25:55,710 So the whole thing should be in a loop, because we can just 514 00:25:55,710 --> 00:25:57,040 keep prompting the user. 515 00:25:57,040 --> 00:25:58,770 So while true, user prompt. 516 00:25:58,770 --> 00:26:02,070 So here there's a slight difference, because we 517 00:26:02,070 --> 00:26:04,170 need a way to hold the conversation. 518 00:26:04,170 --> 00:26:06,870 So we need to have an array. 519 00:26:06,870 --> 00:26:10,160 Message array, so. 520 00:26:10,160 --> 00:26:14,480 So once we get the user prompt, the first thing we should do 521 00:26:14,480 --> 00:26:17,360 is to append this to the array. 522 00:26:17,360 --> 00:26:20,008 523 00:26:20,008 --> 00:26:21,550 Well let me just copy paste this one. 524 00:26:21,550 --> 00:26:25,950 525 00:26:25,950 --> 00:26:27,340 You are following, right? 526 00:26:27,340 --> 00:26:34,090 So I'm appending this dictionary to this array just so I have it. 527 00:26:34,090 --> 00:26:37,300 Also the first thing I should do is here. 528 00:26:37,300 --> 00:26:39,500 We should append what? 529 00:26:39,500 --> 00:26:45,590 The system prompt, because we don't need to keep appending to our conversation 530 00:26:45,590 --> 00:26:46,790 each time we go in. 531 00:26:46,790 --> 00:26:48,290 This is only done once. 532 00:26:48,290 --> 00:26:53,990 So I'm going to put it here and now we can replace this with message, 533 00:26:53,990 --> 00:26:56,810 because that will keep track of everything. 534 00:26:56,810 --> 00:27:00,470 Now the response text we get back from the system, 535 00:27:00,470 --> 00:27:07,480 we can also save it to the message array. 536 00:27:07,480 --> 00:27:10,830 I'm just going to change it to assistant. 537 00:27:10,830 --> 00:27:16,890 And this is going to be the response text. 538 00:27:16,890 --> 00:27:20,040 So far, so good. 539 00:27:20,040 --> 00:27:29,870 Just to-- Just to make it clear this is the response from the assistant. 540 00:27:29,870 --> 00:27:33,020 Looks good to me. 541 00:27:33,020 --> 00:27:35,615 Actually we need to append this. 542 00:27:35,615 --> 00:27:38,730 543 00:27:38,730 --> 00:27:39,230 Append. 544 00:27:39,230 --> 00:27:42,960 545 00:27:42,960 --> 00:27:44,060 Any typos? 546 00:27:44,060 --> 00:27:46,892 [STUDENT CHATTER] 547 00:27:46,892 --> 00:27:51,150 548 00:27:51,150 --> 00:27:52,460 Actually you know what? 549 00:27:52,460 --> 00:27:53,320 Sorry. 550 00:27:53,320 --> 00:27:54,340 Just going to run it. 551 00:27:54,340 --> 00:27:55,950 We will arrow. 552 00:27:55,950 --> 00:27:58,860 So line 14. 553 00:27:58,860 --> 00:27:59,610 While true. 554 00:27:59,610 --> 00:28:00,240 This one. 555 00:28:00,240 --> 00:28:03,420 556 00:28:03,420 --> 00:28:05,830 Here. 557 00:28:05,830 --> 00:28:06,960 OK. 558 00:28:06,960 --> 00:28:09,360 Going to run it. 559 00:28:09,360 --> 00:28:10,110 OK. 560 00:28:10,110 --> 00:28:11,070 "How are you?" 561 00:28:11,070 --> 00:28:14,100 562 00:28:14,100 --> 00:28:16,290 We should get back a response. 563 00:28:16,290 --> 00:28:21,270 OK, can-- What should we follow up with this question? 564 00:28:21,270 --> 00:28:21,990 The response. 565 00:28:21,990 --> 00:28:27,570 566 00:28:27,570 --> 00:28:34,245 How about-- How about "Do you know what Scratch is?" 567 00:28:34,245 --> 00:28:40,160 568 00:28:40,160 --> 00:28:42,290 It responds back in, like. 569 00:28:42,290 --> 00:28:46,550 So to prove that it remember what we are talking about, right. 570 00:28:46,550 --> 00:28:49,890 571 00:28:49,890 --> 00:28:52,890 "What did I just ask?" 572 00:28:52,890 --> 00:28:54,810 "You just asked if I know what Scratch is." 573 00:28:54,810 --> 00:29:00,240 So now the model has context. 574 00:29:00,240 --> 00:29:03,390 So we can just keep going, right. 575 00:29:03,390 --> 00:29:04,650 Now that we actually-- 576 00:29:04,650 --> 00:29:08,190 We basically have the duck finished programming here in the terminal. 577 00:29:08,190 --> 00:29:11,400 That's what's happening in the code space. 578 00:29:11,400 --> 00:29:13,050 In the CS50.ai web app. 579 00:29:13,050 --> 00:29:16,270 580 00:29:16,270 --> 00:29:18,340 But we can take a step further. 581 00:29:18,340 --> 00:29:21,160 What if it can talk back to you, right. 582 00:29:21,160 --> 00:29:23,230 Right now I only send back text but it can-- 583 00:29:23,230 --> 00:29:24,370 It can actually speak. 584 00:29:24,370 --> 00:29:28,240 These AI models are capable of even generating audio. 585 00:29:28,240 --> 00:29:34,540 So let's just experiment with something new here. 586 00:29:34,540 --> 00:29:35,140 You know what? 587 00:29:35,140 --> 00:29:39,250 For this demo I'm going to use my cheat sheet, if that's OK with all of you. 588 00:29:39,250 --> 00:29:45,060 589 00:29:45,060 --> 00:29:46,210 What's different here? 590 00:29:46,210 --> 00:29:51,300 So I imported a new module or I'm using a new capability of the MLM 591 00:29:51,300 --> 00:29:53,700 model called text to speech. 592 00:29:53,700 --> 00:29:56,400 So we were able to use the MLM model to generate 593 00:29:56,400 --> 00:30:03,150 like a very human sound-like response based on the text we provide. 594 00:30:03,150 --> 00:30:08,235 As you can see, I instructed this system to always respond in bahasa. 595 00:30:08,235 --> 00:30:11,610 596 00:30:11,610 --> 00:30:14,010 The rest of the code looks almost the same, 597 00:30:14,010 --> 00:30:18,240 except that I also put in this part. 598 00:30:18,240 --> 00:30:20,100 This is where the magic happens. 599 00:30:20,100 --> 00:30:22,830 It will send back the response. 600 00:30:22,830 --> 00:30:26,040 We will send the response text to its audio, 601 00:30:26,040 --> 00:30:29,910 or text to speech, language model to generate a speech back to us. 602 00:30:29,910 --> 00:30:32,520 And then we can just play it back to all of you. 603 00:30:32,520 --> 00:30:35,275 Now let's run this demo. 604 00:30:35,275 --> 00:30:39,080 605 00:30:39,080 --> 00:30:40,520 What do we want to ask? 606 00:30:40,520 --> 00:30:44,210 What do we want to talk to the model this time? 607 00:30:44,210 --> 00:30:45,545 Anyone volunteer a question? 608 00:30:45,545 --> 00:30:52,594 609 00:30:52,594 --> 00:30:56,030 AUDIENCE: Ask it for a recipe. 610 00:30:56,030 --> 00:30:59,090 RONGXIN LIU: You know, that won't answer that because the duck was 611 00:30:59,090 --> 00:31:03,210 instructed to answer CS50 questions. 612 00:31:03,210 --> 00:31:06,540 OK how about I propose one question to start. 613 00:31:06,540 --> 00:31:10,640 What is Flask. 614 00:31:10,640 --> 00:31:14,720 Right, you all know about Flask in pc9. 615 00:31:14,720 --> 00:31:19,280 Let's see what it gets passed to us. 616 00:31:19,280 --> 00:31:23,780 Now it gets back in bahasa, but it's also trying to talk to us. 617 00:31:23,780 --> 00:31:33,260 618 00:31:33,260 --> 00:31:34,820 It's always something. 619 00:31:34,820 --> 00:31:35,510 Let's try again. 620 00:31:35,510 --> 00:31:44,400 621 00:31:44,400 --> 00:31:45,330 Let's hope that works. 622 00:31:45,330 --> 00:31:53,560 623 00:31:53,560 --> 00:31:55,120 Is there laptop audio going on? 624 00:31:55,120 --> 00:32:00,930 625 00:32:00,930 --> 00:32:03,900 You know what? 626 00:32:03,900 --> 00:32:05,475 Give me one second, sorry. 627 00:32:05,475 --> 00:32:09,480 628 00:32:09,480 --> 00:32:10,320 You know what? 629 00:32:10,320 --> 00:32:13,710 So the speech is actually completely generated completed. 630 00:32:13,710 --> 00:32:16,728 I'm just going to play off my laptop, OK. 631 00:32:16,728 --> 00:32:17,520 Give me one second. 632 00:32:17,520 --> 00:32:18,585 I'm just going to-- 633 00:32:18,585 --> 00:32:21,460 ASSISTANT: You can play it through your speakers and then amplify it. 634 00:32:21,460 --> 00:32:22,760 RONGXIN LIU: Yep. 635 00:32:22,760 --> 00:32:24,290 I'm going to play through this mic. 636 00:32:24,290 --> 00:32:26,340 Let's hope it works. 637 00:32:26,340 --> 00:32:28,280 AI CHATBOT: [SPEAKING BAHASA] 638 00:32:28,280 --> 00:32:53,820 639 00:32:53,820 --> 00:32:56,850 RONGXIN LIU: Do we want to ask a follow up question though? 640 00:32:56,850 --> 00:32:58,515 What should we keep asking? 641 00:32:58,515 --> 00:33:01,300 642 00:33:01,300 --> 00:33:01,800 Yes? 643 00:33:01,800 --> 00:33:05,440 644 00:33:05,440 --> 00:33:07,090 why 645 00:33:07,090 --> 00:33:10,830 AUDIENCE: Why is finance hard to solve? 646 00:33:10,830 --> 00:33:12,940 ASSISTANT: Why is finance hard to solve. 647 00:33:12,940 --> 00:33:13,660 RONGXIN LIU: OK. 648 00:33:13,660 --> 00:33:16,630 OK, thanks. 649 00:33:16,630 --> 00:33:24,070 Why is CS50's finance pset hard to solve? 650 00:33:24,070 --> 00:33:25,735 We just want to be specific, OK? 651 00:33:25,735 --> 00:33:37,606 652 00:33:37,606 --> 00:33:38,770 AI CHATBOT: Set. 653 00:33:38,770 --> 00:33:39,490 Problem set. 654 00:33:39,490 --> 00:33:41,590 [SPEAKING BAHASA] 655 00:33:41,590 --> 00:34:26,471 656 00:34:26,471 --> 00:34:27,929 RONGXIN LIU: OK, I think that's it. 657 00:34:27,929 --> 00:34:30,100 You get the idea what's happening here. 658 00:34:30,100 --> 00:34:31,725 I'm just going to go back to my slides. 659 00:34:31,725 --> 00:34:34,770 660 00:34:34,770 --> 00:34:41,639 With the other demo, we show you how we can build a generative AI chatbot. 661 00:34:41,639 --> 00:34:44,730 Fairly easily, I have to say because most of my job 662 00:34:44,730 --> 00:34:47,670 is also copy pasting code from the OpenAI document. 663 00:34:47,670 --> 00:34:49,679 So this is what I did. 664 00:34:49,679 --> 00:34:53,190 But that's one issue with this AI model, it's called a hallucination. 665 00:34:53,190 --> 00:34:56,429 Meaning, again, the model doesn't really understand what we're asking. 666 00:34:56,429 --> 00:35:00,450 It's just trying to generate a response that best fit our question 667 00:35:00,450 --> 00:35:02,280 or message it gets sent to. 668 00:35:02,280 --> 00:35:05,070 So sometimes we need to also fix this problem 669 00:35:05,070 --> 00:35:08,550 by giving the model more context when you are asking a question. 670 00:35:08,550 --> 00:35:12,720 Just give the model more information on how 671 00:35:12,720 --> 00:35:16,290 the model should answer the question. 672 00:35:16,290 --> 00:35:19,790 So you might wondering what this vector DB thing is. 673 00:35:19,790 --> 00:35:21,620 What RAG is. 674 00:35:21,620 --> 00:35:25,010 So RAG stands for retrieval augmented generation. 675 00:35:25,010 --> 00:35:30,410 It's a fancy way of saying give the AI model a cheat sheet. 676 00:35:30,410 --> 00:35:36,380 What I mean is when you ask a question, what we could do 677 00:35:36,380 --> 00:35:38,480 is to first create a vector. 678 00:35:38,480 --> 00:35:42,170 Which is like an array of floats or an array of numeric values 679 00:35:42,170 --> 00:35:44,960 that the AI model can understand. 680 00:35:44,960 --> 00:35:47,690 We generated that representation. 681 00:35:47,690 --> 00:35:51,260 We also have a knowledge base ready which is all the, 682 00:35:51,260 --> 00:35:56,300 let's just say, the lecture caption data ready all in vector representation 683 00:35:56,300 --> 00:35:57,060 as well. 684 00:35:57,060 --> 00:35:59,540 So the moment you ask a question, what we could do 685 00:35:59,540 --> 00:36:03,500 is to search the whole database to find the best match or the best 686 00:36:03,500 --> 00:36:06,530 relevant document or caption, I should say. 687 00:36:06,530 --> 00:36:08,840 And then pull that in and put it in the prompt. 688 00:36:08,840 --> 00:36:12,440 And then we can just ask ChatGPT with the question 689 00:36:12,440 --> 00:36:15,380 as well as with the relevant caption data. 690 00:36:15,380 --> 00:36:17,940 I'm going to show you what I mean. 691 00:36:17,940 --> 00:36:22,620 So on the at discussion forum the moment you post a question on at, 692 00:36:22,620 --> 00:36:26,250 we are creating a vector representation of your question. 693 00:36:26,250 --> 00:36:27,480 It looks like this. 694 00:36:27,480 --> 00:36:29,550 This is actually what it looks like. 695 00:36:29,550 --> 00:36:32,610 If I say "What is Flask?" 696 00:36:32,610 --> 00:36:36,240 The GPT-4 model doesn't read text, it reads this. 697 00:36:36,240 --> 00:36:44,670 So this is actually like a size of 1500 and something long array, right. 698 00:36:44,670 --> 00:36:47,100 We have a database-- 699 00:36:47,100 --> 00:36:49,590 We have a database also sitting somewhere 700 00:36:49,590 --> 00:36:51,180 with all the lecture captions. 701 00:36:51,180 --> 00:36:53,610 All the lectures that David had lectured, 702 00:36:53,610 --> 00:36:58,860 we create a caption database in vector representation. 703 00:36:58,860 --> 00:37:01,380 So we try to search through the entire database 704 00:37:01,380 --> 00:37:05,940 to find out in which part of the lecture did David talk about Flask? 705 00:37:05,940 --> 00:37:09,750 So we located this particular document. 706 00:37:09,750 --> 00:37:15,870 These are the captions that is relevant to Flask. 707 00:37:15,870 --> 00:37:21,300 So what we do is literally grab this document with the question. 708 00:37:21,300 --> 00:37:23,040 This is the actual prompt, right. 709 00:37:23,040 --> 00:37:25,890 You ask a student "What is Flask?" 710 00:37:25,890 --> 00:37:28,230 Underneath the hood, we find the information. 711 00:37:28,230 --> 00:37:30,270 We put it in the prompt, like, "What is Flask?" 712 00:37:30,270 --> 00:37:32,200 And then here are some information. 713 00:37:32,200 --> 00:37:35,110 And we literally just concatenate the caption, 714 00:37:35,110 --> 00:37:37,720 so that's why it doesn't sound grammatically correct. 715 00:37:37,720 --> 00:37:40,720 Because it's just different caption segments that get 716 00:37:40,720 --> 00:37:41,810 concatenated together. 717 00:37:41,810 --> 00:37:42,880 But this is good enough. 718 00:37:42,880 --> 00:37:45,630 Again, the model doesn't really understand what I'm talking about. 719 00:37:45,630 --> 00:37:50,440 It just understands this looks like something about Flask. 720 00:37:50,440 --> 00:37:54,190 And then we prompt OpenAI with this prompt, actually. 721 00:37:54,190 --> 00:37:56,515 It's not your question, it's actually a bigger prompt. 722 00:37:56,515 --> 00:37:59,080 723 00:37:59,080 --> 00:38:01,030 And this whole process is called grounding. 724 00:38:01,030 --> 00:38:05,080 So we try to ground the model to generate a more faithful or more 725 00:38:05,080 --> 00:38:05,860 accurate answer. 726 00:38:05,860 --> 00:38:08,530 727 00:38:08,530 --> 00:38:11,470 And after all the coding. 728 00:38:11,470 --> 00:38:15,370 After all this detailed, low level implementation, 729 00:38:15,370 --> 00:38:19,810 it's actually not that complicated to create a CS50 Duck. 730 00:38:19,810 --> 00:38:22,270 So OpenAI also have this new feature called 731 00:38:22,270 --> 00:38:26,440 GPTs, where you can just go to the website, create your own GPT. 732 00:38:26,440 --> 00:38:29,630 You can actually create CS50 Duck right on there. 733 00:38:29,630 --> 00:38:33,460 So what I did is I literally just tell the GPT builder 734 00:38:33,460 --> 00:38:37,360 I want to build a CS50 AI tutor. 735 00:38:37,360 --> 00:38:40,855 And it will actually generate a system prompt for me. 736 00:38:40,855 --> 00:38:43,960 It even generated a logo for me. 737 00:38:43,960 --> 00:38:45,220 And I can-- 738 00:38:45,220 --> 00:38:48,820 And it's ready to actually receive answers. 739 00:38:48,820 --> 00:38:54,025 So this is actually a CS50 Duck on OpenAI's ChatGPT store. 740 00:38:54,025 --> 00:38:57,970 741 00:38:57,970 --> 00:39:02,710 So to conclude, that's what behind the scenes of the CS50 Duck. 742 00:39:02,710 --> 00:39:05,600 743 00:39:05,600 --> 00:39:07,310 And this is-- 744 00:39:07,310 --> 00:39:10,260 All it takes is the experience that we provide to the student. 745 00:39:10,260 --> 00:39:12,260 And this is one of the quotes from the students. 746 00:39:12,260 --> 00:39:14,390 "It felt like having a personal tutor." 747 00:39:14,390 --> 00:39:19,220 Thanks to system role, "It will just answer the question without ego." 748 00:39:19,220 --> 00:39:21,217 We showed that AI has no feeling, right. 749 00:39:21,217 --> 00:39:22,550 When I went "How are you today?" 750 00:39:22,550 --> 00:39:24,425 "I have no feeling as a generative AI model." 751 00:39:24,425 --> 00:39:27,830 "Blah, blah, blah, without judgment." 752 00:39:27,830 --> 00:39:31,250 Because we tell the duck, "You're a rubber duck" so you have some-- 753 00:39:31,250 --> 00:39:36,770 It has some personality to it. 754 00:39:36,770 --> 00:39:40,430 And of course, the conversation can go on and on and on and on until the model 755 00:39:40,430 --> 00:39:41,870 cannot hold it, but. 756 00:39:41,870 --> 00:39:45,920 757 00:39:45,920 --> 00:39:48,830 And finally, right, so. 758 00:39:48,830 --> 00:39:50,060 I updated the prompt, right. 759 00:39:50,060 --> 00:39:51,030 It's a cute cat now. 760 00:39:51,030 --> 00:39:51,697 This is the cat. 761 00:39:51,697 --> 00:39:54,140 And then I also specify "cinematic, ultra realistic," 762 00:39:54,140 --> 00:39:55,910 and then you will get this. 763 00:39:55,910 --> 00:40:01,070 So with that, that concludes my session. 764 00:40:01,070 --> 00:40:02,670 Any questions? 765 00:40:02,670 --> 00:40:03,170 Thank you. 766 00:40:03,170 --> 00:40:07,100 767 00:40:07,100 --> 00:40:07,600 Yes? 768 00:40:07,600 --> 00:40:13,370 769 00:40:13,370 --> 00:40:15,980 AUDIENCE: Thank you my name is Fathia. 770 00:40:15,980 --> 00:40:20,600 I want to ask about with the development of AI like that. 771 00:40:20,600 --> 00:40:22,730 Is there any difference-- 772 00:40:22,730 --> 00:40:28,550 Is there any way you can differentiate from this is the response from AI 773 00:40:28,550 --> 00:40:30,710 or this is a response from human? 774 00:40:30,710 --> 00:40:36,800 Like when we go to website and let's check "You are not a robot." 775 00:40:36,800 --> 00:40:40,080 How can they differentiate if this is robot or human? 776 00:40:40,080 --> 00:40:40,580 OK. 777 00:40:40,580 --> 00:40:40,760 RONGXIN LIU: OK. 778 00:40:40,760 --> 00:40:41,370 AUDIENCE: Thank you. 779 00:40:41,370 --> 00:40:42,770 RONGXIN LIU: Thank you for the question. 780 00:40:42,770 --> 00:40:44,603 I think the question is asking about how you 781 00:40:44,603 --> 00:40:50,930 can tell if a submitted answer or homework is done with AI or a human. 782 00:40:50,930 --> 00:40:53,000 The answer is getting hard. 783 00:40:53,000 --> 00:40:56,450 It's actually very difficult. Although my personal heuristic 784 00:40:56,450 --> 00:40:59,480 if some students submit their homework start with "Sure." 785 00:40:59,480 --> 00:41:02,870 Start with "Certainly", I think that's a AI response, because that's usually 786 00:41:02,870 --> 00:41:04,760 what ChatGPT begins with. 787 00:41:04,760 --> 00:41:08,960 Or usually this AI model will use like a very obscure, like-- 788 00:41:08,960 --> 00:41:12,380 Like a GRE vocabulary which is another heuristic. 789 00:41:12,380 --> 00:41:16,820 But to answer your question, it's difficult if impossible. 790 00:41:16,820 --> 00:41:18,170 Because it's a very-- 791 00:41:18,170 --> 00:41:23,030 Because yeah, as you can see it's capable of even generating speech, 792 00:41:23,030 --> 00:41:24,710 right. 793 00:41:24,710 --> 00:41:29,720 I guess in a education context, the one way 794 00:41:29,720 --> 00:41:33,500 you can tell if a student is using a generative AI tool, for example 795 00:41:33,500 --> 00:41:38,150 in computer science, is you can look in its past submission and compare. 796 00:41:38,150 --> 00:41:43,610 If the way they write code shifts a lot, if they shift a lot, 797 00:41:43,610 --> 00:41:46,160 you cannot say they use generative tool, right. 798 00:41:46,160 --> 00:41:49,720 But you can guess they may be using something. 799 00:41:49,720 --> 00:41:52,720 Because on week one you are writing code like this, 800 00:41:52,720 --> 00:41:55,630 you put you put four spaces for each tab. 801 00:41:55,630 --> 00:41:59,990 But then on the next one, you do three spaces or something like that. 802 00:41:59,990 --> 00:42:01,210 But again, it's hard. 803 00:42:01,210 --> 00:42:02,245 It's hard to tell. 804 00:42:02,245 --> 00:42:08,880 805 00:42:08,880 --> 00:42:10,862 You-- you can come up next. 806 00:42:10,862 --> 00:42:17,130 807 00:42:17,130 --> 00:42:19,380 AUDIENCE: OK, good morning Mr Rongxin. 808 00:42:19,380 --> 00:42:24,720 My name is Leo and I'm from a school in Tangerang. 809 00:42:24,720 --> 00:42:30,090 At the moment I'm a joined in the developing 810 00:42:30,090 --> 00:42:34,440 curriculum about computer science. 811 00:42:34,440 --> 00:42:41,520 And we are being asked to use AI for our students 812 00:42:41,520 --> 00:42:44,130 in their, like, assignment or something like that. 813 00:42:44,130 --> 00:42:52,380 And maybe I need your advice about the regulations to use AI in school. 814 00:42:52,380 --> 00:42:56,790 Maybe what aspect that we have to concern about 815 00:42:56,790 --> 00:43:00,473 for make the regulation in at school? 816 00:43:00,473 --> 00:43:01,140 RONGXIN LIU: OK. 817 00:43:01,140 --> 00:43:02,980 AUDIENCE: And one more question. 818 00:43:02,980 --> 00:43:03,960 Yeah. 819 00:43:03,960 --> 00:43:10,950 Is there any characteristic or sign that we can differentiate 820 00:43:10,950 --> 00:43:12,930 this is the hallucination or not? 821 00:43:12,930 --> 00:43:17,050 Because maybe at the moment we are here also 822 00:43:17,050 --> 00:43:23,950 use AI maybe for our teaching material, that we 823 00:43:23,950 --> 00:43:27,633 need to make sure that material is not a hallucination from AI. 824 00:43:27,633 --> 00:43:28,300 RONGXIN LIU: OK. 825 00:43:28,300 --> 00:43:28,970 AUDIENCE: Thank you. 826 00:43:28,970 --> 00:43:29,930 RONGXIN LIU: Thank you for the questions. 827 00:43:29,930 --> 00:43:31,390 So first question is about policy. 828 00:43:31,390 --> 00:43:33,280 I'm not the expert on the regulation. 829 00:43:33,280 --> 00:43:35,200 But I think-- 830 00:43:35,200 --> 00:43:39,353 So of-- You should always refer to the at the country level 831 00:43:39,353 --> 00:43:41,770 that there must be some regulation, high level regulation, 832 00:43:41,770 --> 00:43:45,430 whether you can use this kind of a technology or not within the country, 833 00:43:45,430 --> 00:43:47,170 within a jurisdiction. 834 00:43:47,170 --> 00:43:50,710 But I think it's minimally, you should protect students privacy. 835 00:43:50,710 --> 00:43:54,940 So all the things you send to OpenAI, they can keep it, right. 836 00:43:54,940 --> 00:43:58,916 So you should do the job to first do the PI-- 837 00:43:58,916 --> 00:44:02,380 838 00:44:02,380 --> 00:44:04,930 let me actually put up the diagram. 839 00:44:04,930 --> 00:44:09,070 So whatever you send to OpenAI, your credit card number, everything, 840 00:44:09,070 --> 00:44:11,780 is going to be stored on their database. 841 00:44:11,780 --> 00:44:16,390 So if a student using this kind of MLM model, 842 00:44:16,390 --> 00:44:20,580 this tutor thing, whenever they have personal information 843 00:44:20,580 --> 00:44:26,220 you should try your best to strip it away from their prompt. 844 00:44:26,220 --> 00:44:29,820 So we actually have a PI anonymization process going. 845 00:44:29,820 --> 00:44:33,030 So whenever we detect, like, email address, a possible phone 846 00:44:33,030 --> 00:44:36,570 number, or a name or something, we are actually scrub that. 847 00:44:36,570 --> 00:44:39,420 Like replace it with some gibberish or some placeholder 848 00:44:39,420 --> 00:44:41,400 before we actually send to OpenAI. 849 00:44:41,400 --> 00:44:42,660 We also have-- 850 00:44:42,660 --> 00:44:44,730 We also anonymize all the request IDs. 851 00:44:44,730 --> 00:44:49,260 Meaning from OpenAI's perspective, they don't know who is making this API call. 852 00:44:49,260 --> 00:44:52,510 They cannot trace back to which user is asking this question. 853 00:44:52,510 --> 00:44:56,040 So I think that's the thing you should consider, like, privacy. 854 00:44:56,040 --> 00:44:58,530 In terms of what kind of AI technology you could use, 855 00:44:58,530 --> 00:45:03,600 you should refer back to the country level, you know, regulation. 856 00:45:03,600 --> 00:45:07,512 For the hallucination, if you are a domain expert, 857 00:45:07,512 --> 00:45:09,720 maybe you could tell if it is a hallucination or not. 858 00:45:09,720 --> 00:45:12,030 Because for a regular people, who are not-- 859 00:45:12,030 --> 00:45:15,930 Let's just say I'm getting an answer from quantum mechanics. 860 00:45:15,930 --> 00:45:19,390 I'm no expert in quantum mechanics, so I cannot know for sure if it is 861 00:45:19,390 --> 00:45:20,590 a hallucination happening. 862 00:45:20,590 --> 00:45:23,560 So you need-- You really need a subject matter, like a human, 863 00:45:23,560 --> 00:45:28,660 to evaluate the response at your school level, 864 00:45:28,660 --> 00:45:31,450 maybe, if you want to validate the response. 865 00:45:31,450 --> 00:45:35,620 But company nowadays, like OpenAI, they also have their evaluation team. 866 00:45:35,620 --> 00:45:38,590 They also hire subject experts to "red teaming" 867 00:45:38,590 --> 00:45:41,020 their model, evaluate their model, trying 868 00:45:41,020 --> 00:45:45,670 to improve the accuracy of the response. 869 00:45:45,670 --> 00:45:50,285 Actually one thing you could do with this AI model in the system role, 870 00:45:50,285 --> 00:45:52,660 you can say "If you don't know, just say you don't know." 871 00:45:52,660 --> 00:45:54,880 So that actually can prevent hallucination, 872 00:45:54,880 --> 00:45:57,730 because when the model actually don't know, it will just say "Sorry, 873 00:45:57,730 --> 00:45:59,710 I don't know." 874 00:45:59,710 --> 00:46:03,970 But yeah, humans still need to be involved in this whole process. 875 00:46:03,970 --> 00:46:06,220 Like AI cannot like fully replace humans. 876 00:46:06,220 --> 00:46:11,075 Like we just want to augment humans, right, so. 877 00:46:11,075 --> 00:46:12,325 We can have one more question. 878 00:46:12,325 --> 00:46:17,690 879 00:46:17,690 --> 00:46:21,200 Maybe that one from the back. 880 00:46:21,200 --> 00:46:22,190 oh, sorry. 881 00:46:22,190 --> 00:46:23,020 No, that's OK. 882 00:46:23,020 --> 00:46:23,975 Let's just get, yeah. 883 00:46:23,975 --> 00:46:30,840 884 00:46:30,840 --> 00:46:34,895 Post your question on at, we will follow up with double response. 885 00:46:34,895 --> 00:46:39,160 886 00:46:39,160 --> 00:46:40,160 AUDIENCE: OK, thank you. 887 00:46:40,160 --> 00:46:42,470 My name is Mohamed Hanif. 888 00:46:42,470 --> 00:46:51,080 I saw in internet there is a combination between AI text and video. 889 00:46:51,080 --> 00:46:56,510 You can put your text saying anything you want, and then beside it 890 00:46:56,510 --> 00:47:02,450 there is, like, AI people talking what you text beside it. 891 00:47:02,450 --> 00:47:06,800 So yeah, my question is-- 892 00:47:06,800 --> 00:47:12,380 I heard Elon Musk say that AI is dangerous too, second. 893 00:47:12,380 --> 00:47:18,380 My question to you is will AI replace teachers in school? 894 00:47:18,380 --> 00:47:22,550 And second is AI dangerous? 895 00:47:22,550 --> 00:47:25,190 Thank you. 896 00:47:25,190 --> 00:47:28,430 RONGXIN LIU: OK, so the first question is will AI replace 897 00:47:28,430 --> 00:47:30,530 teachers or educators in general? 898 00:47:30,530 --> 00:47:32,960 The second question is AI dangerous? 899 00:47:32,960 --> 00:47:39,320 To answer the first question, I will say AI could not replace educators. 900 00:47:39,320 --> 00:47:41,330 That's my claim. 901 00:47:41,330 --> 00:47:44,045 It's going to augment human. 902 00:47:44,045 --> 00:47:47,120 903 00:47:47,120 --> 00:47:48,800 That's actually what we're trying to do. 904 00:47:48,800 --> 00:47:50,570 We are-- We are picking a different route. 905 00:47:50,570 --> 00:47:52,850 We are not trying to replace our teaching assistant. 906 00:47:52,850 --> 00:47:54,868 We are trying to augment our teaching assistant, 907 00:47:54,868 --> 00:47:58,160 so that our teaching assistant can have more qualitative time with the student. 908 00:47:58,160 --> 00:47:59,618 So that's what we are trying to do. 909 00:47:59,618 --> 00:48:02,450 So you get-- You have the you have the option 910 00:48:02,450 --> 00:48:04,160 to choose which one you should go, right. 911 00:48:04,160 --> 00:48:07,770 If it's AI replacing educator, it must be some human decision but not 912 00:48:07,770 --> 00:48:09,360 the AI actually replacing human. 913 00:48:09,360 --> 00:48:11,010 So I hope that answer your question. 914 00:48:11,010 --> 00:48:14,530 Second question is I can't answer that, because I don't know if AI is dangerous 915 00:48:14,530 --> 00:48:15,030 or not. 916 00:48:15,030 --> 00:48:18,450 I think AI is helpful in this presentation. 917 00:48:18,450 --> 00:48:22,260 In this workshop, at least, in CS50, we think AI is very helpful. 918 00:48:22,260 --> 00:48:25,530 Of course, great technology also comes with consequences. 919 00:48:25,530 --> 00:48:29,310 So it's really up to us how to use this technology. 920 00:48:29,310 --> 00:48:31,890 Is electricity dangerous or not? 921 00:48:31,890 --> 00:48:33,420 It could be it could be not, right. 922 00:48:33,420 --> 00:48:36,000 So I hope that answer your question. 923 00:48:36,000 --> 00:48:37,390 Thank you. 924 00:48:37,390 --> 00:48:38,250 Thank you. 925 00:48:38,250 --> 00:48:41,300 [APPLAUSE] 926 00:48:41,300 --> 00:48:44,000