1 00:00:00,000 --> 00:00:02,292 SPEAKER: Thank you, Joey, for that lovely introduction. 2 00:00:02,292 --> 00:00:05,390 I'm so honored and thrilled to be here and be part of welcoming you 3 00:00:05,390 --> 00:00:06,800 to the Harvard community. 4 00:00:06,800 --> 00:00:09,890 I want to echo President Garber's remarks earlier 5 00:00:09,890 --> 00:00:13,790 about how fun it is for a student to walk into my office and say, 6 00:00:13,790 --> 00:00:16,970 hey, I'm concentrating in classics and computer science, 7 00:00:16,970 --> 00:00:19,640 and I need to write a thesis at the intersection of the two. 8 00:00:19,640 --> 00:00:21,890 And I'm like, you've come to the right place. 9 00:00:21,890 --> 00:00:26,300 But today, I want to talk to you about something because I specialize 10 00:00:26,300 --> 00:00:27,800 in human computer interaction. 11 00:00:27,800 --> 00:00:30,830 It's kind of like applied psychology for computers and the people 12 00:00:30,830 --> 00:00:32,327 that interact with them. 13 00:00:32,327 --> 00:00:34,160 I want to talk to you about something you're 14 00:00:34,160 --> 00:00:36,860 probably already doing every day, which is interacting with AI. 15 00:00:36,860 --> 00:00:42,680 And I'm particularly interested in how those interactions are, whether they 16 00:00:42,680 --> 00:00:45,800 are accelerating you towards what you are ultimately 17 00:00:45,800 --> 00:00:48,230 going to do if you had infinite time and resources 18 00:00:48,230 --> 00:00:51,200 or whether they are maybe in subtle and not so subtle ways 19 00:00:51,200 --> 00:00:54,410 affecting the goals that you choose and the quality 20 00:00:54,410 --> 00:00:57,420 of the decisions and the output that you ultimately get with them. 21 00:00:57,420 --> 00:01:00,930 So if you look at the research today, which is exploding, 22 00:01:00,930 --> 00:01:03,180 there's a lot of different terms that you could search 23 00:01:03,180 --> 00:01:05,400 should you want to look up some papers. 24 00:01:05,400 --> 00:01:09,940 I'm very proud that my field of human computer action is very accessible. 25 00:01:09,940 --> 00:01:13,200 So I will often, in my classes at the undergraduate level, 26 00:01:13,200 --> 00:01:15,660 ask you to engage with the latest scholarship. 27 00:01:15,660 --> 00:01:17,620 And you can do that even before you get here. 28 00:01:17,620 --> 00:01:21,960 But you can search for human-AI, teaming, human-AI, co-creation, 29 00:01:21,960 --> 00:01:26,190 AI-assisted creativity, or AI-assisted decision making, among others. 30 00:01:26,190 --> 00:01:29,880 So I thought, well, maybe I can ask the AI 31 00:01:29,880 --> 00:01:32,760 to help me generate some images to go along with this slide. 32 00:01:32,760 --> 00:01:35,110 And I usually don't. 33 00:01:35,110 --> 00:01:37,790 But I thought, along with the theme, I asked 34 00:01:37,790 --> 00:01:40,290 it to generate an image of a programmer with a robot looking 35 00:01:40,290 --> 00:01:43,230 over its shoulder at their code because one of the things that I study 36 00:01:43,230 --> 00:01:45,510 is AI-assisted programming, which is quite 37 00:01:45,510 --> 00:01:48,240 relevant to the students in my department of computer science. 38 00:01:48,240 --> 00:01:52,080 And you can see it generated this. 39 00:01:52,080 --> 00:01:55,955 Anyone want to shout out some things you notice about this AI-generated image? 40 00:01:55,955 --> 00:01:57,330 SPEAKER 2: Not over his shoulder. 41 00:01:57,330 --> 00:01:59,102 SPEAKER 1: No. 42 00:01:59,102 --> 00:02:01,810 It does say it's looking over his shoulder, but it's clearly not. 43 00:02:01,810 --> 00:02:05,500 It's kind of creepily looking at the viewer. 44 00:02:05,500 --> 00:02:06,600 What else do you notice? 45 00:02:06,600 --> 00:02:08,350 SPEAKER 3: Robot is a white male. 46 00:02:08,350 --> 00:02:09,267 SPEAKER 1: Yes, it is. 47 00:02:09,267 --> 00:02:09,780 Yes, he is. 48 00:02:09,780 --> 00:02:13,120 Two in one right there. 49 00:02:13,120 --> 00:02:13,760 Any other? 50 00:02:13,760 --> 00:02:16,023 There's one more you might notice. 51 00:02:16,023 --> 00:02:17,190 SPEAKER 4: It's in the dark. 52 00:02:17,190 --> 00:02:18,743 SPEAKER 1: It's in the dark. 53 00:02:18,743 --> 00:02:20,285 Programming only happens in the dark. 54 00:02:20,285 --> 00:02:24,030 55 00:02:24,030 --> 00:02:24,530 I'm sorry? 56 00:02:24,530 --> 00:02:25,863 SPEAKER 4: The code is nonsense. 57 00:02:25,863 --> 00:02:27,780 SPEAKER 1: The code is also probably nonsense. 58 00:02:27,780 --> 00:02:30,010 And if you can read it, I'm impressed. 59 00:02:30,010 --> 00:02:33,970 So I thought, well, I know, in my mind, this is not quite what I wanted. 60 00:02:33,970 --> 00:02:38,670 I didn't know exactly what I wanted when I put this in, but this is not it. 61 00:02:38,670 --> 00:02:40,492 So I asked-- you can see up at the top. 62 00:02:40,492 --> 00:02:42,450 I said, the robot is looking at the programmer, 63 00:02:42,450 --> 00:02:44,408 not over the programmer's shoulder at the code. 64 00:02:44,408 --> 00:02:47,130 Can you fix that? 65 00:02:47,130 --> 00:02:47,870 A little better. 66 00:02:47,870 --> 00:02:50,670 67 00:02:50,670 --> 00:02:51,760 It's still a white man. 68 00:02:51,760 --> 00:02:53,050 It's still in the dark. 69 00:02:53,050 --> 00:02:54,800 The robot's kind of looking at his code. 70 00:02:54,800 --> 00:02:55,300 All right. 71 00:02:55,300 --> 00:02:56,460 SPEAKER: And the robot looks the same too. 72 00:02:56,460 --> 00:02:59,140 SPEAKER: The robot has-- yes, it's a very consistent robot. 73 00:02:59,140 --> 00:03:00,690 There seems to be-- 74 00:03:00,690 --> 00:03:03,210 this particular model, given its training data 75 00:03:03,210 --> 00:03:09,450 that it is reflecting back to you, it's making some choices very consistently 76 00:03:09,450 --> 00:03:11,888 that don't necessarily reflect real life. 77 00:03:11,888 --> 00:03:14,680 So I said, please change the programmer to be a college-aged woman. 78 00:03:14,680 --> 00:03:15,630 I'm a woman. 79 00:03:15,630 --> 00:03:18,940 Many women in my class were college aged. 80 00:03:18,940 --> 00:03:22,000 And it generated this. 81 00:03:22,000 --> 00:03:23,200 OK. 82 00:03:23,200 --> 00:03:23,955 Well-- 83 00:03:23,955 --> 00:03:26,080 SPEAKER: Robot is actually looking at the code now. 84 00:03:26,080 --> 00:03:27,955 SPEAKER: The robot's now looking at the code. 85 00:03:27,955 --> 00:03:29,090 It's still dark. 86 00:03:29,090 --> 00:03:31,240 She's still white. 87 00:03:31,240 --> 00:03:33,610 I'm kind of going through and just-- 88 00:03:33,610 --> 00:03:35,260 it's not to say that aren't-- 89 00:03:35,260 --> 00:03:40,180 there are programmers who come from all different identities. 90 00:03:40,180 --> 00:03:43,450 Programming happens at, truly, all hours of the day. 91 00:03:43,450 --> 00:03:50,050 But it is reflecting, perhaps, maybe default assumptions that 92 00:03:50,050 --> 00:03:51,920 are reflected in its training data. 93 00:03:51,920 --> 00:03:55,060 And what I want to point out-- and I kind of 94 00:03:55,060 --> 00:03:57,615 kept going back and forth with it. 95 00:03:57,615 --> 00:03:58,990 Eventually, I landed on this one. 96 00:03:58,990 --> 00:04:01,990 And I'm like, all right, at this point, I'm going to stop. 97 00:04:01,990 --> 00:04:03,800 Because the point I'm trying to make here-- 98 00:04:03,800 --> 00:04:05,800 one of the FIRST most important points is that-- 99 00:04:05,800 --> 00:04:09,760 I put in a very generic specification, and it made a whole bunch 100 00:04:09,760 --> 00:04:12,040 of arbitrary choices on my behalf. 101 00:04:12,040 --> 00:04:17,920 And the choices that are consistent with my conscious or unconscious expectations 102 00:04:17,920 --> 00:04:20,410 are the ones I'm least likely to notice. 103 00:04:20,410 --> 00:04:24,610 And we cannot think critically about the choices that are made on our behalf-- 104 00:04:24,610 --> 00:04:28,430 especially algorithmic ones-- if we don't notice them first. 105 00:04:28,430 --> 00:04:32,620 And so I just want to flag that as something for us to keep in mind. 106 00:04:32,620 --> 00:04:36,820 But at this point, I thought, this is taking more time than it's supposed to, 107 00:04:36,820 --> 00:04:38,860 so I'm not going to be using AI generated. 108 00:04:38,860 --> 00:04:39,575 I'll just draw. 109 00:04:39,575 --> 00:04:40,075 OK. 110 00:04:40,075 --> 00:04:41,040 [LAUGHTER] 111 00:04:41,040 --> 00:04:41,540 All right. 112 00:04:41,540 --> 00:04:44,592 So another thing that matters is cognitive engagement. 113 00:04:44,592 --> 00:04:47,050 So even if you notice something, if you don't really, truly 114 00:04:47,050 --> 00:04:49,633 cognitively engage with it, again, that criticality of thought 115 00:04:49,633 --> 00:04:50,920 is not going to come through. 116 00:04:50,920 --> 00:04:53,860 I did a little-- well, actually, my student, [? Priyan-- ?] 117 00:04:53,860 --> 00:04:58,760 here on the right-- did a really nice study of some programmers. 118 00:04:58,760 --> 00:05:00,550 The first group didn't get any assistance. 119 00:05:00,550 --> 00:05:02,170 They just did a programming problem-- 120 00:05:02,170 --> 00:05:05,170 and, of course, some sort of approximate normal curve of how long 121 00:05:05,170 --> 00:05:07,160 it took them to finish. 122 00:05:07,160 --> 00:05:09,190 Then we added-- we said, OK, here's some AI. 123 00:05:09,190 --> 00:05:10,750 Do another one. 124 00:05:10,750 --> 00:05:13,580 And some people got shifted to the left, shorter time. 125 00:05:13,580 --> 00:05:16,100 They were like, ah, I'm just done. 126 00:05:16,100 --> 00:05:16,670 Thank you. 127 00:05:16,670 --> 00:05:17,420 AI is great. 128 00:05:17,420 --> 00:05:19,960 And the other guys never finished. 129 00:05:19,960 --> 00:05:23,140 What do you think happened? 130 00:05:23,140 --> 00:05:24,430 Does anyone want to guess? 131 00:05:24,430 --> 00:05:26,597 SPEAKER: They had a very specific thing they wanted, 132 00:05:26,597 --> 00:05:29,480 and the AI couldn't create exactly that. 133 00:05:29,480 --> 00:05:32,560 SPEAKER: That is not what we observed, but I definitely 134 00:05:32,560 --> 00:05:34,670 think that that's happening in other contexts. 135 00:05:34,670 --> 00:05:37,670 Any other guesses? 136 00:05:37,670 --> 00:05:40,340 SPEAKER: It got distracted. 137 00:05:40,340 --> 00:05:42,750 SPEAKER: Another excellent guess that they might-- 138 00:05:42,750 --> 00:05:44,048 OK, all right. 139 00:05:44,048 --> 00:05:45,590 I'll tell you what we think happened. 140 00:05:45,590 --> 00:05:47,690 But, literally, this is from-- you can see-- 141 00:05:47,690 --> 00:05:48,830 2022. 142 00:05:48,830 --> 00:05:51,920 If you come to Harvard, we can do more empirical studies about this 143 00:05:51,920 --> 00:05:53,580 and figure out exactly what's going on. 144 00:05:53,580 --> 00:05:55,640 One of the things that happened in this context 145 00:05:55,640 --> 00:06:04,190 was that they were not cognitively engaging with the suggestions the AI was 146 00:06:04,190 --> 00:06:04,970 making so much. 147 00:06:04,970 --> 00:06:07,970 And so if they got lucky and they were just like, yep, looks good to me, 148 00:06:07,970 --> 00:06:08,678 looks good to me. 149 00:06:08,678 --> 00:06:09,930 Accept, Accept, accept. 150 00:06:09,930 --> 00:06:13,310 And they got code that passed the test cases, they were done early. 151 00:06:13,310 --> 00:06:17,142 And if they were unlucky and they accept, accept, looks good to me. 152 00:06:17,142 --> 00:06:17,850 Looks good to me. 153 00:06:17,850 --> 00:06:19,860 Looks good to me-- oh no. 154 00:06:19,860 --> 00:06:22,590 I have a lot of code that I didn't really write. 155 00:06:22,590 --> 00:06:24,500 I don't really know exactly what it does, 156 00:06:24,500 --> 00:06:26,130 and it's not passing the test cases. 157 00:06:26,130 --> 00:06:28,640 And now I will use up all the remainder of my time-- 158 00:06:28,640 --> 00:06:32,900 running out of time trying to debug code I didn't write, that's subtly wrong. 159 00:06:32,900 --> 00:06:36,500 But it looks right. 160 00:06:36,500 --> 00:06:43,160 Anyway, so I think that this has been one of our, 161 00:06:43,160 --> 00:06:45,530 surprisingly, most influential papers. 162 00:06:45,530 --> 00:06:49,790 One other thing that-- a follow-up work by my colleague, Krzysztof Gajos, 163 00:06:49,790 --> 00:06:52,542 with a colleague of his-- found that cognitive engagement is also 164 00:06:52,542 --> 00:06:54,750 necessary for you to even learn from the interaction. 165 00:06:54,750 --> 00:06:58,430 So if you're trying to get better at something, if you're not cognitively 166 00:06:58,430 --> 00:07:00,500 engaging when you're being assisted by AI, 167 00:07:00,500 --> 00:07:03,090 you may never gain additional competency. 168 00:07:03,090 --> 00:07:06,350 And I don't know about you guys, but I like getting better at stuff-- 169 00:07:06,350 --> 00:07:09,950 at least, for many things. 170 00:07:09,950 --> 00:07:11,270 It's OK if-- for some things-- 171 00:07:11,270 --> 00:07:13,062 I don't get better at navigating Cambridge. 172 00:07:13,062 --> 00:07:15,090 I'm OK, depending on my GPS. 173 00:07:15,090 --> 00:07:16,300 It's a little confusing. 174 00:07:16,300 --> 00:07:16,800 All right. 175 00:07:16,800 --> 00:07:19,670 So another domain you may already have experienced 176 00:07:19,670 --> 00:07:22,790 is AI-assisted writing and, really, in more simple contexts, 177 00:07:22,790 --> 00:07:24,240 like writing emails. 178 00:07:24,240 --> 00:07:26,310 Some of those emails can be pretty boilerplate. 179 00:07:26,310 --> 00:07:30,112 Like, not hard to guess that "are you" comes after "how" 180 00:07:30,112 --> 00:07:31,320 at the beginning of an email. 181 00:07:31,320 --> 00:07:35,810 But another colleague of mine found that predictable text 182 00:07:35,810 --> 00:07:37,900 encourages predictable writing. 183 00:07:37,900 --> 00:07:42,990 When you are predicting text, people will-- 184 00:07:42,990 --> 00:07:45,040 they are strategically lazy. 185 00:07:45,040 --> 00:07:47,520 That's OK. 186 00:07:47,520 --> 00:07:51,570 But they will write more efficiently and less thoughtfully. 187 00:07:51,570 --> 00:07:55,950 So if we're thinking about how the outcome of our work 188 00:07:55,950 --> 00:07:59,740 changes when we have AI assistants, in this case, 189 00:07:59,740 --> 00:08:06,150 the outcomes were less verbose but also less colorful, a little bit more 190 00:08:06,150 --> 00:08:07,500 stereotypical. 191 00:08:07,500 --> 00:08:11,310 So one thing that I found in a follow-up study 192 00:08:11,310 --> 00:08:14,560 was that it doesn't have to be this way. 193 00:08:14,560 --> 00:08:17,970 And so part of being in my field, human interaction, 194 00:08:17,970 --> 00:08:21,270 we can critique the systems and how they're currently working. 195 00:08:21,270 --> 00:08:23,645 But we're also engineers, so we can build better systems. 196 00:08:23,645 --> 00:08:25,520 We don't have to just point out the problems. 197 00:08:25,520 --> 00:08:26,800 We can try to solve them. 198 00:08:26,800 --> 00:08:28,980 So here, we had a study where we had people 199 00:08:28,980 --> 00:08:31,650 do creative writing to write a story. 200 00:08:31,650 --> 00:08:35,640 And we had AI-generated text that might be 201 00:08:35,640 --> 00:08:39,030 relevant to the evolution of their story, AI-generated images, 202 00:08:39,030 --> 00:08:40,570 AI-generated sounds. 203 00:08:40,570 --> 00:08:43,080 It was kind of a multimedia experience inspired 204 00:08:43,080 --> 00:08:44,497 by whatever they had been writing. 205 00:08:44,497 --> 00:08:46,580 The suggestions-- because this was two years ago-- 206 00:08:46,580 --> 00:08:48,040 were not as good as they are today. 207 00:08:48,040 --> 00:08:50,050 But the point is not how good they were. 208 00:08:50,050 --> 00:08:51,670 It's what people did with them. 209 00:08:51,670 --> 00:08:56,640 And because they were part of the environment and not directly-- 210 00:08:56,640 --> 00:09:00,150 additional gray text in the text buffer, people-- 211 00:09:00,150 --> 00:09:02,130 one person was writing about a situation. 212 00:09:02,130 --> 00:09:04,560 They weren't sure how to solve the plot point. 213 00:09:04,560 --> 00:09:07,200 And they saw that the system had generated 214 00:09:07,200 --> 00:09:09,610 the image of an elephant, which felt pretty random. 215 00:09:09,610 --> 00:09:13,950 But it was their human creativity, where they made it what we called 216 00:09:13,950 --> 00:09:16,150 an "integrative leap," where they said, ah, 217 00:09:16,150 --> 00:09:19,980 I will solve the problem that I've created in this plot. 218 00:09:19,980 --> 00:09:22,480 I realize that one of the characters has stolen an elephant, 219 00:09:22,480 --> 00:09:25,450 and this is why the detective has come to their door. 220 00:09:25,450 --> 00:09:27,490 So this was not limiting. 221 00:09:27,490 --> 00:09:30,237 This was helping them get past stuck points, 222 00:09:30,237 --> 00:09:32,070 but it was really them in the driver's seat, 223 00:09:32,070 --> 00:09:34,950 and it was not impacting the predictability 224 00:09:34,950 --> 00:09:36,450 of what they were writing. 225 00:09:36,450 --> 00:09:39,800 So just some other places where you may not 226 00:09:39,800 --> 00:09:42,110 think of being AI assisted in your decision making, 227 00:09:42,110 --> 00:09:44,510 but just so you call it out-- 228 00:09:44,510 --> 00:09:47,030 we receive driving recommendations-- even 229 00:09:47,030 --> 00:09:50,720 with metadata-- about the impact on the environment 230 00:09:50,720 --> 00:09:54,960 or our pocketbook when we're driving. 231 00:09:54,960 --> 00:09:58,160 The media that we consume may be recommended to us 232 00:09:58,160 --> 00:10:02,150 by algorithms that are surfacing some content they think would 233 00:10:02,150 --> 00:10:04,950 be more appealing to us than others. 234 00:10:04,950 --> 00:10:06,060 They may be wrong. 235 00:10:06,060 --> 00:10:08,060 Even when you come to college and perhaps 236 00:10:08,060 --> 00:10:12,620 engage in some of our data match and other algorithms for matching people-- 237 00:10:12,620 --> 00:10:16,740 we should continue to study the effects of these systems and how they affect us. 238 00:10:16,740 --> 00:10:20,270 Sometimes it's more easy to recognize the system not quite 239 00:10:20,270 --> 00:10:21,495 serving our needs correctly. 240 00:10:21,495 --> 00:10:24,620 Does anyone notice what happened here when I was trying to google for-- not 241 00:10:24,620 --> 00:10:27,250 use AI-generated stuff anymore, just see if I could find some clip art. 242 00:10:27,250 --> 00:10:29,010 SPEAKER: Humans and it's not humans-- 243 00:10:29,010 --> 00:10:30,240 SPEAKER: It's not humans. 244 00:10:30,240 --> 00:10:30,500 There-- 245 00:10:30,500 --> 00:10:31,095 SPEAKER: The first one's human. 246 00:10:31,095 --> 00:10:33,088 SPEAKER: There are two images on here that 247 00:10:33,088 --> 00:10:36,130 have humans interacting with humans working together, which was my query. 248 00:10:36,130 --> 00:10:38,060 The rest are humans interacting with robots. 249 00:10:38,060 --> 00:10:40,870 So anyway, I hope now that I'm pointing it out, 250 00:10:40,870 --> 00:10:42,850 you will continue to notice these things. 251 00:10:42,850 --> 00:10:46,570 You have to keep your brain on when you're interacting with these systems. 252 00:10:46,570 --> 00:10:48,910 Another really critical situation-- 253 00:10:48,910 --> 00:10:53,050 how many people have asked ChatGPT to summarize a document for you? 254 00:10:53,050 --> 00:10:54,790 Yeah, OK. 255 00:10:54,790 --> 00:10:55,650 Be careful. 256 00:10:55,650 --> 00:10:56,150 [LAUGHTER] 257 00:10:56,150 --> 00:10:58,317 SPEAKER: And then read the document and scrutinize-- 258 00:10:58,317 --> 00:11:00,680 SPEAKER: Exactly, you have to read the entire document. 259 00:11:00,680 --> 00:11:03,440 And that's tedious-- you should come to my class, OK? 260 00:11:03,440 --> 00:11:04,610 I'm serious. 261 00:11:04,610 --> 00:11:06,490 SPEAKER: I probably will. 262 00:11:06,490 --> 00:11:10,630 SPEAKER: So AI-generated summaries can omit something 263 00:11:10,630 --> 00:11:13,630 that it decides is not important enough but actually would fundamentally 264 00:11:13,630 --> 00:11:15,920 change how you interpret the sentence. 265 00:11:15,920 --> 00:11:19,030 They may hallucinate things that are plausible but not found 266 00:11:19,030 --> 00:11:20,140 in the original document. 267 00:11:20,140 --> 00:11:22,360 And they may even just subtly misrepresent 268 00:11:22,360 --> 00:11:25,190 something that's critical for you to understand correctly. 269 00:11:25,190 --> 00:11:27,800 And they're just onerous and memory intensive to assess. 270 00:11:27,800 --> 00:11:30,760 Many people don't read the entire original document. 271 00:11:30,760 --> 00:11:36,520 So when you're-- again, Google gives me so much material. 272 00:11:36,520 --> 00:11:43,052 I was googling-- a lot of my friends are also in their late 30s. 273 00:11:43,052 --> 00:11:44,510 Many of them are starting families. 274 00:11:44,510 --> 00:11:49,060 And I remembered sometime hearing that twins were more likely as you got older. 275 00:11:49,060 --> 00:11:52,480 So I googled twin probability with maternal age. 276 00:11:52,480 --> 00:11:56,320 And it said, "According to BabyCenter, the chances of having fraternal twins 277 00:11:56,320 --> 00:11:58,845 are 6.9% for women, 35 to 37." 278 00:11:58,845 --> 00:12:00,470 And I was like, wow, that's quite high. 279 00:12:00,470 --> 00:12:02,290 I should be meeting more twins. 280 00:12:02,290 --> 00:12:05,170 But I accepted this. 281 00:12:05,170 --> 00:12:10,480 And it wasn't until much later that I actually went and looked at the source. 282 00:12:10,480 --> 00:12:12,220 It turns out these are numbers-- 283 00:12:12,220 --> 00:12:14,870 left off for some reason women under 35-- 284 00:12:14,870 --> 00:12:17,410 these are numbers for people who are receiving assisted 285 00:12:17,410 --> 00:12:20,920 reproductive technology interventions where they intentionally put 286 00:12:20,920 --> 00:12:25,570 multiple embryos in to maximize the chances of a live birth. 287 00:12:25,570 --> 00:12:28,480 So this completely changes the interpretation. 288 00:12:28,480 --> 00:12:31,600 These are intentional as opposed to what I was asking about, 289 00:12:31,600 --> 00:12:35,450 which was some natural feature over reproductive age. 290 00:12:35,450 --> 00:12:41,890 So there was no information sent to signal to me the missing 291 00:12:41,890 --> 00:12:45,910 context that completely changed the meaning of the numbers that it gave me. 292 00:12:45,910 --> 00:12:49,330 And I know it says generative AI is experimental, 293 00:12:49,330 --> 00:12:51,290 but we have to be really, really careful. 294 00:12:51,290 --> 00:12:55,015 How do we design interfaces that help people have the information sent to-- 295 00:12:55,015 --> 00:12:58,360 or give them the context to notice when the AI is 296 00:12:58,360 --> 00:13:02,710 making a choice that doesn't make sense for their query or for their situation? 297 00:13:02,710 --> 00:13:04,490 So who's driving? 298 00:13:04,490 --> 00:13:08,020 Well, if you're not cognitively engaged or supported in noticing AI choices 299 00:13:08,020 --> 00:13:12,250 or given enough context to judge the AI choice well, 300 00:13:12,250 --> 00:13:15,430 the AI might be driving more than you think. 301 00:13:15,430 --> 00:13:17,170 But again, we're engineers. 302 00:13:17,170 --> 00:13:21,700 I'll tell you one solution we've come up recently called grammar-preserving text 303 00:13:21,700 --> 00:13:24,610 saliency modulation, which is a totally different approach. 304 00:13:24,610 --> 00:13:32,710 What we do is we recursively cut the least semantics-modifying words 305 00:13:32,710 --> 00:13:33,970 from a sentence. 306 00:13:33,970 --> 00:13:39,270 And then we reify that in the saliency of the sentence. 307 00:13:39,270 --> 00:13:41,120 So we initially cut just-- 308 00:13:41,120 --> 00:13:43,910 it said, "The world is at present accumulating carbon dioxide 309 00:13:43,910 --> 00:13:45,877 in the atmosphere from well-known two sources, 310 00:13:45,877 --> 00:13:47,960 the combustion of fossil fuels and deforestation." 311 00:13:47,960 --> 00:13:53,272 You can see how the later a word was cut while still preserving 312 00:13:53,272 --> 00:13:55,730 the grammaticality and the overall meaning of the sentence, 313 00:13:55,730 --> 00:13:58,910 the darker it remains in the original sentence. 314 00:13:58,910 --> 00:14:03,260 If you do this for an entire paragraph from the GRE, it looks like this. 315 00:14:03,260 --> 00:14:05,240 What's beautiful about this is that there 316 00:14:05,240 --> 00:14:08,330 are extractive summaries and multiple levels of granularity 317 00:14:08,330 --> 00:14:10,170 all within the same thing. 318 00:14:10,170 --> 00:14:12,980 You can change the level of detail as you read. 319 00:14:12,980 --> 00:14:19,220 And if it mistakenly makes less salient a piece of text 320 00:14:19,220 --> 00:14:23,250 that you realize for your context is exactly what you need to read you, 321 00:14:23,250 --> 00:14:23,940 you see it. 322 00:14:23,940 --> 00:14:24,720 It's still there. 323 00:14:24,720 --> 00:14:25,860 It's still legible. 324 00:14:25,860 --> 00:14:27,170 It's not hidden. 325 00:14:27,170 --> 00:14:30,830 So we call this an AI-resilient alternative to AI-generated summaries. 326 00:14:30,830 --> 00:14:34,010 We found that people were able to read faster and answer questions 327 00:14:34,010 --> 00:14:37,610 more accurately when reading this rather than normal text, 328 00:14:37,610 --> 00:14:41,450 as well as one other control condition which modulated text 329 00:14:41,450 --> 00:14:42,920 salience a different way. 330 00:14:42,920 --> 00:14:49,940 So, in summary, I think we should be inviting, facilitating, 331 00:14:49,940 --> 00:14:51,020 or even forcing-- 332 00:14:51,020 --> 00:14:53,870 even though it's unpleasant sometimes for users-- 333 00:14:53,870 --> 00:14:55,080 cognitive engagement. 334 00:14:55,080 --> 00:14:58,370 We need to design interfaces that help people better notice 335 00:14:58,370 --> 00:15:02,060 AI choices that are made on their behalf and, at the same time as it makes 336 00:15:02,060 --> 00:15:05,360 those choices visible, provides the context necessary for you to decide 337 00:15:05,360 --> 00:15:07,340 whether that choice is right for you. 338 00:15:07,340 --> 00:15:10,700 So, on that note, I will pass it on to our final speaker, who 339 00:15:10,700 --> 00:15:11,900 I'm very excited about. 340 00:15:11,900 --> 00:15:14,950 [APPLAUSE] 341 00:15:14,950 --> 00:15:16,000