1 00:00:00,000 --> 00:00:02,210 FIERY CUSHMAN: Oh, man, I am so excited to be here. 2 00:00:02,210 --> 00:00:04,910 This is my second Welcome Weekend at Harvard 3 00:00:04,910 --> 00:00:07,362 because my first one I attended many, many years ago. 4 00:00:07,362 --> 00:00:09,320 And I still have friends that I'm in touch with 5 00:00:09,320 --> 00:00:13,580 and see at reunions that I met at this weekend so many years ago. 6 00:00:13,580 --> 00:00:17,638 When I came here, I studied biology. 7 00:00:17,638 --> 00:00:19,430 But I kept taking philosophy because I felt 8 00:00:19,430 --> 00:00:21,950 like in biology, somehow the questions, they were really interesting. 9 00:00:21,950 --> 00:00:23,840 But they were a little bit small for me. 10 00:00:23,840 --> 00:00:26,660 In philosophy, the questions were really big and exciting. 11 00:00:26,660 --> 00:00:29,990 But it felt like maybe we weren't making as much progress on those questions 12 00:00:29,990 --> 00:00:31,010 as I wanted. 13 00:00:31,010 --> 00:00:33,950 And then I kind of stumbled into psychology 14 00:00:33,950 --> 00:00:38,060 at the very, very end of my time at Harvard, in my senior year. 15 00:00:38,060 --> 00:00:40,730 And now I teach social psychology. 16 00:00:40,730 --> 00:00:44,600 And I start that lecture by trying to explain to students why 17 00:00:44,600 --> 00:00:46,880 I've organized my life around studying this thing, 18 00:00:46,880 --> 00:00:49,103 why, like, every single morning, I get up. 19 00:00:49,103 --> 00:00:51,770 And as soon as I've got the kids at school, I race to the office 20 00:00:51,770 --> 00:00:53,810 to study psychology all day long. 21 00:00:53,810 --> 00:00:55,550 I come home to cook dinner for the kids. 22 00:00:55,550 --> 00:01:00,080 I put them to bed, pull out my laptop, and get right back to psychology again. 23 00:01:00,080 --> 00:01:03,740 And although the thing that I've studied for the last 15 years or so 24 00:01:03,740 --> 00:01:07,730 is mostly moral judgment, and I find moral judgment really interesting, 25 00:01:07,730 --> 00:01:12,050 a fascinating part of our psychology, when I teach undergraduates, 26 00:01:12,050 --> 00:01:15,290 I start with two slides that are focused on those big, 27 00:01:15,290 --> 00:01:18,770 almost philosophy-level questions that are the ones that really get me out 28 00:01:18,770 --> 00:01:19,880 of bed in the morning. 29 00:01:19,880 --> 00:01:22,970 And these two slides, they can summarize in two pictures 30 00:01:22,970 --> 00:01:25,730 the question that I find so motivating. 31 00:01:25,730 --> 00:01:27,770 So here's the first picture. 32 00:01:27,770 --> 00:01:30,440 This is a picture of the world at night, like if you 33 00:01:30,440 --> 00:01:32,150 were approaching it from outer space. 34 00:01:32,150 --> 00:01:36,260 For the first 3.5 billion years that there was life on Earth, 35 00:01:36,260 --> 00:01:38,930 this is kind of what you would have seen from outer space. 36 00:01:38,930 --> 00:01:42,590 And then the second picture is the world tonight. 37 00:01:42,590 --> 00:01:45,620 So after all of this time of life evolving on Earth, 38 00:01:45,620 --> 00:01:48,980 there's one species that so radically transforms 39 00:01:48,980 --> 00:01:52,408 everything about what's going on here that if you knew nothing 40 00:01:52,408 --> 00:01:54,950 about the planet Earth, and you were coming from outer space, 41 00:01:54,950 --> 00:01:57,260 it would be the first thing that you would see. 42 00:01:57,260 --> 00:02:00,780 And when you got here, it would just be totally clear 43 00:02:00,780 --> 00:02:06,600 that there is one kind of life that is doing something way different than all 44 00:02:06,600 --> 00:02:08,639 the other kinds of life. 45 00:02:08,639 --> 00:02:12,030 And I'm trying to understand how did that happen. 46 00:02:12,030 --> 00:02:16,650 What is it that makes us so different from all the other species on Earth? 47 00:02:16,650 --> 00:02:20,370 This is a question that people have been asking for a really, really long time. 48 00:02:20,370 --> 00:02:23,040 And what I would say in my corner of the literature 49 00:02:23,040 --> 00:02:24,840 and my corner of the academic world, people 50 00:02:24,840 --> 00:02:26,790 have offered two different types of answers. 51 00:02:26,790 --> 00:02:28,440 One of them is pretty intuitive. 52 00:02:28,440 --> 00:02:32,070 It says, look, we evolved very, very big and powerful brains. 53 00:02:32,070 --> 00:02:34,800 They allow us to reason and solve problems. 54 00:02:34,800 --> 00:02:36,960 And then the other key thing that we have 55 00:02:36,960 --> 00:02:40,740 is culture, which operates a little bit like a gigantic textbook. 56 00:02:40,740 --> 00:02:44,040 When people have smart ideas, we pass them 57 00:02:44,040 --> 00:02:46,350 on to the people who come after us in a chain. 58 00:02:46,350 --> 00:02:49,350 Like, I pass it to you, you pass it to your kids, or the people 59 00:02:49,350 --> 00:02:51,780 that you teach in school one day. 60 00:02:51,780 --> 00:02:54,900 And so according to this view, most of the heavy lifting 61 00:02:54,900 --> 00:02:57,330 is being done by our intellects. 62 00:02:57,330 --> 00:03:00,510 But culture and social learning, they play a role too, 63 00:03:00,510 --> 00:03:03,600 like a textbook does, just getting the information from generation 64 00:03:03,600 --> 00:03:05,177 to generation. 65 00:03:05,177 --> 00:03:07,260 It seems like that's got to be part of the answer. 66 00:03:07,260 --> 00:03:10,080 I mean, there are textbooks, and we do communicate that way. 67 00:03:10,080 --> 00:03:13,740 But recently, people have emphasized another very different type 68 00:03:13,740 --> 00:03:20,550 of answer, which is that culture is a little bit more like our DNA. 69 00:03:20,550 --> 00:03:23,730 Our DNA is modified by random mutations, not 70 00:03:23,730 --> 00:03:25,830 by-- we don't have intelligent cells. 71 00:03:25,830 --> 00:03:27,810 We don't have intelligent transcription. 72 00:03:27,810 --> 00:03:32,250 There's just purely random accidents, but some of them turn out to be useful. 73 00:03:32,250 --> 00:03:34,590 And because they're successful, they get passed on. 74 00:03:34,590 --> 00:03:36,870 People have said, maybe culture works that way too. 75 00:03:36,870 --> 00:03:40,590 Maybe we're kidding ourselves thinking that we're coming up with great ideas 76 00:03:40,590 --> 00:03:41,550 using our intellect. 77 00:03:41,550 --> 00:03:44,820 We're just kind of randomly stumbling into different ways 78 00:03:44,820 --> 00:03:48,060 of living, different values, different beliefs. 79 00:03:48,060 --> 00:03:51,120 And the people who have good values or good beliefs 80 00:03:51,120 --> 00:03:54,180 or useful values, useful beliefs, they're more successful. 81 00:03:54,180 --> 00:03:56,130 So they get copied by other people. 82 00:03:56,130 --> 00:04:00,975 And it's just like DNA evolving but through social learning. 83 00:04:00,975 --> 00:04:03,600 I want to give you-- this is a kind of a weird way of thinking. 84 00:04:03,600 --> 00:04:04,920 And I'm not sure it's right. 85 00:04:04,920 --> 00:04:06,222 I just find it interesting. 86 00:04:06,222 --> 00:04:09,180 So I want to give you one of the examples that people in the literature 87 00:04:09,180 --> 00:04:11,820 have used to suggest that something like this could be going on. 88 00:04:11,820 --> 00:04:14,528 There's many different examples, but the one that I'll talk about 89 00:04:14,528 --> 00:04:17,100 is called nixtamalization. 90 00:04:17,100 --> 00:04:21,750 So many people who came to the Americas from other parts of the world, 91 00:04:21,750 --> 00:04:24,120 the way that we make corn flour and eat corn, we just-- 92 00:04:24,120 --> 00:04:27,120 we either eat the corn, or if we want to make corn flour, we dry it out, 93 00:04:27,120 --> 00:04:29,190 we grind it up, and we bake with it. 94 00:04:29,190 --> 00:04:31,110 But if you look at indigenous communities, 95 00:04:31,110 --> 00:04:32,370 they do something different. 96 00:04:32,370 --> 00:04:33,450 They make masa. 97 00:04:33,450 --> 00:04:34,680 They make hominy. 98 00:04:34,680 --> 00:04:37,830 These are things that take a lot of work to make. 99 00:04:37,830 --> 00:04:42,510 Here's a little schematic of the process called nixtamalization that transforms 100 00:04:42,510 --> 00:04:45,000 corn into things like hominy and masa. 101 00:04:45,000 --> 00:04:45,990 It has a lot of steps. 102 00:04:45,990 --> 00:04:49,020 It's a little bit complicated, not that complicated. 103 00:04:49,020 --> 00:04:52,590 Why are they putting all this effort into doing that? 104 00:04:52,590 --> 00:04:55,860 If you went to ask them, they would say, well, it's a little easier to chew. 105 00:04:55,860 --> 00:04:58,290 Or they would just say, like, look, this is the way-- 106 00:04:58,290 --> 00:04:59,970 how else would you make a corn tortilla? 107 00:04:59,970 --> 00:05:02,430 This is the way that you do it. 108 00:05:02,430 --> 00:05:06,300 I want to draw your attention to one particular part of this process, which 109 00:05:06,300 --> 00:05:07,170 is counterintuitive. 110 00:05:07,170 --> 00:05:08,997 See step two-- it says add lime and cook. 111 00:05:08,997 --> 00:05:11,580 You might be forgiven that thinking what they're talking about 112 00:05:11,580 --> 00:05:12,810 is like the citrus fruit lime. 113 00:05:12,810 --> 00:05:14,852 But that's not what they're talking about at all. 114 00:05:14,852 --> 00:05:17,340 They're talking about calcium hydroxide. 115 00:05:17,340 --> 00:05:20,730 They're talking about a chemical which is very alkaline substance, which 116 00:05:20,730 --> 00:05:22,800 actually, if you ate it in a concentrated form, 117 00:05:22,800 --> 00:05:24,240 could give you chemical burns. 118 00:05:24,240 --> 00:05:26,730 It's not good for you at all. 119 00:05:26,730 --> 00:05:30,090 These indigenous communities are getting it by burning things, 120 00:05:30,090 --> 00:05:33,120 and they take the ash, which has a lot of this-- 121 00:05:33,120 --> 00:05:37,050 different alkalines, and especially calcium hydroxide in it. 122 00:05:37,050 --> 00:05:38,850 They combine it with the corn. 123 00:05:38,850 --> 00:05:40,980 They let it sit for a long time. 124 00:05:40,980 --> 00:05:44,970 And then they have to wash it out because it's actually not good for you. 125 00:05:44,970 --> 00:05:49,020 And again, they don't have a lot of explanation 126 00:05:49,020 --> 00:05:53,040 for why it is that you would want to make your corn products this way. 127 00:05:53,040 --> 00:05:55,050 But when scientists went and studied it, they 128 00:05:55,050 --> 00:06:00,940 realized that it literally quadruples the nutrient value of corn to do this. 129 00:06:00,940 --> 00:06:03,700 In particular, it makes available a vitamin B3 130 00:06:03,700 --> 00:06:07,060 which is otherwise absent in the diets of the communities that 131 00:06:07,060 --> 00:06:08,860 relied on this practice. 132 00:06:08,860 --> 00:06:11,770 And the civilizations that we knew about-- enormous, 133 00:06:11,770 --> 00:06:14,320 powerful civilizations in the Americas-- 134 00:06:14,320 --> 00:06:17,380 would not have been possible had people not 135 00:06:17,380 --> 00:06:21,790 been finding a way to get more nutrient value out of corn than it would have 136 00:06:21,790 --> 00:06:24,070 had otherwise. 137 00:06:24,070 --> 00:06:27,940 But not only are these communities not able to tell you 138 00:06:27,940 --> 00:06:31,930 that it's an alkaline substance that unlocks vitamin B3. 139 00:06:31,930 --> 00:06:34,167 It might not be surprising to know that that's not 140 00:06:34,167 --> 00:06:35,500 the explanation they would give. 141 00:06:35,500 --> 00:06:37,743 They wouldn't talk about nutrition at all. 142 00:06:37,743 --> 00:06:39,910 They wouldn't tell you this makes it more nutritious 143 00:06:39,910 --> 00:06:41,230 or this makes it healthier. 144 00:06:41,230 --> 00:06:43,570 They just say this is how you make a tortilla. 145 00:06:43,570 --> 00:06:45,640 This is what makes it good. 146 00:06:45,640 --> 00:06:47,620 And so some people have argued that this should 147 00:06:47,620 --> 00:06:51,070 be our basic paradigm for understanding how it is 148 00:06:51,070 --> 00:06:53,320 that humans got to where we are today. 149 00:06:53,320 --> 00:06:55,780 The thing that makes us so different from other species 150 00:06:55,780 --> 00:06:57,790 is just that we copy each other. 151 00:06:57,790 --> 00:07:01,150 And the rest is like natural selection. 152 00:07:01,150 --> 00:07:03,460 People are just trying different things. 153 00:07:03,460 --> 00:07:07,570 And if it works well, if it makes you a little fitter, you have a bigger family, 154 00:07:07,570 --> 00:07:09,700 other people copy you. 155 00:07:09,700 --> 00:07:12,190 So we wanted to explore that in our lab. 156 00:07:12,190 --> 00:07:15,010 I teach this stuff in my undergraduate lecture. 157 00:07:15,010 --> 00:07:17,800 And a few years ago, there was this student, Danish Bajwa, 158 00:07:17,800 --> 00:07:19,780 who kept showing up to my office hours. 159 00:07:19,780 --> 00:07:22,180 And we were having fascinating chats together. 160 00:07:22,180 --> 00:07:25,150 He thinks he wants to go to law school or become a journalist. 161 00:07:25,150 --> 00:07:27,220 I think he should really become a psychologist. 162 00:07:27,220 --> 00:07:28,928 I'm still trying to convince him of that. 163 00:07:28,928 --> 00:07:31,180 I've got one year left. 164 00:07:31,180 --> 00:07:34,420 But Danish and I started working on a project 165 00:07:34,420 --> 00:07:37,600 together that was a very new and different kind of project for me. 166 00:07:37,600 --> 00:07:40,180 But it was exciting because after all of these years 167 00:07:40,180 --> 00:07:44,440 studying moral psychology, which I really do love, finally, with Danish, 168 00:07:44,440 --> 00:07:46,630 I was doing that research project that felt 169 00:07:46,630 --> 00:07:49,960 like it was starting to address the thing that got me up out of bed 170 00:07:49,960 --> 00:07:53,380 every morning, like the big questions that I really cared about. 171 00:07:53,380 --> 00:07:57,040 And recently, I also hired Linas Nasvytis, 172 00:07:57,040 --> 00:07:59,770 who will be staying in psychology and going to graduate school 173 00:07:59,770 --> 00:08:01,558 at Stanford next year. 174 00:08:01,558 --> 00:08:04,600 So the challenge that we face-- because we're experimental psychologists. 175 00:08:04,600 --> 00:08:08,230 So what we do is we try to bring interesting things in the world 176 00:08:08,230 --> 00:08:13,180 into the lab and run experiments on humans on those things. 177 00:08:13,180 --> 00:08:16,150 How are you going to bring nixtamalization into the lab 178 00:08:16,150 --> 00:08:18,970 and run experiments on humans? 179 00:08:18,970 --> 00:08:23,920 It's not easy to figure out how you take a cultural process that takes hundreds 180 00:08:23,920 --> 00:08:28,210 of years and generations of people from childhood to adulthood, and then, 181 00:08:28,210 --> 00:08:31,461 like, over the course of a weekend reproduce that 182 00:08:31,461 --> 00:08:32,919 and play with it in the laboratory. 183 00:08:32,919 --> 00:08:36,062 And it took a long time for Danish and I to start to make progress. 184 00:08:36,062 --> 00:08:38,770 I don't think you're going to be very impressed with the progress 185 00:08:38,770 --> 00:08:41,020 because we're still very much at the beginning of this project. 186 00:08:41,020 --> 00:08:42,020 But it's exciting to me. 187 00:08:42,020 --> 00:08:43,240 So I want to share it. 188 00:08:43,240 --> 00:08:47,050 This is the first version that Danish came up with after really 189 00:08:47,050 --> 00:08:51,490 about a year of work that started to reliably work pretty well. 190 00:08:51,490 --> 00:08:53,000 And it's a very simple idea. 191 00:08:53,000 --> 00:08:55,000 There's this character that's you in the middle. 192 00:08:55,000 --> 00:08:57,790 And you get to wander around this environment and collect mushrooms. 193 00:08:57,790 --> 00:08:59,530 We tell you that you're making a mushroom 194 00:08:59,530 --> 00:09:03,880 soup because you belong to a community where people like to eat mushroom soup. 195 00:09:03,880 --> 00:09:09,373 The first people who participate in this experiment, that is all we tell them. 196 00:09:09,373 --> 00:09:12,040 And they can go around and collect whatever mushrooms they want. 197 00:09:12,040 --> 00:09:14,620 These people are very confused. 198 00:09:14,620 --> 00:09:17,483 They have no idea which mushrooms they're supposed to collect 199 00:09:17,483 --> 00:09:19,150 or what the point of this experiment is. 200 00:09:19,150 --> 00:09:21,913 But that's OK with us. 201 00:09:21,913 --> 00:09:24,580 I'll give you an example of the type of thing that might happen. 202 00:09:24,580 --> 00:09:26,528 Oh, I'll tell you. 203 00:09:26,528 --> 00:09:27,820 But we don't tell participants. 204 00:09:27,820 --> 00:09:28,820 So they don't know this. 205 00:09:28,820 --> 00:09:30,340 Only you know this. 206 00:09:30,340 --> 00:09:34,810 Those mushrooms are healthy and good, according to us, the experimenters. 207 00:09:34,810 --> 00:09:38,560 And these mushrooms are toxic and will kill you. 208 00:09:38,560 --> 00:09:41,260 But we didn't tell them, and we're not going to tell them. 209 00:09:41,260 --> 00:09:45,010 At no point in the entire experiment are we ever going to reveal that. 210 00:09:45,010 --> 00:09:48,250 Imagine that they're the kind of toxins that kill you eventually, 211 00:09:48,250 --> 00:09:49,165 like after years. 212 00:09:49,165 --> 00:09:51,790 But it's not like you taste one, and immediately, you get sick. 213 00:09:51,790 --> 00:09:54,130 So you don't know that you're doing a bad thing. 214 00:09:54,130 --> 00:09:57,400 OK, so people just choose some random path through this environment. 215 00:09:57,400 --> 00:10:00,110 They're, like, doing their best to collect these mushrooms. 216 00:10:00,110 --> 00:10:02,180 You know, maybe this is like-- 217 00:10:02,180 --> 00:10:04,730 you're not feeling great at the end of this trip. 218 00:10:04,730 --> 00:10:08,250 But there could be somebody else who does especially poorly. 219 00:10:08,250 --> 00:10:11,898 You know, there could be somebody who maybe just through sheer dumb luck, 220 00:10:11,898 --> 00:10:14,690 they just happen to think that these mushrooms look a little better 221 00:10:14,690 --> 00:10:15,523 than the other ones. 222 00:10:15,523 --> 00:10:18,140 Remember, there's no red and blue indicating for them. 223 00:10:18,140 --> 00:10:20,130 And so this person does really well. 224 00:10:20,130 --> 00:10:22,095 So we just collect a ton of data like this. 225 00:10:22,095 --> 00:10:24,470 And people are-- some people are having good experiences. 226 00:10:24,470 --> 00:10:26,540 Some people are having bad experiences. 227 00:10:26,540 --> 00:10:30,830 But then what we do is we decide we're going to kill off all the ones who 228 00:10:30,830 --> 00:10:32,450 ate the toxic mushrooms. 229 00:10:32,450 --> 00:10:34,820 We're going to assume that they didn't do very well. 230 00:10:34,820 --> 00:10:38,840 I mean, maybe they literally died, or maybe they just were a little sickly 231 00:10:38,840 --> 00:10:41,060 or they weren't having big families or they weren't 232 00:10:41,060 --> 00:10:42,890 as successful in their communities. 233 00:10:42,890 --> 00:10:46,190 And so when the next generation was born, 234 00:10:46,190 --> 00:10:51,330 they're only going to see the people and want to copy the people who did well. 235 00:10:51,330 --> 00:10:51,830 OK? 236 00:10:51,830 --> 00:10:57,200 So we're going to show a new generation of people the same experiment, 237 00:10:57,200 --> 00:10:59,780 but the difference is first we say, I'm just 238 00:10:59,780 --> 00:11:03,140 going to show you how a couple of other people in this experiment 239 00:11:03,140 --> 00:11:05,360 did this when we gave them the chance. 240 00:11:05,360 --> 00:11:07,430 We don't say that these were good people. 241 00:11:07,430 --> 00:11:09,260 We don't say they were the survivors. 242 00:11:09,260 --> 00:11:11,150 We don't say you should copy them. 243 00:11:11,150 --> 00:11:12,980 We just say, like, before you do it, we're 244 00:11:12,980 --> 00:11:15,110 going to give you a chance to see what some other people did. 245 00:11:15,110 --> 00:11:16,700 And then it's going to be your turn. 246 00:11:16,700 --> 00:11:20,960 And then we iterate that over and over and over, generation after generation, 247 00:11:20,960 --> 00:11:23,500 showing people the strongest survivors. 248 00:11:23,500 --> 00:11:25,550 If you've ever studied natural selection, 249 00:11:25,550 --> 00:11:26,990 this should feel really familiar. 250 00:11:26,990 --> 00:11:29,670 If it works for your DNA, it should work for culture too. 251 00:11:29,670 --> 00:11:31,460 And that's exactly what we find. 252 00:11:31,460 --> 00:11:35,930 On a scale from dead to flourishing, we find 253 00:11:35,930 --> 00:11:38,990 that over the course of generations, our population starts 254 00:11:38,990 --> 00:11:41,540 to do better and better and better. 255 00:11:41,540 --> 00:11:45,800 Now an interesting question we can ask is, what are they saying to themselves 256 00:11:45,800 --> 00:11:49,100 and saying to us about why they're doing the things that they're doing? 257 00:11:49,100 --> 00:11:51,690 Is it consistent with this idea that they're just 258 00:11:51,690 --> 00:11:55,470 adhering to a rigid cultural practice without much thought given 259 00:11:55,470 --> 00:11:58,560 to why it is that, in my culture-- 260 00:11:58,560 --> 00:12:00,840 microculture that we've created in the lab-- 261 00:12:00,840 --> 00:12:02,453 we make soup this way? 262 00:12:02,453 --> 00:12:05,370 And here's an example of some comments that were consistent with that. 263 00:12:05,370 --> 00:12:07,920 These participants are basically telling us, look, I don't know. 264 00:12:07,920 --> 00:12:09,480 I just saw some other people do things. 265 00:12:09,480 --> 00:12:11,188 It seemed like I should do what they did. 266 00:12:11,188 --> 00:12:12,540 So I copied them. 267 00:12:12,540 --> 00:12:16,200 That's consistent with the basic model that I presented. 268 00:12:16,200 --> 00:12:18,128 But one of the things that popped out to us 269 00:12:18,128 --> 00:12:19,920 is that there were also a lot of people who 270 00:12:19,920 --> 00:12:24,060 were trying to make sense out of what was happening, 271 00:12:24,060 --> 00:12:29,460 who were trying to extract textbook style knowledge. 272 00:12:29,460 --> 00:12:32,610 They were looking at what other people around them did, 273 00:12:32,610 --> 00:12:36,120 and although nobody said to them these ones are poisonous 274 00:12:36,120 --> 00:12:39,570 or these ones are healthy, they were trying to make sense out of it. 275 00:12:39,570 --> 00:12:41,490 And by trying to make sense out of it, they 276 00:12:41,490 --> 00:12:44,700 were actually recovering true facts. 277 00:12:44,700 --> 00:12:48,420 It's kind of cool to me that, like, of all the people in this experiment, 278 00:12:48,420 --> 00:12:53,100 not a single one was ever told that some are nutritious and some are poisonous. 279 00:12:53,100 --> 00:12:55,170 But by the end of the experiment, they know that. 280 00:12:55,170 --> 00:12:57,570 They figured it out for themselves. 281 00:12:57,570 --> 00:13:01,110 In social psychology, which is the field that I sit in, 282 00:13:01,110 --> 00:13:04,710 we do think a lot about this process where 283 00:13:04,710 --> 00:13:08,550 we find justifications for our behaviors, our cultural practices, 284 00:13:08,550 --> 00:13:09,870 and share them with each other. 285 00:13:09,870 --> 00:13:14,220 And the predominant view has been to call that rationalization 286 00:13:14,220 --> 00:13:16,350 in a kind of a pejorative sense. 287 00:13:16,350 --> 00:13:18,780 Like, I hold my values. 288 00:13:18,780 --> 00:13:21,960 And now, just to save face or to look good to you, 289 00:13:21,960 --> 00:13:24,510 I'm going to construct some reasons post-hoc 290 00:13:24,510 --> 00:13:26,730 that make sense out of those values. 291 00:13:26,730 --> 00:13:29,610 I have no doubt that sometimes rationalization 292 00:13:29,610 --> 00:13:32,250 is a kind of pernicious thing, the kind of thing 293 00:13:32,250 --> 00:13:35,820 that we should view in through a pejorative lens. 294 00:13:35,820 --> 00:13:38,910 But on the other hand, like an experience that I've had 295 00:13:38,910 --> 00:13:41,430 is that sometimes if I'm in a conversation with someone 296 00:13:41,430 --> 00:13:46,230 about my deeply held values that I learn from others, that I learned from people 297 00:13:46,230 --> 00:13:49,710 who came before me, I learn something about why 298 00:13:49,710 --> 00:13:53,070 I hold those values by having to explain them to someone else. 299 00:13:53,070 --> 00:13:55,890 And the things I learned feel true. 300 00:13:55,890 --> 00:13:59,340 It feels like what I'm reconstructing is why those are actually 301 00:13:59,340 --> 00:14:01,290 pretty reasonable values to hold. 302 00:14:01,290 --> 00:14:03,180 And we think on a micro scale, we're starting 303 00:14:03,180 --> 00:14:05,520 to pick up on that in this experiment. 304 00:14:05,520 --> 00:14:07,350 You can see that in the data as well. 305 00:14:07,350 --> 00:14:11,790 We ask people, I know we didn't tell you that any mushrooms were good 306 00:14:11,790 --> 00:14:13,440 or any mushrooms were bad. 307 00:14:13,440 --> 00:14:16,950 But can you tell me, if you had to guess which mushrooms are good or bad, 308 00:14:16,950 --> 00:14:17,940 what would you say? 309 00:14:17,940 --> 00:14:20,220 And we find that at the beginning of this experiment, 310 00:14:20,220 --> 00:14:22,500 people are at chance, where that red bar is. 311 00:14:22,500 --> 00:14:25,290 By the end of the experience, they're showing a lot of insight 312 00:14:25,290 --> 00:14:29,910 because they're making sense out of their culturally inherited values. 313 00:14:29,910 --> 00:14:31,980 I should also say that there are some people who 314 00:14:31,980 --> 00:14:35,550 just decide to strike off on their own and not do it 315 00:14:35,550 --> 00:14:37,420 the culturally inherited way at all. 316 00:14:37,420 --> 00:14:39,420 You know, they say things like, hey, wouldn't it 317 00:14:39,420 --> 00:14:42,120 be great to have this many mushrooms in a soup? 318 00:14:42,120 --> 00:14:45,480 It would be really expensive to have a mushroom soup like this in real life. 319 00:14:45,480 --> 00:14:47,490 I hope these mushrooms are non-toxic. 320 00:14:47,490 --> 00:14:52,710 Obviously, these people all died in our experiment. 321 00:14:52,710 --> 00:14:56,340 It emphasizes to me-- you know, to come back to this big picture theme, 322 00:14:56,340 --> 00:14:57,810 you all know Newton's quote. 323 00:14:57,810 --> 00:15:00,720 Newton famously said, "If I've seen farther than others--" you know, 324 00:15:00,720 --> 00:15:04,620 in inventing physics and calculus-- "it's by standing on the shoulders 325 00:15:04,620 --> 00:15:05,430 of giants." 326 00:15:05,430 --> 00:15:09,150 And for me, it's fun to be able to take that phenomenon, that 327 00:15:09,150 --> 00:15:11,370 standing on the shoulders of giants phenomenon that 328 00:15:11,370 --> 00:15:13,800 makes us so different than all other species, 329 00:15:13,800 --> 00:15:15,960 and to bring it under experimental control 330 00:15:15,960 --> 00:15:18,510 and to begin to be able to play with it. 331 00:15:18,510 --> 00:15:21,960 I will also just share with you that my own experience at Harvard 332 00:15:21,960 --> 00:15:25,320 is one where I do feel incredibly lucky to be 333 00:15:25,320 --> 00:15:28,710 able to stand on the shoulders of, for instance, the kinds of speakers 334 00:15:28,710 --> 00:15:32,217 that you just heard from this morning, where every once in a while, 335 00:15:32,217 --> 00:15:34,050 you get that feeling like you're seeing just 336 00:15:34,050 --> 00:15:37,770 a little bit further than other people have seen in some direction or another. 337 00:15:37,770 --> 00:15:41,760 But the thing that I most love about Harvard that I think Newton actually 338 00:15:41,760 --> 00:15:42,990 didn't capture-- 339 00:15:42,990 --> 00:15:48,420 I love finding people like Danish around me and hoisting them up on my shoulders 340 00:15:48,420 --> 00:15:50,370 and then getting a perspective on the world 341 00:15:50,370 --> 00:15:52,810 that I never, never would have had otherwise 342 00:15:52,810 --> 00:15:58,190 if I hadn't had their help looking in a direction that I've never looked before. 343 00:15:58,190 --> 00:15:59,440 So thanks very much. 344 00:15:59,440 --> 00:16:01,710 Enjoy your time here. 345 00:16:01,710 --> 00:16:03,000