1 00:00:00,000 --> 00:00:01,868 >> [MUSIC PLAYING] 2 00:00:01,868 --> 00:00:05,400 3 00:00:05,400 --> 00:00:07,220 >> JAMES WHITTAKER: What's next? 4 00:00:07,220 --> 00:00:11,310 We are going to spend the next hour asking, and answering, 5 00:00:11,310 --> 00:00:13,840 that question, what's next? 6 00:00:13,840 --> 00:00:18,180 Because as students, you all spend a lot of time studying the past. 7 00:00:18,180 --> 00:00:20,900 You spend a lot of time mastering the present. 8 00:00:20,900 --> 00:00:24,650 >> But asking, and answering, the question, what's next, 9 00:00:24,650 --> 00:00:27,970 is what's going to make you successful. 10 00:00:27,970 --> 00:00:32,280 Because if you can't answer this question, you end up being left behind. 11 00:00:32,280 --> 00:00:35,720 >> Companies that can't answer this question get left behind. 12 00:00:35,720 --> 00:00:38,040 Individuals get left behind. 13 00:00:38,040 --> 00:00:43,200 Asking, and answering, the question, what's next, 14 00:00:43,200 --> 00:00:46,500 is probably the most important skill that you will develop in your career. 15 00:00:46,500 --> 00:00:49,830 >> You only have to get it right once, and you're done. 16 00:00:49,830 --> 00:00:52,290 So we are going to ask this question. 17 00:00:52,290 --> 00:00:56,710 But in order to get there we're going to study the past where people have 18 00:00:56,710 --> 00:00:59,430 gotten that answer right and wrong. 19 00:00:59,430 --> 00:01:02,440 >> In the 1980s, the big thing was hardware. 20 00:01:02,440 --> 00:01:06,300 In the 1980s, companies bought room-sized computers 21 00:01:06,300 --> 00:01:10,950 and hired rooms full of people to program them. 22 00:01:10,950 --> 00:01:14,480 And in the late '80s, things began to change. 23 00:01:14,480 --> 00:01:22,290 >> The companies that weren't asking what's next, IBM, DAK, Wang, Sperry, are gone. 24 00:01:22,290 --> 00:01:26,250 The companies that did ask, and answer, what's next, 25 00:01:26,250 --> 00:01:30,060 were the companies that survived and thrived into the '90s. 26 00:01:30,060 --> 00:01:34,200 And of course, what's next, was software. 27 00:01:34,200 --> 00:01:38,950 >> IBM got that answer wrong, and they declined. 28 00:01:38,950 --> 00:01:42,900 Microsoft got that answer right, and they owned the next decade. 29 00:01:42,900 --> 00:01:47,010 >> Now, this decade, this word decade is a hint. 30 00:01:47,010 --> 00:01:52,170 This industry runs in 10 year cycles that are as reliable as Moore's law is. 31 00:01:52,170 --> 00:01:56,310 10 years is something being born, and then growing into maturity, 32 00:01:56,310 --> 00:01:59,979 and being the big thing, and then going away rapidly, only 33 00:01:59,979 --> 00:02:01,520 to be replaced by the next big thing. 34 00:02:01,520 --> 00:02:03,353 >> And of course the next big thing in the '90s 35 00:02:03,353 --> 00:02:06,190 was software, mostly the operating system. 36 00:02:06,190 --> 00:02:09,650 And the killer app was productivity, both office productivity 37 00:02:09,650 --> 00:02:11,660 and programmer productivity. 38 00:02:11,660 --> 00:02:15,610 >> The company that owns that killer app for what's next, 39 00:02:15,610 --> 00:02:17,570 ends up being the company that dominates. 40 00:02:17,570 --> 00:02:21,470 And of course, that company in the 1990s was Microsoft. 41 00:02:21,470 --> 00:02:26,380 >> Now 10 years later, Microsoft failed to ask, and answer, 42 00:02:26,380 --> 00:02:32,494 the question what's next, because the world changes reliably every 10 years. 43 00:02:32,494 --> 00:02:34,910 And of course, what's next, was the web and the killer app 44 00:02:34,910 --> 00:02:36,200 was information retrieval. 45 00:02:36,200 --> 00:02:41,780 And Google had the best answer for what was next. 46 00:02:41,780 --> 00:02:46,640 >> Now, in 2007, everybody thought the cloud was next. 47 00:02:46,640 --> 00:02:48,685 Amazon comes out with Amazon Web Services 48 00:02:48,685 --> 00:02:51,140 and everybody was betting on the cloud. 49 00:02:51,140 --> 00:02:52,010 That's what's next. 50 00:02:52,010 --> 00:02:54,250 But see, there's a problem with that. 51 00:02:54,250 --> 00:02:55,880 It was too new. 52 00:02:55,880 --> 00:03:00,190 No invention really becomes important until it's 10 years old. 53 00:03:00,190 --> 00:03:03,630 >> So in 2017, the cloud's going to become really important, 54 00:03:03,630 --> 00:03:06,300 but it's still not what's next. 55 00:03:06,300 --> 00:03:12,700 I was a Google employee in 2009 when we discovered the hard way, what was next. 56 00:03:12,700 --> 00:03:15,610 Do you know? 57 00:03:15,610 --> 00:03:18,994 >> So you've got to get used to asking and answering this question, 58 00:03:18,994 --> 00:03:20,660 or you're never going to get good at it. 59 00:03:20,660 --> 00:03:22,820 And if you get good at it, you're going to be relevant. 60 00:03:22,820 --> 00:03:24,361 >> Can I hold the questions for the end? 61 00:03:24,361 --> 00:03:24,870 Thank you. 62 00:03:24,870 --> 00:03:26,120 You're going to be good at it. 63 00:03:26,120 --> 00:03:27,360 Or you were going to guess? 64 00:03:27,360 --> 00:03:27,740 What? 65 00:03:27,740 --> 00:03:28,100 >> SPEAKER 1: Mobile 66 00:03:28,100 --> 00:03:28,610 >> JAMES WHITTAKER: Of course. 67 00:03:28,610 --> 00:03:29,330 Mobile. 68 00:03:29,330 --> 00:03:33,250 Now, the way we discovered this, is Larry Page 69 00:03:33,250 --> 00:03:37,950 sent out an email that said, hey, we have a big problem. 70 00:03:37,950 --> 00:03:39,690 And he called a few of us together. 71 00:03:39,690 --> 00:03:41,830 And he showed us the big problem. 72 00:03:41,830 --> 00:03:43,400 >> He showed us the data. 73 00:03:43,400 --> 00:03:44,900 And the data was frightening. 74 00:03:44,900 --> 00:03:48,330 The data caused us to have a near death experience. 75 00:03:48,330 --> 00:03:51,040 The data showed that users that use the iPhone-- the iPhone 76 00:03:51,040 --> 00:03:54,260 had been out for two years in 2009-- people who 77 00:03:54,260 --> 00:03:57,010 use the iPhone don't use browsers. 78 00:03:57,010 --> 00:03:58,930 They don't search the web. 79 00:03:58,930 --> 00:04:01,220 They use apps that search the web for them. 80 00:04:01,220 --> 00:04:05,010 >> Imagine our fright as Google. 81 00:04:05,010 --> 00:04:08,920 We made-- at the time we made 97% of our income 82 00:04:08,920 --> 00:04:13,490 on sponsored links and ads viewed through a web browser. 83 00:04:13,490 --> 00:04:16,450 If this iPhone thing was going to be big, 84 00:04:16,450 --> 00:04:23,170 if this smartphone thing, if mobile was going to be big, we had a huge problem. 85 00:04:23,170 --> 00:04:26,990 97% of our income gone. 86 00:04:26,990 --> 00:04:28,670 Crazy. 87 00:04:28,670 --> 00:04:31,040 >> What did they do about it? 88 00:04:31,040 --> 00:04:34,530 A couple of interesting moves that Google made in 2009 that you all 89 00:04:34,530 --> 00:04:37,490 should be aware of, because if you don't start studying this stuff, 90 00:04:37,490 --> 00:04:42,540 you're not going to be able to get good at asking and answering this question. 91 00:04:42,540 --> 00:04:48,910 >> Do you remember back before 2009, if you Googled what time is 92 00:04:48,910 --> 00:04:52,460 it in, say, Columbia, what's your city? 93 00:04:52,460 --> 00:04:53,440 >> SPEAKER 2: [INAUDIBLE]. 94 00:04:53,440 --> 00:04:54,610 >> JAMES WHITTAKER: I can't even say that. 95 00:04:54,610 --> 00:04:55,040 >> SPEAKER 2: Bogota. 96 00:04:55,040 --> 00:04:56,600 >> JAMES WHITTAKER: Bogota, Colombia. 97 00:04:56,600 --> 00:04:57,740 What did you get? 98 00:04:57,740 --> 00:05:02,200 You got 10 blue links about Bogota, Colombia, and time. 99 00:05:02,200 --> 00:05:06,760 All of a sudden in 2009, you got the time in Bogota, Colombia. 100 00:05:06,760 --> 00:05:11,120 >> Make search more app-like was Larry Page's marching order, and we did. 101 00:05:11,120 --> 00:05:14,560 Now that's a big deal for Google, because every time Google delivers 102 00:05:14,560 --> 00:05:17,020 an answer, they don't get paid for it. 103 00:05:17,020 --> 00:05:21,220 They have to deliver a link that might be bought and paid for, or an ad, 104 00:05:21,220 --> 00:05:22,830 in order to make money. 105 00:05:22,830 --> 00:05:24,574 This was devastating to the business. 106 00:05:24,574 --> 00:05:27,490 Second thing they did is they moved the A-team from Search to Android. 107 00:05:27,490 --> 00:05:30,239 And that's probably why Marissa Mayer left the company eventually, 108 00:05:30,239 --> 00:05:33,940 because that was a backup plan, just in case this is important. 109 00:05:33,940 --> 00:05:36,900 But when we looked at this-- at Google, we looked at this-- 110 00:05:36,900 --> 00:05:42,440 and not only were people using apps, they were right to use apps. 111 00:05:42,440 --> 00:05:45,520 >> You understand, apps are a better way to search the web. 112 00:05:45,520 --> 00:05:47,260 Your app searches for you. 113 00:05:47,260 --> 00:05:48,690 >> You're from Brazil. 114 00:05:48,690 --> 00:05:49,670 You're into soccer. 115 00:05:49,670 --> 00:05:52,310 What's a better way to get soccer scores? 116 00:05:52,310 --> 00:05:56,610 Install a soccer app that specializes in soccer-- oh, football, sorry-- 117 00:05:56,610 --> 00:05:58,240 that specializes in football? 118 00:05:58,240 --> 00:06:02,010 Or search the web that tries to generalize everything? 119 00:06:02,010 --> 00:06:03,380 It's the app, right? 120 00:06:03,380 --> 00:06:06,900 >> If you're a foodie, what's the best thing to do? 121 00:06:06,900 --> 00:06:10,730 Get an app that specializes in all the information about food? 122 00:06:10,730 --> 00:06:14,320 Or search the web through a browser that is a generalist? 123 00:06:14,320 --> 00:06:17,780 >> If you're a science nerd, the science app 124 00:06:17,780 --> 00:06:21,560 is going to outperform the web every single time. 125 00:06:21,560 --> 00:06:23,850 Our life flashed before our eyes. 126 00:06:23,850 --> 00:06:29,120 >> But then, why, why did the iPhone win? 127 00:06:29,120 --> 00:06:30,700 Why? 128 00:06:30,700 --> 00:06:34,050 What was it about the iPhone that made it special? 129 00:06:34,050 --> 00:06:36,200 The BlackBerry did everything an iPhone would do. 130 00:06:36,200 --> 00:06:40,100 Windows CE did everything iOS would do. 131 00:06:40,100 --> 00:06:42,842 >> SPEAKER 3: Apple is a company you can trust because they 132 00:06:42,842 --> 00:06:44,383 value the privacy of their customers. 133 00:06:44,383 --> 00:06:46,640 JAMES WHITTAKER: Nope. 134 00:06:46,640 --> 00:06:48,710 Apple has a lot of privacy data. 135 00:06:48,710 --> 00:06:49,590 Why? 136 00:06:49,590 --> 00:06:53,370 All those other phones, the BlackBerry, Windows CE, PalmPilot, they all 137 00:06:53,370 --> 00:06:55,660 did everything that those other ones did. 138 00:06:55,660 --> 00:06:57,570 What was it? 139 00:06:57,570 --> 00:06:58,820 >> SPEAKER 4: Performance. 140 00:06:58,820 --> 00:07:00,392 >> JAMES WHITTAKER: No. 141 00:07:00,392 --> 00:07:01,350 SPEAKER 5: Good design? 142 00:07:01,350 --> 00:07:03,340 JAMES WHITTAKER: No. 143 00:07:03,340 --> 00:07:05,000 The App Store. 144 00:07:05,000 --> 00:07:11,040 If you wanted to get an app on any other machine, any other handheld device, 145 00:07:11,040 --> 00:07:13,720 before the iPhone, how did you get that app? 146 00:07:13,720 --> 00:07:15,300 You had to go to the web. 147 00:07:15,300 --> 00:07:20,590 You had to know where it was. www.whereinthehellismyapp.com. 148 00:07:20,590 --> 00:07:24,650 And every single one of those websites had a completely different download 149 00:07:24,650 --> 00:07:27,020 experience, completely different. 150 00:07:27,020 --> 00:07:28,640 >> Users couldn't navigate it. 151 00:07:28,640 --> 00:07:31,739 No one ever installed apps on those devices. 152 00:07:31,739 --> 00:07:32,780 You couldn't update them. 153 00:07:32,780 --> 00:07:36,520 >> And Steve Jobs came along and said, hey developers. 154 00:07:36,520 --> 00:07:40,280 I'm going to put your app where customers can actually find it. 155 00:07:40,280 --> 00:07:41,270 How does that sound? 156 00:07:41,270 --> 00:07:43,670 Really? 157 00:07:43,670 --> 00:07:46,740 You mean I can write code and it'll be used? 158 00:07:46,740 --> 00:07:47,920 Wow. 159 00:07:47,920 --> 00:07:49,090 That's awesome. 160 00:07:49,090 --> 00:07:53,090 >> And it was third party developers that made that platform what it is. 161 00:07:53,090 --> 00:07:55,090 The App Store was the killer app. 162 00:07:55,090 --> 00:07:56,850 >> Now see, this is a hint. 163 00:07:56,850 --> 00:07:59,280 Start watching for these gap. 164 00:07:59,280 --> 00:08:01,290 Start watching for these holes. 165 00:08:01,290 --> 00:08:04,050 From hardware to software, there was a hole that was filled. 166 00:08:04,050 --> 00:08:06,460 We can't program all these machines individually. 167 00:08:06,460 --> 00:08:09,470 One operating system to rule them all worked. 168 00:08:09,470 --> 00:08:10,230 The web. 169 00:08:10,230 --> 00:08:12,380 >> We no longer have to distribute media. 170 00:08:12,380 --> 00:08:16,010 We no longer have to update software and ship it out on disk. 171 00:08:16,010 --> 00:08:18,270 We can just push it through the web. 172 00:08:18,270 --> 00:08:19,860 This is important. 173 00:08:19,860 --> 00:08:23,180 And the App Store solved the biggest problem in mobile. 174 00:08:23,180 --> 00:08:26,110 And if it wasn't for the App Store, it wouldn't have worked. 175 00:08:26,110 --> 00:08:27,840 Right, because it drew developers. 176 00:08:27,840 --> 00:08:31,070 And developers made the platform what it is. 177 00:08:31,070 --> 00:08:33,565 >> I'm going to take questions at the end, is that OK? 178 00:08:33,565 --> 00:08:35,429 Or I'm never going to get through this. 179 00:08:35,429 --> 00:08:38,549 >> So what's next? 180 00:08:38,549 --> 00:08:40,336 What's the 2020s? 181 00:08:40,336 --> 00:08:41,867 >> SPEAKER 6: Nanocomputers? 182 00:08:41,867 --> 00:08:43,700 JAMES WHITTAKER: We're going to get to that. 183 00:08:43,700 --> 00:08:45,360 Close. 184 00:08:45,360 --> 00:08:46,960 >> Now, here's what we're going to do. 185 00:08:46,960 --> 00:08:50,520 And I want to teach you this because I think 186 00:08:50,520 --> 00:08:52,540 this is the way to discover the future. 187 00:08:52,540 --> 00:08:55,500 There was a bunch of gaps, a whole bunch of gaps. 188 00:08:55,500 --> 00:08:57,560 And you've got to get used to finding them. 189 00:08:57,560 --> 00:09:00,180 >> Every time your machines let you down. 190 00:09:00,180 --> 00:09:04,940 Every time you think, why did I have to click five times to get this? 191 00:09:04,940 --> 00:09:09,420 Why is this functionality not there? 192 00:09:09,420 --> 00:09:12,140 That's a missing link. 193 00:09:12,140 --> 00:09:15,410 >> That's a gap that you can step in and fill, 194 00:09:15,410 --> 00:09:17,320 but only if you're watching for them. 195 00:09:17,320 --> 00:09:21,730 Only if you look at the holes in the existing technology as opportunity 196 00:09:21,730 --> 00:09:24,800 instead, of something that pisses you off. 197 00:09:24,800 --> 00:09:26,240 >> So what's next? 198 00:09:26,240 --> 00:09:27,490 Here's what we're going to do. 199 00:09:27,490 --> 00:09:31,230 We're going to take a scenario, and we're going to chuck it in 2020. 200 00:09:31,230 --> 00:09:32,100 All right? 201 00:09:32,100 --> 00:09:34,360 >> We're just going to take a user scenario and we're 202 00:09:34,360 --> 00:09:37,800 going to throw it into the future and see what happens. 203 00:09:37,800 --> 00:09:39,550 Have you ever seen this movie? 204 00:09:39,550 --> 00:09:41,060 The best movie ever, right? 205 00:09:41,060 --> 00:09:41,850 World War Z. 206 00:09:41,850 --> 00:09:43,620 >> It's the best movie ever for two reasons. 207 00:09:43,620 --> 00:09:44,620 First, it's got zombies. 208 00:09:44,620 --> 00:09:46,530 And zombies are real, right? 209 00:09:46,530 --> 00:09:50,710 Not like that bullshit vampire stuff we had to put up with a few years ago. 210 00:09:50,710 --> 00:09:52,310 Vampires are made up. 211 00:09:52,310 --> 00:09:55,040 They're not real, they never will be. 212 00:09:55,040 --> 00:09:57,300 But they were kind of sexy, so we watched them anyhow. 213 00:09:57,300 --> 00:09:59,590 >> But zombies, people, zombies are real. 214 00:09:59,590 --> 00:10:02,010 We can mathematically define this virus. 215 00:10:02,010 --> 00:10:03,230 It's just a matter of time. 216 00:10:03,230 --> 00:10:06,610 It's like your civic duty to watch every single one of these movies. 217 00:10:06,610 --> 00:10:09,140 So that I don't have to kill you during the apocalypse. 218 00:10:09,140 --> 00:10:12,890 >> Second reason it's the best movie ever is because it's got Brad Pitt in it. 219 00:10:12,890 --> 00:10:16,340 It's like zombie movie meets date night, and there we were. 220 00:10:16,340 --> 00:10:18,147 I'm going, oh, zombies. 221 00:10:18,147 --> 00:10:20,480 And she's sitting next to me and she says, Oh Brad Pitt. 222 00:10:20,480 --> 00:10:24,000 Somebody told her about the shirtless scene and she couldn't wait. 223 00:10:24,000 --> 00:10:27,910 >> But then something strange happened, my date pulls out her phone 224 00:10:27,910 --> 00:10:29,820 and begins to share screen time. 225 00:10:29,820 --> 00:10:34,060 She's like, oh Brad Pitt, add something on-- oh, Brad Pitt. 226 00:10:34,060 --> 00:10:36,490 And I'm thinking, what's going on here? 227 00:10:36,490 --> 00:10:40,860 What part of shirtless Brad Pitt does she not understand? 228 00:10:40,860 --> 00:10:42,180 >> And then, it got worse. 229 00:10:42,180 --> 00:10:45,190 She taps me on the knee and said, I'll be right back. 230 00:10:45,190 --> 00:10:50,530 And she walks out of the theater during this most perfect movie. 231 00:10:50,530 --> 00:10:51,450 I'm stunned. 232 00:10:51,450 --> 00:10:52,320 I'm flabbergasted. 233 00:10:52,320 --> 00:10:54,540 I'm coming up with break up lines, man. 234 00:10:54,540 --> 00:10:56,170 This is crazy. 235 00:10:56,170 --> 00:10:58,020 >> Finally, she comes back, she sits down. 236 00:10:58,020 --> 00:10:59,600 She's gone two, three minutes. 237 00:10:59,600 --> 00:11:01,780 And she's in a huff, right, she's clearly hurrying. 238 00:11:01,780 --> 00:11:03,780 And I'm waiting for it, right. 239 00:11:03,780 --> 00:11:05,070 Waiting for it. 240 00:11:05,070 --> 00:11:05,570 Come on. 241 00:11:05,570 --> 00:11:06,300 Say it. 242 00:11:06,300 --> 00:11:07,670 You missed part of the movie. 243 00:11:07,670 --> 00:11:09,790 Say it. 244 00:11:09,790 --> 00:11:10,800 >> And she didn't say it. 245 00:11:10,800 --> 00:11:12,970 She didn't say, what happened? 246 00:11:12,970 --> 00:11:14,140 What did I miss? 247 00:11:14,140 --> 00:11:16,150 So I thought, all right, I'll be proactive. 248 00:11:16,150 --> 00:11:18,840 So I said, hey, you didn't miss anything. 249 00:11:18,840 --> 00:11:21,080 And she said, I know. 250 00:11:21,080 --> 00:11:25,520 >> Because she has an app that tells her when to pee at the movies. 251 00:11:25,520 --> 00:11:26,810 There is an app for that. 252 00:11:26,810 --> 00:11:30,940 Someone has watched every single minute of every single movie, 253 00:11:30,940 --> 00:11:32,180 and they curated it all. 254 00:11:32,180 --> 00:11:32,430 Right? 255 00:11:32,430 --> 00:11:34,680 >> Oh, there's an action sequence you don't want to miss. 256 00:11:34,680 --> 00:11:35,860 Don't pee during that one. 257 00:11:35,860 --> 00:11:38,150 New character, don't pee during that one. 258 00:11:38,150 --> 00:11:41,025 But here's four minutes, right in the middle, nothing really happens. 259 00:11:41,025 --> 00:11:42,640 Go take a leak. 260 00:11:42,640 --> 00:11:45,169 >> Now I ask you all, how many of you all go to the movies? 261 00:11:45,169 --> 00:11:46,960 Just raise your hand if you're a moviegoer. 262 00:11:46,960 --> 00:11:49,270 I think it's safe to say we're all moviegoers. 263 00:11:49,270 --> 00:11:51,880 How many of you all have this app? 264 00:11:51,880 --> 00:11:53,230 One person. 265 00:11:53,230 --> 00:11:53,790 I love it. 266 00:11:53,790 --> 00:11:54,610 I do too. 267 00:11:54,610 --> 00:11:57,670 Small bladder people. 268 00:11:57,670 --> 00:11:58,882 >> Look at this. 269 00:11:58,882 --> 00:12:00,340 This is a gap. 270 00:12:00,340 --> 00:12:02,280 Here is-- $1.98. 271 00:12:02,280 --> 00:12:02,780 Right? 272 00:12:02,780 --> 00:12:08,520 Your $0.99, my $0.99, and yet all of you all are potential customers. 273 00:12:08,520 --> 00:12:11,030 >> This is called the application discoverability problem. 274 00:12:11,030 --> 00:12:13,140 And it is the biggest technical problem. 275 00:12:13,140 --> 00:12:15,490 It's taking down the App Store. 276 00:12:15,490 --> 00:12:17,460 The App Store solved a huge problem. 277 00:12:17,460 --> 00:12:20,110 You couldn't find apps on the web, put them all in the store, 278 00:12:20,110 --> 00:12:21,040 now you can find them. 279 00:12:21,040 --> 00:12:21,880 Aha. 280 00:12:21,880 --> 00:12:24,470 You can't find them again. 281 00:12:24,470 --> 00:12:26,990 >> That is a big world changing solution. 282 00:12:26,990 --> 00:12:28,130 So how should this work? 283 00:12:28,130 --> 00:12:30,090 Let's cast this problem into the future. 284 00:12:30,090 --> 00:12:32,920 How should this technology solve this problem? 285 00:12:32,920 --> 00:12:38,300 It should be that I'm sitting in the movies, and I think, oh I've got to go. 286 00:12:38,300 --> 00:12:44,900 And I say, oh I don't know, Cortana, I have to pee. 287 00:12:44,900 --> 00:12:48,370 Now, what does Cortana have to do to help me? 288 00:12:48,370 --> 00:12:50,760 >> First, Cortana has to figure out what I'm doing. 289 00:12:50,760 --> 00:12:52,010 Where is my user? 290 00:12:52,010 --> 00:12:52,640 OK. 291 00:12:52,640 --> 00:12:53,660 No problem. 292 00:12:53,660 --> 00:12:54,820 Lat/long pair. 293 00:12:54,820 --> 00:12:55,780 Easy to do. 294 00:12:55,780 --> 00:12:58,000 Built in functionality to the phone. 295 00:12:58,000 --> 00:13:01,250 So now, Cortana knows exactly where I am. 296 00:13:01,250 --> 00:13:02,350 Where is that? 297 00:13:02,350 --> 00:13:03,600 Looks up an address. 298 00:13:03,600 --> 00:13:05,740 Maps that lat/long pair to an address. 299 00:13:05,740 --> 00:13:06,460 Easy. 300 00:13:06,460 --> 00:13:08,335 Built in functionality on the map application 301 00:13:08,335 --> 00:13:10,280 that is already on my phone. 302 00:13:10,280 --> 00:13:12,280 No apps, yet. 303 00:13:12,280 --> 00:13:17,060 >> And then, next, she's got to think, OK what is he doing here at this location? 304 00:13:17,060 --> 00:13:19,700 Microsoft Research has a patent for geolocating you 305 00:13:19,700 --> 00:13:21,120 off the face of the earth. 306 00:13:21,120 --> 00:13:23,610 We know you're on the second floor of that building. 307 00:13:23,610 --> 00:13:30,640 Bing has every single floor plan in most major cities, across the world, 308 00:13:30,640 --> 00:13:31,520 as data. 309 00:13:31,520 --> 00:13:32,020 OK. 310 00:13:32,020 --> 00:13:33,560 >> You're in theater number four. 311 00:13:33,560 --> 00:13:36,685 Let's see, now I can go look up on the web what's playing in theater number 312 00:13:36,685 --> 00:13:37,310 four right now. 313 00:13:37,310 --> 00:13:42,920 Because I know he's watching movie, he hasn't moved it in 30 minutes. 314 00:13:42,920 --> 00:13:45,670 Cortana knows exactly what I'm doing. 315 00:13:45,670 --> 00:13:48,880 >> And in fact, Cortana, probably when I say, Cortana I have to pee, 316 00:13:48,880 --> 00:13:50,520 she's probably going to know, right? 317 00:13:50,520 --> 00:13:55,190 Because the last time she geolocated me in a bathroom was two hours ago. 318 00:13:55,190 --> 00:13:58,170 She's been living in my pocket for two years. 319 00:13:58,170 --> 00:14:04,120 She has my mean time to bladder evacuation data down, right? 320 00:14:04,120 --> 00:14:05,790 >> She's going to say, yes, James, I know. 321 00:14:05,790 --> 00:14:10,220 I saw you buy that beer on the way in, just wait a few minutes, I'll vibrate 322 00:14:10,220 --> 00:14:14,050 and then you can go do your thing and not miss any part of this movie. 323 00:14:14,050 --> 00:14:15,750 >> Now what happened there? 324 00:14:15,750 --> 00:14:17,610 There was no app involved in that. 325 00:14:17,610 --> 00:14:23,380 We actually have all of the pieces to solve this problem, right now. 326 00:14:23,380 --> 00:14:26,990 So what can we infer about the future based on this? 327 00:14:26,990 --> 00:14:30,330 There are several things we can confer about the future. 328 00:14:30,330 --> 00:14:36,090 >> First is our technology doesn't have to go to the web anymore. 329 00:14:36,090 --> 00:14:37,840 We don't have to go to the web anymore. 330 00:14:37,840 --> 00:14:40,340 Our technology takes us there. 331 00:14:40,340 --> 00:14:44,070 The web is no longer a destination, it's simply a data source. 332 00:14:44,070 --> 00:14:46,190 >> The movie times are up there. 333 00:14:46,190 --> 00:14:48,020 The pee times are in the cloud. 334 00:14:48,020 --> 00:14:50,520 They're actually sitting in Azure. 335 00:14:50,520 --> 00:14:52,940 All that data is already there. 336 00:14:52,940 --> 00:14:56,730 The world is beginning to turn into data. 337 00:14:56,730 --> 00:15:00,250 Our devices are beginning to process and calculate intent 338 00:15:00,250 --> 00:15:02,330 almost better than we can. 339 00:15:02,330 --> 00:15:06,030 >> Secondly, Search. 340 00:15:06,030 --> 00:15:07,780 My machine searched the web. 341 00:15:07,780 --> 00:15:10,802 Do you know in 2015 something really special happened. 342 00:15:10,802 --> 00:15:13,260 Now I'm not talking about bots, I'm talking about machines. 343 00:15:13,260 --> 00:15:18,130 >> Machines on the web originated and consumed more searches 344 00:15:18,130 --> 00:15:20,970 than humans for the first time ever. 345 00:15:20,970 --> 00:15:24,640 In 2014, it was still a human dominated web. 346 00:15:24,640 --> 00:15:27,960 2015 it's a machine-- machines are equal. 347 00:15:27,960 --> 00:15:33,770 2016, 2017, by 2020, the amount of human generated and human consumed traffic 348 00:15:33,770 --> 00:15:35,710 is going to be minuscule. 349 00:15:35,710 --> 00:15:39,880 Our machines are going to be consuming the web for us. 350 00:15:39,880 --> 00:15:41,950 >> And then, finally apps. 351 00:15:41,950 --> 00:15:43,960 Where's the app in this? 352 00:15:43,960 --> 00:15:45,520 I don't need the app. 353 00:15:45,520 --> 00:15:47,120 The app is a noun. 354 00:15:47,120 --> 00:15:51,110 I only need the pee times, I don't need all the trappings of it. 355 00:15:51,110 --> 00:15:55,410 Give me the answer and I'm happy. 356 00:15:55,410 --> 00:15:57,800 >> Why are apps nouns? 357 00:15:57,800 --> 00:15:58,710 So you pay for them. 358 00:15:58,710 --> 00:16:00,970 We're going to talk about monetization at the end, 359 00:16:00,970 --> 00:16:04,250 because making money gets kind of scary in the future. 360 00:16:04,250 --> 00:16:08,630 But we're going to talk about it, because it's important. 361 00:16:08,630 --> 00:16:10,380 >> Apps have turned into verbs. 362 00:16:10,380 --> 00:16:10,880 Right? 363 00:16:10,880 --> 00:16:14,741 My technology discerns my intent, realizes I need something. 364 00:16:14,741 --> 00:16:15,490 It's in the cloud. 365 00:16:15,490 --> 00:16:16,350 It's on the web. 366 00:16:16,350 --> 00:16:18,650 Crack the app open, bring out the answer, 367 00:16:18,650 --> 00:16:22,340 put it on my device just in case I need it. 368 00:16:22,340 --> 00:16:28,460 >> All of a sudden, a lot of the things that we do are no longer human 369 00:16:28,460 --> 00:16:29,410 generated. 370 00:16:29,410 --> 00:16:33,530 The machines are beginning to take over. 371 00:16:33,530 --> 00:16:36,540 So you said nanotechnology, I'm just going to generalize it to machines, 372 00:16:36,540 --> 00:16:40,340 but yes, a lot of them are going to be very, very small. 373 00:16:40,340 --> 00:16:46,510 >> So Microsoft, my boss came to me summer of 2014, and he's like dude, 374 00:16:46,510 --> 00:16:51,300 HR tells me you haven't taken a vacation in four years. 375 00:16:51,300 --> 00:16:52,820 Go on vacation. 376 00:16:52,820 --> 00:16:55,570 >> And so I Binged it what this vacation thing was. 377 00:16:55,570 --> 00:16:59,680 And apparently, vacation is something that people who don't like their jobs 378 00:16:59,680 --> 00:17:02,890 do to get away from their jobs. 379 00:17:02,890 --> 00:17:04,480 And that kind of pissed me off, right? 380 00:17:04,480 --> 00:17:05,480 I don't want a vacation. 381 00:17:05,480 --> 00:17:08,240 So I decided, I want to write code. 382 00:17:08,240 --> 00:17:12,380 And so I thought this whole internet of things thing sounded stupid, right? 383 00:17:12,380 --> 00:17:13,730 Nest a thermostat. 384 00:17:13,730 --> 00:17:15,369 You can control it. 385 00:17:15,369 --> 00:17:18,079 And it's better, some way, because there's a machine doing it. 386 00:17:18,079 --> 00:17:18,830 >> No, it's not. 387 00:17:18,830 --> 00:17:19,859 Not in my house. 388 00:17:19,859 --> 00:17:22,150 I am a Seattle tree hugger. 389 00:17:22,150 --> 00:17:23,619 That thing stays off. 390 00:17:23,619 --> 00:17:27,119 Put on a sweater and kiss my ass. 391 00:17:27,119 --> 00:17:28,540 Or the lights. 392 00:17:28,540 --> 00:17:30,750 I've got this light program on my phone. 393 00:17:30,750 --> 00:17:32,020 I can do my burglar alarm. 394 00:17:32,020 --> 00:17:35,400 It's harder to do the app than it is to just go over and turn 395 00:17:35,400 --> 00:17:36,920 the light on or turn the light off. 396 00:17:36,920 --> 00:17:41,950 >> So I had this whole-- I call bullshit on the internet of things kind of feeling. 397 00:17:41,950 --> 00:17:45,520 So I thought, that's what I'll do I'll investigate this internet of things. 398 00:17:45,520 --> 00:17:48,890 >> I'll put a machine on the internet of things that deserves to be there. 399 00:17:48,890 --> 00:17:50,780 A machine that's got something to say. 400 00:17:50,780 --> 00:17:53,980 A machine that's difficult for me to manage on my own, 401 00:17:53,980 --> 00:17:56,560 and I could use some robots to help me. 402 00:17:56,560 --> 00:18:00,940 And so I put my hot tub on the internet of things. 403 00:18:00,940 --> 00:18:04,290 >> So Step 1 was my vision, because I always like to vision 404 00:18:04,290 --> 00:18:06,180 how these things are going to work. 405 00:18:06,180 --> 00:18:11,430 My vision was that Amazon drone would fly over full of hot tub chemicals. 406 00:18:11,430 --> 00:18:15,210 And my hot tub would see it coming, and open its lid automatically. 407 00:18:15,210 --> 00:18:18,319 And we'd just shoot the chemicals right in the-- so I thought, 408 00:18:18,319 --> 00:18:20,860 OK first thing I'm going to do is I'm going to open and close 409 00:18:20,860 --> 00:18:22,325 the lid automatically. 410 00:18:22,325 --> 00:18:23,700 And that was actually quite easy. 411 00:18:23,700 --> 00:18:27,610 A radio frequency controller on the motor for my hot-- easy, easy. 412 00:18:27,610 --> 00:18:29,920 I mean seriously, two hours. 413 00:18:29,920 --> 00:18:31,780 Most of it was just hooking stuff up. 414 00:18:31,780 --> 00:18:33,470 Four or five lines of code and I'm done. 415 00:18:33,470 --> 00:18:37,800 And now I've got a little Windows phone app I can open and close the lid. 416 00:18:37,800 --> 00:18:44,090 >> Second, was-- actually second was not any of that-- second was I thought, 417 00:18:44,090 --> 00:18:48,530 I don't want that Amazon drone squirting chemicals in if I'm in, 418 00:18:48,530 --> 00:18:51,820 so I put in a level, a detector. 419 00:18:51,820 --> 00:18:55,700 I can detect when a human gets in, the water level rises. 420 00:18:55,700 --> 00:18:58,880 >> And in fact, it turns out, I was looking at the data, I can weigh you. 421 00:18:58,880 --> 00:19:02,510 If you're sitting in my hot tub, I know how much water you've displaced, 422 00:19:02,510 --> 00:19:06,542 I've done the math, and I can weigh you more accurately than a doctor can. 423 00:19:06,542 --> 00:19:08,500 And I know if your head's gone under, because I 424 00:19:08,500 --> 00:19:11,840 have to estimate your head weight because it's not under the water. 425 00:19:11,840 --> 00:19:15,660 426 00:19:15,660 --> 00:19:19,040 So if you go under, I know if you're drowning. 427 00:19:19,040 --> 00:19:20,740 This is totally cool. 428 00:19:20,740 --> 00:19:24,110 I had a bug in that, my hot tub kept thinking somebody was drowning and they 429 00:19:24,110 --> 00:19:24,610 weren't. 430 00:19:24,610 --> 00:19:28,340 I'm like, just go ahead drown, I'm tired of this error message. 431 00:19:28,340 --> 00:19:32,210 >> The next was checking and maintaining water quality, that was the hard part. 432 00:19:32,210 --> 00:19:34,830 That took me a good day and a half to solve that one. 433 00:19:34,830 --> 00:19:37,980 Because the way you check water is you actually dip the water out, 434 00:19:37,980 --> 00:19:40,790 and you put in these little chemicals, and you compare the color. 435 00:19:40,790 --> 00:19:43,330 And I don't have a machine that can do that. 436 00:19:43,330 --> 00:19:46,420 >> So I had to get a laser, and I actually worked with a nanotechnologist 437 00:19:46,420 --> 00:19:47,000 to do this. 438 00:19:47,000 --> 00:19:47,890 Get a laser. 439 00:19:47,890 --> 00:19:48,800 Shoot it through. 440 00:19:48,800 --> 00:19:55,297 And you get the color spread on this little backing piece of nanotechnology. 441 00:19:55,297 --> 00:19:56,380 I don't know how it works. 442 00:19:56,380 --> 00:19:59,240 It gives me data and it allows me to determine the color. 443 00:19:59,240 --> 00:20:01,760 And then I can maintain the water quality. 444 00:20:01,760 --> 00:20:03,050 Very cool. 445 00:20:03,050 --> 00:20:04,570 >> Next was reorder chemicals. 446 00:20:04,570 --> 00:20:05,180 Easy. 447 00:20:05,180 --> 00:20:09,680 Amazon makes buying things from the web so easy. 448 00:20:09,680 --> 00:20:10,640 Bless them. 449 00:20:10,640 --> 00:20:11,940 So it can reorder chemicals. 450 00:20:11,940 --> 00:20:13,560 It can monitor use. 451 00:20:13,560 --> 00:20:15,540 It sends me a signal when somebody's in. 452 00:20:15,540 --> 00:20:18,730 It monitors how often people are in, and I've 453 00:20:18,730 --> 00:20:20,680 gotten a bunch of data about that right. 454 00:20:20,680 --> 00:20:23,520 >> I know how dirty you are when you get in my hot tub. 455 00:20:23,520 --> 00:20:25,850 Because I can check the water before you get in. 456 00:20:25,850 --> 00:20:28,100 I can check the water after you get in. 457 00:20:28,100 --> 00:20:32,350 I know what you've done in my hot tub. 458 00:20:32,350 --> 00:20:35,430 >> The amount of data you can get from these machines 459 00:20:35,430 --> 00:20:38,770 is pretty, pretty amazing. 460 00:20:38,770 --> 00:20:39,960 It will troubleshoot itself. 461 00:20:39,960 --> 00:20:41,610 >> So that's what I'm doing now. 462 00:20:41,610 --> 00:20:45,460 I'm monitoring the voltage and trying to figure out all the noise from the grid. 463 00:20:45,460 --> 00:20:47,600 The Seattle grid is really spiky. 464 00:20:47,600 --> 00:20:50,000 Your grid is, probably, really spiky too. 465 00:20:50,000 --> 00:20:53,600 >> But I'm beginning to find the patterns for when my seals are wearing out, 466 00:20:53,600 --> 00:20:56,160 because when the seals wear out, voltage spikes. 467 00:20:56,160 --> 00:20:57,190 And it stays up. 468 00:20:57,190 --> 00:21:01,524 So it's one of these-- you can detect it over time. 469 00:21:01,524 --> 00:21:04,190 I'm getting to the point where it's going to be able to do that. 470 00:21:04,190 --> 00:21:06,106 >> And then next summer, I'm thinking 3D printing 471 00:21:06,106 --> 00:21:08,940 is going to get good enough that I can just print my own seals. 472 00:21:08,940 --> 00:21:11,900 By the way, that's something else that's happened in 2015. 473 00:21:11,900 --> 00:21:15,520 Before 2015, we were only 3d printing in plastic. 474 00:21:15,520 --> 00:21:17,310 This year we've added metal. 475 00:21:17,310 --> 00:21:19,020 We've added carbon fiber. 476 00:21:19,020 --> 00:21:21,060 We've added sugars. 477 00:21:21,060 --> 00:21:22,870 We can print carbohydrates. 478 00:21:22,870 --> 00:21:23,600 This is crazy. 479 00:21:23,600 --> 00:21:25,225 >> SPEAKER 6: There scientists [INAUDIBLE] 480 00:21:25,225 --> 00:21:27,894 481 00:21:27,894 --> 00:21:29,060 JAMES WHITTAKER: Absolutely. 482 00:21:29,060 --> 00:21:32,890 And it's going to match your DNA, too, so it's going to know. 483 00:21:32,890 --> 00:21:36,240 Because, you know, medicine is made for a six-foot tall white guy. 484 00:21:36,240 --> 00:21:41,270 And so we're going to able to do some amazing things with this. 485 00:21:41,270 --> 00:21:43,180 >> Proteins are coming next. 486 00:21:43,180 --> 00:21:48,030 We're almost to the point where we can print cotton and fibers for clothing. 487 00:21:48,030 --> 00:21:49,620 They're printing houses in China. 488 00:21:49,620 --> 00:21:52,580 They're printing cars in North America. 489 00:21:52,580 --> 00:21:56,770 And so I'm thinking I can print a seal for my hot tub. 490 00:21:56,770 --> 00:22:00,390 >> So now, the cool thing about this is, when these hot tubs 491 00:22:00,390 --> 00:22:05,330 begin to talk to each other, because let's say you have a hot tub 492 00:22:05,330 --> 00:22:06,510 and I have a hot tub. 493 00:22:06,510 --> 00:22:11,434 And my hot tub might say, dude-- because I get to design the hot tub protocol, 494 00:22:11,434 --> 00:22:13,350 and they're all going to call each other dude. 495 00:22:13,350 --> 00:22:14,830 That's just going to happen. 496 00:22:14,830 --> 00:22:16,360 I guarantee that's going to happen. 497 00:22:16,360 --> 00:22:18,240 >> It can say, dude, here's my data. 498 00:22:18,240 --> 00:22:20,150 Here's my usage data. 499 00:22:20,150 --> 00:22:21,420 Here's my chemical data. 500 00:22:21,420 --> 00:22:22,270 Let's look at yours. 501 00:22:22,270 --> 00:22:24,050 And they're going to compare notes. 502 00:22:24,050 --> 00:22:28,710 And they're going to figure out what the best data-- what the best chemical 503 00:22:28,710 --> 00:22:29,960 concoction is. 504 00:22:29,960 --> 00:22:32,380 >> This company makes the best chemical. 505 00:22:32,380 --> 00:22:34,900 This company makes the best chemical for the Northwest. 506 00:22:34,900 --> 00:22:39,490 This company makes the best chemical for a desert climate. 507 00:22:39,490 --> 00:22:42,570 They are going to figure all of this out. 508 00:22:42,570 --> 00:22:44,010 And it's going to be amazing. 509 00:22:44,010 --> 00:22:48,310 >> By the way, this is where the advertising and marketing 510 00:22:48,310 --> 00:22:50,610 economy goes away. 511 00:22:50,610 --> 00:22:53,030 You can't advertise to machines. 512 00:22:53,030 --> 00:22:55,350 And if we're right, and if the machines really 513 00:22:55,350 --> 00:23:00,330 are the next thing, what are you going to advertise? 514 00:23:00,330 --> 00:23:03,530 How are you going to say, hey, I got some hot tub chemicals for you. 515 00:23:03,530 --> 00:23:08,020 Look at these hot tub chemicals, man, they've got dancing cats and stuff. 516 00:23:08,020 --> 00:23:09,210 >> They don't care. 517 00:23:09,210 --> 00:23:13,660 Machines are going to say, we know the data, man, don't come advertising 518 00:23:13,660 --> 00:23:15,580 to me. 519 00:23:15,580 --> 00:23:19,120 And all the machines are going to be in on this. 520 00:23:19,120 --> 00:23:20,120 Yes. 521 00:23:20,120 --> 00:23:23,120 >> SPEAKER 6: One of the next things is building personal apps for ads 522 00:23:23,120 --> 00:23:26,130 so that they don't have to feel like they're being marketed. 523 00:23:26,130 --> 00:23:29,120 >> JAMES WHITTAKER: A couple slides later, I'll talk about that. 524 00:23:29,120 --> 00:23:31,079 Your refrigerator. 525 00:23:31,079 --> 00:23:32,120 I'm building a new house. 526 00:23:32,120 --> 00:23:33,520 I'm shopping for refrigerators. 527 00:23:33,520 --> 00:23:34,280 Refrigerators. 528 00:23:34,280 --> 00:23:36,320 You don't have to just scan things in anymore, 529 00:23:36,320 --> 00:23:39,180 like you used to have to keep the barcode, so it knows you have chicken. 530 00:23:39,180 --> 00:23:41,638 >> Soon as you close your refrigerator door, a bunch of lasers 531 00:23:41,638 --> 00:23:45,480 start shining on your food, figuring out what you have in there, 532 00:23:45,480 --> 00:23:49,000 how long it's been in there, and its chemical composition. 533 00:23:49,000 --> 00:23:52,990 We're going to be able to detect food that's gone bad. 534 00:23:52,990 --> 00:23:55,750 And it's going to be able to know what you have, it's going 535 00:23:55,750 --> 00:23:58,491 to be able to cook, suggest meals. 536 00:23:58,491 --> 00:24:00,740 Your toaster is going to be on the internet of things. 537 00:24:00,740 --> 00:24:03,670 What the hell's a toaster have to say on the internet of things? 538 00:24:03,670 --> 00:24:06,340 Actually, I think the toasters have a lot to say on the internet of things. 539 00:24:06,340 --> 00:24:08,600 But I think one is going to be really interesting. 540 00:24:08,600 --> 00:24:10,462 And that is end of life decisions. 541 00:24:10,462 --> 00:24:13,420 Because one day, your toaster's going to wake up and it's going to say, 542 00:24:13,420 --> 00:24:14,003 wait a minute. 543 00:24:14,003 --> 00:24:15,810 Something's wrong with me. 544 00:24:15,810 --> 00:24:17,840 Ah, my data's off-- something's wrong. 545 00:24:17,840 --> 00:24:20,120 It's going to go out on the internet of toasters. 546 00:24:20,120 --> 00:24:23,800 And he's going to say, dudes, something's wrong with me. 547 00:24:23,800 --> 00:24:25,390 What's wrong with me? 548 00:24:25,390 --> 00:24:28,350 And these other toasters are going to look at the data. 549 00:24:28,350 --> 00:24:31,160 >> And they're going to say, oh dude, you're dying. 550 00:24:31,160 --> 00:24:33,370 Filament number four is wearing thin. 551 00:24:33,370 --> 00:24:35,090 You're about to go. 552 00:24:35,090 --> 00:24:39,420 And it's going to have to order its own replacement toaster. 553 00:24:39,420 --> 00:24:40,800 Is that sad? 554 00:24:40,800 --> 00:24:43,780 That brave little toaster is going to have to do that. 555 00:24:43,780 --> 00:24:46,360 >> And the thing that I think is going to happen 556 00:24:46,360 --> 00:24:50,360 is, you're not going to get a choice of toasters. 557 00:24:50,360 --> 00:24:53,160 Your toaster is going to replace itself. 558 00:24:53,160 --> 00:24:57,360 >> You're just going to get the stock, gray toaster, because you don't care. 559 00:24:57,360 --> 00:25:00,950 >> You have a pink Mac and cool glasses. 560 00:25:00,950 --> 00:25:03,220 You're going to get a stylish toaster. 561 00:25:03,220 --> 00:25:05,197 >> Your machines are going to know what you want. 562 00:25:05,197 --> 00:25:07,780 They're going to know your income level, and the neighborhood, 563 00:25:07,780 --> 00:25:10,470 and what other type of toasters those people are buying. 564 00:25:10,470 --> 00:25:12,710 And they're going to make decisions for you. 565 00:25:12,710 --> 00:25:15,216 >> Now, people in my generation are old enough to say, 566 00:25:15,216 --> 00:25:16,840 I don't want the machines to take over. 567 00:25:16,840 --> 00:25:18,173 I want to make my own decisions. 568 00:25:18,173 --> 00:25:21,209 And we don't matter, because we're going to die sooner than you all. 569 00:25:21,209 --> 00:25:23,750 And you all are going to die sooner than the next generation. 570 00:25:23,750 --> 00:25:25,791 >> And the next generation's going to say, you mean, 571 00:25:25,791 --> 00:25:27,480 you picked out your own toaster? 572 00:25:27,480 --> 00:25:29,850 That's stupid. 573 00:25:29,850 --> 00:25:31,010 You drove your own car? 574 00:25:31,010 --> 00:25:32,820 That's stupid. 575 00:25:32,820 --> 00:25:33,830 Are you kidding me? 576 00:25:33,830 --> 00:25:36,038 >> Your alls kids aren't going to get driver's licenses. 577 00:25:36,038 --> 00:25:37,880 Isn't that cool? 578 00:25:37,880 --> 00:25:39,260 >> Clothing. 579 00:25:39,260 --> 00:25:42,180 Clothing is going to self-market. 580 00:25:42,180 --> 00:25:43,920 We're all going to have to dress better. 581 00:25:43,920 --> 00:25:48,270 Because if I say, man, look at that Microsoft shirt, that is awesome. 582 00:25:48,270 --> 00:25:49,527 I want that shirt in my size. 583 00:25:49,527 --> 00:25:52,360 Now, I could come up to you, and I can have this creepy conversation 584 00:25:52,360 --> 00:25:55,295 about hey, man where did you get that cool shirt? 585 00:25:55,295 --> 00:26:00,635 >> And all of this, or that thing is going to be on the internet of shirts. 586 00:26:00,635 --> 00:26:03,010 And it's going to be able to communicate with my devices. 587 00:26:03,010 --> 00:26:05,350 I don't know how. 588 00:26:05,350 --> 00:26:06,520 Maybe I do this. 589 00:26:06,520 --> 00:26:09,100 Maybe there's a button. 590 00:26:09,100 --> 00:26:10,360 They communicate. 591 00:26:10,360 --> 00:26:16,040 >> And all of a sudden, I have one of those 3D printing at home in my size. 592 00:26:16,040 --> 00:26:17,640 And you get paid from that. 593 00:26:17,640 --> 00:26:19,973 We're going to talk about how to get paid from all this. 594 00:26:19,973 --> 00:26:23,260 You just referred a shirt manufacturer to a new customer. 595 00:26:23,260 --> 00:26:25,509 You're going to get a micropayment for that. 596 00:26:25,509 --> 00:26:26,425 This is my prediction. 597 00:26:26,425 --> 00:26:28,990 Now, it doesn't matter if I'm wrong. 598 00:26:28,990 --> 00:26:33,280 I'm trying to teach you how to ask, and answer, the question, what's next? 599 00:26:33,280 --> 00:26:35,510 I'm giving you example answers. 600 00:26:35,510 --> 00:26:37,520 It may, or may not, work this way. 601 00:26:37,520 --> 00:26:39,070 It doesn't matter. 602 00:26:39,070 --> 00:26:43,070 >> What really matters is that you get practice asking, and answering, 603 00:26:43,070 --> 00:26:45,690 this question, what's next, so you're ahead 604 00:26:45,690 --> 00:26:48,590 of the curve, instead of left behind. 605 00:26:48,590 --> 00:26:52,200 >> So these machines are going to talk to each other, too. 606 00:26:52,200 --> 00:26:56,305 Every time I go, my Microsoft band-- whatever it is-- 607 00:26:56,305 --> 00:26:59,500 is going to know I'm traveling in Europe. 608 00:26:59,500 --> 00:27:00,800 And I'm over-eating. 609 00:27:00,800 --> 00:27:03,320 >> And it's going to tell my refrigerator when I get 610 00:27:03,320 --> 00:27:06,160 home, hey man, this guy's unhealthy. 611 00:27:06,160 --> 00:27:09,760 You need to make sure you just order healthy food for him for a few weeks, 612 00:27:09,760 --> 00:27:13,630 and let's get his statistics back together. 613 00:27:13,630 --> 00:27:15,130 They're going to talk to each other. 614 00:27:15,130 --> 00:27:16,964 >> All of this traffic is going to be going on, 615 00:27:16,964 --> 00:27:19,713 and it's going to wash out everything humans are doing on the web, 616 00:27:19,713 --> 00:27:21,560 everything humans are doing with the cloud. 617 00:27:21,560 --> 00:27:24,490 Our devices are going to discern intent. 618 00:27:24,490 --> 00:27:27,266 And they're going to be correct most of the time. 619 00:27:27,266 --> 00:27:29,640 When they're not correct, they're going to learn from it, 620 00:27:29,640 --> 00:27:31,740 and they're going to stop being not correct. 621 00:27:31,740 --> 00:27:34,300 And within five years, our machines are going 622 00:27:34,300 --> 00:27:40,860 to know more about what we need to do during the day than we are. 623 00:27:40,860 --> 00:27:42,730 That's my prediction. 624 00:27:42,730 --> 00:27:44,240 >> Screens. 625 00:27:44,240 --> 00:27:46,990 What if we didn't need screens? 626 00:27:46,990 --> 00:27:49,470 What if we didn't have to carry these things around? 627 00:27:49,470 --> 00:27:50,540 Because screens go away. 628 00:27:50,540 --> 00:27:53,770 If our machines are automatically discerning intent, 629 00:27:53,770 --> 00:27:58,710 why do I need an input device to tell them what I want to do? 630 00:27:58,710 --> 00:28:02,370 Why do I need an output device to look at their suggestions? 631 00:28:02,370 --> 00:28:04,870 We're going to need it for a while, because the machines are 632 00:28:04,870 --> 00:28:05,720 going to be wrong. 633 00:28:05,720 --> 00:28:10,440 The machines are going to say, hey, I think this is your intent, am I right? 634 00:28:10,440 --> 00:28:11,530 Then we'll need it. 635 00:28:11,530 --> 00:28:13,984 But then, they're not going to need that for long. 636 00:28:13,984 --> 00:28:16,650 Or, they're going to say, hey, there's a couple of choices here. 637 00:28:16,650 --> 00:28:19,300 And then we're going to choose and then they're going to learn from that. 638 00:28:19,300 --> 00:28:21,230 And we're not going to need that anymore. 639 00:28:21,230 --> 00:28:24,470 >> If you take away this screen, what happens to this machine? 640 00:28:24,470 --> 00:28:27,310 The electronics in this machine, without the screen, 641 00:28:27,310 --> 00:28:30,960 are about the size of my- glad I chose that finger-- 642 00:28:30,960 --> 00:28:33,452 about the size of my index finger. 643 00:28:33,452 --> 00:28:35,660 Because if you take away the screen, you take away 70 644 00:28:35,660 --> 00:28:38,240 some odd percent of the battery, because that's all it does 645 00:28:38,240 --> 00:28:40,250 is service the screen. 646 00:28:40,250 --> 00:28:41,520 >> Moore's law. 647 00:28:41,520 --> 00:28:44,250 Let's push Moore's law a few years into the future. 648 00:28:44,250 --> 00:28:47,550 Instead of this machine being this big, the machine 649 00:28:47,550 --> 00:28:50,170 is going to be, in about four years, the size of just 650 00:28:50,170 --> 00:28:52,340 from the knuckle to the tip. 651 00:28:52,340 --> 00:28:55,850 And in another four years, just the size of my fingernail. 652 00:28:55,850 --> 00:28:59,160 >> We can sew these machines into anything, right? 653 00:28:59,160 --> 00:29:00,150 You can have computers. 654 00:29:00,150 --> 00:29:02,650 Do you realize that this Windows phone I carry in my pocket, 655 00:29:02,650 --> 00:29:04,720 and the smartphones you all carry your pocket, 656 00:29:04,720 --> 00:29:07,930 is more powerful than any computer that existed in 1994? 657 00:29:07,930 --> 00:29:11,080 In 1995, there were two that gave it a run for its money. 658 00:29:11,080 --> 00:29:13,340 1994, 21 years ago. 659 00:29:13,340 --> 00:29:18,810 >> And so now, all this computing power into the size of my fingernail. 660 00:29:18,810 --> 00:29:21,050 You all understand Moore's law is under threat. 661 00:29:21,050 --> 00:29:23,680 Moore's law is going away by 2020. 662 00:29:23,680 --> 00:29:26,760 We will no longer be able to get silicon so thin that we 663 00:29:26,760 --> 00:29:30,840 can continue to double the number of transistors on it, and microprocessors. 664 00:29:30,840 --> 00:29:32,770 How, or what, are we going to do? 665 00:29:32,770 --> 00:29:34,800 >> We're already solving this problem. 666 00:29:34,800 --> 00:29:38,710 IBM has produced-- IBM remember them from the '80s? 667 00:29:38,710 --> 00:29:40,240 They're coming back. 668 00:29:40,240 --> 00:29:43,070 See, it's much easier to come back from a long time 669 00:29:43,070 --> 00:29:46,690 ago than it is to come back from owning the last thing. 670 00:29:46,690 --> 00:29:51,940 That's why I think Apple's in trouble, because they're on top now. 671 00:29:51,940 --> 00:29:58,410 >> A germanium silicon mix, which is going to give a few more years of life, 672 00:29:58,410 --> 00:30:04,410 maybe as much as a decade of life, to Moore's law. 673 00:30:04,410 --> 00:30:09,280 We're also looking at graphene and carbon nanotubes. 674 00:30:09,280 --> 00:30:15,660 Both of those are going to continue Moore's law into the far future. 675 00:30:15,660 --> 00:30:18,850 At least a future that is far away enough to be unimaginable. 676 00:30:18,850 --> 00:30:22,301 >> Machines are going to get a lot more powerful. 677 00:30:22,301 --> 00:30:23,800 And we're not going to need screens. 678 00:30:23,800 --> 00:30:25,550 The screens are going to pop up anywhere. 679 00:30:25,550 --> 00:30:29,176 And you all had HoloLens on yet? 680 00:30:29,176 --> 00:30:34,930 HoloLens uses your brain, tricks your brain into putting objects 681 00:30:34,930 --> 00:30:36,490 that you think are real. 682 00:30:36,490 --> 00:30:39,420 Because that's what your brain already does. 683 00:30:39,420 --> 00:30:41,140 >> I don't look like this. 684 00:30:41,140 --> 00:30:45,780 I am your brain-- the way I look is your brain translating 685 00:30:45,780 --> 00:30:51,125 what the light is doing bouncing off of my skin, and hair, well skin. 686 00:30:51,125 --> 00:30:54,310 687 00:30:54,310 --> 00:30:56,400 That's what HoloLens uses to trick your mind 688 00:30:56,400 --> 00:30:58,660 into seeing things that aren't there. 689 00:30:58,660 --> 00:31:00,650 >> So we put a screen on the wall. 690 00:31:00,650 --> 00:31:03,330 Although this is a total bullshit exercise, 691 00:31:03,330 --> 00:31:07,450 no daughter needs her dad to figure out how to fix this. 692 00:31:07,450 --> 00:31:10,480 That daughter's going to do what everybody else does, and go to YouTube 693 00:31:10,480 --> 00:31:11,910 to figure out how to fix this. 694 00:31:11,910 --> 00:31:14,390 So I don't think I believe that scenario. 695 00:31:14,390 --> 00:31:16,080 >> But this is what it looks like. 696 00:31:16,080 --> 00:31:18,510 It's not virtual reality, it's augmented reality. 697 00:31:18,510 --> 00:31:19,660 Screens can be anywhere. 698 00:31:19,660 --> 00:31:21,130 You can work with this. 699 00:31:21,130 --> 00:31:24,860 You can see a Word document eight feet away from you, giant. 700 00:31:24,860 --> 00:31:28,935 You can have a holographic keyboard materialize and hover right 701 00:31:28,935 --> 00:31:30,810 in front of you, and with the special gloves, 702 00:31:30,810 --> 00:31:33,740 it actually feels like a real keyboard. 703 00:31:33,740 --> 00:31:36,830 >> The screens are going to come when we need them. 704 00:31:36,830 --> 00:31:41,070 And we're not going to need them for most things, just 705 00:31:41,070 --> 00:31:43,810 creative things, design things. 706 00:31:43,810 --> 00:31:47,000 For the times when we are intensely human, again. 707 00:31:47,000 --> 00:31:50,650 Because that's my hope is that the machines will get us back 708 00:31:50,650 --> 00:31:54,670 to that intensely human place. 709 00:31:54,670 --> 00:31:56,640 >> Now let's talk about money. 710 00:31:56,640 --> 00:32:01,720 And again I'm going to talk-- I'm going to go-- I might not be right. 711 00:32:01,720 --> 00:32:03,600 I want you all to ask this question. 712 00:32:03,600 --> 00:32:05,960 I want you all to answer this question. 713 00:32:05,960 --> 00:32:08,970 But I'm showing you how to do it. 714 00:32:08,970 --> 00:32:09,685 >> What about money? 715 00:32:09,685 --> 00:32:11,560 How are we going to make money in this future 716 00:32:11,560 --> 00:32:14,300 when our machines are doing everything for us? 717 00:32:14,300 --> 00:32:16,200 Well, let's take a look in the past. 718 00:32:16,200 --> 00:32:20,592 How did we make money-- sorry, it's not my fault Harvard doesn't have HDMI. 719 00:32:20,592 --> 00:32:23,320 720 00:32:23,320 --> 00:32:25,690 >> The web, how did we make money on the web? 721 00:32:25,690 --> 00:32:29,680 There's two words you're missing web and ads. 722 00:32:29,680 --> 00:32:34,690 >> Why are browsers still the same as they were in 1994? 723 00:32:34,690 --> 00:32:36,030 Do you all realize that? 724 00:32:36,030 --> 00:32:39,130 Netscape Navigator in 1994, here's how it worked. 725 00:32:39,130 --> 00:32:40,360 You invoked it. 726 00:32:40,360 --> 00:32:42,570 And you got a rectangle on a screen. 727 00:32:42,570 --> 00:32:43,910 It had a text box. 728 00:32:43,910 --> 00:32:46,270 You clicked in a text box, typed the search term, 729 00:32:46,270 --> 00:32:48,160 hit Enter, and got 10 blue links. 730 00:32:48,160 --> 00:32:51,460 >> That's the same way the Edge browser, and the Chrome browser, 731 00:32:51,460 --> 00:32:53,390 work 21 years later. 732 00:32:53,390 --> 00:32:56,100 Exactly the same way. 733 00:32:56,100 --> 00:32:56,910 Why? 734 00:32:56,910 --> 00:33:00,670 No other piece of software works the same way as it did 21 years ago. 735 00:33:00,670 --> 00:33:02,515 And it's exactly the same way. 736 00:33:02,515 --> 00:33:03,390 It's a little faster. 737 00:33:03,390 --> 00:33:05,890 And it takes a few more file formats, but that's it. 738 00:33:05,890 --> 00:33:06,690 >> Why? 739 00:33:06,690 --> 00:33:08,420 Because of ads. 740 00:33:08,420 --> 00:33:10,960 Browsers are money making little machines. 741 00:33:10,960 --> 00:33:14,140 Because it's start, stop, start, stop, right? 742 00:33:14,140 --> 00:33:18,690 You have to type in search terms, get a search, stop, opportunity for ads. 743 00:33:18,690 --> 00:33:21,910 >> And then you go to a website, stop, opportunity for ads. 744 00:33:21,910 --> 00:33:22,760 Over and over. 745 00:33:22,760 --> 00:33:24,210 >> It's like the NFL. 746 00:33:24,210 --> 00:33:28,070 They only play for 10 seconds before they stop for commercial break. 747 00:33:28,070 --> 00:33:29,160 It's crazy. 748 00:33:29,160 --> 00:33:30,500 Advertisers love it. 749 00:33:30,500 --> 00:33:34,400 But it doesn't work so well in the apps world, does it? 750 00:33:34,400 --> 00:33:35,800 It's a lot harder. 751 00:33:35,800 --> 00:33:37,730 There's not enough room. 752 00:33:37,730 --> 00:33:40,290 >> Apps are more like soccer. 753 00:33:40,290 --> 00:33:41,980 Where's the ads in soccer? 754 00:33:41,980 --> 00:33:43,590 You can't stop a soccer game. 755 00:33:43,590 --> 00:33:45,960 Go on, go on-- hey, time out, guys. 756 00:33:45,960 --> 00:33:47,770 Aren't you all buggered? 757 00:33:47,770 --> 00:33:49,650 Take a break. 758 00:33:49,650 --> 00:33:50,870 That doesn't worked out way. 759 00:33:50,870 --> 00:33:53,160 >> And apps don't work that way, either. 760 00:33:53,160 --> 00:33:57,830 Because I got that food app, I got that science app, for a reason. 761 00:33:57,830 --> 00:34:00,100 I want to get work done using my app. 762 00:34:00,100 --> 00:34:01,490 Ads get in the way. 763 00:34:01,490 --> 00:34:05,140 So when apps came out, all of a sudden, the ads economy 764 00:34:05,140 --> 00:34:07,350 had to make way for the purchase economy. 765 00:34:07,350 --> 00:34:13,389 >> And for $0.99, forever, not per year, but forever, you 766 00:34:13,389 --> 00:34:16,270 can make those ads go away, and people do. 767 00:34:16,270 --> 00:34:21,031 The purchase economy is beginning to really take hold. 768 00:34:21,031 --> 00:34:24,239 And I think you're going to see it take hold in some really interesting ways, 769 00:34:24,239 --> 00:34:25,100 too. 770 00:34:25,100 --> 00:34:28,010 >> If you look at my blog on medium.com, I wrote 771 00:34:28,010 --> 00:34:31,330 a blog post called Twitter is stupid, which is kind of cool, 772 00:34:31,330 --> 00:34:34,210 because it trended on Twitter. 773 00:34:34,210 --> 00:34:37,790 Twitter is stupid, because this is why it's stupid. 774 00:34:37,790 --> 00:34:41,429 I tweeted this last year, or something, traveling to Boston and New York City 775 00:34:41,429 --> 00:34:41,929 next week. 776 00:34:41,929 --> 00:34:45,110 Any locals who can hook me up with some good live music recommendations? 777 00:34:45,110 --> 00:34:46,929 >> I'm blessed with a lot of followers. 778 00:34:46,929 --> 00:34:49,770 I've got a bunch of people from Boston and New York City. 779 00:34:49,770 --> 00:34:51,460 Nobody replied. 780 00:34:51,460 --> 00:34:52,219 Nobody replied. 781 00:34:52,219 --> 00:34:53,219 I got two favorites. 782 00:34:53,219 --> 00:34:56,540 Why in the hell would you favorite this tweet? 783 00:34:56,540 --> 00:35:01,450 James can't find no music, asshole. 784 00:35:01,450 --> 00:35:03,930 >> Are you kidding me? 785 00:35:03,930 --> 00:35:06,140 So I thought this is broken. 786 00:35:06,140 --> 00:35:10,090 Surely all of these places, all of these live music venues in these two cities 787 00:35:10,090 --> 00:35:12,160 have a Twitter account, right? 788 00:35:12,160 --> 00:35:15,180 And so I looked it up, because Bing's ingested all the Twitter data. 789 00:35:15,180 --> 00:35:18,270 >> So I could use a few of the API causes-- it's not even codes, 790 00:35:18,270 --> 00:35:20,680 it's just making an API call. 791 00:35:20,680 --> 00:35:22,140 1,500 of them. 792 00:35:22,140 --> 00:35:25,910 1,500 Twitter accounts that claim to be live music venues. 793 00:35:25,910 --> 00:35:30,020 So I just wrote a little loop, and sent this tweet to them. 794 00:35:30,020 --> 00:35:31,470 Blast. 795 00:35:31,470 --> 00:35:33,210 Four lines of code. 796 00:35:33,210 --> 00:35:34,900 >> And then I waited. 797 00:35:34,900 --> 00:35:36,320 I didn't have to wait long. 798 00:35:36,320 --> 00:35:40,810 One minute, because every single people-- all of them who care, 799 00:35:40,810 --> 00:35:45,100 have a social media coordinator, with three screens-- Instagram, Twitter, 800 00:35:45,100 --> 00:35:47,460 Facebook-- waiting for customers. 801 00:35:47,460 --> 00:35:50,100 >> And I'm thinking, what's wrong here? 802 00:35:50,100 --> 00:35:53,710 Here I am in Twitter saying, I want to spend money. 803 00:35:53,710 --> 00:35:57,500 And here are other Twitter accounts, I would like to take your money. 804 00:35:57,500 --> 00:36:03,790 And Twitter's like, would you like to see an ad for-- you're stupid, 805 00:36:03,790 --> 00:36:05,760 Twitter, stupid. 806 00:36:05,760 --> 00:36:06,760 >> How should it work? 807 00:36:06,760 --> 00:36:08,450 It should work like this. 808 00:36:08,450 --> 00:36:09,890 Left swipe, right swipe. 809 00:36:09,890 --> 00:36:11,390 Data to tell me what I like. 810 00:36:11,390 --> 00:36:13,910 Or am I connecting you to the right people? 811 00:36:13,910 --> 00:36:15,190 >> Let's see. 812 00:36:15,190 --> 00:36:19,450 Hyatt Regency Boston says they've got jazz music. 813 00:36:19,450 --> 00:36:20,840 Jazz sucks. 814 00:36:20,840 --> 00:36:21,820 Left swipe. 815 00:36:21,820 --> 00:36:22,460 Holy. 816 00:36:22,460 --> 00:36:26,540 The only people that like jazz are people that play it, for goodness sake. 817 00:36:26,540 --> 00:36:30,340 >> The Boston Symphony, they got classical music now. 818 00:36:30,340 --> 00:36:31,550 Now, now, now. 819 00:36:31,550 --> 00:36:34,490 We legalized marijuana in my state a year or so ago, 820 00:36:34,490 --> 00:36:38,260 and all of a sudden, classical music is sounding a lot better. 821 00:36:38,260 --> 00:36:40,120 But I'm not yet stoned enough. 822 00:36:40,120 --> 00:36:42,680 Left swipe. 823 00:36:42,680 --> 00:36:46,840 >> Orpheum Theater in Boston's got Cage the Elephant and the Foals. 824 00:36:46,840 --> 00:36:51,130 Cage the Elephant is awesome rock band from Kentucky, my homeland. 825 00:36:51,130 --> 00:36:53,210 And I went to see this concert. 826 00:36:53,210 --> 00:36:54,374 >> This is commerce. 827 00:36:54,374 --> 00:36:57,040 This is the purchase economy that's going to begin to take over, 828 00:36:57,040 --> 00:36:58,360 and Twitter is beginning to do this. 829 00:36:58,360 --> 00:37:01,026 I don't know if my blog post had anything to do with it, or not. 830 00:37:01,026 --> 00:37:06,180 But you can now donate money to a political candidate through Twitter. 831 00:37:06,180 --> 00:37:09,010 >> They are finally beginning to connect buyers and sellers 832 00:37:09,010 --> 00:37:09,980 within their ecosystem. 833 00:37:09,980 --> 00:37:11,470 And you are going to see this. 834 00:37:11,470 --> 00:37:13,120 It's a massive threat to Google. 835 00:37:13,120 --> 00:37:16,140 This is a massive threat to anything that puts 836 00:37:16,140 --> 00:37:20,120 ads in the path of buyers and sellers. 837 00:37:20,120 --> 00:37:23,390 >> Facebook and Twitter did this, they would consume much 838 00:37:23,390 --> 00:37:26,030 of the purchase economy on the web. 839 00:37:26,030 --> 00:37:29,790 >> Now what about when devices start coming into play? 840 00:37:29,790 --> 00:37:31,720 What about this scenario. 841 00:37:31,720 --> 00:37:33,660 Cortana, I have to pee. 842 00:37:33,660 --> 00:37:35,080 And I go. 843 00:37:35,080 --> 00:37:37,140 And I come back. 844 00:37:37,140 --> 00:37:38,300 How about that? 845 00:37:38,300 --> 00:37:41,820 How does this person make money in this new economy? 846 00:37:41,820 --> 00:37:44,100 >> I think it's subscriptions and micro payments. 847 00:37:44,100 --> 00:37:45,800 That data is in Azure. 848 00:37:45,800 --> 00:37:50,570 That developer pays a subscription to be an Azure customer. 849 00:37:50,570 --> 00:37:53,910 >> So do a million other, I think Microsoft announced last week 850 00:37:53,910 --> 00:37:57,350 a million paid accounts in Azure, or something like that. 851 00:37:57,350 --> 00:38:00,050 So a lot of money in the ecosystem. 852 00:38:00,050 --> 00:38:07,040 >> So when Cortana sees me go and consume third-party content, 853 00:38:07,040 --> 00:38:08,899 a micropayment will be made. 854 00:38:08,899 --> 00:38:09,940 This makes perfect sense. 855 00:38:09,940 --> 00:38:11,900 The subscription money is already out there. 856 00:38:11,900 --> 00:38:15,470 >> You all have subscript-- you're paying subscriptions to Spotify, and Netflix, 857 00:38:15,470 --> 00:38:20,010 and Comcast, or Time Warner, or whoever whatever lousy cable company you all 858 00:38:20,010 --> 00:38:21,980 are stuck with. 859 00:38:21,980 --> 00:38:24,830 I know they're lousy, because they're all lousy. 860 00:38:24,830 --> 00:38:28,120 >> In order to get from one Game of Thrones episode 861 00:38:28,120 --> 00:38:32,930 to the next Game of Thrones episode on my Comcast box, it's 21 clicks. 862 00:38:32,930 --> 00:38:34,480 Are you kidding me? 863 00:38:34,480 --> 00:38:36,440 That's an opportunity. 864 00:38:36,440 --> 00:38:40,910 >> So for a cent, maybe you watch World War Z. That's a $0.04 pee. 865 00:38:40,910 --> 00:38:44,880 A three hour Hobbit movie, that's a $0.28 pee. 866 00:38:44,880 --> 00:38:50,890 The premiere of Star Wars, that pee's $1, man. 867 00:38:50,890 --> 00:38:56,260 >> So now, instead of you and I, the two lone people who got this app, 868 00:38:56,260 --> 00:39:00,340 makes a couple of dollars, or he makes money on every single customer. 869 00:39:00,340 --> 00:39:02,930 This is the economy that's coming. 870 00:39:02,930 --> 00:39:11,890 Every single, possible customer, when they consume your value, you get paid. 871 00:39:11,890 --> 00:39:16,190 >> Now this, you need to think about very carefully. 872 00:39:16,190 --> 00:39:18,660 Because I think this is the future of making money. 873 00:39:18,660 --> 00:39:24,070 It's your ability to inject value into the ecosystem. 874 00:39:24,070 --> 00:39:28,675 The more the value gets consumed, the more money you're going to make. 875 00:39:28,675 --> 00:39:33,410 >> SPEAKER 6: What about RPGs and buying virtual products? 876 00:39:33,410 --> 00:39:35,410 JAMES WHITTAKER: That's purchase economy, right? 877 00:39:35,410 --> 00:39:40,510 And that's already taking place, so it's now. 878 00:39:40,510 --> 00:39:42,470 >> Now, what about things? 879 00:39:42,470 --> 00:39:46,590 When these machines really begin to start doing real things for us, 880 00:39:46,590 --> 00:39:49,210 it adds one more additional way of making money. 881 00:39:49,210 --> 00:39:50,720 And that's sharing. 882 00:39:50,720 --> 00:39:52,870 Airbnb, we share our houses. 883 00:39:52,870 --> 00:39:54,470 Uber, we share our cars. 884 00:39:54,470 --> 00:39:56,430 >> Do you know there's an Airpnp? 885 00:39:56,430 --> 00:39:59,010 Have you ever been driving, and go, god, I got to go? 886 00:39:59,010 --> 00:40:02,090 Why do all my examples have to do with urination? 887 00:40:02,090 --> 00:40:03,260 God, I got to go, right. 888 00:40:03,260 --> 00:40:07,340 And you're driving by all these houses, with toilets. 889 00:40:07,340 --> 00:40:10,320 It's a buck and a half to go pee at somebody's house. 890 00:40:10,320 --> 00:40:14,650 And then they can rate you, so you can't-- it's got to be clean. 891 00:40:14,650 --> 00:40:19,170 892 00:40:19,170 --> 00:40:20,129 >> So sharing comes about. 893 00:40:20,129 --> 00:40:22,003 In fact, I'm going to do that for my hot tub. 894 00:40:22,003 --> 00:40:25,300 It's going to rent itself out, because it's completely self-maintaining now. 895 00:40:25,300 --> 00:40:27,480 $100 an hour, it's going to rent itself out. 896 00:40:27,480 --> 00:40:31,630 So anyhow, if you're in Woodinville, Washington, next year, 897 00:40:31,630 --> 00:40:33,260 and you need a soak-- 898 00:40:33,260 --> 00:40:35,020 >> [THKK] 899 00:40:35,020 --> 00:40:38,340 >> Now, what about when the machines take over. 900 00:40:38,340 --> 00:40:43,100 Because if you listen to people like Ray Kurzweil, and Bill Gates, and Stephen 901 00:40:43,100 --> 00:40:46,440 Hawking, and Elan Musk, everybody's worried about this thing 902 00:40:46,440 --> 00:40:50,370 they call the singularity, the point at which the machines get so 903 00:40:50,370 --> 00:40:54,420 smart that they don't need us anymore. 904 00:40:54,420 --> 00:40:58,130 >> Now we've got a long way to get there, but on the way, 905 00:40:58,130 --> 00:41:00,800 we are going to become less and less relevant. 906 00:41:00,800 --> 00:41:04,310 You understand that our industry has destroyed a lot of other industries. 907 00:41:04,310 --> 00:41:07,080 The video rental industry, gone. 908 00:41:07,080 --> 00:41:09,180 >> The photography industry, gone. 909 00:41:09,180 --> 00:41:12,080 Eastman Kodak used to employ hundreds of thousands 910 00:41:12,080 --> 00:41:15,020 of people, and millions of ancillary jobs, 911 00:41:15,020 --> 00:41:21,970 and taking pictures and developing film, and wedding photographers, all of that 912 00:41:21,970 --> 00:41:26,860 billions of dollars concentrated into the 13 employees of Instagram. 913 00:41:26,860 --> 00:41:34,510 >> We are really good at destroying jobs and making it harder and harder 914 00:41:34,510 --> 00:41:36,610 and harder for people to make money. 915 00:41:36,610 --> 00:41:39,690 So what happens when the machines start doing everything? 916 00:41:39,690 --> 00:41:42,030 What then? 917 00:41:42,030 --> 00:41:43,810 >> Because you know what? 918 00:41:43,810 --> 00:41:46,160 They're going to be better at it than we are. 919 00:41:46,160 --> 00:41:49,570 There is going to come a day, in the very near future, when you will not 920 00:41:49,570 --> 00:41:52,120 get on an airplane if there's a human in the cockpit. 921 00:41:52,120 --> 00:41:53,760 Too dangerous. 922 00:41:53,760 --> 00:41:55,880 I don't know if that person is taking their meds. 923 00:41:55,880 --> 00:41:56,814 No way. 924 00:41:56,814 --> 00:41:59,230 There's going to come a day, not too distant future, where 925 00:41:59,230 --> 00:42:02,570 it will be illegal for a human to drive a car 926 00:42:02,570 --> 00:42:06,232 because the machines are a lot better at it. 927 00:42:06,232 --> 00:42:07,940 They're going to be better at everything. 928 00:42:07,940 --> 00:42:09,120 They're going to be better at building. 929 00:42:09,120 --> 00:42:10,703 They're going to be better at driving. 930 00:42:10,703 --> 00:42:12,720 They're going to be better at paving our roads. 931 00:42:12,720 --> 00:42:15,140 They're going to be better at designing traffic flows. 932 00:42:15,140 --> 00:42:17,431 They're going to be better at designing traffic lights. 933 00:42:17,431 --> 00:42:21,650 They're going to be better than us at all of this. 934 00:42:21,650 --> 00:42:23,360 >> So what do we do? 935 00:42:23,360 --> 00:42:27,280 When the number of jobs that we can do better than machines are, 936 00:42:27,280 --> 00:42:29,806 OK, I'm going to have to take this question, aren't I? 937 00:42:29,806 --> 00:42:31,847 SPEAKER 6: Machines don't have feelings, and they 938 00:42:31,847 --> 00:42:35,807 have no desire to do certain things, nor can they handle things. 939 00:42:35,807 --> 00:42:38,432 JAMES WHITTAKER: They can't desire if they don't have feelings. 940 00:42:38,432 --> 00:42:39,140 SPEAKER 6: Right. 941 00:42:39,140 --> 00:42:42,768 So they don't handle things like interpersonal interactions, 942 00:42:42,768 --> 00:42:45,310 there's going to be more jobs for people like psychologists-- 943 00:42:45,310 --> 00:42:46,143 JAMES WHITTAKER: OK. 944 00:42:46,143 --> 00:42:47,770 You're a little bit ahead of me, again. 945 00:42:47,770 --> 00:42:49,560 So bear with me. 946 00:42:49,560 --> 00:42:51,680 I'm going to get there, I promise. 947 00:42:51,680 --> 00:42:54,790 >> So what do we do? 948 00:42:54,790 --> 00:42:56,640 Is that what we're left with? 949 00:42:56,640 --> 00:43:00,540 Basically all just the human part? 950 00:43:00,540 --> 00:43:04,700 Where we're philosophers, and we're poets? 951 00:43:04,700 --> 00:43:08,290 Is this why we're legalizing marijuana? 952 00:43:08,290 --> 00:43:12,430 Are we preparing for this day where basically it just, let's 953 00:43:12,430 --> 00:43:15,260 think deep thoughts and get high. 954 00:43:15,260 --> 00:43:17,580 >> Or it could be what you suggest. 955 00:43:17,580 --> 00:43:20,526 It could be that the machines give us back our humanity. 956 00:43:20,526 --> 00:43:23,650 It could be that they take care of all the things that we don't want to do. 957 00:43:23,650 --> 00:43:28,570 So we can get back to being human. 958 00:43:28,570 --> 00:43:33,340 >> But is this singularity going to come? 959 00:43:33,340 --> 00:43:34,670 It might. 960 00:43:34,670 --> 00:43:36,470 Our machines are getting really powerful. 961 00:43:36,470 --> 00:43:37,380 Why are they? 962 00:43:37,380 --> 00:43:40,490 Why is it all of a sudden that we're talking about AI? 963 00:43:40,490 --> 00:43:43,676 Do you understand what's going into this AI? 964 00:43:43,676 --> 00:43:46,150 I claim it's not AI, at all. 965 00:43:46,150 --> 00:43:48,740 I claim it's just code. 966 00:43:48,740 --> 00:43:49,280 >> Look. 967 00:43:49,280 --> 00:43:53,130 Here's why our computers, our machines, are looking so smart now. 968 00:43:53,130 --> 00:43:55,480 Number one is they got data they didn't have access to. 969 00:43:55,480 --> 00:43:58,840 Even a few years ago, they didn't have access to this data. 970 00:43:58,840 --> 00:44:02,050 Now, everything is born digitally. 971 00:44:02,050 --> 00:44:05,600 Of course, they have the-- there's just that much more data. 972 00:44:05,600 --> 00:44:08,136 >> Secondly, that data's stored together, now. 973 00:44:08,136 --> 00:44:09,260 We don't really have a web. 974 00:44:09,260 --> 00:44:11,100 You all understand the web is already dead. 975 00:44:11,100 --> 00:44:15,910 >> How many of you all have built a web server, over the last year. 976 00:44:15,910 --> 00:44:19,580 So this is the highest concentration of web server developers 977 00:44:19,580 --> 00:44:21,420 that will ever exist in a university. 978 00:44:21,420 --> 00:44:24,900 If I had asked this five years ago, every single one of you 979 00:44:24,900 --> 00:44:28,590 all would be building web servers every week. 980 00:44:28,590 --> 00:44:30,274 Because that's what you did. 981 00:44:30,274 --> 00:44:31,940 You configured web servers all the time. 982 00:44:31,940 --> 00:44:34,731 I had six of them under my desk serving different kinds of traffic. 983 00:44:34,731 --> 00:44:36,490 That's not the web we have anymore. 984 00:44:36,490 --> 00:44:38,680 The web we have now is data centers. 985 00:44:38,680 --> 00:44:42,980 All those web servers have migrated into the cloud, into data centers. 986 00:44:42,980 --> 00:44:47,130 And so they're all together, which means we can store contiguous data, 987 00:44:47,130 --> 00:44:52,990 related data in the same place, and off-line process the crap out of it. 988 00:44:52,990 --> 00:44:56,030 >> Right now, no one's searching for World Cup data. 989 00:44:56,030 --> 00:44:59,940 And so all-- soccer World Cup, maybe rugby but rugby is over now, too. 990 00:44:59,940 --> 00:45:03,830 So all that's migrated to back end machines, and now all this stuff-- 991 00:45:03,830 --> 00:45:07,590 >> We can push data that's more popular different places. 992 00:45:07,590 --> 00:45:09,490 And then offline, we can say, OK hey let's 993 00:45:09,490 --> 00:45:11,490 take a look at this World Cup data, and see what 994 00:45:11,490 --> 00:45:14,511 we can figure out, completely offline. 995 00:45:14,511 --> 00:45:17,760 That's why it's looking so much smarter, is because it's all sitting together. 996 00:45:17,760 --> 00:45:20,380 And our machines are a lot faster at processing it. 997 00:45:20,380 --> 00:45:25,320 Vast amounts of data, well-organized, and processed at speed. 998 00:45:25,320 --> 00:45:28,570 >> And yet still, at some point, we are going to figure something 999 00:45:28,570 --> 00:45:29,610 out really important. 1000 00:45:29,610 --> 00:45:31,900 And I think it's the Human Brain Project. 1001 00:45:31,900 --> 00:45:33,560 That's what we need to watch. 1002 00:45:33,560 --> 00:45:35,970 Do you all know about the Human Brain Project? 1003 00:45:35,970 --> 00:45:40,150 >> Look, we mapped the human genome with computers no more powerful than this. 1004 00:45:40,150 --> 00:45:42,660 In 1990, the Human Genome Project started. 1005 00:45:42,660 --> 00:45:48,380 We mapped the genome of an arbitrary human being in 13 years using computers 1006 00:45:48,380 --> 00:45:50,900 no power more powerful than this. 1007 00:45:50,900 --> 00:45:54,120 >> Now we have computers way more powerful than this. 1008 00:45:54,120 --> 00:45:57,160 And we started mapping the brain two years ago. 1009 00:45:57,160 --> 00:45:59,990 This is where our machines are going to help. 1010 00:45:59,990 --> 00:46:03,370 Our machines allow us to do this. 1011 00:46:03,370 --> 00:46:07,020 We don't map the brain without our machines. 1012 00:46:07,020 --> 00:46:09,360 And what are we going to discover? 1013 00:46:09,360 --> 00:46:13,260 >> What have we already discovered in just two short years? 1014 00:46:13,260 --> 00:46:15,610 Two years into the Human Brain Project, we're already 1015 00:46:15,610 --> 00:46:17,860 hitting milestones at year eight. 1016 00:46:17,860 --> 00:46:19,860 We know what depression looks like. 1017 00:46:19,860 --> 00:46:21,940 We know what anxiety looks like. 1018 00:46:21,940 --> 00:46:23,930 We know what bipolar looks like. 1019 00:46:23,930 --> 00:46:25,890 We know what autism looks like. 1020 00:46:25,890 --> 00:46:28,190 >> All of these things are already mapped out. 1021 00:46:28,190 --> 00:46:31,320 How much longer until our machines look at this and say, 1022 00:46:31,320 --> 00:46:33,840 I know how to cure anxiety. 1023 00:46:33,840 --> 00:46:35,680 I know how to cure depression. 1024 00:46:35,680 --> 00:46:38,590 Our machines are going to do this. 1025 00:46:38,590 --> 00:46:41,230 And it's going to be stunning. 1026 00:46:41,230 --> 00:46:45,000 >> So I think our machines are going to allow 1027 00:46:45,000 --> 00:46:48,430 us to do what we were meant to do. 1028 00:46:48,430 --> 00:46:51,280 This, I think, is the fundamental purpose for human beings 1029 00:46:51,280 --> 00:46:56,980 is to explore, to terraform other worlds, to reach other solar systems, 1030 00:46:56,980 --> 00:47:03,520 to find other lives, to figure out whether ancient aliens really is true. 1031 00:47:03,520 --> 00:47:07,800 >> And we are going to slowly solve every single mystery of mankind. 1032 00:47:07,800 --> 00:47:09,930 This is what you all are going to do. 1033 00:47:09,930 --> 00:47:12,170 This is what your children are going to do. 1034 00:47:12,170 --> 00:47:19,460 And over the decades, we will exhaust every single mystery on the planet, 1035 00:47:19,460 --> 00:47:21,110 and on other planets. 1036 00:47:21,110 --> 00:47:23,580 And then what? 1037 00:47:23,580 --> 00:47:25,580 This is the thought I'm going to leave you with. 1038 00:47:25,580 --> 00:47:27,710 Then we'll have questions. 1039 00:47:27,710 --> 00:47:33,990 Perhaps, just perhaps, using these magic machines, 1040 00:47:33,990 --> 00:47:38,410 the power of our minds amplified by these magic machines, 1041 00:47:38,410 --> 00:47:42,860 we'll discover that we weren't meant to go to heaven at all. 1042 00:47:42,860 --> 00:47:47,010 >> But through technology, to create heaven for ourselves. 1043 00:47:47,010 --> 00:47:49,720 Perhaps, just perhaps, the meaning of life 1044 00:47:49,720 --> 00:47:52,860 isn't given to us by a higher power. 1045 00:47:52,860 --> 00:48:01,060 Perhaps, we use our technology to evolve into that higher power. 1046 00:48:01,060 --> 00:48:08,690 >> Perhaps, God, it is said, created us in his own image. 1047 00:48:08,690 --> 00:48:16,460 Maybe, through these magic machines, we create god in ours. 1048 00:48:16,460 --> 00:48:17,890 >> My name is James Whitaker. 1049 00:48:17,890 --> 00:48:19,450 I work for Microsoft. 1050 00:48:19,450 --> 00:48:20,097 Thank you. 1051 00:48:20,097 --> 00:48:27,520 1052 00:48:27,520 --> 00:48:29,130 Follow me on Twitter, if you'd like. 1053 00:48:29,130 --> 00:48:32,780 The transcript of this is on medium.com/@docjamesw. 1054 00:48:32,780 --> 00:48:34,232 >> And I'll take questions. 1055 00:48:34,232 --> 00:48:35,940 SPEAKER 7: So two questions, first thing. 1056 00:48:35,940 --> 00:48:36,780 When we're talking about-- 1057 00:48:36,780 --> 00:48:39,363 >> JAMES WHITTAKER: Oh, you're getting a free question here, huh? 1058 00:48:39,363 --> 00:48:42,710 SPEAKER 7: --App store transforming everything. 1059 00:48:42,710 --> 00:48:47,730 But Ubuntu had an app store that looked exactly like the one from Apple. 1060 00:48:47,730 --> 00:48:49,160 It was just for desktop. 1061 00:48:49,160 --> 00:48:51,577 Why is it that App Store for mobile is that life changing? 1062 00:48:51,577 --> 00:48:54,701 JAMES WHITTAKER: Because mobile, that's where all the users were in mobile. 1063 00:48:54,701 --> 00:48:56,220 Now there's a second piece to this. 1064 00:48:56,220 --> 00:49:00,920 The second piece to this is Steve Jobs, and his amazing storytelling ability. 1065 00:49:00,920 --> 00:49:04,180 >> SPEAKER 7: Because technically, the Ubuntu app store 1066 00:49:04,180 --> 00:49:09,010 already solved the problem of I have an app for-- 1067 00:49:09,010 --> 00:49:11,830 >> JAMES WHITTAKER: But there were eight people with Ubuntu machines. 1068 00:49:11,830 --> 00:49:13,890 And there were 200 million people with-- 1069 00:49:13,890 --> 00:49:14,530 >> SPEAKER 7: And that's the Steve Jobs. 1070 00:49:14,530 --> 00:49:16,940 >> JAMES WHITTAKER: --iPhones and that's Steve Jobs, right. 1071 00:49:16,940 --> 00:49:21,050 But it was the developers, the developers building functionality. 1072 00:49:21,050 --> 00:49:23,960 That's why you all have iPhones, and not the other 1073 00:49:23,960 --> 00:49:26,480 because they've got the best app store. 1074 00:49:26,480 --> 00:49:29,007 What was your second one? 1075 00:49:29,007 --> 00:49:30,500 >> SPEAKER 7: About the future. 1076 00:49:30,500 --> 00:49:33,350 I always wonder why people say so much, internet of things, 1077 00:49:33,350 --> 00:49:36,520 right now, because we basically have the technology, 1078 00:49:36,520 --> 00:49:40,020 we had this technology doing most of this stuff like 10 years, 1079 00:49:40,020 --> 00:49:41,420 or even 20 years ago. 1080 00:49:41,420 --> 00:49:43,360 >> JAMES WHITTAKER: Ah, I don't know. 1081 00:49:43,360 --> 00:49:46,500 We had the sensors, but the data-- we didn't have the data. 1082 00:49:46,500 --> 00:49:48,490 There wasn't anything useful for them to do. 1083 00:49:48,490 --> 00:49:50,040 And we didn't have the conductivity. 1084 00:49:50,040 --> 00:49:52,530 Bandwidth is almost free now. 1085 00:49:52,530 --> 00:49:53,030 So. 1086 00:49:53,030 --> 00:49:55,170 >> SPEAKER 7: Does the data bring that much into it? 1087 00:49:55,170 --> 00:49:56,878 >> JAMES WHITTAKER: But it's 10 year cycles. 1088 00:49:56,878 --> 00:50:00,350 1089 00:50:00,350 --> 00:50:03,410 Everything that you all are using right now is 10 years old. 1090 00:50:03,410 --> 00:50:07,840 And what's going to be really big in 10 years has already been invented. 1091 00:50:07,840 --> 00:50:10,720 That's why the cloud is underpinning all of this. 1092 00:50:10,720 --> 00:50:13,350 And the cloud was invented in 2007. 1093 00:50:13,350 --> 00:50:19,101 So it takes a while for the world to catch up with the technology. 1094 00:50:19,101 --> 00:50:19,600 Yes, ma'am. 1095 00:50:19,600 --> 00:50:24,173 >> SPEAKER 6: So, the Brain Project, there are those people 1096 00:50:24,173 --> 00:50:28,542 in the field of psychology, who feel that neuropsychology may turn out 1097 00:50:28,542 --> 00:50:31,850 to be nothing more than phrenology. 1098 00:50:31,850 --> 00:50:37,240 How do you think-- is it possible to quantify and create 1099 00:50:37,240 --> 00:50:42,180 algorithms, understand a state of consciousness and intent, 1100 00:50:42,180 --> 00:50:44,530 when we don't understand what those things are? 1101 00:50:44,530 --> 00:50:48,310 >> JAMES WHITTAKER: So I actually think that the machines are never 1102 00:50:48,310 --> 00:50:49,520 going to catch up to us. 1103 00:50:49,520 --> 00:50:53,140 My opinion is that gray matter will end up 1104 00:50:53,140 --> 00:50:58,300 being triumphant over germanium, silicon, 1105 00:50:58,300 --> 00:51:00,252 carbon nanotubes, and graphene. 1106 00:51:00,252 --> 00:51:02,960 I think there's something going on up here that's really special. 1107 00:51:02,960 --> 00:51:04,660 I do think we're going to figure it out. 1108 00:51:04,660 --> 00:51:08,690 I'm not sure we're going to figure out how to build it. 1109 00:51:08,690 --> 00:51:10,465 Yes, sir. 1110 00:51:10,465 --> 00:51:11,882 >> SPEAKER 8: Who governs this? 1111 00:51:11,882 --> 00:51:14,840 If we get to a point where the machines and our software are making all 1112 00:51:14,840 --> 00:51:21,100 of these momentous decisions, does that mean that at some point we-- 1113 00:51:21,100 --> 00:51:23,850 in our world where governance isn't important anymore, and Google, 1114 00:51:23,850 --> 00:51:25,830 Microsoft and international corporations-- 1115 00:51:25,830 --> 00:51:28,580 >> JAMES WHITTAKER: That's why people are signing all these petitions 1116 00:51:28,580 --> 00:51:34,210 to create rules about never-- we've already programmed machines to kill. 1117 00:51:34,210 --> 00:51:36,620 We've done that, and the machines have independently 1118 00:51:36,620 --> 00:51:40,290 killed people, because they are obeying their programming. 1119 00:51:40,290 --> 00:51:43,380 >> And so, we can program them to do anything we want. 1120 00:51:43,380 --> 00:51:45,640 Where are those laws going to come from? 1121 00:51:45,640 --> 00:51:48,480 Do people trust-- the companies at the forefront of this 1122 00:51:48,480 --> 00:51:58,560 are Microsoft, Google, IBM, even Cisco, Tesla. 1123 00:51:58,560 --> 00:52:00,330 Who are we going to trust? 1124 00:52:00,330 --> 00:52:02,190 >> I mean, this is a societal level discussion 1125 00:52:02,190 --> 00:52:04,140 that really needs to take place. 1126 00:52:04,140 --> 00:52:08,880 I don't have an answer for it, because I don't think there is one yet. 1127 00:52:08,880 --> 00:52:12,620 Donald Trump, Vladimir Putin, they'll figure it out. 1128 00:52:12,620 --> 00:52:17,470 >> [LAUGHING] 1129 00:52:17,470 --> 00:52:18,930 >> I love blue states. 1130 00:52:18,930 --> 00:52:20,740 We can laugh on things like that. 1131 00:52:20,740 --> 00:52:25,190 Wow, I have to be careful when I talk in red states. 1132 00:52:25,190 --> 00:52:26,980 Any more questions? 1133 00:52:26,980 --> 00:52:28,320 Yes, sir. 1134 00:52:28,320 --> 00:52:31,390 >> SPEAKER 7: So when you were talking about things, 1135 00:52:31,390 --> 00:52:34,540 you know, if you went to the movie theater, 1136 00:52:34,540 --> 00:52:37,200 and you used the bathroom, that a payment would already happen. 1137 00:52:37,200 --> 00:52:38,230 >> JAMES WHITTAKER: Yeah. 1138 00:52:38,230 --> 00:52:40,760 Money is going to be moving around, so that micropayment, 1139 00:52:40,760 --> 00:52:43,440 I am 100% sure that that's going to happen. 1140 00:52:43,440 --> 00:52:44,690 It has to. 1141 00:52:44,690 --> 00:52:46,010 The App Store is breaking down. 1142 00:52:46,010 --> 00:52:48,050 No one's making money in the App Store. 1143 00:52:48,050 --> 00:52:52,780 And whenever no one makes money, that's when change happens. 1144 00:52:52,780 --> 00:52:55,727 >> It was harder to make-- easier to make money on web than Windows. 1145 00:52:55,727 --> 00:52:57,060 That's when the transition went. 1146 00:52:57,060 --> 00:52:59,393 When it was easier to make money on mobile than the web, 1147 00:52:59,393 --> 00:53:00,902 that's when the transition went. 1148 00:53:00,902 --> 00:53:03,360 When it's easier to make money in the cloud than on mobile, 1149 00:53:03,360 --> 00:53:05,310 that's going to be the transition. 1150 00:53:05,310 --> 00:53:07,140 It's capitalism. 1151 00:53:07,140 --> 00:53:08,830 Did I get all your question? 1152 00:53:08,830 --> 00:53:10,800 >> SPEAKER 7: So actually it was more about-- does 1153 00:53:10,800 --> 00:53:13,216 that mean you see the industry becoming more consolidated? 1154 00:53:13,216 --> 00:53:16,685 1155 00:53:16,685 --> 00:53:17,957 >> JAMES WHITTAKER: At first. 1156 00:53:17,957 --> 00:53:19,290 But it's going to be a reaction. 1157 00:53:19,290 --> 00:53:22,299 My prediction is, actually, that it's going to be individuals. 1158 00:53:22,299 --> 00:53:24,340 I don't know how long the big companies are going 1159 00:53:24,340 --> 00:53:26,730 to last because we won't need them. 1160 00:53:26,730 --> 00:53:29,880 >> When infrastructure is free, storage is free. 1161 00:53:29,880 --> 00:53:30,380 Yeah. 1162 00:53:30,380 --> 00:53:33,840 If you all need to go, you won't hurt my feelings by filing out. 1163 00:53:33,840 --> 00:53:36,090 But when storage is free, when infrastructure is free, 1164 00:53:36,090 --> 00:53:38,520 when networks are free, when communication is free, 1165 00:53:38,520 --> 00:53:42,050 there's no advantage to the big conglomerates anymore. 1166 00:53:42,050 --> 00:53:43,320 So it's an individual thing. 1167 00:53:43,320 --> 00:53:45,153 >> It's your ability to code, it's your ability 1168 00:53:45,153 --> 00:53:48,240 to code, it's her ability to code, and inject value into the ecosystem. 1169 00:53:48,240 --> 00:53:52,126 Because the infrastructure is going to be free. 1170 00:53:52,126 --> 00:53:53,875 It could be a great age of the individual. 1171 00:53:53,875 --> 00:53:56,450 1172 00:53:56,450 --> 00:53:57,056 Yes, sir. 1173 00:53:57,056 --> 00:54:01,190 So did you have the grand scale of everything 1174 00:54:01,190 --> 00:54:04,094 you covered, but I just had a question about what 1175 00:54:04,094 --> 00:54:08,378 your vision for [INAUDIBLE], how to interact with [INAUDIBLE] directly 1176 00:54:08,378 --> 00:54:11,300 everything is totally taken care of. 1177 00:54:11,300 --> 00:54:13,460 So you talked about [INAUDIBLE] you'd have 1178 00:54:13,460 --> 00:54:15,530 Word doc and [INAUDIBLE] over here. 1179 00:54:15,530 --> 00:54:20,530 And so one thing I found about physical [INAUDIBLE] systems 1180 00:54:20,530 --> 00:54:22,911 is that some things are evolving, like touchscreens 1181 00:54:22,911 --> 00:54:25,202 are definitely getting way better than they used to be. 1182 00:54:25,202 --> 00:54:27,243 Mice are getting way better than they used to be. 1183 00:54:27,243 --> 00:54:30,986 But the physical keyboard is actually pretty amazing. 1184 00:54:30,986 --> 00:54:33,967 And it's [INAUDIBLE] 1185 00:54:33,967 --> 00:54:36,550 JAMES WHITTAKER: Is it, or is it-- so I don't know the answer, 1186 00:54:36,550 --> 00:54:39,380 but I want you to not get lost in your opinions. 1187 00:54:39,380 --> 00:54:41,640 Question that. 1188 00:54:41,640 --> 00:54:44,300 Is it just your familiarity with a physical keyboard 1189 00:54:44,300 --> 00:54:46,090 that's creating this affinity to it? 1190 00:54:46,090 --> 00:54:50,300 Or is it really something that is-- because we're very limited by it. 1191 00:54:50,300 --> 00:54:53,580 This corded keyboard is actually the wrong way to type fast. 1192 00:54:53,580 --> 00:54:56,210 Artificially, slows us down because the old mechanical things 1193 00:54:56,210 --> 00:54:57,980 we needed to slow typists down. 1194 00:54:57,980 --> 00:54:58,480 Right? 1195 00:54:58,480 --> 00:54:59,680 I don't know. 1196 00:54:59,680 --> 00:55:01,950 >> I think, I believe that the machines are going 1197 00:55:01,950 --> 00:55:05,200 to be able to figure out intent really easily. 1198 00:55:05,200 --> 00:55:07,080 If you're a writer, you're going to need it. 1199 00:55:07,080 --> 00:55:08,070 Right? 1200 00:55:08,070 --> 00:55:11,752 The machines, those input devices, are going to be for the creatives. 1201 00:55:11,752 --> 00:55:14,960 That's how we're going to know who are our artists, who are our creators, who 1202 00:55:14,960 --> 00:55:19,260 are our designers, and who are just the muggles, that go through world not 1203 00:55:19,260 --> 00:55:21,480 needing their machines. 1204 00:55:21,480 --> 00:55:24,700 1205 00:55:24,700 --> 00:55:27,910 >> Keep thinking about this, and be careful of your biases. 1206 00:55:27,910 --> 00:55:29,560 We all inject biases in this. 1207 00:55:29,560 --> 00:55:30,936 I've got biases injected in this. 1208 00:55:30,936 --> 00:55:33,435 That's why I warned you, this is not what's going to happen, 1209 00:55:33,435 --> 00:55:35,190 it's my opinion of what's going to happen. 1210 00:55:35,190 --> 00:55:39,480 I want your all's take away to be you walk out of here thinking, what's next? 1211 00:55:39,480 --> 00:55:40,177 What's next? 1212 00:55:40,177 --> 00:55:43,260 I want you cussing me in the middle of the night when you wake up and say, 1213 00:55:43,260 --> 00:55:43,810 what's next? 1214 00:55:43,810 --> 00:55:45,270 Damn James. 1215 00:55:45,270 --> 00:55:47,210 Asshole. 1216 00:55:47,210 --> 00:55:48,030 What's next. 1217 00:55:48,030 --> 00:55:49,980 Careful of your biases. 1218 00:55:49,980 --> 00:55:51,600 >> Is there more questions? 1219 00:55:51,600 --> 00:55:52,250 Yes, sir. 1220 00:55:52,250 --> 00:55:55,798 >> SPEAKER 7: Earlier you talked about machines being able to predict intent 1221 00:55:55,798 --> 00:55:56,760 and what people want. 1222 00:55:56,760 --> 00:56:01,120 What happens-- will they be able to know when people want to change their minds 1223 00:56:01,120 --> 00:56:02,910 or trying something new? 1224 00:56:02,910 --> 00:56:05,540 Let's say your fridge is going to order your groceries for you. 1225 00:56:05,540 --> 00:56:07,884 What if one day I want to try coconut milk ice cream. 1226 00:56:07,884 --> 00:56:09,800 JAMES WHITTAKER: I remember my mother-in-law-- 1227 00:56:09,800 --> 00:56:12,290 I remember tasting flavored coffee one time 1228 00:56:12,290 --> 00:56:13,960 and thought, um, that's really good. 1229 00:56:13,960 --> 00:56:16,930 And every time we went to visit her she had flavored coffee flavor 1230 00:56:16,930 --> 00:56:19,620 and I'm like, how do I tell her? 1231 00:56:19,620 --> 00:56:23,290 So I mean, we don't have to worry about hurting machine's feelings. 1232 00:56:23,290 --> 00:56:25,250 And so you just-- 1233 00:56:25,250 --> 00:56:26,190 >> SPEAKER 7: Tell it. 1234 00:56:26,190 --> 00:56:27,106 >> JAMES WHITTAKER: Yeah. 1235 00:56:27,106 --> 00:56:27,690 Tell it. 1236 00:56:27,690 --> 00:56:28,540 Speak to it. 1237 00:56:28,540 --> 00:56:30,301 Or whatever. 1238 00:56:30,301 --> 00:56:33,050 The interfaces-- I think there's going to be a time, a phase where 1239 00:56:33,050 --> 00:56:34,662 we are going to be discovering this. 1240 00:56:34,662 --> 00:56:37,870 It seems to me that you know, you said touch screens are really getting good. 1241 00:56:37,870 --> 00:56:39,960 To me that means they're doomed. 1242 00:56:39,960 --> 00:56:43,790 Because as soon as we perfect technology, we don't use it anymore. 1243 00:56:43,790 --> 00:56:46,550 >> PCs got really good, and we're like, oh sorry. 1244 00:56:46,550 --> 00:56:48,584 We're cheating on our PCs now with these. 1245 00:56:48,584 --> 00:56:51,750 These are going to get really good and it's going to go into something else. 1246 00:56:51,750 --> 00:56:52,400 >> 10 years. 1247 00:56:52,400 --> 00:56:53,830 10 years seriously. 1248 00:56:53,830 --> 00:56:54,910 10 year cycles. 1249 00:56:54,910 --> 00:56:55,720 Think about that. 1250 00:56:55,720 --> 00:56:58,600 It is as reliable and as accurate as Moore's law. 1251 00:56:58,600 --> 00:57:01,960 1252 00:57:01,960 --> 00:57:03,875 >> Anybody who hasn't asked a question, yet? 1253 00:57:03,875 --> 00:57:07,340 1254 00:57:07,340 --> 00:57:12,970 >> So I also teach a course on storytelling, and on creativity, 1255 00:57:12,970 --> 00:57:15,800 the brain science guy on how to be creative. 1256 00:57:15,800 --> 00:57:19,440 Neuropsychology is a really, really cool topic. 1257 00:57:19,440 --> 00:57:22,970 And I've decoded creativity and I know what it takes to be creative. 1258 00:57:22,970 --> 00:57:25,920 >> So if you all enjoyed this, I can come back sometime 1259 00:57:25,920 --> 00:57:28,810 and teach one of the other seminars. 1260 00:57:28,810 --> 00:57:32,820 And the storytelling class actually tells six stories and deconstruct them. 1261 00:57:32,820 --> 00:57:36,440 If you go to docs.com, you can see me doing this with a high school. 1262 00:57:36,440 --> 00:57:40,790 I give a high school commencement speech throughout the state of Washington. 1263 00:57:40,790 --> 00:57:42,970 >> And I give this speech, and then deconstruct it, 1264 00:57:42,970 --> 00:57:47,960 and talk about why you all have been sitting here paying attention. 1265 00:57:47,960 --> 00:57:51,020 When most of the time you sit there and stare at a PowerPoint, 1266 00:57:51,020 --> 00:57:54,140 you're bored of fucking tears, right? 1267 00:57:54,140 --> 00:57:57,360 >> There is actually a method to the madness. 1268 00:57:57,360 --> 00:57:59,111 And I tell a great story about Larry Page. 1269 00:57:59,111 --> 00:58:00,610 SPEAKER 6: Can we come to workshops? 1270 00:58:00,610 --> 00:58:01,736 Or do you just do seminars? 1271 00:58:01,736 --> 00:58:04,151 JAMES WHITTAKER: Well, I teach these monthly at Microsoft. 1272 00:58:04,151 --> 00:58:04,820 All of them-- 1273 00:58:04,820 --> 00:58:06,320 >> SPEAKER 6: But for creative writing, 1274 00:58:06,320 --> 00:58:07,444 JAMES WHITTAKER: Oh, right. 1275 00:58:07,444 --> 00:58:08,220 Got it, no. 1276 00:58:08,220 --> 00:58:10,270 I give homework. 1277 00:58:10,270 --> 00:58:14,130 And so, at Microsoft, the way I do it is I take classes and chop them 1278 00:58:14,130 --> 00:58:17,310 into four bits, and give homework, and then we talk about the homework 1279 00:58:17,310 --> 00:58:20,050 afterwards. 1280 00:58:20,050 --> 00:58:21,830 It would be great. 1281 00:58:21,830 --> 00:58:23,700 So maybe we could do one together. 1282 00:58:23,700 --> 00:58:25,490 >> I just hate running workshops. 1283 00:58:25,490 --> 00:58:27,270 So I don't do it, because I don't like it. 1284 00:58:27,270 --> 00:58:29,311 Not that it's not important, it's very important. 1285 00:58:29,311 --> 00:58:31,369 I just don't like it. 1286 00:58:31,369 --> 00:58:32,160 Any more questions? 1287 00:58:32,160 --> 00:58:36,060 1288 00:58:36,060 --> 00:58:37,610 Peace. 1289 00:58:37,610 --> 00:58:39,052