Year in Review 2023 Behind the Tech with Kevin Scott

Show video

KEVIN SCOTT: Hi, everyone. Welcome to Behind the Tech. I'm your host, Kevin Scott, Chief Technology Officer for Microsoft. In this podcast, we're going to get Behind the Tech. We'll talk with some of the people who have made our modern tech world possible and understand what motivated them to create what they did. So join me to maybe learn a little bit about the history of computing and get a few behind the scenes insights into what's happening today. Stick around.

CHRISTINA WARREN: Hello, and welcome to Behind the Tech. I'm Senior Developer Advocate at GitHub, Christina Warren. KEVIN SCOTT: And I'm Kevin Scott. CHRISTINA WARREN: Here we are at the end of 2023 already, and what a huge year it's been in our world.

It's hard to believe how much has changed this past year, what with like the widespread adoption and explosion of AI into the world. KEVIN SCOTT: It has been a crazy and big year. I feel like we're living AI dog years or something, I just can't even believe how much has happened in the past six months, much less the whole year. In the past 12 months, we've had ChatGPT release, we've had GPT-4 release.

Then we've had this crazy explosion of generative AI activity across the board from Microsoft and OpenAI and a whole bunch of other companies. It feels very much like what it's felt like a couple of times earlier in my career where with the PC revolution or the Internet revolution or the smartphone revolution, where a big platform thing is happening and that everybody's excited about it. The smartest and most ambitious people are putting all of their energy into figuring out what this all means for them. Like what useful things they're going to try to go make for other people on top of things that are newly possible. It's a super exciting moment. CHRISTINA WARREN: It's an extremely exciting moment.

I've been describing this to people for basically a year now, as like this is another iPhone moment, and that only becomes more and more true, I think all the time. On the podcast this year, it's been really exciting to hear folks talk about, as you say, what's been happening now and the stories behind the years and years of work and research that have gotten us to this place. Because of course, it feels like it is all happening right now but this is the build up of a lot of hard work.

Back in the spring, you got to talk to Bill Gates, who obviously is a huge force. I think it's hard to express just how huge a force in technology he has been for decades. KEVIN SCOTT: Yeah, for sure. Obviously, Bill was one of the accelerators for the personal computing revolution.

He played maybe the most pivotal role there in founding Microsoft in the first place and pushing the personal computing revolution forward, and played a tremendously important role in the Internet revolution, honestly. It was really great to spend some time chatting with him about AI and his perspective on what this moment means, like how it is similar to some of the things that he's seen in the past. Obviously, Bill has also been thinking about AI for his entire career.

For all of us who have an interest in technology, seeing a thing that has gone from promising but not moving as fast as all of us ambitious people would like to actually making some pretty big breakthroughs, has been, I think just as exciting for him as it has been for all of us. CHRISTINA WARREN: Let's take a listen to that conversation. KEVIN SCOTT: I wonder what your advice might be to people who are thinking about like, oh, I have this new technology that's amazing that I can now use. How should they be thinking about how to use it? How should they be thinking about the urgency with which they are pursuing these new ideas? How does that relate to how you thought about things in the PC era and the Internet era? BILL GATES: The industry starts really small where computers aren't personal. Then through the microprocessor and a bunch of companies, we get the personal computer, IBM, Apple and Microsoft got to be very involved in the software. Even the basic interpreter on the Apple II, very obscure fact, was something that I did for Apple.

That idea that, wow, this is a tool that at least for editing documents that you have to do all the writing, that was pretty amazing. Then connecting those up over the Internet was amazing. Then moving the computation into the mobile phone was absolutely amazing. Once you get the PC, the Internet, the software industry, and the mobile phone, the digital world is changing huge parts of our activities.

I was just in India seeing how they do payments digitally even for government programs. It's an amazing application of that world to help people who never would have bank accounts because the fees are just too high, it's too complicated. We continue to benefit from that foundation.

I do view this, the beginning of computers that read and write, as every bit as profound as any one of those steps and a little bit surprising because robotics has gone a little slower than I would have expected. I don't mean autonomous driving, I think that's a special case that's particularly hard because of the open ended environment and the difficulty of safety and what safety bar people will bring to that. But even factories where you actually have a huge control over the environment of what's going on and you can make sure that no kids are running around anywhere near that factory. A little bit people have been saying, okay, these guys can overpredict, which that's certainly correct. But here's a case where we under predicted that natural language and the computer's ability to deal with that and how that affects white collar jobs including sales, service, helping a doctor think through what to put in your health record, that I thought was many years off.

All the AI books, even when they talk about things that might get a lot more productive will turn out to be wrong. Because we're just at the start of this, you could almost call it a mania, like the Internet mania. But the Internet mania, although had its insanities and things that, I don't know, sock puppets or things where you look back and say, 'What were we thinking?" It was a very profound tool that now we take for granted. Even just for scientific discovery during the pandemic, the utility of the immediate sharing that took place there was just phenomenal. This is as big a breakthrough, a milestone as I've seen in a whole digital computer realm which really starts when I'm quite young.

KEVIN SCOTT: I'm going to say this to you and I'm just interested in your reaction because you will always tell me when an idea is dumb. But one of the things that I've been thinking for the last handful of years is that one of the big changes that's happening because of this technology is that for 180 years, from the point that Ada Lovelace wrote the first program to harness the power of a digital machine up until today, the way that you get a digital machine to do work for you is you either have to be a skilled programmer, which is like a barrier to entry, that's not easy, or you have to have a skilled programmer anticipate the needs of the user and to build a piece of software that you can then use to get the machine to do something for you. This may be the point where we get that paradigm to change a little bit. Where because you have this natural language interface and these AIs can write code and like they will be able to actuate a whole bunch of services and systems that we give ordinary people the ability to get very complicated things done with machines without having to have all of this expertise that you and I spent many years building. BILL GATES: No, absolutely. Every advance hopefully lowers the bar in terms of who can easily take advantage of it.

The spreadsheet was an example of that, because even though you still have to understand these formulas, you really didn't have to understand logic or symbols much at all. It had the input and the output so closely connected in this grid structure that you didn't think about the separation of those two. That's kind of limiting in a way to super abstract thinker, but it was so powerful in terms of the directness, so that didn't come out right, let me change it. Here, there's a whole class of programs of taking corporate data and presenting it or doing complex queries against, "Have there been any sales offices where we've had 20 percent of the headcount missing? Are our sales results affected by that?" Now, you don't have to go to the IT department and wait in line and have them tell you, "Oh, that's too hard," or something. Most of these corporate learning things, whether it's a query or report, or even a simple workflow where if something happens, you want to trigger an activity, the description in English will be the program.

When you want it to do something extra, you'll just pull up that English or whatever your language is and type that in. There's a whole layer of query assistance and programming that will be accessible to any employee. The same thing is true of, "OK I'm somewhere in the college application process and I want to know: what's my next step and what's the threshold for these things?" It's so opaque today, so empowering people to go directly and interact.

That is the theme that this is trying to enable. CHRISTINA WARREN: It was incredible to get to talk with Bill not only about the specifics of how all of this stuff works, but also to get a sense of the big picture for this historic moment that we're in, as you alluded to Kevin. KEVIN SCOTT: It has been a very long and winding road to get to where we are right now.

You said it earlier, this may feel like all of a sudden it broke through and it was very abrupt. But it's because of decades and decades of work that just an untold number of people have been doing. But it has been really interesting working with the folks at OpenAI. We had this incredibly interesting conversation with Mira Murati, who is OpenAI's CTO about her perspective as sitting effectively in the hot seat of developing one of the most interesting pieces of technology that's resulted in this breakthrough moment that we've had this past year. CHRISTINA WARREN: For sure, let's take a listen. MIRA MURATI: The first time that we thought about deploying these models that were just in research territory was kind of this insane idea.

It wasn't normal back then to go deploy a large language model in the real world. What is the business case? What is it actually going to do for people? What problems is it going to solve? We didn't really have those answers. But we thought, if we make it accessible in such a way that it's easy to use and it is cheap to use, it is highly optimized. You don't need to know all the bells and whistles of machine learning and just accessible. Then maybe people's creativity would just bring to life new products and solutions and we'll see how this technology could help us in the real world. Of course, we had a hypothesis, but really it was just putting GPT-3 in the API the first time that we saw people interact with these large language models and the technology that we were building and that for so many years we've just been building in the lab without this real world contact and feedback from people out there.

That was the first time, It was this leap of faith that it was going to teach us something. We were going to learn something from it and hopefully, we could feed it back into the technology. We could bring back that knowledge, that feedback, and figure out how to use it to make the technology better, more reliable, more aligned, safer, more robust when it eventually gets deployed in the real world. I always believed that you can't just build this powerful technology in the lab with no contact with reality, and hope that somehow it's going to go well, and that it's going to be safe and beneficial for all.

Somehow you do need to figure out how to bring society along, both in gathering that feedback and insight, but also in adjusting society to this change. The best way to do that is for people to actually interact with the technology and see for themselves instead of telling them, or just sharing scientific papers. That was very important and it took us then a couple of years to get to the point where we were not just releasing improvements to the model through the API, but in fact, the first interface that was more consumer facing that we played around with was DALL-E labs, where people could just input a prompt and natural language and then you'd see these beautiful, original, amazing images come up. Really for research reasons, we were experimenting with this interface of dialogue where you go back and forth with the model in ChatGPT.

Dialogue is such a powerful tool. The idea of Socratic dialogue and how people learn, you can correct one another and or ask questions, get really to a deeper truth. We thought if we put this out there even with the existing models, we will learn a lot. We will get a lot of feedback and we can use this feedback to actually make our upcoming model that at the time was GPT-4, safer and more aligned. That was the motivation and of course, as we saw in just a few days, it became super popular and people just loved interacting with this AI system.

CHRISTINA WARREN: When we talk about AI and we think about this stuff, it has been so powerful and so public the way I think that the whole world has been getting to learn about and experience the challenges of this new wave of AI. I feel like we are in Before ChatGPT, After ChatGPT world and that's really been evident this year. KEVIN SCOTT: We had a really great opportunity on the podcast and, you know me personally, in my professional life to have a bunch of really important conversations with Kevin Roose from The New York Times. I think journalism has this really really important role to play in helping frame what's happening right now. We're doing part of the job which is develop the technology and try to responsibly get it deployed out to the public.

But that's a tiny little part of the job of figuring out how to get AI into the hands of the public in a reasonable way. I've been super excited to have those conversations with journalists this year. CHRISTINA WARREN: This conversation that you had with Kevin is actually one of my favorites this year. A, because he's a former colleague and B, because I really do think, as you say, journalists play a really important role in I think helping bridge the understanding gap between what it is that is being created and all the things that we're working so hard on and how the general public is going to be able to understand and synthesize what all this means for them. KEVIN SCOTT: For me personally, it also serves this really important role of making sure that you're not getting trapped in your own little bubble.

That if you sit around and all you do is talk to your fellow technologists all day long and you can get convinced of some pretty wonky things. It's like really useful to have journalists and academics and policymakers and, like in my mom in rural central Virginia, able to knock you out of your bubblethink. CHRISTINA WARREN: No, I think you're exactly right and I think that that's how we make sure that this is stuff that is not just useful in theory and as you say in our own bubbles. But it is actually something that could be world changing. Great conversation here. KEVIN ROOSE: We are building some of the...I say we, I mean

you essentially and your peers in the tech industry are building some of the most powerful technology ever created. I think without the media there just wouldn't be a countervailing I don't know if it's a force on the minds of the people building that technology or just a caution around the technology, but I'll give you an example of what I mean. You and I had this now infamous encounter back in February where you guys had just released Bing with what we now know was GPT-4 running inside of it. I had this insane conversation with Bing Chat, AKA Sydney, please don't hurt me, Sydney. Went on the front page, went totally viral, blew up.

I'm sure your inbox, my inbox everyone's inbox for a month. KEVIN SCOTT: Everybody's inbox. KEVIN ROOSE: Just in a nutshell, if people aren't familiar, it was a conversation that lasted two hours, in which Bing/Sydney confessed all these dark secrets and tried to break up my marriage, and it was insane. Subsequent to that story running, I got notes from a lot of other people at tech companies saying, how do we prevent our technology from doing that. I even got a leaked document from another big tech company which had a roadmap for their AI product and listed on the roadmap was, "do not break up Kevin Roose's marriage." (Laughter) I really think that that and not to toot my own horn, this could have been anyone.

But it did really serve as a cautionary tale for other companies that are building similar technology. I think that is what the media can and should do in moments of societal transformation and change, is really hold up a sign that basically tells people like you want to do this right or there may be consequences for that. KEVIN SCOTT: Yeah, I think that is one of the very good things that came out of that experience. I think it's another important reason why it is I think you actually want to launch these things even if it results in something that floods your inbox for a while, is like you just get the societal level conversation about what's possible, what's going on, where's the line, what's good, what's bad. We haven't chatted since that story published.

One thing that I will say is I deeply appreciate the part of the writing that you did where you published the full transcript of the conversation. At that level of transparency, it just wasn't confusing to anyone about what inputs into the system led to the things that it gave back. That was super good. I don't think many of the people who were coming into my inbox have read the full transcript, but I think that you're just 100 percent spot on. The existence of this thing has helped a lot of people just make sure that they're paying real attention to some very important things.

Also, some of this stuff is fuzzy, right? Like where the line is, and part of what you all are helping doing is making sure that the public is paying enough attention to it so they can weigh in and have an opinion about where the line should be. KEVIN ROOSE: Absolutely. Imean it is an area where I think more public opinion is good. Right now the number of people who are actually building this technology is quite small relative to the number of people who are using it or who will soon be using it. I just think the more people know about what's going on, the better. I think it'll ultimately be good for the tech industry to have that kind of feedback even if it is annoying and blows up your inbox in the moment.

KEVIN SCOTT: Yeah. For what it's worth, I was never annoyed about that. KEVIN ROOSE: You were remarkably chill. I was pleasantly surprised by the response that you and I talked after I had had this conversation, but before I published my story. You didn't freak out, you didn't accuse me of lying. It was a remarkably civil conversation and I have appreciated that because and I'm not just blowing smoke here, that is not the reaction that I expected given how these things can go with other tech leaders.

I'm hopeful that the lesson from that has not been that we should build in secret and should never let anyone try or stop. KEVIN SCOTT: No. KEVIN ROOSE: I hope that it has been salutary for the whole project of building good and safe and responsible AI to have some feedback along the way. KEVIN SCOTT: 100 percent I'm genuinely trying to think if anyone was irritated, It's like everybody is like, okay, this is good, lesson learned and we have a whole bunch of mechanism for preventing things like that from happening again. It's all good. Somebody said to me at some point that all feedback is a gift.

The fact that you spent two hours trying very hard to get this thing to do unusual things and then published the whole transcript and then wrote this article that helps people pay more attention to the importance of responsible AI, all of that's a gift and that's the way you should just look at it. KEVIN ROOSE: Yeah, I'll remind tech executives of that the next time I get an angry call from a comms department. Kevin Scott thinks feedback is a gift, so maybe you guys should get on board with that.

I did also hear that you guys had Sydney swag made, and I'm a little offended that none of that has shown up at my house. Did you hear about the beer? KEVIN SCOTT: I did. KEVIN ROOSE: This was my favorite thing that came out of that article was that there was a brewery in Minneapolis that came out with a beer called Sydney Loves Kevin. I have not tried it yet. I heard it was good, but maybe you and I can get a pint of it sometime. KEVIN SCOTT: We should absolutely make a road trip to Minnesota to get a pint.

That would be hilarious. KEVIN SCOTT: It's just so exciting to me how much technology and AI are becoming part of the mainstream conversation. It really does make me think about the episode we did with Neil deGrasse Tyson who is one of my heroes. In addition to being a brilliant astrophysicist, Neil, I think is in my opinion, the best science communicator since Carl Sagan.

I think Neil tried to model a lot of what he's doing after Carl. He just does a terrific job of helping explain some incredibly complicated scientific concepts to the public, but not in a condescending way. I just love Neil to death. CHRISTINA WARREN: I do too. He's also one of my heroes and I think you're exactly right. He is definitely one of our best, if not the best living science communicator that we have.

I think it's so important that we have people like him in these moments. As you say, I think helps get us out of our bubbles a little bit. Also, I think helps put this again into terms that people who might not follow along with every little thing can understand and get excited about and also start to think the important questions about so this is a great conversation with Neil.

NEIL DEGRASSE TYSON: Right. Well, actually, so I think much less about kids than I do about adults because kids are born curious and they retain that curiosity at least into middle school. I'm not so worried about them as I am about adults who think they're thinking rationally and are not.

And adults are in charge of the world. They wield resources, they vote, they lead. They're captains of industry. They're all the things that shape this world. yeah, we can target children, but I'm too impatient.

I don't want to wait 30 years until they're old enough to then make a difference. Whereas adults can make a difference with a pen at the bottom of a document that puts a new legislation into effect or new educational directives, or new mission statements. That's been my goal.

Now, in terms of role model, I think the term is a little overplayed. Overvalued, I should say, for the following reason. I grew up in The Bronx, New York. And if I needed another Black person who grew up in The Bronx before me, who then became an astrophysicist to have preceded me, I would have never become an astrophysicist. Role models, as they are generally thought of, inherently require that you are not the first to do something.

But suppose you have interest where you would be the first to do something, either first out of your town-- Like you said, your role model can't have been people in your town because none of them went to college or in your family. At some point, you have to say "my role model is someone I have never met, and I may never meet. But I know enough about them that I want to emulate what happened in their life." And so I knew this from very early, so I had an athletic role model of baseball of course, I grew up in the Bronx where the Yankees play.

There's a Yankee that was a role model. I didn't want to be him but if I played baseball, I wanted to play baseball as well as he did. That's all. I visited my local planetarium, the Hayden Planetarium, took extra classes there. They

had educators and scientists, and they had educators that had such a way with words storytelling. I said "if I'm ever an educator, that's the educator I want to be," and the scientists had such command and they just knew so much. The weird thing is I remember thinking "I will never know this much about the universe as much as they do." Meanwhile, they're twice my age at least, and I'm 15 when I'm having these thoughts. Not realizing that when you get a PhD you spent a whole five, six years studying that one subject.

Six years before that I was nine so I had to get the time scales understood within me. For me, a role model at its best is assembled a la carte, from different people and that's why if you're visible, you should always in the back of your head, ask yourself, am I being something that someone else might want to emulate, because that chance is very real whether or not you ever meet the person. When I see little children coming through - the Hayden Planetarium is part of the American Museum of Natural History - there's school groups that come through, many international tourists, but also camp groups, I see little kids in the looking bright eyed.

All I've committed is that whatever I help create there have the impact on next generation. The way the educators and scientists had an impact on me. Then I've given back to this world. CHRISTINA WARREN: Then when we talk about how important it is to bring people into science and technology, a lot of that I think comes from being able to inspire them and their creative minds. KEVIN SCOTT: Yeah, there are so many structural barriers for a lot of people and it's really easy, I think, for folks to get confused about how much luck and good fortune is involved in all of us who have the privilege of having interesting careers, where we get to have a little slice of impact on an industry, like how much luck plays a role in that for us. William Adams, who worked for me for a while in OCTO, he's one of the first people I hired when I came to work for Microsoft, is just an incredible human being, a source of inspiration for me personally, and the extent to which he has dedicated his life, not just to being a better engineer, but as a Black engineer trying to figure out how he goes and addresses some of those structural impediments to having more folks who are historically under represented in this field be able to participate.

It's just really, really inspiring, seeing what Will has chosen to do with his life. WILLIAM A. ADAMS: Yeah. Well, let me speed mouth then.

The general thing is and this started back when George Floyd was killed. I really sat and thought. OK what am I going to do? This is not the first Black man who has ever been killed by police. He's not going to be the last. But this is a good moment because it was caught on 4K video.

People are paying attention, what am I going to do? One thought was, well I've got money. I can just throw money out to people and be that philanthropist. Once your money's gone, that effect is gone. Well, I have 40 plus years in tech.

I know that tech billionaires are the top of the world right now, so that means there's more money in tech. Why don't I help get more people into tech right? That led to, well, there's lots of different models for that as well. I could invest, I could be an LP in someone else's thing or whatever. There's lots of different ways of doing it. I just came up with the models like, well, what matches my skills and experience and network best? I came up with the studio model. A Venture Studio.

The difference between A Venture Studio and a plain old VC or an accelerator is, I tell people think of Motown. I'm the Berry Gordy. I'm going to find the talent. I'm going to train the talent.

I'm going to find Aretha Franklin, I'm going to find Michael Jackson. I'm going to help show them how to dance, stage presence, give them the record contract, put them out in the network and help them build their thing. Do that with software. It's about creating software, creating a network, having a finance network, having resources as simple, as soon as you start your business, you need a tax accountant or else you're just going to fall behind on your taxes and you're going to be out of business in a year.

You need an admin, so you need three or four programmers. You're not going to write the code yourself. And, when you fail, you need someone to say, "that's all right, let's do it again."

This is something that typically Black and women businesses don't have. They struggle to do it once, they get burned, they're working at McDonald's or they're back to general population. They don't try again because they got obligations. The studio is intended to put a little cushion that they don't normally have in the society. Such that they can have enough longevity to succeed. I've been doing a lot of network building.

I've been writing tons of code. Because the other thing that I'm telling people is OK: AI is the thing. Two years ago I would have told you, go out and learn some C Sharp and web development. Now it's like, no no no. First you need to go play with ChatGPT and if you're going to write any code at all, you're going to use Copilot. I'm not just saying that because I'm shilling for Microsoft.

It's like I do it myself and it makes me 30 percent more productive at least. The software I'm writing is about saying, how can I write code such that it's going to be more composable when I use AI to do it? For example, there's the ZIP file format, compressed files. There's a library for that and it has its quirks. I wrote one myself.

I have my own ZIP decoder and the interface is so simple that when I want to say, now compress that and stick it in the ZIP, it's one function, three parameters done. Clear licensing, I don't have to worry about all the licensing all over the place, and that's a difficulty when you start stitching with AI. It's like, well, wait a minute, where did that data come from? Where did that code come from? Do you have the right license for that chunk of code right there? I'm creating a substrate of code that's like, this is clear licensed and here's how you program leveraging AI. This is how we have to do it, this is how we're going to 10X our abilities, is by leveraging AI.

You can't just go and lock yourself in a room and think you're going to hammer out some code. By the time you're done, someone else who's got AI has already done it 10 times over. You better learn how to do it with AI. This is what I'm currently engaged in. CHRISTINA WARREN: Yeah, honestly, William's work, what he's doing, is so inspiring, as you say and what he's accomplished in his time at Microsoft and now after Microsoft is so impressive and I loved that conversation between the two of you and it's great that we have people like him who are doing this important work and really working to inspire the next generation.

KEVIN SCOTT: Yeah, for sure. Maybe the theme of this podcast, the people that we talk to, are folks who are just deeply passionate about something and who are fully committed to pursuing that passion and it's just a great pleasure to be able to chat with so many people who have that mindset. One of those conversations that we had this year was with my favorite science fiction author right now, Adrian Tchaikovsky, who has written just really incredible hard science fiction books. He's got a super interesting background as many science fiction authors do. He is a zoologist and lawyer by training.

I think he wrote, maybe inadvertently like, one of the most important books on contemporary artificial intelligence which was the third installment of a trilogy that he wrote called Children of Memory where of all things, pairs of corvids, intelligent birds are maybe the best metaphor that I have seen for large language models, just is an incredible thing that came from his imagination before large language models really were doing some of the things that they could do. It was just tremendous to be able to spend some time talking with Adrian, not just about his craft, but about what inspires him to write the things that he does and what some of his takes are as a creative person, like a writer about what this AI moment means for him and people practicing his craft. It's just a great conversation and wanted to give people a chance to listen to a little snippet of that conversation in this year in review we're doing here. ADRIAN TCHAIKOVSKY: When I'm working from the point of view of a non human entity, whether it's a sort of an uplifted animal or an alien or something like that, it's generally the start point is the input.

You look at what senses it has, how it experiences the world around it and that then gives you a very good filter through which to look at the world and interpret the things that are going on around this particular character. It gives it its own set of priorities that can be quite different to human because we're very limited to our senses, I mean our entire picture of what goes on around us is fed to us through the various ways that we sense our environment. When those ways go wrong or when those ways are giving us uncertain information, you can get some profoundly weird and dysfunctional worldviews generating from that, which can be extremely hard to dissuade someone from, even if you knew that you had a problem, which meant that you were seeing things, you were hallucinating. That doesn't necessarily mean that the hallucinations aren't themselves still incredibly powerful.

It's very hard not to react to a thing that your eyes are telling you is there. Sleep paralysis is a perfect example of that because most people who have it are well aware of what's going on when it's happening, but that doesn't make it any less scary. KEVIN SCOTT: Yeah, I think a really good example of this are the Portids from the Children of Time, which are the spiders who become the protagonists of the story. I don't want to give too much away about the book for folks who haven't read it, but you basically develop this very complicated society of intelligent spiders, which is a really interesting premise.

Part of what makes them so compelling is their sensorium is very different from humans. You want to talk about that a little bit? ADRIAN TCHAIKOVSKY: Yes. I mean weirdly enough, they're also considerably more human than either of the two major human, points of view presented in the next book because they're still very visual creatures. They're land creatures and there is a certain factor in their evolution which is going to make them more human than they might otherwise be.

But at the same time, spiders have a whole suite of senses that we can't really easily imagine. They have the ability to sense scents and chemicals in a way that we can't. They have ability, especially to sense vibration in a way that we can't.

They live in a world that is constantly informing them of things that we would be completely oblivious to. The way I go about this is I work in organic stages. Well, if they're like this, then what would their early development be and then just building on each other at each state to make this more and more complex society as the book takes them through time. You get a society which has a lot of, for example, technology that builds on their own strengths. They can do a lot of things that we can't quite early on purely because they have a lot of tools, even down to just being able to spin web, which gives you the ability to make, say, watertight containers very early on in your society, which is a major - you know making things like clay pots and so forth, is a major step forward for human society. It's much harder for us because we have to use fire and we have to make them where the spiders can just literally produce them from their own bodies.

Little things like that then go on to have enormous implications to how their society develops. It also affects the way that they conceptualize of less of physical things. You have one point where there's an effort to try and communicate a picture from one culture to another. They run into the basic problem that when a human is coding a picture in a mathematical form, we start at the top left or one of the corners depending, possibly culturally dependent, and we work through the rows. The spiders start in the middle and spiral outward. That makes perfect sense to them, because how can you necessarily know how big the picture is going to be when you start it? And so you just start and keep working until you've got all the picture.

But it means that a lot of basic ideas, even when they have a means of communication, become very hard to communicate because the way you're thinking about them is very different. CHRISTINA WARREN: Yeah, I loved hearing Adrian talk about how he's thinking about creating these species and civilizations. I still fundamentally consider myself a writer first and foremost and so I love hearing about the creative process that other writers, writers more talented than me, like him, can do. Also thinking in the broader terms about how sensory input can shape everything in how we operate and have the influence art can have on science and vice versa.

KEVIN SCOTT: Yeah, another person who's thinking about some of this stuff in a really significant way was on the podcast this year and one of our most engaging conversations, the pioneer of emotion AI, Rana el Kaliouby. She believes, and I agree with her that we're missing out on a huge amount of data that we could be using to train our AI's to make these systems more not human like, but more relatable to humans. We had a super interesting conversation which we're going to listen to right now.

RANA EL KALIOUBY: It's exactly what your friend is saying. Empathy and emotions are at the center of how we connect and communicate as humans. Our emotional experiences drive our decision making, whether it's a big decision or a small decision. It drives our learning, our memory, it drives this human connection, it's how we build trust. It's everything. But if you look at how we think about technology, it's often what I call the IQ of the device, take ChatGPT, very smart, very intelligent, but it's all very focused on the IQ, the cognitive skills. I felt like it was out of balance with our emotional and social intelligence skills and I wanted to marry the IQ and the EQ in our machines, basically. I had to go back.

I'm a computer scientist by background, but I had to study the science of emotions, and how do we express emotions as human beings? It turns out 93 percent of how we communicate is non-verbal. It's a combination of our facial expressions, our hand gestures, our vocal intonations, how much energy is in our voice, all of these things and only 7 percent is in the actual choice of words we use. I feel there's been a lot of emphasis on the 7 percent but the 93 percent has been lost in cyberspace. I'm reclaiming that 93 percent using computer vision and machine learning, technologies that weren't available 20 years ago but now they're ubiquitous. KEVIN SCOTT: I think you're absolutely right. In 2016, I dialed back almost all of

my social media consumption because you effectively have these machine learning systems, particularly for businesses where their business model is engagement. The more you engage with the platform, the more ads run and the more money that you make. It is very easy to get systems that get the engagement by triggering your amygdala and keep you in this, and it's very easy to be triggered by e-mail. I, all the time have e-mail conversations with colleagues where I get super agitated by the e-mail conversation. If I just jump into a video call with them even, not even face to face, but what we're doing right now, in seconds, all of the stuff that I was agitated about goes away, so I'm just super intrigued by what you're doing.

How do we actually get this rolled out more broadly because I think you're absolutely right. We focus so much on the text and text is so rife with opportunity to get us emotionally activated in the wrong way. RANA EL KALIOUBY: Wrong way. Because there's a lot of confusion and ambiguity where you can clarify that when you see the face or hear the voice.

I think what's really cool about this and what ended up being both the opportunity but also the biggest challenge for Affectiva, when we spun out of MIT was that there are so many applications of this technology. We tried to focus, but it was always this challenge, oh my God, there are so many cool applications. Some that I think are really powerful, one is the automotive industry where we ended up selling to Smart Eye and they're very focused on bringing driver monitoring solutions to the world. This idea of understanding driver distraction and if you're texting while driving, well we can tell that using computer vision.

Look at your eye, your head movement and if you have a phone in your hand, drowsiness, intoxication, even. We've started doing a lot of research to detect that using cameras, optical sensors. So automotive is one area we can do, advanced safety solutions. We can look at, is there a child seat in the car and an infant in it and often not often, but about 50 kids or so get forgotten in the car every year and they die of heat. That's very fixable, we can fix that. Mental health is another space that I'm really fascinated by.

We know that there are facial and vocal biomarkers of mental health disease, depression, anxiety, stress, even Parkinson's. Imagine if every time we hop in front of our computer with people's opt-in, of course, we can articulate your baseline using machine learning. We know your baseline and then if you start deviating from it, we can flag that to you or a loved one or a psychiatrist. I think there's a lot of opportunity, but we're missing the scale. CHRISTINA WARREN: That actually makes me think about how connected to creativity so many of our guests have been, as we've been going through this year in review.

Let's share a clip now from will.i.am. He talked about technology and how it changed the music industry in a super interesting way. WILL I AM: Imagine this is 1970 and you're a drummer from a band that's pretty popular, and you're a drummer that's coming up and you have aspirations as a drummer. Then there was a drum machine, but you weren't really threatened by the drum machine because it sounded like doot-doot-doot - didn't really sound like drums. Then the '80s come around and the LIN900 started sounding a little bit more realistic, but it was stiff and robotic and then Prince really made some pretty awesome songs with the LIN900. Then the Akai MPC 60 drum machine came in, and you could sample live drums and put the swing on it and it sounds realistic.

Then 2000 comes around, you're like **** these drum machines, all the songs on the radio are ******* drum machines, in the '70s it was all human beings. Tight, you had to be super precise. The drum machine ate up a lot of the freaking drum time for drummers on the radio. Since then, live drums on the radio and on streaming, you don't hear no live drums anywhere on the radio. But what happened was, the drummer, a lot of them became the producer because they were like, you know what, if this is the case, I'm going to learn this machine and I'm going to produce.

Then the role of the drummer, they made more money because the drummer never got publishing anyways because what's the publishing on drums? The drummer in the band always got the ****** end of the stick when it came to ownership of a song because what part did you write? I wrote this [drum sounds] everybody says [drum sounds]. No I said [drum sounds]. Everybody says [drum sounds]. When it came to how everyone participated, the drummer was always last and that sucks. But the drum machine and producing on computers really empowered the drummer. The same is for music, the whole entire package now, not just drums, it's guitars, it's bass lines, it's freaking chord progressions, ensembles like orchestral.

Everything of music is now going through what the drummer went through in the '70s to the 2000s. Now the guy with the idea, the girl with the idea, the person with the idea. Now, they don't need the whole entire studio, they don't need a whole entire band. They just need their idea and the machine will supercharge them the way the drum machine and the DAW supercharged the drummer. That's the optimistic point of it all. Then there's a bunch of negative stuff that I don't even want to entertain because I think what we should be doing with AI is not just to augment yesterday.

Yes, it's going to render how we used to do things kinda obsolete, or create new ways of doing things that undermine how we used to get paid. There's going to be new ways that we get paid. A song is going to be a lot deeper. MTV said, "hey, don't just write the song, do a video." That video used to be just promotion. If you wanted to go and sing, you're from LA, but you want to tour all of America.

Other locations were like, send me a demo of what your show is going to be and I could put you on TV. They would do that promo clip, that MTV hijacked and said these are videos, those are promo clips. It was never a part of what a record contract was to be. A record contract was based on the limitation of lacquer. Even though we had technology like mini disc and CD's, that there were no more limitations, music is still composed as if there were limited space on a disc.

A song is three minutes and 33 seconds, an album is 12 songs on it, just because of the RPM's 33 that it went around the record and the tempos of every pop song was limited. Even with this advanced technology that we have, it's still based on limited. That means you have to re-imagine what a song is, just like they re-imagined what a song was from the 1800s when it was just theater and opera.

When it came to the recording industry, they're like, "We got to change a different song format here. Muddy Waters, that song is too long. I need you to shorten it." Because when they would sing the blues, there was no time limit. They would just go to a bar, you hear somebody sing, and they sung for ******* hours.

A song is like a reduction of the song sequence because of a limitation of the technology. That means with AI and AI Music, somebody has to re-imagine what the **** of song is. It's a discussion; a place for you to put your memories, a place for the song to alter based on your mood. What's the version of I Got a Feeling when you're ******? I just wrote the one that when you're optimistic, Where's the Love is when something goes wrong, what about when something is right? In the songs like Boom Boom Pow, they're about nothing. Hey, what's Boom Boom Pow? I don't know nothing. It's describing the song.

This song is promoting the song that I wrote. What song? This one right here that I'm singing. Which is a crazy concept of songwriting. What are you going to write about? I don't know that I'm going to write about this song I' writing.

But you have to change AI's opportunity to re-imagine what a song is, to re-imagine what it means in your life, to re-imagine what it means to the world, where it's discussion based. We need to inspire the creative community to use AI not just for entertainment, but to re-imagine the world. Tomorrow's industries are what? Tomorrow's jobs are what? If AI is going to replace jobs, it's going to create new jobs.

Shouldn't the creative community also be tasked to imagine what tomorrow is with it or are we just going to freakin pretend that we're going to make songs forever? No, that ain't the case. You ain't going to be making songs forever for business like we used to. You're going to be making experiences with it. You're going to be scoring moments in different ways of scoring moments, but it isn't going to look anything like yesterday. No. CHRISTINA WARREN: I really am seeing a theme here.

You see the intersection of creativity and innovation and technology, and how all of these things influence one another. KEVIN SCOTT: Even Tobi Lutke, who is one of my absolute favorite engineers in addition to being the incredible Entrepreneur and founder of Shopify, and I were talking about some of these ideas about the change that's coming, and how change influences the nature of craft and creativity, we had a really fascinating conversation where we reflected on this, which you-all are going to get to hear a little bit of right now. TOBI LUTKE: The core of a craft is conversion something into something. Like every craft that exists, is like that particular process. It's like people who convert wood into furniture are carpenters. One thing which is really inspired about these things, is that, very rarely is the identity of a carpenter to be a carpenter.

The identity is actually a craftsperson. That the particular craft that they've pursued and they've done their, they become a master in, it's usually just aptitude. But it's a very common type of person which is also really useful because, therefore, it's some crafts exist for a moment and then they don't exist anymore when they're no longer need it. Like blacksmiths kind of figured out what to do next. I think all of this is actually really worth understanding, because I do sometimes talk to you know junior engineers who describe themselves as, "I'm a React Frontend Developer." I'm like, "Wow, you just put two little bits into your identity, but you really don't need to because I think you are actually much more than that because you can solve problems extremely well."

Rallying cry is like, people who who are able to make the things that they really want for what they got. That's at the core of it, and I love that. KEVIN SCOTT: Our last guest of the year is really looking towards the future. David Kirtley and his company, Helion, are working to build the world's first nuclear fusion power plant. Which, as the words come out of my mouth, sounds incredibly audacious. But if you really think about the biggest challenges facing us, as a species, having access to not just clean energy supply, but abundance of energy like, where we can have way more power than we have now, available to us to help us to fulfill and achieve our biggest ambitions is super important.

CHRISTINA WARREN: I completely agree. I have to say, audacious or not, and think audacious is a great word for it, listening to you and David talk got me so excited about what the potential in the future could be if these things can come to fruition. Let's close out now with your conversation with David. KEVIN SCOTT: But like I think the thing that people really miss, and you probably have a better perspective on this than I do, is that, starting somewhere in the 1970s, we just stopped using energy at the rate that we had been using it before. It's actually a staggering thing to think about, not just what happens if we could take the energy that we're consuming right now, and it's more sustainable.

But what happens if energy becomes sustainable and cheap and abundant enough, where you could use 100 times more, a 1,000 times more or 10,000 times more of it than you're using right now? What's then possible? Talk a little bit about why energy matters. DAVID KIRTLEY: I think about this a lot actually - from two perspectives. One is, just recently, we're starting to use more electricity. It's, we're upticking for the first time in a long time, where in the 1970s and '80s, we plateaued in a lot of ways in terms of our energy use. I think that's totally right.

But recently electric transportation, probably computation, and AI kicks into that too. Looking at ways to solve climate change by spending electricity, then goes in and it's starting to increase our demand for electricity. The other thing I tie to in this is standard of living directly ties with access to low cost electricity. You look at different parts of the world and standard of living, and you can say, "Great. They have more electricity access, they have more standard of living," however, you want to define that.

But I asked the same question, what happens if we had 10 times? Does that mean our standard of living would be 10 times? What does that mean? Would we have access to clean water? Desalination is the classic. There's a trigger at the 1-2 cent per kilowatt hour, where if you can have electricity at that cost, then now you can desalinate water through electrolysis and other methods directly. Clean water is now cheaper than it was to actually pull it out of a river and purify it. Suddenly, you enable some of those things which are clear.

But I think through if you had the computational access where you can have large scale servers at everybody's house, you got to get maybe the server price down, but now you can actually do really interesting things on the computation, and the cooling around that. KEVIN SCOTT: Look, I think, one of the things that maybe people don't appreciate or think about clearly enough is, maybe everything good that has ever happened in the history of humankind is humans discovering new sources of energy and being able to put that energy to work, solving problems that benefit humans. In a sense, like you actually want to be able to consume more energy, you don't want the consumption of energy to be a bad thing, because nominally, consuming more energy means you're doing more of those useful things for humanity, like electrolysis. I mean, one of the things here in the state of California is, we have parts of the state where you have abundant water, and you have parts of the state where you have no water.

One of the reasons that California is habitable is, we spend an enormous amount of energy pumping water from places where it's abundant to places where it's scarce. I think you're going to have to do more of that in the future with climate change. You really do want a world where you have cheap abundant energy. Which is why, I think, the problem you're working on is, I think, artificial intelligence is a pretty important problem. That's the thing I spend most of my time working on, but I think your problem is more important than my problem, and my problem is dependent on your problem. DAVID KIRTLEY: At some scale, our two problems work together.

I don't know that we know, frankly, what happens if you have more low cost electricity, particularly once. It has to be low cost. That's really important. If it just costs a lot more and you have more of it, it doesn't actually help the situation where the essentially effective cost of burning wood, to burning coal, to fission power, and then to renewables, some of the renewables, when you have access to good sunlight, solar power can be really low cost. You have these stage gates for humanity that you unlock.

I don't know that we know the answer to that, but I'm excited to find out. That's for sure. CHRISTINA WARREN: That was a whirlwind tour of our incredible 2023 guests on Behind the Tech this year. You can check out those full episodes on your favorite podcast platform and on YouTube. Be sure to check those out. If you have anything that you would like to share with us, you can e-mail us anytime at behindthetech@microsoft.com.

Thank you for tuning in, and we will see you in 2024. KEVIN SCOTT: See you next time. [MUSIC]

2023-12-14

Show video