Andrew Yao - 2021 Kyoto Prize Laureate in Advanced Technology

Andrew Yao - 2021 Kyoto Prize Laureate in Advanced Technology

Show Video

[MUSIC] Welcome to the Kyoto Prize Symposium. I'm Pradeep Khosla Chancellor, UC San Diego. Today, we're coming together to celebrate the Kyoto prize honorees.

The Kyoto Prize is Japan's highest private Award for Lifetime Achievement in the fields of advanced technology, basic sciences, and arts and philosophy. Awarded by the Annemarie Foundation, the prize is presented to individuals and groups worldwide who have significantly contributed to the betterment of mankind. The Annemarie Foundation and UC San Diego share similar missions. Our world-class faculty collaborate to address revolutionary discoveries across multiple disciplines.

Together we drive innovations that advanced society and drive economic impact. During this symposium, we have the distinct pleasure of hearing from three Kyoto Prize laureates. First, we will hear from Dr. Andrew Yao,

this year's recipient of the Kyoto Prize in advanced technology, whose life work has revolutionized the field of information science worldwide. I'm delighted to hear him speak and learn more about his groundbreaking work. Honored guests, it is my sincere pleasure to welcome Dean Al Pisano, the Dean of the Jacobs School of Engineering, who will tell us more about what we have. Al, unto you. Thank you, Chancellor Khosla. I'm Al Pisano, the Dean of the Jacobs School of Engineering here at UC San Diego.

I am pleased to present UC San Diego Computer Science Professor Russell Impagliazzo. Russell is the perfect person to introduce Professor Andrew Yao, who is the 2021, awardee of the Kyoto Prize in advanced technology. I'll give Russell the honor of making the full introduction of Professor Yao. But first, I'd like to share a little bit of information about Professor Impagliazzo himself. Professor Impagliazzo is a world leader in computational complexity theory and like many high impact computer scientists, he engages not only in the deep math and the deep theory, but he also engages in real-world use cases.

Professor Impagliazzo has an admirable understanding of the scope and impact of Professor Yao's career, which I'm proud to say, includes very fruitful time spent right here at the University of California. Without further ado, I present to you Professor Russell Impagliazzo. Thank you, Dean Pisano.

I'm Russell Impagliazzo, a professor in the Jacobs School of Engineering and Computer Science and Engineering Department here at UC San Diego. It is my privilege to be the host for Professor Andrew Chichi Yao, the 2021 Kyoto Prize Laureate in advanced technology. Professor Yao's work is both wide ranging and foundational and is still inspiring cutting edge technological developments. His contributions range from cryptography and computer security to quantum computing. From the Yao principle to the DLA of Yao model of cryptographic protocols security to the Yao Next Fit Test to Yao's Millionaires Problem to the Yao class at Shanghai University, his name is literally all over computer science. Much of what we do every time we use the Internet was enabled by the concept Andrew Yao introduced starting in the 1970s and continuing today.

Professor Yao has also had a wide ranging career, starting with doctorate in physics from Harvard and Computer Science from the University of Illinois at Urbana-Champaign. He then taught at MIT, Berkeley, Stanford, Princeton, and is currently Dean of interdisciplinary information sciences at Shanghai university. Among many other honors, he received the 2,000 ACM Touring Award. Professor Yao was unable to attend in person, but it's my honor to now introduce Andrew Yao's Kyoto Prize Laureate address and to discuss his working career with him in the interview that follows the address.

Thank you and enjoy. [MUSIC] In my computer science career a lot of my work is really building theories. I am really interested in fundamental questions because it's more philosophical. I think in science, one of the great pleasure is that we often encounter patterns.

The sense of beauty in those patterns. I think it's the same thing in astronomy in the several, 100 years ago, people already realized that the planets are orbiting the sun in elliptic circles. Those patterns are beautiful. This is quite similar to art, and I remember a famous saying that there's nothing beauty unless there's some disproportionality in the proportions and some strangeness in the proportions.

[MUSIC] I like to work on hard problems. When you encounter a really hard problem it's useless to have many people together, and you are just going around. So it's often weeks and months or even years of thinking. [MUSIC] I love hearing my wife playing piano, and I actually can think very well.

I actually do a lot of thinking when she plays. When I do deep thinking, it's either in my study by myself; the other mode is that I'll be walking outside, just try to relax. Let my mind flow freely.

Having been back to China for, I think it's more precisely 17 years, and I've built an Institute if for Interdisciplinary Information Sciences and also including a Quantum Information Center, I think quantum computing is the future. Nowadays many people are yearning to go into the outer space. But actually there is inner space that is every bit as exciting as going outer space. Just imagine these are such tiny objects, how is it possible for us to convey some signal to an individual atom or a molecule? But physicists allow us to do it.

We are now communicating with our brothers in the micro world, and we could give it a signal saying that, could you please do this for me? The idea that we now are capable of talking to them. To me, that's very, very uplifting. Science is my religion, and I love it. Essentially I am a scientist all my life, and I don't intend to change that. [MUSIC] Ladies and gentlemen, it's good to be here.

First of all, let me say it is a tremendous honor to receive the Kyoto Prize. Knowing all the previous recipients and their brilliant achievements, I feel deeply humbled as to be considered worthy to join their ranks. It is a great pleasure and privilege for me to speak here today. I would like to tell a little bit about where I came from, how I found Computer Science, and my journey through it all. In more detail, I will begin with a little bit about my background, then my early enchantment with physics which led to my first career, and then how I switched fields by accident to become a computer scientist. Then I would like to give a synopsis of some of my work, what problems I consider and why they are interesting.

Finally, I would like to pay tribute to several people who have greatly influenced my life and work, and then I'll wrap up. I was born in Shanghai, China in 1946. My family soon after moved to Hong Kong and then Taiwan. I grew up in a happy middle-class family. I had loving parents, and two very close siblings. I was brought up with an emphasis in traditional Chinese values, in particular it pays high regard to culture and learning.

Happily and to my parents delight, I was an excellent student, straight A's, throughout my schooling. Remember, I loved math, science and history as a kid. For example, I was fascinated by historical figures who showed unusual bravery and wisdom which scientists like Galileo and Newton, they were my heroes also in just the same way.

They give me a sense of awe and magnificence because of their brilliance and their courage to stand up for what they believed in, and I dreamed that this would also be my destiny someday. In the third year of my high school, I came upon a copy of notes by Lord Ellington on the theory of relativity. It gives a most of vivid and simple derivation of relativity, and the argument goes roughly like follows. Experimentally, we have known that light has constant speed, and from this fact one can be derive, with clever argument, that our familiar concept of time cannot be an absolute universal concept. That was something that everybody had taken for granted for a long time, and this argument impressed me greatly.

Physics can be read like a detective story, and more imaginative than any of the clever episodes in Sherlock Holmes. I was really impressed and encouraged. In 1963, I went to college as a major in physics. Not long after that, Feynman's lecture notes on physics was published, and the legend has it that Caltech wanted to radically reorganize their freshman course in physics, and Feynman agreed to do it with a condition that he would only teach it once.

This led to the legendary three volume series of Feynman's lectures on physics. Reading the book, it was an eye opener for me. Advanced concepts that are difficult to explain turned out to be possible to explain and derive using only elementary mathematics, that was really impressive. That led me to see the depth of physics and its beauty, and actually, this was the first time that I felt that I really understood the principles of quantum mechanics, and 30 years later when I went to work on the field of quantum computing, Feynman's explanation on quantum phenomena still looks to me to be the most illuminating and useful explanation. That really settles it, I decided to study physics after college.

In 1967, I graduated from college, and after one year in the military service, I went to Harvard University to do graduate study in physics, and in 1972, I obtained my PhD degree in physics under Professor Sheldon Glashow. Finally, I was a certify real physicists. That didn't last long though, in 1973, my wife Frances, who was a PhD student at MIT at that time, introduced me to algorithms. The word algorithm was at that time quite unfamiliar to most people unlike today, and now it is in the everyday vocabulary.

An early draft of the Art of Computer Program volume III by Professor Donald Knuth, and that's a book on algorithms, it was having limited circulation. What an amazing masterpiece. It introduces a fascinating new science, and after reading the materials, I could not stop thinking about the research questions raised in the book. It became an obsession so much that I soon quit my post-doc job in physics in order to pursue graduate study in computer science full-time. I remember that my mother was really concerned because it seemed that I had given up all these years of work in physics. But my wife was very supportive, and so I became a computer science graduate student at University of Illinois.

Thanks very much to Professor CL Liu for willing to take me on. Now I would like to tell you a little bit about my work. Initially, I focused on solving existing open problems in algorithms such as minimal spanning trees, B-trees, etc. But not soon after I graduated, I started getting interested in developing new frameworks, and new theories for computer science. Over several decades, I had the opportunities to work in several top-rated universities.

I have spent 10 years at Berkeley, Stanford, and followed by 18 years at Princeton. In 2004, I moved to Tsinghua University, where I now work. At each period, I was doing somewhat different things and it was quite interesting that the topics that I focus on during this different periods, they had lots to do, both with the changing times and the development of the computer science as a discipline and also with the environment in that universities that I encountered. What I would like to do is to pick the three topics; the min-max complexity, communication complexity, and cryptography and MPC. Now, the way that I find best to do research is to find insightful bold questions that are important. If you find good questions, then you invariably will do good research, finding nice results that are applicable and important to the research world.

What I would like to do is for each topic, I will describe the motivating question that led me to the discovery and why they are important. The first one is about min-max complexity in 1977, and this holds a special place in my heart because this really is first significant chance that I formulated my own questions and found good ways for dealing with it. Now, we know that algorithm is essentially equivalent to some sort of a recipe.

For example, in cooking, you would say that I would put three ounces of salt and some grams of meat step-by-step. In the mid-1970s, a new style of algorithms came to the attention of people namely that is called the randomized algorithms. In this novel type of algorithms, they incorporate stochastic moves, metaphorically is just like that. Instead of having a definite procedure of putting two spoons of salt, you say that I can't throw a coin. Decide whether to put two spoons of salt or to put a cup of red wine into the cooking process.

For the traditional way of thinking this looks like a crazy way to do things, but in the 1970s people have demonstrated that actually there are advantages in carrying out algorithms in this fashion and they lead to some spectacular result in some cases. But the question that people wasn't able to understand, to analyze is that that was the limitations of this algorithms. This led me to the question that I ask myself, which is better? Is this the randomized approach just propose or it actually would be better just to take the traditional approach of looking at average-case behavior, just looking at the data distribution, and a tailor your algorithms for doing that. But once you phrase the question in this fashion, then a most pleasing connection leaps out. That gives a lot of insight into randomized algorithm.

When you look at the randomized approach versus the traditional distribution approach, you can regard it as a game played between the algorithms and the data. The algorithm would choose how to make stochastic moves while the data, the adversary can try to pick the distribution to make the life of the algorithms more difficult. Now, these two approaches meet exactly at their limit by the game theory's Minimax Principle due to Von Neumann.

This connection gives us the theorem that we would like to prove, namely that these two approaches are the same and it provides a handle for understanding the randomized approach. The important thing is that this model type of algorithms at that time and now has become the default model for many cryptographic and AI algorithms. There are reason that people now want to understand the limitations of the randomized algorithms.

In over 40 years, the algorithm that I have found there are still regularly used by many researchers in solving their problems. The second topic is communication complexity that I raised in 1979. Let me explain the mathematical problem first.

Alice and Bob are two people who are in separate locations and they each hold a piece of data, say x and y, these are n-bit integers respectively, and the question we would like to solve is that suppose they would like to compute jointly some quantity F how many bits need to be communicated between them. That's now called the communication complexity of this particular function. Now, this, of course, depend on what functions you are computing. For example, to compute whether the sum of these two integers is not even, clearly, you only need two bits of communication. Each one just tell the other what is the that it's even or an odd and then they can decide on the answer. On the other hand, if you want to compute something whether x is greater than y, then it would take n-bits and you can see that the natural algorithm you really have to send the entire stream from one person to another in order to solve the problem.

The deeper part is that you have to realize and proof that they are no better ways to solve this problem than just doing in this not either way. In general, is quite a difficult problem. If I give you a particular computation F, usually requires quite deep mathematical analysis to do it. The reason for consider this problem is because, during the late 1970s, it becomes clear that there is a change in paradigm of computing from the mainframe computing everybody was familiar with before that time, is gradually moving into network computing that we are now familiar today, namely that people are interested in solving problem in a distributed way and many people would like to collaborate to solve a problem.

This means that we have to adjust the model we used to have about computing into the network model of computing. In this new world, the communication cost is often the expensive part namely that because if you have to move the data around and that part is the most expensive part. The concept of communication complexity, as I just described to you, is to model and reflect this change in the paradigm.

Since this model is proposed and analyzed, communication complexity has found wide applications in areas ranging from chip design to data streaming. The last topic I will discuss in some detail is about cryptography and MPC. In 1982, I wrote three papers and this became significant contributions to modern cryptography. [NOISE] The three papers deal with the Dolev-Yao security model, pseudo-random number generation, and multi-party secure computation, or MPC.

I will just address this last one. [NOISE] MPC is a cryptographic concept that would enable computation to be done on encrypted data. If you use MPC, it's possible to have multiple databases do any joint computations without leaking it's own data.

Mainly that essentially it's saying that we can share data without seeing them, and let them explain it in a special case and I think it will become clear. I will use the well-known example of the millionaire problems that was also raised in that paper. Two millionaires, Alice and Bob, they wish to figure out who is the richer person without revealing any quantitative information. Alice has x millions, Bob has y millions so the mathematical question is, they want to talk to each other to decide whether x is [NOISE] less than y. The question is, is it possible to carry out a conversation so that, in the end, both Alice and Bob would know the answer who is richer but without knowing anything else about the data on the other side. Naively, intuitively, you would think it's impossible.

How can I find out something about who is richer without revealing any information from either party. If you think about it for a few minutes you will realize that it is impossible if you adopt the standard security definition in 1982 namely the Shannon's information theory and you can prove it is not possible under that model. But I think that necessity is the mother of all inventions and if you really have to do it and you will think of a way.

If you think outside the box then actually it turns out that it is possible. By " outside the box" what we mean is that you should discard the very rigid and tight box that Shannon constructed in this case and you bring Alan Turing into the picture. I will not say too much about it, but basically, it turns out that if you relax the security definition, but still a programmatically very good definition, then it can be accomplished. In particular, I showed it. This can be accomplished using what is now called the garbled circuit.

During the past, nearly four decades of development, the advances have been made in hardware and algorithms that is now becoming nearly feasible and there are many research work in this regard ready for work and in fintech, data training, and joined drug discovery. There are other subjects that I work on. I'm not going to go into details, I just listed here in quantum computing, is a revolutionary approach with promising exponential speedup.

Auction theory, they would cast amorphous economic problems in neat game- theory of framework and artificial intelligence witnessing the incredible feat achieved by machine-learning algorithms like Alpha-Go but the reason for success still mysterious. All these were very interesting new areas and still under development. As you can see that I have worked on quite a number of different subjects and these diverse and colorful subject they actually reflect not just my own personal taste but it really is about the blossoming of the information science over half a century and the growing interdisciplinary connections that we see today. Finally, I would like to say a few words about the people that I've met. Now, during these many years, as a computer scientist, I had a good fortune of encountering many extraordinarily talented people.

I was especially fortunate to have had two inspirational mentors. Professor Sheldon Glashow and Professor Knuth. Professor Glashow was my physics Ph.D. advisor at Harvard University. He was one of the group of people who first predicted that there exists a new particle called the charm quark and he was its most enthusiastic advocate.

I learned from Professor Glashow that in science that you have to be bold and you have to be persistent in what you believe in. Another thing that I learned from Professor Glashow is that mathematics is different from physics. For physicists is most important to be able to find out the truth about the physical reality rather than to insist on precise mathematical argument.

I think that pragmatic spirit helps me a lot in my research afterwards. There's one other thing that's important that I learned from Professor Glashow, live should be fun. As a young student in the spring of 1971, I tagged along on his sabbatical to CNRS at Marseille, France. What a marvelous and charming city. That was also my first trip to Europe. Later that summer, he took me to a summer school in Sicily, Italy.

Again, a most wonderful experience. It's a very important lesson that Professor Glashow had showed me that a joyful life and intellectual pursuit can go hand in hand. Now I would like to mention Professor Donald Knuth. As I said before, his book, when I read The Art of Programming, Art of Computer Programming, it virtually changed my life. In this master book, he literally created a new field of study that also have inspired generations of new computer scientists. For example, I began my career in computer science by reading his book and solve some of the problems that he masterfully formulated.

Later, I was lucky to become his colleague at Stanford. It is well-known that Professor Knuth is good in many things besides mathematics and computer science. He is an accomplished piped organ player. Here is also a composer, a fiction writer among many other things.

He's very multi-talented, but yet, he is sincere and the generous and he always sees the good things in other people. To conclude, I have had a wonderful journey in computer science with many twists and turns. I have found out that actually starting on the wrong foot may not be a disadvantage. Actually, the early training in physics turned out to be most helpful to me in at least two regards. First, I learned what good theories look like.

In physics, there are many good paradigms of theories, such as relativity and quantum mechanics and that helps me a lot later on when I need to build theories in computers science. The second thing I benefit from physics is its programmatic spirit. It says that the important thing is that you want to solve the particular problem at hand. It doesn't matter what method that you use.

You should use, learn or invent as the case may be, with the goal that in the end the problem is solved. In science, the paradigm is the search for truth. In this process, we sometimes discover patterns and beauty which can lift the human spirit. It also leads to innovations that can improve human conditions and prepare us for future human challenges.

I agree completely with Inamori Foundation's vision that science and humanity shall work together for the betterment of mankind. Again, it is a great honor for me to receive the Kyoto prize and a privilege to give this lecture, sharing my experiences with the audience. Thank you very much. Hello, everyone. I'm Andrew Yao, joining you from Tsinghua University in Beijing. I wish I could be in San Diego today.

San Diego is one of my favorite cities. Very beautiful scenery and friendly people and I've many good friends at UCSD. It's a real delight for me to attend this Kyoto Prize Symposium held San Diego and I hope we'll have some interesting dialogue and discussion.

But first, let me express my thanks to the Inamori Foundation for awarding me the Kyoto Prize this year. Mr. Costello Inamori dedicates himself to the betterment in mankind and stressing the essential roles for both science and humanities in moving towards that goal and his vision touches me profoundly. I feel both thrilled and humbled to join the list of distinguished laureates who have received this honor previously.

I would also like to thank the Kyoto Symposium Organization and its executive director, Dick Davis, for inviting me to appear in this wonderful symposium. Also, I would like to thank the two universities, UC San Diego and Point Loma Nazarene University for co-hosting this symposium. Hi, I'm Professor Russell Impagliazzo from the CSC Department of UCSD, and I'm honored to be chatting with Dr. Andrew Yao of

Tsinghua University who is this years Kyoto Prize Laureate for Advanced Technology. Andy, How are you doing? Very good, Russell. It's great for seeing you here. In your career, you moved around a lot.

Your undergraduate was in Taiwan. You were born in Shanghai. Your undergraduate was in Taiwan.

You did your graduate work in the United States and taught at a number of prestigious universities such as Stanford, Berkeley, Princeton, MIT and then decided to go back to China to lead the program at Tsinghua University. Have these transitions been difficult or have they been inspirational I think that it's interesting adventure for me to move around both in terms of the subject that I study and also in terms of the physical locations. I think that actually every move has a reason that I'm actually pretty happy to move around and getting new experiences and doing new things. I don't think there's any difficulty.

It's really a pure joy to be able to do that as an academic in the modern world. That's great. You mentioned that you've moved around in terms of topics. You actually got your first PhD in physics and then went back and get another PhD in computer science.

Was that a cultural shift that was difficult to navigate. What do you think is the difference between the culture of physics, a very established field and computer science, a relatively young field. I think that's exactly why I moved from physics to computer science. Over the years, I realized that I do my best work when I'm entering a subject that is still in the initial stage.

Of course, I didn't know that when I entered physics. When I got the chance of seeing computer science all is open province, I got a feeling that this field is really for me. There's no difficulty in moving because I was pretty sure I could do that better than the research problems I worked in physics at that time.

Your work didn't only solve other people's open problems. I think you shape the area in many ways. I reminded of your work on characterizing probabilistic computation and also your work that was very influential to me as a grad student on pseudo-random generator. Is actually a paper that I consider a classic called on the theory and application of trapdoor functions.

But I'm afraid that, what I would define as a classic is a work that everyone knows about, but no one has read. I knew about this work. It was just a few years before I was doing my own graduate work and still, everyone knew about this paper.

But I didn't read it until I was about to write my thesis and needed to look up some references. When I read it, I realized how revolutionary this paper really is. There's what everyone knows about it and what it contains, and what it contains is so much more than what everyone knows. In my mind, it's a re-envisioning of the notion of information in light of computational difficulty, that there's information that you know, information that you don't have access to, and information that you could have access to if you're willing to do a prohibitive amount of work to get it. Is that a fair characterization and what were you thinking when he wrote this paper? Yes. Well, Russell, thank you very much for your appreciation of this work.

I think that you are one of the people who really can appreciate everything in that work. It is true that it came at a interesting time when the classical information theory and the theory of computation interact in a very natural way, namely because the age of network computation was just starting in the late '70s and early '80s. There were really people with great foresight that the future perhaps belonged to the network computing world and in that world, one have to do lots of new protocols so that the electronic commerce could prosper, and that the people have found out that actually you have to marry the theory of computing with the classical information theory to be able to anything interesting cryptographically. I think that was a big event, and it's lucky that I happen to be there together with many of our friends at that time so we were able to do work in that direction. My paper was finished within that framework and there were a lot of exciting things that one can do when two great disciplines interact.

I was just putting things that I could think of at that time. I think that's the reason why it contains a lot of different threads of thoughts. I'm so glad to hear you about your understanding and appreciation of that work because I think that's the most insightful, satisfying comments that I have heard not only for that piece of work but I think about anything that I've ever done. Russell, thank you so much because, really, you are one of the top people in the area of pseudorandom number generation and cryptography. I think coming from you, I'm moved and I'm happy to hear that.

I should have talked to you more often in the previous 20-30 years. I have to say a lot of those 20-30 years I've been working, a lot of my work is commentary on that one paper. [LAUGHTER] Even if we haven't been talking, we've added intellectual dialogue. Yes. You mentioned that you were thinking of a world where people were interacting on a network, where information is distributed throughout the network, where people's identity is maybe not tied so much with their geographical location as what they know and what information they have access to, and that's a really good description of the world today. But I think it takes a lot of foresight to look at the world of 1982 or even 1976 when you wrote your paper on communication complexity and say that was the state of the world at the time.

Did you anticipate that the Internet would be such an important part of today's world? Did you foresee a future that's more to today's where people are communicating like we are over long distances fairly routinely? I am ashamed to say that I didn't expect anything this big at that time. I think at that time, it just seemed to be an interesting new technology development in the computer science world I think namely that it was a novelty to have this Internet development and the Uppernet. People were already pretty amazed at the ability to do email.

I think that there were a few people, perhaps have you seen that, but I'm not one of them that could see that far in terms of the Internet development of today. On the other hand, I did appreciate the fact that development of the PC and that the network is a new computing paradigm. It would change a lot of things. Therefore, for a theoretician at that time, any time there is some significant technological advance is a chance to do fundamental work and it's intellectually interesting to consider actually the deep problems in connection with that.

I think that in those days, I was young and I really couldn't see that far. I was just like everyone else, is trying to find some interesting, challenging problems to work on. Only very later on, I realized that it's a good habit to follow the technology because that's what distinguished the theoretical computer scientists from mathematicians. Because I think that is a very fruitful and good idea for computer scientists to follow the technology and work on theories that's relevant for the new development. In other words, you're not careful yet, you just have amazing we could test.

I think that it's probably my physicist background has something to do with it. Because I think that in physics, people are very pragmatic. I think even theorists in physics, they are at their heart pragmatic people. That it's not just how beautiful the theory is.

I think important point is that it should be relevant. It should have something to do with the deep laws of nature and you'll have to pay attention to experiments. I think that in that spirit, I think that's a culture that's very deeply embedded in me. I'm attracted not to just by how elegant and pretty the mathematics involved is, but I'm also paying a lot of attention to whether it reflects the pragmatic ability for us to utilize it. But there is a very big difference between physics and technology, basic science and technology.

Which is in physics, you're explaining what is, and technology is something we envision and create and can change. Do you think of that as a dual role that we play. Yes, I think that it is true that in computer science, even the most abstract theoretical computer science could have a real impact in the world within a few years. Of course, it depends very much on the particular subject, for example, in cryptography.

I think it took 30 years before it reached the level where it's really become essential in today's world. But in physics, I think that the time between the conception of quantum mechanics, it really took many years, perhaps 100 years, to reach the point where we could think about quantum computing. I think they are similarities and there are differences.

However, I'm interested in what you said. You're a comment about physics being a science that is trying to describe nature. I think that is also starting to happen in computer science too because we know that the man-made phenomenon have actually created things that are very much similar to the physical phenomenon. For example, in the Internet is such a big infrastructure that I don't think we could characterize everything mathematically and try to control it. I think that basically a lot of computer scientists are doing measurements and observations about the behavior of big networks and also about the mass psychology of people who make use of the Internet.

All these have made that the study of computer science closer to natural science in that sometimes it's fruitful to look at it from the point of view of a natural scientist. Even more prominently, I think that Russell you have worked on learning and big software programs for satisfiability, that type of really very important practical work. We all know that everyone is puzzled by the excellent performance of deep neural networks. But there really is not a very good explanation about it.

I think this is another case that the computer scientists, many of us are doing now is to try to look at it with an eye of a natural scientist. Namely that here is a physical phenomenon, even though it's sort of man-made. But it's a really important physical phenomenon that we should try to understand it. That perspective actually to me is compulsory because I don't think we should approach it. Because it's essentially a big nonlinear system. We know that it's almost impossible to try to find extremely precise theory to describe it.

We have to be like physicists so that we can find mathematical laws to explain things at the right level. Because the ultimate goal of science, including computer science, is try to do the best we can. I think some people have said that I remember reading a long time ago, is saying that politics is the art of the possible. I think that actually computer science and machine learning actually is also the art of the possible.

We should go beyond our traditional work looking at algorithms. Because now we are exploring algorithms from the very narrow domain where deep mathematics dominates, into a world where all these practical problems that have to be solved and perhaps in principle not solvable within our traditional thinking. I think that, Russell, we all have to become physicist, and perhaps we already have. Well, that reminds me of the old joke. I know it works in practice, but does it work in theory. [LAUGHTER] There are lots of things where we have algorithmic techniques that have been successful like satisfiability, like machine learning, like simulated annealing, and other heuristic methods.

Before that where we have algorithms, they work for typical problems that we don't really understand why all the time. Don't have a great theoretical explanation. This of course, is a challenge to a theorist, but it's also a pragmatic matter because if you don't have a real understanding of why your tools work, it's hard to optimize them. Yes exactly. I think that's why both the experimentalist and the theorists, both types of researchers are necessary.

I will have to think it. It is just many of the big discoveries in the world. I think that when the Wright brothers first successfully launched the airplanes, I think that there were just doing experiments. I think that they didn't know about the deep aerodynamics equations and the theories behind it.

But things developed. You have to establish a scientific basis so that you'll be able to design modern airplanes so that it can really radically change the world. I think it's the same thing here. I think that we are at a stage when the Wright brothers have discovered the wonderful devices and we have, the scientists have discovered the wonderful machine learning techniques. I think that now the theorist has to catch up with the knowledge.

Fortunately, I'm not really unhappy about it because we know that every human endeavor has its limits, has its regularity. I think, for example, in medicine today, even though we have many wonderful drugs, but I think that for most, or at least for many of them, were really not sure about the precise mechanisms, how at the molecular level. But nonetheless, even at this level of understanding is immensely important. I think that we can take our time in trying to understand neural computation or even other types of artificial intelligence algorithms to be developed.

We don't need to solve everything within one generation. In that regard, we have to be like mathematicians, that we have to have patience. Your prizes for advanced technology.

It's good that you mentioned patience because in some sense, you're thinking was well in advance of technology. You're saying sometimes we have to catch up with technology that we don't understand. Other times, technology takes a while to catch up with theoretical ideas.

I'm thinking of things like multiparty computation, a great intellectual idea that you introduced in the '80s where you can compute with information without actually knowing that information in some formal sense. So two or more people can jointly compute functions of what they know individually without leaking their secrets to each other. That's technology that we really need because we all want to take advantage of algorithms that optimize for our individual private taste and private information, like with personalized medicine, but we don't want to necessarily leak that information to the whole world. That's something that I think that was a brilliant idea One one that we really need, but where the technology is still catching up, the first implementations were around 2008, Fairplay.

Still many of the implementations are more of a prototype proof of concept than a full working tool. Has that frustrated you that technology has taken so long to catch up with that? Well, I think it was not frustrating because I didn't expect that the multiparty computation could become practical for a long, long time. I think that in those days, I was at heart, almost was a pure theorist. I worked on problems really at that time, mostly because of intellectual curiosity. I think that in that regard, I was a pure theorist at that time. I worked on that problem, it's mainly because it surprised me that it could be done.

I think that I often said that we had to do good research. We had to be sometimes dreamers to think about things that are seemingly impossible to do. I think that actually the public-key cryptography was one such thing. Well, it was invented. I think that it just popped out of mind. I think that in the modal partition amputation, I was following that good tradition for theorists.

I think that when I start off the possibility for two millionaires to figure out who the richer person is without revealing the asset, I thought that it's something that if it can be generalized, it would have theoretically wonderful application. I thought it was a very well motivated conjecture and I worked on it to solve that problem. At that time, I immediately understood that it would be many, many years, if ever, that the computing capability would be able to reach the level where this could become useful.

I think that actually amazingly, after 40 years, now it really is starting to enter the realm of possibility. I think this type of thing, it's a good illustration of two things. Firstly, that the curiosity driven research is essential and useful. Secondly, that one has to be patient about basic science research because it might take many years for it to be realized. I think perhaps the most interesting thing about science is really to be able to have that kind of application.

In my mind, the most wonderful thing about science is the ability to realize some impossible dreams. That's something that I very much enjoy doing. Another project of yours has started, I think theoretical speculation, but just very recently been started to be implemented is in quantum computing. What got you interested in quantum computing, and what do you think is the future of quantum computing? I think that quantum computing was started by a physicist. Richard Feynman had a very influential paper in 1981 that motivated the idea that one probably should consider using quantum mechanics to build computers.

I started getting interested in the early 1990s. The thing that excited me and also, I think excited most people, is that this really was the first time in 60, 70 years that the people realize that there could be an alternative to the classical way of designing computers. Certainly, nowadays, the computers work much faster than a Turing machine, but the principle was the same as in the 1936 Turing's paper. However, quantum computer really is a different kind of animal. It's based on the law of quantum mechanics, which is very abstract. I think that it's almost impossible to characterize precisely, except through mathematical equations.

But in principle, it does have amazing abilities like being able to factor large integers, which is the basis for some really important cryptographic standard today. But even more importantly, as Feynman described in his paper, that perhaps the most important intellectual foundation for the importance of quantum computer is that with a practical quantum computer, it would be possible to simulate quantum mechanical laws. For example, the simulation of the folding process.

I'm going to bring you back to your statement about understanding why drugs work. Yeah. That's very exciting. It is that suddenly a new horizon is open upon us. Presumably in some future time, we would be able to have personal quantum computer sitting on our desktop.

I think it's still a distant dream right now, but I think that the usable quantum computers, I think it looks good. I think within the next 10, 15 years, I think that we'll see really exciting development in the construction of quantum computers and the associated quantum computing industries. It'll be exciting times because we have to do all the compilers or the operating systems over again. In addition to mathematics and the physical sciences, some of your recent work touches on the social sciences, in oxygen theory and computational economics, what interested you in that topic? I have heard about this interdisciplinary area between economics and computation many years ago. It did strike me even at that time the concept of the design of, I think it's technically called incentive-compatible mechanism design was really interesting to me just from a purely computer science for view.

But I didn't follow that up until more recently and partly because it seems that that auction has become important in advertising on Internet, that type of location. I decided to look up the subject and it happened that I had a chance to supervise an undergraduate project. I decided to ask the undergraduate students to work with me on an open problem in that area. That got me into auction theory and I worked on it for the next 4,5 years.

The big impression on me is that for the first time. I really got to appreciate the beauty of economics because previously, I only knew that economics is a science for which that is hard to predict, at least the short-term development of a large system. But after I got to understand the principle of [inaudible] and that the mathematical theory, essentially the game theory behind it, I really got struck that it's amazingly beautiful. Because I think if for the same reason I mentioned before about things that seemed to be impossible, is that such a complicated world and a complicated economic system, how can you have any theory about it? Amazingly, game theory gave a possible approximation to the real world. One can have a rational basis for designing and guiding economic systems. Also coincidentally the economics, things like Bitcoin.

I think they are going to interact with computers and moving the world forward. One has to have integration to some degree of the economics and game theory and computer algorithm design and also cryptography. I think that all these have to work together in order to construct an efficient economic system for the future. It's also again, another new horizon is opening upon us for which we computer scientists can play a major role.

When you went back to Tsinghua or to China to work at Tsinghua, you started another challenge I think, which is in addition to doing your own research, your institution building. You're really helping create an environment where a new generation of researchers could thrive. What was your experience at this kind of challenge of the human challenge? When I first came back to China, I had a very low ambition. My main interest at that time was just trying to build up a small group in my specialty and to be able to produce good Ph.D. students. But soon after, I realized that it's not possible to do it without a really big supporting system. That got me into building an undergraduate program so that I could have good graduate students.

Now in order to have a good undergraduate program, I have to build up a faculty. So I managed to get the Tsinghua University to support me to build an institute from scratch around 2010. The theme is really to build an institute based upon the trying the tenure track systems that's very much the standard in the modern world. I defined the focus of my faculty to be in the interdisciplinary area. That means that anything for which the computer science may interact with.

I had physicists to build quantum computers and I had people whose training is in economics, and people who do computational biology and build an environment, so that would do all the interesting things. Ten years from the time when I started the institute and now is 2020. Actually, I found out that I'm pretty happy to have build an institute from scratch to have now around 30 faculties.

Especially in recent years, we have been able to really have top-rated young faculty members. I'm very excited about that. Building an institute has challenges that I never had to face before because, perhaps like you, I was at heart a really just say researcher. But somehow changing the environment, you are naturally led into doing things. I think to put it in a positive light to discover abilities that I didn't know that I have.

So you've had this amazingly successful career as a researcher and as a institution builder and you've mentored so many students. Do you have advice for students who are just starting their careers as computer scientists or as researchers? I think there are just two advice. The first one is that you should follow your heart.

I think that's the old adage. I think that you have to do things that interest you and excites you most. The second is pay attention to all the developments. Not only just in computer science, but actually in all the sciences.

Basically, I think the best advice I can give to studying computer scientist is that you should first think of yourself as a scientist and then you could be a specialist in computer science. We're almost out of time, so do you have any closing thoughts that we should end on? Yeah. I think the closing thought is that I think I have had a wonderful life, and I think that being in the academic world all my life, I think is the greatest privilege and the greatest luck of my life. In this environment, I could have good friends and colleagues like you who we really can understand and talk to each other and enjoy the most interesting intellectual exchanges. I wouldn't give that away for anything.

Well, it's been really a privilege and a pleasure to talk to you. Yeah, Russell. It's wonderful. I think that in the future we should do this even informally without the need for a price. But perhaps next time I can return the favor to have a constant conversation with you on your price of any location.

Well, it would be a prize if you just came to San Diego and [LAUGHTER] we'd be talking personally. Thank you so much. Thank you. [MUSIC] [FOREIGN]

2022-04-06 11:55

Show Video

Other news