Artificial Intelligence for Military Use and National Security
Good. Evening everybody, and. Welcome. Back. So. If you're like me you've, probably heard. Quite, a bit about artificial. Intelligence over, the last few years it's. Covered a lot of ground on. Any given day it, might be taking. Our jobs. Beating, us in jeopardy. Powering. Driverless cars. Inspiring. Medical, breakthroughs, or, maybe even as Elon, Musk says posing. The biggest existential threat. To humanity as AI. Gets probably much, smarter, than humans he says the, relative, intelligence, ratio is probably. Similar to that between a person, and a, cat maybe. Even bigger we. Need to be very careful. About the, advancement, of AI. Tonight. We're. Not going to discuss every, potential, application. Of AI but. Instead focus. On a specific application that. Demands our attention the. Use of artificial, intelligence, for military, purposes, and national, security. To. Begin let's, rewind just a little bit, for. Decades there have been, long-standing. Ties, between, Silicon. Valley and, the US military, as. Described, by Leslie, Berlin a historian, and archivist here at Stanford, all. Of, modern high tech has, the US Department, of Defense to, thank at its core because. This is where the, money came from to, be able to develop a lot of what is driving the, technology that. We're using today. Even. The internet itself was. Initially seeded with, defense funding, in the, late 1960s, the department's. Advanced. Research Projects. Agency DARPA. Stimulated. The creation, of ARPANET, which, became the basis for the modern web and. Many. Of the technologies, that are ubiquitous today, have, similar, origin, stories, DARPA. Funding, of SR I international. Throughout the late 60s, and 70s culminated. In shakey the robot the, first AI powered, mobile robot and over, the course of decades that, effort, would grow into, the world's, most popular. Virtual assistant, Siri, which, Apple famously, acquired, from SR I in 2010. Another. Example, in, 1995. Two, Stanford, graduate, students, received. A DARPA NSF, grant of their own to. Conduct research into digital. Libraries, that. Pair turned out to be Sergey, Brin and Larry Page and, their, research, supported. In the early days by this grant was, at the core of Google's original, search engine the, flagship, offering, of the company they would Co found a few years later in. Short. The military's, fingerprints, are all over the. History of the technology, industry and by. Some accounts the, connection, between the government and tech firms is as, strong as ever, Amazon. And Microsoft are, actively, competing for. A ten billion dollar contract. With the Pentagon today the. CIA spun, out an organization. To strategically, invest, in cutting-edge startups, and the. Department of Defense has set up an innovation, unit just down the road. But. Of course the. Relationship between the, Pentagon and Silicon, Valley hasn't. Always been, cooperative. Right. Alongside, military. Funding, of research has, always been, fierce. Opposition to. The use of technology for, violent, ends, the. Biggest focal point of such, opposition, today is the. Development, of so-called killer robots, lethal. Autonomous weapons. That, can select, and engage targets, without human. Intervention. Human. Rights activists. And international, policy makers alike are sounding. The alarm about a future in which machines, rather. Than humans may, determine, who. Lives and who dies. Now. Sometimes, opposition. To the development, of autonomous weapons or the technology, that underlies them, emerges. From within tech, firms themselves. Nowhere. Was. This more apparent than in the high-profile, debate, within, Google about the company's work with the Pentagon on an, initiative called project, maven, so. As a refresher, in 2017. Google. Began working on a pro using, AI to, analyze, the massive, amount of data generated. By military, drones every day to, classify objects, track movements. And detect images, in video, footage, far more accurately, and quickly, than, humans, ever could. Upon. Learning of this contract. 4,000, Google employees signed an internal, petition. In opposition and a, dozen employees resigned. In protest as, they. Saw it the, military, was trying to appropriate, the technology, they were building, in a non-military context, and repurpose. It for possibly. Objectionable. Ends. Specifically. They, were troubled, that their work might be used to improve the targeting. Of drone strikes a, practice. Around which there are considerable, human, rights concerns and they. Were firm, in their beliefs that quote Google. Should not be in the business of war, so. The DoD said they had initiated this project to make drill and warfare less, harmful. Harmful, to innocent, civilians but as Google employees wrote this, contract, puts Google's reputation, at risk and stands, in direct opposition. To our core values building. This technology, to assist the US government, in military, surveillance and potentially.
Lethal Outcomes, is not, acceptable. Weeks. Later Google. Decided not to renew its contract citing. This employee backlash, as a primary, reason for the decision. Now. How should we think about this on the, one hand. Activists. And organized tech employees claimed. The decision, as a victory, seeing. It as an opening for broader reform, a group. Of them joined with the tech workers coalition to. Urge not just Google but, also Amazon. Microsoft, and IBM to. Say no to future, military contracts, in their, words DoD. Contracts, break user trust and signal a dangerous, alliance tech, companies shouldn't build offensive, technology, for one country's military. But. On the other hand critics. Of Google's decision have, at times described, the companies handling of it as at, minimum ill-informed, perhaps. Worse unpatriotic. And perhaps, worst, of all some. Amounting, to a unilateral. Disarmament, in a new global AI arms. Race in, the, words of Christopher, Kirchhoff who helps lead the Pentagon's, recent, push to collaborate, with Silicon Valley the. Only way the, military can. Continue, protecting, our nation and preserving. The relative peace the world has enjoyed since World War two is by integrating, the newest technology, into, its systems. Denying. The military, access, to this technology he says would, overtime crippling, which, would be calamitous for, the nation and for the world, so. These newly, empowered employees, right to protest, their, company's partnerships, with the Pentagon or. Are they short-sighted. In their demands, clamoring. For change that, might threaten the, very global, order upon. Which their lives and companies depend, Google. Is after all an American. Tech company protected. By the rule of law here, in the United States. Now. This debate gets to the heart of why our topic, tonight is so important, it's. Not merely about individual, companies decisions, to pursue or, forego specific. Contracts, but, about the broader geopolitical. Story in which these decisions, are unfolding, the. Military, isn't. Seeking, out and investing, in new technology, just because it's, exciting but, rather because, AI, represents. A new frontier, for global, competition. These. Events, are taking place in a hyper competitive environment. One, in which countries, are vying not, just for technological. Superiority, but. The economic, and military advantages. That will accompany it around. The world global, leaders are taking this seriously Russian. President Vladimir Putin has said artificial. Intelligence is the future not only for Russia but, for all humankind, whoever. Becomes the leader in this sphere will become the ruler of the world, Emmanuel. Macron announced, a significant, AI strategy, for France just last spring and 15, other countries have, released national, plans as well. But. When it comes to the global AI race the, u.s. is primarily, up against one rival China. Over. The last few, years Chinese, leaders have forcefully. Championed, AI development. As a national, priority they've. Laid out an ambitious strategy. That promises, major breakthroughs. By 2025. And pledges. To make China the world leader, in AI by, 2030. Experts. Estimate that they will commit something on the order of a hundred fifty, billion dollars, to the goal over the next decade, and some. Argue that absent. Significant. Action, to the contrary China, is poised to surpass the u.s. in the years ahead as investor. Chi Fuli sees it China, is a more hospitable climate for, AI development, at this stage and is, acting, more aggressively. In a, competition, that he says essentially, requires, four key inputs, abundant. Data hungry. Entrepreneurs, AI. Scientists. And an AI friendly, policy, environment, China, has some distinct, advantages as, an. Authoritarian, regime the, government permits, few, boundaries, between itself, and its tech sector which includes, some, of the world's most powerful companies Tencent, Baidu, and Alibaba they. Have centralized, decision-making, and, access. To as much data as can, be collected. Now. Noting the profound risk this could pose the United States former, Deputy Secretary of, Defense, Robert. Work has, said we are now in a big technological. Competition, with great powers, we, cannot assume that our technological superiority is, a given we are going to have to fight for it. Taking. A step in that direction just, last week President, Trump signed, a new executive order, to establish the American, AI initiative, a high-level, strategy for AI development, here in the US, many.
Have Criticized, the plan for lacking specifics, and funding but it is yet another reminder. Of, the global competition for AI dominance. And the challenge, that will play out in the years ahead. Examining. That challenge, together, with a set of formidable guests, will be our task here tonight, we. Have with us three. Extraordinary. Experts, first, is Courtney, Bowman the director, of privacy, and civil liberties engineering. At Palantir his. Work addresses, issues at the intersection of policy, law technology. Ethics. Social norms and then working extensively, with government, and commercial, partners, his, team focuses, on enabling, Palantir to build and deploy data. Integration, sharing. And analysis, software that, respects, and reinforces, privacy, security and data protection principles and, community, expectations. We. Also have with us of real Haynes who's a senior, research scholar at, Columbia. University, the Deputy Director of Columbia, world projects, and a lecturer in law at Columbia Law School prior. To joining Columbia. Of real served as assistant, to the President and principal, deputy, national security adviser, to President Obama and that, role she chaired the deputy's committee the administration's, principal, forum for, formulating, national security, and foreign policy and before. That she served as deputy, director of the Central, Intelligence Agency. And. Finally we have Mike call who, heads the artificial, intelligence and machine learning portfolio. At the defense innovation, unit experimental. Di UX he. Served as a pilot and air staff officer, in the US Air Force and has extensive, experience in, the software industry and product, development product management and as a senior executive at, multiple, technology. Companies and startups please, join me in welcoming everybody, to the stage. So, thanks to our steam panelists, for being with us and thank you all for coming back, for yet, another I, hope energizing, conversation, I, want, to start with you of real we had the privilege of serving together in. The Obama administration the. Tables were reversed you, held the gavel and you were directing, the questions, at the rest of us around the table now I hold the gavel oh my god so welcome back I. Want. To start you know the job of principal deputy, national security adviser, put, you at the front lines of, managing, a wide, range of national security threats whether it was the, civil war in Syria. Nuclear. Proliferation in. Iran or North Korea the Russian invasion of Ukraine many. Things that involved, reacting. In, really, rapid, succession, to events that were unfolding around. The globe but. You also made it a priority to try, and look over the horizon and think about the long term and it's hard not to think, about the challenge that we have before us tonight about the role of AI and national security and think, that what we are confronting. Today in, terms, of our national security threats, is likely to change pretty dramatically. Over, the next decade, or two as, a function, of developments, in AI and so, what I'd like you to start us off with is your perspective having, been in that role about what. The real challenges. Are that AI developments. Present, for u.s. national security and also what some of the opportunities are, that are enabled by this kind of technological. Change. Sure. So, I. Think. As, part of this maybe, it's useful, to. Talk about the third offset and, in, a sense how we think about that in the context of defense strategy, and the, third offset is a very unsexy, name for, an effectively. The, strategy, that was really intended, to. Be. Another.
Way, To think, about how to promote conventional. Deterrence, and preserve peace and in. Essence, it was looking, at ways in which you could offset. Potential. Adversaries. Advantages. With. The idea that the, United States can't have the military be, everywhere, and you at the same time needs, to project power, into, areas in order to create that kind of deterrence, and there, were really three elements, to it and technology, was one element, there. Was also the. Concept of, essentially. Operational. Concepts. Changing, and organizational. Structures, being a piece of it and in. Each of these areas the, view of the Department of Defense was these are spaces in which we. Need to be thinking about how to position ourselves in, order to offset effectively. The potential. You. Know adversaries, advantages, and I. Think, if Bob work were here today you know former deputy, secretary of, defense who. Really spent an enormous amount of time thinking about this and working this through, the Pentagon and. And. Through the interagency in many respects, he would, look at AI as, being really, at the hub, of, the technology, aspect, of this and that, that was a critical, both. An opportunity and, a challenge in, many respects, looking, in across the world so a challenge in the sense that we're. Not the only one interested in this technology and we have to constantly, think, about how it is that adversaries. May use it against, us in effect but, in also many respects, an opportunity. To, improve, how, it is that we function. From, a defense perspective but also across a range of issues in, the United States that, we have to think about in that context, so I in, so many spaces and I think I'll just end on this and Hillary. Pointed, out one. Aspect of this you, can think about the opportunities, that a certain technology presents, in the defence and national security space. Such as maybe. It makes you capable, of. Being. More targeted, in your operations, so that you can actually limit, any kind of collateral damage that's. Occurring in terms of civilians, or, civilian. Infrastructure. And so on that, can be one aspect of it it can be a tremendous advantage in terms of trying. To analyze enormous, amounts of information or data that you're coming at and, really. Picking in on what actually. Matters and, analyzing. It more effectively, by recognizing, patterns and, so on that are useful. To you there's a whole series of things in the defense, area and in the intelligence, area that we would think about as being useful uses. Of AI but, there's also just a tremendous, area of science that can benefit, whether you're combining it with biology. In the context of health or you're thinking about it in the context of education or in the context so many different spaces and part, of what you really have to do from. A process, perspective and, this is one of the challenges, but also one of the opportunities, is really think. About how. Do we ensure, as we're creating, a strategy for. A potentially, disruptive, but also advantageous, technology. For. The United States government, as a whole how do we make sure that we have all the right people in the room thinking, about that strategy, thinking. About the different aspects, of it so that we can actually take the greatest advantage, of it in terms. Of science, and in terms of our commercial, and and, private. Sector advantages, and so on but also in. Terms, of our defense and our foreign policy, and all of these other pieces how can we actually create, a comprehensive, strategy, in this space that's, great I want to use this as a jumping-off, point Mike, to bring you into the conversation, because. While I've real was grappling, with these issues and thinking about strategy, you, know from the table in the in the West Wing you're, here out as the ambassador, in some sense for the Defense Department to Silicon Valley right, so your job is, to think about how, operationally. We. Take advantage of advances. In AI to. Achieve. Our national security and our national defense objectives. And part, of the challenge that you confront, is where as 30, or 40 years ago as Hillary described. Department. Of Defense investment. Was absolutely, central to, technological, change now. So much of what's happening is in the commercial, sector and not, in Defense, Department, research. Projects. Either, at universities, or in the National Labs what. Does that challenge, look like how can the u.s. have a national, AI strategy. When. Much, of the technological, innovation, is happening in the private sector, it. Is, a challenge, and, kind. Of taking off on the notion of the third offset today's. Third offset is AI from our perspective, it is it is, in. The evidences and what Hilary said, all the other nations are. Imposing. A national, strategy countries. Are saying we this is an imperative but we must do this and and and, the US needs.
To Do the same thing in the, context, of having the technological. Advantage that that. We need for. Our own defence and for our own good it. Is a challenge, to try to weave we, you know we've come from. The. Defense Department funding. A lot. Of the initiative, the, initiatives, are a lot of the initiatives that drove technology, early early, on the. Venture community took, over and has, done an excellent, job of funding technology, but. That funding, now outpaces. The R&D, funds that the DoD provides, and so, we've got to kind of close, that gap a little bit if we're gonna get somewhere and actually, advance, the state of the arc the opportunity. It's. A challenge but the opportunity, is to cast the, the things the. Scenarios, the capabilities, that the default defense Defense, Department, needs in the context, of the. Pull on the science the science. Needs, to be pulled along. Various, dimensions certainly, in terms of healthcare, the societal. Benefits it's, all there, but, we talked about recognizing. Objects. Recognizing. Objects, in a defense. Context. Is a hugely, difficult, task and it pulls on the science in a, very in, a very powerful. And creative way if we can combine that we. End up with the ability to find. Objects, for the DoD that. Are the same kinds, of capabilities, we need to. Help humans, when we have big fires in California to. Rescue them to. Do things such as figure, figuring, out how best to find where. Trouble, spots are after, a flood and where, do you where do you deploy, forces to do that so there's a synergy between the. Requirement, for the DoD and the capability, that's needed by the DoD and the science that that's being provided in the valley and the, trick is to try to bring it together in a meaningful way and and. At, the same time have a debate about is, this. Is this an object. I'm recognizing because, I'm going to target, it which I don't want to don't, want to have. And. Do, or is, it something that will benefit everybody so there's a there is that fine line but. There's an opportunity and if we're gonna do it as a nation we do need to do it as a nation so, just want to follow up you, know on this front I get.
That There are probably some synergies, out there but let's talk about the cases when there aren't synergies. Part. Of what is different, about this moment is that we. Are outsourcing. To a set of venture capitalists, in Silicon Valley the. Financing. To, develop. The military capabilities, that we need in some fundamental sense, to, achieve. The third offset isn't, that a problem from your perspective. No. Because I don't see it I don't quite see to that extent, I don't think we're outsourcing, the. Financing, of the capability, the the capability, will get financed, in any case by. The venture community. What. Our task is I think is to provide. The provide, the pull on the technology, with the things that we really need. To. Help bring it forward faster, and to, do it in a coherent way with, use cases that are of importance, from a national perspective and, that. Pulls on the technology, so, I don't think it's all about the money although that helps but. I think it's it's about what. Capabilities we're trying to drive, into the marketplace the same capabilities, that, we want as compute consumers, that play into the, national sector as well let. Me bring Courtney, into the conversation, and then you know invite, my colleagues, to join in in the questioning. So. Hilary started us off with this discussion, about project. Maven and the, decision, of Google, employees to challenge. You. Know the contract. That the company had to be involved, in thinking about the use of AI. For. Drone strikes and the like, volunteer, represents, a very different model of a company for Silicon Valley everyone. Knows that Palantir, collaborates, with government, in fact, that's part of your mo as. A company, why. I mean you know what Google's, got going on here is an internal, revolt among, incredibly. Talented people about. Their collaboration, with government yet. Palantir, claims it as a badge of honor is, that a good business strategy, for Palantir, don't, you at risk of losing the really talented and, bright-eyed idealistic. Engineers. That, are gonna be needed to advance her company I think. I think one thing that Palantir, is has done a good job of since. The early days of the company is very explicitly, acknowledging. That there. Are certain types of work that we wanted to enable when, the company was founded, the. Initial. Set, of. Programs. That we, had built this, this infrastructure, around were. Allowing. Government, institutions, to address some of the challenges that had been identified in, the wake of 9/11, Commission. Reports. Identifying. That institutions. Within the intelligence community had. Failed to pull, the pieces together amongst, information, that was readily available so, we constituted. A company around the idea of data. Integration, and analysis with. This initial, problem set in mind of helping. Our. Government. Institutions, and agencies to. Defend. Their. Old democracies. And and the institutions, that, inform. And preserve the, type of society that we want to live in so we made we made that very explicit, and I. Think that was something that was, reflected in decisions, that employees. That came to the company. As. They as they thought about different, opportunities and by. The way we draw from the same talent, pool as as, as. Google, and Facebook and other companies that have have dealt with these. These issues in public discourse, and. In, by and large we're also composed, of a similar set of community members so I think this is to some degree there are students. That come, from from, Stanford, and in other elite institutions. That. Make a choice to be involved in this in this type of work there is some self-selection. Bias, but, I think there's a world of opportunity for. Amongst people who recognize, that there's maybe more nuance to these sorts of questions and, there's, an opportunity to engage in a meaningful way on, the. Development, of power to the powerful technology. But, to do so in a way that is also respectful.
Of Some, of the, considerations. That I think we're here to talk about in terms of the ethics of AI. Applications and. Powerful, information systems. Let's. Stick with this for a second so Courtney you've just identified as, the you, know founding, and emission, Palantir. To, work, on behalf of, liberal democratic, values in the wake, of 9/11. We've. Heard about project, maven in Google now so I want to ask Mike and have real is. Google, being. Unpatriotic. As, a company. That's made this decision not to partner with the US government, for these particular purposes. And project maven or any other AI deployments. On behalf of the military or if, you don't prefer the, language of unpatriotic. How about, insufficiently. Loyal. To, the values of liberal democracy which, doesn't doesn't make it just about the United States but about the set of values that the country stands for, it's. A loaded question. Well. Palantir, is here to show up on behalf of liberal democratic values and i would imagine the folks representing, the US government, would not, feel a whole lot of anxiety, about saying the same i think. That. It's. Not about it being unpatriotic. We. Are finding. We're. Finding companies, here in the valley that. Are coming to us and saying we, put out the problem sets and say here the problems we're trying to solve do, you have that capability and, we. Are surprised, by. The number, of companies that are coming, forward and, saying we'd like to work with you and in. Some cases it's not so much about, helping. The country necessarily being, patriotic are not being patriotic, in some cases it's a business decision in, some cases it's a I have, this technology I want to advance it by, working on your problem kind of a decision. So. To. Some degree I. Characterize. The Google situation. As not. So much unpatriotic, but. Maybe, uninformed. With. Some information companies, are. Choosing. Or not, choosing to, work with the Department, of Defense but, it's after some information. Some, why are we doing this what's, this all about there's, some conversation there's, some debate there, and discourse, I think, the Google situation, was, completely void, of all that but, so. What's the information I, I'm. I'm at Google you. Show up from the Department. Of Defense. Unit. For investing, and you. Say you've made this decision I, got, some information I want to share with you how. Does the information I I haven't, been considering. At Google I think, he didn't consider you at Google yeah didn't. Consider. What. What was really the problem set and what, is the science we're trying to pull on I think. The DoD frankly made a mistake, by. Not. Being. Open about what they were really trying to do it maven it. Came across in the press as, we're. Taking movies from dirt from drones, we're gonna analyze those pictures and. We're gonna use them for bad things and, it just, sort of flowed in the paper and. In the media and I and that may sound naive but I don't think it that's the case I think if there had been more information, and more. Communication. About, it with people from Washington, coming, here and talking, about it the.
Outcome May may. Be different, to. A degree maybe, not completely different, but I think the outcome would have been different one. More pass of this so let's say I'm a Google employee I'm, part of the group that was protesting. The. Company about the project, and I. Say, as a Google employee something, like I came, to work here because I bought into the mission of do no evil and, I. Want to deploy my. Talents, on behalf of the development, of technology so, long as I'm convinced, it's not being used to help kill people and, when. I think about making contracts, with the Department of Defense that's. Oftentimes was involved, even if it means killing people, that, folks, at Palantir, or elsewhere and the government you know. People. Who are terrorists, that. Might be something that as, a citizen I might wish for but, as a personal. Endeavor. Of my own technological. Talent, its. Sounds. Scary what. Do you say to that person. You. Do have to make a choice and it's not necessarily binary. A lot. Of a lot of the. Ability to know. Precisely whether. That's a bad thing or knocking a not, bad thing on the ground. Helps. Helps, defend the cause in the sense that if you do decide to do something about that you. Are preventing, collateral. Damage for example. So. That's the extreme if you are going to use it for lethal reasons. You're. Doing a you know a job that has. To be done for lethal reasons, but you're also precluding. All of the collateral. Damage that that may come come to bear that's one way to look at it the. Other way to look at it is, if. You, this. Is this is indirect, but it's very very, true, if. We have the capability, of knowing who's a bad person a not bad person on. The ground from, a drone in the air so halfway around the world if we have that capability it. Precludes, the other guys from from doing what they're going to do because they know that know that. We're watching, so, that's that's not a bad outcome I mean if I could maybe, push, this a little bit further to think about government, capability. Maybe start with avril on this and then and move to Courtney and start.
With Kind of a strange, question which is why does Palantir, exist as a company, and so, the reason why I asked that if she can give us a little bit of in the notion of dynamics, you were saying at the beginning Palantir. Was, identified. As a company, because there was this need the government had and we, could develop information, systems which seems very worthy but. If that need was identified early on why didn't the government itself. Actually, go to develop this technology and develop the capabilities, to be able to do this why, was there a need to rely on a private company to do it so, could you help us understand, a little bit of the dynamics, of how government, used that and then how that turned into a private company. Sure. So, it. Would. You mind if I just started with a little bit of, just. To throw in my head a little bit on that one too because I think it's relevant but. So. From my perspective I, would not have come to the same conclusion, that, Google did on. This issue, maybe, that's obvious but just to state it boldly and. And. I. Actually, I think and, Courtney to credit you on this I think you phrased, this, in the following way which I quite agreed with I think it was a missed opportunity for. Google, to engage in a conversation. With. The, government about essentially. What it was that they were doing in that context. But I, do think, that the conversation that, was had at Google. With. Its employees, and. More. Publicly, in some respects was, an appropriate conversation in other words I think it is critical, for. Individuals. Who. Are working for a company individuals. Who are owning, and managing a, company. Citizens. To think about the. Ethics and the appropriateness, of activities. That the government is taking and whether. Or not you want to participate in that and. Effectively. Facilitate it, in certain ways and I, think, that's something that we should be talking about and that is relevant.
To, You. Know a decision that they will then have the opportunity, to make I also think, it's important to recognize when. You have the, skills and, when you have the capabilities. In. Your, company or otherwise, to. Do something, that may be useful for society for. Your community, for the country in which you live, in that that should be a factor that plays. Into, your decision-making. On these issues. You. Know that there is talent. For example in different parts of the country that can, be brought to bear on issues. That we're dealing with, and I think one of the challenges that I at, least saw in this conversation. And. That I think is worth thinking about is this question of whether or not Google. Sees itself as an, American, company as part of. America. And that's or as a global company as both you. Know what does that mean how, do we think about our responsibilities. In that context. You. Know what what are sort of the the factors, that we should be bringing to bear in that conversation on the other side of things and I think the. Reality is whether Google, decides. To, contribute, to maven or pull out of maven those. Are both decisions, that have impact. On the. United states and its defense and strategies. For the future so there, is no sideline that you sit on in this conversation, you're, either doing. What were you doing the other but you're having an impact on it so, to. Put on that I think, to your question. The. Reality is there. Is talent, and there is work that is done outside, of the United States government that the United States government. Itself, within it does not have the capacity to do and. You. Know one, of the ways in, which. You. Know the intelligence, community others sought. To essentially. Promote. And. Fuel. Work. That, would be of use in effect to the. National security of the United States was, through, an, essentially. An entity called in-q-tel, and in-q-tel. Promotes. Essentially, seed money for, companies, that. Do. Work in certain areas that are of interest and, and. That is in fact a part of the mechanism that, led to Palantir, and all that courtney take it from here, so. Palantir. I should i should clarify. By way of level, setting is, not primarily, an AI company, or at least we don't think of ourselves it's primarily an AI or machine learning company, we think of ourselves as a data integration, and analytics, company which is to say we. Work with institutions. That have complex, data, assets that. They have access to as part of their normal course, of action and. Those. Data assets are in all different sorts of forms and shapes and, their siloed, systems, and they don't interact, so. You can imagine when you have one institution. That has 20. 30, different data systems they're. Trying to stitch that together to carry out their mission, or mandate and.
Those Those systems would talk to each other they don't come in a common format and. You're you're trying in exigent, circumstances. To to, deliver on on a task and to address a significant. Need it's, a very, complex. Order. To carry out so. Imagine scaling that to the size of multiple, government, institutions. Where different governments and institutions are, holding different, pieces of information and, while, they may have the. Lawful. Means, to, be able to share that information, because. They themselves are individually, dealing with the complexity, of their legacy systems, they can't do it so. What, we set. Out to do and one of the reasons that in-q-tel made, made, a small initial investment. That, helped found Palantir. Was, to deal with this very discrete. Problem. That in a lot of ways is a very unsexy. Problem, how, do you get, data to come together in, a way that analysts. Typically. Human analysts, can, make sense out of that information enrich. It and and, take, action, according. To to the the institutional. Mandates that was that was sort of the the focus and the drive behind what we were what. We were what, we set out to do why, Palantir. As opposed to two. Government entities as. An institution, that could carry out this type of work I think. If you look at some of the complexities, of government, institutions, you you see that there are, for. Better or worse bureaucracies. That come into play and make this type of of information. Sharing, particularly. Complex, and the technology, to be able to do it may. Not be an easy thing for certain institutions. And. So there's there is an opportunity, for for, private entities to be able to plug in the into, this space and. There may also be opportunities, to expand that technology, that sort of integration, of information. Technology. In broader, domains, because the reality is that this is not just an issue that exists, within government. Institutions, but virtually, every large institution. That over time builds up data assets is grappling. With the same sort of issue so, I want, to follow up a little bit on this point. You. Know which is to ask the following question, it's it's a badge of honor for Palantir, that you partner with government it's something that you celebrate it's, something that's core to your identity, in the language that's been used by some of our panelists. In the last couple, months it's may be part of your North Star your, Guiding mission in terms of what Palantir is about. The. Quest is what, are the limits on that how, do you draw lines are there you, know Google has faced this challenge from its own employees about, sort, of not participating, with its technology, and in, the killing of human beings, what. Are palin tears lines is there anything that the government could, come to you saying this is what we want to do either this government, or another government in another country, where you would say you know what that's not consistent. With our mission we, we are we are proud to work with with. Government, agencies in. The intelligence, community, in. Defense, with. Special forces with. State and local institutions. But. This. Commitment. To to, working, with with the public sector as well as our work in the commercial sector is, not without limits and we, as a company have to make decisions. And trade-offs about, what we're comfortable. Supporting. The, the. The, reality, of how this decision-making framework, plays out is that it's not easy, because. The problems, that you deal with in these spaces are. Inherently. Complex, and. If, we. We mentioned in a discussion earlier, in. Earlier days at Palantir, we had set out with a task of kind of defining, a set. Of red lines that we could apply universally. To all customers. Or all potential, customers, to define, what we would do or wouldn't, do and, the scope of engagements, and what. We thought would be this very brief set, of very clear maybe five to ten red lines turned into the sprawling forty, page, exercise. That. When, we applied to both, the the universe of existing customers and all prospective, customers, not just in the government, sector but also in the commercial sector and as, well and, also with.
Respect To potential philanthropic. Customers. We, found that virtually, none or maybe you, would end up with a completely null set of people. That you can work with because every, situation that you work with in. The world is going to be fraught with some set of issues that's the that's the trade-off of kind of engaging with with. Thorny, not a real-world, problems. So. The approach that we, took over time where we built. Up through a lot of pain, and experience. Was, grappling. With the, hard questions, on the ground and gradually. Realizing. That there are sort of families, of resemblance, that create. A heuristic, that you can apply in any given environment. Such. That you. May struggle with the first three, engagements. That kind of feel similar, but, the next few you start, to really see what what the similarity similarities, are and you're much more effective. At addressing, those questions but. The short, answer is there is no short answer and the, reality, is you have to really struggle and, toil, with the. Questions, inclusive, of the moral dimensions, of these types of engagements, and can you tell us anything that's on the other side of the red line. So. We, one, example in. Our. Commercial, work we. Made a decision that we would not work with tobacco, companies, and that, was just a principle decision that power our CEO. Made. After, some discussion within, the company but there other instance as long as. Yeah. So I mean if, you have a question. Okay. So I actually wanted, to pull. Us back away. From this though I hope we do come back to this question of bright lines and red lines and where we don't cross them but to something that have real brought, up which was this conception, of these, companies and it's something that we've heard in here in previous, sessions of these, companies as American, companies versus global platforms. Or global, companies, and and what the implications, are of that. Self-identification. For, the work that they go on to do so you said a couple, of, different. Things you said you know first of all it's totally good and we should celebrate the fact that these companies and their engineers, or employees are asking hard questions you. Also pointed, out to us that you. Know regardless. Of which way they make, their decision either one of those is actually a decision so deciding not to act is as much of a, clear. Action, as, deciding, to go through with the contract but then you said and I and we sort of kept. Moving past that you said that you would have made a different, decision so, for companies. Who are, based. In America but are struggling. With this question of whether or not they're an American. Company what. Do you think are their. Responsibilities. - the, US government versus. Other. Entities. That might have demand for their services. Yeah. So, I mean. I think part, of the challenge is that I don't know that my answer is appropriate. Or acceptable to. Be applied across the board but I will give you my answer right so from, my perspective I. Think. It's. Informed. By my own life experience not surprisingly, one. Of the things that, was. Sort of fascinating, for me I started off in science, in. Physics, and and doing work in that area and, and. Then I opened. Up a bookstore cafe and. In. Baltimore and it, was my first experience. With. Any kind of business owner my parents, one was an artist one with a scientist, and the teacher and. And. Having a business. In. My. Experiment. Way then. You. Know living in a apartment. Building in New York City when I was a kid and, and. It was the first time that I started to think about what did it mean to be, a part. Of a community and that kind of way to have a business, and be. Part of the community and you you. Recognize you know oddly enough it like politicians, would stop by and, talk to you you know about like this is very strange, you know council.
People And so on and, and. Part of what it started to mean to me was, essentially. That I had. A kind of a heightened responsibility. Within the community to make it a better community, in effect that, I needed to think about things. Like what, am i selling in my business, am. I doing. Things with the community that, promotes. You. Know sort, of. Getting. People who are out of work and to work am, I, thinking. About you. Know sort of zoning, issues all, kinds of things that that. I needed to start to think about in the context, of being community, member and and what, I came, to realize from, other business, owners that I respected. And you know was learning from was really if if, I'm not doing it nobody's doing it right if, if I'm not taking responsibility. For. The government, and the. Community, that I'm living in then, you. Know it's not going to be the community that I want to be in in ten years and it's not going to essentially. Move, in the right direction and, I see companies. Like Google which are enormous, companies. Have incredible, power within, our communities, right and within, our country. Taking. Advantage of many of the opportunities, offered by the, United States taking, advantage of the infrastructure, of, the. You, know political environment, all these things that. Both provides, them with a responsibility, in the sense that they are taking advantage of these things they should be giving something back to the communities, in which they live but, also really, thinking, about the. Future of where. They're living they should be contributing, to that as well and I think that's. Something that should be a part of the conversation. In the way in which they interact in this country I mean. So taking actually that that same community oriented. Approach to what a business, sees as its mandate. Or its community and take, to bring it back to Courtney do. You is. There a community, with which Palantir, would identify as, its primary. Audience. That it is providing. Service. If we recite, for MIT's customers, and if so how would you define that community, who are you in service of so. I think I think that the first layer, of community, is the community of employees, and. Those. Are. People. Who come from institutions. Like like Stanford. People, with, diverse. Viewpoints. And, perspectives and. And political, views, probably. Not, that all not, that far removed from the, sorts. Of political persuasions, that are represented, in this, audience and and so one thing we find is that when we engage, with that community on, these these hard questions about who we should be working with or who we should not be working with in the scope of those. Types of of, deployments. Those. The, some of the hardest discussions, we have our internal discussions and, we have to pass muster with that community, before we can even go, beyond that I. Would say the other layer of community, is we. Are a company as we set out from from the earliest stages, we, were a company that we're directing.
Our Efforts, towards, enabling certain, institutions. To preserve, the society, and the values that we consider. To be core to our identities. As employees. Of the company but also as citizens and. So that's probably the the next layer of community. And I think there's a lot that falls, out from that as, the company has grown and expanded into, other sectors and has moved into. Government. Engage in engagements. Internationally. And as well as working with commercial institutions. We've. Had to broaden that vision to think about how for. Example private. Organism. Organism, I'm. Playe. Into this view of what, are the institutions, that are critical to preserving the society, we want to live in and that implicates, a certain set of decisions, but. I would say go, back to what I said before this. Idea of being. A part being. An active, part and and having. Agency, in preserving. Liberal. Democracies. Is kind of central to our identity as a company, and it's a it's it is a North Star when we make, some decisions, often, times have, complicated. Complicated. Externalities. Let. Me try here to get into so I mean the question, I had in mind just before when Jeremy, was asking you about red lines that you might you, know definitely. Not want to cross oh you left there again when invoking, the. Values, of the society that we belong to the, institutions. Of democracy that, you want to defend. Again. Those are those are phrases, that. Understandably. Would trip off the tongue of, someone, like of real representing. The US government, sounds, stranger. Perhaps that come, from a private, company but. So again let's just see so I imagine, that means something like you wouldn't do business with the North Korean, government that's. A white line you don't go go don't go don't go past you. Could then, make, a more complex decision maybe, you do business with certain non democratic regimes. And their intelligence services, or government, agencies but, because you have a view about by. Doing business with them in some long run, horizon. You're acting. On behalf of liberal, democratic, values and so take, Saudi, Arabia say this.
Palantir Did business with Saudi Arabia and if, you do how do you think about it as democracy. Preserving. No. We don't we don't you work with with Saudi Arabia but but I think it's an interesting hypothetical. It. Does raise the questions about whether there are strategic alliances. That make sense for us to engage in and, by. Nature of the work that we do in the sense of institutions, that we serve domestically. We, would have formal responsibilities. To discuss the prospects, of working with countries, like Saudi, Arabia with, our counterparts, in in the US and. Effectively. Ensuring, that they're comfortable with with with, that that, type of work, but. But the the details, of whether we would engage under what circumstances. Really come down to the complexity, of what's. Being asked and and. An understanding the the actual context. Treating. A country is a monolithic, entity, that. Only, is represented, by the, the. Depictions. That we see in a few, brief, newsreels. I think may not do justice, to the, fact that, governments. Aren't always monoliths, we know, this to be the case in the US and. So you have to really engage on the specific institutions, that you might, be. Considering. Contracting, where they're working with and even. More specifically, the type of work that that, would be involved and then beyond that understanding, the trajectory, of that work whether, it aligns with, broader, set of values, according. To the the institutions. That we serve primarily and, then making a holistic, decision based on all those considerations, so one. One more thing that's you know it comes to mind for me here is to ask. You. Know I'm, getting the impression that Palantir, has a foreign policy and it's making me anxious, and. Not. Because I necessarily disagree, with the objectives, but because I don't know what business a private company has, having. A foreign policy and, the. Idea would go something like this on behalf of Democratic, values, democracies.
Typically Have civilian, control of the military so. If the. Work of these folks here is seen over some time horizons to be distasteful to a sufficient number of citizens the. Various leaders get voted out of office and the direction, of the military changes over the time but. Palantir has no such accountability. Structure to citizens. You. Might internally. Feel like you work with various, agencies and in some intermediary, way there's an accountability to citizens but. From my perspective on the outside since I'm neither in the government agency that you're consulting with nor in any way connected, to Palantir, I. Wonder. How it is that you feel accountable to, external. Members. Of the, very values. Of the societies and the values that you aim to defend, well. Why, why should I feel good about Palantir having a foreign policy I think, it's a it's a it's a great question I wouldn't, frame, it as pound here as a private company holding, a foreign policy may be foreign, opinions. But. But I think your, point about accountability. Is. Is a very fair one and, I. Think it draws back on on the point that we are the, boy, who worked for the company, have, a sense of responsibility and, so accountability, foremost is, reflect. The. Way that that people within the company think about their, their, comfort levels and working with with certain institutions. But. But the reality, is we, acknowledge, that our view has to be much more sophisticated and cannot just be sort of a go a go at it alone, technocratic. Approach. To, the world we. Operate within a broader context. Where political, discourse needs to be factored, into the decisions, that we make and so what that means is that if, we're going to go in into potentially, fraud environments, we need to have conversations, with advisors. And. Officials, that we know, and trust within, government institutions, to, make sure that some of the approaches that we're taking are, in alignment with considerations. That they could bring to bear mm-hmm. Okay, maren well I wanted to bring Mike into this conversation actually so, it sounds, like you have a point, to follow up on this and so maybe you can also expand. On the with respect to the US government has a pretty clear foreign policy and, so we see you know this arms race developing. An AI where. One could think of China, as an a psychotic nation having, a very centralized. Control. Over, the, investment, they put into things like education, and science and how they want to develop particular, technologies, for particular ends that they want to achieve and the, United States seems, to have a very decentralized view, of seeing are there particular companies, that will work with us are there, particular projects. That we could potentially fund, that, may or may not decide, that they're going to take the money to do something and in. Terms of how that plays out long term how do you actually see the competitive situation between, the United States and China with, respect, to their policies, in terms of a high development going forward. Well. I think. We. We talked about this earlier. What, we hinted at it. China. Has an advantage, because. They can centralize, the way they decide things they they pick and choose the areas they want to fund they. Make data available to anybody.
That Wants, to you use the data they force the. Use cases that that are going to advance the science they've. Come, out of top-down and there's. Not sort of these lines of, responsibility. They. Can start a company from, top down we. As a democracy don't get to do that and so. To some degree we're, at a disadvantage and, and. So you have to pick and choose where, you can you, can exploit what you're good at and. Where. You need to get better at things so. The. The, DoD, is a very bureaucratic, organization, okay. And that. There but it's just that's just the way it is. What, we've got to start to realize is that that the AI talent. And the AI growth. The. Science. Is. In commercial, companies and how do we make. Decisions. About policy funding. All. The way down from what good what comes out of Congress and a five year funding cycle, and. What. Use cases we're going to we're going to. Advocate. To, get, the commercial, companies to work for us I, want, to comment a little bit about about, Palantir, also kind, of step back for a second. Years, ago we didn't have debates about why was Northrop, making airplanes, for, us years. Ago we didn't have debates about McDonnell, Douglas building F force that. The country needed, what's. Changed a little bit is that the. Technology. Advantage, is not in carriers, and ships and airplanes which have clearly defense, things, but. It's in the softer. Science. And art of information, technology. It's. In how data flows and, disparate databases in, around integration, and these. Things are being developed for the commercial, sector and so. I, daresay, what's, causing this debate is the fact that we're trying to buy, software, rather than buy jet airplanes and that makes it harder to draw the line between you. Know what is good and what is evil which i think is way, too binary, anyway. So. Really. So. Really I think that we. Have to create the advantage, that by exploiting, the things we're good at one. Of the things I think that were we need to get better at is what I'm hinting at and that is. The. DoD needs to understand, that they need to buy commercial. Technology, and buy software, at the speed of software. We. Fund, things in a five-year cycle and, even. Now we're finding that, we. Find. Technologies. In the valley that. Users. In the field would love to have and we can fund it and get it to them but. To scale it, the. Funding is four years out and in. Those four years we, lose the technology, advantage so we've, got to work on some of these things it's, not it's not a slam dunk. China. Has the advantage, I think I think. We can I think, we're playing catch-up but. We. Need to be clear about what we can fix and do very quickly I, want. To move from I want to move to a very specific example and get all three of you to react to this, when. I was a kid I might have been afraid of monsters, but today's kids probably, should be afraid of killer robots not just because they'll. Be on the television, set or in the movie theater but because it's a very real possibility, whether. Through artificial, generalized.
Intelligence. Or. The kind of technological. Advances, that we have in the military that, you can imagine either the American military, or other foreign, militaries, really, putting in the hands of autonomous, decisions, the, question of whether lethal, action should be taken against, a human being so. I want to ask you can you imagine a situation and, we'll start with a real but, I want to hear from both Courtney and Mike on this as well can you imagine a situation in, which, you would be supportive. Of delegating. To machines. Decision. About whether to take lethal, action against. A human being now. In answering, that question I, want. You to think about an. Environment. In which not, doing, so might, put us at a military, disadvantage. Visa, via our adversaries. So if your answer is no do, you really think that principle. Of preserving. A human, at in the loop is so, worth it to bear costs, with, respect to our own military superiority. So. Probably. A challenge in answering these kinds of questions I find is in. Actually. Drilling down on what it means to. Have a human in the loop right. I in, other words um. Am. I delegating. Authority. To an, autonomous, machine. To. Make a determination I want to kill if I am. If. I'm flying. An aircraft and. I, have targeted. You. Know I have a target in front of me and it has human. Beings in it and, that. Target is an identified, target, that we have decided we need to hit but, I give the airplane, essentially. Or the machine or the computer, you know a certain amount of space. Within which to decide when to take the. Strike, because. It's. More capable, of figuring, that out than I am right I've identified, the target it knows what the target is in a sense but, it can decide when it's going to do it and it can do it within a certain range of, space etc and the. Reality is we've sort of moved into that space already, right where there are places where, you. Know we delegate, to machines within. Certain, guidelines things. That they are supposed to be doing according. To our instructions, so, to, that extent yes. I can imagine it right but, at the same time I can't, imagine a, scenario in. Which I. Essentially. Abdicate, all responsibility. For what, target is going to be hit who will be hit etc, and I say to a, computer, that you know somehow is capable, supposedly. Of terming. In this you're, now responsible for the defense of the United States and, you decide who. It is and when you know you should be taking, strikes etc, and. Mmin. Between that's really important, and much more likely to be really. Dealt with and so in a way the. Way I think about this is as follows I think, and. There's you know you could spend an entire hour on this issue right but. Or. More. A class maybe it. Is. On. The one hand I believe that. We. Are gonna see more and more. AI. Other. Technology, etc be brought to bear on. On. Our defence through, machines in a way that is similar to what I described, but even more advanced, and that. Is clearly because more and more we are seeing how. Quickly. Essentially. It is possible, for an, adversary, to. Present. A challenge to us in effect by striking, us or doing other things like that and the question is when that loop becomes so short right, how does a human being actually. Are. They capable of responding quickly enough to defend against, the attack that is coming in right and the. Question then becomes so how do we manage, that in a, way that retains. Both. Our humanity. In a sense our principles, our you, know the sort of broader guidelines, under which we are willing to use force. How, do we do it in a way that is. Consistent with what we expect. To be lawful, under the circumstances, which is connected, to our principles, and our humanity and the, way we've designed the law but, also how, do we do it in a way that retains, accountability.
When, You. Do things that are outside of that box because that's one of the key issues I, think that you, know lethally autonomous, weapons systems raises, and it's, one of the key issues that's being discussed both internal. To the government I'm sure I know it was discussed when I was in government and it's. Being discussed within the international, community, there's, a group. Of government experts, under the, convention. For certain conventional, weapons that, are looking at this issue and trying to devise principles, upon which to deal with it and I think as we. Move through this the key is really to. Do you, know to sort of very thoughtfully. And sometimes. More slowly than people wish but. Nevertheless go. Through these cases, and think through ok, are we you, know do we have a sort of a rubric here do we think this is acceptable do we think this is out of bounds that sort of thing while at the same time my, new I think keeping, an eye on the, policy, development, so. That you don't actually create, an arms race in, this area that, actually can't, is counterproductive, to what you're trying to achieve which. Ultimately is, really effectively. Defending. And preserving, peace so, let me ask you one follow-up question before. I invite the others in because. Of real is one of the most, talented lawyers in the US government you played, an important, role in thinking. About the rules that should govern the. Use of drones. And. To set in place an architecture. For, making decisions, and doing this kind of careful, judgment, that's required about when force should be used in that way of course. One of the challenges, is that that, kind of system depends. On the faith that people have in, policy, makers to operate. In a non-transparent space. And make, those judgments, in accordance, with our ethics or values we. Now find ourselves in a child in the situation, where a lot of the architecture, that was built during the Obama administration, with, respect to human, enabled, drone strikes has been rolled back by. A subsequent, administration. So, I'm interested in reflecting, on that experience. What. Lessons, would you draw for. This, careful, you. Know calibration. And experimentation. With, greater autonomy and decision-making, in a, space that is not visible, to the democratic, process, that, depends, on a lot of trust and faith that, people have in policymakers. And experts, to make reasoned decisions, and at a time when that trust is evaporating. You, know given the events that we see in Washington is, this something that really can be left to the executive, branch to carefully. Navigate, in the way that you described, or do, we need some democratic, deliberation about, this and some external oversight, of the use of these capabilities. There's. A lot there all right so first of all I am NOT one of the most talented lawyers.
In Government I'm. Not even governor anymore but there. Are so many extraordinarily, talented, lawyers, in government in anyway and it's, true I participated, in this effort to, do this I I, think. Try. To boil it down to a few things I'd say on this one. Is, I. Went. Into law in. Large part because I thought. Lawyers. Really understood. In a sense how to affect change in society and, I. Was. Inspired through. Civil rights and a variety of other spaces, where I saw law have an enormous impact, I have come now, to a point where I think I still, think law is critically, important, and you know important to change and so on but, I think at least as important and protect. Possibly. More important, in this moment, in history is. Is. Culture, in a way it's, sort of the norms that we have that are not necessarily. Legally, binding but nevertheless are the ways in which we accept, you. Know determine. What is acceptable, behavior what's unacceptable behavior, in how we think about things and approach challenges and, and. It's, largely because I think these things are so, important, in their, interaction. Together to. Create an environment in which you can actually. Promote. What, you think is for example better, decision-making or you. Know I two beauties that you believe are reflect, the society and I, think in the national security space you do naturally, have an area where first of all if the execut
2019-03-07 19:24