Refiguring the Future, A NetGain Event: Leaders from the NetGain Partnership

Refiguring the Future, A NetGain Event: Leaders from the NetGain Partnership

Show Video

It's, now my, pleasure to, welcome on stage, the. Leaders, of the, net gain challenge, along, with dr., Alondra. Nelson. Dr.. Nelson is, president. Of the Social Science Research Council, and, professor. Of sociology at, Columbia. University. Her. Work. Bridges, and operates. At the intersection, of Technology. Science. Inequality. And race. So. With. That I would like to welcome them now on stage. Good. Evening everyone so. Glad to to, delight it to be here Eric thank you for that introduction. - delighted to be here this evening with this distinguished, group of, philanthropic leaders, who are. Also pioneering. And their commitment to helping, us think as communities. About emergent, technology, so here. With us this evening are Darren Walker president. Of the Ford Foundation Patrick. Gaspard president. Of open societies, Foundation, Julius. - president, of the MacArthur, Foundation and. Mitchell, Baker chairwoman. And founder of Mozilla. So. As you will have heard from comments. Earlier this evening several. Of us in the room spent the better part of today trying. To think together about, the social. Implications of. Artificial. Intelligence through. The lens of the arts and of creative, expression and one. Of my takeaways. I guess that I was left with and I wanted to start by I'm hearing. What you made, of the day in the conversation, and of course the the, incredible. Provocative. Words. And ideas we heard and images this evening, was, the sort, of the, toggle between. Being. Caught in a gilded cage I, mean if algorithms, are sort of rules and systems that, give. Us directions about, how to do things in the world on. The one hand there's a kind of gilded cage maybe a bit of a beauty to it and on the other hand we might just be facing sort of simply prisons, right that those two there but either way on either side there's a kind of constraining, force. On the one hand there is incredible. Art. Offers. For us as a way to think about emergent. Technologies, incredible. Hope and. New narratives, but. On the other hand as, Trevor, Paglen reminded, us this evening some. Of these systems are in his words, irredeemably. Undemocratic. Right and so there's this kind of very, much Janis based or a, sort of bittersweet. Moment. That we're living in with regards, to AI, and, so I wanted to to begin, with you Darren and sort of get your thoughts on how you've thought about our conversations, today and this evening thank. You I'm really delighted to be here, so. Algorithms. And. AI. Is not new it, is. An updated way of. Control. We've. Always had control, in society. We, have now. Simply. In this digital age, created. A new, way of control. A new. Way of defining power, and. What. I worry about is, that. We. Are simply. Through. AI. Potentially. Going to. Simply. Replicate, all. Of, the. Discrimination. Bias. Prejudice. And, power. Imbalance. That. We see in the, analog, world now. In the, digital world and, so. As. The. Leader of a social justice foundation. What I am, concerned, about, is. Power. And. Justice. And how. Does that how. Will it be advanced. Or. Constrained. By. Technology. And, and. So for. Me I think we. Have a responsibility. To, raise. For. The public through investment. In what. I call the three eyes and. Ideas. And. Stitute in individuals. Which. Is what we basically do, we foundations, are basically, financiers. Of. Many. Of the organizations. And the people who. We've heard from today and that's our job and, and. So for. Be our responsibility.

Now Is to. Ensure, that we. Are. Resourcing. The people, who. Need to be engaging, the public on these questions, of justice, that. We are, investing. In a new generation, of, institutions. That. Currently, that. Haven't existed. And. That we are fortifying. Them to exist for into, the future and that, we're investing in ideas. Many. Of those, ideas may, or may not have, long-term currency. But. They. Will, help to frame how, the public. Understands. This question, of technology, and justice, so. That's how I see this, great. Thanks. Patrick. Wait. I don't get a different question. It's. Impossible, to, follow Darren but it's even more impossible to. Follow Lauren, that, was a, remarkable. Presentation. And it really reminds me of something that the science, fiction writer William, Gibson said Lauren, when he said that earth is the alien planet now well. I really think that your, presentation, and the work of artists like you who are examining. And interrogating, what. AI. Means. In our lives really allows us to. Pause to. Hit the freeze-frame button. In, order to begin, to question things, that have gone unquestioned. For far too long when. Kate was making her presentation, she talked about phrenology. And, I have to tell you that, as a black man in America I feel as if I'm phrenology, eyes in every single room I've ever walked into all of my life but, to. Pick up the thread of where. Darren, started, I think, that now I'm for knowledge eyes in a way that's completely unacceptable. Because, it's, not done. In a way that we have any transparent. Of read into which, is why this conference. Is so critically. Important. This. Is an urgent matter, you. Know this is this is something that's real and happening in our lives now the. Future, isn't stupid the. Present, is what's actually, really. Insanely, stupid, I'll lift up a quick, anecdote, and example, from in. The, realm of public policy in Indiana, a few short years ago when, the. State of Indiana made a decision, around. 2008-2009. To. Figure. Out ways to streamline, their welfare delivery. System, and, the governor of Indiana was harping on one instance, where two. Co-workers. In. Their, system, found, a way to game things and still eight thousand dollars from the system and so of course this, meant that everything was broken so overtime, in. Indiana the legislature, replaced. 1,500. Live human beings that were administering, welfare. Benefits, we, placed him with online, tools. Applications. And a call center, and over, the next three years after that there, were 1 million more denials, of benefits, in. Those, 3 years and that was a 54%. Increase from. The three years prior. So, people who, have. Been on the margins, desperately. Poor who, looked. Like my. Family, were, being denied. Life-saving, essential. Benefits, because of the way that these algorithms, have become, profoundly. Unaccountable. The, the. EU. Commissioner. For competition. Verse sagar said. Not too long ago that, algorithms.

Really Need to be taken, to law school right because, this. Is really not machine. Learning, it's machine, teaching. And, we are basically teaching these machines to make the same screwed-up. Decisions. That human beings have been making for something for, some time now so this confluence, of data. Science, and. Provocative art is. Exciting. To be able to make an investment in right now that's, great so, we'll go to Julia but I wanted to ask them that right. Though you provide. A really powerful, example. Of, the. Of. The, sort, of damage that algorithms. Can do to people's, lives and I. Wanted I would then ask you to sort of what either, can the arts or philanthropy, do in that space right, what, what's to be done about this. Indiana example, well I would say this about philanthropy and, I think that the artists who've already been on stage have spoken. With their example. About the power of their intervention. Philanthropy. Is not. As nimble and as inventive, as artists, are but there are something that we can do the. The companies. That are. Producing. These algorithms, that are benefiting these from these algorithms have a tremendous. Amount of power in the resources, that they have and the access, that they have to. Policymakers. And, the work of philanthropy. Foundations, like ours with, our modest. Investments, can tip, the scales a bit in, some of the, resources, the access, and we can help lift the voices that have been marginalized in. These kinds, of debates we, know that it's mostly. People. Who are well-off who are wealthy who have the most access to technology, but it's the poorest people in society who, are most, implicated. By the, outcomes, and. Being able to. Lift. Up those voices, in public. Policy in the base is critical, in our advocacy. At, the end of the day as Darren said as Kate said earlier this. Is ultimately. And. Truly, about power and. Foundations. Can lift up the power of networks. There are examples, in the u.s. in the EU in, sub-saharan. Africa that we can pull together cobble. Together to. Push back against some of the worst abuses from, these, industries, thank, you so. Julia in our conversation, earlier today, you. Brought. Up maybe, Onew ho as phrase. Algorithmic. Violence, as a way of sort, of thinking about what might be some of the more pernicious implications. Of, artificial, intelligence and. You, also talked a bit about how the work, that some of the investments, that you're making at MacArthur in the arts might. Be a sort of counterpoint, to that I wondered if you wanted to share some of that with this larger group so. I'll do that in just a moment but let me pick up a little bit on the notion of urgency. I'm. I. Have a hard time sort, of reconciling. The. Sort of tensions, between the, sort, of increasing, uncritical. Acceptance of. Systems. And practices and. That. And that, juxtaposed. To what feels like warp speed. Adoption. Of, of. Technologies. In algorithms, and so I'm wondering and this, is not what, you can say when you're a foundation, president because you have to be optimistic at all times, but. I'm wondering if. If. We, have enough time if. We actually have enough time to. Have the, voices at the table that need to be there if we have enough time to. Spark. The interest of, and plumb the you, know the insights, from ethicists. If we actually have, enough time to, have an impact, on the, systems that are becoming so, pervasive. And embedded, in our lives and, so I think, that foundations, need. To do what you're saying which is make, sure that the. Only voices aren't the voice you know the normative tech, male. White, male voices, but. What I'm saying is. That. Plus. The, fact that none. Of us read the Terms of Service and. We. Just go I accept. And so, our even, we who, are savvy. And, understanding. Are weak complicating. This by our own and critical, acceptance, of the changes, and so I'm, worried about I'm, worried about running, out of time but. I did raise this issue of, I think, that we are and I guess it's part of the same thing you know I don't think we're thinking urgently, enough, about, the. You. Know the embedded, flaws in the systems I mean, we talk you know almost you. Know sort of flatly. About oh yes, there's bias and, oh yes there's there, errors being built-in and so, we they say oh yes you, know people. Aren't getting mortgages or, you, know all the kinds of consequences that play out in you know in their daily lives but. This. Doesn't feel like just a bad thing it. Feels like a violent. Thing and that's why I was so taken by that phrase algorithmic. Violence. And so, when there's violence it seems to me that it, requires a different response, than the measure of the measured research, response, or the you, know or the es let's see if we can go, she ate with the good guys on the in the inside of the corporate giants.

It Says do we have strong enough advocacy. Demanding. Sort of accountability, funny. Thing was I was on a panel with Darren you know maybe two or three years ago and I think he said something about we have to hold these algorithms, accountable, and I thought well I don't even understand, that but, you know didn't people design, them and that was before I understood that people, have a role but then it's. Out of our hands in many ways so, I'm. Worried that the genie is a little bit too much out of the bottle, please. Yes. I. Think. The question of whether we're, out of time, I think we have to look at a long time frame I think the answer is we. Are out of time for some generation, of people right now like that violence, is occurring now and. And more is likely to occur, and. So and, the question of are, we out of time to stop it before it happens because, we were prescient, enough to know what's coming I think that's true. And so. There, is not only the work of understanding there's. The work of building better sit well building, more representative. Development. Systems, understanding. What's happening building. Institutions, getting. Rid of the, systems. That we have today that are so biased one, by one right like that that's a path so, so, I don't mean to be even more critical but like that this stuff is happening now so we. Won't stop it before it's, in our lives and, some, portion, of us many, more than others are going to experience it so I think the urgency, is real but, that shouldn't stop us like. There's you know there's the person born today and the born tomorrow and the person who's going to be done denied next week and six months from now and a year from now and so, we, have to we, have to make changes, for the future so, I'm yes, like. We like. It's on us right now and, that. Should add to the urgency, of a, building, and there'll be incremental, fixes for a while, here's, a system and some someone, has actually figured out exactly how biased it is and then we figure out the right pressure point and the political, and power system, to somehow, try and get it fixed and you know so very incremental. Right now that, that needs to build into, something that's more sustainable and, more ubiquitous. Can. I just on this point say you. Know when I hear things and. Like where are, we out of time as. A society. We. Have been out of time, for. A lot of people when it comes to justice, we were out of time, 20 years ago and the war on drugs when, we were saying what, the implications, were going to be for, black and brown men in this country we, were out of time then, and so. Today. We're asking once again, are. We out of time and, the, question is. The. Answer is no we're. Not out of time, we. Lack the will and, so. Julie, the reason, one, of the many reasons I admire, you and what you're doing at McArthur is you didn't even know what I was talking about two, years ago and, and. You. Had no clue. But. Unlike. Most. Of us in foundation, land particularly, big foundation, land you. Got on it because, you understood. On some level, that for. MacArthur's, mission, these, issues, were, important. And you. Got smart people on your staff like Eric and other people, you, got your board on board and, you made a big commitment.

Unfortunately. We, and philanthropy. Often. Aren't able to, respond. With. Urgency. Because. We. Are. Like every other privileged. Institution. In our, society. We. Have the privilege. Moving, at our own pace and. And. So, if. We, are to internalize. These questions, around power and. Privilege. Which. Implicates. Us. Then. We will work with more urgency on these issues because. The reality is that. For. The, privileged, and the powerful. The. Outcomes. The. Implications. For. This conversation are. Not as dire as, they. Always are there, never is dire in our. Society, as they, are for those who are vulnerable and, those who have historically in, our, analog, world been. Left behind and so the question is how urgently, are we working, to ensure that in the digital world they're not left behind and fortunately. Artists. Who, always. Always. Are. The ones who, can save our society. And who have always been the ones when there's ever been a moment of social, inflection, to, call America. Out and to, call the powerful, and privileged out because. The artists, artists, hold the mirror up and, demand. Of us that, we interrogate. Who, we say we are with. Who we actually are, and, they're, doing that through this. Work this intersection, of art and technology, today and so that's why for those of us who are investing, and they're not enough of us in philanthropy. We've. Got to figure out a way to broaden. Because. If it's the same five foundations, in. That, game that it was five years ago five, years from now that. Urgency, is not going to be met so. I want to push on this point a little bit but. For all of you about how you connect, the dots right, so Lauren. Heather. You. Know Dorothy showed us incredible. Works that's sort of for a moment, I think all of us were in raft and we were literally, taken somewhere else we were taken to different worlds right and so, how do you take the power of that transformation, and. Get into social change like, what like what is the role of the philanthropy, and sort of connecting. The dots in that transformation, I. Don't. Want to be the one-time. Speak. From our perspective, because. And. That perspective, is trying to bridge philanthropy. Technology. And and product. And consumer, market which is very complex, you know at that moment but. But part of what connects, the dots is the. Imagination. And the ability, to see something else and, so we we see this in technology, too and in building technology, is that, the right, now you know everything is convenient like this is the problem with mass surveillance, Alexus convenience our phones are convenient you know we survey our selves. And. We surveil our cells in our own homes now so so conveniences. Is like people just sort of expected, and are resigned and have given up like there is really no choice, and. Even if I opt out of Facebook and you know like the rest of the world is still surveilling, me so so, part of it is that you. Have the reflection. And you have the idea, like that, the that, we experienced, earlier tonight and you can see a possibility, and humans. Are much better when they see and have internalized, but something different and better is possible and so we see that even in technologists, you know it's you, know there's there's, there is a you. Know I am I do, love technology so you know there is something about calling, out in technologists. As well the possibility of something different and. It's something like even as basic as a browser before, we built it everybody, knew we were stupid like, you know it was an ie world and, it was a Windows world and you know we were just stupid but but, if you can take a possibility. And make it real so, people can see it then behavior, changes, and so I connect, the dots by thinking that the prototyping, and the. The. Different worlds that were in two shows, a possibility. Of what is and what could be different and if you can like a prototype and, if you can even get to a prototype, of something that's different with your relationship, to technology. Then you, know you, know the possibilities.

Multiply I. Was. Just thinking about the role of artists, and the sort of translational. Dynamic. That. That. An artist, can, can. Create I mean we, can fund research, and we can, you. Know and we can mount, an information. Campaign about, something to raise the awareness but. There's something about the you, know the sort of creative. Moment, of, interaction. Between an, an. Artist's, work or an artist performance, and the, people that experience, that then they see, it in a way that. Is meaningful. To them and so it seems to me that if we could actually, acknowledge. The role to, a greater degree of. Different. Translators, than, we typically, think of if, we were to actually say, we. Want every, single young person to experience. You. Know the perils are the concerns or, the things related. To technology along. With the opportunities. Of it, are. We not acknowledging, that there are different ways for those people, to learn and that, one of the ones that we want to pay a lot of attention to is the, one that gives them the immediate. Sense of understanding, because. Then. I think that if there's sort of a popular. Uprising of. Sort of understanding, or, demand. For accountability. That's. An essential complement. To a future, that looks different and. We have to. Really. Cut off my big brother wary, of that but so you know I think to. Go, to back to the, premise of your question I, just want to be really clear and noting that philanthropy. Foundations. Don't. Create. Social, movements, what. We can do though is, help socialize, the right set of questions, that can form. Movement. And, lift up a set of actors who can then make. Transformational. Change. The. People who are. Pushing. This, technology, into, our lives are really driven by an, efficiency, paradigm. Right it's an efficiency paradigm, they're trying to figure out ways to make. Many. Decisions as, quickly, as, possible. But an efficiency, paradigm, isn't necessarily the best thing for things like you know social welfare and justice. Because, due process, is a complicated. Thing right, so in order to push back against, that efficiency, paradigm. We, can help ask the, right kinds of questions about. How this, technology is. Applied for instance, in. 2014. Which. Now seems like a million years ago there was an attorney general named Eric. Holder who, actually, hadn't read the Constitution. Release. A study that demonstrated. That. Algorithm. Risk. Assessment, tools that were being used already in. Courts, were profoundly. Flawed, instead. Of asking a number of questions that, then those of us in foundations picked up and disseminated. And, connected. To the work of activists. Grassroots, activists, on the ground to, artists, to other data scientists, and helped really convene. Conversations. That. That, spurred, action. And the right questions being asked, in. The, halls of Congress fast. Forward, now to 2018. Brand. New world brave, new universe, and. We see the quick, quick, rapid, adaptation of, that technology, in ways of truly distorting. Outcomes. In. Bail. Decisions, and, we're, doing this with technology, that a recent study demonstrated. That a a, random person picked off the street who's paid a dollar.

Could. Make decisions, that are much more accurate about, predicting, who's more likely to be recidivists or. Not so this is a dangerous, time dangerous. Set of tools being. Controlled, by unaccountable. And poorly, informed, and, deeply biased, actors, so in philanthropy, appreciating. This knowing, something, of the, history knowing. That everything, old. Is, new again, that. Can just kind of drive a set of strategic, decisions. And. Investments. That create linkages and networks that can help help. To, motivate a. Pushback. But we can't be the, primary progenitors. Of that, so I'd be careful about how I take that reframe there Darren, you're gonna I was just saying that you. Know when I think about the things that we lack as a society, today the thing, that we lack most, is empathy. Basic. Fundamental. Empathy. As a society. Technology. Does nothing. To contribute. To our. Producing. A more empathetic. Society. And for. Us to be a more, empathetic people and, at the end of the day in order. To have justice, a society. Must have empathy, and so. We. As philanthropy. Should, be doing all we can to, support, those. Inputs, that make us more empathetic, which, is so, much about the artists and, so much about artists, and art, making and arts, institutions and, so, why are we investing. More in artists. Arts, institutions. Art making, so. I think we need to think about that more. And then, within our own institutions. At the end of the day. Our. Priorities. Mean. No matter what any big, foundation, says on their website. Get. The document, that's not on our website which, is our annual budget, and what, we prioritize. In, terms, of grant making and, and. That. Will tell you what. Our priorities. Are and, so. The, reality is that we. Aren't. Allocating. Enough resources. Within our own budgets, because, it means that we have, trade-offs. That we are saying that this, is more, important, I know this from my own experience in. My own institution which, had a very. Modest. Investment in, this space and thanks, - the good leadership of Jenny to me and educating. Me because, I was, like you I had no idea what net neutrality was. Until. I met Jenny, but, even, though I was a lot I'd I understood, on some level, that the implications, for justice, once I really. Came to, know the, implications. Of. All of this this technology. So. You have to within your own organization. Say okay. We may have to spend less, on blank. Because. This. Is more important, and we, have to have a conviction, about that I'm asked all the time why, do you spend more money, at, the Ford Foundation on, the arts and humanities than, you do on jobs. That's. A legitimate question because. Jobs are really really, critical one. Of the reasons, is because. More. Foundations, actually think like that, and so. I'm. Actually. More. Interested in helping. People experience. Beauty. Beauty. Yes. Because, everybody. Deserves, and. So as a matter of as a matter of, grant. Making. Being. Comfortable, with. A conviction that the, arts matter in a democracy and, that. Without. Beauty. People. Can't be sustained, and. Without. Creativity. And free. Expression a. Democracy. Will die and. So as a foundation, with an objective of promoting democracy, and, advancing, democratic, principles, the. Arts matter a whole lot. And, so for me it's, not that what we do it for it is the right way it's not that and I say this with all humility but, I also coming, back to where Julie I think rightly. Positioned. The conversation. Around the urgency. The. Urgency demands. Of us that. We. Get out of our traditional. Boxes, and that, we upend, our normative. Behavior. That. Simply. Reinforces. The status quo, the status quo is not going to get us out of this, hole. So. It's such a profound. Point around, empathy. Because you know the examples, of predictive policing the, examples, of the. Sort, of routinization of, social services, are examples. Also have an abstraction, away, from human connection, right again and again and so you, know the sort of algorithmic, society. That's building, up around us actually, makes, this actually, more and more acute and ways that are worth, thinking about so. I've been told we've got a wrap up so one last sort of musing, for the. Four of you would be around, the future so we're we figuring, the future this evening we've been refreshing, the future this evening we've been talking about the, future this afternoon, what. Is the the sort of future role for philanthropy we know it has to be around, these issues it has to be more urgent it has, to be thinking outside of the box. It. Has to be more nimble, what. Would you offer us as the. Future of as, a for.

A Philanthropic. World, that is going to sort of help us engage this, important, issue so I've got a couple of things to offer and one is, smallish. And the others maybe a little bit bigger but the. First is that if, we're going to you. Know if, we're going to say others. Need to be working, with our support, or not on a better world we. Need to say how, are we modeling that we need to be authentic, in our own practices. And, so in this arena of Technology, in this moment of. Of. The. Need to actually demonstrate, to, others even in a small universe, like philanthropy. But. One of the things that the Ford Foundation just, did was they published, on their website all, of the data that the Ford Foundation collects. In plain. English, in bullet. Point lists so, that I know that if I sign up for a newsletter, you've got who I am where I live and if I filled out a little bit of a something, you know you, know I'm a I'm a woman I'm a you know whatever else I am but, the point is that we. Now know how, you collect. Data and also, what you do with it and so that's. The. Sort of thing that if others if we could sort of say okay I'm gonna do that you should do that and then we could start start, to demand that of others so a little bit of modeling, I think, could, make our engagement. In this whole, arena more, authentic, so. What let me stop there just let. Others envision. The future that's. A tiny little piece it's wonderful one, it's. Hard to engage in the future here because Laurent has terrified me of the future. It, feels, like Shelley's monster has committed. I'm. Not quite sure how we regain. Control but and, I you know and I'm someone who is. Already a bit of a Luddite. Who fears these things but I'll, say this I think it's important, for, philanthropy, for all of us to. Actually consistently. Lift up examples, of. Things. That, actually. Do, work in. In spaces where. There's. An instinct, towards, resistance. To lift up a little bit of prescription, so since we're in Chicago despite. The fact that I am terrified, of, the. Explosive. Use of algorithms, I'll point out an example of, algorithmic. Or rhythmic. Public. Policy that I think is useful and hopeful here, at Chicago, they have a system called M relief where, you can go on a government website put, information. About yourself and it tells you by reading a set of algorithms what, benefits. You might be eligible for and then it connects you to a. Flesh-and-blood. Human being, who's, able to actually, walk. You through. A set of protocols to actually, access those benefits so being able to marry. Technology. With. Some old-school. Social. Service, delivery. I think, as a positive prescription. Here in Chicago, that could get modeled and replicated, elsewhere so, being able to point. Up things. That are working even in spaces that cause. Us some, trepidation is I think an important, role. Philanthropy, has, to play I'm, trying to believe that truth, crushed, to earth will rise again, through Wi-Fi so, I'm gonna hold on to that optimism. Here. I. Can't help but ask you about the Mozilla, manifesto which.

Is You, know it's a document, or a message. That talks, about what the future should be like I'm it really responds, to the question that Kate posed to us about what is the world we want to create would you do want to say something yeah, that'd, be great, first I was also, going to add on the what should philanthropy, do a familiarity. With technology is. Really important, same, with artists, like. The. Ability like we saw today to have artists who have facility, with technology, and aren't afraid of it and can do tech like, that is a like, that is new in the last few years and amazingly, powerful and so if and as, philanthropic. Philanthropy. Is comfortable, and using tech I think that will also be important, we. As as builders, of technology, at, Mozilla, are also trying to represent. Public. Asset public, benefit, of, the network, and how. We can build it and what, we. Mccann. Imagine, and think of in demand of ourselves and. Commit to ourselves as, we. Build technology and, so, we've, always had a manifesto, for a, decade, or so, and. And recently in the last month. Or two we. Have. Added. To it to express, more clearly what the human experience might be so, to, actually and this may seem obvious coming. Out of philanthropy, or social, justice or you, know the, world of art but but for technologists. It's it's. Actually a big deal to. Say that you know we actually think that like civil, discourse, and human dignity that the internet and the technology, we build should, have as a goal, a, human, experience, of decency, you know, rational. Thought and civil, discourse as well as inclusion. So. So this is it's new I mean it's been it took me why I mean I spent a long time on it to, make sure that the, Mazzilli ins like the global movement that is Mozilla feel, like this is us and. So now I hope, to start connecting with other organizations. And see you, know what are the tools that could draw us together as, a louder, and, more effective, voice that, technology. When it's built should, be both. Inclusive, but also aiming, at the human experience, not just the efficiency, argument probably, I don't you shall, we give the last word to the in the middle Darren Walker's anything more you want to know, I mean I think when you talk about philanthropy, you. Know dr.. King said something in 1968. To a group of philanthropists he, said that philanthropy. Is commendable, but. It should not allow the philanthropist. To, the to, ignore, the. Very injustice, that, makes philanthropy, necessary. And. What. He what. He was, saying. To, us, is that. At. The, core of, philanthropy. Ought not to be charity. But justice, and, if. Justice. Is that the core of your philanthropy it. Will demand of you, to. Do some things that will make you uncomfortable. Because. You are and you, are implicated. In. Your, philanthropy. And. Injustice. Is, also. Implicated, and so. I think that is we think about the future I actually.

While. I too. Like. My brother Patrick am terrified. By. Some, of the things that I see actually, don't see a dystopian, future I think. We. Have the. Capacity, to have. A better outcome. This. Time, the. Question just is do we have the will and I'm, encouraged, I'm encouraged, in philanthropy because I see more. Foundations. Coming. To the table more. Of us recognizing. Even in our own shops. We. Are woefully. Capacitated. To. Take this on and therefore looking. For means as. We've done it forward by hiring. Tech, fellows. Having. A new complement, of staff but, that has required us to get, out of our normative, patterns. Of. Hiring, and talent. Recruitment and, and all of that right, so if we're willing to get a little, uncomfortable and, outside, of our. Traditional. Behaviors, I actually. Don't see a, dystopian. Future at, all I'm very optimistic, so, Patrick wants to get out what the acceleration we've, been talking about is happening to us on this stage but please. Darren's. Bringing dr. King into this and dr., King always said that dissent. And dissatisfaction. Has to be creative. Dissent and creative, dissatisfaction so, I just want to note, that in order to bring the artist back as the exclamation, point in this conversation and, I hope that their creative, dissent and their creative dissatisfaction. Will be disruptive, for those of us who are on stage as well and that you'll question. The sorts, of commitments and investment, that we make in this, space so it's just been absolutely incredible. Thank you yeah so, just a final word and, I think that is I. Think. That we ought to challenge, ourselves to, say is there, anything that we're working on that, could not benefit, from greater, engagement with, artists, and reflexology. I think taking at this point yes but. I meant beyond, that if we're talking about climate or nuclear risk or journalism, or anything do, we have the sufficient. Intellect. Challenge, provocation. Do we have enough of that to make sure that our choices, are better informed it's, a wonderful, way to end thank. You for the vision that is been that game partnership. You.

2018-06-14 14:30

Show Video

Other news