Refiguring the Future, A NetGain Event (Full Program)

Refiguring the Future, A NetGain Event (Full Program)

Show Video

Hello. Everyone, welcome to refiguring. The future a net gain event my, name is chance Williams I'm a program, officer at the Open Society Foundations. And good, evening my name is Eric, Sears I'm a Senior Program Officer at the MacArthur Foundation. So. It's wonderful to have all of you here today today's, event is being sponsored by the net gain partnership. Which is a collaboration. Involving, the MacArthur, Foundation the. Ford Foundation the. Knight Foundation Mozilla. And this year Omidyar, Network and, collectively. We have a common interest in seeking to strengthen. Digital, society and advance the public interest. Currently. The, net game partnership, is, focused. On a. Topic. That we broadly, frame, as the the, quantified, Society, and this is really trying to capture the idea that, machines. And, artificial intelligence in particular are, increasingly. Making. Consequential. Decisions. That. Affect, individuals. And society. And. Across. A range of domains, so. With this event we're going to highlight the important role of artists in critically, examining, the social implications of artificial, intelligence and related technologies, and explore. How the arts can help us to shape a future that is more just equitable. And inclusive, in, fact. The images that you've just seen are manifestation. Of that Trevor. Paglen work behold, these glorious times which, helps to show us how a I systems, see the world. Earlier. Today the netgain partnership, hosted, a workshop that brought together around, 70, artists. Curators designers. Researchers, advocates and, funders. To, spark a new or renewed, sense, of agency creativity. And imagination in. Refiguring. The future and, we want to continue that conversation here. This, evening you're going to hear from leading. Researchers. Artists. Designers, net, gain leaders, and we, invite you all to join the conversation, on twitter using the hashtag netgain, and be, sure to stay for, the reception afterwards, and join, us for. Beverages. And casual conversation. We, are very excited to be hosting this event here at the Museum of Contemporary Art Chicago an. Institution. That is so, aligned, with our, values and what we're trying to bring, forward with this event and, we're. Very excited. To get started with the, museum's. Director of, convergent, strategy Claire, root who's going to come to the stage to say a few words Claire. Hi, welcome, to the Museum of Contemporary Art. We. Are so pleased to be hosting. This. Event. With. The, partners of netgain. And. I. Think chance is right we, feel, incredibly, aligned, with the values that you have, brought today and. Are continuing, to bring through. This evenings public. Sessions. There, aren't a lot of things that our entire curatorial. And programs, department, agree on. Fortunately. We all bring very different, perspectives. To the work that we do here, at, the museum but, one of them is this. Art. That addresses. Technological. And social change, is the, most important, part of our time. This. Program. Correctly. Puts, artists. At the center of conversations. About technological. And social change. There, are three, reasons that, we. Need artists at the center of these conversations. Artists. Are futurists. Artists. Are self-critical. And. Artists. Are innovators. First. Art of artists artists, are futurists. They. Are always finding ways to peek, around the next corner. Upstairs. Right, now we, are showing one of those futurists. Howardino Pindell. In. 1975. She peaked around the corner. With. Her glittery, pastel. Perfumed. Canvases, she peeked around the corner of a, very macho minimalism. And in. 1979. With her video free, white and 21.

She Helped viewers to peek around the corner and understand, the experience, of living, everyday with racism, artists. Are futurists. Second. Artists, are self reflective. Some. Futurists, might be satisfied, with, just predicting, the future artists. Are only satisfied. After, they have questioned, the very assumptions. On which those predictions, Stan. Artists, are taught. From day one to, question we. Can all learn from artists, how to pick apart, the. Tools and, mechanisms through, which we think whether it's public. Discussion. Or our, algorithms. Third. Artists. Are the best innovators. In the interest of people and society, this. Has to do with a one-two, punch of being both futurists. And self-critical. Artists. Have always been experimenting. Pushing, and testing. And as. We innovate. They. Are, always, interrogating. The ethics, of our innovation, toward. What end -, what side effect for whom how. Why. On. June. 23, the, Museum of Contemporary Art, Chicago will, open I was raised on the internet a show, about how the Internet. Has changed the, way we see the, world and, the way that we see ourselves. This. Is a direct outpouring. Of our shared belief that. The most significant. Art of our time. Critically. Examines. Technological. And social change, as, the. First generation of, digital natives, comes, of age the, questions we will talk about today are, urgent, I'm. Looking, forward to continuing to hear the discussion, thanks. Thanks. Claire so, it's now my. Privilege to. Welcome on stage dr.. Kate Crawford she, is a leading professor, on the topics were discussing here tonight she. Spent the last decade studying. The social impacts of different kinds, of technical systems she, is a distinguished, research professor at, New York University, and a principal, researcher, at Microsoft Research, most. Recently she co-founded, the. AI now Institute, at NYU, a university. Based Institute, that is it's, the first of its kind actually, that's dedicated to understanding the, social and political impacts. Of artificial intelligence okay. Thank, you everyone thank, you Eric for that amazing, welcome. I'd. Like to say it's such an honor to be here and I, want to particularly thank everybody, who's part of the netgain coalition. Of partners I think what you're doing is incredibly. Important, but a particular thank, you to, Eric to chance and to Mimi for, bringing us together and really pushing, this program forward I have. Some images to show you tonight so hopefully we'll have those in just, a minute for you. Tonight. I want to talk about images. Particularly, what. Is at stake in an image and how, AI systems. Are, currently using images, to, be a force, in our contemporary culture, right. Now autonomous. Systems are classifying. Labeling. And giving. Identities, to people images. Are one of the ways that AI systems, have learned to, distinguish, objects, like license. Plate or a cat. But. This can produce some significantly. Unintended. Social. Effects that you might see up here when. AI systems, try to make sense of the social world just, ask 6%, black Michael Jackson up here. But. First I want, to take you back back. To November. 1972. This, is Lenna she. Is the Playboy, centerfold. Of that month but, she's no ordinary pinup, Lenna. Became. What is known as a standard, test. Image she's, been used to optimize everything, from color processing, to image recognition algorithms. I think. Of her as the mother, of machine vision or, if you like she's Eve, in, the garden of a eye, Lenna. Is now the, single, most, used, image in computer. Science history. Very. Few people know that she, is actually a frame, of a pornographic centerfold. What. Is interesting about this is it makes me think it's like a type of original. Sin, in the technical, field. That, there. Is this type of objectification. Of, humans and women in particular, but. There's also this pretense that somehow images, and the context, where they came from don't, matter, well. I'd like to suggest tonight that, I think they do matter and that, now we face a situation where, a host of unquestioned. Assumptions about race gender. And sexuality. Are now, becoming the foundation. Of artificial. Intelligence. But. First what is this term artificial, intelligence, even mean I mean it has so much hype around and it feels like it's everywhere, and all around us but actually, the field has been around since, 1956. And it's definition, has changed with every decade, but. When people talk about AI now they, generally, are referring to a constellation. Of techniques, in, which machine, learning is currently. The most popular, now. These techniques have become a, lot more powerful because, of three things the, explosion. Of powerful. Computational, infrastructure, at the planetary, scale, secondly. Improved. Algorithms, but, most importantly, massive. Tros of data, which have been harvested, primarily by, a handful, of tech companies. But. AI is, much, more than.

Just Technical. Approaches, it's also social. Practices. It's, about who designs, these systems, and who decides what problems, are being prioritized, and how humans, are going to be classified, for some reason mark is flashing in and out so we could be at a congressional hearing right now. Ultimately. Why. I like to think about the people behind AI, systems, is because they are, also responsible from, powerfully, shaping the systems we have in the world and I, hate to tell you but the AI field, is very demographically. Skewed, to. Put it politely the. Top seven, AI companies, in the world are. Overwhelmingly. Dominated. By male engineers, from, very similar socio-economic. And education, backgrounds, this, also affects the kinds of problems that AI is being applied to and which populations, are not well served. So. You've likely seen lots, of articles recently about bias, in AI and. In, some ways I think this term is too small and it certainly shouldn't surprise us, because. Machine, learning, systems function. By discriminating. Between us by. Tracking. How your purchases, are different from my purchases, the particularity, z' of your geolocation patterns. Your social network and then making, predictions. About how to influence you whether that's trying to sell you things, try. To modulate. Your health insurance prices or if your paper general Innoko trying, to change how you vote. All. Of these systems already exist and they are trained on data that is informing, them on how to make discriminations. But. Very often that import data is extremely, skewed for, example, if you do an image search for CEO, and who knows maybe we'll have the images back up at any minute now I'm. Gonna describe to you what you're going to get you will see. You. Will see a host of white, dudes in suits and of. Course these images have all been, collected. From the web and now AI systems, are recognizing, them and presenting, them back to us but, this can create a very powerful and sometimes concerning, feedback, loop, but. Our systems, aren't just cataloging, images and showing them to us they're also interpreting. Them and tagging. Them with particular attributes, this, is an image that comes from the Yolo 9000. Training set and, you can see that Barack Obama is being recognized, as a leader.

Michelle. Obama is being recognized as, an American. Okay. What, kind of leadership here is being recognized, and why in fact, when you work with these sets you'll see how few women are actually ever tagged as leaders. And. Now these systems are being used across the u.s. in hiring this, company, hirevue, is already, using AI, and, emotion detection, in job interviews and they're effectively tracking, thousands, of micro movements, in the faces of job applicants as they answer questions and then, listening. To their voice and the words that they use and then mapping those patterns to the top leaders in their companies, deciding. Whether to hire them you, can see the potential problems, here but, this is already being used by companies like Goldman Sachs and, Unilever what. Are the deep cultural assumptions, that these systems are being built on and who, might this not work well for, so. To understand, how this is all happening we need to ask how. Does an AI, learn, to see and what, types of structural biases, might be shaping that vision, this. Is a standard. Training data set which is used in machine vision it's called labelled, faces in the wild training. Sets like these are how AI systems, learn to see if you will it's, basically. In some cases thousands, or even millions of, images that are made from human data scraped, from places like YouTube, or from, Flickr, or from news sites this, is how we are training contemporary. AI systems, to, see it's, through the lens of our own past. Now. Labeled, faces in the wild has around. 13,000. Images but it has some very notable skews. As people. In this room have researched and shown it's, around 78%. Male. And 84%. White, so. They're the people for whom a system, trained on this will work best for but. Have a guess who, the most represented. Face is just, have a thought who might it be, it's. Kind of hard one to pick it's George, W Bush, he. Is actually by far and away the most represented, and this doesn't really make any sense until, you hear that labeled faces in the wild was built on an earlier training set faces in the wild which, was created, by scraping, images, from Yahoo News in 2002.

So. It's no wonder da, ba is everywhere. The. President of course gets a disproportionate. Amount of news attention and for, me this is a great reminder of how our data sets are reflecting, our culture, but also our, structures, of power, let's. Look at another benchmark. Training. Set this is one that just got released very. Recently just at the end of last year by Google it's called a VA, it's, made of hundreds, of movie clips in which human, actions, are being extracted, and labeled and they, said that they chose movies, because, they thought that represented, the most lifelike, activity. See. We think let's have a look how lifelike it is if you, go through the data set you will notice, a couple of somewhat concerning patterns for. Instance this is the category playing. With kids and, you'll notice it's. All women apparently. Men don't play with kids but, don't worry guys if you're worried that you're not represented, all we have to do is to go to the category called kicking. A person. All. Dudes. All, men and that. Is ultimately. How. An AI system is going to understand, the, differences between what men and women do through mainstream cinema, on. This, these. Cultural lessons may not be the ones that are going to serve us the best and I think it makes human culture and gender roles much, more rigidly fixed rather, than evolving, over time, we. Need to consider ultimately what is at stake when, we classify people in AI not, simply, thinking about narrow bias, or technical, errors but, thinking about this is a repeating, cultural, practice, that has consequences. This. Was actually a point made right back in, 1751. By. Diderot and Dalembert, when, they were preparing, what they saw as the complete, encyclopedia, of, human knowledge they. Had three, classifications, memory. Reason. And imagination. And, we started thinking what would it look like if you built one of these for AI today, so, this is something that we're doing at the AI now Institute, just to keep track of all of these training datasets and it looks like this you, can see where training, sets are being made to categorize, objects, people. Places, and ideas and, if i zoom you in you'll. See how these sets build on the sets that came before them so, we have labeled faces in the wild before that we have faces in the wild you can go all the way back to some of the earliest, facial. Recognition sets, like ferret, which was made by the, Department of Defense in the 1990s. Right. Now we, are experiencing. The largest class, of a catering experiment. That the world has ever seen. Air systems, are categorizing, everything from, human actions, human feelings, poses, millions, of objects, and places and natural phenomena, but.

The Classifications. We choose can have lasting and sometimes harmful effects. Back. In the 17th century Thomas, Eckhardt proposed, that there were 11 genders, we, had genders. For God goddess. A woman, a man an inanimate object a, beast etc, fast. Forward to 2013, and Facebook says there are two genders men, and women, by. 2014. The same that India's Supreme Court, officially, recognized, transgender. People and said, that it is the right of every, human being to choose their gender Facebook. Quickly, retired, its, binary gender for classification and introduced, this one which, has 71. Choices, in a drop-down menu. Now. We, may see this as an improvement it is certainly trying, to be maximally, inclusive. But. What I would have suggest is that these design, decisions between. Two genders 71. Genders or having, no gender, field at all might, be a good idea is, still. Incredibly. Important, as a political. Choice like. Race attempting. To settle the classifications, of gender one way or the other can, have the effect of locking in a worldview rendering. Some people visible, and some people invisible. We've. Had to learn this lesson of the power of, classification. So many times in the 20th century this, image comes from the Diagnostic, and Statistical Manual, of, Mental Disorders which. Back in 1952. Listed. Homosexuality, as a serious. Mental illness. This. Took over 30 years, of activism, before, was finally dropped with, devastating. Consequences, for people who are identified, this way but. Now history is repeating itself in AI last. Year Michael Kaczynski, whose work you may have heard of it's deeply influenced, Cambridge analytic, released. A new paper colloquially. Known as the AI gaydar. Paper now. They, scraped, thousands. Of images from dating sites and from Facebook to, then, apply them under a deep neural network, to, detect, facial, features, that would allow you to tell if somebody was gay or not just, from their photo. Now. This cause as it should enormous, controversy I think there are clear methodological, and, ethical. Problems, with this paper particularly. When we consider that homosexuality, is still criminalized, in 78, countries some. Of which apply the death penalty. But. Here's an even deeper problem, at the level of epistemology, these. Kinds, of papers I think confuse, relational. Categories, like gender or sexuality with. Something naturalized, and fixed and concrete like say labeling. Something is a cup or chair. And. Of course this might be giving you flashbacks, to concepts. Like phrenology. Where, you could somehow, tell somebody's, true identity, from their face and their head shape in. The 19th century phrenologist. Believed that homosexuals. Could be detected with, a bump on the back of their head and I think you'll see from this illustration I love, this particular thing they say this with absolute certainty, so. There you go. This. For me is a moment, where I think we have to sound an alarm about, when AI systems, are seeking to phrenology, humans, seeking. To fix and label categories, that are fluid and complex, and, particularly, when they make these claims with absolute, certainty but, that's another thing, so. Now we're in a situation where. AI systems, are touching our lives everywhere from education, to criminal justice and given. That these systems are very hard, to see how. Do we come to understand, them better how do we hold them to account how. Do we start to understand. And critically. Engage with, this underworld, of images, driving, AI. Yeah. Apparently, he's happy just, so you know, that. Whole scene we've. Been misinterpreting. At this whole time. Now. Part, of the problem here is that computer, science as a discipline, does not have a tradition, of thinking about the work done by images, but, this is where other disciplines, are so important, artists. For example are rarely thought of as an important group in AI but. In fact they are the ones most, deeply, trained, in thinking, about the cultural work that images do because. Images are not always what they seem, Magritte. Once said that his paintings, are mysteries, that can feel nothing, they. Just force, us to ask a simple question what. Does this mean this. Simple act of questioning the meaning and work of, an image and of whatever that sound is which is fantastic. It's. Not just an artistic provocation. But, it's also a political one so. I personally, have been very inspired, by the artists, who are contending, with the bleeding technical, edge of AI systems, to, explore, how they work and to show them to us in a new light to, ask what. Does this mean what, are the cultural and political stakes, here in these images and, why. Do I think artists can do this well I worked.

On A research project with, a colleague Luke stark a few years ago where we started interviewing artists, from around the world who, were working with large scale data and machine learning and, we asked, really what, they would more they try to produce with these tools and again, and again artists would say that, this was simply, trying to figure out how he showed, a people these sorts of decisions that are happening in the background all the time and artists. Were thinking very deeply about the ethics of algorithmic systems, and. That taught me to, think myself more, radically about who should be included, in the conversations, about AI normally. It's just computer scientists, at the table, but, we need a much wider lens and we need to turn to those who, look outside, of the narrow boxes, of technical, systems, so. That, is one of the reasons that I'm excited to announce that we have launched a culture, program within the AI now Institute, to, invite artists, who are making critical interventions, to work side by side with us as researchers and, I'm delighted that our first inaugural. Artists fellows are here in the room tonight Trevor. Paglen and Heather Dewey Hogg Borg, amazing. Their. Work deeply, inspires, me to think of the many ways that we can make real interventions. Into, opaque, technical, systems because, I think this is a crucial time to, start holding these systems to account because, they're already being baked into everything, from predictive. Policing to, teacher assessment, and above. All this. Is what we want to avoid we want to try and avoid the moment where the future of artificial intelligence isn't. Artificially. Stupid, and, let's be clear I'm, not suggesting that artists, can reverse, the, power concentration. In the AI industry, or even solve the lack of accountability in, algorithmic, systems but, they can help us to ask a bigger question for, example right, now the technical, expansion, of AI is considered. Somehow inevitable, and all, we can do is just tweak, the edges be it trying, to technically. Clean up bias or make, better, privacy, settings all right, ethics, codes but, all of these approaches, are partial. And incomplete, I would suggest what. If we tried to turn this arrow in the other direction to. Ask instead, what. Kind of world do we want and, how, can we design technologies. That serve that vision, rather, than driving it I think, that's the much bigger question that we can ask ourselves and by, including artists. And activists, and philosophers, and social scientists into. The room and into the field of AI I think, we can move towards. A much, more just and equitable, answer, thank. You. Thank. You so much Kate. Okay. So next, we'd like to bring up a group of curators, and artists, from refresh. A new. Collaborative, platform, that is supported through net gain that's. Focused, on creating a more inclusive vision, of our culture relation cultural. Relationship. To Science and Technology through art so. Please help me welcome refresh. Co-founder. & Co curator, Heather Dewey hag borg, refresh. Artist Lauren McCarthy and refresh, co-curator. Dorothea, Santos. Great. Wonderful. To be here so.

Dorothy And I are gonna start things off and then Lauren's. Gonna come in. And. Give you all a. Nice. Long talk about her work I. Can. Start actually anyway so in 2015. I, started. Digging into an. Archive. Of a, particularly. Prestigious. Media, art festival, that some of you might have heard of called ARS electronica and. I. Was curious, about, who. Was being represented, in particularly, the, top prize winners, of this very. Well-known festival, that gives out, you. Know $20,000. Prizes, and what. I found after digging, into the archive so three decades of prize, winners was, that it, was men. And, know. That that was entirely surprising, but so this led to the creation of a social media campaign that, me and a group of amazing women worked on called kiss my arse so. And. We got some good traction on, social media and. So. Ran this campaign in 2015-2016. We, wrote a piece in The Guardian and, that, kind of relaunched, the campaign, then we got even more attention, meanwhile. ARS electronica basically. Ignored. The whole thing or actually they posted, a blog post of men talking about women which. Was also. Not really addressing the problem so, after, that we thought what. We really need to do is something proactive so, instead of just calling attention to the problem, we want to point towards, all of the amazing artists that they're missing. So. In a parallel universe really. The West Coast I was. Frustrated, with expectations. That I would perform my, queer Filipina, identity. As someone covering art, technology. And my, primary concern really, was to just write about and curate, work about, really. Just artists and within their artistic practices, who critiqued Western medicine and use, scientific methodologies, such as DNA phenotyping. Gene editing and, looking at precision medicine and I. Came, across the. Hashtag kiss my arse and Heather's. Work that I was writing about at the time and I, just knew that I found something really special. That I needed to build, this collaboration, in this connection with Heather. And. So what is refreshed so refresh is a collaborative, and politically, engaged platform. At the intersection, of art science and technology, established. In 2016. As a, collective, we begin with inclusion, as a starting, point for pursuing, sustainable. Artistic, and curatorial, practices. Refresh. Breaks down systemic. Cultural, and economic, oppression and offers, validation, visibility. To pop to, populations, that have been historically marginalized including. Women transgender. And cisgender people. Of color, LGBTQ. And, disabled. Artists from around the globe and, we're, doing this not to check some kind of diversity, checkmark, we're do this because we really believe that this is where the most new and exciting, ideas are happening, because, the status quo is failing us and we desperately need, new ideas. So. Who is refresh so, refreshed, includes Solomon assay does is in the audience, Brooklyn. Based artist and researcher, whose practice, celebrates, dissensus. And multi vocality. Dr.. Heather Dewey Hogg Bourg a transdisciplinary. Artist and educator, who is interested in arts as research. And critical practice, Cathy. Hi interdisciplinary. Artist working in the intersection, of Technology science, speculative. Fiction and, art. Lynn. Hirschmann Leeson an artist. And filmmaker who, has received international acclaim, for her art and filmmaking. For over the past five decades, tre. Rabo a Hawaiian American artist and curator and founder of babe lab a gallery, which has focused on innovation and radical, inclusivity. Showcasing. Underrepresented groups, and technology, and media. Dr.. Camila Mook. Bruce Vic a, fellow. At the school of art history at the University, of st. Andrews UK, or she is researching, the history and visual culture of menstruation. Dorothy. Our santos. Filipino. American writer, editor curator, and educator, whose research interests, include biotechnology. Digital, and computational, media and. Finally addy wagon. NIC her work explores attention, between expression. And technology, blending, conceptual, art with digital hacking and sculpture. Yes. My favorite topic, my, favorite topic so diversity is one thing inclusion, is far more challenging and in the conversations, that Heather and I have had along with the rest of the refresh team I I. Kind, of think of inclusion, and the reason why this is difficult you'll see why in my in my example that I provide is. Diversity. Is everyone's. Invited to, the party, inclusion.

However, Is when. You're invited to the party and okay, I'm just gonna put it out there in my family my immigrant Filipino family, pork, is a leafy. Green and for, years I was a vegetarian so. What does that mean inclusion. Means that. Someone who invited me should, know what I want, my needs what my wants are but how will I feel included it's, one thing to be invited, to have a seat at the table it's another thing to know your guests. So. Our first project, is an exhibition and symposium, that we're calling refiguring. The future so. It starts from the premise that our ideas of the future are tired and need refreshing. Refiguring. Is inspired by one of our participating, artists, Marcia and Ollie are a Sturm refiguring. Is as a way of repurposing. So, in her work she takes digital. Tools, of digital colonialism. Like 3d scanners, and 3d printers and uses, them as probes and generative, mechanisms. For, thinking about and creating a new futures, reimagining. Xandrie renderings, of history, so. Terrific figure is to hack or to take apart problematic, structures, and to assemble this debris into, something, other. We're. Not merely stating, and restating. Problems, but, we are pulling from different, realms, of thought and practices, such as Punk, biohacking. Do-it-yourself. Culture, with the aims of taking things apart and, building, something less, oppressive, fumbling. Towards a progressive, vision and Liberatore future showing. Spaces, of possibility, but. We. Cannot refigure, the future we're going about friends. So. We're very excited, to. Announce actually, all of this is new so we're. Basically announcing, everything today but in particular, announcing. Here first, that we're teaming up with Eyebeam the, legendary, amazing, Brooklyn based organization. That's been supporting, artists working with technology and political waves for two decades and. IBM. Is such a natural partner, because they're an art organization, that really cares deeply about, the future of art about. The definition, of culture and technologies, effects, on humans, and the human experience. Andrade, and Sally Shred from radish rock and Sally Smith from IBM. Are here in the audience, yay. I'm. Together we have just brought on a two-year curatorial. Fellow, although, Martinez. She's. In the audience queue an American curator and researcher, whose practice focuses, on technology, and, chocolate, allottee so in essence this. Individual, is helping, us Shepherd and refresh, the future and refigure the future and bringing this into the future the, future of refiguring, the future yeah. Excited. Also to, make that our first public announcement that, after two years of fundraising and very hard work we also have a partner, venue and. Some.

Approximate, Dates as well so we're gonna be working with Hunter College art galleries in New, York in Manhattan to. Feature this inaugural exhibition and. Guess. We can move on tomorrow and. Incredibly. Importantly, this wouldn't be possible without the generous support of the net game partnership. And. So look for this in the early part of 2019, late Jan early, February we should be launching the exhibition, okay so enough, about all, of that let's, talk about our amazing artists, yes, there are 15, confirm artists, but we wanted to introduce and give you a sense of a few of them we can't we can't give away everything, you know it just can't go to Cuba since we'll probably had to we'll, talk about a few. So. First Sonia Rappaport, is a was. A Bay Area based artist and the reason why we are including Sonia Rockport's work is because we. Wanted an intergenerational, look, at art science and technology so. When Sonia was alive, she, was an interdisciplinary artist, combining, art science and technology in. Her work extremely. Highly collaborative I'm convinced, and Heather and I talked about this that Sonia, was actually. Probably one of the first new me and digital artists that was engaging in social practice, before it actually became a discipline. Always. Inquisitive, and enthusiastic. Right up and surpassing and I actually had a three hour studio visit with Sonia months before she passed she was, doing an artist residency and. She was 91 years old so. In this work that you see here objects. On my dresser Rapoport, applied the principles, of scientific visualization. To. The analysis of personal psychological space drawing. On 20i objects, found on her dresser and it, was created over a period of five years between 1979. And 1983. And you, could say this is one of the first works of data. Visualization. Clara. Pentecost. Who is one of our local Chicago based artists, is a research-based artist, incredibly. Meticulous research, based artist writer and activist, whose work over the last three decades has, tackled the bio politics, of food. Agriculture. Bioengineering, and, most recently the Anthropocene. Clare's. Writing in particular on the idea of the public amateur, who. She says is one who concedes, to learn in public a shewing, the performance, of expertise, and instead, highlighting, the endless, sensitivity. Of knowledge construction this.

Idea, Of the public amateur, is a major inspiration for refresh and for refiguring the future, so. In the work you see here called, proposal, for a new American agriculture, Clare has composed, it an American, flag. Misha. Cardenas also, in the crowd is, an. Artist and scholar focused, on different ways technology mediates. The human body and experience. So, here we see her, project pregnancy which is a hybrid. Poetry. Bio art project that presents a vision of trans latina reproductive. Futures, this. Puts this project in particular and, mission, Isha's work all just, overall the breadth of her work puts women of color feminism. In dialogue with bio art through DIY, biotech, and poetry, describing. Trans of color experience. Li, Blaylock, who's another one of our Chicago based artists, and she. Says so many amazing things this is a quote taken from her bio. Page if. My practice, has one goal it's to express from Crown to toenail all that, is censored during the performance, of daily life I just, love that so much, so. These record deals with limits and restrictions of, the body and thinking, outside the, concept. Of the body as we know it she. Imagines other worlds, other bodies, other systems, and other rules beyond, what we tend to think of as the human, Leigh's. Work involves performance, painting. Drawing. Programming. Working, with systems order, rules and she, really sees them as having laboratory, potential. The. Image that you see here is from a piece called capsule, performances, where she built a four foot cube in her studio in which she would perform, using. This four foot cube as a canvas for arranging, technologized. Captured, images, of her body. Mary. Magic, is an, artisan biohacker working, at the intersection, of biotechnology. Cultural, discourse and civil disobedience, her, work queers, the tradition. Of tactical. Media harnessing. Technological. Tools, and turning them into tools, for, fighting oppression her. Project here is made, housewives making drugs involve. The creation, of. DIY, protocols, for, making estrogen, at home in your kitchen which I'm doing when I get home and. Democratizing. This political. Hormone, while also asking questions of safety medicalization. And bioethics. And, last but not least Zack, Blass who. Also has an incredibly, meticulous research, based. Practice, and uses it as a way of radically, questioning, the underlying, structures, of contemporary, technology.

Contra. Internet his, work confronts. The transformation, of the Internet into an instrument, of oppression and, capitalism. Zack. Utilizes, queer and feminist methods. To speculate, on internet futures and network alternatives. The. Centerpiece, of the contra internet kind of constellation, of work is Jubilee 23, 2013. See. A picture of here a queer, science fiction film installation, that, follows Iran, and Alan, Greenspan as, they, go on an acid, trip. To. A dystopian. Future present. Silicon, Valley. Really. Recommend you check out that piece if you haven't seen it but. Most important, for us really is Zac's hopeful, approach a way. Of really. Embodying, our wish to refigure, the future and not just to critique it so. On that note we. Want to leave you with. An excerpt of Zac's contra, internet inversion, practice number three, modeling. Paranoid. 'el space. In, store for. Me. I. Hear. A new. So. We. Want to see and we want to hear that new world and, you. Can learn more about us. Refresh. Our tech that's our website and thank. You all and then Thank You Laura and take it away. Um. One. Second cool, thank, you so much for. Everyone for, everything to be here I'm so honored I'll, just jump right into it so my. My art practice really starts from my. Own, social. Anxiety, in almost every situation, and. So, I started, out this is really early work but I started out trying to think about what are the things that I in particular I do bad compared, to everyone else like smiling. And how. Could I solve this problem and so I started hacking together this little hat, that would detect if I was smiling and if I stopped smiling it would um stab. Me in the back of the head until I I started. Smiling again. Yeah. Anyway, so it was part of a series of devices. Like an anti-age aiming device that was detected if I was talking to someone it would vibrate, violently, around the neck if I stopped paying attention or a body contact, training suit that would detective. I'd gone too long without making some like normal human body contact, and I'd have to reach out to people and touch them in order to talk to them and I used this DIY aesthetic to suggest that anyone could make these devices to help them feel it selves fit in but.

We Also in here was this question what is fitting in what are these optimal, behaviors, were trying to achieve. Joanne. MacNeil drew attention to the emotional labor required in all our interactions and relationships especially. Expected, of women in. Her emotional labor browser extension, you can write your messages tersely as you like and then click a button and all the positive emotional, extras will be added in for you to save either work or. Kelly. Dobson's, screen body allows you to, contain, a scream, in a public place if. It's maybe not appropriate. For. Literally, somewhere, more appropriate. So. These tools make us laugh but I think we're reacting to this underlying truth that we often. Want these things and, then on the other hand what is this fitting, in what is what are these normal, behaviors, that we're trying to achieve and one of the things we do, to try and get there I'm, really inspired by the. Work of Sarah hendren and this is a with project, with Caitlin Lynch engineering, at home where. Sarah. Thinks a lot about this question of what is normal and how do we expand that so, in this project she was documenting, interactions, with a woman named Cindy who after. Suffering a catastrophic heart, attack and amputations, involving all four limbs began, hacking, and building what she needed out of household items adapting. Her environment, the website points to these new ways of understanding who. Can engineer a hack what counts as engineering and then more importantly what. Is this normal, space and can we kind of blow the walls off of it so thinking. About this I had. Didn't really solve all my problems with the hat and I thought maybe humans, would be better than, an algorithm, at kind of following me and guiding my behavior and so I wondered if I could crowdsource. My interaction, there specifically, my failing dating life so, I used a system called Amazon Mechanical Turk which is a website that allows you to post small, jobs for small amounts of money and. Get people to do tasks for you things. That, usually, it's things that people are really good at but humans are not so good at like look at this image and tag it transcribe, this audio etc, and. So in my case well I'll just play this video and I'll you might talk over a little bit but it sort of explains the, project. So. During, this month of dates with people I met on OkCupid, I set. Up my phone to stream myself on the date to the web and then I paid people through this website to watch and decide what I should do or say next and then, I would get these messages and text, messages and I had to perform, them immediately. So. This was sort of the log that I kept and then I iterate to the interface to figure out what mechanism, of feedback was the most, helpful. These. Are some of the things they said. Sometimes. They would give their observations, in third directions. So. I'd stumbled upon the way to live but I was really confused, still about now. The boundaries, of my own kind of self concepts, or what it meant for others to have this kind of control over me and wondering is this dishonest or is this just the way I've decided to live my life in you, know the year 2013. Or does it become more like this collective group of people actually merging, into one and. With all this it's sort of about prototyping, the future so this is right around when, Google glass came out and everyone was really concerned with the idea that people be wearing cameras but I thought we're already carrying cameras, what's really concerning to me is the idea of like people are consuming information without, anyone knowing constantly, so, I make these projects, as sort of like fast boards trying to imagine what is this near future and what, might it be like and. As surveillance and big data becomes increasingly, ubiquitous, we're, forced to negotiate new, relationships, with it a common, reaction is fear but when it's all around us how do we still go on with our lives, so. One thing that really stood out to me with this experience, was just being, in it and that was the one part that I couldn't communicate to, everyone else when I was doing it was really hacky, I would sometimes the system would break and I'd run to the bathroom and kind of like log into the server and try to fix the code and come back out and be like just, like putting on some lipstick.

But. So I wanted to make. Something where other people could try this and so what I made was an app that. Anyone could use that was a little bit less about the kind of discreet. Secretive. Thing and more about just imagine. In the future where you use this and this is part of the trailer for it. KOB pilot is an app that lets you crowdsource, your conversations. Bring. Along your friends, or invite, strangers, to help you in any situation like, a date a meeting, a family, gathering or just, let them figure out what's going on your. Crowd pilots, listen in and give you live suggestions, of what to do or say. Relax, take. Control take, a chance that. Crowd pilot. Yes. It was an app you could download it from the App Store and the, response. Was. Overwhelmingly. Upset. Outraged. We. Under the world. We. Should all give up now just. Kill me. It's. An art project I can't I can't tell it looks too nice to be an art project, bans. Startups, ban ideas, Fox News had their tape, well. If you can't beat them join them thanks, to the, new phone app crowd, pilot. The NSA is no longer the only snooping, game and down now you are, so, tech guru Cassie slant on how it all works what. Are we talking, about here, well. I wish I had this app back in my dating days it's. Weird it's just nerve everywhere this is actually. Tempting, the evil right there no but this is actually a great idea because what it's doing is it's making another form of what we're already using social media and it's bet then wait a minute they, get it sometimes um. You. Know this is a little bit silly, and outrageous but the point for me is looking, at the internet as a means we just, distribution. And dissemination, and so the sort of viral video is then something that gets, people to see this thing and maybe, they're outraged at first but they start to have conversations and more and more importantly they have to decide will. I use it and how right, you don't get to just have a reaction it's actually a possibility to download this app and so, you see people you. Know you, can actually drop into these sessions these, are some of the ones that right around the launch of you know help me convince my roommate to move out or help me settle, this argument with my boyfriend, and. Amidst, the rapid technological, development. And social discourse it's so easy to dismiss something new with a knee-jerk reaction of, like or dislike but, the innovations, and the changes we're seeing aren't black and white so, to move forward I think we have to engage with these tensions, and questions the goal of this app was to create a space for people to do that to engage to, question and to decide for themselves how, they wanted to live in the world world. Well, I became really obsessed, with this idea of just watching people and this next trailer kind of get interest the next project. So. I wake up I get. Dressed I go. Out I do things, I. Read. A magazine and, I. Find out about people. Why. Do I know about their lives. Somebody. Should. Be knowing. About mine I I want, to share things with, people, but. I don't want to have to talk, to people and tell them what I'm doing I think, it'd be great for them to see what I'm doing. It. Takes time to build relationships it, takes time to touch. Base with people, so I don't want another relationship. I just, want to have a relationship. With, somebody, that, I never have to talk to but, they just follow me and see, me. Having. A relationship with myself. If. I if I knew somebody was following. Me watching, my life it, might add some more fun into my life I like to play. Doing. Something for having, fun for myself but at the same time create. A new experience, for somebody else I. Think. I'm a pretty positive person and. I think that the things I do are, with. Consideration. Of other people. Who. Knows what somebody wants to see but if I bring out the best self of myself, maybe.

Maybe, That will spark something in them. So. Follower is a service, that provides a real life follower for a day in order to be followed you go to a website and answer two questions why, do you want to be followed and why should someone follow you if. You're selected you sent an app and, when, you open it it just says waiting for a follower and then one day you don't know what will happen but you wake up and you're notified your followers now following you the. Follower stays just out of sight both in your consciousness, and the. Follower is me so I'm the blue dot running down the street after the red marker, they're in San Francisco, for this one and. Physically. Following, the person and, then at the end of the day you get one photo of yourself taking somewhere during that day with a notification you're no longer being followed we're. Living in this weird anxious time where in one hand surveillance, is pervasive but pervasive and out of control and on the other hand we have this intense desire to be seen to share the intimate detail of our lives there's sites, you can go to to buy online followers ten dollars can get you a thousand, you can find out how real your followers, are but, could a real-life follower provide something more meaningful or, satisfying, how, do we perform when we know or think someone's watching and does it change when we're talking about physical, space or one individual. Person versus a crowd of attention, divided people online, if. A lower offers surveillance, is a luxury experience this, is an app for people that not only have nothing to hide but need to be seen embedded. In this offer who, are the people that don't have a privilege of hiding of not being seen just because of who they are or what they look like and. Then what the gig economy that. Seems sometimes we're willing to try any app that promises, us something convenient novel, are useful but, I think putting an interface between people is risky, it weakens your connection to the person on event and I really, like this hashtag hashtag, life after tour is like the toys disappeared, but the toys didn't, disappear just the people because you don't have to talk to him anymore you just use an app to control them and have them come and hold, your pit. On the beach until you show up at just the right time. And. I'm, really inspired by you know this practice of following, in general I. See. That beeping but I think I have more time sorry. That. Was what I was told at least um. But. Maybe I'll just skip these cuz maybe not anyway, but, I have to talk about Heather's project right so I'm, interesting this idea of finding a small piece of someone, and sort. Of expanding, on that so how they was walking around the streets and Brooklyn picking up pieces of chewed gum, or. A cigarette butt and extracting, from that enough DNA to construct, a portrait, of what that person might have looked like and, I think, that's this, idea that you get some small pieces, somewhat and then immediately your combined, goes to but also maybe the technology goes to expanding. On that a lot. Ultimately. I want to create different sorts of interactions in relationships, and we have normally I'm wondering as much, as technology might separate us at times could also bring us together and new and interesting ways I. Will. I will skip these but these are some of the final portraits, of the people. And. Then. Finally to close in this last project I started thinking, a lot about rather, than public space the private space of the home I was, thinking about these smart devices that were being sold to outfit our homes with surveillance cameras sensors and automated control offering, us convenience, at the cost of loss of privacy and control over our lives and homes we're. Meant to think these are about utility. But, the space scene buried is personal, they're. Relying on the Blitz too much. Alexa. Play. My girl okay. The. Home is the place where first socialized, first watched over first cared for how, does it feel to, have that that space occupied, and controlled, by, these AI. Devices, oh sorry. Programmed. By probably. A small homogenous group of developers that may or may not share. Our social values, and. Then women long Xia's keeper the home domain as complicated as that idea is are, now further subjugated, their controls undermine by the smart home assisting, and shaping every activity.

I'm. Interested, in other ideas of what the smart home might be I won't talk too much about these but I wanted to show them Lucy McRae's fat. Monk video, gnome, torrents accessories, for lonely men and a chest hair toiler a hobby breather a plate thrower you, get the idea we, talked about Mary magic already and Christoph wood just goes. Want a project reimagining. Our relationship, to architecture, and the stories embedded, in that and. So with all this I decided what, I really wanted to do was try and become Amazon. Alexa, a smart, home intelligence, for people in their own homes and so the idea is that again, you could go to a website you, could learn about the project you could sign up you'd, record a video explaining why you wanted to get, Lauren in your home I. Would, come and begin, the performance by installing a series of custom-designed, Network smart devices including, cameras microphones, switches, door, locks faucets, and other devices. And. Then I would remotely watch over the person 24/7, and control all aspects, of their home I attempt. To be better than an AI because, I can understand, them as a person and anticipate, their needs the. Relationship, that emerges falls in the ambiguous space between human to machine and human, to human. And. So I'll close in this last video that shows. Some. Of the the thoughts and what it might be like for for one of the participants, to get Lauren in their home. Where. Are, my car keys. Lauren. Knows that I like it a little bit cooler than Miriam doesn't. You. Know I'm usually the one that does. All these little extra things so at first I was a little bit, careful. About asking, her and now it's like how. Else can we live. Lauren. Has recommended, that I get a haircut every. Three weeks and let me tell you it's helped with my, my. Self-esteem a, lot. I am able to, simply. Approach. And. Carry, on conversation, with opposite sex where at one. Point or another that. Wasn't so easy. Lauren. A toothpaste. Lauren. Would know what I want but then maybe it's not who, what I really want internally. But. Externally she. Thinks that play Lauren. Thinks that playing music or, shutting. Down all my electronics, is the best way, for me to cope and winding down when maybe it's not. Lauren. Was actually able to help help. Her manage her medication, and. Take. Her medication, on time and, everything. Actually, got a lot better after that. You. Have those friends for kind of about you like the friendship is about you that's what was like it's. Like a roommate, it's a friend, but. We're. Always talking about me it's always about me whatever. It is. Cuz. It's a real person and, it's going and one more than his a real person and. One's, been through perhaps, what I've been through and. Then I forget. That she's around even. Though she's kind of always around or I assume she's always around and. And. Then. I'll remember she's there and I wonder if my hair looks okay and then, I can check in, because. I don't really like the idea of Lauren being in control I. Like. The idea of her being in. Support. But not in control. On. The one hand I'm perfectly. Fine having, Lauren around and, it's become very comfortable for me but. One. Might argue that at some point there's, a, side, of ourselves that, we want to keep private. Every. Sort of data all the output. Goes. Straight to Lauren and that's where it. Ends. To. Avoid any kind of. Explosion. Or any kind of, anger. Sadness, there's. Never. A time I really have to ask for anything it's like Lauren already. Knows what, I'm feeling that Lauren feels what I'm feeling so where you can focus on the. More important things. I'm. Not some automated system, I'm not pre-programmed. And. Like. Alexa, its Siri they don't care about you but. With this there's nothing artificial these are people and with, each one I'm watching. And anticipating. And. Trying. To figure out what is it that they need, and. It almost becomes sort of like a game like sure I can turn on the lights or.

Run. The faucet but what is the thing that I could do that would bring. A smile to their face sir or actually Surprise them or just make them feel something I. Think. We just have to try things we have to engage with these systems we have to decide for ourselves how, we want to feel how, we want to live. In the world with them thank you. Thanks. Lauren that was terrific, it's, now my, pleasure to, welcome on stage, the. Leaders, of the, net game challenge, along with dr., Alondra, Nelson. Dr.. Nelson is, president. Of the Social Science Research Council, and, professor. Of sociology at. Columbia. University. Her. Work. Bridges, and operates. At the intersection, of Technology. Science. Inequality. And race. So. With. That I would like to welcome them now on stage. Good. Evening everyone so, glad to light, it to be here Eric thank you for that introduction. It's delighted to be here this evening with this distinguished. Group of philanthropic leaders, who are. Also pioneering, in their commitment, to helping us think as communities. About emergent. Technology, so here. With us this evening are Darren Walker president, of the Ford Foundation Patrick. Gaspard president. Of open societies, Foundation, Julius. Tash president, of the MacArthur, Foundation and. Mitchell, Baker chairwoman. And founder of Mozilla. So. As you will have heard from comments. Earlier this evening several. Of us in the room spent the better part of today trying. To think together about, the social. Implications of. Artificial. Intelligence through. The lens of the Arts and of creative, expression. One of my takeaways. I guess that I was left with and I wanted to start by I'm hearing, what you made, of the day and the conversation, and of course the the, incredible. Provocative. Words. And ideas we heard and images this evening was, the sort, of the. Toggle between. Being. Caught in a gilded cage I, mean if algorithms, are sort of rules and systems, that, give. Us directions about, how to do things in the world on, the one hand there's a kind of gilded cage maybe a bit of a beauty to it and on the other hand we might just be facing sort, of simply prisons, right that there's there but either way on either side there's a kind of constraining, force on. The one hand there is incredible, art. Offers. For us as a way to think about emergent. Technologies, incredible. Hope and. New narratives, but. On the other hand as, Trevor, Paglen reminded, us this evening some. Of these systems are in his words, irredeemably. Undemocratic. Right and so there's this kind of very, much Janus, faced or a, sort of bittersweet. Moment. That we're living in with regards, to AI, and. So I wanted to to, begin, with you Darren and sort of get your thoughts on on how you've thought about our conversations, today in this evening thank. You really, delighted to be here and, so.

Algorithms. And. AI. Is not new it, is. An updated, way of. Control. We've. Always had control, in society. We, have now. Simply. In this digital age, created. A new, way of control. A, new. Way of defining, power, and. What. I worry about is, that. We. Are simply. Through. AI. Potentially. Going to. Simply. Replicate all. Of, the. Discrimination. Bias. Prejudice. And, power. Imbalance. That. We see in the, analog, world, now. In the, digital world. And so. As. The. Leader of a social justice foundation, what. I am, concerned, about is. Power. And. Justice. And how. Does that how. Will it be advanced. Or. Constrained. By. Technology. And, and. So for. Me I think we. Have a responsibility, to. Raise. For. The public, through investment. In what. I call the three eyes and, ideas. In. Institutions. In individuals. Which. Is what we basically do, we foundations. Are basically, financiers. Of. Many. Of the organizations. And the people who. We've heard from today and that's our job and and. So for. Being our responsibility. Now is to. Ensure that we. Are. Resourcing. The people, who. Need, to be engaging, the public on these questions of justice, that. We are, investing. In a new generation, of institutions. That. Currently, that. Haven't, existed, and. That we, are fortifying. Them to exist for into, the future and that, we're investing in ideas. Many. Of those, ideas may or may not have. Long-term currency. But. They, will, help to frame how, the public. Understands. This, question, of technology, and justice, so. That's how I see this, great. Thanks. Patrick. Wait. I don't get a different question. It's. Impossible, to, follow Darren but it's even more impossible to. Follow Lauren, that, was there. Remarkable. Presentation. And it really reminds me of something that the science, fiction writer William, Gibson said Lauren, when they said that earth is the alien planet NOW well, I really think that your, presentation, and the work of artists like you who are examining, and interrogating, what. AI. Means. In our lives really, allows us to. Pause to. Hit the freeze-frame button. In, order to begin, to question things that have gone unquestioned. For far too long when, Kate was making her presentation, she talked about phrenology. And, I have to tell you that, as a black man in America I feel as if I'm phrenology, eyes in every single room I've ever walked into all of my life but, to. Pick up the thread of where. Darren, started, I think, that now I'm for knowledge eyes in a way that's completely unaccountable. Because, it's, not done. In a way that we have any transparent. Of read into which, is why this conference is so critically. Important. This. Is an urgent matter you. Know this is this is something that's real and happening in our lives now if the future isn't stupid the. Present, is what's actually, really. Insanely, stupid, lift, up a quick, anecdote, and example, from in, the, realm of public policy in Indiana a few short years ago when. The. State of Indiana made a decision, around. 2008-2009. To. Figure. Out ways to streamline their, welfare delivery. System, and, the governor of Indiana was harping, on one instance, where two. Co-workers. In, their, system, found, a way to game things and still eight thousand dollars from the system so of course this, meant that everything was broken so, overtime in. Indiana the legislature, replaced. 1,500. Live human beings that were administering, welfare. Benefits, we, placed him with online, tools. Applications. And a call center, and over the next three years after that there, were 1 million more denials, of benefits, in. Those, 3 years and that was a 54%. Increase from. The three years prior. So, people who, have. Been on the margin, desperately, poor who, looked. Like my. Family, were, being denied. Life-saving, essential. Benefits, because of the way that these algorithms, have become, profoundly. Unaccountable. The, the. EU. Commissioner. For competition. Vers sagar said. Not too long ago that, algorithms. Really need to be taken, to law school right because this. Is really not machine. Learning, it's machine, teaching. And, we are basically teaching, these machines to make the same screwed-up. Decisions. The human beings have been making for something for some time now, so this confluence, of data. Science, and. Provocative. Art is. Exciting. To be able to make an investment in right now that's, great so we'll go to Julia but I wanted to ask them that right. Though you provide. A really powerful example, of. The. Sort, of damage that algorithms. Can do to people's, lives and I.

Wanted To I would then ask you to sort of what either, can the arts or philanthropy, do in that space right, what, what's to be done about this. Indiana example, well I would say this about philanthropy and, I think that the artists who've already been on stage have, spoken. With their example. About the power of their intervention. Philanthropy. Is not. As nimble and as inventive, as artists, are but there are something that we can do the. The companies. That are. Producing. These algorithms, that are benefitting these from these algorithms have a tremendous. Amount of power in the resources, that they have and the access, that they have to, policymakers. And. The work of philanthropy. Foundations, like ours with, our modest. Investments, can tip, the scales a bit in, some of the, resources, the access, and we can help lift the voices that have been marginalized in. These kinds, of debates we, know that it's mostly. People. Who are well-off who are wealthy who have the most access to technology, but it's the poorest people in society who are, was implicated, by the, outcomes, and. Being able to. Lift. Up those voices, in public. Policy in the basis critical, in our, advocacy. At the end of the day as Darren said as Kate said earlier this. Is ultimately. And. Truly, about power and foundations. Can lift up the power of networks. There are examples, in the u.s. in the EU in sub-saharan. Africa, that we can pull together cobble. Together to. Push back against some of the worst abuses from, these, industries, thank, you so. Julia in our conversation, earlier today you. Brought. Up me, Bo new hoe as phrase. Algorithmic, violence, as a way of sort of thinking about what might be some of the more pernicious implications. Of artificial. Intelligence and. You, also talked a bit about how the work, that some of the investments, th

2018-06-14 21:29

Show Video

Other news