Humans Data AI & Ethics – Associate Professor Theresa Anderson

Show video

What. I really want to talk, about in this opening address. To just set us if. You like get their juices firing, and get us thinking about what it is as a community, we, can be talking about not just today, but from. Here on in building, partnerships. And connections around the. The theme data ethics, and looking at different ways that we can make sure that we're keeping humans. In the data and I'll, talk a little more about how I would. Like to propose that we think about doing that so. We. Are swimming, in data and that's one reason that that, we, as researchers as, practitioners as activists, are engaging in this but it's also why in the general public there's, this growing concern. If you like about what is it that happens with the data and. It's not that necessarily. It's always about new data there's a lot of data that has been with us from the beginning it's just that now we have new ways of capturing and understanding, and knowing and. There's more that can get lost and more than that, can leak and those. Sorts, of concerns, were what led to the generation of, the human data network, through the connected intelligence Center when we started having these, lunchtime. Conversations. We've had daughter, drinks, with, in the master of data science and innovation we've, been discussing these issues and really the three starter, questions, that I put out to. The community, to start to get the conversation going are the ones that you have here thinking. About how we can lift the lid if. You like on the black boxes, that do, some amazing work and sometimes it's nice to have a black box that does it that I don't have to think about but, shouldn't. We as humans be capable. Of lifting those lids how do we interrogate. The data science practices, that are starting to shape the way that we're working on these things and can we still be humanists. In the, way of engaging with data science practice, where. Does ethics come into play and again, as Glen had flagged for us at UTS there's a really lovely opportunity, for us as a university to engage not just with data, science practice but look at the intersection of data science, with creative, intelligence, and social justice which, are three tenants, hallmarks, if you like of the way that we at UTS are trying, to engage as researchers. As educators, and as activists. So. What I want to do just to start off is I'll be an academic, and I'm gonna give one definition, of Dada ethics because for, myself what I've found myself transitioning. From is this idea of an information, ethicist, into a Dada ethicists and there's one particular. Definition. I just want to put to you at the start I know it's early in the morning and it's before coffee but we'll we'll, put that there and. Then just, explore. Briefly. Some, of some of the, conversations. That are going on around this, notion of the rise of machines both as a as a positive and as a negative and that sets, the scene for something that some of the master of data science, students are going to engage you in immediately.

After, My talk which is a bit of a game to, get you thinking about utopia, and dystopia sand, and where realities, might be somewhere in between and. Then getting to this idea of data science practice data advocacy. And to. Then close with some provocations. For us as a community that I'd like to propose that we think about about, ways of keeping humans, in our daughter practices, making, the invisible visible and. Thinking about when we need to and when we can make the invisible visible and also, how we can go about creating truly. Informed, data, publics. So. This is probably the densest slide this is where I can be a lecture and then after this we'll go on and we'll have conversation, so. This idea of data ethics, so Ellen and I have talked, about this because when I first started having conversations with Simon, and members, of the team about what it is that that we use, as the shorthand, for the issues that we're trying to engage with so when I set, up the community I've used this idea of human centered data science practice, or humans. And data and now, increasingly we're starting to see language. In particularly. The academic, literature this notion of data ethics, and so, this particular text. Which was put out by the Oxford internet Institute, and. It. Involves, luciano, floridi and so that in itself was a marker for me because luciano has been an information, f assistant, and information, philosopher, and to see him starting, to to. Language. Things at the level of data as opposed to the level of of analysis, that information, made. Me go okay so so. What what is his argument and so I just pulled this little statement out as. I said it's the the densest part that I thought we're using this term data ethics what do we mean and, so. The reference that that he makes to why it is that this level of analysis, around dot. As opposed to information, is critical I think flags exactly, the sorts of things that we social, and from at assists have been talking about for a few decades and that is this relationship, between people information and technology, as being one that is Co evolving, that, is not just about technology it's, not just about people it's, not just hardware, in software it's, the interconnections. And the inter plates and so, when we start to think about data ethics, the way that floridian. Today are defining it that that fits very well it's, this sense that, whatever. The platform, whatever the technology, it is not. Just the hardware as he says what I've highlighted it it's not the hardware that causes the ethical problems it is what the hardware does with the software and the. Data that, being. Our starting point and that is the source of some of these new moral, dimensions. So the sense of morality comes, in as well which, I want to come back to in a moment. So. We've. Talked about this in the black box brown bag how. This. Rise of machines is not just about Dada it's about a, to. Co evolving forces if you like the, growing, capacity of, digital, technologies, to make different. Ways of doing. Collecting. Connecting, possible, and also, the rise of data so, we have three, general. Enablers, that are talked about by. Different, sectors of communities, that are having. Conversations, around data science, and these issues of data ethics, and one is the, the growth of sensors, so the Internet of Things the, fact that we increasingly, are developing, cyber, physical. Systems that. That, make, data visible, and then. We've got growing, analytic, capacity, for actually trying to understand what, that data could, be telling us and I'm going to come back to that you know what it says as opposed to what it should be saying or what is not being said and then, we've got this, growth of communication, infrastructure, so so the collision of those three. Enablers. If you like is what brings us to, this moment where people are increasingly, starting, to say okay how. Do we as humans stay in this conversation. You also hear this notion of Dada fication okay. So how many people here are wearing some sort of a Fitbit or a sensor, or using, something that's tracking okay so, and we outsource, to things that then tell us what it is that our day is like okay. Because we as humans you know we don't have to remember all that if I have to count all my stats or if I have to know where, I've been so, you know outsourcing, to the Machine the, these, new capacities. For not only collecting, but then trying to make sense of that so that intertwining. Of new, AI capacities. And data growth are part, of that conversation and so this notion of dada fication is something that, people are referring, to and. I love this story that was in science earlier, in the year about, the fact that now new data storage, capacity, is such that people, have started to develop a way of encoding, data.

Warehouses, In DNA so routine and so that, particular story in science talked about you know imagine a truckload of data, just. Just coming. To your door or being moved in so so, as, we start to collect more we have more that we have to preserve or protect or, to be able to access and. As you will hear you know in our brilliant, argument, tonight, as team machine you know there are certain things that machines are just going to be able to do better right, even as a information, ethicist, I know that. So. What this does is lead to a growing, concern, not, just amongst, researchers, and activists, but in the general public you, often see, reference. To the idea of the rise of the machines and, this is still one of my favorite images from from, a Terminator movie so this notion of something that is half machine half human, and, where. Are the controls where are the stops and, and. I come back to luciano, floridi on things, that he's been saying for decades about, where. We are in, this, InfoSphere. As humans. Moving, through the world where, it is always. Connected. To information, and, cybernetics. We live in an info sphere he says we, exist as information, organisms, so, as we move through the world it's it's we, we give off information, we are collecting, information and, as, he argued back in 2007. If you, are spending more time connected, than sleeping you are a nympho or so hands up how many people spend more time connected, than, sleeping. Okay. I was worried, if there were gonna be I mean I suppose if we had more undergraduate, students there be more sleeping than connecting I'm not sure so, so, it's I mean for the most part and increasingly, out people sleep and are connected at the same time so there are these growing concerns about what's happening to our bodies so, so. We are info, org so we're in this info sphere and. Now increasingly, people talk about artificial intelligence and, machine learning and it's, very much a part of our present, and our future and, there, are growing concerns about the fact that AI is now starting, to be able to be taught right from, wrong so, that begs the question who is doing the, teaching and what are those moralities, what are those concerns. There's. Also opportunities. There's really interesting possibilities. In the ways that some of these tools can be used so, this particular story, that came out earlier earlier. In the year was. Following, a high school student who was you know he had spare time before, he started at Stanford and he. Had a couple parking. Fines and, so. He, did. What any, I'm. Looking at Tom in the back row I'm sure you've played with these sorts of ideas you know I've got some time on my hands let me see what I can, do with the tool to try to work out how I can beat this fine and so, he created a chat bot that helped himself and then he worked out a way to start to help others um he offered this for free started. Making you. Know great inroads. Training. Himself and developing a real professional capacity, around working, with AI and then, he made the decision to turn his talent, towards trying to address homelessness so. Again it's it's lovely to see the different ways that some of these technologies and the people who develop, capacity, with these technologies, then, doing. What I like to think of is using these four forces of good so. We have those choices. Now. Ai is in our future but this idea of thinking machines, and automation has been with us as a species. For a long time and I love this quote from Pamela, makota that artificial, intelligence began. With, an ancient wish to forge the gods and. These. Are two lovely early illustrations. Of ways, that, we as humans, try to imagine, machines. That could be us be. Like us but. Not. So, something, that could extend the, things that we as people wanted to do so in Greek mythology you, have the idea of the Tallis automaton, that circles, around the island, and and can, be there to protect so it's a way of automating, something, for. Could. Be the for the forces of good unless you are on the wrong side of that automaton, and then, algae's re a. An. Engineer. Now. Rising, in prominence in. Terms of historical revisitation. Zuv some of some, of the early work in Islamic, mathematics and, I, think that this is a there's of some beautiful. Illustrations. Of different kinds of automatons. That are doing, hand washing, or playing musical, instruments, and and doing, things that that, are more mechanistic, that, you can just imagine coming, up with ways of trying to enable.

Humans, To to. Experience. More of the world and. Then. We have later on a Catalonian, poet, who, also tried to imagine what a thinking machine might, look like and again what I what, I wanted to point to by having these three examples is this, intersection. Between art and science this. Intersection, between the aesthetic, and the technical, and it is really an. Aspiration. Of mine to see that we can, carry on conversations, to look at ways that we can keep, aspiring, that way and looking at ways of supporting the human spirit along these, lines and. This. Idea of the bright and dark the different ways that we as a species have worked with technology because, I said started, almost. From from, time, immemorial and, as technology. Started. To, develop. In the Industrial Revolution you start to see a ramping, up of concerns. The. Dystopic, views but also the utopic and in. The example of jewel Verne you. Can see that both of those can inhabit the same person, so. The anyone here familiar with the lost novel, this idea of Paris in the twentieth century so this, was a novel, supposedly. One, of the first ones that drew over and wrote but. It was rejected by the publisher, because it was considered, far, too dark and gloomy and far, too unrealistic, he. Was he was painting a picture of what it would look like to be in Paris, in the, middle of the 20th century so he wrote this in the 19th century and, what. He starts to describe in that is a person, wandering, through the streets somewhat, disengaged, because, the technology, that is in use separates. The human from the community, so. It's interesting to think that in the 19th century that was considered, far too unrealistic. Um you. Know you look at it you go well you know quite, future-oriented. And, yet at the same time jewel vert is also the person who gives us the, most beautiful, illustrations. As part of the world expositions. In the late 19th, early 20th, century, where, where he defines, with, excitement. Imaginings. Like electricity, and imagine, if a room like this had light and we, didn't have to have windows you know you can just imagine the excitement in those sorts of places so you get in in the spirit of one person both, the bright and the dark and that. Continues. Through. Alongside. The, rise of robotics, the. Rise of, technology, in the 20th century you, get this tension between humans, and machines so the rise of machines is there and in, film there's some lovely illustration so, metropolis, is one of those classic illustrations.

Where We see you know machines taking, over and. Then Charlie Chaplin showing well yeah maybe they take over but let's do a little light bubbly, look at this and and just take a comic, pause and think about what, kind of future we could shape, and. Then, we come to. 1945. And one reason I always stop to think about man of our bush is my background as a Soviet ologist, so so, I specialized. When I was first. In academic, in looking at the tensions of the Cold War and looking, at the different ways that that. We. You. Know in the West we're trying to understand what was happening in the East so my early job which, now doesn't exist because it's been replaced by a machine was. To try to understand, all the data that could be extracted from the Soviet Union to try to paint a picture of what was happening behind the Iron Curtain ah, so. 1945. The beginning of the Atomic Age parallels. The rise of the computational, age new computational intelligence, and, when, I read. This essay by Vannevar Bush I always imagined, someone, sitting, in a room having. It experienced. A world, where an atom bomb has gone off and we. Tried to think about what we as a species could do to prevent this from happening again and when. I look at various passages, in that essay, one. Of the aspirations he has is there's, just too much information out, there that, we as individuals. And as, trained. Specialists. In our domains cannot, make sense of fast enough, to. Prevent this sort of thing from happening again, surely. With computational, capacity it, behooves, us to or I was told so English, is not my first language I sometimes pronounce, it wrong I think I heard the other day it's behoves and not the who so but, it's important, for us to look, at the different ways of trying to take. Advantage. Of computational, capacity to. Understand, ways, of solving the world's problems but. Earlier. In the piece and so that part five section in. Vannevar, Bush's essay is often invoked as the. Argument, for why it is that we just need to pass everything to machines why it is the computational, intelligence is so important, it was, one of the essays that supposedly, led to, the imaginings, of hypertext, so, that of our Bush created, these. Confections. The Memex he came up with different ways of working with computers. To manage, information and large-scale information but. Earlier in that essay, there's, this lovely piece which. I quite like where. He points to the fact that there, are still things that humans do better that. For mature thought there is no mechanical, substitute, and in. Other parts he starts to talk what I hear, when, I read. That is someone, who recognizes. The, need for us to harness, our creative, capacity, and find, ways to support. Human thought and allow, us to be human. The. Other reason I like that van of our Bush piece is because now increasingly, in these conversations about AI, this. Idea of the arms race comes up is the. Battle, with AI now, the next arms race so. Are we as a species. Facing. Extinction is, artificial. Intelligence going, to be the end of humanity or will it save us and this, is where now increasingly, in various domains and, disciplines, and organizations, can Unity's internationally. And within, Australia you, start to see communities, coming, together to, think about the policies, the. Guidelines, the codes that they can write to. Try to manage. This this. Inevitable. Force, and that's, some of what we're going to be talking about today the conversations, that we have with people from, industry that are coming in to share it is it's, the kinds of conversations I want us as a community to continue, to have they're, the sorts of things that aren't just being discussed by professionals, these are conversations. That are that are moving ever deeper, into the, general public so, you know I can remember early on when we were first developing the master of data science, and innovation, and Simon and I would talk about algorithmic, accountability.

And If, you search the hashtag algorithmic. Accountability, you found very few and now, it, you, know there there are so many variations. On, that and we've moved into even, richer conversations, around that and that's not that long. These. Issues of transparency, opacity. Accountability. Justice. Social, justice making. Sure that people aren't left behind these, are the sorts of conversations that are really critical for us as a community to come together and talk about and then the sorts of things that are starting to be discussed, in, ever-increasing. Frequency. This. Idea that algorithm, politics. So. That algorithms. Can now shape politics, and that could be a threat to democracy if we don't put some sort of control on it thinking, about how to tame tools, so. That they function in our best interests, and a. Very an, oft, cited piece was in. ProPublica, earlier in the year about, the different ways that machines. Can actually embed, bias, and perpetuate, bias, it's. The sort of conversation. That is now increasingly, being had in Australia as well so, Genevieve Bell has, returned to Australia, after years working with companies. In the United States and engaging with this this, human. Technical. Intertwining, to, set up an institute to start to build more, accountabilities. And to engage in that, Allen. Bloom who you'll hear from if you're joining us hopefully you stay for the debate we come in the evening or, find. Wave your hand and you can meet her in the coffee's or the teas and have conversations, has been talking about these things now that she's returned to Australia as well and. What this reminds us of is the fact that algorithms, are human. So. So, they are designed at the moment still by humans we haven't quite reached the idea of the singularity so. We, have, the capacity to think about how to shape these so, this story I. Found. Really amazing, when I came across it earlier in the year this, is this is a specialist, this is a woman with great machine learning capacity, who was building a robot and understood. How to build robots and suddenly found that, her own robot, was misbehaving, and when. She looked at it she worked out that, the code that was used for facial recognition, was. Not, recognizing. Her skin because. She had dark skin and. It. You know she did what exactly anyone, would do instead of building code from scratch you used what's available you think well facial recognition great, I'll use it it exists, I don't have to start from the very beginning. So. Then unpacking. That to realize how deep those. Those, assumptions, and, those, preconceptions, went so, what is what is used to train will, then perpetuate. Certain. Assumptions, and expectations, and so, this statement that computers don't become biased on their own they learn that from us, now. We can't go back we can only go forward so we as a community have an opportunity, to start thinking about ways, that we can shape that training and this, is where I'll come back to floridian, today'll and the work that they've been trying to do around naming this, idea of data ethics, and thinking about codes of practice and, it's the sort of thing that we try. To do with the data science students that we're working with an MDS I it's, the sort of thing that increasingly, is done nationally. And internationally with. Various, communities, of data scientists, and coders. To think. About all the different ways that, we can become, more aware of the. Opportunities, for, morally. Good solutions, now that's still very very loaded the, fact that it's up there that's great but, what constitutes morality. And whose, morality these, are still the questions to have but. To begin with what, what I propose. What we try, to do an MDS I for instance, it's lift the little nose black boxes, and start, trying to find the invisible work that. Goes into designing, and building, and, perpetuating. The use of these tools so. Thinking, about the different ways of pre creating, opportunities, to, in the word of their. Training in social informatics, we talk about articulation, work so, you're looking at the work that is needed to get work done so. Let's think about the different ways that, the tools and the coding, that is used, that, we pick up prepackaged.

What. You know it's like the provenance, work that you do when you're trying to think about where your food comes from so let's think about where our data comes from and where, the tools that are managing, and holding and housing, and. Processing, our data are coming from. So. This is how we try to harness Dada's human value with, an MDS I we. Start with this wonderful, statement that is still one of my favorites from Jeff Valkyr who has worked in infrastructure, politics, and is dealt with the, the politics of categorization, for a long time that, raw data is an oxymoron, it's also a bad idea so we, try to practice careful, cooking we, try to think about the different ways that we can look at the origins of our data and be careful with that and in. Our graduate, attributes and, of course intended learning outcomes, we make this very visible we. Think about the different ways that we, can not just develop, this capacity, as as students. And as educators but, how we take this into our practices. And I. Would be lovely to imagine, this becoming, a mantra. For us as a university community to think about the different ways of embracing, our ethical responsibilities. To, think about the different ways of making the invisible visible that, for me is a particularly, critical one right now I think there are so many places where, data. Is held up as providing, evidence and there's. Little opportunity, unless we grab it to say who is in there who is not in there why. Are they not in there what, is not represented. There and why is that not represented, what is underrepresented, what. Is misrepresented. So. We have to develop that capacity to speak to be the voice for people who are initially, not, with voice in those conversations and, that's, how we as a community lead, data science, and take, a leadership role in this and offer. Pathways not just for professionals, but for communities, so. That brings me to this idea of data advocacy. As well so, it isn't just about training our students.

It's Not just about developing good practices, as researchers, and practitioners it's. About thinking about the different ways that we can be a voice for data. Advocacy. And for better education and, better data literacies, within. The community so going back to that idea of this fear of the rise of machines one, reason that exists is because a lot of people feel disenfranchised. And disconnected, and without power in in. An increasingly, Dada fide world where. It feels like you have to have expertise, to know what to do you. Go into a bank and they'll tell you who you are you go well, I didn't think that was me okay, how do I speak up against that there's. This beautiful story that jereth ork tells in this medium post that he calls turning data around where. He describes, a high school in New York City that, was identified, as the saddest place in Manhattan based. On data that had been gathered, through an API at. A community, or a research group far, removed from New York had, gathered about the, city about, Manhattan and they'd, worked out ok let's. Let's do sentiment, analysis is a very common technique and they mapped, and there was this hot, spot near, Central Park that turned out to be a high school so, then they did what a good researcher. Might do they. Tried to understand, what could lead to this possible, explanation, and what, they found was that the date when they had collected that data coincided. With the, return, of students, to the school after a break, well. That's why they're depressed you say you know they've come back from vacation ok who's not sad when they come back into high school of that the, only problem was that those high school students didn't realize that they were supposed to be sad because. They weren't sad and they weren't depressed and we, can, just imagine the consequences of telling a high school student that they're supposed to be depressed so, there's a real agency. Issue there and there's a there's a responsibility. So. What he talks about is ok let's think about the ways that we could turn the data around, so. Often, dadah, flows in one direction data. Comes from us but it rarely returns to us so. What would happen if we build into our practices. And we try to do this in MDS hi there's, lots of participatory, researchers, who try to do this as well let's. Think about the ways we build. Conversation. Build, a tooing and a froing sure, carry, on doing great daughter work don't expect everyone to develop that daughter capacity, but build into your processes, the way to turn that daughter back around and make, sure that you are conversing with the people from whom you've taken that daughter in the first place now.

That Doesn't mean that you can assume that they're gonna understand, how to make sense of it unless, you are starting to train them as well and the, problem is that this slows the process down now I say problem for me that's not a problem I'm a participatory. Ethnographic. Researcher, but often people say well that's not happening fast enough well. What's. The point of something happening so fast if it's wrong and now. Increasingly, we have an ethical, and a moral responsibility I, think to, think about the ways of creating, new ways of working with Dada, we. Have a responsibility, for making, the invisible visible in these cases because, Dada is given, a voice by, the people, and by the tools that are increasingly, being used in this process we. Can give it a different voice. But. We have to train ourselves we. Have to train our data publics we have to do what your authority all, create, real, functioning. Data public's not. Just paying it lip service but making. The investment, in education, and in training. Holding. Data stewards accountable. So. Can we actually code human values into AI this is a growing question, can, we create a morally, good machine and for. This I'm going to go back way. Back to early, conversations, around ethics and I love this idea that comes from Aristotle, and. I didn't, look at the original language so apologies, if anyone can read the original so I drew on an English translation, the. Good of a human being is related. To being, human. But. Humans being. Human are not perfect. Humans. Are subjective. So, if all knowledge and every pursuit aims at some good that. Question, of what we mean by good becomes an important part of the conversation who's, good coming. Back to the idea of morally. Good. Underpinning. The definition, of Dada ethics, that fluidity and today our present is there, one singular, good, what. Is the good for one community, as opposed to the good for the other so. Again there's another layer of conversation, that we as a community have to have and this. Brings me back to some of the work of Jeff Baker and Susan.

Lee Starr and Karen ruler on this. Idea of infrastructure, politics. So. In the same way that you know the old mantra used to be it takes a village to raise a child well. I think it takes an infrastructure, to raise and make an AI, it's. Not just the technology it's not just the tool it's. Not just the training to work out how to design it's, the entire system, the entire process the village, around. That, development, its. Technical, its human, it's. Physical, its social its visible, it's invisible it's not, just one infrastructure, it's in fact infrastructure. Is plural and. So. Increasingly, the. Group of data, ethicists. That I've been having conversations with, we, talk about trying, to understand the ecology, trying. To look at okay what, what is this dynamic, ecological. Space within which. This. AI is being, grown. How. Can we remain, alert, to actors. Who are excluded, are. There particular communities. Or individuals. That. Are. Somehow. Out of the loop on this that we need to deliberately. Bring in can. We start to map the infrastructure, and map, the architecture. That, is informing, the design the, implementation. The use the evaluation can. We identify critical. Choice points places. Where maybe we really deliberately, need to, intervene or need. To add evaluation. So, that we can catch something because again when things are released in the wild. They. Don't behave as, we. Had intended and with. The best of intentions, we can actually create something, that is problematic so, if we identify choice, points, early enough we, could potentially still. Release, things. Evaluate. Experience, but then also catch, something, before too. Much harm is good, so. We need to find the humans in the assemblage, we, need to think about the different ways that we can bring, those intertwining. More to the fore, become. More mindful of, these coevolving relations. Chips. Foreground, the way that knowledge, work is actually performed, in these data science practices, because, the dot is just the starting point most people really. Bluntly, aren't, really concerned about the data they're concerned with what the data will enable the. Concern with the way that the data will help them to get their work done get. From point A to B it, becomes. Information it becomes knowledge it. Provides. Insight, so. Getting getting that right is really critical. Bringing. That background, work center. Stage for us as practitioners as, educators. As advocates, is where, I think there's a real critical, moment in our time a critical, point. In the conversation around, these. These concerns, for us as a community and for the general public. It's. About holding these algorithms accountable. And the designers, of these algorithms accountable. Asking. The difficult questions before. Too. Much gets crystallized, before. It becomes so institutionalized. That you hear well we can't do anything about it now how. Do we change those models how. Do we recognize categorization. As, a provocation so this comes back again to this idea of the politics, of classification. And the politics of categorization. And. Simon has written about this in a really nice medium, post talking. About the fact that these data points are just tiny little portholes so, they do not replace. A human, they do not replace human experience, they become useful, opportunities. For insight but, let's not give, them more credence, than we should, let's. Keep the, conversation going let's, think maybe instead of artificial, intelligence about, the possibility, of augmented. Intelligence, where. We are actually harnessing, the capacity. Of the, technology, and. Allowing. For the delightful notions, of humanity, to actually be made. Better use of can. We think about the different ways that we can allow ourselves to have hunches, to, draw on intuition, and gut instinct, in ways that we often don't have time because. We often do not have time to think. We're. Very often, caught up in trying to design and get something out production. What, I would love us to be able to do is to, try to think about, building. A folding. And an unfolding. Of uncertainties, into our practices, so I look on the back row to my students from MDS I you, know we celebrate, uncertainty, we talk about the different ways that instead of removing.

Those Uncertainties we can actually build that into our design practice, can, we start, to code something then. Try to resolve what. Might be working what might not be working as well as it should create, this tooing and froing and. So. The, one last thing I want to start to point to is alongside, all, this daughter work alongside working. With these new technical, tools the. Other thing I want us to think about is what it means to be human and, for. This I'm going to come, back to, a statement from an, information, ethicist. That that again helped to shape my thinking when I was working, on my, thesis David. Levy who. Had worked at Xerox PARC and worked as a high-level designer, on. Some. Real cutting edge technologies. At the time and then. Moved into academic. Practice where he suddenly realized, that what he imagined, the Academy to be an opportunity, for thinking, was quite the opposite because being an academic is a lot of work you. Know it. Involves, a lot of management it involves a lot of putting. Together information, and. What he found ironically, was he had little time to think and. So. He started to look back at that essay from Vannevar Bush that, was referred to as we may think and said, well it's kind of interesting that at this very moment that we as scholars, and, as in, the general public you think about the computational, tools that allow us to outsource, our memory allow, us to gather information from, from, more sources than we could have imagined, even five years ago and this. Statement still exists today we are losing the time to look and to think that exactly, the moment we have produced a remarkable new. Set of tools in his case he was talking scholarly, investigation but. We could wide that out to the Wyden community so. Just at that point we've lost. The opportunity to do what humans do best to, be creative to allow us to ruminate to. Be able to understand, what our gut is telling us to be, able to think about instinct. So. This is where this idea of slow data comes about and one, of the posters that you'll see in the poster session will point to this because, one of our students has been engaging, in this this, activity, that, is inspired, by the work of of to, information. Visualizers. And data specialists. Overseas who, have started to ask this question about, whether or not we could start drawing by hand connecting. To our humanity and. Starting. To understand data in different ways and so. Georgia loopy who is one of those activists. Has started to refer to this as data humanism, and, I like that I love that idea of thinking about ways that we can alongside. Developing. High-level data science, capacities, that work with the best of technologies, we, can also start, to forge. Better, capacities. In our human, brains and in our human hearts and think, about what it means to be human how, can we celebrate that humanity, and and, start to build better, AIIA. Partnerships. Human-machine. Partnerships. That allow us to be, more delightfully. Human. So. For. Me the. Key things that I would love us as a community to aspire, to are. These five statements, look, at the ways to make personal, data legible, think. About the different ways to build systems. That are far more inclusive, and if. They're not inclusive that we have mechanisms for, allowing us to identify ways, to make them more inclusive, can. We create better mechanisms, for feedback not, just in the design but, in the implementation, and the deployment and in the analysis, and the evaluation. Can. We find ways to make sure we are holding the stewards, of data accountable, and. Alongside. That, we must be nurturing, better. Informed. Data public's, not, just ourselves but. The general community. So. To close these are some provocations that, I wanted to put out before. You start the, game before. You, start having conversations amongst. Yourselves and hearing the rest of the. Really amazing talks, that are going to be a part of the day and I can't thank our participants. Enough for this and I thank the, black box brown bag community, for starting. To engage so passionately, in these ideas this, is just the start of hopefully, a long. Set. Of engagements. Not just today but in, the, the weeks the months and and, I think there'll be enough here for a few years for us to really work with and to keep everybody, in ourselves accountable. Can. We avoid dotted determinism, to start to make more space for Dada humanism.

Again. It's not an either/or, in boolean, terms it's about combining the end can. We become, more minded, about the assumptions, that we're making when we choose particular data sets or particular subsets, of that data and we, do so at the expense of others we, always have to make choices abstraction. Is about making choices it's just about becoming more mindful, of the consequences, or potential, consequences, of those choices and. Can. We start to build opportunities, for, engaging, ongoing. Dialogue, about this and not, thinking, that the design process is consultation. Development. Deployment end of we. Have a responsibility, for keeping the, communities, in the conversation, on an ongoing basis, and thinking. About the different ways we can, bring, the providers, of that data more actively, into. The mix. So. Thank you very much for your attention. On this and your engagement with these ideas and for getting. Here so early and I. Guess that gives us some time for questions in conversation, too just before we transition to the game but. Thank you for listening. Thank. You for that question I think I think in the same way that a lot of digital, capacity and digital literacies, are already a part of primary school data literacies, belong there as well. I've. Talked with school, librarians, I've been giving talks in schools about the different ways that that. We can be, playful in, the. Activities, that are set up to engage young. People in, understanding first, off what it is that they are doing what their data and their digital practices are and what the potential consequences are, and in. The process empowering, them and educating them to become, more. Aware of the different, ways that data, is used in research. So. You don't have to be a PhD to, start to understand that I mean we've we did some work, around. Mobile phone implementations. In the New South Wales school, system and part. Of that process involved. Building. In time to educate our. Participants. Our young learners in, what it meant to interrogate, the data that we were collecting about, them and it's really exciting when you watch an eleven-year-old who's you know hey any time you want to give me coke and pizza I'm here, and you, know let's talk about what it is you're doing with this data and what. Could happen, so. So. The. The challenge, is it takes time and it, means that something else has to not, be given that time so usually the difficulty, in schools when when, I've had those conversations with librarians, and with teachers it's, okay I still have a curriculum to finish how. Can I embed this so. But I but I think there, are there. Is such an enthusiastic, group to work with, and if there are some that you can think of that you want to involve. In in taking, some of these ideas further I'd love us to talk about that that'd. Be great thank you. Thank. You. What's. The. Relation.

And. I think that's again oh oh. Yeah sorry, yeah so I should come over here with the mic we should I thought we were gonna have a room in mind let me just pass cuz that is a good point about oh there's a mic coming down so I can say mic so, so the observation was, about how this. Is where I feel like Oprah Winfrey or the woke up this day. Sorry. No there's no car sorry under, your desk Oh everyone, gets a Fitbit. Yeah. Yes so. See. Look how the technology, shape so what's happening. It's. Too smart for us. I. Am. Personally, really, struck, at how in, academia, and, schools. The remark was made because the curriculum must be done we've, sort. Of built a, very. Data-driven, and. Machine, based, structure. In which we work which, although it was touted, in. My personal, memory, in the, 90s, to be making, administration. A lot easier, appears. To in fact change, the, way a lot of people are working in thinking in. Unintended. I'm sure ways which, leave them no time to think and very. Contained. About. What. They feel they can do to change anything even, teach well and. Politically. See, to be leaving us in a society where Nolan feels the need to take. Responsibility, for anything, anymore. In, lots of ways I mean I'm feeling, Canberra. At the moment you, know it's not just the public media or the social media I mean in my lifetime all. Of, those politicians, would have needed to resign. Absolutely. Scandalous taking. So little, responsibility. About, it's. Like they lost you know mechanism, they don't control anymore and thank. You for the question and I appreciate you rupee I was. I really, wanted you to be able to have a voice in this so I wouldn't try to practice what I preach and not try to sum that up in. Accurately. My. Response to that baby so so the fuller essay. That David. Levy wrote. That I was referring to where he talks about no time to think is, part. Of an argument that he makes about contemplative, scholarship, as, a way to first train ourselves as, academics, and and a, lot of the work I do as a creative practitioner, is about, finding, ways, for myself but also for the, people who experience an installation, that I produce to, become.

More Aware of opportunities for pauses, and for green spaces of the mind because, I would, say we do still have an opportunity to. Subvert but. We can, often lose sight of the fact that we have that power and that's why I feel there's a Dada advocacy, role because it is really heart-wrenching. To, hear someone say oh I can't change it you know I can't change the systems sure. Yeah, and. It is so it's one reason in our, data, science program we've, talked about changing, our own languaging, to not talk data-driven but data informed. So. That when we ourselves start. To think that we are, not. I mean, it's lazy to give over to the Machine it. Takes a lot more work to, think about what the machine says and then also make sure that we are harnessing our own cognitive, capacities. Sufficiently, and that, takes training and that's where I think it is an important choice to make in schools, in education, but in our research practices, as well to. Provide. Those opportunities for slow thinking. It's. Been you. Know an argument, decades, in the making for me I mean when I did so my pH comes out of information, retrieval and you can imagine how weird. It, looked when I would talk to information retrieval us the first time I spoke to software engineers and they went, seriously. Yeah. But but there's there's a wealth, of research and evidence, that I will now stand. Here and talk about and we can have a long conversation about that I can point you to things I think there, is an opportunity but. We, have to act now because. It, is increasingly. A situation, where you know I love the little britain phrase the computer says no you. Know unless. You can you can spit back and go well actually. It's saying no because of this and it, could say yes if you did X, but, that again takes training awareness, and. Really. A commitment. To activism. To. Hold people accountable to, hold institutions accountable, for. Us to recognize, that we have an opportunity in a democratic, system to be able to have a voice but it does mean putting. Your head above the parapet, and again, it is, a is. The pressure of time and, and it can feel like a jogger not but i but i but I think I look at this room I look, at the the, fact that every, one of the conversations. That we tried to set up early in the year around this drew. People in from across different disciplines in different backgrounds, so. There. Are we, just have to keep one another together and we have to support one another and you know it's like you know workers. Of the world unite, Donna ethicists, of the world unite so thank you yeah and that's where that's my cue, to pass that I'm. Gonna have to hit the pause button on Theresa. At that point, can. We thank her once more thank. You. You. You.

2018-03-19

Show video