Artificial Borders The Digital and Extraterritorial Protection of Fortress Europe

Show video

so welcome everyone who is joining us um i'm gonna wait a few seconds because before we formally start as people are entering the virtual room and sitting down and maybe having a cup of coffee like like we are having and then we'll get started on this conversation about eu digital borders with our guest petra monae so i'm seeing the room filling up quite quickly so i think we we should start we have an hour and i want to make the make the most out of it so again uh a warm welcome to everyone tuning in for what is already uh the eighth uh conversation in our transformer stage series on digital government and uh human rights and um before we begin i should say that this conversation is recorded is being recorded i mean and will be made available on our center's youtube channel and our webpage in about a week's time also please keep in mind everyone who's joining us that there will be a q a with our guest petra monar in the last 15 minutes of this conversation so please as soon as you have a question during the conversation please enter it in the in the chat in the q a i mean at the bottom of your of your screen and uh we'll try to select some of those questions at the end to ask uh to petra now for each conversation in our series my colleagues and i at the digital welfare stating human rights project interview a human rights practice practitioner academic or expert on a specific case study of digital government and its implications for human rights and that conversation is always about an hour long and as those of you who joined us earlier will probably know one of our aims with this conversation series is to introduce a wider audience of human rights students academics and practitioners as well as any other interested audiences to the implications of digital government for the field of human rights and that also brings me to a second objective behind this transformer stage series which is to create what we call a community of practice of people interested in the advent of the digital state and its consequences for human rights and for human beings more broadly speaking and so through these conversations but also to regular blogs written by students and academics and activists working on these issues and that are being published on our webpage we hope to serve as a hub for an interesting discussion and exchange on the rise of the digital state and human rights now today we will talk about the digital and increasingly extra territorial borders border protection of the european union with our guests the inspiring petromonar and in recent years the european union has been rapidly introducing digital technologies into its border management processes especially as it attempts to monitor and control movement before people on the move reach its physical borders those technologies are becoming increasingly central to the eu's efforts in this migration area and this includes for example the use of drone surveillance to facilitate interceptions and push backs of boats and biometric scanning in partner countries to identify and prevent certain individuals from traveling in the first place those are just two of many examples now this conversation today will examine the human rights implications of eu's use of ai and other new technologies to shift its border control operations ever further away from its physical territory at the screen for risk to protect uh through prediction what we will focus on especially is the importance of shifting our gaze away from technocratic discussions in brussels among policy makers and consultancies about ai to the lived experiences of human beings on the move who are confronted by these new digital tools and this gap between technocratic on the one hand and human realities on the other hand is quite significant in the eu we find to give one a brief example at the outset i was reading a recent study by rand europe commissioned by frontex the european border and coast guard agency on what ai might mean for this agency and just to quote an assessment a very frank assessment by rent in that study and here comes the quote on average ai based capabilities are expected to have the greatest impact on the speed and efficiency with which border security functions can be carried out as such there's more confidence in the positive contribution of ai to make border security functions more efficient by saving financial and human resources rather than the ability of ai to qualitatively improve the results of processes underpinning such functions between brackets eg in their accuracy now i think it's important to remind ourselves that there are actual human beings who will experience the consequences of trade-offs like the ones presented by rand right here which is the goal of today's conversation so i'll i'll end there after that quick summary let me now introduce you uh to my colleague negotiavanta negoti is a doctoral student at new york university school of law and she's also an affiliate with our digital welfare state and human rights project at the center for human rights and global justice her doctoral research focuses on development finance and the role of law in governing identities and infrastructures with a particular focus on digital technologies and digital identification projects and godzi over to you thank you very much christian thank you for the great introduction and it is my honor to introduce our panelist our speaker for today for today's webinar vitra mona so uh petra is a human rights lawyer and a researcher specialized in migration technology and human rights she is the associate director of the refugee law lab which is an organization that undertakes research and advocacy related to new legal technologies and their impact on refugees and other displaced communities last year petra order m authored a report about the role of digital technologies in the european union's migration central control policies which is the subject of our conversation today called technological testing ground and victoria just shared a link to the report below now uh today's conversation is all about the eu's introduction of digital technologies into its border control operations as christian greatly introduced now but all too often the conversation about digital technologies can serve to steer discussion away from the underlying policy discussions decisions and ideologies at hand through focusing instead only on technologies themselves to avoid this kind of technicized the politicized approach we want to begin today's conversation by bringing out the politics of migration in the eu so petra can you briefly tell us about the eu's approach to migration what is the context uh what is the historical background within which technologies are being brought into eu's border for control policies in other words what is the starting point here thank you so much for gazi and thank you christian victoria for inviting me today i've really been looking forward to this conversation and i hope that it will be a great way to unpack some of these complicated threads that are woven throughout the way that the eu but really you know the world is dealing with migration these days and i think it's a great starting point uh like you said to try and get at some of the underlying reasons why we are seeing some of these more uh technocratic or techno-solutionist ways that europe and other jurisdictions are moving towards migration management because none of this really is new um you know in my work i look at surveillance and automation but i'm trying to track it from a perspective in terms of how power operates in society and here in the eu in particular over the last number of years we've been seeing a turn a turn towards you know politics of exclusion deterrence and it's the sharpening of borders and you can really see that manifesting in a variety of ways for example last year's new pact on migration which makes reference to really interesting ways that the region is dealing with migration you know new databases and technologies at and around the border europeanized returns process this was all followed up by designating turkey as a safe country for people who are seeking asylum which really undermines and changes the way that people are moving uh into the european territory but it also has some really violent manifestations and christian already made a reference to this we've been seeing an increase of violent pushbacks detentions and deportations and some of these pushbacks have led to deaths particularly in the aegean and the mediterranean seas but really it the starting point has to be a broader conversation and a nuanced conversation about who belongs and who is demarcated as the outsider and in particular how powerful entities like states bolstered by the private sector influences are able to play up these differences and use increasingly more technologies at and around the border to to sharpen this divide wow so against this background um how and where are various digital technologies being brought into these border control efforts could you give us uh examples of the kinds of technologies that have been piloted and deployed and are becoming increasingly central to this agenda sure um and you know for those of you who are aware of my work i sometimes use uh the perspective of trying to map out how these technologies are playing out on a person as they're moving through their migration journey and i know it's a little bit artificial of course because the border as a concept is shifting and changing and sometimes we need to you know unpick these uh concepts in a really critical way and i know we will kind of get into this a little bit later in the discussion but for me it's actually helpful to try and demarcate how this is playing out on a person's life so there's a lot of stuff that actually happens before a person even crosses a border um they might be in a humanitarian emergency and accessing services in a refugee camp and find themselves interacting with biometric technologies such as iris scans or fingerprinting there's a lot of population prediction modeling that is now happening about people moving from conflict zones and the ways that they might be planning out their journeys and then at the border space um in the more traditional sense let's say we're seeing an increase of kind of surveillance type technologies unpiloted drones aerostat machines all sorts of predictive modeling um that also is happening in that space too and also some really really fascinating and troubling experiments such as ai type lie detectors that have made the new in recent years uh that are being piloted in a variety of jurisdictions and also automated decision making that impacts a person once they enter into a country and are interacting with the immigration refugee determination process and on the back end too you know we've been trying to track technologies such as voice printing that is used in asylum applications in germany and different types of ways that communities on the move are tracked and surveilled and made knowable to the kind of state entity as it's trying to determine how public administration uh is going to proceed when it comes to dealing with with communities on the move but really in in my work and and in the report that we put out last year the idea is to show this kind of tapestry this this global panopticon in ways that this is all connected because it's easy to separate it in terms of use cases but really what we're talking about is an increasing use of technologies at every point in a person's journey as they're as they're on the move okay um so in what ways um does the eu's reliance on digital technologies change pre-existing policies and politics of migration is there anything different um between what these technologies are offering or what these technologies are used to achieve as against what was in place um prior to the coming into effect of these technologies so what is new here could you for example speak to how these technologies are enabling a shifting of the border far outwards away from the physical territories of the eu and towards far away geographical spaces yeah this is a really good question i think and it also picks up on what christiane used to introduce this discussion this kind of obsession with making certain administrative decisions more speedy or efficient this is kind of a motivating logic that we've been seeing in a lot of immigration decision making uh already and it's kind of really just become more of an appetite when it comes to bringing in new technologies at and around the border so you know in order to answer this question i was i was really trying to reflect on what was and what is and what will be and and in a way we are talking about uh a regime of decision making and immigration and refugee law that has always been opaque and always been discretionary it has always been about violent border logics and exclusions of certain communities particularly along ethnic and racial lines when we're talking about the way that people are able to exercise their internationally protected right to asylum none of this is new we've been dealing with this you know for years and years and years what is new is that now powerful actors such as states and again the private sector industry that has sprung up around this this border industrial complex that some have called it is able to capitalize on certain perspectives that matter in terms of pushing these discussions forward so it's not the starting point isn't what are the human rights implications of this technology rather it's for the private sector how much money can be made in big border tech and for the state increasingly particularly in the european union how can we fortify fortress europe and prevent people from being able to enter in the first place and i think this is also where this um shifting of the border and this makes me think of a great book by dr ayala shahar called the shifting border um that really is challenging us to think about the concept of the border as something that is not static it's something that's changing and shifting and becomes less and less about a physical experience but it can be our phones it can be our computers it can be the way that we kind of move through the world and it also then um plays into policies of what some have called border externalization or taking the border from its physical location and pushing it further and further afield and european union in particular is very good at this and they've been doing it for years this isn't anything super new it's just a sharpest manifestation is now playing out where you have states for example that are subcontracted to do the dirty work of the eu further and further afield to try and prevent people from even making it to european territory in the first place and sometimes it's really hard to imagine what some of these frontier zones can look like for those of you who are familiar with my work most recently i've been based in greece which is in many ways at the frontiers of europe's migration policy and while i'm talking i wanted to share some photos with you if victoria you don't mind clicking through some of them uh perhaps just a bit of background um i work really closely with a fantastic photo photographer and filmmaker by the name of kenya jade pinto most of these photos are hers although the blurry ones like this one is mine you'll be able to tell the difference but what i try and do in my work is really document the securitization the hostility that is around the border space particularly in europe but we're also moving to look at other geographic contexts such as the u.s mexico border and the ways that biometrics are playing out in east africa for example and in other mena regions as well and what we try and do in our work is really to try and capture this stark divide between these broad policies of exclusion externalization and shifting borders or really highlight how this is actually playing out and also the discrepancies that are present in and around the border space and we can get into this a little bit later if you are interested in hearing more about the details of what i'm seeing on the ground but often we're dealing with a really incongruous situation that really makes me think of the kind of pinnacle of cognitive dissonance we're talking about the eu spending vast amounts of money on new technologies such as drones you know algorithmic motion detection that is making its way into greece and yet oftentimes these camps before they're open don't have running water so there's an interesting tension here we're talking about a humanitarian emergency such as this one for example this is the summer the old samos camp before people were forcibly transferred into a new one which you will see in a minute um but again it really goes to show the kind of priorities of the region we're essentially building glorified prisons or prison-like camps that are surrounded by barbed wire you will see cameras and and different types of technologies playing out um in in the way that people kind of interact with their new surroundings but it really just goes to show that there is this kind of incongruity between you know the european space um having certain priorities that set the stage also again underpinned by an increasingly voracious private sector that is making inroads in terms of how these kind of decisions are playing out but the hu it's not a human approach is what i'm trying to say the human experience has really been left to the wayside thanks so much for uh showing us those pictures um as well and just to say that the kenya jade pinto who took a lot of those pictures has done a great job because it's really impressive how you've been able to sort of visualize what it's like uh when you go to the actual spaces in which people on the move are being received maybe that's too uh polite of a word and as we discussed about earlier that very sh very much shows that contrast that we were talking about between uh around europe or deloitte writing a very clean study uh for for people in brussels versus showing pictures of what it's actually like to uh to be in a place like that in in in greece um i very much like by the way uh the eu or ru picture and sometimes i mean knowing from my own field work in retrospect it almost seems uh planted that sort of uh evidence right because you think this is too appropriate uh almost um but it sort of goes to the core of what you uh what you've been working on eu where are you uh maybe you're in brussels discussing those ai strategies but you're claiming that here um making sure that people's rights are uh protected um and just final thing to say about that by the way what also strikes me is um the sort of the contrast between cleanliness and messiness if that's the right way of putting it you know like a a study produced in brussels is often looks very clean uh the people in hazmat suits look very clean and very anxious about not getting infected with any form of virus or germ or whatever and then and then there's sort of the realities that you just showed as well which are often much more messy and in a way this seems to me to be about a project trying to make everything very orderly and clean whereas life is obviously not orderly and clean in that way um and and i think that's that goes to the theme of our conversation today just wanted to ask you a follow-up question uh petra uh and that is to take us into how uh you arrived at doing this research what's the what's the story uh there and not just the personal story of how you arrived at doing this type of research but also if you could tell us a little bit more i mean the photos already go into this to some extent but what kind of methodologies did you uh use what are your starting principles and to what extent have you been very deliberate about using certain methodologies certain principles uh to uh to do this kind of work yeah thank you for that question i mean it's one that i um don't often get asked but it's something that i really think about daily because i think it's really important to query our own position when it comes to the way that we arrive at certain projects and certain ways of working and living and moving through the world and i really like that kind of dichotomy that you just mentioned between the messiness and the cleanliness the cleanliness and sterility of technology versus the messy human reality of life i come very much from from that side of things you know i used to be a migrant justice organizer um before i became a lawyer never planned to be a lawyer but that's a long story i eventually made my way into law school and then thought you know i would do kind of more i hesitate to put it this way but like traditional refugee law type things i worked a lot on immigration detention violence against women issues and then quite randomly honestly stumbled into the area of this intersection between digital rights and human rights in the way that's playing out in migration a colleague of mine and i did a report in canada called bots at the gate which was looking at automated decision making and canadian visa applications and it really kind of blew my mind i mean i really was able to only do the immigration and refugee analysis because i didn't come from a digital rights or technical background and you know now it's been a few years and and i've definitely been able to build up that side of analysis but by no means do i ever pretend to say that i'm a technologist or even really a digital rights person i'm a migration person through and through and you know as someone who's also had a migration journey of my own and someone who wears multiple hats in multiple ways the way i move through the world it also i think allows me the ability to query certain things from interdisciplinary and a multi-methodology perspective so it's been a bit of a messy journey but um as someone who's trying to understand again the kind of threads the the tapestry of how this is all playing out in terms of the actors that are involved the types of decisions that are made and the way this is playing out on people's lives i actually think this messiness is really really helpful and for me also i work from a really slow deep methodological perspective i'm also trained in qualitative research methods as an anthropologist and so i when i move through the world i try and not take up too much space when i am listening to people and witnessing and holding space when it comes to their experiences particularly when we're talking about really violent experiences or even you know technology-induced trauma for example i spend a lot of time listening and sitting and being quiet and and being present um there's a reason why i'm in greece and and why i'm actually physically where i need to be to be able to foster the kind of deep relationships that um i think are necessary um in terms of you know the way that this has kind of been playing out i think in ways uh it's been fascinating because it in my mind allows for really rich nuanced deep and complex conversation that is very messy but is again perhaps a little bit closer to what human reality is actually like but you're absolutely right it doesn't really square with conversations that are being had in different sectors and it's been fascinating for me since the release of our report that you so kindly shared with the audience um we received a little bit of pushback because some people um in the policy circles in brussels for example uh don't consider it uh research worthy of engaging with and you know to that i always say we can disagree on analysis of course but it's a bit rich to say that the kind of first person accounts that i'm privileged to be able to hear on a daily basis are not uh expert knowledge enough to be then introduced into conversations in these policy maker spaces because if that's not what we're really talking about what's the point of even having these conversations if we're not uh actively holding space for people's lived experience then really it's just a feedback loop that we are engaged in and unfortunately this is what i have been seeing kind of in the digital rights space writ large there is a lot of exclusion a lot of gatekeeping and just a really lack of a nuanced analysis when it comes to intersectionality systemic racism and the way that again technology plays out in real people's lives we have a lot of work to do perhaps too to follow up on that uh it's clear that you're sort of giving a critique here of the field that you've entered and it's sort of interesting that you described your background as being in law activism anthropology and now you've written a report which in its title has technology and and so to what extent do you think you're an intruder in a space that's occupied by other experts and um and uh what would be sort of a response then uh to the existing research that has been done on the use of digital technologies in migration or the digital rights field more broadly speaking uh coming from quite a different background than most people in that in that space yeah it's been it's been really fascinating and i think you know a longer term project would be to actually map out how the digital rights space and the migrant justice space has kind of been interacting with with one another kind of like a meta project i'm gonna make a note of that for maybe when i have time in the future but i think it's been interesting yeah this i definitely in many ways you know years ago there was uh i think i wouldn't say pushback but just kind of not an interest in certain spaces to be having these kind of conversations um and that goes both ways i think coming at it from the migration side of things you know a lot of my colleagues myself included we didn't really engage with these issues because you know frankly when the people you're working with are facing deportation tomorrow you're not necessarily thinking about ai or drones or automated decision making and conversely if you know in you're in the digital right space and you talk a lot about privacy and data protection all of that it's hard to see how that can then maybe play out uh in different ways whether we're talking about immigration and or refugee camps or whether we're talking about welfare applications or the criminal justice system i think there has been this kind of divide between the two communities but i will say what's been really wonderful is over the last few years there are more and more bridges that are being built between different perspectives different communities and different ways of knowing and sharing and then learning from one another and that's been really fascinating to see and i hope that that trend continues in the future and as i understand it you're working yourself on sort of bridging those gaps and trying to change the the narratives and change the field in and of itself you are working on a couple of projects that i think are relevant in in that regard maybe you can tell us a little bit more trying to trying to you know we don't always get it right either but it's a learning process um but i've with a few colleagues over the last year we've been trying to spin out this project that we're calling the migration and technology monitor which we're hoping would be a way to kind of counter balance some of these problematic approaches um and and silencing that has been kind of happening in this space it's an interdisciplinary collaboration between journalists filmmakers we also have the un special repertoire and discrimination that's working with us it's really as a hub for work that is being done in these spaces but always thinking about how to center lived experience and also work from a participatory action research perspective so instead once again not replicating this kind of very westernized way of producing knowledge that's very extractive aka creating another entity or an institution that just goes in and kind of extracts knowledge from someone and writes a report about it which you know we are all guilty of it i as someone who's half an academic half practitioner and someone who writes a lot for a living there is an element of kind of extraction in something that i do as well and i am aware of it but what we're trying to do is um find a balance find a way to create a community where we can learn from each other and also be able to shift some of the funding that is around uh to communities themselves to researchers from within uh communities with lived experience of migration and also these kind of technological um you know high-risk applications that we're seeing so that people can do the work themselves so we are not the ones kind of going in and parachuting and doing that um it's not about giving voice to anybody it always makes me think of this quote by arundhati roy that says you know there's really no such thing as the voiceless they're only the deliberately silenced to the preferably unheard i've been reflecting a lot on that in the last few weeks and it's really something that's i think integral to the way that we're looking at trying to work in this space uh it's really interesting to hear about the forthcoming book and i can't wait to um sit down with a cup of coffee to really dissect every part of the book and internalize it so following uh from the question that we just um the christian just asked you on how you're filling in the silences and gaps in the digital rights field do you ever maybe see yourself as an outsider maybe do you ever think that your approach to um to analyzing the issues within the space is illegitimate or that the knowledge you relay is not valuable um enough maybe when compared to like more tech focused pieces so what has your response been to this and how do you perceive the added value of your research yeah thanks this is this is a really important thing to reflect on particularly when you know i've been in this space a little bit now and it's been fascinating to kind of try and map out the ways that certain conversations and critiques are are happening and even a few years ago a lot of the policy level conversations in particular were centered around ethics ethics of ai ethics of technologies and all of that but the issues with ethics very important but they're non-binding right i mean it's it's a normative idea that gets us thinking a lot and it's super important but if you're not also thinking about kind of the way that these more high-risk applications are playing out virtually in an in an area that is highly unregulated um that's where you know conversations around human rights and human rights impacts can be quite useful um i mean perhaps also as a caveat though and and those of you who know my work i'm a very kind of reluctant lawyer in many ways i think the law as a project can inherently itself be very violent and perpetuate all sorts of problematic power dynamics so i even personally struggle with identifying with being a lawyer but it can also be a very useful tool a tool to be taken more seriously which is something we all need to think about in different spaces in different ways but also as a scaffolding a scaffolding on which to pin rights and responsibilities particularly when we're talking about an area that is highly unregulated and that already is very opaque and discretionary because again the context matters here we're talking about immigration decision making we're talking about uh refugee law which inherently there's a lot of opacity written into the way that the law is allowed to operate and be practiced and then kind of the way that jurisprudence develops it there's a lot of kind of gray zone um when it comes to the decision making so then when you superimpose technological decision making on it it creates a very very messy and a very very high risk kind of area and when you don't have certain laws and governance mechanisms in place people can kind of fall through the cracks it's a fascinating space too though right because now the european union has its proposed regulation on artificial intelligence that's been tabled a few months ago now it's going through the process it's going to take a while before it's actually enforced but it's interesting to see how a piece like that is trying to deal with you know high-risk human rights impacts of technologies including the ones at the border but again i think it's interesting when you look at something like the ai act um what are the silences there which which voices are represented which experiences are the ones that are pushing the agenda and how are certain things even something considered like risk uh and who gets to determine what is high risk there's there's a lot of politics in that conversation too absolutely uh just um for our audience i see that the questions are are coming in but by all means keep asking them in the in the q a and we'll we'll get to them get to them soon it's great that you see so much enthusiasm in the in in the questions that's a reflection on uh on the interest in this topic and also in petra and your amazing amazing work um i wanted to move on to the next part of the conversation talking a bit about experimentation but before we get there just a brief uh question on methodology going back to the photos that you just showed i read in your report that you made a very deliberate choice not to show the faces of the people that you've been speaking to people on the move either in brussels or in greece that you spoke to and that's part of a wider policy of the organization that you're based at i was wondering if you could talk briefly about sort of on the one hand trying to make visible the human lived experiences and juxtaposing them with a very dry technological technocratic approach in brussels and other centers of power but then not being able to show the people that you're talking to how do you deal with that dilemma between wanting to give meaning to their stories and showing their stories in in in a variety of ways and then at the same time trying to protect the people um that you're uh that you're working with yeah absolutely and it's and it's a hard balance you know and for me it's been really uh so instructive to work closely with a photographer that really thinks critically about visual representation in these spaces that are so loaded with power differentials um because i think a lot of it has to do with the choices that are made in terms of how you tell a story and and the stories that you're kind of entrusted with in this space and it's really complicated because for sure i think images are such a powerful way to try and illuminate a lot of the stuff that i'm talking about that is invisible or really really hard to see and and hard to also conceptualize when we're talking about you know algorithmic decision making and you know biases and data sets or even something like a drone they're hard to see they're kind of there we know about them but it's hard to pin them down and especially as you're trying to explore the kind of human side of it how would you do that ethically and and carefully and critically um i mean my colleague kenji pinto i would definitely urge you to check out her work um uh and i know there are others who are thinking this through as well but a lot of it also done as a reaction to the ways that particularly in you know forced migration and kind of humanitarian settings images particularly images of black and brown and racialized bodies are used in really specific ways to further specific agendas and kind of dehumanize people in in really troubling ways that play up um these kind of tropes of uh you know lack of agency what counts as a proper refugee you see this everywhere you know the u.n is guilty of it states are guilty of it academics are guilty of it as well there's all sorts of visual choices that are made that are incredibly extractive and really really lack kind of a critical engagement and really this goes back to the kind of central kernel of the way i try and work and that's always to try and figure out how does the person that is sharing their space with me how do they feel about being represented in this you know and if they're for example not a partner in our work and you know not the one kind of driving this and they are really kindly sharing um their story with me you know it's not up to me to then use their photo in a way to further my own career to further the kind of messaging that we're putting out there even if you know it's done for the the kind of greater good so to speak you know to highlight the harms it's not up to me to decide that someone's face or someone's experience like that is is there i have certain people that i work with now for a while who gave me permission to use quotes for example or to you know show certain images that they feel comfortable with but i wouldn't ever take a picture of somebody and then use it on my twitter feed or use it on in the report at all you will notice for example that there are no faces at all in the project because at that point we also didn't want to put anyone in danger because of course people are still very much impacted and experiencing what we are writing about and trying to to get out there absolutely thanks so much for the great explanation of the choices that you made there in terms of representation because i think those are questions that are not just relevant for your work board or reports on migration and deck right but much more broadly speaking for the human rights field for journalism so i think that's very relevant for uh for our audience i just wanted to get to basically the title of your report and the central theme of your report technological testing grounds there's a deliberate use i i imagine of the word testing in the in the title and just to say that we've had other conversations in this series where uh that element of testing also came up very clearly we talked about the australian cashless debit card for instance where that had been piloted tested if you will on for instance indigenous communities in australia there are many more examples that i could get from our work on digital government and human rights where you see that new tech is often tested on communities that are less powerful first and also think about the work of virginia eubanks for instance who makes that point repeatedly in our book automating inequality about uh using tech in social services in the u.s now in migration in the eu and the use of tech how do you see the role of experimentation i mean it strikes me that you often come across terms like pilots test beds sandboxes another sort of horrible horrible jargon so do you think that that is accidental or do you think that those terminologies and what's behind that that shows up deliberately in these policy documents and what's the purpose behind this testing rhetoric and testing environments yeah at least in our experience it's definitely quite deliberate and again i think this goes back to the context that we were talking about throughout this talk really the fact that we're talking about immigration and decision and refugee decision making a space that's historically extremely opaque really difficult to parse out and it's just really ripe for this type of experimentation and again this is just one manifestation immigration and migration policymaking has often been thought of as kind of an experiment policy wise law wise and now through technology and i think the reason why this is allowed to happen again is it goes to show that there is a lack of kind of a governance mechanism when we're talking about different states uh for example putting forward pilot projects that are funded through different entities and sometimes partnering with academic institutions like the horizon 2020 program in the eu we're also talking about kind of extra territoriality here where you might have international organizations that are involved that are of course you know regulated in a very different way if at all they're not they don't have to comply with certain regional uh directives or governance mechanisms and then often you have the private sector heavily involved in all of this both in terms of setting the agenda but then also obfuscating responsibility especially when things go wrong because when you're trying to compare the way that responsibility accrues between a public entity and a private entity it's very different and this is where kind of the individual or the community that's impacted can kind of fall through the cracks in the middle because the public entity can say well you know it's really the responsibility of the private sector that's developing these um technologies and the private sector will say well it's really not our problem either because we're just you know we were just contracted to do this by the state and then the individual rights holder or if you're trying to push the law forward and argue for you know community-based rights the community has virtually very little uh stake in the game when it comes to kind of again trying to pin themselves to the scaffolding of of our understanding of human rights i think it's it's an interesting i think space to try and parse out and this experimentation really really comes into play in really um disturbing ways you know the the ai lie detector for example that i mentioned at the beginning from eye border control and there's another one called avatar um i find the names also really fascinating but perhaps that's a different talk um you know they're projects that are pilots that are tested that are kind of played around with and sometimes they're not actually rolled out um on you know populations kind of in real time but that's not really the point is it the point is who gets to experiment on whom and who gets to say no we are not okay with this like you said christian it's always along the lines of power and privilege whether we're looking at you know the welfare state or the criminal justice sector and the way that technology is playing out there or in my case borders and immigration there is a huge power differential between those who develop and deploy the technology and the communities that are at the sharp edges of this technology thank you very much petra um so talking about experiments and this notion that so much of these digital technologies are being deployed say far away from the border um or in places that one might call low right spaces how can there be meaningful accountability on the part of eu and its member states for human rights violation where migration control increasingly happens before the border but with the aid of digital technologies how can um accountability efforts be approached in these spaces yeah this is a question that keeps me up at night all the time and i don't have the answers um i think it has to be kind of a multilateral way that we think about creating accountability and governance and regulation it has to be localized but it also has to be regionalized and internationalized because we're really right now we're talking about largely the eu but this is a global phenomenon but it becomes problematic and tricky when you try and regulate and create some sort of system of accountability i mean i think this is where the human rights framework can be quite useful particularly when we're looking at traditional ways that we can get policy makers or the courts to try and wrap their minds around how this is playing out on real people but i think there's also a lot further that we can go and we can learn from different traditions of trying to understand how you know rights and responsibilities accrue on communities for example is there a collective right to privacy that a community can have if not why not why is the law so rigid when it comes to kind of being focused around an individual rights holder and i think also there's a lot of uh more work that needs to be done on more kind of critical perspectives around uh surveillance technology and also the administration of the kind of public state i think for me at least what's really instructive are the more abolitionist type perspectives and work of for example harshawaliya on kind of the violent border logics that are inherent in bordering or creating borders and the way that the nation state is able to practice its kind of sovereign prerogative to control its its borders i mean that's all tied to the way that the technology is playing out on people's lives and the fact is in my work i argue the lack of governance and regulation is actually deliberate it it's good for the state and it's good for the private sector actors who make a lot of money from it to not regulate the space because then they're able to push pilot projects forward you know kind of increase innovation without having to be accountable to anybody thanks petra i um realized we're already in the last 15 minutes of our conversation time has really uh flown by uh so i wanted to get uh to the q a because there have been some amazing questions uh in the q a and in the in the chat i'll try to group some together because otherwise there would be too many for you to answer so um first there are a couple of questions about how to get involved basically there's kyla yoon who asks how can law students and future lawyer lawyers stay informed and get involved in resisting widespread use of such technologies are there roles that lawyers can play in particular as lawyers or at least as global citizens there's an anonymous attendee who basically asks a similar question do you have recommendations for how law students can get involved in similar work what sort of experience training is valuable for us to pursue to work at the intersection of tech and immigration and then there's a perhaps a slightly different question but on the same theme by boeing kwon i'm wondering if the civil society ngos within eu within the eu are considering any strategies together to deal with the digital border issue are there any kind of united groups or meetings surrounding digital borders so these are all questions about how to how to become active which are great questions great and i'm writing them all down so hopefully i don't miss anything um thank you all for engaging with my ideas and i'll try and get to all your questions if we can yeah i mean in terms of getting involved um particularly for for law students i think it's it's great to take as many courses that are kind of at the intersection of tech and law as possible but also then to delve into these more critical perspectives of you know abolitionism or whatever you want to call it um take some sociology classes take an anthropology class if you can um you know i went to law school a little later and i had my qualitative research training already in hand and as i said i'm a very reluctant lawyer anyways but i was able to really i think the fact it's important to pull from different perspectives and different kind of ways of looking at these issues because as i hope to kind of share with you today it's a really complicated area and we need all different disciplines also people who understand the tech from the more technological side to then engage with the human rights side as well but i think a lot of it has to do with kind of community building and learning from one another whether it comes from you know lawyers leaning learning from civil society civil society sharing what they're seeing on the ground academics sitting there and witnessing and not bulldozing too much space which we sometimes see in this space quite often um and there's some really really i think promising initiatives um that are kind of out there that are really springing up now and i would encourage you to get in touch with us at the migration technology monitor um our colleagues at amnesty international are also putting together a really great network of people working on this um there's a few people at edri the european digital rights for example that have been working on these issues i did a research fellowship with them a few years ago and we've been spinning out a few things there so i think it's starting i think we are managing to find each other now in in ways that we were not able to before i think one thing too though it's a resource issue particularly in the migration justice space um because again often you're talking to civil society or lawyers or you know social workers who are so overworked and underpaid the digital rights are kind of pushed to the side but now i think more and more people are realizing that this really is the next frontier of these issues when it comes to kind of migrant justice organizing and i think it's incumbent upon those of us who have more resources to enable this work to come up organically and from the grassroots from within the migrant justice organizations themselves thanks so much uh petra i'm gonna group together two questions now which are a bit different but i'm gonna do that for efficiency reasons uh i could use some ai here so first of all gavin sullivan asks a great question could you say more on how we connect the human-centered approach that you mentioned and human rights aspires to with algorithmic technologies that affect violence on human bodies but by making humans governable to opaque risk profiles interferences and indicators etc that are opaque and hard for researchers lawyers activists to access i.e how to connect the internal and

external dimensions of algorithmic violence and how does the security securitization sorry of digital bordering and increasing privatization of the borders make critiquing these developments and experiments more complex and difficult so that's already a big question to uh to mulan but i just wanted to add a question of a different kind i have to go to the chat for that by um andrea quijada who asks i'm curious to if you happen to see similarities between what is happening in the eu and what has been happening in the u.s i don't know if you look at all at the u.s but so much of what you are saying resonates deeply with how the us border has ideologically and digitally shifted into mexico to prevent migrants from moving from central america into mexico so perhaps you can reply to those two questions and then hopefully we'll have room for one more question at the end right and thank you so much christian for knitting this together for me like this this is really helpful um so yeah gavin's question is great because i think it gets at two issues that are inherent in trying to illuminate what is happening in border regimes generally and then made further complicated by the fact that a lot of the technology is either proprietary or difficult to understand for someone who is not a coder or a computer scientist or someone who can pick apart an algorithm and you know create a matrix of how the decision is made because also of course algorithmic decision making in and of itself is very opaque and hard to understand and it's been critiqued rightly so i mean i think part of that is interdisciplinary collaboration working with technologists who are able to illuminate why something works the way it does and then working you know with civil society with human rights lawyers with journalists to find ways to then critique and talk about these issues from an accessible perspective that gets at all these nuances but i think your question also speaks to a really worrying development all over the world and that is this kind of securitization or militarization of a lot of immigration regimes and frankly being on the ground is getting increasingly difficult um it's harder to access certain spaces it's difficult to take pictures it's difficult to contact people inside refugee camps and that's a deliberate way that certain states and certain regions are securitizing the border and then also making critiquing it very very difficult so i think we're unfortunately going to see much sharper manifestations of this in the future and this perhaps gets me to the second question about the similarities between the us and the eu contexts because of course i think again this is a global phenomenon i kind of accidentally found myself at the fringes of europe partly because of covid and certain methodological constraints in terms of being able to do on-the-ground work during the pandemic but our project is global we really are trying to slowly slowly knit this all together and there are some really great folks doing work on the kind of smart bordering of the u.s mexico border which

again of course is deeply historical and it predates you know the biden administration and even the trump administration goes back all the way to obama and even before of course uh you know with with 911 giving uh the u.s kind of migration machine all sorts of leeway to securitize and quote unquote smarten its border with mexico and unfortunately we are seeing the kind of parallels between again the sharpest manifestations of this the rise of deaths in the desert are largely due to people having to take more dangerous routes as they're trying to avoid smart towers um you know different types of technologies that are present in the kind of border that is currently in play it's really similar to what's happening in the mediterranean and aegean seas here they really are violent borders that are bolstered by technology thanks so much petra we have time i think for an answer to two more questions although we have only a few more minutes left uh they're very different um first of all eric rymondi has a question about methodology he asks when visiting lesbos and samos what was your experience with camp authorities and security like were you able to physically access the camps yourself and if so how which is very interesting question and then there was a question from an anonymous attendee who asks what ai use cases in migration and border control would you call for banning in the eu ai act rather than categorizing them as as high risk oh two excellent questions i'll try and be brief because i know we have a few minutes so the first one is interesting and i mean i should perhaps be somewhat diplomatic but i'm not known for being that diplomatic so i'll just tell you um i am able to get inside the camps largely because i ask sometimes for official permission to go on press tours and i have been able to go so far um the more and more i write about this the less and less access i will have which unfortunately is again goes to what we were just talking about with the kind of securitization of the border so for those of you who are doing qualitative work and also work that is considered perhaps a little bit more on the activist side of things it's a balancing act always because you know i think it's my responsibility to write about it if i'm one of you know 20 people who can go inside the samos camp then i think like why sit on that information and you know put it in my book which i'm now writing and then like build my career on it and pat myself on the back three years later that's not really that important i think it's important to get that information out now but sometimes when you ruffle certain feathers then your access is curtailed so it's always a bit of a balancing act i am really lucky because here i work with quite a few journalists and we often collaborate together so it's easy to support each other when it comes to kind of access and also certain safety considerations that we all have to think about with this kind of work um and then in terms of uh the second question uh when it comes to the regulation oh gosh yeah this is this is a fascinating question and i think the way this is going to play out in the coming months is going to be really really telling in terms of how the eu will move forward i mean given that this is the end of our talk perhaps it's no surprise that my somewhat abolitionist type leanings are coming out so i would be more on the side of you know banning high-risk technologies unless and until we can find a way a to get rid of systemic racism and violent border regimes but that's probably not going to happen so what would that actually look like to be using these technologies in ways that is at the very least right respecting i don't know if it's possible however you know so this is my like pie in the sky abolitionist uh you know uh commitment that i have in my life and in my work but i am also a realist perhaps this is where my reluctant lawyering comes in because of course realistically calling for a full-out ban may not get us anywhere when it comes to the recurrent kind of regulatory space that we find ourselves in the eu so then we have to be strategic and figure out okay so are we okay for example and i'm not saying this is on the table because there's a coalition of us trying to actually figure out what some of these bans and recommendations would be so if you are interested in this i would say stay tuned because we will be coming out with something in the next two months or so but it's a tough question because it's like well are we okay with you know banning ai in refugee applications but somehow fine with it in immigration i mean it sounds good in theory but if i put my refugee lawyer hat on you know people slip between categories and statuses all the time and just because you're a refugee at one point in your life doesn't mean that you might not have a spousal visa in the future or maybe you'll adopt a baby from another country or maybe you'll just need to go for a conference somewhere else and the fact that we then artificially would create this kind of distinction doesn't really sit well with me okay so maybe that's not good so then do we look at use cases do we look at regulating the private sector it's a really complicated space and i don't really have the answers for it and i think it's very much in development and the next few months and years are going to be quite telling whether or not we can actually arrive at a ban and the last thing i'll say and i know we're running out of time for me what's really instructive is also the conversations around for example the really really really super high risk ways of using technology for example in warfare or autonomous weapons i mean we couldn't even agree on a ban to that so how will we agree on a ban in immigration and refugee applications i don't know but i i think we have to try for it and see how far we can push the conversation because again it has real harmful impacts on people thank you so much uh petra for for your time and for your very inspiring um um talk today uh and for not being uh diplomatic uh at times which i very much uh respect and uh and admire um also to say that you just mentioned the importance of sharing your work and not just thinking about yourself and i think that's imperative and i think it's great that you're bringing that out i'm often telling my three-year-old to share but i think it applies to all of us and in part it's in the spirit of this conversation uh series which is all about share more widely the great work that people like you are uh are doing uh i also wanted to thank nagotsi uh today for being a co-host uh in in this conversation and also she's hit him we can't see her face but victoria edelman is really the driving force behind this conversation series including uh including this one i also wanted to thank her uh very much for for making this conversation today happen just to reiterate to our audience that the conversation has been recorded and will soon be made available on our website and also on the center's youtube channel victoria shared the links earlier and you should feel free again to share it uh with others and uh to use it in your teaching or in whatever other way you think is uh is useful um we will also publish a brief blog once the recording is available in which we'll also highlight some of the materials that petra has referred to her own report her earlier report in canada but also some other materials which are highly relevant so you can use them again for your own purposes and finally to say please join us for our next conversation in this series on the 24th of november when we will interview juan lopez he's a researcher at the global data justice initiative and at colombian ngo foundation charisma and we will talk about the use of digital technologies by the colombian government uh for the transfer of pandemic relief payments and that also proves uh to be a very interesting uh interesting conversation thank you all for tuning in and hope to see you next month

2021-11-15

Show video