Ifeoma Ajunwa on the limitless boundaries of employee surveillance

Show video

and and it's it's still quite shocking to me like even you know other scholars have that idea um you know that oh no we should just give it all to the machines you know humans are just you know so full of unconscious bias that we can't you know debug them so we can only debug the machines but um i'm like well who's creating the machine [Music] so today we're speaking with iphoma ajumma dr ajunwa is an associate professor at the industrial and labor relations school at cornell university and also an associated faculty member at cornell law school in addition she's a faculty associate at the berkman client center for internet and society at harvard university and also a faculty affiliate at the center for inequality at cornell university uh so welcome dr ojinwa we're very excited to talk to you today thank you so much i'm excited to have this conversation yeah thanks for joining us we really appreciate you spending the time thanks david all right so um we're gonna jump right in uh to an interesting and deep question uh looking at employee surveillance i know this is a topic that you've spent some time quite a bit of time thinking about and researching and is on a lot of people's mind with so much remote work one of the things we're interested in is how surveillance tools affect workers and what are the ways in which gender and race and other axes of inequality shape those effects well thanks for that question david um surveillance uh is something that has been around for as long as we've had the workplace uh employers do have a vested interest in surveilling workers uh particularly in ensuring productivity and also in deterring misconduct however the issue arises when you have a workplace um where the surveillance becomes intrusive or pervasive and also uh surveillance operates on several axes in ways that can be discriminatory or that can be used to single out um certain employees for harassment so we do need to be aware of that um currently for american law there are no limits really on what can be uh surveillance in the workplace so for example in my law review article limitless worker surveillance i look at the various types of surveillance currently employed in the workplace whether it's surveillance of productivity or surveillance for healthy behaviors through workplace wellness programs and i find that the law really essentially allows carte blanche to the employer in terms of how far they can go in surveilling their employees and while employers might think this is a boon or this is a benefit uh employees do really have to be careful in weighing the surveillance choices that they make to ensure that it does not then become actionable against them or is not seen as being discriminatory or harassment and you know to that effect you know i wanted to bring a case that recently happened in the state of georgia to mine here so this case was called devious defecator uh for reasons that will still be become clear so in that case um certain people or individuals were leaving feces around the workplace and this was a warehouse and the employer to determine who was doing it decided that they would surveil the workers through dna testing unfortunately they singled out two employees for this dna testing and those two employees happen to be african-american so the dna testing revealed that it was not the employees who were responsible for the acts of vandalism um and those employees substance subsequently sued their employer and uh in a verdict returned against employer uh the judge noted that this could be seen as harassment or discrimination because of the singling out of those two individuals and that also this was in violation of the uh genetic information non-discrimination act gina this is an interesting case for various reasons um first you know you have to ask yourself why you use dna testing to uh accomplish this surveillance could the employer not have used perhaps uh video cameras um which is actually still perfectly legal um and then the second interesting reason here is that the genetic information on discrimination act was not really created for the purposes that it was used in this case it was really created to prevent employers from discriminating against employees for hiring or retention purposes uh because of their genetic profile however with this case the court has now stretched gina to in some ways be an anti-surveillance law when it comes to scrutinization of an employee's dna profile wow that's a that's a fascinating intersection of law and technology and and employees um relationships with their with their employers that's a it's a that's a very unusual situation but is it it's a little unusual in the fact that companies i think to some degree are aware that targeting and singling out people is dangerous from a legal perspective and i wonder on the flip side a lot of technology is sort of blanket and you're sort of casting a wide net that picks up uh people all people and their activities in a broad sense and i wonder at the other end of the spectrum um you know sort of what's what's happening in that space where where you know you may have software on your laptop if you you know sit at home and watch movies etc your employer is capturing everything about you and we're sort of blanket capturing everything on that end of the spectrum um you know there's certainly dangers there right yeah that's a great question because you know nowadays uh surveillance is prevalent in the workplace it is pervasive it is widespread it's not really just a trend it's really the standard right any american working today can really expect it you know can really expect that they will be surveilled in the workplace um and you know you might think well if everybody is being surreal then you know it's going to affect everyone equally but that's not really the case right um and let's you know let's take the employer perspective for for a moment so an employer might think well i'm surveilling all my employees equally i'm not singling out anyone um perhaps um you know taking screenshots of their computers and what they're doing perhaps i'm taking um you know transcripts of the emails it's it's equal for everyone however this can actually still be a situation of like more data more problems for the employer because the more data you collect right the more you actually put yourself at risk of collecting data that is sensitive or data that is really forbidden in terms of making employment decisions and this can then open up the employer to suit by an employee who comes from a protected category right so for example perhaps you have an employee who is not out in the workplace in terms of their sexual orientation um but the information from surveillance actually captures this or shows this um if the employer then subsequently takes um employment action against that employee like let's say they're fired or let's say they're um you know demoted or not promoted well in in such a situation employee could have reason to say well i suspect that it was because of my sexual orientation and this claim would be bolstered right by the fact that the employee employer does actually have that information so employers do really have to be you know sort of cognizant of those issues that come with more data well just yeah it's kind of following up on that i mean it sounds like part of what you're saying is that some of the threat or risk around these surveillance tools and regimes is not just to the individual employee right in terms of kind of their privacy or their rights being violated but also kind of an exposure of the employer to certain kinds of legal risks right there's some threats there and that's sort of why it's important for employers and organizations to be thoughtful about it not just you know in service of kind of doing the right thing by their employees but also just being cognizant of the risk that they're exposing themselves to what you're saying right right so so i really see the risks of surveillance as twofold right um there's certainly the risk to the individual employee right invasions of their privacy um information about them being revealed without their consent and perhaps that information then being used to treat them differently um you know you can think of for for example um you know women with children who perhaps prefer not to make that known in the workplace but through surveillance of the emails or even through uh surveillance of the screenshots of the computer that becomes known and this could in turn impact say their promotion chances or their uh ascension to leadership positions correct um but on the flip side there is also a risk to the employer of uh pervasive surveillance because they now have you know within their knowledge or within their possession information um that is uh pointing to protected categories right and just the mere fact of having that information puts them at greater risk for lawsuits uh you know alleging discrimination and i wonder it just makes me wonder about kind of what we're dealing with right now with remote work during the pandemic i mean i feel as though i'm reading articles all the time right about an increase in these surveillance tools and employers tracking employees um and i don't quite it's not quite clear to me sort of what is the prevalence of you know how much that has increased but i would be really curious to hear your thoughts a on just you do is your sense that there are more um intrusive or pervasive tools that are being used and also kind of you know to this point about the risk for employers what would you advise organizations that are you know thinking oh we need to maybe um more proactively monitor our employees if they're working from home yeah that's a great question i would say that with the kovid 19 crisis uh there certainly is an instinct right to surveil workers who are working from home employers might have some anxiety in terms of maintaining productivity or even just deterring misconduct and we have seen some high profile cases of misconduct happening with employee employees working from home that being said um i think employers really need to be very deliberate and really need to be very uh sort of conscientious in terms of the sort of surveillance tools that they are choosing and think about whether these are serving the purpose that they want them to serve or whether those surveillance tools are too invasive uh and too sort of um infringing upon the dignity and privacy of workers because there's another layer right that has been brought on by this uh kobit 19 crisis which is that most employees are not working from home right so there is a difference you know in terms of surveilling a worker in the workplace versus surveilling a worker in their actual home and employers do you really have to give some thought to that i wonder we've been talking a lot about how employers think about this i wonder for the people watching this how employees should be thinking about this in terms of one like what you said i most of what we're doing is over video most of what you can see are people not just maybe the person i know a lot of cats show up but also children um and i wonder do you have any advice for employees or do you have a sense of whether employees realize this or should be more aware well i think it really behooves employees to really be very careful uh when conducting work at home and i would really urge uh any employee to really treat their work hours as that work hours and to really um be conscientious in terms of like the activities that they're doing during their work hours uh i would say really try to have a dedicated laptop for your work um and obviously don't do personal activities on that laptop um i would say try if you can afford it have a dedicated space uh where you work um that is hopefully a place that can be secluded where you can close a door and it can be quiet and you can sort of shut out distractions um i would just really urge employees to really understand that with the advent of technologies now anything you do on an employee um laptop like if an employer gives you a laptop or if an employee gives you any kind of electronic device the law is that that device actually still belongs to the employer such that they can surveil anything on it so it is important for employees to realize that when they are using those devices and it is important for employees to really be professional during your work hours and try very uh hard to keep their life uh personal life separate separate from your work life so shifting gears to kind of thinking in another way um about um about tech you know how employers use technology you've studied um the use of these artificial intelligence hiring tools um and screening tools um which as you um have said you know often they're they're created or they're implemented ostensibly with the purpose to reduce bias right they're kind of an approach as an intervention to either eliminate or mitigate human bias i think most people you know who kind of just even casually keep up with news about technology um and and business know that that's very much not the case um all the time and often can just perpetuate bias um so just and i know this is a long-winded sort of intra introduction but would love to just hear you talk a little bit about your work on this sort of you know how how is it that these algorithmic hiring tools can perpetuate inequalities just maybe some examples um um of what you see in that space yeah um yeah that's a great question um so when it comes to automated hiring i would say that the public impression and and also the ethos behind why employers adopt them is that they're seen as impartial they're seen as neutral they're seen as having less bias right than human decision making so in my paper the paradox of automation as anti-bias intervention i really examined this right i really examined this idea that automated hiring platforms are neutral or without bias and can actually be sort of an intervention to prevent bias in coming into the hiring process and what i find is that this is not actually the reality right and don't get me wrong i think automated hiring as a technological tool can be quite useful but just like any technological tool um automated hiring will perform the way that the people who use it um make it perform right so the people who use automated hiring are ultimately the people who will dictate the results and what i mean by that is that there is a false binary between automated decision making and human decision making and that's because we don't have the singularity right we don't really have humans who are completely i'm sorry we don't have uh machines that are completely thinking on their own right all the algorithms we have right now are created by humans yes we have machine learning algorithms that learn from the initial program and then create new programs but you still have to understand that there is an initial program there and then there is a training of the algorithm created and this is trained on data um that a human decision maker decides should be the training data and this training data can come with its own historically embedded bias and just to give you an a real life example of this um there was a news article uh really of a whistleblower exposing that amazon had created an automated hiring program really for the purpose of actually finding more women for its uh engineer positions you know computer science and engineering positions and it turned out that that automated hiring program or platform was actually biased against women and amazon subsequently had to scrape that program and of course you know didn't really reveal to the public uh well the question then became well how could this be how could a program that was actually created to help women right that was actually created to ameliorate this bias against women how could that program then actually go go ahead and replicate that bias and that is you know an important point that i make in my article paradox of automation as antibiotic intervention which is that automated hiring platforms right if not you know programmed correctly if care is not taken can actually serve to replicate bias and while doing so at large scale can also serve to obfuscate right it can actually serve to hide that this biases is happening so it's not enough for an employer to say i want a more diverse workplace or i am going to use automated hiring and therefore eliminate human bias um the employer actually should do audits of the results coming out of this automated hiring because those audits are what will tell it if it has an issue with this automated hiring platform so i advocate in my um forthcoming paper automated employment decision um automated um employment discrimination i i advocate that there should be an auditing imperative for automated hiring systems because why should we have an unrelated hiring system some of which can be machine learning and just depend on them to get a good results without actually checking for it so i argue that the federal government should actually mandate that automated hiring platforms be designed in such a way to allow easy audits so the design features can incorporate already um elements that would allow for like audits to be run in like one hour or less right because these are computerized systems so it wouldn't really be a big burden on the employer then um and you know i want to add one other thing to that and some employers take this tack of you know look for no evil see no evil hear no evil right like they don't want to do the audits because they are afraid of finding discrimination and then you know then we actually have to do something about it that's not actually a good tack to take in this day and age why because a recent decision um actually has now allowed for academic researchers to audit systems um so whether an employer wants it or not right an academic researcher could come about and audit their system and guess what now they're caught you know unawares so it is actually better for the employer to take this responsibility of auditing their system regularly checking for bias and then also correcting for that bias i found what you said about this how we set up this false binary between human and machine decision making really useful because you know in kind of the like i said sort of general you know diversity equity inclusion field there's you know a lot of discussion about how it's very hard to for people to unlearn bias or to devise a person so we need to focus on processes and systems which i think there's a lot of you know merit to that but i get uncomfortable sometimes with this sort of pivot to well if we just have the right technologies and tools then that's the solution and i think what you said that's just a very helpful way for me to you know just i think i'll be relying on that to articulate that concern to say well you know these are not you know human and machine decision making are not these two independent things yeah and it's it's still quite shocking to me like even you know other scholars have that idea um you know that oh no we should just give it all to the machines you know humans are just you know so full of unconscious bias that we can't you know debug them so we can only debug the machines but um i'm like well who's creating the machine exactly but i think i mean that is i think yeah there's a strong trend for that especially in kind of behavioral science driven you know on discrimination in the workplace those are really good examples um they're started you're starting to share examples of how technology can be perfected to actually reduce bias are there other ways you know of or have come across um where we can actually leverage technology to fight bias right so so you know i think a lot of times the perception is that people like me are cassandra's right because we are always predicting doom and gloom when you use technology and you know many people see technology as like panaceas there is this you know brand new shiny tool and they want to just be able to use it and not really have to worry about consequences so i don't think i'm a technology pessimist but i'm also not necessarily a blind eyed technology optimist i think i'm somewhere in between which is technological tools are just that tools the results from them will depend on how you use them i think technology can be a boon for employers who are trying to do the right thing and diversify their workplaces i think technology could also be a boon for employees who are trying to get you know a foothold in the workplace trying to find employment but i do think for that to happen we need regulation of technologies uh technology uh makers can't really just be allowed to uh have like you know we can't really take a laser fair approach to the development of automated decision-making technologies right we need strong regulation um to make sure that they are serving the good of society um in automated hiring specifically i think automated hiring with the proper regulations could actually be a boon to uh anti-discrimination efforts because for one if you have a data retention mandate right and a record keeping design then through automated hiring you could actually see exactly who is applying right and exactly who is getting the job so they could actually then be very accurate records of the picture right of the employment decision making picture such that you could then see if there is bias right you could then see if there is employment discrimination and i think frankly the first step to fixing the problem is seeing the problem and i think with um traditional hiring a lot of times the problem is quite hidden it's not as easy to see the bias it's not as easy to see the discrimination whereas with automated hiring it could actually become easier to see all of that yeah it's a good point that with um with automated hiring systems and the appropriate audit tools you could actually see the scoring right of factors like you mentioned maybe um predominantly uh or women's universities or higher ed whereas with uh hiring managers that's hidden away in someone's head they may not even know why they're making that decision that's a great point exactly as we say you know in the ai field the worst uh black box is the human mind that's uncrackable right to some extent so maybe we could talk a little bit about wearable tech and the implications for employees and employers i know in some of your writing some of your research you've had examples that affect people of different gender differently but then also some of this technology is getting quite invasive what can you share about this topic yeah so that's a great question um i think you know we've had so many technological advances in the past um i would say a few decades and one of the biggest ones is really this uh rise of wearable tech because as computer systems become smaller and smaller uh then we're more able to embed tech in so many different things and wearable tech is is definitely become uh more even more than a trend now i would say it's become really a fixture of the workplace and you know when i speak about wearable tech um probably the first one that comes to mind for most people is the fitbit you know that you're wearing on your wrist um there's also rings that do similar things to the fitbit like track your heart rate um you know pace etc um but there's actually a plethora of you know types of wearable text what i am seeing though is that these wearable tech are also raising several legal questions um the first one is really related to data ownership and control um so there's this idea of that this wearable tech are collecting so much data from employees and there's a question of well who owns the data right the device belongs to the employer but the data is being generally generated by the employee so should the employer own the data even if the employer owns the data who has access to the data right um should the employee have access to the data to actually review it and make sure it's accurate um and they have you know some say over how that data is used um so i wrote an article for harvard business review where i actually you know noted that currently all the data that's being collected as part of workplace wellness programs you know through wearable tech can actually be sold without the knowledge or consent of the worker and has been you know in you know currently and in the past so is should that be legal um should employees have a say in how their data is exported and and exploited um and then you know when you come to workplace wellness programs you know you have the wearable tech like fitbits but you also have other apps that workers are being asked to download on their phones to track their health habits and unfortunately some of those apps have actually been found to be doing things that could be used for discrimination or for discriminatory purposes so there was an article in the new york times law review where cass light a workplace wellness vendor had um you know requested that employees uh download an app to track their prescription medicine usage um and they were using this information essentially to figure out when a woman was about to get pregnant and you know basically what that means is that you know certain prescriptions right are counter indicated for when somebody is either pregnant or about to get pregnant so those women would then stop taking those prescriptions and um you know caslite was using that to predict when a woman was about to get pregnant uh this was especially concerning because although we have the pregnancy discrimination act which forbids employees from discriminating against women who are pregnant notice the act does not forbid employers from discriminating against women who are about to get pregnant so essentially this app could was a tool right that could allow employers to really discriminate against women who were about to get pregnant without legal recourse so that is concerning um when wearable tech is used for those purposes thank you for no thank you that's that's really good um i can see from colleen's face she has given up on all of humanity especially the technology i know some of your work has certainly looked at surveillance and you've um i know you have other scholars you uh either collaborate with or respect in the field uh tell us about some of that right um so i definitely want to mention the work of ethan bernstein here he is a harvard business professor who has done empirical work uh looking at surveillance in the workplace um he's looked at surveillance and factories in china and other places um and i i want to highlight one important finding of his which i think is something that definitely employers need to keep in mind so in one of his papers he noted that when um workers were overly surveilled right it actually backfired so it actually had the opposite effect that employers wanted so he found that you know in one specific factory right when um workers felt that they were being overly surveilled yes they did work exactly how they were expected to do them but they didn't actually take initiative right they didn't actually get creative in terms of getting things done in ways that were faster and more productive um so i think employees really need to think about the fact that you know organizational theory has recognized something called practical drift which is that in any given work right there's um sort of a standard way of getting it done right and this standard way has really been thought of by management right but the people on the ground the people who are doing the actual work they sometimes quickly figure out that yes the standard way is okay but there are actually better or quicker or faster or more efficient ways to get the stuff done and so they they drift away right from the standard way of doing things and this is called practical drift but when you have over surveillance then you're not allowing for this practical drift from workers and then you're actually basically uh you know cutting off your nose to spike your face as they say right you're actually hamstringing your uh employees um from being able to be as efficient as possible so we often end these interviews with sort of asking the person to you know recommend a resource or a takeaway for people who care about these topics um so we want to do that slightly differently since you have this forthcoming book um that certainly is going to be a resource for people who care about these issues so um you have this a book coming out next year i believe in 2021 um called the quantified worker um i'm sure you're in the home stretch with writing and editing and all of that um so it's your book is called the quantified worker and so would just love for you to talk a bit about the focus of that book and also kind of to this point about creating change you know who you hope will read it and what what impact you're hoping to have with with the book right so my book the quantified worker um is really uh a historical legal um overview or review of all the technology uh that is now sort of remaking the workplace and the focus is on ai technologies um and really examining how those technologies are changing the employer employee relationship and whether we can ensure through uh new legal regimes uh through new life regulations that those technologies actually don't change um the workplace for the worse but actually can change the workplace for the better um so my my hope is that my book will actually be read not just by business leaders or hr people but also by you know employees by definitely by law makers uh really just to get an in-depth look at what these technologies are doing in the workplace because i think a lot of times we hear about these technologies but without having experienced them firsthand we're not really actually aware of the impact that they are having on the individual worker uh we're not aware of the impact that they are having in society so my book will include you know historical accounts of the evolution of these technologies just to really understand where they came from and therefore the sort of ethos behind them also include some interviews of people who have encountered these technologies and their experience with them and then finally i will have proposals um for legal changes um new laws um for how to better uh incorporate this uh tools in the workplace because you know i'm i'm not a law dad i think these technologies are definitely here to stay but it is about making sure that they are operating in a way that is respecting human autonomy right operating in a way that is respecting um our societal ideals of equal opportunity for everyone and also inclusion of everyone regardless of you know disability uh race gender uh sexual orientation so so that that's really what i hope to do with the book sounds like a fantastic text very very valuable we want to thank dr general for taking her time today to be with us that's a wrap on the interview but the conversation continues yes thank you so much dr arjuna this has been a really fascinating conversation and we want to hear from all of you watching so please send us your comments suggestions and ideas questions to justdigital.hbs.edu you've been watching pathways to a just digital future an investigative project that aims to better understand and address inequality in tech this program was produced by the harvard business school digital and gender initiatives our team includes ethiopia omari and my cat tonya flint one more time liz sarley thomas j midayo i'm dave holma and i'm colleen ammerman thanks for hanging out with us keep exploring at justdigital.hbs.edu

2021-02-17

Show video