NLM Science, Technology, and Society Lecture - Taking on Big Tech

NLM Science, Technology, and Society Lecture - Taking on Big Tech

Show Video

I'm Patty Brennan director of the National Library of Medicine thanks very much for joining us for this lecture science technology and society that is sponsored by the nlm's office of strategic initiatives this lecture aims to raise awareness around the societal excuse me societal and ethical implications in the conduct of biomedical research and in the Technologies while seating conversations across the library the NIH and the broader biomedical research community we're extremely pleased to welcome Dr Sophia you Noble a leading scholar in Internet studies and a professor of gender studies and African-American studies as well as co-founder and co-director of the UCLA Center for critical internet inquiry in 2021 Dr Noble was recognized as a MacArthur Foundation fellow for her groundbreaking work on algorithmic discrimination or the way in which algorithms produce systematic and repeatable results that create or reinforce unjust or prejudicial outcomes this work prompted her founding of a non-profit Equity engine to accelerate investment in companies education and networks driven by women of color she also lent her expertise as a research associate at the Oxford Institute in Oxford internet Institute where she's a commissioner on the Oxford Commission on AI and good governance her lecture taking on big Tech new paradigms for new possibilities will help us think through the ways in which Technologies are not neutral and reflect the biases that exist within our broader society when these biases go unchecked or uninterrogated they can become baked into the algorithms that serve up information that is Central to how we make decisions and see the world I'm eager to hear her insights as to how we may reverse these Trends particularly given our responsibility as the world's largest biomedical library and the leader in data science research the lecture will be followed by a q a so if you have questions for Dr Noble please make use of the live feedback button to submit your questions you will find this just below the the talk description excuse me um there is also a link on your screen for the ALS American sign language interpretation please join me in welcoming Dr Noble and Dr Noble thank you again for joining us thank you so much for this wonderful invitation I'm truly thrilled to be here with you today I um have to say that um you know I consider it such an honor to get to speak before Librarians my home Community if you will my scholarly Community is among library and information science professionals and so um it's a it's just a great opportunity to share a little bit about my work with you today but then also one of the things I'd like to do in this talk is expose you to other people who are now working in this field and who I think would be tremendous resource for you as well what I'm going to ask in return is that you as I talk about my work and share other people's um Brilliance with you that you be thinking about ways that you might be able to partner with us or recommend these Scholars and their work because there's still so much to do it seems like every day we're reading a new headline about harms that are coming to various publics but in particular vulnerable people around the world and um for as many people as our as many people are coming into the field of study at the kind of intersection of society and Technology um we cannot even begin to keep up with the incredible amount of work there is to do so I'm going to try to enlist your help those of you who are watching today to also think about how you can participate in stemming the tide around some of the kinds of dangers that we are seeing as researchers so um let me just say a word here about this Photograph you know this Photograph was really inspired by um the provocation that we think about the effects or what it means to um go digital and to have everything be digitally networked and and the way in which we think about knowledge and information sharing being um so so deeply and profoundly tied to electronics a couple of years ago about three years ago I spent the summer in Accra Ghana really looking at and thinking about the global implications of the way in which we talk about um networked knowledge and networked information uh made available vis-a-vis computers and Computing technology and you know yet we don't think about particularly in the kind of global North or in the west we don't think about um who pays the price or the kind of global supply chain whether it's the kind of extraction industries of mining that are so incredibly necessary in order to make microprocessor chips and other kinds of resources that are needed for computing or whether it's kind of at the end of the life cycle of electronics when we make um you know uh billions of pounds of uh E-Waste electronic waste discarded Electronics so I think those are things to be thinking about just the the material dimensions of what it means to become so profoundly invested in the digital and those are the kinds of things that I look at as well in my research so um with that I will um offer up that there are many ways that we could be thinking about the long-term the viability of knowledge and certainly that is what brought me to these inquiries um myself all right um can you see that can someone give me a thumbs up maybe Patty um or Miriam uh Mike that you can see this next slide great thank you um so there are a number of theorists who've really been proposing important interventions at the intersection of free technology and society and I want to just foreground these people because some of them came before uh I'll just say I stand on their shoulders you know they were doing important work that really um disrupted the way that we think about vulnerable communities communities of color and of course my work is always centered on vulnerable people because I think that uh there are many communities who are uh let's say technology is beta tested upon them who have very little uh ability or recourse to speak back to the ways in which they are either misrepresented or directly outright harmed by different kinds of Technologies and I want you to be thinking about this in the context of um librarianship and data curation and data management because even inside our systems um we often do not think about the way in which people are represented or people are misrepresented or the ways in which we are deploying frames toward different kinds of communities and I'm going to talk a little bit more about that in a moment these are Scholars who also have been looking at a variety of different kinds of technological histories and Futures that I I can't recommend enough if you're interested in Reading More and learning more I strongly recommend the critical race and digital studies syllabus we those of us who are on the screen here um and many many others who whom I can't fit them all on one slide uh have been doing work for since kind of the early uh years of the internet and all the way through to the kind of contemporary and future casting and so this is definitely a very powerful and important resource for you and for um colleagues and students that you may interact with um let me just say that I you know they they kind of one of the Marquee let's say uh dimensions of my work has been the book algorithms of Oppression which I'll talk about in just a moment but really this work that I've been thinking about in terms of the harms of different kinds of digital technology platforms and media companies digital media companies against black communities has kind of been ongoing it's actually even surprising uh to me that the last 10 years has really flown by 10 11 years um and I I point to this because I think uh our work those of us who have been trying to sound the alarm that there are different kinds of uh material and representational harms that come through different types of digital Technologies including in the library um that work has been quite obscure I think it's really only been the last couple of years that the idea that maybe digital Technologies or everything the internet has promised us isn't um isn't quite liberatory for us that we thought it would be or maybe there are some consequences to these types of Investments and we're talking about trillions of dollars of Investments globally um and we're just now getting to the moment where we can uh have more visibility around these conversations and the interventions quite frankly that we need so desperately to slow down the rapid rush to uncritical deployment of different kinds of Technologies and I see this most acutely as a professor at UCLA with students who for example have a uh their relationship to the library and this is my undergraduate students is is mostly around going and finding study space or places to sleep and not really having the kind of facility and the stamina to engage with Librarians and and our many libraries on our campus um and because they have been socialized around things like internet search uh and that becomes kind of the the default way in which they have come to think about accessing information and knowledge of course um I find that to be so incredibly dangerous on a number for a number of reasons uh and so I want to talk about and I want you to think with me today in this conversation about um what what it means when large digital technology companies are shaping and remaking expectations about how to access knowledge and information too such that many people believe that everything can be known and it can be known in .03 seconds and they are searching and uh and really do not necessarily have the kind of careful eye toward deeper investigation that also is socializing researchers and people who come into contact with libraries and librarianship I often hear people talking about how you know how difficult it is to navigate research communities research [Music] kind of go going deep in into research and so we are all kind of socialized by this new phenomena of internet search and so that's one of the things I want to talk about now um when I was writing this book algorithms of Oppression I was thinking about the future of knowledge I was at the information school at The Graduate School of library of information science its former name at the University of Illinois at Urbana-Champaign and um at the time Google was really coming into Vogue as were search engines really um more broadly many of us I know on this call were on the internet long before search engines um we might we remember old uh directories of information many of them curated and managed by Librarians around the world um uh diy-ers subject matter Experts of varying types people understood kind of the let's say the um precarious and fragile if in if if anything um ways of exploring the internet I try to tell my students that you know we used to have um I don't have one here with me but you know we used to have these books phone books if you will of URLs I know some of you on this call know what I'm talking about and remember that those days and um we in many ways I think we're relieved when uh um search engines came about because they did some of the hard work you know of indexing what was available on the web and then kind of displaying results for us in in uh what felt like let's say maybe less complicated number some ways certainly the opacity of what was happening behind the uh the interface if you will uh that would create or generate those displays is quite different than our own previous labor of looking through and searching through directories ourselves so this is one of the things that I think is so important and I've seen many academic libraries try to model their own search functionalities around what search engines commercial search engines did for the internet and I think this is one of the places where we should stop and really spend some time thinking about the obli the implications of that first of all we know that there's something about this kind of um uh you know opaque interface that with just a Simplicity um or the alleged Simplicity of a search box where people enter terms many times we know for example from the information retrieval literature that uh um people search with very few words uh I guess unless you're a computer programmer I know lots of programmers you might put in a string of code to look for mistakes or breaks in the code but um for the most part people use very few keywords that they enter into these kind of opaque systems and then they believe that what they get back is in fact the most kind of credible and um uh viable information excuse me I'm fighting a bit of a cold today um so this book this kind of investigation that I started in 2010 2011. and completed in 2016 which came out in 2018 was a look at the kinds of searches that um um I had collected and done over several years uh on um women and girls of color on a variety of different kinds of occupations and then there were stories or case studies of um people and um uh ideas that had proliferated in search engines that have were that had even caused viral news stories and what the implications of it um were what does it mean when we become Reliant upon searching databases or searching an index of the web and relating to it as if it's true um credible and of course what are the cultural implications of um the work that Librarians do in um creating metadata about the kinds of things that we find in the library and online and what are the incredible responsibilities that we have with respect to that when we are doing that in um for the most part in the case of something like Google you don't have Librarians necessarily working on search algorithms there who are not thinking this as people who are not thinking about the subjective nature of knowledge and information uh the veracity The credibility what should be seen what should be known these are the kinds of things that um that I investigated for several years so you have for example um from 2009 to 2012 I was conducting searches on the keywords black girls and this of course was true for these types of results if you can see them here most of these are poor pornographic websites these are websites that are either one click away or directly connected to a pornographic websites and I was asking the question in the book what does it mean that you don't have to add descriptors like sex or porn but that girls of color in the United States for the most part um black girls Latina girls Asian girls were synonymous with pornography of course I've been writing about this and talking about this for a decade we have seen some improvements with the images and the websites that come to the fore around black girls we still have work to do around other girls of color and women of color in the United States and so this really opened up to me a very important conversation that we've been having uh since this book came out and articles prior to this about the incredibly subjective nature of search and the way in which those companies and industries with the most money are able to really control the narratives and the kinds of resources um that come to the fore and um uh companies uh in the kind of gray Market of SEO are also able to uh deeply manipulate the kinds of things and gamify search algorithms and of course there is some uh effect from what uh the public is searching for and what the public is clicking on but I would argue that even for those who feel that the kinds of results that we get on the first page of search are purely connected to what the public is searching for then we have to ask ourselves about the profoundly kind of undemocratic way in which that happens because black girls themselves are in such a small percentage in the uh population that we would never be able to control the narratives about black women and girls that are arrived at if we're just talking about kind of a majority rules type of um engagement so this book is really about kind of exploding a number of different myths that people hold about how search engines work and really trying to unpack for as as close a glimpses we can get uh without kind of knowing what these proprietary algorithms are in commercial search spaces um but it's also about thinking about what um what does it mean for us to try to in the library information space try to model our own projects after commercial search engines and I think there's a lot of um tension um here in that space that we should be continuing to investigate I will make this even more specific of course in the context of health and medicine one of the things we know for example is that the most searched keywords and terms in Google search are related to health and so we would need to ask ourselves about the way in which publics are profoundly Reliant upon search Technologies to help them navigate um whether it's their own medical advice and health information and things that they're exposed to or that are recommended to them by Healthcare professionals but also the way in which they are targeted through social media with different kinds of uh fraudulent uh you know not scientific not proven Health remedies remedies if you will we saw this uh of course I'm preaching to the choir here in terms of covid-19 but also um around vaccinations prior to that and other kinds of medical inquiry that the public has and for me I asked these questions in relationship to not only like what are the words that are searched and how is it that the public cannot access scientific research for the most part because much of that kind of research is sequestered away into uh Library databases that are not available to the public they're often in academic libraries where you need to have some type of access or relationship to the university in order to access that kind of information um and so these are the kinds of tensions that I think in this moment in particular when uh uh in the U.S we have

so um such a precarious and fragile National Health Care system that we uh the public is left with nothing but advertising engines to help them navigate um these incredible complex um information needs all right so those are the some things that I think are worth thinking about I will say that in many times when I'm speaking with Librarians I really try to encourage our community to think about what we can do and how we can partner with each other we have such a long history of strategic Partnerships and building on behalf of the public excuse me in so many different uh ways um how might we be thinking about indexing the web and making um scientific information more visible readily available to the public rather than their whole Reliance upon commercial advertising spaces now you know one of the things we know is that these kinds of um you know grotesque misrepresentations of black women and girls these are not just uh happening in that kind of New Media environment of the internet these are old ideas and old stereotypes racist and sexist tribes that have been with us for um you know really since the beginning of the uh Americas or the kind of uh you know early colonial period um uh you know these mythologies had to accompany these mythologies for example that black women are more sexual um had to be invented and uh produced and circulated to justify the enslavement and the reproduction of the enslaved labor force uh which black women were forced to do particularly after the uh uh the transatlantic slave trade was made illegal uh the only way in which enslaved people could remain enslaved was through Force reproduction and so we have um uh many artifacts here you can see some of these artifacts and if you're interested in these histories I really encourage you to go to the Jim Crow Museum of racist memorabilia that at Ferris University that really helps us understand the history of racist stereotyping um but what's so important about this is that um these kinds of Legacy uh uh racial racial uh racially biased ideas uh discriminatory ideas are also profoundly and deeply embedded in libraries uh and of course we want to understand that as we're thinking about how the history of science the history of medicine is also a history of not just experimentation but of of uh classifying black people indigenous people people of color Jewish people as subhuman right it is it there's this profound kind of high power hierarchies that we live under still that are tied to these early histories and contemporary histories of science and medicine and here I would point you to the work of people like Dorothy Roberts Alondra Nelson ruha Benjamin Terence Keel who have really helped us understand these histories if you don't understand these histories or this is new information to you I can't um underscore enough how important it is to understand these histories as they affect and are affecting the future of science and medicine and um I'm certainly thinking about how they're affecting the future of digital Technologies for example even the underlying Logics that we have about hierarchical systems are tied to these early uh uh scientific classification projects and of course we know that librarianship is also the science of classification the classification of knowledge and information who is uh foregrounded in the hierarchy of these knowledge systems those are the kinds of things we really want to understand and know and care about more deeply um I would just say that um the contribution that those of us who are working in this area certainly and my work has been is to really think about um uh the ways in which the algorithms that we use in these digital systems are also uh um Laden with a a variety of different kinds of racial and um gender uh discriminatory Logics when I was writing the book and working on uh This research a decade ago more than a decade ago when I would argue that the algorithms and the artificial intelligence that's used to help help sort through and index the web um that it what it was encoded in racially biased ways this was um heretical I mean it was it's shocking to me that now 10 years later it's common sense it's common knowledge that artificial intelligence and algorithms can be discriminatory it could be racist or sexist but at the time that I was arguing yet people were not arguing that and so we've done a lot of work and a lot of people have joined into this conversation and we now have kind of an emergent field of critical race and digital studies of critical internet studies um you know were so much more deeply tied to ethnic studies and I've really helped uh generate this community and be a part of this community because I think that um with with respect to the uh the catapulting like that that just the the leap the kind of blind faith that we have in so many different types of software so many different types of database driven um uh projects that are being invested in there's just not enough time that's being taken to understand what these Technologies are and how they will remake our future um and so these are things that um uh and you can see here other important books automating inequality looking at the way in which artificial intelligence is completely changed the social welfare system foster care um Roja Benjamin's race after technology data feminisms design Justice I mean these are all works that are so important to helping us understand how and where we can intervene um I'll just say that um other dimensions of this work for for me have been scattered into other books and book chapters and that I've been working on with my students and my colleagues over the last decade and these are also I think places where you can find more people thinking critically about uh digital technology and um and social harm um one of the things I think is really important to also um you know kind of consider is that um we are living right now in a moment where people are um very like let's say when I say people I'd be in middle class uh Americans uh are increasingly aware of the harms of social media I sit as a member of the real Facebook oversight board which is really the uh excuse me the community of journalists and Scholars and critics of um of Facebook and its lack of um investigation into its own contributions to uh undermining democracy around the world it's its role in uh genocide and human rights violations and civil rights violations and um of course uh many journalists writing about the harms of social media Facebook in particular but one of the things that I think is just so important is that we remember that there's this really important relationship between um social media and search so here if you go back to um 2016 during the presidential election when you did excuse me a search on um final election results Google even gave us um and as a top hit a website that took you to a disinformation site that um that showed uh kind of inaccurate completely false information about Donald Trump winning the popular vote now we know that he won the electoral college and that Hillary Clinton won the popular vote but you have to ask yourself what does it mean when um hundreds of millions of people use a search engine to fact check things that they either hear and broadcast or mass media or that they come across in social media and that then when they go to the commercial advertising systems that are search engines and they look to use them like fact Checkers and they find propaganda or disinformation to me these are the kinds of things that are incredibly threatening uh here to democracy and that we should be big very close attention to in addition to nauseous politics but the way in which people come to understand a whole host of issues around um Society Health well-being and our Collective kind of social welfare um I want to just pause here for a moment and I want to point you now to some other um efforts that I think are really important one of the things that we need so desperately in order to support the community of librarians and agencies that are funding work in the public interest is we really need to more support for research around these important conversations um these are some that I think are really important or that I have been connected to obviously our own Center at the UCLA Center for critical internet inquiry which I'll speak about in just a moment um but you know uh the Ida B Wells just data lab the algorithmic Justice League Equity engine UC Santa Barbara seeds program the distributed AI research Network Terence Kilz coroner reports um project which is I just want to say a moment about this because I think it's an incredibly important um um in the case of uh Terence kills work it's been so inspiring to see him um uh use this kind of deep knowledge in science technology and medicine to better help the public and people whose families members have been harmed uh by through police brutality to understand things like how to read autopsy reports or the autopsy reports of their loved ones so there are so many different kinds of ways in which science and medicine and Technology converge to really do the work of bringing about Justice and helping communities that are fully neglected in our society and so these are the kinds of people that I think really are so deserving of partnership and support and they want to make sure that you know about them for us at UCLA our work has really been focused on policy we understand the incredible moment that we are living in where the potential for not only federal policy that would foreground and more deeply invest in public institutions and public agencies and public efforts of course we think of this as being a research um uh universities academic libraries um uh archives Community archives projects that are profoundly tied to serving the public and um restoring in many ways the public good and expanding on the public good but we also see an incredible need for shifting culture and communicating out to the public how important um it is for us to not have all of the resources our information resources live in the hands of commercial advertising platforms which includes social media companies search companies other kinds of information and media companies that are truly tech companies I mean obviously um these conversations around companies like Spotify for example that are really tech companies more than they are media companies are really important because the reach of knowledge and information into the public is so incredibly important that we understand and the business Logics that drive tech companies are not well understood they have really positioned themselves so powerfully as um public goods and while public goods themselves are defunded and undersupported and so we've really focused so much on the policy work and the advocacy work for public institutions um as well as the culture shifting work that we think needs to happen which is just helping the public understand what these different kinds of Technologies are um the last kind of thing that I want to share is this kind of future work that I'm working on and there's a preview of this work with a in an interview I did with Ethan Zuckerman at UMass Amherst um we're in a moment where I think um we have to ask ourselves what does it mean at seven of the ten most well-capitalized uh companies on planet Earth are tech companies that we're living in a moment where um for me it's akin to um we're hearing a discourses that are akin to the discourse as we heard during the era of big cotton which was of course that the era that was predicated upon um occupying indigenous land and also um an enslaved labor force and um the things we heard during that era that were written were that it would be impossible for example to end the slave trade and the uh trading in African people because the American economy was was completely Reliant upon upon it um we still hear these kinds of arguments about the Reliance of every industry upon the tech sector and on algorithms and artificial intelligence and that if we don't have those kinds of Technologies we are our economy would collapse and of course we've learned so much about the limits what should the limits of the economic models be um should they violate human and Sovereign rights in order for uh Capital to flourish at all costs including at the cost of Human Rights and civil rights these are the kinds of questions that I think we are grappling with right now we see more and more headlines about the ways in which AI has falsely accused uh people almost exclusively African-American men we see over and over again headlines of facial recognition um failing and this again over and over again on black women's faces um and people of color um so we're gonna have to ask ourselves some very tough questions about what the limits of air and I bring this conversation to this community because I think that what Scholars and Librarians and data scientists and information professionals do at the kind of Forefront of this work is really important and that is an incredibly important site for also putting the brakes on and saying we don't have enough information about some of the kinds of technologies that are being used and what the implications will be and think about you know we think about thousands of years of librarianship in order for us to have bodies of knowledge for human beings to use scientific information in particular to use to improve our qualities of Life all around the world and here we are on the precipice I think of employing technologies that might also um squander those thousands of years I mean what does it mean of course it is such a basic level that we digitize knowledge to the degree that if you do don't have electricity or you don't have access to the internet you don't have access to knowledge I mean that those are profound shifts in the way in which human beings are thinking about the future of knowledge um we also are um living in a moment and where big Tech has hired many of the lobbyists and Executives from the era big tobacco I will tell you that my students just cannot believe that um I try to tell them all the time that probably when I was born my mom was chain smoking probably the doctor had a cigarette in his mouth when I was born I'm sure my mom changed smoked two packs of cigarettes while she was in her hospital bed and my students can't believe that because you know they've grown up under a different Paradigm around tobacco and so we could ask ourselves what would it take to create a paradigm shift around big Tech what would it create to think about the interests of the public um and the protections that the public needs from the kind of um self-interested uh technology research and uh projects that that are all around us um that we're contending with and this is to me an incredibly important place for librarianship it's it's really the moment for information professionals to be at the Forefront of articulating what our national and international agenda should be at the intersection of knowledge and Technology um such that the commercial interests or the easiest or the faulty Technologies do not just move to the foreign take it for granted ways which is really kind of what we're contending with now so what are some things we could do um I think first of all we can integrate the study of Science and Technology and AI with critical ethnic studies um this is really what my work has done is brought the information science and Library science conversations into dialogue with black studies with histories and theories of Oppression and really to help eliminate the stakes that live at those intersections and I think we need more of that and and we need uh more uh attention in these areas of course we need to fund centers of expertise as I've mentioned but also High rate phds or researchers Who come out of the humanities and social sciences who can work alongside technologists um to teach in The Sciences in engineering but to also work in our organizations to think about the things that maybe data scientists don't always think about when they inherit a data set down at the end of the line or where they're not thinking about the data that they're using to train their machine learning algorithms where we can kind of bring in other experts to help us dig critically about the that that work um I think we need um centers of excellence that can really help shift public policy and and culture as I mentioned and of course these are all things that we can be doing together um the last thing I'll say is that you know we have more data and Technology than ever but we also have more inequality to go with it and more adjust to go with it we have this promise that Ai and algorithms would liberate us um but what we see is that when we look at the research um economists tell us that uh by 2030 what the top one percent of wealth holders will hold two-thirds of the world's wealth we have to ask ourselves to what degree is technology and AI implicated in more sorting between the Haves and the have-nots my colleague Kathy O'Neill who wrote The Amazing book weapons of math destruction she says AI is great for helping those who are already doing great do better and those who are not doing great do worse and I think there's a lot to be said about where we are using these Technologies in creating more access for people who have power and privilege and less access for those who don't these are things that are of course that I get up every day and think about um so I think I'll leave it there and just say thank you and I look forward to your questions thank you very much Dr Noble that was just incredible um and and a a broad swath of concerns and spaces that we've all been in so I really appreciated it resonated with some My Mother by the way did not smoke but I certainly babysat for lots of women who did um let me start with some questions that will take you a little bit further into some of the interests of our our audience um an audience member says I appreciate it early in your talk that you reflect on socializing students and researchers to understand the role Librarians have an information ecosystems do you have any strategies to share around elevating the role of librarianship and perhaps human content moderation as we continue to think about the challenges that you outline in your talk it's really fantastic it's a great question um you know my colleague and close collaborator and Professor Sarah Roberts at UCLA when I went when we were in graduate school we were actually graduate school together and we miraculously ended up at UCLA um uh too um when she was you know she did the first academic study on Commercial content moderation and uh at the time that I was trying to argue for racist like that algorithms were racist and sexist she was arguing that there was a hidden Workforce of content moderators and both of us you know those questions emanated from our commitment to librarianship and understanding Librarians also as content moderators quite frankly adjudicators people who have to um use their kind of expertise or their uh you can understand things like literary warrant and other value systems to decide what is collected what is preserved what is not what is purged what is never makes it into a collection and these are of course content moderation issues that um at the in the context of global advertising companies cannot be managed I mean this the size and scale of the amount of content that moves into these platforms cannot be managed through software and it can't be managed through the global Workforce um at the you know at the speed by which content is posted I think YouTube um the last interview I saw one of their vice presidents said that they had 400 over 400 hours per minute of content being uploaded just to YouTube 400 hours per minute just to YouTube so at that volume of course we have to think about um how much is just there and Sarah's work has been so important in helping us understand um what content moderators have learned about not only the decisions that they make in moderation and how that is used to train algorithms for large-scale um Computing uh but also that um you know just the lack of knowledge and depth of knowledge and the ability to even recognize racist propaganda for example um so those things are very very important this is a certainly a place that where Librarians for me I mean our professional training is about the subjective nature of knowledge and information and that we have different kinds of Professional Standards for that adjudication um we are quite out front about that which is a very different than the Way digital media platforms and companies argue that they are not biased that they hold no values that they are value free right so that is a fundamental tension that we have values we know what those values are we advocate for those values versus we have no values we are neutral and um of course part of that is because the platforms themselves do not want to be held accountable for the kind of content that I'm going to see their their platforms or propaganda and they use section 230. um to as a shield so I think you know part of what Librarians have to do and organizations um have to do is be out front I mean we also have an Impulse of kind of being passive um Advocates you know just like I I always feel this because um I always I wonder because I went to library School in the midwest there's such a humility um about librarianship that is both beautiful and incredibly annoying because you know I'm from California and I'm like we gotta get in the streets but we have to get in the streets I mean metaphorically and maybe also um uh you know actually we need to be in front of policy makers really helping them understand why it's important that they invest in public institutions in public libraries in universities in academic libraries in the funding agencies because we are on the front line uh providing high quality knowledge to the public and if we don't take that role and take that role seriously we will actually just leave everyone for Google or Facebook and for me that just seems so incredibly unsatisfying when we have the language of advocacy and knowledge at our fingertips we really have to be activated to use it well it's very very uh I like the the assertiveness in that another question we have is going to take you into your personal process for search and so one of the participants here is asking can you take us through your personal process engaging search when you'd like to learn about a particular topic to give us a framework for being more mindful as we go into search absolutely it's such a great question it really depends on what it is so if I'm shopping I'm looking for new cleats for my son I will use the search engine I'll use something like Google because I know that it is a commercial advertising space and everybody's going to be trying to advertise the best price on cleats on you know on a search engine so I'm very aware of that if I'm trying to go deep on um something I don't know as much about I typically will start with a library database just to try to get a landscape analysis of what's there um I also will use um like on my phone as just a matter of like everyday needing to know things I will often use DuckDuckGo um because you know I'm just mindful of like it not tracking me right so that I don't have a lot of influence of past searches affecting the kinds of searches that I do I often will compare things in um with Google too so I often use like a variety of different kinds of methods and then I also still ask around I'll start I mean I'm fortunate because as a faculty member at UCLA I work with thousands of experts so I'll often also kind of like ask around does anybody know anybody who knows anything about X Y or z um I'll go to our faculty Pages I'll try to see if I can do searching through the ucla.edu domain just to see if I can find like who's working on a thing that's when it's more research oriented because I want to know who the people are who already know more than I know that are that I have proximity to or you know a warm relationship too I also will sometimes use Google Scholar and I will compare that with the library databases and I often find things that are in the databases one last thing I want to say though about Library databases is that in the face of so many budget cuts many libraries are really cutting back on journals Journal subscriptions they're cutting back in a lot of different ways we need to be advocating for more resources on the campuses um and and I say that because if you publish let's say um at the intersection of race and Science and you're trying to get your work visible to other black Scholars who might also be working in this area you might be publishing in more Niche journals that black Scholars are reading and Publishing in and those tend to be journals that are first on the chopping block that don't get bundled into big purchases that aren't owned by large publishing scholarly publishing companies so you have to also keep that in mind in terms of your collection development um how easy it is now with so much consolidation and scholarly publishing to Nyx access to um underserved communities so Envy we missed not to say that there's we're seeing lots of people thanking you and having enjoyed this presentation very much and I know that your language including collection development is really resonating with a lot of people that are listening here right now um I have um a third question and maybe time for one more after this if I can um and then we'll be able to close up at four so the the question is specific about librarianship what kinds of competencies do that library and information science professionals need to have in terms of Technology AI big data and their intersection of diversity equity and inclusion thank you for asking that when I was in the information studies Department which is our library science department at UCLA I always taught the the one diversity course and I wish that there had been many many more because we true our students really want those classes and need those classes and um need to think about things like the social construction of data sets you need to understand right how data sets come into existence um and uh even for the kind of informaticists they really you know even though they might think oh I didn't come here to learn about history or Society that's not my thing then they get in there and they're like oh oops I didn't realize and so those kinds of um histories are very very important for us to teach and to keep professionally developing and learning even if we are not fresh out of grad school and mlis program um I think that there are I those the syllabus that I tried to share with you and certainly there are a lot of people on on Twitter under the critlib hashtag there are a lot of critical Librarians that are sharing out books and syllabi and things that people should be reading to um really contextualize and and work kind of specifically in our field um at the intersection of let's say power knowledge um and oppression and I think that there's such a wealth I mean I there are people I think of like Melissa Adler who could give you an amazing lecture um just about um or Emily drabinsky about lgbtqia plus concerns and of course my gosh no place could that be more crucial than in a conversation about science and medicine right so we have a lot of work um right here at our fingertips that we can access and that we should be touching well thanks very much um this is a long question I want to see if maybe you could touch on it briefly before we thank you and and thank our audience um the the questioner is saying I was thinking about your suggestion that we hire phds and black studies into science department and in our offices to provide thoughtful lifestyle knowledge development life cycle of knowledge development um having perspectives is particularly critical for rigorous and just policy development are there standard methods for evaluating Justice and Equity particularly in policy one of the most reliable folks who don't have access to Scholars in-house what would be the best options that's a hard last question that's a hard question I mean I think that we're seeing some centers emerge where we are training doctoral students who can come out um certainly where I've been training some students I know ruha Benjamin has been training undergraduates coming out um uh Andre Brock at Georgia Tech has been training students I think that there are faculty around the country who are developing including in students to take on policy work it's a challenge I'll tell you because PhD if you if you get a PhD to research one University you are trained to go be a researcher at a research one University that's the project right the project is not to go work in public policy in fact that might be heavily discouraged so I think that's attention that's structural in Academia that we're going to have to contend with but I will say that at the research centers that are emerging we are working on policy and thinking about those things and increasingly being called to testify and provide expertise and I think that right now that's a small community of people but we are there there are great organizations I'm thinking like upturn and others that are doing policy work that um at the intersection of AI machine learning and social justice and I would say those are also places that can be tapped for expertise thank you so much I'm going to express my gratitude and the amount that I've learned and also so pleased to hear there's a scholar like you and your colleagues in working in the area let me turn back to Miriam and Mike for the final thank you and farewell well I'll let Miriam take that I'll just express my thanks as well yeah we thank you so much for um all of these uh insights um and a framework for uh things for introducing I think a bit of friction into how we go about learning about the world and taking in new information um this has been a really lovely way to spend the afternoon and we're honored uh that we had you with us um so thank you and uh good afternoon to all thank you everybody bye bye now

2022-10-27 00:57

Show Video

Other news