Public Interest Technology Symposium: Tech powdered Public Health
- [Moderator] Welcome to this afternoon's Public Interest Technology Symposium. This event will be recorded and available for viewing at a later date. By joining this session, you are giving consent to be part of a recorded event. Please note that participants' cameras and microphones have been disabled and the chat feature will not be available. At any time during the discussion, you may enter a question using the Q&A feature located in the toolbar at the bottom of your screen.
The panelists will do their best to answer submitted questions as time allows. Thank you for joining us and it's now my pleasure to turn the event over to Dean Laura Haas. - Hello, everybody. I am Laura Haas. I am the Dean of the UMass Amherst Manning College of Information and Computer Sciences.
I am really pleased to welcome you all to the kickoff event for UMass Amherst's New Public Interest Technology Initiative PIT at UMass. I am thrilled to be here to introduce the initiative which is being led by my colleagues, Francine Berman, Director of Public Interest Technology and one of our moderators for today's panel, Ethan Zuckerman, Associate Professor of Public Policy, Communication, and Information, and Charlie Schweik, Professor of Public Policy and Environmental Conservation. Oops, I'm going to try and share my screen for a minute here. So Public Interest Technology, what is it? It is a relatively new field of study that, to paraphrase a definition from the New American Foundation, focuses on the application of technology expertise to advance the public interest in a way that generates public benefits and promotes the common good. It centers the people who will be affected by a technology, making them part of the solution or policy design and implementation process.
Faculty and students in the Manning, in the... I am really having troubles, faculty and students in the Manning College of Information and Computer Sciences are working on a variety of Public Interest Technology projects at any one time from teaching and learning computing ethics to conduct a research unfair and equitable artificial intelligence to building tools that help aging populations stay in their homes and tools that provide more equitable ways to participate in civic life. Often, these projects cross disciplines and are joint work with other colleges. And likewise, given the ubiquity of technology, every academic field and sector has its own projects and challenges to contribute such as our topic today, technology-powered public health. So it is fitting and necessary that the new PIT at UMass initiative brings together all of the university's colleges and schools.
Under Fran, Ethan and Charlie's leadership, the initiative will develop a variety of educational research, experiential learning, and outreach offerings that empower our students and the broader community with the critical thinking expertise and information needed to promote personal and professional social responsibility and the common good in a technology-driven world. With that, I'd like to stop sharing and turn this over to my colleague, Anna Maria Siega-Riz, Dean of the UMass Amherst School of Public Health and Health Sciences, will be introducing the topic of today's panel and our moderators. - Thank you so much, Dean Haas. It's a pleasure to be here and to collaborate with the school as well as with the libraries on this particular topic.
There is just no question that public health and health sciences is in the forefront of using these technologies that you are mentioning. For many, many years, we've been interested in the development of mobile technology and wearable devices that can facilitate real time data collection of diet, lifestyle, and biochemical variables, as well as electronic medical records that allows for us to better identify and enhance our services in healthcare. It is astronomical to think about how many people actually use these types of devices. In the recent survey done among 50 to 80-year-olds, and I want you to keep that age range in mind, conducted at the University of Michigan this past year, found that 44% had used a digital health app in the last year and 28% were using it at the time of the survey. And of those, 28% were using it to control their diabetes, 34% to track exercise, 22% to monitor diet, and 20 to help manage their weight.
You can only imagine if they did this survey in younger individuals how much higher those percentages are. So clearly, these technologies are being used by us, not only for research purposes, but in order for us to improve our healthcare and personalize our both medical and lifestyle prescription in order to improve health and wellbeing. And yet at the same time, we haven't necessarily thought about the implications related to how do you keep privacy, very confidential here, and how do you refrain from sharing these data that individuals have allowed you to use for particular purposes.
So today, we'll be discussing some of those interesting tension points that are sometimes things that we don't have the opportunity to really think about from an ethical, societal, and health inequalities kind of viewpoint. And so I'm very thankful that the colleagues from our schools have joined together as well as the library to be able to tackle this topic. And with that, I'm actually going to introduce Dr. Francine Berman who will then introduce the moderators.
And thank you so much for collaborating with us and allowing the School of Public Health to be part of this conversation. - Thanks very much, Anna Maria and Laura. We're really thrilled to be here at the inaugural symposium of the Public Interest Technology at UMass Amherst Initiative. To paraphrase Laura, Public Interest Technology really focuses on the development and realization of socially responsible solutions to today's challenges in a tech-driven world, any kinds of challenges. It serves as a critical foundation for 21st century education. We live in a tech-driven world.
We're very interested in the advancement in health of society. And so this kind of initiative helps both the people who create technology and the people who use technology, and that's all of us. The PIT at UMass initiative at UMass Amherst is, as was said before, a cross campus initiative and touches every single one of our schools and colleges, the libraries, et cetera. Today's symposium is particularly important area for us all, tech and health. We live in a world where we've been going through a pandemic, we've been living in a world where we're very focused on our health.
Tech is more and more important part of our health and the medical profession, so this is an ideal topic to get across campus group of experts to be in conversation among themselves and with you about this really important topic. So today's symposium is a conversation, you'll hear from us, we're very interested in your thoughts and questions, which we hope you send in as well. And I have the great pleasure of co-moderating this, but my co-moderator is Sarah Hutton. And I wanted to tell you a little bit about Sarah and the library. So Sarah is UMass Amherst's Interim Dean of Libraries.
She's an education policy researcher with interests and maker spaces in the public sphere and the connection between making and citizen-led policy spheres, so important for us all. Sarah's focus on open education and scholarship reflects some of the most important trends of our time, trends that are critical to accelerating innovation and maximizing the benefits of higher education. Her work at UMass is important. For thousand of years, libraries have been cultural keepers of knowledge, and today's libraries are no different.
Today's progressive libraries and the progressive librarians that lead them like Sarah focus on information in whatever form it takes. and in particular digital formats. Our best libraries focus on the nuanced challenges of making that information available and usable and findable and relevant. The opportunities for librarians like Sarah and libraries like the UMass Amherst libraries is to identify and develop and manage and disseminate that information in the public interest, and those opportunities are tremendous. So for all these reasons, I'm really delighted to be co-moderating with Sarah, and it's been wonderful to prepare this event with her. So at this point, I'm going to hand it over to Sarah who will introduce to you some of our panelists.
- Thank you so much, Fran, and just for those who have not had the pleasure of meeting and working with Fran yet, just a little bit more on where she comes from. As you know, she is the Interim Director of the Public Interest Technology Program, and she's a Stuart Rice Honorary Research Professor as well in the Manning College of Information and Computer Science at UMass. So her role as a data scientist, her work is really focused on the social environmental impacts of information technology with a focus on those strategies that promote technology as a tool to advance humanity, not just technology for technology's sake, but in the public interest, and that's one of the things that really excites me about the work that I've been able to do with her so far and having her at UMass Amherst. And I am delighted that her previous work is focused on preservation and stewardship of cyber infrastructure and digital information.
And that is exactly, there's so many things that are in alignment with what the libraries do and what we support on this campus and in the profession of academic librarianship, so I'm delighted to be a part of this conversation today as an individual and also to have the libraries be a part of the panel today. So just a little bit more about how we're looking to frame this conversation. We're really looking at facilitating a dialogue with all of you today.
So while we are a collection of tech and health and public policy experts, we are also members of the public. We're also all consumers, we're using some of those same health monitoring devices. I'd be wearing my own right now if I hadn't broken the band when I was at the gym last week. We share many of the same concerns. And even though we may research a lot of what we'll be discussing today as a part of our daily practice, we have a lot of unanswered questions ourselves. And so we really want to share with you what we have learned in our work and also discuss with you the work that still needs to you be done.
So we're going to be starting out this discussion by introducing each of the panelists by asking them a few questions. And once we get that conversation started, we'll move into some of the questions that were submitted during the registration process. And there were some amazing thoughts and ideas shared by you all as the attendees beforehand so I'm really looking forward to discussing some of these topics. I am going to be monitoring the Q&A throughout, and we encourage folks to submit questions at any time through the Q&A function. We'll work to integrate those into the conversation as it is happening as opposed to saving them all for the end.
And we'll continue talking about, we're going to talk through and unpack a lot of these concerns that we'll be talking about surrounding data as a public good, privacy data, privacy governance models, and customization versus intrusion as it plays out, and health monitoring devices for personal and research use. So without further ado, I'm going to get us started with the panelists' introductions. It's my great pleasure to introduce John Sirard. Let me go ahead and do a screen share myself. All right, okay.
So Dr. John Sirard is an Associate Professor of Kinesiology in the UMass Amherst School of Public Health and Health Sciences. He is an Amherst alum holding a bachelor's of science and masters from UMass and received his PhD from the University of South Carolina in 2003, followed by a post-doctoral fellowship at the Stanford University School of Medicine. His areas of specialization include the measurement of physical activity, social and environmental influences on youth physical activity, and school-based physical activity interventions. His research program goals are to develop tasks and disseminate successful intervention programs that work at multiple levels of influence to increase youth activity and decrease screen media use leading to long-term improvements in physical, social, and mental health. He leads the work of the Physical Activity and Health Lab, whose mission is to advance the field of physical activity measurement through research and application generating new knowledge about the health benefits of physical activity across the course of the lifespan.
So just a couple questions to get us started in dialogue, John. As a part of your introduction and noticing that you're a double alumnus of UMass Amherst, tell us a little bit more about your history with UMass and your work here over time. - Yeah, okay. So hi, everyone. And yeah, so I'm sort of a local boy. I grew up in Ludlow, Massachusetts down the road here.
So I came here for my undergrad, which was actually in marketing, and turns out I'm not so good at selling anything, so that didn't really work out. So I came back to school for, at the time, exercise science, now it's kinesiology here at UMass. And so went away for almost two decades, but then had the opportunity to come back as a faculty member. So really great being back here and working with the professors that used to teach me, but now as a colleague, so it's been fun. - Great, thank you. And so looking at your work in measurement for the purpose of comparing determinants and creating interventions in physical health, this has a lot to do with wearables, and we'll be talking a lot about that today.
So have you always been interested in wearable devices in your ongoing research or is this relatively new in your work? - Yeah, so, yeah. I've been working with wearables since my master's thesis, so that was testing a device called the Actigraph, and we still use the Actigraph different model, more advanced model of that Actigraph device. But yeah, so it's been about 20, almost 25 years now that I've been working in this space, primarily using research devices with children and adolescents is sort of my niche.
- Gotcha, so the Actigraph, is that the wearable accelerometer device? - Yeah, yeah, so it's an accelerometer technology. It's the same technology that's in your Apple Watch but just sort of the research grade version of those, yeah. - Okay, so besides this particular accelerometer, have you used other wearable technologies in your research? - A bit, yeah. So we do focus more on the research devices, but we have done some validation testing with some youth and adult devices in the past.
And so the main caveat for research would be looking at those consumer devices in a research space. I think they work well as intervention tools to get people to be more active, but if we want to just measure physical activity of what's going on and get a true picture, we rely more heavily on the research devices. And also some of the privacy concerns around some of the commercial devices, which we'll talk about later can also be an issue that I think we avoid some of that with the research-grade devices. - Thank you, yeah. I know that we've talked before about some of the concerns surrounding proprietary algorithms that come along with those commercial devices. So yeah, and as you said, we'll be talking more about some of those research devices as we move through our panel discussion.
Thank you, John. I am now going to move on to introducing Laura Quilter. All right. So Laura Quilter is a copyright and information policy librarian and attorney at the UMass Amherst libraries.
She works to educate and support campus affiliates about copyright, privacy, and other information policy matters that may affect their teaching, research, or scholarship. I have worked with Laura for several years now in many different venues, projects in the libraries, and she is one of my cherished colleagues. So to start off, Laura, could you tell us a little bit more about what these images are that we're looking at in this slide? - Yeah, absolutely. And I guess I should say that nothing I say is legal advice. (laughs)
The first image is from a legal clinic that I worked in when I was in law school, the Samuelson Law Technology Clinic at Berkeley, and it says, "Making the world safe for, and sometimes from technology." And that really captures to me when I think of as Public Interest Technology lawyering, which is that technology are these tools, various tools of all sorts that we use to make our lives better. With public interest law, we want to make sure that the technology benefits people and doesn't harm them, and that feeds lots of trade-offs. And I also like it 'cause it has a robot. The second two images really focus to some points about privacy that I hope to make in the conversation. - Thank you, so I guess my next question for you, as an attorney and somebody who has focused so much on copyright law and policy surrounding data privacy, especially as we're talking about health data today, do you feel that US law is doing a good job protecting health data currently? - Yeah, such an interesting question, right? I mean, we're all familiar with HIPAA.
It's like the big dog in the room, right? And I think we all get so many notices relating to HIPAA all the time, and maybe that's why many of us are familiar with HIPAA. And so HIPAA does a very good job of giving us notices, right? Which make us aware that there is a law and that somebody's collecting health information. To answer the question of whether HIPAA or US law in general are doing a good job, I mean, we could talk about the failings or some of the ways it's been good, but I think the bigger picture in my mind is that HIPAA is just one tiny patch in an overall landscape.
And so that's that picture on the right, Chromatopia by an artist, Pablo Manga. And each one of those patches might be conceived of as one of the different statutes at a state or federal law on some particular subject, but the overall landscape affords you not a really robust privacy protection at all, right? And that kind of takes me to the middle image, right, which is a self-portrait patchwork quilt by Lorrie Cranor, a computer scientist at Carnegie Mellon. And she thinks a lot about privacy among other issues. And this was a pixelated photograph that she made of herself and then she turned it into a quilt, and I love this piece of work, but I think it's a really useful way to think about privacy being protected by technology because pixelation is one of our ways of de-identifying images, right? But anybody can identify the person with machine learning and algorithms.
People can re-engineer all these de-identifications. And so that's another key point that I like to think about in these discussions, which is that technology as a solution for protecting privacy just throws us into an arms race. So we have a patchwork of law, we have a technology arms race, and then we have to put all those things together. That's what I think about.
- Oh, I love that, I love that. And it makes me think too, as you're talking about the patchwork and piecing together, a lot of what we'll be discussing is about personalized data that's put into different areas, for a consumer it's like different apps, different devices. And so there's less of a concern about there being a full profile on somebody that you're still protected, but then you think about stitching all of those quilt squares together and you have a very explicit profile of a person. And so, I think that that'll be a really juicy part of the conversation.
Thank you. - Yeah. - All right, I'm going to hand the mic over to Fran to introduce our next panelist. - This is exciting, and let's get Liz on here. So let me tell you a little bit about Liz. Liz Evans is an Associate Professor of Community Health Education in the Department of Health Promotion and Policy. Her work spans opioid use, community engagement, ethics and storytelling, a really fascinating combination of things.
And it's a thrill to have her with us today. But as you have noticed, we asked each of the panelists to give us a slide and it could be any slide of their choice that they would like to talk to. So Liz has given us an incredibly intriguing slide. And so let me ask you, Liz, tell us about these two photos. - Yeah, thank you, and thank you so much for having me.
I'm so excited to join this discussion. I feel like I need to add a pixelated me of what Laura was just sharing to capture sometimes how I feel today. I chose these images.
Here's me when I was five and here I am a few years ago when I joined UMass at a much older age. And it's just a reminder of how our health is shaped by where we live and other contexts like our place in time in history. And so now I'm very much interested in how context, especially interactions with institutions in our public sphere can shape health above and beyond our biology or our sociodemographic characteristic.
So this is an image of me, as I move along, I guess, in my life course, to remind folks in public health, that is a very important concept that shapes what we do. - I see your work and I kind of hear what you're saying is really talking about tech being a whole ecosystem. And I'm kind of interested in how you've thought about the ecosystem. So have your own experiences influenced your interest in tech and health? - Yes, I mean, there is me, of course. Personally, I too use an iPhone and use it to try to improve my own health behavior by tracking my steps, but then I wonder, where does that data go and how does it get used by others maybe without my awareness? For me, myself in my research, I started out as a research assistant in 1990 where we collected data interviewing people who had just been incarcerated using paper and pencil.
And in that, now in my own life, it's changed so drastically. I now use computers to analyze administrative big data. So this is data that's collected about folks as they interact with the criminal justice system or healthcare.
And data's collected about them as part of their digital footprint. And we can access that to analyze it better, understand what shapes their health over time. At the same time, there's been a real change in how we think about, I study addiction, we look less towards the criminal justice system itself as a solution to people who use and have a problem with substances and instead are trying to use more health-oriented models.
So it's an exciting time, I guess, to think about how the technology is enabling different types of research, and so is our way of thinking about these health conditions is evolving as well. - I can't resist asking just one more really quick question if that's okay. You have degrees and lit and public health, which I like completely love. So say something about that. - Yeah, I guess I'm a little bit like John, like when I was much younger, I soon realized, I don't think I'm very good at selling things.
I love to read books, I love literary fiction, so I study comparative literature. I went on for a degree in public health, a PhD, but I still think about how storytelling, it's how we can understand the experiences and viewpoints of other people different from myself and that can instill compassion and understanding in ways that sometimes maybe this big data that I use for my research doesn't do a great job of helping me encompass in the research. So I do think are important to understand the strengths and limitations of these data that we use, but also putting front and center this responsibility to think about health equity as a guiding principle when we consider like what type of research should we do, do we want to do it, and how do we go about using it responsibly? - I think you're positive that communication is everything. - Thank you.
- Thank you so much. I'm really excited to see the discussion. Let's get Tauhidur up here and we'll introduce him and then we'll get going. Tauhidur, free up your camera.
Yeah, that very intelligent-looking person in the foreground there is Tauhidur, which we hope to see in real time in a bit. He's an Assistant Professor in Computer and Information Sciences. He's focused on, oh, yay, there you are, on digital health and how we might rethink the core physical mechanisms of health technologies to better impact the way we diagnose disease and better impact the way we track and manage our health. And for all of you who have Fitbits or follow your steps on your smartphone or use the machines at the gym, all of those things are made up of technology and devices and sensors. Tauhidur's work integrate sensors and data from our bodies and from our environment and biological and behavioral measurements. So that's a lot of stuff to pull together to answer science questions.
So let me ask you with, let me start by asking you what those science questions are. How do you use all that information to answer health questions? - Okay, thank you so much for the wonderful opportunity, and really already inspired after hearing so many exciting ideas, projects that my fellow colleagues are doing here at UMass Amherst. So since no introduction, quick introduction already done, I'm Assistant Professor in the College of Sciences at UMass Amherst, and broadly, my research, I aim to harness health signals from large scale data in different ways and applications ranging from fighting substance use disorder for detecting early psychopathologies and for developing epidemiological models for infectious diseases. And since you asked about the science piece, I think overall, that's why the equation kind of lies. I put it there on the slide.
Overall, the research philosophy and the sciences can be explained by this equation. Generally, to address the question head on, it generally starts from the community and thinking about the problems that the communities are facing. Some of these problems include opiod epidemic, as I say, and other vulnerable communities facing extreme temperature and then critical analysis of the problem itself, ways to ideas and approaches with mobile sensors and various computing techniques and tools that we developed in the lab. The mobile computing tools are often paired with advanced artificial intelligence and machine learning. And lastly, the research often involve deployment and validation of such mobile tools in the community. And obviously, without effective community partnerships and learning from the target community, it becomes very difficult, so that is often the last and most probably the most important piece in the puzzle.
And that's pretty much the research science piece of my research. - So the amazing thing to me is in looking at the kind of work you do, you've instrumented anything from hospitals in Amherst to, what, Rickshaw drivers in Bangladesh? Is that true? - That's right, that's right, yeah. - Tell us a little bit about your community orientation because very different populations, very different questions, or maybe the same questions, you might ask.
- Right, I mean, growing up in a developing country in Dhaka, so I was part of Dhaka, I completed my bachelor's degree there before moving to the States where I did my masters and PhD eventually before coming here joining Amherst, I was always inclined to really build technologies that could address some of the problems that this vulnerable community really facing. And I wouldn't feel content just by innovating novel technologies itself that could, and I always have this inclination to see it through, right? See through in terms of deploying some of this technology in the world, real world, and how computing can really solve some of the core social problems that we are facing today, so that that's kind of the background. And some of the research that we're doing right now includes looking at participants or individuals with substance use disorder in Western Massachusetts, we're looking at opioid epidemic in booster area, we're collecting a big dataset with variable sensors that has the possibility to potentially misuse opioids, so that's going on.
We're also looking at, as you said, trying to instrument Rickshaws in Dhaka in Bangladesh and Rickshaw pullers are really one of the most vulnerable committees because the amount of the work is really excruciating. And just by looking at how they actually manage, how they work, what are the physiological and psychological variables while apart from the works in extreme temperature, because since the city is partially unplanned and heat island effect exist in that city, we wanted to see study what are physiological and physiological, even decision making related impact on extreme heat. And just by looking at that, we potentially could build mobile technology that could help them better plan their life, better adjust their lifestyle. We are also looked at other vulnerable communities, which is early preschool children here in Amherst.
We're trying to create this, we're trying to build models that leverages brain functioning and behaviors that would allow us to screen for early psychopathologies. Emotion regulation lead to psychopathologies, where there's a big gap that we see in psychological health medicine. - So much amazing work, and if you think about it, there's so many commonalities between your work and the other panelists. We have culture, we have physiology, we have environmental influences.
Let's get you all together on the screen. And Sarah, I know has some questions, so everybody get your camera on. This is our choreography for today, I want you guys to know. And Sarah, I know, has been collecting a bunch of really interesting questions from the audience and we have Laura on here, or did we miss Laura? Aha, here she is, okay.
We've got the band together. So Sarah, I'm going to hand it to you for the first question for everybody. - Sure, sure, sure. And actually we had one pop into the Q&A during the panelists' introductions that I'm going to front load here. And this goes to a comment thread that we've talked about quite a bit in all of our work as a shared group. And so framing, I'm just going to read the question as it was submitted.
So we have seen distrust of public health with the polarization of pandemic camps. How can we rebuild that trust going forward, especially introducing personal tech devices? Many will see some of these helpful devices as an intrusion into their privacy. - Great question. - Yeah, it is. It's a really good question.
And one, I know, that impacts our daily lives. We read about it in the news, we read about it in social media. There are different opinions that at foster distrust, there's different media that fosters distrust, misinformation.
So thinking about it from the role as either researcher or consumer, I would invite anyone just to turn on your mic and jump in. - Yeah, I mean, the question is, are these helpful or intrusive? And my guess is that Laura might want to start us off because what's the legal basis in order to even think about that kind of question? - Yeah, thank you. I love that question actually, because when I think about privacy, you've got the technology and you can build in protections, you've got social norms, which we haven't talked about, and you've got law and all of these things kind of interact.
And I think no one of them will be wholly effective without the others. The legal regulation of these things is nonexistent, right? Really, and so the personal wearables and that sort of thing except in isolated ways that it comes up. But how do we go about, so I think people's lack of trust is entirely merited to be honest, right? I often will cover up my laptop camera with a little sticker, right? I'm not like a super paranoid person, but it's actually completely warranted to do that, I believe. So how do we begin creating trust? I think we have to start creating a culture of trustworthiness, which means that the norms have to lead.
I believe, right? We have to really, by having conversations like this, by having researchers like my colleagues here at UMass who are actually thinking about privacy as they do their research, they're building in protective norms. Eventually, those norms will get folded into law in more or less useful ways, but those protective norms are going to be the key piece of it, and we have to keep having these kinds of conversations and bringing people into them. And basically, that's my answer. It's not an easy answer at all, right? Changing social norms. - Well, just pushing a little bit back on that, Laura, don't you think that context matters? So for example, we do have HIPAA laws, and one question I think we got earlier was, how do we update those for the modern age? Which is a really, I think, especially when you have AI and you have autonomy and you have devices making decisions on their own. But one of the questions, I think to push back on is, there's some context in which information is private and there's other context in which you would like people to know things about you, you know? If I'm lost hiking, I would really like somebody to locate myself through my device, maybe not otherwise.
So how do we get that nuance context into the policy and the legal system and standards in order to even make those kinds of decisions? - That's a really great point and I certainly don't want to be accused of saying like, "Oh, law has nothing to do with this," right? But I mean, I think to really build our trust, we're going to have to make it a more universal thing. Law is going to lead in some ways, not United States law to date. (laughs) But yeah, we, we do to be looking at these kinds of issues.
I think historically, the US was a leader in privacy law in the '70s, and we've really fallen way behind. But the good thing about that is that we've had 50 years of experiments at state levels, at the EU, at Canada that we can look at like what works, what doesn't, what's not so effective, what ends up being more burdensome, and thus adding to the problem? I think the HIPAA notices, to be honest, I think, are counterproductive usually. And when was the last time anybody read a HIPAA notice or a term of use notice, right? Lawyers don't, right? That's a big secret.
We mostly don't except when we're feeling nerdy. So I think we can look at these experiments in privacy law and really learn from them and start to make stabs at figuring out good regulations for these kinds of devices, right? And I could go on at length because I'm a lawyer, I can talk, but I'm going to pause 'cause I don't want to stop anybody else from talking. - Yeah, I guess I maybe have something to add this idea. I experienced the emergence of HIPAA as a researcher and when it first came on the scene, it really made the research harder because it imposed these regulatory requirements that people didn't understand very well and misinterpreted.
Now, I think there has been some evolution, so we know better, like how we can use HIPAA in a way to access data, patient records in responsible ways. But ultimately my thinking, I guess, around something like HIPAA, it's a necessary thing, but we need to think of it as a living document. We have to keep revisiting that, updating it for our times. But also that we cannot only rely on HIPAA to solve all of these ethical issues. This is just one tool and actually a pretty small, not especially influential one, like you just alluded to, Laura, instead we need other things in place. And one important one is what we're doing today having these thoughtful discussions about the bigger ethical issues involved and maybe giving the public the opportunity to engage and be a part of determining what are the issues, but also how do we use the data in ways that we value and in ways that also yield benefits that outweigh any risks that come with them.
To me, that's a better, bigger issue rather than privacy alone. - Okay. - And I think that's a fascinating response, and I want to just add one more point.
I feel like public education and some of the things that we, as a researcher, could do, like open sourcing of our software systems when we develop some of these mobile devices, maker culture, right, right? The whole idea of engaging and allowing people to experience how this tech is made, right? These are, ultimately, it's us. We, the subset of the public who actually end up developing some of these tools, I think that kind of practice could allow to potentially tackle misinformation, right? That becomes a big issue especially during the pandemic that I've seen. - How can open source help us tackle misinformation? Say more about that, Tauhidur. - Yeah, I think part of the idea is mistrust and ultimately, the core problem is lack of knowledge about a particular domain of knowledge. So if we don't know about sensors, for example, and mobile computing, for example, and how these are computed, how datas are being collected, what are the different kind of encryption technologies that are out there, what are the vulnerabilities, right? Not only just the positive side of it, but also what are the vulnerabilities and how have a clear idea about all this, and if you could disseminate knowledge and make public aware of some of the strength as well as weaknesses, I think that would clear up a lot of mistrust potentially, and also allow people to have a reasonable expectations from different technologies. And that could potentially ultimately public could be the main forcing function when it comes to regulation, right? So that's, I think, how it would potentially could work.
- So even more than misinformation, it's the trust of technology, really believing that the technology is working in some reasonable way, really believing that the results and the information that it gives you and sort of trust through familiarity and understanding what it means. Is that what you're getting at? - Right, right, right. And also maker culture, right? That promotes that you could do it, right? I mean, there are courses, for example, at UMass in CICS, I'm now branding CICS for a second, but there are courses that allows you to learn about the smartphone sensors that are there and how we can build quite easily, right? In couple of hours, you can write an app that could collect this data, and that will allow you to have reasonable understanding of some of the vulnerabilities as well as strengths, which could partially inform the decision makers when it comes to regulating. - Yeah, go for it, Laura. - I just want to jump into Tauhidur's comment because I am an enormous fan of open source software in terms of increasing security and reliability.
So exactly for the reasons you state that many eyes make all bugs small, right? However, it's not going to solve the problem of access to data or of people who just share their data without thinking about it or understanding, or even if they get a notice about that they're sharing their data, understanding, and able to understand the notice, understand the value of what they're sharing, getting out from underneath the leverage that like, oh, but I want my device to work, so if I want my device to work, I have to say yes, right? And open source can't solve those kinds of problems, right? That's where privacy as a public good comes into the space. And so that's why we also have to think about law and why we also have to be, again, really trying to shift the norms so that we're all thinking about privacy, so that every researcher, every technologist who develops a new technology, they're not just thinking about bugs, they're thinking about protecting the data, right? That's one of their checklist things that they do. They're just like, it's baked in and- - Well, one of the things, if I could interject there is that, so yes, as an advocate for the maker culture and building as understanding and as a part of the experience, going back to talking about some of the research especially that Donna is engaged in with some of these third party devices where you may not have a hand in constructing either the device itself or have a hand in the policy or governance of how the data is shared and how individual privacy is protected. I'd be interested to hear a little bit from John about how do you talk with your students and your research participants about some of these inherent risks and how they can be managed based on what is being used as a part of the research? - Yeah, in the research space with the research accelerometers, we don't typically talk about the general data security confidentiality issues. We all deal within the research and IRB protocols.
With the consumer devices, yeah, it is a little, I guess, less clear, because what Laura was just talking about where it almost seems like, to protect the data, we have these commercial interests that have one particular interest and they want to get access to the data to assume to tell you something else, right? But then, the researchers just trying to do their thing with the data. And so we have these competing interests, it seems, right? And it almost seems like, well, Laura was saying where law is going to have to lead the way here because certainly those commercial interests are not going to do anything to reduce their market share. And so, yeah, so I guess, so I diverted a little bit, but I think, yeah, with the consumer devices, we, in the past, we've set up the devices ourselves, and so we're collecting the data and the participant doesn't really have access to that data. But certainly, the company is receiving that data in the backend as well, but we haven't really dug too deep into that. And I must say none of our participants have actually brought it up as an issue, but certainly moving forward, I think, it will become more of an issue, yeah. - Well, actually, I have kind of a follow-up question that was actually submitted before this session by one of the attendees.
I would be interested in your perspective, John, following up on this, in the issue of privacy and negotiating how do you educate folks based on what it is that they're using, how they're participating. In your opinion, and then I would like to hear from others, what do you consider to be the biggest threat to personal privacy data as a part of participating in this type of research, from your perspective? - Yeah, I think from the commercial devices, I think the physical activity data itself might not be that onerous of a data breach. But I think a lot of the commercial devices also track location similar to your phone, and I think that is probably the thing that really sticks in my mind.
I don't know, probably about 10 or 15 years ago, I had thought about doing some sort of intervention approach. It was like a point of decision prompt, but it was going to be through this cell phone technology and GPS technology. So we could see where a person was at lunchtime and maybe steer them towards a healthier option at lunch, or suggest going for a walk if they're in a certain environment or something. So I pitched that idea to free colleagues, and all of them said, "That's a little creepy." So just sort of big brother is watching aspect. So I think the location thing is just maybe the bigger concern, also legal issues around that, but also the creepy factor.
- John, that reminds me of, and now it's an old story of how target use data analytics vary successfully to try to figure out if people were pregnant or not, and then market to them baby formula and cribs and things like that. And there's a well-known case where they figured out that a teenage girl or they thought a teenage girl might be pregnant and tried to market to her, and her father was really upset about it and it turned out she was pregnant. And what they learned from that, which was really interesting is all of our management audience here is that, the creepiness factor really turned people off, even though they could bring in new customers. And so what they did was they actually then sent you coupons for cribs and baby formula, but also for wine and barbecues and whatever else, so they mixed it in, so you got what you want and lessened the creepiness factor. But it's that kind of information you don't expect, they know just 'cause you gave them your credit card, and they know what you bought, for example.
So it's those kinds of issues that are so subtle. I had a question that I kind of wanted to pick Liz's brain on and others on the opioid crisis because anybody watching the news have seen these kind of big court cases come and go, and really have a sense about how all of the players have really strong issues around the opioid crisis. You think about physicians, you think about pharmacists, you think about patients, whether they're chronic pain people or addicts, you think about the police who are involved in this, and tech has been a giant influence in sort of that momentum in creating an environment where there's such subtle and complex relationships between all of those groups.
So I would love to get Liz's thoughts on it and anybody else. - Yeah, well thank you for that. It's a great observation and really true how the opiate epidemic created a context that accelerated and really had big data take off. So all of these institutions collect data that now in Massachusetts are put into a public health data warehouse. So 98% of the population in Massachusetts is included in the state warehouse. And it has events like, have you ever been arrested or incarcerated? Have you used public assistance? Or how about your use of healthcare? What are your diagnoses? What kind of treatment do you receive? What are your outcomes? The whole point of all this was to gather this data up to hopefully bring it to better understand, like who's at risk of an overdose? How do we better shape create policies that can intervene, prevent that from happening? But now that that has been created, it's available to study other health conditions, other populations.
So it does raise all kinds of other questions and interesting research opportunities. For example, the data can be used not just to look at the patients themselves, those who have problems with opioid use disorders, but you could use it to identify inappropriate prescribing practices. How is it that maybe there are some physicians in certain places that looks like they're being a little loose and free with how they prescribe opioids? And so it's a way to like shine a light on different causes of the opioid epidemic. It's not just about someone's personal behavior or risk factors.
It is about the healthcare that they interact and other institutions that can shape the decisions that they make. I put decision in quotes because often they're kind of set up by the context to fall into a certain way of behaving. So yeah, I'm very interested also in the ethical issues that public health data warehouse brings up.
A lot of when you just asked about what are some of the privacy concerns that people have, there's a lot around data 'cause we all wonder we're all in that data set. Do you know what it's used for? I'm not always totally sure and I study this for a living, but for people with opioid use disorders, that's a stigmatized illegal behavior. So they're especially concerned that this data could be used to deny them health insurance or jeopardize their employment, or threaten their parental rights, or it could lead to unjust policing, or criminal justice consequences, ones that are not warranted. There could be racial profiling. So there's a lot of ways in which the data could be used inappropriately and maybe contrary to a lot of our values, and it speaks to the need to have these safeguards in place above and beyond just like the individual institutions who contribute that data, but maybe a broader way of doing oversight and involving others in the uses of that data.
- What do you think the fixes would be, Liz? It's like, the data's out there, it's helpful in some context. - Yeah. - And you don't want it used in other context. So how do you make sure that it's used in the right context? - Right, yeah, and another potential outcome is folks hear of this, and then they choose to not go get healthcare that they need because they're afraid. So that's another potential unintended consequence that is a bad one that we don't want to have happen. So when we think about what to do about it, I think having community advisory boards, so creating a forum where people who are interested in this and also affected by it have some forum.
They can come together and engage in the decision making process. That's one way. There should be some off, like set some off limit uses, like we all come together and agree.
These data cannot be used for X, Y, and Z because that's not their intent and that's not what we value. You can also, another important, I guess, limitation of data like this is that there can be these blind spots. So the data doesn't encompass everything about everyone.
There are some serious just blank holes in it. For example, opioid use disorder itself is not particularly well-marked in there, neither is like race and ethnicity necessarily. So think about how that really hampers our ability to look at disparities in who accesses care and what are their outcomes by their cultural, ethnic identity, right? So we can do a lot more, I guess, to educate in the public about what are the capabilities of data like this, and also to involve the public in what it is that we're doing. - That's amazing. Anybody else want to jump in on that? I think Tauhidur, don't you do something around data opioid communities as well? - Yeah, I mean, I mainly focus on wearable sensor development and privacy is a big, big factor that informs the choices of the sensor itself, how we compute, what are the features that we extract out of the low level signals, so this is really relevant.
One thing that I wanted to emphasize and totally agree with, that kind of relates to Liz's point and what John mentioned earlier is that, harm of information particular signal, that by releasing a particular information in a dataset, it's not necessarily dictated by what information, not necessarily dictated by what type of information that particular channel reveals about the user itself, but it also depends on these big companies. How they could potentially put that channel of information together with other information sources that they passively collect? Just to give a concrete example, for example, activity data, for example. It may come across as innocuous, right? Not necessarily super harmful information channel, but Google or companies like Facebook may be collecting our clicks, right? Information about clicks, what are our searches, and then they could put it together and identify that, okay, I'm more likely to have a certain type of disease, or maybe I'm, I don't know, my political inclination was blah, right? So that becomes more of a challenge when these big companies can put information together and mine that information, right? Yeah. - I'm very glad that Tauhidur brought that up and is following up on things I was thinking about, and Liz and John were speaking as well, which is that, with traditional threat modeling, you identify your asset, right? And you could do that with the public health database that Liz is mentioning, right? But the problem that we face, and this is what I think is the big challenge is the problem that Tauhidur is describing, which is oftentimes, there's no one single asset, right? It is the aggregation of data from multiple assets, data that we may not even be aware that we are creating, right? Because somebody is very clever and they're like, "Oh my God, I can buy this data set from Google and I can pair it with this other data set and I can do something else. And I'm super smart and I've created an algorithm that now develops an entirely new data product that will be really valuable and interesting," right? So we haven't even thought about how to threat model that kind of thing really. I mean, at least in terms of creating laws about that.
I think, we're all pretty familiar with the credit report and the credit number that we get from that, right? That didn't exist that long ago, a few decades ago. And so that was a novel creation that created a new piece of information that has become inordinately valuable and inordinately influential in people's lives, right? Now that we are out there engaging in numerous public health transactions, we have basically two laws, the health, HIPAA, and the Genetic Non-Discrimination Act, right? That's entirely not sufficient to deal with the credit report of our health profiles that are going to be created. So traditional threat modeling is really hard to do with the threat that we're facing, and so I think that's the biggest threat. It's not the government or the corporate sector or any one of those. It's the fact that everything is data and all that data can be stitched together, honestly, by 15-year-old kids who download software and have great ideas with it.
That democratization is fabulous for a lot of things, but we've got to start thinking about what it means for our data. I fantasize, even though I just said technology is not the solution. I do fantasize a types of encryption that would make interoperability difficult, right? And that using quantum computing to create uniquely encrypted data sets for everything, so that it would just be really difficult to interface them. - I'm really with you on everything you just said, and I think you just brought up a really good point because there's lots of different ways of getting at these problems.
There are legal ways, and policy ways, and community ways, those are really important ways of getting at it, but it's also design ways. And John and Tauhidur design devices and systems that work on those devices and they can design in ways to protect the information, ways to exploit the information. The design makes a big difference. Use makes a big difference. And so there's this whole ecosystem that has to do with the people who make the devices, the people who use the devices, the people who regulate the devices, the words you get for using it, the punishments you get if you use them incorrectly. All of those things are really important.
And Laura, I think you bring up such really important points around that, that I think for tech and health are particularly important because when you have an environment where results can be catastrophic, and this is one of them, then you have to pay special attention to things like that. Sarah, I think you were going to say something, I'm sorry. - No, that's fine. No, that's totally fine. There are so many different questions that I want to ask and weaving in all these great topics that are coming up in the Q&A as well. One of the things, I mean, looking at, talking still about the customization and especially what we do with big data and how that's governed. There are a couple of questions that have come in, but also, a deep, personal interest of mine too, on the conversation about AI and it's role that it plays with public health, public health practice, and in particular for improving disease surveillance, improving, using larger data sets to look at different trends and themes for the purpose of improving interventions.
We're looking still at this, what are the benefit of doing this? AI can solve a lot of problems. Automation can solve a lot of problems machine learning can, but there are those issues. And one of the questions that came in that, I think, gets at the heart of this really well, is that, how again do we ensure that the development of some of this AI, on the one hand, there's the policy and governance regarding access to the larger data sets, but then in the dev on the AI, what is the role that any of us have in the conversations about how these are developed and if there are philosophers, if there're ethicists involved, are there, I see the smiles, yeah, because, not typically. And what is the role of public health researchers in the development, in the dev work of AI that is going to be using some of these larger data sets to provide recommendations on interventions? - Yeah, so I'll just jump in, but yeah. So I think this idea of using our wearable devices and other data sources for the purposes of precision medicine and targeting, whether it be behavioral intervention or pharmacologic intervention, I think that has tremendous progress or potential to really move us forward with the improving health. But I think, going back to what Tauhidur said earlier about these different data streams coming together in the back end and what Laura was talking about pulling these different pieces together and then, and not for the benefit of society.
So, yeah, so that's something that I don't think I've wrestled with too much yet, but I think is coming in the near future for me and probably for most of us. So, yeah, definitely interesting. - Yeah, and I know that the CDC, I was looking at this a couple, I don't know, this was a couple years ago at this point, and of course, I'll drop a citation in chat. I already have another one in queue on AI and public health, but the CDC queries about this too. They have been conducting systematic reviews on where does the topic of AI up here in public health research using PubMed and other areas, because they're trying to get a grip on, where is this going? What is the future of this? And so this is where the conversations are happening live right here with the researchers.
I would be really interested to hear your thoughts. It looks like Tauhidur, you might have, be ready to jump in. - Yeah, I mean, this is fascinating and highly relevant to some of the things that I'm working on.
So i
2022-03-13 00:01