History, Boundaries, Best Practices, and Team-Based Workable Paths Forward

History, Boundaries, Best Practices, and Team-Based Workable Paths Forward

Show Video

You. Um, so I guess we'll get started I, know. Other people will watch. This later, I signed a release for, like everything. Okay. This is a mouthful I have a whole bunch of slides we, have fewer people, and. If they weren't recording, we could go sit outside I go round a table because, that's even, more fun and so maybe we can do that afterwards. So. As Gabriel mentioned I've been doing, sort. Of technology research review as. A reviewer, and I Arby's for a long time I've, built technology I've, led, teams to build tech, and. They've seen where things have gone wrong and I've, also seen how these. Cultures and communities need, to do a better job of getting to know one another so, we can build, better stuff and make it a lot better for everybody so. I have. A lot of stickers of nonsense. About stuff I've done. Disclosures. I think it's really important to kind of disclose relationships. That you have so, I have a number of relationships I wanted to make sure I disclosed and the, reason of disclosing them is because I'm gonna use them as. Conversations. To make the points that I'm making today otherwise. I wouldn't have anything about them in this talk but then I wouldn't be able to go into any level of detail about anything, so. Some of the companies I work with my company venture catalyst another. Company I'm involved with adaptive. Health so it's a health technology, company and and, in a number of ones which I would, have to have a disclosure about because. I have some level of you know whatever, equity or something, in them or bought stock because it was a good idea a. Technology. Company lifelink and then of, course like the big for Microsoft. Alphabet. Amazon. And Apple, you. Know it's, you know you're. The only em a munch, among. A bunch of A's it's, it's, fun and, then the two academic institutions. I work primarily with Memorial, sloan-kettering in New York City and brew College which is part of the CUNY system in New York City, okay. The. Most important slide of the entire talk is right here. We're. I'm gonna prove this to you hopefully along the way. But. I want to always I'm one of those people who wants to put the most important thing up front, because. If you fall asleep you get called away to something else and, you only watch or you only watch the first five minutes of this kind. This, a thousand. Times this and this. Is the problem that I identified. As something, I was extremely, worried about just, why I've focused most of my. IRB. Related, work and research review and ethics work on this, kind of issue and so, in going upstream and trying to help technologists. Investors and and developers, and regulators. And others and researchers, do, this better so that we don't get this wrong. Okay. It's, a little outline of what we're gonna do we're, gonna talk a little bit about how we got here which is a really big mess that, none of us had anything to do with but we sort. Of inherited the outcomes from a little. Bit about what, oversight of Technology research looks like and then. We're gonna talk about research ethics and under the example of digital health just. Because digital health is the area I worked the most in and it's, a good driving problem because it's probably the most difficult and so, you can do a lot of simplification. And reduction, from there in order to deal with some of the other issues so it's probably the most regulated most difficult so it's the best one to kind of learn from because. It gets all those edge cases a. Whole. Bunch of stuff about caution, and. Where things can go wrong this. I think is helpful because you, don't know unless you've seen it and so, if we start sharing what we've seen go wrong maybe we could make sure it doesn't go wrong again and, then the last part is is really the pulling, it all together and saying some recommendations, about how you can design.

Technology. Research. In, more of a team-based way and get things done in a way that makes your research better and also makes it easier to review and approve which. Meets multiple, goals at the same time. Okay. How we got here. Very. Simply put most, of the history of our regulations around research are reactive, the, reactive, to what we would now call misconduct. Or, abuse of people and situations, where individuals and participants, are vulnerable and it's. A really nasty history most people get really uncomfortable with it but we need to get uncomfortable with it to understand, how we got to the position we are now with the regulations, we have. We're. Not changing those regulations we are subject to them that's. The way it is and it's, good to know that, it. Came came, to us for a reason, it doesn't mean we're going to repeat this or that we're bad people that we could have you know been, horrible and done it this way no one assuming, that but, it just shows why we have the structure that we have, and. It starts with the Nazis. Took. Less than ten minutes into a conversation to satan word nazi and. Actually have a reason for it instead of calling one one because. These were actual Nazis though, the German experience. Happened. You. Know sort of there's an outcropping of the eugenics movement they. Did really, nasty things to people they, did stuff like take, people at a high altitude to see how high they could go without oxygen and measured different. Things about their blood chemistry until they died they, did the same with drowning too did the same with freezing it, was, terrible. After, the, end of the war there, was a series of trials in Nuremberg Germany, there, was actually a large number of them but, people within research always think of the Nuremberg code which. Came from only one of those trials and that was one of the trials specifically. Focused on the Nazi doctors. Oddly, enough, virtually. All of them were put to death save one or two and one, of them actually continued, to work in a drug company in Germany for twenty years afterwards, which, is really strange and no one really knows why this guy was spared his life, so, it. Started, with that and we. Got some, of the things that we now call our requirements, from the. Outcome of those trials in Nuremberg, after the war the. Next set of terrible. Things that we found were, the, revelation. About the Tuskegee. Syphilis study, that. Happened there, early 20th century a, groundbreaking. Research paper or actually. Prospective paper by, Henry. Beecher that talked about 22, different types of unethical experiments, and why they were unethical and they were archetypes, then we had Stanley, Milgram's experiments, this was pretty. Nasty stuff where. You. Were recruited, into research and you were led to believe that you were torturing someone and. You were kind of getting rewarded, by it even. Though the person was not actually being tortured on the other side and. That was that, was pretty pretty. Awful and of course published, and he got Awards for it and, then we move on into the 70s or. In the part of the 60s where a number of terrible things happen in New York where I live, where. They did things like it, purposely. Inject. People to. Make sure that they got. Pepsi. And hepatitis. When, they were children. With Down syndrome in a group home. We had the, chronic. Disease thing where we tried to, make. Sure that people ended up getting cancer it, was a number of issues and, most. Shockingly, he. Won a Lasker, prize and, the, Lasker prize is it's kind of what they call like the baby you're the baby Nobel many, people went in biomedical, sciences can win a Lasker prize and then, usually within a few years they're being considered for a Nobel Prize. Sure. Enough he, wound up essentially, getting kind. Of pushed out of, research. For the terrible, things that he did. This. Is not limited just to biomedical, research this.

Is Not just the kind of stuff where happens with drugs in doctors. Offices the, tearoom trade study was an ethnographer, so. Social and behavioral research we're an ethnographer, was going around in San Francisco and following. Gay men into, public restrooms, and acting as a watch Queen then, following them home after writing down their license plates and. Confronting. Them at the front door and saying so about, you you know having sex with men in these restrooms and. One. Of the problems was many it besides, the queer invasion, of privacy. Many, of those men were married, with families, and so. This kind of invasion was was really rather, ridiculous, and, so it's not just limited to you. Know medical science this is social behavioral, as well, then. We get even more recent, it's. Happened at Stanford, the. Stanford Prison Experiment. Kind. Of disgustingly, lombardo, is really famous and made a ton of money writing the book about the. Thing that he did wrong that, went terribly wrong and so. Even, capitalized. On what. He did afterwards. And. This. One we had students, split up into guards. And prisoners and within, a very short period of time the. Guards ended up actually, abusing. The. Prisoners and their classmates and the, experiment. Had to stop. Okay. All. These things were, either. People or situations, which the, individual and the participant, was vulnerable, what. Did it lead to this, is sort of the brief, regulatory, history of how we got to all the all the things we have today that we are subject, to. We. Had the Nuremberg code that came from I believe it was trial number 11 in the Nuremberg trials. After the war we. Have the. Declaration. Of Helsinki in the 60s then, we get the National Research Act this is actually what created a thing we now call the IRB formerly, although. The FDA did something in 66, that actually looked like it created the precursor, to it and then, we have a number of other things we, are now under the what we typically, call the common rule the common rule is called the common rule because a whole bunch of federal agencies decided, they were gonna abide. By one rule in common and boringly. They decided to call it the common rule and, there there is currently a revision, of the common rule that's going to take effect in. Was. Supposed to take effect in January it. Got delayed twice. They. Say it's going to take effect in January 2019. I'll. Believe it when I see it, but. We have. This history of things that has occurred the. Important, one that I think many researchers, ought to try to remember is the, threefold thing we got from the Belmont report which, is this idea of autonomy, or. Respect for persons, justice. And risk. Benefit or Bennett beneficence. And it. Was interesting because they had 60 some-odd different. Ethical norms and theories, and things that they were thinking were important, and they kind of said gee we don't think anybody will remember all these would be able to do it and then, whittled it down to these three. If. You ask me they're probably missing a good one but, you. Know this is better. Than nothing so. What. Are we missing I believe. They're missing something, that they wouldn't, have had the presence of mind at 79, to actually put in which, is there's a researcher. As. Believe she's now retired or name, is Carol Gilligan who, was at Brandeis who is nominally, credited with the. Sociological, study of the ethics of care and, interconnectedness. And I think that's probably the fourth one we need. In. Addition to these three but, I don't, think she was quite a quite, famous enough yet. When that came out to consider it but. That's probably number four, but. So those were all these terrible atrocities now here's, the earliest one that I found that's not typically, talked about that much which, is in the turn of well, at the end of the, 19th, century we. Were in the spanish-american war in the United States and there. Was this as I. Believe the Washington Post will nicely point it out this sort of crack-brained idea, that, that, black people had an increased. Level of immunity to yellow fever and so. They created what they called immune regiments, that was actually in a moment it was a number, of black and, listed, individuals, and they, sent them into places with high amounts of yellow fever in Cuba during the spanish-american war.

What, Happened, well, it's. Racist, pernicious, and disgusting, to believe that somebody has more immunity because based on the color of their skin many. Of them died and, it. Was a, atrocity. And this was experiment. And. So, this was us doing research without consent, on. Our. Own soldiers and. Sending, them places it was pretty terrible and this by the way the reason I discovered, this was I was in South Carolina, for a wedding maybe twelve, years ago and this. Plaque talking. About that immune regiment, is at. The South Carolina State House and, I saw it took a picture of it I said I wonder what this is all about, turns, out that's, what they were doing so we've had these kinds of problems for very long times these are the most serious but. They led to our reactionary, policies, that, we have that show the framework that we're operating under and. While nobody ever expects, anyone to do something, just terrible, it, gives a sense of why the. Environment. Looks the way it is for how we have to oversee, research. It. Was not created holistically. It was created, in a reactionary, way. So. I just said this, is all about vulnerability, it's, about people. Who are vulnerable in situations, were vulnerable, what, does vulnerability, mean you. Don't have to read all these now you can come back to this later the. Point is that they're all different and these are a number of different international guidelines, and other things many. Of which you are party, to. Have to abide by especially, as a global company you. Operate in a variety, of environments now I'm going to be everywhere, on the planet and, you. May be subject to these, kinds of definitions there's. Not total, harmony between them they, have different variations of saying the same thing about, we. Should care about this and we should do a good job with it we should mind it. What. Are the effects of vulnerability, when it comes to research the the these. Are the things we're actually really worried about worried. About physical, control, worried. About. Coercion. Undue, influence, and. Manipulation, these. Are not things that people ever want to be accused of and they're not things that any researcher, in their right mind would ever be designing something to try to do it. Can happen primarily. By accident, or through not thinking, through things in a very broad way, that's. Why we have to have more conversations, have between. Other people and have review boards that will help us think through things a. Little more holistically, and. So you want to have research that doesn't do these things or minimizes. Them to such a level that they're not a problem. Okay. In. Our regulations, of which you guys are subject, to here in the United States common, rule etc. When you do research on humans we, have, definitions. Of who's, vulnerable and they're, written in such a way to say okay, children. Or vulnerable people are not at the age of majority and. If I asked everybody what, you thought the age majority, was, how. Many of you would say 21. Anybody. Say 21, nobody. 18. Couple. Hands for 18 anything, other than 18. The, answer is it depends. Because. If, you're seeking medical care and, you're. A woman, and. You're seeing medical care related to something, like contraception, or a woman's health issue you can actually get that and have, consent. As an adult prior, to 18 in many states you, can. Drive. A car at 16, you. Can smoke cigarettes at 18 you, can drink alcohol in United States at 21 the age of majority on a given thing is not completely. Fixed. It may be fluid, and it may have laws, and jurisdictions, to change it you're allowed to get married at, age 16, and even 14, in some states in the United States in. Case, any of you want to become a child bride right. Go 14, and Rhode Island I believe right, so. These things are not fixed, and. We. Have regulations that talk about research, on children we have regulations to talk about research on pregnant women fetuses. And neonates, we, have research we have regulations talk about research on prisoners, those, are the ones that are explicitly written in there but, it's not just the pathology, of a human being their characteristics, that actually matter we, need to go a step further because you could imagine there are situations, in which people might be vulnerable and that's, what we're. Typically, talking about when. We say going beyond sort of the regulatory meeting to something more broad that's a little, more useful and in. The, President's Commission actually, said that and said you know this really matters it's really situations, in addition, to. Characteristics. And so, what what could that mean, instead. Of having these group based definitions, you might want to have, situations. As well you, could have multiple.

Vulnerabilities. At, the same time you could have pregnant minors you. Could be studying homeless. People who are mentally ill, is. Not covered. In the regulations, that somebody with a mental illness is automatically, vulnerable, but, you could imagine cognitive. Impairment, creating, a vulnerability, where someone is unable to protect their own interests and so, that's true you could also imagine that someone who is homeless or, socioeconomically. Disadvantaged, compared to the, society, in which they're living and the context in which are living not. Being able to protect their own interests and so, these these sort. Of thoughts around vulnerability. Really important, for us as we design research, as we design technology, that's, going to be used by everybody, because. Let's face it as a global company you're creating things they're going to be used by, everybody, you, know you can't just yeah it's gonna be these people over here in, this one neighborhood you. Know in rural India, no, no. It's going to be everywhere. So. We need to be we need to be more broad and more circumspect. About this now. How do you figure out, what. Kind of protections you want to build in or try to wrap your head around this you you probably want to know whether the research you're doing is actually targeting, a group or, a situation in which people might be vulnerable in those. Cases you really need to understand it very well and you need to design mitigating, factors or other things to, reduce that but, it, also might not be targeted and so, in the cases where it's not targeted, you want to be generally, you know good. About your design, but. You're not necessarily doing research, on vulnerable people if you're not targeting them so. You, know know you, should know whether or not you're really focusing, on that the, fact that someone may. May. Be pregnant, has. Nothing, to do with whether or not, they're. Capable of providing consent, to. Show, you how easy it is to click things on a touchscreen it's. Totally irrelevant, so there's message media again any additional protections, for a pregnant woman who's. Doing that in fact, pregnant, pregnancy, is not by itself a vulnerability why, I don't, know half, the population, or women and women, get pregnant because, that's how you reproduce, and have more people, on this planet it is not automatically, a, pathologizing. Thing that they are vulnerable, because of that they, are no less capable, of making informed, choices and protecting, their own interests just because there are with a you. Know fetus, and are, going to have a child so we, we need to be thinking about whether or not it really matters, in some of these circumstances and so this targeting, idea. Helps. Answer that question. Okay. So that's a little bit of a back story of how we got to where we are why. Vulnerability, really matters it's. Really good to use because it often teases, out edge cases and these people who develop technology, edge cases are your friends, because. If you can manage the edge cases, then usually what you're building will not break or will break a lot less frequently so. We, really we, really can be using. This kind of framework in, a way I think is understandable to people who develop technology, and do research on it so. How do we do this oversight. Okay. We have IRB is we know what they do we know what they are they review research in humans one. Of the things that I've been teaching people about research. Oversight, and. Technology, is to think about bucketing. Things are, you, doing research that is something. People are simply, unfamiliar, with this. Is the most common if. You're, reviewing research you may not be familiar with what that particular group of individual, is doing cool. We're all really smart people we can learn people on our B's or smart they're very, capable of learning and learning quickly and so, you're merely just describing something to somebody who's unfamiliar, with it that's, a teaching moment that's easy, the.

Second One is stuff that's better technology, of the same old thing we. Have lots of structures, that allow us to review quite readily most. Types of research in this world in fact I think just about all of them and some. Of it is just new ways of doing old stuff when, I was a little kid our. Standardized, tests were a scantron, and a number two pencil. Some. People after that started, to do them on desktop computers after, that I might even laptops then, I might even touch screens I'm. Waiting for the point at which we have a voice enabled you know quiz that people take right but they were still just doing answering questions, it's sale is probably, survey methodology. We've. Been doing that for thousands of years ever since we've figured out how to write so. We're using better and new technology, to do the same old thing and, if that's the case we. Already have structures for it so most. Of what I think people are doing in technology research can. Fall under this better. Way to do the same old stuff wonderful. That means we know how to handle it the, last one and from, somebody who's been an IRB chair at a hospital and reviewed research for a long time this is the one that's really fun I mean they're all fun but this one gets really interesting and when you do something that's actually novel, an emergent property of a new system or platform a, useful, and interesting combination. Of different pieces of tech or new data types that you can actually gather that prior. To this nobody could ever gather that's. Stuff is novel and interesting and you usually have to go into a little more detail and there's a little more learning to, figure out how to apply rules to it to make sure that it's covered so. In, in terms of you know ease the difficulty, in review and therefore, ease the difficulty, and how you want, to take care and writing protocols, so you explain what you're doing top. Ones easiest, you're teaching somebody about something they just didn't know about great, second. One is a little harder, because you probably just have to make the correlations, for them really obvious we're, doing you know something, that we've done, for hundreds of years but we're just doing it in a better more efficient or technologically, supported way or at. The, gold end we're. Doing something that's really novel this is different and not. Novel in the sense of like you just want to tell people you're doing something cool like actually, novel. You. Know patentable, kind of novel that level of novelty. Okay. So, how do we get how do we get through this problem I've. Always thought of this as a science and technology policy issue it, we, already know that we accept we accept the premise that technology. Is in continuous developments, the stuff is going to continue to grow and go forward, so. We can't look at it in a stagnant way especially as reviewers we. Need to take our current regulations and overlay them and try, to understand, how we can do things within the boundaries we already have so it can become efficient, and easy for the researcher, as well, as easy for the reviewer to, be able to do it and ensure that the participants, are being protected and then.

The Last part is trying to find flexibility, this is the hard part you really kind of have to know stuff pretty well and this is what you would be relying on your IRB, cambria. And others for is, to trying to fight the flexibilities, so that you can advance, things forward, without, them standing in your way but, also that. They're able to do their job to, protect the participants, and provide. You with structures that help make your research both better and. Flourish, so, there's, a you can tell there's an interplay here that I'm sort of arcing towards. Okay. At. A, very basic level you all know this you're gonna go create a digital solution you've probably got a team and you got something you want to build I. Come. From healthcare so we're going to talk about in the context of digital, health and healthcare this. Is an interdisciplinary problem. People. From Silicon Valley and people, who create technology, have a different way of going about their work than, people who have, classically. Done research on humans for time, immemorial. Medical. Investigators, social. And behavioral scientists these. Are people who have kind of have a way, of doing this and then people who review it we, haven't had as much technology research go through, IRB, s as we, have in the last ten years and it's going, to just continue to rise and so, everybody kind of needs to meet in the middle. This. Isn't a new problem. Interdisciplinarity. Is difficult. There. People have written a lot about how to try to do work within it all, of that stuff is useful you have to court one another and you kind of have to get to know one another you, know put your swords down right. So. What are some of the things you can do try. To understand each other's cultures and the way that you operate, I had. Why I ran, a group. That was doing research on a six sided virtual reality facility, to create. Software. To be able to interact and investigate, protein structures I'm a biochemist, and so that was interesting to me and I. Was, working with. Virtual. Reality programmer. Mechanical. Engineer and a. UX, designer a visual artist and. A, few others we had a very diverse team of people who all come from different, backgrounds. And practices, and one, of the things it was probably, the most important thing that I made an effort to do was to learn what. I would call the hundred terms of their field try. To understand, 100 important terms of the, work of your colleagues so that you can respect and honor their contribution, to work and try to keep it interesting for them and show. That effort and then, encourage, them to learn about your field so, that you can all be working together hopefully.

With Some unified, terminology. So you don't talk across one another but. Also to recognize, that that there's, so much that you need to be, able to manage in those processes. The, second part is that we need to actually identify, and fill holes and some of these teams and. This. Is hard, to do I think, a Microsoft. You might have a better chance at this than, other people because you seem to be more fluid in the way you create, and collaborate. In your teams, so. The more food you are in that and the more willing people are to sort of jump in and help usually. The better this becomes, because when you identify a hole if there's, no ego about it you can say oh we're missing something let's find someone who's good at that to work with us and let's. You, know let's share, the load, so, that is something that that sometimes even IR bees will actually kind of look at and say you know you don't have somebody to think about this issue and you probably, should. And. So ire. Bees can be helping to identify that, though it's not their primary role it's just when you're reviewing research, you often will see it. Okay. One. Of the things that technological. Development, has done that, I think is really marvelous we're going to see even more of it is change. The way we design research and. Nowhere. Is this more interesting, to me personally. Because. It affects my work and I'm seeing a lot of it is in, healthcare technology. Oftentimes. The platforms themselves are. Really. Novel ways of conducting research and, have. Novel, designs to them because we've we've. Changed the environment in which we're, conducting. The research and therefore some of the limitations we had on design may. Either go away or may change and so, you can do a lot more and we're seeing a lot more interesting, research designs and I, wanted to go over one example I mean so every specialty is trying to integrate technology, into it within medicine within human medicine and, then to a lesser extent and in veterinary medicine so. We're. Seeing a lot of this happen and you. Know it's got a few different categories but, it's essentially, through the whole pipeline from research to care. One. Of the studies that I am currently an, investigator. On and, running is doing some stuff like this and it's. A good example to talk about the different, types of design so it, is a virtual. Trial, recruiting. 4,000 people a thousand in New York Boston, Philadelphia. And Los Angeles respectively, and they're, going on to a website and then registering, for a research study and consenting, to a conversational. Chat bot and then, they once. They've gotten consented. Into the study there's demographic, information, taken. About them and a number of questions are asked about who they would want to genetic testing results from whether, their primary care provider they, can tell us who it is or the other they'd like it to be returned by medical geneticists we're. Recruiting for Ashkenazi, ancestry, Jewish populations, they, are at a forty times higher risk of having, mutations, that, could cause, an increased, risk for breast cancer prostate, cancer colorectal, cancer, and fallopian.

And Ovarian cancers these are not, the. Kinds of things you ever want to get. So. We're. Doing this population screening it's technology supported, it's a virtual trial and. It's. Actually, really quite fun. And interesting so what I want to show you is, the. Actual chat. Bot and so this is the kinds of stuff that we're starting to see in, healthcare. Technology, that's. Really interesting so you can go onto the website to. Sign up for the study, we, tried to use like really straightforward, resign, you know big, button register now and. What. It does is if it pops you in to. A chat, bot and so. This chat bot, actually. Will walk you through the. Consent process and it's, interactive, it's got videos on it it goes through modules where it teaches people about the basics of, genetics. And about. What potential. Results could be of these type of genetic testing so it's very different from direct-to-consumer genetic testing, this is actually done through cancer centers with, the support of genetic counselors and. Education, and training integrated, into it and so this is not your typical kite type of consent it's, gonna walk you through these things and it's. Very interactive it requires you to actually do stuff right and it'll even show you videos, and asks, you questions to confirm your understanding, of the, information that you're being given, that. Is something that we have, to historically, never done but. Now we have these. Kinds, of things that can allow us to attempt to do that so we can start to do a lot better with the work that we're doing. So. We could go through this if you wanted it it's, it. Goes for a pretty long while there's videos and, animations. And you name it. And. Of course I lost my space. Okay, so. As you can see we've got an. Entire platform, that, is a study design and so there's a number of very interesting, looking specific. Aims for that research study that we're doing that I just showed you the the online system for, we're. Trying to see whether or not primary, care providers, feel comfortable, returning, genetic testing results of their patients so, we asked them we give them the choice we say. Your. Patient has enrolled, into this research study to get genetic testing done do. You feel comfortable returning. The results to your patient if you do please. Indicate yes we and we're happy to support you in it provide you with information training, and and live support from a genetic counselor of medical geneticists, who, can talk to you and help you you, know learn about feel comfortable returning these results to your patient and if, they say no then, we do it and we've. Used a system to actually make this a lot easier for us we, track all of it for certain types of results they can be returned directly to the person and so. The person could go through this online system walk. Into a blood. Testing. Center this is fully integrated with a place called quest it's a blood lab place and so, all they do is they it goes through this consent, process and then.

They Can walk in and the system already has it that gets the blood drawn and they walk out they've never seen an investigator, they've. Gotten training we've confirmed, that they know what they're doing they've given consent and they've, had the research activity, all on their own. Some, might think that recruitment, would be actually pretty slow for this you, know that people wouldn't be willing to do it we. Recruited almost 2,000, people in four months, which. Was. Really, insane it. Was really fast it was really quick it seemed to work it had good design, so, we actually are noticing. That when, done properly and I think we're just sort of at the tip of the iceberg on this I, think it's very good example, is it perfect probably, not because it's the first time we've done something like this so. I'm sure it could be better so, there's a lot of opportunity, for us and inserted, this this. Milieu to do a better job. Okay. What. About all the protections, that we need for this kind of work. Unfortunately. It's not one-size-fits-all everybody always, wants a quick answer. Anybody, who's been in science long enough knows that when you ask a good question the answer is always it depends and that's. Still the answer it depends. You. Really need to consult your IRB like. Really. They're. Gonna, be your friend they're gonna frustrate you sometimes but your works gonna be way better when, you get, in early and often and have the conversation. It'll you'll have better designed research and ultimately, better design products and services, when, you have people. Who help you navigate that and, the. Protection, requirements may vary depending on a few different things. So. That's just a kind of a framework that you just kind of need to have I want. To go through a few, of the. Standard things that it seemed to come up much more frequently, technology. Research. Privacy. Protections. How. Many of you have taken some sort of training on privacy. Yes. The answer is everybody. Otherwise. I'm not sure they would ever let you in this building right, so. Of, course you've gone through all kinds of training on privacy we, have real definitions. For privacy when it comes to protecting human. Subjects privacy, is about people I always, tell people this privacy. Is about people it's about their. Private space their. Expectations. Whether, they're what whether you're welcome whether you're trusted, all of these things and so, you need to have an understanding of the impact, of the research that you're doing and whether. Or not you have any privacy concerns. You. Can have privacy, concerns that's okay, to have them what's, not okay is to either pretend they're not there or. Not do anything about them you. Can acknowledge that there's a privacy concern and then, say how you're going to handle it or how you plan on mitigating, it and how, you plan on disclosing, it to the person who's participating, if you do those things you're. Probably gonna be fine why, because, if you tell people what you're gonna do and you tell them what the risks are and they're fully inform and they get it and they know how you're trying to you know protect, them from the bad things that are happening and, how, you're trying to honor them they're, going to trust you and that's, a good thing so. These. Are not things that you need to be scared about or try to hide there's things that you should surface, and. With, and. Do it in a way that just makes your life easier. Where. Do they come up the most common places they come up are in recruitment consent. And. Some of the procedures, and then in some of your research methods, and. You. Know for example if you think about social media research, there's. A lot of privacy concerns around social media research, and, a lot of it has to do with the methods and the way you're going to do it it's like oh I'm gonna get access to your Facebook page great okay you, can ask somebody for permission, to do that but you should tell them what you're doing in pretty, gory detail and for how long you, know you can't just say we would like access to your Facebook page to see how things are going with you from time to time but. Then in reality you've scraped, the entirety, of their Facebook page since 2005. And then put it into a database, that's. Probably. Going to inviolate, their privacy, and not, really, telling them what you're actually doing so, you probably need to be more descriptive in some of these cases of what you're doing.

Confidentiality. Is the second one. Privacy. Is about people. Confidentiality. Is typically just considered, to be about data. You. Have lots, of training in that I'm. Assuming everybody here again, did you weren't allowed in this building less you probably have been trained in confidentiality and in data breaches and then data privacy and data protection and, so, there's a lot of things that I reviews, are mandated to look at around this - so. It's something that we can design with and we can do a better job and. We, can work together on it in ways that actually make it make it feasible. So. Really. What often, people are most worried about at, the far end is data breaches, and we've, had a number of breaches that have happened in the last few years that have been very public in this world right. We had the Equifax, breach we've, had EMRs. Electronic, medical record systems at hospitals, held hostage, by people these, are things, that really do concern people and so the more we think about it up front and use procedures, that are consistent with sound design and then, have reasonable expectations. And talk about those risks and have. Adequate, what we consider a CLE provisions, for it the better no. One's asking you to create something insane, they. Want you to demonstrate that, you. Got it through you've, got a plan you've written it down and you, know how to handle it if something's, going to happen and handling. It might not be the way you think it is handling, could even include in the, event that we have a data breach our, process. Is this we. Notify the IRB we call the information security officer, and we have a huddle to figure out what to do next you write that into a protocol. You're. Good because, you you've demonstrated a, knowledge about hey this is how we're gonna approach the problem no, one's expects you to know exactly how some, data breach is going to occur or exactly. How you're going to respond to it minute to minute we want to demonstrate you have this understanding, that we. Are going to react in an appropriate manner and here are the steps we're gonna take, so. It's a usually a lot easier than people fear, that. It is to create these kinds of protections. Okay. So. We've talked about where we how we got here and vulnerability. We talked a little bit about sort of the, generalized oversight, of technology, research and some of the the large issues right data security, and privacy etc, and confidentiality. And now, we're going to actually dig, in onto the the classic, research ethics, part of it we use digital health as an example, just because it's got most of the corner cases that you'll ever find. And a number of things are actually instructive. For other types of Technology research and. Then I've, been talking about it because this area I've spent the most time I don't, design, I'm not a UX designer so I'm not going to talk about it from, from. That standpoint, you. Know and I don't spend my time doing. Machine learning so I'm not going to talk about trying to set point either. Classically. In research ethics remember, I mentioned that Belmont report we've, got these three big pillars. That we talked about autonomy, or respect for persons beneficence, which is the risk benefit ratio and, then justice and we're going to talk about issues and, opportunities. And. Research ethics and digital health or health technology, research. Related. To these three things, the. First one is autonomy, and. In here, the most interesting, stuff lately has, really been around consent and, one. Thing that I've had to drive home really. Really, really strongly with technology, companies that I've worked with and others is the. Concept that a EULA is not Atkins form and it's. Almost I almost want to have this on a mug and I give it to people or make t-shirts. Because. It's really really really important, a EULA, is not a consent, and, you.

Need To look no further than what happened with Facebook where they buried in the EULA the fact that they would do research on people, and. They, considered that to be consent, and then you, know it didn't quite go so well did it really kind of got, got, their nose bloodied around this and I don't think they did it to be cruel and I don't think they did it to be mean I don't, think it was malicious in any way shape or form it, just happened to be that way but they really aren't our. Regulations. Say, how, we're supposed to provide consent the kinds of things were mandated to do and give. Us a posture, that's much more interactive, and much more transparent, and so. An end user licensing, agreement, that's in size 7 font that, you just click is not. Really going to do that and ultimately, at, what. I really. Think is probably true. Is that. Our regulations, which is 45. CFR 46, or, if you're doing FDA 20, FDA, 21. They. Were probably, going to win if that ever goes to court against a EULA because. There's no, federal. Code that talks about ulis but, there is a when, it talks about consent, for research and so. We need to be careful about, these. Contradictions. Between. EULA's. For. A device, or EULA's, for SAS. Platform, or ulis for a social media platform and the, consent forms that we write and how. Do we get some clarity on that we, actually really, need our, IRB, s to help with this and they, can be the people that come coordinating. With legal and others to. Try to get that done properly, because, you don't want that, kind of stuff to stand in the way of you doing research, but, you also need to do it right and it, may impact your design it may, impact the things that you tell and ask participants. To do when they're doing research. That you're that, you're running and so, we need to do that stuff early and we need to be very, involved. In that process in an open-minded way with others we're going to learn a lot as a, community. Of people that do research and Technology around. These issues in the coming years, but. If you want to get out ahead of it don't, assume, that. A EULA can. Say whatever it wants and that you can mandate people to sign it or that. It is sufficient, for research. Consent, because, it's not. So. That's a pretty, important, point that I hope, you'll. Take away and it'll save you a lot of grief, if you think about it that way it'll, also reduce risk, for you and your participants. So. The, other part of this autonomy, piece, was. Something I got to work on a few, years ago when. I was still professor. At Mount Sinai and one of the IRB chairs at the hospital, and it. Was a research. Kit application, so they research. Kit came out they. Had four apps that came out initially it's an SDK that Apple, released to, allow to do mobile medical research on the on the iPhone, the, one that I had worked on was around. Asthma, and it had sort of symptom tracking, and some other nice integrations, it also did, it. Did, some notifications, to you based. On your location. Looking. Up National Weather Service data and telling you about pollen, counts because one of the big triggers for asthmatics, happens to be pollen in the air and air, quality and. So it's really quite fun it was a minimal risk research study, it was a look capturing.

A Lot of data and then just telling them some information there. Was no medical intervention, there was no doctor telling them something there's no prescriptions, or anything like that. We had these wonderful enrollment. Numbers that. Huge ballooning, number of 43,000. Something people. That's what happens when Tim cooked like, points at your app, like. A big conference and goes look at this and then, like you just watch, your enrollment, like go up like that. Oddly, enough so I worked. On and wrote a bunch of this protocol. I had, this sneaking, suspicion though they never told us that Apple might do something like that and so, I wrote into their recruitment that it may be included in Worldwide, Developer, Conference or an Apple Keynote including. It, may be announced, by. The CEO of Apple and therefore, we thought our recruitment numbers could include tens of thousands of people what. Was wonderful about me writing that one paragraph in our recruitment, we. Never had non-compliant. Because. We predicted, that that might occur and we, wrote it in there so we didn't get in trouble for like having our enrollment, go I don't know, 9,000, times what it was going to what, we initially thought. So. What did we do in this from, a from an autonomy, standpoint, with, the thing I was the most heavily involved it and I've continued, to be involved it now is how, do we do consent, on mobile devices and how do we do consent, on tablets. And other computers, and you take advantage of, the technical solutions we have to improve our consenting, process classically. Consents for research are on pieces of paper and they're, long and they're ugly and they're legalistic, and, people barely read them and they just kind of go AAA and they skip to the end and they sign it you really want that that's not really informative s so how can we use technical. Solutions to actually make that process better I think, this is a useful thing to do with technology and so we had a number of things in the research kit SDK. To kind of do this and one, of the things we came out that was come up with by, sage bionetworks was. Heavily involved in that they're here in Seattle. And, Apple, and a few others and myself where. These, this idea of layered and participant. Centric, consent. And layered, consent so you would have each, section, of the, consent form would have you know sort of an icon and a quick one sentence which was the the terse this, is what this is all about you, know we're gonna protect your data and keep it secure right. That's the the the, end thing. You want someone to get when they read a section, about data security right, you, want them to feel like you're, going to protect your data you're. Gonna honor that okay. So boom. First. Layer an, icon, simple. Statement about it and then you can click and you can read more and when, you read more it, would have more detailed information like you would get in a standard consent you can flow through this in a.

Way That that makes it really easy and then, you can capture a signature and then. You can route this back wonderful. Right it, sounds like a good thing I. Thought. That was neat but I actually think we could do more I think, that's not enough that's still very passive, right I'm just gonna swipe. Through some screens it looks a lot like the you know let's tour the functions of an app when you first load a new one right the app tour and. Someone just shows you a bunch of screens what are you doing like now that I know what I'm doing even though we don't and then we go skip right through and then we go on to that I said, no I think there's more that we can do here I think what needs more interactivity, I think. It needs a summary, and this is actually a regulatory thing tell. Me everything, in like three sentences, up front or four sentences up front what is this all about give me a sense of what I'm talking about here in the beginning I think summaries, are really useful I, think. We need easier ways to create edit and deploy these you, know we need real systems so that we can make this more efficient, I don't. Think we should be having it so that they're hard-coded, into each application, that's silly, we. Know how to do this right and I, think there's another of other issues that we can solve and so these are the things that that I've spent a lot of time probably. In the last two years trying. To address, because. I think we need it and I think we, can easily do it I don't think this stuff is actually all that hard. So. It was a good start you know that Apple got this going but. I think we can do better, so. When, we think about informing, this and we think about trying to have. People understand, information. And. Know, what they're signing up for and agreeing with we. Know from a lot of the literature and education, especially around adult learning that people learn in different modalities. Some. Of you may be very visual learners you might like graphics, and charts some, may be very text-based, learners, and like to read paragraphs some things and some, of you very, few of you though I'm guessing them here, Microsoft probably a larger percentage and others may, actually really understand, things like equations, really well you, know there's the classic statement that if you put an equation into a presentation, you lose half the viewers each time you put an equation in but. Some people are much, more, technical. In the way that they learn and if. We're going to be recruiting, participants, into, research we. Should acknowledge that, reality, from.

An Entire field of inquiry and maybe, see how we can use tech to do a better job of it and so, what I came up with which is you know if it's wrong and people hate it they can blame me and my names on it so they can come flog me with a rubber chicken is, that I think we need to have at least three modalities and this can include quiz, questions, to confirm with, the person you know that they understand, what you've just talked to them about, charts. Videos. Texts. Some. Sort of interactivity, I, think, though if you use up to three of those or a minimum three of those you probably have increased your chances of, informing. The participant, and, I think we can do that now and it. Was fun because I've been saying, this for a few years now and I. Finally, got somebody from the FDA when we were talking about some, of the regulations in mobile technology. In. Research, I actually, sort of mentioned it in one of their talks and the best I got from them and a talk was they said well. We don't mandate this it's, really not a bad idea and that, was about as good as you're gonna get you know it's like great I will, accept, the compliment of a double negative from the FDA. So. What. Are some of the benefits you actually get from this this, seems nice and some people me like yeah that sounds good but what. Is it really going to do. Electronic. Consenting when you do some of this I think has a number, of really important benefits it can deal with will. Wit RC and low numeracy if. You, have. Somebody that's lower literacy. But. You do short video segments, that explain things through. A mobile, device, you've. Just gotten around the literacy issue as, you're talking to them they. Don't have to read through it. Low. Numeracy if you show graphics, that, are representative, and informative. May, get around that so. There's a lot of different ways that we can actually provide value here you can make it so that someone could proceed, at their own pace stop, come back to something you. Can make it so that someone might have access, to it at all times if you're, in. A, research, study where they're doing something that's interventional, or investigative, and, something. Happens you trip. Break. Your ankle or something you wind up in a hospital and, you. Want to tell the. Doctor that, you're in a research study cool. We, don't watch haunted consenting systems and apps you, could have that consent, on your phone and be able to show it to the clinician you'd be able to share that information a lot more readily you'd. Be able to remember the stuff you're involved in most. People, when they sign up for research if they're given a paper consent form lose, it or throw it away in the first 48 hours, it's. Like I don't know where it is it's somewhere at home or oh it went out with the recycling, along with the the sunday New York Times from last week right, so these kinds, of things I think we have an opportunity to create stuff that's way better and. Actually provides some real value to people and reduce headaches. Okay. So, that was a couple years ago where are we now what kinds of things are we doing I already showed you one, that. Was the first conversational. Chat bot used for her informed consent for, virtual clinical trial and, we. Started we opened enrollment in March and, we're. Like halfway through our enrollment, four months later it's, wild and it seems to be working and we we, went through the process we're asking follow-up questions, about their how you know how they liked it what they understood what they didn't etc to. See if this is really a viable option for people and so, far it's looking like, it is it's, it seems to be working we, also have analytics, about the funnel of how many people try and how many people complete etc, and it looks looks, really pretty nice we've, also built some more stuff, and. The number of those features of interactivity. Into. Into. A consent platform, for.

A Piece of tech that I've been working on with another company and we. Can do things like have you know finger. Based signatures. We. Can have it generate PDFs, that are locked and timestamps. So you can track and audit, your consent forms we, can have it so that they can easily email them to different parties and so, let's say that you sign up for something but you want to share it with a different doctor because you've got more than one seamless. To send it so, we we're pushing, forward, on these things but this is relatively, new and. This is not just I've. Taken a document, I turned it into a PDF I'm giving them a stylus to sign right we're. Doing interactivity. We've got the ability to create videos and put them in ask quiz questions. Personalized. Different sorts of charts and things so, it's heading in a very fun and interesting direction and all, of this is how I think we can increase autonomy and, protect subjects, in, the conduct of research and. It's using you, know best-in-class tech the, kinds of stuff that you guys build here. Okay so that was pillar one there was econsent we're talking about autonomy, then we've got risks and benefits so this, is one of those things that I've noticed, over time and this list keeps growing these, are risks. That I have commonly, observed that, happen in technology. Research and health but, some of them I think are just happened, in technology, research in general particularly. With mobile devices, the. First is you know your privacy and confidentiality risks, and these, are usually, much more closely tied to the implementation of the the solution. That you're actually creating or. The device that you're creating and you. Know if you're collecting, GPS, data if you, have IP addresses. Those. Kinds of any sort, of location-based, service, sensor, data that could be identifiable, because you could actually understand, a pattern of like well this. Sensor is indicating, there like maybe, you're walking up you, know up a really long Hill at a grade for a long time oh you might be hiking right, and so you might be able to actually identify things, and, you might have additional or odd privacy. And confidentiality concerns, that are much. More closely tied to, the technology, itself this, is not, true in standard. Research, that, doesn't use tech, privacy. And confidentiality are, like, boiler, plates in most hospitals and healthcare systems because, they're like oh you're, physically, coming into, the research setting into, the hospital and sitting down to someone and doing something your. Privacy, and confidentiality is, just of the building and we kind of know what it's about it's, not following. You around in the world in the environment, that you're in so it's much more closely tied to the tech the.

Second One here is collecting. Information from, or about bystanders, so. You want people to take selfies of themselves every, day for the next three years why I don't know you want to see how their, effect changes over time you want to see what their choices are you know you name it there could be any number of reasons why you might do something like that well, what if in 50, out of the you, know nine, hundred to a thousand, of them there's, like a person, in the background that you can see. Well. Well, you collected some information about by center that's probably, identifiable. What. Are you gonna do there's, a risk that that could happen wasn't, intentional, but it happened you could actually solve. That by saying okay. In. Our, protocol if we ever get. Have, people in the background we're. Gonna automatically. Process, the image identify, the person that wit that's part of the study and then either crop out or obscure. The other person in our data that's, totally, fine you can do that you're. Allowed to do that you don't even have to automate if you want you want to do it you know manually, I mean it sounds boring but like you could so. These, are the kinds of things that could come up and then. The last one on here is about consent. Legitimacy, this is something that often gets asked, in the IRB context, is the person who we think they are, we. Haven't seen a huge spike in people trying to spoof for. Research. It. Could happen it's. Often linked to just how risky the research is if you have something that's really high risk research maybe you want to have some sort of technical solution to confirm someone's identity, but. We don't usually have a lot of that kind of non-compliance, and so, the more we see it the more maybe we'll have to develop technical. Solutions to resolve it but to date I haven't seen real problems, with this yet so. I don't think we have to go to the nth degree yet confirming, the identity of individuals. So. Here are some stuff that a colleague of mine put together when. She had, worked with people. In substance abuse and so. She had asked, and got, some feedback from participants about, some of the risks on digital. Health risks related, to their, lives and I, think it's instructive to think about the impact on the, participants, that you're actually studying and putting. Yourselves in their shoes in some cases you may know people. Especially. If it's your area of focus you might know a whole heck of a lot about them and, be. Really tuned in this is why you're doing research in that area ask. Them and they'll. Give you good feedback and so this was a number, of things that I think people were really worried about it especially in substance. Abuse you. Know the prospect of physical harm we. Have the phone I mean they get a text message related. To research and it, could lead to an outcome where somebody else finds out about something or, sinks it they're a snitch and someone. Might react. Violently towards, them you. Can get social. Harm somebody. Finds out there that, you may have been using drugs in. The past you, might be completely, sober at that point but, if you were using drugs and someone finds out you're in a research study that's talking about, or giving you support. You. May get fired you may. Get ostracized, all, because you got a notification or a text message, these. Things are really worrying to people and then, we have economic harms, and legal harms the legal one is particularly, bad. Right. Now we. Have a number of places in the United States are doing research, on. Undocumented. Individuals in, the United States and the immigrant experience. Okay. What. Happens if ice walk sin the door it tries to take your laptop figure, out who those people are, and then goes and finds them. Are. You prepared. Have. You disclosed, that risk to them, do. You have things in your protocol that allow you to handle that. There. Are ways actually to handle that but. These are some of the the harms, and potential, protections, that you may want to put into your work and it's. Not because you think ice is gonna barrel. Down into Microsoft, research and take your laptop that would be really bold right. But. You. Want to have thought through how you're, protecting the people that. You're gaining valuable, knowledge about as you develop, generalizable. Information, you may want to publish or a new product or a new service and so, it's just thinking, this stuff through and being thoughtful about it you. Can do a better job and these. Harms are real. The. One that I just mentioned about undocumented, individuals is, very real. There. Is a way to get around it I have, seen, attempts. By. Agencies. To go after researchers, and so. That's not a that's, not make-belief. Okay. Here, are some of the other ones that are that, also inherent, in in, mobile health and digital health research the.

Hawthorne Effect which. Is we, change our behavior because, someone's watching us this. Is true I haven't, seen anybody pick their nose today because. You know there's gonna like people around you and you like you're not gonna pick your nose right, and that's, the Hawthorne effect in some cases digital, solutions have, a Hawthorne, effect as an, outcome we, want behavior. Change right we, might want to help someone change their behavior that's more supportive, of things that are goals in their life better health more, efficient work product, getting along well you name it anything and so, sometimes the whole point is the Hawthorne effect, but. It is a risk the. More fun, one is the anti Hawthorne effect which is I gave. Your permission to track what I'm doing and collect data about me totally, forgot that you're doing it and then I disclosed. Way more than I really am comfortable with disclosing, to you this. Could happen in social media I gave you access to my social media you're kind of watching it and you. Know then around Thanksgiving, I went home and, like you watched me have a flame war on Facebook, with my relatives and. Like. I'm kind of embarrassed and I really wish I hadn't disclosed that and I really wish you didn't see it so, that would be you know a case of the anti Hawthorne. The. Worst case of the anti Hoffner might include something like, location-based. Services, and tracking and. You. Know, the. Wife goes to work in the morning and the. Husband, has the tracker and actua

2018-09-11 07:20

Show Video

Comments:

The entire point of the Milgram experiment was that people will do horrible things if an authority figure instructs them. In order to get humans to commit atrocities, certain types of justification are more effective than others. On a related note: Incorporation creates hivemind thinking, and since corporate entities operate under corporate law, they cannot help but become sociopathic entities given enough time.

Other news