Robert Califf: "Observations on Technology Enabled Healthcare" | Talks at Google
Good. Morning thank. You all for coming out to meet Rob. Califf former. FDA Commissioner under President, Obama and very early science advisor I'm Paul Varghese the clinical, lead for health informatics for Fairleigh life sciences, and today. We're going to have a question-and-answer session we had one of the most influential physician, scientists, in the. United States and healthcare and internationally, and I'll, be asking dr. Califf about. A range of topics related to tech and able to healthcare and they'll be an opportunity for the audience to also ask questions as well. So. For. Those of you who may not be familiar with verily we are a offshoot. Of Google X we've been around since 2015, and our, mission is really closely. Aligned with what, Google has we would like to make the world's healthcare information, accessible. And useful for, the population, at large and, we do this in a variety of ways one, is to, collect, information organize. It and then make it actionable, you. May have heard of some of our recent efforts in this field which is, some. Of them are prominent public projects, one is called the baseline project. Which is to enroll 10,000, people to, collect evidence in a way that the world has not seen before a broad range of health care information which we consider traditional and non-traditional things, like behavioral. Information, activity. Information another. More. Recent initiative. Is in the disease management space we recently, have a joint venture that was launched with Santa Fe called undo oh it's, for diabetes management, initiatives. And it's again trying to apply the, best of what we know as a Google and alphabet associated, company to, the area of health care which we think is right for improvement. With some of the technologies, take for granted, that's. A little bit about verily and with. Me is Robert Califf who's one of our barely science advisors, there. Could be a whole, separate, talk about all the things that dr. Califf has done but. I will hit the highlights he, is a physician, scientist. He's a cardiologist trained, primarily. At Duke University where he spent the last thirty years becoming. One more prominent. Medical researchers, I think the blurb that I read was. 1,200. Peer-reviewed, journal, articles, 50,000. Citations, and. That just shows the kind of work that he's done it's. Particularly. Meaningful. To me I'm trained, as a cardiologist, and when I did my training we. Learned dr., kaylis works and one. Of them was a most prominent, large-scale, clinical trials, that, gave us true evidence-based, medicine it was called the Augusto trial and we talked large scale I mean large scale 40,000. People and Rob, can will have some opportunity to talk about what it was like to sorta do that kind of evidence generation and data collection in, the era when the most sophisticated technology, was a fax machine those, are some of the reasons why he. Was tapped by President Obama to help, run the FDA for. Those who you're not familiar with the FDA it is the agency, that is tasked with ensuring the safety and efficacy of, the medications, and medical devices that we take for granted in our lives I think, the, blurb from Forbes was that Rob, Califf was the most qualified FDA, Commissioner in the history of the organization given. That sort of wide-ranging. Background that he had and for. All of those reasons I've. Asked him to come here and talk a little bit about what, it's like to be a clinician, scientist. Educator, and be. On the government side of things when, we are trying to find, new and innovative ways to apply technology to healthcare, so. That's, the background for dr., Kaler, Thanks. Why, come to technology, you have. Been on the government side you've been on the clinical side, what. Is it about coming, over to the technology, era.
Or, A technology sector, that you found appealing. And. The related question is something that Eric Schmidt pulls, to us about a year ago which is, how. Does a technology. Centric, company such, as alphabet. Such as Google such as farily, learn. To become more healthcare centric, well, so first of all let me just say I'm hedging. My bets as, you know so I'm half time at verily, and half time at back. At the old University, in, North. Carolina, and. I'm also chairing, the board of an organization called the people centric Research, Foundation, which is an. Effort. To. Generate, evidence on, behalf of, patients. And consumers. About. Health. Policies, and, medical. Products, that has been funded through the, Accountable. Care Act actually and now we. Have a hundred and thirty million Americans. Participating. Through the use of their health records and, hopefully. In clinical trials so I. See, it as a comprehensive. Effort but you know why spend, at. Age 66, spent two weeks in, North Carolina in two weeks in California, seems. Like an, extreme, thing. To do and. The. Answer is because obviously. To. Me the technology, sphere. Is taking, off now and has, a huge amount of money. So, on one hand we got tremendous talent. A revolution. In, technology. And. Then we've got a country maybe, exemplified. By the southeast, and middle America, where. A life expectancy has gone the wrong direction three, years in a row so you. Know we, should be concerned about the world but we'd, better be concerned, about the United States because, our health is deteriorating. Measurably. Under. Our feet, so. What I would hope to do is to bring these worlds together because. The. Traditional, health care sphere, needs better technology. And you, know amazing, things are possible but. The tech world has failed every single time it's tried to get into this face and. They're reasons for that which have to do with a human side of things that are I think, fundamentally. Different between software, development, and application. To. A very human, thing like. Delivering.
Healthcare And people, making decisions about their healthcare so. I you, know at this stage of, my career I want to be helpful and I'm. Trying to keep a foot in, each, side, of the equation so. I'm. Going to ask you to play the role physician. You're. Used to giving patients. Constructive. Information, that can be tough. To hear what. Kind of constructive. Tough medicine would, you give to, an. Organization like, Google alphabet. And verily you, mentioned, the differences. In how clinical. Medicine is done, in. Suffer development, so. Part of this the, part I'm very confident, about giving, advice on is to, pay a lot more attention to the human side of things, the, best software in the world when it comes to people making everyday decisions about, health, will. Not do the job unless it's integrated. With people. An. Individual. You. Know it seems like a lot of what Silicon, Valley is built. On is an individual, interacting, with a computer which, is you know it's been great for me in my career but. We're not make decisions about my health often my. Wife is involved, my parents, may be involved. And. For. Many things by law I've got to deal with doctors. And nurses and. Clinics, and hospitals so. It's a it's a complicated, social. Human, equation. That needs to be dealt with the. Part of it I'm a little less confident. About I'm, sure that this is an issue that we need to pay attention now. It's. Just looking at what's happening with Facebook, you, cannot, assume. That. Using. People's information and, selling it without. A very. Strong governance. Probably. Regulatory. Eventually. Approach. Is going to be needed because ultimately. This. Is a I. Think a very powerful double-edged. Sword that, if, it's not handled through channels, that. Have governance. Could. End up being detrimental. As, I think this. Last election was certainly. Wasn't good for my career I you. Know. I was, out of an out of a job January, 20th. So. You talked about you. Touched on the, role of what. We do with people's data and, so, one of the promises, one, of the potential, promises, is that as we, do better data collection, that we have the, advantage. Of technology. Being fairly ubiquitous that, we have sensors. That. We're all carrying. Their. Smartphones we're, gathering information and, we want to do the right thing by the patient we want to do the right thing by the provider. Can. You talk a little bit about where you see how we empower people to get that information, for. Example you were recently at South by Southwest talking. On this very subject. Can. You share where, you think, the. Obligation is to getting that information back to the patient and, do. We not run the risk of overwhelming, people, more. Information is not necessarily. Easier. To manage a. Lot, of wisdom in what you just said in. My opinion this goes way back for me I mean we were doing studies, in the 80s, of end-of-life. Care. And. Pretty easily demonstrated, that different. People will take exactly the same information, and make very different decisions often, times kind of shocking, like, there. Was a general view that if you had like a 1%, chance of living the next six. Months that, people would rationally, say don't resuscitate. Me because, you. Know it's, just going to add. A couple days of misery to. And. I'd rather die peacefully. Under good social, conditions. But. A lot of Americans, who look at 1 percent say I'm gonna be the 1% and, often. The families would have opinions about it so we've, also got to look at the cognitive capabilities. Of the. Population. A very high proportion of Americans something, like 15% can't, read above a second-grade level so. The idea that we're going to empower people by giving them like their genome, and, that circumstance, is just it just doesn't make sense on, the other hand you. Should get your genome so I think, one of the beauties of what. Could be done with current technologies to tailor the information. To. The person according, to their capabilities. And desires. Also. Taking into account that they need to have a relationship, whether if it's someone, particularly has an illness or a serious, finding. Needs to have a relationship with, a clinician, of some type. And a system, and like. I say very often. With. Family. Members, I love, the British term for this carers, it's. A carer or someone who's not paid but takes, care of you a caregiver. Is someone who is paid like a nurse, or a social worker and. Those things are different because the carer often comes with a very emotional. Component. You know it's it's been said that in. American. Families, the women make about 80% of the health care decisions, and I think that is verified, by empirical, research.
So, An average if it's a man if you're not dealing. With a significant, other and a heterosexual, couple. You're. Probably making a mistake on the average, you. Talked about. Some. Of the new demands we're placing on on, patients. When they get access to that information and. The people surrounding them if, that is changing for the. Patient. And people around them where. Is this going to impact the. People we traditionally, associate. The caregiver, the. Physician, the nurse that, there, are things that we associate, with those, roles, and responsibilities. In 2012. There was a lot of attention to, what, the note Khosla said about, you didn't want to see dr. Varughese from dr. Kayla if he wanted to see dr., algorithm, and. Then. Four years later he kind of backtracked and he. Said well I think. There's. A role for the physician but I think it's gonna be like 20%, physician, not like you know a hundred percent. Replacement. And. And. So, the. Question. For you is, from. Your vantage point perspective, both, as a physician, and an educator, where, do you see the the role of physician, evolving, and I asked in particular because your son is actually doing. His medical, training. And emergency, room medicine so, what, what is it that you're, gonna see, happening. To your son and other, people in their training with, this well, you know one thing you learn at the fda in a big way is and when people are healthy they, really do behave like consumers and, that. Is. Very. Little risk tolerance, a. Lot, of desire for independence particularly. In Americans, so it's a little different in other cultures. But, when people get sick they, become. Dependent fairly, quickly they're also willing to take more risk and, they're more vulnerable, to. People selling, them things when, they're at, risk, and so I, don't. I'm, not worried, about the role of the physician, going, away I don't think that's gonna happen the physician, becomes an enabler, and an information. Transmitter. But there's also this other big thing happening, which is not you know we use the word physician, that's really almost. A metaphor for a clinician, of, some type, more and more, the. Clinician may be a nurse or a pharmacist, or physician's. Assistant, all kinds, of people working in teams. To help people. Take. Better care of themselves and, when, the person desires. It making, decisions, for that person, for, example as a cardiologist, pretty. Rare to have somebody coming in with an acute heart attack making an argument about what treatment they want you, got ten out of ten chest pain 30% chance of dying in, the next couple of days you're pretty dependent, and, it's critical, to have doctors. That can make decisions, for. People and healthcare teams that can do. That on the other hand if you're totally healthy you're trying to decide which vitamin, to take or whether to take a prophylactic. Statin. You. Know the locus of control is, much more in the hands, of the individual, and, the family so the role of the doctor will change you. Know the most imperil, for their current roles would be those that look at images so radiologists. Dermatologists. Because. It's. Clear that, machine. Learning can do better for, the repetitive, parts but think about it right. Now those people make their living for, the most part looking, at images. Serially. And, producing. Reports this, we, have a big problem in translating. The findings, of those reports, into, useful. Actions. And there's a tremendous. Gap. To be filled there, for. Clinicians, so the, jobs will just be different but, I would predict the, better information we have the more there's, gonna be a need for helpful, people in health care that's seems, to be a natural segue to. Afford. Those areas, that are more amenable to these. Applications. Of machine. Learning and technology, how, reliable should, they be and who. Who. Is the person who are the organizations. That should be. In. Charge, of those things and I. Asked. In particular, because the. FDA is has, the purview, of evaluating. Medical software as a device, there. Are some things that are exempt there's some things that are not exempt and I. Think. I as a physician to have a lot of comfort and. Utility. When, I know that there's a tool that's reliable I. I. Joke, with my staff. That I can answer, nearly. Every medical question they asked me with one of three answers we, don't know it depends and my new favorite is I don't remember and. And. I'm freely. Admitting that there are some things that are, just.
I, Don't have time for that. I'm gonna need to real have a tool that helps me but. I know how to evaluate the, effect safety and effectiveness of a drug or. A medical, nutrition Allah medical device but I think we're really uncharted, territory, when, it comes to the, what. We do with algorithms and having, sat at the highest seat in this arena. What, were your. Thoughts well. I mean I feel compelled to save my career started, in 1975. With one of the first databases. In medicine, before, the. 1970s. Everything, in medicine was on sheets of paper and scribbled handwriting, as you. May know and I just was fortunate, as a medical student to. In a place that had a database in cardiology that, measured. Everything, that happened to people as part of clinical care as part of the clinical record and then, followed. Them for life and so we. Had algorithms, that were used in a predictive, way back. Then for one very specific condition. And the. First paper our wrote that was published, in peer-reviewed literature predicted. That within five years all, diseases, would be treated, with. The assistance, of a computer, giving decision. Support. With. The doctor and patient looking together at the information, I was. Off by 40 years we're still not there. Because. It's such a vast enterprise, and so much. To do but if. We fast forward to, my time at FDA, you. Know I had seen this before but at FDA you really see it if. You're, really good in your field and, you're very principled. And you always do the right thing it's, hard to imagine that there are nefarious, people in the world and. Also that there are a lot of people are well-intentioned but just don't get it right and. You. See that at FDA. The mixture of those two the. Proportion, of nefarious people is actually quite small but they can have very powerful negative, effects, and the history. Of the FDA is catastrophic public. Health, events. That lead to regulations. Had started with a horse named Jim milk. Wagon horse who was being, used. To develop, antitoxin. And the horse got infected and it killed some children, in st.. Louis I was 1906. That led to the first. Biologic. Control act. Then. It was sulfanilamide, and there was a company that put arsenic, and antibiotics, that were sold to children believe it or not. Killed. A number of children so that led to safety. Regulation. Then there was the letter mind which. Was. Being given. As free samples, to pregnant, women to control nausea and it caused horrible. Birth defects, that led to, Kefauver. Harris in 1962. And it's, amazing, to think part in 1962. You didn't have to show a drug was effective, to put on, the, market, and that, at, that point they, reviewed 3,000. Drugs that were being sold routinely, in the United States that, had no evidence, that he had any benefit, they. Were all pulled off the market so if. You don't think they're people selling stuff that's bad, and, then for devices, there was a doll con shield which was a pregnancy. Prevention device, that caused infections. And. Let's. Just say the company that, made. It didn't necessarily come. Clean with all the adverse event reports, and a timely, fashion. So that led to more device, regulation. So. Now let's think about decision, support, and. Algorithms. It's. Understandable, there are a lot of smart people, who. Say you know regulators. Are in the way, we. Can do this right it's going to help people we know are good and. Let's. Say that there's some decision support that a lot of people use there's, a little glitch in the program so the wrong dose of the drug is recommended, you know the cover right there would be to say well it's just decision, support the doctor makes a decision, but. Anybody that's been in a busy clinicians, office knows that at ten minutes a patient, you're. Gonna use decision, support if you think it's good and so. This. Is part of my meeting with Obama just, a quick story on that when I got offered the job people. Know I'm an enormous State basketball fan, I was captain of my high school team. That. Season tickets at Cameron, Indoor Stadium for, decades and. He. Spent the first ten minutes of our one-on-one meeting, in the Oval Office telling. Me how much he hated Duke and he love loves, UNC. And. This went on for a while and but then we get you mean like everybody here yeah, you, got it I'd say Duke and the Red Sox have a lot in common, the payer more the Patriots probable in terms of national, hatred but the last the.
Last 20 minutes were literally. About. The importance, of technology to the American people and how to, the. Need the absolute need to regulate it but to. Do so in a way that. Fostered. Creativity. And innovation, and allowed this industry, to thrive, and. You know I loved it and I think I think there are ways to do that it starts, with saying that for now the FDA is not going to regulate decision support heavily, it's going to watch. It with something that's called, enforcement. Discretion, the. Exception, being if that decision support, is attached to a real, medical device that has consequences like. Your, cardiac defibrillator. You, wouldn't want the algorithm, to not quite be right with the cardiac, defibrillator. So, that's going to be regulated. As part of a medical device but. You know should you take a statin decision, support what, you do with your blood pressure medicine should. You get, a flu vaccine not. Going to be heavily regulated, but I've predicted publicly. And I'll just say it again there. Will be some catastrophes. That. Will occur and. That's, how we'll figure out what should be regulated. In the meanwhile. One. Of the reasons that I came to verily, as part of Google is my belief that a, company, with these resources, can. Afford to do it right because so often what you see both. In the academic world and in, the industry world is you've got a company that's just trying to skate by. Minimizes. Expenses. And get revenue as quickly as possible there's. A tendency, to cut corners, and. Often. You can cut corners and get away with it it doesn't affect anything but sometimes. Bad. Things happen and you, know this company can afford to do it right and I, think should not back off I, do. Want to say one other thing I. Advertising. Is a big part of this company and, the. Role of advertising. And all of this is a, very, complex, issue that we bet a lot of attention too. Because. Advertising. Is how you will get products, to people both as you, know doctors, just, like any, other human being if they, don't know about it they can't use it on the, other hand most of advertising, now is being used to sell things that aren't so good for people's health and I, would. Argue it's one of the reasons, that our health statistics are going backwards now we're. Not eating well when it you know and we're. Spending. A lot of money on things that are detrimental to health I'm, thinking about the flip side of that that. With. Those types of interactions that people have with, those things that they're getting prompted, to look at. Is. This not an opportunity, to sort of look at that as information, that can help us make better decisions, that there. Is this concept, of sort, of pharmacovigilance, which, is after. A product, a, pharmaceutical. Or medical device, gets. Released to the market it's, only then when we realize you know what is its true effect because when you're doing the trial it's hard to anticipate everything, and that, we, get real-world, evidence now, with, people interacting, with the things that they're looking for and searching for is, this not an opportunity, for us to. Be able to look at that as a valuable, data stream to help us you, know maybe counteract. You know some of the the things that we're not able to anticipate, now, at. The same time though I think, it's. Nobody's. The right answer for like the level of privacy, associated. With this and I'm curious, where you see. The. The, benefit of that and the challenge of that well, it's it's just it is like the, world's, most important, opportunity, now to make a difference you know it's really fascinating if you look at the health statistics and we're gonna have an amazing meeting of parts. Of Google and, the. Gates Foundation and, Chris Murray who produce these beautiful, geospatial, maps of, the. World, they've. Had a focus recently, on the US because we're looking particularly, bad compared, to other. Developed. Countries and, the breakdown of life. Expectancy and, health. Status roughly. Looks, like the election map it's. Really fascinating. But. If you go to the red, counties, where life, expectancy is going backwards mostly, did a opioids, suicide, and, cardio.
Metabolic Disease, which is roughly. Eating, too much and not exercising. Everybody. Has a cell phone everybody. Has access to information. It's. A question of what do we do with that information then. How do we. Include. People in an active way that leads to, better. Health, and. I you know maybe people in the room there are a lot of smart people here maybe people, here have the answer. But. It seemed to me that a lot of what's going on with the, technology. Now is you. Know I think of them as addictive, circuits, it's more the you know the neurobiology, here's pretty complicated and, not so simple. But. Must, have it's built on a media delight in for, almost everything related, to your health you. Need to have the opposite, of a media delight you need to have executive, function, you'd. Be able to say no I'm not going to eat that Bojangles. Biscuit from where I come from that's a big. Deal I'm gonna hold off and maybe have one a week as, a treat. But, what the what the advertising. Is telling you to do is to eat the biscuit now and when you do it feels pretty darn good and. So you, know how do you engineer. Executive. Function. Using. The technologies, that currently, lead. To repetitive, behaviors. That are detrimental that, I think is. A huge question it's one of the reasons I came I was hoping we could figure this out because it's not so not. So obvious. But. We got to involve people in you. Know what I'm telling the university, side is I don't know of anything more important, for universities right now, than. To have. Serious. Academic. Intellectual. Activity, about. How, with societal, expectations. Of technology. What. Are the what should the rules of the game be that are good for society. You know you need policy, schools law schools medical. Schools. English. Departments, history departments. To come together and think this through, I'll. Just I'll say one other thing because I I sort. Of make fun of Davos, I went this year it's a-you know it's where millionaires, tell billionaires, what the middle class is like but. One. Of the themes there is this fourth, Industrial. Revolution which. I think is very real, you know the first was water power the second was electricity. The, third was information. Technology. We're just recovering, from that, revolution. Now we've got the fourth which is the, merger. Of biological. Physical and information. Sciences, into a single, sphere. It's, dramatic. And it has profound consequences. For, our. Country and for people, all over the world I think, we've learned we can't just leave this a chance we've got to have. Serious people, think about how it should be dealt with. I'm. Going. To get to a couple others questions back I just, want to point out that this, was Rob's, official, FDA portrait, and a, couple, weeks after joining verily this, is his official, barrelling port, so. We, we. Got to him real quickly I noticed, yeah notice the dr. Harrington sent you a Stanford t-shirt, yes. Yes yes yeah, we were, trying for UNC t-shirt but it was last-minute. So. You. Introduced, me to this concept, and. I think it might be related to some of the things that you, just touched on it's like how do we sort of view this information differently and this is actually, a, University. Of Washington, course calling. In the age of big data and apparently, it's incredibly, oversubscribed, so, how, do we teach people to sort of, understand. What, they're being put what's put in front of them now well. I mean, I'm gonna just talk about the health and medicine part of this there's a whole other area in politics, that we could spend a long time, talking about but I mean you could say the FDA's job.
Basically, In society, is to call, but to do it with enforcement, when it, happens. And the problem we have is that, if. You just say AI or, machine learning and much of academia now people step back and say it must be great, but there's a lot of, and. Much. Of it it comes from technically, well done things that just from, people that don't understand, things like confounding. And lead time bias and. Context. Of the research. And. It can lead to very tragic, wrong, answers, and there are many many many examples of, this but. Fascinated, me in this course is it's, in the context, of big data of all types it's a university, course not a health course but. Essentially, the curriculum, is what, you would learn in what I clinical, epidemiology 101. If you are in a school public health or a medical school it's how to understand. Context. How to know. That you need the right teams of people who have a track, record of producing. Reliable. Results. Not just accurate but, reliable for the purpose of the analysis. Particularly. If it's going to be used to make, a decision or derive an. Action. And then there's having the courage to call. Because. When. You do it I mean, the great thing about the FDA you can you you know you get vilified. Every day by the press and by Congress, but, you actually have the legal authority to call, and do something about it you, know four people are in everyday work and they see something going on in the workplace this. Is very important, for health research that's, not right. People. Cutting corners. It's. Easier, not to speak up to not say anything about it and bad, things happen when that occurs so, maybe. You can hold us accountable I. Think. We're all familiar with the phrase dogfooding, that. One. Of the things that is a tremendous, asset for, us in this company. In. The alphabet sphere is that we have a ready, willing to love people who are willing to try the newest the latest to. Contribute their information. How. Do we avoid that sort, of. Building. In those biases, from, your description, you. Are well versed in sort of looking, for these confounding, things coming. Through old-school. Methods of Epidemiology, but for many of us it's. The. Lure of we. Have a ready pool we can generate data quickly, what. Is it that you think that we should be paying. More attention to, when it comes to this especially is. This it, might be internally, valid but it's not externally, valid well. The. Other great thing about the FDA at least in the drug and device arena, we didn't get a chance to talk about food and cosmetics. And animal. Health. Which is you know the FDA regulates, twenty percent of the economy half, of it is food we. Could have a long discussion about that I'll avoid that for now but I still thought it's good sorry, bylaw. Companies. Have to demonstrate that their products are safe and effective in prospective. Studies. And so the in health. It's. Actually, pretty easy but you have to have the rig you have to have the rigor and the courage to do it right and, that, is you can do all the stuff you want to do as you're developing your concepts, and ideas ultimately, the test is a prospective. Study. That measures, health outcomes. And. If, your product doesn't improve health outcomes you got to ask the question question.
Why. Should anyone pay for it and. In. Many, cases a surprising. Finding, is that there's, actually a detrimental, effect of something that you thought was going to be great. You. Know. And there. Are many many examples I could give but I think the same is going to be true of information, technology, it looks logical, it makes sense that ought to work then. You deploy it in for reasons that you didn't understand. The. Wrong things happen, now, if you're selling shoes it's. Probably, ok if people get shoes that are uncomfortable though throw. Them away and buy some new shoes if. Someone's, making a life-or-death decision and, they're now dead that you can't take it back, and you, caused that so. It's. A different game but. The beauty of it is do. The outcome studies figure out if what, you're doing works and. Involve. People. That don't have a financial interest in the outcome in, the, studies, which. Is another hard part it's. One of these native, tensions, that. We. Know that there are things that worked well in the healthcare arena at. The same time we want the advantages, of what, we know works well in our arena, of Technology in that it. Is this sort of uncomfortable tension, that we we, sometimes think the, the adage move fast break things is the. The, thing that makes us successful but how does that translate into a. More, controlled. Arena. Where the consequences are much higher like, I said I don't think anyone solidus yet there's a reason that Silicon Valley has failed almost hundred percent of the time in this arena okay. And, part. Of it is this you. Can't just take human lives and do move fast you know a brick in the face of a human being is not a trivial issue. On. The other hand you know taking five years to get an answer for something on technology, that can move faster, this is what President. Obama was all over me about is how to do. This and I, think the key is informing, people of what you're doing so if you're gonna develop, a system, that will. Figure. Out how to use technology quickly, make sure that the people who are getting health care in that system understand. What you're doing you're not doing it behind their back secretly, back to Facebook again they, need to understand, it and participate that's, why we don't call research. We, used to call them subjects, when you did research like they're like inanimate. Objects, that you do an experiment so we now call them participants. Because they have a right a, human. Right to be involved, and to understand, what. You're doing and why you're doing it I believe. Based on what we're seeing in the baseline study that you mentioned that, if you do inform people they'll be excited about it. They'll. Want to participate and particularly. If you give them feedback about what you're learning as you go along, sorry. I think, the transparency the, feedback, and being, in it for the long term is the other thing if, you want to make about quickly now there's some pretty easy ways to do it may. Not be good for people in the long term, you've. Given us some bracing in, so it's about what what. We can do better i I'm. Going to call. Up this guy named Rob Califf when, he talked about what. We can do with, the. Clinical trial this, was a, formative, part of your contribution. To the world of Medicine, the large-scale clinical trial that produces, evidence-based, medicine that we rely on yet. You were saying. Some bracing things and critical things about that can you elaborate a little bit about this. Yeah you know I just got screwed up early, in my career because, I came along and someone asked me to run the cardiac, care unit at the time we didn't know what caused heart attacks. Fifty. Six-year-old men were routinely, dropping. Over dead from heart attacks everyone was smoking cigarettes very high rates and. An. Enterprising. Cardiologist, did an angiogram and showed it was blood clots in the coronary arteries, and the race was on to develop treatments that worked and for, some strange reason it, became the norm all around the world to enter. People into randomised trials, so. We could do these very big randomized trials very quickly get the answer and move on and the. Risk of being dead now if you have a heart attack is less than half of what it was when we started because, of, this rapid, learning system, using, randomization, to get the right answer because, most of what we tried actually didn't work people, have forgotten about that most, of our trials, the, treatment wasn't effective, even even though we thought it should be. But. For. Some reason, the whole thing got gummed up in bureaucracy. And, collecting. You, know part of the problem was at the time, medical. Records were all written on paper so, you had to create a separate. Enterprise. Of data, systems to collect data separately, from the clinical care very.
Expensive, Then you had to fly nurses, around the airplanes to check to make sure the records all match it's all you know amazing. Amounts, of bureaucratic, human, labor the. Result is that even, in cardiology eighty five percent of our major recommendations. Are not based on high quality evidence. And, that's the best there is you. Know god bless you my little brother's an orthopedic, surgeon god bless you if you need orthopedic, surgery, does a lot of good but. You. Know it's less than 5% of the major recommendations they, are based on high quality evidence. So, we need a different evidence generation system, I'm, very. Excited. That you. Know people in this room can come together and. Produce. Something that's far. Superior with, much more rapid learning, but. We have to call, it God's gift of randomization, I. Don't. Think that just analyzing, our way with a bunch of data because. Of all the complex. Biases. That are involved, is enough, about. That evidence generation. Are, we trying to have it. Both, ways that. We, want to make it easily. Accessible to. Produce this information to use smart phones other. Types, of activity, trackers at. The same time these aren't medical grade. Units. So, are. We trying to get more data at the same time we. Cannot fully. Trust, the, quality, of that data I mean I think we I think we got this right at the FDA and it's playing out you know since I've left I think in a very pleasing. Way I mean, basically if you're otherwise healthy and there's a device that's meant to remind, you to exercise or, count. Your heart rate or stuff that's fine. That's a consumer, device, doesn't. Have to be perfect, you know it'll get better and better if. You've got heart. Failure and you're at risk of dropping over dead, that. Device needs to be accurate, and, I. Think. We. Need to have the discipline, and rigor to. Perfect. The engineering, to the point where it really can be reliable, and, it's. Not so easy I think we're finding but, I'm, very optimistic I, think we will be able to do it but. It also. Means that we've got to involved like teams of people who really understand, biomedical, data and how it's used in the clinical, setting. Working, with the engineers, very, closely. So, that the. Information can really be used to inform, good. Decisions. And I. Think, the FDA is gonna need to look carefully at a lot of the algorithms, when they first come, out which, will be a fascinating. Exercise. I, would like to give the opportunity for the audience to ask questions. Brian sure. So. First as a Pat's Finn speaking, to a Duke fan. Expression. I'll remind you of they, hate us because they ain't us. So. This this is a little bit off the topic, but I'm sure you'll have an opinion on it, what.
Do You think are this, is more of a business question. Hospitals. Today I believe you. Know health care hospitals. In particular are headed for a major disruption from, a business model perspective, what. Are two, or three things that you think hospitals. Need to start doing today from a business perspective to. Sustain, themselves. Or. To continue to provide the value that they provide for the population, there was some interesting traffic. On Twitter today about this related to an article that came out so I would make a big division between, urban. Areas and rural areas it. Turns out that rural hospitals. Need to become community, centers basically if you look at rural America, where this demise that I've talked about is happening in terms of Health Statistics it's, closely related to the economics, and. Loss. Of jobs and, all that sort of stuff it. Turns out at Duke we are affiliated, with a group called LifePoint which is a for-profit, system that buys rural, hospitals, and what's. Happening to those hospitals, when they're well managed, is they are. Sort of a Waystation they take care of people. Who need short stays or, have sort. Of not quite nursing. Home care yet but. Also become community centers, because the main employment, in these towns is. Education. And health care. And so. That's. I think they're rural. Hospitals, need to be transformed, my friend Patrick item I just wrote a great jam. Editorial. About the pathway, for rural hospitals, urban. Hospitals, I think. Can't. Help themselves right, now and, there's. No amount of efficiency in, the current, scheme that they can do that will fix the problem this, is just my opinion so. It's fair to say they ought to continue to be as a to, work more and more on getting, more efficient, implementing. Decision, support etc but. Fundamentally, what's needed in a different payment system, is to keep people out of the hospital. And. Then to optimize, the use of the hospital, using, very. High-grade. Intelligence. Here. On just combined Clayton Christensen's. Focus. Factory thing it, ought to be that you stay out of a hospital unless you really need one then. When you really need one you ought to go to a place that's really good at what it does for the problem that you have even if you have to travel which. Is a disruption, to the cultural, things that we've just been. Talking. About but. There's no way that in the current payment scheme hospitals, can afford to make that transition, so they're gonna have to be forced, to do it and. I hope, that. A roll of Silicon Valley here, with all of its financial. Prowess as well as information will be to help push the country to. The transition, it needs away from fee-for-service, medicine. Which i think is the culprit. Hi. My, name is Bridget I also work with hospitals, with Brian so I have another hospitals, related, question but from. One I know. About machine learning and artificial intelligence is, that you get better if you have bigger datasets and they're more interconnected, and bigger larger, datasets so they've been great moves. To do that but what. I observe with hospitals, is that it's super. Siloed data that, lives in every, different spot and. While. We have made major, inroads with digitizing. Data it's, not connected, to anything and. Data. About our most vulnerable and, needy. Populations. Actually live in a lot of these hospitals. But arguably are not technology, companies are not led by technology, people or people who understand data and again have these siloed, data structures. So, what. Can be the next big step, to connect. Data to allow for machine, learning algorithms, to get smarter, that, aren't just kind of like pockets, of success that are really scaled. Interconnective. Systems, that allow for better evidence does based medicine, for your little brother yeah, so companies, I completely. Agree. With you. About, the. Nature of. The problem, but, I would urge a couple of changes in terminology, so it's not you know America's, coalescing, into it's. Actually. Quantifiable. It's a little over six hundred fifty health. Systems now that. Own the hospitals. It's, been said and I haven't seen today to yet but a pretty reliable person, told me if you look there about a hundred and fifty of these health systems, that care. For about seventy percent of the American population now, and. All. Of them have a business, interest in as you know in aggregating, their information into data, whatever.
The Right term is warehouses. Lakes pools all, that stuff. So. That they can conduct their business and, stay viable financially. Because. Of fee-for-service the. Data they're aggregating. Tends to be not. Quite accurate but it's still usable for a lot of things I don't want to sound totally cynical, about that. But. The culprit is they. Think there's great value in that data and so they're not very, willing to share it or give it up often under the guise of HIPAA. And in. Privacy. Issues. And concerns which, are real but shouldn't Pro prohibit. Us. I'd, say the second thing is in. Our in our in, our Center at Duke for, data. Science, I. Personally. Think we need to quit just saying machine learning we need to say quantitative, methods, because a lot of what needs. To be done doesn't require machine, learning it's just if you had the data and. And so what I would do is focus a lot on the curation, of the information, to make get it in usable form and. Then have teams of people who use the information, for the purpose, using the right, quantitative. Method. For. That purpose, and then what. What I what I'd say is we've got to. Break. Down the data, Sawa's, and I think our best weapon there is going to be patient advocacy, groups. We. Really worked on this in the Obama administration, as you know to try to get over the hurdle. You, know if they've just spent another four years a lot of things will be different but you. Know we couldn't, get. There and you, know we are doing things as I said like the PCRF, Sentinel. There. You know in PCRF we got a third of America's health systems. Sharon's. Been with PCRF is it's a people centered Research Foundation it. Was created through. Funding, from the quarry which was, it's. Paid for out of. Attacks. On the Medicare. Fund. In insurance. Companies, there's, a board that oversees that, which is, industry. Academia, government, you know the usual suspects, the, goal is to do research, using aggregate, information. But. What you want is something even much more, granular.
Than What we have. In that but, we're showing that given the right incentives, people can share data on, a massive scale. But. I think actually the place to divert right now for a period of time is this thing that we were talking about, getting. The social, rules right and having, people understand. What. The rules of the game need to be so. It can be shared, because technically, I don't think it's a big issue. Right now it's all cultural, one. Of the dominant ways to get paid in the. Research. And development of, drugs, and devices is, to charge. For it based on a monopoly. Do. You see our intellectual, property laws as well. Calibrated to the human health space today and. In the future I'd. Say the general principle, I don't know another way to do it than to say, you, got to have some period, of exclusivity. To, recoup the cost of development, when. You say calibrated, that's a really important, term. Because it's different for, a different circle, it should be different for different circumstances. And. I. Don't, know that if the current if the current duration of time is. Right. For the circumstances. That were currently, in. You. Know a simple way I think what you're referring to. Obviously. The more rigorous we are about what gets on the market the more. Actually. Want to argue with myself here but in. Theory the more you'd have to spend on development. To get it right but. I would argue if we use the technology that you guys are developing and, automate, the research, systems, we, could radically, reduce the cost of development cuz the biggest cost in error the late phase, clinical. Trials, but. Then you ask the question if you have a drug that's life-saving, and you. Have a legal, monopoly which is a deal with society, to give you a chance to recoup the cost how. Do you set a price on that when, the only way people can survive is. If they get that drug and that's, where I think we've got it not. Not right at this point so, you can think of that about that either ass length of time in which you have that or some other mechanism, of, setting. Prices that's more fair to people who are, vulnerable. So. I think, this is in you know but let's also now think about the big thing looming, assuming. We solve all the short-term problems who got Alzheimer's. Another, long-term chronic. Diseases where, you will know that people, are at risk long before they have the first symptom, so. How do you develop a treatment and price it right when, it.
Would Be pre-emptive. And. Worth the whole lot but. You got to pay for it upfront. You, know the hepatitis, C drug, situation. Is, a good example of that if. You actually cost out over. A lifetime it's a pretty good deal at the price but the fact that people will, have to pay a upfront at one time meant that whole state Medicaid, programs would have gone bankrupt. Headed, and what is the price right now it's. Coming down but, you know it's in the tens of thousands, per, treatment. But. You know for some rare diseases and, can types of cancers three or four hundred thousand, bucks a year I think, I think that's what you're getting at so I don't think we have it right and I there's a another. Area where we need, once. We get adults in the room in Washington, and at universities. We need to have, those. Continue. Those discussions I, have. Customers in health care trying to solve some of these problems and as. We. Work. Together with them and they have an interest in partnering with Google how, do we decide, whether it's you. Know the right conversation, to have with you guys or what are the criteria. Or how do we determine that process, ah. So. It's. A little. Hard. To come into this environment I hate to tell you so, from a structured, it's really stark for me because I'll. Step, into my office in North Carolina and Duke University it's. Very hierarchical, I've got this big, staff that makes everything work, and. People. Know who I am and, there's a funnel of information, that comes out to me and my job here. I arrived, out at verily I'm in a little glass cage with another guys Dean of a medical school, we're sharing an office. And. I'm. Just another one of the people so. And. Then you say how did decisions, get made and this is a place where I think the, software culture has a lot to offer in terms of the flatness. But. It won't work to be it's, not it won't work like a software, culture, to make it work and I based. On everything I've seen I think this is a major topic that, the enterprise.
Alphabet. Needs to, spend. Some time on it's good to have some internal competition Harvard. Is probably one of the best examples, of place where you can go 100 yards away and see people competing, on the same thing. But to. The outside world if you don't offer some cohesive. Approach. It's. Hard enough for them just to get by right now as you know so. I think we need to do some work to figure this out and I hope I can be helpful, with that well. This has been wonderful thank. You so much for coming, down and listening to rob and great, questions and, please. Come, up and visit us on the 12th floor verily, here in Cambridge if, you want to continue the conversation and. This, was. A great, experience for us and I, hope to replicate it in the future thank you. You.