Using technologies from the gaming industry to improve medicine

Show video

(gentle music) - This is Stanford Engineering's
The Future of Everything. I'm your host, Russ Altman. The gaming industry is
huge and it attracts the attention of millions
of people, gamers.

The competitive landscape
of gaming drives innovation and hardware and software,
virtual reality goggles, motion detection and augmented reality where they project things
on top of real objects. These devices though are
not just for playing games. They have real promise as technologies to improve medical care. Bruce Daniel is a professor of Radiology and Bioengineering at Stanford University.

He studies how human computer technologies such as virtual reality goggles,
mixed reality projection or even motion detection
can improve medicine. He will tell us that the
technology is almost there for routine use in medicine, and now it's time to make good interfaces where we can train physicians how to best use these technologies. We're talking about mixed reality, augmented reality, virtual reality. You're a radiologist. What, Bruce, got you
interested in all these fields and their potential impact
on medicine in the future? - Well, two things; one is that I think that
if you're going to look at a virtual world, or
possibly look at right now, and I can't think of
anything more valuable for you to look at than pictures of the inside of your own body. The other part about it is
because for a long time, we had a very unique
facility here at Stanford that the former chair of Bioengineering, Dr. Pelc had a lot of to do
with, which was the MRT scanner.

It was an open MRI scanner
where you could get in there and you could do procedures on the patient while you were with the
patient in the MRI scanner. It was like a space shuttle. It was expensive. It kind of became hard to sustain. But it was amazing. I did more cases in
there than anyone else.

I really captured this idea
that wouldn't it be great if I could really intimately
use these pictures to help patients with procedures? Then I began thinking,
"Well, well when that, can not be sustained. How do we do something simular to that in a more cost effective way. "and I think this is
one of the solutions." - That's great, 'cause that gets to what
one of my big questions was, is there a problem here
that needs to be solved or do we have a solution
looking for problems? But it sounds like in your last answer that you actually see opportunities where medicine is not delivered
in the best possible way or surgery, medical care. So can you go a little deeper
into what are the problems that maybe we just accept as problems and don't realize some of
these could be actually solved? - Yeah. Well, so the most acute problems that I can think of are the ones in actual procedure delivery. So like surgery, for example.

I do a lot with breast cancer
and you might be surprised to know that if you have a
lumpectomy done as a woman, they get out the tumor about
three quarters of the time, maybe 80% of the time. But if you went to go
get your gallbladder out and the doc told you, "Well,
we're gonna get it out "almost all the time. "But one out of five times
or one out of four times, "we're gonna have to redo it," you'd find a new surgeon. But this is the... And this is the reality
that people deal with.

Surgeons are not doing this
because they're bad surgeons. It's because it's a very
challenging operation. Similarly, we have work
working, Christoph Leuze and Match Buddy are working on transcranial magnetic stimulation. A gradient device can be used to stimulate the brain quite amazingly to treat the FDA approved,
to treat depression, but it only works in
about half the patients. Why is that? Well, one possibility is that
we're just not putting it in the right place every time we do the procedure every day for a month.

So at circuits, you're activating, while you're doing the procedures, maybe these things could
improve the efficacy of these procedures resulting
in fewer repeat procedures in a cost effective way. - Going back to the breast cancer, that means, if I'm
understanding you correctly, the main problem is
they can't find the lump that they're supposed to be optimizing, so to speak.
- Exactly. - That are supposed to be removed. Take us through how some of
these emerging technologies and a lot of them are based
on the gaming industry.

They're building these amazing, not all, but a lot of this comes
from the economies of scale of people who are playing games
and doing virtual reality. So how exactly... Could you walk us through what
would the surgeons experience be like or how would it change
in using these technologies? - Well, what we've worked on initially is that the surgeon
puts this technology on and they look at their patient and the patient has had
a preoperative scan done say in my department, an MRI scan that is extremely accurate and it provides you volumetric 3D data. That data is processed into something that no longer looks
like a bunch of slices, but instead looks like
three dimensional models that maybe you 3D printed
almost or something like that. But then these, instead
of being a physical model, you hold in your patient. So you can, right before you do surgery, do a virtual dissection
of the patient to see where it is you need to go.

Now there's a lot of
challenges to getting that to work to the level of precision that's really going to improve care, but that vision is very clearly possible. And you're absolutely right that it's all leveraged off
of the gaming technology that's coming out there. And frankly, kind of industrial
engineering technology? So, a hollow lens device like I have here, one of these things you put on. So this device would never have... Yeah, the headset. Put it on. That device is extremely sophisticated and could never have
been built for the market of say 300,000 breast cancers a year, which is the most common cancer in women.

But even 300,000 a year
is not enough for this, but the 300 million
people who might be gaming in their basement, that's enough. That's how it gets built. - So how do the surgeons feel about this? I could imagine A, they're
excited because they want to get 100% of the lumps
taken out of course.

On the other hand, they
have a certain training and a certain workflow in their life and it probably did not involve, unless they were big
gamers as kids, headsets. How have surgeons responded to this and how might training have to change? - The surgeons have
responded to this universally with being in favor of this. I think it's quite remarkable how we once presented this a few years ago to the San Francisco
breast surgery meeting.

And they looked at this and said, "This is gonna change things as much as "the robots have changed surgery." And I think that if you're a surgeon, the last thing you want to
do is have a conversation with your patients saying, "That operation we did a few days ago, "well, we didn't quite get everything. "We have to go back." They would really like to diminish that.

They'd also like to diminish
the fact that right now, what their main strategy for avoiding that is taking out twice as much
tissue as they need to. In fact, they would like
to do a better cosmetic job for patients as well. They're really motivated, but the one challenge is
that the big early adopters on this are the neurosurgeons, 'cause the brain has always been an attractive imaging target and they have these million dollar rigs of things they put in the OR to do stereotactic surgery, et cetera. They're onboard already.

But a breast surgeon has to be able to do like eight operations a day. How do we make this work reliably quickly? And when you just put it on
and I see what I need to see without having to click
through a bunch of menus and log in and whatever,
all this other stuff that using our computer
seems to entail these days. We need to get to the point where this is as easy
as putting a stethoscope in your ear and listening to the patient. It has to be that easy that
when you put it on your head, you just see what you need to see. - So this is the traditional
user interface issue that sometimes engineers
who don't immerse themselves in the actual use cases can get wrong. Which leads me to another question, which is how good is our hardware? Is the hardware ready to do...

It's ready for gaming, but is it ready for serious physicians doing
life and death procedures? Or are we needing a few
more years of development? Do you, as a physician get to impact that development at all, or you're simply at the mercy of whatever the gaming industry needs? - It depends on your
application, I would say. A thing like the transcranial
magnetic stimulation, the device creates an effect that's around a centimeter or two in size. So you don't have to be
exactly that accurate because it's not gonna matter. But if you're putting in a
neurosurgical stimulator, that you want to have
it within a millimeter or a fraction of a millimeter in order to hit the right nucleus and
treat somebody's Parkinson's or whatever you're trying to do.

It depends on the target. We're trying to go for low hanging fruit that are big procedures
that we're interested in, orthopedics we're interested in. The breast cancer surgery one...

The breast cancer surgery one, they're gonna take out a
rind of normal tissue anyway. That one's a pretty low hanging fruit. We have a project on
acetabular impingement syndrome where we gotta get down to
the millimeters of shaving. I'm not sure we're accurate
enough for that quite yet. Another big problem is that
these devices are interesting.

They're very compelling actually. And they really show you something that looks three dimensional and the history of that is fascinating. But what's interesting is that
you and I will put this on and ostensibly see some
virtual heart in front of us.

But where you see that heart
and where I see the heart with the same headset might
be order to make it appear, this virtual object
appear in the right place. They have a fixed focus
out at a couple of meters, but I wanna work at hands distance. So we gotta fix that.

The last one we need to fix is some of the depth cues
are not quite right yet. You would like to be able
to perceive something as being not just in front of you, but inside of another object. And that causes some
misunderstanding in your brain.

as you look at sort of like a surface and you try to reconcile
that I see an object, that's a surface, and I see another object that is supposed to be
behind that surface, but how do we render it so
that it really looks convincing that it's back there and
not floating in front of it? - That's fascinating 'cause we don't even do much of that in
real life, if you will. In other words, I'm thinking about when I'm swimming around, snorkeling, the 3D depth perception
becomes a challenge. That isn't a challenge when I'm in terrestrial normal situation. So here, we're almost
having to train the surgeons for a new way of viewing. Is the resolution of
these goggles good enough? - Question you have there is... I think the resolution
will improve a little bit, but the resolution's
actually pretty good there.

I think that at some level, but not all are using this goggles, they're using the microscope
itself and then trying to overlay the images into that. But the images that we take, we think an MRI scan is super resolution, but about a half a millimeter if pre-operative imaging is your goal, that what you want to superimpose. - Before we leave this
idea of the visualization of body parts, basically, and we've been talking about surgery, are there other applications that you're excited about of this? So you're a radiologist and you're looking at MRIs and CTs all the time. Do you imagine you'll
start looking at those in a more augmented way, or is it primarily the surgical
impacts that we're looking? - Well, no, I think... But we are responsible
for every box in the set. Really the best way to get
to interpret every little bit is by slicing through it and that's how we could continue to do this despite the 3D things being available for a while on a computer screen.

But I think that more
and more regular doctors are really going to be using this. And the model that I see is gonna change from where we have a
report that we dictate and then when you go see your doctor, they're not even looking at you, they're looking at their
computer over here, reading my report and trying to figure out what's going on with you. And now it's gonna change where...

The device I showed you is gonna turn into something like this and they're gonna see this
museum gallery approach that's gonna come up. They're gonna look at you. They're gonna look at your
knee when you come in, 'cause your knee hurts.

This is one that my
friend, Brian Hargraves, who co-directs the labs come up with. They're gonna look at your knee and you're gonna say, "It
hurts over here, doc." And they're gonna look at you and say, "Yeah, I can see inside you.

"That's right where your meniscus
is torn right over there." And it correlates perfectly. Or, "Oh, no, no, that's
a burst over here." I think the long term
vision is it's gonna affect the doctor patient relationship in many, many ways and make it better. The doctor's gonna pay
attention to you instead of that computer screen over there.

- That is super. As a regular doctor
myself, and by the way, I love that term, 'cause that
perfectly describes myself. I love that idea because, for example, as a general internist, I don't
always think about anatomy. Every 10 years I take a test where I have to review my anatomy. But if I'm having some kind of display, especially if I can
still make eye contact, that was very important,
what you said about, it'll be more like a pair of glasses because I could scare my
patient if I walked in with one of those things
that looks like a binoculars, but if it's much more subtle and I can make eye
contact with the patient, but I have a visual
assistance in reminding me where the muscles are, how
to do the shoulder exam, how to do the knee exam, 'cause I'll do one of those
every week or two weeks. I don't do 'em enough
so that it's automatic.

That could really be
transformative in terms of the ability to improve
the physical examination. And as you know, many doctors are mourning the loss of physical examination skills and this could really pump that back up. - I had a case yesterday, I spent about an hour with
one of the gastroenterologists 'cause this patient was saying, "Every now and then I have this lump "that shows up in my abdomen."

And she even had pictures
she had taken of this. And we are trying to figure out where's this lump compared to what's on the scans that I saw. We were talking on the phone and on Zoom and trying to figure this out. But if he had had this
ability to just like, "I'm just looking at the patient, "I can see that lump is," I don't know, your gallbladder or that's your liver edge.

I mean, it would've solved
and it would've solved it in an intuitive way that would not have required this long conversation. But I do think the radiologist
role is gonna change. We're gonna go from making
reports to being the one who sort of annotates
and paints these images and says, "In three
dimensions, look over here, "this is a blocked vessel or a
torn vessel or a tumor here." You're going to be looking at... We're already doing a little bit of that in the prostate cancer actually. It's interesting.

- This is The Future of
Everything with Russ Altman. More with Bruce Daniel next on SiriusXM Business Radio channel 132. Welcome back to The Future of Everything. I'm Russ Altman, and I'm speaking with Prof. Bruce Daniel
of Stanford University. In the last segment,
Bruce described the ways in which augmented reality
may improve surgeries, both their success rate
and also their efficiency. In this segment, he will tell us how these technologies will
motivate physical therapy patients to do their exercises, how it may help teams of doctors come up with better plans for complex patients and how it may help
patients manage anxiety about getting into MRI machines.

Sign me up for that. So we had a great chat about these methods for augmenting anatomy. But I know that you've also
thought about other things in terms of things like physical therapy. Can you tell me how do
these technologies promise to impact physical therapy? - Well, what's really interesting
about these new systems that have come out is
that they were designed to warrants on them that
can do things like track where you're looking with your eyes and they need that in order
to provide a convincing image. But interact with them, they find a way that they can track where
your hands are doing so they can actually understand the pose of your hands very well.

So all of a sudden... And they can track your
head and your body. So all (indistinct).

- Here's where we're benefiting
from the gaming industry with the dancing apps and the shooting and tennis and bowling. - But there are times when
these kinds of movements have to be done in as
prescribed by a doctor in order to help you get over a stroke or some other kind of chronic injury. And so one of our students last
fall came up with this idea of a hand app for someone who's had maybe a stroke involving their limb, where they have to move their hand. But what they wanted to
do is not just train them to be able to move their hand, they wanted to motivate them. They didn't want people just
moving their hand a little bit. They want people moving their hand the full extension amount.

What they did is they had
little virtual turtles coming across that that could be
seen in three dimensions. And it looked like you're
swatting these turtles with your fingers. And what was interesting is
the turtles would only... it's like whackamole, but those turtles would only explode into these pretty fireworks
if you had done a good job with your hand, as
opposed to if you'd done a poor job with your hand.

Then they could keep track
of how many you've done and they could kind of make it fun to do this otherwise sort of boring
exercise for the patient. I'm not sure it's there for everyone, but this general paradigm of
improving human performance, as opposed to just healthcare is something that I think is really
a great opportunity here that's just barely being explored. - You did try it out
though. Am I wrong that-- - Oh, yeah. Yeah, we've tried this app.

We now have to get to the point of trying it out on patients. It was a goal that was like
one that we knew we could do, but the body tracking stuff
is exploding all over. It will be more than just your fingers. It will be how you move
your whole body I Think. I mean, I'm very excited about this because I'm at an age when
we won't go into details, but either me or many
friends and relatives my age are in the middle of
physical therapy basically because that's what happens when you get to be a certain
age and it is boring. And anything that the
physical therapist can do to motivate is super important, especially for these long
haul physical therapy plans that are months.

It's a rare patient
that has the discipline to stick to it without some help. And so that's extremely exciting. I didn't wanna miss the
chance to also ask you about telepresence and education.

What's happening in that area? - Well, one of the things
that the vendors have realized is that a killer app for these is that you and I might in the future, not just have this conference over a video Zoom link like this, but that you and I will actually be able to wear these headsets and then
I will see a virtual version of you and you will see
a virtual version of me and we'll be exploring
this same space together. And this is actually already possible now, as long as we have... if you're willing to accept that the other person looks
like a cartoon avatar. But they're even getting to
the point where with that, I heard recently with some of the headsets where they can actually monitor your smile and your eye movements and your blinking and make it less uncanny
valley and more realistic. But if we could do that
together with a model of a congenitally deformed
heart, and you're a surgeon and I'm a cardiologist and
he's a cardiac radiologist and we have this baby heart right here, and we could talk about how does the VSD need to
be closed in this patient? And here's the blood flow
dynamics from the MRI 40 flow and how much flow is going
out the pulmonary artery versus the aorta and how
big the shunt fraction is. And it must be going across
here, what kind of closure or which kind of operation we should do.

We can have this virtual conference. Right now what we do is we
all get together in a room and have these multidisciplinary boards. But that requires physical presence, which is hard to achieve
with a big group of experts. You might wanna have the expert on this particular rare condition
who's from Texas join you. And join in...

And even for general education, we have a student here
where they've scanned some beautiful high resolution
cadaver dissections, and another student who's
working with doctors in rural Kenya, rural
Kenya medical school. They don't have the resources
to have a cadaver lab, but their doctors with a
very simple device could be looking at the same
dissection performed by the expert anatomist from (indistinct) and who has spent his
whole career dissecting out the femoral cutaneous nerve
that I couldn't even show you. I think that this will democratize a lot of medical education in that way, and that'll be great for everybody. - This is super exciting, 'cause I don't think patients
realize, maybe they do, that when they have complicated cases, it very much becomes a team sport, as you said, there's all kinds of experts, including nurses and pharmacists
and they all get together. And the ability to have a
virtual meeting not be almost as good as meeting in person,
but having it be much better. And especially when you can
have 3D models of the organs or the malfunctions that are being...

that's a very exciting
model for both efficient and improved quality. How far away are we from all this? - At Case Western, they actually... which is one of our
leading medical schools, the best one in Ohio and
leading medical schools in the country, they got rid
of their anatomy cadaver labs in favor of having virtual reality content like this delivered. They've actually transformed. - Case Western trained physician, they may have learned
anatomy in a virtual world.

- Yes, absolutely. - And they know what
they're talking about? - They do, and maybe better
than the cadaver world. I mean, my experience with anatomy, maybe you had this was that
when we did our dissections, we would go in there and
try to find these things.

And then we would leave,
and then overnight, the prosector who in our case had come from Heidelberg, Germany would come and he would fix up all these things that we had made hamburger out of. And you go in the next
day and you're like, "Wow, it looks beautiful." (Bruce laughing) - We were in charge of putting pins where we thought various things were. And you're right, the next morning, all the pins would be moved
to the right location. - Exactly. And yet now we'll have
really this great expert and also you can take it apart, but you can put it back
together if it's virtual.

You can actually see the
relationships in so many ways. - We have only a couple minutes left, but there's one other area
that I wanted to ask about, 'cause it's a great topic. It's MRI and fear. I mean right off the
bat, when you say MRI, many of us shut down because we've been in those tiny little
tubes, trying to be still, worried about whether our
saliva is gonna choke us because they told us not to move our head and try not to swallow. But it sounds like there's
some virtual reality or augmented reality
solutions for MRI phobia.

- Well it, a lot of
phobias are being tackled with virtual reality actually. Tom Caruso in the CHARIOT program here, Sam Rodriguez at Children's
Hospital have been trying to help kids get over
phobias by distracting them with video games in virtual
reality, for example. But our idea was let's make
the experience of an MRI be not something that's
unfamiliar to a child who has to have an MRI and then freaks out when they get stuffed in the tube and the gradient start banging. Let's give them that experience. And so they are guided through
with a really encouraging way that kind of incrementally increases what the experience is like with a thing that's as simple as a Google cardboard that you stick your phone
in and they look at it. It may be at their home and
they lie down on their bed and they can pretend while
they're guided through to see what it would be like
to be in the MRI scanner and to hear those sounds.

What's really neat about
it is because again, it's monitoring their head
position to make it work, we could tell whether
they're moving or not. And so we could actually show them your picture's pretty good, but if you held a little more still, the green ball would stay green and then the picture
would look even better. And so we could actually train the ideas that we'll be able to train kids. One of our students presented this at the ISMRM conference
just a couple months ago. I was very excited about
this possibility of now that you could help people prepare for what might be a
scary medical procedure. - It's great because you
describe it as a thing initially for children, but many of us
who have had MRIs are thinking, "I need this."

I mean, it's such an
obvious idea to give me a little heads up about
what it's gonna be like, because most of us just
talk to our friends and say, "How bad was it?" And then of course, you
get whatever story you get. And the idea that there would be a prep, we all prep for various
procedures and various visits, and why not prep everybody for their MRIs. It's obvious that this will
have applications for adults. You just probably have to make
whatever the distraction is or whatever the game is.

It has to become adult oriented. But that doesn't sound
like it would be very hard. - Yeah. I mean, I think what
would really be the best for it would be combining it with
this telepresence thing. People want the human connection.

If we could have somehow the technologist who's going to run your
MRI scan magically appear in this virtual training experience that a child or an adult is having, they would even begin to understand that there's gonna be another person there with them who might not
be in the room with them, but is listening to them or who will be in the
room at certain times. And they'll be able to understand
how to remain connected and that will alleviate a lot of anxiety. And that's my hunch.

I mean, just unproven,
but we have my hunch. - Thanks to Bruce Daniel. That was the future of
augmented reality in medicine. You have been listening to
The Future of Everything with Russ Altman on SiriusXM
Business Radio channel 132. (gentle music)

2022-07-27

Show video