- Thanks to all of you for coming to Open Question. This is our fourth Open Question event, as I think maybe was mentioned and the last in, in this particular series. But as you know, we are trying to tackle wicked problems in these, in these events and these discussions. And these are problems that are resistant to any easy solutions. And so human health and disease is obviously one of the oldest and, and most challenging problems and one we'll probably be working on forever just because of the nature of of, of the human body and, and how complicated it is.
So tonight we're gonna be a little more solutions oriented and as we dig into this question, when does tech make us healthier? And so just to focus that question a little bit more, we're not gonna be talking so much about all of healthcare technology 'cause that's a huge field and you know, we all know about hospital treatments that we've had, like MRIs and CT scans and laparoscopic surgeries. But tonight instead we're gonna look at more personal health technology and things that we might have with us every day that, that help us to improve our health and wellbeing. And so, I mean, how many of you have have your step counter, right, your phone on you counting your steps or, or a Fitbit or or something like a sleep tracker, right? These are things, these are kind of the beginning of this kind of technology and everyone has taken a home covid test, which is another, another way of thinking about a technology that you might carry on with you. And so these are old news.
And so I think tonight we're gonna be hearing from panelists who actually will show us some of the latest advances in personal health technology, including things like a prosthetic hand that that is beginning to feel almost like a real hand, a tattoo that can respond to your blood sugar levels or a vest that can sense anxiety and help you to calm down. So with that, I'm gonna invite our distinguished UC Davis panelists to join me on the stage and if you can give 'em a round of applause as they come on out. So I'll do a quick round of introductions. So just to my left, here is Alyssa Weakley. Alyssa is a clinical neuropsychologist and an assistant professor of neurology at UC Davis School of Medicine. She studies, dementia and cognitive rehabilitation and she helps people with dementia and their families by developing remote caregiving systems.
And then next is Katia Vega. She's an associate professor of design at UC Davis and she leads the interactive organisms lab there. They're working on the integration of cosmetics with electronics, makeup with biosensors and sustainable interfaces. And then next to her left is Jonathan Schofield, an assistant professor in the Department of Mechanical and Aerospace Engineering at UC Davis. He studies human integration with assistive devices and his group is collaborating with clinicians and engineers at Shriners Children's Hospital and UC Davis Health to develop intelligent devices like prosthetic limbs and cooperative robots. And then finally we have Gözde Göncü-Berk, an associate professor of design at UC Davis.
She leads the WEAR Lab, W-E-A-R, and they are pioneering wearable technology design and smart textiles research. And their developments have included clothing that responds to anxiety, therapeutic smart gloves for pain management and textile stretch centers to track body position. So I think now what I'll do is invite each of the panelists to give a brief introduction a few minutes about their research and I'll pass the clicker onto Alyssa here and she can get us started. - All right, thank you very much for having me. I am Alyssa Wea kley.
I'm an assistant professor in the Department of Neurology at UC Davis. I'm a neuropsychologist, but also I've been trained in gerontechnology, which is the design and testing of technology for older adults. And I'm particularly interested in creating technology solutions for older adults who want to remain independent in their own homes and gain some assistance from their family members who are living at a distance. So rather than me talk more about my research, I think it's more impactful to have one of my participants explain a little bit about how it's, how it's changed her life thus far.
Maybe - This really was working like 20 minutes ago and, - But it's fitting because whenever you have a tech talk you have to have failures in technology. It's actually mandatory, so it's perfect. Can anybody up there advance this slide? - There we go. - We go. - I am June Atton and I have three problems due to a deficit of oxygen during surgery many, many years ago. - Okay, that's it.
- iCare is really actually designed to help the individual who has the cognitive impairment and their family members who become remote caregivers. iCare allows them to have conversations with their loved one in their own home and allows them to set reminders, calendar entries and helps them interact more in their life. So within the individual's home, what we have set up is an all-in-one computer system. That computer stays on at all time and is in a prominent location within their home.
This helps you with kind of preserving that memory and being able to look back at it later on. In addition to interactive care, we have partnered with researchers at UC Merced who have developed these vibration sensors that unobtrusively collect information about activities within a home without any video or audio information. These vibration sensors allow us to determine a person's walking pattern, the activity level within the home, how often they come and go. And then we're working with UC Davis researchers to help us put meaning to that information.
- I may not know that my mind has gone off the rails and I've said or done or gone somewhere that I shouldn't have. And I understanding of that she's going to affect people around the world and I am just grateful to the bottom of my heart. She's already helped me and she's still in the beginning stages. - Very impressive work.
My name is Katia Vega, associate professor in the Department of Design and also a member of the graduate group in computer science. I'm here actually to represent the work from my lab and my group that are over here that we work with wearable devices in different kind of substrates and with different kind of technologies. Our aim and the projects I will show you more are to understand information that they usually don't have access to. And what we have in our body fluids, like in our saliva, in our tears, in our sweat and some of these projects involves thinking about how we modify the body and how we rethink about different kind of aesthetics that we daily use. One of them are these tattoos that they change colors with your bio data. We were using glucose, sodium and pH as biosensors, probably similar as all these test strips that you probably already test at some point, maybe some covid now that we're more familiar with these kind of biosensors, that they change colors and reveal information from yourself.
And instead of thinking and going to the doctor or to have some lab exam, what if we could rethink about how tattoos exist in kind of like an static way And we could create dynamic ones by changing the colors of them with your metabolism and the way, and this was kind of like one initial project as a collaboration with MIT and Harvard Medical School that led us to rethinking other kind of substrates to incorporate this kind of biosensors. And this is another project and it's actually a lipstick I'm wearing right now that is a change color with your pH levels. This project is called Bio Cosmic and we could think about how makeup in general, and we're calling all this area of bio cosmetics interfaces on how would your cosmetics could actually react and reveal information from your body or with fluids that you are interacted with. I could show you probably later if you wanna do a fast live demo.
And how my lipsticks actually showing my coffee that is actually a little more basic in color right now. And the goal of the lab is to rethink about the different ways we could understand these kind of informations going for non-traditional wearables in the way of tattoos or a lipstick or dental braces, this kind of ligatures or generally and currently showing a student work of animal bio and some computing with a litter box. So how we call, have information from around us and be revealed in terms of colors or electronics to have a continuous way to monitor your body. I'm happy to answer questions later. Thank you. - Hi, so I'm John Schofield.
I am an assistant professor in mechanical and aerospace engineering at UC Davis. And rather than try and get my research across to you in about three minutes or so, I thought we could maybe play a game and we could learn about it that way. So hands up. Can everyone put both hands up? Yeah. Okay.
Follow along with me. I want you to bend your thumb, pointer, middle, ring, pinky all the way back. All the way back. Okay, good. You can put your hands down. Now whether or not you realize it, what you just did was an incredibly complex systems control problem.
Okay? So what you were doing is you were listening to noises that your brain then interpreted as auditory instructions halfway through I stopped talking and you were taking all my visual cues and interpreting those halfway through I also pulled my hand down and you continued, you were predicting what I was intending you to do, your brain integrated all of that and then it projected it out on a map to your own body and you synchronized all those motions in time with me and ultimately your predictions of what you thought I wanted from you. So if you stop and think about that type of control from your body for just a second, it is intense. And so the reason I bring this up is because one of the things we care deeply about in my lab is how can we leverage that immense control that our brain has over our bodies? And how can we use that to interact with assistive devices when a limb is injured or perhaps profoundly impaired? Let's do one more game, both hands up. Count your fingers, put your hands down. How many of us counted 10? Yeah, that always gets chuckle.
Anyone get 11 or 12? No. Okay, well if you got 11 or 12, you're either really bad at counting or you're one of about every 2000 or so people that are polydactyl. And why polydactyl is really interesting is because you're born with an extra digit, probably extra toes as well, and they legitimately are extra digits. Your brain can control them the same as it would your fingers. And so you can play piano, type on a keyboard, imagine the possibilities.
But what's really fascinating about this is if you think about it, that means your brain or my brain houses the capacity to control things that we might never have been born with. And what's really cool about that is one of the patient populations we do a lot of research with in our lab is children that are born missing a hand. And the question becomes, how would a child want to engage with a prosthetic limb if they never had that hand in the first place? What sort of neurological and muscular activity do they have available so that we can allow them to effectively use prosthetic limbs? And so what I'm gonna show you here is just a sample of some of the work we do. This is a 14-year-old boy who was born missing a limb and my grad student in the background is prompting him to do different motions with his missing hand. We are using wearable non-invasive sensors and simply measuring the activity of the muscles in his stump. And what you see now, we're gonna say open your hand.
What you're seeing there is a child who has never had a hand, we are capturing and decoding his intentions with his missing limb and having a robot achieve those intentions for him. So this is really exciting. There's lots of lots of things I'm excited to talk to you guys about today and really, really happy to be here.
Thanks for having me. - Amazing introduction. John. Is the mic working? No, not yet. I have a red one. Yeah, I'll take the red one.
- Yeah. - Hi everyone, my name is Gözde Göncü-Berk. I'm an associate professor of design at UC Davis. So I direct my research on the WEAR Lab and we develop wearable technology that's soft and textile based. My research focuses on development of electronic textiles and textile based circuits and sensors. And we have a very heavy emphasis on chronic conditions people have to live with and try to find solutions to these real world problems. So a little bit about my background.
I have background in industrial design, educated designing hard products, but switched to soft things later on in life. I got a master's degree in clothing and textiles design and a PhD in design later on. Oh here, this is me and my lab. I forgot Oopsies. So I have a little video that's gonna show behind the scenes of what we do on a day-to-day basis and a quick intro to two projects. - Wear Lab at the University of California.
Davis is an interdisciplinary research and design laboratory directed by Dr. Gözde Göncü-Berk, dedicated to exploring the intersection of textiles technology and the human body focused on wearable innovation and with a strong emphasis on sustainability, inclusivity and human-centered design WEAR Lab fosters collaborations across disciplines including design, material science, engineering, and health sciences. The lab's research spans a wide range of applications from performance enhancing gear and biomedical wearables to speculative designs that challenge the boundaries of human product environment interactions through 3D, printing, laser cutting and computer aided embroidery. New ideas take shape fast. At Wear Lab we develop next generation wearable technologies that go beyond passive sensing, creating garments that interact, respond, and empower. One example of this is calm wear a smart tactile sensory stimulation garment designed to assist individuals with anxiety disorders.
Research has shown that deep pressure stimulation can help regulate the nervous system. Calm wear brings this concept to life as a smart vest with dynamic inflatable bladder that provides reactive compression and tackle actuation without restricting movement. But what sets calm Wear apart is its intelligent textile integration, enabling it to sense and respond to the wearer's physiological state embedded within the fabric.
Electronic textile structures monitor vital signs in real time. Two lead electrocardiogram electrodes, track heart rate and heart rate variability. A stretch sensor monitors respiration rate stitched conductive tracks seamlessly connect these sensors to the novel detachable electronics developed in house. Urinary incontinence affects over 200 million people worldwide impacting quality of life independence and mental wellbeing. Current solutions focus on containment rather than prevention, relying on absorbent products or invasive medical procedures that fail to provide real-time monitoring and proactive management. Another project from Wear Lab is a discreet wearable undergarment designed for continuous non-invasive bladder monitoring.
Using bio impotence spectroscopy privy measures the body's electrical resistance to detect changes in bladder volume. Embedded embroidered textile electrodes track these changes in real time feeding data into an advanced machine learning algorithm that accurately predicts bladder fullness. Privy tracks subtle electrical resistance changes as the bladder fills transmitting data to an embedded machine learning algorithm that analyzes patterns and predicts bladder fullness with an accuracy rate exceeding 90%. This AI driven approach minimizes false alerts and optimizes reliability.
Unlike traditional catheterization or bulky ultrasound devices, privy is soft, washable and designed for long-term wear. - This is really wonderful and really, really cool stuff to see. So thank you for sharing. And it all, you know on screen, it's all these, they're, the projects are all finished and they're all working well and anyone who's ever designed or developed anything knows that that there's a long road to get to to a slide deck like this.
And so I wondered if each of you could share a little bit about what you know, your challenges were along the way to developing these systems and and kind of how you overcame those along the way. So, and we'll just maybe go down the row again. - So every product starts with an idea and that idea needs to be conceptualized and you need to have some initial testing and then you need to get preliminary data and then you need to apply for a grant and then you need to get the money. And then once you get the money you actually get to start to do the research.
And that's a long road in academia from the start of an idea to when you can actually start testing the idea. And I would say that my biggest challenge just getting that funding and technology work is not cheap. So we need a fair amount of data and a fair amount of funding to even get the basic idea kind of off the ground.
And so as a young researcher who was formally a clinician, getting started with that process was an upward battle and it still is sometimes. And so, I don't know if you're all aware of this, but there's been some changes in NIH and if it's already been challenging for researchers to who have these ideas in health technology to get funding when NIH was in full force, imagine what it's gonna be like in the future. So write, send letters, let people know about how this is going to be impacting science and science development in the future. - I'll say I will definitely agree. We're living in a very
weird times and I would also like to acknowledge you guys coming here and supporting research and supporting health and wellness research and trying to learn about that. So that's kind of like definitely something that we'll see probably in the future years that will be affecting the future of research in general. Probably personal going to the question of what are our own challenges in the lab I usually tell my students to think about or try to find a sign based technology and we rethink on how to put that closer to users and that involves technologies that we don't know anything about. That could be, for example, the case of biosensors, colormetric biosensors, electrochemical biosensors.
And we had to learn from scratch reading all these scientific papers in a design plus CS lab of how to make this happen. We are, our main goal is to think about these ways to being closer to humans. So that's kind of like our very first challenge of rethinking these kind of technologies that are usually just based in a very controlled settings and how we could actually use it and put it closer to everyone else.
So that's kind of like, I will say the very first thing that was also kind of like a very big shift for us. We start working a lot with pure wearables, electronics, and now we are moving into this kind of also scientific bay exploration and it gives us all these challenges that probably if you are in science, you have to deal a little bit more with of control based experiments, all these other ways to recover to failures. Plus also computing and machine learning because everyone does it of course. And like the way that incorporate different kind of disciplines. Our research goes into incorporating design to think about the human body, to think about, for example, sweat or our color skin or how this information that we are having I having right now in my lips is revealing information about myself.
So it also comes out with some privacy issues that we also discuss in our work and how we could actually make everyone aware about what it means to create this kind of technologies too. And with all that challenges that having in an interdisciplinary research lab, we could have it and live it very closely. Definitely. - No thank you. I kind of want to, you know, hit on that point.
I think, you know, maybe to, to not answer the question really, one of the things that I think is, is challenging for the field and something that we've been very fortunate in the laboratory to be able to, to build is the interdisciplinary aspects of it. And so especially being someone that's formally trained as an engineer, you, you do have to recognize that you don't have all the answers. And if I was to exist and operate in a vacuum, the scale of the challenges that we're interested in addressing and interested in, you know, and tackling just would not be something we could do.
And so this idea of being able to interact with clinic to be able to interact with neurologists and, and neuroscientists able to interact with patients and and actual users and to, to have everyone come together and provide their lived experience, their perspectives, it's so critical to being able to answer some of these large grand challenges. Which I guess is kind of the theme of what's going on here. And we're so fortunate to be in this Northern California region where these are very real things that we can do. So I don't know about challenges, but there you go.
- Yeah, I agree with everything that's been said so far, especially the interdisciplinarity and as a designer, the need to educate myself on every project, every challenge we are tackling. Right now in the lab, we are learning a lot about bone. What happens to the bone when you're in microgravity just to be able to communicate with collaborators or write down a a grant, there's that never ending learning, which is kind of like challenge. But at the same time the motivation, I think behind all of this. I guess challenges what we have on a day-to-day basis until we have something that's working. Because what we start with is some sort of a vision, a problem to solve and then we build a prototype, we build a technology around it and it takes, I don't know how many failures, how many fixes.
Sometimes it takes years. Sometimes if you're lucky we'll be able to have something that's working within a couple of months. But on the practical level, I think the main challenge my lab has is marrying two very different materials, soft textiles with hard electronics and making them fit very complex. 3D structure, human body which moves, bends and stretches. Therefore we have to be very careful on how we connect electronic textiles with actual hard textiles.
And those connection points are our biggest challenges. But those challenges also give us new project ideas. Now we are looking into creating modular structures where we can take apart hard components from the soft components, which solves other challenges that we have like washability or repairability making things less disposable but more sustainable in the long run. - Yeah, yeah. Alyssa, I wanted to ask you about the computer system that you developed. You know, we all know that it can be challenging to develop a computer system for older adults to use comfortably.
And then you're working with, you know, population that has dementia. And so how do you, how do you actually get to the point where it's something that, that those people are gonna be able to use and are gonna be willing to use? - Yeah, it's a really important question. So I have the background of being a neuropsychologist. I work with people with Alzheimer's disease all the time.
And I felt like going into this problem, I really understood how their brain worked and how they would interact with my software. And I also have a background in designing technology for older adults specifically. So I was like, I have the perfect combination of skills to design this amazing interface that they're gonna be able to pick up and use right away. And I went through and I did the appropriate iterative design process. I developed something, I checked in with my potential consumer base, caregivers and individuals with dementia. I asked for their feedback and then I reiterated on that and I did that three times 'cause that's what the literature told me to do.
And I got very high satisfaction rates, 90 to a hundred percent across the board. And I said, fantastic, this is going to really change people's lives. I did a pilot, I deployed it out into people's homes and then people would come back and say like, I can't quite figure out how to use this button, or I don't really like the way that this is displayed. And what I learned was that I need to take a different approach and I need to work one-on-one with each individual and really get to know the person and how they want to use it and how they naturally are gravitated towards it. And I paid very close attention to where their eyes go, their eyes go, where their hands naturally move. And I slowly figured out how to make it as intuitive as possible so that somebody could have just picked it up off the street and could still learn how to use the different functions of it.
Not to say that we have the perfect solution, but I think that we're getting closer with each iteration that we do. But overall, I think working with people one-on-one and doing this very close design with the end users is incredibly important with any patient population that you're working with, but particularly with individuals who may not remember day to day how to engage with something that's new. - Yeah. One of the other anecdotes you mentioned was that
you, in the initial version was a, was like a, a tablet, like a and that, that you found out that what they really wanted was a desktop computer with a mouse and a keyboard. - Yeah, that's right. So I initially designed something very sleek. I got a very large touchscreen tablet and I was like, this is something that I would want in my house.
We could put it up on the wall, it can be a part of the home, it'll look like a little whiteboard, we can put a pretty frame around it. And I brought this to people's homes and they're like, yeah, it's nice, but I don't think I'm going to use it like this. And because what, what is comfortable to me is a desktop and I want a mouse, I don't wanna use my finger and I want a keyboard and I don't want a little keyboard, I want a big keyboard. So I was like, I've totally changed that.
And now our hardware solution is an all-in-one computer because they also want minimal wires and they want it to still look aesthetically pleasing so they can put it in their dining room. So it still has to look nice, but it had a big transition from where it started. But I honestly, that's the my favorite part of this research, which was surprising to me.
I thought it was going to be, you know, seeing people being more functional and more independent, which I love, don't get me wrong, but what's been really fun is the surprises that I've got to uncover along the way and what I thought was going to be a great solution and being surprised when it wasn't. And I think it's so clever that I'm learning from this amazing population. - So I wanna make sure we have some time for questions from all of you. So if you haven't yet do jot down a question and we will go around and be collecting those and then they'll hand me up a couple to, to ask to the panel. So meanwhile, let's see, Jonathan, I wanted to ask your, your lab, you talk, you talked before about that there's a really, you work a lot on the relationship between the, the person who's getting the prosthetic and, and the prosthetic itself. And, and it seems like it's kind of about integration, like integrating that thing almost into the self in some way.
Is that kind of an accurate way to talk about it or? - Yeah, I think that there's, there's a few ways to kind of look at how the field is progressing. And I think that if you want to, if you want to break it down and be, you know, be very blunt about it, we can today, in the laboratory, in my laboratory, a handful of others across the country in the world, we can trick someone's brain into believing that a robotic limb is a part of their body. They can control the limb by thinking about moving their missing hand. We can allow them to feel touch in their missing hand when their prosthetic is touched. And we can also allow them to feel movement when they're moving their prosthesis.
We can close all the loops that the brain is looking for and trick the brain into believing it's a part of the body now. So what the challenge is is that that's in a very controlled laboratory setting. And when you go into the real world, the way you might engage an object or you might handle yourself or how you might be moving, the context changes, things are moving, things are dynamic and it's always shifting. And so the idea of actually integrating this machine that's strapped to your body and tricking your brain into truly integrating it as a part of the self is an incredibly complex thing to do out in the real world where you really need to achieve some, you know, meaning meaningful fidelity between yourself and the machine. The communication needs to be spot on. And so I think there's a few things.
Yeah, that's absolutely the goal. It would be amazing if we could truly provide a replacement limb and not a machine that is strapped to the body, but there's also a moving goal in terms of what the patient might actually want. Some people truly do want that, you know, especially for a hand or an arm.
Some people want to be functional, want to use that as a piece of themselves, control it like it's themselves. Some individuals don't want that. Some individuals are happy with a hook or a passive device that looks very convincing so they can go out for a social night with their friends. And so I think we have to really, it's exciting to think about bionic replacement body parts that that function just like the, the, the real limb. But we all, as, as you know, scientists and as clinicians and as everyone else that's involved and at the forefront of this technology, we have to be very cognizant at the end of the day there's a patient that's wearing this and there's a patient that's motivating what it is that we need to be pursuing.
So there's, it's a very multifaceted area to be looking into and, and one goal is a limb replacement that's truly integrated, but another full subset of goals is what does the patient want and what can they best engage with? - Yeah, that makes sense. Gözde, I wonder if you could talk a little bit more about the, the, the vest that for anxiety and just, that's an interesting kind of a device and it seems, I just wonder if you could tell us a little bit about how you went about testing that and what, what kind of the response was, like how does that actually work with, with, with people? - Yeah, great question. So that was a project that we completed during covid lockdown. So it kind of felt really timely. We were feeling all anxious and I had this idea in my head forever, and then I happened to find the funding and the right type of collaborators. So what's special about the vest is it provides deep pressure, deep tactile stimulation to calm the nervous system down, but it doesn't do it in static way like a, a weighted blanket.
It would respond automatically to your heart rate variability and your respiration rate. And when it senses your anxiety levels are high, it would respond by gently inflating a bladder that's on the torso, giving you almost like a gentle massage or hug like sensation. And this dynamic pressure is also regulated based on how your vital signs change, it deflates or reinflates and helps you to calm down. So it took a lot of failures to get that point, but once we were at that point it was like, okay, we have this, so what now how do we test this? So I had a collaborator from UC Davis School of Nursing with a background in neuroscience and psychology. So we developed a testing protocol together where we had to make people anxious on purpose.
So our protocol involved having our participants give us a impromptu public talk on a really difficult subject or they had to answer complex math problems on their time pressure while wearing the vest. So we tested their vital signs with and without the west to see its effects and then we had them also to take pre and post surveys. So it was a very interesting learning for us, but not such a great experience for our participants, I must say. - Let's see, do we have any questions from the audience? I can take a couple of those here, I'm sure. Okay, that sounds great. Okay, let's see.
So this is for all the, the panelists. What are the challenges for your inventions under the current American healthcare system? Does anyone wanna take that? - I I'll dive on that grenade. No, it's, it's, it's a contentious issue. It really boils down to, so there's an advanced prosthetic hand that is able to move all of its fingers and make different grasping patterns, has tremendous potential for individuals that use them. You can imagine moving from a grasper or a hook to being able to do different grasping patterns means big differences in the way you can interact with objects in day-to-day life. Each of those hands on the cheap end is about $18,000 up to about $50,000 just for the hand, not for the prosthetic fitting services, et cetera, et cetera.
And now it becomes insurance providers that need to be willing to cover it, which then puts the proof on the, you know, the between the patient, the physician they need to work to prove these things are going to benefit them, and so on and so forth. So one of the things that we always are aware of in our field is that given, you know, the healthcare climate and the way that healthcare services and, and assisted devices are covered from an insurance perspective, just because we might have the most advanced beautiful hand that could be amazing and do everything a hand could possibly do, it might not actually get covered. And so insurance considerations are something I don't like to think about quite honestly, but I think we have to think about when we're, when we're actually caring for the patients. So bit of a round, I don't know. Yeah,
- Let's see. Here's another one that says how do you deal with user error, especially coming from non-scientist users who might not understand the underlying mechanisms. - I can talk about that. So I think that user error is very telling and for me it tells me that I can redesign this in a, in a more intuitive way.
How can I take the error that I just saw or observed and modify it so that that error doesn't happen again and it needs to be done in a way that people can do it without training. And I, that is something that I've learned over time. So user error is I think a, a window into opportunity. Okay. - I may just go to, oh, okay, here, here's one about the tattoos, Katia.
So specifically they asking how long do the tattoos work? Does the sensory part of the tattoo have to be refreshed? - Refreshed? Oh, okay. Yes, that's that's a great question because I think like with the time didn't have time to explain it, we detail the test and the evolution that we had with that project. The project, we had it as a proof of concept.
So unfortunately I cannot make a tattoo for you guys to today. We were evaluating this kind of different kind of biosensors that can be possible used as a tattoo involved in color I metric changing colors of fluorescent, changing the intensity of the fluorescent. We were using an ex vivo model that means big skin to tattoo and have the, to understand how deep those inks could go. And that itk needs to be staying in the dermis to interacting with the interstitial fluid to provide some kind of information. For having, in terms of durability or refreshment, there is of course some kind of test that we always also do, even in that test that we do, for example, for the lipstick project or other projects we are doing, we always need to test, do, have to do reversibility test. So how many times these colors could be changing every time you have a different kind of information going on, let's say to change your metabolic, you have some fluids on the top of the tattoo two or on the, on the lipstick and that could be changing its color.
So there are still a lot of research that needs to be done, probably going into living cells needs to be kind of like the next step. Something that it was, I will say very revealing for our lab when we start that project with the tattoos was that we, when, when we think about tattoos, probably you will call thinking a specific population and I receive hundreds of emails of people that probably unexpecting ways that tattoos could be used. I receive an email from someone that is for four years old with tattooing their self or what, sorry, using this glucose devices and inching their self every time to have their information. And he was telling me that I would like to have a tattoo, I never had a tattoo, but with a tattoo I could have information from my glucose levels or a parent with a two years old telling me that I really need to know that the glucose levels of my child that is diabetic type two.
And that was something for us very revealing to try to think about ways we modify the body, but now as it's very close and intimate to our body and our fluids, how that could be a way to control and monitor yourself and your body. So I think that it, it gives us a lot of information to think and to rethink and to continue going through also this research. Thanks for the question. - Sure. Yeah. So we're getting close to time here, so I'm gonna just do one more question for the, that each of the panelists can answer to wrap us up here. So I wanted to kind of do a little bit looking forward in the future and kind of thinking about what are, what is your vision sort of for your research or your hope for what you might be able to, problems you might be able to tackle in the next say 10 or 15 years and kind of what's inspiring you in the future? So we'll start with you.
- So with my technology, I want to integrate the sensors and information from the environment and to relay that information to the individual, to the people that love them as well as their healthcare providers. And so we want to capture this information, use AI and machine learning algorithms to detect changes in their behavior and their patterns, and then predict what's going to happen in the future. My goal with this technology is to eliminate crisis driven care, which is a huge problem in older adult populations in general, but particularly for older adults who have cognitive impairment, who are living alone, who do not know, are not aware of health related challenges that they may be having and may want to keep that information a little secret from their family members. And so what ends up happening is that intervention is done at a point of crisis and I wanna prevent that by, by employing this type of technology and a whole lot of other things. I want to use all of incorporate all of this technology to really help people live independently and live good quality of life with dementia into old age and and really delay the disease course. If we can get people this technology and put it in their hands early in the disease, my vision is that that will stop right there and people will continue to be able to live independently despite changes that might be going on in their brain.
- Yeah, one, one of the goals we have in the interactive organisms lab is to rethink the way we understand our body and how have access to our own information that for you understanding yourself, you need to have other devices. So that's why we are kind of like trying to go one step farther into incorporating these kind of biosensors that allows to create a new symbiosis between the human and and device. And that involves not just the group of biosensors who we're exploring right now.
So we could think about hormones, we could think about other information that we could give giving, for example with mental health and cortisol levels. So we have a lot of information that is in our body that is not revealed and our goal is to rethink on how we could have an ownership of that information to have more access or an awareness of ourself in general. So ideally in our explorations will be involving new different form factors that not just incorporate the tools or lipstick or other cosmetics, but as a way to have a better awareness of our own self. - Yeah. Thank you. So I, I think kind of a, maybe a bit
of a sobering statistic for our field is that a child who is born missing a limb is about 35 to 45% chance they will just not use a prosthetic limb. They'll, they'll leave the thing they were prescribed sitting in their closet. And if you start to actually ask why that is and you start to look at what's going on, children that are using current prosthetic limbs are actually describing themselves as less functional when they're wearing the device than when they're not. And so I think where we're going is we haven't yet leveraged the boom that we've seen in things like smart sensing technologies, things that use and leverage artificial intelligence and machine learning. And what we're discovering is that the children themselves are incredibly capable despite the fact that they were never born with a hand.
And so where I see us hopefully going in the next 10 to 15 years is developing systems that are more contemporary, that are leveraging modern technological approaches in a clinically informed and patient informed way such that we can start to not fix the abandonment issues, but at least offer them devices that are functional and offering them some benefit if they choose to wear them. - Great, thank you. So thinking about the future, so our work is to create wearables that are not only monitoring things about the body or the environment, but wearables that can respond and tell the wearer the user what to do with that data.
Really big emphasis on the actuators, the response portion of the wearables. And I think my vision is almost like rather than a passive garment, almost like a living system that could respond to the body, the environment, change something about itself or nudge the wearer to change something about themselves to tackle the problem. That's kind of like the long-term vision. Within the next couple of years, our main goal is to take some of our wearables to the clinical testing level, the bladder monitoring wearable after a year of battle with the IRB, we got the IRB and we are hoping to test it at the Department of Urology by catheterizing people with spinal cord injuries, filling the bladder with liquid and then emptying, refilling and collecting by impedance spectroscopy data with the wearable. So that's something I'm very excited about.
I guess we just get really inspired by so many things. So microgravity outer space, our recent collaboration is a huge inspiration now in the lab. What happens to the body under those conditions? So we are learning a lot about bone and kidney and kidney stones, like monitoring those and responding to those changes I think are the next goals in the lab. Thank you. - Thank you so much to all four of the panelists for really just fascinating and inspiring work that you're doing and for coming to share it with us here tonight. And so if we could get a round of applause for our.
2025-03-30 06:00