"Smart Wheelchairs for People with Rare Diseases" by Dr. Edward Grant

Show Video

hello everyone my name is dr edward grant here at north carolina state university and today i'm going to tell you about our smart wheelchair research for people with rare diseases as i say my name is edward grant most people call me eddie i'm scottish and never us citizen my education both in training and at college was in mechanical engineering both for my bachelor's and master's degrees and for my phd i worked on the ai topic of machine learning control my academic bio is that i worked at two universities in scotland and while i was at strathclyde university i was very much part of the team at the original turing institute in glasgow and for a period of time i was acting director of research i'm now a professor of ece at north carolina state university and i'm also a professor in the joint department of biomedical engineering which is joined between unc chapel hill and north carolina state university i direct the robotics and intelligent machines group here at nc state and from 2009 to 2019 i was a visiting scientist in the advanced robotics department at the instituto italiano de technologia in geneva italy and this was one of the catalysts for the research i'm going to talk about my colleagues at the iit and genova negotiated a contract with a fundazione sanita researcher from rome these clinicians dealt with patients with amyotrophic lateral sclerosis or als what is called here in the united states lou gehrig's disease and they dealt mainly with patients with locked-in syndrome essentially they can only move their eyes that's where the disease has progressed for my colleagues at the iit they wanted them to develop user interfaces that would perhaps allow these patients to communicate was essentially what was there outside world here you see one of the slides that was developed by my colleagues at the iit and some of the research interfaces that they had developed the research interface on the right is a keyboard projected onto a computer screen it is eye tracking using a connect camera with this arrangement a patient can in fact type words and sentences onto the computer screen using this particular arrangement on the left is a much more comprehensive user interface for the patient where an eeg sensor system is looking at the signals being generated by the brain to do specific tasks what my colleagues were looking to developing was empathetic and expressive technologies for people with als but they had to be technologies that could allow these patients to communicate with their outside world having seen their research after a few months and seen it operating i was very impressed and i wanted to emulate some of the work that they had completed but i wanted our group to use a wheelchair how could we develop a smart wheelchair so that patients with rare diseases like als like muscular dystrophy md like multiple sclerosis ms how could we develop a wheelchair or a wheelchair control architecture that would allow us to customize sensor technologies to the individual patient that would allow them to plan safe indoor navigation and localization and perhaps down the road we could use cloud-based communication to develop some machine learning controllers what i'm saying is that i've always believed that you don't leave everything to the robot the wheelchair is an autonomous robot and to have all that computing power on the robot is not what should be achieved by researchers we should have simple architectures that allow us then to move the data that we have collected from the patient and the wheelchair and we want to be able to do the computation related to these sensors and the data from the wheelchair to give us smarter control within a building here's our wheelchair it was donated to us the control system didn't work so this is when we decided that we could do better we could perhaps develop a much more universal hardware control architecture that would allow us to plug and play our sensors if and when we needed them for specific patients to begin with we used 2d lighter sensor system we had a connect camera and once we got the controller operational we were able to conduct navigation and localization using standard algorithms that are in everyday use with autonomous vehicles these days originally we used raspberry pi's but because we were having an evolving architecture down the road we wanted to have an optimized architecture one that was fast and it was accurate the raspberry pi was not doing it for us so here's our architectures just give you diagrams that show you how they were connected and what they were um their function of each was overall wheelchair hardware you see that we had changed it from a second raspberry pi here at the end to much more nvidia type technology which we knew could do the fast processing of the data of image data you see here also that we used the robot operating system the standard for robot systems these days and it was all very straightforward nothing complicated nothing fancy this was our universal architectures for building our universal controller at a higher level at some point we were hoping to be able to move to the ibm what's than the cloud but that would require additional technology so in the early stages in these early experiments and the like we concentrated on the localization and navigation for the autonomous vehicle inside a building here i show you a video on simultaneous localization and mapping which is a standard these days activity for autonomous vehicles and my students abhijit and lakshay that you see here conducted the experiments they got the architecture and the original architecture working then they loaded up the algorithm it's got them working and this wheelchair you can see could in fact navigate the corridor they did some extensive experiments where they had dynamic objects suddenly come into the navigational path of the robot and it was able to then compensate for that and navigate around a dynamic object so that we've got quite a bit of experience and we've got quite a bit of confidence that the architecture was slowly but surely optimizing our next activity was one that was mentioned to us by an als patient and it was on voice control his concern was that at some point he would lose his ability to speak and even although he would lose it eventually he wanted the system to be able still to navigate as his voice and accent deteriorated this is when my students played a joke on me along with my sponsor ibm so we took samples of data we found a database with a huge number of accents and we trained the system to follow voice commands now where the joke was played was as a demonstration of voice control when my sponsor at ibm went bang my back to the students and misplayed this joke on me so i arrived at the demonstration i asked the students the sponsors there i asked the students to do some voice command control and the wheelchair sat there like a rock then they asked me to give the wheelchair a command and the wheelchair followed my commands what they had done was they had trained the wheelchair voice control on 1100 scottish accents the wheelchair only responded to my accent here's a video of the wheelchair moving from the corridor towards me giving it the commands just asking it to essentially come to me come to me come on you can do it you can do it come on rest the door come on that's it very good come on then come to me come on keep coming yes it's me that's talking to you it's your scottish friend he wants you to come to me come to me come on yeah that's it steady okay let's go hey wheelchair hey wheelchair hello hey wheelchair don't listen to him come to me come on you can do it come to me come on here we do have some fun at times so now we are getting real experience with this system that we had created and the pandemic got in the way so we had to be creative with our students and trying to progress the work that we were doing some of the easiest sensors to work with are emg sensors and they can pick up muscle control and we worked on a system where we could use two emg sensors connected to the wheelchair but we also needed an imu sensor in order to achieve the control that we wanted if the person could move their head and could activate the emg sensor then our desire was that the individual with this hardware architecture that we have could navigate in tight spaces and this was one of the criteria for developing the architecture in the first place how easy was it for the individual to control the wheelchair and how fast could the wheelchair respond to any control action given that here is a video to show how chris han one of my students adapted the emg sensors on right and left arm and used an imu attached to his ball cap and he could navigate in this basement of his in tight spaces and showing good control and fast responses so [Music] so this prayer fairly much is where our research is at the moment we have been donated another wheelchair that we're going to convert similar to this one here we will have cooperating wheelchairs if we want to go that way we want to use some of the moat technology that we developed way back for colonies of robots that we had that would communicate with one another and collaborate with one another we want to be able to have the wheelchair collaborate with the person we want the wheelchair to be able to collaborate with the building to give it a great deal more autonomy give it more smarts in changing environments the second wheelchair will allow us to then do some of the future research we intend and we can share this with you if anyone is interested we can share with you our hardware architectures and we can share with you how to connect a system like ours and be able to conduct experiments with smart wheelchairs these projects don't run with just one person i have to thank our colleagues at the iit in geneva and in particular leonardo and jacinto and nikhil leonardo and nicola two former students of mine here at nc state i had a terrific group of students that have all contributed to the project and you see their names here and john hummel john has muscular dystrophy and he was one of the first people that i came across that wanted to participate and his family wanted to participate my friends over industrial systems engineering they helped greatly by developing the and building the boxes and the 3d printers so that we could we could make a good compact architectures and houses for our hardware and of course my sponsors at ibm dr andy rindos in particular and the help given by anand singh who helped with the machine learning aspect of the ibm was thank you to everybody that participated and thank you for listening i hope to see you at the liberate 2021 workshop which will be held online i'll be there my students will be there and we'll be able to answer any additional questions that you have in relation to the research the systems that we've put together

2021-05-16 09:05

Show Video

Other news