Living, Learning and Creating with Social Robots
So welcome everyone thanks. For coming so, today I'm extremely excited, to, be introducing, introducing. Cynthia Brazil from MIT where. She's the director of the personal robots group at the Media Lab, Cynthia. Is one. Of the pioneers of the field of human robot interaction and, she, literally wrote the book on social robotics. Cynthia, is also the founder and chief experience officer of, G Bo, really. The first social robot for the home that. Is an actual working consumer, robot and. It's a real success story for moving HRI out of the lab and, into the real world with with real people, geebo. Even made the cover of cover, of Time magazine's 2017. Issue on the 25. Best inventions, of the year she. Is a long long list of other awards and achievements over, the years with. Recognition, by the National Academy of Engineering as, well as the National, Design Awards, but. Rather than list a whole bunch more of these awards I'll, go ahead and hand it over to Cynthia. Thank. You. Alright. Well. It's a pleasure to be here I've had such a great visit seeing, old friends and meeting new friends and talking about living. With AI and, robots, so. You. Know I started. This work in social robots I won't tell you how long ago it was a long time ago but, the. Thing that is so striking, to me right now, it's just who. Is, living, and interacting with AI, today, as part of our daily lives I mean even you know fairly. Recently like with but, 2014. When Alexa, was, announced, in December. There's. Like 43 million of these smart speakers in people's homes now and with, that again a profound shift in who. Is interacting with AI and just acknowledging. There. Is so much opportunity. And, so much we still have yet to understand, about. What it is to live with intelligent, machines when you think about the diversity of users from young children to. Our oldest seniors and everyone in between, and. I. Think quite a lot about the, home in particular, you know I I'm. Sure you've all kind of heard the the, media quips of people like oh the, future is going to be like the Starship Enterprise and, have this you know disembodied. Voice computer. This computer that and for, me you know it's like the home is a special, place it's, I don't want it to feel like work I don't want it to feel like I'm on the subway it, requires, I think different.
Kind Of ways. Of thinking about the, design and how you fit an AI, technology, into. The place that is in so many ways your most intimate important. Place it's a place where you people who you love live. That. Really makes. Me also think quite a lot about our future with living with with these AI systems, and what this is going to mean in. The in in these very important spaces like the home and, we know it's not fraught with without. Important. Issues right so this news article was very striking to me when it came out, 2017. So as, you may recall Mattel, was going to basically create. An Alexa like talking. Speaker, for, the home really catering towards not only young parents, being, able to reorder diapers, and things like that but interacting. With kids directly. And. It. Went very far down the product development, pipeline, it was about to be released for, Christmas. And they pulled the product, because. There were just a lot of concerns, expressed, not. Because of things. That they knew were bad, about this technology, but just we didn't know, right, we didn't know so, child. Advocacy, groups concerns around privacy law maker as parents, again. We just kind of speaks to the time right so we. Just have to be very mindful about bringing. These technologies, into home and appreciating, that as researchers, and scientists, there's, there's so much more to creating, these technologies, that we have to be aware of so. You, know today, we have these smart speakers, I would say in general a lot of AI it is designed to be very. Transactional. In nature it's, very much like an. Intelligent, tool to. Help you make decisions, but, that's quite different from kind, of the vision that I grew up with you know so growing up with Star Wars and thinking about these intelligent, machines not. Just as digital assistants, but really as these kind of sidekicks, and these these. Helpful. Companions. That. Were. Just useful but really connected, to us on this human love and they, not only helped, us do, useful things that matter to us they actually even helped us become the people who we aspire to be right and that. Was really the vision that I've grown up with of what this technology could mean to us and the, exciting thing is now these technologies.
Are Starting, to, come in to world world environments, we've brought them into schools, looking. At robots as language. Learning companions, into Hospital Pediatrics to work with Tri Life specialists, to assisted living facilities, and now of course they're finally coming into the home so. Again. What's the evolution. Of AI in the home is it going to be from this sort of transactional, AI that we have right now and digital assistants will, it become the, sort of different, vision of this helpful companion, that's a much more kind of humanistic. Design. I think, the, crux of the matter is, again, especially, when you talk about robots, now we. Still understand very much about this long-term interaction. Right when talk about a research. Setting it is, hard expensive. Logistically, challenging work. To deploy these, systems, in the real world into, real environments, where they're interacting, with real people where. You're innovating on the algorithms, and the capabilities, as well as trying to understand, what's, the impact of benefit, of these, technologies, on real people's lives so this is just a paper it was a written, into 2013, we actually did an updated version of this looking, at all the longitudinals. Human robot interaction studies. Ever done many. Of them were basically, not even innovating. On the algorithms, just taking a commercial, product like you know a plio or whatnot into. The environment and just looking to see what people did over time and the, punchline is there's just not a whole lot of studies out there so, there was a real need for this. Kind of work, and. There's, a real need I think for coupling this with the algorithmic, innovation. We. Need better tools we need better processes. To. Really help galvanize, and, empower of the field to to advances for it and, you. Know when I think about AI, in the big picture right again there's a lot of discussion, around productivity. And, efficiency the. Future, of work but. I feel that the. Promise AFA I will, not achieve its full potential, if it can't address these, very deep, human. Needs so you know I asked question and a lot of my work can AI actually help us to flourish not, just to be more productive but, to live better quality, lives and to be better people and when, you look at kind of operationalized. Frameworks, around things like well-being, it's. Very clear, that that goes way beyond the cognitive, right you have to think about the social the emotional. The, sense of achievement and meaning if we're, going to create a ice that helped us to live better quality, lives we. Have to create a eyes that can engage in support us on all of these dimensions and I would say although there's a lot of work on the cognitive the, social-emotional is, relatively, very, understudied. What. We are starting to find out is as you create these, social. Robots, it's. Not as if they're like like people who you're going to replace people or like other kinds of tools like smart devices they're. Kind of this new amalgam, of relationships. That we kind of understand but in this new kind of intersection, right so, there are attributes of, these systems, that engage you like a motivating. Collaborative, Ally and I think that's really important. There are aspects, of course of a robot that can, be an Internet cloud connected, tool like any other kind of computing device that's kind of like a given but. The other thing that's fascinating is, the way people naturally. Seem. To want to relate. To it as an other where. There's these attributes, of being like an attentive, companion, animal the sort of non-judgmental. Affectionate. And I think you could appreciate in that video I just showed you the. Nature of how people, interact. With this technology, is, very different. Than, a tablet, or a computer or VR headset, it, is this social emotional. Relational, connected, other, experience. And that actually can be very empowering. For, people if designed in the right way so, you, know in the top I'm gonna highlight. You.
Know This, I think kind of the punchline of a lot of the work that I have done so whether we're talking about a social robot, or I think any. Other kind of technology, that's trying to support human, flourishing is, the, more you can support. People on all of these dimensions, we. As human beings are clearly not purely cognitive creatures. Right we're deeply, social deeply, and Moe deeply, physical, creatures all of these things. Influence. How, we learn how we make decisions how, we perceive the world around, us how are influenced, by events, around us but, the more you can support and engage people in. This holistic way what, we find is the more deeply people will invest in that interaction and when, people invest more, deeply, they, can often be more successful, and I think that's the punchline is they can be even more successful with the technology, than, if you don't designed to support this holistic experience. So. I want to talk about, this. Is kind of touching a little mix of research, and application, of kind. Of what is different, about this. Experience. With. This social robot. And, you. Know a lot of times, we could ask the question still. Today does, it matter that it's physically, embodied does it matter that it's a robot, ironically. I would, say when we first started. This field. We. Would get papers just rejected, from Kye point-blank, just because it was on a robot, and reviewers would say things like why is it a robot it's so expensive why, not just think it a virtual agent so, the thing that we had to compete with against, was basically a screen with the with, a virtual. Avatar. Now. We're, having defend ourselves against, smart speakers, and. I'm kind of like that seems like a huge several. Steps backwards, you're taking even more impoverished. Stimulus and you're asking us to vote on. That so, again I guess that's just a lesson and kind of combating. Ubiquity, right once are ubiquitous then, suddenly you don't have to fight these battles anymore but, we get asked this question a lot so what is it what is it about the social code, presences it actually really matter so I want to present work as a kind of a case study in work. That we've been doing in Pediatrics. This, is a long-standing collaboration, with. Boston Children's Hospital where. We've been working with Child, Life specialists, and Child, Life specialists, are professionals. Who are trained to address the emotional, support. And needs of children. Patients we're in the hospital but also that of their families and, the question of course is not surprising a lot of these domains, there. Are not enough Child Life specialists, to, to meet the demands. The patient. Admitted. Patients and so, the question of how can you create a different kind of technology, that. Is an extension of the Child Life Services, team that helps them basically do their job in a more scalable effective, way is actually, really really interesting and I'd also say another kind of a. Side, effect of that is also how. Can you extend that quality of care to the home because you can maybe provide, exceptional.
Quality Of care in the hospital but. They need to be able to continue to engage patients at home and they have no way of doing that so so again the notion of a robot, of the different, kind of technology to do that is really. Really interesting to them so this is just a video that was done by the New York Times kind, of giving you a little bit of color on on what this looks like a. Piece. Of like this, is not a chronic condition she's, gonna be in and out of the hospital she's, here for procedures, she's, here for doctor's, visits she's missing school she's missing her friends she's really. Not doing any of the things that a normal kid is. Doing and what we want to offer kids like that is just one more way of helping them to feel okay where they are in what's otherwise a really stressful experience, I think. There's a way of connecting. With kids that's. Different. From what grown-ups can offer they, have incredible, imaginations. And, they can really suspend disbelief and there. Can be it's a relationship, that develops between huggable, and a patient. It's. Very nice to meet you. Do. You want to play again. Alright, so what we have here in this case is that we just finished a clinical trial we have a Child Life Specialist, who, works in the room with. One of three interventions. So we compared the physical robot versus, as as, kind of plug compatible, graphical. Version of a robot on tablet, versus standard of care is a plush, the. Social, agents, is tele operated, by another Child, Life Specialist, there's a lot of reasons why we do that one is just because we want to make sure that the quality of care is present, we also want to collect a lot of data to understand, what is the phenomenon of, having a robot, in, this kind of context, that we can then think about what, are the AI is what are the the opportunities, to create more autonomy, for this so we often start projects, with, this sort of teleoperation. Paradigm, to, begin with we get a lot of insights from that so, we just finished a clinical trial comparing. These three interventions, which I think one of the things that's nice about this study is the Child Life specialists, in the room is doing. Her job to the best of her ability with each of these interventions, and so in many ways all three of them are socially, animated, whether that's a remote, trial life specialist tele operating, the virtual bear or the physical bear or if, it's even the Child Life Specialist kind of puppeteering, the plush with the child right so all three, of these are animate. One. Way or another so. You know we looked at the difference between these technologies. Across a lot of different dimensions where we were particularly, interested in the social emotional, factors, if you're, going to try to create a technology intervention. That can engage children. And improve their pop, you know their emotional, experience, we are very interested in looking at that we were interested in looking at engagement, how often children would speak because, if you want to engage a child in a sort of, health protocol in the hospital, their engagement cooperativeness. Is very important, so, if you look at these. Different, charts. That, what we see is you know for things like total. Joyful. Semantic. Effective. Analysis, sentiment analysis on the utterances, that the children would speak over time we, see that the positive effect, increases, over time with the physical robot but. It even decreases, over time with the the virtual, robot and the plush is kind of the lowest when, we look at shared, attention, so the ability for a child to be not only engaged in the technology, with the other people in the room you can imagine for a Child Life Specialist this is important you don't want the child just having their face in the tablet we, say a lot more joint, shared attention, when you have the physical robot there when, you look at just the amount of utterances, you see children are talking, more over time and we're seeing that with. The robot over in the avatar of the plush and we see that the cooperativeness, is. Also high with the robot as well as the avatar and, then when you look at these other social attributes, like the effective, touch the social touch, you. Know people may be throwing the plush across the room they may be poking the tablet with this robot, all of the touch is a social, relational, touch and I think you see that's in the video again and again again so.
You, Know in the hospital the conditions are actually they talk about emotion, as the fourth vital sign and they say this is so critical because we have all this equipment to measure all these other biometrics, of the child but, the emotional, state of the child affects everything, from the recovery, rates to their compliance to, even how long it may take a child to give you their arm to be able to give them a shot for instance so, they're, very very very interested, in trying to come up with new technologies, to continually. Monitor, this and engage children, in new ways we've. Been looking at the opposite, side as well so it's another example of kind of this emotional, lift. Situation. It was lonely for people, that's, especially true with the elderly but Kron. 4s Maureen Kelly discovered. That some cornices. Of looking communities here in the Bay Area are finding, camaraderie and fellowship, through. Robots. Hey. Jim okay you dance let me put on my dancing shoes. That. Was wonderful. Roger, is a adavi technology. Right now is. Me a new regular, visitor, here it's, a robot called Gebo that's being used as part of a pilot project among. The several Bay Area facilities, run by Elder Care Alliance in, addition. To dancing it can play the radio on request. Now. Photograph, oh that's, it so what's the plot for the skills, tell. Jokes, what, did the snail say when he was writing on the turtle's back, we. And. React. To the human touch it purrs, what petted. He. Also often, has some quirky, responses, to questions hey, G Bo what, are you I am a robot but, I'm not just a machine I have a heart well, not a real heart but, feelings well. Not, real feelings, you, know what I mean, Aeryn Partridge, a researcher, and art therapist, when Elder Care Alliance takes. G Bo with her when she meets with the seniors, she says this is not a case of caretakers. Being, replaced by automation, instead. They are using this cute piece of technology, to encourage. Their residents, to connect with each other despite. Some needing different levels of care so. When we have something that all of us are experiencing, all together maybe for the first time right meeting a robot you've got a focus, point where we can all meet, right here in this moment and, have, an interaction then. It breaks that down right it doesn't matter that this person has dementia or. This. Person. Maybe. Has, Parkinson's. And have some trouble talking we're, all meeting together as humans and robots. Gebo. Inspired, some giggles and Capas, amongst, this group here. Well, you get this age you, should be able to learn more and more what. It does it brings that out of us that's. Part of the miracle of it and, that's part of the miracle of living far as that goes welke, so that was very pointed for me so again it's about this emotional, lift of engagement, that's just very different from how engage with other technologies. But, it goes even beyond emotion so again if you look at a number of these systemic, comparative, studies comparing. A physical robot to even a video of a physical, robot so you know it's real but it's just not right in front of you - a graphical, agent of the robot tuned disembodied, voice what. We see is often, these robots actually do quite, well so if you look at all these interpersonal, dimensions. That lead to kind of social judgments, that have a an effect on people's acceptance and engagement of these technologies, the robots core quite well across trust effect. Attraction, empathy, engagement. Persuasiveness, all, of these kinds of things so again there's something actually very important, going on about the physical. Embodiment. Of this kind of technology, that she's engaging people more. Deeply in a different kind of way so, the. Second theme I want to touch on is again, this notion now of this collaborative, Allied engagement. And how that's enabled, with not. Just his voice interface but this kind of interpersonal, UI this richly, multimodal, interpersonal, at UI and thinking. About you know beyond, just interface, to sort of building this working alliance and building this sense of rapport, which, again when you collaborate, with someone who's an empathetic other that's very different than a useful.
Tool That, you're using and. We, know from human social psychology. Right I mean we've, been developing these learning companions for children for about five years now, there's. A lot of social outcomes, and you can look at as well as educational. Outcomes and they're all kind of tied to again this the social, engagement. Between, people. And other, people but now in this case people and robots, so again just appreciating, there's there's a holistic. Kind. Of wrapper if the nature of engagement, directly impacts things like learning attentiveness, attitudes, and so forth so, you, know we. Do a lot of data collection right so you know there's a lot of excitement right now about deep learning and, you know there's certain kinds of data sets that are vast and you, can label them but there's a lot of important. Data sets that frankly don't exist, and these, kinds of data sets that are richly multimodal. Don't. Exist and we think about underrepresented. Populations. Like young children there's. A dearth of data around this so this is just kind of a little montage to. Show you a little bit about our process so if you want to design a robot and engage in this nonverbal. Dance, while. Engaging children in a story, showing telling. Experience. We. Collect the data of what it's like to see children actually. Tell, stories, to one another we. Run various, algorithms, to do automatic, effective computing and pose and all that kind of tracking and then we use that to basically trying to develop these computational. Models by which we want to base the embed that model now in a robot, put, it in the same context. With. A child and see how. Well does that model, perform, is the model actually really, capturing the interaction. That you saw with the two children right, so we do a lot of this kind of work and, these nonverbal, cues and the synchronous, II and the contingency. Turns out to be very important, for how children engage and see, the credibility of the robot as an informative other to, learn from so again all these social judgments, come along with this that's, really important for engaging the children as a learner, so. I want to just quickly, touch on one, project that I think you know from this video really. Highlights. This. Different, way of engaging a technology. Right so this is a learning, companion, take, a robot, we, design a series of educational, games this, particular game is called word quest the, child in the robot they play the game collaboratively, so the robot actually plays the game with the child as they take turns there's. A challenge, word presented. That's you know a pretty hard word for preschooler. Like, crimson or. Garment. Or something like that and their task is to go through pan, through these story scenes and trying, to click on objects, that match that challenge. Word and. What I just want you to kind of notice from this interaction, is based on. Reinforcement. Learning system, to learn a policy, of engagement how. These other aspects. Of the collaboration, are coming out of this encounter. We. Are trying to find, lavender. Color stuff. All. Right so there was a moment in here where you could hear the robot express, confidence her I'm sure you'll do better next time I believe in you if you heard her say very quietly she mirrored that back to the robot so, this is another really, important, phenomenon, we're seeing is when as an AI, engages. People, certainly. Children in this case but we see it across different, generations as an empathetic other you, get all other kinds of aspects of social modeling, and so in this case the child was modeling, empathy. And a growth mindset back, to the Machine right so, you. Know as a. Socially. Influenceable. Technology. This is this is this is something that is definitely something, we need to really understand, we're using it or trying to benefit children but of course this. Is something we that, could be applied to not benefit children right so again just to say there's a lot of ethics and a lot of things we need to understand about this work what's, actually happening, is that we're applying reinforcement. Learning to. Figure out a policy, by which the robot decides, whether it should play the role of an expert, where, it may be defining. The word for a child. Being. Able to reinforce, demonstrate. Its understanding, of why an object matches a word or a novice. Whereby the robot may ask the child questions, the robot may make a mistake on purpose and NASA child why it made the mistake we, basically, compared, based. On these three reinforcement.
Learning Models, always being an expert, always being a novice or this adaptive, roller switches, between the two. Being at children, in the Boston Public Schools, we recruited, schools especially with kids with high ESL, populations. They, played this collaborative game where they learn five, words in the first session turn-taking. And then six of these challenge words in the second session you can see the challenge where it's here they're things like Azur and gigantic, and stuff and they're just silly we can see that benefit, of having this adaptive, role so it's actually better for, the robot to not always be the expert what we actually find is that the chat if the robots always an expert are always a novice children actually start to kind of not pay attention but. When the robot keeps switching it up more like an actual peer like companion, world they, notice, when the robot makes the mistake they engage more and notice when the robots asking you question they answer right so again understanding, this role in the relationship and, now that impact children's, not only engagement even the learning outcomes here is really. Interesting, and what we find is that children who even scored the lowest on the preset, pretest, had the largest gains from these two sessions with the robots so again this is just a single. Shot procession, of these words, with the pre and post-test after and a delayed post-test, you, know I could imagine that of course if you had repeated, sessions like this complementing, that in the home children, would score very, very high on this this sort of challenge level and we think about you know kind, of rules of thumb of game design there's usually this kind of 8020. Rule of mastery, things. You already know versus, the new challenge words all of these words are hard and the fact that children engage, through all of them in stores as well I think this. Is something that's worthy I've got the further further understanding. Of it through this relationship, can you actually have children accelerate. Their learning challenge, them harder because they have this empathetic, other than if they just play the game by themselves for instance right so, these are the kinds of questions we're trying to understand and dig into but again very promising, very exciting, when you think about this different kind of engagement the, last thing I want to touch on is personalization. So. The promise of AI is that live with you can, learn about you can optimize their behaviors, to achieve a learning outcome, this is another, project that's around. Early. Literacy language learning and this, is just to say how important, this is right so you're noticing a lot of work we're doing engaging, young children, you, know the reason is there's. Just too much data showing, that if children don't even start kindergarten, ready to learn it's, very very very hard to catch up so, right now in this country 60%.
Of Children do not attend a quality preschool and. 37%. Of 12th graders in. 2015, can't even rate read a tour but the proficiency, level for their grade now, when you think about the future of the workforce, do you know a single scientist, or engineer who, can't read at grade level right and so, the concern and the worry of course is that dyes getting cast way too early. In a child's life so you. Know people like Benjamin, bloom found. This to sigmah effect that argues, being. Able to learn at your own pace to be able to demonstrate mastery before you move on having. One-on-one personalized. Sessions is the most effective, way to learn the, problem is it's not scalable, and it's not affordable, right but. With AI that. Can change right so this is another example now of. Learning. Oral language, and vocabulary with. A second language or with a early, literacy robot, where, the robot engages, with children over a period of three months so now we're moving into long-term interaction. Of repeated encounters, where, the robot is learning a personalized, model to each child. The. Robot engages. In. A relational, fashion. We've. Been developing new measures of how you measure this relationship, between children, in the robot and how that affects learning efficacy. The. Robot, learns their name refers. To them by name cups. About past interactions talks, about future interactions and, it's. A story telling story telling and story we sharing paradigm. Where. The robot tells a story, ask. Questions. Sis dialogic, question asking. So. We know and how parents, tell stories to children this dialogic. Question asking is critical, for the best learning, efficacy, for children so, the robot engages, in this way ask, the child to retell a story we capture a ton of data we're, capturing everything, the child says. She's wearing for census were charactering biometric, data or capturing her facial expressions, across all of these kids to, create a very rich corpus. Of data that. We're trying to basically learn, through, reinforcement, learning again a model, by which the robot can predict of this library corpus, of 72, stories, that, we can level by complexity. What. Is the optimal, next story for the robot to tell the child to promote the vocabulary and their oral language development, right so that's that's the task at hand so, this is just some data showing that these are this is the confusion matrix showing that the system is definitely current.
Learning A different policy, per, child the. Models aren't converging. Yet but, the question is does this actually impact the learning outcomes right so if you look at measures of the oral language gains, we, look compared the personalized. Policy. Versus a non personalized, we see an accelerated. Rate at which children are developing new, oral, syntactic, categories, and again, we see big differences in vocabulary gains. Between the personalized, robot versus, a non personalized web which is still again over. Baseline right so, again a lot of promise when we think about innovation of, new technologies. That engage, different. Demographics, different diversity, of needs. Around. These kinds of technologies that are really designed in a way to try to match the way this. Particular demographic. Learns and engage is right I think that's that's the big punchline so, when we look at this kind of work and we. Think about the bigger bigger bigger bigger picture. There. Is a real opportunity to, think about and we hear this again and again when we gauge our stakeholders, we, can, provide, exceptional. Quality, of care in our institutions. What. We can't do is continue, that and engage it at home and yet, if you could do that you would have much better outcomes for for. Their stakeholders, right so whether it's education, you, learn at school how do you continue that at home with, an effective personal tutor for whatever the subject matter is when, you talk about you know an assisted living facility they, will tell you point-blank we, cannot build enough facilities, to to, meet. The the silver, tsunami that's. Coming our way we have to be able to engage people in their, homes as they age in place we, think about chronic, disease management help and we're on the same story but even when we talk about, high-touch. Trimming counters, there's, a lot of interest around a technology, that can engage people, in this humanistic, way so whether you're talking retail, our hospitality. Etc. Etc there's. A whole, wealth, of possibilities and, opportunities for, a technology that can engage people in this humanistic. Personalized, high-touch, way that, we're just starting to craft scratch, the surface so. The last thing I want to talk about is, a. Comparative, study to my knowledge this is the first study of its kind where. We compared. A. Digital. Assistant talking. Speaker, kind of transactional, AI in the home as it is today with. This sort of relational, AI social. Robot, we use G Bo because it's the one social robot out there that actually has the features and functions that we can actually do. A comparison, with something like Alexa. So. People. Are starting to live with social robots too and you know the pictures that people are sharing with social robots look a little different than I think the pictures people share it with Alexa people, are definitely engaging, with this technology, and they talk about it as being like, one. Of the family which again I think is is quite quite, different so, different, kind of AI different. Model different kind of bias and how it's designed. How. Do these, different generations from children, to adults to, older adults live. With these different kind of voice based agents, in the, home and what are the implications for, how. We should design. These technologies, what are people's preferences, what, are their boundaries so. You know this is just a fun video that just kind of highlights. The, difference in the design stance of these different technologies so tell. Me about it. An Amazon's, Alexa designed. Around your voice I can, provide information music. News, weather, and more Edie. Buh tell. Me about your thumb. Okay. Sure, my name is G Bo I'm a robot my, favorite, things to do are talking to people and dancing, I also. Really, like Abraham, Lincoln because he was so honest and because I like his hat. Okay. Good luck tell. Me about yourself. I'm. Your Google assistant, we can play mad libs I can tell you a joke or you can spin a wheel once. You're okay okay. So again just to highlight different design. Philosophies. Behind these technologies. Now, when we do this comparison, I mean it's nice and that each, of these technologies, has offerings, in each, of these categories from, the functional, kind of utilitarian, skillsets, to entertainment. Things like music and games to. Social companion, which could be things like jokes greetings. Asking, the Asian about itself right they all have, offerings, in all of these things they're, just weighted, differently right we.
Actually Had. People do a personality, test for these agents and of course you, can see people actually attribute. Very different personality, profiles, to. Something, like Gebo versus something like Alexa where, Alexa, is seen as more conscientious. Consistent. And predictable, Gebo, is seen as being more open, more extroverted. And there's more empathy and emotive nasaw obviously in the team experience they're actually quite similar in agreeableness, so again this is just to say different design, philosophies. Different emphasis and the features and functions so, we, did a two phase study we, did a study where we first brought people in before, they interacted, with the agent, these. People had not experienced. Or had an agent like an Alexa orgy at home and we, had them look at what their kind of perceptions, and attitudes of, features and functions would be before. They lived with the agent then. We said either. Half that people got an Alexa or half the people got a G both they lived with the agent for a month and then. We looked at differences in engagement, and difference, in perceptions, about how that would change after living with the agents of the kinds of roles and features and functions they thought they after the fact we, looked at children which we defined, as 15, to 5 to 17. Years old in our sampling, pool call. It younger adults, adults 18 to 49 and then older adults 50 years and up right so these are the three, demographics, we looked at and this, is a rough distribution, so you can see that you know when our older adults you know a lot of them were more like 50s, and early 60s but we had one 98 year old and. She was awesome okay so this just gives you a sense of the, distribution of age profiles, so. You, know when. We did our initial kind of cultural probes and needs finding, we had all these cards with all these different features and functions, that you. Know if you were going to have an agent like this what would you want it to do and we had people put these colored chips on the cards, that basically indicated, preferences, which is like I would like these things and I definitely would not like these things right so this is all about first impressions right, and, you. Know this, was fascinating right, so we look at children and, adults, and then we look at older, adults older. Adults, were the most open. To the broadest range of features and functions, adults. Were the pickiest I wouldn't. Have expected that I would have thought that the older adults would have been the most conservative but, they were the, most open, and this. Is to say in these different categories of things like reminders, and information, and suggestions, agent. Sharing something as a proactive agent, something it thinks is interesting, versus. Somebody trying to reach you kind of mediating social interaction, or somebody through social media sharing something we, see that you know if there's a lot of nuance here right so the red is things that people didn't think they want clearly suggestions, were the most kind of, polarizing. Kind of topic area but, it depended on what the agent was sharing you know reading writing, kind of things like that's sharing, something new people were open to but they didn't want the robot letting to know they should take a nap right they didn't want the sense of their autonomy, being, being kind of challenged.
By, This kind of agent but again if you look at the different generations here. You. See the older, adults are the most open. To a whole, range of features and functions across these categories, that was fascinating, now, we. Had people take very detailed, logs over, the first 14 days, we. Didn't ask them to take logs at the second, 13 days so what we did over the 1 month is we track detailed, usage in the first 14. Even. By just the color distribution of, these two plots you see something very interesting right, here's. Alexa. With younger. Adults, this. Is G Bo with. Older adults right. It's like the, flip, really. Fascinating, and then, when you look at kind. Of a five-day running, average we. See these trends of usage over time which we're also really fascinating. So it seems, that the younger adults want. To anchor their usage first in the utility, and if, you deliver enough utility, they'll start to use these kind of social these kind of companionship. Entertainment. Things later. Right whereas. Children and seniors were the opposite, children mostly wanted entertainment. But. If you couldn't anchor, it in the social and emotional for both children, and seniors you didn't get as much usage, overall, in the utility, right so, children, were around the social entertainment, and older. Adults were even, more right, so. Again. This. Is fascinating to me in particular because, this is where the social robotics, is really, coming, through right it's, not as if Alexa didn't have social entertainment. Functions but there is something about the way their robot did and arguably you could say Alexa had many more social. And entertainment functions 10,000, 30,000, skills right but, there's something about the nature and the quality and the engagement that was different enough that really got the traction between the older adults and. And, and the children, now. If you look at the change in, these different categories over. Time of what people, after.
The One, month what they thought they wanted we, see that suggestions, is even more. Polarizing. Right so that's to say that you, may have a stakeholder. Pool where suggestions, like medication, reminders and things are actually really important, but. This is to say that the way you do them I think actually really matters, the. Groups or the functions, that had the most positive change, what people wanted more of were, these socially, driven categories. Right so although I'd say in a lot of these systems. Right now there's a lot around utility. After. Living with these ages for a month these socially, driven categories. Were the most kind of that's what people wanted more right and then. We had this injury in construct, called the wish jar so we had these little wooden tokens, that at any point during the 30 days you. Could write a little thing about what. You want, the robot to do or agent to do what you didn't like whatever, just some sentiment. About the experience, and, we collected, them and. When. You look at Alexa, you see kind of a. Consolidation. Of topics. Around a lot or unfunctional. Somewhere. Around humor and they want to Lex have a better sense of humor some around movement, and then some around proactivity, but. Interestingly when, you looked at the social robot, people's. Expansiveness. In terms of what they wanted, this agent to do and fit in their life was much more diverse. So. That's fascinating. Right it's like this is a much more narrow categorization, of how they see this agent fitting their life the, robot, potentially. Has a lot more Headroom, and where. It can go so again just. Fascinating. Stuff, so the last thing I want to talk about now is this. Bigger societal, question of who. Creates with AI right we're talking about people living with AI we're talking about more vulnerable populations, living with AI other. Institutions. Organizations, are building these AIS for these people, but. When only a small. Fraction, of, a society, can design with these technologies, almost, by necessity it's. Only going to be applied to address the needs of a small subset, of society, right there's just going to be inherent, bias there so. How do you democratize, who, creates with AI for. Me it all gets down to education. So, we're starting to look at AI, education. For. Even, starting, at preschool, so K pre-k. Through. 12 we're, a sweet spot right now is pre-k, through like 12, years old and the. Reason why is because when you can have children first of all they're living with these technologies you want them to understand, them appreciate, the way they think to. Be able to feel. The right relationship, and empowerment over them but, you want children growing up with an attitude that AI is something, that not only they can understand but they can actually create with and so, we've been building on, top of the scratch platform. Adding. Extension. Blocks to things like Watson. And, clarify. And, hue, lights and, robots. Like G Bo and Cosmo to. Empower children to.
Code. These. Systems, to train their own models. For these systems and then to put those models, and new experiences. Of personal. Significance for, them so, this is an example of. An. Exercise, the kind of warm-up thing we have children do where first they, try to come, up with a gorilla. More code in terms of what can we say to make. McCosker, react to to what we said and they, realize kind of how like time consuming and the instances, are explicit, and then we invite, them to do it with, something like a classifier. System, that, they can train right so this is just getting their heads wrapped around how two classifiers, work why are they interesting, why won't you want this to do it make it more generalizable, and then, we have interfaces we're developing, that kids can actually come up with what are the things I'd want to actually put in the training set so, you know having Oscar. Be happy to kind of things you say or having asked their assets acts add to mean things but even kids. Wanted to create backhanded, compliments, so. They, so anyway so you see like the way kids are thinking about these things is fascinating, it's so this is giving them hands-on experience. And to, what it's like to create these system so this is a quick, kind. Of summary video of the. Platform, it's again it's very much a research project it's. Not just for kids actually it's for families too because parents have these things in their house as well and they also need to understand, them and, we're trying to understand how to create experiences that parents and kids can also do together so again it's kind of the scratch blocks like programming but, you can see a diversity of blocks right, there's clarify there Steve on this Alexa so, kids are being able to create these custom, IOT. A experiences. Across a big, kind of palette. Of these, AI based technologies. So. We allow, them to train models. They. Didn't think people teach. It's. Like mind blown. Computer. Maximized, colors so, like he's tagged and, labeled. Those colors. Here. Took a picture and, label. It as green. And. So they're able to explain how, they're training these models and that's important, too right. So. Again and you know through the process, of building these projects.
Kids, Are talking about how. The systems, work how they need to train change the training set why the models, not right yet they're, able to work collaboratively they're able to articulate, their ideas they're able to think creatively and, systematically, so basically they're learning a whole bunch of 21st century learning. Skills as well which I think is really critical so we have an opportunity so, last. Week I think was, the week that the, computer science teaching, Association. Partnered. With triple AI R on, AI education. And you. Know there's been a very, grassroots the teachers movements to bring computer, science education into, k12, I think, with that there's an opportunity to bring AI education, into k12 and to, really create the curriculum and exercises. From the ground up that's very hands-on. Collaborative. Getting. Children to tinker with this stuff to think about this stuff and to build this 21st century skills that we all want want our kids to have so they have access to these opportunities this incredible opportunities, and this, time that we live in and I, think, this is critical because you know if we want truly humanistic. AI we. Need to empower a much greater diversity of people to create these. Oh geez right because it's people from their different walks of life that have the empathy and. The appreciation of the challenges, and the opportunities that. Matter to them in their communities, so, whether it's creating relationally, eyes that's our support of the people in this kind of personalized, empathetic. Other or whether, it's these kinds, of educational. Initiatives, that empower a much, broader diversity people to create with a eyes kids, as well as adults and I would argue even seniors, I, think, this is how we're going to get to AI that can truly benefit, everyone and not, just a few alright, so I'm gonna end there thank. You very much. What. Are all the. Living. Video yeah yeah so that was that. Layer rolling to there be more open. Yeah. So I think what I mean again I think what you're seeing from the in-home one month study in general was I think the. Lesson to me is older. Adults and seniors they, do understand, that technology can, help them and empower them but it needs to be designed in the right way so they're open to it but. You, know there's a lot of stuff that are designed, for them that isn't, necessarily stuff they they want to use so I think that. Was clear in the assisted, living facility, where actually the thing that was compelling was it, was a practitioner, in the facility, who had the intuition that this, technology. Could, enhance. The. Way residents, would engage with one another so. It was not, replaced and they made the point the story this was not about replacing the, practitioners, it was really about saying if we introduce this other kind of technology. Can, we spur a richer, community. Collaborative. Connection. Among the residents so that was the hypothesis. And that study is continuing, to go on. I don't, think it's because they're lonely I think it's because they see technology. As something that can really help help them it. Just needs to be designed in the right way. There's. A recent. Case where like. Kidd for mistreating, Alexa, and I think they added in the new word please, Alexa. Is. There anything about this approach that, might, discourage. Or. Make, kids want to treat, the, agent more, respectfully. Well, so I think you know again. We're. In this very intriguing, time and especially with these social robots, where children, are. Engaging. Them more like. Another. Versus. A, device. And what we see is because of that. Children. Like I'll give you an example so we we did a study. Last. Year and we just did a launch of tools that we're finishing up now looking, at, children. Learn all kinds of things from others you know there's kind of this more curricular, stuff there's also things like attitudes, like a growth mindset fixed. Mindset, versus growth mindsets, and children, require mindset, based on how adults. Often or teachers praise. Them and speak to them so, if parents, and teachers reinforce that oh you're so smart you. Know then the concern is what happens when things get hard do they assume like oh I'm just not good enough anymore or if they appraise effort, and that's, how you learn that's the growth mindset right so we designed a. Puzzle. Solving game where, we, had children play the game with either a we call it a neutral mindset, robot cuz we thought a fixed mindset would be unethical a neutral. Mindset robot that would just make factual comments, about the. State of the game play versus. A growth mindset robot, and what, we found was that the children who interacted, with a growth mindset robot. Self-identified, greater in the post testing, of having a growth mindset and, on, the intentional. Parts of the task where we challenged.
Kids And actually made them fail they, tried harder, and they demonstrated, more grit so. And even in that video I showed you where the robot says I believe in you when she got, it wrong and you heard her say back to the robot I believe, in you we. See this social modeling phenomenon, again and again and again and again I would say at this point I think it's quite robust so and. We're starting to explore that phenomenon, and empathy, you know can you actually have social robust engaged children with you know a known issue with bullying and to have them become more empathetic so. That's. Just to say that engaging. Their, social-emotional. Processing's, and behaving x' I think this is a technology that clearly can do it it can, be applied I think to really help kids. But. Then it needs to be done in a really ethical responsible. Way because clearly it you know children are, engaging. Those parts of their brain when they're engaging with this technology and I think it's not just kids it's people of all ages so. Yes, I think it's important, that these technologies, model. The kind of behavior that. You want because I think it actually helps reinforce the, kind of behavior we want to see in each other so I think there's an opportunity there. So. Thinking, about the transaction, is a social, robot. Even. In conversation, system, systems. It'd, be very hard to pin down good reward functions, that reinforcement, learners can actually use so, in the two examples that you pointed out and curious like how, robust. Do you think the remote functions better yeah. I mean I think I mean it's, early days right, so I can say that what we are doing, is we're, you, know those inputs are going into the reinforcement, learning element something is trying to optimize so we're looking at effective. Computing inputs, the biometrics. For engagement we're. Looking at their responsiveness, to the robots dialogic, questions we're, looking at whether they get things right or wrong we're. Looking at their language samples, and so we're trying to both use. Reinforcement learning to kind of. Simultaneously. Maximize. Engagement as, well as these learning outcomes. And. We're seeing promise, I think that you know these are early days I think that the the point of showing the slides is this. Hasn't been done for kids this young first of all right I mean and. And and building a technology that, addresses, these social-emotional. Nonverbal, things, hasn't. Been done for kids like this right so it's a chef the fact that you couldn't even show boost, on both. The dimensions is noteworthy, and important, it's just to say there's. Probably a pony in there right a lot, more work needs to be done and I, think that's you know that's kind of the state of the field right now it's like we're. Starting to see a, different, way of engaging the, human mind and behavior that. Is, actually I think quite profound right this is no longer about naturalness. These. Are deep social, processes, that this technology is able to tap into right, and so this is kind of like this, is not shallow this is actually quite deep and. We. Need to understand. It and we need to think about how it can be leveraged to benefit people we need to be able to create best practices, so, it's not used to you know obviously try to make people do something, that's to benefit a third, party and not the person themselves right so it's just to say we. Live in interesting times. But, we are seeing a very kind. Of you, know provocative. Psychology. Happening, here and. Again it's it's it, runs deep and I think the more we do this work we're discovering how deep it actually goes a. Lot. Of the agents, have. This novelty effect like, about the Kip see I think I'm super excited and. It is hard to make. It permanent like me of this longer-term, friendship. What. Have you talked about this continued, engagement how. You can actually form, folds then and go beyond the problem absolutely so I mean I think you can see in a number of these studies they're longitudinal. So they're going well beyond, the. Kind of novelty effect I actually, think in our field we need to have a real deep systematic, investigation of what we mean by the novelty effect because. You know when I was first a graduate. Student it was like a five-minute, encounter, before you started the task was enough to address the novelty effect now, it's like two weeks or you haven't addressed the novelty effect so we just need to be much more rigorous and what we mean when we say the novelty effect that. Said you know when you're interacting with the system over three months or a month every, day in the home you're past the novelty, effect so.
It's A combination of things I do think, you. Know there. Is definitely you. Need enough. Freshness. In the activities, or. It's, like no matter how compelling Angry, Birds is it's going to get boring at some point so you need enough variability, and freshness of the activities, sustained any sort of encounter, I think. These relational, things are also really important, so the fact that the robot, whether. It's Deva or take are actually, doing these. Social. Grooming functions, greeting, you in the morning, asking you how your day was remembering. That personalizing. That to you commenting. On did you have another good night's sleep or I'm sorry I hope you sleep better lap, tomorrow, night did, you sleep better tonight I mean these, things matter and I guess that's we're finding it's the personalization. Combined. I think with the. Again. The support of empathetic other. Actually. Matters to people so. I think it's a combination of those things and there's other things as well I mean I think but. You know we're. At the very I think. Beginning, of understanding how you design, for sustained. Relationship. And you're, starting to see mechanisms, we're exploring here, it's. Hard to do this work it's hard to deploy these systems, that like these robot systems, in people's homes for a long period of time to collect the data it's just hard. It's logistically, hard and expensive to do but. It's important work that needs to be done. So. You. Know it's just to say there's a lot of opportunity, here I think and and very poignant. But. Just like interpersonal, interaction I think there's a whole. Set. Of processes. That are at play to build that sustained sense of not. Only just engagement but that kind of relationship. That you feel like you're actually working with the other or for sustained period of time to reach a goal of personal. Significance and, I think that's also part of it, so. You. Know I. Thought. Your part. About the. AI tutor, was really really interesting. What. Are some of the next steps then. You see applications. For, the. AI tutor, given, the technology, that we have right now yeah, so I mean obviously you know computer, tutors is a huge field I think it, tends to be applied more towards, older students, and as ten seems, to be tended to apply to more towards. Like math, and physics and kind of stem explicit, stem areas. We. Were going after, the. Younger you know early childhood learning for a number of reasons one is just it's a critical time to intervene as I talked about. It's an under explored area for technological, innovation.
Children. At that age just learned in a very different way they absolutely learn, from friendly, empathetic, others that's how they're wired to learn so you need to design to support that I. Think, there's a whole host of things we need to address around, early, childhood learning, around early math and literacy and even you know the fastest, growing demographic entering. Our public schools our English language learners which is why we are going out to recruit from schools with high English language learning populations. Kids. With special needs is another huge, area right so whether it's attentional. Issues or. ASD. Or whatnot I think, there's, huge opportunities, for again kind of giving. Everyone the best possible, chance. Of being successful by addressing a diversity of learners that's, where you know I think a lot of high impact, opportunity. Is, on the, flip side I think you can look at workforce retraining. You know and with all that kind of concern. About AI replacing. Our jobs I think there's a huge opportunity for AI to actually retrain us for new jobs. So. I think that could become a new and another, kind of huge impact area. For education. There's, a lot more discussion around kind of lifelong learning as well and continuing, education so. It's all important, you know but. For me just after looking. At all of the data and all, the. Opportunity. It's, the early childhood stuff, you. Know the fact that the, die is being passed for. Way too many children way too early in their life it's, just not right, so. I do think that's going to be the biggest kind of social justice area. Of intervention. For. Education. You. Asked me yeah. At. One point you were showing how people, perceive. The personalities. Of Alexa, and Bebo and, I'm wondering for Devo like how did that match like the Intendant design of juegos personality, or is there anything that surprised, you there I think. That apps that did match I think the kind of design stance of how would you but was designed him and he's designed to be open. And extroverted, and to you know convey and elicit this emotive, kind of empathetic response we. Would like it if he was perceived to be more conscientious. You. Know some of us just, she continues to improvement, of all you, know but and, he just doesn't you know he doesn't have 10,000 skills yet you know so so there's some of that at play as well but. In terms of you know I think again, the, punch line for me of especially. That working, average of seeing the trends, is to, say that's where the social robotics. Is really showing up because. Alexa, has, arguably. More offerings. In all of those categories than. A product like giba that's only been on the market a few months and yet. You see this different and engagement, that's the social robotics, coming through that's a different, way that the robot are showing up to people in their lives that's, leading to that engagement, you know I think this is something the field or even in the consumer, landscape, and. Corporations. Have you know have. Yet to really grasp is that this, is actually it's this is different, this kind of experience is quite different and, you. Know it's appealing, to. A diversity, of users. Right, that I think is also noteworthy right, so I think the punch line for me is also if, if, younger adults are keying, in on utility, that's just a matter of time for social robots they're done they have the skills to so. At, some point that's going to be mood and then it really is about these. Are other aspects, you know it is about this other relational. AI aspects, and that technology, to me to me that says that in the long run social. Robotics are going to be the next thing right because you can't just always be, disembodied.
Talking Speakers right clearly that's going to be the glide path and. We're starting to see it in this engagement data. And. Again it just, gets down to the. Way we are our brains work you know we. Are we, evolve to want, to connect and relate to. Animate. Entities, in this way and, you, know the more you designed to support that the, richer, the engagement. Right so I think that's it's, not surprising when you think about it that way but I think the industry has to kind of catch. Up I think right now our, spoken language systems are very much like playing chess it's, like discrete turns but. Communication, human, communication, is it's a dance it's it's not playing chess it's it's. Mutually, regulated, it's dynamic, its fluid it's it's all of these things and that's where all of these other things like liking and trust in. Come, to play, so. Your. Lunch and. I'm. I'm. Just maybe. This is projecting, way too far in the future but I remember, reading an article where, it. Was said that dogs. Have been bred to the point where they, prefer, the company of human to the company of dogs and, so, it's. Not that far stretch, to think that eventually the, social robot will be socially, far, superior. To other humans and thence humans, may start preferring. The company of robots to the company of humans and. How. Far in the future do you think that is we, have already observed so, so I yeah, so I will say this. People. Need people we. Need to feel that, we belong to. Our community, we need to feel valued, by, our community, that our community, cares about us we can't, flourish. If we don't have, that and so I think, these robots and these intelligent, technologies, are going to be, you. Know a wonderful. Augmentation. Of, society. But, people. Still will, always, need, people, and. These technologies should be designed to enhance, and support that you, know I'm, not sure to get your premise, I. Mean. It's certainly no truthful, dogs well. But. I'm not talking about the dog I'm talking about what it is to be human. Humans. Need humans I'm not saying dogs can't, be bred, to prefer humans over dogs I'm saying the way human beings are, by our ve