Mark Wyner – The Voice of UX — How Speech Technologies Will Change the UX Landscape
Welcome. I'll, try to follow up all these amazing speakers today really. Good conferences, I heard a lot of good information people, touched interestingly. On, a lot of things I'm gonna talk about today so let's. Get into it the voice of UX is not. Your mama's UX okay. This. Is an evolution, of communication and, that is, at the core of voice, based oral UX systems and that's what I want to talk about today. So. The first method if we, go back and we look at the origins. Of being human and the origins, of human communication we. Find ourselves at speech okay. And this. First method of communication, was to way I speak. You listen you speak I listen, and. Writing, came. Along much later you know on the timeline well wouldn't it did what. They've learned is that the earliest forms, of writing the, style of writing was, in the manner of talking it, was still written in this two-way. Fashion. This two-way style. It. Wasn't, until much later we. Got the Gutenberg press and we. Had the ability to produce. Publications. That. Could reach far and wide later, we had electricity. And other technologies, radio and television which. Enabled, us to communicate, to much broader. Audiences. Over greater, distances but. In this new manner where it was one-way. Communication. Information. Was broadcast, we would communicate and that's where that conversation would end and thus, it wasn't even a conversation but. It was still communication. Then. The internet comes along and we. Have this whole, new world of communication, it's sort of a blend, of this two-way in one way communication because, now it's interactive, and we're given tools like keyboards and mice and other ways of reaching, vast. Audiences. But, now we, are communicating in two-way and so now we are communicating with, a huge. Audience, of people we have access, to millions of other human, beings and in, this new capacity but. In this new vehicle yet, in this the. Original form of speech which is two-way communication. So. Today, emerging. Oral, you X voice based you X systems, we, mostly remove the visuals the displays we mostly remove the tools that we've had and we were returned, to this primal form of communication. This two-way, dialogue, only this time it's. Between human, and machine. We. Are now talking with machines, but, we have this primal instinct, about the way that we communicate with other human beings and when we have that conversation, there. Are some there are some nuances that we need to get into especially as, UX designers, this. Brings, about on, many. New considerations. Especially. For us you. Know as Ashes talked about the ethics and the responsibilities. That we have as designers and what we put out there in the world we have to think about this very carefully. So. Let's explore this, primarily. The art of conversation, ok. As we, return to voice based communication. We return to conversation. Which is different, from just communication, conversation. Very important distinction and conversation. Changes, everything, about user experience design. So. Verbal conversation. Verbal communication. Is our, most natural, and primal, form of communication, as I just mentioned right but. When you're talking about system, communication. It's very awkward, it's. Not something we're used to and we have very sophisticated, systems. That we can communicate with like. Siri and Google assistants. And Alexa and we can have seemingly, normal, conversations. With these machines and, it feels natural it seems natural but, we have this array of other devices like. Our Honda Odyssey which isn't that old which has a very clunky system. Inside of it and every time I want to say anything I have to push a button on the steering wheel wait for a prompt and then say something and then if the automobile. Speaks, to me and I want to respond, and I want to reply I have to go through that whole process again, and that's. An important distinction because it changes, how we communicate, it changes, this, natural primitive, way that we have conversation. But we have now have these subtle variances. And the awkwardness of how we have those conversations because, of the systems that we're designing. But. When you think about natural. Language, processing which. Is at the core of how we communicate with machines, how natural, is that anyways, this. Right here this this phrase, time flies like an arrow, fruit flies like a banana this, is um something called a syntactic. Ambiguity. And this is something that linguists, use when. They're assessing, the dialogue, in a dialects of various languages, and they're studying the evolution of communication, they look at something like this and say how does this you, know how does this transpire, across all these different languages like how our communication. Evolving, and not at just across different languages but within one language because.
As Our own language, evolves. We have all these other considerations. We have context. We have colloquialisms. We, have slang, it's, an all this evolutionary, dialogue, that we're creating and it changes conversations. John. McWhorter is a linguist, and he has this TED talk texting, is killing language JK, it's absolutely. Awesome you I highly, recommend it check it out he's that he's an incredibly, smart human being and he. Has in this talk he talks about this evolution, of human language he talks about conversation. And he, talks about something that linguist, referred to as pragmatic particles. Pragmatic. Particles an example, of one is lol, laugh. Out loud right and he, talks about how what. They look for linguist, when they're looking for the evolution, of a language they look for these pragmatic. Particles, that create what they refer to as markers, and in. This case LOL he makes a reference to how. LOL, has evolved. And changed the way we actually speak to each other as human, beings because, of the evolution of LOL, has evolved from laugh out loud which, was such as original, meaning to, a marker, of empathy, that linguist referred to and he. Cites this conversation, I just sent you an email lol. I see, it so what's up lol. I have to write a 10-page paper, and he's talking about no one's got fine here right you know there's nothing funny about sending, an email and there's certainly nothing funny about having, to write it page paper right and this, is an evolution, this is how we communicate and then. We see this move out into the world beyond, the technology, that we're using so you look at hashtags, which were originally designed for tagging. Content, for, easier find ability and index ability right but, like my ten-year-old son would say I've done something, hash tag like, a boss it's, part of communication, and you, in this room would understand, you, know potentially, my grandparents, may not understand potentially, people in a development, developing, country. That don't use the internet you know as, Alessandra. Referred to earlier you know we're talking about ways. Of communicating, which are unnatural of people who don't have that context, you, have something like this hashtag where there's no relevance my. Daughter by my, 8 year old was. Texting, me and. She. Loves us this sheep here this pink sheep and you. Know she, that's this YouTube, video thing that he watching he Pio he says prankster a gangster and so she's texting she was teaching me how to text as what she was doing and so she said hashtag, prankster, gangster you know and then she says there you go you're getting better at this you know you're learning how to text dad because now you referenced. You know this hashtag and, it was really funny but this is something, which is not a part it has no relevance in this in this context, this hashtag does nothing for this conversation, but it's a way of communicating that we understand, and my eight-year-old, understands. So. When we're talking about conversation. A really important component that is perspective, and, this is really important I'm going to get into why this is important in a moment in. UX. UX. Designers we often, assume, perspective. And we assume context, and we do our best to truly, understand, the, perspective, of our audience but a lot of times we make assumptions. This. Image right here this is, immediately. There. Was something about it I didn't know what it was it took me a second and then I figured it out sitting. In a physician's office waiting. For the for the doctor to come in and I saw this map on the wall and this is a fire escape route so this is really. Important, information you're. Gonna, die if you can't follow this map this is not a board, game right you know and and so this is really important and I'm looking at this and I'm thinking okay, this, is on the wall. Adjacent. To the or where I'm leaving the room now, if you look at that map and you consider this very carefully, this. Is the perspective, I need because. It's on the wall and that's the door that I'm exiting, and I need to turn left not, right. Not. Like this I'm probably. Going to figure that out many people will figure that out I'm not doubting the intelligence, of human beings but I'm gonna get into this idea of cognitive load especially, in situations which are you know paramount, to our survival and something. Like perspective, and how that can impact everything, about our, moment. In time. So. Here's another element of perspective. Which is which is really interesting and this bothers me all the time I ask. Siri remind me to pick up my book at Powell's in Siri, says okay I'll remind you and says pick up my book from Powell's and when I get this reminder, the next day at 2 p.m. I'm gonna read and it's gonna say pick up my book at Powell's and that's gonna make sense to me okay and I'm gonna pick up my book because it's my book but, I'm having a conversation in, this moment when I'm creating this reminder I am having a conversation with this machine and when, I'm having this conversation with, the Machine there's.
A Very important thing here that's happening, if I ask, ash to, remind me tomorrow that. I need to pick up my book at Powell's he's not gonna say sure I'll remind you to pick up my book at Powell's he's gonna say I'll remind you to pick up your book at Powell's, now. This is subtle this is a nuance and a maintenance seem huge and weird. Bothers me all the time like this that nobody else thinks about but, think about the cognitive load that is required as subtle as it may seem it, changes our perspective our perspective, on this conversation, that we're having with the machine I'm, asking, that to remind me about my book but yet it the machine is saying pick up my book and so, there's this element, sometimes, it's subtle sometimes, it's subliminal but it's there and it impacts, our cognitive load and it impacts our ability to communicate, with the machine because we're trying to get done. So. Let's talk about this let's talk about cognitive load and working, memory, working. Memory some, people refer to it it's. Sibling a short term memory but it's not the same thing part. Of cognitive, load is working memory working memory is, the human brain's version, of RAM it is this temporary, data storage that we have and many, if not all of you have experienced, at some point in time. Walking. Into another room. And going into that room and saying you know I'm going over here and there's this thing that I'm gonna get or, something I'm gonna say and you get into that room and you're like I have no idea why I'm in this room right this. Has happened and that's a failure of working, memory and that's. A perfect example of, something that's incredibly critical when we're thinking about interfacing. With a machine, using. A voice, based system. So. Something. That I learned, when I was researching for this talk a very interesting thing researchers. Have learned that there's a direct correlation between, cognitive. Load and our pupil, dilation isn't, that fascinating. You can literally, measure the cognitive, load of a human being by measuring the dilation of his or her pupils. So. During this process they wanted to use this measurement and they wanted to conduct they, use this measurement system to. Research. The. Cognitive, load of responding, to aural versus visual, tasks, so somebody, telling you go do this thing versus, seeing something written or printed you, know on a screen on a screen that says go do this task and they, learned that the cognitive load much. Higher when, receiving oral tasks, so if you think about this high cognitive load and you think about the, ability for us on our working memory to failure and you think about now I'm interfacing. With a system where nothing about it is visual. You. Now have something that's very challenging, something that UX designers, need to respond, to and. This. Other element of cognitive load we. Choke athletes. Choke. Speakers. Choke they forget what they want to say it happens, all the time this, is part of our this is related, to cognitive load. CN, by lock isn't a professor, in psychology and she notes this she says choking, is sub optimal, performance not, just poor performance, it's, a performance that is inferior, to, what you can do and what. You have done in the past and, occurs. When you feel pressure to get everything right. So. She's talking about. That's easier for you to do that you do all the time and when you feel that pressure and the cognitive load goes up and your pupils dilate, and all these things are happening and you forget and, you fail and you don't do things the way that you know you can do them, and this, is important, especially in, voice-based UX because, voice base UX are based on timers, it's a system, that requires, prompts, for, call and response and, when you don't meet that timer, the.
Pressure Is on and you, can fail this is performance, anxiety. So. This is something we're accustomed to and they were in the origin of the web right you have this blinking cursor that blinking, cursor is gonna wait all day long, you, can go to lunch you can go to the pub you can come back wake, your computer from sleep. Blink. What. Do you want what are you searching for you know that's. Not gonna happen, invoice, based UX voice based us UX. Asks. Us to make haste. Siri. You know, so you got to think about this like how do you reduce this cognitive, load and, you think about this element like Siri right you, think about like and I've done this, it's a prompt timer goes you know I hear the beep and I go remind, me tomorrow to pick up that book. What was it called oh and then she heard me and then there it is and there's my reminder, and sometimes I'll just leave him that way cuz I'm like I know what I needed to do I know this book you know but then I see the prompt but this happens, you know this is real I many, of you may have experienced, if not all of you this type of performance, anxiety and this is related to cognitive load. So. Interestingly earlier, we saw another, slide title path of least resistance right. Mental. Models that's how we as UX designers. Create. This path of least resistance this, is how we, decrease, cognitive, load, and. So let's talk about that there's a pianist, by the name of Clarice, Palk and she says people, don't know what they like they. Like what, they know, love. That quote I think it's beautiful and I think it's highly accurate for. Everyday living and for what we do as a living, because. We are creatures, of comfort we, love comfort and mental. Models feed our comfort. Mental. Models make the new seem, familiar. I'll. Say it again mental, models make the new seem, familiar mental. Models are how we create comfort, mental. Models are how we create a path of least resistance mental. Models help us to crease that cognitive, load decrease, that choke ability, decrease, that, performance, anxiety which. Is something that's inherently, natural, as a part of voice based UX systems so. Think about this Tesla, they. Release his first car and they're very first this Model S I think it was, unprecedented.
Touchscreen. Environment. This dashboard this didn't, exist in automobiles, and they were like you know what let's take a tablet everybody likes touchscreens, let's like throw, that, on the dashboard, and there we go like we're done right and I'm not even gonna get into why, it's highly, inadvisable to. Be like. With a touchscreen, while you're like 60, miles an hour like I don't know where the volume, is for this but, you know so so that's a whole other talk, I'll give that next year but there's. No mental model, for this touchscreen right and so, you know they refer to a, design. Mental, model they go back to this UI design, model of skeuomorphic, design right, and they're like well if we make it look like a button then people will know how to use this you know in the car and this is you know very important, and, what's, cool about this is when we're thinking about you know these mental models and how we apply them search, is a big, part that's like a great place we can look at this search, is an integral part of everything we do on the web it searches, a big part of our voice based UX systems we search search, was a big part of the web just to begin with we use the web to search for information this. Is an inherent, core. Origin. Of the Internet so. Let's look at visual aural and multimodal. Search, visual. Search let's begin there this is something we're all accustomed to this is where we've been this is where we began Lyndsey, her an my, favorite, midfielder, and the NWSL, in the United States of America, Thorne she plays for our Portland. Thorns amazing. Human being and if I want to find more information about Lindsay and I'd, use in the search engine I get all these results I can scan this page I can whip through it you know there are mental models for what this page looks like in the layout and all this information, it's quick it's easy it's, done right no problem now, you have people with visual impairments, and they can't scan this page so screen readers that's. The alternative that's, the adaptability, that's what we used to help people in to accommodate, right and so, this, is what a screen reader which is distilling, Petters only would return, okay.
Now. Is this reasonable, for Aurel return, for. Screen readers they said yes and this is what you return but if you're thinking about new based systems which aren't necessarily for, people with you know visual. Disabilities, you, know in impairments, then, you have this whole other system where now lots of people get involved and now they're really concerned and they want to really make an awesome system so is this a reasonable, aural return, probably. Not so. Google says well what is of what is a reasonable, return you know on something like this, this. Is a multi-modal experience, right so you have Google assistant on your phone you speak into the phone and you get the visual results, and Google says well for. One tiny, screen, you know let's limit this space do we want all these results how can we simplify this single. Return the, right information I can probably guess they, did she's they're done now. Here. You have something like Google assistant, which, now crosses. Two modalities so. You first have your mobile phone where, you have a multiple, Modell. You have a multimodal, experience, so I speak and I see a visual return and then you have the Google home which has no visual return, so it's a single oral. Based. Model and now, you have these multiple modalities, that you have to have to go across and we think about mental models and Google says well okay, that orally turned the list of the headers do we want to return that on Google home do we want this list and then if we return this list of all these headers when we're searching for Lindzi Iran are we gonna you know how do we reference, which one and you know where do we want to go from here and so Google says well let's. Let's distill, this down to one one, result the most meaningful information done, and then we do that on the mobile phone now, you have a mental model for how you can take, this with you across multiple modalities, this, is a smart adaptable, decision that Google made to, handle you, know this type of design decision, making. Accessibility. We. Got to talk about accessibility, I talked about I touched on it with screen readers it's really important. This. Is my son Cassius, he is 10 years old and he can only read and write a handful of words, because. He has a developmental delay. 14%. Of Americans according. To the Department, of Education can. Only read at his level and right at his 14%. Of all American, adults. Are. Reading, at a level that is the same as my ten-year-old Kash who by academic, standards in America is well, behind, the other children of his age okay. And that's a significant, number of people that we have to accommodate. This. Changed. His life. This. Voice based UX system the ability for him to, go find star, wars lego minifigures. On his, own you, know to, be able to like find videos, to be able to just take, control of this system you, know where every ten-year-old boy wants to watch you, know these great Star Wars episodes, and whatever else in Lord of the Rings this. Is important, and this is really life-changing, this oral, based us system that we're creating has, an immediate, benefit, for. People for the illiterate, population. Now. An important. Element here, when we're talking about accessibility we're, accustomed to a visual web this is where we began now, we have this visual we made accommodations for. People with visual impairments. And we created screen readers and voiceover and other things right. They. Adapt reasonably, well they're not amazing, but they do adapt, for. Hearing-impaired. Impairments. That's, a much bigger mountain to climb how, visually. Do, we accommodate, a, voice. Based UX system, so. Something. Like this Amazon, echo they create the echo you speak they create this thing called voice cast and you can pair it with a Kindle a visual, and so now it's Gabe you've got this like, kind of you know makeshift. You know multimodal, experience, I speak I get the visual return not.
Ideal You, know loose accommodation, but it's an accommodation and then, just now they just released this echo spot this is a true multimodal, experience, so they're saying okay this, is now an accommodation. You, know for people with hearing impairments, and it's. A great benefit to everyone else right this is a really intelligent. Ux. Design decision, saying, this is gonna be this. Is gonna meet everyone's, needs all at once and this, you know it seems so simple and it seems so obvious but. Not everyone gets to this point and they made a really smart decision when they went through this process so. Let's. Talk about the language barrier. This is a whole other arena and. Alessandra. Talked about this earlier today she talked about this language barrier. There. Are over 7,000. Languages spoken worldwide, spoken. Written read worldwide, 7,000, languages, devices. Can be localized, right we have lots of devices different computers things you, can shift. The same device you know to 10, different countries and each one can be ready to go and localize in that particular regions, language. So. I want to talk about that this, slides for you in wherever you are in this, is uh one. Of my favorite, football defenders, in the entire world plays for Manchester United. This. Is his name. Now. I can type his name into a visual search engine, and. Voila there he is there's, my man okay. But. This is. Not. Eric, Bailey, as you might pronounce it he's a French. Man. From, Ivory, Coast and his, name is pronounced, Erica by E now. This brought a shitstorm, into, my life when I was communicating, with my devices I will tell you right now. Hey Siri what national, team does Eric bayi play for. That's. What I got, not. Quite not quite let's try okay let's just try Google ok. Google what, national team does Eric bayi play for. Eric. Barry we're. Good we got sports but we're not we're not quite there yet what, if I add some context. What, national, team does the soccer defender, Eric buy you play for the, area's Google. Knows it knows that I search for soccer players now gets it there's some context, that can deduce all of this information and, stays probably not looking for Eric Berry is probably looking for Eric bayi I misheard. Him because I'm a computer that can't understand, Eric by his pronunciation, right. So. This is artificial, intelligence using. It's machine, learning you're using its context, and understanding, that, I searched for soccer players all the time you, know it's my favorite sport in the world and I research, all the time and now it knows and the next time I ask it without, this context, what national team does Eric buy you play for there. Is I get, the results, now, this, is really important, because, what. System, you. Know originally, if you look at the system and this original, search didn't have a return and it took me providing, my providing, some context, and then asking again to get back to this place where I want to have a natural. Primal. Beast comfortable. Conversation with the Machine it put the onus on me to. Adapt my search inquiry, to get the results that I wanted. That's. Something for us to consider that's, a language barrier so, let's. Look at context, censorship. Specifically, privacy, things. Like that. You. All know this or at least those of you with potty mouths like me right this is the bane of our texting, existence. You. Know. So. Visual. Environment you, know I can see what's trying to be censored, right I know what I'm typing at the he's. Not a duck, right, let's say he sort, of but let's, just you know I. Can. Tap that little X that's, gone and then okay now we're getting back to business here right. Speech-to-text. Or. I don't really know it's gonna be printed not. So much this, is a text I got from a friend of mine holy, F just hilarious right I'm. Thinking that's hilarious you know I'm what's, the deal here she says I love the way of my voice activation, doesn't, spell the word eff out what. I'm. Like I'm like thinking about this and I'm like I thought she was just kind of you know being sensitive or I don't know why she'd be sensitive with me and cuss words but she was just like you know I thought okay this is happening no no she said that the machine, is censoring. What, she is saying and not, printing, her cuss words into the text message you know like. What, kind of device you're using I'm gonna use this in my talk in New Zealand F, Android, my droid is an a-hole, but that's a story for another day. So. Let's talk about privacy. Public. Spaces were privacy, or auditory, data services, are a concern I don't.
Want To say my pin number I don't want to say my password, when I'm talking into a system or a credit card number or something like that I don't want to be disrespectful when. I'm talking to other human beings so, how do we adapt to that we have to think about that as UX designers, so. Google assistant, you know the app on our phone Great. Scott you tap this little keyboard, icon suddenly. You don't have to use your voice and you can tap that's an adaptability, right, this, is a good multimodal, experience, that's been converted to a single modal, experience, because. Of the context, because the understanding, that these UX designers, add and. This. Is something which I've seen and I know, a lot of you have seen this over the past couple years. You, know especially feeds, like social media feeds you're scrolling through and videos started Auto plain right this is a newer, sort, of thing where, you know you used to have to actually press play you know and tap on that but now they're automatically playing so, this brought something new into the spectrum of watching these videos where while they're saying well you know people it may not be it may be an inopportune time for, audio to search is blaring out if we're gonna autoplay, videos we got to keep them silent right but then you now have these video makers you know where people are you know hanging out on the airplane, or they're on the can or whatever and they're just like you know flipping through and seeing this video and, you, know and they're watching, it maybe in silent, maybe looking at it or maybe they'll forget about it or won't watch it later and they'll miss it because it gets buried in their feed right so the video makers are saying well let's. Take this let's think about this for a minute what if we hard-code, some, transcriptions. Right into the videos and I'm not talking about closed captioning because if I'm not you, know if I'm not if I don't have if my hearing is fine if I don't have any hearing impairments, I'm not gonna have closed captioning, on but if these are hard-coded if these are printed, into the video now. I can see them and now I can watch a video on silent, and that's a good UX decision, you know in thinking, about this single modal experience, you know of watching a video this is really intelligent. So. I wanna get into this idea of personality. And, Trust. And control and these for me are key components, when we're talking about voice based UX, systems. Caulked. About personality, earlier he talked about it with words and creating, you know this personality, for us he's absolutely, correct and it is paramount. In voice, based UX, environment, and I'll tell you why. Conversations. Change the psychological. And emotional, relationships. That we have with machines. Because. It makes, those conversations. Suddenly, personal. These. Dialogues, that were having with machines they trigger our Darwinian, buttons all the way back to the roots of who we are as human beings and how we communicate, with each other right. Suddenly, it's personal, because we're having conversation. This idea, of speech is very very. Important, as subtlest, soar subliminal, as it, may seem the psychological.
And Emotional impact. Of that is huge and especially, when you bring in the factor that we're talking to machines, and not other human beings, so. How do we get around that personality. Personality is a key component it's, a key UX component, when we're thinking about our relationships. With these machines because. When we think about personality we begin to investigate a lot of things in this personal dialogue the personality, of this machine the intonation of the voice and other, nonverbal. Communication. Cues because. Non-verbal, communication is, a significant. Part of how we communicate how. We converse, with each other it's, very important, you know when you're talking about our facial expressions, you know our pose, our stance you know we fold our arms we roll our eyes when someone says some, ridiculous, and this, is you know you have Wonder Woman there's a famous as Wonder Woman stance and Amy Cuddy. I think is her name gives a TED talks on and this, whole thing on the on the power stance and how it changes. Us psychologically and, it changes the perspective of, us when people are looking at us and we're state you know like right now look see now look at me you go they all think I'm Wonder Woman right now suddenly it's like this magic, but. But it does it really does and this is important nonverbal. Communication, cues which, are undetectable currently. By AI AI, in, environments, where there is no visual feedback. Mechanism. But. For what is verbal, we. Can explore, we can begin to explore personality. You know that's how we can sort of play with, these cues. That we might be missing in other ways but, personality. Can harm as much as it can help. Personality. Says Ricky, Baker anyone come on this New Zealand he had to find his way into this. Personality. Says do, you trust me Rikki, big. Sorry. Sorry I'm done now so. There's. This thing called the aesthetic usability, effect please, raise your hand if you've heard of the aesthetic usability effect holy. Such a small number of you've heard this is amazing. Every single person in this room needs to go look this up because this was life-changing. For, me as a UX designer as a UI, designer anything. I've ever done the aesthetic usability, effect this. Is the idea, that aesthetic.
Designs, Are perceived. As, easier. To use and that their shortcomings, will be more like more likely to be forgiven. There. Was a study by the Hitachi Design Center, on human-computer, interaction. In 1995, the results, of that study they. Noted this in their report users, are strongly influenced. By the aesthetics. Of any given interface, even when they try to evaluate the underlying, functionality. So. People. Were saying, hey here's this interface to use it was it easy to use that, was the fundamental, question the, pretty ones more, often than not they said yeah it was easy to use the, ones that were ugly they said I had some problems this, is very real this is a very real impact. You. Know and this is because humans like pretty things we love beautiful art we love museums we love music we love each other we love to look at each other's faces we love to talk, to each other and this and any element, of beauty aesthetic, is really important, to us and when. We're talking about this in a personality, in the realm of personality, and voice based systems we're talking about the biomimicry. Of UX. Design and there's. This designer named Bert Brownton gum who talks about biomimicry being, at the core of voice, UX and there's a skeuomorphic, layer and we're talking about you know at this at this core level of just how we interact and then there's this you, know core level that that at the very center, of this and he says in this article he wrote about it he notes biomimicry. Manifests. Itself at, a much deeper level than skew Morphin skeuomorphism. And concerns. The how and the why a product, solves human problems while, skeuomorphism. Is the initial and temporary, literalism. For, human interaction with, them. Personality. Is a concrete, a component, of UX, because. Trust. Right. At. A minimum when you trust I. Love. This trust. That UX systems will be usable functional. And delightful. As ash talked about you. Know and and. This is the thing. Personality. Drives, that trust there's been research on this you. Know this is a big part of it and you think about that and you think well, personality. Applies, to. Voice. Based UX the way aesthetics, apply, to. Visual based UX. Personality. Is the aesthetic, usability, effect of oral UX systems, there's. A qualitative study on personality, and trust in the correlation, between the two and Robert, Sikora who's one of the master, researchers, from this who. Was directing this research, project he says the. Findings showed that agreeableness. Was, found to be a significant. Predictor of propensity, to trust the. Higher and individuals, agreeableness, and emotional, stability the, higher their propensity to trust. So. When, we're thinking about this UX designers we spend all this time making. Things, easier, to use making, them more accessible and usable to people but well personalities. Supersede, all of our best intentions to, make a system use eager to use when. We're talking about a voice based system. Well.
People Trust systems, that they don't like just. Like they think systems. That aren't beautiful the UI they're. Not very usable. There's. An element in control. Human. Beings love to control we are controlling human beings and when we have control we feel safe we feel secure and. We, want to continue on that path and when we don't have control, things. Get scary for us right so, control is an, element of having some trust with a machine and when we have that trust, because. We love it's personality, it's a great recipe it's a great cyclone, of personality, trust. So. I want to read this passage from old man's war book called old man's war which is incredible, I'm an AI enthusiast, I get some talks on it by John Scalzi wrote, this book and there's this passage in there which is highly, relevant, and I think it's very realistic, about, where we will go with, these types of systems we're talking about control, and it's a system and, in this particular chapter this character is setting up his new voice based AI. System, so this is an oral ux IA i system, and he, doesn't want to use it and he's being forced to use it and he's like no I don't want any part of this I don't trust this thing I don't want it and they say well let's design it so that you have control of the system and this, is what happens. Many. Brain Powell users find it useful to give their brain pal a name other than brain Powell would you like to name your brain pal at this time. Yes. I said, please. Speak the name you would like to give your brain pal. Asshole. I said. You. Have selected asshole, be aware that many recruits, have selected, this name for their brain pal. Would. You like to choose a different name no. I said and was proud that so many of my fellow recruits, also, felt this way about brain, Powell, your. Brain pal is now asshole now. You must choose an access phrase to activate, asshole, please say your activation, phrase now, hey. Asshole, I said. You. Have chosen hey asshole, please, say it again to confirm I did then, it asked me to choose the deactivation, phrase I chose of course go, away asshole. Would. You like asshole, to refer to itself in the first person. Absolutely. I said, I. Am. Asshole. Of course. You are. Personality. Trust and control I think that's a very realistic, passage. I think it's very realistic you provide control, you think about this character in the book you know you think about this character you know it's like just a real tough like you know grease monkey you know blue collar you, know military guy, has got those recruits, and he's like you're handing me this technology. I don't want to use it you know and did. He names it asshole and that makes him feel really good, you know because he's in control he knows like I have the upper hand I am better than this machine, you, know and that's really important, and now he has trust you.
Know Because he can say hey, asshole what's the weather like later and he can snicker or whatever and that makes him feel comfortable, and that's a very realistic. Way that, we can think about UX, in the form of personalities. And how that impacts trust, this. Is not your mama's UX, this is an entirely, new, landscape, of UX and it's, around the art of conversation. Personality. Trust and control so. As we move into this new territory let, us tread wisely. Thank. You.