Conversational AI: Best Practices for Building Bots

Conversational AI: Best Practices for Building Bots

Show Video

Welcome. To our breakout, session, on conversational. AI best, practices, of building BOTS my, name is Elaine Cheng and very honored to co-presented, session with my colleague fish walk so. Both of us are from the al-jabbar service team our. Team together with a lot of partners have worked with many customers around, the world of fielding bots and then. In today's session we're going to share with you the best practice, we've learned. So. Nowadays bots, and conversational, AI are transforming. How people interact with computers and, how business, provides, services, to their employees, and customers, across. All different languages, couch hosts and industries, so. Let's show her four hands how many of you have built BOTS. Great. How. Many of you believe that you know how to build, great BOTS. Not. A lot ok, so what we're hoping is, by the end of this session a lot. Of you'll be able to walk away with, the answer for that question how to build a great part. So. To help companies, across, the world to. Build their, conversational. AI for their digital transformation. Journey two, years ago at maxsa, build conference, we announced the public preview, of Microsoft, bot framework. Then. Last year we, announced its general availability as. Odd robot service, and in. The last six months, Hoppus service has achieved enterprise. Compliance, including. Having successfully. Completed audit. For, ISO PCI. HIPAA, and most recently, sock one and two. So. This week at, the build conference, we, have announced major updates, for, Microsoft, conversational, AI tools, so. A lot of you probably all have seen the, different. Sessions throughout the build you, can also read more details in this particular, blog on our website. We're. Very excited for, the great momentum, for conversational. AI, we. Have over a thousand, companies across, the world fortifying. Their business with our Jabar service and, you. Can find a lot of the detailed customers. Case studies in the customer start Microsoft comm let. Me show you a few examples, just to see how bots are used in what scenario. So. The first one I'm going to show you is the whole bot in Microsoft teams so. In Microsoft teams you can go to the app store and there's. A bot, category, and you will see a lot of BOTS here one. Of them that I use a lot is this whole bot so. Let's take a look. So. The whole pod is integrated. With Active Directory, macro, graph exchange, so, it, can help you to really navigate and, this is a very typical employee. Productivity scenario. It's. Easy to use so in the text box you can actually see, the, list of sample Chris that you can do with it and it doesn't just only do the simple who is this. Person, it also does some interesting. Curries. Such as who knows about this particular part topic. For example who knows about customers. And one, of the common scenarios, a lot of people do is like you know who, was in a particular meeting with so, you can easily use this and say who. Was in, the meeting. About. Food. Booth and it. Will search your. Calendar, and then get, that information very, fast for you so, the second, one, that I'm going to show you it's called Eva, which. Stands for executive. Virtual assistant, so, this is a bot that we use in the Microsoft, executive, briefing, center so that's where we host a lot, of VIP customers, across the, world when they come to Microsoft headquarter, or other different places this. Particular one is the one used in Redmond, so. You can see the different. Scenarios this, can help when, the visitors come to the center it can help navigate these, directions. Or it can actually help you to get information about, the particular, meeting or briefing your aim, it's. Also very. Adapted. To the user contacts that you can do multi-language, so. I can, ask questions. To tell me a story, about Microsoft, in Chinese. Cows. Or you go way. Rhonda, Shah.

Who Sure. Let's. See why will tell me. So. Is this actually one of the tradition at Microsoft, is that we provide free, beverages. And every, year there over, 23, million and drinks, and the most popular wanna chocolate milk and orange juice so you get the idea it will actually help you get some interesting facts, about Microsoft, when the people are visiting and the. Certain. Example. I'm going to show you it's flow for progressive, so. Progressive is one of the major insurance, providers, in the, United States so. In order for them, to take advantage of the increasing, use of the mobile, channel to, interact with the brand they. Wanted to build a chatbot so. They worked with our Microsoft, service team and then, picked, us actually to the Microsoft, band to, doctor a facebook Messenger, so. Let's have a conversation with, Flo so, one of the things about progressive, is they're, really well known for this, iconic, spokeperson Flo. Who, is funny, insightful. And. Also friendly so, they also wanted, the same consistency. Personalities. To show up in their chat bot so, you can see that when. You actually get, greeted by Flo, you, were actually already in the conversation, showing you some of that personality, Flo. Can does a few things it can answer common. Insurance questions, can, also help, you get quotes so that's a typical task. Completion scenario. So. If you let's, ask her, who. Are you, you. See what she will say. So. She says she's actually a unicorn, having tackle. Eating number one insurance pen so you can already see that it also used a little bit emoji, say that's kind of the how personality. Is really showing up in the chat bot. So, let's. Summarize the, few examples. What. Do you think makes a great butt is that. How much AI services, it uses is it. Whether it actually used button, or cards is that. Whether it's actually use voice or not. Great. Parts are very, similar, to create website, or web apps one, thing that they have, to do one, thing very in, common, is that they have to provide a, delightful. User experience. So. Let's break that down so. It has to provide valuable. Experience. So. That actually address a particular user need and solve, a problem effectively. So. In the who bot so, you see that how it's actually addressing. The employee productivity scenario. In, the IVA but that's, really addressing, the user need when they are so visiting the max of exactly briefing Center in the, flow but it's really addressing, the insurance, needs a.

Gray. Bar has to be accessible, what. Do I mean by that it's really adaptive. To. The contact to the user to the environment, and also, it's easy for the user to naturally, know how they can use the bot and a. Great bot has. To be effective. That's. Actually better easier, and faster, than the other alternative, experience. Otherwise. Why should people choose, a bot so. Al-jabbar, service provides, the most comprehensive, experience. For creating. The conversational. Apps a. Typical. Bar development, workflow including. Planning, building. Testing. Publishing. Connecting. And evaluation. In. Today's session we're going to focus the best practice, especially. In the planning stage build and testing stage and, evaluation. So. Let's get started, at the planning phase so. We talked about a great bot will provide great user experience, then, what makes that experience. So. In the last few years we see a lot of developers, building but a lot of users using, BOTS one of the common, smash we see is what the developer, think the, users expect, and what, the user actually expects. So, it's very important. When you're starting planning. For a bot to think from the users perspective. So. The communication between a bot and user what. User would, expect, from the bot is, what. I say would be understood, what. I received at the response is appropriate and, what I, get as a service, is delightful, and if, I live and when I come back that. Context, is carry-forward so, where I left off is really remembered. So. There how to design such experience, so there are five best, practices that we want to talk about in the planning stage. Number. One form, a multi-discipline, team, number. Two understand, your users in order to identify the. But use cases, number. Three create, bot measurement. Plan number. Four define, park persona and number. Five design, the conversation, flow after, those, when. You define the experience right then, you can build that experience in parts and start, simple and there in sophistication we're. Going to show you how. So. Bonds are no different like a apps, or websites that. Having. A diverse team is, very important. And this, is not saying you know everybody. Government has to have this exact, same team the point is actually really having the diverse, team, take. An example, of the flow chatbot in, that particular. Development. So this was a the, proof of concept was built by our Microsoft, service team for progressive, and in that five-week engagement. They. Had foof to, full-time developers. One. Full-time designers. And one. Part-time, they're inside. Consultant. And one part-time, p.m.. So. After we formed the diverse team where, do you start, so. We really recommend that you think about to, take a design that approach, to start, understanding, the, target user in, order to identify the bot use cases. So. Who are they what. Had the different rain for, your users and. What. Are their objectives. Challenges. Needs and expectations, and. Then. Where, are the potential, use case that the bot will, add great, value.

So. This is a tool, that we, use in some of the pod, design workshop. So. This, helped you to quickly identify, the. Full. Range of users and then, what are their situations. Settings. And environment, and then what are their tasks, then, after you kind of identify, this then, you can prioritize, which one of them the, bot makes most, sense, and. Also. If you really do this in a design. Workshop it's really recommended that you have cope, with this with a diverse, set, of stakeholders, like. Across all your organizations, like, if you're building a customer, support bot not just owning including, the customer, support team you, really want to get other teams like including an operation, IT marketing. Or even legal to make sure that you have that diverse understanding. Of your user and. It's also important. To really spend time with. Your, actual user so, if its call center spend, time in the call center listen to the cause that will really help you to build the true empathy of the, user to, make sure that you really know what, questions they are asking how they're asking, and also, what tasks that they really need help with. So. After you really identify those use cases then. It's a good, idea to start, defining the measurement. Plan so. The importance, of having the measurement, plan in the planning phase is that, you shouldn't think of analytics, app as an afterthought. Because. That is a common thing that a lot of times just like app development, like you develop something and then later on oh I should measure it if you, are doing this at the planning stage then. You can really help, to know this is the use case and this is the success metrics. There. After that. The. Very important, piece about, bot is defined. About persona. So. Any bot is really representing. Your brand you your. Product your services, so, if actually, some companies, spend, of times to train their, customer-facing, staff, to, make sure they know how, to talk, to a customer right, you know what, to say what not to say and if things goes around how to act then, the same thing really need to apply to the bot you need to really think about how, would your bot represent. Your brand your product or services and what, I'm saying here it's not just for. External. Facing but because. Even for internal. Employee, facing, right let's take an example as HR bot that, will represent. Your. Team's service, the, HR service so, it's also important, to think how did you design that personality. How did you design that persona, to set the right expect, user, expectation. And then there are a few aspects. Of this so, how does the bottle look in, the past case it's really the part icon, that's, usually the first time that, your. User will be able to see it right whether it's a company logo that's setting a professional, tone or whether, it's actually the. Same as consistent, with they, are consider I can't, explode person in the flow case, right. And now. What's the name the. Name is how, user are going to find your bot and, also. They may call your bot in that name right, so, again, what's the expectation, you want to set, and. How. Does it sound, because. Part unlike, the, other. Website. Or app and experiences. It doesn't have a lot of visual elements for you to play with so it's all through the conversation so you know that you really think that that tone of voice right. It's very important, and then how does the bot respond.

In Different situations, for. Example like, you know no matter how sophisticated, how. Smart your bot is there would be questions that the pub will be able to answer there will be tasks that, the bot won't be able to complete then. What would you do in that scenario, would. You hand over to a real, person or would, you find some other ways to help the bot. So. After you kind of design. That. Kind of bad persona, get the personality, right then. The last stage of the planning phase it's, important, to think about how do you design, your conversational, flow, there's. Two important, aspects I want to talk about here why, is the user interaction, modality. And then, the other is dialog and depending. On the scenario you, should choose appropriate, one so. In this particular picture, it shows the, increased, complexity. To the right it, doesn't mean like for every scenario you have to get, the most complex, one. So. The more complex, the, more effort you should spend in the implementation to, make sure that you can really design that experience, well but, it's really picking, the right one that's. Important. So. For the user interaction, right there's, many different types. Of that it can be tax can, be cards or buttons can, be speech can. Be custom custom, can be custom app custom. Website or even custom device or a combination, of those kind of different user modality. So, depending, on the different scenario, how do you pick the right mentality right, is a user already in the messenger, app like, Facebook or apps so users, are already very familiar with their typing experience oh, it's. The user in the tablet. Or mobile phone factors so in that scenario they're already very, familiar with. Navigating. Their way getting. Their tasks done by clicking and tapping right. And also a picture, really is worth a, thousand words so in certain scenario, you really want to communicate in, that way and also, another scenario is like you want to limit the, different choices right, like when you say welcome, how can I help you you want to actually give a few choices that should set the right expectation. What you can do so, in those scenario, it really makes sense, so. Then why it's a product you use voice. It's. Very to some common scenarios like when the user is in a hands-off, scenario, or the body is integrated, with a device, that has limited or no display, surface then. It's very suitable, for that, so. The takeaway is that you, need to depend on the scenario and also it's not saying a bot have, all the scenario the same a bot should actually support. A combination, of those depending, on the particular term, in that conversation.

The. Other important, aspect is about the dial, so, there are different types of dialogue and this is the common, types. Very. Obvious. One is the one turn response that's kind of the question-and-answer pair but, that can also sometimes kind, of be the one turn chitchat like you say something and the battery spawn something but. In that particular case, neither. The user know the bot would actually, have. Any follow-up questions so, that's kind of what we mean by one turn response and the second type is really about more. For task completion right so have the guided, assistant, dialogue so in, that scenario the bot requires, collecting, a bunch of missing information in, order to get a task done so. To. Do that then, that's how kind of you design this kind of guided assistant experience, and also. It's. Typically. In that scenario it's multi turn but, it also depends, if the user already giving, all the information, in the same term then you don't have to do that so, the common, scenario will be ticket booking product. Purchase or form filling then. The third one that, actually could be one term response, or actual guided system but it's more that it's actually adding, the contacts it could be the context, of the user context. Of the previous, term or context of the environment. Then. After, you kind of keep choosing, the right modality. And then the right dialogue. Then, you design, that conversation, float so there's actually multiple outputs. That you can have it, depends on the. Scenario that you want and a depending on how your team likes to so, one of the common output. That we see is a kind of like you know they in the design workshop they will start storyboarding. This on the whiteboard or some. Will, actually, turn this into a video, and do, a wireframe, on that or it, could be actually using the check down file the visual code demo later that we support. So. Now you've, kind of I walked, through the five steps of planning, so after you kind of design that now. I am going to work on visual on the stage and then talk about the best practice in building, and testing. Thanks. Elaine. All, right so I want to talk to you guys about best. Practices, that we have been able to distill for building, and testing your bought and. Let me make sure. And. Elaine. Was talking about starting simple and layering in sophistication, when. You think about building your board your, board itself needs, to stitch together a, bunch of different bought, parts. Bought. Parts include language, understanding, QA. Dialogue. Language. Generation. Cognitive. Services like, vision, speech, knowledge. And more. Cognitive. Services help, have. Your board are more. Meaningful and more human-like. Conversation. With your users. But. Where do you start it's. All in a spectrum, so. If you look at user interaction, and dialogue Elaine, touched on that quite a bit but. Talking about language understanding, and language generation even those or a spectrum, you can always start with the most simple thing, for your scenario, and then, layer in the sophistication, as is needed and quite often what we see that, developers, are using as a yardstick, is the feedback loop or the telemetry, to tell them ok my user is expecting, this, particular, thing to behave this way and what do I now need to go do to make that happen and then, evolve, in and add that sophistication. So. Three key takeaways or, three key best practices that we have been able to distill for building and testing boards start. Simple and layer in sophistication based, on your cenario needs build. Your bot in parts because if you're if, you start by you know all the AI services, out there then you can get pretty sophisticated very, quickly but, you need to absolutely nail the core conversation model. And all of the AI services, can come in an argument, that experience, and make that better and complete and provide a rich experience for the user. Alright, so for the rest of my talk I'm gonna take, you through a journey of building a contoso cafe bot it's, a cafe bot for a fictitious contoso, cafe company that's based out of Pacific, Northwest and, let's. Actually go through the journey of them doing a little bit of planning all the way through to building. Something that stitches, a bunch of AI services, together and what that journey looks like and I'm also gonna be using the you know several of the new features that we have announced today at not today this week at build I'm. Sure you might have seen a lot about the cognitive services that are coming up as well I'll touch on a few and. There is an entire spectrum, on the other end for sophistication, but the core of the demo is going to focus on starting simple and then layering in sophistication so.

Let's Jump into the demo. All. Right so Elaine was talking about this and what's our no different than apps and websites. So. For planning, one of the typical things that you would want to do is to create mock-ups, of conversations, between the bot and the user, so. Here in via skirt I have a mock-up of a greeting scenario, and, here's. One for Q&A and. Question. Answer and having knowledge is another very critical things that every board should have. Here's. One for who are you because a lot of users might say who are you or what's your name and. Here's. One for booking a table this one is a little bit more sophisticated and it was multiple, turn because the wart needs a bunch of information to go complete the task and as. You can see these are simple text files so you can use any text editor you prefer and the. Chat files also, support rich attachments, like cards and images so in this case I have an. Adaptive card that I'm including and all of this can be part of your initial, planning, and iterative, design process. Once. I have the chart files then with a simple command-line tool I can convert them into a conversation transcript. And view. Them in the emulator. For. Sake of time I'm not gonna go convert, every them every each one of them into transcripts, and in. The new brand-new bot framework v4, emulator I've got them all loaded and, emulator. Renders, these transcripts, using. The web chat control this is the exact same web chat control that you can also embed within your own application, or website so, here's the greeting transcript we did and here's, one for Q&A. Here's. Who are you and, here's. The one to book a table. While. You're in planning and you're sort, of looking at all of these design ideas through the team with the team you want to be able to iterate quickly sometimes. It's easier if it's, you know very clean, and you can focus on the actual conversation so we have a presentation, more than emulator that helps you do that. Okay. Now imagine that you, know your design product management and leadership team are happy with the plan and you're just about going to go build out these four scenarios so. The rest of the demo is gonna be focused on that specific. Journey a. Good. Starting point for your board is to be able to use this to start thinking about language understanding, and Q&A maker or question. And answer part capabilities, so. Hagane envious code. I have simple, markdown. Based language understanding, files that. Describe, language understanding, for the four different scenarios that we had identified during planning so. Here's one for greeting. Here's. One for Q&A and these also actually render as markdown and. Here's. One for who are you and, here's. One for booking a table and this, one even includes, entity, definitions.

Once. They have done that, markdown. Files are great for authoring. And collaboration, but services, typically JSON, prefer JSON file formats, so, with a simple command-line tool I can convert all of the language understanding, files into Q&A maker and Luis JSON models and that's, exactly what I've got done here. Okay. As I was as Alain was saying dialogue, you, know in the spectrum for dialog starting, with you know simple, single, turn conversations, going through two guided. Assistants through two context carry over type conversations, is sort of the right way, to think about in terms of spectrum, of complexity, so we're gonna start with building. Out a simple, question-and-answer scenario, for the bar and Q&A. Maker da di is an, excellent, service for you to check out and where. You can point. Your existing, FAQ URLs and will be able to consume that and then be, able to provide. A knowledgebase, of question and answer parts that are automatically generated for, you based on that and Q&A. Maker is generally, available now and we, also announce that now you can actually point it to PDFs and, that. Might have enormous, amount of information in it and we can still crawl through that and and come up with the seed of question and answer bars that's based on that document that you pointed us to, so. Let's take a look at getting set up with Q&A maker. Another. Thing that we've done, this, build is we've brought the full power of Louis da di and Q&A maker rai to the command line so, you can actually adapt. These tools to fit your own end-to-end, development workflow, so, in this case I'm just showing that using the simple command line tool I can take the output of the previous tool, and then, pass it in to create my initial, bootstrap, model for Q&A maker, the. Other thing that we have also spent time on and we have heard as active feedback is as you're building a bot and then as you keep adding in services, it, gets increasingly complex, and it gets really hard for you to keep track of all the services that your bot depend on so. The new CLI tools as, well as emulator, make it super easy for you to keep track of all the service references so if you look at the second half of this command I'm taking, the output of the create step and passing. It to another command line tool called a mess board that, can keep track of all of these service references in a simple bot JSON file. Alright. For sake of time I'm not gonna actually go create a Q&A maker model here I've done that already and let's, see what the code looks like to hook it up but. Before I do that I want to spend time on two, things one is the, basic code structure here every. Single message to the board is an activity and there are different types of activities, and every. Single great board out there does respond, to a welcome, message that's the message that gets sent out to the user the first time ever the user either, adds the bot to their context or they start talking to the bot this. Is super critical when it comes to conversational, interfaces, because unlike, hearts and websites where you can have buttons and UI that. Explain to the user what is it that your app or website can do this, is the only way that the user is actually going to know what is it that this bot can do so, in this case the cafe board is coming back and saying okay I can help you find cafe locations, I can, answer some questions, about contoso, cafe and also book a table.

The. Second thing as the best practice I want to touch before we add Q&A maker is as, your bot scenarios, get sophisticated. And complex. There could be cases where users, and users actually feel stuck like they're like okay I actually want to stop booking the table like give me an escape hatch and so, planning for that even before you write any code, or integrate, with any services is super critical and as, you add more and more dialogues, you need to make sure that you constantly, update the escape hatch and make sure the user is super aware and clear of how, they can reset the conversation, how they can start over or how can they cancer a lot of what's going on right now so what we call as global flow control commands and make sure that your planning and thinking about, that throughout your development, phase so, I've already got that code in here the, other thing that I'm also gonna do here is uncomment, this code and. What. This does and by the way all of this code is built, on top of our brand new bot builder v4 SDK and. It's available in github, as a preview, so, please do go check it out so, here I'm simply, making a call out of the Q&A maker service and if, I have an answer I come back and render that answer if I don't then the board is going to say sorry I don't have any results for you let's, try and talk to this board in the emulator. Let's. Restart. All right so. As. Soon as I open my board and the new emulator I can. See as I was telling you the list of all the service references so in this case I've added a Q&A maker model so I can see a reference to that right here and I, can deep link into my Q&A maker model from right here within the emulator. Let's. Try and talk to this board the some of the questions that we had was who's your CEO, this. Was one of the question and answer bars that we had identified during, planning it comes back with an answer powered, by Q&A maker, what. Are your locations, was another question we had. It. Comes back and work so it's very quick, for you to go from almost, nothing to abort, with knowledge using the power of Q&A maker and it's super easy for you to get hooked up in your code as well. All. Right next, up, Alain. Was talking about, adding. Cards in. The user, interaction. Spectrum. So the first thing was text so we just saw that and the, next one I want to talk about is adding card so we announced adaptive, cards last year at build and we're seeing a lot of momentum, behind. This this is a way by which you can add very rich interactive. Pieces of cards within the board conversation. It helps quite a bit depending on where your users are if they are on a particular, device or form-factor but they're already used to clicking and typing around or, clicking and on or tapping around then, cards are very effective, cards are also very effective to help orient and get a lot of information in one small, interaction. So. We're seeing a lot of momentum with adaptive cards and we also have a bunch of samples on adaptive cards dot IO do. Check them out and hopefully. You'll be able to use them in your next board let's. See how to add cards, to this exact same board nothing. Else has changed in the code the only real thing I'm doing here is to be able to start. Sending a card. With. The welcome message. The. Other thing while we are here I want to do is to set up the top-level dispatch so until. Now only thing that the bots did was respond to give animaker questions, and if you didn't have a Q&A maker answer then it was going to say sorry I don't know but, we do want the bar to respond, back to breeding we do want it to be able to book a table and we, do want to be able to respond to who are you so.

Just Getting set up with the top-level dispatch and in most of those or all of those cases the board is simply gonna come back and say I don't know that or I'm still learning that this is just to give you guys, a sense of what's going on as we are building this body let's. Go ahead and run this one. All, right let's try. This water so. Right off the bat Naja's. There, was a welcome message but, there is also a cardinal, and I, can click on the buttons to jump into a specific scenario and in this case is actually gonna say I'm still learning how to do that and. Who are you it's. Just coming back and saying I'm the cafe board so, if I literally typed in hi, these were literal, hard-coded, strings so it's not going to respond to anything more than that if I say hello then. It's not gonna work. So. It said sorry I don't understand, but it's a good starting point for you to think about your outer layer of here are the five things that my board should do in terms of scenarios and get that set up. All. Right next up in, the spectrum of complexity. Let's. Go ahead and take a look at how to do multi, turn conversations. And these. Are the guided. Conversations, that Elaine was talking about in the spectrum for dialogue, so. Here for who are you and booked a table I'm. No, longer saying I'm learning that because the board has now acquired. That skill so. For for, each of these I'm gonna model them as a waterfall, dialogue and there are multiple different ways to model conversations, waterfall, being one of them you might have heard of form flow as being another one and so, you, can pick and choose whichever one that you want and that works for you the best but, in this case I have decided to go with waterfall, so. If you look at the actual dialogue, for who are you it's, literally a waterfall, go execute the first tab that is you. Know if the user says who are you then, we're gonna use the text prompt and say, hi, I'm the contoso cafe board and ask the user for their name whatever. The user replies, back it's a text prompt so it's going to take every single thing that the user replies, back that is not going to be any more layer of understanding, on top and we. Will just take that as the user's name and come back and greet them and then say nice to meet you so that's all pretty much the whoareyou dialogue does right now and here's. Another one for book a table this. Is a little bit more sophisticated than the other one, because. We are trying to get four. Pieces of information so we need to know a, location. We need to know a date or time and a, party size number of guests before, we can go book the table so if the user says book a table then, the board is going through as, waterfall asking for each of those pieces of information because. Before, it can go book the table so, it's asking for city first then, date then, time number. Of guests and then, it's confirming, to see if it should go book the table and then, you call into your back-end service, book the table and then you can come back and say ok I've booked the table or something went wrong try, again later so pretty.

Much That's all we have so but we have gone from a single, turn to now also being able to support our multi turn back and forth conversation let's. Try and talk to this blood but, make sure this is running it is. So, now it's going through the waterfall so it's asking for city. Side, Seattle. Tomorrow. Say. 7:00 p.m.. Bring. Two years and then, now it's at the confirmation step, asking, to confirm everything everything, looks good and I booked the table the, other scenario we did was who are you so, if I say who are you then it's asking me for my name I type. In my name I picked that up it was a simple text prompt and we're good so, we have actually got the core of the conversation, model figured out and then now we can start layering additional, sophistication, on top so, let me show you - typical, sophistications, that are best practices for people to take. Back and think about adding to your bar so the first one is in, the new bot builder SDK, we. Have a very rich variety, of prompts, because as, you saw the previous demo you might have been thinking wow. Like you know you're, using a text prompt to get data and time and a city but I want, to be able to set constraints, like I don't want my user to pick, a date in the past and my cafe is only open between this time window and. So the the the prompt system in. The, body builder v4 SDK allows. You to do that and be able to elegantly, model all of that so, all that I've done here is I've added a choice prompt a number, prompt a time, Explorer and a confirmation prompt, the. Choice prompt is what I want to use when, I'm asking the user for a city so instead of using a text prompt I've swapped it to be a choice, prompt and still. The message that we're using to ask the user is exactly the same and then. For booking for, asking for date and time I'm using a slightly more sophisticated time Explorer that, allows me to set a bunch of constraints, and it can actually understand, date and time and parse it and actually give me a result only if it matches my constraints, and I'll show you that code in a little bit. And. Then for asking, the number of guests we are using a number prompt, and then. We're using a confirmation, prompt to confirm and then. We're done the last one is after we have booked the table successfully. We can come back with a message saying it's done, let's. Go ahead and try this and see this in action. Of the. Bad because we had a choice brand and we. Actually did specify the constraints, and I didn't show you guys where I specify, the constraint so here are the constraints for the choice prom so, we were even able to automatically, generate a list, of suggested actions at the bottom that the user can click on and go and, I could do interesting things like I could say one, and that. Did pick up Seattle it was the first option that was presented and our. Escape, hatch continues, to work so even though I'm in the middle of the dialog I can say start over and let's. Try another, scenario, there and. I. Could say, how. About. Bellevue. I'd, picked up Bellevue so there is a little bit of you know language understanding, matching going on as well let's. Start over again. And. Say book a table and, then. I could say. How. About the third one and I'm literally typing, in the word third I picked up rental so, a bunch, of these natural, language based interactions, to to.

Be Able to parse what the user is saying, then, as it relates to a specific prompt. And as well as being able to set a constraint, for a choice promise what you saw now let's take a look at the date/time prom because that's the next thing it's asking so, if I said how about in. Three. Weeks from now. Too. Bad, the board only accepts, invitations, for the next two weeks and it only does 4 p.m. to 8 p.m. where, is that coming from so in my code I have this time, accelerator which is basically applying a bunch of constraints, on the. Timex prompt itself so here I'm saying okay, the user needs to give me a date and time that has to be either this week or next. Week and it, has to be in the evening and the definition of evening is from 4 to 8 p.m. and that. Kicked in and that's why the board is like okay that doesn't work so, let's try something else so let's say I'm. Gonna say Saturday. At. 7 p.m.. Notice. What happened there now the board says ok I have this Saturday 7 p.m. instead of next Saturday or the Saturday after or the previous Saturday so. We've also added a little bit of language generation, capability, for you to deal with time X or, date and time. Operations. So, that you can the body can actually come back and confirm yeah I actually, have this Saturday if you want to change your mind actually say that right now and this is actually a good practice for the bar to be able to do is to ground and validate, and say what is it that it has understood after every single turn of the conversation, of course you don't have to include everything, that happened before that conversation, it's just immediate, validation, saying okay I said Tuesday you got use day I said four pm you picked up 4:00 p.m. I was, just asking me how many people in the party just. As with the other choice prompt we're using a number prompt so I could literally type in five instead of the number five and it picked that up it picked up five guess a Trenton location, for this Saturday at 7 p.m. I'd. Say yes and we're good to go so we haven't added any you. Know language, understanding, capability, outside, of what is natively available. In the prompt system up until now but. Still you have gotten a little bit more sophisticated in, terms of conversation, and the. Last thing I want to show in terms of a demo is Lewis, dirty eye so. Until, now the, you, know to trigger into a specific scenario you, literally had a hard coded message so it was high if the user only said hi that was going to trigger but. Users could say a bunch of different things that you might want to trigger into that specific, scenario so the user could say hello or good morning good afternoon and there, are numerous variations for, user to be able to book a table so they could say get, me a table can, you please book a table for five guests so. On and so forth lewis is a fantastic, service for, you, to be able to provide a list of example, utterances, and lewis. Can build. A machine learn model that, not only does intent classification, so given i use the utterance it can determine this is what the user is trying to do but. It can also extract, meaningful, pieces of information called. Entities which in this, case is you, know what's the date and time what's, the location and so on and so forth so let's see let's, take a look at how to now hook, up language. Understanding, powered by Lewis for this board, so. A while ago I showed you a markdown. File for, each of our scenario, that that you know was a language understanding, for greeting and Q&A. And who. Were you and booked a table and and, we used a tool to generate a Luis, JSON, model and so I'm just simply rerunning that tool to bring that back in memory for you and we, have also got another tool called lose Jen what this does is it can actually take a Louis JSON model and generate, a strongly. Typed c-sharp or a typescript class and this, is super beneficial because, off. The bat let. Me just do this first and then show you what that looks, like, for. Every single intent and entity that your bot had now, you, can call, them like you know you have a nice enum, of all the intents that's right here and then, every single entity, is also strong type and so your code is going to look a lot more cleaner and then you're.

Not Directly dealing with the. Results from Luis you get a nice little abstraction, there, the. Third thing that I want to show here and if you're not gonna actually run the command but, as, I was saying we're bringing, the full power of Luis to the command line as well so using. Luis import application, I can create a new application import. The JSON model that I have created and just, as we did in the Q&A maker case I want to be able to pipe, that to a mess board so that now I can add that as a new service reference for my body, I've. Got the Luz model setup back, to the cafe board code. The. Real change that I have done here is his, two pieces of changes actually I created a Louis recognizer, and then. This is my new Louis, GNC short strong type class and I. Switched out all of my case, hard-coded, statements, with specific. Lose intent so this is a greeting intent this, is book a table, who are you and for non intent we're, gonna fall back and ask if Q and a maker has an answer or not. Let's. Try and talk to this board I think this is already running yep it is. And. As you can see when I connect to the bot I now see the Louis, service. Reference in emulator, as well so, instead, of now being, a really hard coded set of things I can see I can say a bunch of variations let me pull up my, notes. Here okay. Can. You please book a table that. Triggered. Let's. Start over. Get. Me a table. You. No longer have to exactly say book table. Robert. Start. Over the rest of the prompt is all unchanged, you're still using all of the exact. Same problems. That we had so what's your name you no longer have to say who are you that. Worked and all of this is getting triggered because we had at Luis and it was able to do the internet arbitration, very well. Start. Over and lasting who's. Your CEO this, is again from Q&A maker. And. That. Continues to work so hopefully, that gave you a sense for a bunch of new tools that we have announced at build and you know as, we start very simple, and then, layer in sophistication at, each stage the, the kind of experience, that the bot can provide also grows but, one of the few things that we did at the code was include, welcome message make sure that the user can restart or start over the conversation, and handle. Like global flow control commands, and, then do. The single one turn conversation, with Q&A maker and then, as you're thinking about multi, turn dialogue make, sure that you're starting simple nailed the core conversation model. And then you can layer in you. Know AI services, to argument, that core experience. But. Two slides I think we had just have a summary. Yep. Covered, all of that so start simple progressively. Layer and sophistication, and build your board and parts and nail. The code conversational. Model before and, add, the AI services, to argument, the core conversation model. Back. To you Elaine thank, you. Wasn't. That cool. So. Now after, you build, and tack the bots and then. Publish, it and then connect to the different channels let's talk about the best practice in the evaluate, face. Because. Conversational. Experience, really provides, a unique experience, it. Actually tells. You from the user conversation. What a user are looking for so. There's really three, best practices I want to talk about number one iterate. With the feedback and really get it out soon, don't, just really wait till you have like all the different functionality. You can really get a limited, functionality but get to the user because, the user will, tell you what, questions they are asking, how they're asking, what, tasks they want to get how is and no, matter how you anticipate. You won't be able to get all of those right like your user can't even say different, languages you wouldn't be able to know so get it out and then start refining. It as, fast as you can and also, start simple that's the same principle even, we should have talked about that, also applies in the evaluation, phase we. Have the out of box tools for, our box for service and also language understand service at, least start using that and then give it out and then if you need custom solution, then that's the second point, that I can talk about you can get there but you don't have to wait till you have all the sophisticated, analytics, to start evaluate, number. Two focus, on business KPI, and. Leverages. A custom dashboards so remember, in the planning, phase we talked about at the planning, you already, should actually, define your success metrics, so, then back in the evaluation, phase, you should really review, that and also, look at your instrumentation, to, see whether they actually get, the metrics. Measured correctly the way that you are expected, and here. Is where we see a lot of custom, dashboards, that actually helps because we have the locking application, inside a common one that we see it's integration, with power bi and now.

Our We have the path, builder SDK before. That's extensible. And modular, so we expect, to see a range of the Analects, middleware, that will be contributed. By the community, so if you actually have any analytic middleware that you want to contribute please, join that party as well. Number three build evaluation. Into the processor, and game business insights, from the bots what, I mean is here it's really not how sophisticated your, analytics, tool it's really important, that even if you use simple to build at you to group your process, because. But provides. The unique opportunity, that I tell you that it's not just helping to refine, your bot it, actually helps your overall, service. Because. The user may tell you some product or services that you haven't provided. Right, so you may tell you actually the certain parts of your business that needs improvement so, it's way beyond, the bots that we start seeing as, a common, pattern that, business, really use patent Alex as a way to gain, business, insights and in the customer. Case studies online, you can take a look at Dixon car phone and. Telefónica they both talked about that of how they actually gain the business insights, from, the bot so. This is just kind, of the, screenshot. Of the Louis tool. For the revealed, endpoint utterance, so what I'm saying it's like you know you really, can't start using that if you, really constantly, improving. Your, alternatively. Even, with the very simple and you, can start simple and already help you get to a part constantly, better and then, you can also add phrase, LeSueur patterns, to improve your newest model that way as well and here. Are a few kind, of sample. Examples. That, are for the button, a little excel in the custom dashboard that we see if. You use lewis now of course the very common one is getting, the Louis metric the intent metrics, so you can see that's the one that it helps you what, are the common questions people are asking and, it's important, to really take a look at the non, intent, because that's where you, can see we're, not, meeting the user needs right, and then there's also the, view, at the right on top that's, kind of like also a key phrase view that we're seeing a lot of customers. Using as well and. Another common, one is sentiment, analysis, because, this can give you a good user satisfaction. Indication. Right so if this is a common case when you do the integration with the text analytics and then, you can get this view and also see the trend view about, where does the users faction that sentiment, drops, and then I have a way to basically, drill. Down to see what's the issue and then. The, other. Metrics. Here it's really more scenario. Specific, we talked about you, identify the user case and then see what had the business impact, the bot can, can. Really contribute, then, that's where if you for example like a customer, support but a common.

Metrics People measure is hand over write how much of the percentage, is your goal to be able to cover the, common, questions, and what had a complex each issue that you won't handle hand, over to an agent and in that case we, see that people also look at what are the issues that, actually are, the common case being handover, then, you focus on that to see whether you can refine that to let the bot take care of a little bit more. So. Next I'm going to show you a demo. Of a. Control assistant, so, you've seen in the previous demo, from which work that our, Drupal service really allows you to store simple, but, I also want to show you that we can also support, the sophistication. Solution, as well. So. In this particular case it's an intelligent. Personal, assistant it. Can be in a different environment but in this particular one I'm showing a incur, assistant. So. Let me get a set up. Okay, so. This is the simulation of the in-car assistant, so on the left and right those are simulate. The different signals, you can get from the car so in the actual UI won't be like this and it will probably have very limited screenplays, to show but this is really a more elaborated. UI to show. You how, this simulates, that assistant, so. In the car if you are actually driving in the car it's kind of hand off right so that's where we see in those scenario, boys will, be a good help in, that MLAT modality, so. Let's try so imagine that I'm driving along and I wanted, to really save the preference, of the heating right now. Remember. My seating preferences. Sure. Thing I'll remember, your seat heating preferences, for days like this I have. Gathered these settings for cease heat level zero I'm going. To name it seat heating level zero would. You like a different name. No. That's okay. Okay. All done. Note. That normally. No means no and it's okay means yes but this is how we trained to understand, ok that's ok no, that's ok means no so this is how we're language understanding, comes in. So. Imagine that I'm driving along suddenly, there's a yellow light comes up and I want to find out what's that. What. Is the yellow light in my dash. If. The tire pressure monitor, lamp comes on in yellow then the tire pressure is off about ten percent or more. So. Just shows, a card to display that information, what. If I'm actually really, frustrated. And the image, input. In my voice the particular word was, show it. Being detected in the sentiment analysis, it's kind of an active. What. The heck is that a yellow light in the dash.

Sorry. To hear that if the tire pressure monitor, lamp comes on in yellow then the tire pressure is off about ten percent or more sewing. I can also help you troubleshoot or, connect you to an agent. So. You can see that this is how you can generate, the conditional. Response, right. Based on the different. Result in your text. And analytics. And and the sentiment, analysis, then you decide a different, response and in this case you may want to actually hand to a dealer, the. Other scenario. I want to show you is actually how, it. Really helps a productive, productivity. Scenario, so. Let's say that I'm driving that I suddenly have an idea that I want to capture a note. Can. You take a note. Of. Course. I am ready to capture your memo now, press. The record button to start recording once. You are done recording I, will save the note. So. This is a simulation, that I will click the record button but imagine, in the car that you will be able to actually trigger that right. Follow. Up on the beaut session. I got, your note, please wait while I save it so. Then it's actually processing. This and actually. Then I will be able to show you later that it will actually add the note in, my OneNote. Done. Your, memo can be found in the memos tab in your OneNote. Okay. So let's see. So. You can even see that the text, is extracted from the audio so this is how the, integration of office 365, can really be a powerful scenario, and it can all be done by the car. Assistant, that built using our about. Algebra, Service tools so. This is how a sophisticated, scenario, that can really be achieved using our most comprehensive, tools. So. In this example it. Really goes. All the way to the most most complex. So. To summarize, the, best practice of the building bots in this session we, really talked about that. In the planning phase you should focus on designing, an experience, not just functionality. In, the buildin task face building. Parts and start. Simple and then they are in the sophistication, and then. Found the feedback and iterate and evaluate and refine. So. Here, are the three breakout session about conversational, area that we showed at build and this is the list of channel line video that you can also see later to know more about the tools we have. So. At the end I'd like to conclude with. A customer, touch personal. Testimonial, from telefónica. We. Have become data-driven. Our. Conversational. I-era lets, us put people first we, are using, artificial. Intelligence to. Change the way we, are relating. To customers. And, one of the reasons we begin to use Microsoft, technologies, was because it offered, an open framework. It. Was very easy for all my, development. Team to understand, the tools we. Can very. Fast and very easy to move into, the used cases we want to develop for our customers. Using. Tools like a Shabbat service, and Reese we keep getting new insights and we are applying them into our core business. Microsoft, VI was, a key piece in our strategy because, they have the tools the, platform, the full ecosystem that, we needed. With. That thank you I would be happy to take questions. We'll. Also have our team here so if you want to just stay here and then come, here. That we have also answer questions here, as well. So. Just a question on the dialogue you were building the demo the kind of table booking workflow you were kind of going very deliberately step-by-step through that but, can. You maybe show like what exists, in the tooling today if you have something where you know you're gonna get three pieces of information at once so if, I said you know booked a table in Seattle at 2:00 p.m. yeah. You. Know I might know that the date but yeah, I I've, got a couple of pieces of information there so you're kind of extracting, two or three end identities, and then additionally prompting for the for. The rest of that yep so the, native prom system that we have can only handle one entity at a time because it's layered at that level of sophistication okay. For a for, handling cases where you get multiple pieces of information and, one utterance, which is really what you're trying to do a, good, recommendation is, for you to have one intent, in Luis hopefully.

In A separate Luis app because, Luis is really good with capturing multiple, entities in. Any given utterance and it can come back with that value then, all that you would do in your dialogue, step waterfall step is to see did you get any new values and then be able to rehydrate that hmm. Okay. So there's not there's not I guess is are there other plans that kind of uh started trying to address that in the dialogue model you've got yeah yep. Doesn't, exist right now you're working on it but for now you should still be unblocked okay cool thanks yeah thank, you I. Was. Wondering what you have for, existing. Voice, models, so that you can talk to the bots and come back and what the roadmap is for future things like, Alexa, Google blah. Blah blah okay sure, so for. Wise. A couple of things right so you can already build a bot and publish to Cortana on cortana as a channel and anything, that you do there is also going to be supported through speech but. We, are also making. The vive actually already made the web chat control that you can embed within your own website or application that supports speech, and so you can either use browser speech or you can use cognitive service, speech or, you can even implement, a custom speech recognizer, and hook that up for your scenario, the, third piece of work that we've also accomplished. When you go build a Louis model then, we take all those occurrences, and then, Prime the speech recognition, system, with, those audiences, so that the speech recognition, accuracy, is improved as well so. You, know setting up the stage again starting, simple and layering sophistication. There is that part that is all be laid for you for you to be able to take that journey and the, second question I thought you asked where about Alexa, and Google home in integrations. We don't have anything to announce at the moment but. You know stay tuned to our blogs it's an area that we are actively looking into the thank. You great question. Hi. You. Sure the dashboard. Part, of the presentation. Is this. Is. This the one you have built it or it is part of the lowest it. It will be generated, which one is that the, the. Sentiment, analysis. Performance. I think she's about the customer the. Business this one yeah this one yeah this is just examples, of the power bi dashboard okay. Not out of the box one yet yeah, it looks very similar to other such. So. You can publish those actually, at the Middle where, for. Our SDK, before. Okay. Thanks. Yeah. Thank. You thank, you hey. Yeah. Got. A couple questions one, is if. You, have multiple BOTS it's, or a way to. Switch. Context. And and reference another bot to. Have. It run its it's scripts and then return. Back to the context that you originally at. If. You're trying to do multiple. Bots that are still. Simplified. For have. An expertise, in one particular segment. And, without. You know getting to complex, and and. Being able to reference that and then bring it back into the context. Of what you were originally talking about right right so there are there are two things there one, is as you are sort of building. You. Know different groups. Of scenarios, that you know you might have one bot do you. Know a good example is if you're doing a booking agent board it might want one half of the functionality, might be to be able to actually book a flight yes, a different set of functionality, might be to be able to get a right to the airport and a different one might be to check whether that's.

An R8 exam and so what. We have right now though is that is actually a multi multi layer problem, the first one being we. Need to be able to dispatch, into each one of those the right way so we have done a lot of investment, in that space and we have a tool called dispatch, and it's available in bot builder tools that, once you have built a different Louis model one for each scenario that it can take ingest all of that and create a matter model, that, you can use to dispatch into the right scenario so that's step one and the second thing that you were asking for I thought was where, you can get a piece, of information that was captured by the first scenario and then make that available into, the second scenario yeah so the the reason changes that we have made to the core, bot, framework itself, you're, in control of bot, state so you can have the dialogue and user and conversational, State in your own Azure storage or cosmos DB or whatever you choose you want to have it but, we don't have any native support to be able to pass that around except for the basic infrastructure, that lets you to hook into the backend database, but. But then you're in control of state data so you should be able to persist information, that you want either across. Conversations. For. The same user across, conversations. Or within that specific dialogue, depending on how you career code okay, do you have a sample that dispatch, on so, github, board, builder samples, or builder - samples, we've got like 27, or so samples in there okay, and in my last question, was, do. You guys also have a sample using, a Twilio. To, do. An SMS, based, kinda bot I'm. Not sure it's already a channel, we support oh it. Is great yeah, yeah do, check out the dispatch I would love to hear, your feedback, I'd like to do that okay, thanks thank, you. Great. Session thank you thank you you. Know two. Questions one was when, you're showing the code there and then you you, know created. The lowest model and, then you replace the explicit. Intents, with the Luas intents, and. Then, you know you quickly also added the Q&A service, there as well the couny maker alright but actually added the Q&A maker is the first thing it, was very earlier, so.

If You could if you could just bring up that code of death. I set, up ten more minutes so it should be good. So, this is where we added Q&A. Maker. Model. Which was like the third demo that I should because that was gonna help you do, simple, single turn conversations, and then. This. Code that, we added was. The exact same thing that I had later on so. The later on at the end when I added Luis. This, part of the code stays the same so this this stuff didn't change at all that was already there and the only thing that I did was swapped, out to the Luis intent so. You know, conceptually that's that's interesting so you're really bringing the cue name so you passed a site and then Luis model. Together right, so there are multiple different ways that you can do that I think the matter question maybe you are trying to ask is should, I call Luis first and then as a fallback call Q&A maker, that. Isn't the right thing because depending on how you set up your intense, and example utterance inside Luis you, might have false. Positives from Luis where are the questions that should have been answered by Q&A never end up banking there and so, depending on your scenario if you run into that The Dispatch tool that I was just talking about it, can also take a look at all your Q&A, models and then create a matter model that analyzes, what's in your Q&A what's, in your Luis and then it can dispatch into the right place and so, we've, thought about that exact same problem, and so if your scenario, needs that but that's not the common scenario that's why I didn't show that in here if, you are sophisticated scenario. Needs that then we have the tool for you to be able to get that set up to. Do to more cash or. In. The multi turn scenario. That you're sure yeah you are explicitly. Managed you, know the developers. Are explicitly, managing, the state yep. What. About you, know managing, the state for ourselves you know like, I'm supporting. A multi turn natively. Is, that something. Like. You know from like doing inferences, yourself. And, and. Sort of managing, the state for us rather than us having to manage that state is that yeah so, one. Of the things depending again on your scenario if that's something that is required, or is interesting, we've. Announced, project. Conversation. Learner I don't know if you have heard of it it's where you're basically providing, a bunch of example

2018-05-21 15:23

Show Video

Comments:

Building Bots should be punishable by law. Terrible, not the future - i hope.

Other news