Modernize Learning to Connect Competencies, Business KPIs, and Coaching

Modernize Learning to Connect Competencies, Business KPIs, and Coaching

Show Video

Thank you everyone who's dialed in. And good  morning to those of you joining us from North   America and good afternoon to anyone joining  us from across the Atlantic. We are here to   discuss how to modernize learning to connect  competencies to business KPIs and coaching.   My name is Tim Dickinson from Watershed. And  before we begin I have just a few notes. We've  

reserved a few minutes at the end for Q and A. So  please if anyone has questions during the session   feel free to enter them into the questions panel  on the right hand side of your screen. Also any   questions we actually don't get to today we plan  to follow up with written responses as much as   possible. And there will also be a recording  of today's session. So we are recording this   and we'll be providing access to that hopefully in  about a week's time. With all of those logistics  

covered, I am thrilled to be able to introduce  our partner from Applied Industrial Technologies   where he's the director of organizational  learning, Andy Webb. Take it away, Andy.   Tim, thanks so much excited today to talk to you  about modernizing your digital learning approach.   So the presentation that I'm going to share with  you was first created last fall for a special SAP   event and it's evolved a little bit. But these  are a lot of the things that are on my mind as   a practitioner someone who's in learning and  training and how we need to adapt. So when you  

look at the word modernizing, I'm not sure what  you think of, but it may conjure up different   ideas about updating right. Your learning  experience your engagement perhaps your analytics   or even artificial intelligence. So these are  the kind of updates that we're talking about,   and they're a lot more than a fresh coat of paint  in new carpet, right? It goes beyond the movement   where everyone just converted their PowerPoint  into Articulate and called it rapid eLearning.   I think a lot of these things are a lot deeper. So  we're going to explore modern learning approaches  

and take a deep look at how you design the types  of tools that can help and then also measurement.   And this stuff is not just theory. The team that  I'm on at Applied, we're using a lot of these   things, still struggling through some of it trying  to figure out the best ways to make them work. But  

I am going to share with you two examples where  we tie together some of these concepts and we   look at a learning experience platform tied to  an LRS and see learning analytics in action.   So when you kind of step back and you think about  this whole concept of modernizing a keyword that   keeps coming to my mind is engagement. And I  think that's a theme that we're going to see here   reflected in our design in our development and  measurement of all the things that we're trying to   do from a training perspective. So just a quick  background on myself Um I work with a company   applied industrial technologies and we are a  large industrial distribution company. We're   about $3.5 billion annually in revenue. We  have about 7000 employees for the most part  

we're tied to the manufacturing industry.  So a lot of it's not real exciting but   currently we're doing some things like robotics  automation and hydraulics. So let's go back to   engagement. You know many times when I teach  in a sales or an operations classroom in the   first 30 to 40 minutes when I'm thinking about  engagement do you know what that sounds like.   And apparently my speaker's not on but those  were crickets in the background. It's really   hard sometimes to initiate engagement in a  classroom. But once you do it totally changes  

the experience and many times the outcome and I  think as we look at our digital world our modern   learning efforts should take the same cue about  engagement. We need to really look at a lot of   different parts and figure out how we engage our  audiences how we engage the stakeholders and a   lot of different folks involved in the process.  So let's start with the most difficult topic   how to engage senior leadership on the training  investment. I want to look for a little bit at  

impact and impact is perhaps the most important  result that we get and it sounds very subtle,   but I think there's a big change going on inside  the learning world. It's a fundamental change that   we need to make from basically our learning  ROI to are we helping drive business impact?   So instead of “Here's how effective our learning  is and how much money we've saved the company,”   it's now more about did our learning  impact the effectiveness of the business,   and it should be measured in the  KPIs that the business already uses.   So at Applied, if I were to  calculate let's say 250,000   in learning ROI, if I were to do that and capture  a snapshot and show it to senior management,   you know, that might get some attention, but  it would be just that it would be a snapshot   it would be something that we achieved and that  might be good. But to really involve and engage   senior leadership and leverage things like  analytics we need to really cater what we're   doing around their process. So I hope you see a  little bit of the difference between just coming  

up with a one-time ROI versus really trying to  build your analytics and build some of the things   that you're doing around the ongoing things that  are happening in the business. And a lot of that   the changes and things that need to occur really  happen and start with our design and what we're   doing In this case we're going to look a little  bit at how to leverage xAPI data. So the way   that I kind of sum that up is in a quote learning  effectiveness is not a training statistic at the   end of the road but engaging learning throughout  the workflow to impact ongoing business KPI. So   I hope you catch a little bit of that subtle  change and are able to sort of implement that.  

So we've talked a little bit about some things  here. Now what does that actually look like.   So this was one of our first efforts and some  of you that may have heard things that I've done   before seen parts of this, but I wanted to share  with you a little bit about how we came about   doing this and what this looks like. So we'll  actually look at our LRS, and I'll do that by   way of showing you a movie first if you notice up  top there's three columns here. And as we started   this effort, we really looked at how do we define  competencies that matter in this specific business   instance. So in this case we were trying to train  our service center managers on some financial   acumen points and really focus on operational  metrics because we needed those financial metrics.  

I went around and talked with a lot of our vice  presidents and we narrowed down to a single KPI   for each one of these competencies an approach  that we could use. And then we have an analyst   on our team and she was able to actually take  data going back 24 months and come up with   some of these benchmarks that you see that are  reasonable for all of our locations to achieve.   So as participants went through this digital  part of the learning, we were able to look at   the benchmarks we were able to look at how is  this person doing on their learning scores and   then what are their actual performance marks on  the same metrics. And that's how we came up with   this specific chart here. And I'm going to show  you a little video that will walk you through how   we went about doing this type of thing. So  let me go ahead and queue that up using the  

organizational hierarchy. A single dashboard was  created and segmented by scope of responsibility.   Each leader could view team results to clarify  expectations. Benchmarks were created from deep   data as realistic field targets and are  compared to the learning competencies.  

Millions of data points were simplified into five   dynamic charts including a comparison of  monthly KPIs for the four month trend.   Qualitative analytics were used to consider  coaching topics and training for individual   associates Let's take a look at point of sale  pricing. If we click on this scatter gram you'll   notice a two x two where we're comparing training  scores versus actual performance on the job.   If we look below the median, we  might find somebody here like Donna.   She was close to the median with her training, but  has a ways to go on performance. Let's see if we   can get a better understanding of how we might be  able to help her. We'll go back to the dashboard.  

From here, we'll take a look at her competency  map first we'll compare. Donna to her peers on   learning other than pricing she outperformed the  average of her peers. Now let's take a look at her   KPIs according to the benchmarks on this graphic.  We'll compare the company benchmarks which are in   blue to her learning scores in green with actual  KPIs of her service center which are in purple As   you can see this raises a lot of questions. She's  outperforming the benchmarks on many metrics.  

Financial acumen is net profit and there's still  a little bit of room for improvement there.   Obviously, by tackling pricing and perhaps  raising the bar on inventory she should be able   to increase her net profit inventory. Seems like  a logical place to start she already has great   comprehension there. And you think she'd be able  to raise her performance beyond the benchmark.   Pricing looks like it'll be a little bit more  of a challenge. Her learning score was below  

the benchmark and her metric is even below that  by having this kind of insight before a coaching   session her manager will have better insight on  how to help her and increase overall performance.   So if someone isn't performing at  benchmark we're using a data driven   approach to suggest what areas need focus  and prevent long term gaps from developing.   Alright. So moving on as we looked at developing  this effort again we took our competencies we   looked at KPIs that were associated with those  and developed a lot of the training around the   objectives that we had for this particular effort.  So I wanna step back and look at what we can do on  

our learning design. You know if you think about  modernizing, hopefully, you're not looking at the   70 2010 as someone pointed out recently. This is a  20 year old model and it has its merits and it has   some good things that I think we've learned from  it. And in some ways I feel like it's become the   PowerPoint of learning in many ways it creates  kind of a fixed curriculum design approach. And   we use some of the same parameters or some of the  same measurements that we've used for a long time.  

And by that I mean LMS scoring and completion  mentality. And I think what we need to be able   to do is we need to think a little bit wider about  who our audience is and how we actually go through   and develop content for these folks. What we  should be looking at is really more of what   I think of as frameworks and building ways for  our folks to learn and perhaps develop some of   their own content organically. I also think that  we need new measures. We can't be stuck in the  

same measures that we've used previously  What I'd like to do is walk you through   how we should go about thinking about things. At  least this is how I'm thinking about things moving   forward looking at the perspective of the learner  the people that are in our audience the learning   organization that's the learning development team  that may be inside of your company and also the   business or the business leaders that are your  sponsors or behind a lot of the initiatives.   And then I also want to talk about some of the new  measures that we need to look at that our team is   trying to define and work through. So one of them  is engagement How do you look at how well engaged  

people are on a particular learning effort.  There's things like badging or we've done some   different type of training around coaching and  how can you measure coaching as well as competency   sort of like we shared just a moment ago. Many of  you may know Josh Bersin who's an HR specialist,   who does a lot in the learning sector. And these  are a couple of measures that I think we need to  

think about as we look at our learners inside  of our organizations, those that are using the   and consuming some of the training that we're  providing, and how to think a bit different than   just the 70 2010 model first. 68% of these folks  prefer to learn at work. Now what's exciting about   that to me is that 32% of these folks are not  preferring to learn at work meaning they're either   on the go they're you know in a either a mobile  situation or at home and doing things in their   own time. And so as we think about that that has  design ramifications around how we need to think   about mobile, a learner, and things that we should  be doing to impact our design. 58% prefer to learn   at their own pace. And this is where things like  microlearning and social learning come into play.  

49% of them prefer to learn at the point of need.  And this would be really important for things that   would have a search engine. They need to be able  to instantly find pieces of content that will help   them as we think about modernizing. These are some  concepts that I think you know float to the front   for the learner. Now also for the learner this is  a look at JAM which is the interface that we use.   I wanna point out a couple of things that we're in  development on. First it has notifications, so our  

users can decide which groups they want to join,  what content they want to see, and what they wanna   get alerted at. Second, I have here something that  we're working on where we'll be able to use user   specific stats. So as we look at connecting  some of the things that are in Watershed,   we want to be able to show those and alert the  user to things that they are impacting or things   that are happening that are from the business that  we're working on trying to change through learning   and training. Third, is recommended content. Right  now, we're using a rather simple approach. Down  

the road, that could be AI and triggered by things  that might be appropriate for that specific user.   And then at the bottom, content exploration in our  situation they have the opportunity to join things   and look at groups that might appeal to them or  that they're working with on an ongoing basis.   We're also using it to you know we sell a lot of  different products and we're trying to let some   of our sellers understand some of the different  parts of the business by using different groups   inside of our JAM interface. Next, I wanna  look at the learning organization this is   your organizational learning team inside your  company that's doing a lot of the development.   And I think that we need to think about new ways  of designing training content that really brings   about that modernization as Josh uses that in  the flow of work you know being able to provide   ways for them to access the learning as they need  it. Again, we've talked about coaching and some   of these other concepts recently we've started  doing a podcast application and that's a great   opportunity to get a little bit deeper and provide  things for people that are traveling, particularly   in cars, to different sales calls or in longer  business trips where they may be able to download   information before they hop on a plane. And  simulations is another area where we've done some  

work and are trying to figure out ways to measure  that. I want to look now at three different types   of tools or functions that you may want to think  about leveraging as a learning organization.   The first thing we did as a learning organization  was really look at building our data and trying   to put together a system of record that shared not  only our learning data from our LMS which is still   an important part but also some of our business  stats And so for looking at different groups of   people or different efforts we have that inside of  an LRS. We've kind of covered parts of the social   learning. But this is a sense where people have  the ability to collaborate. And you know there's   some great stories of people that have written  in and asked for help and other people jump in   and are able to provide something about a part or  some tips on what they need to understand about   that in order to connect with a customer. Then  the last part is an LXP. So you may have heard   of Degreed or LinkedIn Learning, and we haven't  yet developed our LXP but putting up a learning   experience platform. And really I think one of  the key elements of what is important in an LXP  

is that sort of Netflix approach to showing you  the different types of training that are available   and giving you different options and choices on  being able to bring that up and connect on it.   So these are some of the things that the  learning organization should be considering.   And I think the challenge in all of this is how  do you put it under one roof right. How can you  

have a single point of learning where all this  functionality is connected together and is easy   so that the user doesn't have to go out and log  into three or four different websites whenever   they want to learn something. So that's something  that we're doing through JAM and being able to   and all things that we do provide a simple  way to connect with a single access point.   I want to look at from the executive level.  I had the opportunity to have a conversation   with Nate her and Nate's from SAP He's  their global VP of learning strategy.   And I asked him you know you've probably talked  with a lot of people in learning I'd like to know   from what you see what are the biggest gaps  in learning strategy that are common among a   lot of folks out there. And the first item that he  said without really skipping a beat was analytics.   You know learning organizations really need to  concentrate on their analytics and being able to   show business impact. So I'll show you something  in a minute but we're starting by using some of  

the different types of statistics that we have  coming off of our business intelligence things   that come out in monthly reports that people  are used to. In this case, I've brought up some   things that show the different types of products  that are being sold by different sellers. And if   we were to do some coaching around that we might  look at how does that impact their sales on one of   those particular items. These vertical lines are  coaching points that we've been able to connect   with the sellers on. Again, this is kind of the  concept for what we're doing moving forward.   The other thing that Nate told me that is a big  issue is mobile and there's a lot that we can say   about mobile. But in our case we're using JAM  really as our mobile point of entry and trying  

to communicate a lot of different training and  opportunity for engagement in that environment.   One thing though that I think is important,  particularly if you're leading a learning team,   is to develop a strategy. And so what I've done  is looked at the different parts of training that   I need to develop for senior leaders and the  biggest one is probably talent development.   So in our organization I report through  operational excellence but obviously   our HR group and many of our different business  leaders are interested in developing their teams.  

And so under that particular business need if you  think of this as sort of a marketing strategy the   associate development these are different tactics  that support that strategy that we're focusing on   and then using some of the different technologies  that are out there that are available to us to   ultimately come up with some measurements that  we'll be able to do. So I've left a lot of this   blank because I can't share my entire strategy.  But I think coming up with something on paper   like this is a good idea when you're trying to  think about how you're going to modernize your   approach and how you'll leverage these types of  technologies. And then most importantly what is   the measurement that you want those business  leaders or business groups to be able to see.  

Additionally, I've put together a technology road  map and I think this is an important thing over a   period of multiple years to show what you're  working on and how you'll actually start to   use some of these things so that there's an  expectation within the organization of what   things are coming. Now what I'd like to do is  show you another effort where we've combined   our LXP and LRS and this is a coaching training  opportunity that we had. Our efforts are really   around sales coaching. And in this case, before we  started looking at the sales components we wanted   to make sure that our managers really understood  what coaching is. So we built a hybrid blended   course on our JAM platform where we went through  and defined what coaching was and came up with a   coaching model. And then we used a series of  webinars with an instructor to work on real  

projects things that they were trying to do from a  sales perspective and tried to use that. And then   we're starting to measure results So I'm going to  show you a little bit of what that would look like   and give you a little bit of a view of what we're  trying to do. As I mentioned we've gotten a lot of   this done but there is some analytics and things  that I wanna share with you that is really more of   a concept at this point. We can answer questions  on this a bit later. But I've put together some   of the different things that our analyst has  put together to tell a story of what this could   look like. As we move forward for sales managers  we're dynamically visualizing product and sales  

KPIs into discrete portions of the sales process  they manage. We're trying to identify performance   issues to help managers make better decisions  about seller training and coaching opportunities   underneath the hood. We've piped in SAP sales  data from business objects on the business issue   at hand increasing active customer activity.  For this example we're going to select Mark   He's an account manager who has performed well.  First we'll look at the customers sold to report   a histogram appears showing the number of active  customers across all account managers within the   month. If we click on mark, we see that he has  60 active customers, 2 inactive customers, he   has reactivated, and 3 new customers. Next we'll  look at the industries in which Mark is selling.  

We can sort by industry and then look at results  in descending order. For the fiscal year mark has   sold about $127,000 to service industry customers.  This represents 19% of his overall sales   Mark's portfolio is fairly diversified for  the industries in his territory. But there  

are several possibilities in construction and  transportation that should be considered. Last,   we'll compare the number of local accounts to  which Mark has sold this year. His count is down   12 from 55 to 43 which confirms that increasing  his local sales could really help by checking   these three reports. A manager can quickly  gain insights to decide how to coach a seller.   Let's look at JAM our learning experience platform  sellers and managers can explore media and   documents that illustrate best practices. We'll  select the CM 100 which defines active customers.   This wiki page clearly defines expectations along  with tips from peers leaders and related training.   Coaching tasks related to active accounts can  be assigned from within the tool. Once I've had  

a coaching session I'll log the activity. As a  sales manager, it would be helpful to understand   the amount of coaching in relation to movement  on the metrics for active accounts. We can   understand the amount of coaching occurring and  view the results against the backdrop of a monthly   active accounts trend line over time. We may see  a correlation between the amount of coaching and   the impact on sales. For now We're trying to  assure leaders that coaching is occurring and  

there's a level of accountability with tasks  and related training. Watershed and JAM are   helping us to deploy and visualize coaching and  learning activities alongside performance metrics.   So what I'd like to do is kind of follow up  on the video there. We're not trying to make  

a direct correlation yet that the coaching has  a direct impact on sales. I think that's gonna   take us some time to do. But what I'm trying  to do is put in front of the business. Here is   how you're trending on these particular topics. We  provided training for these topics and as a result   of things that are going on inside the business.  And hopefully at some point coaching down the road   we'll see an improvement on these things. But by  using analytics and putting them in front of the  

audience our hope is to be able to improve those  metrics for these business leaders moving forward.   I wanna share one more thing with you before  we open it up for some questions. And this is   a learning strategy model of something that I was  thinking about just in terms of how we're working   toward developing this. Again a lot of this is  a work in progress and I think nobody's really   completed this model I think it's something that  you work through and hopefully it has some value   in in bringing it. So again I've mentioned our  base layer is really data right learning and  

business data. And the way to read this is to look  over on the left hand side and it starts with real   basic information like user names and credentials.  Then you might move into completion and scoring   and perhaps learning reporting. Eventually, you  might be able to capture a learning experience and   do some things around impact on the business. Then  hopefully you can add in some of your KPIs things   that the business uses and be able to import those  things into either an LRS or leverage that xAPI   data. And then ultimately at some point have  complete automation of some of these concepts.  

So this is something that you know we've talked  about and we're building toward we're not there   yet but this is a base to what we're doing. And on  top of that base you have learning management. So   most of you are probably familiar with these  concepts. We're deploying different types of   courses and content where there's assignments  transcripts the ability to look at competencies.  

The next layer looks at learning experience. And  here you're using a lot of the different types   of learning that you might be providing and  hopefully capturing some of the ways that you   can record how people are progressing through  those experiences. And the top is really the   business impact and this is what you share with  obviously sales leaders inside of the business.  

And you're looking at things like business  impact talent development skills development.   So from an executive standpoint, they only see  above this line they're not aware of all the   things that you're doing below. But I think  what's important is that on that base layer   as you develop a lot of the different things  in your in progression of a lot of the content   that you continue to build out and strive toward  building a bigger base. Because I think the bigger   your base is the larger of a chance you have to  show that business impact kind of moving forward.   So that concludes what I wanted to  share with you. So what I'll do now  

is I'll open it up to a couple of questions  and we can kind of go through the questions   here moving forward and hopefully I can be  of help and that this might have provided   some inspiration for some of the things that  you're doing in your learning organizations.   Thank you so much, Andy, this has been really  helpful and we all appreciate it very much. And if   anyone does have questions feel free to continue  entering them into the questions panel and we've   got a few good ones that we can start ourselves  off with here. So just to begin Andy, a question   that we received when you were showing the JAM  homepage and some of the recommendations on, there   was a specific question about recommendations  number 3 and 4 within the JAM homepage. And   whether or not those were manually identified or  if they were automatically identified I guess it's   you don't wanna go super specific on those exact  recommendations maybe just in general. Could you   elaborate on where some of those recommendations  are coming from? Yeah right there. Perfect.  

So right now inside of JAM it has an automated  way to build that recommended content. I'm not   sure if that's true AI. I think it's  really just a matter of hits. And so   what happens is basically the content in that  particular group that gets hit the most is   automatically added to the top of the recommended  content page. Perfect. That's helpful. And then  

we've got a couple of more good questions. One  is around some of the competencies that you   described in the first video that you showed and  I think just there was some interest in whether   those were defined by you and your team or by the  business or just kind of what that process looked   like of coming to a common agreement on these  competencies and how you were measuring them.   OK. Yeah good question. I could talk for  quite a while about how this process evolved.   It occurred actually over several months.  Initially, when we went through an SAP deployment   we wanted to make sure that our business leaders  really understood finance and financial acumen So   some of these were observations at the time. I  worked for the VP of operational excellence and   the person in charge at that time saw a lot of  these things. So we identified a number of them  

but then we went and met with vice presidents  on each one of these topics. So for example   you know we looked at the person in in charge of  finance for pricing you know an inventory. We went   through and spoke with that individual as well  and we asked them what is your top KPI. It said   you know I told a story one time before about  we were looking at doing some things around the   analytics. And what I did is I went online and  took some pictures or screen captures of these   things and I put some of these stats basically in  Photoshop to show them here's what your story can   look like. And then we talk with those individuals  and we sort of work through the process of ironing   out which KPIs are really elevated to the top and  I tried to push them to one and in most cases we   were able to get it down to a single one you know  for each particular metric and then just to finish   the story. The objective or the training that  we needed to build around it and the assessment  

questions that we needed to come up with in most  cases they gave us the name of a specialist inside   of their department that we worked on to develop  that content. So that was how that process worked.   Thanks Andy. That I think that helps  clarify that. We we've got another   really good question around actually getting  some of the data from SAP into Watershed. So   could you describe the process of actually  getting the SAP data integrated into   Watershed. And with some of the other tools  that you are using. Yeah I have an old diagram   that will help it's not completely  accurate at the moment but I think   it'll help illustrate the point. So you know  we have our LMS which is success factors.   You know we have raw data coming in from SAP  and our business objects area is really the   point at which our monthly reports are given to  the business. This is really the key part where  

we're tying into the metrics. Eventually  we wanted to have all of that automated.   You know we had been doing some of that for  a while, but in the stuff that I showed you,   our analyst actually puts it in our in smaller  groups and has figured out how to put some of   that information into Watershed. So there  are a few hops here if you will it's not   all completely automated. But that's really  the goal is to get it to a point where it's   completely automated. But for now as we've done  some of the efforts that we have it's going from   business objects and then from our things are  transformed into what Watershed can import and   used moving forward I should point out that  at each stage of the process that we're doing   we're really trying to clean up the data  and we're not just dumping everything in   Watershed and hoping to sort it out later. So  we very much have discrete sets of data. We're  

not using all of business objects we're only  using the portions that we're focused on for   that particular training initiative. And then  you know those KPIs that are most important you   know for let's say a pilot group that we're  looking at those would be you know used and   filtered inside of our and then that particular  data would go into Watershed and use for reports.   Thanks, Andy. That helps kind of along the lines  of some of the discrete data points that you've   pulled and some of the tools that you and your  analysts have used. If you had to just you know  

list a few of the important skills both for you  and members of your team that have really helped   to integrate some of these different platforms.  What do you think would those skills look like?   That's another good question So I think  yeah outside you need a standard if you   will development team you know with instructional  designers and those kinds of things that's pretty   much a given. But I think there's three skill sets  that are needed to sort of pull this together. And   one is somebody that understands the business  and what some of the business objectives and   goals are and can talk with business leaders and  get into meetings where you can develop the kinds   of questions and things that get to those KPI. And  that's kind of the role that I have in overseeing   some of the efforts But I think another really  important role is getting those data feeds hooked   up and understanding how data works and being  able to manage the flow of things. That's a really   tough part In some cases we might get help from  IT. In other cases we might be working you know   with an outside vendor or you know having somebody  on team that can really manage the data and that   flow. And then the third skill set that's really  important is data analyst work So as you look at  

this particular chart. You know our data analyst  was the one who did all the benchmarking? And   you know this is an interesting graphic because  one time I asked her how many data points do you   think this represents. And the answer was 81,000.  That's really because we're looking at 24 months   of data on all these different data points  different locations you need somebody who can   really understand and tell those stories and find  the things that aren't obvious right away because   it gets a lot deeper than just completion status  and having somebody with a data background I think   really helps that. And you know a lot of times  you're tempted to go with something that seems  

obvious. You know I really appreciate her input  and ability to tell me. Well that's not truly   correlation yet. You know there's more things that  we have to prove and we need to look at it over   a longer period of time. So I think it's really  important to have that kind of reference point.   And somebody in the group who understands  data from a pure data science standpoint   you know as part of the team. So those are the  three skill sets that I think are important.   Perfect that is really helpful. Thanks, Andy.  And with that that actually brings us to the   end of our time that we had scheduled for this  particular session. So thank you everyone so  

much for your participation and for asking great  questions. And if you do have any questions that   we didn't get to please feel free to continue  throwing them into the questions panel before   you log off because we are going to follow  up as best we can with any that we didn't get   to today and answer them in written form and  provide some of those with the recording. So   as a reminder there will be a recording of this  webinar available as well. So thank you so much  

Andy. We really appreciate your time and sharing  all of this information with us and these stories   and these insights. And so with that everyone  have a great weekend. Thank you so much.

2023-04-03 20:45

Show Video

Other news