Automate It. Episode 13 - Power Automate Desktop Monitoring Dashboard
What's up everybody? Welcome back. This is Automate It episode number oh i gotta go two hands thirteen. I am Jon Levesque, i'm your host and as always i am joined by the awesome crew of Kent, Pranav and Apostolis. Welcome guys how you doing great john how about you doing good doing good. Kent how are you up in cold Canada uh not too bad not too bad i see you got a fancy new studio behind you that looks pretty cool yeah yeah i got an upgrade in digs we're no longer in the office i'm now in the studio and pranav how you doing buddy i'm doing well in the rainy seattle show i know i wish i wish we would get some snow or some sun or something else i'm tired of gray and wet i can sell you some snow that's not a problem i mean a cooler of snow is not enough Kent i need like i needed to come from the sky you know i appreciate the thought though i wish the borders were open i'd come hang out up there and and also you know what real quick let's say a big happy new year this is our first episode back after the new year happy new year to everybody glad to have you back with us yes okay so on to the business at hand today Apostolis is going to show us some great stuff for so many of you the out of the box analytics you get in flow and power automate are just not enough we hear all the times how do i get deeper how do i find out more information and so apostles today is going to actually walk us through how we can do just that how we can dig into the common data service find the real set of statistics where all that information lives and decide what we want to display for ourselves so apostles i'm going to go ahead and hand things over to you my friend and the the floor is yours yeah thanks john so the topic of today is really how as john said how we can go deeper on the flow analytics uh now that we have uh you know the to distinguish also between cloud flows and desktop flows there are of course different reasons why you would like to dig deeper on specifics uh of uh past flow execution or maybe even on real time so in order to uh enable that um we leveraged uh something which is called microsoft databurst this is the underlying foundation of course of the power platform but also uh driving the whole dynamics 365 business solution stack so we will be leveraging uh the microsoft dataverse api system together with power bi and the power ultimate service to get a little bit deeper when it comes to the flow analytics so for that let me actually jump into the deck so the agenda will be we first of all the key drivers why why we did that dashboard how it looks like how it has been built some really important findings around that and then looking also into options of real-time monitoring and of course q a so before we start what i just said the underlying foundation of that is of course uh leveraging uh the power platform the breadth of the power platform um to to you know do this low code no code kind of extension and even reporting to derive those insights we are looking for so in our scenario from the left-hand side we will be using power bi the business analytics tool power automate of course for the flow execution and the Microsoft dataverse underlying foundation which is uh much more than only a data store right it's really uh it is a hyperscale database as a service if you would like so if you're looking here what this is comprised of i mean this is as you can see here on the right hand side Microsoft database first of all it runs on azure and it's much more than only you know azure sql database so it is really comprised of many many components here making it super scalable highly secure and of course hyper uh hyperdense and stuff right so and if you're looking at the internals from a feature perspective of dataverse as i said it comes with a lot of built-in services like security logic data storage integration and then if you're looking into the you know the foundational topics like authentication authorization but also auditing so you know who who's accessing those components when and uh you know have the full traceability of how many executions of a specific activities on a table has been done and so on then on the right hand side you see many connectivity options reporting you know connectivity to azure data lake and so on so forth so and the api this is what we have leveraged in the scenario to derive to connect to those endpoints which are you know whenever you execute a power automate desktop flow uh the telemetry the execution lock is then of course transferred to our cloud service and from there we can access uh that data through the data verse api connectivity and this is of course enabling many many scenarios across of course the power platform and even beyond that when it comes to integration and to surrounding of of the data uh sitting in database okay so why this dashboard first of all i was bored during the lockdown uh i mean yeah you know it's it's it's it's easy how it's really amazing how easy you can get started with those uh components once you you know a little bit the workings of the power platform it's so easy to extend that knowledge to different uh tool sets like powering bi in this case or even smaller power ups which then surface information uh or even allow you to to uh to you know enter information which will then be part of the overall flow so it's easy to up skill on that and uh you will see this uh in in the real-time dashboard at the end as well so it's focusing on uh desktop flow uh insights what i said before not only uh when an automation has been executed but how long it took what actions have been called uh what actions in in that automation have failed uh which ones uh you know took longer on the previous week for instance or previous execution some statistical process control you can do on these these components and these statistics so that was really uh to get deeper insights on on what's going on from an execution perspective um showcasing of course the api endpoint consumption what i just said so data versus a gift that keeps giving all these built-in hyperscale uh you know architecture security by design record level security so you know whenever you are you know accessing the end point it's respecting also the record level security for your user uh which has been applied audit management and so on then it allows for reactive governance and control like we have with the coe starter kit where you can really uh control execution on action level if you would like to we see this later in in the chart and then you know proactive uh reach out instead of blocking the whole service just proactively then monitoring and reaching out to folks and ask okay what were the use cases why they have used certain actions just to understand a little bit more about the demand is within the organization and also help drive of course the citizen development ship identification of api based operations and also pro code actions we see that in a second what that means what i said before statistical reporting so you can really compare flow runs with previous florence and then you know judge if this is considered an outlier or if there's a pattern you can see okay where you need maybe to look at the hardware or the vms and and so forth it is easy to deploy this was also you know one of the design principles many organizations they they don't have uh you know the full breadth of the power platform components enabled today or deployed things like the coe starter kit which is really a massive governance framework so if you just get started with the you know power automated desktop and the power domain service in general and you don't want to have uh the cre starter kit deployed in your organization but you would like to get also in the same time of course those insights from from the power automate service this is where you can really get going with this starter template and the last point what i just mentioned is really for organizations that haven't uh you know enabled the breadth of the platform yet so all you need is of course uh access to the database uh backend sufficient privileges is very important because otherwise you won't see uh things uh power bi desktop and off you're going okay any questions uh john team no it's all very clear to me so i guess one question just around uh when you mentioned the right level of access uh so what is required in order to to be able to retrieve this data so first of all you have to be uh you know flows are assigned to specific owners so whenever you you author a flow you are the owner that of that flow and then you can share that flow of course as co-owners with others so this is very important that you have to have uh dataverse environment access first of all so if you don't have access to that environment you would not see anything of course and then it goes really further down that you have object access you have rbox on the cds or in the database and tables that you can say okay on a flow session or in this and that workflow entity i would like to restrict that access to that so that's the idea behind yeah and whatever you are accessing you cannot even restrict it so let's say a flow has not been shared with you and also the execution has not been shared with you that normal regular user which is not really having any administrative control won't see that right so you don't get more access by accessing those power bi report through the dataverse api system because that's the essence what i said before it is secure by design so i guess just to follow up so if i was a regular maker that's building say api flows or cloudflows and desktop flows i would be able to see my information but if i say was an administrator i would be able to see um more information because i've been granted that administrator role is that a is that accurate then that's correct and you would see i will demo this at the end as well so what a normal user would see and what the administrator sees and that's i think one of the essential parts why i said you know dataverse is the central piece of our whole power platform and it is so important but that always front and center because all these you know uh scalability and security configurations and the auditability of that system this is just uh god given for us right as soon as you have a license for a power automate desktop or for power domain you can leverage all these technologies which are out of the box you don't have to worry about them right it's always whether you access the this data through the api through the ui or other means it's always respecting the api and the security layer behind it so that's that's very powerful of course yeah i like that it's god-given rights as soon as yeah that's awesome that's great all right so let's have a look uh at the at the demo let's do it demo time so first of all i'm opening here power sorry power bi and which is leveraging what i said before the Microsoft dataverse api to get all this information here so we're starting here and these are this it's on purpose renault you know it's a starter template so this is meant to be customizable like the rest of the series starter kit you can really customize this to your needs and here's the data model of that and here you can see a graphical representation of that uh you can really uh change uh you know add new relationships to that you can add new measures new column calculated members and so on so once you have started that as a template you are asked to provide two things one the dataverse environment url and the flow session history url in poweradmin portal then you have to authenticate with a user that has proper privileges and then you are print it's loading the data and it's presenting it in in this starter dashboard here okay so from here what we see is some filtering section here on top where you can flow status you know failed or succeeded run mode attended unattended or executed from the power automate desktop console then some basic statistics here on top how many makers do i have in my environment how many desktop flows how many legacy ui flows well which is using you know still the windows recorder this we come in a second to that so how many overall flow runs i had uh in that time frame here attended unattended runs console runs how many bots uh how many canceled flows failed and so on and this one is uh something when you have a super large power automated desktop runs which are running and many loops or a large loop within those uh it is producing a lot of course log contacts and this to identify those large log flow runs this is where you can see them here in this overview then you have a couple of uh lists here top three makers top ten bot hosts those are the hosts where the power automate desktop have been run on through the on-premise data gateway uh you know total hours processed and so on uh then the top actions in power automate desktop which have been used along the line you can see of course the right to excel has been used 23 25 000 times what was the average uh processing time on it so just a couple of uh you know basic statistics uh you can get all basic counts you can get out of whatever the flow portal is really producing here from an action perspective so if i click here on one of these flows and i look here at the processing history so all these actions which are registered here they are compiled in this power bi dashboard rights so from here as i said you see action subflow start duration and status but when we are looking later on in the action details you see of course many more things let's go to the desktop flow overview so this is also another important view which gives you on the desktop uh flow name level right here it gives you really a quick start how many times it has been run failed importantly how many action counts uh have been included in those flow runs uh is it using proscript we come to this in a second how many uh you know action calls have been done with pro script or for proscript and what is the total run time on minutes for that specific automation then we have here an indicator whether this is a legacy a ui flow or not uh just to give you an idea okay which of those might be potential targets to migrate to the new power ultimate desktop now what is very important here to note is uh this this feature here and that's this is where we start to really dig deeper on on those flow runs as you can see here as well in the overview statistics so i have here an indicator which says okay how many flows are using uh those pro pro scripting components in my flow run history and this will give you a good indication you know how many of these have been uh including things like run a vbscript run powershell run python scripts or open a command session so on so these are usually indicators where developers rpa developers if they're citizen developers or not when they leverage those you know more advanced components in the power automated desktop then you might want to dig a little bit deeper without opening each and every uh flow individually you would like to have really 30 000 feet you okay how many of my desktop flows are leveraging technologies which might reach out to other web servers do some you know uh powershell scripting which is of course very uh very sophisticated things you can do and potentially also uh dangerous things you can do with those components so to get an overview of those this is where these kpi here comes from that was the idea behind it right and we can drill down into these in a second so overview starts desktop flows what i said and now the desktop flow monitor the desktop flow monitor is nothing different than what we see in the power automator portal and also the layout is aligned to that so if you look here back uh the status is what you see here and start and stop and duration and so on so here conveniently we are showing also if it has been a failure what the error message was right and from here as well you have two links you can go to the actual uh power automate desktop flow so if i click on that so it's opening the power automate portal and directly goes to that position of the flow xor to the to the flow run history of that particular record and here i can again review whatever has happened and the error message of uh that thing so fail to obtain output for custom object here for instance right and here i can see the screenshot as well and what has happened exactly so if i minimize this or i can go even if this automation has been triggered from a cloud flow i can also go here to the para flow and then i have a look okay what happened here in that if there was a failure or not or white has been cancelled and so on so it's opening again on the other screen that's handy to give a view that looks aligned to the portal you know something familiar so it's like oh this makes sense to me i've seen this before exactly especially uh john in case of cloud flows you know that we have this option you can see each and every component in the cloud flow so we can really debug that specific uh action uh which was causing maybe the error or even you would like to look at the input and outputs of those actions individual individually just to make sure okay you understand what what happened in in those environments okay totally so from here on what's important as i said uh in the cloud flow history you will see if you go back to the history records you will see here a note which is saying on the cloud flows of course not on the desktop flows so you will see a note here on top of the run history which says 28 days run history so anything you know beyond that is cut off because of gdpr and privacy requirements and so on and to indicate you know before you before you really click on one of those icons it will give you an indicator graphical integra indicator with a flag okay there is uh that process execution or that flow execution is older let's say than uh 28 days okay so here then you see then the indicator of those flows okay so let's go up here and look at the next component from here on and that's also important right if i would like to review of course my flows and i would like to look at more details on specific errors what is nice here what you can also not do from a portal side is that you go for instance here you say okay on letter specific automation which succeeded in this case i would like to right click if you would like to drill through that record this is really the execution of all flow history right you can now go into details like to the run details so which actions have been included as part of that execution you see here 24 actions the average duration of each action was 1.85 seconds and what was the max duration uh of a specific execution here okay so in here you can see even again the flow itself so it will open again the portal or you can also look at to okay what information drilled down further what information uh has been used or what actions have been used uh in in those flows if for instance a script has been used then you would see you see that here or if web service calls have been done through that then you will see that in this experience so let's have a look at uh actually scripts i think those will be used here okay oh quite a lot of scripts as you can see and this is now i think one of the key reasons this has been actually built this gives coe folks a center of excellent folks really a view into uh the situation across environments or in this particular case of course for that environment but for all uh power to make desktop flows and their respective actions so if i go back here as i said here you will see a subflow the action name which always is reflecting here in power automate desktop so those are the subflows and those are then the specific actions what we see here is exactly the detail level and then the duration in the status but going one step further if you i would like to have a look here for instance if we go back to our power to my desktop flow and i look at one of the uh really pro code actions like you know calling web service so if i double click on that invoke web service action uh here i see okay i have a streaming url and here i have the request body and the request body is actually at this json object so i can i would have to go into each individual flows and then review that stuff and do my auditing and so on but i can uh by accessing the dataverse backend as we said before i can conveniently conveniently do this directly from this dashboard experience if i go here to scripts and i drill through here we'll see okay we had a write to command session action called which was calling a host name then there was a python script executed from dos command we had a run application action which was then calling python and then obviously some parameters uh powershell scripts here vbscript some javascript or even python scripts verb as part of this specific flow here right this gives a really a huge visibility over the whole uh command in action history if you would like to look at that on a general you know on a higher level so for all flows then there's also the script action monitor which allows you to filter that by a specific desktop row or even an action name if you're looking for only bb script for instance or run vbscript you can filter that here of course and then it will show here below all actions which involve vbscript in here for instance this is a an action uh which is calling some sap automation and then you can review okay are they doing something specific something complex here or is it using what transaction code is it using and just you know then maybe to proactive approach again those folks say okay we might have some standard interfaces for that or maybe an api so just to to drive that discussions and have that visibility what do you think folks i think that's a lot of scripting that's amazing though that you can dig into those individual steps and and show every piece of information that's passing through that desktop flow it's pretty awesome indeed i mean that's you know the kind of visibility uh the organizations or you know larger enterprises at least look for when they try to enable the service for the whole organization and having a reactive monitoring kind in place which gives them really uh a lot of flexibility when it comes to auditing and reviewing of those processes okay so this is from an auditing perspective so also seeing and you can really define i've just filtered here so if you open that thing here i've just selected which of those actions i consider to be pro code if you want to debug two through you know with other ones if you want to see okay how many times a write the exit command have been called you just i can filter that and then they will show up in the respective uh lines here so this is as i said completely flexible and up to the consumer to adjust to their needs so the next level will be okay now i know what actions are you are being used how about looking at uh you know process variability and also execution variability so that you know you see early signs if things go wrong or some of the processes run really longer than expected or they are failing consistently or consistently over a period of time so for that purpose you can hear filter then for instance if you have a particular one let's say desktop flow process vendor invoices and then you can limit also the amount of uh so the time frame you would like to look at let's say then you have a week's overview of the data you would like to have a look at and this is what the if you know the the chart would look like with the information you have filtered now here from here on you can then say again right click and look at that particular day the executable execution of that particular day by drilling through to the details and here you come again to that view which is filtered now for that specific uh process cluster on that day all the execution and here you see immediately okay there was something wrong here as well as well i would like to look at that look at the scripts and then see okay there was some some run application happening only some system in for who am i and so on you can read it and you review and look at the error so from here on as i said before you have always a link back to the power automate portal to look at the specifics for that specific execution and here you see if i scroll down i can click on that one there was a run subflow and this has failed actually the real action which has failed is further below here and this is the problem so this gives you really you know uh a link always between your custom reporting you can always play back to what the reality is of course in real time on those power automate desktop and power portal statistics okay and from here again if you would like of course you can then drill through again to web services and so on back in the performance chart as i said this is the historical view you could get and then you see of course here always the duration and the this this bar here next to it always reflects the previous days uh or the previous periods uh execution count all right and this is then the distribution and the change over time you see there as an indicator of course this one is maybe not a stable process it might have you know different inputs and outputs so that's why the processes run more often or longer but if you have really stable uh process with a stable variability of an input and output this will give you really good insight on on what's going on then the next level if you are sufficient in you know analyzing those those points then you can let's filter here to succeed only you can also look at a more statistical approach on that data right where you see here let's start from the left hand side you have here upper and lower control limits and then you know a central point and this is then controlled how many standard deviations your process is allowed to deviate from from the mean right so if i want to control my process with uh three standard deviations or sigma's i can open that a little bit okay i can say this is still now under control my process or if i want to uh drive a little bit more narrow uh execution and you know tighter execution i can even go to one or two uh sigma level below and this would then show okay i have an outlier on these on these dates these three points here and then you can look and dig again deeper on on this side okay and here's just the violin chart which shows you the distribution on the different hosts you are having where where those particular flaws are run on make sense yeah the level of intelligence here is intense my goodness i think each time you click down i'm like how can it get deeper my gosh yeah i mean you know and that's really not rocket science uh john right so this is normal and that's the power of power bi and actually of the dax language so uh this dashboard has been pre-loaded so i've done a couple of these measures as they are called which are really giving you the opportunity to look at data from yesterday for instance previous quarter month over month changes which we will see in a second so those have been defined and you can see here the code you can change this but it's also a good template to learn maybe those technologies and how we can use that you know without any coding or statistical uh backgrounds because powerbi and the dax language give you a lot of uh things out of the block where you can create for instance new quick measures which have a lot of this intelligence building so you don't have to think uh or to learn the language really from scratch right where i see okay average per category time intelligence or uh you know month over month changes quote up or quarter changes and so on so that's all built in again in power bi uh so there's tons of videos of course available in documentation on how to do those but that chart is meant to show you okay how you could address those specific reporting needs uh if you wish to okay so this is then the view you see on the on the portal when you have a flow execution right so header flow execution these are the actions and then durations and the stables so if we look now at this action performance day over day or month over month this will give you a different intelligence now again on action level but it will tell you also for that specific time frame uh what is here for instance the previous execution length for instance duration for vbscript for instance and what was the execution of the previous day and then it calculates the day over day changes right and you can look at this on action level and of course time period level or by host by action and so on so forth and then the same goes for month over month changes where you see okay for specific actions let's say you always having you know an update update script which does some uh month end uh transfer of data and you know always okay it's the same amount for instance stable amount of employees and a stable amount of data which should be transferred you can see okay performance of that specific action which is then calling maybe a web service on on the other end and then you can look at that from uh from from a really uh month over month shade day over uh day change and and so on okay and then the other three those were just the drill through uh reports you have seen uh before so this is this is all good and nice so imagine uh you have a power automate portal for our analytics where you can see not only of course the actions as you might have noticed we have here a filter which is also hostname which is actually the on-premise data gateway or a cluster and the same information we have here of course also the portal right if you go here to our data and then on the gateway side we see a list of on-premise data gateways or actually bought bot hosts and for targets so single on-premise data gateways or here we have two clusters one is a windows 10 cluster and this is a windows server cluster right so if i click on that then i see i have two participating clusters in that gateway and here i see again per club per a machine or per cluster the previous execution of my pad flows okay and here i did the three different sessions so that's that's good and that's nice uh the problem is let me start one of that of our cloud flows which is triggering so that cloud flow you will see in a second is triggering in parallel three uh power augmented desktop flows which are executed on a specific cluster this server cluster i've just shown you right it's executed on that cluster and because i have set here that flag run on all gateways in the cluster it makes a choice depending on load balancing where to run uh the the bots okay so these are the three uh uh bots i'm having here or the same bot sorry but about the three executions of that so i have three accounts here which is then accessing those machines uh accordingly so when i start now that flow and this has been newly added in December so they're really monitoring a cloud execution on on gateway level so if we go now here let this start okay so this has started as you can see here so if you go now to the gateway overview i go here to monitor and then desktop flow cues what i will be seeing here is then first of all again my gateway cluster gateway a gateway cluster how many gateways are part of that cluster and how many if i go to the live updates here how many of uh those clusters or gateways are currently executing flows and here we have two desktop flows which are cued and one which is currently running so if i click on that you will see really again the live update uh depending on priority uh if we go back here these three one has called with high priority high priority normal priority and this is of course then reflected here in this queue as well the problem now with that is that that's good and nice but you don't see uh okay which machine has been executed on you don't see what is the anticipated duration of that specific flow that happens here and what is the current context let's say you have you know a month and closing a power automate desktop which has really long long running reconciliation process and you have a loop with maybe a thousand records and so on but it's really running maybe two three hours it would be very convenient to have a live view or at least uh uh you know snapshots of the current execution state so i'm in the 15th loop for instance and i'm currently working with that category of data so in order to enable this again power automate sorry power bi has been uh used together with power automated desktop to define a so-called real-time streaming data set so what i've done here in power bi in the service i have created a real-time streaming data set which has these you can freely define them you can add and create and you know just as your data schema requires i've just defined a couple of properties here on the real-time streaming data set which allows you to collect information and save that either so if this plug is on then the data is stored on the power bi service if this historic flag is not on then the data is only cached for an hour and then it's gone so this is of course uh you know for purposes like iot monitoring and so on where you don't have to store maybe eventually data but you would like to detect real-time anomalies and so on so that's what i've done very simple i defined a couple of properties here and then i uh this is generating automatically for you an api you can post the data against right so here is the body of the data if you're using a c url or powershell you can do that accordingly okay so then what i've done on powerautomate on that specific so you would do this live monitoring of course only for specific boards you don't want to have that for bots which are running five seconds or 10 seconds or really not critical also from a sequence perspective but if you would like to do that then you could send uh those you know monitoring uh events or those status events to the power bi service to the streaming data set through an uh in invocation of a web service here on power admin desktop and that's what i'm doing here so i'm collecting cpu i'm collecting memory i'm collecting couple of telemetry which i think for that specifically for that specific flow uh would be valuable so this pro actually is a demo flow which takes an excel sheet with a list of countries and writes this into another excel spreadsheet right so this is a list of countries here very simple and what i've decided here for that demo flow that i would like the current context whenever it's looping through that i'm always sending to my power bi streaming data set the current country so the current role in my excel i'm i'm i'm working with okay so in order to show you that um you can define a so-called dashboard power bi dashboard which can host live connectivity to those services sorry to those service execution as you can see it has just changed because i have submitted a couple of new flows right so if i'm running those again here let me resubmit those okay so this will take a second until they warm up but for that we have our nice view here you see one is running one is uh queued up and the one the other one is the next two to the right so if i go back to my uh power bi uh live dashboard here as the execution happens those records will will come in and then you see here the last action which has been called is right to actually you see now it's in austria it's running now and you see if i want to look at the at the the specific row i can see here also the progress so it's live updating as i said every couple of seconds you can define this you see the memory consumption and also here the cpu consumption on that and you see the expected loop count of that process should be 196 loops because that's the amount of countries i have and i'm currently at position 80. so you know you can judge okay it will be finished maybe in two minutes or it might be finished in 10 hours right and this level of view where you see of course also the cpu utilization uh a free memory or whatever telemetry you would like to to get from your power to make a desktop host you can derive that here and then you can filter by uh action by host again and so on okay that's the and then you see of course here the development over those machines i can look at the specific machines and so on and from here again if i would like to know okay what exactly that specific flow was i can go here again to the flow history and this is as it is executing showing we then of course at which stage and position we are in the cloud flow sequence okay so this is rounding up actually uh john the uh you know real-time processing with more uh you know a a history processing of uh the analytical uh data i love it okay so you see it's still executing i have two buds to be finished and this is what you can see here directly from this dashboard as well okay any questions no questions i'm i'm overwhelmed that's a lot of information yeah and as i said no no c-sharp no python code needed to to do these things it's really configuration and the access to the cds of course back-end that's cool yeah pranav you've been quiet what do you have to say i am mesmerized in real time i know that awesome this sort of fills up a huge uh sort of gap that we had from a customer perspective you know we have the out-of-the-box analytics but that's sort of meant for a general purpose audience and gives you a sort of one lens into your desktop and cloud flows but as you start to write more and more these business critical automation on the platform and your the level of sensitivity that you want tends to be a lot more higher you want more more real in real time both insights and auditing so this gives you a nice sweet spot layering on top of the out-of-the-box analytics all the data as possible saying we're still in dataworks so it's the same source that we're connecting to it's not that we're copying the data over to some other data source and now we have this data fragmentation you know problem that you're running into it is just a different view over the same data set you know without having any c-sharp python or data science skills so this is like beauty that's why i was like uh totally mesmerized it changed real time it's amazing love it yeah it's really great i think uh you know i'm gonna ask the question that i know everyone watching wants me to ask and they're gonna ask how do i get my hands on this and so for those watching who have been mesmerized as well uh apostles how do how do others do this yeah it's a super question i think yeah you know the whole reason what i said before is to give this extra level of control to flow and desktop flow makers but also the ce folks and citizen developers whoever would have access eventually to that back-end data and that to that level of data can of course make use of those things and in order to give that beauty to everybody that's why it is called also starter kit and we are planning to announce this soon and uh yeah provide this downloadable component as a power bi template all you need what i said before is access to cds backend with sufficient privileges and then having those two urls and off you go this is for the desktop flow analytics in order to build the real-time dashboard right this is a different dashboard as you can see here and i have i don't have it open that's a different one because real-time data sets streaming data sets cannot reside in the same power bi desktop file as the other ones that's why there are two files but i'm planning i can not say any dates i'm planning to do because it's quite simple to do i'm planning to do a blog post on how to replicate that on your own right this real time that well because it just takes what i said before a schema you just these two actions here to provide something to that power bi streaming data set and then on the power bi service to define that schemas with this field you have seen and then you can monitor whatever you like uh from an execution perspective yeah but they will be very very soon to come yeah okay all right so you heard it here take a look uh i'll update this video when it releases so we're gonna put this video out to show a preview of what's to come but keep an eye on the description as one place i'll go ahead and update it with the links when it releases also keep an eye on the power automate blog i assume that's where the blog post will release right Apostolis yup perfect okay so keep an eye on that as well if you don't want to keep coming back to the video uh the news will launch there so you heard it it's coming soon you'll be able to download a template where you can plug in your information and get some of this goodness yourself and that to me is the most exciting part about this whole thing perfect okay so some final words uh john on because the deck has been called also behind the scenes or backstage um so leveraging database this is how i've realized this whole thing we have said that there are three core entities you would like to look into workflows flow sessions and file attachments uh flow workflows uh this is where the desktop flow uh are stored flow sessions is the execution history and in a file attachment there's the action context of those flow session executions okay then i've used the web.contents for those who know power bi because it allows you to handle
different http statuses but also allow you to include of course the odata annotations to get from database not only the ids but also their respective names for that then in order to do an automatic or schedule an automatic refresh on the power bi service you have to use parameters with relative paths and so on to to allow for that right it showcases also some medium medium to to advanced data processing commands in the m query language so if we go back to our dashboard and we go here to to edit the query just as an example many of these things uh you know have been introduced just to showcase of course so if i go here to the actions and i look at the advanced editor there are some things in there like if i'm create if i'm getting the action context that includes sensitive information like for instance a client id or client secret or anything so i can really obfuscate that information in the transformation process and this is really showcased here with a couple of those script commands you see here so that's why it's also a good showcase on how to do maybe power query and m transformations then this is also very important data privacy fundamentals so you cannot just say i'm ignoring the privacy models because this is of course a security risk but on top of that if you mix it up let you have for instance for your connections set the privacy on one data source to private and the other one for instance organizational you will not be able to uh publish so you will not be able to automatically refresh that thing again on the power bi service and there are some in terms of where i said it's a good showcase to to do that yeah you know and also to yeah to add to it that's a very good point also because in your flows also for folks who've been using sensitive text as input output both for desktop flow and cloud flows you know all that data is not logged if you mark them as sensitive to those that data does not get logged in in dataverse it does not show up in those dashboards as well so do sort of take a look at the uh the privacy fundamentals don't take no there are a lot of controls available both from a maker's perspective and from a cv perspective on it yeah absolutely true so whatever we see in the portal you see also here of course eventually right so some things like what i just described with client id and client secret and so on so those are things if you define them on the portal and they're of course not hidden then you would see them as well and this is where i'm doing some minor checks on those but you have of course to go into detail to your data and then make your according your judgments whether something is considered be sensitive or not and then filter that out and so on so this is not a fundamentals course on on security but uh just to show you that there are means where you can restrict that on on the power query and processing level already yeah so then there's uh the dax measures so the measures i've shown you before so month over month changes and so on and it leverages of course once it is deployed to the cloud also the os awards to authentication um yeah findings for those who are very well versed in dataverse we have uh introduced a concept which is called tds endpoint which allows you to connect to a database environment uh in to a read-only sql instance if you would like right and this is of course very important for for reporting so you're doing offloading all your reporting load from the production uh environments and from production execution and so on and its respects again uh security roads and so on so forth now the current implementation of tds endpoints doesn't expose the flow session entities that's why i had to use the web contents route uh instead of the dataverse or common data service standard connector we have there we are planning of course to support this in the future but there's currently no no eta for that uh okay so it might be that the cloud flow history retention differs from desktop flow history i think this has been rectified already and for the large flow run log history if you receive a 413 error this is usually if you have super super massive large uh legacy uh action flow runs uh which has been of course resolved now in in the current version but this is for the history you're having there if you receive that then you know and the dashboard the first page will surface uh those errors in the action log here so it won't fail they will just tell you okay we have five of them well i think the the important part here is there is a lot of power and you will be able to get your hands on it pretty soon uh as we talked about before keep an eye on the blog keep an eye on the description of this video i will update it when this comes out and you can get your hands on it in the meantime if you have questions for the team if you want to pick apostles brain about it a little bit go ahead and leave us a comment or a question down below we're happy to come back and answer those as best we can all right this has been it for episode number 13. you guys know what to do go ahead and click like and get subscribed so that you don't miss another video and that's it from us we'll see you in the next one you
2021-02-18 04:20