In this video, our SAP Analytics Cloud product experts will demonstrate the main features of every release. Feel free to jump straight to the features you're interested in by using the timestamps in the description below. Let's dive in and hear from our SAP Analytics Cloud product experts.
Hello, I'm Adrian Westmoreland, a product manager with SAP Analytics Cloud, and I'd like to introduce you to a new feature that gives the ability to visually distinguish tenant types. Administrators can now choose to display the tenant type or add a custom title that describes the system. The banner will appear above the shell bar for all users so they can easily identify which tenant they are working in. This allows users to quickly and easily distinguish between the tenants they are using.
From the system administration area, under the default appearance tab, there is now a new toggle to allow the display of the system type. I can select from the predetermined types or I can enter a custom title and custom color for my tenant identification. When I have enabled this, the banner will reflect my selection. Thank you. Just ask is the new natural language query feature. It aims at enabling content consumers to get answers on their own.
This release introduces the ability to ask for results in a specific currency. This is available with acquired data models. Different conversion methods can be triggered depending on how the model is defined. Let's see this in action. First, we select an acquired data model with amounts in multiple currencies.
This model has a currency conversion variable. This variable will set the default currency. We select the Brazilian real as default. Now, if we ask for sales without specifying a currency, we obtain the result in Brazilian real. If we include a currency code in the question like J.P.Y. We obtain the amount in Japanese yen.
As long as a valid ISO currency code is provided, conversion is performed on the fly and does not require a conversion measure to be defined in the model. On the other hand, if we ask sales in British pounds, British pounds is not a currency code. This question requires a currency conversion measure to be defined. in the model. Finally, if we need results in local currencies,
we can leverage a currency conversion measure defined in the model. The result is automatically broken down by the dimension with the currency property. In this example, the country dimension. Currency conversion can of course be combined with other elements in the question, for instance, to compare actual to forecasts. If we drill to the European Union, we see sales in Denmark, in Danish Krona, in Belgium, in Euros, in Sweden in Swedish Krona, and in Poland in Slotty.
This is my insight I want to schedule. Under the share button, you see the new entry, schedule publication. That opens the dialogue where you can give your event. a name and you specify the start date. The reaccurance and the file type I keep the Excel. Basically you see it's the same dialogue as if you schedule a story. As a recipient I pick my name and with a create button I schedule the task. I jumped directly to the calendar
where you can see the insight is already in progress. Some minutes ago the event was successfully done and here in the panel you can see all the settings and as usual the admin or the persona with the authorization can change the settings at any time here in the calendar. That's the scheduling topic and next I jump back to data analyzer in order to show you the drag and drop enhancements. So let's start with the measures. As mentioned from now on you can change the order of the measures here in the table directly by drag and drop. Or you can swap measures, that puts the revenue on top and counter-transaction at the end.
The same is possible with the dimensions on the rows, so you can either change the order or you can swap, and from the builder panel, you can swap dimensions, so let's say I want to see details for products but filtered on product group. Let's say group four. I drop on group four. You see the filter is set on group four and the dimensions has been replaced. And the same is possible. Let's say you want to see the month on the columns, on the column axis and you're only interested in revenue.
So you swap on revenue, you see it's filtered on revenue only. Calendar months is added to the column axis. That's it from my side for today. Thank you very much for your attention.
A new data integration feature added to this QRC is the ability to clone a query when creating a new low job. You will not see this option if you are creating your low job from a source that doesn't use queries, such as spreadsheet uploads, Google Docs, file uploads, etc. However, if the source does use queries, SQL sources, SAPBW, you will have the option to reuse an existing query. So when you're creating your load job, after you have selected the source, if the source has a query to it, you will be presented with the option in the in the dialog, whereas instead of selecting the connection and then going and creating your query, you will have the option to simply reuse an existing query against that connection. Of course, this increases efficiency as you no longer have to recreate your queries in scratch when creating low jobs based off of similar queries. With this QRC,
SAP Analyst Cloud now supports the use of the BW message server for load jobs. What this means in practice is that if you have a BW message server configured for load balancing, you can use this for import jobs now. It's very simple when you are setting up your connection to BW.
One of the options now in the BW connection dialog is the choice between server type. You can choose between message server and application server. And if you have a message server configured on your on-prem BW landscape, and you configure it here, allow jobs based on that connection. We'll go through the message server. Hi, my name is Max Gander, and I am a product manager for SAP Analytics Cloud, and I am presenting the CMS planning integration with SAP Datasphere, which enables you to directly save SAP Analytics Cloud models in an SAP DataSphere space. So that means facts and master data of the model directly live within the chosen SAP data space.
And your data can also be made available and exposed in the data builder for downstream analytic modeling with SAP Datasphere. That means you always have live plan versus actual comparisons available. This is a very strategic initiative and we are planning many enhancements throughout 2025 and beyond, such as the live consumption of SAP DataSphere data in the SAP. analytics cloud planning model. Planning modeling still happens in SAP Analytics Cloud. So we are building a new model in ***. We're starting with data
and here we are prompted to choose the data storage location. Selecting SAP DataSphere means that we're using seamless planning. We're choosing a space that we have the appropriate access to and now we can build a model. I chose to do so via flat file upload.
In the model we can now choose to expose the impact data of the model in the data builder, in data spheres. So we are providing a technical name and a business name as well. And we are doing the same thing for our new product public dimension table. Again, we are providing a business name and a technical name.
Now let's assume we have finished the building of our model. We have created a simple planning application in which we We are creating a new product to be planned on. We're also putting it into the hierarchy, and now we are entering some data on it. We're publishing the data, and now we can check what this looks like in SAP DataSphere. So you see we have a bunch of tables that have been generated based on the settings that we did in SAP Analytics Cloud, fact data, master data. And now we can combine that with actual data and original master data that we have in SAP DataSphere.
We can check out the analytic model sitting on top of that modeling. And when we check the dependency view in the next step, we will see how this analytic model is compiled. We see we have a table and a view with the plan data coming directly from SAP Analytics Cloud. and we have a union view where we are bringing this data together with the actuals that reside in SAP Data Street as well. Looking at the dependencies, we see some further modeling that we did, like bringing together original and updated master data.
And we can put an SAP Analytics Cloud story on top of that analytic model, which will always have live plan versus actual comparison. Compass is our new feature to perform risk analysis using a model color simile. For those new to this simulation technique, it is actually a method to simulate the possible impact of driver uncertainties. With Compass, you can create different scenarios to model your assumptions and understand the pessimistic, realistic and optimistic cases.
It can also easily compare the impact of different scenarios against each other. Multidimensional simulations are possible, meaning that you are able to, for example, simulate uncertainties in different regions, company segments, or products, depending on your use case. Now you do not need to be a data scientist or a math pro to be using Compass because the interface is tailored for non -technical business users. There is no coding, no duplication of data or formula, and no additional IT setup required. It is just click and run, and you can simulate on your own, in a private mode, or together with your colleagues. Either way, you will see how Compass is not only able to help you simulate their unknown, but also improve interpretations of the risk involved with quantified risk analysis.
You can trigger a new compass simulation via the toolbar or directly off a dashboard. I will show you how to do it via the dashboard today. Let us imagine that we are a bike manufacturing company reviewing the budget for 2025. We can see here a simplified profit and loss structure with the different revenues and expenses aggregating up to the operating income.
Now we can explore what will happen with this for Q4 2025 if things are not going as planned for some of the drivers. We can trigger a compass simulation directly off the dashboard to explore this. The drivers are automatically detected and we can directly simulate a range of uncertainty for gross revenue to simulate the impact of possible trends and new deals. We can also simulate a range of fluctuation for direct material. We can now run the simulation and choose between preview, medium or high precision, which is a choice between 1,000, 10,000 or 100,000 calculation iterations. Let us choose median precision here. What is happening in the background is that 10,000 calculations of the operating income will be performed with each calculation sampling a random value for the driver within the range you have specified.
The result will be sorted into a graph and you can easily see the range of probable outcomes for the operating income. You can also see the boundaries between the pessimistic, realistic and optimistic cases, with the value boundary is clearly displayed again at the bottom. You can open the case settings to change the name, color, and percentage division for the three cases.
The current budget plan, which is the baseline, is also displayed on the graph, and you can see here it is still within a realistic range, but there is a much higher chance of achieving a value below it rather than the value above it. You can also drag the slider to explore the chances of achieving the individual values or type them in, for example here $100 million to understand the likelihood of achieving them. You can also save a simulation to show your colleagues or come back to it in a later point in time.
Once saved, you can create multiple scenarios to compare the results. For example, I will make a copy of the current one to simulate the scenario where I would secure the deals in EU and US and also salvage some of the customer trends in Asia by spending more on marketing and customer services. You can run another simulation again and compare the scenario results. You can add a description so that you know what you're simulating here, and you can continue to adjust to simulation until you have reached a state where you would like to publish it to show a colleague your concept, who can now also create a private copy to simulate further. In this way, a common language for risk analysis between different departments can be created and a coordinated simulation environment can be enabled.
Well, it's Derek for Product Management. This session, we're going to be looking at an enhancement we've made to data actions where we can specify how we'd like to create a planning area when one may not exist. So in this case, we've previously had the option of using the recommended planning area or all version data, but now we've extended these capabilities to add the capability of creating an empty planning area. So if you create an empty planning area as part of the data action, setup and configuration. What will happen is when you execute the data action, it'll dynamically extend for the scope of the data action as it executes. So let's take a look. So here we have a story where we have a data action associated with it, and this data action is intentionally very simple. It'll take a region,
multiply this value for sales by a certain growth factor, and then store in 2025. And so when we look at our data action, the new configuration is contained on our default strategy that we want to use. This is in a case where the version does not have an edit mode.
And here we can specify whether we want to use the recommended all versions or the new option here, which is an empty planning area. So this is great if you have access to, say, a large planning area for a version and you want to execute just in a kind of a small area for more of an optimal execution. You can use this type of strategy. And when we execute it, we'll see that it'll give us a prompt. We're going to target version of plan, our growth factor 1.1, and we're just going to be using
this for France. So it's going to start with an empty planning area, and then executes is going to add the French region into the planning area definition and we'll take a look at that now. So you go in version management, look at the details and then we can see here that instead of using all the data or all the recommended data as well for that strategy we can see our selections here for 2025 and we can see our region for France is selected. We can also look at this in terms of our data actions as well. If we look at our job monitor, I'll look at the latest one here, and then we can review the execution steps here, and you can see the execution time. So typically if you have a large
planning area, the prep time will be a little bit longer here, and you'll see by using the strategies, in some cases we can minimize the prep time used in the data action execution. We also see a note that at this step is where the planning area was extended. So there's a note to give some additional indications to the person reviewing the jobs that the data action or planning area was extended in the execution of this data action. So thanks. That's it with this demonstration.
Hello, it's Derek from product management. This session we're going to be looking at our ability to toggle on and off a data action step within the data action designer. So when we disactivate a step, it's also not going to execute when the data action is run, and it's also not going to perform any kind of validation checks as well. So let's take a look. So here we've opened up a data action with two steps, and we see on the second step there's an error. And the error is pretty simple in this case. It'll be related to a version that doesn't exist within the model anymore. So instead of correcting this error, which would be the normal process, If I wanted to, I could just simply deactivate the step, and then I can continue to execute this data action. So when
we deactivate a step, as you can see by the message, that is no longer going to go through the validation process, and when the data action executes, it's going to skip this step as well. So that's it for this demonstration. Well, Sterk from SAP Product Management. This session are going to be looking at enhancement we've made to our validation roles. We've now included the ability to have a toggle to activate as well as deactivate a rule. In addition, we will now deactivate a rule if it contains over 1 million combinations. So let's take a look.
Now I'll take a look at a simple model where we have some validation rules enabled. So when you look at our validation rules, you can see that we have two of them here, and we could certainly take a look at the definitions, but more importantly, is that we can easily toggle these validation rules from inactive here to active and active to inactive. And what this will do is when you are in a planning session, if you deactivate the rules, it means that it won't be applied to your planning session as you're making updates. So take a look at this feature, very easy to use.
In addition, if we have a number of rules exceeds, starts to exceed 1 million combinations, you'll see that the rule is automatically deactivated in that scenario as well. Hi, my name is Scott Godfrey, and I'm one of the product managers with SAP Analytics Cloud focus on planning, and I'm going to cover a couple of the other features of releasing as part of our Q1, 2025 QRC release in the planning area. So the first thing I want to cover is our job monitor, specifically our multi-action support, within the job monitor.
So up until this point, we've had a data action monitor that you could access within the data action application, but we've now taken that out, extracted it, and made it a central component that you could access either directly from data action or multi-action applications or from the system menu option. But the key here is that now you can track the run history for both data actions and multi-actions, including the execution of data actions within the context of multi-actions so this also gives us a platform to expand this to other types of batch jobs in the future like end -user data upload jobs for example so let's jump over and see this in action so to jump directly to the job monitor I can now access the system application list and the job monitor from within that. When I select that, you'll see that I jump to the job monitor. I'll have a list primarily of the main jobs. That's all
the data actions and multi -actions that have been run based on the most recent seven days. That's the default. But within that, I then have other tabs that I can access. So if I want to directly access the multi-actions, I can select that. I'll see that, in this case, I have two multi-actions that have run in the last week.
I have much like the data actions the ability to select the multi-action and see the individual steps in this case the data actions within the multi -actions that have been executed and the execution history and timing and duration of those if i go back to my main jobs you'll see that i have concisely the list of all data actions and multi-actions but then within that if i were to select the all jobs what i see is the run history of the data actions and multi-actions including the data actions that are nested within the multi-action right so i see the multi-action itself plus the individual data actions that have been execute it within the context of the multi-action. Now another thing that's nice here is we do have very rich filtering capabilities based on this. So as I said, the default is to show the last seven days of jobs, right? But I have a lot of capabilities around filtering based on time, time ranges. I have obviously the status.
I have some rich capabilities around filtering based on duration. So I could individual second ranges, in this case 15 to 30 seconds, and I'll see all the jobs that ran in that window so the last thing i want to cover is just some additional ways to get at the monitor itself so if i'm in the data action application i have a button for the job monitor and likewise in the multi-action application i have a similar button and in either case if i click that button i'm taken to the relevant subset of jobs so that's the job monitor which is being released with our q1 2025 QRC release the next thing i want to cover is our scripted table event on after data entry process. So the point for this is to capture an event after the user has made a data update in the table, it's been sent to the server, and any processing on the server side has occurred. So before control is returned to the user, we now have this capability to execute a certain set of code. Now what's important about this event is it will be trapped in any standard data entry mode on the table, so basic data entry mode, mass data entry mode, or fluid data entry mode, and it will provide change tracking. So the data
originally submitted by the user and any change or resulting cells will be returned in that function. And so this allows us to do things like create an array that captures change data to then pass selective values or selective change records to our data action. This would be a typical use case we would expect to see. I do want to caution that you need to be judicious in how you use this. this capability,
right? So executing long-running data actions inside of the code or inside of the event itself or even executing short data actions when you have many users simultaneously. This is inadvisable as it can quickly cause a queuing problem and impact overall system performance. Now, I have a colleague that's written a very good blog related to this event. I would strongly suggest that you take a look at this blog.
It gives you some typical scenarios where we would expect you might utilize this capability, and maybe more importantly, it gives you some best practices and things you need to look out for when you're trying to leverage this. So I strongly suggest before you use this event, that you take a look at this blog and understand where and how it might be appropriate to use this event. So with that, let's jump over and see this in action.
So in my scenario, what I want to do is copy selective records from one verd to another, and I only want to copy the changed records, those that I've modified. And this is a scenario now where I can change specific records in my source version, I can capture the change records in an array, and then ultimately I can run a data action by clicking the copy to forecast button that will only pass the changed context to the data action itself. And so depending on the data action itself and the size of the model, this can dramatically improve the execution. times of the data action. I have another scenario where I, what I can do in this case is sort of trap a data entry value. So in this case, let's say I don't want my plan value
to exceed budget by any more than a certain threshold. If in Europe I was the type, for example, 115, that's fine. But then in Asia, if I was the type 100 and, let's say. million, right, I now am going to get a warning that I've exceeded my budget threshold. So in this case the event checks the budget versus the threshold and then we'll disable the publish button. So this is another example of how you might use this table event.
Now I'm just going to switch to another screen and we'll put sort of the design time for this. So what's important is that this is an event, the event is table-based, right? So in this case I'm going to select my source table. in the on-after data entry process what I'm doing here is checking the context change and then storing that any changed records into the array so that I can then process that in a secondary step right that secondary step in this case is the user actually clicking the copy to forecast button if I just look at the on-click script for that what I'm ultimately doing is setting the parameters for the data action based on the array that I created from that table event and then calling the data action only based on the subset of records that have been changed. So that's our new table event on after data entry process as part of our Q1, 2025 QRC release. The next item I want to discuss today is our calendar template viewer. In Q4, 2024, we release the ability to create calendar templates from an existing structure, existing approval structure.
And the next steps for that are the ability to create a sort of a view and an authoring experience directly for those templates. And that's what we started here with Q1. So we've introduced the ability to view an existing calendar template structure and to use that to deploy an instance of the calendar process based on that structure. In addition to the viewer itself, we are shipping four standard calendar templates as part of the Q1 release. So these will help our customers rapidly deploy and instantiate specific workflows based on the typical structures we would see.
So with that, let's jump over and see the calendar template viewer in action. So if I just look at my file repository, I can filter out selectively only the calendar templates using the filter option. And if I click on one of those templates, I'll now be launched into the template viewer. Now this is a view only experience. It only allows me to browse and deploy an instance. It doesn't allow me to modify and enrich it in a meaningful way. That should come in the future.
I can see some of the details. If I need to about the calendar process, again, I have limited editing capabilities, but I can see the specific structural information about the template. And I, of course, can deploy an instance of that template to an actual calendar process. So in this case, if I wanted my Q1,
2025 forecast, I could take the template and create the Q1 forecast from that. And we'll see here in a minute that my Q1-2020 forecast instances deployed, specifically based on that structure. Additionally, as I mentioned earlier, we are shipping some standard calendar templates.
You can find those in the public folder under calendar templates, there will be four standard calendar templates based on typical structures that we see and expect our customers to use. Of course, we do have the ability to open any of these in our template viewer by clicking on any one of them. But in practice, you're likely to deploy these using the calendar wizard, especially if you want to replicate them down a process hierarchy. So again, this is the calendar template viewer as part of our Q1-2020 QRC release. Hello, my name is Jan Begonay. I'm product manager with SAP
Analytics Cloud, with the focus of the adding in Excel. Today, I will present you new features as part of Curc125 release. First, as part of seamless planning integration, it will be possible to create reports in Excel, do analysis and planning on top of it. Next, now it's also possible from SAP Analytics Cloud Web in the calendar, menu to add the adding Excel workbooks as a part of any task to be part of the calendar integration. Also we will introduce new illustration to be fully compliant with the SEC one.
So now you can see already new icons exactly similar to the one that you know so from Saperlinx Cloud. In the adding we also introduce a new option and that is accessible from the ribbon part named Reset Table. With this option, it will be possible to start from the initial navigation state again.
And last feature for this Curacy will be the capability to create a specific formatting for each table, applying it to different levels. So this is based on the standard Excel cell styles that can be changed, and then you can change the format, you can change the color and you can apply it directly to your report and keep it after refresh of course. But let's see a live demo.
In the panel we can see a new option in the styling tab where you can create your own styling. This styling can be applied in different areas, so initially on the member cell but also extend it to data. and this is based on different cell styles that can be defined in the standard Excel ribbon in the cells starting with SACC here. From here you can modify the standard format and then retrieve what you know from standard Excel capabilities and apply it to your report. Of course several sales styles can be created and apply to the report.
Each cell types can be changed or delete using the icon on the menu. Hi. Welcome to watch new for Q12025 for SCC. Let's start with feature 1.
Threshold compared to measure feature for SAPBW and SAP for HANA. Threshold for comparing measures was supported for data sphere and data analyzer. Threshold for comparing measures was also supported for acquired models, but not for for live models based on SAPBW and SAPS4. Now we have addressed this limitation and SAPBW customers can use compared to measure threshold functionality on live models as well. This will work on SAPBW versions which are greater than 7.5.
Let's see a demo. So here I have a chart with compared to measure threshold which is defined on a live SAPBW model. So, how I have defined it is? So I have defined a threshold comparing two different measures. One is the measure that is used in the graph as well as the measure that is available in the data model. I have compared these two measures.
This was not available as part of SAPBW live connection. Now we have made it available and you can define a threshold. The next update is unit conversion support in a story calculation. As part of Q4 update, we supported unit conversion at model level. Now as part of the self -service use case, we also support unit conversion at the story level. This will help customers to get an accurate aggregation of values.
The unit conversion is available only for new model type, and we support conversion of standard units only. Now let's look at a demo for this. So here's a table which is showing unit sales in multiple units of measures. across the products.
Now let's try to convert or try to convert these unit sales into a standard unit of measure called gram. So to do that, what I can do is I can go to measure, add a calculation, select the calculation type as unit conversion, give an appropriate idea and description. Here I can see. select the measure which I want to convert, select the target unit. So I know that it's a fixed conversion.
And based on the list of the units that is already available, I can just search for all the units or I can just enter G. It starts for gram. click on okay and here I have my measure that is converted into grams. The next feature that we would like to discuss is the updates that we have made to the vertical filter panel. Now this is a design time as well as a view time update. What we have introduced is and applied to selected section as part of the vertical filter panel. What it means
is it will show you the local filters that are applied at a widget level on the widgets that you have selected. And you can interact with these filters as well. So let's look at a demo for this. So here I have a story in the edit mode. Let's look at first the Design Time persona.
And here I have a chart which shows the unit sales across multiple products. Let's click on it and it will show you all the design time. the filters that are applied on the widget in the left panel under applied to section, applied to selected section. So here I have a filter that is applied on version and date. If I want to modify or change the values based on the filter that is set, I can select or deselect the values, click on apply selection, and it will make changes to the filter value.
If I want to add it back, I can click it, select it again, and click on Apply to Selection. Now let's look at the same in a view time persona. So this is my story which is opened in a present mode.
I am clicking on the same widget again and I can see that there is a predefined filter that is set on version and date. Now I can click on date, and I can choose which values I do not want to select as part of my filters. Click on Apply Selection and my chart will be updated. If I want to set it back, I can click on reset or I can go back and select them again. Thank you. Hello everyone, my name is Jayden. I am
product manager for SAP Analytics Cloud. Here I would like to introduce new features that we have delivered with the QRC One release in the category of the story extensibility. The first new feature is in the composite area. Within a composite, now you are able to add the global calculations for charts and for table. With support, account calculation support, dimension calculation support and cross-calculation support. We support account calculation
and dimension calculation in a custom widget if it's applicable. And a calculation input control is out of the scope. And within a composite, now you are also able to define flow layout panel and slide and also range slider. With that, we enable story designer to create more dynamic and interactive composites that can adapt to different screen size. and user inputs.
And also for the composite, we also support LinkedIn analysis for the composite as target widget. We enable story designers to link a widget with composites using linked analysis in SAP Analytics Cloud. We support for composite widgets as a target widgets for the only selected widget scope in the LinkedIn analysis panel. That will have impact on the widgets within the composite, that the chart and a table, custom widget, text with dynamic dimension, an image binded to the data source that use the same data source as the source widget. So the next feature is also related to the composite. Within the composite, you have the option now to go to the data model setting and to have the option disable manager model for story designers.
So the default setting is unchecked. That means you create this composite and in the story you can adapt this composite data model to the story data model. But also in some cases, you don't want to story designer to change the data model in the story at all for the composite. And for that cases, you can use this option to do so. Now we also have the new features. We are able to update the exposed script variables in the story URL parameter. So you may already have already implemented that scenario,
that you create a script variable, and the script variable is exposed as a URL parameter. Then you start your stories with the URL parameter, and in the story runtime, you are using some scripting to update this script variable. But you can notice in the previews in the URL parameter, this script variable value is not synchronized within the actual value you are in the story.
So with this release, we can synchronize this, the URL script variable parameter with the actual script variable value within the story. With that, it is easier to bookmark, copy in the paste, or share the URL with confidence that the URL contains most current value. What you need to do is go to this script variable. and there is a check mark called Enable Dynamic U.R.L. Means once you check that and we can synchronize the URL and the script variable parameter in the URL with the actual value current in the story. And story parameters can dynamically update it according to the latest script variable value at story runtime.
And here is also an enhancement to the pop-up. With the new release, you are able to leverage script API to dynamically adjust the size of pop-ups at story runtime, to address the need for flexible pop-up dimensions based on the screen size and improve your user experience. So now you can use a script and API to set the pop-up more dynamically, either using the pixel, with and hide to different pixel value, or you can also use the width and hide to set to different percentage values.
And with that, you can, for example, based on the data or the table size in your pop-up to dynamically adjust the size of the pop-up during the story runtime to have the better user experience. And we also have the additional and improvements for the input control. So we have done the performance and the functional improvement We have the performance improvements when the input control has a hierarchical structure and is bounded to a BW model. And related script API is the input control set selected members and the input control set the selected members with unbooked data. For these two APIs, and we have the performance improvements and if that is bounded to a BW model and has a hierarchical structure.
We also support script API to return unbooked data for hierarchical input controls. So if you use the input control, get active selected member with the unbooked, and you were able to return the unbooked data for the hierarchical input controls. So with that, you can define more flexible and interactive scenarios by leveraging the input control script API. Next is also a very strong feature that we provide script API that you can dynamically change data model for the charts. So before it's already we have this API for the table. So you can add the runtime to set the model for the table to exchange the underlying model.
Now we also make that possible for the chart. At the runtime, you can use the script API to change the underlying model. So here we use the script API, the chart.set model, then you give your model ID, and also chart and the open select model dialogue. Here, all the models which are available in the SAP Analytics Cloud are supported here.
Now I would like to move to the demo part and to show you all these new features. So first I would like to show you the global calculations in the composite. definition. So here I have a composite. Within a composite, I have a chart, numeric chart, and within the chart now you have the options to go to add a count, and then you will be able to add different calculations, like restricted and calculated and so on, these kind, all different calculations is now possible to be added also for the composite.
And now I also would like to show you the new options if you go to the global settings. And there is a new option disableable manager models for story designers. So the default is unchecked, means you create this composite once, and then in the story you can exchange the underlying model based on the story context.
But if you want that your story designer should not change the underlying model, should always use the composite model as it is. Then you can just select disable manager model to achieve that. So it depends on your scenarios, you can check or uncheck these options. And the next is also the composite enhancements that we also support the flow layout panel within the composite.
So here you can see I have this story header as a composite and within this storyhead as a composite I have a flow layout panel and to display and all these different icons here like text and also here different icons I can use this device previewer and to preview how basically under this look like and what we can see is basically here this is a flow layout panel including text and also under these four icons So why do I need actually to have the flow layout panel? Because flow layout panel can provide me more interactive layout. For example, these are four icons, right? Depends on which icon I don't need. I can, for example, the help I don't want to have, so I just make that invisible. You can see, then is this help icon actually disappear and the other icons, the positions automatically as a move often move to the rights because I have defined the flow layout panel, the orientation is on the right side. I can also further, for example, make the mail tool, for example, I don't want to have it, and I just make that invisible. So you see, with flow layout panel, you have more flexible layout options that you can actually very easily to make some icon visible, invisible or widgets, and the layout will adapt automatically.
So if I want to make that visible, then it's again. So you don't need to adjust the layout yourself, and our flow layout panel will do the job for you. So that's why we also support this flow layout panel within the composites and to help you to define more interactive scenarios. And now let's have a look into actually the stories that all leverage these composites. So here is my example story, leveraged this composites, so I have a header, and I have also here this bar chart with variance composite, I have also this numeric chart with the variance composite. And in this release, we also support the composite for the linked analysis. So if we have input control,
if you go to the LinkedIn analysis settings here, and the default option under the release what we delivered before, it's all which on the page. So that means this input control, we have impacted to the old composite, which has the same data source as my input control. That is the option we delivered before. Now with this release, we also deliver only selected widgets, these options. that you can go to this
dialogue to select composites, for example, right? And I can select under my input control only work for my composite employee expenses. So you can partially actually make the link to analyze. The input control has an impact for some of your composites and not all of your composites. So that's the other flexibility we are delivering with the QRC one releases. and that for the linked analysis,
you have more flexibility with the composites, and that you can select the widgets that will have the impact from the linked analysis. And so that is also the new option that we are delivering with the Q1 and QRC releases. So, okay, good. So then now let's have a look into in the view time and how this story look like. actually I just opened the view time of the story based on the composite. And we can see that that's all the composite and with the details and so on.
And we also have these new features that support the dynamic pop-up size. You can notice that is basically the pop-up size that I defined within this within this composite so that's the pop-up size and I can also here that's pop-up size it's the same but if I want to have the pop-up size to be dynamically changed at the runtime I can use a scripting to do so so here I'm using the scripting to and dynamic to adjust my pop-up size you can notice my pop-up size is much bigger than the one before because here in the design time, I write my code to make this pop-up size dynamic. I have set the pop-up size actually to the 60% of the height.
So the pop-up percentage size is set to 60%. So if I want to change my screen to smaller, you see, and the pop-up size will be adjust dynamically. It always keeps 60% of in the height. That's actually this quite nice with the script API, you can adjust a dynamic pop-up size. Close the pop-up. And also the other feature I want to introduce here is that the dynamic URL parameter.
So here I have a dynamic URL parameter, so here is a reference line, and the default value is 100. And you can see here my chart base on the reference line is 100. And you can see also in the URL parameter. We show you the URL parameter here.
So you are a parameter is actually to, you can see this value is 100. And now if I move my value, for example, to the 156 chart has been updated as well. And now this in the URL parameter is actually to under updated as well. You can see that the reference value is 156. If I actually use, you know, use the range slide to move to 72 and also you can see the reference the script variable value is also updated to 72.
So here you can see that the script variable within the URL is synchronized with the current script variable value in the story runtime. That's absolute advantage that now if I just want to copy paste this link to share with my coworkers, to share with my colleagues, to share with my colleagues. and also maybe bookmark and these values into my favorite, you have the correct value because that's exactly the current script variable value and in the current story. Okay, so that is the feature highlight that I want to introduce and for the QRC1 release in the category of the story extensibility, and thanks for your listening. Hi, I would like to demonstrate some new enhancement on custom-rich builder panel.
So the problem today is that it's not possible to submit query total to custom widgets. So if the total is just a sum of the dimension members, it can be elite calculated. However, this does not worry if the dimension is a calculation based on other dimension in percentage. So similar as these cases, the customer developer may not be easy to calculate the correct percentage value. And the solution is to improve the customer to build a panel to include total and also enhance the customer grid layout set to ensure accurate total data is returned.
We have a test that both acquired data on BDW live. So as shown here, the ratio percentage of being calculated on the table. So we would like to do the same on the custom rejet. So here we have 10-0-1 the show total and we're going to show the category percentage of this custom widget.
So here is the B2Live data. And for B2Live, the total is by default turned on for the table. And for custom-rigate, you can choose the turning on. we can compare data the custom rigid and the table then the second enhancement is to include the display ID and discretion for the for the mayor in the custom rigid so because right now there's no option to display both of them in the mail dialogue so maybe get confused when the same Then a mayor has the same descriptions. How to distinguish from them? I think that the solution is quite straightforward. And this is a function gap with the standard channel which is at the ID and discretion here.
And then make sure that data can be quite a written. So this is a bit out live data. Good. So this is the demo. Thank you for watching. Hello everyone.
Today I will do a quick video data story is a new innovation. in *** which allows *** users to easily convert their busy insights into a short -form video with animation effect and background sound music, which makes it a brand new way of storytelling in SACC. Let's take a look how it works in ***. When open a story with an optimized design experience and turn out the toggle in the story itself, user will be able to select and add one of the charts into the video data story. By doing that, a new designing panel will be shown in the bottom of the screen, and user can select more charts into the video data story. So far, we only support four types of video charts in ***, which is the bar chart, the pie chart, the line chart, and also the numeric charts.
We can also use in drag and drop to change the sequence of each chart. And also for a selected chart, for example, for this bar chart, once I've added, I can do some drill down in the story itself, and I can also add the result into the video story together. You will see that since these two charts or these two sections are related to each other, I can also add some special transition effect. between these two charts. The effect can be shown directly in the preview in the design panel.
For the video data story itself, what I can do is that I can change the rotation between Porto and Landscape. I can add an opening title to make it more meaningful for the audience. I can also change the background image for *** will provide a pre -list a pre-delivered list of some background images and also user can select custom images to upload and use their own pictures. User can also change or select some background music from a pre-delivered list. And for example, if I select this one, I can also get some impression how this music looks like.
Once you are satisfied with the designing of this video story, you can save it, and the video data story will be saved as a new artifact in the file repository. This means later on, this video data story can be edited or shared with each other, just like a story in assessing. Now you can see how the video story looks like in the end. So this is a demo for video story and thank you. Thanks for watching. Hopefully our demos have helped you leverage the latest features in SAP Analytics Cloud in your organization.
Remember to leave a like on this video and subscribe to our YouTube channel to stay up to date with all things Analytics Cloud. We have a lot of exciting new features coming your way in 2025, so stay tuned and we'll see you next time.
2025-02-17 07:17