Visualizing Data for Digital Twins in UE5 Unreal Fest 2024

Show video

Very good afternoon, everyone. I'd like to start with a big thank you to our Epic colleagues for inviting us and hosting us here in beautiful Prague. It's truly a pleasure to be here. I've already seen some really cool talks earlier today. So particular highlight for me was the stuff that the guys at Vero were showing earlier. I don't know if any of them are here, but you should go and check it out if you if you missed it.

But really impressive stuff at scale in the AEC environment. And I think what I'll show you today will hopefully complement this nicely through sort of a deeper dive on the data side of things. So I'd like to take you through a couple of AEC use cases where we at CityScape have deployed Unreal as the core tool to deliver insights and value to our clients through tackling complex kind of data-driven problems. But first, some introductions. I'm an architect, so I have to talk about myself first.

So Rumen Dimov, and as I said, in my previous life, I used to be a practicing architect, working mostly in CRE residential education. Then, for a time, I worked for a large, well-known multi-disk engineering consultancy, delivering information management as well as a sprinkle of enterprise architecture. But the focus there was always sort of large government property portfolios and energy infrastructure, particularly new build nuclear with the common theme being, you know, these are multi-billion, decades long, highly regulated, heavily information-dependent puzzles that need solving. And at this point, I was getting interested in the concept of Digital Twins, particularly in their ability to provide alignment and insight and consensus for decision making for these sort of most complex of problems in our industry. I'm sure many of you who dabble in this field would agree that the future of AEC is built on the back of the concept of Digital Twins, whatever the definition you follow.

And we're not going to open that can of worms. But I myself came around to this belief, which led me to joining Cityscape, where we're in the business of realizing this future of the real estate industry heavily relying on Unreal to do so. So who are we? We're a leading digital real estate consultancy at the forefront of revolutionizing the design sale and asset management of residential and commercial developments. We've got an industry leading production capabilities in-house using the latest digital technologies.

Unreal is a huge component of that. And we create transformative experiential environments and push the boundaries of real estate innovation. We've been doing this for a while. And in fact, we've been supported by Epic for many, many years now, I believe since 2008.

So that makes it over 15 years since we've started recognizing the potential of Unreal. We're a 50-strong studio based in London, near the London Bridge, and we collaborate with developers and asset managers to unlock their commercial goals through cutting edge digital solutions. We're aggressively client-focused by understanding our client's unique business objectives.

I work alongside them to design bespoke strategies that maximize the potential and deliver commercial success. And bespoke is the keyword here. Many of our clients turn to us when they need innovation urgently, but also a clear and direct ROI path from day one.

And we're experts in what I would call inelegant advancement at speed. And this is something that is highly valued by our clients. And in fact, I think both of the use cases that I'll show you are really good examples of it. Yeah, so we utilize emerging technologies, such as VR, AR, interactive visualizations to showcase properties in new and exciting ways, and in these immersive experiences, enhance the sale process and elevate the brand's presence in the market across the across the globe. You can see we work with the best. And we're trusted by some fairly iconic and recognizable public and private entities.

And lastly, some exciting numbers. As I mentioned earlier, we've been creating a great volume of digital real estate over the past 15 years. A huge portion of it powered by Unreal. So what are we talking about today? I'd like to walk you through a couple of use cases where we've used Unreal as the core comms tool, as I said, connected to live data and deployed through an application via pixel streaming.

I'm not going to show you pixel streaming today because I just don't have that level of trust with it. But I do have the applications locally. And they are connected to live data, even though they're local. So you just have to trust me on that one. Both of these are real commercial projects.

So to set the context, I will talk you briefly through the original briefs, as well as how these informed the way we approach them. And I'll open the applications and talk you through the key relevant features at their particular point of development. Now, both of these are not in their final state. That is to say, they're working tools for solving problems, first and foremost. So please treat them as such.

And I'm hoping you walk away with the higher appreciation of Unreal's potential as a data comms tool, giving the user the ability to sort of immediately understand the implications of data-driven decisions in a more experiential and interactive manner and most importantly, at speed. Yeah, we believe that for Digital Twins to ever get the sort of universal adoption required to have a society-wide impact, they have to be visual decision making tools. They have to be visual is the key word here. They must be easy to understand and experiential in order to reach the broad audience and not just us technical geeks, right? And yeah, I think both of these use cases have a good story to tell in that space.

So DESNZ Digital Twin Augmented Visual Demonstrator. It's a bit of a mouthful. To set the context for the purpose of this project, the UK is aiming to decarbonize the energy system as part of the drive for net zero by the year 2050. So to make this happen, policymakers across all levels of government need to make a lot of decisions and need to make sure that they're equipped with the insights to make those decisions as effectively and influentially as possible. With the amount of information about the built environment under the control of some digital system increasing exponentially, so do the valuable data streams being created alongside it. So the purpose of this demonstrator project was to investigate the use case of decarbonizing the UK residential stock, with the focus being on making it easier for policy makers to understand how uncertainty plays a part in the modeling predictions, as well as presenting the results in an easy to understand visual way, again, because they need to present to their local stakeholders.

So Energy Systems reached out to us through a recommendation from Epic to vision this communication approach and interactive application prototype. Shout-out to Rob Harrison and his team for supporting us here and making this happen. We did an initial round of visioning, and we then further developed the tool with our colleagues from the Frazer-Mash Consultancy who provided the sort of heavy data science in the background, as well as the ESC team themselves, who lended us their internal expertise. And as I said, this was then deployed via pixel streaming and then user tested to assess the priorities for development on the next round.

Thought I'd flash this up momentarily, just to give you a glimpse of how we practice this aforementioned inelegant progress at speed. So this is a very typical thing that we do when we work with our clients. So you see we have the-- I don't know if you can see that, yep-- we've got the use cases in the top right here. And the sprint planning started from meeting one. When we met the client, we were actually writing on their walls. They have this sort of walls that are whiteboard, or at least I hope so.

Nobody stopped us. So maybe it's still on their walls now. And then, we've got a whole sort of stage of aligning and establishing the general principles of how things should be related, the scales, and how we can best communicate the various points that they're trying to make. And this is all supported by a mass of notes and precedents and various tests and discussions back and forth, mostly trying to actually get alignment within the team, rather than externally. So this is me trying to make sense of things that I'm slightly under-qualified to understand on the data science side of things. But, you know, very important to get to the result quickly.

So lastly, before we jump into the demo itself, just a quick overview of the key goals for this particular piece. We wanted to find the way, first and foremost, to illustrate the level of confidence in the outcomes and be able to sort of sift and filter through accordingly. We, obviously, needed to have the ability to compare options. If you're trying to find the best option, you need to sort of continuously test your hypothesis.

We needed to understand the logic of the outputs. So it is all well and good that the model tells you something, including how confident you can be in the outputs, but it's just as important to understand how it's arriving at this conclusion at sort of the highest level. So this is any policymaker or any career politician will tell you 100% need this for accountability purposes.

And lastly, again, as we said, it all needs to be connected in a visual and immediate way so that it can be used as a comms tool. So at the top level here, we've got these sort of policy levers that we, at the current point in time, are kind of simplified, but you can imagine these are basically sliders. You will note that cost has been consciously removed from this, so it's not throwing any pound signs. It's just showing sort of intensity of investment or prioritization of the various groupings of technologies. So in mass electrification, you're talking about a number of solar panels and heat pumps. In thermal comfort, you're talking about insulated houses, which, again, come with the heat pumps.

So it's not singular technologies. It's groupings that have their logic to them. So let's just do a couple of options here, and then I'll explain how this works in the background.

So I'm just going to do this click simulate. And before I show you the rest of the app, I'm just going to explain how this works. So the data is based on something called the National Buildings Model, which is basically a massive government database that holds information about big majority of the UK residential stock with various degrees of completion. This is all tens of millions of them. So this includes things like your build EPC ratings, heating type, so on and so forth. This then has a two-way communication with a probabilistic graphical model or a Bayesian network that compiles basically slices, which are then required to power the visual demonstrator on demand.

So these are basically JSON files that we exchange with the network on demand. So when I'm pressing these scenarios, it sends a request. I get a JSON file that basically powers everything behind it, but I get it for the entire UK.

So every local authority, it's a chunky file, but we have tested it in principle, and it works live. The graphical model is where, obviously, all the heavy data science happens, which I must admit is not my domain. But in principle, the model is made up of edges and nodes.

And each node contains the uncertainty based on the lack of information, real-world complexity, and messiness. So this is a gross oversimplification in terms of how it works. But it's not too dissimilar on how weather prediction models work, right? The key point here is that it gives us speed, as well as the ability to not just show what we think is the best outcome, but how confident we are in it.

And it allows us to query a massive data set much quicker than it would otherwise be able to do. OK, so back to this. So just to talk you through what we've got here, so we've got the impact on the top here. You have pretty standard things that you would find in any sort of dashboard.

So this is just the comparison in terms of impact for the various groupings on the policy that we've done between option one and option two, CO2 emissions, again, reducing over time with the probabilistic model. Now, we can do the median probability around when we can hit the target dates. As you can see, because our data set is not very complete, we're getting a pretty loose idea of where it is. It's somewhere between 2050 and 2070, maybe.

We also have things that are a bit more in the numbers in terms of overall properties, having solar panels installed, properties with heat pumps, amount of wall insulation, so on and so forth. All of this is tied by this timeline at the bottom, which allows us to see the outcomes in sort of interactive way. We can filter by the different technologies. And then we can have this sort of confidence filter here, again, which allows us to focus on specific things. So we can focus on things that have high impact and we are fairly confident in those outcomes.

And there are some surprising results that happen here all the time. So we gain, for instance, confidence around 2050. And I'll show you how the simplified model works in the background in a minute.

We also have a regional view. So this is doing more or less the same thing, except at a local authority level. We've only done one here, but the idea is that it's going to be like a postcode search type of thing. And we also have something that we've called the progress chart.

So this helps us interrogate the impact of the illustrated scenarios, but in even more detail and mapped sort of the quantum of carbon reductions across all the local authorities. So we have all 370 of them or so down here. And again, we can look at this kind of as a projection over time between the two policies. So again, you get the various results and we can query them into the carbon emissions. So again, you can see that we can see this evolving over time. Yeah, as I was saying, this gives us the ability to illustrate the individual impacts of the various areas and their own individual targets on an XY canvas where you have confidence on the Y axis, progress on the X axis, and then the size of the bubbles is effectively the carbon footprint.

OK? Another tool in our arsenal is the outcome logic. So this basically illustrates the distribution of contribution to the overall carbon emissions calculation for each policy scenario. So what this is, is this is effectively a representation of the Bayesian network calculation. You've got inputs, which are these. These are the data sets that are being used, and then calculation points which explain what is being measured. And then again, this is something that evolves over time.

So you'll see that the opacity changes of the various nodes, which is how intensely the individual nodes contribute. And then you have the critical paths, in terms of what contributes most to the overall outcome at the end, which is the overall CO2 emissions. Now, at this point, you're probably thinking to yourself, surely, surely, you can do this in Tableau. And you're not wrong. But where we come to the real interesting part is how we chain this together as a communication tool.

So if you take a step back for a moment and put yourselves in the shoes of the target user, like, you're a policymaker, the main aim here is to communicate the practical reality of the proposed policy impact for your individual area, your constituents, right? That's what's relevant to them. And this presents multiple challenges to how you design something like this. On the one hand, you have the drive for realism. So in order to answer the question, OK, the data makes sense, but what does it actually mean on the ground? What does it look like on my street? Reality may not always be your friend, OK. And this is interesting in the sense that let's say that the model tells you that by the year 2040, 40% of houses need to be insulated. Something like that.

Right? If you were to represent this realistically and have a conversation with your local constituents, you're starting to get into conversations like, wait, so why is my house not getting insulated and my neighbors is right? And this is not really the point. The point that you're trying to get across is that our built environment will have to change, it will have to evolve if we're going to make any dent in these massive aspirations that we've got with regards to net zero. There's also the whole realm of GDPR, with regards to individually identifiable information. And there's also the issue of scalability. I mean, technologies are emerging, but there is nothing that quite fits the bill for what we're trying to do here, at least at the time that we're developing this.

So what we've done here is instead of representing the reality, if you will, we've imagined the target residential areas as categorized as a set of non-geospecific character areas to be used to illustrate that point instead. Now we only have two here, so it's obviously not sufficient. But this is a solvable problem. You can imagine that there will be maybe 50 or 60 kind of situations which will illustrate or will be close enough that will cover most of the use cases where this is particularly relevant. At the moment, we just have the sort of suburban 1960s semis and detached houses situation, and the sort of urban center, Victorian heritage sort of stuff.

Of course there will be many more, but this is, as I said, as a challenge, it's much more manageable from a scalability and complexity perspective, that we might otherwise be getting ourselves in. And again, as usual, you have-- you know, we can compare the policies. So hope you found this one interesting. I think we're doing well on time, so I can show you another one as well, and then we can open them again if you want to have any questions towards the end. So Barking Riverside. We were approached by David Watkinson, who's the planning design and communications director at Barking Riverside, the development SVP.

This ambitious project, based in London, aims to develop 11,000 homes, and after years of diligent design, feasibility, and viability testing, the Barking Riverside team was finding themselves at a critical juncture. They needed a platform that could facilitate the review, engagement, and consensus building around the myriad of little decisions, particularly when things are happening on the ground in order to propel the development project forward and keep the projected timescales that they were envisaging. The mission was to create a singular, unified platform where both internal and external stakeholders could converge, and a place that they could engage in an immersive and effective manner to drive the project forward.

So the challenge was to address the issues of data management, complexity, and coordination inherent to any such large and complex projects with an extended development timeline. Our solution was designed to prevent the fracturing of work streams and data and thought processes, and to keep this vision and data central to the decision making. So we integrate a tool that would be streamlining this process of seeing what's the current state of things, and become the catalyst of progress for resolving. So while this opens in the background, what were the key challenges? So you obviously have to build 11,000 homes in zone two of London.

You're working with local authorities, developers, local stakeholders, transport for London. How do you maintain the timeline and the commercials desire at the forefront of the mission? They had a real challenge at their hands with the long-term vision being kind of-- we won't have a central source of truth to help them with design, development and review, engaging with stakeholders, assisting with planning, marketing, and eventually, potentially, estate management. So also, a real mix of housing developers and JVs involved on the project, so contractual arrangements for information transfer are complex.

I'm sure any of you who has worked on anything like this would know. There is a common data environment, but it's always difficult to structure the full picture of the master plan at any point in time at sufficient speed that they could actually resolve the things that are day to day issues on the ground. So what we did is to try and unlock this for them, we build efficient pipelines to quickly update models and sift through the CD, and creating builds and a regular basis and pushing them for pixel streaming with various functionality in the background linked to data from the models and to spreadsheets online, which again, I'll show you an example of how this works.

All right. So let's just open the extended options. So, yeah, as we said, 11,000 homes master plan. I'm just going to make my mouse run a bit faster here.

In Barking, in London. So this is pre-Google API, so we're relying on more traditional means to represent the wider context. We've got a number of tools that they would use on a regular basis, one of them being this data display tool. So here you've got all the various plots on the master plan. We're going to use this one here, so 2008A.

Now, this is all information that's actually driven by various data sources, one of them being a live data sheet. I've relinked it, so what you're seeing here is all scrambled information or massively out of date. But in order to show you how this works. And this was-- it was a common theme from what they wanted us to integrate.

This is a sheet that they've designed in a particular way, and they don't want to change the format of it. They're like, can you just make it the two things talk together. So what we can do is you can see out here, there's a lot of data points on this that are driven by this sheet. So I can say that, let's say a developer has pulled out from a plot and then Epic Games has decided to get into development, and they are going to deliver many more units on this site. So let's say they're saying we're going to deliver 600 units here instead of the 440.

All we need to do is this. This is on Google Docs. And then I just need to come out of this and click back on it. And then we can see this has updated. So we can see-- this is all something that obviously not everybody has access to, but they can dynamically continue to update the overall picture on how the [INAUDIBLE] works. This is another tool that is linked to this.

So we've got a timeline tool. Again, this is something that many of you would be familiar with, just to follow the component. Again, all of this is based on the sheet, and it also kind of connects with the geometry tags and attribution in the model itself as we build it. So again, if we wanted to say that this plot that Epic has now taken over, they can also build it a year earlier really efficiently. We can put this down here. And now this one, because it's linked to geometry takes a bit longer to update, so it will force this update in the background and hopefully by the end of the talk, I'll be able to show you that this has moved.

So currently, we're saying that we're going to start construction in March 2022 and we'll finish by September 2023. And you'll see how this changes in a minute. In addition to this, we obviously have the ability to roam the masterplan and look at it and its point in time. All this is made to be accessible via pixel streaming on touch pad.

So it needs good internet connection, but it was working and they were using it on the ground. It is a bit more stylized. So this is pre-- it is 427. But all of this is obviously connected as well.

We did have some temporary works components as well, so they used this a lot to discuss what sort of use cases they can have for the holding and various other temporary works, and how to deal with infrastructure, interfaces, so on and so forth. OK. So, yeah. Again, hope you found this use case interesting. I mean, the key takeaway here is that Unreal, we find is perfect for these kinds of multi-use case applications, where there is a need for a lot of flexibility to integrate various data sources.

And I can pull that back up in a second when we get to the questions. So obviously I only had about 30 minutes with you today. And before we get into the questions, I wanted to just share with you kind of a quick show reel of some of the other cool work that we do in this space. Simon made me very self-conscious earlier today in his talk, so I'm a bit worried I haven't shown you enough visual candy.

So I'm just going to play this. [GENTLE MUSIC] All right. So, yeah, I suppose in summary, we're seeing a lot of these, more and more of these technical challenges creating opportunity space for connecting traditional visualization work, the stuff we usually do with data. We see this as a big part of the digital twins future, with Unreal being, at least in our opinion, sort of the best place tool for the job in many instances. Yeah. I think that's it for me.

[APPLAUSE]

2024-08-07

Show video