Rendered vs Real Time A Deep Dive into the World of Real Time Graphics SIGGRAPH 2021

Show video

(upbeat music) (inspiring electronic music) - Hello, my name is Takuma Nakata. I'm a real-time graphics artist based in Kyoto, Japan. So before going into my actual presentation, let me a little bit explain about my background.

So I've been working as a freelance artist for almost decades, and I've been working on a lot of technical, designing technical installation and things like that, but I realized like every year the requirements was growing. So first it was just a simple projection mapping, but then it started to become like a VR game development for motor shows and at the end, I had to design a huge illumination installation that I nearly spent my entire 2018 to build all those. And I was kind of struggling because what I'm mainly interested in is real-time graphics, but in the same time, real-time graphics requires technical aspects, and also for living, you have to stop working. But then my struggle started to become really big end of 2019. And that was the point when I found Adobe Creative Residency, which is a residency program that Adobe runs to support young artists to gain their career. So I joined them.

I got huge support from Adobe on 2019 to build my artist career. And then on 2020, I actually started to work on finding a team where I would feel like, and work together, because even though working as an artist, I also wanted to keep myself in the technical scene to help people achieve their technical ideas. And on 2019, I found a company called BASSDRUM And I also recently joined them as an employee to work together. And BASSDRUM, which is a company that nearly 90% of its member are technical director, who's been working on various fields, such as immersive experience or web developing, or even technical consultant. And they do various things and they all nearly had the same struggle as I had.

So I started to feel really comfortable working together with them. And also I realized that if I start working with a team, I would also be able to focus more on real-time graphics, which I believe is what I'm most talented in and what I can put most of my power into it. So, yeah, that was my story until now. And for today, I'm going to talk about real-time graphics, which I said I'm spending most of my time on it. And at the end, I will also show an XR project that I'm working on right now, which definitely shows the interesting part of real-time graphics.

So real-time graphics is the world of computer graphic that happens all in real-time, which means it'll require a strong CPU power because everything has to be processed in real-time. So yeah, let me show you some examples of what I've actually rendered in real-time. (high energy instrumental music) I hope you liked it. So everything you just saw is all done in real-time. No post effect, nothing else, all done in a software called VVVV.

So you saw ambient occlusion, you saw ray marching, SDF function. You probably saw Parson particles, and those were all processed in real-time. And for that, I use this software called VVVV as the main tool. So here, this is a brief of how VVVV actually look like. So it's a node-based programming language where you connect each modules and design your own system.

And it does not require coding skills. I can't code at all. Everything I do is node-based, but if you open up the node, there's actually a code written inside. So if you know about C# or languages, then it should be easy for you to understand what's happening. Even though you don't know how to code, you can still build it like using other 3D software.

So in VVVV you can do tons of things. I mainly use it to render real-time graphics, but I also design interactive installation. And also you can also use it to operate, for example, robot arm, like hardware operation, or you can also use it to, I don't know, like play around with sensor Arduino, like anything that you can imagine with programming can actually be done using this software. Yeah, so today I'm going to about my entire workflow and this is how it actually look like.

Since I came from a background of studying motion graphics and 3DCG, my workflow might be very similar to those who's working on 3D computer graphic, but the only difference that you see here is my main tool that I use is a software called VVVV rather than Cinema 4D or Blender, which is a very cool 3D software that I've been using for long as well. So to explain, first part of my entire workflow will be concept development, and here I have sketches and visual reference. And for this, I use Adobe XD most of the time, because it has an infinite canvas that you can lay out different images.

You can also make it as a slide. There's a lot of things that you can do. I used to use Adobe Illustrator, but now I feel XD more comfortable on layouting references and sketches. So this is a very rough screenshot of my XD.

And what you see on the left is the visuals that I've rendered in real-time. And I lay out those just to see what kind of visuals that I'm getting as a result. And what you see on the right side are more of a reference that I took from Pinterest or Vimeo, to reference like what kind of visual I actually want to create. And since real-time graphics is all about building your own system before getting into the workflow, you actually have to know what exactly you're going to be creating.

So most of the time, I just refer Pinterest quite a lot to find out really cool graphic that's happening, that's made in different 3D software or even Photoshop or Illustrator. And I try to build that in real-time. So I refer them quite often. Another reference that I use is Vimeo because real-time graphics is also about animation.

It's not timeline based, but it's being processed the frame by frame, which means I can easily add a frame animation. And Vimeo is the best platform at the moment for me to refer that kind of animated contents, or works, I'd say. So after concept development is done, my next process will be prototyping. And this, including concept, will take 40%, which is nearly half of my entire workflow. And in prototype, it goes R&D feasibility test, plugin development and shader development. So this is quite a heavy work.

So in concept development, we found out what kind of visual we want to create, but then we have to check if those are actually possible in VVVV. For example, if you want to convert an OBJ data to particles, then you have to find out if there is a way that you can actually get all the vertices data and add a points on every vertices. And in this case, so this is the visual that I created recently. And because there weren't such a function that I could parse an OBJ vertices position, I had to create my own plugin that could solve that requirement.

So what you see on the right side is the parsed OBJ file. So if you open up OBJ file in Notepad, you should probably see this. If it's in a binary data, you might not see the values, but still OBJ is a very simple, structured format. So on what you see on the left side is the actual patch that I created. I didn't code anything at all.

I just read the entire lines and then split those lines to different lines and then said, okay, avoid V because I don't need V and then avoid space, and then sort every values with spaces and then put those into vectors and then get that out to the CPU. So that's what it's doing on the left side. But yeah, after you're finished developing your plugin, you should be able to get this kind of result, which we wanted to have from the reference that we got from Pinterest or Vimeo. Next example that I built is a PNG sequencer. So since in real-time engine, everything just happens in real-time, which means you have to capture the result since in some way to add some post effects on After Effects, for example.

And for that, I designed my own PNG sequencer that keeps outputting a PNG from the real-time rendered scene. And this is how it looks like. So what you see on the right side is the actual structure and what you see on the left side is the node that I actually designed. So if you put your render to this PNG sequencer, then what you would get is this kind of like PNG sequence out of the render.

And then I'll later on put this into After Effects to add some post-process effects. So R&D feasibility testing, once you're done with that process, what will come next is an asset creation. So since real-time graphics is more about building your own structure, you would need to design your own asset in different softwares. For example, if you want to use a 3D teapot, you can design your own modeling software in real-time engine, but rather than that, you would probably want to use Blender or Cinema 4D to just model it.

I use Blender quite often. Also, for example, if you want to have a designed camera path, then I will use Adobe Illustrator to design a spline, and then export that as an SVG, and import that SVG file into VVVV and design a plugin that makes your camera align to an SVG spline. Also on asset creation, I sometimes create a texture noise. Like I can design a texture noise in VVVV as well but if I want to have, for example, a very complex texture noise that I want to map to a geometry to distort the vertices, then I would rather use After Effects to just design such a texture, and then render that out as an MP4 or MOV, and then put it into VVVV as a texture noise to deform an object. Yeah and also later on, I'm going to show this, but I've recently been working on a photorealistic XR project, and for that, I use Quixel Megascan, which is a platform that has tons of really nice quality 3D models and textures.

And so I get those textures, for example, textures out from Quixel Megascan and I put it into my scene to make the real-time engine more rich and gorgeous. So here's an example of how I actually do my modeling in Blender. So what I'm using as a function is quite simple, but Blender's a very powerful tool that you can really easily model something and then export that as whatever file format that you want.

I sometimes also check the render and try to find out a best shader set up in Blender, and then imitate that look into VVVV as well. So the next process that will come is the development and which takes 50% of my entire workflow, which means it's the heaviest and it's the most important part. So I need to structure my entire system that gets me to the visual that I actually want to create. For this, there's a lot of process actually. So scene creation, this is very rough, but inside that there's, for example, object instancing, if you want to add a lot of particles, then you need to understand how object instancing works. Volume rendering, if you want to add some volumes, or ray marching if you want to have some liquish geometry, then ray marching is a technique that you should use, particle system, timeline editing.

So real-time engine also has a timeline that can adjust the timing of the animation or the turbulence noise, for example, shading, ambient occlusion, and post effects. So these will all be done in the development process, basically using VVVV. And also VVVV has a game engine implemented, which is called Stride Game Engine. So I compose my 3D models that I got off from Blender to Stride Game Engine. I composed this 3D scene, and then I put that into VVVV to structure the scene first, and then I start adding some logics and effects on it. So that's how everything is done as you can see here.

So there's a post effect here. I added ambient occlusion and depths of field and that kind of effect here. And what you see on the top is more of the entire logic to get my visual done. So for here, this is an XR project that I'm working on right now. So I get the camera data out of a special protocol, and then I map that to the camera. But if you go inside there, you can actually see that mapping camera data that came out from a real camera actually doesn't match directly, so you have to structure the entire process by yourself to make sure that it's matching your software camera.

And also, so this is the part that I'm actually importing a 3D model from Stride Game Engine, which I already composed at Stride Game Engine. So that's what you see here and I added some particles. So here you can see that I'm parsing an SDL file and then converting each vertices to points so that it looks like there's photorealistic particles in the scene as well. So yeah, this part takes the most of time.

It requires a lot of like thinking like how you want to actually achieve it. And it's also another important thing is that since everything has to happen in real-time, sometimes your idea doesn't require your machine spec, right? Like sometimes it's just overloading, there's too many particles, there's too many shaders, the resolution is too high, in that case, your framerate would go lower and lower. So you also have to think like how you would actually avoid that. How you could for example, if you want to avoid particle count, because that's the reason why it's making it so slow, then you have to understand how you can actually do that.

And if you want to animate the particle but it doesn't work, if you're modifying the vertices position, then you might want to add some like texture that makes the particles look like it's distorting. And I use After Effects for that case. Like I go to After Effects, I try to add some 2D texture and then deform the rendered image, and then make sure that there is a way that I can make it feel like it's distorting. And that in that case, After Effects is definitely the better tool to do it all in real-time.

And then after I finished checking that in After Effects, I'll just come back here to VVVV and try to imitate that in here so that I get a similar result. So yeah, I'm going to show you the result of this work at the end of this slide, but I'll keep talking about my real-time graphics workflow. So next that comes is the post-production part, which is also quite important.

I said in the beginning that those videos didn't do any post-editing, but that's because I just wanted to show the possibility of real-time graphics, but there's a lot of things that real-time graphics aren't really actually good at and color grading and time remapping is definitely something that fits there. So color grading is more of a texture effects, but in After Effects, you can do that quite, how do you say? You know, really good quality. Because After Effects has a timeline and it also has a layer and you can also pre-compose. You can keep adding effects until it looks like it's something that you want to have.

And if you start doing that in VVVV it gets too complicated, since everything goes from the top to the bottom, you don't want to start calculating the entire process that makes those happen. For that I just render out the PNG sequence and put it into After Effects and then start color correcting and maybe changing color, adding some colors or adding some noise to make it look like a decent quality that I can actually deliver to my client or my agency or whatever. And also time remapping is also something that real-time engine can't actually do, because but once you render out a PNG sequence, you have you quite often have this time that you don't like the timing and changing the timing in real-time engine is a bit the hard work, because it's really not made for that kind of usage. It's more like a raw nature that you build in real-time engine.

So I better just export that as a PNG sequence and put it in After Effects and then apply a time remapping in After Effects to get a better timing result. And then another tool that I use quite often is Adobe Premiere since if I want to edit the entire exported sequences, that's nothing to do with real-time engine, so I would use Adobe Premiere to actually compose and add some music and add some effects to make it look like the story is actually telling something. So last part is an activation part, which includes sharing.

So since real-time graphic is something that's not really yet common, not everyone, like there's still a lot of people who don't know what real-time engine is and if I post my visuals on Instagram or social, I realized that most of the people, there's quite a lot of people asking me like what tool did you use? Like is it Blender or is it Cinema 4D? And people think it's not real-time but instead of that they think it's how do you say? A rendered, a pre-rendered visual. And as an artist, I would like to spread the world of real-time graphics because it's so interesting and there's a lot of things that you can actually, you can design your own system to achieve the visual that you actually want. And this is really interesting.

So as an artist, I want to spread this world. So what I would do is I quite often post on social and also explain what's actually happening to make sure that people are understanding like, okay, so there's something called real-time graphics. And yeah, I just feel that it's important for me to spread this word.

So sharing it, and also in coding it only includes 5%, but this 5% is very important for me. And also as an artist, I also design my own exhibition just to make sure that I'm spreading the word correctly. So I try printing out graphics that I got out from VVVV. This fish is also another visual that I created in VVVV.

It's using ray marching, but there's only noises around. I didn't model anything that structures the actual fish but there's just a turbulent noise happening outside, and I just put a small sphere in the middle. And then this noise starts to distort and it started to distort and made the visual look like fish. So this was an exhibition I held last year.

Okay, so this was a long talk, but at last, let me show an in-progress prototype that I'm working on right now, which is an XR project, and I'm working this in BASSDRUM. So XR is one of the most interesting technology or technical scene that's happening right now. The reason why it's interesting is because XR requires real-time engine because it has to match-move the real camera and the virtual camera and the scene has to be rendered in real-time. And I mean, that's the biggest strengths of XR. And then XR quite often happens using Unity and Unreal Engine, but VVVV also has a huge potential on supporting XR projects.

And yeah, let me show the work that I'm working on right now. (lighthearted instrumental music) (inspiring electronic music)

2021-08-14

Show video