Realtime Raytracing in VR ?

Realtime Raytracing in VR ?

Anonymous
Not applicable
1,892 Views
8 Replies
Message 1 of 9

Realtime Raytracing in VR ?

Anonymous
Not applicable

Hello there, 

 

First, to autodesk team : you've done a really nice work on the 2019 VRED version, our performances are now really improved, with the same setup (GTX 1080 SLI).

 

This brings to me a new question, what kind of hardware is able to run an automotive scene with raytracing/global illumination in VR (with 30 millions Polygons for example)?

 

The best Xeon CPU (RT/GI still handled by CPU instead of GPU ?) should do that ? (Xeon Platinum 8180M, 28 cores....), or 2 of them ? Or i'm dreaming about that ?

 

Thanks a lot and have a nice day !

0 Likes
1,893 Views
8 Replies
Replies (8)
Message 2 of 9

michael_nikelsky
Autodesk
Autodesk

Well, depends on the complexity of the scene and the illumination but maybe a cluster with 500+ Nodes/10000+ Cores and Infiniband connection throughout might give you some decent results 😉



Michael Nikelsky
Sr. Principal Engineer
0 Likes
Message 3 of 9

Bob.Bon2000
Collaborator
Collaborator

I guess you want an unreal engine with new Raytracing mode that is coming up + 2/4 titans. That should work. I doubt vred RT VR will happen. 

0 Likes
Message 4 of 9

richardlevene
Collaborator
Collaborator

Funny you bring this up @Anonymous

There was a post from Lukas (Vred product manager) on Linkedin references a video by the Ford team about their "real time raytracing" VR setup.

 

https://www.linkedin.com/feed/update/urn:li:activity:63919317089329152

 

The quality does not look good though as it is not fully raytraced, just precomputed illumination but it at least traces object to object reflections. They still probably have a decent amount of nodes to get this running at the frame rate needed for VR.

 

As @michael_nikelsky says, you are going to need a lot of nodes and super fast infiniband network connection to try and get real time proper raytracing for VR in vred.

 

An interesting question would be, is VRED looking at making raytracing available on the gpu as well?

 

Richard

0 Likes
Message 5 of 9

michael_nikelsky
Autodesk
Autodesk

Ok, I think I need to set the record straight here in terms of Unreal. I think everybody is referring to the Star Wars Demo they just showed (if not, I am talking about this one: https://www.youtube.com/watch?v=J3ue35ago3Y).

 

I attended a session at this years GTC that gave a full rundown of what this really was.

So the whole thing rendered on a DGX-Station with 4 GV100 (so roughly about 65k USD without any discounts) at FullHD (1920x1080) with (mostly) 24fps - so roughly 50 MPixels/s. For VR you are usually rendering 3024x1680 at 90fps, so 450 MPixels/s. So you will need 9 of these stations (so about 585k USD - and you will probably get a discount so let´s just say half a million USD should do it).

But this is just what you need to render that one scene. So let´s take a closer look at it, shall we? 

 

The environment is completely baked only the floor gets a layer of reflection. The characters sum up to roughly 2 Mio Triangles, so let´s be gentle and round up to 3 Mio. There are exactly two textured area lights in the scene that are animated and evaluated with a combination of analytical integration and 1 shadow ray per light. The analytic integration is nice and makes the illumination noisefree but it is actually wrong since the shadows are only calculation how much of the whole light is in shadow. But it is good enough so I won´t complain about this one. Then there is ambient occlusion done with 2 rays per pixel and 1 reflection ray (I don´t remember if it was 1 or 2 bounces). The shader also exist in two variants, one high quality for the directly visible surfaces and one massively stripped down for when a reflection ray hits it. 

What you get from all of this is a pretty noisy image and this is where the magic happens. 

Nvidia has designed some very clever filtering for the arealight shadows, the ambient occlusion and the reflection. Combined with temporal antialiasing you get a pretty good looking image - but I would not call this GI since there are essentially no diffuse indirect illumination rays and the glossy reflection rays are also quite limited.

 

Now back to the original question about GI with a 30 Mio Triangle model: So....10 times more geometry (yeah I know, raytracing is quite good at this but it will still probably cost you a factor of 2 or so), probably much more complex shader (I mean the stormtrooper were chrome and untextured plastic with maybe a slight bump and a roughness map, it does not get much simpler than that) and real(!) GI (for the sake of simplicity let´s do a photonmap with final gathering for this and skip the ambient occlusion so this might come cheap, at least for static scenes) and to not make it completely unrealistic, still use the denoising filters....

....well I would say, quadruple the number of DGX-Stations and see if it works and fits within your 2 Mio USD Budget...

 

Don´t get me wrong, I think they did an awesome job with this one - but there are quite a few differences between an Engineering Datasets and a finely handtuned demoscene.

 

And about GPU raytracing in VRED: We´ll see ... 😉

 

 



Michael Nikelsky
Sr. Principal Engineer
Message 6 of 9

richardlevene
Collaborator
Collaborator

@michael_nikelskythat is what I call a forum post! Have to say I am very pleased to see you, Pascal and some of the other Vred team so active on the forums. You guys have definitely improved on that since our time using Vred. So thank you.

 

I wasn't really talking about the nvidia demo in terms of realtime raytracing.

 

I know it is still full of hacks and I was aware of the rig and the cost so I definitely didn't think it was just done on a couple of consumer cards.

 

When I talk about real time raytracing I mean brute force path tracing like you get when you set to Full Global Illumination mode. That is the aim in my mind. Yes there are optimisation in the code so it is not really "brute force" but it is close enough to that reality.

 

Full Global Illumination on the GPU in vred would be cool though. Although please do not make that a priority. That would be at the bottom of my ever growing vred requests list.

 

Best,

 

Richard

 

0 Likes
Message 7 of 9

Bob.Bon2000
Collaborator
Collaborator

Heheh


Funny how a mere mention of "Unreal" and out of nowhere AD people come hurling over to clear it all up ;- ) 

 

And no I was not thinking of starwars tech. More of RTX > https://www.youtube.com/watch?v=tjf-1BxpR9c&t=5s.

https://www.youtube.com/watch?v=jkhBlmKtEAk

Which to my knowledge does not require the massive DXG station etc etc... and they run the demo in 30fps. So not far off 60/90 that we need. 

 

0 Likes
Message 8 of 9

michael_nikelsky
Autodesk
Autodesk

Well, you are wrong, I just don´t like it when people believe everything that marketing tells them.

 

The star wars demo was rendered using RTX and I think this is the next thing I need to clear up: RTX is nothing ground breaking new. It is a marketing thing, something Nvidia decided to call their Raytracing interface for Optix, DirectX and Vulkan. In fact if you look at the new DirectX Raytraxcing it is merely a stripped down version of Nvidia´s Optix which lacks some features and can directly use DirectX Buffers/Textures and so on. Optix has been around for years and is what for example iRay is based on - and that is exactly the performance you can expect from RTX in terms of raytracing.

The Volta GPUs perform better than the Pascal GPUs whose Performance was better than the Maxwell GPUs and so on. The performance difference going from Pascal to Volta is not massively larger than going from Maxwell to Pascal. The main advantages of Volta are HBM2, which definetly helps raytracing,  and the tensor cores, which are 16bit float only and therefore are only used for the denoising stuff,. And this denoising is actually the real new thing that RTX brings and it is the only thing that makes these demos possible.

If you watch the video you posted this should be clear what it really is: A scene rendered with DirectX with added raytracing effects but shooting very few rays creating very noisy images. These images are then denoised with special filters for denoising shadows, ambient occlusion (which should already give you a hint that this is not GI you see because AO is just a fake you do when you can´t calculate GI) and glossy reflections. Add temporal antialiasing into the mix and you end up with nice looking effects for games. 

 

So to sum it up: All that is new in RTX at this point in time is basically a collection of carefully designed denoiser for raytraced effects. It is not some new hardware feature that makes raytracing faster, the only really new (well, about a year old...) hardware are the 16bit tensor cores which are really usefull for these denoising filters. Other than that RTX is just a now a branded version of what has been in Optix for years.

 

Kind regards

Michael

 



Michael Nikelsky
Sr. Principal Engineer
0 Likes
Message 9 of 9

Anonymous
Not applicable

Thanks for the explanations guys, very interesting but I think we can't forget that for a moment. I didn't want to launch a fight around the "unreal" subject, sorry.

0 Likes