DirectX 12 and Mantle: Can they deliver movie quality?

Published on Sunday, February 8, 2015 By Brad Wardell In PC Gaming

A couple weeks ago I wrote about DirectX 11 vs. DirectX 12 and how they worked.

Bottom line: DirectX 11 stuffs all the commands to the GPU through a single pipeline. DirectX 12 makes it parallel. In theory, you could see an Ntimes boost (N = number of cores). 

The article got a lot of coverage with some people arguing that such massive gains due to a platform change were impossible.

I had the advantage of knowing that Star Swarm was going to be benchmarked on DirectX 12 vs. DirectX 11: http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm

In the AnandTech article: The AMD GPU (which is the brand the consoles use) got a 5X (500%) performance boost just by having the game (which was originally optimized for DirectX 11) updated to use DirectX 12 instead.

You can see their video here: https://www.youtube.com/watch?v=ilVq4kRedcw. Note that this is just a quick engine demo. Having decent textures and such wouldn’t have affected the performance.

The software matters: DirectX 9 vs. DirectX 12 on same hardware

Hardware is only part of the picture and in fact, these days, a relatively small part of the picture.  The biggest monster machine I have available to me (Core I7 5960X with a Radeon R9 290X) is still going to top out at around 8000 batches on DirectX 9 for the typical developer.  On Mantle or DirectX 12, however, that same hardware can do >100,000 batches.

Terminology: Number of batches. 

This is how much information can be put on the screen in a given frame.  The more batches, the more sophisticated the scene. The next big debate will be how many batches does a scene need before the typical viewer will think it’s “real life”. they’re looking at.

You want >30FPs (or better yet over 60FPS) so then it’s just a matter of how many batches can you shove into a scene and keep it at that speed?

Batches directly tied to on screen sophistication

Look around your room. How many sources of light are there? I have 3 monitors and about a half dozen light bulbs in this room. Those lights are bouncing off drink glasses, my desk, the monitor, etc.

Now, how many light sources does a typical game today have? 4? I saw someone on a forum laughing because I didn’t know what I was talking about because they saw an example of 8 light sources. Eight. 

A real life battle scene would likely have hundreds or thousands of different light sources involved.  In a game today, only 4 (or EIGHT) of those lights are going to be real. You may not notice this but your brain will.  And that’s just lighting. Plus, real world temporal anti-aliasing, much greater material sophistication and all the other things that are pre-requisites to have something that looks real or at least not blatantly plasticy. And as Oxide’s Dan Baker points out, we haven’t really touched on lens effects.

Batch count survey

Best guesses below.  Would be great to get exact numbers from some enterprising game developer (hint hint). Smile

Platform

Max Batch Count on a Radeon HD 7790

DirectX 9 4,000
DirectX 11 11,000
DirectX 12 / Mantle 50,000

This presumes 30fps at 720p.  I haven’t tried out DirectX 9 at 1920x1080 with that level card and am too lazy try to now.

Frames per second needs to die

FPS needs to die as a benchmark result.  Let us presume that 30fps or 60fps is a prerequisite at this stage of the game.  The question is therefore, how many batches per frame can a piece of hardware or software deliver and still hit those 2 minimum qualifiers?

Using FPS at this stage is like arguing processor power based on MHZ.

When is it Star Wars prequel level?

I got quoted as saying that by the end of the current console generation that you’ll have games that are able to do LOTR-like battle scenes.  I stand by that. I’m putting it here, in writing, on this blog so people can point back to it in a few years and see if I was right (or tragically wrong).

But we can have a useful debate on it now: Is 50,000 batches in a given frame enough to produce a scene of sufficient fidelity that the typical viewer will find it realistic?  We have had this debate internally a lot over the past year.  Our lead graphics dev says you’d really need more like 300,000 batches.  My position is that at 50,000 you reach well into the “good enough” range provided that the engine is doing a lot of optimizing (LOD, good use of built in hardware features, etc.).

image

I’m pretty sure this can be done with 50k batches at 24fps. They don’t even have real light. Seriously, go back and watch the prequels, most of their explosions might as well be cartoons.

image

This is very doable at 50k batches.

image

With smart LOD, yea, I think this can be done with 50k batches per frame to a level as to make the typical viewer not tell the difference. (this scene is admittedly vastly more sophisticated than that, look closely at the light reflections on the people then point and laugh at the Gungans above. It’s okay. They’re used to it.).

 

Next Up: TXAA, GPU computing

As GDC gets closer, I am going to try to dive into why Mantle, DirectX 12 and other upcoming architectures are a big deal besides just in making the screen prettier or faster in a given frame.

AMD and NVidia have some amazing features built into their hardware that are scarcely used because of the CPU to GPU bottleneck that is now being addressed.   In addition, a lot of interesting things start to become possible when the CPU and GPU total computing power can finally be effectively merged instead of segregated as they are currently.