Nvidia announced their new GeForce RTX 20-series cards today. It's kind of a big deal to a lot of people, but why are people so excited, and is it actually a big deal?

BACKGROUND

Firstly, the 20-series is the next generation of Nvidia graphics cards. PC (and recently Mac) gaming enthusiasts are likely pretty excited because the 20-series was announced later than anticipated. Historically, Nvidia has released new generations every year or two, with generational updates in the form of Ti models coming half way through the life cycle of the series. So when the 20-series was announced 2.5 years after the initial release of the preceding 10-series, you have to recognize that people have been riding the rumor and hype train for nearly a year at this point.

Also for context, the crypto-currency mining rush of the last few years have made it nearly impossible for gamers to be able to afford the 10 series for much of its life cycle. This was hardly helped by the fact that Nvidia's biggest competitor, AMD's Radeon graphics cards, who have been more or less out of the running for nearly a decade, returned with their first competitive series, only to also immediately disappear into the hands of miners.

SERIES AND GENERATIONS

So is the 20-series, just a more powerful 10-series update? Yes and no. Firstly, the 10-series were built on the 16/14nm architecture. This allowed Nvidia to more or less continue their trend of subsequent generations offering roughly 50-60% more performance from their previous series, and roughly 25-30% improvement over the preceding mid-generational Ti refresh. Despite the keynote, it's difficult to say if the 20-series will continue that trend.

A technical analysis suggests the board should support faster RAM and power delivery to support the 50-60% performance bump, and the rated clock speeds / CUDA core count seems to support this as well, so what does Nvidia CEO, Jensen Huang, mean when he's talking about the 20-series being 10-16x faster than the previous generation? He's referring to what the new Turing architecture of the 20-series chips can do.

TURING

The primary feature of recent Nvidia offerings has focused on their CUDA cores. A Compute Unified Device Architecture Core (or in the case of AMD, Stream Processors), are a means to parallelize the process of drawing the pixels to your screen. This uses hardware to "divide and conquer" the task of drawing each frame. These CUDA cores were optimized for floating point operations (FLOPS) for things like color calculations.

The new Turing chips, unlike the preceding Pascal architecture have some new features that have been borrowed from Nvidia’s last gen professional grade card’s Volta architecture. Without getting too technical, the new Turing chips have additional real estate reserved for new operations in addition to the part reserved for the CUDA cores. Namely, they come with Tensor cores, and a new RT (ray tracing) core.

With DirectX12 and more specifically DXR, ray-tracing is squarely on the horizon. Ray tracing basically performs multi-sampling of reversed bounced lighting information, allowing for accurate reflections, shadows, and secondary lighting. This will potentially make things like baked reflections and global illumination obsolete. Nvidia's inclusion of hardware ray-tracing acceleration is the reason for these new card's RTX moniker, as opposed to GTX. But, more on that later.

The Tensor cores on the other hand are optimized for A * B + C matrix operations, allowing for faster neural net operations.

FOR GAMERS

For gamers, the biggest deal is that the RTX 2070, 2080, and 2080Ti, though not inexpensive, are potentially a great sort of future-proof option. In games utilizing ray tracing, Nvidia claims that the 20-series cards may have 10-16 times the performance of their equivalent 10-series predecessor. Even in cases that don't utilize this new technology, the baseline performance increase still allows gamers to comfortably max out current AAA titles at 4K (CPU and RAM permitting). And judging from previous benchmarks, the 20-series also seems more than up to the task of handily supporting next gen VR platforms and their ever increasing pixel density.

FOR PROFESSIONALS

For professionals, the new 20-series' value is not quite as clear. In a worst case scenario, the hardware acceleration for things like rendering may be no more than 50% better than the previous 10-series equivalent. That would mean that these cards could still be relatively underpowered for video-editing, CAD, and simulations.

But, if Mr. Huang's vague claims are to be believed, these cards may do a whole lot more than that. Firstly, the 20-series is the first set of commercially priced cards to feature Tensor cores. This means that the general population will have the opportunity to do neural AI training at rates up to an order of magnitude faster than before. There's also no telling if software developers may find a way to harness the Tensor cores for additional hardware acceleration in non neural net tasks. Though the metrics are still unclear, he claims that the Tensor core alone are the equivalent of "ten GTX 1080s". Transistor and operation counts aside, real world application performance is still unknown.

FOR INDIES

Indies come in all shapes and sizes, so to clarify, this is for us micro-indies. Any indie studio big enough to have a CG specialist of any type probably isn't reading this anyways. So for the rest of us, here's what I see.

Here's why the new 20-series cards are a big deal.

1) As ray-tracing becomes more of the standard, the complexity involved in making higher quality visuals will only continue to come down. As it is, up-scale graphics on the realistic end of the spectrum require nice models, quality textures, level of detail scaling, PBR shaders, an in-depth understanding of your engine of choice's lighting system, including render-pipelines, rendering paths, global illumination, post-processing, and a half-dozen other details to approach even low level realism.

However, at least for us Unity folks, the new Unity Scriptable Render Pipeline, HDRP template, and RTX should allow us to do all of that with closer to half of the bells and whistles.

2) Neural net AI have begun to make their mark on the gaming world as Elon Musk's OpenAI team have trained their bots to take on pro League of Legends players 1v1 and 5v5 with a high degree of success. Only time will tell if the benefits of having access to Tensor cores will become the standard way we make first pass, or even final pass AI for our games, but now, for the first time, those of us without the benefit of $10,000 - $20,000 computers can try to attempt rigorous training of neural network driven AI on our home computers before we experience the heat-death of the universe.

3) It's good to be aware of trends. Granted, the rate at which people are replacing their computers seem to be slowing down, so it may be that none of this will matter for the majority of the install base for 3 to 6 years, but there's also the chance that current advancements may make current hardware feel underpowered very soon. VR is gearing up for its second swing at hooking the masses; 4K displays, which quadruple the rendering overhead from the 1080p standard, are becoming more and more prevalent; gamer frame rate expectations have recently more than doubled; Intel and AMD are in the middle of a CPU war that has pushed max commercial thread counts from a decade-stagnant 8 to a staggering 64 in the last 12 months. Then there's this.

FINAL THOUGHTS

Personally, I'm very excited for the new 20-series cards. I don't think I'll be picking one up on preorder or anything, but this feels similar to when I picked up my first 3DFX VooDoo graphics card. Sure, for a while, the only thing it did was make Quake look better, but there's no denying the impact that this technology has had on not just games, but engineering physics simulations, computer graphics, new branches of computer science, and cryptography.

Hopefully this clears up what all the hoopla has been all about.  Probably not the Earth-shattering technological revolution that Nvidia would like you to believe, but an interesting and noteworthy step-increase in commercial computing technology.