Itís been a long time coming, but Nvidia finally has a new generation of consumer cards coming in hot. This isnít just an upgrade however, Big Green is changing the game and possibly shaking up the graphics market like never before.
When the 10-series cards were released more than two years ago, it was one of the largest generational leaps weíve seen from any GPU maker. The 10-series cards were much more powerful, tier-for-tier, than the 900-series cards. They used much less power too, which meant that we finally got full versions of these chips in laptops. Thanks to chips like the GTX 1050 Ti, mid-range gaming laptops were now good enough to hit high game settings at 1080p, 60 fps.
Although demand from bitcoin miners seriously hurt the prices of these cards closer to the end of their run, they almost always offered a great deal at every tier level. Which means that these new cards from Nvidia have huge shoes to fill. With these cards being released in 20 September 2018, now is a crucial time to think about what your next GPU purchase will be.
The Most Important Facts Thereís a lot to process when it comes to the 20-series,but we want to highlight a few key facts that will help frame the changes that the cards will bring.
First of all, this series is based on an entirely new GPU design known as Turing. Nvidia names its GPUs after famous scientist. The 10-series used the Pascal architecture, but the last new chip was actually known as ďVoltaĒ, which featured in the Titan V. The major addition we got with the Volta was the inclusion of specialized cores that accelerated machine learning. Nvidia was diversifying the different functions of the GPU, adding more than just CUDA cores into the mix.
Turing contains the same machine learning advances as Volta, but comes with much more. These cards now have GDDR6 support and up to date DisplayPort technology. Which means 4K at 120Hz for starters. They have developed a new version of GPU Boost, which dynamically increases clock speeds if the thermal environment allows for it. It also includes the NVLink technology we saw with Volta, which allows for much better integration of multiple cards, even allowing for memory pooling from multiple cards.
Whatís most special about Turing is that it includes a diversity of hardware processing units that combine into something that looks very special. It includes traditional CUDA cores, AI accelerating Tensor cores and now real-time ray tracing cores. Which is why these cards are called RTX and not GTX.
While each type of core can be used to accelerate a specialized function, such as AI, they work together to handle various aspects of graphical rendering to create something that is a qualitative leap ahead of whatís come before. Performance alone is not going to set Turing apart from whatís come before. This is very likely the start of a true new generation of graphics.
Whatís ďRay TracingĒ?
In case you arenít aware, ray tracing is an advanced rendering method that simulates the path a ray of light takes through a scene. This calculates reflection, refraction and overall lighting in a way that closely matches real life. Which means you get incredibly photorealistic renders. If youíve watched any big-budget CG movie recently, youíve probably seen ray tracing in action.
The reason we donít use this rendering method for general graphical use on computers is because it uses too much processing power. It canít be done in real time. Those movies render their scenes over many hours.
What we use for real-time rendering is known as rasterization. Over time engineers and scientists have figured out various tricks to simulate good lighting using rasterization.
So how has Nvidia achieved real-time ray tracing? By handling certain aspects of the scene with ray tracing and others with rasterization, itís possible to achieve a much more realistic real time output. The AI cores also play a role in this, since machine learning helps integrate everything an smooth out the rough edges. Just keep in mind that the software needs to be patched in order to make use of ray tracing.
Right now there are three consumer cards that have been revealed. The top dog is the RTX 2080 Ti. This is the first time that Nvidia has revealed the Ti card right from the start, so now we can make informed decisions. Thereís also the normal 2080 and the 2070.
On the professional side of things, thereís the Quadro RTX 8000,6000 and 5000. The top cards in either camp have over 4000 CUDA cores, but the Quadro cards have far more RAM.
Power draw is very modest at the top end, with a 650W PSU suggested for the 2080 Ti.
Until actual cards make it into the hands of independent reviewers, we wonít know the real deal. However, NVidia is making some big claims on the performance front. The 2080 Ti will apparently bring a 50% per-core performance increase over the 10-series. The thing is that, because Turing is so different, we donít actually know how to gauge performance.
Sure, it will outperform Pascal on the same jobs, but the new technology brings qualitative changes that are hard to quantify. For example, now we have a spec known as the ďGigarayĒ. The 2070 can do 6 Giga Rays per second. Is that good? Thereís no frame of reference. So in short, weíre getting a 50 performance boost and a whole host of new things that the card can do. Things that make a dramatic difference to how our graphics will look.
What About the Midrange?
These cards are all enthusiast grade, which means that most customers will not be buying them. Which means thereís a lot of anticipation for the 2050 and 2060 cards, if thatís what they will be called. With a 1.5 performance multiplier, that would mean having something closer to the current 1070ís performance at 1050 prices. It might not shake out that way, but if it does thatís another reason to get excited.
Buy Now or Wait?
If you are already on the 10-series, then I think a 50% boost plus those amazing ray tracing and AI abilities are worth the money. Founderís edition cards are a little more expensive, but overall the pricing on these new cards isnít crazy at all.
If you donít care about the ray tracing, perhaps because software support is still thin, then waiting for 10-series price drop is a good idea. However, thereís no reason not to embrace the 20-series other than waiting for real-world benchmarks. So, it looks like itís time to start saving up!