Ah, raytracing. If anything demonstrates Nvidia’s dominance of the discrete GPU market, it’s that Team Green has successfully managed to use marketing spin to repackage what’s often been described as the pinnacle of graphics technology as a gimmicky, Turing-specific Gameworks-style add-on to completely shunt attention away from horrifying price gouging and the lack of a generational performance leap anywhere apart from the extreme top-end. It’s almost as big a hoot as DLSS which promises better performance by…reducing the resolution. (Reports indicate that in many cases, regular upscaling can look better than Nvdia’s bespoke, AI-based solution). Turing has many issues, a terrible price-performance ratio only being the most notable. But just as important is the fact that Turing’s ray-tracing feature—utilizing its RT cores—and supposedly meant to offset the lack of credible performance uplift vis-a-vis Pascal—isn’t quite what it’s cracked up to be.
With the impending announcement of new consoles by Sony and Microsoft, there’s an understandable amount of hype around ninth-gen console capabilities. With the rumour mill pointing to a Navi/Ryzen combination as the platform, of choice lots of people are interested in the kind of graphics features these new consoles can use. Could Project Scarlett or the PS5 make use of ray-tracing with the kind of hardware they’re likely packing? Could they bring effects similar to what we’ve seen on the Turing cards to mainstream console audiences? The answer is… well… kinda. To understand why, we’ll have to look at what raytracing actually is, how it differs from conventional raster rendering, and what Nvidia actually means when it says Turing cards can do raytracing (TLDR: they can leverage a limited implementation of raytracing, but within conventional raster renderers and not across the entire pipeline).
You see, RTX “raytracing,” isn’t actually, well, raytracing. It allows for speeding up limited implementations of raytracing that enhance visuals within conventional renderers that are still utilization rasterization. What’s rasterization? When we have conversations about polygon counts and shading, global illumination, SSAO and the rest we’re actually talking about a whole host of disparate techniques that synergize in a conventional rasterized renderer to deliver a rendered image that looks approximately like it would in real life. In rasterization, 3D objects are made up of millions of 2D triangles that intersect with each other in meshes. These meshes have information on what colour value to assign to each pixel on a display. Shading refers to all the techniques used to determine how that colour value should change depending on an approximated understanding of how light would interact with that scene. (Actual) raytracing is fundamentally different. To oversimplify things, it works in the opposite manner of how you see things in real life. Lights in the real world—the sun, candles, your monitor or phone display right now—cast rays of photons that bounce off other object and eventually hit your retina, carrying information about the shape of the objects and the manner in which they bounced off them.
In a conventional raster engine, a light source can hit an object and determine what colour that object will be. But the diffuse light reflected from that object isn’t going to then inform the colour of surrounding objects, which is the way real life (and ray-tracing) do things. Techniques like global illumination offer a credible approximation of the behaviour of diffuse lighting due to “light bounce.” But they are, at the end of the day, approximations. In an actual ray-traced engine, rays are projected out from the camera, hit objects, bounce around and then return an output. Unlike rasterization, ray-tracing gives you the “real thing,” as far as the behaviour of light in the scene and not shader-based approximations. Authentic ray-tracing—with multiple rays per pixel—is extremely intensive computationally and is just not possible in real-time with current hardware, not by a long shot. For lighting that remains static in the scene, rasterized renderers often use “lightmaps,” which can be done by pre-computing the ray-traced output beforehand for those lights in the scene that don’t change at all. This is how you get nice, apparently soft shadowing from static objects like buildings in, say, the Battlefield series. Games like Assassin’s Creed Unity took this approach to commendable heights. Of course, in the real world lights don’t remain in the same place all the time—pre-calculated output would be useless for most of the light interactions in a given scene. You’d need real-time raytracing to capture the movement of light as objects and lights move around. This is an order of magnitude more intensive.
The newly released ray-traced renderer for Quake 2 puts things in perspective. A 2080 Ti with the largest complement of RT cores is unable to hit a locked 60 FPS with Quake 2’s ray-traced renderer. A 22 year-old game that—in rasterization mode—runs well on 2008-era Nokia phones—can’t hit a locked 60 FPS on the world’s most powerful GPU when “real” raytracing is enabled.
If it wasn’t clear yet, current hardware—whether PC or console—is simply incapable of handling complete implementations of raytracing. You would need hardware that’s 5-10 times as powerful as today’s fastest GPUs to run fully ray-traced editions of today’s less-intensive games. Moreover, because that kind of power just doesn’t exist, most current game engines are raster-based and would have to be rewritten from scratch. As far as full raytracing implementations are concerned, the possibility of this happening on PS5 or Project Scarlett is basically non-existent. It’s not likely to happen in the PC space either for at least another 5 years.
However—and this is what RTX cards have shown us so far—we have the capability today to utilize real-time raytracing within conventional rasterized engines to improve upon specific parts of the rendering pipeline. Take Metro Exodus, for instance. The engine’s still raster-based but ray-tracing can be enabled for light cast by outdoor sources such as the sun—this makes specific scenes, particularly outdoor environments—more physically correct and lifelike. Even this, however, tanks performance on the RTX cards, especially at higher resolutions. Keeping in mind that AMD’s current PC flagship, the Radeon VII only keeps pace with the RTX 2080, we would expect the 9th gen consoles to have GPU capabilities somewhere in between an RTX 2060 and 2070, but without dedicated RT hardware. Raytracing calculations are (obviously) possible without fixed-function hardware like the RT cores on Turing GPUs. With enough hardware grunt, “regular” GPUs can pull off RTX effects just fine—Q2VKPT can run on the 2080 Ti and high end Radeon parts as well, albeit at lower resolutions. It’s also important to note that developers have had very little time to play around with raytracing.
It’s only been a few months since the RTX cards launched, after all. And different approaches would have different performance implications—while replacing the global lighting system with ray-tracing in AAA titles is likely to be beyond the capabilities of the ninth-gen consoles, we can easily see Project Scarlett and the PS5 utilizing ray-traced shadows. Indie titles and smaller-scale, less interactive games (think the next-gen’s version of The Order: 1886) could also utilize raytracing for a wider range of functions, free from the constraint of rendering large open worlds.
At the end of day, it’s fairly clear to us that the time for real ray-tracing has yet to come. We just don’t have graphics hardware that’s fast enough. We will, however, reach that point sometime in the next five to ten years. If AMD and Nvidia double down on the idea of adding fixed-function raytracing hardware to their GPUs—as seen in the RTX cards—that day may be sooner rather than later. In the meantime, the use of raytracing for enhancing specific aspects of the rendering pipeline in rasterized engines is likely to increase. With the kind of power likely available to the PS5 and Project Scarlett, these kinds of “ray-traced” experiences are likely to make their way over to console.