It’s odd to see a game become a flagship for the future of graphics technology twice. When Quake II launched at the end of 1997, it was one of the first games to launch with support for hardware-acceleration out of the box. It was a watershed moment, the point where games began to move aware from software-rendering as default, making graphics cards a necessity instead of a novelty. I actually remember running Quake II with a graphics card for the first time (I can’t remember which card, sadly) and marvelling as the scuzzy, brownish frames suddenly became crisp and smooth. It felt almost like a magic trick, like I was suddenly looking at a different game.
22 years later, it’s all happening again. Only this time I’m cooing as the laser blast from my starting pistol illuminates a hallway, and staring into a pool of water as it reflects the sky above me. I’m blowing barrels up despite there being no nearby enemies, just to see how it affects the light levels in the room. It’s like being 10 again, except I’m not glancing over my shoulder in case my parents discover me playing a gory FPS that’s entirely unsuitable for my age.
Has there been a technology that felt like such a leap forward since hardware acceleration? Physics? HDR? DirectX 9? I’ve been racking my brain to think of a moment that seemed equally significant, or at least one that was marketed as such. In theory, real-time ray tracing is the holy-grail of graphical realism. The ability to dynamically simulate how light bounces off objects is fundamental in creating a realistic-looking world. From affecting how different materials look, to creating realistic reflections and achieving that seemingly impossible goal of real-time global illumination, the potential ramifications of real-time ray-tracing are enormous.
But what’s the reality? I’ve spent the past couple of days mucking about in Quake II RTX and Remedy’s Control to get an idea of how RTX alters the overall experience, and I thought I’d share my initial impressions. This is by no means neither a scientific test of ray tracing’s effectiveness nor a thorough benchmarking of the hardware; it’s just a broad consideration of how ray tracing affects the look of virtual worlds and what improvements it offers over existing graphics tech.
I picked Quake II and Control because I have a clear idea of how both games look without ray tracing. I’ve played Quake II many times over the years, and I reviewed Control just over a month ago. As it turns out, my first impressions of the effectiveness of ray tracing in both games was quite different.
With Quake II, the effect of RTX is immediately apparent, mainly because Quake II’s lighting and texturing is so basic compared to modern games. With RTX on, textures and materials appear much softer and brighter coloured than in the original version. Also, Quake II RTX effectively demonstrates how ray tracing simulates dynamic light changes, from blaster shots casting a moving orb of light down long, dark corridors, to explosive canisters briefly turning a room bright orange. The relatively sparse geometry of Quake II’s world means the ray tracing is always front and centre.
By comparison, the effect of ray tracing in Control is less immediately obvious. This is because modern games have become very good at faking what ray tracing does for real, particularly when it comes to lighting. Developers have spent years figuring out how to replicate the “natural” look of materials under specific light sources without having an all-purpose global lighting simulation. Consequently, we already have bespoke, localised solutions for dynamic lighting and shadows, ambient occlusion, soft shadows, day/night cycle lighting adjustments, crepuscular rays, and so on.
With or without RTX, playing Control at 1080p or higher with all settings maxed out is still going to look amazing. That being said, there are specific areas where RTX makes a noticeable difference in the game. The most significant of these is reflections. With RTX on, surfaces like puddles on the floor, glass windows, and even the protective glass panes of picture frames start to bounce back detailed reflections of the world based on the angle you’re viewing them from. You also get compound multi-bounce reflections, where two panes of glass will reflect off one another, producing that “infinite tunnel” effect you’ll often see showcased at science museums.
It’s not just pure reflective surfaces that this affects either. Any surface that’s smooth or glossy enough to partially reflect light will be affected. Control’s brutalist architecture demonstrates this very well. Polished non-glass surfaces like marble will give a partial reflection, while surfaces like varnished wood will clearly bounce the light. Given the amount of wood, glass, and marble you see in Control, this adds up to a lot of extra visual complexity.
It’s worth noting RTX is doing a lot else as well, such as dynamically altering lights and shadows depending on where light sources are and if/how they are moved. Throwing a desk lamp at an enemy in Control produces some cool effects, for example. You also get a much greater distinction between light and dark areas, as light levels are determined by the ray tracing algorithm rather than the level that the developer set. Heavily shadowed areas are very dark, while highly illuminated areas, such as Control's furnace room, are dazzlingly bright.
Ultimately, RTX makes a substantial difference, although it’s more noticeable in some games than in others. It’s also clearly still early days for the tech, and in some situations it can make the image look worse rather than better. For example, I noticed ray-tracing produces a fuzziness on certain objects, usually objects with a dense or fine patterning like wire-mesh fences, or Jesse Faden’s hair. I don’t know why this occurs, but I’d guess it has something to do with the “de-noising” process (ray tracing produces a lot of image noise that then has to be removed to present a clear display). I also think that, generally, there needs to be more distinction between surface types. Sometimes you’ll see thin glass giving off too much of a mirror-like reflection, wood that bounces light like chromed metal, etc.
The other issue is performance. Current RTX cards only guarantee you a smooth ride at 1080p with ray tracing enabled. For most people this will be fine, but if you’re aiming for a higher resolution, it’s more of a gamble. Using my RTX 2080 (backed by a Ryzen 5 3600, 32GB DDR4, and a 1TB SSD), Control at 2160p with ray tracing was entirely unplayable. At 1440p, it's consistently around 35fps, and at 1080p around 50fps. Quake II fared better at higher resolutions. At 4K, I got around 33fps with all RTX settings enabled and global illumination set to medium. Setting it to high gave an average frame rate of 28fps. In short, 60fps with an RTX card is achievable at 1080p. For higher resolutions, unless you’re willing to invest in a 2080 Ti, it’s a struggle.
Nonetheless, I’d say that ray tracing is far from a gimmick. In fact, I'm confident that it’s only a matter of time before ray tracing becomes the norm. The overall effect it has on image complexity is substantial, and that effect will only increase as cards are developed that allow for full global illumination at a lower cost to performance, and as developers give more nuanced consideration to how light bounces off specific surface types. How long it will be before this happens is a harder question to answer, but I think it’ll be sooner rather than later. There are two important milestones coming in the next year or so. The first is Cyberpunk 2077, which I reckon will encourage a lot of people to adopt RTX-enabled cards. The second is the next generation of consoles, as both PS5 and the next Xbox have already been confirmed to support ray tracing (though neither uses Nvidia RTX hardware). To what degree this support is enabled in terms of amount and quality is very much up in the air, but it will give devs considerably more motivation to implement it than they have now. From that point, it’s global illumination all the way, baby.
September 18 2020 | 18:30