Real-time ray tracing is finally starting to become commonplace in high-end PC games and even some PS5 and Xbox Series X games. Most of them come with a performance hit, letting the player choose whether they want smoother gameplay or more accurate details. Which raises the question: Should players even care about ray tracing?
Why Ray Tracing Is a Big Deal (Even If It Hurts Performance)
Although ray tracing is a relatively new buzzword in the gaming world, the feature has been a mainstay of computer graphics in film and TV for years. It simply refers to the process of tracing the path of light rays as they bounce around a scene. This lets computers accurately render things like shadows, reflections, highlights, and bounced light. The result is a scene that looks more realistic with less work. The only downside is that ray tracing usually takes so much processing power that film studios have to spend days rendering highly detailed scenes.
The real breakthrough for video games is real-time ray tracing. Modern consoles and graphics cards finally have enough processing power to handle the brute force of ray tracing. However, it can still be limited to only certain tasks. Cyberpunk 2077, for example, has separate toggles for ray-traced reflections and shadows, so you can choose what aspect of your game’s graphics get improved.
But do you really care that much about more accurate shadows? It might sound like it’s insignificant, but so do most improvements to visual graphics. We tend to notice when bad graphics stick out, but when the graphics are good, we get more immersed in the game instead.
More important, ray tracing makes developers’ jobs easier. Most current games have non-ray-traced graphics options if you don’t have the powerful hardware to turn ray tracing on, but getting those experiences to look right takes far more work. The more work it takes to get the faux shadows in a scene to look right, the less time that developer has to spend on something else.
In the long run, as gaming hardware gets more powerful, ray-traced graphics will become more standard, and it will enable developers to create gorgeous experiences with less effort than before. There’s still something to be said for just appreciating how good a video game looks, and a game will almost always look its best with ray tracing on.
Why You Still Might Want to Turn It Off
For now, though, ray tracing is a demanding task. The current generation of consoles in particular have arrived at a challenging time. Outputting games at 4K is becoming standard, even if you don’t have a 4K HDTV. Games are increasingly targeting 60 frames per second, and those that don’t are sticking out like sore thumbs. And some games are even trying to hit 120 frames per second.
All of those innovations require massive amounts of processing power compared to previous generations. All other things being equal, 4K requires roughly four times as much processing power as 1080p. Games running at 60 frames per second require roughly twice as much processing than 30 frames per second because it’s rendering exactly twice as many frames in the same amount of time. And for 120 frames, the workload is double that of 60 fps. In other words, there’s a lot of data that new consoles and graphics cards have to move very fast just to keep up with modern features.
Adding ray tracing on top of that is like trying to squeeze in a fourth job when you’re already working 90 hours a week. Eventually, something has to give. And that’s exactly where the current crop of ray-traced video games find themselves.
Depending on what graphics card you own, Cyberpunk 2077 might be unplayably slow on PC with ray tracing turned on. Call of Duty: Black Ops Cold War can run at 120 frames per second, or turn on ray tracing, but not both. Even on hardware that allows ray tracing and high frame rates at the same time, you might get a smoother experience or fewer bugs if you skip the ray tracing entirely.