Doom: The Dark Ages is the latest first-person shooter developed by id Software, serving as the eighth main entry in the iconic franchise and the third installment of the modern series, following the critically acclaimed Doom Eternal released in 2020. In this article, we will not be providing a game review but will instead conduct an in-depth benchmarking analysis of this exciting new title. As anticipated, some configurations may face challenges, but we aim to provide clarity on performance across various setups.
Doom: The Dark Ages is built on the advanced idTech8 game engine, which incorporates full dynamic lighting features, including ray-traced global illumination and ray-traced reflections. Consequently, a graphics processing unit (GPU) with hardware ray-tracing support is essential for optimal gameplay. While path tracing is expected to be introduced in a post-launch update, players with high-end GPUs, such as the RTX 5090, will have to wait as this feature is not yet available.
For our benchmarking tests, we utilized driver version 576.31 from Nvidia and version 25.5.1 from AMD, both of which are optimized for Doom: The Dark Ages. Our analysis focused on a specific game segment called Siege Part 1, Chapter 6, Aspiring Slayer, commencing at a checkpoint just before a significant battle sequence. This intense fight provided a rigorous testing ground for frame rate performance, as monitoring less demanding sections, like traversing near-empty corridors, would yield artificially high frame rates.
Doom: The Dark Ages offers multiple preset options, yet for this analysis, we concentrated on just two due to minimal performance difference among the highest quality settings: Ultra Nightmare, Nightmare, Ultra, and High. Notably, the RTX 5080 and 9070 XT demonstrated only a 2% performance increase transitioning from Ultra Nightmare to High, while the 9070 XT experienced a slight uplift of 5% in the same scenario. Surprisingly, the 9070 XT outperformed the RTX 5080 with the provided drivers, despite Nvidia's early access to the game and their request to showcase the RTX features of their GeForce 50 series.
Even when using the Medium preset, the Radeon GPU only saw an 8% increase, while the GeForce GPU improved by a mere 2%. Lowering the settings to Low resulted in a more significant boost: a 14% increase for the 9070 XT and a 13% gain for the RTX 5080. However, this means that the Low preset only offers an 18% performance improvement for the RTX 5080 and 29% for the 9070 XT compared to the highest quality preset, highlighting a concerning lack of performance scaling across presets.
Examining the visual differences between presets reveals that the variations are minimal. In our initial example, we found almost no discernible differences among all six presets. Even when scrutinizing the game’s native presentation, the visuals across the top four presets appeared identical. Comparisons between Medium and Ultra Nightmare settings yielded negligible differences—perhaps only a slight enhancement in distant shadow quality.
With the Low preset, while textures on weapons appeared somewhat softer and distant lighting quality diminished, the overall presentation remained strikingly similar to Ultra Nightmare. A close-up detail comparison reaffirmed this observation, with the most noticeable differences stemming from texture quality and reflections. In scenes featuring intense fire effects, the distinctions remained subtle, suggesting that many users might struggle to identify the differences between presets.
For our benchmark tests, we utilized a robust test system comprising an AMD Ryzen 7 9800X3D processor, paired with 32 GB of DDR5-6000 memory. Before finalizing this article, we obtained Nvidia's latest driver (version 576.40), which yielded a minor performance uplift of around 2-3%. Despite our inquiries, Nvidia was unable to specify the expected performance gains from this new driver.
Using the Ultra Nightmare quality settings with native TAA, the RTX 5090 achieved a frame rate of 151 fps, merely 3% faster than the RTX 4090. This puzzling performance raises questions about the relative gains of the GeForce 50 series, as the RTX 5080 was reported to be 8% slower than the RTX 4080.
At 1440p with Ultra Nightmare settings, the RTX 5090 recorded an average of 125 fps, marking a 6% increase over the RTX 4090 and 30% faster than the AMD 7900 XTX and 9070 XT. Many GPUs managed to maintain performance above 60 fps, indicating a solid experience for most players.
For native 4K performance, the RTX 5090 achieved an average of 82 fps, with the RTX 4090 and 9070 XT falling behind at 74 fps and 56 fps, respectively. Users not equipped with high-end GPUs can expect sub-60 fps without employing upscaling techniques.
Transitioning to the Medium preset at 1080p, most GPUs delivered frame rates over 60 fps, showcasing the versatility of this setting. However, the AMD RX 7600 series underperformed, averaging only 52 fps.
At 1440p, the RTX 5090 reached an average of 135 fps, while the RTX 4090 followed closely with 130 fps. The performance of the 40 and 50 series GPUs was surprisingly similar, with the RTX 5080 lagging behind at 8% slower than the 4080.
Finally, at 4K with Medium settings, the RTX 5090 managed 89 fps, reflecting only a 9% improvement over its Ultra Nightmare performance. Lower-end cards, like the RTX 5070, showed modest gains, further highlighting the limited performance scaling in this game.
In our analysis of FSR and DLSS upscaling techniques, we compared performance using the Radeon 9070 XT and RTX 5080. The Radeon GPU outperformed the GeForce model, yielding up to 11% more performance at the native resolution. With the Quality upscaling option, this margin expanded to 16%, demonstrating the Radeon GPU's superior efficiency.
Focusing on the performance of 8 GB GPUs in Doom: The Dark Ages, particularly the new 8 GB version of the RTX 5060 Ti, we found that they generally performed adequately. However, the game clearly benefits from a 4K texture pack. Many textures appeared low-resolution at higher settings, echoing our observations regarding other recent titles.
In our tests, the 8 GB GPU struggled significantly during intense scenarios, particularly in large horde battles. In contrast, the 16 GB model remained playable, and performance metrics highlighted a substantial gap, with the 16 GB card performing approximately 40% faster when using DLSS quality upscaling.
Our findings indicated that adjusting the texture pool size from the default 2 GB to 1.5 GB considerably improved performance on the 8 GB 5060 Ti at 1440p with upscaling. This adjustment allowed the lower VRAM model to match the performance of its 16 GB counterpart without noticeable visual degradation. Strikingly, we observed no substantial improvement when increasing the texture pool on the 16 GB GPU.
Doom: The Dark Ages is a well-optimized game that runs smoothly at 1440p with upscaling enabled. Quality upscaling can enhance performance by approximately 30% to 40%, making it feasible for GPUs like the Radeon 7700 XT and GeForce RTX 5060 Ti to maintain over 60 fps. However, the game exhibits limited performance scaling, with the highest and lowest settings yielding only a 30% improvement. This raises concerns for gamers using lower-end GPUs, as many struggle to exceed 60 fps.
Visually, the game impresses with its effects and overall presentation, but it suffers from inconsistencies in texture quality. As observed in other titles, the potential for high-resolution texture packs could significantly enhance the gaming experience. Ultimately, developers must navigate the delicate balance between optimizing for 8 GB VRAM and delivering high-fidelity visuals. We hope to see a transformative texture pack for Doom: The Dark Ages in the future.
For further updates and insights, please subscribe to our newsletter and explore our TechSpot Elite subscription for an ad-free experience and exclusive content.