Just a few days ago, we looked at a benchmark overview of Intel‘s next-gen flagship CPU, the Core i9-13900K. That benchmark didn’t include any games and only tested the processor’s synthetic performance along with its relative power consumption. Today, the same leaker has posted a follow-up to that, discovered by HXL (@9550pro), which sheds some more light on the overall package Raptor Lake has to offer.
Just like last time, this leaked benchmark comes from “Extreme Gamer” over on Bilibili, who has put the 13900K up against its current-gen counterpart, the Core i9-12900K(F). For the test bench, he’s using an SUS Maximus Z690 motherboard, equipped with 32GB of DDR5-6400 memory from Team Force, a 360mm AIO liquid-cooler, and a 1500W power supply from Cooler Master. Moreover, the tester used a GeForce RTX 3060 Ti graphics card across both CPUs.
Moving on to performance, let’s take a look at synthetic benchmarks first. It’s important to mention that the i9-13900K is running at 5.5GHz here, while the 12900KF is only running at 4.9GHz, which is a pretty sizeable increase. Regardless, Extreme Gamer put both CPUs through the entire 3DMark suite, and the results are interesting, to say the least.
In FireStrike, FireStrike Ultra, FireStrike Extreme, Time Spy, and Time Spy Extreme, both processors produce nearly identical numbers. The Raptor Lake components is just barely faster in each case but it’s within the margin of error. However, more drastic differences can be seen in other tests below, particularly in FireStrike Extreme Physics where the 13900K leads by over 16,000 points.
On average, the i9-13900K performs about 18% better across the 3DMark suite than the 12900K. The largest discrepancy is seen in the FireStrike Extreme Physics, where the 13900K achieves a ~28% higher score. The closes both CPUs came in this benchmark run was FireStrike Ultra, where there was just a 1% increase in points for the 13900K over the 12900K, again, within the margin of error.
Now, moving on the juicy bit—games. Extreme Player tested 8 games and the 13900K performed close to the 12900KF in nearly all of them. As we all know, games are CPU-bound at lower resolutions, so only 1080p and below is a good indicator of how much the CPU is affecting the game’s performance.
As soon as you move up the resolution ladder, the GPU takes over more and more, with the CPU being there only to assist. With that in mind, check out the performance comparison for all 8 games below, before we move on to individual analysis.
Jumping into specifics; in Far Cry 6, the 13900K is barely better than the 12900KF as it produces a handful of extra frames that you’d never even be able to notice. At 4K and 1440p resolutions, as expected, both CPUs post the exact same numbers for the large part. On average, we see a 2.8% improvement with all resolutions taken into account.
A similar picture can be seen in Counter Strike: Global Offensive. As you can see below, at 1080p, the 13900K produces roughly 20 more frames as compared to the 12900K. But when these numbers are in the 700FPS range, it really does not matter one bit. At 4K, the 13900K manages to push out 416FPS while the 12900K is “limited” to only 414FPS.
Lastly, let’s take a look at Red Dead Redemption 2, a highly-GPU-bound game that stresses even the most powerful hardware. Once again, at 1080p, the Alder Lake and Raptor Lake parts are separated by literally just one frame. At 4K, the 13900K delivers around 45FPS, whereas the 12900K delivers 42FPS. At 1440p resolution, there’s just a 5FPS difference between the two CPUs.
Averaging across 6 games, the Core i9-13900K is barely 5% faster at best, but even that increase is negated by the power consumption figures. The 13900K consumes significantly more power than the 12900K to achieve better frame rates. The reason for this is the higher clock speeds of the Raptor Lake SKU, which has a 600MHz advantage over the Alder Lake SKU.
At 1080p resolution in Horizon Zero Dawn, the 13900K pulled about 157W of power, while the 12900KF only pulled about 113W, that’s a 28% increase. In game, however, the 13900K only performed about 4% better at that resolution, which means the CPU is extremely inefficient at this stage. The same goes for the rest of the games tested here, the 13900K pulls more power than the performance benefit it gives.
- Compared to Core i9-12900K at 1080p: 19.1% higher power consumption
- Compared to Core i9-12900K at 1440p: 19.8% higher power consumption
- Compared to Core i9-12900K at 4K: 26.2% higher power consumption
All in all, you have to keep in mind that this is still a pre-release chip. It’s one step closer to final than an engineering sample, however, it still requires some tweaks before it can hit the shelves. As you can tell, right now, the i9-13900K is a glorified net-negative that chugs watts like they’re nothing.
Raptor Lake is still a couple of months out so Intel has enough time to sort things out and fine-tune the lineup even more. Because if they ship the processor in this stage, then it will only further AMD‘s lead when it comes to efficiency. And, with Zen4 right around the corner, competition’s more tough than ever. This is prime time for Intel to step up its game.