Digital Foundry Tests Intel’s XeSS in Shadow Of the Tomb Raider, Initial Results Much Better Than Expected

Digital Foundry in a video showcased Intel’s XeSS technology in Shadow of The Tomb Raider. This feature is only specific to SOTR for now, however, it gives us an idea regarding the expected performance increments along with the picture quality. Let’s first discuss how XeSS works.

XeSS

Xe Super Sampling (XeSS) is Intel’s solution for AI-upscaling in games providing enhanced performance along with better picture quality at times than native res. Intel’s offering leverages the Xe architecture that essentially renders the game at a lower resolution, then with the power of AI, it upscales it up to your desired resolution. This results in a boosted frame rate and reduced burden on the GPU.

The Results

Shadow of Tomb Raider using DLSS & XeSS was benchmarked. XeSS version 1.187 was used by Digital Foundry for these results paired with NVIDIA’s RTX 3070 and Intel’s Arc A770. Interestingly, the A770 is the direct competitor to the RTX 3060 Ti and not the 3070. 

Intel Arc A770 XeSS at 1440p

The Arc A770 with XeSS provides a 52% performance increment with its performance mode as compared to native 1440p. Those looking for a good-looking image should opt for the XeSS Ultra Quality mode which still grants around a 16% performance boost while looking almost identical to native res. The thing about upscaling technology is that in fast-paced games you won’t often notice the quality difference.

Intel A770 with XeSS at 1440p | Digital Foundry via Videocardz

Intel Arc A770 XeSS at 4K

Moving over to 4K, we see a massive 88% performance increment compared to native res using the XeSS Performance Mode. The shown image does not account for all possible upscaling aftereffects that may occur due to a decreased Field Of View.

Intel A770 with XeSS at 1440p | Digital Foundry via Videocardz

XeSS vs DLSS

When compared against NVIDIA’s signature DLSS technology, XeSS does fall behind a bit due to a few rendering issues (Excess shadows). In addition, the tree on the left is clearer with DLSS but XeSS fails to achieve a similar quality. To put it shortly, XeSS makes the image look really soft.

4K DLSS vs 4K XeSS | Digital Foundry via Videocardz

If we take a look at a few complex textures such as bushes where most AA (Anti-Aliasing) techniques suffer the most, DLSS pulls ahead even against native 4K + TAA. However, Intel’s XeSS fails to achieve an image quality akin to NVIDIA. This may be due to the excess softening we mentioned above. All in all, even DLSS 1.0 was not the best so we expect team blue to improve with each iteration.

4K DLSS vs 4K XeSS | Digital Foundry via Videocardz

XeSS’s Possible Advantage

The chart below shows that all 3 technologies render games at the same resolution for their ‘performance mode‘. However, as we go up, we find Intel’s XeSS Ultra Quality mode rendering games at a ~30% higher resolution than its competitors. This gives it a massive edge in picture quality, however, at the cost of a few frames. 

XeSS vs FSR vs DLSS in terms of resolution used | Digital Foundry via Videocardz

Conclusion

Similar to AMD’s FSR implementation, we hope to see Intel’s XeSS supporting GPUs from other vendors when it comes in the near future. The road is bumpy, however, Intel has made it clear that they are here to stay. NVIDIA’s DLSS is better undoubtedly due to the use of Tensor cores, although, XeSS & FSR are accessible to ‘most’ gamers. 

Abdullah Faisal
With a love for computers since the age of give, Abdullah has always sought to delve into the depths of information, and uses it as his guiding light. He believes success is of utmost importance as history is written by the victor.