The battle for GPU supremacy between the AMD and NVIDIA has never been more ferocious before. Both giants are throwing everything they’ve got to develop the next-gen hardware in a battle that will dictate customers’ buying patterns for the next 2 years or so. However, a new player has stepped in for the first time in nearly three decades, and it’s a familiar one.
Intel has long tried to create their own line of dedicated graphics to compete with the industry’s duopoly. It helps that the company has had extensive experience in the field developing UHD and Iris Graphics backed up by world-class engineers. Last year, Intel finally made Arc official as its new Graphics brand that will umbrella all of its products and services going forward.
Fast forward to today and the company has launched its first-ever Arc A-Series desktop GPU, the Arc A380. A capable entry-level performer with a competitive price tag, however it’s only available in China for now. A worldwide launch is planned for Arc as a whole later this year, right around the time AMD and NVIDIA begin to introduce their own next-gen designs.
But to get here, Intel had to go through a lot of trial and error, obviously, and Arc had a proper predecessor which helped lay the foundation of what Intel’s next-gen discrete GPUs would be. Some of you may know, Arc is also known as “DG2“. With that information, you can make the connection that there was a DG1, which preceded it. And that’s exactly what we’re talking about today.
DG1 tested in 2022
DG1 was more of a silent launch by Intel since it really wasn’t meant to be a consumer product, it was mostly seen in OEM builds and soon terraformed into Iris Xe MAX, Intel’s premier integrated graphics solution. Even though, no one in their right minds should buy DG1 now and or have anything to do with it, one crazy lad over on Chiphell has done the unthinkable.
User Misaka_9993 has went ahead and purchased a DG1 discrete GPU after they couldn’t bare waiting any longer to get their hands on DG2 for testing, due to its insane prices and limited availability. Regardless, the DG1 was put inside an AMD system with integrated graphics, and pitted against each other. The results are interesting, to say the least.
The tester used an aftermarket DG1 variant from GUNNIR, and it has 80 Execution Units (EUs). Even though it’s not mentioned, Misaka is most likely using the Iris Xe MAX Index V2 model.
The test bench for this comparison was built on the AMD platform using a Ryzen 5 5700G, socketed inside an ASUS ROG X570-I motherboard with Resizable Bar turned on. As some of you may know, the 5700G is an APU so it has a fairly-decent Vega 8 iGPU inside, which the Intel DGI is going up against. Just the notion of seeing a Windows Task Manager with an AMD GPU and an Intel GPU together is honestly insane.
As for the performance, the gaming chops of the DG1 aren’t great, especially because it’s technically not supposed to work on AMD motherboards. Misaka had to boot the system in UEFI mode with CSM disabled for it to turn on. It’s also important to note that DG1 is a PCIe Gen4 x8 card while the Vega 8 is a PCIe Gen3 x16, which means that the system will limit those 16 lanes to just 8 to be on-par with the Intel GPU.
In the AIDA64 GPGPU benchmark, it’s slower than the Vega 8 when it comes to single-precision workloads as AMD’s offering has a roughly 50% higher TFLOP count, at 2.35 TFLOPS vs. 1.58 TFLOPS. It also has significantly higher read and write memory bandwidths, along with being able to handle more IOPS than DG1. In the Julia graphics test, the Vega 8 smokes DG1 by netting almost 200 more frames.
Despite the card’s unimpressive on-paper image, the DG1 packs a lot of media performance when it comes to decoding. When transcoding a 4K60 HEVC Dolby Vision Profile 5 test film to 4K H.264 SDR in Jellyfin, the GPU produced over 120FPS. For context, the GTX 1650 managed to net 130FPS in the same test, so both solutions are very close.
Furthermore, when conducting the same test with 1080p footage, the DG1 pushed out 210+ FPS, compared to GTX 1650‘s 300FPS, but with a major difference. The GTX 1650 consumed 30-50W during these tests, while the DG1 only consumed about 7-10W, a gargantuan win in efficiency for Intel. Hence, the decoding performance of the DG1 GPU is excellent for its class.
Keep in mind that DG1 was never positioned as a proper option in the sub-$200 entry-level segment, it served as more of a building block for Intel. A necessary step to get to where they really wanted to go, Arc A-Series. If we look at this specific GUNNIR SKU used for the tests, its official successor would be the recently-launched Arc A380 based on ACM-G11 GPU, formerly known as DG2-128 (DG2 generation, 128 EUs).
It’s fascinating to look back at what Intel achieved with their first modern discrete GPU and how it helped them carve Arc. Even though, DG1 is more than obsolete today, in hindsight it was an instrumental stepping stone for Intel to be able to stand among the big dogs, AMD and NVIDIA. Whether DG2, aka Arc A-Series, is able to give the market leaders a run for their money now remains to be seen, but at least we can confidently say that Intel definitely tried.