The somewhat anticipated Xe LP graphics which Intel promises to release this year (as early as this summer) have been leaked by APISAK in a Geekbench test, showing interesting results. The unreleased GPU was able to match Nvidia’s MX350, which is a rebrand of the GTX 1050 with a much lower power limit. Notably, the GPU was running at a maximum of 1.3 GHz (up from 1.1 GHz Ice Lake graphics) and had 6 GB of RAM.
The first question you might have when seeing this result is whether or not this is integrated graphics or discrete graphics. Though Tiger Lake CPUs use the same GPU as the discrete DG1 GPU, both will actually be present in laptops. Remember CES this year? The DG1 demo was actually performed on a laptop. So, a laptop could have both Tiger Lake integrated graphics and DG1 discrete graphics even though they’re mostly the same. One indication that this is DG1 is that the GPU has 6 GB of VRAM, which is discrete sized. Intel could partition any amount of VRAM they want for this iGPU, but 6 GB is alot, especially when you consider that partitioning any amount of DRAM as VRAM doesn’t improve performance. In my opinion, this is DG1 and not the integrated graphics of Tiger Lake. However, there is one indication that this could be the integrated GPU since the device says “mobile graphics”, but that’s not entirely reliable. In November of last year, a leaked GFXBench/Compubench result for the DG1 actually spurred me to leak some info I had decided not to initially leak because I wasn’t sure if it was true or not. This benchmark showed a discrete, desktop DG1 GPU running in a 9600K system but showed up as “mobile graphics” presumably because the drivers did not distinguish between desktop and mobile GPUs as only mobile DG1 GPUs would be sold to consumers.
Performance for Intel seems to look good given that it’s a first generation product, but we have yet to see power efficiency, pricing, and availability, which are where 10nm based products naturally struggle due to poor yields on the 10nm node. Even if Xe LP is a failure this generation, it could certainly threaten Nvidia in the next generation once Intel finally develops a good node.
Liked it? Take a second to support Matthew Connatser on Patreon!