A first look at, and review of, Intel’s ARC A770 LE dGPU (16gb)

Intel finally launched the ARC A770 GPU which everyone has been waiting on, so how does it perform? I’ve been using the A770 for a few days to get an idea of how it performs and kept an eye of for any problems and bugs while gaming.

First, a quick look at the product

ARC A770 is packaged in a medium sized box and is protected on all sides. Included in the box are the GPU, a cord which connects to a motherboard’s USB header for lighting control, and a pamphlet with the most important part inside – an Intel ARC sticker similar to “Core i9” stickers you would get with a CPU.

Installing the graphics driver was a straightforward process, but don’t forget to enable ReSizable Bar before installing the GPU – performance will be seriously impacted without it!

Controls of ARC are split into two different programs. Intel Graphics Command Center, introduced shortly after Intel announced their intention to return to dedicated graphics, is still the primary center for changing settings. Things like Adaptive Sync (FreeSync) will be enabled by default, but some options that ought to be enabled are not – for example, the Adaptive Tessellation feature which can improve performance in tessellation heavy scenarios.

Arc Control

In addition to Intel Graphics Command Center, ARC is also controlled via the Intel Arc Control overlay. There are a few things I don’t like about this overlay. The first thing that I don’t like about the overlay is that you’re required to give administrator permission to start the overlay – and since it starts with Windows, that means every time you boot your computer you’re presented with an annoying prompt to give it permission to load.

The next thing is that if you’re using it via desktop, it’s wasteful with screenspace. While the overlay is engaged, it only uses a portion of your screen – but you can’t interact with anything else.

Other than those minor annoyances, the overlay works well enough.

It’s main purposes are for

  1. Overclocking
  2. Streaming and Recording
  3. Performance Overlay

The overclocking feature is simple to use and works well enough. You’re able to quickly The LE edition A770 has a maximum power budget of 228 watts vs 190w default, I’m told 3rd party boards may have higher overclocking limits.

Using Alt + O brings up a general performance overlay, but oddly lacks the option to show framerate.

The Intel ARC performance overlay, shown while writing this article

Arc Control’s recording options are more advanced than Intel Graphics Command Center, supporting things like AV1 recording, but in some ways the UI functionality is worse – there are few hotkeys available (there’s not one for starting/stopping recording via Arc Overlay), and they can’t be modified.

Thoughts on Gaming

I spent a few days gaming using the A770 before writing this article, but honestly this isn’t enough time for in-depth testing. Most of the time in my limited testing, ARC simply worked – unlike their iGPU drivers which sometimes had some odd quirks.

However, I did encounter two driver related problems in the time before I wrote this article.

In Fallout 4, overall performance is very good and if you’re running with the game’s default settings which limit performance to 60fps you’ll never notice this problem.

Fallout 4’s framerate can be unlocked, and works fine up to 120hz or so. During most areas of the game, Intel’s A770 LE can handle at or near 120fps. However, there are a few very select scenarios where GPU utilization – and framerate – will drop. The area this is easiest to replicate is by going to Diamond City and overlook the city from the bleachers. This happens in a few other scenarios in the game, but they’re fairly rare.

The second problem I had was with Sleeping Dogs. When the game ran, it ran well – hundreds of fps. However, it was completely unstable and would crash during gameplay – I wasn’t even able to complete a single round of it’s in-game benchmark!

Variable Refresh Rate support – AKA FreeSync

If you’ve ever used a Nvidia GPU and activated G-Sync on an “unsupported” display, you might notice quirks that wouldn’t happen with a Radeon GPU. For example, I’ve noticed brightness flickering of various levels on multiple FreeSync monitors when paired with a Nvidia RTX 2070 and RTX 3060ti GPU in certain scenarios.

I looked for this issue, and a few others I’ve noticed when using an “uncertified” monitor with an Nvidia GPU, and haven’t been able to replicate them using Intel ARC graphics. FreeSync works great on Intel’s A770.

Gaming Benchmarks

Intel knew in some games that ARC would have driver issues, and claimed it would price ARC on it’s worst case performance scenarios. Intel rates A770’s performance overall as beating Nvidia’s RTX 3060. Intel’s A770 is available for $329, the cheapest RTX 3060 I can currently find on NewEgg is $349. However, that ignores Radeon – which offers the RX 6600XT in competition to Nvidia’s RTX 3060k, and can currently be found on NewEgg as cheap as $279.

Unfortunately, I don’t have a 3060 on hand – I have a 3060ti. However, Tom’s Hardware rates that the 3060 is 80% of the speed of a 3060ti – so a rough comparison can be made. Here’s the average performance of all 12 games I tested, with estimated RTX 3060 results.

All tests were performed at 2560×1440 resolution, connected to Nixeus’ EDG27240x monitor.

Looking at the 12 game average, ARC performs a little worse than the estimated RTX 3060 on average. But this doesn’t quite explain ARC’s performance, as in some scenarios it is extremely closer to the RTX 3060ti – but in others it performs much worse.

Below are the individual game benchmarks, starting with ARC’s best performing titles

Strange Brigade is of particular note here, as in this title it’s minimum framerates were slightly better than the RTX 3060ti!

The benchmarks above are where I consider ARC A770 to be performing within, or above, expectations. The results below are where ARC performs poorly.

Just Cause 2 was a title I expected to perform poorly. And really, it’s not that Arc A770 performs bad in this title – averaging 165 fps – but it’s that Nvidia’s GPU does so much better.

Cyberpunk’s poor performance here lead me scratching my head, as the A770’s performance was supposed to be competitive against the RTX 3060 in this title. I’m attributing this to a problem with the launch driver, one I hope is rectified soon.

Bioshock 2 is a DX11 game, and overall performance at 172fps is pretty darned good for actual gameplay – but compared to the 258fps of the 3060ti it’s a bad result in comparison.

I had expected the 3060ti to lead in Assassin’s Creed : Odyssey, but I didn’t expect that the gap would be this large. At 57.2 vs 90.5fps, the a770 is only providing 63.2% of the 3060ti’s performance.

Gaming Power Consumption

Final Thoughts

Intel has stepped into the GPU game, bringing much needed competition into the lower end of the GPU stack. Overall gaming performance is decent, similar to Nvidia’s RTX 3060. It’s got a few nice features – AV1 encoding and 16gb of VRAM amongst them. It’s got a few flaws – higher power consumption and subpar performance in certain titles amongst them.

Looking at value, however, and I’m not quite impressed given that Intel had indicated ARC would be priced based on its worst performance scenarios. If you only compare the A770 to the RTX 3060, it offers similar performance per dollar. But such a comparison ignores the existence of Radeon’s RX 6600XT which can be found for $50 cheaper, and is honestly a better value than both Intel’s & Nvidia’s offerings.

I’m glad to see Intel enter the GPU market, and hope they’ll stick around – but as of right now I can’t quite recommend ARC as an alternative to Radeon or Nvidia products given it’s current state of performance. I hope this will change with the release of their next generation products, codenamed Battlemage.

Liked it? Take a second to support Albert Thomas on Patreon!
Become a patron at Patreon!
AdoredTV