Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on March 30, 2022

Intel’s first Arc GPUs have finally arrived — here’s what you need to know

Team Green and Team Red, meet Team Blue


Intel’s first Arc GPUs have finally arrived — here’s what you need to know

Not to be melodramatic, today might be one of the most important days in PC gaming history. Intel is launching its first batch of Arc GPUs today, adding a third major competitor to a market that’s been dominated by Nvidia and AMD (or ATI, back in the day) for as long as many gamers and creators can remember.

Now the question is just whether Intel’s discrete GPUs are any good. 

I’ll go ahead and warn you right away that Intel didn’t provide any hard comparisons with its competitors; we’ll have to wait for real-world benchmarks for those. Still, there’s a lot to unpack here. The technically inclined can go read Intel’s official announcement, but we’ve summed up the main things you should know.

So can I go buy an Intel GPU and stick it into my desktop now?

Not quite yet. Intel is rolling out its Arc family of GPUs slowly, starting with entry-level units designed for mainstream laptops.

So far, Intel is dividing its GPUs into three tiers: Arc 3, Arc 5, and Arc 7. Arc 3 is aimed at thin-and-light laptops and is focused on providing a solid 1080p experience, while Arc 5 and Arc 7 target higher resolutions, framerates, and effects.

Intel Arc

Only Arc 3 chips are available starting today, with devices coming from Samsung, Asus, Dell, Lenovo, HP, Acer, and more. The first Arc 3 models are divided into Arc 350M and Arc 370M variants, with the latter being more powerful.

Laptops powered by Arc 5 and Arc 7 will begin to show up in the “early summer,” while desktop and workstation GPUs arrive “this summer.” Sighs.

What do Intel’s Arc chips do differently from AMD and Nvidia?

The most notable technology on offer might be what Intel is calling its ‘Xe Matrix Extentions’ (XMX) AI engines, which live alongside traditional GPU vector engines in each of Intel’s Xe graphics cores. Intel claims that its XMX engines “provide a 16x increase in compute capability to complete AI inferencing operations compared with traditional GPU vector units.”

That’s a bit of gobbledygook, but it basically means that Intel’s XMX tech should be a lot more efficient at solving AI operations than typical GPUs, assuming optimal implementation.

How much of a difference that makes in the real world remains to be seen, but Intel is making a big deal of Arc’s ability to combine traditional graphics performance with AI enhancement.

Intel Arc

To that effect, Arc GPUs will support Intel’s take on AI-based upscaling called XeSS. Like Nvidia’s DLSS and AMD’s FSR, XeSS allows your GPU to render graphics at a comfortable resolution — say 1080P — but output a result that looks a lot closer to native 4K.

Unsurprisingly, Intel implies that because its XMX cores are so efficient, its version of AI upscaling is more efficient than the competition’s, but we’ll have to wait for some direct comparisons to see if that holds up.

Unfortunately, XeSS it does require game developers to get on board to support the technology.

That said, the neat thing about XeSS is that it doesn’t only work with Intel’s GPUs. Although it is most efficient with XMX, it can actually work with many current competing GPUs from AMD and Nvidia, as well as Intel’s less powerful Iris graphics, by leveraging the older DP4a technology. That helps increase its chances of widespread adoption in the long run.

In the meantime, Intel says more than 20 games will support XeSS when it becomes available in the summer.

What else can Arc do?

AI enhancements aside, Intel says its Deep Link technology will allow Arc GPUs to work more seamlessly with Intel’s CPUs and integrated graphics to enable significant performance gains across various workloads:

  • ‘Dynamic Power Share’ can boost performance by up to 30% boost in intensive workloads by rapidly adjusting the power consumption of the CPU and GPU based on an application’s needs.
  • ‘Hyper Encode’ can shorten render time by up to 60% “compared to Iris Xe graphics alone” by leveraging the media engines from both your integrated graphics and your dedicated graphics.
  • ‘Hyper Compute’ offers up to 24% higher performance in “a variety of new workloads” by combining the compute and AI capabilities of the CPU, integrated Iris graphics, and dedicated Arc graphics all at once.

Intel Arc

Other features include:

  • Support for DirectX 12 Ultimate, including ray tracing, variable-rate shading, mesh shading, and sampler feedback.
  • Hardware-accelerated AV1 encoding and decoding, allowing for 50% higher quality game streams for the same data bandwidth as H.264.
  • Adaptive Sync and Speed Sync help eliminate tearing while maintaining low latency.
  • Smooth Sync helps minimize tearing by essentially, making it look less janky.
  • Support for two 8K displays at 60 Hz, or four 4K displays at 120Hz.
  • Intel’s Arc Control App mimics Nvidia’s GeForce Experience and AMD’s Radeon Software as an all-in-one hub for viewing the latest drivers and adjusting settings.

Intel Arc

Okay, but do we have any idea how Arc compares to Nvidia and AMD GPUs?

Unfortunately, it’s just too early to tell. Intel’s Arc briefing didn’t include direct comparisons with its competitors, only to the company’s own Iris Xe integrated graphics. It shows significant improvements, as you can see in the chart below, but that’s obviously expected when you go from integrated graphics to a dedicated chipset.

Intel Arc

Based on the targeted power consumption and execution units, the Arc 350M seems to be competitive with GPUs like Nvidia’s MX500 series, while the Arc 370M seems might be competitive with a mobile RTX 3050. But again, the proof is in the pudding. Intel’s XeSS and Deep Link technologies may help give it an edge in some situations, while lack of optimization could give Nvidia the lead in others.

What does this mean for the tech industry?

Regardless, the most exciting thing about this launch is the fact that we have any real competition to Nvidia and AMD at all, let alone from such a major player. It’s particularly important for laptops, where Intel has the distinct advantage of (theoretically) being able to make its CPU, integrated GPU, and external GPU coexist as efficiently as possible while minimizing power consumption.

While many of us are still eagerly awaiting what Intel will do in the desktop graphics market, it’s likely Intel’s play for the mobile market will be of wider consequence.

There’s a very real possibility that Arc could lead to the wider proliferation of dedicated graphics in lighter and more affordable laptops, especially as competition leads to better performance.

The most impressive thing about Intel’s Arc launch may be how many manufacturers have already signed up to take advantage of the new graphics chipset, especially in this performance category. While gaming laptops with powerful GPUs are a dime a dozen, there’s traditionally been a massive performance gap between mainstream lightweight laptops and those gaming models.

AMD has tried to fill some of that gap, but Intel’s market dominance will likely help the company push for more devices with Arc graphics. The first Arc-enabled devices will start at $899, suggesting discrete graphics in laptops are about to become a good deal more common.

One thing’s for sure: the GPU industry has never been so exciting.

 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with