There’s one thing we know all know about integrated graphics: they suck. Usually, they’re barely enough to play any modern games, or even some old ones. But Intel is trying to make them suck significantly less with its ‘Gen11’ series of GPUs next year.
Intel seems to have skipped a number, considering we’re currently at Gen9, but it’s indicative of the performance boost the company is going for. Long story short, the new GPUs have more than twice the execution units as its predecessors, going from 24 to 64. They would be the first Intel GPUs to perform above one teraflop.
Current Iris GPUs can get near that but are still relatively rare on mainstream laptops. On the other hand, Intel’s components will only catch them up to AMD, which has offered 1TFLOP+ performance on some iGPUs for months.
The Gen11 chips will also use tile-based rendering, dividing scenes into tiles so that portions can be rendered individually. The main benefit is saving on memory bandwidth, which is at a minimum with premium components. The new GPUs will also support HDR and Adaptive Sync.
Unfortunately, the company didn’t provide any direct performance comparisons with its previous generation, so we’ll have to wait for benchmarks to see how much of a leap it truly is.
Of course, if you’re on a desktop you can just get an external graphics card, but that’s usually not an option on the laptops most of these GPUs are installed in. Even with the advent of eGPUs via Thunderbolt 3, adding external graphics is prohibitively expensive, and not all laptops support Thunderbolt 3. Besides, it’s not like you’re going to carry an eGPU with you everywhere you take your laptop.
Point is, better integrated graphics are more than welcome. It’s not like it’s going to turn your laptop into a serious gaming machine, but at least some modern games will be playable at low-to-medium settings.
That said, Intel also reiterated its plans to release its first dedicated GPU in two decades. The new hardware – dubbed Intel Xe – is supposed to be competitive with current offerings from AMD and Nvidia, but we’ll have to wait until 2020 to hear more.
Published December 12, 2018 — 18:37 UTC