Intel is looking into CPU overhead associated with Arc GPUs on older chips

ASRock Arc B580 Steel Legend
(Image credit: Tom's Hardware)

The Intel Arc B580 is considered the cheapest modern GPU on the market today, with Tom’s Hardware considering it the $249 GPU champion. However, several reviewers discovered that the budget GPU performs poorly on older processors. Intel has acknowledged the issue and will investigate it.

A moderator on the Intel Community forum started an Intel Arc Graphics and CPU Overhead thread, saying, “Thank you for your patience. We are aware of reports of performance sensitivity in some games when paired with older generation processors. We have increased our platform coverage to include more configurations in our validation process, and we are continuing to investigate optimizations.”

Reviewers first noticed the issue in early January, when they tested the Intel Arc B580 on older and newer CPUs. For example, Hardware Unboxed ran some gaming benchmarks with the GPU using an AMD Ryzen 7 9800X3D and a Ryzen 5 2600. It also used an Nvidia GeForce RTX 4060 as a control, and these are its results.

Swipe to scroll horizontally

Game Title

Intel Arc B580 + Ryzen 7 9800X3D (FPS)

Nvidia RTX 4060 + Ryzen 7 9800X3D (FPS)

% Difference

Intel Arc B580 + Ryzen 5 2600 (FPS)

Nvidia RTX 4060 + Ryzen 5 2600 (FPS)

% Difference

Warhammer 40,000: Space Marine 2

62

74

16.22%

31

52

40.38%

Rainbow Six Siege

240

309

22.33%

212

223

4.93%

Hogwarts Legacy

71

70

-1.43%

34

52

34.62%

Starfield

40

50

20.00%

24

44

45.45%

Marvel's Spider-Man Remastered

152

127

-19.69%

46

78

41.03%

Average

113

126

10.32%

69.4

89.8

22.72%

The Intel Arc B580 performs decently enough when paired with the Ryzen 7 9800X3D, with the RTX 4060 outperforming it by around 10% on average. However, the performance gap between the two entry-level desktop GPUs widens when paired with the much older Ryzen 5 2600. The RTX 4060 now gets a 22% advantage over the Intel graphics card, with some games returning an unplayable 30 FPS or lower. Note that this doesn’t include 1% low numbers, which are much worse for the B580.

Hopefully, Intel can find a fix for this issue sooner rather than later, mainly as it affects the core market of the Intel Arc B580 — budget gamers. You might think that buying the B580 to replace your old GTX 1060, which was still the most popular GPU on Steam as late as 2022, will finally let you play some newer games. But you might be disappointed with this overhead issue, as it will preclude you from playing some game titles without upgrading your CPU.

This might not be too big of a problem if you already have an AM4 motherboard, especially as AMD is still releasing new chips for this socket in 2025. But if you have an AM3+ or older motherboard (or older Intel chips), you’ll likely need to spend on a new CPU and motherboard, increasing the cost of upgrading your system to play the latest titles. And with tariffs jacking up prices everywhere, this might mean you cannot afford to play newer games at all.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Jowi Morales
Contributing Writer

Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.

  • watzupken
    There are 2 things that Intel needs to do for their GPUs,
    1. Optimize for older CPUs, and,
    2. Improve performance consistency - I think it is evident that the average and 1% low FPS may fluctuate a lot more on Intel GPUs. So while the GPU appears to be doing well when looking at average FPS, the user experience can be more stuttery than I like.

    It is still great value dGPU, but needs further refinement. I do hope that they mitigate these 2 mostly through future driver updates.
    Reply
  • TerryLaze
    watzupken said:
    There are 2 things that Intel needs to do for their GPUs,
    1. Optimize for older CPUs, and,
    Isn't like the number one complaint about x86 of many people that the legacy support is holding it back and making it inefficient?!
    Why would we want that for GPUs as well?!
    Older CPUs ( that will be able to handle this tier of GPU) will be phased out fast enough that it will not affect them that much.
    Reply
  • Notton
    I'd be nice if Intel launched the B380 and B310. There's zero stock for those in my local and the B580/570 have an unreasonable price premium.
    I just need something with 4 display out and AV1 decode/encode, which my R7 5800X3D does not feature.
    Reply
  • rluker5
    TechYesCity tested this with a 10400t and against both Nvidia and AMD comparable GPUs. Nvidia did better but the RX 7600 was within 1% of the performance differential of the B580 from the 9800x3d. I don't think the difference in either case was more than 10% so this is apparently an AMD CPU issue more than a B580 issue since if it were an Arc issue it would also be a Radeon issue.

    Nvidia did a good thing with their driver based CPU load distribution back in the Kepler days.
    Reply
  • thestryker
    rluker5 said:
    TechYesCity tested this with a 10400t and against both Nvidia and AMD comparable GPUs. Nvidia did better but the RX 7600 was within 1% of the performance differential of the B580 from the 9800x3d. I don't think the difference in either case was more than 10% so this is apparently an AMD CPU issue more than a B580 issue since if it were an Arc issue it would also be a Radeon issue.
    It's not an AMD CPU issue at all it's a problem with the drivers, design or combination of both on Intel Arc cards. It doesn't appear in every game, and only appears when CPU limited. When HUB originally contacted Intel about it they confirmed the findings.
    Reply
  • thestryker
    watzupken said:
    There are 2 things that Intel needs to do for their GPUs,
    1. Optimize for older CPUs, and,
    2. Improve performance consistency - I think it is evident that the average and 1% low FPS may fluctuate a lot more on Intel GPUs. So while the GPU appears to be doing well when looking at average FPS, the user experience can be more stuttery than I like.

    It is still great value dGPU, but needs further refinement. I do hope that they mitigate these 2 mostly through future driver updates.
    There's no such thing as optimizing for older CPUs here. The problem this article is referring to is Arc cards creating a CPU bottleneck where others do not. I would not be surprised if performance consistency was related to whatever the cause of this is.

    edit: I'm also not sure it's a problem that can be entirely fixed as they likely would have already done so. This may be similar to the design issues on Alchemist which held back performance. If it cannot be fixed that would mean perhaps Celestial (maybe Druid, but it has been a known issue since Alchemist).
    Reply
  • rluker5
    thestryker said:
    It's not an AMD CPU issue at all it's a problem with the drivers, design or combination of both on Intel Arc cards. It doesn't appear in every game, and only appears when CPU limited. When HUB originally contacted Intel about it they confirmed the findings.
    HUB is trying too hard to create a narrative. Gamers Nexus did not find a significant difference between the 9800X3D, 12400 and 5600X when Battlemage cards were GPU limited (as they will be almost all of the time), except for the occasional shortcomings of the 5600X.: m9uK4D35FlM:1034View: https://youtu.be/m9uK4D35FlM?t=1034 Tech Yes City found what I mentioned in my previous post.: mVC2eP9xmXQ:316View: https://youtu.be/mVC2eP9xmXQ?t=316 If you take in all of the testing to see the whole picture then you have the worst performance drops coming from older Ryzen CPUs when compared to the fastest CPUs on Arc GPUS, and when tested, AMD GPUs with budget Intel CPUs have an equivalent dropoff to Arc.

    The presented evidence supports this if you include all reputable sources that have first hand testing.
    The existence of that lopsided evidence is largely due to HUB cherry picking the worst case scenario Battlemage performance drops with weaker CPUs and those worst cases happened to be with Ryzen chips only and only with Arc compared to Nvidia, and not Radeon.

    HUB has all sorts of evidence of just how poorly older Ryzen perform in gaming when they are not helped by Nvidia's driver based enhanced multithreading. But no evidence that the more common older budget Intel CPUs suffer from the same problem with Arc any more than they would when paired with a Radeon. It would be a more compelling argument that it is the fault of Arc and not Ryzen if they could show that Arc is the culprit and not Ryzen don't you think? But they can only show the bad performance when the two are paired in scenarios specifically taxing on the Ryzen.

    Do I think Intel GPUs fare worse with cheap CPUs than Radeon? Yes, but the difference is likely nowhere near what HUB or others mining AMClicks insinuate and affect users less than implied because budget GPUs are generally run in heavily GPU limited scenarios. I also think it is fitting that HUB is making the case that old Ryzen games poorly in their attempt to portray Arc in a bad light. But they have gotten many more clicks than if they would have shown the more boring big picture.
    Reply
  • thestryker
    rluker5 said:
    HUB is trying too hard to create a narrative. Gamers Nexus did not find a significant difference between the 9800X3D, 12400 and 5600X when Battlemage cards were GPU limited (as they will be almost all of the time), except for the occasional shortcomings of the 5600X.: m9uK4D35FlM:1034View: https://youtu.be/m9uK4D35FlM?t=1034 Tech Yes City found what I mentioned in my previous post.: mVC2eP9xmXQ:316View: https://youtu.be/mVC2eP9xmXQ?t=316 If you take in all of the testing to see the whole picture then you have the worst performance drops coming from older Ryzen CPUs when compared to the fastest CPUs on Arc GPUS, and when tested, AMD GPUs with budget Intel CPUs have an equivalent dropoff to Arc.

    The presented evidence supports this if you include all reputable sources that have first hand testing.
    The existence of that lopsided evidence is largely due to HUB cherry picking the worst case scenario Battlemage performance drops with weaker CPUs and those worst cases happened to be with Ryzen chips only and only with Arc compared to Nvidia, and not Radeon.

    HUB has all sorts of evidence of just how poorly older Ryzen perform in gaming when they are not helped by Nvidia's driver based enhanced multithreading. But no evidence that the more common older budget Intel CPUs suffer from the same problem with Arc any more than they would when paired with a Radeon. It would be a more compelling argument that it is the fault of Arc and not Ryzen if they could show that Arc is the culprit and not Ryzen don't you think? But they can only show the bad performance when the two are paired in scenarios specifically taxing on the Ryzen.

    Do I think Intel GPUs fare worse with cheap CPUs than Radeon? Yes, but the difference is likely nowhere near what HUB or others mining AMClicks insinuate and affect users less than implied because budget GPUs are generally run in heavily GPU limited scenarios. I also think it is fitting that HUB is making the case that old Ryzen games poorly in their attempt to portray Arc in a bad light. But they have gotten many more clicks than if they would have shown the more boring big picture.
    GPU limited testing will obviously never show a CPU limit. Testing solely this way is disenguous at best because games change based on where you are.

    Starfield is a great example where most places are never going to be CPU limited, but those that are can absolutely tank performance without enough CPU. If you're using an Arc GPU this sort of circumstance will be significantly worse. If you looked at Hardware Canucks video they showed 9th gen Intel experiencing the exact same problems.

    The truth is very simple: there's a CPU overhead problem with Arc GPUs. This won't always be a problem, but when it is it can absolutely tank performance. You trying to make it out to be overblown or an AMD problem is a really bad look dismissing valid criticism of the GPUs.
    Reply
  • TerryLaze
    thestryker said:
    Testing solely this way is disenguous

    If you looked at Hardware Canucks video they showed 9th gen Intel experiencing the exact same problems.
    They have the same bad testing as HU, they compare a x3d cpu to a non x3d cpu, it doesn't matter if the non x3d one is amd or intel.
    We all know that intel gpus need rebar to work well and rebar is a way to move data between the cpu and gpu in a faster manner, something that you would think that a large cache would help immensely.
    They don't show a CPU overhead because they would have to test this on the same exact cpu by disabling cores or clocking them down, that would show if it's actually cpu overhead, removing the cache only shows that cache helps rebar.
    Reply
  • thestryker
    TerryLaze said:
    They have the same bad testing as HU, they compare a x3d cpu to a non x3d cpu, it doesn't matter if the non x3d one is amd or intel.
    We all know that intel gpus need rebar to work well and rebar is a way to move data between the cpu and gpu in a faster manner, something that you would think that a large cache would help immensely.
    They don't show a CPU overhead because they would have to test this on the same exact cpu by disabling cores or clocking them down, that would show if it's actually cpu overhead, removing the cache only shows that cache helps rebar.
    Go look at the Tom's review as Jarred tested with both the 13900K and 9800X3D and there's functionally no difference. If the cache somehow made the B580 faster it would show there too.

    No matter what though the bottom line is that Intel confirmed with HUB saying they saw the same results. Intel has also now put out this post clearly indicating there are issues. This is not some imaginary problem it's very real. It's not a huge problem and would never stop me from buying an Arc card for a lower end system (so long as it had ReBar support), but it's something very valuable to be aware of.
    Reply