First Intel Core 2 and Nvidia RTX 50 gaming experiments disappoint
The system played nicely, but the ‘majority of games’ with RT fell flat.

Earlier in the week, we reported on an Nvidia driver change which opened up crazy new possibilities for PC DIYers. However, the dreams of Intel Core 2 system makers enjoying outlandish high jinks with Nvidia RTX50 GPUs have now partially evaporated. Twitter(X)-based tech enthusiast Bob Pony, who first surfaced the driver change, is back, but his tales of “struggles,” aren’t exactly the news we wanted to hear.
To recap, the latest Nvidia GeForce driver re-enabled support for Intel processors dating back to the Core 2 era, as it no longer required CPU support for the POPCNT instruction. That is all well and good, and Pony took to Twitter yesterday to “happily confirm that it's possible to use an NVIDIA RTX 50 series graphics card in an old system such as an Intel Core 2 Quad.” Specifically, they partnered their old Core 2 Quad Q9450 with an unashamedly modern RTX 5060 Ti. “It works!” Pony celebrated, and tipped followers to avoid trying the same with an RTX 5060 non-Ti due to its PCIe x8 interface.
What happened next is that Pony quickly went from fiddling around in Windows 11 to trying to get some modern games running on their May-to-September combo.
The struggles of using a Core 2 Quad paired with NVIDIA RTX 5060 Ti... can't play majority of games that use ray tracing due to the processor lacking some instruction sets required for the game to run. 🫠 pic.twitter.com/XcwZFxhHXSMay 17, 2025
Things didn’t go great, and a few hours ago, Pony returned to social media with a less chirpy Tweet. Their key finding was that they “can't play [the] majority of games that use ray tracing due to the processor lacking some instruction sets required for the game to run.” The included screenshot showed evidence of Quake II RTX falling on its face.
So, ray traced games are out of the question with these lopsided systems, but plenty of others may have decent performance. It would be great if social media communities could start to put together a compatibility list. In the meantime, perhaps we can keep an eye on this social media post.
We might also find that Intel Core 2 system owners are more attracted to older titles, which their previous GPU wasn’t that great at accelerating. If this were the case, an RT gaming roadblock wouldn’t be perceived as a significant loss.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Mark Tyson is a news editor at Tom's Hardware. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.
-
BFG-9000 Well whaddya know, the oft forgotten problematic nForce 790i Ultra SLI may be finicky with memory, not overclock very well and has severe SATA data corruption issues, but can actually POST using a modern nVidia GPU with hybrid vBIOS!Reply
It's too bad the POPCNT requirement also applies to 24H2 and later Windows 11, so no still-supported Windows will work on such a system after November, but perhaps such a configuration can be made to work in Linux after that. -
Energy96 While a fun and interesting experiment, it’s not really that useful and the results are not really surprising.Reply
Anyone affording a high end 50 or 40 series gpu can afford a good current gen cpu anyway.
Fun read though. -
HardwiredWireless First they complain that the ultra 285 is too slow so they try using a core 2 duo. What a fool. You have to laugh at such ignorance.Reply -
DavidLejdar
Depends. On a budget, it is a valid question, whether it is time to upgrade the CPU and MB, or whether one can go for higher GPU tier instead - like for a 5070 instead of a 5060 Ti, etc.Energy96 said:While a fun and interesting experiment, it’s not really that useful and the results are not really surprising.
Anyone affording a high end 50 or 40 series gpu can afford a good current gen cpu anyway.
Fun read though.
In this case, RT not working sounds like a knock-out for newer games. So, upgrade of CPU needed. But when someone still is with a later model, what's the FPS difference between using a GPU e.g. with a 4570 and with a modern CPU ? Or in way more cases DDR4-era CPUs - they may still be good enough to run e.g. a 5070 Ti with. But how good actually? Like, if such rigs happen to be CPU bound, how big is the impact, as in just a few frames, or only half of it?
Personally, I'd like more data about that, to have a clearer picture about which CPU can still serve as a good base for a gaming rig, with a modern GPU. Not that I would start a business with it straight away, as in getting GPU-less second-hand rigs, putting a modern GPU in it, and selling it as half-new entire gaming rig for not much more than the retail price of the GPU itself. But seems like relevant questions in general, which CPU performance is not good enough anymore when planning to upgrade the GPU. -
usertests
No Fun AllowedHardwiredWireless said:First they complain that the ultra 285 is too slow so they try using a core 2 duo. What a fool. You have to laugh at such ignorance. -
8086
It's a good technical lesson to those less informed about PCs and how the CPU choice can bottleneck and render the most expensive GPUs almost useless.DavidLejdar said:Depends. On a budget, it is a valid question, whether it is time to upgrade the CPU and MB, or whether one can go for higher GPU tier instead - like for a 5070 instead of a 5060 Ti, etc.
In this case, RT not working sounds like a knock-out for newer games. So, upgrade of CPU needed. But when someone still is with a later model, what's the FPS difference between using a GPU e.g. with a 4570 and with a modern CPU ? Or in way more cases DDR4-era CPUs - they may still be good enough to run e.g. a 5070 Ti with. But how good actually? Like, if such rigs happen to be CPU bound, how big is the impact, as in just a few frames, or only half of it?
Personally, I'd like more data about that, to have a clearer picture about which CPU can still serve as a good base for a gaming rig, with a modern GPU. Not that I would start a business with it straight away, as in getting GPU-less second-hand rigs, putting a modern GPU in it, and selling it as half-new entire gaming rig for not much more than the retail price of the GPU itself. But seems like relevant questions in general, which CPU performance is not good enough anymore when planning to upgrade the GPU. -
hush now
ok sir, let's get you to bed. its funny saying something like this when it was a Gen X YouTuber who did the experiment and techies have been doing these types of tests mixing different generational hardware since even before the internet happened. So absolutely not a "TikTock environment" thing.JohnyFin said:This is again stupid test from TikTock environment, click byte crap. Senseless.