Nvidia Responds to AMD's Claim of PhysX Failure
AMD accuses Nvidia of disabling multi-core CPU support in PhysX API -- Nvidia says it's untrue.
With PhysX being an Nvidia property, there are obvious reasons why AMD wouldn't be first in line to sing the praises of that specific proprietary physics technology.
Earlier this month, AMD worldwide developer relations manager Richard Huddy said in an interview with Bit-tech that Nvidia is squandering away CPU resources.
"The other thing is that all these CPU cores we have are underutilised and I'm going to take another pop at Nvidia here. When they bought Ageia, they had a fairly respectable multicore implementation of PhysX. If you look at it now it basically runs predominantly on one, or at most, two cores," said Huddy. "It's the same thing as Intel's old compiler tricks that it used to do; Nvidia simply takes out all the multicore optimisations in PhysX. In fact, if coded well, the CPU can tackle most of the physics situations presented to it."
We asked Nvidia for its response to the allegations made by AMD, Nadeem Mohammad, PhysX director of product management, stepped up to the mic in hopes of setting the record straight:
I have been a member of the PhysX team, first with AEGIA, and then with Nvidia, and I can honestly say that since the merger with Nvidia there have been no changes to the SDK code which purposely reduces the software performance of PhysX or its use of CPU multi-cores.Our PhysX SDK API is designed such that thread control is done explicitly by the application developer, not by the SDK functions themselves. One of the best examples is 3DMarkVantage which can use 12 threads while running in software-only PhysX. This can easily be tested by anyone with a multi-core CPU system and a PhysX-capable GeForce GPU. This level of multi-core support and programming methodology has not changed since day one. And to anticipate another ridiculous claim, it would be nonsense to say we “tuned” PhysX multi-core support for this case.PhysX is a cross platform solution. Our SDKs and tools are available for the Wii, PS3, Xbox 360, the PC and even the iPhone through one of our partners. We continue to invest substantial resources into improving PhysX support on ALL platforms--not just for those supporting GPU acceleration.As is par for the course, this is yet another completely unsubstantiated accusation made by an employee of one of our competitors. I am writing here to address it directly and call it for what it is, completely false. Nvidia PhysX fully supports multi-core CPUs and multithreaded applications, period. Our developer tools allow developers to design their use of PhysX in PC games to take full advantage of multi-core CPUs and to fully use the multithreaded capabilities.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
randomizer Nadeem MohammadAnd to anticipate another ridiculous claim, it would be nonsense to say we “tuned” PhysX multi-core support for this case.Reply
Oh absolutely, nonsense indeed. :sarcastic: In fact it's such utter nonsense that he won't even bother to provide evidence to substantiate any of his claims - which, funnily enough, is what he accused Richard Huddy of. -
FoShizzleDizzle Not to take sides here, as I own an Nvidia card fwiw. But I came to the same conclusion as Richard Huddy before ever knowing he made this statement. It struck me when toying around with PhysX on Batman Arkham Asylum.Reply
I disabled card PhysX and let the CPU handle them just to see how it performed. Strangely, my CPU usage barely increased at all and framerates suffered immensely as a result - same thing reportedly occurs with ATI cards.
The physics being calculated on this application are not particularly intensive from a visual standpoint, especially not when compared to say what GTA IV does (which relies solely on the CPU). They are just terribly optimized and by my estimation intentionally gimped when handled by the CPU.
Anyone can connect the dots and understand why this is so. It's just stupid because I bet a quad core CPU, or even a triple core paired with say a measly 9800 GT can max out PhysX and the in-game settings if the CPU handled the PhysX without being gimped. But since it is gimped, owners of such a card pretty much cannot run PhysX. -
demosthenes81 If game developers added true multicore support in the first place i bet this would have never even come up even the newest games like borderlands have bad multicore support I know almost nobody with single core cpus these days the devs need to step upReply -
Honis randomizerOh absolutely, nonsense indeed. In fact it's such utter nonsense that he won't even bother to provide evidence to substantiate any of his claims - which, funnily enough, is what he accused Richard Huddy of.I think Batman Arkham Asylum benchmarks are evidence enough that something fishy is going wrong in Nvidia's APIs.Reply
http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html -
Murissokah The response sounded quite well founded. Don't think Nvidia is to blame on this one.Reply -
porksmuggler My first thought is PhysX has been a market failure since it's Ageia days. Nvidia is just using this proprietary gimmick to hock more GPUs. I was stunned when Nvidia bought Ageia, but I guess the price was right, and their in-house development was lagging. The list of games using PhysX is just sad, and the performance hit with PhysX enabled is rough. Makes you wonder how big of a carrot Nvidia has to dangle out there to get the developers to bite.Reply -
randomizer HonisI think Batman Arkham Asylum benchmarks are evidence enough that something fishy is going wrong in Nvidia's APIs.http://www.tomshardware.com/review 65-10.htmlOh I wasn't doubting that at all. My post was meant to have a sarcastic tone, but text doesn't convey sarcasm well. I'll have to fix it up.Reply
EDIT: A smilie makes all the difference :D -
mlopinto2k1 randomizerOh absolutely, nonsense indeed. In fact it's such utter nonsense that he won't even bother to provide evidence to substantiate any of his claims - which, funnily enough, is what he accused Richard Huddy of.Funnily? :PReply