What Do High-End Graphics Cards Cost In Terms Of Electricity?
Many reviews analyze the minimum and maximum power consumption of a given graphics card. But just how much power does a high-end graphics card really need during the course of standard operation? This long-term test sheds some light on that question.
Conclusion And Summary
Although it is alright every now and then to splurge and buy the best there is, this kind of self-indulgence (combined with heavy utilization) is bad news for the electricity bill. While the enthusiast might enjoy limited doses of pixel delicacies on a regular basis and remain rather financially viable, the hardcore gamer with his or her permanent power consumption could noticeably be punished by the electricity companies.
The average user should stay away from high-end graphics cards altogether, because even if those products increase the potential of what can be rendered, the cards remain underutilized most of the time. Even though power consumption remains rather low (due to the amount of time spent idle), the electricity money is still wasted.
Those who decide to get a high-performance graphics card should be aware of the added costs. It is just like buying a car: even if you have just enough money for a Porsche, tires, gas, insurance, and taxes still have to be paid. And if that makes things a bit tighter, driving is less fun.
The differences between the various card categories are extreme at intensive utilization levels. With this brief analysis, we want to give you a point of entry and a notion on how to balance the required performance and costs through power consumption. Above all, gamers should avoid meaninglessly oversized graphics cards, as in the end, this is only good for the electricity companies. So think and calculate before you buy!
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Current page: Conclusion And Summary
Prev Page Power Analysis: Average Energy Consumption-
damric I don't get it. Are they saying that a GTX 480 will cost a hard core gamer $90/year in electricity? Seems like a drop in the bucket considering my power bills are over $90/month in the winter and over $250/month in the summer. Just think of all the money the hard core gamer saves from not having a girlfriend :DReply -
scook9 They are also neglecting the positive side effects like not needing a space heater in the winter....you recoup alot of energy right there :DReply -
porksmuggler ^Tell me about it, warmest room in the house right here. Turn the thermostat down, and boot the rig up.Reply
Typo on the enthusiast graph. calculations are correct, but it should be 13ct/kWh, not 22ct/kWh. -
aznshinobi The fact that you mentioned a porsche. no matter what the context. I love that you mentioned it :DReply -
AMW1011 So at worst, my GTX 480 is costing me $90 a year? Sorry if I'm not alarmed...Reply
Also I can't imagine having 8 hours of gaming time every day. 5 hours even seems extreme. Sometimes, you just can't game AT ALL in a day, or a week.
Some people do have lives... -
nebun alikumNvidia cards consume power like crazywho cares....if you have the money to buy them you can pay for the electricity...it's just like SUVs, you have the money to buy them you can keep them runningReply -
nebun AMW1011So at worst, my GTX 480 is costing me $90 a year? Sorry if I'm not alarmed...Also I can't imagine having 8 hours of gaming time every day. 5 hours even seems extreme. Sometimes, you just can't game AT ALL in a day, or a week.Some people do have lives...i run my 480 sli rig to fold almost 24/7...do i care about my bill...HELL NOReply