6

I currently own an Radeon HD 5570 graphics card which "supposedly" consumes a very small amount of power. The problem is that it stutters during some games and movies. My machine is an Intel quad core i5 - 750 with 4 gig ram running on windows 7 using a 550w Antec Basiq PSU.

I plan to upgrade to the Nvidia GTS 250 card which requires a 450w minimum PSU.

Would I see any substation cost difference in my monthly electricity bill? I mostly just use my computer for browsing the web, programming and watching movies. 5%-10% of the time I might spend playing games.

I reside in NYC, according to: http://michaelbluejay.com/electricity/cost.html my kilowatt per hour is 14¢.

wonea
  • 1,877
TheOne
  • 1,405

3 Answers3

3

After upgrading your power supply, your computer will continue to draw (almost) the same amount of power as it did before. The only additional power draw will be from the requirements of the new video card (plus a bit more for the internal losses of the larger PSU).

Keep in mind that a 450W PSU does not draw 450 watts of power.

Your computer only draws as much power as required for its operation. As a matter of fact, the highest demand for power is when you first turn on your PC to spin up all the hard drives, DVD/CD drives, fans, etc. The increased capacity PSU has to handle that load which can be several times the power demands of a steady-running PC.

Take a look at the technical specifications of the two video cards. They should tell you the peak power requirements. Subtract one from the other and that will give you the maximum additional power demands of the system. Remember, computer components don't always run at their peak capacity, so what you are calculating is an unlikely, worst-case scenario.


By the Numbers

  • ATI Radeon™ HD 5570 - Maximum board power: 45 Watts
  • Nvidia GeForce GTS 250 - Maximum Graphics Card Power: 150 Watts

Difference: 105 Watts (1.47 cents(US)/hour maximum, pushing to card to maximum usage)

2

Well, running a 450W PSU for one hour at 100% use would be 0.45kWh.

If you pay 14¢ per kWh, then that will cost 6.3¢ for every hour of use.

I don't know what PSU you have at the moment, but for arguments sake, lets say it's a 250W one.

For that, it would cost 3.5¢ an hour.

So, the increase per hour would be 2.8¢. Let's say you use the computer for 40 hours a week, then that would cost you an extra $1.12 a week.

And that's assuming you always run the PSU at 100% capacity all the time.

So, I'd say no - you won't see a difference in your electricity bill.

2

If you use your graphics card often (playing games, converting videos, whatever uses the GPU), your power consumption will most likely rise.

Also keep in mind that the rating on your PSU is a maximum rating and can only be served for short times (not 5-hour gaming sessions), so an "oversized" PSU will work a lot longer before having to be replaced.

Additionally, most PSU's are more effective when not operating at full capacity, e.g. they will draw (a lot) less power. An example: my PC draws at maximum about 500W (when all harddrives are working, cpu and gpu at 100%), my PSU is rated at 850W maximum. It never gets hot and is working for 3 years now, never had any problems with it.

more infos on efficiency: http://en.wikipedia.org/wiki/80_Plus

Personally, I wouldn't recommend using your current power supply with that new graphics card, as it would most likely run hot and stop working when using it longer than a few hours.

cfstras
  • 399