0

Not sure if this is the right place to ask such a question, but I don't see a more relevant place on stack exchange otherwise.

So recently I've become kind of obsessed with becoming more efficient with my usage of power when it comes to my computer, and the long term costs of running a computer via your power supply. and have recently been reading about the 80 plus standard.

anyway here is my question. I recently read the only way to really know how many watts your computer is pushing is to buy a kill-a-watt meter, which i did. I Actually bought the Ez version, which show costs including a number of other options.

At idle levels (CPU ~1-5%), my PC pushs ~100 Watts according to this meter. When i stress my computer to around 70-80% cpu levels, it pushs up to ~200 watts. This includes, browsing 3 or 4 heavy cpu driven websites, running a high end game in windowed mode, running a 10gb blu-ray movie and listening to music at the same time.

I'm really trying to understand why on earth people are buying power supply that are 600, 850 or even sometimes 1000+ watt. Can one of you electrical genius's explain this to me?

My computer Specs:

enter image description here

enter image description here

Other Specs: 1 - 128 GB SSD, 2 TB Green WD Hard Drive, 1 TB Hard Drive, 1 Optical Mouse, 1 Usb Keyboard, NO CD-Rom, 430 Watt Thermaltake

Chenmunka
  • 3,264

4 Answers4

8

If your question is "Why do people buy large power supplies?" then there are a few reasons.

Their power requirements could actually be that high

Your computer has fairly modest power requirements. Power requirements are mostly driven by the CPU and GPU. Your i5-2500K CPU has a maximum TDP (thermal design power) of 95 W, and your nVidia GTX660 has a TDP of 140 W. Given your particular hardware it's not surprising that you observe 200 W as the maximum load. A 300 W or 450 W power supply would be quite adequate.

However, other people's computers can have much higher power requirements. A computer with a 150 W CPU and 2×250W graphics cards will need at least 650 W, not including the disk drives, USB devices, fans, water cooling pumps, LED lights (...) also drawing power from the power supply.

Such a computer would generally be equipped with a power supply in the 800-1000 W range.

They are purposely buying a bigger power supply than they need

Even if the computer's power requirements are only (ex.) 300 W, some people would go out and buy a 600 W power supply for it anyway.

The reasons they perceive this might be a good idea include:

  1. To allow some headroom for future upgrading of the computer - new parts might need more watts.
  2. So that the power supply will run at a lower temperature. Components degrade faster at higher temperatures, so the power supply is less likely to fail.
  3. To run the power supply at a higher efficiency. A power supply run at 100% of rated load is actually less efficient than a power supply run at 80% of rated load.
3

First off, a lot of people buy into sales BS that a bigger power supply is better, and, of-course, promises you can expand it more. I think a lot of people also think that a power supply will just draw what it needs, and don't realise the overhead (again, this is a subjective opinion, and won't hold true of anyone who has even bothered to learn what "80+" means.

Another part of the puzzle is that people who haven't built a lot of systems generally don't know how much power a system requires, and its easier to buy a bigger supply then work it out - and still be wrong.

There is also some truth that buying a bigger power supply (can) mean the components are specced to handle greater current and thus will last longer, similarly they are able to better handle transient changes in input power (ie have a bigger power buffer for sudden draws).

Of-course, very often a smaller, high quality power supply will actually do much better then an overly large power supply.

davidgo
  • 73,366
3
  1. In your experiment you probably didn't hit the maximum possible power consumption of your PC yet. A high-end game might no put your GPU under full load, if you want to determine the maximum power consumption of your GPU you should try a program like FurMark (this program can damage certain older GPUs, so be careful).

  2. Yes, many power supplies are larger than they have to be. Higher numbers are generally better for marketing, so that could certainly a driving factor behind this. Many people also don't have a realistic estimate of how much power their computer will draw, so they'll get something bigger just to be sure.

  3. The supply of lower rated power supplies is rather limited, and there is not a large price difference between lower rated and higher rated power supplies in many cases.

  4. If you put in a lot of high-end GPUs, you might actually need that much power. A high-end GPU can draw around 250W, so with a few of them you could actually need a 1000 W power supply.

2

Every power supply on the planet, be it an electrochemical battery, solar cell, or ATX PSU, has an internal resistance.

Every load on the planet, be it a light bulb, a water pump, or a computer, also has an internal resistance. As the current requirement of the load increases, its resistance necessarily decreases.

Putting Kirchhoff's circuit laws together with Ohm's law means that the voltage supplied to the load will decrease, at a ratio equivalent to the two internal resistances. Eventually the voltage may drop too low to operate the load properly.

Supplies with a higher power rating must, according to Ohm's law, have a lower internal resistance in order to be able to provide more current and therefore power. This lower resistance tips the voltage back up towards the "true" voltage of the supply, allowing larger loads to operate properly.