I played with voltages in my PC a while back. My specs:
i5 3570K
Zotac Z77 ITX board
2x 4GB DDR3 @ 1.25v
mSATA Samsung 850 EVO SSD, 250GB
Radeon HD7850 2GB
Fortron 400w Platinum-rated power supply
Noctua NH-L9i cooler
I started with a full ATX motherboard, watercooling, and an 80+ rated 680w PSU. Sitting idle at the wall, the PC drew close to 100w at the wall, as measured by a Kill-a-Watt. After downsizing to a 380w power supply (Antec Earthwatts) and my Zotac ITX board, idle power consumption dropped to around 45w. I snagged the Fortron power supply for $25 (slickdeals) and further dropped idle power consumption to ~37w. The discrete video card, when on, is responsible for about 12w of that, meaning the PC sans gaming GPU draws approximately 25w.
I spent a while playing with voltages and frequency, to see the relationship with power consumption.
Starting off with a 4GHz overclock, I loaded it up with Prime95 and gradually lowered the voltage until I got instability. I found that I was able to reduce the total system power consumption with the CPU fully loaded (but not the GPU) from ~120w to ~95w by decreasing the CPU's operating voltage by 130mv.
This in turn reduced operating temperatures, and fan noise decreased.
Next, I plotted the minimum stable voltage vs frequency, after extensive trial and error.
Due to the voltage curve, CPU power consumption goes exponential as frequency increases. There seem to be two major inflection points, one at ~2800mhz and another at 4000mhz.
This is a graph of the power efficiency of the CPU when compared with frequency. I imagine it pretty closely resembles a BSFC vs load or RPM graph.
One would think then, that running the CPU as slowly as possible would be most efficient. This is actually not the case, at least if you're leaving the PC on 24/7 under load, doing work. Because the PC has certain fixed power consumption sources, when you factor those in, the extra time it takes to complete a task at lower frequency actually hurts efficiency.
The stock frequency Intel ships this CPU at is 3.4ghz, with turbo of up to 3.8ghz. For a system with slightly higher fixed power consumption, that stock frequency would be just about dead-center of the total system power efficiency peak.
~
If you're not running your PC 24/7 and relying on it to do the most calculations over time with the least energy, lowering CPU clockspeed is beneficial, to a point. I've backed off my clockspeed a bit, from 4ghz down to 3.8ghz, because I still appreciate the extra performance.
Some other areas where I have room for improvement:
-My 15 year old Z5500 (500w) speaker system draws 15w just plugged into the wall, and 35w when turned on but making no sound. I use these speakers on both mine and my wife's PC though, and they're wonderful for watching movies. I can't at present justify spending several hundred dollars to save <20w, so I'm likely to continue using them until they die.
-My 24" LCD screen is now 10 years old, and uses CFL backlighting. When on, it draws about 70w, whereas my wife's much newer 27" screen with LED backlighting draws about 35w. Here too, though, I can't justify spending several hundred dollars for the power consumption it saves.
Hope this is useful or interesting to someone!