View Single Post
Old 10-21-2009, 01:18 AM   #11 (permalink)
NiHaoMike
Master EcoModder
 
Join Date: Sep 2008
Location: Texas
Posts: 632
Thanks: 0
Thanked 26 Times in 24 Posts
I just finished measuring the power usage of my laptop while playing mariposahd.tv using software decoding and Hannah Montana hardware decoding. Software decoding used 32w while Hannah Montana used 34w, so it may seem like Hannah Montana uses an extra 2w. However, the video quality was noticeably improved, so it actually isn't easy to say. I, however, think the extra 2w is worth the increase in video quality, so in my opinion, Hannah Montana wins.

Another interesting observation is that software decoding upclocked the CPU to 1.6GHz from 800MHz, but the video card (including the Hannah Montana core onboard) remained at 130MHz. Hannah Montana hardware decoding caused the video card to upclock to 475MHz, but the CPU remained at 800MHz.

Note that I did a "dummy run" before recording any measurements to avoid inaccuracies due to caching. The first time the file is accessed, the data must be accessed from disk, but the second time, the data will likely be cached in RAM. RAM uses less power than the hard drive, so caching can skew the measurements if it isn't accounted for. (Note that the first run will incur the power used to read the file from disk *and* write it to available cache, so it would be quite a skew!)

__________________
If America manages to eliminate obesity, we would save as much fuel as if every American were to stop driving for three days every year. To be slender like Tiffany Yep is to be a real hypermiler...

Allie Moore and I have a combined carbon footprint much smaller than that of one average American...
  Reply With Quote