Quote:
Originally Posted by jamesqf
But now Intel's up to the I7 processor series, the mobile versions of which consume a max of 17W, which includes a lot of functionality which in the P4 was relegated to separate chips...
Then look at the larger picture: I do most of my work on a notebook+display+cable modem, router, etc that probably draws an average of 40 watts (plus my share of whatever the cable company uses to run its system)....I don't have to drive to a physical office to work. It saves all the energy involved in making and transporting paper checks to pay my bills (and get paid!), gas I'd use to go to physical stores for shopping...
|
The 17W CPU you reference is among the lowest power consuming processors that are available, not the average. The trend is that processors and even whole computer systems grow more energy thirsty over time. Average power supply wattage has risen over the years. My first PC in 1995 had a 150w power supply and was among the fastest you could purchase. Today, my PC has a 1100w PSU and is among the fastest. In 1995 most people didn't even own a computer. In 2012, we have multiple computers.
This doesn't just apply to computers. My old cell phone would run for 4 days on a charge; my new one won't go 24hrs without demanding a recharge.
Transportation: The best selling vehicle in 1908 was the Ford model T with a 20hp engine that returned 17mpg. A hundred years later the best selling vehicle is a Ford F150 with a 300hp engine and... it still gets 17mpg.
100 Years of Improvement?
TV: 10 years ago I owned a 32" CRT TV that consumed roughly 200w. Now I have a 60" TV that consumes roughly 200w.
TV sizes are growing
My point though is not that computers will always consume greater amounts of power, or cars, or phones, just that we will always find ways to spend the resources (energy) that are available to us.
Quote:
Originally Posted by Ladogaboy
I don't remember where I read the article, but it was describing how American's electrical power consumption is drastically lower than it was even just 10 to 20 years ago. The article stated that this was due to the increased efficiency of modern electronic devices. Though Americans tend to use more electronic devices than they did in the past, the difference in efficiency has lessened the overall load on the power grid...
Now to see if this second job can lead to some telecommuting...
|
Here (Page 2) is an interesting link that shows household energy use over the years. According to the graphs, overall energy use per capita has remained fairly constant over the past 20 years. However, electricity use has risen dramatically over the past 60 years.
Buy a house from the 50s and see how it handles modern electrical demands. Make sure you have plenty of spare fuses and a flashlight for the inevitable circuit overload.
I am quite excited for the day that I telecommute, and I see this saving a lot of energy as the practice is more widely adopted. The employee saves money by not having to commute, and the employer saves money by not having to power an office space (or even build the office space). It's a nguyen/nguyen situation. With all of the saved energy and income, I'll have to think of ways to spend it on something else. I've always wanted to travel Europe and Asia... a hot tub for our Pacific Northwest winters also sounds lovely.