View Single Post
Old 03-01-2013, 02:25 PM   #509 (permalink)
Master EcoModder
Join Date: Jun 2008
Location: Earth
Posts: 5,209
Thanks: 225
Thanked 807 Times in 591 Posts
Originally Posted by Arragonis View Post
So if you look at CCGT (aka Gas) it has to go up and down to compensate - that is the idling, using fuel, making CO2...
You completely missed my point, which was NOT that you don't need peaker plants. Of course you do, even in the current system (as your graphs demonstrate). My point is that you don't need to design them in such a way that they idle all the time, burning fuel without generating electricity.

Typical peaker plants are gas turbines, which are basically jet engines. Now does your average airliner sit around idling and burning fuel when it's not flying (or taxiing)? How long does it take to spool up from engine start to takeoff thrust? A few seconds.

Then you figure that wind generation (since that's what you have in the UK) just isn't going to change all that quickly - the wind doesn't suddenly start or stop blowing simultaneously all over Britain, does it? So peaker plants are/should be designed to come on line within that small time.

That also answers your problem with farm/sewage gas generation. Since peaker power is more valuable than baseload, it would pay to store some fraction of the generated gas to use at periods of high demand, rather than supplying baseload.

The bottom line is the UK is an industrial country (still) so we need more baseload - new nuclear, or more gas, or get on with fracking asap - we have potentially a world record deposit in Cumbria. We can afford to do this if we didn't subsidise windmills and solar.
And you could afford to build new nuclear, or invest in energy efficiency.

Its not 100% no, the overground lines in some places come down and the infrastructure is in dire need of an upgrade - as is Transport, comms - the lot. A whole lot of digging and putting new stuff in.
Which costs money, no? So the question is whether it's more economical for 3rd world countries, which don't have an existing infrastructure, to build one, or to use distributed power systems.

So I have to know when the power is off to charge them, or just leave then on charge all the time ? Is that efficient ?
Given competent engineers with a mandate to design for efficiency, this should be no problem. Plug the tool in, it charges when power is available. The "vampire power" used by the current generation of "wall warts" is an artifact of careless design coupled with a 120/220 volt A/C distribution system.

And when the charge runs out and the power is not back on ?
And when you want to use a corded power tool, and the power's out?

That problem seems simple to solve if you have the baseload to supply it, transmission can be solved up to some distances (we use AC over here, I believe that makes transmission less loss-y ?).
Well, no, it's not simple at all. Remember that inertia thing? If you dump a major load like an arc furnace on to the system, it's analogous to letting out the clutch with your engine at idle. The system effectively stalls: voltage drops (AKA brownout), the local frequency can degrade enough that it gets out of synch with the rest of the grid, you get separation & islanding. Worst case you can get a blackout over the whole system.

So if you want to put your arc furnace out in the desert, quite a long ways from most generation, you obviously have problems.

The bottom line of all this is that it seems that a lot of people insist on saying "can't", when what they really mean is "that's not the way we've always done it". All the problems are fairly simple, technically, and probably less expensive in the long run than maintain the current system. (And that's without even considering the environmental effects of CO2 & fracking.) You just have a lot of people emotionally & financially invested in business as usual.
The Following User Says Thank You to jamesqf For This Useful Post:
Occasionally6 (07-11-2013)