Quote:
Originally Posted by dcb
that graph shows a spread of about %0.006
|
I get a 0.07 % rise, so it looks like a decimal point is off.
Regardless, you can't just look at a percentage variation of solar irradiance without comparing it to the percentage rise in average global temperature.
So, we'll place the 0.6 C rise in the same percentage term as for the rise in solar irradiance. Taking the average global temperature on Earth, which is estimated to be 18 C, or 291.15 K, we get 0.6 / 291.15, or a 0.2% rise.
Now, take that 0.07% rise in solar irradiance, apply it to the Earth's atmosphere with all of its many mechanisms that are still poorly understood (even by pro-AGW climatologists), and you get a 0.2% rise in average global temperature.
Keep in mind that one of these poorly understood mechanisms is the ability of cosmic rays to influence cloud formation. Since clouds reflect sunlight back into space, that reflected sunlight cannot then cause greenhouse warming because it cannot hit the ground and warm it up. Solar output is known to partially shield the Earth from cosmic rays. So, if solar activity goes up, the Earth is a little more shielded from cosmic rays than before, and cloud formation drops. If cloud formation drops, more sunlight then is available to hit the Earth and heat up the atmosphere.