I did RRTM runs for 10,000 ppm H2O/330 ppm CO2, and then again with no CO2. Downwelling LW radiation dropped by less than 5% from 353 to 336 w/m^2.
Increasing CO2 to 560 ppm increased the greenhouse effect by about half a percent, from 353 to 355 w/m^2. This roughly translates to 1.4C warming (280K * 0.005) with a doubling of CO2.
However, the numbers are exaggerated because I did cloud free runs. If clouds were include, the effect of CO2 would be smaller.
They are going to say that feedback effects will make it much worse. On the other hand, it will probably be difficult to double the CO2 in the atmosphere (thru human production), because so much will settle out.
Empirical evidence lends no support to that.
If you don’t trust a fully coupled complex climate model, you certainly shouldn’t trust a simplistic 1-dimensional radiative transfer model.
Quite the opposite. Radiative transfer is a relatively simple static calculation which can be at least partially verified through direct measurement.
To the extent that a simple model is true, it is irrelevant.
The radiative transfer model is the only part of the model which calculates the greenhouse effect. It isn’t clear to me what you have in mind.
What kind of experiment can test (even partially) the radiative transfer “model” hypothesis with direct measurement in the wild?
What equipment is involved?
Has anyone done this, if so when, where, what were the results? Where they verified by independent parties?
Are there other radiative transfer models that are more accurate assuming it’s a real phenomenon that can be verified?
Steve, could you say something about which software you used and how you configured it for the “RRTM runs” that you mentioned. Thanks.