The first is the climate sensitivity. I imput that against the Manua Loa CO2 data since 1959, which then generates the set of temperatures that we would expect were the climate response time instantaneous.

The second variable is the climate response time. As I stated previously, I have set this up as a logarithmic function that can be 'stretched' or 'squeezed'.

I have chosen to ignore the first 10 years of data and thus the absolute temperature value for the whole time period. The reason for this is that the first CO2 level seemingly makes the earth have a sudden jump above 280 ppm, instead of the slow rise that there was in reality. I believe that this must distort the temperature data, although I have not yet investigated as to in what fashion it does so.

This means that I cannot directly compare measured historical temperatures with the temperatures outputted by my model. I do not think that this is a problem, however, as what I can do is compare trends (which is another way of saying that I am measuring the difference, or anomaly, between the temperature my model shows for 1970 and the temperature my model shows for 2010).

Using GISS data, the trend between 1970 and 2010 is .0163 per year. I can fiddle with the response time parameter to make any climate sensitivity provide a match for this trend. However, the response times required for any particular sensitivity to do so are give us an interesting picture of the realistic sensitivities.

Using my model, a sensitivity of two degrees requires 70 per cent of the expected total temperature increase from a given rise in CO2 to occur in the first 10 years. Further, as we move past 30 years, more than 100 per cent must occur. This would seem to rule out two degrees as a viable sensitivity value

*under this model*. (I am not yet claiming that my model is of use).

If we examine a sensitivity of six degrees, however, we get a different picture. Early on, it seems okay, with a bit over 30 per cent of the expected temperate rise occuring in the first decade. But to get the next 30 per cent takes a further 170 years. And then the next 15 per cent takes close to a further 700 years ... And that leaves a further 25 per cent of the response still to come. That does not seem plausible, either, leaving six degrees as not a viable sensitivity value

*under this model*.

Three degrees sensitivity forces me to use a pretty fast response time to get a match - over 50 per cent in the first decade and 75 per cent after a touch over 30 years.

Four degrees sensitivity requires over 40 per cent in the first decade and around a total of 75 per cent after 85 years.

A sensitivity of 4.5 degrees has just under 40 per cent in the first decade and around 70 per cent after 100 years.

The question then becomes: which is plausible. I would suggest that the last is the most plausible

*using my model*.

However, now the question becomes: is a logarithmic model realistic? Hansen et al use a linear model, with one line for the first decade and another line for the next 90 years, so maybe a logarithmic model is not realistic.

I will test the linear method in my model and report back.

## No comments:

## Post a Comment