Testing my model using Hansen's linear response times (which leaves me with only one variable to play with, climate sensitivity) I need to use a climate sensitivity somewhere between five and 5.5 to get a match with the observed temperature trend between 1970 and 2010.

This could indicate that my model is wrong in other respects - I will need to read Hansen's paper carefully to check this, as he does mention a long-term sensitivity of around six degrees.

I should also note here that I am aware that my model is attributing all of the temperature increase between 1970 and 2010 to CO2, making the assumption that other forcings cancel out over that period. This is likely true for things like solar forcings, ENSO and so forth. However, aerosols are still an issue.

## Thursday, April 28, 2011

### More on my temperature model

My temperature model - which is really a test of the climate sensitivity, as it is looking backwards over the last 50 years of data - has two basic variables.

The first is the climate sensitivity. I imput that against the Manua Loa CO2 data since 1959, which then generates the set of temperatures that we would expect were the climate response time instantaneous.

The second variable is the climate response time. As I stated previously, I have set this up as a logarithmic function that can be 'stretched' or 'squeezed'.

I have chosen to ignore the first 10 years of data and thus the absolute temperature value for the whole time period. The reason for this is that the first CO2 level seemingly makes the earth have a sudden jump above 280 ppm, instead of the slow rise that there was in reality. I believe that this must distort the temperature data, although I have not yet investigated as to in what fashion it does so.

This means that I cannot directly compare measured historical temperatures with the temperatures outputted by my model. I do not think that this is a problem, however, as what I can do is compare trends (which is another way of saying that I am measuring the difference, or anomaly, between the temperature my model shows for 1970 and the temperature my model shows for 2010).

Using GISS data, the trend between 1970 and 2010 is .0163 per year. I can fiddle with the response time parameter to make any climate sensitivity provide a match for this trend. However, the response times required for any particular sensitivity to do so are give us an interesting picture of the realistic sensitivities.

Using my model, a sensitivity of two degrees requires 70 per cent of the expected total temperature increase from a given rise in CO2 to occur in the first 10 years. Further, as we move past 30 years, more than 100 per cent must occur. This would seem to rule out two degrees as a viable sensitivity value

If we examine a sensitivity of six degrees, however, we get a different picture. Early on, it seems okay, with a bit over 30 per cent of the expected temperate rise occuring in the first decade. But to get the next 30 per cent takes a further 170 years. And then the next 15 per cent takes close to a further 700 years ... And that leaves a further 25 per cent of the response still to come. That does not seem plausible, either, leaving six degrees as not a viable sensitivity value

Three degrees sensitivity forces me to use a pretty fast response time to get a match - over 50 per cent in the first decade and 75 per cent after a touch over 30 years.

Four degrees sensitivity requires over 40 per cent in the first decade and around a total of 75 per cent after 85 years.

A sensitivity of 4.5 degrees has just under 40 per cent in the first decade and around 70 per cent after 100 years.

The question then becomes: which is plausible. I would suggest that the last is the most plausible

However, now the question becomes: is a logarithmic model realistic? Hansen et al use a linear model, with one line for the first decade and another line for the next 90 years, so maybe a logarithmic model is not realistic.

I will test the linear method in my model and report back.

The first is the climate sensitivity. I imput that against the Manua Loa CO2 data since 1959, which then generates the set of temperatures that we would expect were the climate response time instantaneous.

The second variable is the climate response time. As I stated previously, I have set this up as a logarithmic function that can be 'stretched' or 'squeezed'.

I have chosen to ignore the first 10 years of data and thus the absolute temperature value for the whole time period. The reason for this is that the first CO2 level seemingly makes the earth have a sudden jump above 280 ppm, instead of the slow rise that there was in reality. I believe that this must distort the temperature data, although I have not yet investigated as to in what fashion it does so.

This means that I cannot directly compare measured historical temperatures with the temperatures outputted by my model. I do not think that this is a problem, however, as what I can do is compare trends (which is another way of saying that I am measuring the difference, or anomaly, between the temperature my model shows for 1970 and the temperature my model shows for 2010).

Using GISS data, the trend between 1970 and 2010 is .0163 per year. I can fiddle with the response time parameter to make any climate sensitivity provide a match for this trend. However, the response times required for any particular sensitivity to do so are give us an interesting picture of the realistic sensitivities.

Using my model, a sensitivity of two degrees requires 70 per cent of the expected total temperature increase from a given rise in CO2 to occur in the first 10 years. Further, as we move past 30 years, more than 100 per cent must occur. This would seem to rule out two degrees as a viable sensitivity value

*under this model*. (I am not yet claiming that my model is of use).If we examine a sensitivity of six degrees, however, we get a different picture. Early on, it seems okay, with a bit over 30 per cent of the expected temperate rise occuring in the first decade. But to get the next 30 per cent takes a further 170 years. And then the next 15 per cent takes close to a further 700 years ... And that leaves a further 25 per cent of the response still to come. That does not seem plausible, either, leaving six degrees as not a viable sensitivity value

*under this model*.Three degrees sensitivity forces me to use a pretty fast response time to get a match - over 50 per cent in the first decade and 75 per cent after a touch over 30 years.

Four degrees sensitivity requires over 40 per cent in the first decade and around a total of 75 per cent after 85 years.

A sensitivity of 4.5 degrees has just under 40 per cent in the first decade and around 70 per cent after 100 years.

The question then becomes: which is plausible. I would suggest that the last is the most plausible

*using my model*.However, now the question becomes: is a logarithmic model realistic? Hansen et al use a linear model, with one line for the first decade and another line for the next 90 years, so maybe a logarithmic model is not realistic.

I will test the linear method in my model and report back.

## Wednesday, April 27, 2011

### Climate sensitivity revisited

I have been working with a simple model for temperature that has the earth responding logarithmically to CO2 forcing (for example, depending on the parameters that I use, it might warm by 40 per cent of the expected total warming in the first 10 years and then by another 30 per cent of the expected total warming in the next 90 years) and then running that model using different climate sensitivities.

Climate sensitivity is commonly defined as the predicted climate response to a forcing and in the case of CO2 it is put as X degrees per doubling.

The values for X that I have tried range from one to 10.

The CO2 data I am taking from Manua Loa.

At the moment I have having some difficulty getting my model to come close to matching observations if I use a low climate sensitivity. I can almost do it if I have a very fast response time. For example, if I choose a climate sensitivity of two degrees per doubling and I have the vast majority (80 per cent) of the temperature response occuring within 50 years, with more than 50 per cent of that in the first decade, I can fit the model to the current observed temperature. But the rate of warming that this produces for the last 50 years is still too low.

However, even here there is a problem: the rate of observed change is still faster than my model shows.

The better fits are with higher climate sensitivities, but even there things are not perfect. (Note: I would not expect them to be so, as my model is leaving out climate variability, but they are still not good enough for my purposes).

This seems reasonable: based on our observations of temperature and atmospheric CO2 concentrations over the last 130 years and the linear fit between the two, a sensitivity of two degrees would seem to be implied. However, this would seem to suggest an almost instantaneous response to CO2. If instead some kind of logarithmic fit was used, I wonder what result we would end up with?

I am assuming that there is a major problem with a model such as this. Hansen seems to use a linear model, with different slopes at different periods of time (for example, four per cent of the response per year for the first decade, followed by about .4 per cent of the response per year for the rest of the century). According to him, other models use much longer response times, at least for the second half of the response.

If anyone has any advice on this, that would be appreciated. I can obviously provide the full model (which is not very full or large) to anyone who wishes to see it.

Climate sensitivity is commonly defined as the predicted climate response to a forcing and in the case of CO2 it is put as X degrees per doubling.

The values for X that I have tried range from one to 10.

The CO2 data I am taking from Manua Loa.

At the moment I have having some difficulty getting my model to come close to matching observations if I use a low climate sensitivity. I can almost do it if I have a very fast response time. For example, if I choose a climate sensitivity of two degrees per doubling and I have the vast majority (80 per cent) of the temperature response occuring within 50 years, with more than 50 per cent of that in the first decade, I can fit the model to the current observed temperature. But the rate of warming that this produces for the last 50 years is still too low.

However, even here there is a problem: the rate of observed change is still faster than my model shows.

The better fits are with higher climate sensitivities, but even there things are not perfect. (Note: I would not expect them to be so, as my model is leaving out climate variability, but they are still not good enough for my purposes).

This seems reasonable: based on our observations of temperature and atmospheric CO2 concentrations over the last 130 years and the linear fit between the two, a sensitivity of two degrees would seem to be implied. However, this would seem to suggest an almost instantaneous response to CO2. If instead some kind of logarithmic fit was used, I wonder what result we would end up with?

I am assuming that there is a major problem with a model such as this. Hansen seems to use a linear model, with different slopes at different periods of time (for example, four per cent of the response per year for the first decade, followed by about .4 per cent of the response per year for the rest of the century). According to him, other models use much longer response times, at least for the second half of the response.

If anyone has any advice on this, that would be appreciated. I can obviously provide the full model (which is not very full or large) to anyone who wishes to see it.

### Inflow and rainfall for the first four months of the year

As of 27 April 2011, Canberra has received 234 mm of rainfall and, according to my estimates, 28,500 megalitres of inflow into our dams. This has to be an estimate, as it looks as though ACTEWAGL carried out a large release of water over the Easter break from the Cotter and Bendora dams.

This gives a megalitre per millimetre rate of about 120 megalitres per millimetre, still lower than I would have expected after our wet year.

We have had a dry April, bringing down the projections for the year to something just over 700 mm. If things transition to an el nino we may get less, however.

This gives a megalitre per millimetre rate of about 120 megalitres per millimetre, still lower than I would have expected after our wet year.

We have had a dry April, bringing down the projections for the year to something just over 700 mm. If things transition to an el nino we may get less, however.

## Thursday, April 7, 2011

### Yearly inflow to the end of March

The first three months of the year have gone, and we have seen low inflow for the amount of rain that we have received. However, we have still received a large inflow: 23,000 megalitres from 225 mm of rain.

(note: this is assuming that ACTEWAGL are still releasing around 130 megalitres a day to avoid problems with the dams - I suspect that this will not be the case in April, but will continue to track inflow as if that was the case until the end of April or I get more information).

This keeps the runoff per millimetre at just over 100 megalitres, which is odd, given the saturated nature of the soils. What it could mean, however, is that much of the rain is simply not falling in the catchement area, which is something that I was concerned about as a possibility last year.

By the same time last year, we had recieved around 16,700 megalitres in runoff from around the 230 mm of rainfall. This means that there has been a significant improvement, but not near what I expected - I thought that we would have returned to double this, or even higher. (Note that the long-term average is *triple* this rate of megalitres per mm).

(note: this is assuming that ACTEWAGL are still releasing around 130 megalitres a day to avoid problems with the dams - I suspect that this will not be the case in April, but will continue to track inflow as if that was the case until the end of April or I get more information).

This keeps the runoff per millimetre at just over 100 megalitres, which is odd, given the saturated nature of the soils. What it could mean, however, is that much of the rain is simply not falling in the catchement area, which is something that I was concerned about as a possibility last year.

By the same time last year, we had recieved around 16,700 megalitres in runoff from around the 230 mm of rainfall. This means that there has been a significant improvement, but not near what I expected - I thought that we would have returned to double this, or even higher. (Note that the long-term average is *triple* this rate of megalitres per mm).

Subscribe to:
Posts (Atom)