climate science

Here’s a straightforward approach to dealing with denial. Most of these points make sense to me:

Tips for dealing with denial

  • Communicate a consistent message. Do not attempt to “soften the blow” too much, by making the issue seem less than it is.
  • Try not to provide too much information at one time. This sometimes can overwhelm [deniers]. Keep the first meeting as brief and succinct as possible, and end with the scheduling of a follow-up meeting.
  • Ask open-ended questions, and allow [deniers] plenty of time to talk. Undoubtedly, they are fearful of losing something very important—health, independence, or optimism/faith about the future.
  • Explain to [deniers] that information is something that you can provide, but that it is their choice what, if anything, they want to do with the information provided. Ask them what they want to know about, and let them guide the conversation.
  • Provide reading materials, which [deniers] can peruse at their own discretion.
  • End your meetings with [deniers] positively, and try to instill in them a sense of self-confidence in their abilities to [deal with the problem].
  • Recommend support groups, whenever possible.
  • Make it clear that the [problem] will never “go away,” .. but emphatically explain that [solutions] can lessen the severity of the [problem].
  • Explain to [deniers] that even if they do not believe that [the problem exists], the recommendations that you are making certainly will not harm them in any way. Ask them to humor you by making an attempt to follow your advice for a little while.
  • Know that [people] in denial often will refuse to admit that they are upset. They claim they are not upset—after all, nothing is wrong. Ask them how they would feel if they really did have the [problem] that they are denying that they have.
  • Remember that tough love often does not work with [people] in denial. Many [authorities] have said, “There is not much use talking to you right now. Just call me when you accept that  __________.”  They never hear from the [denier] again. Do not expect that [denier] will independently have a sudden insight. However, you can say, “I feel like you have other things on your mind today. We can talk more about this tomorrow at noon. Please feel free to call me if you have any questions before then.”
  • Expect [deniers] to direct their anger at you. Many times when you try to deconstruct their carefully built wall of denial, [deniers] will become angry. Do not react to this anger.

Some pretty sound advice there, I reckon. Some of it I’ve already seen in action in climate circles, some not.

The source? Medical clinical denial advice.

I wonder why more climate advocates haven’t looked at this kind of thing? Seems like a fairly obvious starting point, even if it can’t be linearly extrapolated to large groups…

The IPCC is being reviewed by the Interacademy Council (which represents dozens of national science academies). And they’re taking public comment. This might be a good chance to get some improvements. The comments form is at:

If you can’t think of anything, here’s what I wrote:

  • The IPCC needs to report more frequently. Interim reports, or even annual updates would be very useful.
  • More focus on possible tipping points. Especially estimates of sea-level rise from glacial melt, and estimates of non-linear responses to warming.
  • More transparency with the process – especially which representatives are making which changes to the finial release.
  • Stop being so conservative. Offer an your analysis, and be prepared to defend it when it gets attacked by the fossil fuel lobby and governments.
  • Work with science communicators. Create a lay-person’s version of the report.

I’d like to declare here and now that I’m sceptical about the “reality” of the round earth. There are many dissenting voices, sceptics of the current “consensus”, and significant evidence to show that the earth is not round. Not to mention that it’s bleedingly obvious – just look out the window: No curvature there, eh?

But despite this, dissenting voices in the debate are silenced. Proponents of the round earth hypothesis pursue their beliefs with a zeal unmatched even by the world’s most fundamentalist religions. While it’s true that many scientists believe that the earth is round, there are also significant dissenting voices, but were one to mention this in general conversation, or on talk back radio, one would immediately be shouted down, cut off, ostracised. In short, censored.

This is not how science should operate. Science is not decided by majority opinion, but by healthy debate. And while one side is being censored, there can be no real debate.

I’m not saying definitively that the earth flat or round – I’m still undecided, just that the debate needs to be opened up, so the true process of science can run its course, with maximum access to evidence and competing theories from both sides. Until all the information is on the table, I’ll be most skeptical of the majority-imposed “consensus”.

Sound familiar? The above arguments are frequently used by the denial-o-sphere (denial-o-plane?). While obviously climate change science is not so developed, or certain (or simple) as planetary physics, that does not mean that the above arguments have any weight in a climate context. (more…)

I’ve been starting to learn Octave, a maths programming language. Octave is similar to other packages that are often used to create nice graphs that you often see around the place, especially when it relates to climate change. This is a bit of a slap-dash tutorial on how to get some graphs happening with Octave. It probably assumes advanced high-school level maths.

If you wanna learn, I suggest you get QtOctave, which is damn nice, and in the Ubuntu repositories, and probably in most other distributions of linux (you can run Octave on windows – but if you really want to be this geeky, and are still on windows, you need to re-asses your values). QtOctave has a nice help-search function that lest you find most of what you need to know about functions, and installing it installs all the pre-requisites too, although depending on your distro, you might need some of the extra packages from octave-forge.

At the very bottom is an attachment with most of this code in it. I think most of this stuff will also work in Matlab, but you gotta pay for that…

Then read all of this excellent tutorial. That’s where I learned nearly everything for this tutorial, apart from the names of a few functions.

Crank out a graph!

Now you’re ready to go. Get yourself a copy of some temperature data to play with. I used NASA’s GISTEMP data. You can use any data you want, but I’ve attached a file that will do everything I’m talking about here, and includes octave-formatted GISTEMP data.

Ok, so assuming you’ve got your data in a matrix, you can then extract the relevant bits (Some of the variable names are different here to in the attachment, to save space):

% get the years from the first column
yr = GISTEMPdata(:,1);

(You did read that octave tutorial, right?)

% get the monthly averages
Temps =
% Average them, to get the yearly means (2 refers to the second dimension, ie. average rows, not columns)
AnnualTemps = mean (Temps, 2);

You can now hack out a simple graph:

plot(yr, AnnualTemps)


If you read tutorial, you’ll know how to adjust the axes, and add legends and titles, and all that jazz. I’m going to ignore that.

You’ll notice that the data range from -60 to 80. That’s because it’s a graph of temperature differences (anomalies) – which means that what matters isn’t the starting point, but rather, the relationships between the data. In this case, the -60 means -0.6DegC, and 80 means +0.8DegC (this is explained in the header of the GISTEMP file I linked to up top).

To change it to real values, to give it some human scale, we have to make the 1951-1980 average = 14DecC.

% Divide by 100, add 14, and subtract the average from the anomaly means
% 1951-1879 = 72, 1980 = 101
RealTemp = AnnualTemps / 100 + 14 - mean( mean( GISTEMPData(72:101,2:13) ) );


Cool, huh? Okay, let’s get a Trend line going.

Getting Trendy

So, basically, a trend line is a best-fit line. You can do this automatically with a couple of functions in Octave, but since we’re going for just a straight trend line at the moment, we can just use a fairly simple one: a first degree polynomial fit. (a first degree polynomial is a straight line at any angle, from any starting point).

Polynomials are those equations you did in high school maths, that looked like:

y = x2+3x+1.5

That one would give you a basic parabola, shifted down and to the left a bit (I think, I haven’t actually graphed it). High-degree polynomials (where x is raised to the power of 2 or more) aren’t particularly useful for finding trend lines – they can look pretty, but don’t really help much. But more on that later. Simple first order polynomials (straight lines) are a good way of getting an idea of an overall trend.

To get the equation for the line, we need to get all the values for the basic form of a first degree polynomial:

y = mx+b

to get m and b from the data, we can use the polyfit() function, with 1, for 1st degree:

EQ = polyfit ( yr , TempReal , 1 ) ;

which provides us with an array, like:

0.0061271   2.1103472

The first value is m, the second is b.  Now we apply y=mx+b:

TrendLine = EQ(1) .* yr + EQ(2)

Now you can graph the trenline, with the original data:

plot(yr, AnnualTemps, yr, TrendLine)


Looks ok to me. (I also note that even with the so-called “cooling since 1998/2000/2002/cherrypick”, 2008’s average temperature is almost 0.2DegC higher than the linear trend for the last 129 years..)

How Not To do Climate Stats

This is where the higher-degree polynomial equations come in. A high-degree polynomial can easily be made to fit a curve, but that doesn’t particularly mean anything, unless a high-degree polynomial cause can be hypothesised, that matches the trend. I don’t know of any that can.

All this was recently news, because the Australia published a piece of stupid masquerading as climate science.

Anyway, I want to show you how to do that same kind of stupid (albeit with 129 year data, not 30). You can try it with the last 30 if you like. Or with the last two. I don’t care, just don’t be surprised by the results, because they don’t mean anything.

So, we want a sixth-degree polynomial, that best fits the data we have. In other words, we want something like this:

y = rx6 + qx5 + px4 + ox3 + nx2 + mx + b

And we need to find r, q, p, o, n, m, and b. Again, we do it with polyfit(), this time with 6:

EQ = polyfit ( yr , TempReal , 6 ) ;
and we get something like:

1.3740e-16 -7.9135e-13 1.5165e-09 -9.6503e-07 -1.9859e-09 -2.5545e-12 -2.6291e-15

You might point out that these numbers are so small that they are ridiculous. To that, I’d reply: Good point.

Anyway, on with the stupidity, let’s whack those numbers into the above equation:

TempPoly6=EQ(1).*(yr.^6) + EQ(2).*(yr.^5) + EQ(3).*(yr.^4) + EQ(4).*(yr.^3) + EQ(5).*(yr.^2) + EQ(6).*(yr.^1) + EQ(7);

I hope that makes sense, it took me a while to get it.

Now we can graph it, along with the real data, and the linear trend line:

plot(yr, AnnualTemps, yr, TrendLine, yr, TempPoly6 );


Nice, huh? Now, any sane person would see without any stats education would see that an think: yep, that’s a pretty good match. Looks like a good fit to me.

But you already know it’s stupid, so you should be looking at it with even more critical eyes than usual. One of the best ways to be critical in a situation like this is to step back, and take a wide view. So let’s see how those trend lines look if we add another century on each end: 1700 to 2100.

to do this in Octave, you need to stretch the “years” component first, then just put it back into the same equations:

yr = [1700:2100]'

The ‘ is important, it makes the vector matrix vertical. Now you can just hit the up-key to access the same lines as before:

TrendLine = EQ(1) .* yr + EQ(2)

TempPoly6=EQ(1).*(yr.^6) + EQ(2).*(yr.^5) + EQ(3).*(yr.^4) + EQ(4).*(yr.^3) + EQ(5).*(yr.^2) + EQ(6).*(yr.^1) + EQ(7);

Then just run the last plot command again, (yr has changed length though, so go back to the GISSTEMPdata for the years for the original data:

plot(GISTEMPdata(:,1), AnnualTemps, yr, TrendLine, yr, TempPoly6 );


That’s right. By 2100, temperatures won’t be 2DegC warmer, nor 4… Nope, it’s gonna be 21 degrees centigrade – 7 degrees warmer. And the “medieval warm period”? Didn’t exist. Was actually an ice age.


I’m not a statistician, though I do hope to be doing stats at Uni this year. I’m reasonably sure this is all correct, though I haven’t used this kind of maths since high-school, more than half a decade ago. I learned what I now know in Octave in the last 2-3 days, so there might be better ways of doing this, I don’t know. I’d appreciate any corrections, if they’re needed, and feedback is always welcome.

I’d also appreciate any help on running a LOESS filter on the data. I don’t understand the maths except in the vaguest terms (moving polynomial average, or something?), but it seems like it applies a very useful smoothing, although it doesn’t provide any kind of future prediction the way a linear trend does (ie. in a very limited way).


gisstempdata.m – THIS IS A PLAIN TEXT FILE, NOT AN ODT. rename it to gisstempdata.m to use it in octave/matlab.

Until now, the technology hasn’t been available to obtain fine-scaled, precise measurements of CO2 in the atmosphere. But the launch next year of two carbon-detecting satellites, NASA’s Orbiting Carbon Observatory and the Japanese Greenhouse Gases Observing Satellite, should soon help to fill in this knowledge gap, which is critical to establishing a reliable carbon accounting system. – Amanda Leigh Mascarelli

There’s more info on the NASA project at, and on the Japanese project at

It amazes me that this isn’t getting more attention already. It’s going to mean a massive increase in our ability to account for carbon and other greenhouse gas emissions and uptakes. Seems to me that these projects should be WAY more exciting than the Large Hadron Collider, for example, since they will so directly effect the science around one of the most important and controversial issues of this… century? millenium?

It also strikes me that images extrapolated from the data could be strikingly beautiful – in a similar way to the “earth by night” photos. Obviously carbon concentrations won’t be so strictly confined as light sources, and the images will obviously be false colour (since CO2 is invisible). But other effects, like those of coriolis winds and ocean and forest carbon sinks would be great to see in action, especially with changes over the seasons.


Leigh, A. et al. (2008, December 18). What we’ve learned in 2008. Nature Reports Climate Change. Retrieved January 12, 2009, from

When I’m reading about climate change in public forums like the internet, or newspapers, I expect to see denial argments all over. Usually, they’re the same old shit, that’s been roundly debunked by numerous people. So it’s a pleasant suprise to find new arguments – it gives you something to think about.

This one really was suprising though: Richard Lindzen is well known for being a good debater, and well-read. He’s one of the last deniers that mainstream seems to accept. So it’s a suprise that I haven’t seen this particular arguement before: usually these things get picked up like smallpox. This article’s actually a bit old (2004), so I’d expect it to be well spread around the internet by now, but it isn’t.

In the 2004 article/interview by Marc Morano for, Lindzen says: “Although there is [Arctic] melting going [on] now, there has been a lot of melting that went on in the [19]30s and then there was freezing.”

Ok, so the basic appeal of the argument – it’s happened before, so who cares if it’s happening now? – has appeared in many denial rants before, but this one is very specific, and it isn’t documented in any of the other major lists of old denier arguments.

The second part of the suprise is that it’s so damn easy to debunk. You don’t need to be a scientist for this one. You just need to go to the NOAA Arctic website (see the updated graph). Ok, so there was melting from 1934-40, but there was roughly the same amount of ice INCREASE in the year before that trend started. If any sane person looked at that graph, they’d immediately see that the sea ice extent trend is pretty much static up to about the 50s or 60s, and then the trend swings down dramatically, dropping from a relatively constant ~13.5m sqkm, down to about 11.5-12m sqkm over the last decade.

Anyway, the article is generally crap, nothing that hasn’t been talked about thousands of times since. I just thought that this specific bit should be pointed out. I’m not going to go seeking it, but I’d be interested to know what Lindzen thinks about that graph, and whether NOAA is part of the whole conspiracy or not. I’d also be interested to hear why he thinks that fossil fuel companies, with all their billions of dollars of annual profit, haven’t been funneling some that money into climate science to see if they can get a different result – obviously if it could be done, the rewards (of not having to deal with environmental regulation) would be significant…

Luckily for me, I’d never heard of Neal Boortz up until the release of the IPCC’s Summary for Policy Makers, 2007. then in comments on an intro report on the SPM on realclimate ( ), a few people mentioned his latest attack:

have a read. it’s quite entertaining, even in it’s simplicity. I thought I’d take a look at it, and rebut some of the more interesting points. If you think I’ve missed an important one, let me know, I’ll have a go at it. (more…)

Next Page »