The CSIRO last week released what was effectively a statistical analysis
of the reality surrounding large parts of the infamous Home Insulation Program – or for those of you not familiar with this particular policy, you may have heard about it via it’s common alternative name in the mainstream media, the “OMG, PETER GARRETT IS BURNING DOWN OUR FUCKING HOUSES!” policy.
As we here have long known and talked about, the reality of the Home Insulation Program was always vastly different to its hysterical media portrayal - driven as it was by naive and innumerate journalists looking for easy sensational headlines, and partisan hacks prostituting their cheap wares before a gullible public. Having a cowardly government lacking the plums to tell them all where to stick it was another unfortunate sub-plot in this tale of public deception about the reality of a substantial piece of public policy.
The CSIRO report covers three large areas – analysis of fire related incidents, broader safety risk issues relating to the insulation program and the development of a risk profiling tool. Today, we’ll just focus on the fire related incidents component, as we’ve long been following this particular issue in depth and it’s nice to be able to bring it to a close, flip the bird at our detractors and exit the battlefield under a big banner saying “We told you so” :-P
The first thing that needs to be done is explain what the CSIRO *didn’t* do. They *didn’t* answer the elephant in the room question: “In the 12 month period after having insulation installed, was there a difference between pre-program and in-program probability of having an insulation related fire incident?”. They provided all the data we need to get an estimate of it, they made a sort of assumption about it, but didn’t actually attempt to tackle that important question head on.
This question is important because it tells us whether the Home Insulation Program was safer or more dangerous in terms of fires than what existed before it over the short term – over the first 12 months of having insulation installed.
The second thing the CSIRO didn’t do was provide long term background fire rates (the number of fires we should expect to see every year from all houses that have had insulation installed for longer than 12 months) that allow us to answer the questions *we’ve* been asking. They’ve provided background rates that answer a lot of different questions, that answer a lot of questions other people may have been asking, but not the ones we have. This is a simple methodology issue which we can easily deal with since the CSIRO kindly provided in the report all the data we need.
To get a feel for what actually happened, it’s worth looking at the strong relationship between when the fire incidents under the program occurred (defined as houses which had the fire services called out to them over what turned out to be an insulation related fire) and when insulations were actually installed. This data comes from Table 5.1 on page 30 of the report (click to expand)
As the number of installations increased during the program, the number of fire incidents increased with it, before those fire incidents trailed off on an 8 to 12 month tail after the installations had ceased. This is important because it shows us straight away that most fires happened relatively quickly after insulation was installed. To really highlight this relationship further, if we change this data from a chronological month by month representation into one where we measure how many days insulation had been installed, for every fire incident under the program, this is what we get:
This data comes from Table 5.2 on page 32 of the report. I’ve aggregated it into 10 day periods, except for the first observation which covers 11 days (days 1 through 10 plus what the CSIRO reported as the zero day, the same day of installation) .
Of the 156 fire callouts that occurred under the program (note – not the “hundreds” as some media outfits would have you believe, but 156), 43% of those fires occurred within 10 days of having the insulation installed.
After that, the fires dropped dramatically and continued to reduce as a function of time. The reason for this is fairly obvious – if your insulation was sitting on top of, say, a poorly installed downlight with no thermal cover (for instance), the insulation would ignite as soon as the light was left on for any decent length of time. Most things that could catch fire, would catch fire at the earliest opportunity.
An easy way to highlight this reality further is to chart the proportion of all fire related incidents under the program that had occurred after X days of insulation being installed (click to expand):
That tells the story on the distribution of fire related incidents as a function of time. It tells us that the short term fire rate is much larger than the long term fire rate, simply as a result of most of the stuff that *can* burn, *does* burn and burns early.
This sort of distribution has an important consequence when it comes to directly comparing what went on before the Home Insulation Program and what went on during the program in terms of the probability of a fire occurring.
We don’t know (yet) whether the pre-program insulation manifested the same type of distributional behaviour, but because of what we witnessed during the program, we would it expect it to. We would at least expect some *similar* dynamic to have played out in the pre-program insulation numbers. It certainly suggests that we should attempt to find out.
Fortuitously, the CSIRO report provides us with the data necessary in Table 5.5 on page 36 of the report. Here they used data from the Australasian Fire and Emergency Services Authorities Council to show us how many fire callout incidents were recorded each month between March 2007 and December 2010 for *both* buildings with insulation installed pre-HIP *and* for buildings with insulation installed under the Home Insulation Program.
As we know from the HIP data, most fires that happen as a result of getting insulation installed happen relatively quickly - and certainly over the first 12 month period of having such insulation installed.
The Home Insulation Program was announced on the 9th
February 2009. So if we take an annualised fire incident rate for insulation before February of 2009, that annualised rate will have two components – the short term rate (fires that occurred from the insulation that was installed within the previous 12 months) *and* the long term rate (the fires that occurred from the rest of the “insulation stock” older than 12 months).
We know that 65-70K installations per year was the medium term norm in Australia (both from the CSIRO report and the Senate estimates hearings on the matter). That’s about 5800 installs a month (taking the 70K a year figure), every month, going back before the fire incident data in the CSIRO report started. We also know that from March 2010 onward, there was no pre-HIP program insulation left in Australia that had been installed in homes for less than 12 months.
If we take the annualised fire rate for homes where insulation was installed pre-HIP from March 2010 onwards, that fire rate contains only one component – the fire rate for pre-HIP installations older than 12 months.
So, just looking at the data for houses where insulation was installed before the Home Insulation Program, if we subtract the annualised fire rate for those homes from March 2010 onward from the rate for those homes from January 2009 and earlier, we cancel out the long term rate component (which we assume would be the same for both periods – there’s no particular reason why they would be significantly different) and end up with the short term rate for the pre-HIP installations.
That then let’s us compare the short term rate pre-HIP to the short term rate during the HIP.
We know from the CSIRO report (page 37) that as of March 2008, there was approximately 4,273,000 homes with insulation roof insulation installed. We know that during previous years, around 70,000 installations were made per year, which equates to around 5833 per month. If we take the March 2008 figure as the true level of stock, then adjust on a monthly basis both forwards and backward through time by the 5833 per month level, that gives us an estimate of the number of houses in Australia with insulation installed for each month – our “stock”.
The “Fire Callout Incidents” column has the number of fire incidents that occurred in each month.
The “Annualised Fire Rate per 100,000 houses per year” is the number of fires divided by the number of insulated houses multiplied by 100,000 (to give us the rate per month per 100,000) then multiplied by 12 (to give us the annualised rate). This tells us for every 100,000 homes with insulation, how many fire incidents there are per year.
If we take the average of this rate for the 23 months worth of data, that tells us that the average fire rate occurring (that consists of both the short term and long term components mentioned earlier) was 2.83 fires per 100,000 homes with insulation per year.
If we do the same for these pre-HIP houses with the March 2010 numbers onwards – the period where all pre-HIP insulation is older than 12 months, we get:
So the difference between the two rates is 2.83-2.06= 0.76.
Adjusting for the 100,000 factor rate and multiplying it by the estimated pre-HIP stock gives us a number in terms of fires, which is 0.76 x (4,328,333/100000) = 33.1 fires
So there were an estimated 33.1 fires that happened in the pre-HIP program each year from insulation installed for less than 12 months.
As there were approximately 70,000 installs undertaken each year pre-HIP, this gives us (33.1/70,000)*100,000 = 47.3
fires per 100,000 homes per year as the short term pre-HIP fire rate
This is the estimate of the number of fires per 100,000 houses with insulation less than 12 months old we would expect to see under the pre-HIP insulation industry.
If we compare that to the short term rate for the Home Insulation program, we can do it two ways. First, simply divide the number of fires that occurred within the first 12 months (153 to 154 fires, we’ll be generous and call it 154) by the number of installations (1,108,151) and multiply it by 100,000 to give us the comparable rate, we get:
(154/1,108,151)*100,000 = 13.9
Alternatively, we could utilise the annualised rates over the first 12 months (as we did for the pre-HIP data) using the data from Table 5.2 on page 32 of the report to end up with around 13.1.
So let’s be conservative and use the larger rate.
The number of fires per 100,000 installs that occurred within 12 months of installation was 47.3 before the Home Insulation Program and 13.9 during the Home Insulation Program
The Home Insulation Program reduced the short term fire rate by approximately 70% compared to what was happening before it.
The Home Insulation Program was over 3 times safer than the industry it replaced in terms of the numbers of fire experienced within 12 months of getting insulation installed.
Now, what about the long term fire rates – the rate of fires we expect to occur from insulation stock older than 12 months?
We’ve already figured it out for the pre-HIP insulation – 2.06 fires per year for every 100,000 houses with insulation installed.
What about the long term rate for Home Insulation Program?
Since not all of the insulation installed under the Home Insulation Program had been in the ceiling for longer than 12 months when the data was collected, we have to take note of the number of fires that occurred and the number of installs that were older than 12 months at the time. Thankfully, the CSIRO did this for us in Table 5.2 on page 32 or the report. Again, we follow their advice of:
We’re after the data that matches the “has been installed for 351-400 days” or longer (the best estimate available for greater than 12 months), which looks like this:
So the long term rates for the post-12 month period is already starting to average around the 0.66 fires per 100,000 houses installed mark, compared to the 2.06 fires per 100,000 houses installed that we see currently from the pre-HIP industry installations. Again, the Home Insulation Program appears much safer than what it replaced in terms of the number of long term fires.
This differs from the CSIRO’s long term rate because the CSIRO didn’t measure post-12 month rates in and of themselves, but deployed a methodology that measured a different thing entirely (and answered a completely different set of questions in the process).
Their current rate estimates accommodate fire rates from previous months into the calculation of each subsequent period. They used a model where, as you can see on page 33 of report (reproduced below) the forecast of the model (the blue line) is already out of synch with the most recent 3 aggregated observations covering over 35% of the time period of the entire sample (click to expand)
However, even by that methodology and its reliance on curve fitting, the CSIRO estimate that the forecast line “flattens off to a value of 1.1 per 100,000 per year” – still only half of the long term rate we see happening currently with the most recent available data for houses insulated before the Home Insulation Program ever saw the light of day.
Ultimately, the HIP – as we’ve stated from the beginning, regularly, using publically available data at the time – was much safer in terms of fire rates than what preceded it. Now, however, we know that it was safer over both the short term (the fire rates over the 12 month period from installation) as well as the longer term (the residual long term fire rates that occur from 12 months after the insulation was installed).
There’s plenty that could be said about the widespread and pathetic excuse for journalism that was involved in the coverage of the HIP program – but what else is new?
Much of News Ltd - particularly that shit sheet The Australian – not to mention the entertainers pretending to be informed commentators that live under the bridges of talkback radio, had their heads firmly embedded up their own sphincters . But again, what else is new? Their silence on the report is pretty predictable. An under-qualified media will continue to give us under-qualified policy analysis, leaving them regularly stranded on the wrong side of reality when it all comes out in the wash. Eggs, faces and no real surprises.
The sad part here is that a significant proportion of the public will also be left stranded in a fictional world, at least as far as those who have the not unreasonable expectation that the people informing them about political reality are actually up to the job.
But the fault lay not at the feet of those with reasonable expectations of the fourth estate, but with those that have proven - time and time again - that they cannot live up to providing for those essentially reasonable expectations.
Which, thinking about them, reminds me of a song actually (NSFW)