Over the summer Ofgem issued a ECO2 Cavity Wall Checklist which sets out the new rules of evidence that must be observed when wanting to overwrite the default RdSAP U-value for a wall with a higher U-value. In the consultation document they justify the new evidence requirements on the grounds that

we are concerned that wall U-values for cavity wall insulation measures (CWI) are being overwritten to values that we consider to be unreasonably high for the premises in question, and as a result the calculated savings for a measure are artificially inflated. This could lead to fewer households benefitting under ECO.

So at the heart of this is the claim that higher starting U-values will result in more savings (which may well be unjustified). But, surely if we are assessing the impact of insulation the starting U-value doesn’t matter? Does it?

Strangely, that is the one piece of the explanation which is missing from the consultation and from the final checklist; because while it is true that a given thickness of a given insulation will always have the same thermal resistance, it is not true that the benefit of that insulation will be the same.

Let’s unpack that a bit.

At its simplest, the thermal performance of a wall with regard to heat transfer, can be expressed as a thermal resistance. If, for a moment we ignore thermal bridging, we can calculate the thermal resistance of a wall by adding up the resistances of the layers of which is it composed. A cavity wall made up of an outer leaf of brick and an inner leaf of 100 mm dense aggregate block work with 20 mm of plaster to the inside might have a total resistance of 0.617 m²K/W. A similar wall with an inner leaf of 100 mm aircrete blockwork might have a resistance of 1.090 m²K/W. The wall with the aircrete inner leaf has a higher resistance than the wall with the dense blockwork inner leaf.

That’s fine as far as it goes, but it doesn’t allow us to quantify the important aspect of performance, the rate of heat transfer. For that we need the U-value, which is the reciprocal of the thermal resistance and expresses the rate of heat transfer in watts per metre squared kelvin (W/m²K). As an equation:

So our two walls have U-values of 1.620 W/m²K and 0.918 W/m²K respectively.

What happens if we add insulation to each of those walls, say 120 mm of insulation with a thermal conductivity of 0.030 W/mK? We can evaluate that by going back to the thermal resistance in each case. The insulation has a resistance of 4.0 m²K/W, so the wall resistances are now 4.617 m²K/W and 5.090 m²K/W, giving U-values of 0.217 W/m²K and 0.196 W/m²K. But it’s not just the final number we’re interested in, it’s the change. The wall with the dense block inner leaf has gone from 1.620 to 0.217, a drop of 1.403, the other wall went from 0.918 to 0.196, a drop of 0.722. So, although the final U-value is lower in the aircrete wall, the *reduction* in U-value is much greater in the dense block wall.

You can see that visually on the graph in this post, where the same thickness of insulation added to a high U-value gives a bigger drop than the same amount added to a low U-value. (There is a more detailed discussion of U-values in my book How Buildings Work.)

As ECO is all about savings, a higher starting U-value is better. I took a standard house type I use in SAP training and experimented with the effect on emissions of using the two starting U-values I calculated earlier. The annual carbon dioxide savings are 1,080 kg/CO₂ for insulating the dense block wall and 669 kg/CO₂ for insulating the aircrete block wall. The insulation is the same, but the difference of 411 kg/CO₂ is entirely down to that higher starting U-value.