All posts by Editor

“Science-based targets”: sounds good, means very little

WHEN I FIRST heard the term science-based target (SBT) bandied around in the public arena I thought “oh good – they are advocating a rational approach to energy management”. I thought they were promoting the idea that I always push, which is to compare your actual energy consumption against an expected quantity calculated, on a scientific basis, from the prevailing conditions of weather, production activity, or whatever other measurable factors drive variation in consumption.

How wrong I was. Firstly, SBTs are targets for emissions, not energy consumption;  and secondly a target is defined as ‘science-based’ if, to quote the Carbon Trust, “it is in line with the level of decarbonisation required to keep the global temperature increase below 2°C compared to pre-industrial temperatures”. I have three problems with all of this.

Firstly I have a problem with climate change. I believe it is real, of course; and I am sure that human activity, fuel use in particular, is the major cause. What I don’t agree with is using it as a motivator or to define goals. It is too remote, too big, and too abstract to be relevant to the individual enterprise. And it is too contentious. To mention climate change is to invite debate: to debate is to delay.

Secondly, global targets cannot be transcribed directly into local ones. If your global target is a reduction of x% and you set x% as the target for every user, you will fail because some people will be unable or unwilling to achieve a cut of x% while those who do achieve x% will stop when they have done so. In short there will be too few over-achievers to compensate for the laggards.

Finally I object to the focus on decarbonisation. Not that decarbonisation itself is valueless; quite the opposite. It is the risk that people prioritise decarbonisation of supply, rather than reduction of demand. If you decarbonise the supply to a wasteful operation, you have denied low-carbon energy to somebody somewhere who needed it for a useful purpose. We should always put energy saving first, and that is where effective monitoring and targeting, including rational comparisons of actual and expected consumption, has an essential part to play.

Bulk measurement and verification

Anyone familiar with the principles of monitoring and targeting (M&T) and measurement and verification (M&V) will recognise the overlap between the two. Both involve establishing the mathematical relationship between energy consumption and one or more independently-variable ‘driving factors’, of which one important example would be the weather expressed numerically as heating or cooling degree days.

One of my clients deals with a huge chain of retail stores with all-electric services. They are the subject of a rolling refit programme, during which the opportunity is taken to improve energy performance. Individually the savings, although a substantial percentage, are too small in absolute terms to warrant full-blown M&V. Nevertheless he wanted some kind of process to confirm that savings were being achieved and to estimate their value.

My associate Dan Curtis and I set up a pilot process dealing in the first instance with a sample of a hundred refitted stores. We used a basic M&T analysis toolkit capable of cusum analysis and regression modelling with two driving factors, plus an overspend league table (all in accordance with Carbon Trust Guide CTG008). Although historical half-hourly data are available we based our primary analysis on weekly intervals.

The process

The scheme will work like this. After picking a particular dataset for investigation, the analyst will identify a run of weeks prior to the refit and use their data establish a degree-day-related formula for expected consumption. This becomes the baseline model (note that in line with best M&V practice we talk about a ‘baseline model’ and not a baseline quantity; we are interested in the constant and coefficients of the pre-refit formula). Here is an example of a store whose electricity consumption was weakly related to heating degree days prior to its refit:

Cusum analysis using this baseline model yields a chart which starts horizontal but then turns downwards when the energy performance improves after the refit:

Thanks to the availability of half-hourly data, the M&T software can display a ‘heatmap’ chart showing half-hourly consumption before, during and after the refit. In this example it is interesting to note that savings did not kick in until two weeks after completion of the refit:

Once enough weeks have passed (as in the case under discussion) the analyst can carry out a fresh regression analysis to establish the new performance characteristic, and this becomes the target for every subsequent week. The diagram below shows the target (green) and baseline (grey) characteristics, at a future date when most of the pre-refit data points are no longer plotted:

A CTG008-compliant M&T scheme retains both the baseline and target models. This has several benefits:

  • Annual savings can be projected fairly even if the pre- or post-refit periods are less than a year;
  • The baseline model enables savings to be tracked objectively: each week’s ‘avoided energy consumption’ is the difference between actual consumption and what the baseline model yielded as an estimate (given the prevailing degree-day figures); and
  • The target model provides a dynamic yardstick for ongoing weekly consumptions. If the energy-saving measures cease to work, actual consumption will exceed what the target model predicts (again given the prevailing degree-day figures). See final section below on routine monitoring.

I am covering advanced M&T methods in a workshop on 11 September in Birmingham

A legitimate approach?

Doing measurement and verification this way is a long way off the requirements in IPMVP. In the circumstances we are talking about – a continuous pipeline of refits managed by dozens of project teams – it would never be feasible to have M&V plans for every intervention,. Among the implications of this is that no account is taken (yet) of static factors. However, the deployment of heat-map visualisations means that certain kinds of change (for example altered opening hours) can be spotted easily, and others will be evident. I would expect that with the sheer volume of projects being monitored, my client will gradually build up a repertoire of common static-factor events and their typical impact. This makes the approach essentially a pragmatic one of judging by results after the event; the antithesis of IPMVP, but much better aligned to real-world operations.

Long-term routine monitoring

The planned methodology, particularly when it comes to dealing with erosion of savings performance, relies on being able to prioritise adverse incidents. Analysts should only be investigating in depth cases where something significant has gone wrong. Fortunately the M&T environment is perfect for this,  since ranked exception reporting is one of its key features. Every week, the analyst will run the Overspend League Table report which ranks any discrepancies in descending order of apparent weekly cost:

Any important issues are therefore at the top of page 1, and a significance flag is also provided: a yellow dot indicating variation within normal uncertainty bands, and a red dot indicating unusually high deviation. Remedial effort can then be efficiently targeted, and expected-consumption formulae retuned if necessary.

Monitoring external lighting

The diagram below shows the relationship, over the past year, between weekly electricity consumption and the number of hours of darkness per week for a surface car park. It is among the most consistent cases I have ever seen:

Figure 1: relationship between kWh and hours of darkness

 

 

There is a single outlier (caused by meter error).

Although both low daylight availability and cold weather occur in the winter, heating degree days cannot be used as the driving factor for daylight-linked loads.  Plotting the same consumption data against heating degree days gives a very poor correlation:

Figure 2: relationship between kWh and heating degree days

There are two reasons for the poor correlation. One is the erratic nature of the weather (compared with very regular variations in daylight availability) and the other is the phase difference of several weeks between the shortest days and the coldest weather. If we co-plot the data from Figure 2 as a time-series chart we see this illustrated perfectly. In Figure 3 the dots represent actual electricity consumption and the green trace shows what consumption was predicted by the best-fit relationship with heating degree days:

Figure 3: actual kWh compared with a weather-linked model of expected consumption

Compare Figure 3 with the daylight-linked model:

Figure 4: actual and expected kWh co-plotted using daylight-linked model

One significant finding (echoed in numerous other cases) is that it is not necessary to measure actual hours of darkness: standard weekly figures work perfectly well. It is evident that occasional overcast and variable cloud cover do not introduce perceptible levels of error. Moreover, figures for UK appear to work acceptably at other latitudes: the case examined here is in northern Spain (41°N) but used my standard darkness-hour table for 52°N.

You can download my standard weekly and monthly hours-of-darkness tables here.

This article is promoting my advanced energy monitoring and targeting workshop in Birmingham on 11 September

 

 

ISO 50001: transition to 2018 edition

The story so far: ISO 50001 is an international standard which lays down a harmonised recommended method for managing energy. Published in 2011, it is analogous to ISO 14001, which covers environmental management and ISO 9001 for quality management. Organisations can be certified to ISO 50001 to show that they have energy-management procedures which meet certain criteria. 

At the time of writing, the original 2011 edition of ISO50001 is due to be phased out and replaced with a new 2018 version. To help understand the differences,  I have approached it from the point of view of the main topics that you or an auditor might explore when establishing compliance, and the questions that would be asked. I give the section references of both old (2011) and new (2018) editions, and where necessary there is a note of any material differences.

You need to login to view the rest of the content. Please . Not a Member? Join Us

Project sketch: vetting product offers

My client in this case is an international hotel brand. Individual hotels get approached by people selling questionable energy-saving products and rarely if ever have enough knowledge to defend themselves against bogus and exaggerated offers.

The company has established a core group of engineers and sustainability staff to carry out centralised vetting. My job is to provide technical advice during the initial filtering phase and to join a twice-yearly meeting to interview suppliers who are being taken further.

Project sketch: user requirement specification

Our client, a university, has a long-established metering system based on proprietary hardware with associated software for managing and interrogating the meters and storing their output for use, among other things, in a monitoring and targeting scheme. They have two major stakeholders, one predominantly interested in monitoring and managing power quality and availability, and the other in billing the various user departments. The existing scheme suffers from certain limitations and the client is considering migrating to a new data-collection provider. Continue reading Project sketch: user requirement specification

Project sketch: bulk measurement and verification

Our client in this case is a national retail chain which is continually and progressively improving its estate through the application of generic energy-saving fixes. Savings need to be measured and verified, but individual project values and expected savings are generally too low to merit  the cost of rigorous adherence to the International Performance Verification Protocol.

We are conducting a proof-of-concept study Continue reading Project sketch: bulk measurement and verification

Using M&T techniques on billing patterns

One of my current projects is to help someone with an international estate to forecast their monthly energy consumption and hence develop a monthly budget profile. Their budgetary control will be that much tighter because it has seasonality built in to it in a realistic fashion.

Predicting kWh consumptions at site level is reasonably straightforward because one can use regression analysis against appropriate heating and cooling degree-day values, and then extrapolate using (say) ten-year average figures for each month. The difficulty comes in translating predicted consumptions into costs. To do this rigorously one would mimic the tariff model for each account but apart from being laborious this method needs inputs relating to peak demand and other variables, and it presumes being able to get information from local managers in a timely manner. To get around these practical difficulties I have been trying a different approach. Using monthly book-keeping figures I analysed, in each case, the variation in spend against the variation in consumption. Gratifyingly, nearly all the accounts I looked at displayed a straight-line relationship, i.e., a certain fixed monthly spend plus a flat rate per kWh. Although these were only approximations, many of them were accurate to half a percent or so. Here is an example in which the highlighted points represent the most recent nine months, which are evidently on a different tariff from before:

I am not claiming this approach would work in all circumstances but it looks like a promising shortcut.

Cusum analysis also had a part to play because it showed if there had been tariff changes, allowing me to limit the analysis to current tariffs only.

The methods discussed in this article are taught as part of my energy monitoring and targeting courses: click here for details

Furthermore, in one or two instances there were clear anomalies in the past bills where spends exceeded what would have been expected. This suggests it would be possible to include bill-checking in a routine monitoring and targeting scheme without the need for thorough scrutiny of contract tariffs.