One of my current projects is to help someone with an international estate to forecast their monthly energy consumption and hence develop a monthly budget profile. Their budgetary control will be that much tighter because it has seasonality built in to it in a realistic fashion.
Predicting kWh consumptions at site level is reasonably straightforward because one can use regression analysis against appropriate heating and cooling degree-day values, and then extrapolate using (say) ten-year average figures for each month. The difficulty comes in translating predicted consumptions into costs. To do this rigorously one would mimic the tariff model for each account but apart from being laborious this method needs inputs relating to peak demand and other variables, and it presumes being able to get information from local managers in a timely manner. To get around these practical difficulties I have been trying a different approach. Using monthly book-keeping figures I analysed, in each case, the variation in spend against the variation in consumption. Gratifyingly, nearly all the accounts I looked at displayed a straight-line relationship, i.e., a certain fixed monthly spend plus a flat rate per kWh. Although these were only approximations, many of them were accurate to half a percent or so. Here is an example in which the highlighted points represent the most recent nine months, which are evidently on a different tariff from before:
I am not claiming this approach would work in all circumstances but it looks like a promising shortcut.
Cusum analysis also had a part to play because it showed if there had been tariff changes, allowing me to limit the analysis to current tariffs only.
The methods discussed in this article are taught as part of my energy monitoring and targeting courses: click here for details
Furthermore, in one or two instances there were clear anomalies in the past bills where spends exceeded what would have been expected. This suggests it would be possible to include bill-checking in a routine monitoring and targeting scheme without the need for thorough scrutiny of contract tariffs.
I ALWAYS THOUGHT that the diagrammatic representation of the “plan, do, check, act” cycle in ISO50001:2011 was a little strangely drawn (left-hand in picture below), although it does vaguely give the sense of a preparatory period followed by a repetitive cycle and occasional review. Turns out, though, that it was wrong all along because in the 2018 version of the Standard, the final draft of which is available to buy in advance of publication in August, it seems to have been “corrected” (right-hand below). For my money the new version is less meaningful than the old one.
ISO50001 has been revised not because there was much fundamentally wrong with the 2011 version but as a matter of standards policy: it and other management-system standards such as ISO9001 (quality) and ISO14001 (environment) have a lot in common and are all being rewritten to match a new common “High Level Structure” with identical core text and harmonized definitions. ISO50001’s requirements, with one exception, will remain broadly the same as they were in 2011.
It is just a pity that ISO50001:2018 fails in some respects to meet its own stated objective of clarity, and there is evidence of muddled thinking on the part of the authors. The PDCA diagram is a case in point. I see also, for example, that the text refers to relevant variables (i.e., driving factors like degree days etc) affecting energy ‘performance’ whereas what they really affect is energy consumption. To take a trivial example, if you drive twice as many miles one week as another, your fuel consumption will be double but your fuel performance (expressed as miles per gallon) might well be the same. Mileage in this case is the relevant variable but it is the consumption, not the performance, that it affects. This wrong-headed view of ‘performance’ pervades the document and looking in the definitions section of the Standard you can see why: to most of us, energy performance means the effectiveness with which energy is converted into useful output or service; ISO50001:2018 however defines it as ‘measurable result(s) related to energy efficiency, energy use, and energy consumption’. I struggle to find practical meaning in that, and I suspect the drafting committee members themselves got confused by it.
Furthermore, the committee have ignored warnings about ambiguity in the way they use the term Energy Performance Indicator (EnPI). There are always two aspects to an EnPI: (a) the method by which it is calculated—what we might call the EnPI formulation—and (b) its numerical value at a given time. Where the new standard means the latter, it says so, and uses the phrase ‘EnPI value’ in such cases. However, when referring to the EnPI formulation, it unwisely expresses this merely as ‘EnPI’, which is open to misinterpretation by the unwary. For example Section 6.4, Energy Performance Indicators, says that the method for determining and updating the EnPI(s) shall be maintained as documented information. I bet a fair proportion of people will take the phrase ‘determining and updating the EnPI(s)’ to mean calculating their values. It does not. The absence of the word ‘values’ means that you should be determining and updating what EnPIs you use and how they are derived.
Failure to explicitly label EnPI ‘formulations’ as such has also led to an error in the text: section 9.1.1 bullet (a) (2) says that EnPIs need to be monitored and measured. That should obviously have said EnPI values.
The new version adds an explicit requirement to ‘demonstrate continual energy performance improvement’. No such explicit requirement appeared in the 2011 text, but since last year thanks to the rules governing certifying bodies, you cannot even be certified in the first place if you don’t meet this requirement. There was a lot of debate on this during consultation, but this new requirement survived even though it does not appear in the much-vaunted High Level Structure which ISO50001 was rewritten supposedly to conform to. That being the case, it is paramount that users adopt energy performance indicators that accurately reflect progress. Simple ratio-based metrics like kWh/tonne (or in data centres, Power Usage Effectiveness) are not fit for purpose and their users risk losing their certification because EnPIs of that kind often give perverse results and may fail to reflect savings that have really been achieved.
On a positive note, the new version of the Standard retains the requirement to compare actual and expected energy consumption, and to investigate significant deviations in energy performance. These requirements are actually central to effective ongoing energy management. Moreover, a proper understanding of how to calculate expected consumption is the key to the computation of accurate EnPIs, making it a mission-critical concept for anyone wanting to keep their certification.
One ironic and highly satisfying way to debunk the claims for magnetic fuel conditioning is to pitch one supplier against another. I have been digging in the archive for claims made by different suppliers, and with assistance from eagle-eyed newsletter reader Mark J., have compiled the following account. Let’s start with Magnatech. Their web site makes a bald assertion that passing fuel through a magnet’s negative and positive (sic) fields makes it easier for the fuel to bond with oxygen and burn. They offer no explanation of how this works but say it creates a rise in flame temperature of “an extra 120°C or more”. However, their competitor Maximus Green says that the flame temperature only rises by 20°C, but they gamely have a crack at explaining how: they claim that hydrocarbon fuel molecules clump together in large “associations” because they are randomly charged positive and negative (although even if that were true, wouldn’t they just pair up?). Passing through a magnetic field, they say, gives all the molecules a positive charge, breaking up these supposed big clusters of fuel molecules. They don’t say where all the resulting spare electrons go.
Or at least that’s what Maximus Green used to say. In a recent (unsuccessful) submission to the Advertising Standards Authority they offered a completely different story. Quoted in the ASA ruling they said that “the hydrogen and carbon compound of gas and oil had two distinct isometric (sic) forms – ‘Ortho-state’ and ‘Para-state’ – which were characterised by different, opposite nucleus spins. The Ortho-state was more unstable and reactive in comparison to the Para-state, and therefore that state was desired because it resulted in a higher rate of combustion. They said that when fuel passed through the magnetic field the hydrocarbon molecule changed from the para-hydrogen state to the ortho-hydrogen state, and that the higher energised spin state of the ortho-hydrogen molecules produced high electrical potential (reactivity), which attracted additional oxygen and therefore increased combustion efficiency”.
Another player, Maxsys, meanwhile, are having none of this ionised oil, lumpy gas or nuclear spin stuff. Their 2014 brochure lays the blame on very fine dust in the fuel. By applying a magnetic field, they say “nanoparticles that would normally pass through the combustion or reduce heat transfer efficiency, by clinging to and fouling surfaces, begin to cluster together”, an effect which forms “larger colloids, less likely to create a film deposit and compromise a plant’s performance”. Now pardon my scientific knowledge, but a “colloid” is a stable suspension of very fine particles in a liquid. Milk is a good example. Be that as it may, Maxsys are saying that magnetic fields cause things to clump together, in direct contradiction to what we heard earlier from Maximus Green in one of their versions of how magnetism supposedly works.
Someone is telling porkies and I will leave it to you, dear reader, to work out who.
Footnote: an independent test of the efficacy of magnets on fuel lines was carried out by Exeter University in 1997. Their report, which strangely is never quoted by vendors, can be downloaded here.
This kit of parts appears in my latest “Kitchen Table-top Talk” on energy topics. It demonstrates the principle of evaporative cooling, a technique which can reduce the temperature of air in a space without the need for active refrigeration. To view the video please visit my YouTube channel.
Certification to ISO50001 can yield benefits but would be fatally compromised if a misleading energy performance indicator is used to track progress.
Power Utilisation Effectiveness, PUE, is the data-centre industry’s common way of reporting energy performance, but it does not work. It is distorted by weather conditions and (worse still) gives perverse results if users improve the energy efficiency of the IT equipment housed in a centre through (for example) virtualisation.
This presentation given at Data Centres North in May 2018 explains the problem and shows how a corrected PUE should be computed.
And the winner of the Pants on Fire Award is… DB2 Management OÜ who sell a product called ‘Ecovolt’. This device, which plugs into a standard 13A wall socket, is claimed to cut 30-50% off your electricity consumption. What makes it a stand-out candidate for the Pants on Fire Award is the advertisers’ invocation of conspiracy theory::Their web site includes a short video purporting to prove the device’s energy-saving effect. It shows a pair of electric hair clippers on an extension adaptor drawing 0.28 A. When the Ecovolt device is plugged into a neighbouring socket, the current falls to 0.08 A. Electrical engineers will recognise this as an example of power-factor correction and nothing to do with reducing the real power drawn by the appliance; like the EPS Energy Saver which I reported on a couple of years ago (pictured below), the Ecovolt probably contains a big capacitor and not much else.
The visitor to the web site sees continual pop-up notices saying that Tatiana, Sara, or Phillip and so on have just ordered Ecovolt. Keep your eye on those alerts for more than 70 seconds and Tatiana, Sara and Phillip appear again followed by six other repeated names. That’s the kind of loyal customer we all want.
The firm operates out of a Post Office Box in Tallinn, Estonia, and sells a diverse product range including night driving glasses, dash cams, and non-stick frying pans. I ought also to mention that Ecovolt is someone else’s trade mark.
Dateline 1st April 2018: we bring you news of the first ever universal energy-saving product. It is a multi-award-winning patented gel, discovered by an ex-NASA scientist, which boasts a unique combination of nano-magnetic and photo-piezo-electric properties.
Used as an additive in heating-system water it has a triple action. Firstly by reducing surface tension, it improves thermal contact between the water and internal heat transfer surfaces. As a result radiators heat up faster and cool down more slowly, saving energy. Secondly it removes air (improving thermal contact between the water and internal heat transfer surfaces). Removing air means less corrosion and scaling, while its nano-magnetic properties repel any residual magnetite. As a result radiators heat up faster and cool down more slowly, saving energy. Finally it fills in the gaps between water molecules, improving thermal contact between the water and internal heat transfer surfaces. As a result radiators heat up faster and cool down more slowly, saving energy.
The product can also be applied to radiators externally as a paint which promotes heat transfer through far infra-red radiation. As a result rooms heat up faster and cool down more slowly, saving energy.
Another way to use it is as a wall paint. Used externally, its embedded nano-scale vacuum bubbles allow it to act as a superinsulator: just 0.25mm thickness is the equivalent of 7cm thick conventional cavity fill or exterior wall insulation. As an internal paint applied to the wall behind a heating radiator it reflects wasted heat back into the room, which then heats up faster and cools down more slowly, saving energy. The gel changes to a solid at exactly your preferred room temperature, absorbing or releasing latent heat. As a result of this ‘phase change’ action, when applied as an undercoat for interior wall paint or as a wallpaper adhesive, the room will heat up faster and cool down more slowly while maintaining a steady temperature, saving energy.
It can even be used for painting windows, where its photo-electric properties allow it to generate free energy from the sun without loss of light transmission into the room, and as a floor paint its piezo-electric properties mean it can capture energy from passing pedestrians, generating enough power.
It has benefits in plant rooms and substations, too. As a coating on gas or oil supply pipes, its nano-magnetic effect yields all the benefits of the different types of awkward and bulky bolt-on magnetic devices. For example by rearranging the ortho- and para-hydrogen molecules it promotes more complete and rapid combustion. It also aligns the fuel molecules and makes them more reactive, which promotes more complete and rapid combustion. In the case of oil fuels this calorific value enhancement (CVE) can be further increased by adding the product to the fuel itself, where it alters a previously-undiscovered property of the oil to make its molecules more reactive, which promotes more complete and rapid combustion.
The gel is non-Newtonian, so its action does not have any equal and opposite reaction, making it an ideal lubricant to reduce energy losses in gearboxes.
On electrical systems the product can be applied to the outer insulation of supply cables where its nano-magnetic properties will optimise the voltage without the need for transformers or other lossy electrical devices. Moreover, it has the effect of counteracting the random ‘Brownian motion’ of the free electrons in the conductors so that they move in a more orderly manner through your electrical equipment, improving its efficiency by up to several percent.
As a refrigerant additive, it modifies a previously-unknown property of the refrigerant fluid, causing it to absorb heat faster and release it more slowly, saving energy, and when applied to the thermostat sensor of a freezer it shields it from the effects of changing temperature, reducing the operation of the refrigeration compressor and saving energy.
In his highly-recommended book Information dashboard design, data-presentation guru Stephen Few criticises pie charts as being a poor way to present numerical data and I quite strongly agree. Although they seem to be a good way to compare relative quantities, they have real limitations especially when there are more than about five categories to compare. A horizontal bar chart is nearly always going to be a better choice because
there is always space to put a label against each item;
you can accommodate more categories;
relative values are easier to judge;
you can rank entries for greater clarity;
it will take less space while being more legible; and
you don’t need to rely on colour coding (meaning colours can be used to emphasise particular items if needed).
Pie charts with numerous categories and a colour-coded key can be incredibly difficult to interpret, even for readers with perfect colour perception, and bad luck if you ever have to distribute black-and-white photocopies of them.
Data presentation is one of the topics I cover in my advanced M&T master classes. For forthcoming dates click here
ONE OF MY GREAT FRUSTRATIONS when training people in the analysis and presentation of energy consumption data is that there are very few commercial software products that do the job sufficiently well to deserve recommendation. If any developers out there are interested, these are some of the things you’re typically getting wrong:
1. Passive cusum charts: energy M&T software usually includes cusum charting because it is widely recognised as a desirable feature. The majority of products, however, fail to exploit cusum’s potential as a diagnostic aid, and treat it as nothing more than a passive reporting tool. What could you do better? The key thing is to let the user interactively select segments of the cusum history for analysis. This allows them, for example, to pick periods of sustained favourable performance in order to set ‘tough but achievable’ performance targets; or to diagnose behaviour during abnormal periods. Being able to identify the timing, magnitude and nature of an adverse change in performance as part of a desktop analysis is a powerful facility that good M&T software should provide.
2. Dumb exception criteria: if your M&T software flags exceptions based on a global percentage threshold, it is underpowered in two respects. For one thing the cost of a given percentage deviation crucially depends on the size of the underlying consumption and the unit price of the commodity in question. Too many users are seeing a clutter of alerts about what are actually trivial overspends.
Secondly, different percentages are appropriate in different cases. Fixed-percentage thresholds are weak because they are arbitrary: set the limit too low, and you clutter your exception reports with alerts which are in reality just normal random variations. Set the threshold too high, and solvable problems slip unchallenged under the radar. The answer is to set a separate threshold individually for each consumption stream. It sounds like a lot of work, but it isn’t; it should be be easy to build the required statistical analysis into the software.
3. Precedent-based targets: just comparing current consumption with past periods is a weak method. Not only is it based on the false premise that prevailing conditions will have been the same; if the users happens to suffer an incident that wastes energy, it creates a licence to do the same a year later. There are fundamentally better ways to compute comparison values, based on known relationships between consumption and relevant driving factors.
Tip: if your software does not treat degree-day figures, production statistics etc as equal to consumption data in importance, you have a fundamental problem
4. Showing you everything: sometimes the reporting philosophy seems to be “we’ve collected all this data so we’d better prove it”, and the software makes no attempt to filter or prioritise the information it handles. A few simple rules are worth following.
Your first line of defence can be a weekly exception report (daily if you are super-keen);
The exception report should prioritise incidents by the cost of the deviations from expected consumption;
It should filter out or de-emphasise those that fall within their customary bounds of variability;
Only in significant and exceptional cases should it be necessary to examine detailed records.
5. Bells and whistles: presumably in order to give salesmen something to wow prospective customers, M&T software commonly employs gratuitous animation, 3-D effects, superfluous colour and tricksy elements like speedometer dials. Ridiculously cluttered ‘dashboards’ are the order of the day.
Tip: please, please read Stephen Few’s book “Information dashboard design”
Current details of my courses and masterclasses on monitoring and targeting can be found here