All posts by Editor

Pie charts

 

In his highly-recommended book Information dashboard design, data-presentation guru Stephen Few criticises pie charts as being a poor way to present numerical data and I quite strongly agree. Although they seem to be a good way to compare relative quantities, they have real limitations especially when there are more than about five categories to compare. A horizontal bar chart is nearly always going to be a better choice because

  1. there is always space to put a label against each item;
  2. you can accommodate more categories;
  3. relative values are easier to judge;
  4. you can rank entries for greater clarity;
  5. it will take less space while being more legible; and
  6. you don’t need to rely on colour coding (meaning colours can be used to emphasise particular items if needed).

Pie charts with numerous categories and a colour-coded key can be incredibly difficult to interpret, even for readers with perfect colour perception, and bad luck if you ever have to distribute black-and-white photocopies of them.


Data presentation is one of the topics I cover in my advanced M&T master classes. For forthcoming dates click here

 

Common weaknesses in M&T software

ONE OF MY GREAT FRUSTRATIONS when training people in the analysis and presentation of energy consumption data is that there are very few commercial software products that do the job sufficiently well to deserve recommendation. If any developers out there are interested, these are some of the things you’re typically getting wrong:

1. Passive cusum charts: energy M&T software usually includes cusum charting because it is widely recognised as a desirable feature. The majority of products, however, fail to exploit cusum’s potential as a diagnostic aid, and treat it as nothing more than a passive reporting tool. What could you do better? The key thing is to let the user interactively select segments of the cusum history for analysis. This allows them, for example, to pick periods of sustained favourable performance in order to set ‘tough but achievable’ performance targets; or to diagnose behaviour during abnormal periods. Being able to identify the timing, magnitude and nature of an adverse change in performance as part of a desktop analysis is a powerful facility that good M&T software should provide.

2. Dumb exception criteria: if your M&T software flags exceptions based on a global percentage threshold, it is underpowered in two respects. For one thing the cost of a given percentage deviation crucially depends on the size of the underlying consumption and the unit price of the commodity in question. Too many users are seeing a clutter of alerts about what are actually trivial overspends.

Secondly, different percentages are appropriate in different cases. Fixed-percentage thresholds are weak because they are arbitrary: set the limit too low, and you clutter your exception reports with alerts which are in reality just normal random variations. Set the threshold too high, and solvable problems slip unchallenged under the radar. The answer is to set a separate threshold individually for each consumption stream. It sounds like a lot of work, but it isn’t; it should be be easy to build the required statistical analysis into the software.

3. Precedent-based targets: just comparing current consumption with past periods is a weak method. Not only is it based on the false premise that prevailing conditions will have been the same; if the users happens to suffer an incident that wastes energy, it creates a licence to do the same a year later. There are fundamentally better ways to compute comparison values, based on known relationships between consumption and relevant driving factors.

Tip: if your software does not treat degree-day figures, production statistics etc as equal to consumption data in importance, you have a fundamental problem

4. Showing you everything: sometimes the reporting philosophy seems to be “we’ve collected all this data so we’d better prove it”, and the software makes no attempt to filter or prioritise the information it handles. A few simple rules are worth following.

  1. Your first line of defence can be a weekly exception report (daily if you are super-keen);
  2. The exception report should prioritise incidents by the cost of the deviations from expected consumption;
  3. It should filter out or de-emphasise those that fall within their customary bounds of variability;
  4. Only in significant and exceptional cases should it be necessary to examine detailed records.

5. Bells and whistles: presumably in order to give salesmen something to wow prospective customers, M&T software commonly employs gratuitous animation, 3-D effects, superfluous colour and tricksy elements like speedometer dials. Ridiculously cluttered ‘dashboards’ are the order of the day.

Tip: please, please read Stephen Few’s book “Information dashboard design”


Current details of my courses and masterclasses on monitoring and targeting can be found here

Energy monitoring of multi-modal objects

Background: conventional energy monitoring

In classic monitoring and targeting practice, consumption is logged at regular intervals along with relevant associated driving factors and a formula is derived which computes expected consumption from those factors. A common example would be expected fuel consumption for space heating, calculated from measured local degree-day values via a simple straight-line relationship whereby expected consumption equals a certain fixed amount per week plus so many kWh per degree-day. Using this simple mathematical model, weekly actual consumptions can then be judged against expected values to reveal divergence from efficient operation regardless of weather variations. The same principle applies in energy-intensive manufacturing, external lighting, air compressors, vehicles and any other situation where variation in consumption is driven by variation in one or more independently measurable factors. The expected-consumption models may be simple or complex.

Comparing actual and expected consumptions through time gives us valuable graphical views such as control charts and cusum charts. These of course rely on the data being sequential, i.e., in the correct chronological sequence, but they do not necessarily need the data to be consecutive. That is to say, it is permissible to have gaps, for instance to skip invalid or missing measurements.

The Brigadoon method

“Brigadoon” is a 1940s Broadway musical about a mythical Highland village that appears in the real world for only one day a year (although as far as its inhabitants are concerned time is continuous) and its plot concerns two tourists who happen upon this remote spot on the day that the village is there. The story came to mind some years ago when I was struggling to deal with energy monitoring of student residences. Weekly fuel consumption naturally dropped during vacations (or should do) and I realised I would need two different expected-consumption models, one for occupied weeks and another for unoccupied weeks using degree-days computed to a lower base temperature. One way to accommodate this was to have a single more complex model that took the term/vacation state into account. In the event I opted for splitting the data history into two: one for term weeks, and the other for vacation weeks. Each history thus had very long gaps in it, but there is no objection to closing up the gaps so that in effect the last week of each term is immediately followed by the first week of the next and likewise for vacations.

This strategy made the single building into two different ones. Somewhat like Brigadoon, the ‘vacant’ manifestation of the building for instance only comes into existence outside term time, but it appears to have a continuous history. The diagram below shows the control chart using a single degree-day model on the left, as per conventional practice, while on the right we see the separate control charts for the two virtual buildings, plotted with the same limits to show the reduction in modelling error.

Not just space heating

This principle can be used in many situations. I have used it very successfully on distillation columns in a chemical works to eliminate non-steady-state operation. I recommended it for a dairy processing plant with automatic meter reading where the night shift only does cleaning while the day shift does production: the meters can be read at shift change to give separate ‘active’ and ‘cleaning’ histories for every week. A friend recently asked me to look at data collected from a number of kilns with batch firing times extending over days, processing different products; here it will be possible to split the histories by firing programme: one history for programme 20, another for 13, and so on.

Door air curtains

In situations where it is necessary to keep a building’s outer doors open, you will sometimes find “air curtains”, fans which blow a sheet of air down across the width of the doorway. These are an effective way of preventing dust and insects getting in through the door: they are entrained in the outer layer of airflow, and where the jet hits the floor it splits, with the outer layer discharging the contaminants back outside.

Convective circulation in open doorway

Some suppliers of air curtains claim that they conserve energy as well. The basis of this claim lies in what would naturally happen in an open doorway in still conditions, namely convective circulation in which warm air at high level flows out to be balanced by cold air flowing inwards at low level (right). This effect will be especially marked with high doorways. The claim for air curtains is that they disrupt the flow of escaping warm air, forcing it down to floor level where the jet splits, with the warm inner layer returning inside.

However, even in still conditions there is a problem here, because the fan is drawing air from high level inside and at floor level only half of it returns inside. 50% of the internal air drawn into the fan is diverted outside when the jet splits at floor level (left).

A further problem with pedestrian doorways particularly is that the air curtain usually needs heating to prevent the perception of cold that the air’s velocity would create. If the building actually doesn’t need that heat, it is all a waste of money. Even if it does need the heat, half of what is put in goes straight outside.

In windy conditions the argument for air curtains as heat barriers really breaks down. A moving sheet of air is simply not as effective as a door. If there is any differential pressure whatever, that sheet of air will be displaced, and the problem is exacerbated if there are open doors or windows on the far side of space – or extract fans. In one instance I visited a restaurant that operated an open-door policy. Their air curtain had a 20kW heater that ran continuously, but the downjet did not reach the floor: about 60cm above the floor it turned inwards along with a layer of cold air at floor level, thanks to the kitchen extract depressurising the space.

Condensing boilers (not)

The exhaust from a natural gas appliance contains about 0.15 litres of water per kWh of gas input, and about a tenth of the thermal output is lost because that water is emitted as vapour. Condensing boilers are a good idea in theory because they can condense the vapour and recover latent heat from the products of combustion, boosting output by around a tenth.

In practice, too few condensing boilers achieve their potential because they cannot cool the flue gas below its dew point (around 59C ). Result: plumes of vapour outside. This one resembles what you’d see boiling a 2-3 kW kettle in the open air, and that’s a measure of how much energy is being wasted.

The truth is that so-called condensing boilers need to be installed in heating systems with low return water temperatures. Underfloor heating, or systems with oversized radiators for example. Only then will they get sufficiently-low temperatures in their heat exchangers to get the exhaust vapour to condense.

 

Nice try, but…

A recent issue of the CIBSE Journal, which one would have thought ought to have high editorial standards, recently published an article which was basically a puff piece for a certain boiler water additive. It contained some fairly odd assertions, such as that the water in the system would heat up faster but somehow cool down more slowly. Leaving aside the fact that large systems in fact operate at steady water temperatures, this would be magic indeed. The author suggested that the additive reduced the insulating effect of  steam bubbles on the heat-exchanger surface, and thus improved heat transfer. He may have been taking the word ‘boiler’ too literally because of course steam bubbles don’t normally occur in a low or medium-temperature hot water boiler, and if they did, I defy him to explain how they would interfere with heat transfer in the heat emitters.

But for me the best bit was a chart relating to an evaluation of the product in situ. A scatter diagram compared the before-and-after relationships between fuel consumption and degree days (a proxy for heating load). This is good: it is the sort of analysis one might expect to see,

The chart looked like this, and I can’t argue that performance is better after than before. The problem is that this chart does not tell quite the story they wanted. The claim for the additive is that it improves heat transfer; the reduction in fuel consumption should therefore be proportional to load, and the ‘after’ line ought really to have a shallower gradient as well as a lower intercept. If the intercept reduces but the gradient stays the same, as happened here, it is because some fixed load (such as boiler standing losses) has disappeared. One cannot help wondering whether they had idle boilers in circuit before the system was dosed, but not afterwards.

The analysis illustrated here is among the useful techniques people learn on my energy monitoring and targeting courses.

Kinetic plates

When this “kinetic plate” was installed in 2009, the Guardian published an article which suggested that it would harvest up to “30 kWh per hour” of “green energy” from the traffic passing over it. Rubbish, of course. Firstly (as was acknowledged in a muted disclaimer at the foot of the article) it wasn’t free energy; it was energy taken from the passing vehicles (and thus paid for by their drivers). But what about the 30 kWh per hour claim? That’s the equivalent of harnessing the output from engine of this Peugeot and running it flat out for 15 minutes in the hour.

Really? We can do some quick sums on this. Say the car, with its driver, weighs about 1,400 kg. Suppose that it depresses the plate 10mm (0.01m). If we take gravitational constant as 9.8 N/kg, the energy imparted by the car as it drives onto the plate is 1,400 x 0.01 x 9.8 = 137.2 joules (watt-seconds). That is only 0.000038 kWh. In other words, you’d need getting on for eight hundred thousand cars an hour to achieve 30 kW output, even if the mechanism were 100% efficient, which it won’t be.

Along similar lines the IMechE published an article about a kinetic pavement in 2015. This related to a system for capturing energy from pedestrians, and rather usefully it included some statistics: that 54,267 footsteps had generated 217,028 watt-seconds. I hope all my readers can confirm for themselves that this equates to a mere 0.06 kWh.

Daylight-linked consumption

When monitoring consumption in outside lighting circuits with photocell control, it is reasonable to expect weekly consumption to vary according to how many hours of darkness there were. And that’s exactly what we can see here in this Spanish car park:

It is a textbook example: with the exception of two weeks, it shows the tighest correlation that I have ever seen in any energy-consuming system.

The negative intercept is interesting, and a glance at the daily demand profile (viewed as a heatmap) shows how it comes about:

Moving left to right we see from January to March the duration of daylight (zero consumption in blue) increases. High consumption starts at dusk and finishes at dawn, but from about 10 p.m. to 5 a.m. it drops back to a low level. It is this “missing” consumption for about seven hours in the night which creates the negative intercept. If they kept all the lights on from dusk to dawn the line would go through the origin.

For weekly and monthly tabulations of hours of darkness (suitable for England and other countries on similar latitudes)  click here.

 

Unethical sales of so-called energy saving product

The BBC’s Watchdog programme (series 38, episode 1) did an excellent job exposing the activities of a company called Energysave, which was caught training its salesmen to use high-pressure sales techniques on vulnerable customers. They were claiming, outrageously, that their product, a water-repellent coating for masonry, cut heat loss by a third. (Although their website says conductivity “decreases enormously with dampness”. Whoops)

Warning sign: an iIlliterate promotional video

You can catch up with the episode on BBC iPlayer. The relevant material is in two parts at 16’04” and 32’53” with the final confrontation scene at 51’52”, but the episode also includes stuff on defective smart meter installations.

Thanks to newsletter reader Istvan Sereg for the tip-off.