Participating in a remote meeting for one hour generates the same emissions as driving just 580 metres. That was my conclusion when someone asked me what were the relative environmental impacts of remote and in-person meetings. Here’s how I approached the question…
We’ll start by estimating the energy intensity of data communications. We know from an Ofcom study that in 2018 the average UK fixed broadband connection was using 240 GB per month, and if we assume £30 per month was the typical tariff, that works out at £0.125 per GB. Now let’s assume that this price covers the operator’s costs and that, pessimistically, 50% of that cost is for electricity which they were buying at (say) £0.15 per kWh. This implies an energy intensity of £0.125 x 50% / £0.15 = 0.42 kWh per GB.
But how much data is there in a remote meeting? Fortunately we can get a good direct estimate from the sizes of session recordings. My two-hour on-line events have typically resulted in recordings of around 500 MB, which must be the equivalent of all the data broadcast to each participant (as a sense check, that’s 250 megabytes per hour, or about 0.55 megabits per second bandwidth). To be conservative let’s add as much again for return traffic from each participant, giving a total of 500 MB (0.5 GB) per hour per participant.
At 0.42 kWh per GB that implies 0.5 x 0.42 = 0.21 kWh per participant-hour.
This only accounts for the communications element. To be fair we need to add the cost of central data processing and to do that I’m firstly going to guess that the server consumes 100 watts for the purposes of processing the meeting. Secondly I’ll assume that the meeting has four participants. That would imply 0.025 kWh per participant-hour, bringing the total to 0.235. The fact that it’s a small correction means the conclusions aren’t very sensitive to the number of participants. If we assume a grid carbon intensity of 0.3 kgCO2/kWh we arrive at emissions of 0.235 x 0.3 = 0.07 kgCO2 per participant-hour.
How does that final figure compare with car travel to the meeting? The average car in the UK emits about 0.12 kgCO2 per km, so attending an hour-long remote meeting equates, in emissions terms, to 0.07/0.12 = 0.58 km of car travel. Case closed.
This article first appeared in the Energy Management Register bulletin on 12 July, 2021. Subscriptions are free of charge: please follow this link. You can unsubscribe again from any issue.
Submitted by the Association of Midlands Energy Professionals
The Association of Midlands Energy Professionals (MEP) invites you to join us for our FOURTH annual event for energy assessors and other energy professionals, under the title: “Gearing up for Change”.
With BREXIT behind us and the TRUSTMARK now up and running, MEES is having a significant impact on the type of work we do. WHOLE HOUSE RETROFIT, PAS2038, and THE FUTURE HOMES STANDARD are becoming even bigger drivers as the gathering momentum surrounding climate change is set to make 2021 a year of real involvement and opportunity for us in changing the behaviours of our customers.
We will be on the front line, giving advice and promoting change to UK consumers.
We will hear from a selection of keynote speakers who will be bringing us up to date on all the above and more. The speakers will include the leading lights from the Retrofit Academy, TrustMark, and the Accreditation Bodies.
There will be a selection of workshops to participate in. These workshops will give you tasters of the ‘state of art’ tools and techniques.
There will be a “BBC Question Time” style session where you can put questions to a panel of experts including the Accredited Bodies
This will be a full and informative day which will provide 5 hours CPD plus valuable networking with fellow assessors and other professionals.
Date: Wednesday 29 September 2021 (10.00 to 16.30)
The charge for the event is £50 for MEP members, and £60 for non-members. Subscribers to the Energy Management Register newsletter can join at MEP members’ rates using their customary discount code. There will be a £10 early bird discount for those booking before the end of August 2021. Booking forms are available from
This story concerns a commercial data centre, and specifically its cooling system. The players are: (a) clients whose servers are housed in the centre; (b) a facilities operations team responsible for maintaining conditions in the server hall; and (c) a sustainability manager whose duty is to ensure that energy consumption is minimised. There is a service level agreement in place and the facilities team are contractually obliged to report regularly on the server-room temperature.
The sustainability manager regularly reviews consumption against weather-related targets, in order to detect excessive consumption. Specifically he uses the relationship between chiller electricity consumption and cooling degree days, as illustrated in Figure 1:
At the end of September 2020, weekly consumption began to deviate from expected values. The first few weeks of abnormal performance are highlighted in Figure 2:
Figure 3 is a control chart which shows that the deviation is not only statistically significant compared with anything previously observed, but it’s also persistent:
At this point the sustainability manager challenged the operations team for an explanation. The problem turned out to be the location of the temperature sensor that was used for their routine service-level reports. It was not registering the actual air temperature at equipment level, but a higher value. To get around this problem the ops team had started overcooling the building to ensure that their temperature reports were within the specification.
The problem was ultimately rectified by relocating the sensor used for reporting, and reverting to the correct space temperature set point. Figure 4 shows how consumption then came back within its normal control limits:
Thanks to Dave Covell of Clearlead Consulting for this picture of a magnet on a gas line which had grabbed hold of his steel-capped safety shoe as he was passing. We know magnets are useless, but it’s a bit rich that they make safety shoes hazardous…
We will have two real experts on heat pumps addressing our afternoon conference “Decarbonising heat – practical realities” on 8 July, which will focus on the non-domestic market and the lessons that can be learned from real-life installations.
Ben Whittle is a technical manager from the Energy Saving Trust. He has been working in the world of renewables for 20 years, and has previously worked for companies designing and installing solar thermal, solar PV, biomass and heat pump systems to megawatt scale.
John Cantor started out manufacturing and installing bespoke heat-pump systems during 1980s and 90s. He was system inspector for the first UK grants through BRE and was also on the MCS working group. He’s an honorary member of the Ground Source Heat Pump Association and author of ‘Heat Pumps for the Home’.
As well as heat pumps we’ll examine the realities of biomass installations and consider the prospects for hydrogen. More details at https://vesma.com/z200
Delighted to have Jan Rosenow opening our afternoon conference “Decarbonising heating – practical realities” on 8 July. Focussing on the non-domestic market, our speakers will discuss real-life experience with biomass boilers and heat-pump systems, and draw lessons for future projects. We’ll also hear from an expert on hydrogen about how that might be introduced into the national heating-fuel mix, and we have allowed plenty of time for questions from the audience.
WHEN thinking about possible energy-saving projects, you might ask yourself how radical you want them to be. There’s a spectrum from the relatively easy to the costly and disruptive. Readers may have their own views on this but I think the spectrum runs like this:
Make sure control setpoints and timings are correct. This is generally the cheapest and least disruptive measure one can take;
Enhance the control strategies. For example introduce floating setpoints on chilled water circuits, optimum start in place of fixed timers, or variable-speed control of motor-driven equipment;
Implement loss-reducing modifications. Examples here include zone isolation valves in a compressed-air network or thermal insulation on hot pipework;
Improve component efficiency. The classic case here is lighting technology (which may be cost-effective in its own right) but think also about things such as the introduction of higher-efficiency or better-sized motors (which may only be economical when replacement is necessary for other reasons);
Improve process layouts and integration. Here I am thinking primarily about opportunities for waste heat recovery, but there are other special cases where part-processed materials may gain or lose moisture or heat while in transit between stages to the detriment of overall energy efficiency. And finally the nuclear option:
Retire buildings or process plant in favour of more energy efficient replacements
Effective energy waste avoidance relies crucially on the comparison of actual and ‘expected’ consumptions. Classically we do this on a weekly or monthly basis, using models for expected consumption that are linked to independent driving factors. But there are other ways to skin that cat.
Buildings will in many cases have a characteristic diurnal pattern of demand that can be expressed as a profile at, say, half-hourly intervals. With a large enough group of similar buildings, and taking account of drivers like the weather, it seems possible in theory to create a dynamic template for each building against which its demand can be assessed in near-real-time. The template is just a different way of calculating and expressing expected consumption, but it creates the realistic prospect of daily exception reports. Of course the implied excess costs need to be taken into account, because you need to be able to suppress the clutter of insignificant deviations, prioritise cases for investigation and estimate the value of resolving them, just as you would if you were using a weekly or monthly overspend league table.
The role of artificial intelligence here is to learn what ‘correct’ behaviour looks like and one advantage of this in large estates is that it obviates the need for human analysts to calibrate degree-day regression models for every meter. Another benefit would be the recognition of common abnormalities in profiles. Properly trained with correct human feedback, an AI-based pattern recognition system could in principle recognise symptoms that have occurred before elsewhere and associate them with remedies that have previously been successfully applied.
A further benefit is advanced benchmarking. In classical M&T we know that buildings can be benchmarked by comparing the slopes and intercepts respectively of their degree-day regression lines. A pattern-analysis system can take this more incisive analysis to a whole new level.
I will be interviewing James Ferguson, a keen proponent of AI in energy waste detection, on 15 July 2021 in my “Energy Conversations” series of open video calls. If this is a subject which interests you you can request a place in the audience here.
Once you have discovered how to routinely calculate expected consumptions for comparison with actual recorded values, you can get some very useful insights into the energy behaviour of the processes, buildings and vehicles under your supervision. One thing you can do is chart the history of how actual and expected consumption compare. In this example we are looking at the daily electricity consumption of a large air-compressor installation:
The green trace represents expected kWh (computed by a formula based on the daily air output) and the individual points represent the actual metered kWh. Most of the time the two agree, but there were times in this case when they diverged.
It is illuminating to concentrate on the extent to which actual consumption has deviated from expected values, so in the following chart we focus on the difference between them:
There will always be some discrepancy between actual and expected consumptions. Part of the difference is purely random, and the limits of this typical background variation are signified by the red dotted lines. If the difference goes outside these bounds, it is probably because of an underlying shift in how the object is performing. In the above diagram there were three episodes (one moderate, two more severe) of abnormal performance. Significant positive deviations (above the upper control limit) are more usual than negative ones because consuming more energy than required for a given output is much more likely than using less.
For training in energy consumption analysis look for ‘monitoring and targeting’ at VESMA.COM
In a well-constructed energy monitoring and targeting scheme, every stream of consumption that has a formula for expected consumption will also have its own control limit. The limits will be narrow where data are reliable, the formula is appropriate, and the monitored object operates in a predictable way. The limits will be wider where it is harder to model expected consumption accurately, and where there is uncertainty in the measurements of consumption or driving factors. However, it is not burdensome to derive specific control limits for every individual consumption stream because there are reliable statistical methods which can largely automate the process.
Control charts are useful as part of an energy awareness-raising programme. It is easy for people to understand that the trace should normally fall between the control limits, and that will be true regardless of the complexity of the underlying calculations. If people see it deviate above the upper limit, they know some energy waste or losses have occurred; so will the person responsible, and he or she will know that everyone else could be aware of it as well. This creates some incentive to resolve the issue, and once it has been sorted out everyone will see the trace come back between the limits.
Widespread adoption of automatic meter reading has given many energy users a huge volume of fine-grained data about energy consumption. How best to use it? A ‘heat-map’ chart is a powerful visualisation technique that can easily show ten weeks’ half-hourly data in a single screen. This for example is the pattern of a building’s gas consumption between November and January:
Each vertical slice of the chart is one day, running midnight to midnight top to bottom, with each half-hourly cell colour-coded according to demand . This creates a contour-map effect and when you look at this specifi example, you can see numerous features:
Fixed ‘off’ time;
Optimised startup time (starts later when the building has not cooled down as much overnight);
Peak output during startup;
Off at weekends but with some heating early on Saturday mornings;
Shut-down over Christmas and New Year; and
A brief burst of consumption during the Christmas break, presumably frost protection.
This building’s gas consumption pattern is quite similar to the previous one’s (they both belong to the same organisation), but the early-morning startup boost is much more evident and occurs even during the Christmas and New Year break:
Next we have a fairly typical profile for electricity consumption in an office building. What is slightly questionable is the higher daytime consumption near the start (April) compared with the end (June). This suggests the use of portable heaters. Note also that the peak half-hourly demands can easily be seen (Friday of the second week and Wednesday of the fiifth week). In both cases it is evident that those peaks occurred not because of any specific incident but because consumption had generally been higher than usual all day:
In this final example we are looking at short-term heatmap views of electricity feeding a set of independent batch processes in a pharmaceutical plant. The left-hand diagram is the actual measured consumption while the right-hand diagram is the expected profile based on a mathematical model of the plant into which we had put information about machine scheduling: