Energy Management Information Systems: The Missing Link for Facilities Efficiency (Part 1)

Les Lambert, P.E. | Feb 25, 2003

Share/Save  
Introduction

With rising energy costs, you'd like to bring your facility to peak energy efficiency, and keep it that way - economically. But most facilities are not "commissioned" for best efficiency before going into service. Once in service, efficiency usually goes downhill. And you won't learn whether day-to-day operation and maintenance is at peak efficiency from utility bills. What's a manager to do?

You could hire consultants or an energy service company, buy a building automation system, or replace inefficient equipment -- given lots of budget, and competent, trustworthy outside help.

Better yet, you could do much of it. Experience shows that 10% or greater savings often result from simply operating what you have more efficiently. Operating and Maintenance (O&M) changes usually have the shortest paybacks. But you must know how your facility uses energy, and how to readily exploit this information. This takes an Energy Management Information System.

The Energy Manager's Challenge

Management theory says information - to see what's happening, and the results of your actions - is essential. But getting that information has been easier said than done. Here are some challenges a successful energy management information system must address:

Monthly utility bills make you wait until next month's bills to see if this month's actions did any good. When the bills arrive, you must guess whether differences are due to weather, different schedules, holidays, or your efforts. Effects of conservation efforts are often masked by weather and daily or weekly occupancy-driven fluctuations in energy use. Staff may not be following management direction. People make mistakes, and equipment mal-adjustments and failures can go unrecognized. Results of HVAC tuning efforts, and of changes or retrofits that affect multiple systems, are particularly hard to gage.

Worse yet, most energy inefficiencies are invisible to management. Is your equipment working as intended? Are lights are turned off after the cleaning crew goes home? Is the main heating, ventilating and air conditioning (HVAC) equipment properly tuned up for efficiency? Was the last HVAC repair properly done, or just a cosmetic band-aid? Do fan systems run full-time at night (drawing in - and conditioning - lots of fresh air) when nobody's there? These and similar operational problems require periodic management attention. If no one checks, they get out of control.

Building automation systems - or "energy management and control systems" (EMCS) - only solve part of the problem. If properly programmed, they allow you to schedule and regulate equipment more efficiently. But they don't usefully summarize energy data. Some provide floods of data - so much, it's time-consuming and difficult to review. But neither an EMCS nor utility bills tell you how to adjust for weather and occupancy changes to see long-term trends, or trend changes.

Energy managers often have tiny budgets. Other facilities management duties may leave scant time for conservation. The typical facilities manager cannot afford the time to become a specialist in energy conservation. Worst of all, managers typically must try to save energy with woefully inadequate energy management information. No wonder energy gets wasted.

Wanted: MIS

Fifty years ago, some expected energy to become "too cheap to meter". Instead, energy's now too expensive not to meter - in detail. Most CEOs would be horrified at having no cost accounting other than monthly total expenditures. But that's precisely how most of us "manage" energy.

The United States spends over 2% of GDP annually on financial management information systems (MIS). Legions of accountants, auditors, bookkeepers, and data entry personnel, with their computers, program-mers, and software, labor mightily to give managers detailed cost breakdowns and analyses. This enables managers to know - not guess - where their money goes - and more efficiently manage it. Why do any less, with costly (and mostly polluting, non-renewable) energy? We should properly equip ourselves, and spend at least as much - over 2% of annual energy costs - for on-going energy management.

A building's energy MIS should provide these basics:

  • A permanent set of economical basic measurements.
  • A flexible capability to measure items of temporary special interest.
  • Low effort data processing and simple interpretation, to identify trends and track problems.

The first two of these basics are readily achievable instrumentation tasks. But with no simple, practical way to use the resulting data, the instrumentation does little good.

The MIS data processing and interpretation should be as nearly intuitive as possible. Benchmarking should support assessing the need for action and progress toward efficiency goals. The MIS should support typical conservation activities such as commissioning, scheduling, tuning, and retrofitting. Go/no-go criteria should quickly signal if action is needed. When things are operating efficiently, data review time needs should be minimal. When action's needed, the MIS should provide information for trouble-shooting and follow-up. The MIS should empower on-site or remote managers to rely more on their own judgement, and less on outside consultants, to save energy.

If the MIS performs these functions successfully, it should pay for itself through enhanced O&M energy savings. Such savings might typically amount to 10%. In facilities with annual bills over $25,000 to $50,000, realizing such savings should pay for the cost of essential measurements in a few years.

Such a system exists.

The Energy Signature Concept

It's long been known that utility billing data correlates with monthly weather. But the correlation is sometimes poor, due to "scatter" in the data. Poor correlation limits data usefulness. So we need a conceptual model that accounts for scatter, or better yet, gets rid of scatter. Here's the basic conceptual model for energy signatures:

With constant building physical condition, occupancy, operation and maintenance, variations in energy use rate depend solely on weather.

But building occupancy usually isn't constant, and many people can't resist fiddling with thermostat settings. To account for changes in thermostat settings, we can use inside-to-outside temperature difference instead of outside temperature. For occupancy, most changes occur in daily and weekly cycles. If we are always careful to measure over an even number of cycles, and only attempt comparing like kinds of cycles, the basic idea still works.

Figure 1 shows hourly facility energy use data for a week. It's obvious that energy use rates won't be representative (or equal) on Monday morning or on Monday night or Friday afternoon or Sunday. Each of these times represents a different part of a single occupancy cycle. For spotting long-term trends, looking at only part of a cycle produces "sampling error". In a building, getting a representative value, unaffected by sampling error, requires aggregated measurements over one or more complete cycles - not partial ones. Using weekly time aggregation provides "optimal smoothing", to eliminate most of the data scatter.

Figure 2 - Example Energy Signature

Figure 2 shows a month of hourly data for the same building as shown in figure 1. By using weekly data aggregation, the long-term pattern, showing that energy use rate depends on weather and temperature settings, emerges from the jumble of hourly energy use data. The plot is nearly linear as a function of delta-T, in spite of a change in the night and week-end setback temperatures and schedule for the last week of the data period shown. (The two excursions "above the line" result from atypical occupancy patterns, discussed later.) Contrasted to 100% or greater standard deviation in hourly energy data in Figure 1, the regularity of this curve shows the energy signature's optimal smoothing benefits.

Now the conceptual model can be restated and expanded:

  1. With constant building physical condition, and consistent occupancy, operation and maintenance ("consistent conditions"), variations in total energy use rate from one occupancy cycle to the next depend entirely on weather and inside temperature.
  2. For weather variations, outside temperature is the dominant driver of energy use rate; variations in humidity, wind, and insolation tend to be randomly distributed or second order effects.
  3. Under consistent conditions, a plot of cycle-average energy use rate versus mean coincident delta-T, over a sufficient number of cycles, will form a characteristic curve. With consistent conditions, all future cycles will fall on this same curve.
  4. Significant scatter in such a curve will be due to inconsistent operation, or to changes in building physical condition, maintenance, or occupancy.

Notice the substitution of "consistent" for "constant", as regards occupancy, operation, and maintenance. "Occupancy cycle" means the regular occupancy cycle. It might be one day for a nursing home, prison or hospital, but more typically is one week. Regular occupancy cycle does not include irregular situations such as weeks with holidays, plant shut-downs for maintenance, etc. Things that change total or heating or cooling loads or HVAC efficiency compared to regular occupancy should be considered as possibly irregular. The examples that follow will help you understand what's "regular" and what's not.

Thermal inertia, and letting changes in wind, humidity, and sun average out are good reasons to use weekly, (not daily) cycles. Defining "significant scatter" needs examples or statistics. For a statistical explanation, read the following paragraph. Otherwise, the examples show what's "significant scatter".

Statisticians sometimes try to describe a data set with an equation that produces a curve or line. The curve is chosen so it goes through or near as many points as possible on a graph of the data. "Scatter" refers to deviation of data points from the curve or line fitted to the data points. "Correlation coefficient" (r2) describes how well the curve fits the data. R2 = 1.0 indicates the curve perfectly fits the data, with zero scatter. Points away from the curve constitute scatter; points far from the curve are "outliers". Values below 1.0 indicate the curve doesn't perfectly describe the relationship between data points. For example, r2 = 0.96 indicates that the curve fit accounts for 96% of the variations in data values, with 4% "scatter" or unexplained variation. For an energy signature like figure 2, r2 = 0.96 indicates that variation of energy use rate with delta-T explains 96% of the variation in energy use rate, with 4% scatter, unexplained by delta-T. For heating conditions, a working definition of "significant scatter" is r2 value less than 0.96.

Measurement Requirements

"Basic measurements" needed for an energy signature are modest. Figure 2 is an energy signature for an all-electric building. Three data points - hourly measurements of total energy use, inside and outside temperature - were needed (with gas or steam, a fourth data channel is needed.) Beyond these minimums, more measurements may be added to improve problem-solving capabilities, or for larger buildings.

Energy Signature Characteristics

Figure 2 uses hourly data in a weekly "rolling average" form, to get a continuous curve. Each hourly data point uses energy data for the most recent 168 hours (one week), matched with temperatures for the same 168 hours. Updating each hour gives a continuous curve of hourly values. The endpoint of each week's data is marked with a symbol. It's useful to mark each week's data with a different symbol, so that points of interest can be related back to a particular time. When plotting longer periods of time, plotting just the weekly values and their symbols produces less clutter. Showing energy use rate on a per-square-foot basis gives size-independent values that enable comparisons between buildings. Energy data can also be plotted against outside temperature, but ignoring inside temperature increases scatter.

Example Energy Signatures

Example headings indicate the main ideas shown. Additional application and interpretation tips are also presented.

Detecting Variations in Operation. Figure 3 shows 1990-91 heating season data for a 200,000 ft2 all-electric middle school (Plymouth-Carver Intermediate School, PCIS) in Massachusetts (Lambert 1991). For first-cut analysis, a linear fit of all points, r2 = 0.87. But the data suggest two lines (rather than a single line or curve), indicating two distinct types of consistent operation. We found that on holiday weekends (points V,T,C,N, & K - Veteran's day, Thanksgiving, Christmas, New year's & Martin L. King day) rooftop HVAC was entirely shut down for three days, not two.

This means there are really two distinct occupancy cycle types. Re-evaluating the data as two sets, the fits improved to r2 = 0.95 (two-day weekends) and 0.925 (three-day weekends). But this left a puzzle. Delta-T explicitly accounts for temperature setback. "Consistent conditions" implies a consistent average building loss coefficient. Just reducing temperatures would shift operation to the left, on the same curve. Two distinct curves meant something else had changed - the average building loss coefficient. Shutting fans off reduced ventilation loads, reducing the weekly average building loss coefficient for three-day weekends.

This data, confirmed by on-site investigation, helped us to discover that rooftop HVAC mixed air temperatures were revised from 65 to 60, co-incident with a lighting retrofit. Greatly increased ventilation made the lighting retrofit appear to fail.

Also, the energy signature slopes allow quantifying set-back savings. They showed the school could save $1500 per month for each degree winter-time inside average temperatures were reduced.

Holidays mostly fall near week-ends. This, and thermal inertia effects, make Wednesday noon a good time to start the "data week". Thus, one week (not two) is affected by most holidays.

Figure 3 used the building main meter and the average of five inside temperatures, to sample 36 rooftop HVAC zones.

Comparing Different Buildings. Energy signatures can be compared, using Watts/square foot. PCIS (the school in fig. 3) is compared to four nearby 55,000 Sq. ft. all-electric elementary schools in Figure 4. Linear fits of data (holiday weeks excluded) are shown for PCIS plus Federal, Indian Brook, South and West (P,F,I,S,W).

The elementary schools were built from the same plans. The same vintage as PCIS, they had near-identical designs, in terms of insulation, fenestration, structural panels, mechanical equipment, etc., to PCIS. If size were the only difference, one would expect near-identical area-normalized energy signatures for all five. Instead, there are significant differences.

We suspected O&M differences; two different ventilation control types and control settings were discovered. Federal and Indian Brook used a fixed percentage of outside air for ventilation. PCIS, South, and West all used mixed air temperature control, albeit with differing mixed air temperature settings.

Checking Staff Efficiency, Detecting Interactive Effects and reakdowns. Figures 5 and 6 show weekly data for Federal and West Elementary.

As with PCIS, regular and holiday weeks differed. Figure 5 shows four holiday weeks "below trend", consistent with longer HVAC shut-downs for holiday weeks (except King day) at Federal.

Figure 6 shows some holiday weeks above trend at West; there were no holiday shutdowns. West's custodian was suspected of neglecting to program holiday shutdowns. But data also showed an equipment problem at West, discussed below.

Thanksgiving and Christmas weeks (T & C) are visibly above trend in Figure 6. This indicates interaction between lights and HVAC. Why?

Both lights and HVAC furnish part of the energy required during heating conditions. The heat from "internal gains" such as lights is delivered at 100% efficiency, since lights are entirely within the heated envelope. But energy input to the HVAC system suffers over 25% losses, from ducts (conduction & leakage), and from tempering mixed air; efficiency is less than 75%.

Internal gains from lights and equipment were lower during the holidays, requiring more heat from the rooftop HVAC. Resistance space heat required higher total energy input to maintain the same delta-T, when the schools were vacant.

Interaction between lights (100% efficiency) and HVAC (~75% efficiency) thus explains why points T and C were above trend at West.

Figure 6 also shows a striking O&M problem. The high outlier (point K) includes a three day week-end, M. L. King day. Saturday, setback had started, when a power outage disrupted the EMCS. It reverted to fail-safe mode - occupied temper-ature and ventilation settings, day and night, for the weekend, just after passage of a cold front. Extra cost of this one O&M mishap was $2600, all because the EMCS needed new back-up batteries.

Each elementary school used two inside temperature sensors to sample 9 HVAC zones.

Energy Signatures used as an MIS for Commissioning and O&M Improvement.

Figures 7...9 show energy signatures used for both commissioning and energy management. A multi-channel programmable data logger was used to gather data for the energy signatures. This allowed monitoring other items expected to be of interest.

In commissioning, it's not unusual to find errors in controls installation or programming. For each modular, we monitored total and HVAC energy use (two units per modular, except for A), and one inside temperature. Monitoring each HVAC unit separately allowed use of HVAC diagnostics1 on each unit, for rapid commissioning. Subtracting HVAC from total energy isolates lighting & equipment, to examine it for efficient scheduling.

Figure 7 shows baseline data for six modular classrooms in Bend, OR for three weeks during February and March 1998. Three weeks of pre-commissioning data are shown for each of buildings A...F. Of the six modulars, five are identical in envelope, size and HVAC units. Modular A is half-size, with one HVAC unit instead of two. Since the modular classrooms all came from the same production line, one might expect them to be nearly identical. If they were identical, they would all plot on the same line. But the plot looks like something from a 12-gauge shotgun. Their performance is not identical.

First, modular D uses about 150% more energy than the others. Second, although A, B, C, E and F are more closely grouped, they still differ by more than 50% in energy intensity. Third, there's a 10oF range in delta T values, for modulars subject to identical weather. All these observations warranted follow-up.

Diagnostics for modular D showed one HVAC unit operated continuously, alternately heating and cooling. The compressor or the strip heat was on at all times. Our diagnostics spotted this problem immediately, and guided warranty service. When called, the service contractor's first impulse was to replace the thermostat. Discovering this had already been done twice, we removed a known functioning thermostat from another unit, and showed it did not fix the problem. Obliged to probe the HVAC unit's factory wiring, the serviceman stopped unwanted cooling.

Subsequently, diagnostics showed the unit's fan still ran continuously, regardless of the fan mode selected. That problem was eliminated on the third warranty service call.

Why weren't these modulars, from the same production line, all alike? The differences could not be solely inside temperature preferences. Had this been so, all would have plotted on the same line, perhaps to the right or left of each other. Modular A is a single-wide, but the others are double-wides. Hence Modular A could have a slightly higher skin loss coefficient per square foot. But E and F are higher than B and C, despite identical sizes, envelopes and HVAC units. Part four of our concept statement says that occupancy, O&M differences or hidden physical differences are the remaining possible explanations for the scatter. Ventilation differences (leading to different overall heat loss coefficients) were suspected.

HVAC diagnostics showed a variety of fan modes in use, ranging from 24/7 "fan-on" to "fan-auto" during occupied hours. Different amounts of ventilation likely account for some of the differences. We decided to try using fan-auto mode as default, with staff able to manually select fan-on mode for more fresh air.

Finally, the wide delta-T range showed inconsistent temperature settings or setbacks. After reviewing fan modes, we saw variations in temperature patterns. Few of the staff understood how to program their thermostats. The result? Some thermostats were used as manual on/off controls.

Manual operation also caused expensive demand peaks at morning warm-up. Monitoring confirmed many units were turned on when teachers first arrived. To fix this, all thermostats were re-programmed for aggressive night and weekend setbacks, and a "staggered" (sequential rather than simultaneous) recovery from setback.

1 HVAC diagnostics allow many HVAC operating characteristics to be monitored, using only a power sensor. In this case, five parameters - Total HVAC energy use, fan-on mode energy use, compressor-mode energy use, compressor-mode duty cycle, and number of on-off cycles per hour - were monitored for each of 11 HVAC units, providing 55 channels using a total of 11 power sensors. See references for more info on monitoring details of HVAC operation.

Estimating Savings Potential. Figure 8 is an early energy signature for modular C, showing the first four weeks.

The energy signature for Modular C was below baseline data for all the other modulars. We concluded that modular C's energy signature was an achievable condition or goal for the rest of the modulars, except A.

In four weeks, the signature for C didn't cover a full year's temperature range, but it allowed rough estimates of what could be accomplished if the other modulars were like it. This allows quantifying three of the four problems shown by Figure 7, to estimate the O&M savings potential for six modulars. This was done by looking at percentage reduction possible at a known delta-T, or (for setbacks) reduction in delta-T, with results as follows:

  • 25% energy savings (of the total for all six modulars) from correcting modular D's continuous cooling/heating problem.
  • 10 to 15% energy savings from consistent, aggressive setbacks (for affected units).
  • 10 to 15% energy savings from systematic ventilation control (for affected units).
  • 40% reduction in peak demand, from a staggered start sequence for morning warm-up.

Other observations - The only change made with modular C (at the start of week 4) was adding a setback schedule. Since energy signatures explicitly account for changing inside temperatures, the data should still plot on the same curve. After adding setbacks, this is just what happened - operating points moved further to the left. Symbols "C" represents baseline data, "1" represents data after setback started. Setbacks reduced average inside temperature by about 3oF, reducing heating energy intensity by about 0.2 W/sq.ft. (about 10%). Note: temperature change energy effects must be assessed using average change in inside temperature, rather than a up-or-down shift in the energy signature.

The two segments of the line in Figure 8 that are "above trend" include data from holidays (as opposed to weeks with five days of classes). As noted for Figure 6, HVAC had to provide more of the total heat needed when the building was vacant, with lights off. The lights delivered heat at greater efficiency than the HVAC, showing HVAC losses caused by ventilation or ducting.

In Part 2 of this article, we'll show before-and-after examples of commissioning work, showing clearly measurable savings. We'll also show how to make weather-normalized comparisons between computer model outputs and real buildings, and describe how to use these techniques in your facility. References follow Part 2.

Related Topics

Comments

No Comments