CLIMATE CONTROL? Better delivery

REDUCING TSD LINE LOSSES TO REDUCE CARBON EMISSIONS

Published In: Intelligent Utility Magazine March/April 2009

Share/Save  

AS THE PRESSURE TO MINIMIZE CARBON EMISSIONS mounts, policymakers and electric utilities have been pursuing energy efficiency with a renewed sense of urgency as a means toward this end. While most of the attention in energy efficiency has focused on more efficient end-use of electricity, there is growing interest in expanding the scope of energy efficiency to include more efficient power delivery through the electric grid.

While built for reliability, the grid is not necessarily optimized for transmission and distribution (T&D) efficiency. Electricity losses that occur in T&D between generation sources and end users represent a significant amount of energy. According to the U.S. Energy Information Administration (EIA), total net generation to the U.S. grid was 3,897 million MWh. Power delivery losses vary by utility, but nationally the figure is estimated at 7 to 8 percent. On that basis, power delivery losses amounted to 273 to 311 million MWh in 2006, equivalent to 186 to 211 million tons of CO2 emissions.

Electricity losses in power delivery are a reality of physics, but there are incremental steps that the utility industry can take to improve the efficiency of the T&D system and, by extension, avoid incremental generation and CO2 emissions.

One approach to reducing power delivery losses is investing in traditional infrastructure - including upgrading distribution transformers with more efficient amorphous core transformers, reconductoring lines, and utilizing distributed generation closer to load centers. However, these measures typically require large capital expenditures and are usually undertaken to meet T&D capacity or replacement requirements rather than for the purpose of reducing losses. The loss reduction impact of such T&D infrastructure projects is usually regarded as a side benefit.

The other approach to reducing power delivery losses is investing in advanced communications and control technology - functionality that many attribute to the concept of a smart grid.

Today, there is much discussion of potential operational benefits of a smarter grid, which could directly or indirectly improve utility customer service. These include advanced distribution management functions, outage management and power theft detection. Other operational benefits might be automated change of service, improved asset management capabilities, greater load profiling ability, grid stabilization, and a variety of advanced metering functions.

An often overlooked possible application of a smart grid is the potential to improve the efficiency of power delivery using the existing power infrastructure. The promulgation of standard communications protocols through a smart grid promises to enable utilities to monitor and modulate the operating parameters of what today are operationally incompatible components in the T&D infrastructure.

REDUCING T&D LOSSES

Better real-time monitoring, analysis and control technologies as part of a smart grid can facilitate the implementation of optimum voltage profile to minimize the reactive power flow in the transmission network. Minimizing the reactive power flow also reduces transmission losses while maintaining system reliability.

When designing improvements to the transmission system, technologies such as synchronous condensers, shunt capacitors, static VAR compensators (SVC), and STATCOM can help reduce transmission losses by directly controlling the reactive flow and providing local reactive support.
Similarly, utilities can reduce distribution losses by facilitating more intelligent controls on capacitors and optimizing their usage to reduce system losses further. A smarter grid may also enable automatic reconfiguration to minimize losses during the day, which requires distribution state estimations, more sensors, and real-time control.

CONSERVATION VOLTAGE REDUCTION

A smart grid can also help to reduce load and losses through adaptive voltage control at substations and line drop compensation on voltage regulators and load tap changers (LTCs) to levelize feeder voltages based on load. The American National Standards Institute (ANSI) standard C84.1, specifies a preferred tolerance of +/- 5 percent for 120-volt nominal service voltage to the customer meter, or a range of 114 to 126 volts, according to the Northwest Energy Efficiency Alliance (NEEA). Utilities tend to keep the average voltage above 120 volts to provide a safety margin during peak load periods. However, maintaining voltage on the upper end of the ANSI C84.1 band at all times - which most utilities do - causes the overall system to consume more energy. A smart grid using sensors along the line to monitor and maintain voltage at 114 volts could minimize energy consumption and losses while meeting all voltage standards. While the impact of voltage reduction on energy consumption will vary from circuit to circuit based on the relative apportionment of resistive and reactive load, utility experience has shown that, on average, a 1 percent reduction in voltage yields a 0.8 percent reduction in power draw, according to NEEA.

In the Pacific Northwest, for example, NEEA has been operating a distribution efficiency initiative since 2003 with Bonneville Power Administration and participating utilities to demonstrate the effectiveness of strategies such as adaptive voltage control to reduce line losses. In a report prepared by Global Energy Partners, LLC, the market potential for voltage regulation in the Pacific Northwest region was determined to be as high as 6 average MW.

The most readily quantifiable example of the impact of a smart grid on T&D efficiency is the potential to regulate voltage more precisely. We have assumed that additional voltage reduction controlled by a smart grid would be confined to the residential sector, because residential loads tend to be more resistive and therefore more responsive to voltage reduction, as opposed to commercial and industrial loads, which tend to be more reactive because of increased motor and refrigeration loads.

As shown in table 1, of the 2,179 distribution substations in the United States, 70 percent are assumed to serve predominantly residential circuits. A range of savings induced by a smart grid is a function of:

  • Market penetration of voltage regulation between 25 and 50 percent of residential distribution substations by 2030 (7.5 percent of distribution circuits already have voltage regulation capability)

  • Average percent voltage reduction between 1 and 4 percent (that is, between 1.3 and 5.0 volts from a baseline of 126 volts).

On this basis, we quantify the savings range for conservation voltage reduction enabled by a smart grid between 3.5 and 28 million MWh per year in 2030. This represents a 1 to 11 percent reduction in the T&D losses forecasted by the EIA in 2030.

Assuming a reduction in the carbon-intensity of U.S. generation from 0.68 to 0.40 tons CO2 per MWh by 2030 with a greater share of generation from renewables, nuclear, and advanced coal with carbon capture and storage consistent with the Electric Power Research Institute (EPRI) Prism analysis, this represents a reduction in CO2 emissions of 1.4 to 11.2 million tons of CO2 in 2030. This CO2 reduction is the equivalent of taking 329,000 to 2.63 million vehicles off the road each year.

 

This article was written by Omar Siddiqui

Related Topics