Utilities are turning to value creation from data streams arising from grid modernization. Data analytics are nothing new to the industry, but the imagination and technical capabilities to advance from a historical view of past events to real-time or predictive capabilities does require new processes, systems and ways of thinking.
The smart grid is leading the power industry into a data and analytics boom as its implementation phase gives way to its value phase, according to Christine Richards, senior analyst with Energy Central's Utility Analytics Institute, who moderated a webcast on the topic yesterday.
Richards defined the decade from 2000 to 2010 as the smart grid's development phase, the five-year period from 2007 to 2012 as the infrastructure, or implementation phase and the decade from 2010 to 2020 as the value phase.
In the value phase, grid optimization, intelligent asset management and the integration of renewable resources and electric vehicles will be achieved by transforming raw data into actionable intelligence for all business and operational units of a utility, Richards said.
Three utilities presented their experience around data analytics in the webcast, titled "Lessons Learned: How Utilities Leverage Data ." (Click on the title for the audio replay and click here to download the slide deck .)
The webcast clearly connected the traditional and new sources of data, how that data is processed for integrity ("data you can trust" one participant dubbed it), how it is mashed-up with other data to create myriad insights and how those insights can be presented to enable evidence-based decisions, often in real time.
Alan Dulgeroff, director of IT enterprise and corporate systems for Sempra Utilities, kicked off the presentation with analytics work at Sempra's flagship, San Diego Gas & Electric (SDG&E), which serves 1.4 electric customers over 4,100 square miles.
"Our future lies in energy sharing and data sharing," Dulgeroff said.
By "energy sharing" he meant distributed generation and "data sharing" referred to the two-way, utility-customer flow of electricity, price signals and information.
The traditional basis for data creation and interpretation at SDG&E began with a geographic information system (GIS), an outage management system, SCADA (supervisory control and data acquisition) substations and some 1,300 field devices. With the advent of advanced metering infrastructure (AMI) and other grid modernization steps, data sources will multiply, become richer and the value of analytics increased accordingly, Dulgeroff said.
Today, for example, SDG&E runs the third-largest, privately owned weather network in the country, which has served in an emergency preparedness role in an area often beset by wildfires and high winds. In the future that system will be applied to forecasting conditions affecting distributed renewable generation sources and coordinating their integration onto the grid.
Dulgeroff described how data generation and its transformation by analytics into useful guidance for decision-making related to multiple areas defined by SDG&E in its "Smart Grid Deployment Plan" filed with the state of California: customer empowerment, outage/distribution management, condition-based asset maintenance and the integration of distributed renewable energy sources.
In many of these cases, the proper analysis of data will transform the utility's role from reacting to historical events and outcomes to predictive, proactive stances that avoid outages or power quality issues, to cite two examples.
Producing the imagination and skills to fully exploit data's promise has led SDG&E to partner with San Diego State University, some of whose computer and electrical engineering students have found jobs with the utility.
Paul Dick, director of enterprise information management for OGE Energy Corp., noted that its flagship utility, Oklahoma Gas & Electric, serves 800,000 customers in Oklahoma and Arkansas over 30,000 square miles of territory.
OGE Energy's "2020 Plan" calls for avoiding new fossil fuel generating capacity until 2020, Dick said. That means increased penetration of residential and commercial demand response, distributed generation, plug-in hybrid electric vehicles and geothermal energy sources. A related goal: reduce peak load by 500 megawatts in the coming decade.
Reaching that goal externally entails providing customers with the information and tools to enable them to make energy management decisions, while internally an integrated operations center will drive the utility's IT and OT (operational technology) teams to work more closely together. Those goals depend on exploiting new, rich data sources for strategic and real-time decision-making, Dick said.
The OGE "Information Factory" has a three-tier architecture, according to Dick. Its data warehousing capability has been sized to accommodate hundreds of terabytes of data expected by 2014, has expanded data integration capabilities and new presentation capabilities. New analytical capabilities currently in progress include dynamic segmentation of meter data, load curve analyses, the integrated operations center already cited and distribution management capabilities.
One difficulty in conveying new, data-enabled capabilities, of course, is describing how such things get done in terms of systems. Martin Mysyk, senior enterprise architect for TransAlta Corp., Canada's largest, publicly traded electricity generator, based in Calgary, Alberta, provided several slides to illustrate the process.
One of Mysyk's slides, dubbed "Leveraging Data from Plant Floor to Decision Maker - Value Scenarios," illustrated how discrete data with limited value reached the status of actionable information with enterprise value. Another, dubbed "Data to Information - From Source to Delivery," showed the operation steps required to get from data generation to outputs such as integrated reports, dashboards, analytics and answers to self-service queries.
At least one question to the panelists focused on how to "future proof" data analytics tools. The short answer: build in agility and capacity for scalability.
An overflowing column, of course, doesn't do justice to the nuance provided by these panelists and I can recommend the slide deck  and a replay of the audio  to understand the nuances and complexities involved in creating business value from utilities' new data streams.
For a use case that applies data analytics to revenue protection, tap last week's column, "The Promise (and Application) of Data Analytics ," which directs you to another webcast on that critical topic.
Intelligent Utility Daily