OGE’s Information Factory takes flight
Handling disparate data effectively for analytics
Published In: Intelligent Utility January / February 2012
IN "LESSONS LEARNED: HOW UTILITIES LEVERAGE DATA," an Intelligent Utility Reality Webcast held in mid-December, panelists discussed the ways in which their utilities were leveraging and analyzing the data available through meter data management, outage management, distribution management systems and more.
Paul Dick, director of enterprise information management for OGE Energy Corp., explained OGE's "big data" in these terms.
Defining big data
"What does OGE's big data look like? In the past it was SCADA, it was OMS, it was all these different messages and events that we're trying to capture," Dick said. "So for us, meter interval data with the demand response program we have is going to drive our typical meter interval data just a bit higher in terms of the volumetrics of that. We're actually taking in 50 billion events annually, just from a meter interval perspective.
"Our Integrated Operating Center, also, when we collect all the messages that are coming from this kind of technology, and we put together the meter events, the AMI events, the OMS [outage management system] integration, the DMS [distribution management system] messages too, it's a very large volumetric again that's going to take some skill to manage, incorporate it into the data model, and also to build analytics on top of that."
DMS messages "can be massive, and they can be large in width, as well," Dick said. "The value there is also providing OMS the intelligence from DMS, so your reliability processes can be better maintained. The Holy Grail, from a distribution perspective, is integrated Volt/VAR controls (IVVC), and what you are able to optimize."
Handling data from disparate systems
With all that data to manage, OGE developed what the utility calls an Information Factory program. "It's basically, in a nutshell, nothing different than anyone else has ever seen, but maybe we've just tooled the capabilities in a different way," Dick said. "We've spun up our enterprise service bus to start handling messages from all these disparate systems in near-real-time and real-time fashion.
"So we had the traditional batch processes ... big chunks of data over specific time dimensions like everybody else has, but this is now giving us a new capability to be extracting and (building) composite applications above and beyond the transactions which everyone's accustomed to."
Geospatial-enabled presentation, he said, "allows us to layer these things on dynamically, so that each time we add a new subject area within the warehouse, it's automatically picked up in a new layer it was assigned. Users can actually go and pick that layer from a dynamic library. We also beefed up and added some statistic modelling capabilities, so that we can start to do these predictive looks into the IOC and the operating centers as well as the predictive looks into the DMS system, IVVC and so on."
But the last piece of the equation needs not to be underplayed in importance, Dick said. "We believe that the importance is that after these analytical findings are present, then everybody else needs to know, to be able to go in and show what's been done, and show how to leverage those things and the governance around these things, so that people use them in the correct way. Collaboration is becoming a bigger and bigger thing for the presentation layer."