From Research Horizon's "Data Driven" article:
With unprecedented amounts of data suddenly on tap, the challenge many researchers face is how to consume it.
For example, inexpensive sensor technology has made it easy for power companies to collect data on critical high-value assets such as generators and turbines. Yet analytical technology has lagged behind, inhibiting their ability to make sense out of it, said Nagi Gebraeel, Georgia Power associate professor in Georgia Tech’s School of Industrial and Systems Engineering (ISyE) and associate director of the Strategic Energy Institute.
In response, Gebraeel’s research group is developing a new computational platform to provide detection and predictive analytics for the energy industry. This platform remotely assesses the health and performance of equipment in real time and monitors trends to determine such things as:
- The best time to perform maintenance.
- When to order new parts so they don’t linger in inventory, costing money and possibly becoming obsolete.
- How shutting down one piece of equipment will affect the entire network.
“The latter is especially important because any slack caused by shutting down one generator has to be picked up by the rest of the generators,” Gebraeel said. “Now their lifetime has to be re-evaluated because they are working in overload. That’s where optimization and analytics intersect.”
By integrating detection, prediction, and optimization capabilities, the new platform could help power companies achieve significant savings. Indeed, a preliminary study shows a 40 to 45 percent reduction in maintenance costs alone.
In the past, there’s been a lot of unnecessary preventative maintenance, Gebraeel pointed out. “Companies do it because of safety, which is rational, but they are being too conservative because they don’t have enough visibility into their assets.”
Key to creating the computational platform is re-engineering older statistical algorithms that were developed in the context of limited data, Gebraeel said. Today’s algorithms must be executed on processing platforms that can handle terabytes and petabytes of data, deployed across a large number of computer nodes.
To read the rest of the article, click here: http://www.rh.gatech.edu/features/data-driven.