ADVANCES IN COMPUTE POWER, ANALYTICS AND SENSOR TECHNOLOGY COULD REVOLUTIONIZE TURBOMACHINERY DESIGN, OPERATIONS & MAINTENANCE
The last decade has seen monumental advances in compute power. The average laptop, these days, would put a supercomputer of the nineties to shame. This has propelled the field of data analytics into the stratosphere. Calculations which used to take months can be accomplished in minutes.
Now factor in gains in sensor and wireless technology, which make it possible to gather an ocean of information from plants and turbomachinery. This combination of compute power, analytics and the assembly of huge quantities of data is known under the collective title of “Big Data.”
Turbomachinery product and service providers, such as GE, Siemens, Praxair, Woodward and Invensys, are already investing heavily in Big Data, to help users extend operating times of turbines and plants; better predict machine failures; and plan overhauls and maintenance cycles.
Others caution, however, that it may be too much, too soon. Klaus Brun, Senior Machinery Manager at the Southwest Research Institute (SwRI), would rather use this computing power to compare massive quantities of measured data or for computational fluid analysis and finite element analysis programs rather than to determine performance optimization or maintenance cycles for operating gas turbines.
In any case, some are convinced that investing now is well worth the risk and are pushing to be ahead of the curve. Praxair, an industrial gases company has established a central data monitoring center for all its facilities in the world, using Big Data technologies.
“With Big Data employed correctly, we can see if a facility in Argentina has problems and mitigate issues before they happen,” said Victor Rioli, Reliability Director in Global Operations Excellence at Praxair. “This means we can reduce our costs, our customers’ costs and minimize environmental impacts as there is no need to have an engineer stop in a facility once a month, just to see what’s going on.”
Similarly, GE Oil & Gas is implementing Big Data to increase efficiency and output of turbines for onsite generation, as well as for applications such as compression of natural gas for pipeline transmission.
“A small increase of only 1% in several areas such as efficiency, reduced emissions, uptime, safety and productivity can mean millions of dollars in profitability, or make it viable to exploit resources in remote regions and harsh environments,” said Dan Heintzelman, President & CEO GE Oil & Gas.
Meanwhile companies such as Siemens, Woodward, Invensys, as well as IT partners such as Amazon.com are figuring how best to harness these vast stores of data to change the way turbomachinery is designed, operated and maintained. Instead of evaluating the gathered data to obtain a historical view of machine performance, it can be reviewed and acted upon in real time.
“Analyzing Big Data at rest is too slow,” said Greg Schulz, an analyst with IT consulting firm StorageIO. “You have to be able to process it as it comes in.”
For example, the sensors on a jet engine generate several terabytes (TBs) of data every flight. Paul Maritz, CEO of IT firm Pivotal, explained that most analytics applications would first ingest the data, store it and then analyze it. His firm is figuring out the best way to use this information during the journey in order to optimize jet fuel consumption, improve engine efficiency and in the longer term, to drastically reduce maintenance costs.
Pivotal is owned mostly by IT giants EMC and VMware, but GE has rolled in a 10% stake which amounts to $110 million.
“As advanced telemetry is added into devices, it will ramp up the data volume by two orders of magnitude,” said Maritz. “A gas turbine, jet engine or locomotive can harness telemetry to deliver new value. By reviewing real time data, there is no need to break down an engine every 2,000 hours. This will enable OEMs to profitably change their business model to contracting for hours of operation, not sales of machines.”
Consequently, turbine OEMs and control system manufacturers are gearing up for what promises to be a revolution in plant control, profitability and efficiency. But far from being pie in the sky, they have already begun to add these capabilities into their turbines and compressors, and offer service contracts that take remote monitoring of turbomachinery to a new level.
GE’s initial foray into Big Data is through its “Predictivity services,” which collect data from 1,600 units that can be used to improve efficiency and lifecycle value while reducing downtime. So far, GE has gathered 93 million hours of operational data and 25 TB of engineering data. Another 30,000 hours of operational data are being added daily as the company ramps up its Big Data initiative. “With smarter machines, the software inside makes them even more valuable,” said Jim Fowler, CIO, GE Power Gen Services. “You can get more out of the asset you own.”
GE Predictivity services and technologies manage and operate jet engines and gas turbines, using the latest sensors which gather five-to-ten times the data of standard sensors, in order to fuel more accurate analytics. Applications include condition-based maintenance, outage management, fuel consumption and controls, and plant automation. They can be deployed at the asset, facility, fleet, or can run on equipment belonging to GE, with users accessing the data over an internet connection. Predictivity gathers and analyzes historical and current information including operational data, fuel costs, weather patterns and climate. It slices and dices that data as a means of optimizing turbine operation. “We want to make an average plant on an average day run like the best plant on the best day,” said Fowler.
One of the services GE offers is called Startup Advisor, which allows a plant operator to simulate plant startup based on current operating conditions. The operator puts in the variables and gets instructions on optimal startup whether he wants the lowest fuel costs, the fastest start or the lowest emissions. This is particularly useful in deregulated markets with day-ahead bidding where a company can lose its place if it is not ready to comply with emissions standards. “Big Data comes in where we take readings 24 times a second to ensure optimal output at a low cost and with the lowest emissions,” said Fowler.
He said that customers collect a lot of data continually, but they are not sure what to do with it. GE has assembled the resources that plant operators lack, such as advanced analytics that ingest the data and put it into a format that allows operators to make better decisions. Or those decisions can be left up to the control units. “Analytics is catching up to the volume and velocity of data,” said Fowler. “As sensors and control units get smarter, we can push analytics out to the unit and make decisions at the controller level without user intervention.”
Not all decisions, however, should be automated. In some cases, decisions need to be made after looking at the entire fleet, and in others it is best to let the operator make the final decision. In June 2013, GE released Proficy Historian HD, a new type of Big Data historian data management software. The software can gather and analyze 152,000 samples per second, or 13 billion samples per day. This is used by GE for time series data where large turbines can generate as much as 1 TB of data per day.
As would be expected, control vendors are scrambling to come to terms with the ramifications of Big Data. Will Schulke, Senior Staff System Engineer, Woodward, made the company’s motivation clear. Unplanned downtime costs too much not to work towards improved methods of prevention, maintenance time extension and faster prediction of problems. “In order to reduce emissions and improve efficiency, systems are becoming more complex such that data is the foundational method to maintaining top performance.”
Schulke predicted that Big Data will primarily be used by turbine manufacturers in developing prognostics from data to help users extend the operation times of turbines and plants. The problem is not how to gather the data; plants have used relatively slower trending and historian systems for decades to monitor and solve problems. But that pace is no longer acceptable.
High-speed data needs to be gathered at the component and system level, added Schulke, and modeled against previously known or gathered data sets. It can then be aggregated with other units to provide fleetwide data that can be used to determine how an individual unit performs compared with other similar units.
“Although there are many uses for Big Data, we see rotating equipment owners and manufacturers collecting fleet-wide data today to better understand where the weakest parts of the system are and to allow them to better verify maintenance periods,” said Schulke. “As rotating equipment designs migrate to new technology, Big Data will have a role in fleet-wide and unit performance verification and model creation. In the future this data and its related verification help us improve and verify system designs as well as better predict failures and plan system overhauls.”
About 20 years ago, Woodward began including an embedded control Datalog with high-speed, control-execution-rate resolution for analysis after a shutdown occurred. This Datalog gathered about 50-to-75 critical signals at 5-to-10 millisecond (ms) resolution for 30-to-40 seconds prior to a control shutdown. The current Datalog can store multiple crash logs with 600-to-700 variables at the same 5-to-10 ms resolution for 2- to-3 minutes.
“We are seeing more and more requests to capture high-speed, high-resolution data continuously vs. traditional methods that use longer term, slower type trends,” said Schulke.
The drop in price of computer storage over the last few years has made it cost effective to store far more data for a longer period of time, noted Schulke. A full year of data covering 300 variables at 10 ms resolution would fill about 1.4 TB of disk space. With a PC backup drive costing $100 per TB or less, that year of data can be stored for less than $150. Therefore, Woodward has added options of an unlimited time window or a continuous data log inside its turbine controls.
Woodward recently implemented Big Data analytics within its actuation and metering valve products to assist customers with component level performance and reliability verification. In this case, software predicts maintenance times for issues such as premature gear wear in electric actuators.
Woodward is far from the only controls vendor investing in this technology. Terence Creevan, Triconex Turbomachinery Control Principal Application Engineer, Invensys, said that Big Data can reduce maintenance costs for tasks that have been traditionally based on calendar days by allowing maintenance to be based on machine usage as measured by runtime hours or number of starts.
“In the real world, machinery does not necessarily wear by the calendar,” said Creevan. “Big Data helps by providing more data logging points with a higher resolution, which can be used to more accurately predict when maintenance is required.”
For gas turbines, “numbers of fired starts” is a more precise indicator of when a major overhaul might be required than counting off the days, but even that is crude compared to monitoring all turbine parameters. As gas turbines operate, efficiency declines due to temperature changes, humidity, barometric pressure, wear, various buildups, dirt and other factors.
“More data with higher resolution would lead to higher levels of efficiency by providing a means of continuous rather than periodic tuning by an engineer,” said Creevan.
The higher resolution in data point logging that Big Data provides allows for quicker, more accurate diagnostics. A resolution of 1 ms provides better sequence of events and historical data, which improves troubleshooting and results in a decreased time to diagnose and repair failures.
Additionally, Big Data is also being used to simulate how a turbine will respond under different operating conditions, and how the controls should be set for best performance. Invensys’s DYNSIM platform is a process simulator that represents plant or process performance. The resulting model is connected to the Triconex controller. The user can make changes or create specific simulation cases. This enables system testing in a safe environment. Faults can be discovered and resolved without incurring large costs.
Siemens takes a different approach to Big Data. Instead of importing mountains of data from every conceivable sensor, system and component, it is working to reduce the number of monitoring signals to the essentials to implement an optimized, task-specific integration of sensors in the plant. “We regard on-site data processing as complementary to back office analyses,” said Timot Veer, Product Manager, Remote Monitoring & Services, Energy Service Fossil,. “Big Data, however, does provide support in overall analyses and assists in integrating various business and operational processes.”
The Siemens Power Diagnostic Centers (PDC) in the U.S. and Germany cover more than 530 turbines as well as other equipment worldwide. With Power Diagnostic Services (PDS), this provides long-term maintenance customers with 24/7 monitoring. For security purposes, it is a passive data gathering and analysis system; Siemens cannot remotely control any of the plant’s instrumentation and control system or engine operation. The goal is to detect abnormal operating conditions of gas turbines, steam turbines, heat recovery boilers and generators before those abnormalities lead to system trips, forced outages or lowered efficiency and output. This gives plant operators greater control over scheduling repairs and upgrades.
PDS monitors more than 134,000 configured systems with 5.4 million connections per month and several 100 GBs per day of data transferred. It includes individual turbines or entire turbine-generators according to closely defined, plant-specific routines which automatically record various parameters for plant condition. “This enables early detection of rotor vibration issues, burners and combustion chambers exhibiting deviations in behavior, changes to plant performance, faulty sensors and actuators,” said Veer.
If the diagnostics system detects conspicuous behavior, Siemens contacts the plant operator and they jointly evaluate the event and potential deviation from normal. On this basis, recommendations for further safe operation are made. In agreement with the customer, additional diagnostic measures can be implemented at the project site to support more detailed root cause analyses.
Veer claimed that PDS can sometimes detect potential issues months before instrumentation and control (I&C) systems would generate an alarm or a trip. The alarm threshold of PDS monitoring levels normally are set below the alarm threshold of plant I&C systems allowing intervention to prevent plant trips.
Not everyone, however, thinks it is necessary to gather and process such huge quantities of turbine data. SwRI’s Klaus Brun said that a cycle deck program, a 1-D thermodynamic analysis that predicts the performance of how the machine should be behaving, is adequate and easy to run. “These programs are fairly accurate as they are the same programs the manufacturer uses for his gas turbine performance guarantees,” said Brun. “Even 20 years ago you could run a transient cycle deck program in real time with a gas turbine using a simple PC.”
He and fellow Mythbuster Rainer Kurz pose serious questions about the hurdles that lie ahead for Big Data as covered later in the issue (pp. 52).
Clearly, there is a ways to go before the hype makes much of a difference on the plant floor. That said, able minds are working on how to harness Big Data to make turbomachinery run better. The likelihood is that it will take the long-term involvement of control vendors to gradually bridge the gap between OEM dreams and plant manager realities.
When that gap is bridged, plant managers need not worry that they will be out of a job. Sanjay Sarma, Director of Digital Learning at Massachusetts Institute of Technology (MIT) made the point that computer geeks and Big Data analysts know nothing about gas turbines and compressors.
“Data from heavy industrial units need to be made sense of,” said Sarma. “Insight is required on how often to collect it and how quickly to react. Plant operators know this far more than a computer guy so you need that partnership of the domain expert working with the software and Big Data guys.”