Consumer market will drive industrial Internet of things for manufacturing

Technology Update: The same disruptive activities are at work for the industrial Internet of things (IIoT) as when the personal computer changed the way businesses operated.

By Jim Campbell March 6, 2014

It’s official. The entire world knows about the Internet of things (IoT). A recent article in the Wall Street Journal described the efforts of businesses built around producing and consuming data measured by devices. These businesses are based on the thought that smart machines will help market products to the consumer. Your egg carton will tell you when the eggs are old. Your refrigerator will tell you when it’s time to buy milk and will eventually tell you which supermarket has the best price. And, with Google’s purchase of Nest, it’s clear that consumer-based companies want to merge all that data from your cell phone (location, Web browsing history, and so on) with data from your home.

Engineers in the machine condition monitoring world have been gathering data from machines for decades. Early data collection from remote machines was done over dedicated phone lines and modems. Supervisory control and data acquisition (SCADA) systems archived the data for historical review. And, as more programmable logic controllers (PLCs) were involved, the realm of machine to machine (M2M) appeared, where a local PLC could react to data from a remote machine to improve control and safety.

So, if machine condition monitoring engineers have been active in M2M for years, why would they care about the buzz around IoT? The reason is that we are all going to benefit from the huge amount of development being done for that consumer-based market. Remember how the personal computer changed the way businesses operated? The same disruptive activities are at work here. Let’s explore some. 

Hardware enablers

Cell phones have caused a 42% increase in the number of U.S.-based cellular towers in the past five years, and Internet data traffic being carried just on U.S.-based wireless (not including wired traffic) is around 1500 petabytes per year. Internet data consumption has grown at around 43% per year on average during that last five years, with video streaming consuming almost half of the entire data throughput in recent years. And, microprocessor designs and miniature micro-electromechanical systems (MEMS) sensors originally developed for cell phones are being considered for industrial machine monitoring. This push is making impacts on traditional sensors and controllers. Companies are seriously considering making their machines "smarter" because costs to do so are dropping as the infrastructure becomes ubiquitous. Even companies that might not see an immediate benefit are joining the Smart Machine revolution because they are worried about being left behind.

Some examples of the infrastructure being developed are IMI’s accelerometer in TO-8 package, wireless capability for traditional accelerometers from Lord MicroStrain, the Imp module for integrating Wi-Fi and the cloud into products from Electric Imp, powerful and fast FPGA-enhanced sbRIO controllers for complex machine monitoring and control from National Instruments, the Zynq-based MicroZed that combines ARM processor technology with FPGA capabilities, and the low-power Waspmote from Libelium with eight options for integrated wireless technologies.

Interestingly, cellular providers, such as AT&T and Verizon, have developed business units around M2M, so that data produced by measurement hardware can be connected to the world. Clearly, they see opportunities to increase business by carrying the huge amount of data being produced by machines. And, machines can produce enormous amounts of data. While not typical, the CERN LHC Collider is illustrative of possibilities: one "machine" has collected over 100 petabytes in the past four years. 

Software enablers

The data generated by all the machine monitoring devices and sensors is the industrial version of the big data topic widely discussed in the past few years.

Big data is not new to engineers. For example, engineers have been performing condition monitoring on machines for decades; traditional SCADA historians are the original big data management tools.

Such historians are still widely used, but their tag-based data model can be restrictive in some cases and, in my opinion, a growing number of cases. Certainly, the ability to graph and analyze one or more tag values as a function of time yields huge benefits. Data from transient events can be monitored and analyzed to understand how to improve the reliability of a system. Long-term trends help predict future problems and allow proactive scheduled maintenance. And, of course, traditional process control monitoring and tuning benefits from data collected in SCADA systems.

However, the ability to consume and use multiple data sources from disparate sensor types and machines has been a difficult problem for decades. Traditional SQL-based databases have yielded custom solutions that have worked well for specific industries. For example, tools used for condition monitoring on rotating equipment have traditionally pushed RMS (root-mean-square) vibration levels into SCADA systems, but the actual vibration waveforms were lost.

These waveforms contain valuable information for diagnosing heightened vibration levels. (Is it a bearing? Is the fault on the inner or outer race?) Consequently, custom applications were developed to merge these vibration waveforms with SCADA historical data to allow vibration engineers to identify quickly that an issue is occurring, perhaps through alarm levels on the SCADA data, and then locate the archived vibration data for further detailed analysis.

Rather than building custom applications, companies have begun to leverage emerging technology to help simplify the aggregation of multiple types of data, such as vibration waveforms and tag-based scalars. Some of that technology has been leveraged from Web tools, such as RDF and other NoSQL databases, and some from the enhanced interconnectivity available between computers that simplifies the creation of distributed systems.

Unstructured advantages

All these tools can handle unstructured data, as opposed to the tabular structure imposed by relational databases. Not only do traditional relational databases complicate the creation of linkages between disparate data, they also restrict data types to a finite set (unless you count BLOBs) and are not flexible when changes are required to the schema.

The newer unstructured databases are no longer restricted to certain types of data. Furthermore, this lack of structure allows arbitrary connections between data entities and adaptability when new entities are introduced or old ones are removed.

Some worthy examples of tools to manage arbitrary data sources are VantagePoint, which is part of Rockwell Automation FactoryTalk; GE’s Industrial Internet tools; Digi International cloud-based solutions; the Aperio database framework from Viewpoint Systems for making connections between any arbitrary data; and the Thingworx platform for visualizing and connecting arbitrary data sources. Even low-level tools, such as NoSQL databases, are being widely used now.

Deployment and configuration

An often overlooked aspect of monitoring multiple assets is the need to manage the suites of devices performing the data collection. A typical scenario has perhaps hundreds to thousands of devices deployed in the field across multiple geographic locations. Each device has certain channel configurations (sensor types and counts) and applications (software running on the device controller performing the analysis and managing the local data storage). Over time, changes are made to these devices. Perhaps a new version of the application software is needed for some new analysis or a new sensor has been included.

Ideally, the deployment of the new application or setting up the new channel configuration management can be done remotely. The large number of devices begs for tools to help automate the deployment. Automation helps assure successful deployment and helps manage the available configurations with each deployment. This automation is still an emerging topic that will include device-type dependent tools and some generic configuration management tools. 

Commercial tech helps industry

By leveraging all the technological developments arising from the consumer world, the industrial world is leveraging huge gains in infrastructure and product platforms for the purpose of monitoring remote machines. Engineers have more tools than ever to collect and analyze data from those machines, and make informed decisions about asset utilization and predictive maintenance.

– Jim Campbell is president of Viewpoint Systems Inc., a Control System Integrators Association member based in Rochester, N.Y. Edited by Mark T. Hoske, content manager, CFE Media, Control Engineering, mhoske@cfemedia.com.

ONLINE 

See related articles at bottom of this posting.

www.viewpointusa.com 

www.controlsys.org 

Key concepts

  • Focus on M2M, remote monitoring, data visualization, and management systems.
  • Why use these versus traditional SCADA historians, touching on asset monitoring?