Cloud Instrumentation: Data without infrastructure

Will instrumentation and other devices in your plant be communicating via the cloud rather than your own networks? While it may not happen tomorrow, the technology is advancing and may be closer than you think. What is this cloud, and what can it do for you?

03/15/2011


The development of cloud instrumentation that communicates via the Internet is truly a major technological advance, but one that is difficult to comprehend completely without looking at it in context. To help establish that context, we should begin with a short bit of history to help understand why the development of cloud instrumentation is so significant.

The first created electronic instruments, let us call them traditional instruments, typically used a standalone or box format. Users connected sensors directly to the box, which contained the measurement circuitry and displayed the results. Initially it was on analog meters and later with digital displays (Figure 1).

Traditional instruments, then and now, have fixed hardware and software architecture with most if not all of the functionality defined inside the box by the vendor. They are often limited to performing custom analysis offline on the full data sets. Also, there is little or no flexibility in changing any of the instrument components with off-the-shelf parts.

In many cases, test engineers wanted to have instruments communicate with each other, for instance in a stimulus/response experiment, where a signal generator instructs a digitizer when to start taking samples. This was initially done with serial links, but in the 1970s the Hewlett Packard Interface Bus, which evolved into today's IEEE-488 interface, became extremely popular for connecting instruments.

The next major breakthrough in measurement technology came with the availability of desktop computers, which made it more cost effective to run test programs, control instruments, collect data and allow test engineers to process and display data. Plug-in IEEE-488 boards allowed minicomputers and later PCs to perform these tasks.

Today such interface cards are often not needed thanks to instruments that communicate with PCs directly via USB or Ethernet, and more recently over wireless Ethernet schemes.

When the prices of PCs, motherboards, and embedded PCs dropped significantly, the next logical step was to combine the instrument and the PC into one box creating the PC-based instrument. PC-based measurements can be done in three forms:

  • Traditional standalone instruments such as oscilloscopes or logic analyzers incorporate full PC functionality so users can perform analysis using familiar tools such as spreadsheets or popular packages such as Matlab;
  • The instrument moves into the PC itself. Manufacturers developed plug-in instrument cards, initially on the PC-XT and PC-AT bus and today for variations of the PCI and PCI-E bus; and
  • When expansion slots disappeared from most PCs, engineers who wanted to work with instrument cards migrated to chassis schemes where one or two slots would hold a PC motherboard in an industrial card format and the remaining slots would hold data acquisition cards.

In PC-based instrumentation, the application software runs in a local PC which becomes the instrument. Data from sensors and signals move into the PC using different communication busses in order to apply analysis, presentation, and communication functions required by the instrument. This measurement method works very well in production automated testing on a factory line, or in experiments that take place in laboratories.

The hardware and software components of a PC-based instrument are flexible; you can interchange them inside the same PC based on the application’s requirements. But the PC-based instrument is not software scalable because for every new instrument which requires another computer, another license of the same software package has to be purchased and installed. This is the Microsoft model where for every new PC a new license of the operating system is required, and, together with this license, new licenses of all the software that needs to run on this OS in this computer. Therefore, the PC-based instrument remains today the best tool that can be bought for creating test fixtures and benchtop instruments for local automated testing applications, but with the worst financial scalability for multiplying the setup in other places.

Distributed and remote measurements are done using wireless modules or nodes. These modules may use Bluetooth, ZigBee, Wi-Fi, or serial proprietary communication protocols and firmware that may support hopping, meshing, TCP/IP, serial bit banging, etc. When installing distributed and remote measurement nodes, data may be transferred to a PC running the application software. This approach is characterized as a PC-based data acquisition system. But if the wireless modules support it, digitized data can also be sent straight to the Internet via off-the-shelf access points, as opposed to sending the data to a PC or any box instrument. In this case you are no longer inside the two classical measurement systems discusses so far, box or PC-based instrument, but rather you enter into the cloud Instrument model.

This very simple decision – to send digitized data straight into the Internet – is a key milestone, and if it is adopted, then we are about to enter a new era in measurement technology. In this new era, not only is the wireless hardware freed from the confines of cabling, the software is no longer relegated to a specific box or PC. Instead, measurement front ends connect to the Internet and a Web page or a Web widget becomes the instrument, which users can access from anything that can surf the Web, including mobile devices such as smart phones. At present there is little or no use for raw sensor data being available on the Internet. But with Web-enabled applications starting to emerge, the instrumentation cloud scheme presents an enormous opportunity. A cloud instrument will route digitized sensor data packets to dedicated computers located anywhere on the Internet, and they will either use local software or run Web-based programs for computation, simulation, modeling, analysis, and presentation. The following diagram graphically illustrates this concept.

The cloud instrument

As we try to project how future measurement technologies will look, we must use our intuition as we study the patterns that have enabled such tremendous growth and scale of the Web applications in the last year or so. Within the cloud computing space, the platform companies that are doing well are focusing on new applications in the cloud as opposed to moving existing applications. This applies to test and measurement as well. Greater adoption by consumers facing applications like environmental monitoring, electricity use, building automation, grid applications, biomedical, spread of diseases, viruses, seismic, hurricane and tornado monitoring, and so forth, will push forward the use case of measurement applications where things that need to be monitored are very dense, wide spread, and Web enabled. For these applications to be possible, we need two critical things:

First, sensors and the hardware measurement component must be designed into the fabric of things. Second, the software component must be pushed from local PCs to the cloud for much more processing capability, speed, visibility, and reach. We are currently in the visionary phase of this technology. At this point, cloud instruments are working prototypes of the concept.

A sensor becomes a cloud Instrument when it is connected to a wireless tag such as a Wi-Fi tag. The tag digitizes the data to send it on to an access point, where the data is routed to the Internet and a server IP. Here a customized engine is collecting data to feed into applications like metering, charting, control, analysis, modeling, data mining, and so forth, with display on Web page and Web widget instruments.

In a cloud instrument, one or several sensors are connected to a wireless tag device such as the Tag4M Wi-Fi tag. A digitizer radio chip is required with the following capabilities: very accurate, very small, very low-power, and network enabled with capability to talk to the existing infrastructure. The Tag4M Wi-Fi tag, as seen in the picture bellow, is a prototype fitting with this concept.

Such a Wi-Fi tag is a small, battery-powered 802.11b/g device, allowing connections of up to five analog and four digital sensors. (Download a datasheet at http://www.tag4m.com/) The tag digitizes sensor(s) signal(s) and sends data to the nearest Wi-Fi access point which routs the digitized data to the Internet and to a server IP running a tag engine application named Web Page Instrument.

When the tag is powered, it automatically shows up as a Web page instrument. (See an example at http://demo.tag4m.com) The Web page instrument standard tag display contains the tag MAC, measurement channels, battery voltage, RSSI, and tag sleep time. The Web page instrument will list all the tags that are live and associated with access points all over the Internet.

Let us assume that the measurement application is to read temperature from a thermistor that is connected to channel AI0 of a Tag4M Wi-Fi tag with MAC address ending in 2886.

This tag can be easily configured for channel AI-0 measurement by clicking on the tag MAC which will open the tag configuration panel (Figure 9).

The Web page instrument is now reading digitized voltage values from channel AI-0 that is populated with a thermistor (Figure 10). The application may run in the cloud or on a private server and the location of the tag and sensor can be anywhere as long as the Web page instrument sees the tag. We can also read the temperature from the thermistor that is connected to channel AI-0 of the tag.

We could also build a totally new Web page that presents tag measurements the way we want, or we could build another Web application that gets the data we need from the Web page instrument and presents it the way we want with scaled measurements, analysis, charts, etc. There is a lot of opportunity for systems integration at the software level of a cloud instrument. Companies like Pachube offer services ranging from hosting of the Web page instrument to specialized applications.

Tag4M provides tools for companies to implement cloud instruments. We neither host nor develop custom applications, but rather offer the tools you need – Wi-Fi tags, Web page instruments with source code, and widget instruments to develop applications. Widget instruments are small, Web-based measurement instruments that can be installed and executed within a Web page or media site. The widget instrument gets feeds of specific digitized sensor data from one or several tags displayed in a web page instrument and either displays data as is or it applies processing on the data before display. The unique features of a widget instrument are its small size and ability to be embedded in a Web page, media site, or smart phone user interface.

The thermistor instrument can be customized to show numbers and/or images as shown here (click the image to the right to see this live widget).

Generally, cloud computing customers do not own the physical infrastructure, avoiding capital expenditures by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. If we apply this model to the thermistor application, the cloud instrument customer who needs a thermistor instrument to embed into his or her Web or media site will only have to purchase the tag and connect channel AI-0 of the tag to a 10K3A1A thermistor. This is the hardware component of the application. Other than that, the customer does not need to own or know anything about the software component. The cloud instrument service provider is hosting the Web page instrument and provides the user the code for embedding the widget instrument into a Web or media site.

An example of such site can be seen at http://sensedin.blogspot.com

With a cloud instrument, the hardware is unchained because measurement front ends are no longer tied to a particular box, instrumentation network, or PC. A Wi-Fi sensor tag connects to any access point or router within range. Even more significant, the software is unchained because measurement data is no longer tied to a particular data-acquisition or analysis program on a specific computer. Instead of a sensor tag sending data to a device driver that feeds readings into local software on a PC, it sends the data to an access point for further routing to an Internet IP address that defines a server IP located on a company network or in the cloud. A user-controlled Web-based application server, or Web page instrument, makes digitized tag data available for processing via these widget instruments.

A federation of sensors connected to Web page instruments running on a wide range of devices can provide any number of services such as:

  • Simple data presentation to an alarming service that sends a text message or tweet or even a correcting signal to the sensor tag;
  • A metering service that calculates consumption and costs;
  • A database service that stores data; and
  • Machine health monitoring, home-based medical supervision and many others in what will become an enormous application space.

What is even more important than the diversity of applications is that with processing taking place in the cloud, information from all these applications can be shared, data-mined, and refined using behavioral algorithms to detect patterns that can help us make predictions and define strategies for a better life together.

Access points will play a much larger role, not only for routing sensor data but also for hosting and routing application code. Innovation will also take place in the domain of Web page instruments thanks to tools such as Web page instrument builders and updaters. Cluster Web pages can include widgets for portable Wi-Fi platforms, running wave-type applications that move data from page to page to follow the progress of a physical process. We are working our way towards an instrumentation cloud where not only the sensors but also the logging, analysis and control programs, and widget instrument deployment and display can be anywhere you want them. The question is how soon will it all happen? Let us look at the current business environment for cloud instrumentation.

The future of the cloud

We are currently facing the same business path or cycle that surrounded the introduction and adoption of PC-based instrumentation. Back in the mid 1970’s, when PC-based instrumentation was invented, it was very hard to explain to ordinary people why this approach would be beneficial. How can a standard Mac and later PC computer be a better instrument than the (real) traditional instrument? The answer was not obvious at that time because there were no projects and also no software applications written for the Mac and PC to do measurements. People were looking into what use cases may benefit from instruments having a larger display, more storage capability, and more flexible hardware. Brute number crunching was also attractive, offering the potential for precise analog to digital converters moving from 14-bit to 16-bit to 24-bit at high scan rates.

But history shows that they were looking in the wrong places. The drivers of the PC revolution—data manipulation and presentation, what can be done with acquired data, and how fast—these proved to be much more important than converter accuracy and bits of precision. Smarter technology (and not necessarily better) eventually created a new multi-billion dollar market for PC-based instrumentation, named automated test and measurement.

This is happening all over again, and we are at the beginning of a similar cycle today. Companies that are in the new business of cloud Web services today tend to concentrate on enterprise administration, offering services such as high-performance data transport technology to, from, and between remote cloud infrastructures, IT financial management solutions, automation of IT management functions, database security, risk and compliance (SRC) solutions for the enterprise, mobile workforce scheduling, real-time route planning, and enterprise-class optimization. These companies do not intentionally ignore applications that need measurements, but rather adapt to where the market takes them, and this is obviously the corporate world, banking, IT, personnel administration, and so forth.

There are definitely projects, or rather problems we can help solve at a global level with help from cloud instruments. The following is a description that comes from IBM’s call for action, “A mandate for change is a mandate for smart” which is relevant to the subject:

“The problems of global climate change and energy, global supply chains for food and medicine, new security concerns ranging from identity theft to terrorism — all issues of a hyper connected world — have surfaced since the start of this decade. The world continues to get “smaller” and “flatter.” But we see that being connected isn’t enough. Fortunately, something else is happening that holds new potential: the planet is becoming smarter. That is, intelligence is being infused into the way the world literally works — into the systems, processes and infrastructure that enable physical goods to be developed, manufactured, bought and sold. That allow services to be delivered. That facilitate the movement of everything from money and oil to water and electrons. And that help billions of people work and live. How is this possible? First, the world is becoming instrumented. Imagine a billion transistors for every human being. We’re almost there. Sensors are being embedded everywhere: in cars, appliances, cameras, roads, pipelines…even in medicine and livestock. Second, our world is becoming interconnected. Soon, there will be two billion people on the Internet — but systems and objects can now “speak” to each other, as well. Think of a trillion connected and intelligent things, and the oceans of data they will produce. Third, all of those instrumented and interconnected things are becoming intelligent. They are being linked to powerful new backend systems that can process all that data, and to advanced analytics capable of turning it into real insight, in real time. With computational power now being put into things we wouldn’t recognize at first sight as computers, any person, any object, any process or service and any organization — large or small — can become digitally aware, connected and smart. With so much technology and networking available at such low cost, what wouldn’t you enhance? What wouldn’t you connect? What information wouldn’t you mine for insight? What service wouldn’t you provide a customer, a citizen, a student or a patient? The answer is, you will do all these things — because you can and you must.”

The idea of the world becoming instrumented should not be focused only on sensor data. There is a larger virtual space named the Internet of Things (IoT) that covers any device or object that possibly has access to the internet. Things have services, data, events, and much more to offer than a simple sensor. In the Internet of Things space we will be able to identify not only the object but also the semantics of an object - its functionality, its shape, and its type - in the context of pursued objectives, and at a specific time. Cloud instrumentation will provide the sensor monitoring and measurement capabilities of this very wide and well organized space.

Limits to cloud instrumentation

Cloud instruments are riding two big technological waves or platforms: the Internet and cloud computing. When we are asking about limitations in cloud instrumentation we are in fact thinking about problems like data security, confidentiality, reliability, availability and latency. This is appropriate in that the Internet and cloud computing platforms currently have certain inadequacies which are transferred to the instrument. On this line of thought though, we should also note that these technologies are very much in their emerging stage right now, and therefore have a tremendous potential for advancement in the years to come. Cloud instruments will continuously benefit from every technological improvement. We need to keep this in mind when looking at current limitations.

A wireless thermistor is not in itself a cloud instrument. But when you place it in a network with access to the Internet, that simple sensor becomes one. There are two segments of data communication in this setup: the wireless portion that covers communication between the digitizer radio chip and the access point, and the Internet segment. Our company’s position regarding limitations to cloud instrumentation is two-fold:

First, we, the cloud instrumentation community, have to take care of the wireless segment by setting standards that regulate the interfaces between the radio digitizer chip and the sensor/access point.

Second, we will ride the technology wave that will improve security, determinism, and latency over the Internet by using off-the-shelf protocols and standards.

At the hardware level we need to set open standards that regulate the interface between the radio digitizer chip and the sensor. These standards will open the field for manufacturers to create radio digitizer chips with Internet connectivity that are sensor interchangeable, sensor plug-and-play, and can associate with any off-the-shelf infrastructure access point.

At the firmware level we need to define how we read, digitize, packetize, and send sensor data over wireless in a way that maximizes data security, confidentiality, reliability, availability, and determinism.

The radio digitizer chip code includes an authentication security function, which can be used to perform the initial (open) steps of authentication and association between the chip and its access point. Depending on the security level being chosen, these may be followed by one or more further authentication and/or key exchange transactions. The radio digitizer chip supports a range of security levels. As long as the application is in possession of the appropriate keys and/or credentials, the method to be used can be chosen at run-time. In some security levels, certain key lengths are configurable. Longer key lengths usually lead to significant increases in processing, if only during the hand shake stage. Before choosing a long key, we should consider the purpose of choosing that security level. In some cases it is to protect the data, in which case use of a longer key could be justified. In many cases, though, the reason for implementing security is purely to gain access to the network. In these cases, time and power savings can be made by reducing the level of security.

Availability, reliability, and latency of data at the wireless level are also controlled by code running in the digitizer radio chip. Each data or command frame is initially transmitted at a preferred default rate. If the initial transmission fails (i.e. no acknowledgement is received) a number of further attempts are made at this rate, according to a configuration variable. If this count is exhausted, the hardware attempts to transmit at a slower rate and so on, down to the slowest rate setting. If the rate count is exceeded a failure code is returned.

One very interesting application that may be well served by cloud instrumentation is the electric utility distribution grid. Many grids have lately been upgraded from what the industry knows as zonal to nodal (Figure 12). This means that generators, utilities, retail electric providers, and customers can now buy and sell power based on a much more specific region of the state or country and have costs (such as congestion charges) distributed much more fairly as well.

This transition has been made possible mainly by upgrading the power grid from analog to digital but also with the use of cloud instrument digital meters to monitor low voltage distribution lines.

A complete transition, when upgrading a power grid from analog to digital, requires digitization of the small, low-voltage distribution lines that feed each home and business. A key element is to replace the decades-old power meter, which relies on turning gears, with a cloud instrument digital meter that can track the current going into a building and also the current sent back out. The cloud instrument energy meter is connected to the grid’s main communication line and also to the Internet so it can send measurement data to a nodal cloud computing cluster for analysis of demand and price prediction. The use of cloud instrumentation allows utilities to much better assess how much power and reactive power is flowing from independent producers back into the grid. It also allows a utility to sense very local disturbances, which can provide an earlier warning of problems that may be mounting, thereby improving look-ahead simulation. And it will allow utilities to offer customers hour-by-hour rates, including incentives to run appliances and machines during off-peak times that might vary day to day, reducing demand spikes that can destabilize a grid. Unlike a regular meter, the cloud instrument can be used as a Web-based energy portal that would allow network intelligence to flow back and forth, with consumers responding to variations in pricing. The portal is a tool for moving beyond the commodity model of electricity delivery into a new era of energy services as diverse as those in today’s dynamic telecommunications market.

Perhaps as an industrial user who works with process instrumentation in a typical plant environment, you’re asking yourself if it’s conceivable that cloud instrumentation could someday be used for monitoring industrial processes and even controlling them, including critical or real time ones. Well, consider this historical precedent. The IBM Personal Computer, released on March 8, 1983, came standard with 128 kB of memory, a 360 kB double-sided 5.25 in. full-height floppy disk drive, a 10 MB Seagate ST-412 hard drive with Xebec 1210 MFM controller, an asynchronous adapter (serial card with 8250 UART) and a 130 W power supply. The motherboard had eight 8-bit ISA expansion slots, and an Intel 8088 microprocessor running at 4.77 MHz (with a socket for an 8087 math coprocessor). The operating system usually sold with it was PC-DOS 2.0 and above. Did we think, those who were born at that time, that we would someday do process control, including critical or real time, with this instrument? The answer is probably not. Technology is very hard to predict, but the history of technology is repeatable, therefore we believe that cloud instrumentation riding the waves of the Internet and cloud computing is in the best place to be able someday to take on critical process control. Watch for major improvements in Internet determinism, security, and reliability in the years to come.

Joining art and technology

When looking at the widgets, temperature avatar, and digital sensor frame created by Tag4M, we realize that the cloud instrument concept brings closer together, maybe more than ever before, two of humanity’s most driving engines which are art and technology. This just makes perfect sense because as we move through life and interconnect with objects we generate data, lots of data, which tells stories about our lives. We believe the art we create needs to capture these stories, just like instruments do, but using more warm and human images to reflect our interaction with the world around us. This new type of art, which we named Multi-Sensory Moving Art will be dynamic and therefore able to tell a developing story continuously, just like an instrument, but on an artistic level. We foresee a very thin line between an instrument and a piece of avatar art because both have measurement capability and the only difference between the two instruments will be the degree of artistic expression used in the user interface part of the instrument.

The following is an example of a temperature instrument that is also piece of art. This multi-sensory moving art work titled Life & Death is also a thermometer showing temperature at the artist location, or maybe collector location, gallery, etc. Art images change in relation with temperature readings. This is the equivalent of the traditional graph or chart displayed by measurement instruments. Instead of images of art, the container could also display images of the object whose temperature is monitored by the tag.

Marius Ghercioiu is president of Tag4M at Cores Electronic, LLC, in Austin, TX. Reach him at info(at)tag4m.com. Tag4M is working its way towards an instrumentation cloud where not only the sensors but also the logging, analysis, control programs, sensor widget deployment, and display can be anywhere.

www.tag4m.com



No comments
Consulting-Specifying Engineer's Product of the Year (POY) contest is the premier award for new products in the HVAC, fire, electrical, and...
Consulting-Specifying Engineer magazine is dedicated to encouraging and recognizing the most talented young individuals...
The MEP Giants program lists the top mechanical, electrical, plumbing, and fire protection engineering firms in the United States.
Water use efficiency: Diminishing water quality, escalating costs; Lowering building energy use; Power for fire pumps
Building envelope and integration; Manufacturing industrial Q&A; NFPA 99; Testing fire systems
Labs and research facilities: Q&A with the experts; Water heating systems; Smart building integration; 40 Under 40 winners
Maintaining low data center PUE; Using eco mode in UPS systems; Commissioning electrical and power systems; Exploring dc power distribution alternatives
Protecting standby generators for mission critical facilities; Selecting energy-efficient transformers; Integrating power monitoring systems; Mitigating harmonics in electrical systems
Commissioning electrical systems in mission critical facilities; Anticipating the Smart Grid; Mitigating arc flash hazards in medium-voltage switchgear; Comparing generator sizing software
As brand protection manager for Eaton’s Electrical Sector, Tom Grace oversees counterfeit awareness...
Amara Rozgus is chief editor and content manager of Consulting-Specifier Engineer magazine.
IEEE power industry experts bring their combined experience in the electrical power industry...
Michael Heinsdorf, P.E., LEED AP, CDT is an Engineering Specification Writer at ARCOM MasterSpec.