MENU

Digitizing analog sensor data for the IoT

Digitizing analog sensor data for the IoT

Feature articles |
By eeNews Europe



According to The Guardian, the term Internet of Things was coined by Kevin Ashton in 1999, a bit of anecdotal history that Cisco seems to confirm.

A few hours of research and I was able to find a reference from Auto-ID labs using the term Internet of Things from 2001. Given that the Auto-ID Labs predecessor the Auto ID Center was formed in 1999 (see this), we can trace IoT back at least 14 years and maybe 16 years. Fifteen-ish years may qualify IoT as having one of the longest hype cycles ever, since it just reached the zenith of Gartner’s Hype Cycle last year.

So much for ancient history—the IoT is nearly passe, and now we are in the hype cycle of the Internet of Everything (IoE), and the Industrial Internet of Things (IIoT). I thought Cisco had coined IoE, but according to a date-windowed Google search, the domain internetofeverything.io has existed since as early as 2008 (note: running a Whois did not reveal the creation date), but the earliest reference I could find was an Economist article in 2010. The earliest Cisco reference I could find for IoE dates from 2012.

The Industrial Internet of Things seems to have been around for at least as long; I found an obscure International Sociological Association conference agenda from 2009 including a presentation Technology and Networked Memory: Toward an Internet of Old Things. The description for the presentation includes the term Industrial Internet of Things.

You are probably asking by now, where is this going?

Next: Where this is going.


I’ve mentioned in other blogs that the IoT means basically every node is a sensor, and sensors are mainly inherently analog. This has led silicon vendors like STMicroelectronics and Texas Instruments to provide analog front ends and other building blocks to help get real, analog data into the digital domain. Unfortunately, that is only part of the problem – we still need to get the sensor data, even in digital form, to a monitoring system. That is where IIoT comes in, and in particular, wireless sensor networks (WSNs) and smart sensors in industrial settings. While it is great for consumers to control their lights or thermostats from their phones, Industrial sensor needs are in much higher numbers of nodes per installation, and require reliable but efficient communication methods.

Over the last few years I have been involved in some real-world IIoT application developments, and became aware of an interesting protocol that has gained a lot of use in the transport of digital data from sensor nodes to higher-level application layers. MQTT is a protocol invented by Andrews Stanford-Clark of IBM, specifically for application to constrained devices and low-bandwidth, high-latency or unreliable networks.

In laymen’s language, the latter is a good description of a lot of the IoT. Oasis has many online resources including open source code for MQTT, and a nice overview presentation. Of interest are power consumption per "piece of information" of MQTT vs. HTTP. This article has data, which indicate MQTT has advantages in many applications, especially in short-range applications using WiFi instead of 3G.

While MQTT 3.1.1 is an Oasis Standard, a variant of MQTT, MQTT-SN has been developed by Stanford and colleagues at IBM specifically for sensor networks. For my purposes here, I don’t need to dig into the differences, but MQTT-SN is worth looking at if you plan to be in the smart sensor business.

Brilliantly or ironically, depending on your point of view, MQTT is a "pub/sub" (publish/subscribe) protocol that works well with one to many and many to one architectures, without IP addressing. Also of importance is that MQTT is intended to support asynchronous data—consider a smart sensor that instead of sending temperature every minute, instead publishes temperature every hour and then only reports if the change is greater than some threshold. Such a scheme means you don’t know when the next update may occur, but it lowers the power requirements quite a lot.

Next: Real-world implementation.


A common real-world implementation would be using an IEEE 802.15.4 wireless protocol to connect a whole factory of sensor nodes to a gateway. The nodes are identified by simple IDs, but more importantly they publish on various "topics". Topics are defined by the implementation, such as "temperature" or "battery status". The gateway would subscribe to those topics, and after registering the nodes, would simply "listen" for updates. Higher application layers would then handle what to do with the messages, which they would access via APIs (Application Programming Interfaces) to the gateway.

The gateway (which could be a virtual device running in a cloud) could be designed with APIs that are internet-accessible. Those APIs can be completely inside a corporate firewall, so no additional cyber-risk need be incurred, if that is important. Figure 1 is an example of combining REST (Representational State Transfer) APIs with an MQTT Broker, used with permission of the author Michael Koster of ARM. REST is relatively new to a lot of people and can be confusing, but you can get an introduction in this article from IBM.

The MQTT nodes publish updates that are routed by a broker; endpoints wishing to access information subscribe to Topics over the API. In this example applications can also use a REST interface to the Broker. Used with permission of Michael Koster.

Conceptually the architecture in Figure 1 allows, say, sensor node data to be used by multiple applications and multiple sessions. To put that most simply, there could be a web page that showed the temperature of all the nodes in a system. As many users as needed could browse to that page and view the data (with authentication if desired). A second application, say a desktop widget could alert when certain nodes exceeded a defined temperature range. There could be any number of users running that widget on their desktop simultaneously.

So where does this leave us? First, from the supplier point of view, I expect to see growth in smart sensor packages combining various types of sensors, AFEs, micro-controllers or microprocessors, a wireless chipset, and a protocol like MQTT in one package. From the user standpoint, once the network is in place, adding more nodes to it is as simple as activating a smart sensor and having it self-register to the broker (in the gateway). Approaches like these will drive higher use of sensors, corresponding to more demand for analog silicon for the signal conditioning and other aspects of the sensor hardware layer.

Blaine Bateman is president of EAF LLC, a consultancy in strategy, market analysis, technology due diligence, and related areas, and has over 20 years of experience. Prior to forming EAF, Blaine was VP of Strategic Markets, VP of Strategic Business Dev., and Global VP of Marketing with Laird Technologies. He has experience in electronics, automotive, wireless, instruments, and cryogenics. Over his career, he has received 18 patents in chemical instruments, antennas, and RF design.

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s