On the train to work, Lee opened an email on her smartphone sent from a PAC (programmable automation controller) operating a surface-mount tool at her factory. The PAC attached a quality-control report to the email that suggested changing the tool’s solder temperature. To generate that email suggestion, the PAC had securely sent yesterday’s production data to a cloud-based analytics system to compare the current and historical data for the machine. Next, it accessed the machine manufacturer’s website and obtained the latest recommended settings.
Finally, the PAC built a production efficiency report with a suggested solder temperature for today’s production run that would increase yield by 7% over yesterday’s run. Lee clicked a link in the email and connected to the PAC’s mobile interface over a secure, encrypted channel. She logged in and navigated to the machine’s solder temperature set point, where she entered the recommended value. All this took place before she got to the office.
PAC at the edge
The PAC operating the surface-mount tool at Lee’s factory operates at the edge of the factory’s network. Systems like these at the network edge are increasingly able to leverage cloud-based resources to perform edge computing—if computing resources exist as needed along the path from a sensor to the cloud—and if these computing resources reduce the total amount of data to be sent to the cloud for storage, processing, and analysis. As a result, businesses can more quickly identify real opportunities for operational efficiency improvement and meaningful revenue generation.
To foster such business benefits, data from the physical world of machines and equipment must be available to the digital world of the internet and information technology systems, quickly, easily and continuously.
Successful IoT applications require operational technology (OT) professionals to make data from their systems, which monitor and control the physical world, accessible to the data processing systems of information technology (IT) professionals. Once the data is there, cognitive prognostics algorithms running on IT systems can analyse it, refining raw physical data into actionable information that can predict outcomes in real-time. The results can be used to improve inventory management and predictive maintenance as well as reduce asset downtime. But before such benefits can be realised, three problems need to be solved—connectivity, Big Data, and IoT architecture.
The connectivity problem
The Internet of Things runs on vast amounts of data, generated by the physical world and then transported & analysed by the digital world. It’s an attempt to achieve perpetual connectivity and communication between people and things and even between things and other things. But, unfortunately, most of these things were never designed to serve this new purpose. They were designed and installed long before the internet was developed. At the edge, things like sensors, circuits, relays, and meters are attached to industrial control systems used to operate equipment and machines. These sensors translate what’s physically happening in the world (temperature, light, vibration, sound, motion, flow rate, and so on) into an electrical signal like voltage or current that can be interpreted by other systems to monitor and control physical equipment and machines.
These sensors typically have little or no intelligence and are designed to merely observe and report. They were not designed to communicate with the digital world of the IoT. The physical world’s language, the language of flow metres, temperature sensors, switches, and relays, is rarely digital bits and bytes. Their language is not the ones and zeros that information technology and internet devices understand and use to communicate. They also lack the physical connections and logical interfaces to communicate on the Internet of Things. They do not have a built-in Ethernet jack or wireless interface.
Integrating these disconnected things and systems is no small task. And with the significant potential technical pitfalls and risks of integrating these disconnected systems, we begin to wonder how long it will take to realise return on our investments in IoT applications. One option is to simply wait for highly intelligent, connected sensors to become available to the marketplace. But those sensors are years away from being cost-effective. Moreover, sensors installed today or even decades ago, are still performing their tasks. They’re just not connected to the IoT, so the data they generate is siloed and inaccessible to IT systems for further analysis.
The Big Data problem
Across the globe, a massive installed base of things exists today, generating useful data that the IoT wants to access and consume. In oil & gas applications, a typical oilfield has up to 30,000 sensors installed. Factories and plants across the world have billions of sensors. Each sensor is capable of generating huge amounts of data from the physical world. Some IoT applications could potentially generate terabytes of data per second.
These are volumes of data the digital world has never seen before. This is the Big Data problem. Moving that much data onto existing network and internet infrastructures for cloud-based analytics and centralised management will clog networks, vastly-increasing network and internet latency. For many industrial IoT applications, that is not acceptable, because real-time control and monitoring are mandatory. For the Internet of Things to reach critical mass, intelligence must be pushed to the network edge, where the physical world meets the digital world. Computing systems at the network edge must have the capability to collect, filter, and process data generated at the source, before it’s transmitted up to the IoT. And at the same, time these edge computing systems must be able to complete the local real-time process control and automation tasks of traditional industrial applications.
The IoT architecture problem
Let’s take a look at how today’s IoT architecture works, so we can see it’s complexity and perhaps find a path forward. For a cloud-based server to capture data from an analog sensor today, the sensor’s data must be translated using a series of disparate software and hardware tools. First, the sensor is physically wired to a device such as a PLC. While modern PLCs do provide basic analog-to-digital conversion of sensor signals, PLCs were not designed to interface with the Internet of Things.
PLC hardware, software, and programming languages were designed for repetitive, application-specific tasks like process control and discrete automation. They typically use proprietary protocols and languages for communication and programming, and do not include information security standards like encryption and authentication. PLCs were originally designed as standalone systems. The protocols they use are seldom internet compliant and are designed for point-to-point communication instead of the point-to-multipoint communication architecture found in the IoT ecosystem.
If systems that communicate using internet-compliant protocols—such as PCs, web servers, and databases— want to communicate with a PLC, a vendor-specific and often proprietary software driver or hardware-based protocol gateway is required. OPC (Open Platform Communication) software is one solution to this communication disconnect. But, OPC was originally designed around PC architecture using the Microsoft Windows-only process exchange, COM/DCOM. Most systems and devices connecting to the IoT are not Windows-based devices.
For example, take your smartphone. It’s likely an Apple or Android device, both of which run modified versions of the Linux operating system, where COM/DCOM process exchange does not exist. OPC UA (Unified Architecture) has been released, but it’s merely a wrapper for existing OPC drivers built on Windows architecture. It requires design engineers to build an OPC UA client adapter into their products. And even then, modern network and internet assets such as web servers, databases, smartphones, and tablets do not speak OPC UA.
PLCs, OPC servers, proprietary drivers, and protocol gateways quickly become a convoluted IoT architecture. These layers of complexity not only require time, money, and specific domain expertise to install and maintain, but also the data being sent from the physical world has been converted by so many different pieces of hardware and software that data integrity can be jeopardised. Imagine the difficulty in provisioning and troubleshooting these IoT systems, and then consider that today’s automation architectures often do not address information security. Sending data generated at the edge through so many layers of conversion not only increases network latency but also opens up complex information security concerns as the data is transported to the cloud. Multiply these issues across the billions of devices we expect to connect using the IoT, and you see the communication challenge the IoT faces.
There has to be a better way!
Flattening the IoT architecture
As we’ve seen, for the IoT to reach critical mass, internet protocols and technologies need to be driven into systems at the edge, where the physical world and the digital world connect. Layers of complexity must be removed from the communication process between digital systems and physical assets. Modern IoT system architectures must be flattened, streamlined, optimised, and secured.
If we drive internet connectivity and data processing power into edge devices, we can greatly accelerate our time to insight and action. Edge computing devices will become the sensor on-22ramp for the billions of data points we intend to connect to the IoT. These edge computing systems will need the ability to receive the input signals of the physical world and output the meaningful data the IoT needs, in a form that digital internet-enabled systems already understand. Edge computing systems must easily and securely access the cloud through the open, standards-based communication technologies the internet is based on:
Internet technologies like TCP/IP, HTTP/S, MQTT, and RESTful APIs—the dialect of the internet—must be built directly into the input/output level, or the point of physical to digital conversion.
Internet security technologies like SSL/TLS encryption and authentication must be built in directly to edge computing systems.
Cloud-based systems must be able to make RESTful API calls to access data, or subscribe to data points on remote edge devices, without the layers of complexity and conversions that exist in industrial applications today.
The power of interoperability
We did not always have one cohesive system for sending and transmitting information. Before the internet and the world wide web, many different Internet-like protocols and architectures existed. Computer systems all ran different operating systems requiring different programming languages.
Small pockets of interconnectivity existed, but for the most part systems were disconnected from each other. It was very similar to the way industrial systems communicate today, with the need for converters, adapters, and gateways.
The internet was designed to allow input/output and information systems to share data through a common interface, removing layers of complexity and allowing for greater interoperability between systems designed and manufactured by different vendors. That’s why an Apple computer or Android phone today can send an email to a Windows computer—they speak the same internet languages. Today’s internet uses a common set of protocols, tools, and routines designed to make the transportation, acquisition, and analysis of digital information a seamless process, no matter what device you’re using.
Although sensors and other physical assets installed at the edge may not have been designed with internet interoperability in mind, there’s still a massive opportunity to collect meaningful data from the huge installed base of existing things. But it will require a solution that understands both sides of the OT and IT convergence—something that can:
Process and filter mountains of data, sending only the necessary data to the cloud for analysis
Provide communications interfaces and processing power to maintain the closed-loop, real-time control requirements of industrial applications
Deliver all of the above in a package suitable for challenging industrial environments where dust, moisture, vibration, electro-mechanical frequencies, and temperature vary widely
The article is reproduced with courtesy to Opto 22