Tech Terms Part II

As mentioned in our previous article, technical terms continue to change and evolve. Below are more terms we found useful to know and understand and think you will, too.

Artificial Intelligence (AI) , Machine Learning (ML) & Deep Learning – Not only a mouthful but also confusing word spaghetti that seem to get used interchangeably in a lot of scenarios.  Nvidia uses an interesting infographic to provide some clarification

The Nvidia definition uses the date of introduction to organize the hierarchy.  And since AI was used as common term first, they represent it as the most general term.  On other hand, it seems that intuitively AI would have to include characteristics associated with intelligence – adaptation and reason to name a few.  Whereas, Machine Learning could deliver value from data mining without necessarily having adaptive characteristics or applying reason.  For our purpose, we use Machine Learning as a broader term to describe solutions that address Predictive and Prescriptive Analytics and AI as a narrower subset, more analogous to Deep Learning.  But don’t be surprised to see continued use of these terms synonymously and we look forward to a clearer definition taking shape.

Digital Twin – An emerging term, commonly used in the AI and MI space, used to describe a software model of a physical object.  There is broad meaning to the term, in fact someday we might end up with more granularity like we did with Analytics.  On one extreme is the complicated models that represent an entire plant using first principle techniques, similar to a flight simulator for planes.  Plants have used these digital twins for decades to help commission and optimize plants.  More recently, the term has been used to describe applications like visualization and data models used for remote monitoring or software sensors – a statistical model that estimates the result of a physical instrument in a plant.  In these cases, we have digital representations of the physical object used for a variety of purposes.

Cloud – This is an interesting one.  For a long time, the term sounded sophisticated, but in reality just meant that your application/OS/etc. was hosted on a server somewhere else.  No different than many corporations created Data Centers years ago to centrally host and maintain software.  But applications and technology have evolved and the term Cloud has fulfilled the original promise.  Now Cloud solutions are purpose built to not only run on a server somewhere else but are built with scalability, ease of installation, support and security that make them unique.

Edge – Another example of expanding technology that’s been used in some industries for a long term.  It’s simply the idea of placing applications (data collection, processing, etc.) near the source or end user.  In Telecom, Edge computing is a significant advancement and a key differentiation in the 5G rollout.  In plants, instruments and pieces of equipment can now be considered part of the Edge.  However, real-time Historians have utilized Edge computing for decades.  Remote data collection nodes were placed as close to the data source (DCS or PLC) for years to deliver Store & Forward data collection and pre-processing of data.

In many Use Cases, Edge computing is just a product of the continued evolution of Cloud based computing and the realization that a hybrid strategy is required.  Some pieces will live in the Cloud and others will continue to live On Premise, the Edge.  If you’re peering down from the Cloud, the plant probably looks like the Edge.  To many us, it’s where we’ve been since the 1980’s.

A good article on discussing Cloud vs. Edge can be found here – https://www.arcweb.com/blog/edge-cloud-analytics

Block Chain – Probably our favorite term these days.  Right now it feels like a solution looking for a problem, but a list of buzzwords would not be complete without it.  So, until someone develops a proven Real-Time Historian or DCS block chain, we’ll let others work on this one and you won’t hear much from us (as a side note, as of June 25th, 2018 the average Bitcoin transaction confirmation time, over previous 60 days, was ~20 minutes – https://blockchain.info/charts/avg-confirmation-time?timespan=60days).

Have questions or need clarification? Feel free to reach out to us.

Why We’re Implementing the OPC UA Spec. (and how it will benefit our customers)

why-we-are-implementing-OPC-UAGoing back 15 years now, dataPARC had the notion of a “Process Area” that allowed tags from multiple systems to be organized by Asset, providing filters (like Grade or Product) for all tags assigned to an Asset and for other useful associations to be applied globally. Building on this experience, the next major version of PARCview takes the next step in Asset Management and includes an adoption of the ISA 95 companion specification to OPC UA. The implementation will allow end-users a familiar, standards-based architecture for organizing their plant data.
Read More

OPC UA: A Framework for the Industrial Internet of Things

opcThe ISA 95 spec and OPC UA companion standard provide a model that allows software programs to exchange all the relevant information throughout a manufacturing organization. This provides the groundwork for an industrial internet of things by breaking down the communication barriers between objects.

The Internet of Things

The concept of an “Internet of Things” (IoT) has been around since about 2005 and has really begun to catch on in recent years. Basically, IoT is connecting multiple “things” with sensors to data processing programs capable of sending data to and receiving data from those things. The things can be anything from household appliances to industrial machines. In factories this concept is called the Industrial Internet of Things or Industrie 4.0. Industrie 4.0 is the term coined in Germany for the IoT because it is being considered the 4th Industrial Revolution. No matter what it’s called, the goal is to create a network world with intelligent objects that can communicate and interact with each other.
Read More