What is the Role of Standards, Big Data and Analytics in Manufacturing?

PinIt
analytics

Manufacturing has been traditionally complex, so how do you inject today’s new technologies of big data, real-time analytics, and interoperability into it?

Analytics for discrete manufacturing have been advancing slowly when compared with other industries like finance and media.

A little background on manufacturing: Manufacturing can be broken down into two high-level categories, discrete and continuous process. Discrete manufacturing is concerned with individual parts that are machined or produced separately (discretely) and often go into products like automobiles, aircraft, electronics, and appliances, to name a few. Continuous process manufacturing concerns itself with a flow of material and products like food, pharmaceutical, and chemical industries.

Discrete parts are often high value and have complex geometries and multistep processes. In most cases, the processes use equipment that is general purpose and capable of producing parts for many products whereas, in a continuous process, the equipment is often specific to a single product or type of product where the volumes are much greater and variation is very limited.

As we move into discrete manufacturing there are two high-level categories, high volume/low mix, and low volume/high mix. The first category is aligned with consumer electronics and automotive where large numbers of parts will be made on dedicated lines. Low volume/high mix is where individual orders for different parts are produced in small quantities. Many of the objectives of industrial movements, like Industry 4.0, are to achieve higher levels of agility and increase variability, that means that production is moving to smaller batch size and higher mix. This article will discuss discrete manufacturing low volume/high mix and the challenges to applying analytics. We also will discuss what it will take to create value to match the hype.

A Lack of Interoperability

The reasons for the lack of analytical advancement are numerous, but most have to do with lack of semantic interoperability between key manufacturing systems and equipment. The other area of concern is the data management infrastructure that can be deployed effectively and safely in the industrial environments. These concerns are being addressed, but because of the recent hype, the complexity and core value drivers are often overlooked leading to failure to deliver any perceived improvement when compared to the prevalent manual processes in use today.

See also: In 2018, prepare for IoT interoperability

The emergence of the Industrial Internet of Things (IIoT) has provided the economic motivation to address the historical concerns of data analytics for manufacturing systems, but at the same time it has created a hype cycle that has also been detrimental to addressing these concerns because of the tendency to assume that at some point value is created by magic. With complex analytics, like Machine Learning (ML) and Artificial Intelligence (AI), the lack of understanding of the underlying capabilities and limitations allows for claims to be made that cannot be supported by the technology.

Manufacturing data, in particular, discrete manufacturing data, is very complex and if it is analyzed without regard to the context in which it is created, the value is often lost. In many manufacturing systems, the equipment can be used to make a wide variety of parts with widely varying processes. A Computer Numerical Controller (CNC) machine tool is a general-purpose piece of equipment that runs a program that can execute complex high-speed motion with incredible accuracy.

CNC Machine tool and controller manufacturers have historically not been concerned with providing data from their equipment beyond the minimal required to load programs and do some low-speed monitoring of the processes as they execute. Many of the Application Interfaces (API)s were written to support a Human Machine Interface (HMI) for machine tool builders to create their own custom look and feel. Because the APIs are not concerned with data analytics and did not envision connectivity to the internet, they often lacked the richness of data and the necessary security.

The main problem is that each controller manufacturer has a proprietary protocol and structure for the data that are accessible. This means that the cost and complexity of data collection and analysis is often far too costly and onerous for many companies to attempt. The development of standards like MTConnect has greatly reduced the complexity of getting data from multiple equipment vendors and allowed for faster development of more analytically rich applications, but most are still primitive compared to other industries. MTConnect has reduced this burden by providing data that has semantic content, meaning that each piece of data has well-defined meaning, for example, Positions, Temperatures, Modes, and States, and has context with respect to the logical components of the machine, such as the Controller, Axis, Electrical and Pneumatic Systems. Each piece of numeric data also has defined units and format.

Custom Rules Not Needed

Having the semantic data with component level context allows for an application to apply similar analytics across a large number of machines from multiple vendors without having to have custom rules for each vendor. The standardized data gets one significantly greater value than analyzing data from equipment that is delivered in native formats without the common semantics. One can state that the data cannot be analyzed if it cannot be understood. Pure AI approaches to determine the meaning of the data have been largely unsuccessful because of the variety and volatility of the data and it makes little to no sense to try to infer what can be easily tagged at the point of collection. Semantic standards provide the most efficient way to collect meaningful data and provide the foundation that creates the shortest path to value in the least amount of time.

See also: IIoT needs semantic standards, says IIC’s Soley

A paper written by Google, “Hidden Technical Debt in Machine Learning Systems,”  does a nice job at highlighting the often-overlooked truth about machine learning and the hidden debt when developing these systems. The majority of the time and expense is spent on developing the supporting systems to collect, transform, clean, and store the data and to monitor and maintain the infrastructure. In most cases, the data does not come to the analytics in a pristine and easily consumable form that is ready for immediate ingest and analysis. The data is often dirty with a lot of anomalous values that must be removed and one must have an adequate sample size to determine the statistical significance of any signal that may be correlated to a critical event. One does not want to produce too many false alerts since they will result in a loss of trust and eventual dismantling.

The next concern is process and part context. The equipment context tells one what happened during the process, but to determine if it was correct, one must know the intent. If the same part is made millions of times, it is easier to automate analysis since one can collect a baseline set of data and compare each subsequent part to the set of good and bad parts used as the training set. Anomaly detection can be done by comparing each subsequent part and classifying it without the need for the explicit intent being communicated. This, of course, assumes that there is sufficient data volume and content to find a suitable signal.

When working with low volume high mix manufacturing, there is often insufficient parts of a single type to train the analytics. Therefore, the process intent must be provided. To address this, the design and engineering information must be provided with annotations that can be understood by software systems. Currently, a lot of the intent is provided in proprietary formats in the Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) tools as well as the Engineering and Simulation technologies.

There is currently limited standardization in these areas. The current standards provide support for digital representations of geometries with tolerances (GD&T) and annotations (PMI), but the annotations are provided for human consumption, not for software automation. The same situation occurs with engineering standards. The information communicated to the manufacturing equipment is a series of low-level motion control with tool selection with proprietary code that causes changes to operational states. The instructions and intent are provided as notes on printed documents for humans to interpret but are not communicated to the equipment.

For the next level of analytics, all the information, as well as the reasons why decisions are made, will need to be delivered in a standardized, machine accessible, standardized information models that can be accessed by many systems to compare the outcome of the process as executed with the intent of the process. These models need to be combined using a common identity scheme and managed across multiple disparate systems with security and provenance encoded in each of the information models to make sure the information has not been altered, forged, or accessed by malicious or unauthorized parties.

Intent Provides Data Context for Analytics

Intent allows for analytical software to understand the context of the data within the product and the process being executed. For example, the analytics can determine if an individual material removal operation is correct by knowing the material, the type of feature or geometrical change being made and the desired outcome and what type of tooling and cutting conditions were present as well as knowledge of the equipment and its capabilities. All these factors are required to understand what happened in a process and then compare with the outcome to provide feedback for prescriptive process improvement.

Cutting corners by trying to infer the intent has not been successful beyond basic utilization and time management. There has been some success in maintenance, but high mix tends to reduce the analysis to basic time accounting and limit monitoring. There is a lot that can be done to deliver on the promise of digital manufacturing and prescriptive analytics, but the foundation work has not been completed and the standardization of the intent-based information models needs to be completed. As well, the control systems will need to be enhanced to provide an adequate signal at much higher rates to allow the analytics to find the patterns and drive the learning systems. But, the learning systems will not succeed if the up-front context is not provided and standardized semantics are not used when the information is created.

There is still a lot of work to be done to achieve greater levels of agility in manufacturing, but the work is now understood and has been started. The data management systems will need to deal with the special concerns around security and safety that are not currently addressed by the operational technology (OT) systems but will need to be enhanced if the analytics are to progress. The goal of agile, on-demand manufacturing is achievable with greater software and execution automation, but the foundations must be completed first.

William Sobel

About William Sobel

William Sobel is Chief Strategy Officer and co-founder of VIMANA, the leading predictive analytics solution for manufacturing intelligence, the principal architect of the MTConnect Standard and chair of the Technical Steering Committee; he is the co-chair of the Industrial Analytics task group of the Industrial Internet Consortium. He brings over 30 years experience advancing the state of the art in manufacturing technology and is currently developing standards for self-aware IIoT systems in manufacturing.

Leave a Reply

Your email address will not be published. Required fields are marked *