Advantages of a Plant Wide Information System

When it comes to operating a continuous manufacturing process of any kind, it is beneficial to have the maximum amount of data. Simply collecting and storing data does not, by itself, yield measurable benefits. In order to take full advantage of the data, it needs to be organized, archived and then made available in a variety of formats throughout a facility. This is the function of a versatile and robust plant-wide information system like the dataPARC software suite.

The term “plant-wide” applies to an information system in two ways. The first function is that of collecting information from various data sources throughout the actual manufacturing process, including the administrative infrastructure around the plant. The information system should then condition and archive the data.

“Data conditioning” refers to a variety of techniques that include, but are not limited to, averaging, filtering, correlating time stamps, creating combined calculated values, and aggregating raw data. The second application of the “plant-wide” term refers to the re-presentation of the conditioned data throughout the mill. While using a system like this may seem like an obvious idea, there are many plants  that do not utilize one.

The History of Data Management in Plants

Historically speaking, as manufacturing facilities have transitioned from analog mechanical and pneumatic control systems and paper based recordkeeping into the digital age, the changes have not been at all uniform. Production processes were largely re-instrumented and put under the control of computer based Distributed Control Systems (DCSs) or Programmable Logic Controllers (PLCs). Some of these systems had the ability to archive data, some did not. Initially these systems were offered by a variety of vendors and the exchange of data was either not considered, or discouraged. It is not unusual to see incompatible DCS systems from different vendors within a single facility. As computing technology advanced, the vendor offering the best blend of features and price constantly changed, leading to a diversification of systems as different areas of a facility were modernized.

The Problem with “Data Islands

Quality control labs also took advantage of advancing technology, and invested in database programs and communication interfaces tailored at archiving both manual data entries and automated input from certain instruments. Raw material ordering and inventory was tracked in database programs optimized for those purposes, as was warehousing and shipping information. Each department in a facility did indeed move forward. Digitizing and storing data reduced costs and increased efficiency at the departmental level. While this computerization increased the ability to share data between departments in some ways, a facility which relies on these marginally connected “data islands” is missing out on many of the benefits that can be realized with a plant-wide information capable of integrating data from all those sources.

Troubleshooting with quality and process data

Consider an example of troubleshooting a quality problem in an integrated pulp and paper mill, where product paper reels are produced every 20 to 60 minutes. Several quality tests are run on samples taken from each reel. Suppose that the machine direction (MD) tensile strength was measured as being below the lower acceptable limit for a particular grade on a couple of consecutive reels. With only “data islands” in place, this information would probably be made available to the paper machine operators through an electronic report, and they would be left on their own to figure out the cause and solution for this problem.

With a plant-wide information system in place, the MD strength data could be easily trended next to any number of upstream process variables. A good information system would have the ability to “time shift” the quality data, so that the drop in the strength number for the reels could be visually matched with changes in other process variables. Doing this, the machine operators would see that the drop in strength had started before the refining change was made.

A good plant-wide information system such as dataPARC would give the paper machine operators access to variables from outside their area. By casting their troubleshooting net a little farther, the PM operators could see that the time at which the drop in paper strength occurred at the reel closely matched an earlier upset in the digester, which led to the production of 3 hours of over-cooked, low strength pulp.

The Benefits of Process Information and Corresponding Insight

Having this insight would lead to two positive outcomes. Not only would the source of the low strength paper be discovered, but by knowing that it came from outside the paper machine, those operators would not create additional, possibly off-spec product by “chasing their tail” and further changing refiner settings. With the knowledge that the 3 hours of low strength pulp had largely already passed through the machine, they would also know that the strength number would in all likelihood return without the operators making any changes to the stock prep and machine settings. In this case the enhanced data access would lead to good decision making and more efficient operation.

Combing Raw Cost and Process information

In a manufacturing facility, electrical, fuel and raw material costs originate in Enterprise Resource Planning (ERP) software. These costs are sometimes dynamic, and the ability to access those numbers is an important capability for a plant-wide information system. Some facilities generate and sell electrical power as well as consume it. Having accurate real time cost data helps engineers and operators optimize fuel types, steam generation and electrical power flows to maximize profits.

Additionally, showing actual costs in process trends is a technique used to further operator involvement in optimizing a process.  A steam vent of 10,000 lbs per hour may provide a convenient way to operate a given process for a period of time, but it comes at a cost. If the vent flow is displayed as loss of $100 per hour based on the flow and value of steam, it is easier to communicate to operators the importance of eliminating that method of operation.

Pushing Process Information

While the previous two examples apply to enhancing the operation of an actual production process, it is equally important that the vital metrics of the process be seamlessly returned to an ERP or other administrative software for ordering and shipping reasons. Modern manufacturing philosophy says that minimizing inventory is one way to reduce costs. As the period of time between the production and shipping of goods or product is reduced, it becomes increasingly important for the shipping planners to have real time information about manufacturing problems which might lead to the inability to meet an order. It is the role of a plant-wide information system to make this interchange of data happen.

Goals of Plant- Wide Information Sharing

As stated above, a plant-wide information system should fulfill two important goals. One is to collect and archive as much data as is needed to operate the plant and allow for effective troubleshooting. Just as importantly, it should present, in various formats, the same conditioned and calculated values to everyone throughout the mill. By using a single set of values, all the decision makers from the planners and engineers to the process operators are all working with the same up-to-date data.

Contact us to learn more about dataPARC for your plant wide data integration needs.

Process Data Compression: Why it’s a BAD Idea

data-compressionMost people are familiar with compressing data files so that they require less memory and they are easier to send electronically. Similar concepts are popular with process data historians. With process data, compression means reducing the number of data points that are stored, while trying to not affect the quality of the data. Compression can be accomplished using one of several algorithms (swinging door, Box Car Back Slope). Each algorithm uses some criteria to eliminate data between points where there is constant change (slope), within some tolerance.

The main drivers for compression are disk space, and network traffic or data retrieval speed. The goal of compression is to remove data that is “unimportant”. The proponents of compression make convincing arguments, like the shape of the graph is still the same. However, there are several drawbacks to compression.
Read More

Data Historian Reporting Best Practices

data-historian-reporting-best-practicesHistorian packages were originally intended to help operators and engineers understand and operate manufacturing processes. Current and historical data was constantly displayed on a dedicated screen next to the primary control screens, and users were intended to interact with it at that location more or less continuously. As the historian became a one-stop source for all types of data throughout a facility, it became a tool that could benefit supervisory and management personnel as well. This led to the development of a variety of remote notification and reporting tools to meet the somewhat different needs of these individuals.
Read More