Select Page
Frequently asked questions in near-infrared spectroscopy analysis – Part 1

Frequently asked questions in near-infrared spectroscopy analysis – Part 1

Whether you are new to the technique, a seasoned veteran, or merely just curious about near-infrared spectroscopy (NIRS), Metrohm is here to help you to learn all about how to perform the best analysis possible with your instruments.

In this series, we will cover several frequently asked questions regarding both our laboratory NIRS instruments as well as our line of Process Analysis NIRS products.

1. What is the difference between IR spectroscopy and NIR spectroscopy?

IR (infrared) and NIR (near-infrared) spectroscopy utilize different spectral ranges of light. Light in the NIR range is higher in energy than IR light (Figure 1), which affects the interaction with the molecules in a sample.

Electromagnetic Spectrum
Figure 1. The electromagnetic spectrum.

This energy difference has both advantages and disadvantages, and the selection of the ideal technology depends very much on the application. The higher energy NIR light is absorbed less than IR light by most organic materials, broadening the resulting bands and making it difficult to assign them to specific functional groups without mathematical processing.

However, this same feature makes it possible to perform analysis without sample preparation, as there is no need to prepare very thin layers of analyte or use ATR (attenuated total reflection). Additionally, NIRS can quantify the water content in samples up to 15%.

Want to learn more about how to perform faster quality control at lower operating costs by using NIRS in your lab? Download our free white paper here: Boost Efficiency in the QC laboratory: How NIRS helps reduce costs up to 90%.

The weaker absorption of NIR light leads to using long pathlengths for liquid measurements, which is particularly helpful in industrial process environments. Speaking of such process applications, with NIR spectroscopy, you can use long fiber optic cables to connect the analyzer to the measuring probe, allowing remote measurements throughout the process due to low absorbance of the NIR light by the fiber (Figure 2).

Electromagnetic Spectrum
Figure 2. Illustration of the long-distance measurement possibility of a NIRS process analyzer with the use of low-dispersion fiber optic cables. Many sampling options are available for completely automated analysis, allowing users to gather real-time data for immediate process adjustments.

For more information, read our previous blog post outlining the differences between infrared and near-infrared spectroscopy.

2. NIR spectroscopy is a «secondary technology». What does this mean?

To create prediction models in NIR spectroscopy, the NIR spectra are correlated with parameters of interest, e.g., the water content in a sample. These models are then used during routine quality control to analyze samples.

Values from a reference (primary) method need to be correlated with the NIR spectrum to create prediction models (Figure 3). Since NIR spectroscopy results depend on the availability of such reference values during prediction model development, NIR spectroscopy is therefore considered a secondary technology.

Electromagnetic Spectrum
Figure 3. Correlation plot of moisture content in samples measured by NIRS compared to the same samples measured with a primary laboratory method.

For more information about how Karl Fischer titration and NIR spectroscopy work in perfect synergy, download our brochure: Water Content Analysis – Karl Fischer titration and Near-Infrared Spectroscopy in perfect synergy.

Read our previous blog posts to learn more about NIRS as a secondary technique.

3. What is a prediction model, and how often do I need to create/update it?

In NIR spectroscopy, prediction models interpret a sample’s NIR spectrum to determine the values of key quality parameters such as water content, density, or total acid number, just to name a few. Prediction models are created by combining sample NIR spectra with reference values from reference methods, such as Karl Fischer titration for water content (Figure 3).

A prediction model, which consists of sufficient representative spectra and reference values, is typically created once and will only need an update if samples begin to vary (for example after a change of production process equipment or parameter, raw material supplier, etc.).

Want to know more about prediction models for NIRS? Read our blog post about the creation and validation of prediction models here.

4. How many samples are required to develop a prediction model?

The number of samples needed for a good prediction model depends on the complexity of the sample matrix and the molecular absorptivity of the key parameter.

For an «easy» matrix, e.g., a halogenated solvent with its water concentration as the measurement parameter, a sample set of 1020 spectra covering the complete concentration range of interest may be sufficient. For applications that are more complex, we recommend using at least 40–60 spectra in order  to build a reliable prediction model.

Find out more about NIRS pre-calibrations built on prediction models and how they can save time and effort in the lab.

5. Which norms describe the use of NIR in regulated and non-regulated industries?

Norms describing how to implement a near-infrared spectroscopy system in a validated environment include USP <856> and USP <1856>. A general norm for non-regulated environments regarding how to create prediction models and basic requirements for near-infrared spectroscopy systems are described in ASTM E1655. Method validation and instrument validation are guided by ASTM D6122 and ASTM D6299, respectively.

Figure 4. Different steps for the successful development of quantitative methods according to international standards.

For specific measurements, e.g. RON and MON analysis in fuels, standards such as ASTM D2699 and ASTM D2700 should be followed.

For further information, download our free Application Note: Quality Control of Gasoline – Rapid determination of RON, MON, AKI, aromatic content, and density with NIRS.

6. How can NIRS be implemented in a production process?

Chemical analysis in process streams is not always a simple task. The chemical and physical properties such as viscosity and flammability of the sample streams can interfere in the analysis measurements. Some industrial processes are quite delicate—even the slightest changes to the process parameters can lead to significant variability in the properties of final products. Therefore, it is essential to measure the properties of the stream continuously and adjust the processing parameters via rapid feedback to assure a consistent and high quality product.

Figure 5. Example of the integration of inline NIRS analysis in a fluid bed dryer of a production plant.

Curious about this type of application? Download it for free from the Metrohm website!

The use of fiber optic probes in NIRS systems has opened up new perspectives for process monitoring. A suitable NIR probe connected to the spectrometer via optical fiber allows direct online and inline monitoring without interference in the process. Currently, a wide variety of NIR optical probes are available, from transmission pair probes and immersion probes to reflectance and transflectance probes, suitable for contact and non-contact measurements. This diversity allows NIR spectroscopy to be applied to almost any kind of sample composition, including melts, solutions, emulsions, and solid powders.

Selecting the right probe, or sample interface, to use with a NIR process analyzer is crucial to successful process implementation for inline or online process monitoring. Depending upon whether the sample is in a liquid, solid or gaseous state, transflectance or transmission probes are used to measure the sample, and specific fitting attachments are used to connect the probes to the reactor, tank, or pipe. With more than 45 years of experience, Metrohm Process Analytics can design the best solutions for your process. 

Visit our website to find a selection of free Application Notes to download related to NIRS measurements in industrial processes.

7. How can product quality be optimized with process NIRS?

Regular control of key process parameters is essential to comply with certain product and process specifications, and results in attaining optimal product quality and consistency in any industry. NIRS analyzers can provide data every 30 seconds for near real-time monitoring of production processes.

Figure 6. The Metrohm Process Analytics NIRS XDS Process Analyzer, shown here with multiplexer option allowing up to 9 measuring channels. Here, both microbundle (yellow) and single fiber (blue) optical cables are connected, with both a reflectance probe and transmission pair configured.

Using NIRS process analyzers is not only preferable for 24/7 monitoring of the manufacturing process, it is also extremely beneficial for inspecting the quality of raw materials and reagents. By providing data in «real-time» to the industrial control system (e.g., DCS or PLC), any process can be automated based on the NIRS data. As a result, downtimes are reduced, unforeseen situations are avoided, and costly company assets are safeguarded.

Furthermore, the included software on Metrohm Process Analytics NIRS instruments has a built-in chemometric package which allows qualification of a product even while it is still being produced. A report is then generated which can be directly used by the QC manager. Therefore, the product quality consistency is improved leading to potential added revenues.

Do you want to learn more about improving product quality with online or inline NIRS analysis? Take a look at our brochure!

In the next part of this FAQ, we will cover even more of your burning questions regarding NIRS for lab and process measurements. Don’t forget to subscribe to the blog so you don’t miss out on future posts!

Want to learn more about NIR spectroscopy and potential applications? Have a look at our free and comprehensive application booklet about NIR spectroscopy.

Download our Monograph

A guide to near-infrared spectroscopic analysis of industrial manufacturing processes

Post written by Dr. Nicolas Rühl (Product Manager Spectroscopy at Metrohm International Headquarters, Herisau, Switzerland) and Dr. Alexandre Olive (Product Manager Process Spectroscopy at Metrohm Applikon, Schiedam, The Netherlands).

Forewarned is Forearmed: Error and risk minimization in process analysis – Part 3

Forewarned is Forearmed: Error and risk minimization in process analysis – Part 3

In the course of life, each of us learns to trust our gut feelings or our experiences to avoid situations that seem dangerous or risky. You quite literally sense potential dangers with an uneasy feeling. Who hasn’t painfully learned that touching a hot stove top isn’t a good idea? Or who voluntarily goes outside during a tornado?

While humans can rely on their intuition and learned patterns to avoid dangers or use protective strategies, this is far more complicated with electronic systems or machines. All components of a system must be in a permanently safe state. Failures and malfunctions of individual components can have devastating consequences for production processes and the safety of the operators.

An example of this is the Seveso disaster in 1976, in which highly toxic dioxin TCDD escaped as a result of an uncontrolled reaction, and sustainably poisoned flora and fauna. With regard to other major chemical accidents, the European Seveso III Directive then came into force in 2012 to control major accident hazards to prevent major accidents.

Have you read Part 1 and Part 2 of our «Advantages of PAT (Process Analytical Technology)» series? If not, find them here!

Recognize, master, and avoid errors

Process engineering systems that are operated continuously contain countless components that can wear out or fail during their life cycle. However, if the measuring, control, or regulating circuit is affected, failures can cause immense damage. Under no circumstances should humans nor the environment be exposed to any kind of danger. For this reason, the functional safety of the components must be guaranteed, and their risk and hazard potential must be analyzed in detail.

The service life of mechanical components can be evaluated by observing mechanical wear and tear. However, the aging behavior of electronic components is difficult to assess. A unit of measure that makes risk reduction and thus functional safety quantifiable is the so-called «Safety Integrity Level» (SIL). 

The following procedure is followed:

  1.   Risk analysis
  2.   Realization of risk reduction
  3.   Evidence that the realized risk reduction corresponds at least to the required risk reduction

«Process analysis systems are part of the entire safety cycle of a manufacturing plant and therefore only one component whose risk of malfunctions and failures must be considered in an assessment.»

Risk assessmentA process is considered safe if the current risk has been reduced below the level of the tolerable risk. If safety is ensured by technical measures, one speaks of functional safety.

Significance for process analysis systems

Errors can happen anywhere, and can never be completely excluded. To minimize possible errors, it is therefore necessary to estimate the risk of occurrence and the damage to be expected from it as part of a risk analysis. A distinction must be made here between systematic and random errors.

Systematic errors are potentially avoidable and are caused, for example, by software errors or configuration deficiencies. Accordingly, they already exist during or prior to commissioning.

In contrast, random errors are potentially difficult to avoid because they occur arbitrarily. Nevertheless, the error rate or failure probability can be determined statistically and experimentally.

Random errors usually result from the hardware and occur during operation. Ultimately, systematic errors should be avoided, and random errors should be mastered to ensure trouble-free functionality.

Process analysis systems are the link between manual laboratory analysis and the industrial process. In applications where continuous and fully automatic monitoring of critical parameters is required, process analyzers are indispensable. Due to the different analysis conditions in the laboratory and directly in the process, there are some challenges when transferring the measurement technology from the lab to the process. The decisive factors are the working and environmental conditions (e.g., high temperatures, corrosive atmospheres, moisture, dust, or potentially explosive environments) which the process analyzers have to meet regarding their design, construction materials, and reliability of the components. The analyzer automatically and continuously transmits system and diagnostic data to prevent hardware or software components from failing through preventive measures. This significantly reduces the chance of random errors occurring.

General process analyzer setup

a) Analyzer Setup

Process analyzers have been specially developed for use in harsh and aggressive industrial environments. The IP66 protected housing is divided into two parts, and consists of separate wet and electronic parts. The electronics part contains all components relevant to control and operate the process analyzer. Modular components like burettes, valves, pumps, sampling systems, titration vessels, and electrodes can be found in the analyzer wet part. Representative samples can thus be taken from the process measuring point several meters away. The analysis procedure, the methods to be used, and method calculations are freely programmable.

A touchscreen with intuitive menu navigation allows easy operation, so that production processes can be optimized at any time. The course of the measurement is graphically represented and documented over the entire determination, so that the analysis process is completely controlled. The measurement results can be generated 24/7 and allow close and fully automatic monitoring of the process. Limits, alarms, or results are reliably transferred to the process control system.

When operating the analyzer, there is a risk that software errors can lead to failures. In order to recognize this with foresight, the system delivers self-diagnostic procedures as soon as it is powered on and also during operation. This includes, e.g., checking pumps and burettes, checking for leaks, or checking the communication between the I/O controller, the human interface, and the respective analysis module.

b) Sensors

The central component of a process analyzer is the measurement technique in use. In the case of sensors or electrodes, there are several requirements such as chemical resistance, ease of maintenance, robustness, or precision which they must meet. The safety-related risk arises from the possibility if measurement sensors fail due to aging, or if they become damaged and subsequently deliver incorrect measurement results.

Failure of the electrode, contamination, or damage must be reported immediately. With online analysis systems, the analysis is performed in an external measuring cell. In addition, recurring calibration and conditioning routines are predefined and are performed automatically. The status of the electrode is continuously monitored by the system.

Between measurements, the electrode is immersed in a membrane-friendly storage solution that prevents drying out and at the same time regenerates the swelling layer. The electrode is therefore always ready for use and does not have to be removed from the process for maintenance. This enables reliable process control even under harsh industrial conditions.

c) Analysis

Process analyzers must be able to handle samples for analysis over a wide concentration range (from % down to trace levels) without causing carry-over or cross-sensitivity issues. In many cases, different samples from several measuring points are determined in parallel in one system using different analysis techniques. The sample preparation (e.g., filtering, diluting, or wet chemical digestion) must be just as reliable and smooth as the fully automatic transfer of results to the process control system so that a quick response is possible.

Potential dangers for the entire system can be caused by incorrect measurement results. In order to minimize the risk, a detector is used to notify the system of the presence of sample in the vessel. The testing of the initial potential of the analysis or titration curves / color development in photometric measurements are diagnostic data that are continuously recorded and interpreted. Results can be verified by reference analysis or their plausibility can be clarified using standard and check solutions.

Detect errors before they arise

The risk assessment procedures that are carried out in the context of a SIL classification for process engineering plants are ultimately based on mathematical calculations. However, in the 24/7 operation of a plant, random errors can never be completely excluded. Residual risk always remains. Therefore, the importance of preventive maintenance activities is growing immensely in order to avoid hardware and software failures during operation.

A regular check of the process analyzer and its diagnostic data is the basic requirement for permanent, trouble-free operation. With tailor-made maintenance and service concepts, the analyzer is supported by certified service engineers over the entire life cycle. Regular maintenance plans, application support, calibration, or performance certificates, repairs, and original spare parts as well as proper commissioning are just a few examples.

Advantages of preventive maintenance from Metrohm Process Analytics

  • Preservation of your investment
  • Minimized risk of failure
  • Reliable measurement results
  • Calculable costs
  • Original spare parts
  • Fast repair
  • Remote Support

In addition, transparent communication between the process control system and the analyzer is also relevant in the context of digitalization. The collection of performance data from the analyzer to assess the state of the control system is only one component. The continuous monitoring of relevant system components enables conclusions to be drawn about any necessary maintenance work, which ideally should be carried out at regular intervals. The question arises as to how the collected data is interpreted and how quickly it is necessary to intervene. Software care packages help to test the software according to the manufacturer’s specifications, to perform data backup and software maintenance.

«Remote support is particularly important in times when you cannot always be on site.»

In real emergency situations in which rapid error analysis is required, manufacturers can easily support the operator remotely using remote maintenance solutions. The system availability is increased, expensive failures and downtimes are avoided, and the optimal performance of the analyzer is ensured.

Read what our customers have to say!

We have supported customers even in the most unlikely of places⁠—from the production floor to the desert and even on active ships!

Post written by Dr. Kerstin Dreblow, Product Manager Wet Chemical Process Analyzers, Deutsche Metrohm Prozessanalytik (Germany).

Making a better beer with chemistry

Making a better beer with chemistry

Lager or ale? Pale ale or stout? Specialty beer, or basic draft? This week, to celebrate the International Beer Day on Friday, August 7th, I have chosen to write about a subject near and dear to me: how to make a better beer! Like many others, at the beginning of my adult life, I enjoyed the beverage without giving much thought to the vast array of styles and how they differed, beyond the obvious visual and gustatory senses. However, as a chemist with many chemist friends, I was introduced at several points to the world of homebrewing. Eventually, I succumbed.

Back in 2014, my husband and I bought all of the accessories to brew 25 liters (~6.5 gallons) of our own beer at a time. The entire process is controlled by us, from designing a recipe and milling the grains to sanitizing and bottling the finished product. We enjoy being able to develop the exact bitterness, sweetness, mouthfeel, and alcohol content for each batch we brew.

Over the years we have become more serious about this hobby by optimizing the procedure and making various improvements to the setup – including building our own temperature-controlled fermentation fridge managed by software. However, without an automated system, we occasionally run into issues with reproducibility between batches when using the same recipe. This is an issue that every brewer can relate to, no matter the size of their operation.

Working for Metrohm since 2013 has allowed me to have access to different analytical instrumentation in order to check certain quality attributes (e.g., strike water composition, mash pH, bitterness). However, Metrohm can provide much more to those working in the brewing industry. Keep reading to discover how we have improved analysis at the largest brewery in Switzerland.

Are you looking for applications in alcoholic beverages? Check out this selection of FREE Application Notes from Metrohm:

Lagers vs. Ales

There are two primary classes of beer: lagers and ales. The major contrast between the two is the type of yeast used for the fermentation process. Lagers must be fermented at colder temperatures, which lends crisp flavors and low ester formation. However, colder processes take longer, and so fermentation steps can last for some months. Ales have a much more sweet and fruity palate of flavors and are much easier to create than lagers, as the fermentation takes place at warmer temperatures and happens at a much faster rate.

Comparison between the fermentation of lagers and ales.

Diving a bit deeper, there are several styles of beer, from light pilsners and pale ales to porters and black imperial stouts. The variety of colors and flavors depend mostly on the grains used during the mash, which is the initial process of soaking the milled grains at a specific temperature (or range) to modify the starches and sugars for the yeast to be able to digest. The strain of yeast also contributes to the final flavor, whether it is dry, fruity, or even sour. Taking good care of the yeast is one of the most important parts of creating a great tasting beer.

Brewing terminology

  • Malting: process of germinating and kilning barley to produce usable sugars in the grain
  • Milling: act of grinding the grains to increase surface area and optimize extraction of sugars
  • Mashing: releasing malt sugars by soaking the milled grains in (hot) water, providing wort
  • Wort: the solution of extracted grain sugars
  • Lautering: process of clarifying wort after mashing
  • Sparging: rinsing the used grains to extract the last amount of malt sugars
  • Boiling: clarified wort is boiled, accomplishing sterilization (hops are added in this step)
  • Cooling: wort must be cooled well below body temperature (37 °C) as quickly as possible to avoid infection
  • Pitching: prepared yeast (dry or slurry) is added to the cooled brewed wort, oxygen is introduced
  • Fermenting: the process whereby yeast consumes simple sugars and excretes ethanol and CO2 as major products

Ingredients for a proper beer

These days, beer can contain several different ingredients and still adhere to a style. Barley, oats, wheat, rye, fruit, honey, spices, hops, yeast, water, and more are all components of our contemporary beer culture. However, in Bavaria during the 1500’s, the rules were much more strict. A purity law known as the Reinheitsgebot (1516) stated that beer must only be produced with water, barley, and hops. Any other adjuncts were not allowed, which meant that other grains such as rye and wheat were forbidden to be used in the brewing process. We all know how seriously the Germans take their beer – you only need to visit the Oktoberfest once to understand!

Determination of the bitterness compounds in hops, known as «alpha acids», can be easily determined with Metrohm instrumentation. Check out our brochure for more information:

You may have noticed that yeast was not one of the few ingredients mentioned in the purity law, however it was still essential for the brewing process. The yeast was just harvested at the end of each batch and added into the next, and its propagation from the fermentation process always ensured there was enough at the end each time. Ensuring the health of the yeast is integral to fermentation and the quality of the final product. With proper nutrients, oxygen levels, stable temperatures, and a supply of simple digestible sugars, alcohol contents up to 25% (and even beyond) can be achieved with some yeast strains without distillation (through heating or freezing, as for eisbocks).

Improved quality with analytical testing

Good beers do not make themselves. For larger brewing operations, which rely on consistency in quality and flavor between large batch volumes as well as across different countries, comprehensive analytical testing is the key to success.

Metrohm is well-equipped for this task, offering many solutions for breweries large and small.

Don’t take it from me – listen to one of our customers, Jules Wyss, manager of the Quality Assurance laboratory at Feldschlösschen brewery, the largest brewery in Switzerland.

«I have decided to go with Metrohm, because they are the only ones who are up to such a job at all. They share with us their huge know-how.

I can’t think of any other supplier who would have been able to help me in the same way

Jules Wyss

Manager Quality Assurance Laboratory, Feldschlösschen Getränke AG

Previous solutions failed

For a long time, Jules determined the quality parameters in his beer samples using separate analysis systems: a titrator, HPLC system, alcohol measuring device, and a density meter. These separate measurements involved a huge amount of work: not only the analyses themselves, but also the documentation and archiving of the results all had to be handled separately. Furthermore, Jules often had to contend with unreliable results – depending on the measurement procedure, he had to analyze one sample up to three times in order to obtain an accurate result.

A tailor-made system for Feldschlösschen

Jules’ close collaboration with Metrohm has produced a system that takes care of the majority of the necessary measurements. According to Jules, the system can determine around 90% of the parameters he needs to measure. Jules’ new analysis system combines various analysis techniques: ion chromatography and titration from Metrohm as well as alcohol, density, and color measurement from another manufacturer. They are all controlled by the tiamo titration software. This means that bitterness, citric acid, pH value, alcohol content, density, and color can all be determined by executing a single method in tiamo.

Measurement of the overall water quality as well as downstream analysis of the sanitization process on the bottling line is also possible with Metrohm’s line of Process Analysis instrumentation.

Integrated analytical systems with automated capabilities allow for a «plug and play» determination of a variety of quality parameters for QA/QC analysts in the brewing industry. Sample analysis is streamlined and simplified, and throughput is increased via the automation of time-consuming preparative and data collection steps, which also reduces the chance of human error.

Something to celebrate: The Metrohm 6-pack (2018)

In 2018, Metrohm celebrated its 75 year Jubilee. At this time, I decided to combine my experience as a laboratory analyst as well as a marketing manager to brew a series of six different styles of beer for the company, as a giveaway for customers of our Metrohm Process Analytics brand, for whom I worked at the time. Each batch was brewed to contain precisely 7.5% ABV (alcohol by volume), to resonate with the 75 year anniversary. The array of ales was designed to appeal to a broad audience, featuring a stout, porter, brown ale, red ale, hefeweizen, and an India pale ale (IPA). Each style requires different actions especially during the mashing process, based on the type of grains used and the desired outcome (e.g., flavor balance, mouthfeel, alcohol content).

Bespoke bottle caps featuring the Metrohm logo.
The 6 styles of beers brewed as a special customer giveaway to celebrate the Metrohm 75 year Jubilee.

Using a Metrohm Ion Chromatograph, I analyzed my home tap water for concentrations of major cations and anions to ensure no extra salts were needed to adjust it prior to mashing. After some of the beers were prepared, I tested my colleagues at Metrohm International Headquarters in the IC department, to see if they could determine the difference between two bottles with different ingredients:

Overlaid chromatograms from IC organic acid analysis highlighting the differences between 2 styles of the Metrohm 75 year Jubilee beers.

The IC analysis of organic acids and anions showed a clear difference between the beers, allowing them to determine which sample corresponded to which style, since I did not label them prior to shipping the bottles for analysis. As the milk stout contained added lactose, this peak was very pronounced and a perfect indicator to use.

Metrohm ion chromatography, along with titration, NIRS, and other techniques, allows for reliable, comprehensive beer analysis for all.

In conclusion, I wish you a very happy International Beer Day this Friday. Hopefully this article has illuminated the various ways that beer and other alcoholic beverages can be analytically tested for quality control parameters and more  fast, easy, and reliably with Metrohm instrumentation.

For more information about the beer quality parameters measured at Feldschlösschen brewery, take a look at our article: «In the kingdom of beer The largest brewery in Switzerland gets a made-to-measure system». Cheers!

Read the full article:

«In the kingdom of beer – The largest brewery in Switzerland gets a made-to-measure system»

Post written by Dr. Alyson Lanciki, Scientific Editor (and «chief brewing officer») at Metrohm International Headquarters, Herisau, Switzerland.

The role of process automation in an interconnected world – Part 2

The role of process automation in an interconnected world – Part 2

The following scenario sounds like a fictional dystopian narrative, but it is a lived reality. A catastrophe, much like the current COVID-19 crisis, is dramatically impacting society. The normality, as was known before, has suddenly changed: streets are swept empty, shops are closed, and manufacturing is reduced or at a complete standstill. But what happens to safety-related systems, e.g. in the pharmaceutical or food industry, which must not stand still and are designed in such a way that they cannot fail? How can the risk of breakdowns and downtimes be minimized? Or in the event of failure, how can the damage to people and the environment be limited or, in general, the operational sequence maintained?

Digitalization: curse or blessing? 

When considering process engineering plants, one is repeatedly confronted with buzzwords such as «Industry 4.0», «digitalization», «digital transformation», «IoT», «smart manufacturing», etc. The topic is often discussed controversially and often it is about an either-or dichotomy: either man or the machine and the associated fears. No matter what name you give to digitalization, each term here has one thing in common: intelligently networking separate locations and processes in industrial production using modern information and communication technologies. Process automation is a small but important building block that needs attention. Data can only be consistently recorded, forwarded, and reproduced with robust and reliable measurement technology.

For some time already, topics including sensors, automation, and process control have been discussed in the process industry (PAT) with the aim of reducing downtimes and optimizing the use of resources. However, it is not just about the pure collection of data, but also about their meaningful interpretation and integration into the QM system. Only a consequent assessment and evaluation can lead to a significant increase in efficiency and optimization.

This represents a real opportunity to maintain production processes with reduced manpower in times of crisis. Relevant analyses are automatically and fully transferred to the process. This enables high availability and rapid intervention, as well as the assurance of high quality requirements for both process security and process optimization. In addition, online monitoring of all system components and preventive maintenance activities effectively counteracts a failure.

Digitally networked production plants

Even though digitalization is relatively well-established in the private sector under the catchphrase «smart home», in many production areas the topic is still very much in its infancy. In order to intelligently network different processes, high demands are made. Process analysis systems make a major contribution to the analysis of critical parameters. Forwarding the data to the control room is crucial for process control and optimization. In order to correspond to the state-of-the-art, process analysis systems must meet the following requirements:

Transparent communication / operational maintenance

Processes must be continuously monitored and plant safety guaranteed. Downtimes are associated with high expenditure and costs and therefore cannot be tolerated. In order to effectively minimize the risk of failures, device-specific diagnostic data must be continuously transmitted as part of the self-check, or failures must be prevented with the help of preventive maintenance activities. Ideally, the response must be quick, and faults remedied without having to shut down the system (even remotely).

Future-proof automation

If you consider how many years (or even decades) process plants are in operation, it is self-explanatory that extensions and optimizations must be possible within their lifetime. This includes both the implementation of state-of-the-art analyzers and the communication between the systems.

Redundant systems

In order to prevent faults from endangering the entire system operation, redundancy concepts are generally used.

Practical example: Smart concepts for fermentation processes

Fermenters or bioreactors are used in a wide variety of industries to cultivate microorganisms or cells. Bacteria, yeasts, mammalian cells, or their components serve as important active ingredients in pharmaeuticals or as basic chemicals in the chemical industry. In addition, there are also degradation processes in wastewater treatment assisted by using bioreactors. Brewing kettles in beer production can also be considered as a kind of bioreactor. In order to meet the high requirements for a corresponding product yield and the maintenance of the ideal conditions for proper metabolism, critical parameters have to be checked closely, and often.

The conditions must be optimally adapted to those of the organism’s natural habitat. In addition to the pH value and temperature, this also includes the composition of the matrix, the turbidity, or the content of O2 and CO2. The creation of optimal environmental conditions is crucial for a successful cultivation of the organisms. The smallest deviations have devastating consequences for their survival, and can cause significant economic damage.

As a rule, many of the parameters mentioned are measured directly in the medium using inline probes and sensors. However, their application has a major disadvantage. Mechanical loads (e.g., glass breakage) or solids can lead to rapid material wear and contaminated batches, resulting in high operational costs. With the advent of ​​smart technologies, online analysis systems and maintenance-free sensors have become indispensable to ensure the survival of the microorganisms. In this way, reliably measured values ​​are delivered around the clock, and it is ensured that these are transferred directly to all common process control systems or integrated into existing QM systems.

Rather than manual offline measurement in a separate laboratory, the analysis is moved to an external measuring cell. The sample stream is fed to the analysis system by suction with peristaltic pumps or bypass lines. Online analysis not only enables the possibility of 24/7 operation and thus a close control of the critical parameters, but also the combination of different analysis methods and the determination of further parameters. This means that several parameters as well as multiple measuring points can be monitored with one system.

The heart of the analysis systems is the intelligent sensor technology, whose robustness is crucial for the reliable generation of measured values.

pH measurement as a vital key parameter in bioreactors

Knowledge of the exact pH value is crucial for the product yield, especially in fermentation processes. The activity of the organism and its metabolism are directly dependent on the pH value. The ideal conditions for optimal cell growth and proper metabolism are within a limited pH tolerance range, which must be continuously monitored and adjusted with the help of highly accurate sensors.

However, the exact measurement of the pH value is subject to a number of chemical, physical, and mechanical influencing factors, which means that the determination with conventional inline sensors is often too imprecise and can lead to expensive failures for users. For example, compliance with hygiene measures is of fundamental importance in the pharmaceutical and food industries. Pipelines in the production are cleaned with solutions at elevated temperatures. Fixed sensors that are exposed to these solutions see detrimental effects: significantly reduced lifespan, sensitivity, and accuracy.

Intelligent and maintenance-free pH electrodes

Glass electrodes are most commonly used for pH measurement because they are still by far the most resistant, versatile, and reliable solution. However, in many cases changes due to aging processes or contamination in the diaphragm remain undetected. Glass breakage also poses a high risk, because it may result in the entire production batch being discarded.

The aging of the pH-sensitive glass relates to the change in the hydration layer, which becomes thicker as time goes on. The consequence is a sluggish response, drift effects, or a decrease in slope. In this case, calibration or adjustment with suitable buffer solutions is necessary. Especially if there are no empirical values ​​available, short intervals are recommended, which significantly increase the effort for maintenance work.

With online process analyzers, the measurement is transferred from the process to an external measuring cell. This enables a long-lasting pH measurement to be achieved with an accuracy that is not possible with classic inline probes.

In many process solutions, measurement with process sensors takes place directly in the medium. This inevitably means that the calibration and maintenance of electrodes is particularly challenging in places that are difficult to access, leading to expensive maintenance work and downtimes. Regular calibration of the electrodes is recommended, especially when used under extreme conditions or on the edge of the defined specifications.

If the measurement is carried out with online process analyzers, then calibration, adjustment and cleaning are carried out fully automatically. The system continuously monitors the condition of the electrode. Between measurements, the electrode is immersed in a membrane-friendly storage solution that avoids drying out, and at the same time prevents the hydration layer from swelling further as it does not contain alkali ions. The electrode is always ready for use and does not have to be removed from the process for maintenance work.

The 2026 pH Analyzer from Metrohm Process Analytics is a fully automatic analysis system, e.g., for determining the pH value as an individual process parameter.

Maintenance and digitalization

In addition to the automatic monitoring of critical process parameters, transparent communication between the system and the analyzer also plays a decisive role in terms of maintenance measures. The collection of vital data from the analyzer to assess the state of the system is only one component. The continuous monitoring of relevant system components enables conclusions to be drawn about any necessary maintenance work. For example, routine checks on the condition of the electrodes (slope / zero point check, possibly automatic calibration) are carried out regularly during the analysis process. Based on the data, calibration and cleaning processes are performed fully automatically, which allow robust measurement even at measuring points that are difficult to access or in aggressive process media. This means that the operator is outside the danger zone, which contributes to increased safety.

Summary

The linking of production processes with digital technology holds a particularly large potential and contributes to the economic security of companies. In addition, the pressure is growing steadily for companies to face the demands of digitalization in production. As an example, in the area of ​​fermentation processes, the survival of the microorganisms is ensured by closely monitoring relevant parameters. Intelligent systems increase the degree of automation and can make the process along the entire value chain more efficient.

Find out in the next installment how functional safety concepts help to act before a worst case scenario comes true where errors occur and systems fail.

Want to learn more about the history of process analysis technology at Metrohm? Check out our previous blog posts:

Read what our customers have to say!

We have supported customers even in the most unlikely of places⁠—from the production floor to the desert and even on active ships!

Post written by Dr. Kerstin Dreblow, Product Manager Wet Chemical Process Analyzers, Deutsche Metrohm Prozessanalytik (Germany).

To automate or not to automate? Advantages of PAT – Part 1

To automate or not to automate? Advantages of PAT – Part 1

I have to admit that the technological world of process analysis seemed foreign for me for a while. When I first heard about process automation, I imagined futuristic robots that do the work, similar to modern science fiction films. Perhaps many people might have the same impression.

There is often a great deal of uncertainty about what the expression «we automate your process» actually means. In this blog series, I want to show you that process analytical technology (PAT) is less complicated than expected and offers several advantages for users.

What does process analytical technology (PAT) mean? 

I was once told in conversation:

«Process analytics is for everyone who believes that they don’t need it.»

There is definitely truth in this statement, and it certainly shows the abundance of application possibilities. At the same time, it should be considered that in the future, users of process analytical technology will not only invest in conventional measurement technologies (e.g., direct measurement, TDLAS, GC), but also increasingly in the determination of substance properties and material compositions.

Pollution (gases and aerosols) in ambient air are especially harmful to human health. These substances can continuously and reliably be monitored by process analyzers.

PAT serves to analyze, optimize, and ultimately control processes and their critical parameters. This control makes a major contribution to quality assurance and the overall process reliability at the manufacturer. Thinking back to some well-known chemical disasters (e.g. Minamata, Toulouse, or Tianjin) in which poisonous substances were released, causing immense damage to people and the environment, the importance regarding regular monitoring of critical parameters becomes abundantly clear. The list of analytes that can and must be monitored is long, ranging from contamination in wastewater due to municipal or industrial wastewater treatment plants, to pharmaceutical agents, to gases and aerosols in the ambient air.

From Lab to Process

Considering the history of manufacturing and other industrial processes, it is clear that the ultimate goal is to increase throughput in ever-shorter timeframes, with an eye on safety measures and minimization of costs where possible. Independence through automation and fast, reliable data transfer is a high priority.

In order to make the process economically viable along the entire value chain, the resulting products should be manufactured at the highest quality in a short time and with minimal raw material and energy usage. For 24/7 operations in particular, knowledge of the composition of the starting materials and intermediate products (or rather, any impurities) is essential for optimal process control and reliability.

How can reliable process monitoring be ensured around the clock? Very few companies have company laboratories with an actual 3-shift operation, and often send their samples to external laboratories. Additionally, the samples are sometimes taken with longer time intervals between them. This carries various risks.

On one hand, the time lost between the sampling event and receiving the results from the analysis is enormous. It is only possible to react to fluctuations and deviations from target concentrations or limit values ​​with a certain delay. On the other hand, working and environmental conditions are not comparable and can lead to changes in the sample. Oxidation, pressure or temperature changes, introduction of moisture, and many other factors can change a sample’s original properties during transport, waiting periods, and manual laboratory analysis.

Example trend graph comparing process deviations mitigated by manual control (grey) and fully automatic process control (orange) via PAT.

Process analyzers: automated operation around the clock

Analyses, which are usually carried out manually, are automated by using industrial process analyzers. The samples are automatically removed from critical points in the production process and processed further. The information obtained is used to control the process without any delay, as the data can be transferred immediately to a central computing system at the plant. Automated analysis right at the sample point allows for increased accuracy and reproducibility of the data.

In practice, this entails rerouting a partial stream from the process in question to be fed to the analyzer by means of valves, peristaltic pumps, or bypass lines. Each sample is therefore fresh and correlates to the current process conditions. Probes can also be integrated directly into the process for continuous inline measurement.

The analysis is performed using common titration, spectroscopy, ion chromatography, or electrochemical methods known from the laboratory, which are optimally integrated into the process analyzer for each individual application requirement. The methods can be used in combination, allowing several measuring points to be monitored in parallel with one system. Thanks to the process analyzers that are specifically configured and expandable for the application, the optimal conditions for stable process control are obtained.

Spectroscopic methods have become particularly well-established in recent years for process analysis and optimization purposes. In contrast to conventional analysis methods, near-infrared (NIR) spectroscopy shows a number of advantages, especially due to the analysis speed. Results can be acquired within a few seconds and transferred directly to the chemical control system so that production processes can be optimized quickly and reliably. Samples are analyzed in situ, completely without the use of chemicals, in a non-destructive manner, which means further added value for process safety.

The many advantages of PAT

Automation in the context of process analysis technology does not always have anything to do with futuristic robots. Instead, PAT offers companies a number of advantages:

 

  • Fully automatic, 24/7 monitoring of the process
  • Timely and automatic feedback of the analysis results to the system control for automatic process readjustment
  • Reduction in fluctuations of product quality
  • Increased process understanding to run production more efficiently
  • Independent of your own laboratory (or contract lab)
  • Complete digital traceability of analysis results
  • Total solution concepts including sample preconditioning, saving time and increasing safety

What’s next?

In our next post in this series, you will discover the role process analysis technology plays in digital transformation with regard to «Industry 4.0».

Want to learn more about the history of process analysis technology at Metrohm? Check out our previous blog post:

Read what our customers have to say!

We have supported customers even in the most unlikely of places⁠—from the production floor to the desert and even on active ships!

Post written by Dr. Kerstin Dreblow, Product Manager Wet Chemical Process Analyzers, Deutsche Metrohm Prozessanalytik (Germany), with contributions from Dr. Alyson Lanciki, Scientific Editor at Metrohm International Headquarters (Switzerland).