Select Page
Tips and Tricks for IC Columns

Tips and Tricks for IC Columns

Monitoring and maintaining column performance

One of the basic requirements for ensuring reliable chromatographic analyses is a high-performance separation column. Ion chromatography (IC) users should regularly check the performance of their column. This way, if a drop in performance becomes apparent, steps can be taken in good time to restore or maintain the proper functioning of the column, reducing downtimes in sample throughput. In this blog post, we explain how you can assess column performance, which parameters you should monitor, and which measures you can take to ensure excellent column performance.

 

First-time use of a new separation column

When you use a column for the very first time, we recommend that you check its initial performance. The Certificate of Analysis (CoA), which you receive with every purchase of a Metrohm column, is your source of reference here. Record a chromatogram and use the analysis conditions specified in the CoA: these include flow rate, temperature, eluent (mobile phase), analyte concentration, sample loop size, and suppression.

You can evaluate the column’s performance by comparing some of the result parameters with the values listed in the CoA (e.g. retention time, theoretical plates, asymmetry, resolution, and peak height).

Regular monitoring of column performance

Columns that are already in use should be monitored regularly, too! We recommend carrying out these tests with check standards under the application conditions you normally use, because performance varies depending on the type of analysis and associated analysis conditions as well as the instrumental setup. If a reduction in performance is observed, the requirements of the application are crucial to determine whether it can still be used.

Below, we explain how to determine your column performance based on five performance indicators. You will also find out how you can prevent or rectify a decline in performance.

Click to jump directly to a topic:

 

Backpressure

Monitor the backpressure: When you use your new column for the first time, save the backpressure value under the analysis conditions of your application as a reference («common variable» in MagIC Net). Then use the user-defined results to monitor the difference between the initial backpressure and the one displayed during the current determination.

If you identify an increase in the backpressure in comparison with the saved initial value, this indicates that particles have been deposited in either the guard column or separation column. If the measured increase is greater than 1 MPa, action must be taken. First, you should check which of the columns is affected (guard vs. separation). If the guard column is contaminated, it should be replaced, as this is its primary function. If the separation column is affected, remove it from the system, turn it around and reinstall, and then rinse it for several hours in this reversed flow direction. If this doesn’t help, we strongly recommend that you consider replacing the column. This will be essential if the maximum permitted backpressure for the column is reached.

Retention time

To track changes in the retention time (which signal a decrease in column performance), the retention time of the last analyte peak is monitored in the chromatogram. Sulfate, for example, is suitable for this, as it usually elutes right at the end of standard anion chromatograms. Here too, work with a common variable in MagIC Net to save the initial value.

Unstable retention times can be caused by carbon dioxide introduced from the ambient air or from air bubbles present in the eluent. Luckily, these problems can be resolved easily (see Table 1).

Table 1. Preventing and correcting performance loss in IC columns (click to enlarge).

The column may also have lost some capacity. This capacity loss can be caused by the presence of high-valency ions which are difficult to remove due to their strong attraction to the stationary phase. The column should then be regenerated in accordance with the column leaflet to remove any contamination. If this doesn’t lead to any improvement, then consider replacing the column depending on the requirements of the application, particularly in the event of progressive capacity loss.

Capacity loss can also occur if the functional groups are permanently detached from the stationary phase. In such a case, the column cannot be regenerated and must be replaced.

Resolution

Monitor the chromatographic resolution by comparing measurements from a predefined check standard with an initial reference value. If the resolution is R > 1.5, the signal is considered baseline-separated (see illustration below). However, in cases involving highly concentrated matrices and for peaks that are more widely spread, the resolution value must be higher to ensure baseline separation.

If a loss of resolution occurs, first make sure that it is not caused by the eluent or the IC system. Once these have been ruled out, it is possible that the adsorptive effect of contaminations in the guard column or separation column may be responsible. A contaminated guard column should be replaced. If the cause of the problem is found to be the separation column, this should be regenerated in accordance with the column leaflet to free it from any organic or inorganic contamination. If the loss of resolution progresses, a column replacement is inevitable.

Theoretical plates

Save the initial number of theoretical plates in MagIC Net as a common variable, as mentioned earlier for other parameters. Usually, the last eluting peak is used – in anion chromatograms, sulfate would yet again prove to be a suitable candidate. Theoretical plates also depend on the analyte concentration. Therefore, it is ideal to monitor this parameter during check standard measurements and not during sample measurements. You can track the development of any changes to the number of theoretical plates via the user-defined results in MagIC Net.

A decrease in the theoretical plates can suggest dead volume in the IC system (see Table 1). A low number of theoretical plates may also be observed if the column has been overloaded by a high salt concentration in the sample matrix, for instance. If the theoretical plates decrease by more than 20%, this indicates that column performance is declining. Depending on the requirements of the application, action may need to be taken.

If the guard column is the reason for the drop in performance, it should be replaced. If the problem is with the separation column, we recommend regenerating the column in accordance with the column leaflet to eliminate any organic or inorganic contamination. If this doesn’t help, you should consider replacing the column, particularly if a trend toward lower theoretical plates is observed.

Asymmetry

Determine the initial asymmetry of the analytes by measuring a predefined check standard under the analysis conditions of your application. Save it as a common variable, then track the user-defined results to observe the development of asymmetry over time. The maximum acceptable values for the asymmetry vary depending on the analyte. For example, calcium and magnesium peaks initially present relatively high asymmetry values.

Asymmetry is defined as the distance from the centerline of the peak to the descending side of the peak (B in the figure below) divided by the distance from the centerline of the peak to the ascending side of the peak (A in the figure below), where both distances are measured at 10% of the peak height. Some pharmacopoeia may use other figures – please check to be sure of the requirements in your country.

AS > 1 means a peak has tailing, and AS < 1 equates to peak fronting. Optimum chromatography is achieved with peak asymmetries as close as possible to 1. As a general rule, column performance is considered in decline when the asymmetry is AS > 2 or AS < 0.5. Depending on the requirements of the application, measures have to be taken in this case in order to improve symmetry and to enable better integration.

The reason for high asymmetry values may be down to the ion chromatograph – due to dead volume, for example. If this is not the case, it is important to find out whether the asymmetry is caused by problems with the guard column or with the separation column. If the guard column causes the asymmetry, it should be replaced. If it is the separation column, it should first be regenerated in accordance with the column leaflet to remove any organic or inorganic contamination. If this doesn’t help, you should consider replacing the column. If a trend toward higher asymmetry values can be observed, replacement is unavoidable.

In summary, there are many ways in which you can estimate the performance of the column and track concrete figures over its lifetime. Proper maintenance can extend the lifetime of the separation column, as well as always using a guard column for extra protection.

Need help choosing the right column for your application?

Look no further!

Try the Metrohm Column Finder here:

For further guidance about IC column maintenance, you can watch our tutorial videos here:

Post written by Dr. Alyson Lanciki, Scientific Editor at Metrohm International Headquarters, Herisau, Switzerland. Primary research and content contribution done by Stephanie Kappes.

How to Transfer Manual Titration to Autotitration

How to Transfer Manual Titration to Autotitration

Maybe you’ve read our earlier blog post on the main error sources in manual titration and are now wondering what you have to do in order to convert your manual titration to autotitration. In this blog entry, I want to give you a step by step guideline on how to proceed and what you have to consider.

Let’s jump right in with the following topics (click to go directly to each topic):

Choice of sensor

The first and most crucial step in transferring a manual titration to autotitration is the choice of the sensor for the indication of the equivalence point.

One of the simplest choices is to use a photometric sensor, effectively replacing the human eye with a sensor, especially when norms or standards stipulate the use of color indicators. It is often simpler to use a potentiometric electrode for indication purposes, because indicator solution is not needed and you can even combine multiple titrations into one to save time.

The electrode choice depends on the type of reaction, the sample, and the titrant used. Acid-base titrations require a different electrode than redox or precipitation titrations. Additionally, the sample matrix can have a significant influence on the electrode. The more complex the matrix is, the more crucial the choice. For example, you must use a different pH electrode for non-aqueous titrations than for aqueous titrations.

To help you select the best electrode for your titrations, we’ve prepared a free flyer, which you can download below. If you prefer, our Electrode Finder is even easier to use. Select the reaction type and application area of your titration and we will present you the best solution.

Optrode: optical sensor for photometric titrations.

Optimizing the sample size and solution volumes

If you’ve performed manual titrations before, then you know that many of these methods have endpoints requiring the use of up to 30 mL or even 40 mL of titrant. Autotitrators are most commonly equipped with 10 mL or 20 mL burets. Since refilling the buret during the titration causes errors, you should reduce the sample size for autotitrations. In general, for autotitration it is recommended that the equivalence point lies between 10% and 90% of the total buret volume. The second step when transferring a manual titration to autotitration is thus the optimization of the sample size.

Don’t forget: decreasing the sample size additionally reduces the waste produced  since you need less titrant, contributing to greener chemistry and cost savings with each titration.

When you transfer a manual titration to autotitration, it might be necessary to adjust the amount of used diluent (water or solvent) for the analysis. To obtain accurate results it is imperative that the glass membrane (for measurement) and diaphragm (for reference) of the sensor are fully immersed into the solution, as shown here.

Selecting the right titration mode

Depending on the reaction type, some titration reactions are completed faster than others. For this reason, autotitrators are equipped with different titration modes.

The two most often used modes (monotonic and dynamic mode) can be distinguished by the way the titrant is added. When using the monotonic mode, the same amount of titrant is always added with each addition. In the dynamic mode, the amount of titrant added differs depending on how close you are to the equivalence point. The closer you are to the equivalence point, the smaller the additions—similar to a manual titration.

As a rule of thumb, use the dynamic mode for fast titrations such as acid-base titration and the monotonic mode for slower titrations, where the equivalence point is suddenly reached (e.g. vitamin C determination).

Learn more about the different titration modes in this free webinar available on our website.

Optimizing the titration setup

The stirring speed and sensor placement within your titration beaker will influence the accuracy of your results.

Depending on the sample beaker and stirrer, choose the stirring speed in such a way that the mixing is fast enough but no splashing occurs. Also be sure that no vortex is created in the solution due to the speed of mixing, leaving your electrode hanging in dry air. Make sure to place the electrode close to the beaker wall and upstream of (behind) the buret tip as displayed in this example. This allows for an ideal mixing of the titrant with the sample and improves your accuracy.

Optimizing the titration method

In this last step, I want to present some options about how you can optimize your titration in regards to titration speed and titrant consumption.

One efficient way to speed up titrations, especially monotonic titrations, is to utilize a start volume. This would be similar to a case where you pre-dose titrant to your sample before you beginning the manual titration. Don’t forget to add a pause step after the addition! This way, the added titrant can mix well with the sample before the titration starts.

To save titrant and to reduce waste, I recommend to use stop criteria. The simplest stop criterion is the stop volume. If your equivalence point always occurs at the same volume, this is the easiest way to go. If the volume of your equivalence point varies, you can use the option to define a volume which should be added after the value for the expected equivalence point has been reached. In general, I recommend a stop volume of approximately 1 mL after the equivalence point.

To summarize:

  • Select the right sensor for your titration
  • Adjust sample size and diluent volume
  • Select the titration mode depending on your reaction
  • Optimize your titration for speed and titrant consumption

 

You see, changing from manual to autotitration is as easy as it sounds – maybe even easier!

If you want to learn even more about practical aspects of modern titration, have a look at our monograph.

Want to learn more?

Download our free monograph:

Practical aspects of modern titration

Post written by Lucia Meier, Product Specialist Titration at Metrohm International Headquarters, Herisau, Switzerland.

Benefits of NIR spectroscopy: Part 3

Benefits of NIR spectroscopy: Part 3

This blog post is part of the series “NIR spectroscopy: helping you save time and money”. 

How to implement NIRS in your laboratory workflow

This is the third installment in our series about NIR spectroscopy. In our previous installments of this series, we explained how this analytical technique works from a sample measurement point of view and outlined the difference between NIR and IR spectroscopy.

Here, we describe how to implement a NIR method in your laboratory, exemplified by a real case. Let’s begin by making a few assumptions:

  • your business produces polymeric material and the laboratory has invested in a NIR analyzer for rapid moisture measurements (as an alternative to Karl Fischer Titration) and rapid intrinsic viscosity measurements (as an alternative to measurements with a viscometer)
  • your new NIRS DS2500 Analyzer has just been received in your laboratory

The workflow is described in Figure 1.

Figure 1. Workflow for NIR spectroscopy method implementation (click to enlarge).

Step 1: Create calibration set

NIR spectroscopy is a secondary method, meaning it requires «training» with a set of spectra corresponding to parameter values sourced from a primary method (such as titration). In the upcoming example for analyzing moisture and intrinsic viscosity, the values from the primary analyses are known. These calibration set samples must cover the complete expected concentration range of the parameters tested for the method to be robust. This reflects other techniques (e.g. HPLC) where the calibration standard curve needs to span the complete expected concentration range. Therefore, if you expect the moisture content of a substance to be between 0.35% and 1.5%, then the training/calibration set must cover this range as well.

After measuring the samples on the NIRS DS2500 Analyzer, you need to link the values obtained from the primary methods (Karl Fischer Titration and viscometry) on the same samples to the NIR spectra. Simply enter the moisture and viscosity values using the Metrohm Vision Air Complete software package (Figure 2). Subsequently, this data set (the calibration set) is used for prediction model development.

Figure 2. Display of 10 NIR measurements linked with intrinsic viscosity and moisture reference values obtained with KF titration and viscometry (click to enlarge).

Step 2: Create and validate prediction models

Now that the calibration set has been measured across the range of expected values, a prediction model must be created. Do not worry – all of the procedures are fully developed and implemented in the Metrohm Vision Air Complete software package.

First, visually inspect the spectra to identify regions that change with varying concentration. Often, applying a mathematical adjustment, such as the first or second derivative, enhances the visibility of the spectral differences (Figure 3).

    Figure 3. Example of the intensifying effect on spectra information by using mathematical calculation: a) without any mathematical optimization and b) with applied second derivative highlighting the spectra difference at 1920 nm and intensifying the peaks near 2010 nm (click to enlarge).
    Univariate vs. Multivariate data analysis

    Once visually identified, the software attempts to correlate these selected spectral regions with values sourced from the primary method. The result is a correlation diagram, including the respective figures of merit, which are the Standard Error of Calibration (SEC, precision) and the correlation coefficient (R2) shown in the moisture example in Figure 4. The same procedure is carried out for the other parameters (in this case, intrinsic viscosity).

    This process is again similar to general working procedures with HPLC. When creating a calibration curve with HPLC, typically the peak height or peak intensity (surface) is linked with a known internal standard concentration. Here, only one variable is used (peak height or surface), therefore this procedure is known as «univariate data analysis».

    On the other hand, NIR spectroscopy is a «multivariate data analysis» technology. NIRS utilizes a spectral range (e.g. 1900–2000 nm for water) and therefore multiple absorbance values are used to create the correlation.

    Figure 4. Correlation plot and Figures of Merit (FOM) for the prediction of water in polymer samples using NIR spectroscopy. The «split set» function in the Metrohm Vision Air Complete software package allows the generation of a validation data set, which is used to validate the prediction model (click to enlarge).
    How many spectra are needed?

    The ideal number of spectra in a calibration set depends on the variation in the sample (particle size, chemical distribution, etc.). In this example, we used 10 polymer samples, which is a good starting point to check the application feasibility. However, to build a robust model which covers all sample variations, more sample spectra are required. As a rule, approximately 40–50 sample spectra will provide a suitable prediction model in most cases.

    This data set including 40–50 spectra is also used to validate the prediction model. This can be done using the Metrohm Vision Air Complete software package, which splits the data set into two groups of samples: 

    1. Calibration set 75%
    2. Validation set 25%

    As before, a prediction model is created using the calibration set, but the predictions will now be validated using the validation set. Results for these polymer samples are shown above in Figure 4.

    Users who are inexperienced with NIR model creation and do not yet feel confident with it can rely on Metrohm support, which is known for its high quality service. They will assist you with the prediction model creation and validation.

    Step 3: Routine Analysis

    The beauty of the NIRS technique comes into focus now that the prediction model has been created and validated.

    Polymer samples with unknown moisture content and unknown intrinsic viscosity can now be analyzed at the push of a button. The NIRS DS2500 Analyzer will display results for those parameters in less than a minute. Typically, the spectrum itself is not shown during this step—just the result—sometimes highlighted by a yellow or red box to indicate results with a warning or error as shown in Figure 5.

    Figure 5. Overview of a selection of NIR predicted results, with clear pass (no box) and fail (red box) indications (click to enlarge).
    Display possibilities

    Of course, the option also exists to display the spectra, but for most users (especially for shift workers), these spectra have no meaning, and they can derive no information from them. In these situations only the numeric values are important along with a clear pass/fail indication.

    Another display possibility is the trend chart, which allows for the proactive adjustment of production processes. Warning and action limits are highlighted here as well (Figure 6).

    Figure 6. Trend chart of NIR moisture content analysis results. The parallel lines indicate defined warning (yellow) and action (red) limits (click to enlarge).

    Summary

    The majority of effort needed to implement NIR in the laboratory is in the beginning of the workflow, during collection and measurement of samples that span the complete concentration range. The prediction model creation and validation, as well as implementation in routine analysis, is done with the help of the Metrohm Vision Air Complete software package and can be completed within a short period. Additionally, our Metrohm NIRS specialists will happily support you with the prediction model creation if you would require assistance.

    At this point, note that there are cases where NIR spectroscopy can be implemented directly without any prediction model development, using Metrohm pre-calibrations. These are robust, ready-to-use operating procedures for certain applications (e.g. viscosity of PET) based on real product spectra.

    We will present and discuss their characteristics and advantages in the next installment of the series. Stay tuned, and don’t forget to subscribe!

    For more information

    If you want to learn more about selected NIR applications in the polymer industry, visit our website!

    We offer NIRS analyzers suitable for laboratory work as well as for harsh industrial process conditions.

    Post written by Dr. Dave van Staveren, Head of Competence Center Spectroscopy at Metrohm International Headquarters, Herisau, Switzerland.

    Increase productivity and profitability in environmental analysis with IC

    Increase productivity and profitability in environmental analysis with IC

    Nearly every chemist begins his or her path under the guidance of trained professionals, learning the correct way to implement the scientific method and to handle themselves safely in the laboratory. I am no different; I obtained my doctorate in Analytical and Environmental Chemistry in 2010. Since 2003, I worked in the environmental analysis sector, investigating soil contamination due to heavy metals and chemical spills, water quality analysis, and especially performing studies relating to atmospheric chemistry. During these years, I’ve been exposed to several analytical technologies, varied laboratory sizes, and different sample preparation procedures.

    A common theme runs throughout these different places—the hunt for more time and a bigger budget. However, with the right tools at your disposal, you can have your cake and eat it, too.

    Environmental chemical analysis

    The focus of environmental analysis lies in these three major sectors:

    • Air
    • Water
    • Soil

    It is in our best interest to study these interconnected areas as thoroughly as possible, considering how our health and the future of our species heavily relates to and relies upon them.

    Authorities and regulations

    With that in mind, local and governmental authorities have developed and enforced several regulations for the good of public health.

    One of the more well-known authorities on the subject is the United States Environmental Protection Agency (EPA). Under the Clean Air Act (enacted in 1970) and Clean Water Act (1972), as well as the requirement to report the use and disposal of toxic chemical substances (TRI reporting), several norms and standards have been developed over the intervening years to meet the stringent guidelines brought forth in these and other regulations.

    In the world of water analysis, one of the most common methods you will hear about is EPA Method 300. The methods 300.0 and 300.1 give detailed instructions to chemical analysts regarding measurement of common anions (Part A) and inorganic disinfection byproducts (Part B) in water via ion chromatography.

    Meet the family! The Metohm 940 Professional IC Vario TWO/SeS/PP, 930 Compact IC Flex Oven/SeS/PP/Deg, and Eco IC.

    Heavier workloads = less time per sample

    A growing list of aqueous contaminants and increasingly stringent regulatory requirements require labs to process more samples in less time, without sacrificing accuracy.

    The nature of the samples measured in environmental laboratories is such that sample preparation is required—this always involves filtering the samples, and in many cases diluting them as well. This procedure is the only way to prevent damage to the analysis system and to achieve accurate results.

    However, sample preparation is an expensive step, as it involves a significant amount work as well as costly consumables.

    Time to crunch the numbers!

    A 30 day study was performed on a Metrohm IC system with automatic ultrafiltration and dilution by an environmental analysis laboratory in the US. This lab, like many others, processes a high volume of samples, including some with a limited shelf life. Reliability is therefore a particularly important criterion when it comes to buying a new system.

    Economic considerations also play a key role: a new system should pay for itself as quickly as possible; it needs to be generating a return on investment after a year at the latest.

    So, how did we perform in the study? We tested several parameters, including:

    (Click to jump directly to the relevant section.)

    (Ultra)filtration

    All aqueous environmental samples must be filtered prior to analysis. This prevents particles from the sample contaminating or blocking the separation column, significantly extending its lifetime. The high volume of samples at the lab involved in this study drove material cost down to only $1 USD per syringe filter. However, since each individual sample requires a new filter, with 14,300 samples a year this still amounts to $14,300 – just for filtration materials!

    The integrated ultrafiltration in the ion chromatography system from Metrohm only needs one filter change per day, saving this laboratory over $12,000 per year. What’s more, the ultrafiltration process is fully automated.

    Compared to manual filtration, this saves three minutes of working time per sample. With labor costs of $18 per hour, this again corresponds to savings of around $13,000 per year.

    The Metrohm inline ultrafiltration cell.
    Overall, using ultrafiltration saves over $25,000 in annual expenditure – a significant return on investment (ROI).
    Yearly cost savings estimated by switching from manual filtration to automatic inline ultrafiltration (click to enlarge).
    For even more information about this time-saver, read our earlier blog post about how to determine when it is time to exchange the ultrafiltration membrane:

    Suppression

    Suppression reduces the conductivity of the eluent, resulting in a more sensitive conductivity detection of the analyte. This makes it possible to achieve particularly low limits of detection and quantification.

    The instrument previously used at this laboratory (from a different supplier) employed membrane-based suppressors. These suppressors have to be replaced every three months, costing approximately $1,200 each time. The Metrohm Suppressor Module (MSM), on the other hand, is a one-off purchase because it utilizes ion exchanger particles in a robust micro-packed bed for suppression instead of membranes. The three suppression cartridges of the MSM alternate between suppression, rinsing, and regeneration, thereby ensuring continuous suppression at all times.

    The regeneration reagents are inexpensive, averaging $52 per 1,000 samples, resulting in total annual costs of $750 for 14,300 samples. This is much cheaper than the cost of replacing a membrane suppressor multiple times!

    The Metrohm Suppressor Module (MSM) high-capacity version.
    Yearly cost savings estimated by switching from membrane suppressors to the packed bed MSM. (click to enlarge).
    Want to learn even more about suppression in anion chromatography? Download our free brochure here:

    Separation Columns

    With Metrohm columns, the environmental laboratory in this study achieved better separation of the analytes and a much longer column service life – on average, 7,000 injections compared to 1,200 with the previous columns. There appear to be two factors that are key to the reduced wear on the separation column:

    1. The Metrohm ion chromatograph provides measuring signals that are four to five times stronger. This makes it possible to reduce the injection volume by a factor of five.

    2. Additionally, Metrohm Inline Ultrafiltration removes particles down to a size of 0.2 μm, whereas manual filtration with syringe filters can only remove particles down to 20 μm.

    Selection of Metrohm separation columns in various lengths with intelligent chips (top) and protective guard columns (bottom).
    Overall, using Metrohm separation columns saves nearly $18,000 in one year for a high-throughput environmental analysis laboratory.
    Yearly cost savings estimated by switching from using Metrohm separation columns compared to the competition (click to enlarge).
    Looking for a specific column for your analysis challenges? Check out our Column Finder here!

    Dilution

    If the determination indicates that the analyte concentration is too high, i.e., outside the permissible determination range, the sample must be diluted and reanalyzed.

    This is the situation for around 30% of the samples at the laboratory involved in this study. Manual dilution takes the lab staff at least three minutes per sample. With labor costs of approximately $18 USD per hour, this adds up to annual costs of $3,800.

    Automatic Inline Dilution eliminates this expense: the analysis system dilutes the relevant samples fully automatically and then measures them again. This makes the laboratory much more efficient: the daily sample throughput increases, and samples with a limited shelf life are always analyzed in good time.

    Yearly cost savings estimated by switching from manual dilution practices to automatic inline dilution from Metrohm (click to enlarge).
    Find out more about the many different Metrohm Inline Sample Preparation options available here:

    Robustness

    Significant cost savings weren’t the only benefit of the Metrohm analysis system – the 30 day comparison study also revealed a number of other advantages. The company was impressed with the robustness of the instrument and with its ability to measure the entire range of samples processed in their laboratory.

    Its stable calibration also made it possible to reduce the calibration frequency: the new system only needs calibrating every two to three weeks instead of every two to three days.

    The most impressive features, though, were the high measuring sensitivity and the large linear range of the detector. Thanks to the latter, only 2% of the samples remain outside the measuring range and have to be diluted – compared to 30% with the old system.

    Final Results

    The 30 day test proved to the lab in question that the Metrohm IC with the integrated automatic inline sample preparation techniques saves both material and labor costs. Furthermore, it also offers a number of improvements in terms of analysis performance compared to the systems previously used on site.

    The most significant savings are those for labor and material costs as a result of using ultrafiltration, followed by those resulting from the longer service life of the separation column.

     

    For the final savings calculation over an entire year, download our white paper on the subject below.

    Want to learn more?

    Download our free White Paper:

    High productivity and profitability in environmental IC analysis

    Post written by Dr. Alyson Lanciki, Scientific Editor at Metrohm International Headquarters, Herisau, Switzerland.
    How Mira Became Mobile

    How Mira Became Mobile

    Handheld Raman spectrometers are truly like no other analytical chemical instruments. All spectrometers (e.g. IR/NIR, UV-Vis, GC/MS, and Raman) rely on interactions between matter and energy and include detectors that collect information about resulting atomic and molecular changes. This information is used to qualify and/or quantify various chemical species. Typically, a spectrometer is a benchtop instrument attached to a computer or other visual display that is used by an analytical chemist in a laboratory.

    Classical Raman spectrometers fall into this category. Lasers, filters, detectors, and all associated hardware for sampling is combined in one unit, while data processing and viewing occurs nearby.

    For a comparison of other spectroscopic techniques, visit our previous blog post «Infrared spectroscopy and near infrared spectroscopy – is there a difference?».

    Raman is a unique investigative analytical technique in many ways. It is said, «If you can see it, Raman can ID it.»

    Indeed, Raman’s strengths are its simple sampling methods combined with its specificity. Direct analysis is possible for many pure substances without sample preparation. Sampling is performed via direct contact with a substance, remotely, or through a barrier. Even solutes in water may be directly identified. This technique is highly specific; each material investigated with Raman produces a unique «fingerprint» spectrum. Raman spectroscopy is successful at positively identifying each distinct substance, while accurately rejecting even very similar compounds.

    Mira (Metrohm Instant Raman Analyzer) with several sampling attachments for easy analysis: with or without sample contact.

    The Raman spectrum

    Raman spectra contain peaks across a range that correspond to specific molecular connectivity and can be used to determine the composition of a sample. The spectral range is dependent on spectrometer design, and embodies a balance of resolution and sensitivity.

    The «fingerprint region» (400–1800 cm-1) is used to ID unknowns and verify known materials. The region below 400 cm-1 is helpful in the analysis of minerals, gemstones, metals, and semiconductors. For most organic materials (oils, polymers, plastics, proteins, sugars/starches, alcohols, solvents, etc…), very little information above 2255 cm-1 is useful in Raman applications, as carbon-hydrogen chains contribute little to molecular qualification.

    A selection of different bonds and functional groups with their general regions of activity in the Raman portion of the electromagnetic spectrum (click to enlarge).

    Mira’s measuring range of 400–2300 cm-1 is perfect for most Raman applications, including:

    • Pharma & Other Regulated Industries
    • Food
    • Personal Care & Cosmetics
    • Defense & Security
    • Process Analytics
    • Materials ID
    • Education & Research

    Mira is available in different configurations for all kinds of applications and user needs.

    Good things come in small packages

    Technology, analysis, ease of use, accuracy—handheld Raman has all of this in a small format that escapes the confines of the lab. It also invites many new types of users who employ Raman for vastly new and exciting applications. In the rest of this blog post, I share details about the development of components that led to miniaturization of Raman. This is followed by the origin story of Metrohm Raman, manufacturer of Mira (Metrohm Instant Raman Analyzers).

    Four significant innovations came together to create Mira: diode lasers, specialized filters and gratings, on-axis optics, and the CCD (Charge Coupled Device) in a unique design called the «astigmatic spectrograph». These basic components of a Raman spectrograph can be seen in the graphical representation above (click to enlarge). Note that this is not an accurate depiction of the unique geometries found within Mira’s case!

    Raman spectroscopy is a technique which relies on the excitation of molecules with light (energy). C.V. Raman’s discovery of Raman scattering in 1928 was enabled by focused sunlight, which was then quickly replaced with a mercury lamp for excitation and photographic plates for detection. This resulted in a simple, popular, and effective method to determine the structure of simple molecules.

    C.V. Raman. India Post, Government of India / GODL-India

    The first commercial Raman spectrometer was available in the 1950’s. As lasers became more available in the 1960’s, followed by improved filter technology in the 1970’s, Raman grew in popularity as a technique for a wide range of chemical analysis. Integrated systems were first seen in the 1990’s, and the miniaturization of instruments began in the early 2000’s.

    Miniaturization of Raman spectrometers

    Diode lasers were the first step toward handheld Raman. For those of you at a certain age, you may remember that these are the kind of small, cool, low energy lasers used in CD players, stabilized at the source with a unique kind of diffraction grating.

    Powerful, efficient optical filters also contribute to miniaturization by controlling laser light scattering within the spectrograph. The development of sensitive, small Charge Coupled Devices (CCDs), which are commonly used in mobile phone cameras, permitted the detection of Raman scattering and efficient transmission of the resulting signals to a computer for processing.

    The astigmatic spectrograph simplified both geometry and alignment for the many components within a Raman spectrometer; this design was the final advancement in the development of handheld Raman.

    From Wyoming to Switzerland

    By the 1990’s, new technologies developed for diverse industries were being incorporated into Raman spectroscopy. In Laramie, WY (USA) at the time, Dr. Keith Carron was a professor of Analytical Chemistry with a focus on Surface Enhanced Raman Scattering (SERS). Dr. Carron already had robust SERS tests, but he envisioned a low-cost Raman system that would introduce his tests to industrial, medical, or defense and security markets. His next steps would revolutionize Raman spectroscopy. 

    Using commercial off-the-shelf parts, Dr. Carron and his team developed an economical benchtop instrument that eliminated the high cost of Raman analysis, helping to enable its use in university curricula. In the early 2000’s, a research and education boom began as Raman grew from an esoteric technique used in high-end applications to becoming widely available for all kinds of tasks. Dr. Carron is responsible for ushering Raman into the current era. A collaboration led to a portable Raman system and, ultimately, to a new astigmatic spectrograph design in a very small instrument.

    The U.S. tragedies on September 11, 2001 created an immediate push for technology to detect terrorist activity. Around this time, anthrax scares further enforced the need for “white powder” analyzers. Fieldable chemical analysis became the goal to achieve.

    Dr. Carron was inspired to invent a truly handheld, battery powered Raman device for the identification of explosives and other illicit materials. A number of iterations led to CBex, a palm-sized Raman system (even smaller than Mira!) designed by Snowy Range Industries, in February 2012 (see image). CBex caught the attention of Metrohm AG, and an offer of cooperation was sent to Dr. Carron in August 2013.

    Along comes Mira

    Mira was born in 2015. Not only is it a novel analytical instrument, but it is also unique amongst handheld Raman spectrometers. Mira has the smallest form factor of all commercially available Raman instruments. What truly sets Mira apart from the competition is its built-in Smart Acquire routines, which provide anyone, anywhere, access to highly accurate analytical results. It is rugged, meeting MIL-STD 810G and IP67 specifications—you can drop Mira or submerge it in a liquid to get an ID.

    Once Raman escaped the confines of the laboratory, it suddenly had the potential for new uses by non-technical operators, who could perform highly analytical tests safely, quickly, and accurately.

    In fact, miniaturization of Raman has revolutionized safety in a number of ways:

    • Direct analysis eliminates dangers from exposure to laboratory solvents and other chemicals.
    • Through-packaging analysis prevents user contact with potentially hazardous materials.
    • Simplified on-site materials ID verifies the quality of ingredients in foods, medicines, supplements, cosmetics, and skin care products.
    • ID of illicit materials such as narcotics, explosives, and chemical warfare agents supports quick action by military and civilian agencies.

    What’s Next?

    I hope that you have enjoyed learning about the evolution of Raman technology from benchtop systems to the handheld instruments we have today. In the coming months we will publish articles about Mira that describe, in detail, several interesting applications of handheld Raman spectroscopy—subscribe to our blog so you don’t miss out!

    As a sneak preview: In 1 month we will be introducing a brand new system, aimed at protecting consumer safety through the ID of trace contaminants in foods. Stay tuned…

    Free White Paper:

    Instrument Calibration, System Verification, and Performance Validation for Mira

    Post written by Dr. Melissa Gelwicks, Technical Writer at Metrohm Raman, Laramie, Wyoming (USA).

    How to determine if your edible oils are rancid

    How to determine if your edible oils are rancid

    Rancidity is the process through which oils and fats become partially or completely oxidized after exposure to moisture, air, or even light. Though not always that obvious, foods can go rancid long before they become old. For oils, whose antioxidant properties are highly valued, such as for olive oil, this is especially problematic. A simple (and free) test for rancidity of oils can be performed at home using your own analytical instruments: your senses of smell and taste.

    1.  Pour a few milliliters of the oil into a shallow bowl or cup, and breathe in the scent.
    2.  If the smell is slightly sweet (like adhesive paste), or gives off a fermented odor, then the oil is probably rancid.
    3.  A taste test should be performed to be sure, since some oils may have a naturally sweet scent.
    4.  Ensure the oil sample is at room temperature, then sip a small amount into your mouth without swallowing. Similar to tasting wine, slurp air across the oil in your mouth, then exhale to determine if the oil has flavor.
    5.  If the oil has no flavor, it is most likely rancid. Do not consume it!

    Once food has turned rancid, there is no way to go back and fix it. So, if you find out by means of the sensory test that the oil is rancid, it is already too late. For those of us who would rather skip this step to avoid having rancid food in our mouths, the possibility to accurately predict the future oxidation behavior of edible oils would be great. In fact, this is exactly what the Rancimat from Metrohm can do if you follow our tips and tricks in this article.

    Rancimat to the rescue!

    With the 892 Rancimat and the 893 Biodiesel Rancimat, Metrohm offers two instruments for the simple and reliable determination of the oxidation stability of natural fats and oils and of biodiesel, respectively. The method, also known as the Rancimat method or Rancimat test, is the same in both cases. It is based on a simple principle of reaction kinetics, according to which the rate of a chemical reaction (here the oxidation of fatty acids) can be accelerated by increasing the temperature.

    The 892 Rancimat (L) and the 893 Biodiesel Rancimat (R) from Metrohm (click to enlarge image).

    How does it work?

    During the determination, a stream of air is passed through the sample at a constant temperature (e.g. 110 °C according to standard EN 14214 for biodiesel). Any oxidation products that develop are transferred by the air stream to a measuring vessel, where they are detected by the change in conductivity of an absorption solution. In addition to the temperature (both the accuracy and stability of which are guaranteed by the Rancimat system), the preparation of the measurement and the condition of the accessories also influence the quality and reproducibility of the results. In this blog post, we have compiled some practical experience in using the Rancimat to help you.

    Oxidation stability: practical tips and tricks from the experts

    Remove foreign particles from the reaction vessel

    Foreign particles in the reaction vessel can catalyze reactions in the sample, leading to measurement results which are not reproducible. Remove foreign objects such as packaging remains from the reaction vessels using a strong gas stream (preferably nitrogen).

    Weigh sample with a plastic spatula

    Weigh the sample directly into the reaction vessel. Make sure that the maximum filling height does not exceed 3.5 cm. An error of ±10% in the sample weight has no influence on the final result.

    Metal spatulas should not be used for weighing, as the metal ions could catalytically accelerate oxidation.

    Reaction vessel lid

    The green reaction vessel lid (see following image, article number: 6.2753.100) must seal the reaction vessel tightly. If this is no longer possible, the lid must be replaced. Leaky reaction vessel lids lead to incorrect and non-reproducible measurement results!

    Tip: to make it easier to seal or to remove the lid, a fine film of silicone oil can be applied with a finger to the upper outer edge of the reaction vessel, to a height of about 1 cm.

    Position and stability of the air tube

    The stable, vertical positioning of the air tube (article number: 6.2418.100 or 6.2418.130) in the reaction vessel increases the reproducibility of the measurement results.

    The air tube should protrude straight down into the vessel as illustrated in the following graphic representation (click image to enlarge).

    Absorbent solution in the measuring vessel

    Deionized water is used as the absorption solution with the Rancimat. Prior to beginning the analysis, the electrical conductivity of the water in the measuring vessel should not exceed 5 µS/cm.

    If this value is higher, check the filters of the water system, and also ensure that there are no other sources of contamination.

    Need replacement measuring vessels?

    We’ve got you covered.

    Positioning of the cannula for air supply

    The PTFE (polytetrafluoroethylene) cannula for the air supply into the absorption solution (article number: 6.1819.080) must be aligned properly so that no air passes over the electrodes of the conductivity measuring cell, as shown in the graphic (click image to enlarge).

    Air bubbles at the electrodes lead to noisy measurement curves that are difficult to evaluate.

    Is it time to start the determination yet?

    First, the temperature of the heating block (which is defined in the method) must be reached and stabilize before the reaction vessel is inserted into the instrument.

    The sample identification data is then entered in the  StabNet software by the operator.

    After connecting all of the tubing for the air supply, the reaction vessel can be inserted into the heating block. The sample measurement begins immediately after pressing the button on the Rancimat.

    Cleaning: important for reproducible results

    To obtain reliable analysis results, cleaning all accessories is of the utmost importance.

    Both the reaction vessel and the inlet tube are disposable items. You can dispose of these materials immediately after cooling down. The rest of the accessories can be cleaned with a laboratory dishwasher (or equivalent) at maximum temperature and maximum drying time.

    If you use glass or polycarbonate materials for the measuring vessels, you can of course also clean them in the same manner. The same applies to the measuring vessel lid with integrated conductivity electrode, the transparent silicone tubing, or the black Iso-Versinic tube, as well as the reaction vessel lid.

    Tip: the silicone or Iso-Versinic tubing should be washed in a vertical position inside of the dishwasher to ensure it is thoroughly cleaned inside.

    After washing, the transfer tubes and the reaction vessel lids should be heated at 80 °C for at least two hours in a drying cabinet, since the materials of these accessories absorb reaction products. This step further reduces the possibility of carryover to the next measurement which leads to unstable measurement results.

    Maintenance

    Depending on the use of the Rancimat, a regular visual control of the air filter on the back of the instrument is recommended. A clogged filter will lead to fluctuating air flows. The molecular sieve may also need a regular change depending on the instrument usage.

    I hope that these tips have given you some helpful suggestions which will save you a little time and troubleshooting when using the Rancimat for determination of the oxidation stability of edible oils and other products. Good luck with your determinations!

    Want to learn more?

    Check out all of the stability measurement options offered by Metrohm.

    Post written by Simon Lüthi, PM Titration (Meters & Measuring Instruments) at Metrohm International Headquarters, Herisau, Switzerland.