Categories
Publications

HILIC-MS impurity profiling of therapeutic PS- oligonucleotides

Ion-Pairing Hydrophilic Interaction Chromatography for Impurity Profiling of Therapeutic Phosphorothioated Oligonucleotides

Oligonucleotides are short strands of synthetic DNA or RNA that are synthesized via a solid-phase synthesis, in which numerous of closely-related impurities are generated. Ion-pairing reversed-phase liquid chromatography (IP-RPLC), anion exchange chromatography (AEX), and hydrophilic interaction chromatography (HILIC) are often used to profile these impurities, which allows for good separation of impurities comprising different number of nucleotides as the full-length product (FLP). However, impurities comprising the same number of nucleotides as the FLP are often not separated. Therefore, ion-paring HILIC (IP-HILIC) was explored as an alternative separation mode to overcome these challenges.

Key points:

  • Changed selectivity: by adding ion-pairing reagents (IPRs) to the HILIC eluent, the relative contribution of the highly polar phosphate moieties on HILIC retention is reduced and, thereby, increasing the relative contribution of the nucleobase composition and conjugated groups.
  • Suppressed diastereomer separation: Phosphorothioation of the phosphate groups results in the formation of diastereomers, with 2n possible diastereomers (n = phosphorothioate groups). IPRs in the HILIC eluent reduced diastereomer separation, leading to sharper peaks.
  • Separation of same-length impurities: IP-HILIC shows increased separation of impurities comprising of the same number of nucleotides as the FLP, such as deaminated products that differ less than 1 Da from the FLP. This is noteworthy as no other MS-compatible, one-dimensional LC separation can achieve this.

Figure 1: IP-HILIC-MS total and extracted ion chromatograms of GalNAc-conjugated oligonucleotides (top) and non-conjugated oligonucleotides (bottom) and the mass spectra of peaks A-D (A & C: FLPs, B & D: deaminated products)

The developed IP-HILIC method shows great potential as a screening method for quality control. The work is published in Analytical Chemistry and can be found with the following link: https://pubs.acs.org/doi/full/10.1021/acs.analchem.5c01407

 

Categories
Publications

SEC-MS and Enzymes for polyester polymers analysis

CAST scientist Masashi Serizawa recently published a manuscript in which he investigated a novel size-exclusion chromatography hyphenated with mass spectrometry and ultraviolet (SEC-MS/UV) method to characterize poly lactide-co-glycolide (PLGA), and also co-authored work on the use of enzymes for polyester polymer degradation.

 

·         SEC-MS characterization of PLGA:

Size-exclusion chromatography (SEC) hyphenated with MS is valuable for microstructure analysis. While SEC-UV/RI determines molecular weight distribution (MWD), SEC-MS is used for chemical composition distribution (CCD) and functionality-type distribution (FTD). However, previous applications of SEC-MS have failed to address the risk of polymer fragmentation during the analysis process. It is crucial to establish whether SEC-MS methods can be applied to biodegradable polymers and to recognize if fragmentation processes occurred during the SEC separation or during the ESI-MS process.

 

This study addresses fragmentation in PLGA analysis by optimizing SEC-MS conditions. We demonstrate that cesium iodide (CsI) minimizes fragmentation during electrospray ionization (ESI-MS), simplifying spectra and enabling differentiation of PLGA isomers. This facilitates accurate determination of CCD and FTD, even revealing “blockiness” when coupled with selective degradation.

Figure 1: schematic illustration of in-source fragmentation in SEC-MS, depending on ionization agents

The study is supported by the COAST/TKI-Chemistry POLY-SEQU-ENCHY project between the UvA and Corbion and is funded by Mitsubishi Chemical Corporation. This work was recently published in the Journal of the American Society for Mass Spectrometry and can be accessed freely at the link below:

https://doi.org/10.1021/jasms.4c00447

 

·         Insights in the selectivity of enzymes for polyester co-polymer degradation

Another example of the SEC-MS/UV polymer application conducted by our group is the structural analysis of aromatic/aliphatic polyesters. To understand the polymer chain structure of aromatic/aliphatic polyesters, Eman et al. successfully developed a two novel thermostable cutinase that primarily degrade aliphatic ester bonds. These enzymes maintain activity at elevated temperatures of up to 90 °C thanks to enzyme engineering.

For both enzymes, higher hydrolysis rates were observed for aliphatic compared to aromatic homo-polyesters. SEC-MS analysis revealed that the hydrolysis of aliphatic/aromatic co-polyesters occurred at the aliphatic monomers, significantly reducing the molecular weight and changing the end-group composition. These results underline the importance of co-polymer composition in the biodegradation of co-polymer systems and demonstrate the applicability of enzymes for the analytical characterization of synthetic polymers by selectively reducing their molecular weight.

Figure 2: Results of SEC-MS/UV analysis of a copolymer containing aromatic/aliphatic polyesters,
comparing between before and after enzymatic degradation (The degradations were
performed at 71°C, using thermostable cutinase)

This research was funded by Topconsortium voor Kennis en Innovatie (TKI) Chemie, deployment project PPS-programma toeslag 2019 (CHEMIE.PGT.2020.020). This work was recently published in the Chemistry – A European Journal and can be accessed at the link below:

https://doi.org/10.1002/chem.202403879

Categories
Publications

NPLC method to characterize end groups of poly lactic acid co-glycolic acid copolymers

The CAST scientist Masashi Serizawa recently published a manuscript in which he investigated a novel method of using gradient elution normal-phase liquid chromatography with basic and acidic additives to separate PLGAs in the different end groups and in the different chemical compositions at the same time.

PLGA is an important material in drug delivery systems. It is used in nanoparticle-containing drugs to prevent a sudden increase in drug concentration in the body when the drug is ingested. The LA/GA ratio and differences in the terminal structure of PLGA have a significant effect on the degradation rate of PLGA in the body.

To distinguish these distinctions, we created a unique ternary gradient liquid chromatography method utilizing base and acid additives. Initially, we used a gradient of hexane, a poor solvent, and ethyl acetate, a good solvent, with a mobile phase containing a base additive to separate non-ester-terminated PLGAs (ester-terminated PLGA and cyclic PLGA) based on their chemical composition. Subsequently, by switching the mobile phase to THF containing an acid additive, we were able to elute acid-terminated PLGA.

This method offers the advantage of quick analysis compared to traditional NMR methods, making it potentially valuable for future industrial research. Furthermore, it can be applied to high molecular weight PLGA of 180 kDa, making it useful for the development of high molecular weight PLGA, which is challenging to analyze using mass spectrometry techniques such as MALDI-TOF-MS.

 

Figure: (left) schematic illustration of the working principle of the NPLC separation. (right): key results obtained in the study
Figure: (left) schematic illustration of the working principle of the NPLC separation. (right): key results obtained in the study

 

The study is supported by the COAST/ TKI-Chemistry POLY-SEQU-ENCHY project between the UvA and Corbion (Gorinchem, The Netherlands) and is funded by Mitsubishi Chemical Corporation.

The link to the publication is reported below.

https://doi.org/10.1016/j.chroma.2024.465137

Categories
Publications

2D-LC in industry: technological innovations reviewed

CAST scientist Rick van den Hurk wrote a review on recent developments in 2D-LC and the use of 2D-LC in industry. He did this under the supervision of Bob Pirok and in collaboration with Matthias Pursh (Dow) and Dwight Stoll (Gustavus Adolphus College).

Two-dimensional liquid chromatography (2D-LC) greatly advances the separation powered of analytical separation sciences through a better peak capacity as well as offering more-tailored selectivity combinations.

However, the field of 2D-LC is, in particular in contrast to 2D-GC, still very immature and under significant development. In 2019, Bob Pirok, Dwight Stoll and Peter Schoenmakers published a review in which they examined the latest trends from 2015 until 2019 [1].

In this recent installment [2], Rick van den Hurk reviewed the recent innovations between 2019 and 2023. In addition the review also devotes significant focus to the implementation of the technique in industry. The review was co-written by Bob Pirok, Dwight Stoll (Gustavus Adolphus College) and Matthias Pursch (Dow).

The authors examined over 200 articles and also compared these with the articles published prior to 2019. In their review, the authors concluded that mobile-phase mismatch continues to be an important focus area for the field, and several modulation strategies and new variants were discussed.

Van den Hurk and co-workers also noticed that a third of the publications had at least one author affiliated with industry. Application fields that particularly demonstrated involvement were the polymer characterization, metabolomics, and pharmaceutical and biopharmaceutical analysis. Furthermore, industrial applications favored the use of heart-cut 2D-LC and largely employed on-line hyphenation. The authors did note that the database was likely to be missing out on a number of industrial works that are not published for confidentiality reasons.

Other important developments were the increased popularity of computer-aided strategies, alternative gradient-elution methods to facilitate modulation, as well as multi-stage, multi-dimensional separations, the latter of which were applied to the characterization of protein therapeutics.

2D-LC in industry: technological innovations reviewed
Figure. Number of applications per application area distributed by non-comprehensive (light blue) and comprehensive (dark blue) applications between 2019 and 2023. Reproduced with permission from [2].

Two-dimensional liquid chromatography is of paramount importance to the PARADISE project of CAST scientist Bob Pirok in which multi-dimensional separation technology is used to achieve separation of highly complex samples. In addition, the project aims to characterize the correlation of different sample properties within a single analytical solution. The outcomes of this recent review are thus of value to the ongoing progress in the PARADISE project. Rick van den Hurk is a PhD candidate in the PARADISE project, which stands for Propelling Analysts by Removing Analytical-, Data-, Instrument- and Sample-related Encumbrances and receives funding from the Dutch Research Council (NWO), as well as a number of public and private organisations. Read more about the PARADISE project here.

In addition, the review was part of Pirok’s UPSTAIRS project, which aims to improve the accessibility of advanced separation technology by developing computational methods to leverage chromatographic theory in an unsupervised workflow. This project also receives funding from NWO.

The work was published in TrAC Trends in Analytical Chemistry as open access, and can thus be accessed for free here.

References

  1. Recent Developments in Two-Dimensional Liquid Chromatography: Fundamental Improvements for Practical Applications, B.W.J. Pirok, D.R. Stoll and P.J. Schoenmakers, Anal. Chem., 2019, 91(1), 240-263, DOI: 10.1021/acs.analchem.8b04841
  2. Recent trends in two-dimensional liquid chromatography, R.S. van den Hurk, M. Pursch, D.R. Stoll, B.W.J. Pirok, TrAC Trends in Analytical Chemistry, 2023, 166, 117166, DOI: 1016/j.trac.2023.117166
Categories
Publications

Algorithm for evaluation of background correction algorithms

Chromatographic signals comprise three components, (i) low-frequency baseline drift, (ii) high-frequency noise and, for chromatography (iii) the relative mid-frequency peaks. The first two contributions together are the “background” of the signal. Often, there is more background than chromatographic information, as each data point contains a background contribution. In such a case, or if the background is of a frequency very similar to that of the relevant signals, problems may occur with the interpretation of the data. For example, peak detection may be hindered, and errors in classification, discrimination, and, especially, quantification, may occur [1].

We earlier reported how the past decade has seen the development of a plethora of different background correction algorithms [1]. While this is technically a welcome trend, it is currently unclear which tool is useful in what case. There thus is a need for a tool to objectively compare these algorithms. This will allow other users to select the appropriate algorithm for their case.

Often scientists use simulated data for such a comparison, yet such data is often controversial as it is regarded to poorly represent a realistic case. Experimental data solves this issue, but is difficult to generate for each type of chromatographic application. Moreover, it is impossible to determine the statistically true value of, for example, the area of any given peak in the chromatogram.

In this light, scientist Leon Niezen developed a data simulation tool to generate realistic data for use in algorithm comparison studies. The tool was designed to combine experimental baseline drift and noise signals with carefully modeled chromatographic peaks. For the latter, Niezen modelled experimental chromatographic peaks with distribution functions (Figure 1).

Figure 1. A) Fit for Modified Pearson VII and EMG distributions on experimental data, B) AIC values for the five best distribution models for each of the fitted peaks and C) zoomed-in fits and residuals for five individual peaks. Reproduced, with permission, from [2].

Niezen then applied the tool to evaluate a large number of background-correction algorithms that have been developed. By varying signal properties he was able to discern strengths and weaknesses of various algorithms as a function of signal properties. An example is shown in Figure 2.

Figure 2. A) Root mean square error (RMSE) surfaces obtained for the various drift-correction methods in combination with the sparsity-assisted signal (SASS) smoothing algorithm [3]. Methods are indicated by the coloured dots. B) Bottom view (lowest values) resulting from the overlaid RMSE surfaces. Reproduced, with permission, from [2].

The tool was made available to the public and the work can open-access be downloaded. Aside from being useful to scientists, the work also will be of significant importance to the automation (‘AutoLC’) project that is currently commencing in Amsterdam. The work by Niezen was funded by the UNMATCHED project, which is supported by BASF, Covestro, DSM, and Nouryon, and receives funding from the Dutch Research Council (NWO) in the framework of the Innovation Fund for Chemistry and from the Ministry of Economic Affairs in the framework of the “PPS-toeslagregeling.

References

[1] Recent applications of chemometrics in one- and two-dimensional chromatography
T.S. Bos, W.C. Knol, S.R.A. Molenaar, L.E. Niezen, P.J. Schoenmakers, G.W. Somsen, B.W.J. Pirok, J. Sep. Sci. 43(9-10), 2020, 1678-1727, DOI: 10.1002/jssc.202000011 [OPEN ACCESS]

[2] Critical comparison of background correction algorithms used in chromatography, L.E. Niezen, P.J. Schoenmakers, B.W.J. Pirok, Anal. Chim. Acta2022, 1201, 339605, DOI: 10.1016/j.aca.2022.339605 [OPEN ACCESS]

[3] Sparsity-assisted signal smoothing (revisited), I. Selesnick, ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing – Proceedings, 2017, DOI: 10.1109/ICASSP.2017.7953017

Categories
Publications

Use and Limits of Scouting Experiments for Retention Modelling

Retention modelling is a useful technique which can be used to substantially reduce the method-development process for LC separations [1]. Believe it or not, as complicated and distant it may seem from routine use in the analytical, it actually is useful to make life easier.

Retention modelling saves us a lot of time

One branch of retention modelling in liquid chromatography employs scouting (or ‘scanning’) experiments to probe retention of analytes of interest. By fitting models to the recorded retention times, retention parameters are obtained for each individual analyte. Interestingly, this information can be used to predict retention times for the analytes under conditions different from those used for the scouting experiments. 

In other words, a computer can use these to simulate large numbers of hypothetical separation methods. By analyzing the resulting separations, optimal method parameters can be discerned. As a consequence, trial-and-error method development can be replaced by a number of scouting experiments and thus saves valuable time.

Figure 1. Workflow of the method optimization using scanning gradients to obtain retention-model parameters. The workflow starts at the top right with an insufficiently-resolved sample, on which scouting experiments are performed to yield retention parameters. These can be used to compute the most-optimal method parameters, thus saving a lot of valuable method-development time. Reproduced with permission from [2].

We have earlier explained why, ideally, we would like to use gradient elution during these scanning experiments, something also shown by Vivo-Truyóls et al. [3]. In contrast to isocratic elution, gradients are more practical and save a lot of time at the cost of significantly more challenging modelling and less accurate retention parameters.

With the entire method-development automatization approach hinging on the accuracy of retention parameters, it is imperative that usefulness of the gradient experiments is mapped and, if possible, improved.

Optimizing the separation of unknown degradation products

CAST member and PhD candidate Mimi den Uijl develops analytical methods based on 2D-LC to degrade and characterize small molecules and their degradation products. Under the lead of Prof. Maarten van Bommel, her application area ranges from cultural-heritage art objects, to environmental aqueous samples, to foodstuffs (Unilever). While the identify of the parent molecules are sometimes known, this is certainly not the case for degradation products. Yet optimized methods are needed to characterize such samples.

Den Uijl and co-workers participated in a large collaboration between the teams of Bob Pirok (University of Amsterdam) and Dwight Stoll (Gustavus Adolphus College), and – in their project – systematically investigated the use and limits of gradient experiments for method-development workflows.

Figure 2. Combined results of all investigated factors. The box-and-whisker plots represent the average prediction error of all the compounds for a noisy dataset (top) and highly-precise (bottom) dataset. See publication for more details. Reproduced with permission from [2].

For the first stage published in Journal of Chromatography A, two datasets were generated of various small molecules ranging in chemical properties [2]. One dataset was measured was recorded with very high measurement prevision relative to the other. Interestingly, some of the results were different between the two sets.

Several key observations

These and other conclusions are detailed in the article which was published in Journal of Chromatography A as open-access article. This means you can download and read it for free. Meanwhile, we decided to continue this project, so we hope to be able to report more about this at a later stage.

References

[1] Recent applications of retention modelling in liquid chromatography, M.J. den Uijl,  P.J. Schoenmakers,  B.W.J. Pirok, and  M.R. van Bommel, J. Sep. Sci.2020, DOI: 10.1002/jssc.202000905.

[2] Measuring and using scanning-gradient data for use in method optimization for liquid chromatography, Mimi J. den Uijl, Peter J. Schoenmakers, Grace K. Schulte, Dwight R. Stoll, Maarten R. van Bommel, Bob W.J. Pirok, J. Chromatogr. A, 2021, 1636, 461780, DOI: 10.1016/j.chroma.2020.461780

[3] Error analysis and performance of different retention models in the transference of data from/to isocratic/gradient elution, Author links open overlay panel, G. Vivó-Truyols, J.R. Torres-Lapasió, M.C. García-Alvarez-Coque, J. Chromatogr. A, 2003, 1018(2), 169-181, DOI: 10.1016/j.chroma.2003.08.044.

[4] Reducing the influence of geometry-induced gradient deformation in liquid chromatographic retention modelling, T.S. Bos, L.E. Niezen, M.J. den Uijl, S.R.A. Molenaar, S. Lege, P.J. Schoenmakers, G.W. Somsen, B.W.J. Pirok, J. Chromatogr. A2020, 1635, 461714, DOI: 10.1016/j.chroma.2020.461714.

Mimi den Uijl was a PhD student in the TooCOLD (Toolbox for studying the Chemistry Of Light-induced Degradation) project at the University of Amsterdam. In this project, Mimi developed light-induced reaction modulators for use in 2D-LC. She defended her PhD in 2022.

Categories
Publications

Reducing Effect of Gradient Deformation For LC Retention Modelling

Retention modelling is a useful technique which can be used to substantially reduce the method-development process for LC separations. One approach utilizes so-called scanning (or ‘scouting’) experiments using isocratic or gradient elution [1]. Here, a number of pre-defined methods are employed to record retention times to which empirical models are fitted. 

Isocratic experiments will generally yield reliable solid datasets that are very suitable for retention modelling. Using isocratic elution is, however, not always very practical. Indeed, scouting experiments can take rather long for the slower experiments. Moreover, some manual fine-tuning and experience with the analytes in question are needed to identify the appropriate modifier concentrations.

In contrast, gradient elution allows rather quick and easy scanning experiments at the significant cost of the usefulness of the resulting data. Where isocratic experiments directly measure the retention factor at a certain modifier (φ) fraction, the retention time in gradient elution depends on the gradient experienced by the analyte.

image_2021-01-02_083104

Figure 1. Schematic illustrating a programmed linear gradient and the experienced gradients for two different systems.

However, as the programmed change in composition produced by the pump migrates through the chromatographic system, its shape is altered. In Figure 1, above, we can see how this leads to the familiar difference between the programmed (dark blue) and effective (purple, light blue) gradients for two different systems.

This deviation is the product of an array of effects, such as the morphology and inefficiencies in the pump components, chromatographic system volumes and the accuracy of pumped mobile-phase composition (A vs. B). The latter can rather easy deviate if the pump does not take into account the change in density as φ increases.

Figure 2. Response functions of systems 1 and 2. The shape essentially represents the differences between the programmed and effective gradients shown in Figure 1.

The overall effect can be represented by response functions. These functions essentially describe the difference between the programmed and measured gradient. Two examples for two different systems are shown above in Figure 2. Indeed, depending on the pump characteristics, dramatic changes can be observed.

 

The problem is only complicated further as the recorded dwell curve may also in itself represent an inaccurate depiction. Depending on the detector, solvatochromic effects and  the presence of other mobile-phase components can severely convolute the true depicted of the experienced gradient.

For modelling, deformation is a problem because

As part of a larger collaboration with Agilent Technologies in the “DAS PRETSEL” project, Tijmen Bos, with assistance of other CAST members Mimi den Uijl, Leon Niezen and Stef Molenaar, developed an algorithm to reduce partially the effects of gradient deformation.

In their work, Bos et al. showed that the impact of the gradient deformation significantly impacts retention parameters. By modelling so-called Stable distribution functions to the measured dwell curves, the authors were able to significantly reduce the prediction errors for water-water systems (Figure 3). Conveniently, the Stable parameters turned out to be related to physical parameters of the chromatographic system.

Figure 3. Relative errors (%) in the predicted retention times of the test compounds on Instruments 2 (top) and 3 (bottom) obtained when using retention parameters determined for the test compounds on Instrument 1 at different flow rates. Please see the publication for details about the instruments. Reproduced with permission from [2].

This work is part of a larger project. In this first stage, we mainly targeted the geometric-influences. Now, we shift our focus to more complicated solvent systems and also the effect on larger molecular systems.

The work was recently published open-access in Journal of Chromatography A and can be downloaded for free here. An accompanying video pitch can be viewed below. Readers interested in learning more about retention modelling and its application areas are referred elsewhere

References

[1] Recent applications of retention modelling in liquid chromatography, M.J. den Uijl,  P.J. Schoenmakers,  B.W.J. Pirok, and  M.R. van Bommel, J. Sep. Sci.2020, DOI: 10.1002/jssc.202000905.

[2] Reducing the influence of geometry-induced gradient deformation in liquid chromatographic retention modelling, T.S. Bos, L.E. Niezen, M.J. den Uijl, S.R.A. Molenaar, S. Lege, P.J. Schoenmakers, G.W. Somsen, B.W.J. Pirok, J. Chromatogr. A, 2021, 1635, 461714, DOI: 10.1016/j.chroma.2020.461714.

The Authors

Tijmen Bos

Mimi den Uijl

Leon Niezen

Stef Molenaar

Researchers Bos, Niezen and Molenaar were part of the UNMATCHED project, which was supported by BASF, DSM and Nouryon, and received funding from the Dutch Research Council (NWO). Den Uijl was part of the TooCOLD project, which was supported by Unilever and NWO. You can read more about them and find their contact info on the Team page.

Categories
Publications

Recent applications of retention modelling in LC

Ever since the 1970s, retention modelling has been a point of interest for the characterization of retention mechanisms. In Amsterdam, modelling of retention is mainly conducted for the purpose of method optimization. With the spur of applications of retention modelling to characterize HILIC, the field has recently received a significant number of developments. Furthermore, the rapidly growing technological capabilities in data sciences certainly continue to introduce new opportunities to model retention.

PhD candidate Mimi den Uijl (Van ‘t Hoff Institute of Molecular Sciences) set out to track these developments, covering mainly the last 5 years [1]. Focusing on applications of of the modelling of mobile-phase effects, den Uijl found five main categories under which most applications could be classified. These were method optimization, method transfer, stationary-phase characterization, selectivity characterization and lipophilicity characterization.

Den Uijl identified a number of main focus areas for which retention modelling was mainly applied and mapped their generic workflows. Reprinted with permission from [1].

"There is currently no consensus on the quality of retention models, which frustrates the comparison and evaluation of models. Reported prediction errors range from 0.1 to 10%, but almost all authors speak of “accurate” or “good” models."

Den Uijl furthermore reviewed the use of individual models and found that a surprising small number of studies reported numerical evaluations of the regression. As one of her conclusion, Den Uijl noted that model parameters may eventually be used as system‐independent retention data, if numerical evaluation data would be provided.

Her review, which she wrote together with Peter Schoenmakers, Maarten van Bommel and Bob Pirok, was published open-access in the special Reviews 2021 issue of Journal of Separation Science. The publication can freely be accessed here.

References

[1] Recent applications of retention modelling in liquid chromatography, M.J. den Uijl,  P.J. Schoenmakers,  B.W.J. Pirok, and  M.R. van Bommel, J. Sep. Sci., 2020, DOI: 10.1002/jssc.202000905.

Mimi den Uijl was a PhD student in the TooCOLD (Toolbox for studying the Chemistry Of Light-induced Degradation) project at the University of Amsterdam. In this project, Mimi developed light-induced reaction modulators for use in 2D-LC. You can read more about her on the Team page.

Categories
Publications

Detection Challenges in Polymer Analysis with LC

With their large distributions culminating in wide envelopes of – almost exclusively – co-eluting peaks, polymers certainly present a unique challenge relative to the analysis of small molecules. If anything, this challenge has spurred innovations which have also benefited other fields. A good example has been the work on polymer analysis with 2D-LC in the first years of this millennium, which have certainly contributed to recent developments on the technique [1,2].

In his review, Wouter Knol (Van ‘t Hoff Institute for Molecular Sciences) reviews another research area which offers a lot of room for innovations: detection. Together with the co-authors, Knol provided an exhaustive overview of applications of detection techniques in LC for polymer analysis. For each detection technique, notable recent applications are discussed and the authors distilled the key advantages and disadvantages of each approach.

One particularly useful trait of the review is this table which summarizes all key strengths and weaknesses of each detection technique employed for LC in polymer analysis. Reprinted with permission from [3].

Knol and co-workers noted that promising approaches receive surprising attention in recent literature. He concluded that the opportunities deserve more attention. You can download and read the paper, which was published open-access in the special Reviews 2021 issue of Journal of Separation Sciences, here.

Example of a separation of complex polyether polyols with LC×LC-MS by Groeneveld et al. showing a clear structure based on the number of ethylene oxide/propylene oxide units in the polymer. Reprinted with permission from [4].

References

[1] Recent Developments in Two-Dimensional Liquid Chromatography: Fundamental Improvements for Practical Applications
B.W.J. Pirok, D.R. Stoll and P.J. Schoenmakers, Anal. Chem., 2019, 91(1), 240-263, DOI: 10.1021/acs.analchem.8b04841

[2] Comprehensive Two-Dimensional Ultrahigh-Pressure Liquid Chromatography for Separations of Polymers E. Uliyanchenko, P.J.C.H. Cools, Sj. van der Wal and P. J. Schoenmakers, Anal. Chem. 2012, 84, 18, 7802–7809, DOI: 10.1021/ac3011582

[3] Detection challenges in quantitative polymer analysis by liquid chromatography, W.C. Knol, B.W.J. Pirok, and R.A.H. Peters, J. Sep. Sci. 2020, DOI: 10.1002/jssc.202000768

[4] Characterization of complex polyether polyols using comprehensive two-dimensional liquid chromatography hyphenated to high-resolution mass spectrometry G. Groeneveld, M.N. Dunkle, M. Rinken, A.F.G. Gargano, A. de Niet, M. Pursch, E.P.C. Mes, and P.J. Schoenmakers, J. Chromatogr. A, 1569, 2018,  128-138, DOI: 10.1016/j.chroma.2018.07.054

Wouter Knol was a PhD candidate in the group of Peter Schoenmakers and Bob Pirok. He worked in the UNMATCHED project in Amsterdam and mainly focuses on techniques to determine the sequence distribution of polymers.

You can read more about him on our Team page.

Categories
Publications

Liquid Chromatography in the Oil and Gas Industry

With the petroleum industry representing roughly 40% of the chemical industry [1], the analysis of petroleum-related samples is of interest for the optimization of refining processes, studying environmental pollution or monitoring of other processes, such as biodegradation. While gas chromatography is commonly used for most petrochemical samples, liquid-phase separations may be employed for the difficult cases.

To cover the applications of liquid chromatography in this context, MSc. graduate student Denice van Herwerden compiled this in a chapter as a contribution to the book Analytical Techniques in the Oil and Gas Industry for Environmental Monitoring [2].

In her review, Van Herwerden observed that LC is mainly applied for analysis of heavier petroleum fractions (due to their high boiling points), thermally labile compounds, and, for example, acidic compounds that would require a derivatization step prior to GC analysis. She also discussed applications where LC is used as a pre-separation technique to decrease the sample dimensionality prior to GC analysis. Denice found that, while both on- or off-line coupling may be used, the off-line coupling is recently more favored. 

Finally, van Herwerden addressed a limited number of applications of comprehensive two-dimensional liquid chromatography to the analysis of heavy-oil fractions and derivatives

Analytical Techniques in the Oil and Gas Industry for Environmental Monitoring (ISBN: 9781119523307) is a book edited by Melissa Dunkle and William Winniford. The book was published this August and includes 11 chapters. The chapter on LC applications, which was co-written by Bob Pirok and Peter Schoenmakers, can be found here.

References

[1] Hazardous Effects of Petrochemical Industries: A Review. A. Sharma, P. Sharma, A. Sharma, R. Tyagi and A. Dixit, Advances in Petrochemical Science, 2018, 3(2), 2–4, DOI: 10.19080/rapsci.2017.03.555607

[2] Liquid Chromatography: Applications for the Oil and Gas Industry, D. van Herwerden, B.W.J. Pirok, and P.J. Schoenmakers, Analytical Techniques in the Oil and Gas Industry for Environmental Monitoring, 2020, John Wiley & Sons, Inc., ISBN: 9781119523307, DOI: 10.1002/9781119523314.ch5