RESUMO
We report a new approach to measuring very low rates of water vapor transmission through high-performance barrier layers, based on detection of the water vapor by cavity ring-down infrared spectroscopy. It provides accurate and traceable measurements with a detection limit for water vapor transmission significantly below 1 × 10(-4) g/m(2)/day. The system is underpinned by dynamic reference standards of water vapor generated between 5 and 2000 nmol∕mol with an estimated relative expanded uncertainty of ±2%. It has been compared with other methods and demonstrates good comparability.
RESUMO
A system for generating traceable reference standards of water vapor at trace levels between 5 and 2000 nmol/mol has been developed. It can provide different amount fractions of trace water vapor by using continuous accurate measurements of mass loss from a permeation device coupled with a dilution system based on an array of critical flow orifices. An estimated relative expanded uncertainty of ±2% has been achieved for most amount fractions generated. The system has been used in an international comparison and demonstrates excellent comparability with National Metrology Institutes maintaining standards of water vapor in this range using other methods.
RESUMO
We report the use of a calibration transfer strategy to correct for drift in the quantitative sensitivity of a portable quadrupole mass spectrometer (QMS) aimed at process monitoring applications. Gas mixtures of CH4/Ar/C2H6/CO2 were studied with calibration phase measurements made of the pure gases for a univariate analysis and of 40 multi-component mixtures for a multivariate approach. To evaluate calibrations, test set spectra of a CH4/Ar/C2H6/CO2 gas mixture were recorded bi-weekly over a period of 12 months. As part of the strategy a standard of pure argon was measured during both calibration and test phases so that correction factors could be calculated for each measurement day. It was shown that in the absence of a calibration transfer strategy quantifications of test set spectra could be inaccurate by more than an order of magnitude over 12 months. Furthermore, due to the effects of drift in the sensitivity over the 6 days required to record the training set in the calibration phase it was found that the multivariate analysis quantified test spectra less accurately than the univariate analysis. However, by applying the calibration transfer strategy across all measurements (both calibration and test phases) it was shown that the errors in prediction using the multivariate analysis previously seen after 2 weeks were not observed until approximately 12 months later.