Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Int J Mol Sci ; 23(6)2022 Mar 17.
Article in English | MEDLINE | ID: mdl-35328645

ABSTRACT

Flow cytometry is widely used within the manufacturing of cell and gene therapies to measure and characterise cells. Conventional manual data analysis relies heavily on operator judgement, presenting a major source of variation that can adversely impact the quality and predictive potential of therapies given to patients. Computational tools have the capacity to minimise operator variation and bias in flow cytometry data analysis; however, in many cases, confidence in these technologies has yet to be fully established mirrored by aspects of regulatory concern. Here, we employed synthetic flow cytometry datasets containing controlled population characteristics of separation, and normal/skew distributions to investigate the accuracy and reproducibility of six cell population identification tools, each of which implement different unsupervised clustering algorithms: Flock2, flowMeans, FlowSOM, PhenoGraph, SPADE3 and SWIFT (density-based, k-means, self-organising map, k-nearest neighbour, deterministic k-means, and model-based clustering, respectively). We found that outputs from software analysing the same reference synthetic dataset vary considerably and accuracy deteriorates as the cluster separation index falls below zero. Consequently, as clusters begin to merge, the flowMeans and Flock2 software platforms struggle to identify target clusters more than other platforms. Moreover, the presence of skewed cell populations resulted in poor performance from SWIFT, though FlowSOM, PhenoGraph and SPADE3 were relatively unaffected in comparison. These findings illustrate how novel flow cytometry synthetic datasets can be utilised to validate a range of automated cell identification methods, leading to enhanced confidence in the data quality of automated cell characterisations and enumerations.


Subject(s)
Data Analysis , Software , Algorithms , Cluster Analysis , Flow Cytometry/methods , Genetic Therapy , Humans , Reproducibility of Results
2.
PDA J Pharm Sci Technol ; 76(3): 200-215, 2022.
Article in English | MEDLINE | ID: mdl-35031542

ABSTRACT

Application of synthetic datasets in training and validation of analysis tools has led to improvements in many decision-making tasks in a range of domains from computer vision to digital pathology. Synthetic datasets overcome the constraints of real-world datasets, namely difficulties in collection and labeling, expense, time, and privacy concerns. In flow cytometry, real cell-based datasets are limited by properties such as size, number of parameters, distance between cell populations, and distributions and are often focused on a narrow range of disease or cell types. Researchers in some cases have designed these desired properties into synthetic datasets; however, operators have implemented them in inconsistent approaches, and there is a scarcity of publicly available, high-quality synthetic datasets. In this research, we propose a method to systematically design and generate flow cytometry synthetic datasets with highly controlled characteristics. We demonstrate the generation of two-cluster synthetic datasets with specific degrees of separation between cell populations, and of non-normal distributions with increasing levels of skewness and orientations of skew pairs. We apply our synthetic datasets to test the performance of a popular automated cell populations identification software, SPADE3, and define the region where the software performance decreases as the clusters get closer together. Application of the synthetic skewed dataset suggests the software is capable of processing non-normal data. We calculate the classification accuracy of SPADE3 with robustness not achievable with real-world datasets. Our approach aims to advance research toward generation of high-quality synthetic flow cytometry datasets and to increase their awareness among the community. The synthetic datasets can be used in benchmarking studies that critically evaluate cell population identification tools and help illustrate potential digital platform inconsistencies. These datasets have the potential to improve cell characterization workflows that integrate automated analysis in clinical diagnostics and cell therapy manufacturing.


Subject(s)
Benchmarking , Flow Cytometry/methods
3.
Sensors (Basel) ; 21(15)2021 Jul 28.
Article in English | MEDLINE | ID: mdl-34372336

ABSTRACT

In-situ metrology utilised for surface topography, texture and form analysis along with quality control processes requires a high-level of reliability. Hence, a traceable method for calibrating the measurement system's transfer function is required at regular intervals. This paper compares three methods of dimensional calibration for a spectral domain low coherence interferometer using a reference laser interferometer versus two types of single material measure. Additionally, the impact of dataset sparsity is shown along with the effect of using a singular calibration dataset for system performance when operating across different media.

4.
Methods Protoc ; 4(2)2021 Mar 30.
Article in English | MEDLINE | ID: mdl-33808088

ABSTRACT

Measured variability of product within Cell and Gene Therapy (CGT) manufacturing arises from numerous sources across pre-analytical to post-analytical phases of testing. Operators are a function of the manufacturing process and are an important source of variability as a result of personal differences impacted by numerous factors. This research uses measurement uncertainty in comparison to Coefficient of Variation to quantify variation of participants when they complete Flow Cytometry data analysis through a 5-step gating sequence. Two study stages captured participants applying gates using their own judgement, and then following a diagrammatical protocol, respectively. Measurement uncertainty was quantified for each participant (and analysis phase) by following Guide to the Expression of Uncertainty in Measurement protocols, combining their standard deviations in quadrature from each gating step in the respective protocols. When participants followed a diagrammatical protocol, variation between participants reduced by 57%, increasing confidence in a more uniform reported cell count percentage. Measurement uncertainty provided greater resolution to the analysis processes, identifying that most variability contributed in the Flow Cytometry gating process is from the very first gate, where isolating target cells from dead or dying cells is required. This work has demonstrated the potential for greater usage of measurement uncertainty within CGT manufacturing scenarios, due to the resolution it provides for root cause analysis and continuous improvement.

5.
Cytometry A ; 99(10): 1007-1021, 2021 10.
Article in English | MEDLINE | ID: mdl-33606354

ABSTRACT

Automated flow cytometry (FC) data analysis tools for cell population identification and characterization are increasingly being used in academic, biotechnology, pharmaceutical, and clinical laboratories. The development of these computational methods is designed to overcome reproducibility and process bottleneck issues in manual gating, however, the take-up of these tools remains (anecdotally) low. Here, we performed a comprehensive literature survey of state-of-the-art computational tools typically published by research, clinical, and biomanufacturing laboratories for automated FC data analysis and identified popular tools based on literature citation counts. Dimensionality reduction methods ranked highly, such as generic t-distributed stochastic neighbor embedding (t-SNE) and its initial Matlab-based implementation for cytometry data viSNE. Software with graphical user interfaces also ranked highly, including PhenoGraph, SPADE1, FlowSOM, and Citrus, with unsupervised learning methods outnumbering supervised learning methods, and algorithm type popularity spread across K-Means, hierarchical, density-based, model-based, and other classes of clustering algorithms. Additionally, to illustrate the actual use typically within clinical spaces alongside frequent citations, a survey issued by UK NEQAS Leucocyte Immunophenotyping to identify software usage trends among clinical laboratories was completed. The survey revealed 53% of laboratories have not yet taken up automated cell population identification methods, though among those that have, Infinicyt software is the most frequently identified. Survey respondents considered data output quality to be the most important factor when using automated FC data analysis software, followed by software speed and level of technical support. This review found differences in software usage between biomedical institutions, with tools for discovery, data exploration, and visualization more popular in academia, whereas automated tools for specialized targeted analysis that apply supervised learning methods were more used in clinical settings.


Subject(s)
Data Analysis , Software , Algorithms , Cluster Analysis , Flow Cytometry , Immunophenotyping , Reproducibility of Results
6.
PDA J Pharm Sci Technol ; 75(1): 33-47, 2021.
Article in English | MEDLINE | ID: mdl-33067330

ABSTRACT

Flow cytometry is a complex measurement characterization technique, utilized within the manufacture, measurement, and release of cell and gene therapy products for rapid, high-content, and multiplexed discriminatory cell analysis. A number of factors influence the variability in the measurement reported including, but not limited to, biological variation, reagent variation, laser and optical configurations, and data analysis methods. This research focused on understanding the contribution of manual operator variability within the data analysis phase. Thirty-eight participants completed a questionnaire, providing information about experience and motivational factors, before completing a simple gating study. The results were analyzed using gauge repeatability and reproducibility techniques to quantify participant uncertainty. The various stages of the gating sequence were combined through summation in quadrature and expanded to give each participant a representative uncertainty value. Of the participants surveyed, 85% preferred manual gating to automated data analysis, with the primary reasons being legacy ("it's always been done that way") and accuracy, not in the metrological sense but in the clear definition of the correct target population. The median expanded uncertainty was calculated as 3.6% for the population studied, with no significant difference among more or less experienced users. Operator subjectivity can be quantified to include within measurement uncertainty budgets, required for various standards and qualifications. An emphasis on biomanufacturing measurement terminology is needed to help understand future and potential solutions, possibly looking at translational clinical models to engage and enhance better training and protocols within industrial and research settings.


Subject(s)
Data Analysis , Flow Cytometry , Humans , Reference Standards , Reproducibility of Results , Uncertainty
7.
Regen Med ; 14(11): 1029-1046, 2019 11.
Article in English | MEDLINE | ID: mdl-31718498

ABSTRACT

Aim: Understanding blood component variation as a function of healthy population metrics is necessary to inform biomanufacturing process design. Methods: UK Biobank metrics were examined for variation in white blood cell count as an analog to potential manufacturing starting material input. Results: White blood cell count variation of four orders of magnitude (6.65 × 109 cells/l) was found. Variation increased with age, increased with weight up to 80 kg then decreased. Health state showed a greater absolute number of participants with elevated count. Female range was greater than male. Cell count/distributions were different between centers. Conclusion: This variation and range of process input signals a requirement for new strategies for manufacturing process design and control.


Subject(s)
Biological Specimen Banks , Cell- and Tissue-Based Therapy/methods , Age Factors , Aged , Aged, 80 and over , Body Weight , Diet , Exercise , Female , Humans , Leukocytes/cytology , Male , Middle Aged , United Kingdom
8.
Appl Opt ; 55(13): 3555-65, 2016 May 01.
Article in English | MEDLINE | ID: mdl-27140371

ABSTRACT

In a recent publication [3rd International Conference on Surface Metrology, Annecy, France, 2012, p. 1] it was shown that surface roughness measurements made using a focus variation microscope (FVM) are influenced by surface tilt. The effect appears to be most significant when the surface has microscale roughness (Ra≈50 nm) that is sufficient to provide a diffusely scattered signal that is comparable in magnitude to the specular component. This paper explores, from first principles, image formation using the focus variation method. With the assumption of incoherent scattering, it is shown that the process is linear and the 3D point spread characteristics and transfer characteristics of the instrument are well defined. It is argued that for the case of microscale roughness and through the objective illumination, the assumption of incoherence cannot be justified and more rigorous analysis is required. Using a foil model of surface scattering, the images that are recorded by a FVM have been calculated. It is shown that for the case of through-the-objective illumination at small tilt angles, the signal quality is degraded in a systematic manner. This is attributed to the mixing of specular and diffusely reflected components and leads to an asymmetry in the k-space representation of the output signals. It is shown that by using extra-aperture illumination or tilt angles greater than the acceptance angle of aperture (such that the specular component is lost), the incoherent assumption can be justified once again. The work highlights the importance of using ring-light illumination and/or polarizing optics, which are often available as options on commercial instruments, as a means to mitigate or prevent these effects.

9.
Cytotherapy ; 18(5): 686-94, 2016 May.
Article in English | MEDLINE | ID: mdl-27059205

ABSTRACT

Currently cellular therapies, such as hematopoietic stem cell transplantation (HSCT), are produced at a small scale on a case-by-case basis, usually in a clinical or near-clinical setting. Meeting the demand for future cellular therapies will require a robust and scalable manufacturing process that is either designed around or controls the variation associated with biological starting materials. Understanding variation requires both a measure of the allowable variation (that does not negatively affect patient outcome) and the achievable variation (with current technology). The prevalence of HSCT makes it an ideal case study to prepare for more complex biological manufacturing with more challenging regulatory classifications. A systematic meta-analysis of the medical literature surrounding HSCT has been completed of which the key outcomes are the following: (i) the range of transplanted CD34+ cells/kg can be up to six orders of magnitude around the median for allogeneic procedures and four orders of magnitude for autologous procedures, (ii) there is no improvement in variation encountered over a period of 30 years and (iii) as study size increases, the amount of variation encountered also increases. A more detailed, stratified source from a controlled single-site clinical center is required to further define a control strategy for the manufacture of biologics.


Subject(s)
Cell- and Tissue-Based Therapy/methods , Hematopoietic Stem Cell Transplantation/methods , Cell Count , Humans , Middle Aged , Process Assessment, Health Care , Quality Control
10.
J Acoust Soc Am ; 115(1): 187-95, 2004 Jan.
Article in English | MEDLINE | ID: mdl-14759010

ABSTRACT

Localized changes in the density of water induced by the presence of an acoustic field cause perturbations in the localized refractive index. This relationship has given rise to a number of nonperturbing optical metrology techniques for recording measurement parameters from underwater acoustic fields. A method that has been recently developed involves the use of a Laser Doppler Vibrometer (LDV) targeted at a fixed, nonvibrating, plate through an underwater acoustic field. Measurements of the rate of change of optical pathlength along a line section enable the identification of the temporal and frequency characteristics of the acoustic wave front. This approach has been extended through the use of a scanning LDV, which facilitates the measurement of a range of spatially distributed parameters. A mathematical model is presented that relates the distribution of pressure amplitude and phase in a planar wave front with the rate of change of optical pathlength measured by the LDV along a specifically orientated laser line section. Measurements of a 1 MHz acoustic tone burst generated by a focused transducer are described and the results presented. Graphical depictions of the acoustic power and phase distribution recorded by the LDV are shown, together with images representing time history during the acoustic wave propagation.

11.
Appl Opt ; 43(3): 579-84, 2004 Jan 20.
Article in English | MEDLINE | ID: mdl-14765916

ABSTRACT

We propose a computer-aided method of lens manufacture that allows assembly, adjustment, and test phases to be run concurrently until an acceptable level of optical performance is reached. Misalignment of elements within a compound lens is determined by a comparison of the results of physical ray tracing by use of an array of Gaussian laser beams with numerically obtained geometric ray traces. An estimate of misalignment errors is made, and individual elements are adjusted in an iterative manner until performance criteria are achieved. The method is illustrated for the alignment of an air-spaced doublet.

12.
Appl Opt ; 42(28): 5634-41, 2003 Oct 01.
Article in English | MEDLINE | ID: mdl-14528924

ABSTRACT

Interferometric measurement techniques such as holographic interferometry and electronic speckle-pattern interferometry are valuable for measuring the deformation of objects. Conventional theoretical models of deformation measurement assume collimated illumination and telecentric imaging, which are usually only practical for small objects. Large objects often require divergent illumination, for which the models are valid only when the object is planar, and then only in the paraxial region. We present an analysis and discussion of the three-dimensional systematic sensitivity errors for both in-plane and out-of-plane interferometer configurations, where it is shown that the errors can be significant. A dimensionless approach is adopted to make the analysis generic and hence scalable to a system of any size.

13.
Appl Opt ; 42(4): 701-7, 2003 Feb 01.
Article in English | MEDLINE | ID: mdl-12564489

ABSTRACT

A scanning probe consisting of a source and receive fiber pair is used to measure the phase difference between wave fronts scattered from the front and rear surfaces of an aspheric optic. This system can be thought of as a classical interferometer with an aperture synthesized from the data collected along the path of the probe. If the form of either surface is known, the other can be deduced. In contrast with classical interferometers, the method does not need test or null plates and has the potential to be integrated into the manufacturing process.

SELECTION OF CITATIONS
SEARCH DETAIL
...