Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Drug Discov Today ; 24(9): 1795-1805, 2019 09.
Article in English | MEDLINE | ID: mdl-31207205

ABSTRACT

Multiple obstacles are driving the digital transformation of the biopharmaceutical industry. Novel digital techniques, often marketed as 'Pharma 4.0', are thought to solve some long-existing obstacles in the biopharma life cycle. Pharma 4.0 concepts, such as cyberphysical systems and dark factories, require data science tools as technological core components. Here, we review current data science applications at various stages of the bioprocess life cycle, including their scopes and data sources. We are convinced that the scope and usefulness of these tools are currently limited by technical and nontechnical problems experienced during their development and deployment. We suggest that the establishment of DevOps mind- and toolsets could improve this situation and would be essential cornerstones in the further development of Pharma 4.0 systems.


Subject(s)
Data Science/trends , Drug Industry/trends , Drug Development , Humans , Information Storage and Retrieval , Information Technology/trends , Manufacturing Industry/trends
2.
PDA J Pharm Sci Technol ; 73(4): 373-390, 2019.
Article in English | MEDLINE | ID: mdl-30770485

ABSTRACT

In the pharmaceutical industry, process validation tasks are based on the raw data and its derived analytical results generated from the process. Process validation failure affects both patient safety and the economic success of the manufacturing company. Hence, data integrity is highly critical in this area. Regulatory agencies, such as the Food and Drug Administration (FDA), reacted to past data integrity breaches by publishing new guidelines on data integrity for the correct handling of data in the pharmaceutical context.In this contribution, we want to show how data integrity can be improved on a technological level, removing the need for trusted third parties and centralized systems for this task. Therefore, we implemented an approach that uses existing tools, currently mostly used by software developers, and combined them with a new smart contract built on top of the Ethereum blockchain. In a case study, we test how data manipulation or backdating of results can be easily detected and how regulatory agencies can audit the complete data flow from the regulatory report back to the original raw data.The results of this contribution outline a possible road map for the development of production-ready tools, such as versioned database systems that natively interoperate with distributed ledgers. This will improve the trustworthiness of pharmaceutical manufacturing data by both protecting the intellectual property of the industrial company and improving the safety of the patients.LAY ABSTRACT: In the pharmaceutical industry, economically driven manufacturing companies are regulated and controlled by regulatory agencies. The pharmaceutical manufacturing companies need to produce large amounts of process and analytical data to show that their products are safe for patients. As the decisions of the regulatory agencies rely on this data, manufacturing companies need to prove how their generated data can be protected from technical breaches or data manipulation.As of today, the available technical solutions to provide data integrity are not working well enough. Regulatory agencies have published multiple documents highlighting the current data integrity issues.In this contribution, we show how blockchain, a technology that multiple cryptocurrencies like Bitcoin rely on, can help to improve the integrity of manufacturing data and data science analysis procedures. Therefore, we combined a smart contract on the Ethereum blockchain with tools currently mostly used by software developers. The presented workflow shows how data integrity can be guaranteed on a technological level without the need for trusted third parties.


Subject(s)
Blockchain , Data Analysis , Drug Industry/standards , Pharmaceutical Preparations/standards , Quality Control , Technology, Pharmaceutical/standards , Drug Industry/methods , Patient Safety , Systems Integration , Technology, Pharmaceutical/methods , United States , United States Food and Drug Administration
3.
Article in English | MEDLINE | ID: mdl-29906679

ABSTRACT

Chromatography is one of the most versatile unit operations in the biotechnological industry. Regulatory initiatives like Process Analytical Technology and Quality by Design led to the implementation of new chromatographic devices. Those represent an almost inexhaustible source of data. However, the analysis of large datasets is complicated, and significant amounts of information stay hidden in big data. Here we present a new, top-down approach for the systematic analysis of chromatographic datasets. It is the goal of this approach to analyze the dataset as a whole, starting with the most important, global information. The workflow should highlight interesting regions (outliers, drifts, data inconsistencies), and help to localize those regions within a multi-dimensional space in a straightforward way. Moving window factor models were used to extract the most important information, focusing on the differences between samples. The prototype was implemented as an interactive visualization tool for the explorative analysis of complex datasets. We found that the tool makes it convenient to localize variances in a multidimensional dataset and allows to differentiate between explainable and unexplainable variance. Starting with one global difference descriptor per sample, the analysis ends up with highly resolute temporally dependent difference descriptor values, thought as a starting point for the detailed analysis of the underlying raw data.


Subject(s)
Chromatography , Data Interpretation, Statistical , Multivariate Analysis , Algorithms , Databases, Factual
4.
Anal Bioanal Chem ; 409(3): 693-706, 2017 Jan.
Article in English | MEDLINE | ID: mdl-27376358

ABSTRACT

In biopharmaceutical process development and manufacturing, the online measurement of biomass and derived specific turnover rates is a central task to physiologically monitor and control the process. However, hard-type sensors such as dielectric spectroscopy, broth fluorescence, or permittivity measurement harbor various disadvantages. Therefore, soft-sensors, which use measurements of the off-gas stream and substrate feed to reconcile turnover rates and provide an online estimate of the biomass formation, are smart alternatives. For the reconciliation procedure, mass and energy balances are used together with accuracy estimations of measured conversion rates, which were so far arbitrarily chosen and static over the entire process. In this contribution, we present a novel strategy within the soft-sensor framework (named adaptive soft-sensor) to propagate uncertainties from measurements to conversion rates and demonstrate the benefits: For industrially relevant conditions, hereby the error of the resulting estimated biomass formation rate and specific substrate consumption rate could be decreased by 43 and 64 %, respectively, compared to traditional soft-sensor approaches. Moreover, we present a generic workflow to determine the required raw signal accuracy to obtain predefined accuracies of soft-sensor estimations. Thereby, appropriate measurement devices and maintenance intervals can be selected. Furthermore, using this workflow, we demonstrate that the estimation accuracy of the soft-sensor can be additionally and substantially increased.


Subject(s)
Biomass , Biosensing Techniques/methods , Quality Control , Technology, Pharmaceutical/instrumentation , Technology, Pharmaceutical/methods , Biosensing Techniques/instrumentation
SELECTION OF CITATIONS
SEARCH DETAIL
...