Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
J Environ Health Sci Eng ; 22(1): 229-243, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38887771

ABSTRACT

Society's support upon chemicals over the last few decades has led to their increased production, application and discharge into the environment. Wastewater treatment plants (WWTPs) contain a multitude of these chemicals such us; pharmaceutical compounds (PCs). Often, their biodegradability by activated sludge microorganisms is significant for their elimination during wastewater treatment. In this paper the focus is laid on two PCs carbamazepine (CBZ) and diclofenac (DCF) and their main transformation products (TPs). Laboratory degradation tests with these two pharmaceuticals using activated sludge as inoculum under aerobic conditions were performed and microbial metabolites were analyzed by liquid chromatography-mass spectrometry (LC/MS-MS). In two different Mixed liquid Suspended Solids (MLSS) concentrations the biodegradability by activated sludge of CBZ and DCF were evaluated. Also, this article proposes a decision support system to optimize the prediction process of this type of pharmacological compounds. A study and analysis of the techniques of Support Vector Machine, Random Forest, Decision Trees and Multilayer Perceptron Network is carried out to select the most reliable and accurate predictor for the decision system. There are not significant differences in the removal of DCF with 30 mg MLSS/L and 60 mg MLSS/L. DCF was better removed than CBZ in all experiments studied. The TP detected in the samples were mainly 4-OH-DCF for DCF and 10, 11 EPOXICBZ for CBZ. The results show that the best models are obtained with Random Forest and Multilayer Perceptron Network techniques, with a model fit of more than 95% for both carbamazepine and diclofenac metabolites. Obtaining a root means square errors of 0.80 µg/L for the metabolite 4-OH-DCF for DCF with the technique Random Forest and a root means square errors of 1.13 µg/L for the metabolite 10, 11 EPOXICBZ for CBZ with the Multilayer Perceptron Network technique. Supplementary Information: The online version contains supplementary material available at 10.1007/s40201-023-00890-x.

2.
Sci Rep ; 11(1): 15173, 2021 07 26.
Article in English | MEDLINE | ID: mdl-34312455

ABSTRACT

We are witnessing the dramatic consequences of the COVID-19 pandemic which, unfortunately, go beyond the impact on the health system. Until herd immunity is achieved with vaccines, the only available mechanisms for controlling the pandemic are quarantines, perimeter closures and social distancing with the aim of reducing mobility. Governments only apply these measures for a reduced period, since they involve the closure of economic activities such as tourism, cultural activities, or nightlife. The main criterion for establishing these measures and planning socioeconomic subsidies is the evolution of infections. However, the collapse of the health system and the unpredictability of human behavior, among others, make it difficult to predict this evolution in the short to medium term. This article evaluates different models for the early prediction of the evolution of the COVID-19 pandemic to create a decision support system for policy-makers. We consider a wide branch of models including artificial neural networks such as LSTM and GRU and statistically based models such as autoregressive (AR) or ARIMA. Moreover, several consensus strategies to ensemble all models into one system are proposed to obtain better results in this uncertain environment. Finally, a multivariate model that includes mobility data provided by Google is proposed to better forecast trend changes in the 14-day CI. A real case study in Spain is evaluated, providing very accurate results for the prediction of 14-day CI in scenarios with and without trend changes, reaching 0.93 [Formula: see text], 4.16 RMSE and 1.08 MAE.


Subject(s)
COVID-19/epidemiology , Artificial Intelligence , Forecasting , Humans , Incidence , Models, Statistical , Neural Networks, Computer , Spain/epidemiology
3.
Int J Mol Sci ; 22(9)2021 Apr 22.
Article in English | MEDLINE | ID: mdl-33922356

ABSTRACT

Artificial Intelligence is providing astonishing results, with medicine being one of its favourite playgrounds. Machine Learning and, in particular, Deep Neural Networks are behind this revolution. Among the most challenging targets of interest in medicine are cancer diagnosis and therapies but, to start this revolution, software tools need to be adapted to cover the new requirements. In this sense, learning tools are becoming a commodity but, to be able to assist doctors on a daily basis, it is essential to fully understand how models can be interpreted. In this survey, we analyse current machine learning models and other in-silico tools as applied to medicine-specifically, to cancer research-and we discuss their interpretability, performance and the input data they are fed with. Artificial neural networks (ANN), logistic regression (LR) and support vector machines (SVM) have been observed to be the preferred models. In addition, convolutional neural networks (CNNs), supported by the rapid development of graphic processing units (GPUs) and high-performance computing (HPC) infrastructures, are gaining importance when image processing is feasible. However, the interpretability of machine learning predictions so that doctors can understand them, trust them and gain useful insights for the clinical practice is still rarely considered, which is a factor that needs to be improved to enhance doctors' predictive capacity and achieve individualised therapies in the near future.


Subject(s)
Antineoplastic Agents/therapeutic use , Machine Learning , Molecular Targeted Therapy , Neoplasm Proteins/antagonists & inhibitors , Neoplasms/drug therapy , Precision Medicine , Humans , Neoplasms/metabolism , Neoplasms/pathology , Neural Networks, Computer
4.
Bioinformatics ; 37(11): 1515-1520, 2021 07 12.
Article in English | MEDLINE | ID: mdl-31960899

ABSTRACT

MOTIVATION: Molecular docking methods are extensively used to predict the interaction between protein-ligand systems in terms of structure and binding affinity, through the optimization of a physics-based scoring function. However, the computational requirements of these simulations grow exponentially with: (i) the global optimization procedure, (ii) the number and degrees of freedom of molecular conformations generated and (iii) the mathematical complexity of the scoring function. RESULTS: In this work, we introduce a novel molecular docking method named METADOCK 2, which incorporates several novel features, such as (i) a ligand-dependent blind docking approach that exhaustively scans the whole protein surface to detect novel allosteric sites, (ii) an optimization method to enable the use of a wide branch of metaheuristics and (iii) a heterogeneous implementation based on multicore CPUs and multiple graphics processing units. Two representative scoring functions implemented in METADOCK 2 are extensively evaluated in terms of computational performance and accuracy using several benchmarks (such as the well-known DUD) against AutoDock 4.2 and AutoDock Vina. Results place METADOCK 2 as an efficient and accurate docking methodology able to deal with complex systems where computational demands are staggering and which outperforms both AutoDock Vina and AutoDock 4. AVAILABILITY AND IMPLEMENTATION: https://Baldoimbernon@bitbucket.org/Baldoimbernon/metadock_2.git. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Subject(s)
Proteins , Ligands , Molecular Conformation , Molecular Docking Simulation
5.
Sensors (Basel) ; 20(24)2020 Dec 12.
Article in English | MEDLINE | ID: mdl-33322717

ABSTRACT

Precision agriculture is a growing sector that improves traditional agricultural processes through the use of new technologies. In southeast Spain, farmers are continuously fighting against harsh conditions caused by the effects of climate change. Among these problems, the great variability of temperatures (up to 20 °C in the same day) stands out. This causes the stone fruit trees to flower prematurely and the low winter temperatures freeze the flower causing the loss of the crop. Farmers use anti-freeze techniques to prevent crop loss and the most widely used techniques are those that use water irrigation as they are cheaper than other techniques. However, these techniques waste too much water and it is a scarce resource, especially in this area. In this article, we propose a novel intelligent Internet of Things (IoT) monitoring system to optimize the use of water in these anti-frost techniques while minimizing crop loss. The intelligent component of the IoT system is designed using an approach based on a multivariate Long Short-Term Memory (LSTM) model, designed to predict low temperatures. We compare the proposed approach of multivariate model with the univariate counterpart version to figure out which model obtains better accuracy to predict low temperatures. An accurate prediction of low temperatures would translate into significant water savings, as anti-frost techniques would not be activated without being necessary. Our experimental results show that the proposed multivariate LSTM approach improves the univariate counterpart version, obtaining an average quadratic error no greater than 0.65 °C and a coefficient of determination R2 greater than 0.97. The proposed system has been deployed and is currently operating in a real environment obtained satisfactory performance.

6.
Sensors (Basel) ; 20(3)2020 Feb 07.
Article in English | MEDLINE | ID: mdl-32046231

ABSTRACT

Wireless acoustic sensor networks are nowadays an essential tool for noise pollution monitoring and managing in cities. The increased computing capacity of the nodes that create the network is allowing the addition of processing algorithms and artificial intelligence that provide more information about the sound sources and environment, e.g., detect sound events or calculate loudness. Several models to predict sound pressure levels in cities are available, mainly road, railway and aerial traffic noise. However, these models are mostly based in auxiliary data, e.g., vehicles flow or street geometry, and predict equivalent levels for a temporal long-term. Therefore, forecasting of temporal short-term sound levels could be a helpful tool for urban planners and managers. In this work, a Long Short-Term Memory (LSTM) deep neural network technique is proposed to model temporal behavior of sound levels at a certain location, both sound pressure level and loudness level, in order to predict near-time future values. The proposed technique can be trained for and integrated in every node of a sensor network to provide novel functionalities, e.g., a method of early warning against noise pollution and of backup in case of node or network malfunction. To validate this approach, one-minute period equivalent sound levels, captured in a two-month measurement campaign by a node of a deployed network of acoustic sensors, have been used to train it and to obtain different forecasting models. Assessments of the developed LSTM models and Auto regressive integrated moving average models were performed to predict sound levels for several time periods, from 1 to 60 min. Comparison of the results show that the LSTM models outperform the statistics-based models. In general, the LSTM models achieve a prediction of values with a mean square error less than 4.3 dB for sound pressure level and less than 2 phons for loudness. Moreover, the goodness of fit of the LSTM models and the behavior pattern of the data in terms of prediction of sound levels are satisfactory.

7.
PLoS One ; 14(7): e0219388, 2019.
Article in English | MEDLINE | ID: mdl-31348783

ABSTRACT

INTRODUCTION: Ovarian tumors are the most common diagnostic challenge for gynecologists and ultrasound examination has become the main technique for assessment of ovarian pathology and for preoperative distinction between malignant and benign ovarian tumors. However, ultrasonography is highly examiner-dependent and there may be an important variability between two different specialists when examining the same case. The objective of this work is the evaluation of different well-known Machine Learning (ML) systems to perform the automatic categorization of ovarian tumors from ultrasound images. METHODS: We have used a real patient database whose input features have been extracted from 348 images, from the IOTA tumor images database, holding together with the class labels of the images. For each patient case and ultrasound image, its input features have been previously extracted using Fourier descriptors computed on the Region Of Interest (ROI). Then, four ML techniques are considered for performing the classification stage: K-Nearest Neighbors (KNN), Linear Discriminant (LD), Support Vector Machine (SVM) and Extreme Learning Machine (ELM). RESULTS: According to our obtained results, the KNN classifier provides inaccurate predictions (less than 60% of accuracy) independently of the size of the local approximation, whereas the classifiers based on LD, SVM and ELM are robust in this biomedical classification (more than 85% of accuracy). CONCLUSIONS: ML methods can be efficiently used for developing the classification stage in computer-aided diagnosis systems of ovarian tumor from ultrasound images. These approaches are able to provide automatic classification with a high rate of accuracy. Future work should aim at enhancing the classifier design using ensemble techniques. Another ongoing work is to exploit different kind of features extracted from ultrasound images.


Subject(s)
Algorithms , Fourier Analysis , Image Processing, Computer-Assisted , Machine Learning , Ovarian Neoplasms/diagnostic imaging , Ultrasonography , Area Under Curve , Female , Humans , ROC Curve
8.
Front Neuroinform ; 11: 39, 2017.
Article in English | MEDLINE | ID: mdl-28690512

ABSTRACT

Faced with a new concept to learn, our brain does not work in isolation. It uses all previously learned knowledge. In addition, the brain is able to isolate the knowledge that does not benefit us, and to use what is actually useful. In machine learning, we do not usually benefit from the knowledge of other learned tasks. However, there is a methodology called Multitask Learning (MTL), which is based on the idea that learning a task along with other related tasks produces a transfer of information between them, what can be advantageous for learning the first one. This paper presents a new method to completely design MTL architectures, by including the selection of the most helpful subtasks for the learning of the main task, and the optimal network connections. In this sense, the proposed method realizes a complete design of the MTL schemes. The method is simple and uses the advantages of the Extreme Learning Machine to automatically design a MTL machine, eliminating those factors that hinder, or do not benefit, the learning process of the main task. This architecture is unique and it is obtained without testing/error methodologies that increase the computational complexity. The results obtained over several real problems show the good performances of the designed networks with this method.

9.
Cureus ; 8(4): e570, 2016 Apr 15.
Article in English | MEDLINE | ID: mdl-27186452

ABSTRACT

Fetal lung masses are rare findings in prenatal ultrasound scanning in general population, of which congenital cystic adenomatoid malformation is the most commonly diagnosed type. This paper reports a single case of congenital cystic adenomatoid malformation detected at our hospital and the subsequent clinical follow-up using ultrasound scanning and fetal magnetic resonance imaging.

10.
Curr Drug Targets ; 17(14): 1626-1648, 2016.
Article in English | MEDLINE | ID: mdl-26844561

ABSTRACT

The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.


Subject(s)
Computational Biology/instrumentation , Proteins/chemistry , Algorithms , Computational Biology/methods , Humans , Models, Molecular , Protein Conformation , Protein Folding , Software
11.
Biomed Res Int ; 2014: 959645, 2014.
Article in English | MEDLINE | ID: mdl-25045715

ABSTRACT

According to the World Health Organization, the world's leading cause of death is heart disease, with nearly two million deaths per year. Although some factors are not possible to change, there are some keys that help to prevent heart diseases. One of the most important keys is to keep an active daily life, with moderate exercise. However, deciding what a moderate exercise is or when a slightly abnormal heart rate value is a risk depends on the person and the activity. In this paper we propose a context-aware system that is able to determine the activity the person is performing in an unobtrusive way. Then, we have defined ontology to represent the available knowledge about the person (biometric data, fitness status, medical information, etc.) and her current activity (level of intensity, heart rate recommended for that activity, etc.). With such knowledge, a set of expert rules based on this ontology are involved in a reasoning process to infer levels of alerts or suggestions for the users when the intensity of the activity is detected as dangerous for her health. We show how this approach can be accomplished by using only everyday devices such as a smartphone and a smartwatch.


Subject(s)
Exercise , Heart Diseases/epidemiology , Risk Assessment/methods , Biometry , Heart Diseases/pathology , Heart Rate , Humans
12.
Neural Netw ; 48: 19-24, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23892908

ABSTRACT

Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages.


Subject(s)
Artificial Intelligence , Computer Systems , Neural Networks, Computer , Algorithms , Data Interpretation, Statistical , Neurons/physiology , Reproducibility of Results
SELECTION OF CITATIONS
SEARCH DETAIL
...