Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
Add more filters










Publication year range
1.
Appl Soil Ecol ; 1962024 Apr.
Article in English | MEDLINE | ID: mdl-38463139

ABSTRACT

Remediation methods for soil contaminated with poly- and perfluoroalkyl substances (PFAS) are needed to prevent their leaching into drinking water sources and to protect living organisms in the surrounding environment. In this study, the efficacy of processed and amended clays and carbons as soil amendments to sequester PFAS and prevent leaching was assessed using PFAS-contaminated soil and validated using sensitive ecotoxicological bioassays. Four different soil matrices including quartz sand, clay loam soil, garden soil, and compost were spiked with 4 PFAS congeners (PFOA, PFOS, GenX, and PFBS) at 0.01-0.2 µg/mL and subjected to a 3-step extraction method to quantify the leachability of PFAS from each matrix. The multistep extraction method showed that PFAS leaching from soil was aligned with the total carbon content in soil, and the recovery was dependent on concentration of the PFAS. To prevent the leaching of PFAS, several sorbents including activated carbon (AC), calcium montmorillonite (CM), acid processed montmorillonite (APM), and organoclays modified with carnitine, choline, and chlorophyll were added to the four soil matrices at 0.5-4 % w/w, and PFAS was extracted using the LEAF method. Total PFAS bioavailability was reduced by 58-97 % by all sorbents in a dose-dependent manner, with AC being the most efficient sorbent with a reduction of 73-97 %. The water leachates and soil were tested for toxicity using an aquatic plant (Lemna minor) and a soil nematode (Caenorhabditis elegans), respectively, to validate the reduction in PFAS bioavailability. Growth parameters in both ecotoxicological models showed a dose-dependent reduction in toxicity with value-added growth promotion from the organoclays due to added nutrients. The kinetic studies at varying time intervals and varying pHs simulating acidic rain, fresh water, and brackish water suggested a stable sorption of PFAS on all sorbents that fit the pseudo-second-order for up to 21 days. Contaminated soil with higher than 0.1 µg/mL PFAS may require reapplication of soil amendments every 21 days. Overall, AC showed the highest sorption percentage of total PFAS from in vitro studies, while organoclays delivered higher protection in ecotoxicological models (in vivo). This study suggests that in situ immobilization with soil amendments can reduce PFAS leachates and their bioavailability to surrounding organisms. A combination of sorbents may facilitate the most effective remediation of complex soil matrices containing mixtures of PFAS and prevent leaching and uptake into plants.

2.
Environ Pollut ; 347: 123762, 2024 Apr 15.
Article in English | MEDLINE | ID: mdl-38479705

ABSTRACT

Toxic substances, such as polycyclic aromatic hydrocarbons (PAHs) and heavy metals, can accumulate in soil, posing a risk to human health and the environment. To reduce the risk of exposure, rapid identification and remediation of potentially hazardous soils is necessary. Adsorption of contaminants by activated carbons and clay materials is commonly utilized to decrease the bioavailability of chemicals in soil and environmental toxicity in vitro, and this study aims to determine their efficacy in real-life soil samples. Two ecotoxicological models (Lemna minor and Caenorhabditis elegans) were used to test residential soil samples, known to contain an average of 5.3, 262, and 9.6 ppm of PAHs, lead, and mercury, for potential toxicity. Toxicity testing of these soils indicated that 86% and 58% of soils caused ≤50% inhibition of growth and survival of L. minor and C. elegans, respectively. Importantly, 3 soil samples caused ≥90% inhibition of growth in both models, and the toxicity was positively correlated with levels of heavy metals. These toxic soil samples were prioritized for remediation using activated carbon and SM-Tyrosine sorbents, which have been shown to immobilize PAHs and heavy metals, respectively. The inclusion of low levels of SM-Tyrosine protected the growth and survival of L. minor and C. elegans by 83% and 78%, respectively from the polluted soil samples while activated carbon offered no significant protection. These results also indicated that heavy metals were the driver of toxicity in the samples. Results from this study demonstrate that adsorption technologies are effective strategies for remediating complex, real-life soil samples contaminated with hazardous pollutants and protecting natural soil and groundwater resources and habitats. The results highlight the applicability of these ecotoxicological models as rapid screening tools for monitoring soil quality and verifying the efficacy of remediation practices.


Subject(s)
Araceae , Metals, Heavy , Polycyclic Aromatic Hydrocarbons , Soil Pollutants , Animals , Humans , Clay , Caenorhabditis elegans , Charcoal , Metals, Heavy/toxicity , Metals, Heavy/analysis , Polycyclic Aromatic Hydrocarbons/analysis , Soil/chemistry , Tyrosine , Soil Pollutants/analysis
3.
Environ Sci Pollut Res Int ; 31(14): 21781-21796, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38396181

ABSTRACT

Pesticides are commonly found in the environment and pose a risk to target and non-target species; therefore, employing a set of bioassays to rapidly assess the toxicity of these chemicals to diverse species is crucial. The toxicity of nine individual pesticides from organophosphate, organochlorine, phenylurea, dinitroaniline, carbamate, and viologen chemical classes and a mixture of all the compounds were tested in three bioassays (Hydra vulgaris, Lemna minor, and Caenorhabditis elegans) that represent plant, aquatic, and soil-dwelling species, respectively. Multiple endpoints related to growth and survival were measured for each model, and EC10 and EC50 values were derived for each endpoint to identify sensitivity patterns according to chemical classes and target organisms. L. minor had the lowest EC10 and EC50 values for seven and five of the individual pesticides, respectively. L. minor was also one to two orders of magnitude more sensitive to the mixture compared to H. vulgaris and C. elegans, where EC50 values were calculated to be 0.00042, 0.0014, and 0.038 mM, respectively. H. vulgaris was the most sensitive species to the remaining individual pesticides, and C. elegans consistently ranked the least sensitive to all tested compounds. When comparing the EC50 values across all pesticides, the endpoints of L. minor were correlated with each other while the endpoints measured in H. vulgaris and C. elegans were clustered together. While there was no apparent relationship between the chemical class of pesticide and toxicity, the compounds were more closely clustered based on target organisms (herbicide vs insecticide). The results of this study demonstrate that the combination of these plant, soil, and aquatic specie can serve as representative indicators of pesticide pollution in environmental samples.


Subject(s)
Araceae , Pesticides , Animals , Pesticides/toxicity , Pesticides/chemistry , Caenorhabditis elegans , Carbamates/toxicity , Organophosphates , Soil
4.
Chem Eng Sci ; 2812023 Nov 05.
Article in English | MEDLINE | ID: mdl-37637227

ABSTRACT

Humans are continuously exposed to a variety of toxicants and chemicals which is exacerbated during and after environmental catastrophes such as floods, earthquakes, and hurricanes. The hazardous chemical mixtures generated during these events threaten the health and safety of humans and other living organisms. This necessitates the development of rapid decision-making tools to facilitate mitigating the adverse effects of exposure on the key modulators of the endocrine system, such as the estrogen receptor alpha (ERα), for example. The mechanistic stages of the estrogenic transcriptional activity can be measured with high content/high throughput microscopy-based biosensor assays at the single-cell level, which generates millions of object-based minable data points. By combining computational modeling and experimental analysis, we built a highly accurate data-driven classification framework to assess the endocrine disrupting potential of environmental compounds. The effects of these compounds on the ERα pathway are predicted as being receptor agonists or antagonists using the principal component analysis (PCA) projections of high throughput, high content image analysis descriptors. The framework also combines rigorous preprocessing steps and nonlinear machine learning algorithms, such as the Support Vector Machines and Random Forest classifiers, to develop highly accurate mathematical representations of the separation between ERα agonists and antagonists. The results show that Support Vector Machines classify the unseen chemicals correctly with more than 96% accuracy using the proposed framework, where the preprocessing and the PCA steps play a key role in suppressing experimental noise and unraveling hidden patterns in the dataset.

5.
ESCAPE ; 52: 2631-2636, 2023.
Article in English | MEDLINE | ID: mdl-37575176

ABSTRACT

We develop a machine learning framework that integrates high content/high throughput image analysis and artificial neural networks (ANNs) to model the separation between chemical compounds based on their estrogenic receptor activity. Natural and man-made chemicals have the potential to disrupt the endocrine system by interfering with hormone actions in people and wildlife. Although numerous studies have revealed new knowledge on the mechanism through which these compounds interfere with various hormone receptors, it is still a very challenging task to comprehensively evaluate the endocrine disrupting potential of all existing chemicals and their mixtures by pure in vitro or in vivo approaches. Machine learning offers a unique advantage in the rapid evaluation of chemical toxicity through learning the underlying patterns in the experimental biological activity data. Motivated by this, we train and test ANN classifiers for modeling the activity of estrogen receptor-α agonists and antagonists at the single-cell level by using high throughput/high content microscopy descriptors. Our framework preprocesses the experimental data by cleaning, scaling, and feature engineering where only the middle 50% of the values from each sample with detectable receptor-DNA binding is considered in the dataset. Principal component analysis is also used to minimize the effects of experimental noise in modeling where these projected features are used in classification model building. The results show that our ANN-based nonlinear data-driven framework classifies the benchmark agonist and antagonist chemicals with 98.41% accuracy.

6.
Comput Chem Eng ; 1562022 Jan.
Article in English | MEDLINE | ID: mdl-34720250

ABSTRACT

The coordination of interconnected elements across the different layers of the supply chain is essential for all industrial processes and the key to optimal decision-making. Yet, the modeling and optimization of such interdependent systems are still burdensome. In this paper, we address the simultaneous modeling and optimization of medium-term planning and short-term scheduling problems under demand uncertainty using mixed-integer bi-level multi-follower programming and data-driven optimization. Bi-level multi-follower programs model the natural hierarchy between different layers of supply chain management holistically, while scenario analysis and data-driven optimization allow us to retrieve the guaranteed feasible solutions of the integrated formulation under various demand considerations. We address the data-driven optimization of this challenging class of problems using the DOMINO framework, which was initially developed to solve single-leader single-follower bi-level optimization problems to guaranteed feasibility. This framework is extended to solve single-leader multi-follower stochastic formulations and its performance is characterized by well-known single and multi-product process scheduling case studies. Through our data-driven algorithmic approach, we present guaranteed feasible solutions to linear and nonlinear mixed-integer bi-level formulations of simultaneous planning and scheduling problems and further characterize the effects of the scheduling level complexity on the solution performance, which spans over several hundred continuous and binary variables, and thousands of constraints.

7.
ESCAPE ; 51: 205-210, 2022.
Article in English | MEDLINE | ID: mdl-36622647

ABSTRACT

This work addresses the control optimization of time-varying systems without the full discretization of the underlying high-fidelity models and derives optimal control trajectories using surrogate modeling and data-driven optimization. Time-varying systems are ubiquitous in the chemical process industry and their systematic control is essential for ensuring each system to be operated at the desired settings. To this end, we postulate nonlinear continuous-time control action trajectories using time-varying surrogate models and derive the parameters of these functional forms using data-driven optimization. Data-driven optimization allows us to collect data from the high-fidelity model without pursuing any discretization and fine-tune candidate control trajectories based on the retrieved input-output information from the nonlinear system. We test exponential and polynomial surrogate forms for the control trajectories and explore various data-driven optimization strategies (local vs. global and sample-based vs. model-based) to test the consistency of each approach for controlling dynamic systems. The applicability of our approach is demonstrated on a motivating example and a CSTR control case study with favorable results.

8.
ESCAPE ; 50: 481-486, 2021.
Article in English | MEDLINE | ID: mdl-34355221

ABSTRACT

A comprehensive evaluation of toxic chemicals and understanding their potential harm to human physiology is vital in mitigating their adverse effects following exposure from environmental emergencies. In this work, we develop data-driven classification models to facilitate rapid decision making in such catastrophic events and predict the estrogenic activity of environmental toxicants as estrogen receptor-α (ERα) agonists or antagonists. By combining high-content analysis, big-data analytics, and machine learning algorithms, we demonstrate that highly accurate classifiers can be constructed for evaluating the estrogenic potential of many chemicals. We follow a rigorous, high throughput microscopy-based high-content analysis pipeline to measure the single cell-level response of benchmark compounds with known in vivo effects on the ERα pathway. The resulting high-dimensional dataset is then pre-processed by fitting a non-central gamma probability distribution function to each feature, compound, and concentration. The characteristic parameters of the distribution, which represent the mean and the shape of the distribution, are used as features for the classification analysis via Random Forest (RF) and Support Vector Machine (SVM) algorithms. The results show that the SVM classifier can predict the estrogenic potential of benchmark chemicals with higher accuracy than the RF algorithm, which misclassifies two antagonist compounds.

9.
ESCAPE ; 50: 1707-1713, 2021.
Article in English | MEDLINE | ID: mdl-34414400

ABSTRACT

Supply chain management is an interconnected problem that requires the coordination of various decisions and elements across long-term (i.e., supply chain structure), medium-term (i.e., production planning), and short-term (i.e., production scheduling) operations. Traditionally, decision-making strategies for such problems follow a sequential approach where longer-term decisions are made first and implemented at lower levels, accordingly. However, there are shared variables across different decision layers of the supply chain that are dictating the feasibility and optimality of the overall supply chain performance. Multi-level programming offers a holistic approach that explicitly accounts for this inherent hierarchy and interconnectivity between supply chain elements, however, requires more rigorous solution strategies as they are strongly NP-hard. In this work, we use the DOMINO framework, a data-driven optimization algorithm initially developed to solve single-leader single-follower bi-level mixed-integer optimization problems, and further develop it to address integrated planning and scheduling formulations with multiple follower lower-level problems, which has not received extensive attention in the open literature. By sampling for the production targets over a pre-specified planning horizon, DOMINO deterministically solves the scheduling problem at each planning period per sample, while accounting for the total cost of planning, inventories, and demand satisfaction. This input-output data is then passed onto a data-driven optimizer to recover a guaranteed feasible, near-optimal solution to the integrated planning and scheduling problem. We show the applicability of the proposed approach for the solution of a two-product planning and scheduling case study.

10.
Ind Eng Chem Res ; 60(23): 8493-8503, 2021 Jun 16.
Article in English | MEDLINE | ID: mdl-34219916

ABSTRACT

Industrial process systems need to be optimized, simultaneously satisfying financial, quality and safety criteria. To meet all those potentially conflicting optimization objectives, multiobjective optimization formulations can be used to derive optimal trade-off solutions. In this work, we present a framework that provides the exact Pareto front of multiobjective mixed-integer linear optimization problems through multiparametric programming. The original multiobjective optimization program is reformulated through the well-established ϵ-constraint scalarization method, in which the vector of scalarization parameters is treated as a right-hand side uncertainty for the multiparametric program. The algorithmic procedure then derives the optimal solution of the resulting multiparametric mixed-integer linear programming problem as an affine function of the ϵ parameters, which explicitly generates the Pareto front of the multiobjective problem. The solution of a numerical example is analytically presented to exhibit the steps of the approach, while its practicality is shown through a simultaneous process and product design problem case study. Finally, the computational performance is benchmarked with case studies of varying dimensionality with respect to the number of objective functions and decision variables.

11.
ACS Omega ; 6(22): 14090-14103, 2021 Jun 08.
Article in English | MEDLINE | ID: mdl-34124432

ABSTRACT

An attractive approach to minimize human and animal exposures to toxic environmental contaminants is the use of safe and effective sorbent materials to sequester them. Montmorillonite clays have been shown to tightly bind diverse toxic chemicals. Due to their promise as sorbents to mitigate chemical exposures, it is important to understand their function and rapidly screen and predict optimal clay-chemical combinations for further testing. We derived adsorption free-energy values for a structurally and physicochemically diverse set of toxic chemicals using experimental adsorption isotherms performed in the current and previous studies. We studied the diverse set of chemicals using minimalistic MD simulations and showed that their interaction energies with calcium montmorillonite clays calculated using simulation snapshots in combination with their net charge and their corresponding solvent's dielectric constant can be used as inputs to a minimalistic model to predict adsorption free energies in agreement with experiments. Additionally, experiments and computations were used to reveal structural and physicochemical properties associated with chemicals that can be adsorbed to calcium montmorillonite clay. These properties include positively charged groups, phosphine groups, halide-rich moieties, hydrogen bond donor/acceptors, and large, rigid structures. The combined experimental and computational approaches used in this study highlight the importance and potential applicability of analogous methods to study and design novel advanced sorbent systems in the future, broadening their applicability for environmental contaminants.

12.
Ind Eng Chem Res ; 59(37): 16357-16367, 2020 Sep 16.
Article in English | MEDLINE | ID: mdl-33041499

ABSTRACT

The construction and expansion of steam cracking plants and feedstock diversification have resulted in a significant demand for the numerical simulation and optimization of models to achieve molecular refining and intelligent manufacturing. However, the existing models cannot be widely applied in industrial practice because of the high computational expense, time-consumption, and data size requirements. In this paper, a high-performance optimization process, which integrates transfer learning and a heuristic algorithm, is proposed for the optimization of furnaces for various feedstocks. An effective transfer learning structure, based on motif feature of the reaction network, is designed and subsequent product distribution prediction program is compiled. Then a hybrid genetic algorithm and particle swarm optimization method is applied for the coil outlet temperature (COT) curve optimization using the derived prediction model, and the results are obtained for different pricing policies of products. The results are determined based on the weight coefficients of prices for different products, and could be further explained by the yield distribution pattern and reaction mechanism.

13.
PLoS Comput Biol ; 16(9): e1008191, 2020 09.
Article in English | MEDLINE | ID: mdl-32970665

ABSTRACT

Environmental toxicants affect human health in various ways. Of the thousands of chemicals present in the environment, those with adverse effects on the endocrine system are referred to as endocrine-disrupting chemicals (EDCs). Here, we focused on a subclass of EDCs that impacts the estrogen receptor (ER), a pivotal transcriptional regulator in health and disease. Estrogenic activity of compounds can be measured by many in vitro or cell-based high throughput assays that record various endpoints from large pools of cells, and increasingly at the single-cell level. To simultaneously capture multiple mechanistic ER endpoints in individual cells that are affected by EDCs, we previously developed a sensitive high throughput/high content imaging assay that is based upon a stable cell line harboring a visible multicopy ER responsive transcription unit and expressing a green fluorescent protein (GFP) fusion of ER. High content analysis generates voluminous multiplex data comprised of minable features that describe numerous mechanistic endpoints. In this study, we present a machine learning pipeline for rapid, accurate, and sensitive assessment of the endocrine-disrupting potential of benchmark chemicals based on data generated from high content analysis. The multidimensional imaging data was used to train a classification model to ultimately predict the impact of unknown compounds on the ER, either as agonists or antagonists. To this end, both linear logistic regression and nonlinear Random Forest classifiers were benchmarked and evaluated for predicting the estrogenic activity of unknown compounds. Furthermore, through feature selection, data visualization, and model discrimination, the most informative features were identified for the classification of ER agonists/antagonists. The results of this data-driven study showed that highly accurate and generalized classification models with a minimum number of features can be constructed without loss of generality, where these machine learning models serve as a means for rapid mechanistic/phenotypic evaluation of the estrogenic potential of many chemicals.


Subject(s)
Algorithms , Estrogens/classification , Machine Learning , Cell Line , Estrogens/metabolism , Humans , Receptors, Estrogen/metabolism
14.
AIChE J ; 66(10)2020 Oct.
Article in English | MEDLINE | ID: mdl-32921798

ABSTRACT

Support Vector Machines (SVMs) based optimization framework is presented for the data-driven optimization of numerically infeasible Differential Algebraic Equations (DAEs) without the full discretization of the underlying first-principles model. By formulating the stability constraint of the numerical integration of a DAE system as a supervised classification problem, we are able to demonstrate that SVMs can accurately map the boundary of numerical infeasibility. The necessity of this data-driven approach is demonstrated on a 2-dimensional motivating example, where highly accurate SVM models are trained, validated, and tested using the data collected from the numerical integration of DAEs. Furthermore, this methodology is extended and tested for a multi-dimensional case study from reaction engineering (i.e., thermal cracking of natural gas liquids). The data-driven optimization of this complex case study is explored through integrating the SVM models with a constrained global grey-box optimization algorithm, namely the ARGONAUT framework.

15.
J Glob Optim ; 78(1): 1-36, 2020 Sep.
Article in English | MEDLINE | ID: mdl-32753792

ABSTRACT

The Data-driven Optimization of bi-level Mixed-Integer NOnlinear problems (DOMINO) framework is presented for addressing the optimization of bi-level mixed-integer nonlinear programming problems. In this framework, bi-level optimization problems are approximated as single-level optimization problems by collecting samples of the upper-level objective and solving the lower-level problem to global optimality at those sampling points. This process is done through the integration of the DOMINO framework with a grey-box optimization solver to perform design of experiments on the upper-level objective, and to consecutively approximate and optimize bi-level mixed-integer nonlinear programming problems that are challenging to solve using exact methods. The performance of DOMINO is assessed through solving numerous bi-level benchmark problems, a land allocation problem in Food-Energy-Water Nexus, and through employing different data-driven optimization methodologies, including both local and global methods. Although this data-driven approach cannot provide a theoretical guarantee to global optimality, we present an algorithmic advancement that can guarantee feasibility to large-scale bi-level optimization problems when the lower-level problem is solved to global optimality at convergence.

16.
ESCAPE ; 46: 967-972, 2019.
Article in English | MEDLINE | ID: mdl-31612156

ABSTRACT

The National Institute of Environmental Health Sciences (NIEHS) Superfund Research Program (SRP) aims to support university-based multidisciplinary research on human health and environmental issues related to hazardous substances and pollutants. The Texas A&M Superfund Research Program comprehensively evaluates the complexities of hazardous chemical mixtures and their potential adverse health impacts due to exposure through a number of multi-disciplinary projects and cores. One of the essential components of the Texas A&M Superfund Research Center is the Data Science Core, which serves as the basis for translating the data produced by the multi-disciplinary research projects into useful knowledge for the community via data collection, quality control, analysis, and model generation. In this work, we demonstrate the Texas A&M Superfund Research Program computational platform, which houses and integrates large-scale, diverse datasets generated across the Center, provides basic visualization service to facilitate interpretation, monitors data quality, and finally implements a variety of state-of-the-art statistical analysis for model/tool development. The platform is aimed to facilitate effective integration and collaboration across the Center and acts as an enabler for the dissemination of comprehensive ad-hoc tools and models developed to address the environmental and health effects of chemical mixture exposure during environmental emergency-related contamination events.

17.
PLoS One ; 14(10): e0223517, 2019.
Article in English | MEDLINE | ID: mdl-31600275

ABSTRACT

A detailed characterization of the chemical composition of complex substances, such as products of petroleum refining and environmental mixtures, is greatly needed in exposure assessment and manufacturing. The inherent complexity and variability in the composition of complex substances obfuscate the choices for their detailed analytical characterization. Yet, in lieu of exact chemical composition of complex substances, evaluation of the degree of similarity is a sensible path toward decision-making in environmental health regulations. Grouping of similar complex substances is a challenge that can be addressed via advanced analytical methods and streamlined data analysis and visualization techniques. Here, we propose a framework with unsupervised and supervised analyses to optimally group complex substances based on their analytical features. We test two data sets of complex oil-derived substances. The first data set is from gas chromatography-mass spectrometry (GC-MS) analysis of 20 Standard Reference Materials representing crude oils and oil refining products. The second data set consists of 15 samples of various gas oils analyzed using three analytical techniques: GC-MS, GC×GC-flame ionization detection (FID), and ion mobility spectrometry-mass spectrometry (IM-MS). We use hierarchical clustering using Pearson correlation as a similarity metric for the unsupervised analysis and build classification models using the Random Forest algorithm for the supervised analysis. We present a quantitative comparative assessment of clustering results via Fowlkes-Mallows index, and classification results via model accuracies in predicting the group of an unknown complex substance. We demonstrate the effect of (i) different grouping methodologies, (ii) data set size, and (iii) dimensionality reduction on the grouping quality, and (iv) different analytical techniques on the characterization of the complex substances. While the complexity and variability in chemical composition are an inherent feature of complex substances, we demonstrate how the choices of the data analysis and visualization methods can impact the communication of their characteristics to delineate sufficient similarity.


Subject(s)
Chemistry Techniques, Analytical/methods , Gas Chromatography-Mass Spectrometry , Petroleum/analysis , Principal Component Analysis , Reference Standards , Sample Size
18.
Comput Chem Eng ; 116: 488-502, 2018 Aug 04.
Article in English | MEDLINE | ID: mdl-30546167

ABSTRACT

The (global) optimization of energy systems, commonly characterized by high-fidelity and large-scale complex models, poses a formidable challenge partially due to the high noise and/or computational expense associated with the calculation of derivatives. This complexity is further amplified in the presence of multiple conflicting objectives, for which the goal is to generate trade-off compromise solutions, commonly known as Pareto-optimal solutions. We have previously introduced the p-ARGONAUT system, parallel AlgoRithms for Global Optimization of coNstrAined grey-box compUTational problems, which is designed to optimize general constrained single objective grey-box problems by postulating accurate and tractable surrogate formulations for all unknown equations in a computationally efficient manner. In this work, we extend p-ARGONAUT towards multi-objective optimization problems and test the performance of the framework, both in terms of accuracy and consistency, under many equality constraints. Computational results are reported for a number of benchmark multi-objective problems and a case study of an energy market design problem for a commercial building, while the performance of the framework is compared with other derivative-free optimization solvers.

19.
ESCAPE ; 43: 421-426, 2018.
Article in English | MEDLINE | ID: mdl-30534632

ABSTRACT

The ultimate goal of the Texas A&M Superfund program is to develop comprehensive tools and models for addressing exposure to chemical mixtures during environmental emergency-related contamination events. With that goal, we aim to design a framework for optimal grouping of chemical mixtures based on their chemical characteristics and bioactivity properties, and facilitate comparative assessment of their human health impacts through read-across. The optimal clustering of the chemical mixtures guides the selection of sorption material in such a way that the adverse health effects of each group are mitigated. Here, we perform (i) hierarchical clustering of complex substances using chemical and biological data, and (ii) predictive modeling of the sorption activity of broad-acting materials via regression techniques. Dimensionality reduction techniques are also incorporated to further improve the results. We adopt several recent examples of chemical substances of Unknown or Variable composition Complex reaction products and Biological materials (UVCB) as benchmark complex substances, where the grouping of them is optimized by maximizing the Fowlkes-Mallows (FM) index. The effect of clustering method and different visualization techniques are shown to influence the communication of the groupings for read-across.

20.
Int Symp Process Syst Eng ; 44: 1885-1890, 2018.
Article in English | MEDLINE | ID: mdl-30397687

ABSTRACT

The land use allocation problem is an important issue for a sustainable development. Land use optimization can have a profound influence on the provisions of interconnected elements that strongly rely on the same land resources, such as food, energy, and water. However, a major challenge in land use optimization arises from the multiple stakeholders and their differing, and often conflicting, objectives. Industries, agricultural producers and developers are mainly concerned with profits and costs, while government agents are concerned with a host of economic, environmental and sustainability factors. In this work, we developed a hierarchical FEW-N approach to tackle the problem of land use optimization and facilitate decision making to decrease the competition for resources and significantly contribute to the sustainable development of the land. We formulate the problem as a Stackelberg duopoly game, a sequential game with two players - a leader and a follower (Stackelberg, 2011). The government agents are treated as the leader (with the objective to minimize the competition between the FEW-N), and the agricultural producers and land developers as the followers (with the objective to maximize their profit). This formulation results into a bi-level mixed-integer programming problem that is solved using a novel bi-level optimization algorithm through ARGONAUT. ARGONAUT is a hybrid optimization framework which is tailored to solve high- dimensional constrained grey-box optimization problems via connecting surrogate model identification and deterministic global optimization. Results show that our data-driven approach allows us to provide feasible solutions to complex bi-level problems, which are essentially very difficult to solve deterministically.

SELECTION OF CITATIONS
SEARCH DETAIL
...