Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Nat Commun ; 13(1): 7233, 2022 11 24.
Article in English | MEDLINE | ID: mdl-36433980

ABSTRACT

Climate extremes cause significant winter wheat yield loss and can cause much greater impacts than single extremes in isolation when multiple extremes occur simultaneously. Here we show that compound hot-dry-windy events (HDW) significantly increased in the U.S. Great Plains from 1982 to 2020. These HDW events were the most impactful drivers for wheat yield loss, accounting for a 4% yield reduction per 10 h of HDW during heading to maturity. Current HDW trends are associated with yield reduction rates of up to 0.09 t ha-1 per decade and HDW variations are atmospheric-bridged with the Pacific Decadal Oscillation. We quantify the "yield shock", which is spatially distributed, with the losses in severely HDW-affected areas, presumably the same areas affected by the Dust Bowl of the 1930s. Our findings indicate that compound HDW, which traditional risk assessments overlooked, have significant implications for the U.S. winter wheat production and beyond.


Subject(s)
Triticum , Wind , Seasons , Climate , Climate Change
2.
Article in English | MEDLINE | ID: mdl-36072087

ABSTRACT

Research data management is becoming increasingly complex as the amount of data, metadata and code increases. Often, researchers must obtain multidisciplinary skills to acquire, transfer, share, and compute large datasets. In this paper we present the results of an investigation into providing a familiar web-based experience for researchers to manage their data and code, leveraging popular, well-funded tools and services. We show how researchers can save time and avoid mistakes, and we provide a detailed discussion of our system architecture and implementation, and summarize the new capabilities, and time savings which can be achieved.

3.
Proc IEEE Int Conf Big Data ; 2021: 4113-4118, 2021 Dec.
Article in English | MEDLINE | ID: mdl-36745144

ABSTRACT

This paper presents a novel use case of Graph Convolutional Network (GCN) learning representations for predictive data mining, specifically from user/task data in the domain of high-performance computing (HPC). It outlines an approach based on a coalesced data set: logs from the Slurm workload manager, joined with user experience survey data from computational cluster users. We introduce a new method of constructing a heterogeneous unweighted HPC graph consisting of multiple typed nodes after revealing the manifold relations between the nodes. The GCN structure used here supports two tasks: i) determining whether a job will complete or fail and ii) predicting memory and CPU requirements by training the GCN semi-supervised classification model and regression models on the generated graph. The graph is partitioned into partitions using graph clustering. We conducted classification and regression experiments using the proposed framework on our HPC log dataset and evaluated predictions by our trained models against baselines using test_score, F1-score, precision, recall for classification, and R1 score for regression, showing that our framework achieves significant improvements.

4.
Article in English | MEDLINE | ID: mdl-36760802

ABSTRACT

Determining resource allocations (memory and time) for submitted jobs in High Performance Computing (HPC) systems is a challenging process even for computer scientists. HPC users are highly encouraged to overestimate resource allocation for their submitted jobs, so their jobs will not be killed due to insufficient resources. Overestimating resource allocations occurs because of the wide variety of HPC applications and environment configuration options, and the lack of knowledge of the complex structure of HPC systems. This causes a waste of HPC resources, a decreased utilization of HPC systems, and increased waiting and turnaround time for submitted jobs. In this paper, we introduce our first ever implemented fully-offline, fully-automated, stand-alone, and open-source Machine Learning (ML) tool to help users predict memory and time requirements for their submitted jobs on the cluster. Our tool involves implementing six ML discriminative models from the scikit-learn and Microsoft LightGBM applied on the historical data (sacct data) from Simple Linux Utility for Resource Management (Slurm). We have tested our tool using historical data (saact data) using HPC resources of Kansas State University (Beocat), which covers the years from January 2019 - March 2021, and contains around 17.6 million jobs. Our results show that our tool achieves high predictive accuracy R 2 (0.72 using LightGBM for predicting the memory and 0.74 using Random Forest for predicting the time), helps dramatically reduce computational average waiting-time and turnaround time for the submitted jobs, and increases utilization of the HPC resources. Hence, our tool decreases the power consumption of the HPC resources.

5.
Article in English | MEDLINE | ID: mdl-35373221

ABSTRACT

In this paper, we present a novel methodology for predicting job resources (memory and time) for submitted jobs on HPC systems. Our methodology based on historical jobs data (saccount data) provided from the Slurm workload manager using supervised machine learning. This Machine Learning (ML) prediction model is effective and useful for both HPC administrators and HPC users. Moreover, our ML model increases the efficiency and utilization for HPC systems, thus reduce power consumption as well. Our model involves using Several supervised machine learning discriminative models from the scikit-learn machine learning library and LightGBM applied on historical data from Slurm. Our model helps HPC users to determine the required amount of resources for their submitted jobs and make it easier for them to use HPC resources efficiently. This work provides the second step towards implementing our general open source tool towards HPC service providers. For this work, our Machine learning model has been implemented and tested using two HPC providers, an XSEDE service provider (University of Colorado-Boulder (RMACC Summit) and Kansas State University (Beocat)). We used more than two hundred thousand jobs: one-hundred thousand jobs from SUMMIT and one-hundred thousand jobs from Beocat, to model and assess our ML model performance. In particular we measured the improvement of running time, turnaround time, average waiting time for the submitted jobs; and measured utilization of the HPC clusters. Our model achieved up to 86% accuracy in predicting the amount of time and the amount of memory for both SUMMIT and Beocat HPC resources. Our results show that our model helps dramatically reduce computational average waiting time (from 380 to 4 hours in RMACC Summit and from 662 hours to 28 hours in Beocat); reduced turnaround time (from 403 to 6 hours in RMACC Summit and from 673 hours to 35 hours in Beocat); and acheived up to 100% utilization for both HPC resources.

6.
Merrrill Ser Res Mission Public Univ ; 2019: 36-41, 2020 Jan 13.
Article in English | MEDLINE | ID: mdl-35309728

ABSTRACT

We present a "Researcher's Hierarchy of Needs" (loosely based on Maslow's Hierarchy of Needs) in the context of interdisciplinary research in a "big data" era. We discuss multiple tensions and difficulties that researchers face in today's environment, some current efforts and suggested policy changes to address these shortcomings and present our vision of a future interdisciplinary ecosystem.

7.
PEARC19 (2019) ; 20192019 Jul.
Article in English | MEDLINE | ID: mdl-35308798

ABSTRACT

High-Performance Computing (HPC) systems are resources utilized for data capture, sharing, and analysis. The majority of our HPC users come from other disciplines than Computer Science. HPC users including computer scientists have difficulties and do not feel proficient enough to decide the required amount of resources for their submitted jobs on the cluster. Consequently, users are encouraged to over-estimate resources for their submitted jobs, so their jobs will not be killing due insufficient resources. This process will waste and devour HPC resources; hence, this will lead to inefficient cluster utilization. We created a supervised machine learning model and integrated it into the Slurm resource manager simulator to predict the amount of required memory resources (Memory) and the required amount of time to run the computation. Our model involves using different machine learning algorithms. Our goal is to integrate and test the proposed supervised machine learning model on Slurm. We used over 10000 tasks selected from our HPC log files to evaluate the performance and the accuracy of our integrated model. The purpose of our work is to increase the performance of the Slurm by predicting the amount of require jobs memory resources and the time required for each particular job in order to improve the utilization of the HPC system using our integrated supervised machine learning model. Our results indicate that for larger jobs our model helps dramatically reduce computational turnaround time (from five days to ten hours for large jobs), substantially increased utilization of the HPC system, and decreased the average waiting time for the submitted jobs.

8.
PLoS One ; 9(7): e100850, 2014.
Article in English | MEDLINE | ID: mdl-24992684

ABSTRACT

Variations in spatio-temporal patterns of Human Monocytic Ehrlichiosis (HME) infection in the state of Kansas, USA were examined and the relationship between HME relative risk and various environmental, climatic and socio-economic variables were evaluated. HME data used in the study was reported to the Kansas Department of Health and Environment between years 2005-2012, and geospatial variables representing the physical environment [National Land cover/Land use, NASA Moderate Resolution Imaging Spectroradiometer (MODIS)], climate [NASA MODIS, Prediction of Worldwide Renewable Energy (POWER)], and socio-economic conditions (US Census Bureau) were derived from publicly available sources. Following univariate screening of candidate variables using logistic regressions, two Bayesian hierarchical models were fit; a partial spatio-temporal model with random effects and a spatio-temporal interaction term, and a second model that included additional covariate terms. The best fitting model revealed that spatio-temporal autocorrelation in Kansas increased steadily from 2005-2012, and identified poverty status, relative humidity, and an interactive factor, 'diurnal temperature range x mixed forest area' as significant county-level risk factors for HME. The identification of significant spatio-temporal pattern and new risk factors are important in the context of HME prevention, for future research in the areas of ecology and evolution of HME, and as well as climate change impacts on tick-borne diseases.


Subject(s)
Ehrlichiosis/epidemiology , Spatio-Temporal Analysis , Bayes Theorem , Climate Change , Humans , Kansas/epidemiology , Risk Factors , Socioeconomic Factors
9.
Article in English | MEDLINE | ID: mdl-19963906

ABSTRACT

Cattle health assessment is receiving increased attention due to threats that disease and bioterrorism pose to producer profits and to the safety of the food supply. Ingestible pill technology offers a promising means to obtain these physiologic data, since a bovine reticulum is an environment sheltered from outside elements that offers direct access to feed intake and heart/lung data. Traditional radio-frequency links are inappropriate for this application, as water absorption severely limits transmission ranges through tissue. This paper presents the initial design of a communications link that utilizes magnetic induction for signal transport and should be well-suited for a tissue medium. The link consists of a transmitter/receiver pair that employs loop antennae frequency matched at 125 kHz. Optimization of the link design offers the potential to achieve transmission distances of several feet through tissue.


Subject(s)
Dosage Forms , Health , Magnetics/instrumentation , Telemetry/instrumentation , Telemetry/veterinary , Animals , Cattle
10.
Am J Vet Res ; 69(8): 1005-12, 2008 Aug.
Article in English | MEDLINE | ID: mdl-18672963

ABSTRACT

OBJECTIVE: To determine the accuracy of accelerometers for measuring behavior changes in calves and to determine differences in beef calf behavior from before to after castration. ANIMALS: 3 healthy Holstein calves and 12 healthy beef calves. PROCEDURES: 2-dimensional accelerometers were placed on 3 calves, and data were logged simultaneous to video recording of animal behavior. Resulting data were used to generate and validate predictive models to classify posture (standing or lying) and type of activity (standing in place, walking, eating, getting up, lying awake, or lying sleeping). The algorithms developed were used to conduct a prospective trial to compare calf behavior in the first 24 hours after castration (n = 6) with behavior of noncastrated control calves (6) and with presurgical readings from the same castrated calves. RESULTS: On the basis of the analysis of the 2-dimensional accelerometer signal, posture was classified with a high degree of accuracy (98.3%) and the specific activity was estimated with a reasonably low misclassification rate (23.5%). Use of the system to compare behavior after castration revealed that castrated calves spent a significantly larger amount of time standing (82.2%), compared with presurgical readings (46.2%). CONCLUSIONS AND CLINICAL RELEVANCE: 2-dimensional accelerometers provided accurate classification of posture and reasonable classification of activity. Applying the system in a castration trial illustrated the usefulness of accelerometers for measuring behavioral changes in individual calves.


Subject(s)
Motor Activity , Orchiectomy , Posture , Acceleration , Animals , Cattle , Equipment Design , Meat , Mice
11.
Article in English | MEDLINE | ID: mdl-19163803

ABSTRACT

Decreased agricultural profit margins and recent bioterrorism concerns have led to an increased interest in monitoring livestock health. Heart rate and core body temperature are traditional vital parameters for cattle health assessment, as they provide warnings for illness and disease. However, obtaining these data in the field is time and labor intensive, which speaks to the need for solutions that provide continuous and automatic acquisition of these parameters. This paper presents the design of a pill that can remain in an animal's reticulum and use electrocardiographic techniques to ascertain heart rate. The wired prototype has been tested with a fistulated steer. These tests demonstrate that consistent heart vector data can be acquired even in the presence of animal movement and rumination. After minor processing, these signals are suitable for use with peak detection circuitry that can automate heart rate determination.


Subject(s)
Cattle/physiology , Diagnosis, Computer-Assisted/instrumentation , Electrocardiography, Ambulatory/instrumentation , Electrodes , Heart Rate/physiology , Signal Processing, Computer-Assisted/instrumentation , Telemetry/instrumentation , Animals , Diagnosis, Computer-Assisted/methods , Electrocardiography, Ambulatory/methods , Equipment Design , Equipment Failure Analysis , Reproducibility of Results , Sensitivity and Specificity
12.
Conf Proc IEEE Eng Med Biol Soc ; 2006: 3190-3, 2006.
Article in English | MEDLINE | ID: mdl-17945758

ABSTRACT

The livestock industry is an integral part of the United States economy. The continued production of quality beef requires new and improved methods for long term monitoring of animal health. Additional benefits can be realized from this class of technology, such as the ability to identify the presence of disease early and thereby prevent its spread. An important element of health assessment is the ability to monitor vital data such as heart rate and core body temperature. This paper presents preliminary results from the design of an ingestible pill that allows one to acquire heart rate (via a phonocardiograph) and core temperature in cattle. Packaging, circuitry, algorithms, and the wireless link are addressed.


Subject(s)
Body Temperature , Cattle/physiology , Heart Rate , Telemetry/veterinary , Acoustics , Animals , Biomedical Engineering , Computer Simulation , Electronics, Medical , Equipment Design , Software , Telemetry/instrumentation , Thermometers/veterinary
13.
Conf Proc IEEE Eng Med Biol Soc ; 2006: 4659-62, 2006.
Article in English | MEDLINE | ID: mdl-17946256

ABSTRACT

Clinical techniques for monitoring live stock health are insufficient, as they provide only sporadic information and require too much resource investment in terms of time and veterinary expertise. A sophisticated system capable of continuously assessing the health of individual animals, aggregating these data, and reporting the results to owners and regional authorities could provide tremendous benefit to the livestock industry. Such a system would not only improve individual animal health, but it would help to identify and pre vent widespread disease, whether it originated from natural causes or from biological attacks. This paper presents results from a prototype telemonitoring system that utilizes wearable technology to provide continuous animal health data. The infrastructure, hardware, software, and representative physiological measurements are presented.


Subject(s)
Animals, Domestic , Cattle Diseases/epidemiology , Medical Records Systems, Computerized , Telemedicine/instrumentation , Acoustics , Animal Husbandry , Animals , Body Temperature , Cattle , Computer Communication Networks , Computers , Equipment Design , Heart Rate , Monitoring, Physiologic , Software
SELECTION OF CITATIONS
SEARCH DETAIL
...