Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Annu Int Conf IEEE Eng Med Biol Soc ; 2018: 1160-1163, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30440597

ABSTRACT

Evaluation of lung mechanics is the primary component for designing lung protective optimal ventilation strategies. This paper presents a machine learning approach for bedside assessment of respiratory resistance (R) and compliance (C). We develop machine learning algorithms to track flow rate and airway pressure and estimate R and C continuously and in real-time. An experimental study is conducted, by connecting a pressure control ventilator to a test lung that simulates various R and C values, to gather sensor data for validation of the devised algorithms. We develop supervised learning algorithms based on decision tree, decision table, and Support Vector Machine (SVM) techniques to predict R and C values. Our experimental results demonstrate that the proposed algorithms achieve 90.3%, 93.1%, and 63.9% accuracy in assessing respiratory R and C using decision table, decision tree, and SVM, respectively. These results along with our ability to estimate R and C with 99.4% accuracy using a linear regression model demonstrate the potential of the proposed approach for constructing a new generation of ventilation technologies that leverage novel computational models to control their underlying parameters for personalized healthcare and context-aware interventions.


Subject(s)
Algorithms , Respiration, Artificial , Lung , Machine Learning , Support Vector Machine , Ventilators, Mechanical
2.
IEEE J Biomed Health Inform ; 22(1): 252-264, 2018 01.
Article in English | MEDLINE | ID: mdl-29300701

ABSTRACT

Diet and physical activity are known as important lifestyle factors in self-management and prevention of many chronic diseases. Mobile sensors such as accelerometers have been used to measure physical activity or detect eating time. In many intervention studies, however, stringent monitoring of overall dietary composition and energy intake is needed. Currently, such a monitoring relies on self-reported data by either entering text or taking an image that represents food intake. These approaches suffer from limitations such as low adherence in technology adoption and time sensitivity to the diet intake context. In order to address these limitations, we introduce development and validation of Speech2Health, a voice-based mobile nutrition monitoring system that devises speech processing, natural language processing (NLP), and text mining techniques in a unified platform to facilitate nutrition monitoring. After converting the spoken data to text, nutrition-specific data are identified within the text using an NLP-based approach that combines standard NLP with our introduced pattern mapping technique. We then develop a tiered matching algorithm to search the food name in our nutrition database and accurately compute calorie intake values. We evaluate Speech2Health using real data collected with 30 participants. Our experimental results show that Speech2Health achieves an accuracy of 92.2% in computing calorie intake. Furthermore, our user study demonstrates that Speech2Health achieves significantly higher scores on technology adoption metrics compared to text-based and image-based nutrition monitoring. Our research demonstrates that new sensor modalities such as voice can be used either standalone or as a complementary source of information to existing modalities to improve the accuracy and acceptability of mobile health technologies for dietary composition monitoring.


Subject(s)
Diet/classification , Natural Language Processing , Nutrition Assessment , Smartphone , Telemedicine/methods , Adolescent , Adult , Algorithms , Eating/physiology , Humans , Nutrition Policy , Pattern Recognition, Automated , Software , United States , United States Department of Agriculture , Wearable Electronic Devices , Young Adult
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2016: 1991-1994, 2016 Aug.
Article in English | MEDLINE | ID: mdl-28268720

ABSTRACT

Diet and physical activity are important lifestyle and behavioral factors in self-management and prevention of many chronic diseases. Mobile sensors such as accelerometers have been used in the past to objectively measure physical activity or detect eating time. Diet monitoring, however, still relies on self-recorded data by end users where individuals use mobile devices for recording nutrition intake by either entering text or taking images. Such approaches have shown low adherence in technology adoption and achieve only moderate accuracy. In this paper, we propose development and validation of Speech-to-Nutrient-Information (S2NI), a comprehensive nutrition monitoring system that combines speech processing, natural language processing, and text mining in a unified platform to extract nutrient information such as calorie intake from spoken data. After converting the voice data to text, we identify food name and portion size information within the text. We then develop a tiered matching algorithm to search the food name in our nutrition database and to accurately compute calorie intake. Due to its pervasive nature and ease of use, S2NI enables users to report their diet routine more frequently and at anytime through their smartphone. We evaluate S2NI using real data collected with 10 participants. Our experimental results show that S2NI achieves 80.6% accuracy in computing calorie intake.


Subject(s)
Monitoring, Physiologic/instrumentation , Computers , Diet , Feeding Behavior , Humans , Self Care
SELECTION OF CITATIONS
SEARCH DETAIL
...