Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 23(6)2023 Mar 13.
Article in English | MEDLINE | ID: mdl-36991791

ABSTRACT

Human context recognition (HCR) using sensor data is a crucial task in Context-Aware (CA) applications in domains such as healthcare and security. Supervised machine learning HCR models are trained using smartphone HCR datasets that are scripted or gathered in-the-wild. Scripted datasets are most accurate because of their consistent visit patterns. Supervised machine learning HCR models perform well on scripted datasets but poorly on realistic data. In-the-wild datasets are more realistic, but cause HCR models to perform worse due to data imbalance, missing or incorrect labels, and a wide variety of phone placements and device types. Lab-to-field approaches learn a robust data representation from a scripted, high-fidelity dataset, which is then used for enhancing performance on a noisy, in-the-wild dataset with similar labels. This research introduces Triplet-based Domain Adaptation for Context REcognition (Triple-DARE), a lab-to-field neural network method that combines three unique loss functions to enhance intra-class compactness and inter-class separation within the embedding space of multi-labeled datasets: (1) domain alignment loss in order to learn domain-invariant embeddings; (2) classification loss to preserve task-discriminative features; and (3) joint fusion triplet loss. Rigorous evaluations showed that Triple-DARE achieved 6.3% and 4.5% higher F1-score and classification, respectively, than state-of-the-art HCR baselines and outperformed non-adaptive HCR models by 44.6% and 10.7%, respectively.


Subject(s)
Neural Networks, Computer , Supervised Machine Learning , Humans , Acclimatization , Records , Smartphone
2.
IEEE Comput Graph Appl ; 41(3): 96-104, 2021.
Article in English | MEDLINE | ID: mdl-33961548

ABSTRACT

Smartphone health sensing tools, which analyze passively gathered human behavior data, can provide clinicians with a longitudinal view of their patients' ailments in natural settings. In this Visualization Viewpoints article, we postulate that interactive visual analytics (IVA) can assist data scientists during the development of such tools by facilitating the discovery and correction of wrong or missing user-provided ground-truth health annotations. IVA can also assist clinicians in making sense of their patients' behaviors by providing additional contextual and semantic information. We review the current state-of-the-art, outline unique challenges, and illustrate our viewpoints using our work as well as those of other researchers. Finally, we articulate open challenges in this exciting and emerging field of research.


Subject(s)
Semantics , Smartphone , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...