Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
J Med Internet Res ; 25: e44265, 2023 12 18.
Article in English | MEDLINE | ID: mdl-38109188

ABSTRACT

The effective management of chronic conditions requires an approach that promotes a shift in care from the clinic to the home, improves the efficiency of health care systems, and benefits all users irrespective of their needs and preferences. Digital health can provide a solution to this challenge, and in this paper, we provide our vision for a smart health ecosystem. A smart health ecosystem leverages the interoperability of digital health technologies and advancements in big data and artificial intelligence for data collection and analysis and the provision of support. We envisage that this approach will allow a comprehensive picture of health, personalization, and tailoring of behavioral and clinical support; drive theoretical advancements; and empower people to manage their own health with support from health care professionals. We illustrate the concept with 2 use cases and discuss topics for further consideration and research, concluding with a message to encourage people with chronic conditions, their caregivers, health care professionals, policy and decision makers, and technology experts to join their efforts and work toward adopting a smart health ecosystem.


Subject(s)
Artificial Intelligence , Ecosystem , Humans , Ambulatory Care Facilities , Big Data , Chronic Disease
2.
J Med Syst ; 46(6): 36, 2022 May 06.
Article in English | MEDLINE | ID: mdl-35522356

ABSTRACT

The World Health Organization (WHO) recommends a six-step hand hygiene technique. Although multiple studies have reported that this technique yields inadequate skin coverage outcomes, they have relied on manual labeling that provided low-resolution estimations of skin coverage outcomes. We have developed a computational system to precisely quantify hand hygiene outcomes and provide high-resolution skin coverage visualizations, thereby improving hygiene techniques. We identified frequently untreated areas located at the dorsal side of the hands around the abductor digiti minimi and the first dorsal interosseous. We also estimated that excluding Steps 3, 6R, and 6L from the six-step hand hygiene technique leads to cumulative coverage loss of less than 1%, indicating the potential redundancy of these steps. Our study demonstrates that the six-step hand hygiene technique could be improved to reduce the untreated areas and remove potentially redundant steps. Furthermore, our system can be used to computationally validate new proposed techniques, and help optimise hand hygiene procedures.


Subject(s)
Cross Infection , Hand Hygiene , Hand , Hand Disinfection/methods , Hand Hygiene/methods , Humans , Muscle, Skeletal , Upper Extremity , World Health Organization
3.
Yearb Med Inform ; 30(1): 191-199, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34479391

ABSTRACT

OBJECTIVES: To describe the use and promise of conversational agents in digital health-including health promotion andprevention-and how they can be combined with other new technologies to provide healthcare at home. METHOD: A narrative review of recent advances in technologies underpinning conversational agents and their use and potential for healthcare and improving health outcomes. RESULTS: By responding to written and spoken language, conversational agents present a versatile, natural user interface and have the potential to make their services and applications more widely accessible. Historically, conversational interfaces for health applications have focused mainly on mental health, but with an increase in affordable devices and the modernization of health services, conversational agents are becoming more widely deployed across the health system. We present our work on context-aware voice assistants capable of proactively engaging users and delivering health information and services. The proactive voice agents we deploy, allow us to conduct experience sampling in people's homes and to collect information about the contexts in which users are interacting with them. CONCLUSION: In this article, we describe the state-of-the-art of these and other enabling technologies for speech and conversation and discuss ongoing research efforts to develop conversational agents that "live" with patients and customize their service offerings around their needs. These agents can function as 'digital companions' who will send reminders about medications and appointments, proactively check in to gather self-assessments, and follow up with patients on their treatment plans. Together with an unobtrusive and continuous collection of other health data, conversational agents can provide novel and deeply personalized access to digital health care, and they will continue to become an increasingly important part of the ecosystem for future healthcare delivery.


Subject(s)
Health Promotion , Health Services Accessibility , Speech Recognition Software , Telemedicine , Communication , Humans , Monitoring, Physiologic/methods , User-Computer Interface
4.
IEEE Trans Vis Comput Graph ; 26(12): 3402-3413, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32986552

ABSTRACT

The presence of fully-occluded targets is common within virtual environments, ranging from a virtual object located behind a wall to a datapoint of interest hidden in a complex visualization. However, efficient input techniques for locating and selecting these targets are mostly underexplored in virtual reality (VR) systems. In this paper, we developed an initial set of seven techniques techniques for fully-occluded target selection in VR. We then evaluated their performance in a user study and derived a set of design implications for simple and more complex tasks from our results. Based on these insights, we refined the most promising techniques and conducted a second, more comprehensive user study. Our results show how factors, such as occlusion layers, target depths, object densities, and the estimation of target locations, can affect technique performance. Our findings from both studies and distilled recommendations can inform the design of future VR systems that offer selections for fully-occluded targets.

5.
Sensors (Basel) ; 20(9)2020 Apr 29.
Article in English | MEDLINE | ID: mdl-32365724

ABSTRACT

Mind wandering is a drift of attention away from the physical world and towards our thoughts and concerns. Mind wandering affects our cognitive state in ways that can foster creativity but hinder productivity. In the context of learning, mind wandering is primarily associated with lower performance. This study has two goals. First, we investigate the effects of text semantics and music on the frequency and type of mind wandering. Second, using eye-tracking and electrodermal features, we propose a novel technique for automatic, user-independent detection of mind wandering. We find that mind wandering was most frequent in texts for which readers had high expertise and that were combined with sad music. Furthermore, a significant increase in task-related thoughts was observed for texts for which readers had little prior knowledge. A Random Forest classification model yielded an F 1 -Score of 0.78 when using only electrodermal features to detect mind wandering, of 0.80 when using only eye-movement features, and of 0.83 when using both. Our findings pave the way for building applications which automatically detect events of mind wandering during reading.


Subject(s)
Attention , Biosensing Techniques , Eye Movements , Eye-Tracking Technology , Female , Humans , Male , Reading
6.
JMIR Mhealth Uhealth ; 8(3): e17001, 2020 03 26.
Article in English | MEDLINE | ID: mdl-32213469

ABSTRACT

BACKGROUND: Hand hygiene is a crucial and cost-effective method to prevent health care-associated infections, and in 2009, the World Health Organization (WHO) issued guidelines to encourage and standardize hand hygiene procedures. However, a common challenge in health care settings is low adherence, leading to low handwashing quality. Recent advances in machine learning and wearable sensing have made it possible to accurately measure handwashing quality for the purposes of training, feedback, or accreditation. OBJECTIVE: We measured the accuracy of a sensor armband (Myo armband) in detecting the steps and duration of the WHO procedures for handwashing and handrubbing. METHODS: We recruited 20 participants (10 females; mean age 26.5 years, SD 3.3). In a semistructured environment, we collected armband data (acceleration, gyroscope, orientation, and surface electromyography data) and video data from each participant during 15 handrub and 15 handwash sessions. We evaluated the detection accuracy for different armband placements, sensor configurations, user-dependent vs user-independent models, and the use of bootstrapping. RESULTS: Using a single armband, the accuracy was 96% (SD 0.01) for the user-dependent model and 82% (SD 0.08) for the user-independent model. This increased when using two armbands to 97% (SD 0.01) and 91% (SD 0.04), respectively. Performance increased when the armband was placed on the forearm (user dependent: 97%, SD 0.01; and user independent: 91%, SD 0.04) and decreased when placed on the arm (user dependent: 96%, SD 0.01; and user independent: 80%, SD 0.06). In terms of bootstrapping, user-dependent models can achieve more than 80% accuracy after six training sessions and 90% with 16 sessions. Finally, we found that the combination of accelerometer and gyroscope minimizes power consumption and cost while maximizing performance. CONCLUSIONS: A sensor armband can be used to measure hand hygiene quality relatively accurately, in terms of both handwashing and handrubbing. The performance is acceptable using a single armband worn in the upper arm but can substantially improve by placing the armband on the forearm or by using two armbands.


Subject(s)
Hand Disinfection , Adult , Electromyography , Female , Humans , Machine Learning , Male , Pilot Projects , Young Adult
7.
Hum Factors ; 55(1): 157-82, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23516800

ABSTRACT

OBJECTIVE: The goal of this project is to evaluate a new auditory cue, which the authors call spearcons, in comparison to other auditory cues with the aim of improving auditory menu navigation. BACKGROUND: With the shrinking displays of mobile devices and increasing technology use by visually impaired users, it becomes important to improve usability of non-graphical user interface (GUI) interfaces such as auditory menus. Using nonspeech sounds called auditory icons (i.e., representative real sounds of objects or events) or earcons (i.e., brief musical melody patterns) has been proposed to enhance menu navigation. To compensate for the weaknesses of traditional nonspeech auditory cues, the authors developed spearcons by speeding up a spoken phrase, even to the point where it is no longer recognized as speech. METHOD: The authors conducted five empirical experiments. In Experiments 1 and 2, they measured menu navigation efficiency and accuracy among cues. In Experiments 3 and 4, they evaluated learning rate of cues and speech itself. In Experiment 5, they assessed spearcon enhancements compared to plain TTS (text to speech: speak out written menu items) in a two-dimensional auditory menu. RESULTS: Spearcons outperformed traditional and newer hybrid auditory cues in navigation efficiency, accuracy, and learning rate. Moreover, spearcons showed comparable learnability as normal speech and led to better performance than speech-only auditory cues in two-dimensional menu navigation. CONCLUSION: These results show that spearcons can be more effective than previous auditory cues in menu-based interfaces. APPLICATION: Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.


Subject(s)
Acoustic Stimulation/methods , Auditory Perception , Cell Phone/trends , Computers, Handheld/trends , User-Computer Interface , Adolescent , Analysis of Variance , Cell Phone/instrumentation , Cues , Data Display , Female , Humans , Male , Sound , Speech , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...