Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 71
Filter
1.
Front Artif Intell ; 7: 1386753, 2024.
Article in English | MEDLINE | ID: mdl-38952408

ABSTRACT

Introduction: Computerized sentiment detection, based on artificial intelligence and computer vision, has become essential in recent years. Thanks to developments in deep neural networks, this technology can now account for environmental, social, and cultural factors, as well as facial expressions. We aim to create more empathetic systems for various purposes, from medicine to interpreting emotional interactions on social media. Methods: To develop this technology, we combined authentic images from various databases, including EMOTIC (ADE20K, MSCOCO), EMODB_SMALL, and FRAMESDB, to train our models. We developed two sophisticated algorithms based on deep learning techniques, DCNN and VGG19. By optimizing the hyperparameters of our models, we analyze context and body language to improve our understanding of human emotions in images. We merge the 26 discrete emotional categories with the three continuous emotional dimensions to identify emotions in context. The proposed pipeline is completed by fusing our models. Results: We adjusted the parameters to outperform previous methods in capturing various emotions in different contexts. Our study showed that the Sentiment_recognition_model and VGG19_contexte increased mAP by 42.81% and 44.12%, respectively, surpassing the results of previous studies. Discussion: This groundbreaking research could significantly improve contextual emotion recognition in images. The implications of these promising results are far-reaching, extending to diverse fields such as social robotics, affective computing, human-machine interaction, and human-robot communication.

2.
Front Psychol ; 15: 1287952, 2024.
Article in English | MEDLINE | ID: mdl-38770252

ABSTRACT

Individuals with Parkinson's disease (PD) may exhibit impaired emotion perception. However, research demonstrating this decline has been based almost entirely on the recognition of isolated emotional cues. In real life, emotional cues such as expressive faces are typically encountered alongside expressive bodies. The current study investigated emotion perception in individuals with PD (n = 37) using emotionally incongruent composite displays of facial and body expressions, as well as isolated face and body expressions, and congruent composite displays as a baseline. In addition to a group of healthy controls (HC) (n = 50), we also included control individuals with schizophrenia (SZ) (n = 30), who display, as in PD, similar motor symptomology and decreased emotion perception abilities. The results show that individuals with PD showed an increased tendency to categorize incongruent face-body combinations in line with the body emotion, whereas those with HC showed a tendency to classify them in line with the facial emotion. No consistent pattern for prioritizing the face or body was found in individuals with SZ. These results were not explained by the emotional recognition of the isolated cues, cognitive status, depression, or motor symptoms of individuals with PD and SZ. As real-life expressions may include inconsistent cues in the body and face, these findings may have implications for the way individuals with PD and SZ interpret the emotions of others.

3.
Cureus ; 16(4): e58504, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38765425

ABSTRACT

Introduction In December 2019, COVID-19 originated in Wuhan, China, triggering a global pandemic. However, the Saudi Arabian Ministry of Education ensured the safe continuation of teaching and learning activities. Amid the pandemic, health sciences students were exposed to diverse learning opportunities. Methods This study seeks to explore their experiences with online teaching. Conducted as a descriptive cross-sectional study, it involved 397 health sciences students from three universities in the Makkah province who had encountered both traditional and virtual teaching methods. Results Most participants were female (71.1%), predominantly from Jeddah city (76.5%). The highest agreement scores were observed for student comprehension during online sessions (61.1%). A significant proportion (74.4%) found paying attention during online lectures easier than traditional ones. Blackboard emerged as the preferred educational platform for online teaching. Notably, there were no significant variations in students' perceptions of online teaching based on location, gender, or specialisation. Approximately 54.7% of students preferred watching their instructors through a webcam during online lectures. Conclusion Medical educators can leverage these findings to develop standardised teaching protocols and enhance the effectiveness of online education systems. The study underscores the importance of instructors using webcams during online teaching sessions, as it allows students to visually connect with their instructors, potentially improving the learning experience.

4.
Animals (Basel) ; 14(5)2024 Feb 21.
Article in English | MEDLINE | ID: mdl-38473063

ABSTRACT

This study aimed to investigate the common social and communicative behaviors of the Fonni's Dog under different outdoor conditions. For this study, 70 adult dogs (3-7 years; 32 intact males, 38 intact females) belonging to the Fonni's breed were used. A total of 35 dogs were kept in kennels and 35 were free-ranging dogs in their sheep/goat livestock units. A behavioral repertoire was adapted from the literature and an ethogram was filled in for each dog. All dogs were evaluated in the presence of the owner. Fisher's exact test, following Bonferroni's correction, was used to test possible differences in the categorical variables (presence or absence of the behavior) between free-ranging dogs and dogs kept in kennels. The study revealed that several categories of the dogs' body language were associated with the management condition. However, the breed motivations (guarding and defense of the territory) were satisfied both in kennel and in the animals who were free in the property. The current study suggests a good behavioral balance of the Fonni's Dogs which could be attributed to correct communication between dogs and owners.

5.
PNAS Nexus ; 3(2): pgae012, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38344008

ABSTRACT

For highly visual species like primates, facial and bodily emotion expressions play a crucial role in emotion perception. However, most research focuses on facial expressions, while the perception of bodily cues is still poorly understood. Using a novel comparative priming eye-tracking design, we examined whether our close primate relatives, the chimpanzees (Pan troglodytes), and humans infer emotions from bodily cues through subsequent perceptual integration with facial expressions. In experiment 1, we primed chimpanzees with videos of bodily movements of unfamiliar conspecifics engaged in social activities of opposite valence (play and fear) against neutral control scenes to examine attentional bias toward succeeding congruent or incongruent facial expressions. In experiment 2, we assessed the same attentional bias in humans yet using stimuli showing unfamiliar humans. In experiment 3, humans watched the chimpanzee stimuli of experiment 1, to examine cross-species emotion perception. Chimpanzees exhibited a persistent fear-related attention bias but did not associate bodily with congruent facial cues. In contrast, humans prioritized conspecifics' congruent facial expressions (matching bodily scenes) over incongruent ones (mismatching). Nevertheless, humans exhibited no congruency effect when viewing chimpanzee stimuli, suggesting difficulty in cross-species emotion perception. These results highlight differences in emotion perception, with humans being greatly affected by fearful and playful bodily cues and chimpanzees being strongly drawn toward fearful expressions, regardless of the preceding bodily priming cue. These data advance our understanding of the evolution of emotion signaling and the presence of distinct perceptual patterns in hominids.

6.
J Health Care Chaplain ; 30(2): 122-136, 2024.
Article in English | MEDLINE | ID: mdl-37178134

ABSTRACT

Recent research has described broad types of healthcare chaplains' activities, but many questions remain about how these professionals perform these tasks, whether variations occur, and if so, in what ways. Twenty-three chaplains were interviewed in-depth. Chaplains described engaging in highly dynamic processes, involving both verbal and non-verbal interactions. They face challenges and vary in ways of starting interactions, using verbal and non-verbal cues, and communicating through physical appearance. In these processes, when entering patients' rooms, they seek to "read the room," follow patients' leads, look for cues, match the energy/mood in the room, and adjust their body language appropriately, while maintaining open-ended stances. They face choices of what, if anything, to communicate through clothing (e.g., wearing clerical collars or crosses) and can confront additional challenges with members of groups different than their own, at times requiring further sensitivity. These data, the first to examine challenges chaplains confront entering patients' rooms and engaging in non-verbal communication, can enhance understandings of these issues, and help chaplains and other healthcare professionals provide more sensitive and astute context-based care. These findings thus have critical implications for education, practice, and research concerning chaplains and other providers.


Subject(s)
Clergy , Patients , Humans , Health Facilities , Nonverbal Communication , Delivery of Health Care
7.
Comput Biol Med ; 167: 107626, 2023 12.
Article in English | MEDLINE | ID: mdl-37918262

ABSTRACT

BACKGROUND: Infant crying is the first attempt babies use to communicate during their initial months of life. A misunderstanding of the cry message can compromise infant care and future neurodevelopmental process. METHODS: An exploratory study collecting multimodal data (i.e., crying, electroencephalography (EEG), near-infrared spectroscopy (NIRS), facial expressions, and body movements) from 38 healthy full-term newborns was conducted. Cry types were defined based on different conditions (i.e., hunger, sleepiness, fussiness, need to burp, and distress). Statistical analysis, Machine Learning (ML), and Deep Learning (DL) techniques were used to identify relevant features for cry type classification and to evaluate a robust DL algorithm named Acoustic MultiStage Interpreter (AMSI). RESULTS: Significant differences were found across cry types based on acoustics, EEG, NIRS, facial expressions, and body movements. Acoustics and body language were identified as the most relevant ML features to support the cause of crying. The DL AMSI algorithm achieved an accuracy rate of 92%. CONCLUSIONS: This study set a precedent for cry analysis research by highlighting the complexity of newborn cry expression and strengthening the potential use of infant cry analysis as an objective, reliable, accessible, and non-invasive tool for cry interpretation, improving the infant-parent relationship and ensuring family well-being.


Subject(s)
Algorithms , Crying , Humans , Infant, Newborn , Infant , Acoustics , Brain/diagnostic imaging , Kinesics
8.
Front Neurosci ; 17: 1266873, 2023.
Article in English | MEDLINE | ID: mdl-37799341

ABSTRACT

Introduction: Even though infant crying is a common phenomenon in humans' early life, it is still a challenge for researchers to properly understand it as a reflection of complex neurophysiological functions. Our study aims to determine the association between neonatal cry acoustics with neurophysiological signals and behavioral features according to different cry distress levels of newborns. Methods: Multimodal data from 25 healthy term newborns were collected simultaneously recording infant cry vocalizations, electroencephalography (EEG), near-infrared spectroscopy (NIRS) and videos of facial expressions and body movements. Statistical analysis was conducted on this dataset to identify correlations among variables during three different infant conditions (i.e., resting, cry, and distress). A Deep Learning (DL) algorithm was used to objectively and automatically evaluate the level of cry distress in infants. Results: We found correlations between most of the features extracted from the signals depending on the infant's arousal state, among them: fundamental frequency (F0), brain activity (delta, theta, and alpha frequency bands), cerebral and body oxygenation, heart rate, facial tension, and body rigidity. Additionally, these associations reinforce that what is occurring at an acoustic level can be characterized by behavioral and neurophysiological patterns. Finally, the DL audio model developed was able to classify the different levels of distress achieving 93% accuracy. Conclusion: Our findings strengthen the potential of crying as a biomarker evidencing the physical, emotional and health status of the infant becoming a crucial tool for caregivers and clinicians.

9.
Front Psychol ; 14: 1174424, 2023.
Article in English | MEDLINE | ID: mdl-37663337

ABSTRACT

The embodied mind in motion is a concept in which health and well-being, prevention and therapy, as well as lifestyle and habits meet. The mind changes profoundly in the course of dementias, affecting daily living and resulting in reduced quality of life. Interdisciplinary approaches are required for a holistic understanding of how the mind is affected by dementia. We here explore what such a holistic theory of dementia might look like and propose the idea of "embodied mind in motion". The paradigm is biopsychosocial or biocultural, the theoretical anchor point is the lifeworld, and the guiding concept is "embodiment," as body and mind are constantly in motion. Physical activity is, hence, central for the experience of health and well-being, beyond being "exercise" and "health behavior". We discuss the embodied mind in motion referring to phenomenology, enactivism and (philosophical) anthropology. In our view, habits are embodied long-term memories and a philosophical equivalent to lifestyle. They unfold the meaningfulness of moving the body, complementing the objectifiable benefits of physical exercise. Empirical studies on "holistic activities" like hiking, yoga, music and dance illustrate improved integration into everyday life. Their meaningfulness enhances compliance and increases the preventive and even therapeutic potential. A crucial factor for this is the emotional dimension of lifestyle, exemplified by the virally popularized performance of "Swan Lake" by wheel-chair bound ex-ballerina Marta Cinta González Saldaña, suffering from Alzheimer's disease. A number of epistemological and ontological consequences anchor "embodied movement" as a valuable principle for dementia research.

10.
Violence Against Women ; 29(14): 2915-2940, 2023 11.
Article in English | MEDLINE | ID: mdl-37644854

ABSTRACT

What do women learn in feminist self-defense that is empowering? This study examined the skills women used months and years after completing an IMPACT self-defense course. Ninety-seven survey participants described skills they had used and incorporated into their lives. The major themes that emerged through a classic grounded theory analysis were awareness, boundary setting, assertive body language, and managing adrenaline to prevent, interrupt, or stop uncomfortable, intrusive, or hostile behaviors. IMPACT-trained women did not engage in self-blaming or risky behavior and used their skills to prevent and interrupt aggressive behavior.


Subject(s)
Empowerment , Feminism , Female , Humans , Aggression , Kinesics
11.
eNeuro ; 10(9)2023 09.
Article in English | MEDLINE | ID: mdl-37648448

ABSTRACT

Understanding the neural basis of emotions is a critical step to uncover the biological substrates of neuropsychiatric disorders. To study this aspect in freely behaving mice, neuroscientists have relied on the observation of ethologically relevant bodily cues to infer the affective content of the subject, both in neutral conditions or in response to a stimulus. The best example of that is the widespread assessment of freezing in experiments testing both conditioned and unconditioned fear responses. While robust and powerful, these approaches come at a cost: they are usually confined within selected time windows, accounting for only a limited portion of the complexity of emotional fluctuation. Moreover, they often rely on visual inspection and subjective judgment, resulting in inconsistency across experiments and questionable result interpretations. To overcome these limitations, novel tools are arising, fostering a new avenue in the study of the mouse naturalistic behavior. In this work we developed a computational tool [stimulus-evoked behavioral tracking in 3D for rodents (SEB3R)] to automate and standardize an ethologically driven observation of freely moving mice. Using a combination of machine learning-based behavioral tracking and unsupervised cluster analysis, we identified statistically meaningful postures that could be used for empirical inference on a subsecond scale. We validated the efficacy of this tool in a stimulus-driven test, the whisker nuisance (WN) task, where mice are challenged with a prolonged and invasive whisker stimulation, showing that identified postures can be reliably used as a proxy for stimulus-driven fearful and explorative behaviors.


Subject(s)
Emotions , Fear , Animals , Mice , Exploratory Behavior , Posture , Kinesics
12.
JMIR Serious Games ; 11: e45600, 2023 Jun 30.
Article in English | MEDLINE | ID: mdl-37389910

ABSTRACT

BACKGROUND: After the COVID-19 pandemic, society has become more aware of the importance of some basic hygienic habits to avoid exposure to pathogens transmitted via hands. Given that a high frequency of touching mucous membranes can lead to a high risk of infection, it is essential to establish strategies to reduce this behavior as a preventive measure against contagion. This risk can be extrapolated to a multitude of health scenarios and transmission of many infectious diseases. #RedPingüiNO was designed as an intervention to prevent the transmission of SARS-CoV-2 and other pathogens through the reduction of facial self-touches by thoughtfully engaging participants in a serious game. OBJECTIVE: Facial self-touches should be understood as behaviors of limited control and awareness, used to regulate situations of cognitive and emotional demands, or as part of nonverbal communication. The objective of this study was to ensure that participants become aware of and reduce these behaviors through a game of self-perception. METHODS: The quasi-experimental intervention was applied to 103 healthy university students selected by convenience sampling and put into practice for 2 weeks, with 1 control group (n=24, 23.3%) and 2 experimental groups (experimental group with no additional social reinforcement interventions: n=36, 35%; experimental group with additional social reinforcement interventions: n=43, 41.7%). The objective was to improve knowledge and perception and reduce facial self-touches to prevent exposure to pathogens transmitted via hands not only in health multihazard scenarios but also in ordinary circumstances. The ad hoc instrument used to analyze the experience consisted of 43 items and was valid and reliable for the purpose of this study. The items were divided into 5 blocks extracted from the theoretical framework: sociological issues (1-5); hygiene habits (6-13); risk awareness (14-19); strategies for not touching the face (20-26); and questions after the intervention (27-42), designed as a postintervention tool assessing the game experience. Validation of the content was achieved through assessment by 12 expert referees. External validation was performed using a test-retest procedure, and reliability was verified using the Spearman correlation. RESULTS: The results of the ad hoc questionnaire, which were analyzed using the Wilcoxon signed-rank test and McNemar index to identify significant differences between test and retest for a 95% CI, showed that facial self-touches were reduced (item 20, P<.001; item 26, P=.04), and awareness of this spontaneous behavior and its triggers increased (item 15; P=.007). The results were reinforced by qualitative findings from the daily logs. CONCLUSIONS: The intervention exhibited a greater effect from sharing the game, with interactions between people; however, in both cases, it was helpful in reducing facial self-touches. In summary, this game is suitable for reducing facial self-touches, and owing to its free availability and design, it can be adapted to various contexts.

14.
Rev. Hosp. Ital. B. Aires (2004) ; 43(1): 37-40, mar. 2023.
Article in Spanish | LILACS, UNISALUD, BINACIS | ID: biblio-1437220

ABSTRACT

La lectura de experiencias personales en situaciones de compromiso de la salud refleja el impacto que tiene sobre los pacientes y sus familias la manera de comunicarse de los profesionales. En este artículo se recorren textos de la literatura con ejemplos paradigmáticos, se mencionan posibles motivos del uso de jerga médica y se reflexiona sobre la importancia de una adecuada comunicación con los pacientes y sus familias, haciendo hincapié en la escucha activa, la individualización del abordaje considerando el contexto, los hábitos y las habilidades que deben estar presentes en los encuentros ante situaciones críticas. Se subraya la importancia de desarrollar programas formativos que incluyan encuentros de simulación. (AU)


Reading personal experiences in situations involving health problems reflects the impact that inadequate communication has on patients and their families. This article reviews texts from the literature with paradigmatic examples, mentions possible reasons for the use of medical jargon and reflects on the importance of adequate communication with patients and their families, emphasizing active listening, personalized approach considering contextual circumstances, habits and skills that must be present in critical situations meetings. The importance of developing training programs that include simulation scenarios is underlined. (AU)


Subject(s)
Humans , Physician-Patient Relations , Professional-Family Relations , Empathy , Health Communication , Communication Barriers , Patient-Centered Care , Decision Making
15.
Perspect Psychol Sci ; 18(6): 1388-1411, 2023 Nov.
Article in English | MEDLINE | ID: mdl-36791676

ABSTRACT

Research and theory in nonverbal communication have made great advances toward understanding the patterns and functions of nonverbal behavior in social settings. Progress has been hindered, we argue, by presumptions about nonverbal behavior that follow from both received wisdom and faulty evidence. In this article, we document four persistent misconceptions about nonverbal communication-namely, that people communicate using decodable body language; that they have a stable personal space by which they regulate contact with others; that they express emotion using universal, evolved, iconic, categorical facial expressions; and that they can deceive and detect deception, using dependable telltale clues. We show how these misconceptions permeate research as well as the practices of popular behavior experts, with consequences that extend from intimate relationships to the boardroom and courtroom and even to the arena of international security. Notwithstanding these misconceptions, existing frameworks of nonverbal communication are being challenged by more comprehensive systems approaches and by virtual technologies that ambiguate the roles and identities of interactants and the contexts of interaction.


Subject(s)
Facial Expression , Nonverbal Communication , Humans , Nonverbal Communication/psychology , Emotions , Sexual Behavior
16.
Healthcare (Basel) ; 10(12)2022 Dec 10.
Article in English | MEDLINE | ID: mdl-36554028

ABSTRACT

In recent decades, epidemic and pandemic illnesses have grown prevalent and are a regular source of concern throughout the world. The extent to which the globe has been affected by the COVID-19 epidemic is well documented. Smart technology is now widely used in medical applications, with the automated detection of status and feelings becoming a significant study area. As a result, a variety of studies have begun to focus on the automated detection of symptoms in individuals infected with a pandemic or epidemic disease by studying their body language. The recognition and interpretation of arm and leg motions, facial recognition, and body postures is still a developing field, and there is a dearth of comprehensive studies that might aid in illness diagnosis utilizing artificial intelligence techniques and technologies. This literature review is a meta review of past papers that utilized AI for body language classification through full-body tracking or facial expressions detection for various tasks such as fall detection and COVID-19 detection, it looks at different methods proposed by each paper, their significance and their results.

17.
Int J Nurs Stud ; 136: 104365, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36327681

ABSTRACT

BACKGROUND: Many people living with dementia experience challenges comprehending language and benefit from nonverbal communication supports. Little published empirical evidence exists for care partners regarding supportive strategies for nonverbal communication with people living with dementia. This study aimed to conduct a scoping review of nonverbal strategies for care partners which have been observed to support communication with people living with dementia. METHODS: Current best practices for scoping research guided this review. CINAHL, PsycInfo, Scopus, and Pubmed databases were searched December 8, 2020. Empirical studies that examined the supportiveness of nonverbal communication strategies used by care partners of people living with dementia were eligible. All publication dates were included. Eligible studies were published in English in peer-reviewed journals. Studies were screened first by title and abstract, and subsequently by full-text review. Data charting was conducted using an evidence summary table, which was subsequently used for analysis. Results were presented in the form of a written summary. RESULTS: Sixteen studies were included in the final review. Six categories of supportive nonverbal communication strategies were identified: eye contact, gestures, facial expression, touch, close proximity, and frontal orientation. Studies observed six outcomes which indicated that these nonverbal strategies were supportive for communication with people living with dementia; however, person-centered outcomes were limited. CONCLUSIONS: The review identified supportive nonverbal communication strategies used by care partners with people living with dementia in the current literature. Disagreement exists in the literature regarding which outcomes define supportive nonverbal communication with people living with dementia. This in combination with the benefits of person-centered approaches to care with people living with dementia presents a critical need to delineate which nonverbal communication strategies are person-centered in future research. TWEETABLE ABSTRACT: Six supportive nonverbal communication strategies identified by scoping literature review, but there is disagreement in how the literature defines "supportive" @marie_y_s @EmmaBender19.


Subject(s)
Dementia , Humans , Nonverbal Communication , Delivery of Health Care , Empirical Research
18.
Front Psychol ; 13: 1035328, 2022.
Article in English | MEDLINE | ID: mdl-36405118

ABSTRACT

A classical theoretical frame to interpret motor reactions to emotional stimuli is that such stimuli, particularly those threat-related, are processed preferentially, i.e., they are capable of capturing and grabbing attention automatically. Research has recently challenged this view, showing that the task relevance of emotional stimuli is crucial to having a reliable behavioral effect. Such evidence indicated that emotional facial expressions do not automatically influence motor responses in healthy young adults, but they do so only when intrinsically pertinent to the ongoing subject's goals. Given the theoretical relevance of these findings, it is essential to assess their generalizability to different, socially relevant emotional stimuli such as emotional body postures. To address this issue, we compared the performance of 36 right-handed participants in two different versions of a Go/No-go task. In the Emotional Discrimination task, participants were required to withhold their responses at the display of emotional body postures (fearful or happy) and to move at the presentation of neutral postures. Differently, in the control task, the same images were shown, but participants had to respond according to the color of the actor/actress' t-shirt, disregarding the emotional content. Results showed that participants made more commission errors (instances in which they moved even though the No-go signal was presented) for happy than fearful body postures in the Emotional Discrimination task. However, this difference disappeared in the control task. Such evidence indicates that, like facial emotion, emotional body expressions do not influence motor control automatically, but only when they are task-relevant.

19.
Front Neurosci ; 16: 997263, 2022.
Article in English | MEDLINE | ID: mdl-36248653

ABSTRACT

While reading covered with masks faces during the COVID-19 pandemic, for efficient social interaction, we need to combine information from different sources such as the eyes (without faces hidden by masks) and bodies. This may be challenging for individuals with neuropsychiatric conditions, in particular, autism spectrum disorders. Here we examined whether reading of dynamic faces, bodies, and eyes are tied in a gender-specific way, and how these capabilities are related to autistic traits expression. Females and males accomplished a task with point-light faces along with a task with point-light body locomotion portraying different emotional expressions. They had to infer emotional content of displays. In addition, participants were administered the Reading the Mind in the Eyes Test, modified and Autism Spectrum Quotient questionnaire. The findings show that only in females, inferring emotions from dynamic bodies and faces are firmly linked, whereas in males, reading in the eyes is knotted with face reading. Strikingly, in neurotypical males only, accuracy of face, body, and eyes reading was negatively tied with autistic traits. The outcome points to gender-specific modes in social cognition: females rely upon merely dynamic cues while reading faces and bodies, whereas males most likely trust configural information. The findings are of value for examination of face and body language reading in neuropsychiatric conditions, in particular, autism, most of which are gender/sex-specific. This work suggests that if male individuals with autistic traits experience difficulties in reading covered with masks faces, these deficits may be unlikely compensated by reading (even dynamic) bodies and faces. By contrast, in females, reading covered faces as well as reading language of dynamic bodies and faces are not compulsorily connected to autistic traits preventing them from paying high costs for maladaptive social interaction.

20.
Healthcare (Basel) ; 10(7)2022 Jul 04.
Article in English | MEDLINE | ID: mdl-35885777

ABSTRACT

Given the current COVID-19 pandemic, medical research today focuses on epidemic diseases. Innovative technology is incorporated in most medical applications, emphasizing the automatic recognition of physical and emotional states. Most research is concerned with the automatic identification of symptoms displayed by patients through analyzing their body language. The development of technologies for recognizing and interpreting arm and leg gestures, facial features, and body postures is still in its early stage. More extensive research is needed using artificial intelligence (AI) techniques in disease detection. This paper presents a comprehensive survey of the research performed on body language processing. Upon defining and explaining the different types of body language, we justify the use of automatic recognition and its application in healthcare. We briefly describe the automatic recognition framework using AI to recognize various body language elements and discuss automatic gesture recognition approaches that help better identify the external symptoms of epidemic and pandemic diseases. From this study, we found that since there are studies that have proven that the body has a language called body language, it has proven that language can be analyzed and understood by machine learning (ML). Since diseases also show clear and different symptoms in the body, the body language here will be affected and have special features related to a particular disease. From this examination, we discovered that it is possible to specialize the features and language changes of each disease in the body. Hence, ML can understand and detect diseases such as pandemic and epidemic diseases and others.

SELECTION OF CITATIONS
SEARCH DETAIL
...