Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Front Psychol ; 14: 1223806, 2023.
Article in English | MEDLINE | ID: mdl-37583610

ABSTRACT

Introduction: This work explores the use of an automated facial coding software - FaceReader - as an alternative and/or complementary method to manual coding. Methods: We used videos of parents (fathers, n = 36; mothers, n = 29) taken from the Avon Longitudinal Study of Parents and Children. The videos-obtained during real-life parent-infant interactions in the home-were coded both manually (using an existing coding scheme) and by FaceReader. We established a correspondence between the manual and automated coding categories - namely Positive, Neutral, Negative, and Surprise - before contingency tables were employed to examine the software's detection rate and quantify the agreement between manual and automated coding. By employing binary logistic regression, we examined the predictive potential of FaceReader outputs in determining manually classified facial expressions. An interaction term was used to investigate the impact of gender on our models, seeking to estimate its influence on the predictive accuracy. Results: We found that the automated facial detection rate was low (25.2% for fathers, 24.6% for mothers) compared to manual coding, and discuss some potential explanations for this (e.g., poor lighting and facial occlusion). Our logistic regression analyses found that Surprise and Positive expressions had strong predictive capabilities, whilst Negative expressions performed poorly. Mothers' faces were more important for predicting Positive and Neutral expressions, whilst fathers' faces were more important in predicting Negative and Surprise expressions. Discussion: We discuss the implications of our findings in the context of future automated facial coding studies, and we emphasise the need to consider gender-specific influences in automated facial coding research.

2.
Nat Methods ; 18(8): 953-958, 2021 08.
Article in English | MEDLINE | ID: mdl-34312564

ABSTRACT

Unbiased quantitative analysis of macroscopic biological samples demands fast imaging systems capable of maintaining high resolution across large volumes. Here we introduce RAPID (rapid autofocusing via pupil-split image phase detection), a real-time autofocus method applicable in every widefield-based microscope. RAPID-enabled light-sheet microscopy reliably reconstructs intact, cleared mouse brains with subcellular resolution, and allowed us to characterize the three-dimensional (3D) spatial clustering of somatostatin-positive neurons in the whole encephalon, including densely labeled areas. Furthermore, it enabled 3D morphological analysis of microglia across the entire brain. Beyond light-sheet microscopy, we demonstrate that RAPID maintains high image quality in various settings, from in vivo fluorescence imaging to 3D tracking of fast-moving organisms. RAPID thus provides a flexible autofocus solution that is suitable for traditional automated microscopy tasks as well as for quantitative analysis of large biological specimens.


Subject(s)
Brain/anatomy & histology , Image Processing, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Microglia/cytology , Microscopy, Fluorescence/methods , Animals , Male , Mice
SELECTION OF CITATIONS
SEARCH DETAIL
...