Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 23(14)2023 Jul 22.
Article in English | MEDLINE | ID: mdl-37514902

ABSTRACT

RGB-T tracking involves the use of images from both visible and thermal modalities. The primary objective is to adaptively leverage the relatively dominant modality in varying conditions to achieve more robust tracking compared to single-modality tracking. An RGB-T tracker based on a mixed-attention mechanism to achieve a complementary fusion of modalities (referred to as MACFT) is proposed in this paper. In the feature extraction stage, we utilize different transformer backbone branches to extract specific and shared information from different modalities. By performing mixed-attention operations in the backbone to enable information interaction and self-enhancement between the template and search images, a robust feature representation is constructed that better understands the high-level semantic features of the target. Then, in the feature fusion stage, a modality shared-specific feature interaction structure was designed based on a mixed-attention mechanism, effectively suppressing low-quality modality noise while enhancing the information from the dominant modality. Evaluation on multiple RGB-T public datasets demonstrates that our proposed tracker outperforms other RGB-T trackers on general evaluation metrics while also being able to adapt to long-term tracking scenarios.

2.
AMIA Annu Symp Proc ; 2021: 1039-1048, 2021.
Article in English | MEDLINE | ID: mdl-35308958

ABSTRACT

Burn wounds are most commonly evaluated through visual inspection to determine surgical candidacy, taking into account burn depth and individualized patient factors. This process, though cost effective, is subjective and varies by provider experience. Deep learning models can assist in burn wound surgical candidacy with predictions based on the wound and patient characteristics. To this end, we present a multimodal deep learning approach and a complementary mobile application - DL4Burn - for predicting burn surgical candidacy, to emulate the multi-factored approach used by clinicians. Specifically, we propose a ResNet50-based multimodal model and validate it using retrospectively obtained patient burn images, demographic, and injury data.


Subject(s)
Burns , Deep Learning , Burns/surgery , Humans , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...