Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
J Breast Imaging ; 6(1): 45-52, 2024 Jan 19.
Article in English | MEDLINE | ID: mdl-38243861

ABSTRACT

OBJECTIVE: To investigate the efficacy of immersive virtual reality (VR) in combination with standard local anesthetic for mitigating anxiety and pain during US-guided breast biopsies compared to local anesthetic alone. METHODS: Patients scheduled for US-guided biopsy were invited to participate. Eligible patients were females 18 years of age or older. Patients were randomized to VR or control group at a 1:1 ratio. Patients in the VR group underwent biopsy with the addition of a VR experience and patients in the control group underwent usual biopsy. Patient-perceived levels of anxiety and pain were collected before and after biopsy via the State-Trait Anxiety Inventory (STAI) and Visual Analog Scale (VAS). Physiological data were captured during biopsy using a clinically validated wristband. Differences in anxiety, pain, and physiologic data were compared between the VR and control group. RESULTS: Sixty patients were enrolled. After excluding 2 patients with VR device malfunction, there were 29 patients in the VR and 29 patients in the control group for analysis. The VR group had reduced anxiety compared to the control group based on postintervention STAI (P <.001) and VAS (P = .036). The VR group did not have lower pain based on postintervention VAS (P = .555). Physiological measures showed higher RR intervals and decreased skin conductance levels, which are associated with lower anxiety levels in the VR group. CONCLUSION: Use of VR in addition to standard local anesthetic for US-guided breast biopsies was associated with reduced patient anxiety. Virtual reality may be a useful tool to improve the patient biopsy experience.


Subject(s)
Anesthetics, Local , Virtual Reality , Adolescent , Adult , Female , Humans , Anxiety , Anxiety Disorders , Pain/prevention & control
2.
Neuroreport ; 18(8): 763-6, 2007 May 28.
Article in English | MEDLINE | ID: mdl-17471062

ABSTRACT

Gaze direction signals another's focus of social attention. Here, we recorded event-related potentials to a multiface display where a gaze aversion created three different social scenarios involving social attention, mutual gaze exchange, and gaze avoidance. N170 was unaffected by social scenario. P350 latency was the shortest in social attention and mutual gaze exchange, whereas P500 was thelargest for gaze avoidance. Our data suggest that neural activity after 300 ms poststimulus may index processes associated with extracting social meaning, whereas that earlier than 300 ms may index processing of gaze change independent of social context.


Subject(s)
Attention/physiology , Brain Mapping , Evoked Potentials/physiology , Pattern Recognition, Visual/physiology , Social Behavior , Adult , Analysis of Variance , Eye Movements , Female , Functional Laterality/physiology , Humans , Male , Middle Aged , Photic Stimulation/methods , Reaction Time/physiology
3.
Neuropsychologia ; 45(1): 93-106, 2007 Jan 07.
Article in English | MEDLINE | ID: mdl-16766000

ABSTRACT

During social interactions our brains continuously integrate incoming auditory and visual input from the movements and vocalizations of others. Yet, the dynamics of the neural events elicited to these multisensory stimuli remain largely uncharacterized. Here we recorded audiovisual scalp event-related potentials (ERPs) to dynamic human faces with associated human vocalizations. Audiovisual controls were a dynamic monkey face with a species-appropriate vocalization, and a house with opening front door with a creaking door sound. Subjects decided if audiovisual stimulus trials were congruent (e.g. human face-human sound) or incongruent (e.g. house image-monkey sound). An early auditory ERP component, N140, was largest to human and monkey vocalizations. This effect was strongest in the presence of the dynamic human face, suggesting that species-specific visual information can modulate auditory ERP characteristics. A motion-induced visual N170 did not change amplitude or latency across visual motion category in the presence of sound. A species-specific incongruity response consisting of a late positive ERP at around 400 ms, P400, was selectively larger only when human faces were mismatched with a non-human sound. We also recorded visual ERPs at trial onset, and found that the category-specific N170 did not alter its behavior as a function of stimulus category-somewhat unexpected as two face types were contrasted with a house image. In conclusion, we present evidence for species-specificity in vocalization selectivity in early ERPs, and in a multisensory incongruity response whose amplitude is modulated only when the human face motion is paired with an incongruous auditory stimulus.


Subject(s)
Brain/physiology , Face , Motion Perception/physiology , Speech Perception/physiology , Vocalization, Animal/physiology , Acoustic Stimulation , Adult , Animals , Area Under Curve , Electroencephalography , Evoked Potentials/physiology , Female , Haplorhini , Humans , Male , Middle Aged , Photic Stimulation , Social Perception , Temporal Lobe/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...