Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Article in English | MEDLINE | ID: mdl-38648153

ABSTRACT

Extensive research has been done in haptic feedback for texture simulation in virtual reality (VR). However, it is challenging to modify the perceived tactile texture of existing physical objects which usually serve as anchors for virtual objects in mixed reality (MR). In this paper, we present ViboPneumo, a finger-worn haptic device that uses vibratory-pneumatic feedback to modulate (i.e., increase and decrease) the perceived roughness of the material surface contacted by the user's fingerpad while supporting the perceived sensation of other haptic properties (e.g., temperature or stickiness) in MR. Our device includes a silicone-based pneumatic actuator that can lift the user's fingerpad on the physical surface to reduce the contact area for roughness decreasing, and an on-finger vibrator for roughness increasing. The results of our perceptual study showed that the participants could perceive changes in roughness, both increasing and decreasing, compared to the original material surface. We also observed the overlapping roughness ratings among certain haptic stimuli (i.e., vibrotactile and pneumatic) and the originally perceived roughness of some materials without any haptic feedback. This suggests the potential to alter the perceived texture of one type of material to another in terms of roughness (e.g., modifying the perceived texture of ceramics as glass). Lastly, a user study of MR experience showed that ViboPneumo could significantly improve the MR user experience, particularly for visual-haptic matching, compared to the condition of a bare finger. We also demonstrated a few application scenarios for ViboPneumo.

2.
IEEE Trans Vis Comput Graph ; 30(5): 2559-2569, 2024 May.
Article in English | MEDLINE | ID: mdl-38437107

ABSTRACT

For VR interaction, the home environment with complicated spatial setup and dynamics may hinder the VR user experience. In particular, pets' movement may be more unpredictable. In this paper, we investigate the integration of real-world pet activities into immersive VR interaction. Our pilot study showed that the active pet movements, especially dogs, could negatively impact users' performance and experience in immersive VR. We proposed three different types of pet integration, namely semitransparent real-world portal, non-interactive object in VR, and interactive object in VR. We conducted the user study with 16 pet owners and their pets. The results showed that compared to the baseline condition without any pet-integration technique, the approach of integrating the pet as interactive objects in VR yielded significantly higher participant ratings in perceived realism, joy, multisensory engagement, and connection with their pets in VR.


Subject(s)
Computer Graphics , Virtual Reality , Humans , Animals , Dogs , Pilot Projects , Movement
3.
IEEE Trans Vis Comput Graph ; 29(12): 5149-5164, 2023 12.
Article in English | MEDLINE | ID: mdl-36074874

ABSTRACT

There has been an increasing focus on haptic interfaces for virtual reality (VR), to support a high-quality touch experience. However, it is still challenging to haptically simulate the real-world walking experience in different fluid mediums. To tackle this problem, we present PropelWalker, a pair of calf-worn haptic devices for simulating the buoyancy and the resistant force when the human's lower limbs are interacting with different fluids and materials in VR. By using four ducted fans, two installed on each calf, the system can control the strength and the direction of the airflow in real time to provide different levels of force. Our technical evaluation shows that PropelWalker can generate vertical forces up to 27N in two directions (i.e., upward and downward) within 0.85 seconds. Furthermore, the system can stably maintain the generated force with minor turbulence. We further conducted three user-perception studies to understand the capability of PropelWalker to generate distinguishable force stimuli. First, we conducted the just-noticeable-difference (JND) experiments to investigate the threshold of the human perception of on-leg air-flow force feedback. Our second perception study showed that users could distinguish four PropelWalker-generated force levels for simulating different walking mediums (i.e., dry ground, water, mud, and sand), with an average accuracy of 94.2%. Lastly, our VR user study showed that PropelWalker could significantly improve the users' sense of presence in VR.


Subject(s)
Virtual Reality , Wearable Electronic Devices , Humans , Feedback , Leg , Computer Graphics , Touch , Walking , User-Computer Interface
4.
IEEE Trans Vis Comput Graph ; 27(5): 2597-2607, 2021 05.
Article in English | MEDLINE | ID: mdl-33750694

ABSTRACT

Low-cost virtual-reality (VR) head-mounted displays (HMDs) with the integration of smartphones have brought the immersive VR to the masses, and increased the ubiquity of VR. However, these systems are often limited by their poor interactivity. In this paper, we present GestOnHMD, a gesture-based interaction technique and a gesture-classification pipeline that leverages the stereo microphones in a commodity smartphone to detect the tapping and the scratching gestures on the front, the left, and the right surfaces on a mobile VR headset. Taking the Google Cardboard as our focused headset, we first conducted a gesture-elicitation study to generate 150 user-defined gestures with 50 on each surface. We then selected 15, 9, and 9 gestures for the front, the left, and the right surfaces respectively based on user preferences and signal detectability. We constructed a data set containing the acoustic signals of 18 users performing these on-surface gestures, and trained the deep-learning classification pipeline for gesture detection and recognition. Lastly, with the real-time demonstration of GestOnHMD, we conducted a series of online participatory-design sessions to collect a set of user-defined gesture-referent mappings that could potentially benefit from GestOnHMD.


Subject(s)
Computer Graphics/instrumentation , Gestures , Smart Glasses , Virtual Reality , Acoustics , Adult , Deep Learning , Equipment Design , Female , Hand/physiology , Humans , Male , Signal Processing, Computer-Assisted , Smartphone , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...