Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 19(23)2019 Dec 03.
Article in English | MEDLINE | ID: mdl-31816889

ABSTRACT

This paper presents two methodologies for delivering multimedia content to visually impaired people with the use of a haptic device and braille display. Based on our previous research, the research using Kinect v2 and haptic device with 2D+ (RGB frame with depth) data has the limitations of slower operational speed while reconstructing object details. Thus, this study focuses on the development of 2D multiarray braille display using an electronic book translator application because of its accuracy and high speed. This approach provides mobility and uses 2D multiarray braille display to represent media content contour more efficiently. In conclusion, this study achieves the representation of considerably massive text content compared to previous 1D braille displays. Besides, it also represents illustrations and figures to braille displays through quantization and binarization.


Subject(s)
Brain/diagnostic imaging , Self-Help Devices , Sensory Aids , Touch , Visually Impaired Persons/rehabilitation , Algorithms , Blindness , Computers, Handheld , Data Display , Equipment Design , Humans , Multimedia , Reading , Reproducibility of Results , Ultrasonics , User-Computer Interface
2.
Sensors (Basel) ; 18(11)2018 Nov 01.
Article in English | MEDLINE | ID: mdl-30388872

ABSTRACT

360-degree video streaming for high-quality virtual reality (VR) is challenging for current wireless systems because of the huge bandwidth it requires. However, millimeter wave (mmWave) communications in the 60 GHz band has gained considerable interest from the industry and academia because it promises gigabit wireless connectivity in the huge unlicensed bandwidth (i.e., up to 7 GHz). This massive unlicensed bandwidth offers great potential for addressing the demand for 360-degree video streaming. This paper investigates the problem of 360-degree video streaming for mobile VR using the SHVC, the scalable of High-Efficiency Video Coding (HEVC) standard and PC offloading over 60 GHz networks. We present a conceptual architecture based on advanced tiled-SHVC and mmWave communications. This architecture comprises two main parts. (1) Tile-based SHVC for 360-degree video streaming and optimizing parallel decoding. (2) Personal Computer (PC) offloading mechanism for transmitting uncompressed video (viewport only). The experimental results show that our tiled extractor method reduces the bandwidth required for 360-degree video streaming by more than 47% and the tile partitioning mechanism was improved by up to 25% in terms of the decoding time. The PC offloading mechanism was also successful in offloading 360-degree decoded (or viewport only) video to mobile devices using mmWave communication and the proposed transmission schemes.

3.
Sensors (Basel) ; 18(9)2018 Sep 18.
Article in English | MEDLINE | ID: mdl-30231529

ABSTRACT

Recently, with the increasing demand for virtual reality (VR), experiencing immersive contents with VR has become easier. However, a tremendous amount of calculation and bandwidth is required when processing 360 videos. Moreover, additional information such as the depth of the video is required to enjoy stereoscopic 360 contents. Therefore, this paper proposes an efficient method of streaming high-quality 360 videos. To reduce the bandwidth when streaming and synthesizing the 3DoF+ 360 videos, which supports limited movements of the user, a proper down-sampling ratio and quantization parameter are offered from the analysis of the graph between bitrate and peak signal-to-noise ratio. High-efficiency video coding (HEVC) is used to encode and decode the 360 videos, and the view synthesizer produces the video of intermediate view, providing the user with an immersive experience.

4.
IEEE Trans Haptics ; 8(3): 327-38, 2015.
Article in English | MEDLINE | ID: mdl-26219098

ABSTRACT

This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.


Subject(s)
Data Display , Museums , Robotics/methods , Self-Help Devices , Sensory Aids , Touch/physiology , Vision Disorders/rehabilitation , Algorithms , Computer Graphics , Computer Simulation , Female , Humans , Male , Multimedia , Task Performance and Analysis , User-Computer Interface
SELECTION OF CITATIONS
SEARCH DETAIL
...