Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Front Robot AI ; 11: 1329270, 2024.
Article in English | MEDLINE | ID: mdl-38783889

ABSTRACT

Shared autonomy holds promise for assistive robotics, whereby physically-impaired people can direct robots to perform various tasks for them. However, a robot that is capable of many tasks also introduces many choices for the user, such as which object or location should be the target of interaction. In the context of non-invasive brain-computer interfaces for shared autonomy-most commonly electroencephalography-based-the two most common choices are to provide either auditory or visual stimuli to the user-each with their respective pros and cons. Using the oddball paradigm, we designed comparable auditory and visual interfaces to speak/display the choices to the user, and had users complete a multi-stage robotic manipulation task involving location and object selection. Users displayed differing competencies-and preferences-for the different interfaces, highlighting the importance of considering modalities outside of vision when constructing human-robot interfaces.

2.
Front Neurorobot ; 16: 898859, 2022.
Article in English | MEDLINE | ID: mdl-36310633

ABSTRACT

Despite the importance of usability in human-machine interaction (HMI), most commonly used devices are not usable by all potential users. In particular, users with low or null technological experience, or with special needs, require carefully designed systems and easy-to-use interfaces supporting recognition over recall. To this purpose, Natural User Interfaces (NUIs) represent an effective strategy as the user's learning is facilitated by features of the interface that mimic the human "natural" sensorimotor embodied interactions with the environment. This paper compares the usability of a new NUI (based on an eye-tracker and hand gesture recognition) with a traditional interface (keyboard) for the distal control of a simulated drone flying in a virtual environment. The whole interface relies on "dAIsy", a new software allowing the flexible use of different input devices and the control of different robotic platforms. The 59 users involved in the study were required to complete two tasks with each interface, while their performance was recorded: (a) exploration: detecting trees embedded in an urban environment; (b) accuracy: guiding the drone as accurately and fast as possible along a predefined track. Then they were administered questionnaires regarding the user's background, the perceived embodiment of the device, and the perceived quality of the virtual experience while either using the NUI or the traditional interface. The results appear controversial and call for further investigation: (a) contrary to our hypothesis, the specific NUI used led to lower performance than the traditional interface; (b) however, the NUI was evaluated as more natural and embodied. The final part of the paper discusses the possible causes underlying these results that suggest possible future improvements of the NUI.

SELECTION OF CITATIONS
SEARCH DETAIL
...