Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Trans Haptics ; 13(2): 312-324, 2020.
Article in English | MEDLINE | ID: mdl-31603800

ABSTRACT

Achieving high stiffness and low inertia is a big challenge for current haptic devices. Impedance-based devices are limited in providing high stiffness while, in contrast, admittance-based devices are limited in generating low inertia. Thus, it is difficult to simulate hard contact and small inertia simultaneously in virtual environments. In this paper, we introduce a co-actuation module to overcome this difficulty. The module is a one degree-of-freedom (DOF) revolute joint which consists of a link and a physical constraint with a clearance between the two components. A motor controls the physical constraint moving cooperatively with the link. In free space, the constraint has no contact to the link and thus, users can move the link freely without feeling the inertia of the motor. In constrained space, the constraint comes into contact with the link and thus, users can feel a resistance from the motor. By means of a direct physical contact between the link and the constraint, users can feel a hard virtual surface. This paper describes the principle and the implementation of the proposed co-actuation module. Performance evaluation was conducted using a two-DOF haptic device in a task workspace of 100 mm × 100 mm. The effective inertia of the device is 64-142 g within the task workspace. The device can stably render a virtual wall with stiffness as high as 65 N/mm. The penetration to the virtual wall was 0.02-0.41 mm when tapping the wall with a speed range of 80-320 mm/s. The maximum back driving force was about 0.19 N when moving within 4.5-8.6 mm/s. The experimental results demonstrate that the concept of co-actuation is feasible in achieving high force, high stiffness range and low inertia for haptic devices.


Subject(s)
Equipment Design , Feedback, Sensory , Psychomotor Performance , Touch Perception , User-Computer Interface , Adult , Equipment Design/standards , Feasibility Studies , Humans , Proof of Concept Study
2.
Front Neurorobot ; 13: 117, 2019.
Article in English | MEDLINE | ID: mdl-32116632

ABSTRACT

This paper presents an intuitive end-to-end interaction system between a human and a hexacopter Unmanned Aerial Vehicle (UAV) for field exploration in which the UAV can be commanded by natural human poses. Moreover, LEDs installed on the UAV are used to communicate the state and intents of the UAV to the human as feedback throughout the interaction. A real time multi-human pose estimation system is built that can perform with low latency while maintaining competitive performance. The UAV is equipped with a robotic arm, kinematic and dynamic attitude models for which are provided by introducing the center of gravity (COG) of the vehicle. In addition, a super-twisting extended state observer (STESO)-based back-stepping controller (BSC) is constructed to estimate and attenuate complex disturbances in the attitude control system of the UAV, such as wind gusts, model uncertainties, etc. A stability analysis for the entire control system is also presented based on the Lyapunov stability theory. The pose estimation system is integrated with the proposed intelligent control architecture to command the UAV to execute an exploration task stably. Additionally, all the components of this interaction system are described. Several simulations and experiments have been conducted to demonstrate the effectiveness of the whole system and its individual components.

SELECTION OF CITATIONS
SEARCH DETAIL
...