Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Trans Haptics ; 15(1): 200-211, 2022.
Article in English | MEDLINE | ID: mdl-34529575

ABSTRACT

The objective of this paper is to develop and evaluate a directional vibrotactile feedback interface as a guidance tool for postural adjustments during work. In contrast to the existing active and wearable systems such as exoskeletons, we aim to create a lightweight and intuitive interface, capable of guiding its wearers towards more ergonomic and healthy working conditions. To achieve this, a vibrotactile device called ErgoTac is employed to develop three different feedback modalities that are able to provide a directional guidance at the body segments towards a desired pose. In addition, an evaluation is made to find the most suitable, comfortable, and intuitive feedback modality for the user. Therefore, these modalities are first compared experimentally on fifteen subjects wearing eight ErgoTac devices to achieve targeted arm and torso configurations. The most effective directional feedback modality is then evaluated on five subjects in a set of experiments in which an ergonomic optimisation module provides the optimised body posture while performing heavy lifting or forceful exertion tasks. The results yield strong evidence on the usefulness and the intuitiveness of one of the developed modalities in providing guidance towards ergonomic working conditions, by minimising the effect of an external load on body joints. We believe that the integration of such low-cost devices in workplaces can help address the well-known and complex problem of work-related musculoskeletal disorders.


Subject(s)
Ergonomics , Vibration , Feedback , Humans , Posture , Torso
2.
Sensors (Basel) ; 20(10)2020 May 18.
Article in English | MEDLINE | ID: mdl-32443547

ABSTRACT

In physical Human-Robot Interaction (pHRI), forces exerted by humans need to be estimated to accommodate robot commands to human constraints, preferences, and needs. This paper presents a method for the estimation of the interaction forces between a human and a robot using a gripper with proprioceptive sensing. Specifically, we measure forces exerted by a human limb grabbed by an underactuated gripper in a frontal plane using only the gripper's own sensors. This is achieved via a regression method, trained with experimental data from the values of the phalanx angles and actuator signals. The proposed method is intended for adaptive shared control in limb manipulation. Although adding force sensors provides better performance, the results obtained are accurate enough for this application. This approach requires no additional hardware: it relies uniquely on the gripper motor feedback-current, position and torque-and joint angles. Also, it is computationally cheap, so processing times are low enough to allow continuous human-adapted pHRI for shared control.


Subject(s)
Fingers , Proprioception , Robotics , Feedback , Humans , Regression Analysis , Torque
3.
Sensors (Basel) ; 19(24)2019 Dec 05.
Article in English | MEDLINE | ID: mdl-31817320

ABSTRACT

In this paper, a novel method of active tactile perception based on 3D neural networks and a high-resolution tactile sensor installed on a robot gripper is presented. A haptic exploratory procedure based on robotic palpation is performed to get pressure images at different grasping forces that provide information not only about the external shape of the object, but also about its internal features. The gripper consists of two underactuated fingers with a tactile sensor array in the thumb. A new representation of tactile information as 3D tactile tensors is described. During a squeeze-and-release process, the pressure images read from the tactile sensor are concatenated forming a tensor that contains information about the variation of pressure matrices along with the grasping forces. These tensors are used to feed a 3D Convolutional Neural Network (3D CNN) called 3D TactNet, which is able to classify the grasped object through active interaction. Results show that 3D CNN performs better, and provide better recognition rates with a lower number of training data.


Subject(s)
Neural Networks, Computer , Robotics , Deep Learning , Equipment Design , Palpation , Touch
4.
Sensors (Basel) ; 18(3)2018 Feb 26.
Article in English | MEDLINE | ID: mdl-29495409

ABSTRACT

The use of tactile perception can help first response robotic teams in disaster scenarios, where visibility conditions are often reduced due to the presence of dust, mud, or smoke, distinguishing human limbs from other objects with similar shapes. Here, the integration of the tactile sensor in adaptive grippers is evaluated, measuring the performance of an object recognition task based on deep convolutional neural networks (DCNNs) using a flexible sensor mounted in adaptive grippers. A total of 15 classes with 50 tactile images each were trained, including human body parts and common environment objects, in semi-rigid and flexible adaptive grippers based on the fin ray effect. The classifier was compared against the rigid configuration and a support vector machine classifier (SVM). Finally, a two-level output network has been proposed to provide both object-type recognition and human/non-human classification. Sensors in adaptive grippers have a higher number of non-null tactels (up to 37% more), with a lower mean of pressure values (up to 72% less) than when using a rigid sensor, with a softer grip, which is needed in physical human-robot interaction (pHRI). A semi-rigid implementation with 95.13% object recognition rate was chosen, even though the human/non-human classification had better results (98.78%) with a rigid sensor.


Subject(s)
Robotics , Hand Strength , Humans , Neural Networks, Computer , Support Vector Machine , Touch
SELECTION OF CITATIONS
SEARCH DETAIL
...