Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Front Robot AI ; 8: 631371, 2021.
Article in English | MEDLINE | ID: mdl-34113655

ABSTRACT

Soft robotic grippers are increasingly desired in applications that involve grasping of complex and deformable objects. However, their flexible nature and non-linear dynamics makes the modelling and control difficult. Numerical techniques such as Finite Element Analysis (FEA) present an accurate way of modelling complex deformations. However, FEA approaches are computationally expensive and consequently challenging to employ for real-time control tasks. Existing analytical techniques simplify the modelling by approximating the deformed gripper geometry. Although this approach is less computationally demanding, it is limited in design scope and can lead to larger estimation errors. In this paper, we present a learning based framework that is able to predict contact forces as well as stress distribution from soft Fin Ray Effect (FRE) finger images in real-time. These images are used to learn internal representations for deformations using a deep neural encoder, which are further decoded to contact forces and stress maps using separate branches. The entire network is jointly learned in an end-to-end fashion. In order to address the challenge of having sufficient labelled data for training, we employ FEA to generate simulated images to supervise our framework. This leads to an accurate prediction, faster inference and availability of large and diverse data for better generalisability. Furthermore, our approach is able to predict a detailed stress distribution that can guide grasp planning, which would be particularly useful for delicate objects. Our proposed approach is validated by comparing the predicted contact forces to the computed ground-truth forces from FEA as well as real force sensor. We rigorously evaluate the performance of our approach under variations in contact point, object material, object shape, viewing angle, and level of occlusion.

2.
Front Robot AI ; 5: 2, 2018.
Article in English | MEDLINE | ID: mdl-33500889

ABSTRACT

This paper presents a fully printable sensorized bending actuator that can be calibrated to provide reliable bending feedback and simple contact detection. A soft bending actuator following a pleated morphology, as well as a flexible resistive strain sensor, were directly 3D printed using easily accessible FDM printer hardware with a dual-extrusion tool head. The flexible sensor was directly welded to the bending actuator's body and systematically tested to characterize and evaluate its response under variable input pressure. A signal conditioning circuit was developed to enhance the quality of the sensory feedback, and flexible conductive threads were used for wiring. The sensorized actuator's response was then calibrated using a vision system to convert the sensory readings to real bending angle values. The empirical relationship was derived using linear regression and validated at untrained input conditions to evaluate its accuracy. Furthermore, the sensorized actuator was tested in a constrained setup that prevents bending, to evaluate the potential of using the same sensor for simple contact detection by comparing the constrained and free-bending responses at the same input pressures. The results of this work demonstrated how a dual-extrusion FDM printing process can be tuned to directly print highly customizable flexible strain sensors that were able to provide reliable bending feedback and basic contact detection. The addition of such sensing capability to bending actuators enhances their functionality and reliability for applications such as controlled soft grasping, flexible wearables, and haptic devices.

SELECTION OF CITATIONS
SEARCH DETAIL
...