Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
IEEE Trans Haptics ; 16(3): 400-411, 2023.
Article in English | MEDLINE | ID: mdl-37527306

ABSTRACT

Human-like robot hands provide the flexibility to manipulate a variety of objects that are found in unstructured environments. Knowledge of object properties and motion trajectory is required, but often not available in real-world manipulation tasks. Although it is possible to grasp and manipulate unknown objects, an uninformed grasp leads to inferior stability, accuracy, and repeatability of the manipulation. Therefore, a central challenge of in-hand manipulation in unstructured environments is to acquire this information safely and efficiently. We propose an in-hand manipulation framework that does not assume any prior information about the object and the motion, but instead extracts the object properties through a novel haptic exploration procedure and learns the motion from demonstration using dynamical movement primitives. We evaluate our approach by unknown object manipulation experiments using a human-like robot hand. The results show that haptic exploration improves the manipulation robustness and accuracy significantly, compared to the virtual spring framework baseline method that is widely used for grasping unknown objects.


Subject(s)
Touch Perception , Touch , Humans , Haptic Technology , Learning , Hand
2.
Cognition ; 239: 105532, 2023 10.
Article in English | MEDLINE | ID: mdl-37442021

ABSTRACT

Music performances are rich in systematic temporal irregularities called "microtiming", too fine-grained to be notated in a musical score but important for musical expression and communication. Several studies have examined listeners' preference for rhythms varying in microtiming, but few have addressed precisely how microtiming is perceived, especially in terms of cognitive mechanisms, making the empirical evidence difficult to interpret. Here we provide evidence that microtiming perception can be simulated as a process of probabilistic prediction. Participants performed an XAB discrimination test, in which an archetypal popular drum rhythm was presented with different microtiming. The results indicate that listeners could implicitly discriminate the mean and variance of stimulus microtiming. Furthermore, their responses were effectively simulated by a Bayesian model of entrainment, using a distance function derived from its dynamic posterior estimate over phase. Wide individual differences in participant sensitivity to microtiming were predicted by a model parameter likened to noisy timekeeping processes in the brain. Overall, this suggests that the cognitive mechanisms underlying perception of microtiming reflect a continuous inferential process, potentially driving qualitative judgements of rhythmic feel.


Subject(s)
Auditory Perception , Music , Humans , Bayes Theorem , Auditory Perception/physiology , Brain/physiology , Emotions , Music/psychology
3.
Article in English | MEDLINE | ID: mdl-36327180

ABSTRACT

Multifingered robot hands can be extremely effective in physically exploring and recognizing objects, especially if they are extensively covered with distributed tactile sensors. Convolutional neural networks (CNNs) have been proven successful in processing high dimensional data, such as camera images, and are, therefore, very well suited to analyze distributed tactile information as well. However, a major challenge is to organize tactile inputs coming from different locations on the hand in a coherent structure that could leverage the computational properties of the CNN. Therefore, we introduce a morphology-specific CNN (MS-CNN), in which hierarchical convolutional layers are formed following the physical configuration of the tactile sensors on the robot. We equipped a four-fingered Allegro robot hand with several uSkin tactile sensors; overall, the hand is covered with 240 sensitive elements, each one measuring three-axis contact force. The MS-CNN layers process the tactile data hierarchically: at the level of small local clusters first, then each finger, and then the entire hand. We show experimentally that, after training, the robot hand can successfully recognize objects by a single touch, with a recognition rate of over 95%. Interestingly, the learned MS-CNN representation transfers well to novel tasks: by adding a limited amount of data about new objects, the network can recognize nine types of physical properties.

4.
PLoS Comput Biol ; 18(9): e1010579, 2022 09.
Article in English | MEDLINE | ID: mdl-36174063

ABSTRACT

Long-term and culture-specific experience of music shapes rhythm perception, leading to enculturated expectations that make certain rhythms easier to track and more conducive to synchronized movement. However, the influence of enculturated bias on the moment-to-moment dynamics of rhythm tracking is not well understood. Recent modeling work has formulated entrainment to rhythms as a formal inference problem, where phase is continuously estimated based on precise event times and their correspondence to timing expectations: PIPPET (Phase Inference from Point Process Event Timing). Here we propose that the problem of optimally tracking a rhythm also requires an ongoing process of inferring which pattern of event timing expectations is most suitable to predict a stimulus rhythm. We formalize this insight as an extension of PIPPET called pPIPPET (PIPPET with pattern inference). The variational solution to this problem introduces terms representing the likelihood that a stimulus is based on a particular member of a set of event timing patterns, which we initialize according to culturally-learned prior expectations of a listener. We evaluate pPIPPET in three experiments. First, we demonstrate that pPIPPET can qualitatively reproduce enculturated bias observed in human tapping data for simple two-interval rhythms. Second, we simulate categorization of a continuous three-interval rhythm space by Western-trained musicians through derivation of a comprehensive set of priors for pPIPPET from metrical patterns in a sample of Western rhythms. Third, we simulate iterated reproduction of three-interval rhythms, and show that models configured with notated rhythms from different cultures exhibit both universal and enculturated biases as observed experimentally in listeners from those cultures. These results suggest the influence of enculturated timing expectations on human perceptual and motor entrainment can be understood as approximating optimal inference about the rhythmic stimulus, with respect to prototypical patterns in an empirical sample of rhythms that represent the music-cultural environment of the listener.


Subject(s)
Auditory Perception , Music , Bias , Humans , Movement , Periodicity
5.
Front Robot AI ; 8: 703869, 2021.
Article in English | MEDLINE | ID: mdl-34458325

ABSTRACT

Grasp stability prediction of unknown objects is crucial to enable autonomous robotic manipulation in an unstructured environment. Even if prior information about the object is available, real-time local exploration might be necessary to mitigate object modelling inaccuracies. This paper presents an approach to predict safe grasps of unknown objects using depth vision and a dexterous robot hand equipped with tactile feedback. Our approach does not assume any prior knowledge about the objects. First, an object pose estimation is obtained from RGB-D sensing; then, the object is explored haptically to maximise a given grasp metric. We compare two probabilistic methods (i.e. standard and unscented Bayesian Optimisation) against random exploration (i.e. uniform grid search). Our experimental results demonstrate that these probabilistic methods can provide confident predictions after a limited number of exploratory observations, and that unscented Bayesian Optimisation can find safer grasps, taking into account the uncertainty in robot sensing and grasp execution.

6.
Sensors (Basel) ; 21(15)2021 Jul 28.
Article in English | MEDLINE | ID: mdl-34372335

ABSTRACT

Tactile sensing is crucial for robots to manipulate objects successfully. However, integrating tactile sensors into robotic hands is still challenging, mainly due to the need to cover small multi-curved surfaces with several components that must be miniaturized. In this paper, we report the design of a novel magnetic-based tactile sensor to be integrated into the robotic hand of the humanoid robot Vizzy. We designed and fabricated a flexible 4 × 2 matrix of Si chips of magnetoresistive spin valve sensors that, coupled with a single small magnet, can measure contact forces from 0.1 to 5 N on multiple locations over the surface of a robotic fingertip; this design is innovative with respect to previous works in the literature, and it is made possible by careful engineering and miniaturization of the custom-made electronic components that we employ. In addition, we characterize the behavior of the sensor through a COMSOL simulation, which can be used to generate optimized designs for sensors with different geometries.


Subject(s)
Robotics , Touch , Electronics , Fingers , Magnetic Phenomena
8.
Front Robot AI ; 7: 79, 2020.
Article in English | MEDLINE | ID: mdl-33501246

ABSTRACT

Plants are movers, but the nature of their movement differs dramatically from that of creatures that move their whole body from point A to point B. Plants grow to where they are going. Bio-inspired robotics sometimes emulates plants' growth-based movement; but growing is part of a broader system of movement guidance and control. We argue that ecological psychology's conception of "information" and "control" can simultaneously make sense of what it means for a plant to navigate its environment and provide a control scheme for the design of ecological plant-inspired robotics. In this effort, we will outline several control laws and give special consideration to the class of control laws identified by tau theory, such as time to contact.

9.
Front Robot AI ; 5: 46, 2018.
Article in English | MEDLINE | ID: mdl-33500931

ABSTRACT

Humanoid robots are resourceful platforms and can be used in diverse application scenarios. However, their high number of degrees of freedom (i.e., moving arms, head and eyes) deteriorates the precision of eye-hand coordination. A good kinematic calibration is often difficult to achieve, due to several factors, e.g., unmodeled deformations of the structure or backlash in the actuators. This is particularly challenging for very complex robots such as the iCub humanoid robot, which has 12 degrees of freedom and cable-driven actuation in the serial chain from the eyes to the hand. The exploitation of real-time robot sensing is of paramount importance to increase the accuracy of the coordination, for example, to realize precise grasping and manipulation tasks. In this code paper, we propose an online and markerless solution to the eye-hand kinematic calibration of the iCub humanoid robot. We have implemented a sequential Monte Carlo algorithm estimating kinematic calibration parameters (joint offsets) which improve the eye-hand coordination based on the proprioception and vision sensing of the robot. We have shown the usefulness of the developed code and its accuracy on simulation and real-world scenarios. The code is written in C++ and CUDA, where we exploit the GPU to increase the speed of the method. The code is made available online along with a Dataset for testing purposes.

10.
Sensors (Basel) ; 16(4)2016 Apr 07.
Article in English | MEDLINE | ID: mdl-27070604

ABSTRACT

This paper presents an easy means to produce a 3-axis Hall effect-based skin sensor for robotic applications. It uses an off-the-shelf chip and is physically small and provides digital output. Furthermore, the sensor has a soft exterior for safe interactions with the environment; in particular it uses soft silicone with about an 8 mm thickness. Tests were performed to evaluate the drift due to temperature changes, and a compensation using the integral temperature sensor was implemented. Furthermore, the hysteresis and the crosstalk between the 3-axis measurements were evaluated. The sensor is able to detect minimal forces of about 1 gf. The sensor was calibrated and results with total forces up to 1450 gf in the normal and tangential directions of the sensor are presented. The test revealed that the sensor is able to measure the different components of the force vector.

SELECTION OF CITATIONS
SEARCH DETAIL
...