Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Nat Commun ; 15(1): 2646, 2024 Mar 26.
Article in English | MEDLINE | ID: mdl-38531857

ABSTRACT

Animals traverse vegetation by direct physical interaction using their entire body to push aside and slide along compliant obstacles. Current drones lack this interaction versatility that stems from synergies between body morphology and feedback control modulated by sensing. Taking inspiration from nature, we show that a task-oriented design allows a drone with a minimalistic controller to traverse obstacles with unknown elastic responses. A discoid sensorized shell allows to establish and sense contacts anywhere along the shell and facilitates sliding along obstacles. This simplifies the formalization of the control strategy, which does not require a model of the interaction with the environment, nor high-level switching conditions for alternating between pushing and sliding. We utilize an optimization-based controller that ensures safety constraints on the robot's state and dampens the oscillations of the environment during interaction, even if the elastic response is unknown and variable. Experimental evaluation, using a hinged surface with three different stiffness values ranging from 18 to 155.5 N mm rad-1, validates the proposed embodied aerial physical interaction strategy. By also showcasing the traversal of isolated branches, this work makes an initial contribution toward enabling drone flight across cluttered vegetation, with potential applications in environmental monitoring, precision agriculture, and search and rescue.

2.
Sensors (Basel) ; 23(13)2023 Jun 30.
Article in English | MEDLINE | ID: mdl-37447914

ABSTRACT

Ensuring safe and continuous autonomous navigation in long-term mobile robot applications is still challenging. To ensure a reliable representation of the current environment without the need for periodic remapping, updating the map is recommended. However, in the case of incorrect robot pose estimation, updating the map can lead to errors that prevent the robot's localisation and jeopardise map accuracy. In this paper, we propose a safe Lidar-based occupancy grid map-updating algorithm for dynamic environments, taking into account uncertainties in the estimation of the robot's pose. The proposed approach allows for robust long-term operations, as it can recover the robot's pose, even when it gets lost, to continue the map update process, providing a coherent map. Moreover, the approach is also robust to temporary changes in the map due to the presence of dynamic obstacles such as humans and other robots. Results highlighting map quality, localisation performance, and pose recovery, both in simulation and experiments, are reported.


Subject(s)
Robotics , Humans , Robotics/methods , Algorithms , Computer Simulation , Computer Systems
3.
Sensors (Basel) ; 22(8)2022 Apr 13.
Article in English | MEDLINE | ID: mdl-35458952

ABSTRACT

Legged robots are meant to autonomously navigate unstructured environments for applications like search and rescue, inspection, or maintenance. In autonomous navigation, a close relationship between locomotion and perception is crucial; the robot has to perceive the environment and detect any change in order to autonomously make decisions based on what it perceived. One main challenge in autonomous navigation for legged robots is locomotion over unstructured terrains. In particular, when the ground is slippery, common control techniques and state estimation algorithms may not be effective, because the ground is commonly assumed to be non-slippery. This paper addresses the problem of slip detection, a first fundamental step to implement appropriate control strategies and perform dynamic whole-body locomotion. We propose a slip detection approach, which is independent of the gait type and the estimation of the position and velocity of the robot in an inertial frame, that is usually prone to drift problems. To the best of our knowledge, this is the first approach of a quadruped robot slip detector that can detect more than one foot slippage at the same time, relying on the estimation of measurements expressed in a non-inertial frame. We validate the approach on the 90 kg Hydraulically actuated Quadruped robot (HyQ) from the Istituto Italiano di Tecnologia (IIT), and we compare it against a state-of-the-art slip detection algorithm.


Subject(s)
Robotics , Algorithms , Gait , Locomotion , Lower Extremity , Robotics/methods
4.
IEEE Trans Haptics ; 14(1): 109-122, 2021.
Article in English | MEDLINE | ID: mdl-32746372

ABSTRACT

Recently, in the attempt to increase blind people autonomy and improve their quality of life, a lot of effort has been devoted to develop technological travel aids. These systems can surrogate spatial information about the environment and deliver it to end-users through sensory substitution (auditory, haptic). However, despite the promising research outcomes, these solutions have met scarce acceptance in real-world. Often, this is also due to the limited involvement of real end users in the conceptual and design phases. In this article, we propose a novel indoor navigation system based on wearable haptic technologies. All the developmental phases were driven by continuous feedback from visually impaired persons. The proposed travel aid system consists of a RGB-D camera, a processing unit to compute visual information for obstacle avoidance, and a wearable device, which can provide normal and tangential force cues for guidance in an unknown indoor environment. Experiments with blindfolded subjects and visually impaired participants show that our system could be an effective support during indoor navigation, and a viable tool for training blind people to the usage of travel aids.


Subject(s)
Visually Impaired Persons , Wearable Electronic Devices , Humans , Quality of Life
5.
Sensors (Basel) ; 20(14)2020 Jul 17.
Article in English | MEDLINE | ID: mdl-32709102

ABSTRACT

Self driving vehicles promise to bring one of the greatest technological and social revolutions of the next decade for their potential to drastically change human mobility and goods transportation, in particular regarding efficiency and safety. Autonomous racing provides very similar technological issues while allowing for more extreme conditions in a safe human environment. While the software stack driving the racing car consists of several modules, in this paper we focus on the localization problem, which provides as output the estimated pose of the vehicle needed by the planning and control modules. When driving near the friction limits, localization accuracy is critical as small errors can induce large errors in control due to the nonlinearities of the vehicle's dynamic model. In this paper, we present a localization architecture for a racing car that does not rely on Global Navigation Satellite Systems (GNSS). It consists of two multi-rate Extended Kalman Filters and an extension of a state-of-the-art laser-based Monte Carlo localization approach that exploits some a priori knowledge of the environment and context. We first compare the proposed method with a solution based on a widely employed state-of-the-art implementation, outlining its strengths and limitations within our experimental scenario. The architecture is then tested both in simulation and experimentally on a full-scale autonomous electric racing car during an event of Roborace Season Alpha. The results show its robustness in avoiding the robot kidnapping problem typical of particle filters localization methods, while providing a smooth and high rate pose estimate. The pose error distribution depends on the car velocity, and spans on average from 0.1 m (at 60 km/h) to 1.48 m (at 200 km/h) laterally and from 1.9 m (at 100 km/h) to 4.92 m (at 200 km/h) longitudinally.

SELECTION OF CITATIONS
SEARCH DETAIL
...