Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
2.
Nature ; 609(7928): 709-717, 2022 09.
Artigo em Inglês | MEDLINE | ID: mdl-36131037

RESUMO

Additive manufacturing methods1-4 using static and mobile robots are being developed for both on-site construction5-8 and off-site prefabrication9,10. Here we introduce a method of additive manufacturing, referred to as aerial additive manufacturing (Aerial-AM), that utilizes a team of aerial robots inspired by natural builders11 such as wasps who use collective building methods12,13. We present a scalable multi-robot three-dimensional (3D) printing and path-planning framework that enables robot tasks and population size to be adapted to variations in print geometry throughout a building mission. The multi-robot manufacturing framework allows for autonomous three-dimensional printing under human supervision, real-time assessment of printed geometry and robot behavioural adaptation. To validate autonomous Aerial-AM based on the framework, we develop BuilDrones for depositing materials during flight and ScanDrones for measuring the print quality, and integrate a generic real-time model-predictive-control scheme with the Aerial-AM robots. In addition, we integrate a dynamically self-aligning delta manipulator with the BuilDrone to further improve the manufacturing accuracy to five millimetres for printing geometry with precise trajectory requirements, and develop four cementitious-polymeric composite mixtures suitable for continuous material deposition. We demonstrate proof-of-concept prints including a cylinder 2.05 metres high consisting of 72 layers of a rapid-curing insulation foam material and a cylinder 0.18 metres high consisting of 28 layers of structural pseudoplastic cementitious material, a light-trail virtual print of a dome-like geometry, and multi-robot simulations. Aerial-AM allows manufacturing in-flight and offers future possibilities for building in unbounded, at-height or hard-to-access locations.

3.
IEEE Trans Pattern Anal Mach Intell ; 44(1): 154-180, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-32750812

RESUMO

Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of µs), very high dynamic range (140 dB versus 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world.


Assuntos
Algoritmos , Robótica , Redes Neurais de Computação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...