Your browser doesn't support javascript.
loading
Montrer: 20 | 50 | 100
Résultats 1 - 2 de 2
Filtre
Ajouter des filtres








Gamme d'année
1.
Progress in Biochemistry and Biophysics ; (12): 890-911, 2024.
Article Dans Chinois | WPRIM | ID: wpr-1039077

Résumé

Human-animal interaction has a long-standing tradition dating back to ancient times. With the rapid advancements in intelligent chips, wearable devices, and machine algorithms, the intelligent interaction between animals and electronic technology, facilitated by electronic devices and systems for communication, perception, and control, has become a reality. These electronic devices aim to implement an animal-centric working mode to enhance human understanding of animals and promote the development of animal intelligence and creativity. This article takes medium-sized and large animals as research objects, with the goal of developing their ability enhancement, and introduces the concept of “intelligent animal augmentation system (IAAS)”. This concept is used to describe the characteristics of such devices and provides a comprehensive overview of existing animal and computer interface solutions. In general, IAAS can be divided into implantable and non-implantable types, each composed of interface platforms, perception and interpretation, control and instruction components. Through various levels of enhancement systems and architectural patterns, intelligent interaction between humans and animals can be realized. Although existing IAAS still lack a complete independent interaction system architecture, they hold great promise and development space in the future. Not only can they be applied as substitutes for cutting-edge devices and transportation equipment, but they are also expected to achieve cross-species information interaction through intelligent interconnection. Additionally, IAAS can promote bidirectional interaction between humans and animals, playing a significant role in advancing animal ethics and ecological protection. Furthermore, the development of interaction models based on animal subjects can provide insightful research experiences for the design of human-computer interaction systems, thereby contributing to the more efficient realization of the ambitious goal of human-machine integration.

2.
Progress in Biochemistry and Biophysics ; (12): 158-176, 2024.
Article Dans Anglais | WPRIM | ID: wpr-1039132

Résumé

ObjectiveExisting artificial vision devices can be divided into two types: implanted devices and extracorporeal devices, both of which have some disadvantages. The former requires surgical implantation, which may lead to irreversible trauma, while the latter has some defects such as relatively simple instructions, limited application scenarios and relying too much on the judgment of artificial intelligence (AI) to provide enough security. Here we propose a system that has voice interaction and can convert surrounding environment information into tactile commands on head and neck. Compared with existing extracorporeal devices, our device can provide a larger capacity of information and has advantages such as lower cost, lower risk, suitable for a variety of life and work scenarios. MethodsWith the latest remote wireless communication and chip technologies, microelectronic devices, cameras and sensors worn by the user, as well as the huge database and computing power in the cloud, the backend staff can get a full insight into the scenario, environmental parameters and status of the user remotely (for example, across the city) in real time. In the meanwhile, by comparing the cloud database and in-memory database and with the help of AI-assisted recognition and manual analysis, they can quickly develop the most reasonable action plan and send instructions to the user. In addition, the backend staff can provide humanistic care and emotional sustenance through voice dialogs. ResultsThis study originally proposes the concept of “remote virtual companion” and demonstrates the related hardware and software as well as test results. The system can not only achieve basic guide functions, for example, helping a person with visual impairment to shop in supermarkets, find seats at cafes, walk on the streets, construct complex puzzles, and play cards, but also can meet the demand for fast-paced daily tasks such as cycling. ConclusionExperimental results show that this “remote virtual companion” is applicable for various scenarios and demands. It can help blind people with their travels, shopping and entertainment, or accompany the elderlies with their trips, wilderness explorations, and travels.

SÉLECTION CITATIONS
Détails de la recherche