Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(12)2023 Jun 19.
Artigo em Inglês | MEDLINE | ID: mdl-37420891

RESUMO

Diabetic retinopathy (DR) is a common complication of long-term diabetes, affecting the human eye and potentially leading to permanent blindness. The early detection of DR is crucial for effective treatment, as symptoms often manifest in later stages. The manual grading of retinal images is time-consuming, prone to errors, and lacks patient-friendliness. In this study, we propose two deep learning (DL) architectures, a hybrid network combining VGG16 and XGBoost Classifier, and the DenseNet 121 network, for DR detection and classification. To evaluate the two DL models, we preprocessed a collection of retinal images obtained from the APTOS 2019 Blindness Detection Kaggle Dataset. This dataset exhibits an imbalanced image class distribution, which we addressed through appropriate balancing techniques. The performance of the considered models was assessed in terms of accuracy. The results showed that the hybrid network achieved an accuracy of 79.50%, while the DenseNet 121 model achieved an accuracy of 97.30%. Furthermore, a comparative analysis with existing methods utilizing the same dataset revealed the superior performance of the DenseNet 121 network. The findings of this study demonstrate the potential of DL architectures for the early detection and classification of DR. The superior performance of the DenseNet 121 model highlights its effectiveness in this domain. The implementation of such automated methods can significantly improve the efficiency and accuracy of DR diagnosis, benefiting both healthcare providers and patients.


Assuntos
Aprendizado Profundo , Diabetes Mellitus , Retinopatia Diabética , Humanos , Retinopatia Diabética/diagnóstico por imagem , Redes Neurais de Computação , Cegueira , Pessoal de Saúde
2.
Multimed Tools Appl ; 81(3): 3297-3325, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-34345198

RESUMO

Robotics is one of the most emerging technologies today, and are used in a variety of applications, ranging from complex rocket technology to monitoring of crops in agriculture. Robots can be exceptionally useful in a smart hospital environment provided that they are equipped with improved vision capabilities for detection and avoidance of obstacles present in their path, thus allowing robots to perform their tasks without any disturbance. In the particular case of Autonomous Nursing Robots, major essential issues are effective robot path planning for the delivery of medicines to patients, measuring the patient body parameters through sensors, interacting with and informing the patient, by means of voice-based modules, about the doctors visiting schedule, his/her body parameter details, etc. This paper presents an approach of a complete Autonomous Nursing Robot which supports all the aforementioned tasks. In this paper, we present a new Autonomous Nursing Robot system capable of operating in a smart hospital environment area. The objective of the system is to identify the patient room, perform robot path planning for the delivery of medicines to a patient, and measure the patient body parameters, through a wireless BLE (Bluetooth Low Energy) beacon receiver and the BLE beacon transmitter at the respective patient rooms. Assuming that a wireless beacon is kept at the patient room, the robot follows the beacon's signal, identifies the respective room and delivers the needed medicine to the patient. A new fuzzy controller system which consists of three ultrasonic sensors and one camera is developed to detect the optimal robot path and to avoid the robot collision with stable and moving obstacles. The fuzzy controller effectively detects obstacles in the robot's vicinity and makes proper decisions for avoiding them. The navigation of the robot is implemented on a BLE tag module by using the AOA (Angle of Arrival) method. The robot uses sensors to measure the patient body parameters and updates these data to the hospital patient database system in a private cloud mode. It also makes uses of a Google assistant to interact with the patients. The robotic system was implemented on the Raspberry Pi using Matlab 2018b. The system performance was evaluated on a PC with an Intel Core i5 processor, while the solar power was used to power the system. Several sensors, namely HC-SR04 ultrasonic sensor, Logitech HD 720p image sensor, a temperature sensor and a heart rate sensor are used together with a camera to generate datasets for testing the proposed system. In particular, the system was tested on operations taking place in the context of a private hospital in Tirunelveli, Tamilnadu, India. A detailed comparison is performed, through some performance metrics, such as Correlation, Root Mean Square Error (RMSE), and Mean Absolute Percentage Error (MAPE), against the related works of Deepu et al., Huh and Seo, Chinmayi et al., Alli et al., Xu, Ran et al., and Lee et al. The experimental system validation showed that the fuzzy controller achieves very high accuracy in obstacle detection and avoidance, with a very low computational time for taking directional decisions. Moreover, the experimental results demonstrated that the robotic system achieves superior accuracy in detecting/avoiding obstacles compared to other systems of similar purposes presented in the related works.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...