Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Int J Neural Syst ; 34(2): 2450005, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38063381

ABSTRACT

Autism Spectrum Disorder (ASD) is a complex and heterogeneous neurodevelopmental disorder which affects a significant proportion of the population, with estimates suggesting that about 1 in 100 children worldwide are affected by ASD. This study introduces a new Deep Neural Network for identifying ASD in children through gait analysis, using features extracted from frames composing video recordings of their walking patterns. The innovative method presented herein is based on imagery and combines gait analysis and deep learning, offering a noninvasive and objective assessment of neurodevelopmental disorders while delivering high accuracy in ASD detection. Our model proposes a bimodal approach based on the concatenation of two distinct Convolutional Neural Networks processing two feature sets extracted from the same videos. The features obtained from the convolutions of both networks are subsequently flattened and merged into a single vector, serving as input for the fully connected layers in the binary classification process. This approach demonstrates the potential for effective ASD detection in children through the combination of gait analysis and deep learning techniques.


Subject(s)
Autism Spectrum Disorder , Deep Learning , Child , Humans , Autism Spectrum Disorder/diagnosis , Neural Networks, Computer , Video Recording/methods
2.
Sci Rep ; 13(1): 9786, 2023 06 16.
Article in English | MEDLINE | ID: mdl-37328550

ABSTRACT

Affective states are psycho-physiological constructs connecting mental and physiological processes. They can be represented in terms of arousal and valence according to the Russel's model and can be extracted from physiological changes in human body. However, a well-established optimal feature set and a classification method effective in terms of accuracy and estimation time are not present in the literature. This paper aims at defining a reliable and efficient approach for real-time affective state estimation. To obtain this, the optimal physiological feature set and the most effective machine learning algorithm, to cope with binary as well as multi-class classification problems, were identified. ReliefF feature selection algorithm was implemented to define a reduced optimal feature set. Supervised learning algorithms, such as K-Nearest Neighbors (KNN), cubic and gaussian Support Vector Machine, and Linear Discriminant Analysis, were implemented to compare their effectiveness in affective state estimation. The developed approach was tested on physiological signals acquired on 20 healthy volunteers during the administration of images, belonging to the International Affective Picture System, conceived for inducing different affective states. ReliefF algorithm reduced the number of physiological features from 23 to 13. The performances of machine learning algorithms were compared and the experimental results showed that both accuracy and estimation time benefited from the optimal feature set use. Furthermore, the KNN algorithm resulted to be the most suitable for affective state estimation. The results of the assessment of arousal and valence states on 20 participants indicate that KNN classifier, adopted with the 13 identified optimal features, is the most effective approach for real-time affective state estimation.


Subject(s)
Algorithms , Emotions , Humans , Machine Learning , Support Vector Machine
3.
Bioengineering (Basel) ; 10(1)2023 Jan 04.
Article in English | MEDLINE | ID: mdl-36671635

ABSTRACT

The ability to finely control hand grip forces can be compromised by neuromuscular or musculoskeletal disorders. Therefore, it is recommended to include the training and assessment of grip force control in rehabilitation therapy. The benefits of robot-mediated therapy have been widely reported in the literature, and its combination with virtual reality and biofeedback can improve rehabilitation outcomes. However, the existing systems for hand rehabilitation do not allow both monitoring/training forces exerted by single fingers and providing biofeedback. This paper describes the development of a system for the assessment and recovery of grip force control. An exoskeleton for hand rehabilitation was instrumented to sense grip forces at the fingertips, and two operation modalities are proposed: (i) an active-assisted training to assist the user in reaching target force values and (ii) virtual reality games, in the form of tracking tasks, to train and assess the user's grip force control. For the active-assisted modality, the control of the exoskeleton motors allowed generating additional grip force at the fingertips, confirming the feasibility of this modality. The developed virtual reality games were positively accepted by the volunteers and allowed evaluating the performance of healthy and pathological users.

4.
Appl Ergon ; 82: 102950, 2020 Jan.
Article in English | MEDLINE | ID: mdl-31542573

ABSTRACT

Repetitive and intensive exercises during robot-aided rehabilitation may expose patients to inappropriate and unsafe postures. The introduction of a sensory feedback can help the subject to perform the rehabilitation task with an ergonomic posture. In this work, the introduction of visual and vibrotactile feedback in a robotic platform for upper limb rehabilitation has been proposed to ensure ergonomic posture during rehabilitation. The two feedback modalities have been used to provide information about incorrect neck and trunk posture. Ten healthy subjects have been involved in this study. Each of them performed 3D reaching movements with the aid of the robotic platform in three different conditions, i.e. without feedback, with visual feedback and with vibrotactile feedback, and a comparative analysis has been carried out to evaluate feedback effectiveness, acceptance and performance. Experimental results show that in case of no feedback the subjects reach and maintain configurations that can lead to incorrect neck and trunk configurations and therefore, if repeated, to musculoskeletal disorders. Conversely, with visual or vibrotactile feedback, the subjects tend to correct inappropriate posture with both trunk and head during task performing.


Subject(s)
Feedback, Sensory , Posture/physiology , Rehabilitation/instrumentation , Robotics/instrumentation , Upper Extremity , Equipment Design , Ergonomics , Humans
5.
Front Neurorobot ; 12: 67, 2018.
Article in English | MEDLINE | ID: mdl-30364325

ABSTRACT

The design of patient-tailored rehabilitative protocols represents one of the crucial factors that influence motor recovery mechanisms, such as neuroplasticity. This approach, including the patient in the control loop and characterized by a control strategy adaptable to the user's requirements, is expected to significantly improve functional recovery in robot-aided rehabilitation. In this paper, a novel 3D bio-cooperative robotic platform is developed. A new arm-weight support system is included into an operational robotic platform for 3D upper limb robot-aided rehabilitation. The robotic platform is capable of adapting therapy characteristics to specific patient needs, thanks to biomechanical and physiological measurements, and thus closing the subject in the control loop. The level of arm-weight support and the level of the assistance provided by the end-effector robot are varied on the basis of muscular fatigue and biomechanical indicators. An assistance-as-needed approach is applied to provide the appropriate amount of assistance. The proposed platform has been experimentally validated on 10 healthy subjects; they performed 3D point-to-point tasks in two different conditions, i.e., with and without assistance-as-needed. The results have demonstrated the capability of the proposed system to properly adapt to real needs of the patients. Moreover, the provided assistance was shown to reduce the muscular fatigue without negatively influencing motion execution.

6.
IEEE Int Conf Rehabil Robot ; 2017: 1001-1006, 2017 07.
Article in English | MEDLINE | ID: mdl-28813952

ABSTRACT

People with a high level of disability experience great difficulties to perform activities of daily living and resort to their residual motor functions in order to operate assistive devices. The commercially available interfaces used to control assistive manipulators are typically based on joysticks and can be used only by subjects with upper-limb residual mobilities. Many other solutions can be found in the literature, based on the use of multiple sensory systems for detecting the human motion intention and state. Some of them require a high cognitive workload for the user. Some others are more intuitive and easy to use but have not been widely investigated in terms of usability and user acceptance. The objective of this work is to propose an intuitive and robust user interface for assistive robots, not obtrusive for the user and easily adaptable for subjects with different levels of disability. The proposed user interface is based on the combination of M-IMU and EMG for the continuous control of an arm-hand robotic system by means of M-IMUs. The system has been experimentally validated and compared to a standard voice interface. Sixteen healthy subjects volunteered to participate in the study: 8 subjects used the combined M-IMU/EMG robot control, and 8 subjects used the voice control. The arm-hand robotic system made of the KUKA LWR 4+ and the IH2 Azzurra hand was controlled to accomplish the daily living task of drinking. Performance indices and evaluation scales were adopted to assess performance of the two interfaces.


Subject(s)
Disabled Persons/rehabilitation , Electromyography/instrumentation , Rehabilitation , Robotics/instrumentation , Self-Help Devices , Accelerometry/instrumentation , Adult , Equipment Design , Humans , Rehabilitation/instrumentation , Rehabilitation/methods , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...