Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
Add more filters










Publication year range
1.
Comput Biol Med ; 179: 108839, 2024 Jul 12.
Article in English | MEDLINE | ID: mdl-39002320

ABSTRACT

BACKGROUND: Although early rehabilitation is important following a stroke, severely affected patients have limited options for intensive rehabilitation as they are often bedridden. To create a system for early rehabilitation of lower extremities in these patients, we combined the robotic manipulator ROBERT® with electromyography (EMG)-triggered functional electrical stimulation (FES) and developed a novel user-driven Assist-As-Needed (AAN) control. The method is based on a state machine able to detect user movement capability, assessed by the presence of an EMG-trigger and the movement velocity, and provide different levels of assistance as required by the patient (no support, FES only, and simultaneous FES and mechanical assistance). METHODS: To technically validate the system, we tested 10 able-bodied participants who were instructed to perform specific behaviors to test the system states while conducting knee extension and ankle dorsal flexion exercises. The system was also tested on two stroke patients to establish its clinical feasibility. RESULTS: The technical validation showed that the state machine correctly detected the participants' behavior and activated the target AAN state in more than 96% of the exercise repetitions. The clinical feasibility test showed that the system successfully recognized the patients' movement capacity and activated assistive states according to their needs providing the minimal level of support required to exercise successfully. CONCLUSIONS: The system was technically validated and preliminarily proved clinically feasible. The present study shows that the novel system can be used to deliver exercises with a high number of repetitions while engaging the participants' residual capabilities through the AAN strategy.

2.
IEEE Int Conf Rehabil Robot ; 2022: 1-5, 2022 07.
Article in English | MEDLINE | ID: mdl-36176141

ABSTRACT

This study describes an interdisciplinary approach to develop a 5 degrees of freedom assistive upper limb exoskeleton (ULE) for users with severe to complete functional tetraplegia. Four different application levels were identified for the ULE ranging from basic technical application to interaction with users, interaction with caregivers and interaction with the society, each level posing requirements for the design and functionality of the ULE. These requirements were addressed through an interdisciplinary collaboration involving users, clinicians and researchers within social sciences and humanities, mechanical engineering, control engineering media technology and biomedical engineering. The results showed that the developed ULE, the EXOTIC, had a high level of usability, safety and adoptability. Further, the results showed that several topics are important to explicitly address in relation to the facilitation of interdisciplinary collaboration including, defining a common language, a joint visualization of the end goal and a physical frame for the collaboration, such as a shared laboratory. The study underlined the importance of interdisciplinarity and we believe that future collaboration amongst interdisciplinary researchers and centres, also at an international level, can strongly facilitate the usefulness and adoption of assistive exoskeletons and similar technologies.


Subject(s)
Disabled Persons , Exoskeleton Device , Humans , Motivation , Upper Extremity
3.
Article in English | MEDLINE | ID: mdl-35290187

ABSTRACT

Individuals with severe tetraplegia can benefit from brain-computer interfaces (BCIs). While most movement-related BCI systems focus on right/left hand and/or foot movements, very few studies have considered tongue movements to construct a multiclass BCI. The aim of this study was to decode four movement directions of the tongue (left, right, up, and down) from single-trial pre-movement EEG and provide a feature and classifier investigation. In offline analyses (from ten individuals without a disability) detection and classification were performed using temporal, spectral, entropy, and template features classified using either a linear discriminative analysis, support vector machine, random forest or multilayer perceptron classifiers. Besides the 4-class classification scenario, all possible 3-, and 2-class scenarios were tested to find the most discriminable movement type. The linear discriminant analysis achieved on average, higher classification accuracies for both movement detection and classification. The right- and down tongue movements provided the highest and lowest detection accuracy (95.3±4.3% and 91.7±4.8%), respectively. The 4-class classification achieved an accuracy of 62.6±7.2%, while the best 3-class classification (using left, right, and up movements) and 2-class classification (using left and right movements) achieved an accuracy of 75.6±8.4% and 87.7±8.0%, respectively. Using only a combination of the temporal and template feature groups provided further classification accuracy improvements. Presumably, this is because these feature groups utilize the movement-related cortical potentials, which are noticeably different on the left- versus right brain hemisphere for the different movements. This study shows that the cortical representation of the tongue is useful for extracting control signals for multi-class movement detection BCIs.


Subject(s)
Brain-Computer Interfaces , Electroencephalography , Hand , Humans , Movement , Tongue
4.
IEEE Trans Biomed Eng ; 68(6): 2011-2020, 2021 06.
Article in English | MEDLINE | ID: mdl-33449876

ABSTRACT

OBJECTIVE: This study aims at investigating the functional performance of a novel prosthesis control scheme integrating an inductive tongue interface and myoelectric control. The tongue interface allowed direct selection of the desired grasp while myoelectric signals were used to open and close the robotic hand. METHODS: The novel method was compared to a conventional sequential on/off myoelectric control scheme using functional tasks defined by Assistive Hand Assessment protocol. Ten able-bodied participants were fitted with the SmartHand on their left forearm. They used both the conventional myoelectric control and the Tongue and Myoelectric Hybrid interface (TMH) to accomplish two activities of daily living (i.e., preparing a sandwich and gift wrapping). Sessions were video recorded and the outcome measure was the completion time for the subtasks as well as the full tasks. RESULTS: The sandwich task was completed significantly faster, with 19% decrease in the completion time, using the TMH when compared to the conventional sequential on/off myoelectric control scheme (p < 0.05). CONCLUSION: The results indicate that the TMH control scheme facilitates the active use of the prosthetic device by simplifying grasp selection, leading thereby to faster completion of challenging and relevant tasks involving bimanual activities.


Subject(s)
Artificial Limbs , Robotic Surgical Procedures , Activities of Daily Living , Electromyography , Hand , Hand Strength , Humans , Prosthesis Design , Tongue
5.
IEEE Trans Biomed Eng ; 68(8): 2552-2562, 2021 08.
Article in English | MEDLINE | ID: mdl-33513095

ABSTRACT

Individuals with tetraplegia have a challenging life due to a lack of independence and autonomy. Assistive robots have the potential to assist with the activities of daily living and thus improve the quality of life. However, an efficient and reliable control interface for severely disabled individuals is still missing. An intraoral tongue-computer interface (ITCI) for people with tetraplegia has previously been introduced and tested for controlling a robotic manipulator in a study deploying discrete tongue robot mapping. To improve the efficiency of the interface, the current study proposed the use of virtual buttons based on the ITCI and evaluated them in combination with a joystick-like control implementation, enabling continuous control commands. Twelve able-bodied volunteers participated in a three-day experiment. They controlled an assistive robotic manipulator through the tongue to perform two tasks: Pouring water in a cup (PW) and picking up a roll of tape (PUT). Four different tongue-robot mapping methods were compared. The results showed that using continuous commands reduced the task completion time by 16% and the number of commands of the PUT test by 20% compared with discrete commands. The highest success rate for completing the tasks was 77.8% for the PUT test and 100% for the PW test, both achieved by the control methods with continuous commands. Thus, the study demonstrated that incorporating continuous commands can improve the performance of the ITCI system for controlling robotic manipulators.


Subject(s)
Robotic Surgical Procedures , Robotics , Activities of Daily Living , Humans , Physical Functional Performance , Quality of Life , Tongue , User-Computer Interface
6.
IEEE Rev Biomed Eng ; 12: 138-153, 2019.
Article in English | MEDLINE | ID: mdl-30561350

ABSTRACT

With the existence of numerous rehabilitation systems, classification and comparison becomes difficult, especially due to the many factors involved. Moreover, most current reviews are descriptive and do not provide systematic methods for the visual comparison of systems. This review proposes a method for classifying systems and representing them graphically to easily visualize various characteristics of the different systems at the same time. This method could be an introduction for standardizing the evaluation of gait rehabilitation systems. It evaluates four main modules (body weight support, reciprocal stepping mechanism, pelvis mechanism, and environment module) of 27 different gait systems based on a set of characteristics. The combination of these modular evaluations provides a description of the system "in the space of rehabilitation." The evaluation of each robotic module, based on specific characteristics, showed diverse tendencies. While there is an augmented interest in developing more sophisticated reciprocal stepping mechanisms, few researchers are dedicated to enhance the properties of pelvis mechanisms.


Subject(s)
Gait Disorders, Neurologic/rehabilitation , Gait/physiology , Robotics , Telerehabilitation/trends , Biomechanical Phenomena/physiology , Equipment Design , Gait Disorders, Neurologic/physiopathology , Humans , Pelvis/physiopathology
7.
Annu Int Conf IEEE Eng Med Biol Soc ; 2018: 2483-2486, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30440911

ABSTRACT

This paper assesses the ability of speaking while using an inductive tongue-computer interface. Lately, tongue- computer interfaces have been proposed for computer/robotic interfacing for individuals with tetraplegia. To be useful in home settings these interfaces should be aesthetic and interfere as little as possible with the limited preserved functionality of individuals with tetraplegia. As tongue interfaces from an aesthetical point of view are preferred to be entirely intra-oral it is relevant to address their effect on speech. Here we show that reading more than 566 words while using an inductive tongue-computer interface results in a maximum sensor activation time of less than 0.6 s, which means that false activations can be avoided by a sensor dwell time of 0.6 s. Furthermore, we show that it is possible to speak while controlling a powered wheelchair with the inductive tongue computer interface.


Subject(s)
Quadriplegia , Wheelchairs , Equipment Design , Humans , Tongue , User-Computer Interface
8.
IEEE Int Conf Rehabil Robot ; 2017: 925-928, 2017 07.
Article in English | MEDLINE | ID: mdl-28813939

ABSTRACT

This paper demonstrates how an assistive 6 DoF robotic arm with a gripper can be controlled manually using a tongue interface. The proposed method suggests that it possible for a user to manipulate the surroundings with his or her tongue using the inductive tongue control system as deployed in this study. The sensors of an inductive tongue-computer interface were mapped to the Cartesian control of an assistive robotic arm. The resulting control system was tested manually in order to compare manual control of the robot using a standard keyboard and using the tongue interface. Two healthy subjects controlled the robotic arm to precisely move a bottle of water from one location to another. The results shows that the tongue interface was able to fully control the robotic arm in a similar manner as the standard keyboard resulting in the same number of successful manipulations and an average increase in task duration of up to 30% as compared with the standard keyboard.


Subject(s)
Man-Machine Systems , Robotics/instrumentation , Robotics/methods , Self-Help Devices , Tongue/physiology , User-Computer Interface , Adult , Equipment Design , Humans , Male , Task Performance and Analysis
9.
Disabil Rehabil Assist Technol ; 9(4): 307-17, 2014 Jul.
Article in English | MEDLINE | ID: mdl-23931550

ABSTRACT

PURPOSE: To evaluate typing and pointing performance and improvement over time of four able-bodied participants using an intra-oral tongue-computer interface for computer control. BACKGROUND: A physically disabled individual may lack the ability to efficiently control standard computer input devices. There have been several efforts to produce and evaluate interfaces that provide individuals with physical disabilities the possibility to control personal computers. METHOD: Training with the intra-oral tongue-computer interface was performed by playing games over 18 sessions. Skill improvement was measured through typing and pointing exercises at the end of each training session. RESULTS: Typing throughput improved from averages of 2.36 to 5.43 correct words per minute. Pointing throughput improved from averages of 0.47 to 0.85 bits/s. Target tracking performance, measured as relative time on target, improved from averages of 36% to 47%. Path following throughput improved from averages of 0.31 to 0.83 bits/s and decreased to 0.53 bits/s with more difficult tasks. CONCLUSIONS: Learning curves support the notion that the tongue can rapidly learn novel motor tasks. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, which makes the tongue a feasible input organ for computer control. IMPLICATIONS FOR REHABILITATION: Intra-oral computer interfaces could provide individuals with severe upper-limb mobility impairments the opportunity to control computers and automatic equipment. Typing and pointing performance of the tongue-computer interface is comparable to performances of other proficient assistive devices, but does not cause fatigue easily and might be invisible to other people, which is highly prioritized by assistive device users. Combination of visual and auditory feedback is vital for a good performance of an intra-oral computer interface and helps to reduce involuntary or erroneous activations.


Subject(s)
Computer Peripherals , Psychomotor Performance/physiology , Self-Help Devices , Tongue , User-Computer Interface , Adult , Equipment Design , Female , Humans , Male , Middle Aged , Practice, Psychological
10.
IEEE Trans Biomed Eng ; 59(1): 174-82, 2012 Jan.
Article in English | MEDLINE | ID: mdl-21954196

ABSTRACT

This study assessed the ability of the tongue tip to accurately select intraoral targets embedded in an upper palatal tongue-computer interface, using 18 able-bodied volunteers. Four performance measures, based on modifications to Fitts's Law, were determined for three different tongue-computer interface layouts. The layouts differed with respect to number and location of the targets in the palatal interface. Assessment of intraoral target selection speed and accuracy revealed that performance was indeed dependent on the location and distance between the targets. Performances were faster and more accurate for targets located farther away from the base of the tongue in comparison to posterior and medial targets. A regression model was built, which predicted intraoral target selection time based on target location and movement amplitude better than the predicted by using a standard Fitts's Law model. A 30% improvement in the speed and accuracy over three daily practice sessions of 30 min emphasizes the remarkable motor learning abilities of the tongue musculature and provides further evidence that the tongue is useful for operating computer-interface technologies.


Subject(s)
Learning/physiology , Movement/physiology , Palate/physiology , Task Performance and Analysis , Tongue/physiology , Touch/physiology , User-Computer Interface , Adult , Biofeedback, Psychology/methods , Biofeedback, Psychology/physiology , Female , Humans
11.
Article in English | MEDLINE | ID: mdl-19963972

ABSTRACT

This paper presents the development of a character activation time prediction model for tongue-typing. This model is based on a modification of Fitts's law that is more suitable for tip-of-tongue selectivity tasks around the palatal area. The model was trained and evaluated with data from tongue-selectivity experiments using an inductive tongue-computer interface. It takes into account the movement amplitude, target position, interactions between them, character disambiguation time and error correction time.


Subject(s)
Communication Aids for Disabled , Models, Biological , Psychomotor Performance/physiology , Tongue/physiology , Touch/physiology , User-Computer Interface , Word Processing , Computer Simulation , Computer-Aided Design , Equipment Design , Equipment Failure Analysis , Humans , Sensitivity and Specificity , Time Factors
12.
Article in English | MEDLINE | ID: mdl-19963971

ABSTRACT

This work describes a novel fully integrated inductive tongue computer interface for disabled people. The interface consists of an oral unit placed in the mouth, including inductive sensors, related electronics, a system for wireless transmission and a rechargeable battery. The system is activated using an activation unit placed on the tongue, and incorporates 18 inductive sensors, arranged in both a key area and a mouse-pad area. The system's functionality was demonstrated in a pilot experiment, where a typing rate of up to 70 characters/minute was obtained with an error rate of 3%. Future work will include tests with disabled subjects.


Subject(s)
Communication Aids for Disabled , Disabled Persons/rehabilitation , Signal Processing, Computer-Assisted/instrumentation , Telemetry/instrumentation , Tongue/physiology , User-Computer Interface , Computer-Aided Design , Equipment Design , Equipment Failure Analysis , Humans , Man-Machine Systems , Systems Integration
13.
Article in English | MEDLINE | ID: mdl-19964489

ABSTRACT

Effective human input devices for computer control are very important to quadriplegics and others with severe disabilities. This paper describes a framework for computer control without need for special PC software or drivers. The framework is based on a tongue control system recently developed at Center for Sensory-Motor Interaction (SMI), Aalborg University. The framework provides emulation of a standard USB keyboard and mouse, and allows tongue control of any computer using standard USB drivers available in all modern operating systems.


Subject(s)
Communication Aids for Disabled , Computer Peripherals , Man-Machine Systems , Signal Processing, Computer-Assisted/instrumentation , Telemetry/instrumentation , Tongue , User-Computer Interface , Feedback , Humans , Touch , Transducers
14.
Article in English | MEDLINE | ID: mdl-19965193

ABSTRACT

Experimental results for pointing tasks using a tongue control system are reported in this paper. Ten untrained subjects participated in the experiment. Both typing and pointing tasks were performed, in three short-term training sessions, in consecutive days, by each subject. The system provided a key pad (14 sensors) and a mouse pad (10 sensors with joystick functionality) whose placements were interchanged (front, back) in half of the subjects. The pointing tasks consisted of selecting and tracking a target circle (of 50, 75 and 100 pixels diameter) that occurred randomly in each of the 16 positions uniformly distributed along the perimeter of a layout circle of 250 pixels diameter. The throughput was of 0.808 bits per second and the time on target was of 0.164 of the total tracking time. The pads layout, the subjects, the sessions, the target diameters, and the angle of the tracking direction had a statistically significant effect on the two performance measures. Long term training is required to assess the improvement of the user capability.


Subject(s)
Communication Aids for Disabled , Equipment Design/instrumentation , Self-Help Devices , Tongue/physiology , Computer Peripherals , Computers , Ergonomics/instrumentation , Humans , Man-Machine Systems , Software , Transducers , User-Computer Interface
15.
IEEE Trans Biomed Eng ; 55(1): 372-5, 2008 Jan.
Article in English | MEDLINE | ID: mdl-18232387

ABSTRACT

Single fiber action potentials (SFAPs) from peripheral nerves, such as recorded with cuff electrodes, can be modelled as the convolution of a source current and a weight function that describes the recording electrodes and the surrounding medium. It is shown that for cuff electrodes, the weight function is linearly scaled with the action potential (AP) velocity and that it is, therefore, possible to implement a model of the recorded SFAPs based on a wavelet multiresolution technique (filterbank), where the wavelet scale is proportional to the AP velocity. The model resulted in single fiber action potentials matching the results from other models with a goodness of fit exceeding 0.99. This formulation of the SFAP may serve as a basis for model-based wavelet analysis and for advanced cuff design.


Subject(s)
Action Potentials/physiology , Diagnosis, Computer-Assisted/methods , Electrodiagnosis/methods , Models, Neurological , Nerve Fibers/physiology , Neural Conduction/physiology , Animals , Computer Simulation , Humans
16.
IEEE Trans Biomed Eng ; 53(12 Pt 2): 2594-7, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17152438

ABSTRACT

This letter introduces a new inductive tongue computer interface to be used by disabled people for environmental control. The interface demands little effort from the user, provides a basis for an invisible interface, and has potential to allow a large number of commands to be facilitated.


Subject(s)
Man-Machine Systems , Self-Help Devices , Therapy, Computer-Assisted/instrumentation , Therapy, Computer-Assisted/methods , Tongue/physiology , User-Computer Interface , Equipment Design , Equipment Failure Analysis , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...