Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
Add more filters










Publication year range
1.
J Neural Eng ; 21(2)2024 Mar 27.
Article in English | MEDLINE | ID: mdl-38386506

ABSTRACT

Objective.A key challenge of virtual reality (VR) applications is to maintain a reliable human-avatar mapping. Users may lose the sense of controlling (sense of agency), owning (sense of body ownership), or being located (sense of self-location) inside the virtual body when they perceive erroneous interaction, i.e. a break-in-embodiment (BiE). However, the way to detect such an inadequate event is currently limited to questionnaires or spontaneous reports from users. The ability to implicitly detect BiE in real-time enables us to adjust human-avatar mapping without interruption.Approach.We propose and empirically demonstrate a novel brain computer interface (BCI) approach that monitors the occurrence of BiE based on the users' brain oscillatory activity in real-time to adjust the human-avatar mapping in VR. We collected EEG activity of 37 participants while they performed reaching movements with their avatar with different magnitude of distortion.Main results.Our BCI approach seamlessly predicts occurrence of BiE in varying magnitude of erroneous interaction. The mapping has been customized by BCI-reinforcement learning (RL) closed-loop system to prevent BiE from occurring. Furthermore, a non-personalized BCI decoder generalizes to new users, enabling 'Plug-and-Play' ErrP-based non-invasive BCI. The proposed VR system allows customization of human-avatar mapping without personalized BCI decoders or spontaneous reports.Significance.We anticipate that our newly developed VR-BCI can be useful to maintain an engaging avatar-based interaction and a compelling immersive experience while detecting when users notice a problem and seamlessly correcting it.


Subject(s)
Avatar , Virtual Reality , Humans , User-Computer Interface , Movement , Electroencephalography
2.
J Neural Eng ; 18(2)2021 02 26.
Article in English | MEDLINE | ID: mdl-33494072

ABSTRACT

Objective.In contrast to the classical visual brain-computer interface (BCI) paradigms, which adhere to a rigid trial structure and restricted user behavior, electroencephalogram (EEG)-based visual recognition decoding during our daily activities remains challenging. The objective of this study is to explore the feasibility of decoding the EEG signature of visual recognition in experimental conditions promoting our natural ocular behavior when interacting with our dynamic environment.Approach.In our experiment, subjects visually search for a target object among suddenly appearing objects in the environment while driving a car-simulator. Given that subjects exhibit an unconstrained overt visual behavior, we based our study on eye fixation-related potentials (EFRPs). We report on gaze behavior and single-trial EFRP decoding performance (fixations on visually similar target vs. non-target objects). In addition, we demonstrate the application of our approach in a closed-loop BCI setup.Main results.To identify the target out of four symbol types along a road segment, the BCI system integrated decoding probabilities of multiple EFRP and achieved the average online accuracy of 0.37 ± 0.06 (12 subjects), statistically significantly above the chance level. Using the acquired data, we performed a comparative study of classification algorithms (discriminating target vs. non-target) and feature spaces in a simulated online scenario. The EEG approaches yielded similar moderate performances of at most 0.6 AUC, yet statistically significantly above the chance level. In addition, the gaze duration (dwell time) appears to be an additional informative feature in this context.Significance.These results show that visual recognition of sudden events can be decoded during active driving. Therefore, this study lays a foundation for assistive and recommender systems based on the driver's brain signals.


Subject(s)
Automobile Driving , Brain-Computer Interfaces , Brain , Electroencephalography/methods , Evoked Potentials, Visual , Fixation, Ocular , Humans
3.
J Neural Eng ; 17(3): 036030, 2020 06 25.
Article in English | MEDLINE | ID: mdl-32442981

ABSTRACT

OBJECTIVE: Event Related Potentials (ERPs) reflecting cognitive response to external stimuli, are widely used in brain computer interfaces. ERP waveforms are characterized by a series of components of particular latency and amplitude. The classical ERP decoding methods exploit this waveform characteristic and thus achieve a high performance only if there is sufficient time- and phase-locking across trials. The required condition is not fulfilled if the experimental tasks are challenging or if it is needed to generalize across various experimental conditions. Features based on spatial covariances across channels can potentially overcome the latency jitter and delays since they aggregate the information across time. APPROACH: We compared the performance stability of waveform and covariance-based features as well as their combination in two simulated scenarios: 1) generalization across experiments on Error-related Potentials and 2) dealing with larger latency jitter across trials. MAIN RESULTS: The features based on spatial covariances provide a stable performance with a minor decline under jitter levels of up to ± 300 ms, whereas the decoding performance with waveform features quickly drops from 0.85 to 0.55 AUC. The generalization across ErrP experiments also resulted in a significantly more stable performance with covariance-based features. SIGNIFICANCE: The results confirmed our hypothesis that covariance-based features can be used to: 1) classify more reliably ERPs with higher intrinsic variability in more challenging real-life applications and 2) generalize across related experimental protocols.


Subject(s)
Brain-Computer Interfaces , Electroencephalography , Evoked Potentials
4.
J Neural Eng ; 14(5): 056017, 2017 10.
Article in English | MEDLINE | ID: mdl-28696340

ABSTRACT

OBJECTIVE: Brain-machine interfaces (BMIs) have been proposed in closed-loop applications for neuromodulation and neurorehabilitation. This study describes the impact of different feedback modalities on the performance of an EEG-based BMI that decodes motor imagery (MI) of leg flexion and extension. APPROACH: We executed experiments in a lower-limb gait trainer (the legoPress) where nine able-bodied subjects participated in three consecutive sessions based on a crossover design. A random forest classifier was trained from the offline session and tested online with visual and proprioceptive feedback, respectively. Post-hoc classification was conducted to assess the impact of feedback modalities and learning effect (an improvement over time) on the simulated trial-based performance. Finally, we performed feature analysis to investigate the discriminant power and brain pattern modulations across the subjects. MAIN RESULTS: (i) For real-time classification, the average accuracy was [Formula: see text]% and [Formula: see text]% for the two online sessions. The results were significantly higher than chance level, demonstrating the feasibility to distinguish between MI of leg extension and flexion. (ii) For post-hoc classification, the performance with proprioceptive feedback ([Formula: see text]%) was significantly better than with visual feedback ([Formula: see text]%), while there was no significant learning effect. (iii) We reported individual discriminate features and brain patterns associated to each feedback modality, which exhibited differences between the two modalities although no general conclusion can be drawn. SIGNIFICANCE: The study reported a closed-loop brain-controlled gait trainer, as a proof of concept for neurorehabilitation devices. We reported the feasibility of decoding lower-limb movement in an intuitive and natural way. As far as we know, this is the first online study discussing the role of feedback modalities in lower-limb MI decoding. Our results suggest that proprioceptive feedback has an advantage over visual feedback, which could be used to improve robot-assisted strategies for motor training and functional recovery.


Subject(s)
Brain-Computer Interfaces , Brain/physiology , Feedback, Sensory/physiology , Gait/physiology , Proprioception/physiology , Visual Perception/physiology , Adult , Female , Humans , Lower Extremity/physiology , Male , Young Adult
5.
Med Biol Eng Comput ; 55(8): 1339-1352, 2017 Aug.
Article in English | MEDLINE | ID: mdl-27858227

ABSTRACT

Brain-Computer Interfaces (BCI) rely on the interpretation of brain activity to provide people with disabilities with an alternative/augmentative interaction path. In light of this, BCI could be considered as enabling technology in many fields, including Active and Assisted Living (AAL) systems control. Interaction barriers could be removed indeed, enabling user with severe motor impairments to gain control over a wide range of AAL features. In this paper, a cost-effective BCI solution, targeted (but not limited) to AAL system control is presented. A custom hardware module is briefly reviewed, while signal processing techniques are covered in more depth. Steady-state visual evoked potentials (SSVEP) are exploited in this work as operating BCI protocol. In contrast with most common SSVEP-BCI approaches, we propose the definition of a prediction confidence indicator, which is shown to improve overall classification accuracy. The confidence indicator is derived without any subject-specific approach and is stable across users: it can thus be defined once and then shared between different persons. This allows some kind of Plug&Play interaction. Furthermore, by modelling rest/idle periods with the confidence indicator, it is possible to detect active control periods and separate them from "background activity": this is capital for real-time, self-paced operation. Finally, the indicator also allows to dynamically choose the most appropriate observation window length, improving system's responsiveness and user's comfort. Good results are achieved under such operating conditions, achieving, for instance, a false positive rate of 0.16 min-1, which outperform current literature findings.


Subject(s)
Brain-Computer Interfaces , Electroencephalography/instrumentation , Electroencephalography/methods , Evoked Potentials, Visual/physiology , Self-Help Devices , Signal Processing, Computer-Assisted/instrumentation , Visual Cortex/physiology , Brain Mapping/instrumentation , Brain Mapping/methods , Equipment Design , Equipment Failure Analysis , Reproducibility of Results , Sensitivity and Specificity , User-Computer Interface
6.
Annu Int Conf IEEE Eng Med Biol Soc ; 2016: 5733-5736, 2016 Aug.
Article in English | MEDLINE | ID: mdl-28269556

ABSTRACT

Movement Related Cortical Potentials (MRCP) have been the subject of numerous studies. They accompany many self-initiated movements and this makes them a good candidate for incorporation in BCI paradigms. In this work we propose a novel experimental protocol involving natural controlling of a computer mouse and based on EEG recordings from 5 subjects, show that it elicits MRCP. We also show the feasibility of online detection of MRCP by implementing a classification based detection framework. Additionally, we discuss the adverse effects of causality restriction on detection performance by implementing an additional offline approach relaxing those restrictions and comparing the results. The best MRCP detection performance achieved on the recorded data with the offline approach has an average maximum accuracy of 0.76 and with the online approach an average AUC of 0.953.


Subject(s)
Electroencephalography/methods , Evoked Potentials , Movement/physiology , Signal Processing, Computer-Assisted , Female , Humans , Male , User-Computer Interface , Young Adult
7.
Neuroimage ; 120: 64-74, 2015 Oct 15.
Article in English | MEDLINE | ID: mdl-26169320

ABSTRACT

Electrophysiological and neuroimaging evidence suggest the existence of common mechanisms for monitoring erroneous events, independent of the source of errors. Previous works have described modulations of theta activity in the medial frontal cortex elicited by either self-generated errors or erroneous feedback. In turn, similar patterns have recently been reported to appear after the observation of external errors. We report cross-regional interactions after observation of errors at both average and single-trial levels. We recorded scalp electroencephalography (EEG) signals from 15 subjects while monitoring the movement of a cursor on a computer screen. Connectivity patterns, estimated using multivariate auto-regressive models, show increased error-related modulations of the information transfer in the theta and alpha bands between frontocentral and frontolateral areas. Conversely, a decrease of connectivity in the beta band is also observed. These network patterns are similar to those elicited by self-generated errors. However, since no motor response is required, they appear to be related to intrinsic mechanisms of error processing, instead of being linked to co-activation of motor areas. Noticeably, we demonstrate that cross-regional interaction patterns can be estimated on a trial-by-trial basis. These trial-specific patterns, consistent with the multi-trial analysis, convey discriminant information on whether a trial was elicited by observation of an erroneous action. Overall, our study supports the role of frequency-specific modulations in the medial frontal cortex in coordinating cross-regional activity during cognitive monitoring at a single-trial basis.


Subject(s)
Cerebral Cortex/physiology , Electroencephalography/methods , Executive Function/physiology , Motion Perception/physiology , Nerve Net/physiology , Adult , Alpha Rhythm/physiology , Beta Rhythm/physiology , Female , Humans , Male , Theta Rhythm/physiology , Young Adult
8.
Article in English | MEDLINE | ID: mdl-25570292

ABSTRACT

Decoding the user intention from non-invasive EEG signals is a challenging problem. In this paper, we study the feasibility of predicting the goal for controlling the robot arm in self-paced reaching movements, i.e., spontaneous movements that do not require an external cue. Our proposed system continuously estimates the goal throughout a trial starting before the movement onset by online classification and generates optimal trajectories for driving the robot arm to the estimated goal. Experiments using EEG signals of one healthy subject (right arm) yield smooth reaching movements of the simulated 7 degrees of freedom KUKA robot arm in planar center-out reaching task with approximately 80% accuracy of reaching the actual goal.


Subject(s)
Arm/physiology , Electroencephalography , Robotics , Brain-Computer Interfaces , Humans , Movement , Reward , Stroke/physiopathology
9.
Article in English | MEDLINE | ID: mdl-24110382

ABSTRACT

Controlling a brain-actuated device requires the participant to look at and to split his attention between the interaction of the device with its environment and the status information of the Brain-Computer Interface (BCI). Such parallel visual tasks are partly contradictory, with the goal of achieving a good and natural device control. Is there a possibility to free the visual channel from one of these tasks? To address this, a stimulation system based on 6 coin-motors is developed, which provides a spatially continuous tactile illusion as BCI feedback, so that the visual channel can be devoted to the device. Several experiments are conducted in this work, to optimize the tactile illusion patterns and to investigate the influence on the electroencephalogram (EEG). Finally, 6 healthy BCI participants compare visual with tactile feedback in online BCI recordings. The developed stimulator can be used without interfering with the EEG. All subjects are able to perceive this type of tactile feedback well, and no statistical degradation in the online BCI performance could be identified between visual and tactile feedback.


Subject(s)
Brain-Computer Interfaces , Feedback, Sensory , Touch/physiology , Visual Perception/physiology , Adult , Attention/physiology , Electric Stimulation , Electrodes , Electroencephalography , Female , Humans , Male , Young Adult
10.
Article in English | MEDLINE | ID: mdl-24110383

ABSTRACT

Motor-disabled end users have successfully driven a telepresence robot in a complex environment using a Brain-Computer Interface (BCI). However, to facilitate the interaction aspect that underpins the notion of telepresence, users must be able to voluntarily and reliably stop the robot at any moment, not just drive from point to point. In this work, we propose to exploit the user's residual muscular activity to provide a fast and reliable control channel, which can start/stop the telepresence robot at any moment. Our preliminary results show that not only does this hybrid approach increase the accuracy, but it also helps to reduce the workload and was the preferred control paradigm of all the participants.


Subject(s)
Brain-Computer Interfaces , Robotics/instrumentation , Telemedicine/instrumentation , Adult , Electroencephalography , Electromyography , Humans , Male
11.
Article in English | MEDLINE | ID: mdl-22255795

ABSTRACT

One of the main problems of both synchronous and asynchronous EEG-based BCIs is the need of an initial calibration phase before the system can be used. This phase is necessary due to the high non-stationarity of the EEG, since it changes between sessions and users. The calibration process limits the BCI systems to scenarios where the outputs are very controlled, and makes these systems non-friendly and exhausting for the users. Although it has been studied how to reduce calibration time for asynchronous signals, it is still an open issue for event-related potentials. Here, we propose the minimization of the calibration time on single-trial error potentials by using classifiers based on inter-subject information. The results show that it is possible to have a classifier with a high performance from the beginning of the experiment, and which is able to adapt itself making the calibration phase shorter and transparent to the user.


Subject(s)
Brain/physiology , Man-Machine Systems , Adult , Algorithms , Brain/pathology , Calibration , Electrodes , Electroencephalography/methods , Equipment Design , Evoked Potentials , Female , Humans , Male , Neurophysiology/methods , Pattern Recognition, Automated/methods , Reproducibility of Results , Time Factors , User-Computer Interface
12.
Article in English | MEDLINE | ID: mdl-22254868

ABSTRACT

Motor imagery (MI) brain-computer interfaces (BCIs) translate a subject's motor intention to a command signal. Most MI BCIs use power features in the mu or beta rhythms, while several results have been reported using a measure of phase synchrony, the phase-locking value (PLV). In this study, we investigated the performance of various phase-based features, including instantaneous phase difference (IPD) and PLV, for control of a MI BCI. Patterns of phase synchrony differentially appear over the motor cortices and between the primary motor cortex (M1) and supplementary motor area (SMA) during MI. Offline results, along with preliminary online sessions, indicate that IPD serves as a robust control signal for differentiating between MI classes, and that the phase relations between channels are relatively stable over several months. Offline and online trial-level classification accuracies based on IPD ranged from 84% to 99%, whereas the performance for the corresponding amplitude features ranged from 70% to 100%.


Subject(s)
Man-Machine Systems , Motor Cortex/physiology , User-Computer Interface , Bayes Theorem , Humans , Probability
13.
Article in English | MEDLINE | ID: mdl-22255272

ABSTRACT

In this paper we present the first results of users with disabilities in mentally controlling a telepresence robot, a rather complex task as the robot is continuously moving and the user must control it for a long period of time (over 6 minutes) to go along the whole path. These two users drove the telepresence robot from their clinic more than 100 km away. Remarkably, although the patients had never visited the location where the telepresence robot was operating, they achieve similar performances to a group of four healthy users who were familiar with the environment. In particular, the experimental results reported in this paper demonstrate the benefits of shared control for brain-controlled telepresence robots. It allows all subjects (including novel BMI subjects as our users with disabilities) to complete a complex task in similar time and with similar number of commands to those required by manual control.


Subject(s)
Disabled Persons , Robotics , Female , Humans , Male
14.
Article in English | MEDLINE | ID: mdl-21096001

ABSTRACT

Practical Brain-Computer Interfaces (BCIs) for disabled people should allow them to use all their remaining functionalities as control possibilities. Sometimes these people have residual activity of their muscles, most likely in the morning when they are not exhausted. In this work we fuse electromyographic (EMG) with electroencephalographic (EEG) activity in the framework of a so called "Hybrid-BCI" (hBCI) approach. Thereby, subjects could achieve a good control of their hBCI independently of their level of muscular fatigue. Furthermore, although EMG alone yields good performance, it is outperformed by the hybrid fusing of EEG and EMG. Two different fusion techniques are explored showing graceful performance degradation in the case of signal attenuation. Such a system allows a very reliable control and a smooth handover if the subjects get exhausted or fatigued during the day.


Subject(s)
Brain/physiology , Man-Machine Systems , Muscles/physiology , Electroencephalography , Electromyography , Humans , Task Performance and Analysis
15.
IEEE Trans Biomed Eng ; 55(3): 923-9, 2008 Mar.
Article in English | MEDLINE | ID: mdl-18334383

ABSTRACT

Brain-computer interfaces (BCIs) are prone to errors in the recognition of subject's intent. An elegant approach to improve the accuracy of BCIs consists in a verification procedure directly based on the presence of error-related potentials (ErrP) in the electroencephalogram (EEG) recorded right after the occurrence of an error. Several studies show the presence of ErrP in typical choice reaction tasks. However, in the context of a BCI, the central question is: "Are ErrP also elicited when the error is made by the interface during the recognition of the subject's intent?"; We have thus explored whether ErrP also follow a feedback indicating incorrect responses of the simulated BCI interface. Five healthy volunteer subjects participated in a new human-robot interaction experiment, which seem to confirm the previously reported presence of a new kind of ErrP. However, in order to exploit these ErrP, we need to detect them in each single trial using a short window following the feedback associated to the response of the BCI. We have achieved an average recognition rate of correct and erroneous single trials of 83.5% and 79.2%, respectively, using a classifier built with data recorded up to three months earlier.


Subject(s)
Electroencephalography/methods , Evoked Potentials/physiology , Gyrus Cinguli/physiology , Imagination/physiology , Motor Cortex/physiology , Robotics/methods , User-Computer Interface , Algorithms , Artifacts , Feedback/physiology , Humans , Intention , Man-Machine Systems , Reproducibility of Results , Sensitivity and Specificity
16.
Article in English | MEDLINE | ID: mdl-18003064

ABSTRACT

Brain-Computer Interfaces (BCIs) need an uninterrupted flow of feedback to the user, which is usually delivered through the visual channel. Our aim is to explore the benefits of vibrotactile feedback during users' training and control of EEG-based BCI applications. An experimental setup for delivery of vibrotactile feedback, including specific hardware and software arrangements, was specified. We compared vibrotactile and visual feedback, addressing the performance in presence of a complex visual task on the same (visual) or different (tactile) sensory channel. The preliminary experimental setup included a simulated BCI control. in which all parts reflected the computational and actuation process of an actual BCI, except the souce, which was simulated using a "noisy" PC mouse. Results indicated that the vibrotactile channel can function as a valuable feedback modality with reliability comparable to the classical visual feedback. Advantages of using a vibrotactile feedback emerged when the visual channel was highly loaded by a complex task.


Subject(s)
Brain/physiology , Electroencephalography , Feedback , Humans , Magnetics , Shoulder , Touch , User-Computer Interface , Vibration
17.
Comput Intell Neurosci ; : 48937, 2007.
Article in English | MEDLINE | ID: mdl-18354734

ABSTRACT

To be correctly mastered, brain-computer interfaces (BCIs) need an uninterrupted flow of feedback to the user. This feedback is usually delivered through the visual channel. Our aim was to explore the benefits of vibrotactile feedback during users' training and control of EEG-based BCI applications. A protocol for delivering vibrotactile feedback, including specific hardware and software arrangements, was specified. In three studies with 33 subjects (including 3 with spinal cord injury), we compared vibrotactile and visual feedback, addressing: (I) the feasibility of subjects' training to master their EEG rhythms using tactile feedback; (II) the compatibility of this form of feedback in presence of a visual distracter; (III) the performance in presence of a complex visual task on the same (visual) or different (tactile) sensory channel. The stimulation protocol we developed supports a general usage of the tactors; preliminary experimentations. All studies indicated that the vibrotactile channel can function as a valuable feedback modality with reliability comparable to the classical visual feedback. Advantages of using a vibrotactile feedback emerged when the visual channel was highly loaded by a complex task. In all experiments, vibrotactile feedback felt, after some training, more natural for both controls and SCI users.

18.
Comput Intell Neurosci ; : 25130, 2007.
Article in English | MEDLINE | ID: mdl-18354739

ABSTRACT

Controlling a robotic device by using human brain signals is an interesting and challenging task. The device may be complicated to control and the nonstationary nature of the brain signals provides for a rather unstable input. With the use of intelligent processing algorithms adapted to the task at hand, however, the performance can be increased. This paper introduces a shared control system that helps the subject in driving an intelligent wheelchair with a noninvasive brain interface. The subject's steering intentions are estimated from electroencephalogram (EEG) signals and passed through to the shared control system before being sent to the wheelchair motors. Experimental results show a possibility for significant improvement in the overall driving performance when using the shared control system compared to driving without it. These results have been obtained with 2 healthy subjects during their first day of training with the brain-actuated wheelchair.

SELECTION OF CITATIONS
SEARCH DETAIL
...