Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
Add more filters










Publication year range
1.
PLoS One ; 19(5): e0300128, 2024.
Article in English | MEDLINE | ID: mdl-38758733

ABSTRACT

Interpersonal touch plays a crucial role in human communication, development, and wellness. Mediated interpersonal touch (MIT), a technology to distance or virtually simulated interpersonal touch, has received significant attention to counteract the negative consequences of touch deprivation. Studies investigating the effectiveness of MIT have primarily focused on self-reporting or behavioral correlates. It is largely unknown how MIT affects neural processes such as interbrain functional connectivity during human interactions. Given how users exchange haptic information simultaneously during interpersonal touch, interbrain functional connectivity provides a more ecologically valid way of studying the neural correlates associated with MIT. In this study, a palm squeeze task is designed to examine interbrain synchrony associated with MIT using EEG-based hyperscanning methodology. The phase locking value (PLV) index is used to measure interbrain synchrony. Results demonstrate that MIT elicits a significant increase in alpha interbrain synchronization between participants' brains. Especially, there was a significant difference in the alpha PLV indices between no MIT and MIT conditions in the early stage (130-470 ms) of the interaction period (t-test, p < 0.05). Given the role that alpha interbrain synchrony plays during social interaction, a significant increase in PLV index during MIT interaction seems to indicate an effect of social coordination. The findings and limitations of this study are further discussed, and perspectives on future research are provided.


Subject(s)
Brain , Electroencephalography , Interpersonal Relations , Touch , Humans , Brain/physiology , Male , Female , Young Adult , Touch/physiology , Adult , Alpha Rhythm/physiology , Touch Perception/physiology , Social Interaction
2.
J Neural Eng ; 21(2)2024 Mar 22.
Article in English | MEDLINE | ID: mdl-38479013

ABSTRACT

Objective. Classifying motor imagery (MI) tasks that involve fine motor control of the individual five fingers presents unique challenges when utilizing electroencephalography (EEG) data. In this paper, we systematically assess the classification of MI functions for the individual five fingers using single-trial time-domain EEG signals. This assessment encompasses both within-subject and cross-subject scenarios, supported by data-driven analysis that provides statistical validation of the neural correlate that could potentially discriminate between the five fingers.Approach. We present Shapley-informed augmentation, an informed approach to enhance within-subject classification accuracy. This method is rooted in insights gained from our data-driven analysis, which revealed inconsistent temporal features encoding the five fingers MI across sessions of the same subject. To evaluate its impact, we compare within-subject classification performance both before and after implementing this augmentation technique.Main results. Both the data-driven approach and the model explainability analysis revealed that the parietal cortex contains neural information that helps discriminate the individual five fingers' MI apart. Shapley-informed augmentation successfully improved classification accuracy in sessions severely affected by inconsistent temporal features. The accuracy for sessions impacted by inconsistency in their temporal features increased by an average of26.3%±6.70, thereby enabling a broader range of subjects to benefit from brain-computer interaction (BCI) applications involving five-fingers MI classification. Conversely, non-impacted sessions experienced only a negligible average accuracy decrease of2.01±5.44%. The average classification accuracy achieved is around 60.0% (within-session), 50.0% (within-subject) and 40.0% (leave-one-subject-out).Significance. This research offers data-driven evidence of neural correlates that could discriminate between the individual five fingers MI and introduces a novel Shapley-informed augmentation method to address temporal variability of features, ultimately contributing to the development of personalized systems.


Subject(s)
Brain-Computer Interfaces , Imagination , Humans , Imagery, Psychotherapy , Fingers , Brain , Electroencephalography/methods , Algorithms
3.
IEEE Trans Haptics ; 17(1): 39-44, 2024.
Article in English | MEDLINE | ID: mdl-38224514

ABSTRACT

Although medical simulators have benefited from the use of haptics and virtual reality (VR) for decades, the former has become the bottleneck in producing a low-cost, compact, and accurate training experience. This is particularly the case for the inferior alveolar nerve block (IANB) procedure in dentistry, which is one of the most difficult motor skills to acquire. As existing works are still oversimplified or overcomplicated for practical deployment, we introduce an origami-based haptic syringe interface for IANB local anesthesia training. By harnessing the versatile mechanical tunability of the Kresling origami pattern, our interface simulated the tactile experience of the plunger while injecting the anesthetic solution. We present the design, development, and characterization process, as well as a preliminary usability study. The force profile generated by the syringe interface is perceptually similar with that of the Carpule syringe. The usability study suggests that the haptic syringe significantly improves the IANB training simulation and its potential to be utilized in several other medical training/simulation applications.


Subject(s)
Anesthesia, Local , Touch Perception , Humans , Syringes , Haptic Technology , User-Computer Interface , Computer Simulation , Clinical Competence
4.
IEEE Trans Haptics ; PP2023 Dec 29.
Article in English | MEDLINE | ID: mdl-38157457

ABSTRACT

The use of vibrotactile feedback, in place of a full-fledged force feedback experience, has recently received increased attention in haptic communities due to their clear advantages in terms of cost, expressiveness, and wearability. However, designers and engineers are required to trade off a number of technical challenges when designing vibrotactile actuators, including expressiveness (a wide band of actuation frequency), flexibility, and the complexity of the manufacturing process. To address these challenges, we present the design and characterization of an origami-inspired flexible vibrotactile actuator, named OriVib, with a tunable resonance frequency (expressiveness), an origami-inspired design (flexible, soft contact with the human body), and a streamlined manufacturing process (low-cost). Based on its characterization, the fabricated OriVib actuator with 54 mm diameter can produce up to 1.2 g vibration intensity where the vibration intensity increases linearly from 6-11 V input. The resonance frequency is tunable through the characteristic diameter (the resonance frequency decreases in an almost inversely proportional fashion as the diameter increases). As for the thermal signature, the OriVib actuator maintains its temperature below 38  oC when actuated within 6-8 V. In terms of repeatability, the OriVib maintained an average vibration intensity of 0.849 g (standard deviation 0.035 g) for at least 2 million cycles. These results validate the effectiveness of the OriVib actuator to offer an expressive, low-cost, and flexible vibrotactile actuator.

5.
Front Robot AI ; 10: 1193388, 2023.
Article in English | MEDLINE | ID: mdl-37779578

ABSTRACT

Introduction: Handwriting is a complex task that requires coordination of motor, sensory, cognitive, memory, and linguistic skills to master. The extent these processes are involved depends on the complexity of the handwriting task. Evaluating the difficulty of a handwriting task is a challenging problem since it relies on subjective judgment of experts. Methods: In this paper, we propose a machine learning approach for evaluating the difficulty level of handwriting tasks. We propose two convolutional neural network (CNN) models for single- and multilabel classification where single-label classification is based on the mean of expert evaluation while the multilabel classification predicts the distribution of experts' assessment. The models are trained with a dataset containing 117 spatio-temporal features from the stylus and hand kinematics, which are recorded for all letters of the Arabic alphabet. Results: While single- and multilabel classification models achieve decent accuracy (96% and 88% respectively) using all features, the hand kinematics features do not significantly influence the performance of the models. Discussion: The proposed models are capable of extracting meaningful features from the handwriting samples and predicting their difficulty levels accurately. The proposed approach has the potential to be used to personalize handwriting learning tools and provide automatic evaluation of the quality of handwriting.

6.
J Neural Eng ; 20(5)2023 09 29.
Article in English | MEDLINE | ID: mdl-37732958

ABSTRACT

Objective. Single-trial electroencephalography (EEG) classification is a promising approach to evaluate the cognitive experience associated with haptic feedback. Convolutional neural networks (CNNs), which are among the most widely used deep learning techniques, have demonstrated their effectiveness in extracting EEG features for the classification of different cognitive functions, including the perception of vibration intensity that is often experienced during human-computer interaction. This paper proposes a novel CNN ensemble model to classify the vibration-intensity from a single trial EEG data that outperforms the state-of-the-art EEG models.Approach. The proposed ensemble model, named SE NexFusion, builds upon the observed complementary learning behaviors of the EEGNex and TCNet Fusion models, exhibited in learning personal as well generic neural features associated with vibration intensity. The proposed ensemble employs multi-branch feature encoders corroborated with squeeze-and-excitation units that enables rich-feature encoding while at the same time recalibrating the weightage of the obtained feature maps based on their discriminative power. The model takes in a single trial of raw EEG as an input and does not require complex EEG signal-preprocessing.Main results. The proposed model outperforms several state-of-the-art bench-marked EEG models by achieving an average accuracy of 60.7% and 61.6% under leave-one-subject-out and within-subject cross-validation (three-classes), respectively. We further validate the robustness of the model through Shapley values explainability method, where the most influential spatio-temporal features of the model are counter-checked with the neural correlates that encode vibration intensity.Significance. Results show that SE NexFusion outperforms other benchmarked EEG models in classifying the vibration intensity. Additionally, explainability analysis confirms the robustness of the model in attending to features associated with the neural correlates of vibration intensity.


Subject(s)
Deep Learning , Humans , Vibration , Neural Networks, Computer , Electroencephalography/methods , Algorithms
7.
IEEE Trans Haptics ; 16(4): 524-529, 2023.
Article in English | MEDLINE | ID: mdl-37126610

ABSTRACT

Reliable haptic interfaces augment human-computer interaction via simulated tactile and kinesthetic feedback. As haptic technologies advance, user experience evaluation becomes more crucial. Conventionally, self-reporting is used to evaluate haptic experiences; however, it could be inconsistent or imprecise due to human error. A promising alternative is using neurocognitive methods with machine or deep learning models to evaluate the human haptic experience. Machine and deep learning models can be trained on Electroencephalography (EEG) data labeled based on self-report or actual physical stimulation. As the literature lacks a systematic study on which approach is more robust, we develop a visuo-haptic task to answer this question by examining an important haptic experience, namely, haptic delay. EEG is recorded during the experiment, and participants report whether they detected a delay in the haptic modality through self-report. Four machine/deep learning models were trained twice on the EEG data using the two labeling methods. Models trained with labels from the physical stimuli significantly outperformed those trained with self-reporting labels. Although this finding holds true for one particular haptic experience (haptic delay), it cannot be extrapolated to others; rather, it suggests that EEG data labeling plays a prominent role in evaluating the haptic experience through neurocognitive methods.


Subject(s)
Touch Perception , Humans , Haptic Interfaces , Haptic Technology , Self Report , Physical Stimulation , Machine Learning , Electroencephalography
8.
Front Neurosci ; 17: 1320417, 2023.
Article in English | MEDLINE | ID: mdl-38260029

ABSTRACT

Introduction: Thermal feedback technologies have been explored in human-computer interaction to provide secondary information and enhance the overall user experience. Unlike fast-response haptic modalities such as vibration and force feedback, the human brain's processes associated with thermal feedback are not fully understood. Methods: In this study, we utilize electroencephalography (EEG) brain imaging to systematically examine the neural correlates associated with a wide range of thermal stimuli, including 9, 15, 32, and 42°C, during active touch at the fingertip. A custom experimental setup is developed to provide thermal stimulation at the desirable temperature levels. A total of 30 participants are recruited to experience the four levels of thermal stimulation by actively touching a thermal stimulation unit with the index finger while recording brain activities via EEG. Time-frequency analysis and power spectral density (PSD) of the EEG data are utilized to analyze the delta, theta, alpha, beta, and gamma frequency bands. Results: The results show that the delta, theta, and alpha PSDs of 9 and 15°C stimuli are significantly higher than the PSDs of 32 and 42°C in the right frontal area during the early stage of the stimulation, from 282 ms up to 1,108 ms (One-way ANOVA test, Holm-Bonferroni correction, p < 0.05). No significant differences in PSDs are found between 9 and 15°C thermal stimuli or between 32 and 42°C thermal stimuli. Discussion: The findings of this study inform the development of thermal feedback system in human-computer interaction.

9.
Front Neurosci ; 16: 961101, 2022.
Article in English | MEDLINE | ID: mdl-36330339

ABSTRACT

Haptic technologies enable users to physically interact with remote or virtual environments by applying force, vibration, or motion via haptic interfaces. However, the delivery of timely haptic feedback remains a challenge due to the stringent computation and communication requirements associated with haptic data transfer. Haptic delay disrupts the realism of the user experience and interferes with the quality of interaction. Research efforts have been devoted to studying the neural correlates of delayed sensory stimulation to better understand and thus mitigate the impact of delay. However, little is known about the functional neural networks that process haptic delay. This paper investigates the underlying neural networks associated with processing haptic delay in passive and active haptic interactions. Nineteen participants completed a visuo-haptic task using a computer screen and a haptic device while electroencephalography (EEG) data were being recorded. A combined approach based on phase locking value (PLV) functional connectivity and graph theory was used. To assay the effects of haptic delay on functional connectivity, we evaluate a global connectivity property through the small-worldness index and a local connectivity property through the nodal strength index. Results suggest that the brain exhibits significantly different network characteristics when a haptic delay is introduced. Haptic delay caused an increased manifestation of the small-worldness index in the delta and theta bands as well as an increased nodal strength index in the middle central region. Inter-regional connectivity analysis showed that the middle central region was significantly connected to the parietal and occipital regions as a result of haptic delay. These results are expected to indicate the detection of conflicting visuo-haptic information at the middle central region and their respective resolution and integration at the parietal and occipital regions.

10.
Front Robot AI ; 9: 1013043, 2022.
Article in English | MEDLINE | ID: mdl-36237844

ABSTRACT

Haptic technologies are becoming increasingly valuable in Human-Computer interaction systems as they provide means of physical interaction with a remote or virtual environment. One of the persistent challenges in tele-haptic systems, communicating haptic information over a computer network, is the synchrony of the delivered haptic information with the rest of the sensory modalities. Delayed haptic feedback can have serious implications on the user performance and overall experience. Limited research efforts have been devoted to studying the implication of haptic delay on the human neural response and relating it to the overall haptic experience. Deep learning could offer autonomous brain activity interpretation in response to a haptic experience such as haptic delay. In this work, we propose an ensemble of 2D CNN and transformer models that is capable of detecting the presence and redseverity of haptic delay from a single-trial Electroencephalography data. Two EEG-based experiments involving visuo-haptic interaction tasks are proposed. The first experiment aims to collect data for detecting the presence of haptic delay during discrete force feedback using a bouncing ball on a racket simulation, while the second aims to collect data for detecting the severity level (none, mild, moderate, severe) of the haptic delay during continuous force feedback via grasping/releasing of an object in a bucket. The ensemble model showed a promising performance with an accuracy of 0.9142 ± 0.0157 for detecting haptic delay during discrete force feedback and 0.6625 ± 0.0067 for classifying the severity of haptic delay during continuous force feedback (4 levels). These results were obtained based on training the model with raw EEG data as well as their wavelet transform using several wavelet kernels. This study is a step forward towards developing cognitive evaluation of the user experience while interaction with haptic interfaces.

11.
Sci Rep ; 12(1): 8869, 2022 05 25.
Article in English | MEDLINE | ID: mdl-35614196

ABSTRACT

The use of haptic technologies in modern life scenarios is becoming the new normal particularly in rehabilitation, medical training, and entertainment applications. An evident challenge in haptic telepresence systems is the delay in haptic information. How humans perceive delayed visual and audio information has been extensively studied, however, the same for haptically delayed environments remains largely unknown. Here, we develop a visuo-haptic experimental setting that simulates pick and place task and involves continuous haptic feedback stimulation with four possible haptic delay levels. The setting is built using a haptic device and a computer screen. We use electroencephalography (EEG) to study the neural correlates that could be used to identify the amount of the experienced haptic delay. EEG data were collected from 34 participants. Results revealed that midfrontal theta oscillation plays a pivotal role in quantifying the amount of haptic delay while parietal alpha showed a significant modulation that encodes the presence of haptic delay. Based on the available literature, these results suggest that the amount of haptic delay is proportional to the neural activation that is associated with conflict detection and resolution as well as for multi-sensory divided attention.


Subject(s)
Haptic Technology , Theta Rhythm , Attention , Electroencephalography , Feedback , Humans , Theta Rhythm/physiology
12.
IEEE Trans Haptics ; 15(1): 74-78, 2022.
Article in English | MEDLINE | ID: mdl-35077368

ABSTRACT

Wearable haptic technologies have garnered recent widespread attention due to increased accessibility, functionality, and affordability. These systems typically provide haptic feedback to augment the human ability to interact with their environment. This study compares two haptic feedback modalities, vibrotactile and EMS, against visual feedback to elicit a motor response during active hand movement. Forty-five participants, divided into three groups, performed a task to touch their face and received one of three possible sensory feedback cues, namely visual, vibrotactile, and electrical muscle stimulation (EMS), to interrupt their movement and avoid touching their face. Two quantitative performance measures are used in the comparison, the response time (time elapsed from stimulation to motor response) and the error rate (percentage that the user fails to avoid touching their face). Results showed that vibrotactile and EMS feedback yielded significantly faster response time than visual feedback, while no significant differences between vibrotactile and EMS were observed. Furthermore, the error rate was significantly lower for EMS compared to visual feedback, whereas no significant differences were observed between vibrotactile and visual feedback. In conclusion, it seems that EMS feedback is preferable for applications where errors are not tolerable (critical medical applications), whereas vibrotactile is superior for non-critical applications due to its low cost and higher usability (more pleasant compared to EMS).


Subject(s)
Hand , Psychomotor Performance , Feedback , Feedback, Sensory , Hand/physiology , Humans , Muscles , Psychomotor Performance/physiology , Touch/physiology , Vibration
13.
Front Neurosci ; 15: 682113, 2021.
Article in English | MEDLINE | ID: mdl-34858124

ABSTRACT

Vibrotactile feedback technology has become widely used in human-computer interaction due to its low cost, wearability, and expressiveness. Although neuroimaging studies have investigated neural processes associated with different types of vibrotactile feedback, encoding vibration intensity in the brain remains largely unknown. The aim of this study is to investigate neural processes associated with vibration intensity using electroencephalography. Twenty-nine healthy participants (aged 18-40 years, nine females) experienced vibrotactile feedback at the distal phalanx of the left index finger with three vibration intensity conditions: no vibration, low-intensity vibration (1.56 g), and high-intensity vibration (2.26 g). The alpha and beta band event-related desynchronization (ERD) as well as P2 and P3 event-related potential components for each of the three vibration intensity conditions are obtained. Results demonstrate that the ERD in the alpha band in the contralateral somatosensory and motor cortex areas is significantly associated with the vibration intensity. The average power spectral density (PSD) of the peak period of the ERD (400-600 ms) is significantly stronger for the high- and low-vibration intensity conditions compared to the no vibration condition. Furthermore, the average PSD of the ERD rebound (700-2,000 ms) is significantly maintained for the high-vibration intensity compared to low-intensity and no vibration conditions. Beta ERD signals the presence of vibration. These findings inform the development of quantitative measurements for vibration intensities based on neural signals.

14.
Sci Rep ; 11(1): 17074, 2021 08 23.
Article in English | MEDLINE | ID: mdl-34426593

ABSTRACT

Haptic technologies aim to simulate tactile or kinesthetic interactions with a physical or virtual environment in order to enhance user experience and/or performance. However, due to stringent communication and computational needs, the user experience is influenced by delayed haptic feedback. While delayed feedback is well understood in the visual and auditory modalities, little research has systematically examined the neural correlates associated with delayed haptic feedback. In this paper, we used electroencephalography (EEG) to study sensory and cognitive neural correlates caused by haptic delay during passive and active tasks performed using a haptic device and a computer screen. Results revealed that theta power oscillation was significantly higher at the midfrontal cortex under the presence of haptic delay. Sensory correlates represented by beta rebound were found to be similar in the passive task and different in the active task under the delayed and synchronous conditions. Additionally, the event related potential (ERP) P200 component is modulated under the haptic delay condition during the passive task. The P200 amplitude significantly reduced in the last 20% of trials during the passive task and in the absence of haptic delay. Results suggest that haptic delay could be associated with increased cognitive control processes including multi-sensory divided attention followed by conflict detection and resolution with an earlier detection during the active task. Additionally, haptic delay tends to generate greater perceptual attention that does not significantly decay across trials during the passive task.

15.
IEEE Trans Haptics ; 14(4): 825-834, 2021.
Article in English | MEDLINE | ID: mdl-34038368

ABSTRACT

Handwriting is a fundamental human skill that is essential for communication yet is one of the most complex skills to be mastered. Pen-based interaction with touchscreen devices are increasingly used in digital handwriting practices to simulate pen and paper experience, but are mostly based on auditory-visual feedback. Given that handwriting relies on visual and motor skills, haptic feedback is recently explored to augment audio-visual systems to further support the handwriting process. In this article, we present an assistive platform entitled KATIB (means writer in Arabic) that provides high fidelity kinesthetic feedback, in addition to audio-visual feedback, to support handwriting using magnetic forces. We propose novel contactless kinesthetic guidance methods, namely proactive and retroactive guidance, to guide the handwriting stylus along a desirable trajectory based on position control. Detaching the handwriting stylus from any mechanical device enables learners to have full control over grasping and moving at their own pace and style. The proposed platform is characterized for haptic interaction. Finally, a psychophysical experiment is conducted to validate that the kinesthetic guidance is perceivable and beneficial as a sensory feedback using a novel handwriting copy task. Contactless kinesthetic feedback seems to play a significant role in supporting digital handwriting by influencing the kinematics of the handwriting process.


Subject(s)
Handwriting , Kinesthesis , Feedback , Feedback, Sensory , Humans , Magnetic Phenomena
16.
Front Robot AI ; 8: 612392, 2021.
Article in English | MEDLINE | ID: mdl-33898529

ABSTRACT

Most people touch their faces unconsciously, for instance to scratch an itch or to rest one's chin in their hands. To reduce the spread of the novel coronavirus (COVID-19), public health officials recommend against touching one's face, as the virus is transmitted through mucous membranes in the mouth, nose and eyes. Students, office workers, medical personnel and people on trains were found to touch their faces between 9 and 23 times per hour. This paper introduces FaceGuard, a system that utilizes deep learning to predict hand movements that result in touching the face, and provides sensory feedback to stop the user from touching the face. The system utilizes an inertial measurement unit (IMU) to obtain features that characterize hand movement involving face touching. Time-series data can be efficiently classified using 1D-Convolutional Neural Network (CNN) with minimal feature engineering; 1D-CNN filters automatically extract temporal features in IMU data. Thus, a 1D-CNN based prediction model is developed and trained with data from 4,800 trials recorded from 40 participants. Training data are collected for hand movements involving face touching during various everyday activities such as sitting, standing, or walking. Results showed that while the average time needed to touch the face is 1,200 ms, a prediction accuracy of more than 92% is achieved with less than 550 ms of IMU data. As for the sensory response, the paper presents a psychophysical experiment to compare the response time for three sensory feedback modalities, namely visual, auditory, and vibrotactile. Results demonstrate that the response time is significantly smaller for vibrotactile feedback (427.3 ms) compared to visual (561.70 ms) and auditory (520.97 ms). Furthermore, the success rate (to avoid face touching) is also statistically higher for vibrotactile and auditory feedback compared to visual feedback. These results demonstrate the feasibility of predicting a hand movement and providing timely sensory feedback within less than a second in order to avoid face touching.

17.
IEEE Trans Haptics ; 14(3): 626-634, 2021.
Article in English | MEDLINE | ID: mdl-33769937

ABSTRACT

Handwriting is an essential skill for developing sensorimotor and intellectual skills in children. Handwriting constitutes a complex activity relying on cognitive, visual-motor, memory and linguistic abilities, and is therefore challenging to master, especially for children with learning difficulties such as those with cognitive, sensorimotor or memory deficits. Recently-emerged haptic guidance systems have a potential to facilitate the acquisition of handwriting skills in both adults and children. In this paper we present a longitudinal experimental study that examined the effects of haptic guidance to improve handwriting skills in children with cognitive and fine motor delays as a function of the handwriting complexity in terms of visual familiarity and haptic difficulty. A haptic-based handwriting training platform that provides haptic guidance along the trajectory of a handwriting task was utilized. 12 children with cognitive and fine motor delays defined in terms of intellectual difficulty (IQ score) and mild motor difficulty in pincer grasp control, participated in the study. Children were divided into two groups, a target group and a control group. The target group completed haptic-guided training and pencil-and-paper test whereas the control group took only the pencil-and-paper test without any training. A total of 32 handwriting tasks was used in the study where 16 tasks were used for training while the entire 32 tasks were completed for evaluation. Results demonstrated that the target group performed significantly better than the control group for handwriting tasks that are visually familiar but haptically difficult (Wilcoxon signed-rank test, p 0.01). An improvement was also seen in the performance of untrained tasks as well as trained tasks (Spearman's linear correlation coefficient, 0.667; p = 0.05). In addition to confirming that haptic guidance can significantly improve motor functions, this study revealed a significant effect of task difficulty (visual familiarity and haptic complexity) on the effectiveness of haptic guidance for handwriting skill acquisition for children with cognitive and fine motor delays.


Subject(s)
Handwriting , Motor Skills , Adult , Child , Cognition , Hand Strength , Humans , Recognition, Psychology
18.
IEEE Trans Haptics ; 13(4): 825-830, 2020.
Article in English | MEDLINE | ID: mdl-32054586

ABSTRACT

The use of haptic technology has recently become essential in Human-Computer Interaction to improve performance and user experience. Mid-air tactile feedback co-located with virtual touchscreen displays have a great potential to improve the performance in dual-task situations, such as when using a phone while walking or driving. The purpose of this article is to investigate the effects of augmenting virtual touchscreen with mid-air tactile feedback to improve dual-task performance where the primary task is driving in a simulation environment and the secondary task involves interacting with a virtual touchscreen. Performance metrics included primary task performance in terms of velocity error, deviation from the middle of the road, number of collisions, and the number of off-road glances, secondary task performance including the interaction time and the reach time, and quality of user experience for perceived difficulty and satisfaction. Results demonstrate that adding mid-air tactile feedback to virtual touchscreen resulted in statistically significant improvement in the primary task performance (the average speed error, spatial deviation, and the number of off-road glances), the secondary task (reach time), and the perceived difficulty. These results provide a great motivation for augmenting virtual touchscreens with mid-air tactile feedback in dual-task human-computer interaction applications.


Subject(s)
Automobile Driving , Task Performance and Analysis , Computer Simulation , Feedback , Humans , User-Computer Interface
19.
IEEE Trans Haptics ; 12(4): 461-469, 2019.
Article in English | MEDLINE | ID: mdl-31247561

ABSTRACT

Haptics technologies have the potential to considerably improve the acquisition of handwriting skills by providing physical assistance to improve movement accuracy and precision. To date, very few studies have thoroughly examined the effectiveness of various haptic guidance methods to leverage the acquisition of handwriting skills. In this paper, we examine the role of several methods for haptic guidance, namely full haptic guidance, partial haptic guidance, disturbance haptic guidance, and no-haptic guidance toward improving the learning outcomes of handwriting skills acquisition for typical children. A group of 42 children from Cranleigh School Abu Dhabi across two educational stages, namely Foundation Stage 2 (FS2, 4-5 years old) and Year 2 (6-7 years old), participated in this study. Results showed that disturbance haptic guidance was the most effective for high complexity handwriting tasks (such as writing the letters "o" and "g"), partial haptic guidance was the most effective for medium complexity handwriting tasks (such as "t," "r," "s," "e," "n," "a," and "b"), and full haptic guidance was the most effective for low complexity letters (such as "i"). Another interesting finding was that FS2 participants had statistically significant improvement in handwriting speed compared to the Year 2 group, demonstrated by a significantly shorter test completion time. Furthermore, female children performed statistically better than their male counterparts in partial guidance. These results can be utilized to build more effective haptic-based handwriting tools for typical children.


Subject(s)
Handwriting , Motor Skills/physiology , Movement/physiology , Teaching , Child , Child, Preschool , Female , Humans , Male , Touch/physiology
20.
Front Neurorobot ; 13: 27, 2019.
Article in English | MEDLINE | ID: mdl-31191286

ABSTRACT

Tactile sensation largely influences human perception, for instance when using a mobile device or a touch screen. Active touch, which involves tactile and proprioceptive sensing under the control of movement, is the dominant tactile exploration mechanism compared to passive touch (being touched). This paper investigates the role of friction stimulation objectively and quantitatively in active touch tasks, in a real human-computer interaction on a touch-screen device. In this study, 24 participants completed an active touch task involved stroking the virtual strings of a guitar on a touch-screen device while recording the electroencephalography (EEG) signal. Statistically significant differences in beta and gamma oscillations in the middle frontal and parietal areas at the late period of the active touch task are found. Furthermore, stronger beta event-related desynchronization (ERD) and rebound in the presence of friction stimulation in the contralateral parietal area are observed. However, in the ipsilateral parietal area, there is a difference in beta oscillation only at the late period of the motor task. As for implicit emotion communication, a significant increase in emotional responses for valence, arousal, dominance, and satisfaction is observed when the friction stimulation is applied. It is argued that the friction stimulation felt by the participants' fingertip in a touch-screen device further induces cognitive processing compared to the case when no friction stimulation is applied. This study provides objective and quantitative evidence that friction stimulation is able to affect the bottom-up sensation and cognitive processing.

SELECTION OF CITATIONS
SEARCH DETAIL
...