Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 286
Filter
1.
J Exp Child Psychol ; 242: 105885, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38471382

ABSTRACT

Previous work has suggested a different developmental timeline and role of visual experience for the use of spatial and non-spatial features in haptic object recognition. To investigate this conjecture, we used a haptic ambiguous odd-one-out task in which one object needed to be selected as being different from two other objects. The odd-one-out could be selected based on four characteristics: size, shape (spatial), texture, and weight (non-spatial). We tested sighted children from 4 to 12 years of age; congenitally blind, late blind, and adult participants with low vision; and normally sighted adults. Given the protracted developmental time course for spatial perception, we expected a shift from a preference for non-spatial features toward spatial features during typical development. Due to the dominant influence of vision for spatial perception, we expected congenitally blind adults to show a similar preference for non-spatial features as the youngest children. The results confirmed our first hypothesis; the 4-year-olds demonstrated a lower dominance for spatial features for object classification compared with older children and sighted adults. In contrast, our second hypothesis was not confirmed; congenitally blind adults' preferred categorization criteria were indistinguishable from those of sighted controls. These findings suggest an early development, but late maturation, of spatial processing in haptic object recognition independent of visual experience.


Subject(s)
Child Development , Spatial Processing , Adult , Child , Humans , Adolescent , Child, Preschool , Haptic Technology , Space Perception , Visual Perception , Touch
2.
IEEE Trans Vis Comput Graph ; 30(5): 2247-2256, 2024 May.
Article in English | MEDLINE | ID: mdl-38437075

ABSTRACT

Physical QWERTY keyboards are the current standard for performing precision text-entry with extended reality devices. Ideally, there would exist a comparable, self-contained solution that works anywhere, without requiring external keyboards. Unfortunately, when physical keyboards are recreated virtually, we currently lose critical haptic feedback information from the sense of touch, which impedes typing. In this paper, we introduce the MusiKeys Technique, which uses auditory feedback in virtual reality to communicate missing haptic feedback information typists normally receive when using a physical keyboard. To examine this concept, we conducted a user study with 24 participants which encompassed four mid-air virtual keyboards augmented with increasing amounts of feedback information, along with a fifth physical keyboard for reference. Results suggest that providing clicking feedback on key-press and key-release improves typing performance compared to not providing auditory feedback, which is consistent with the literature. We also found that audio can serve as a substitute for information contained in haptic feedback, in that users can accurately perceive the presented information. However, under our specific study conditions, this awareness of the feedback information did not yield significant differences in typing performance. Our results suggest this kind of feedback replacement can be perceived by users but needs more research to tune and improve the specific techniques.


Subject(s)
Haptic Technology , Touch Perception , Humans , Equipment Design , Computer Graphics , Touch , User-Computer Interface
3.
IEEE Trans Vis Comput Graph ; 30(5): 2422-2433, 2024 May.
Article in English | MEDLINE | ID: mdl-38437136

ABSTRACT

Spatial search tasks are common and crucial in many Virtual Reality (VR) applications. Traditional methods to enhance the performance of spatial search often employ sensory cues such as visual, auditory, or haptic feedback. However, the design and use of bimanual haptic feedback with two VR controllers for spatial search in VR remains largely unexplored. In this work, we explored bimanual haptic feedback with various combinations of haptic properties, where four types of bimanual haptic feedback were designed, for spatial search tasks in VR. Two experiments were designed to evaluate the effectiveness of bimanual haptic feedback on spatial direction guidance and search in VR. The results from the first experiment reveal that our proposed bimanual haptic schemes significantly enhanced the recognition of spatial directions in terms of accuracy and speed compared to spatial audio feedback. The second experiment's findings suggest that the performance of bimanual haptic feedback was comparable to or even better than the visual arrow, especially in reducing the angle of head movement and enhancing searching targets behind the participants, which was supported by subjective feedback as well. Based on these findings, we have derived a set of design recommendations for spatial search using bimanual haptic feedback in VR.


Subject(s)
Haptic Technology , Virtual Reality , Humans , Feedback , Computer Graphics , Feedback, Sensory
4.
IEEE Trans Vis Comput Graph ; 30(5): 2839-2848, 2024 May.
Article in English | MEDLINE | ID: mdl-38498761

ABSTRACT

The inferior alveolar nerve block (IANB) is a dental anesthetic injection that is critical to the performance of many dental procedures. Dental students typically learn to administer an IANB through videos and practice on silicone molds and, in many dental schools, on other students. This causes significant stress for both the students and their early patients. To reduce discomfort and improve clinical outcomes, we created an anatomically informed virtual reality headset-based educational system for the IANB. It combines a layered 3D anatomical model, dynamic visual guidance for syringe position and orientation, and active force feedback to emulate syringe interaction with tissue. A companion mobile augmented reality application allows students to step through a visualization of the procedure on a phone or tablet. We conducted a user study to determine the advantages of preclinical training with our IANB simulator. We found that in comparison to dental students who were exposed only to traditional supplementary study materials, dental students who used our IANB simulator were more confident administering their first clinical injections, had less need for syringe readjustments, and had greater success in numbing patients.


Subject(s)
Augmented Reality , Nerve Block , Virtual Reality , Humans , Haptic Technology , Mandibular Nerve , Computer Graphics , Nerve Block/methods
5.
No Shinkei Geka ; 52(2): 279-288, 2024 Mar.
Article in Japanese | MEDLINE | ID: mdl-38514117

ABSTRACT

We established a unique pre-surgical simulation method by applying interactive virtual simulation(IVS)using multi-fusion three-dimensional imaging data, presenting high-quality visualization of microsurgical anatomies. Our IVS provided a realistic environment for imitating surgical manipulations, such as dissecting bones, retracting brain tissues, and removing tumors, with tactile and kinesthetic sensations delivered through a specific haptic device. The great advantage of our IVS was in deciding the most appropriate craniotomy and bone resection to create the optimal surgical window and obtain the best working space with a thorough understanding of the lesion-bone relationship. Particularly for skull-base tumors, tailoring the procedures to individual patients for craniotomy and bone resection was sufficiently achieved using our IVS. In cases of large skull base meningiomas, our IVS was also helpful preoperatively regarding tumors, as several compartments were achievable in every potentially usable surgical direction. Additionally, the non-risky realistic microsurgical environments of the IVS provided improvement in the microsurgical senses and skills of young trainees through the repetition of surgical tasks. Finally, our presurgical IVS simulation method provided a realistic environment for practicing microsurgical procedures virtually and enabled us to ascertain the complex microsurgical anatomy, determine optimal surgical strategies, and efficiently educate neurosurgical trainees.


Subject(s)
Meningeal Neoplasms , Meningioma , Neurosurgery , Skull Base Neoplasms , Humans , Haptic Technology , Neurosurgical Procedures/methods , Meningioma/diagnostic imaging , Meningioma/surgery , Skull Base Neoplasms/surgery , Meningeal Neoplasms/surgery
6.
Cortex ; 173: 263-282, 2024 04.
Article in English | MEDLINE | ID: mdl-38432177

ABSTRACT

Current accounts of behavioral and neurocognitive correlates of plasticity in blindness are just beginning to incorporate the role of speech and verbal production. We assessed Vygotsky/Luria's speech mediation hypothesis, according to which speech activity can become a mediating tool for perception of complex stimuli, specifically, for encoding tactual/haptic spatial patterns which convey pictorial information (haptic pictures). We compared verbalization in congenitally totally blind (CTB) and age-matched sighted but visually impaired (VI) children during a haptic picture naming task which included two repeated, test-retest, identifications. The children were instructed to explore 10 haptic schematic pictures of objects (e.g., cup) and body parts (e.g., face) and provide (without experimenter's feedback) their typical name. Children's explorations and verbalizations were videorecorded and transcribed into audio segments. Using the Computerized Analysis of Language (CLAN) program, we extracted several measurements from the observed verbalizations, including number of utterances and words, utterance/word duration, and exploration time. Using the Word2Vec natural language processing technique we operationalized semantic content from the relative distances between the names provided. Furthermore, we conducted an observational content analysis in which three judges categorized verbalizations according to a rating scale assessing verbalization content. Results consistently indicated across all measures that the CTB children were faster and semantically more precise than their VI counterparts in the first identification test, however, the VI children reached the same level of precision and speed as the CTB children at retest. Overall, the task was harder for the VI group. Consistent with current neuroscience literature, the prominent role of speech in CTB and VI children's data suggests that an underlying cross-modal involvement of integrated brain networks, notably associated with Broca's network, likely also influenced by Braille, could play a key role in compensatory plasticity via the mediational mechanism postulated by Luria.


Subject(s)
Haptic Technology , Speech , Child , Humans , Blindness/psychology , Vision Disorders , Touch
7.
BMC Ophthalmol ; 24(1): 103, 2024 Mar 05.
Article in English | MEDLINE | ID: mdl-38443841

ABSTRACT

PURPOSE: To measure the dislocation forces in relation to haptic material, flange size and needle used. SETTING: Hanusch Hospital, Vienna, Austria. DESIGN: Laboratory Investigation. METHODS, MAIN OUTCOME MEASURES: 30 G (gauge) thin wall and 27 G standard needles were used for a 2 mm tangential scleral tunnel in combination with different PVDF (polyvinylidene fluoride) and PMMA (polymethylmethacrylate haptics). Flanges were created by heating 1 mm of the haptic end, non-forceps assisted in PVDF and forceps assisted in PMMA haptics. The dislocation force was measured in non-preserved cadaver sclera using a tensiometer device. RESULTS: PVDF flanges achieved were of a mushroom-like shape and PMMA flanges were of a conic shape. For 30 G needle tunnels the dislocation forces for PVDF and PMMA haptic flanges were 1.58 ± 0.68 N (n = 10) and 0.70 ± 0.14 N (n = 9) (p = 0.003) respectively. For 27 G needle tunnels the dislocation forces for PVDF and PMMA haptic flanges were 0.31 ± 0.35 N (n = 3) and 0.0 N (n = 4), respectively. The flange size correlated with the occurring dislocation force in experiments with 30 G needle tunnels (r = 0.92), when flanges were bigger than 384 micrometres. CONCLUSIONS: The highest dislocation forces were found for PVDF haptic flanges and their characteristic mushroom-like shape for 30 G thin wall needle scleral tunnels. Forceps assisted flange creation in PMMA haptics did not compensate the disadvantage of PMMA haptics with their characteristic conic shape flange.


Subject(s)
Fluorocarbon Polymers , Haptic Technology , Lenses, Intraocular , Polyvinyls , Humans , Polymethyl Methacrylate , Sclera/surgery
8.
Sci Rep ; 14(1): 5140, 2024 03 01.
Article in English | MEDLINE | ID: mdl-38429357

ABSTRACT

Accomplishing motor function requires multimodal information, such as visual and haptic feedback, which induces a sense of ownership (SoO) over one's own body part. In this study, we developed a visual-haptic human machine interface that combines three different types of feedback (visual, haptic, and kinesthetic) in the context of passive hand-grasping motion and aimed to generate SoO over a virtual hand. We tested two conditions, both conditions the three set of feedback were synchronous, the first condition was in-phase, and the second condition was in antiphase. In both conditions, we utilized passive visual feedback (pre-recorded video of a real hand displayed), haptic feedback (balloon inflated and deflated), and kinesthetic feedback (finger movement following the balloon curvature). To quantify the SoO, the participants' reaction time was measured in response to a sense of threat. We found that most participants had a shorter reaction time under anti-phase condition, indicating that synchronous anti-phase of the multimodal system was better than in-phase condition for inducing a SoO of the virtual hand. We conclude that stronger haptic feedback has a key role in the SoO in accordance with visual information. Because the virtual hand is closing and the high pressure from the balloon against the hand creates the sensation of grasping and closing the hand, it appeared as though the person was closing his/her hand at the perceptual level.


Subject(s)
Haptic Technology , Ownership , Humans , Female , Male , Feedback , Hand , Upper Extremity
9.
Transl Vis Sci Technol ; 13(2): 19, 2024 02 01.
Article in English | MEDLINE | ID: mdl-38407885

ABSTRACT

Purpose: This study aimed to determine the influence of decentration, rotation, and tilt on objective optical quality of plate haptic toric intraocular lenses (tIOLs). Methods: The area ratio of modulation transfer function (MTF), strehl ratio of point spread function (PSF), and higher order aberrations (HOAs) for 3 mm and 5 mm pupil diameter (PD) were evaluated at postoperative 1 month. The retroillumination images pictured by OPD-scan III were used to quantify the degree of decentration and rotation, whereas the tIOL tilt was directly obtained by the tilt aberration. Patients were separated into two subgroups based on tIOL misalignment cutoff values. Results: There were 29 eyes (24 patients) in the study. The decentration of more than 0.25 mm did not substantially differ from those less than or equal to 0.25 mm. PSF of 3 mm PD and MTF, intraocular HOAs, and trefoil aberration for 3 mm and 5 mm PD significantly deteriorated with a rotation of more than 3 degrees, whereas only intraocular HOAs for 5 mm PD and coma for 3 mm and 5 mm PD were significantly severe with a tilt of more than 0.1 µm and 0.25 µm in corresponding PD. Furthermore, tIOL rotation and tilt were highly correlated with intraocular trefoil aberration and coma, respectively. Conclusions: The decentration of the monofocal bitoric IOLs is more tolerant to optical quality degradation after 1 month of surgery but more sensitive to intraocular trefoil aberration caused by rotation and coma aberration induced by tilt. Translational Relevance: As far as we know, this is the first study to investigate the relationship between the plate haptic bitoric IOL misalignment and objective optical quality measured by OPD-scan III in the real world, which may provide reference information for IOL selection to improve surgical outcomes.


Subject(s)
Coma , Lenses, Intraocular , Humans , Haptic Technology , Rotation , Postoperative Period
10.
J Exp Child Psychol ; 241: 105856, 2024 May.
Article in English | MEDLINE | ID: mdl-38306737

ABSTRACT

Sound-shape correspondence refers to the preferential mapping of information across the senses, such as associating a nonsense word like bouba with rounded abstract shapes and kiki with spiky abstract shapes. Here we focused on audio-tactile (AT) sound-shape correspondences between nonsense words and abstract shapes that are felt but not seen. Despite previous research indicating a role for visual experience in establishing AT associations, it remains unclear how visual experience facilitates AT correspondences. Here we investigated one hypothesis: seeing the abstract shapes improve haptic exploration by (a) increasing effective haptic strategies and/or (b) decreasing ineffective haptic strategies. We analyzed five haptic strategies in video-recordings of 6- to 8-year-old children obtained in a previous study. We found the dominant strategy used to explore shapes differed based on visual experience. Effective strategies, which provide information about shape, were dominant in participants with prior visual experience, whereas ineffective strategies, which do not provide information about shape, were dominant in participants without prior visual experience. With prior visual experience, poking-an effective and efficient strategy-was dominant, whereas without prior visual experience, uncategorizable and ineffective strategies were dominant. These findings suggest that prior visual experience of abstract shapes in 6- to 8-year-olds can increase the effectiveness and efficiency of haptic exploration, potentially explaining why prior visual experience can increase the strength of AT sound-shape correspondences.


Subject(s)
Haptic Technology , Vision, Ocular , Child , Humans , Touch , Sound , Emotions
11.
J Exp Child Psychol ; 241: 105870, 2024 May.
Article in English | MEDLINE | ID: mdl-38354447

ABSTRACT

Geometrical knowledge is typically taught to children through a combination of vision and repetitive drawing (i.e. haptics), yet our understanding of how different spatial senses contribute to geometric perception during childhood is poor. Studies of line orientation suggest a dominant role of vision affecting the calibration of haptics during development; however, the associated multisensory interactions underpinning angle perception are unknown. Here we examined visual, haptic, and bimodal perception of angles across three age groups of children: 6 to 8 years, 8 to 10 years, and 10 to 12 years, with age categories also representing their class (grade) in primary school. All participants first learned an angular shape, presented dynamically, in one of three sensory tracing conditions: visual only, haptic only, or bimodal exploration. At test, which was visual only, participants selected a target angle from four possible alternatives with distractor angle sizes varying relative to the target angle size. We found a clear improvement in accuracy of angle perception with development for all learning modalities. Angle perception in the youngest group was equally poor (but above chance) for all modalities; however, for the two older child groups, visual learning was better than haptics. Haptic perception did not improve to the level of vision with age (even in a comparison adult group), and we found no specific benefit for bimodal learning over visual learning in any age group, including adults. Our results support a developmental increment in both spatial accuracy and precision in all modalities, which was greater in vision than in haptics, and are consistent with previous accounts of cross-sensory calibration in the perception of geometric forms.


Subject(s)
Touch Perception , Visual Perception , Adult , Child , Humans , Adolescent , Haptic Technology , Vision, Ocular , Spatial Learning , Knowledge
12.
J Endod ; 50(4): 533-539.e1, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38280513

ABSTRACT

There has been a significant increase in robot-assisted dental procedures in the past decade, particularly in the area of robot-assisted implant placement. The objective of this case report was to assess the initial use of the Yomi Robot's assistance and haptic guidance during endodontic microsurgery. The robot was used during the osteotomy and root-end resection of the first and second upper left premolars. The report aims to inform clinicians of the initial implementation of this cutting-edge technology in endodontics and its potential to enhance endodontic microsurgery. The Yomi Robot was used in performing osteotomy and root-end resection during apical surgery in a patient presenting with symptomatic upper left first and second premolars. The treatment procedure was decided after clinical examination, chart data, and radiographic examinations, which showed periapical lesions on both premolars, taking into consideration the failed endodontic retreatment on the first premolar, the post and ceramic coronal restorations on both teeth, and the desire of the patient to save them. The Yomi Robot system provides auditory, visual, and physical guidance to clinicians during surgery while using a cone-beam computed tomography scan for precision planning with greater accuracy and minimized potential for human error. Further studies are needed to prepare a protocol for robotic-guided procedures in endodontics.


Subject(s)
Endodontics , Robotics , Humans , Root Canal Therapy/methods , Haptic Technology , Endodontics/methods , Bicuspid/diagnostic imaging , Bicuspid/surgery , Cone-Beam Computed Tomography
13.
Indian J Ophthalmol ; 72(4): 544-548, 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38189463

ABSTRACT

PURPOSE: To compare the visual outcomes and complication rates between the extra-ocular needle-guided haptic insertion technique (XNIT) and the conventional handshake (HS) technique of scleral fixation intra-ocular lens (SFIOL). METHODS: In this retrospective study, we retrieved data of those patients who had undergone SFIOL surgery from January 2018 to May 2022 at our institute for aphakia following either a complicated cataract surgery or an ocular trauma and had a minimum follow-up of 3 months. RESULTS: Of the 156 eyes, the HS technique was done in 80 eyes and the remaining 76 eyes with XNIT. At 3 months follow-up visit, there was no significant difference in the median best corrected visual acuity (BCVA) ( P = 0.988) and uncorrected visual acuity (UCVA) ( P = 0.765) between the two techniques. There was no statistically significant difference between pre-operative median BCVA and post-operative UCVA in XNIT ( P = 0.961) and the HS technique ( P = 0.831) at 3 months follow-up visit. The complication rates between the two techniques were minimal and comparable. The most common post-operative complication was corneal edema. The incidence of cystoid macular edema was slightly more in the XNIT group but not statistically significant ( P = 0.05). Two patients in the HS group developed retinal detachment, which settled after repeat surgery. CONCLUSION: The newer XNIT technique was found to be as safe and effective as compared to the conventional HS technique.


Subject(s)
Lens Implantation, Intraocular , Lenses, Intraocular , Humans , Lens Implantation, Intraocular/methods , Retrospective Studies , Haptic Technology , Visual Acuity , Sclera/surgery , Suture Techniques
14.
IEEE Trans Haptics ; 17(1): 39-44, 2024.
Article in English | MEDLINE | ID: mdl-38224514

ABSTRACT

Although medical simulators have benefited from the use of haptics and virtual reality (VR) for decades, the former has become the bottleneck in producing a low-cost, compact, and accurate training experience. This is particularly the case for the inferior alveolar nerve block (IANB) procedure in dentistry, which is one of the most difficult motor skills to acquire. As existing works are still oversimplified or overcomplicated for practical deployment, we introduce an origami-based haptic syringe interface for IANB local anesthesia training. By harnessing the versatile mechanical tunability of the Kresling origami pattern, our interface simulated the tactile experience of the plunger while injecting the anesthetic solution. We present the design, development, and characterization process, as well as a preliminary usability study. The force profile generated by the syringe interface is perceptually similar with that of the Carpule syringe. The usability study suggests that the haptic syringe significantly improves the IANB training simulation and its potential to be utilized in several other medical training/simulation applications.


Subject(s)
Anesthesia, Local , Touch Perception , Humans , Syringes , Haptic Technology , User-Computer Interface , Computer Simulation , Clinical Competence
15.
J Neurosci ; 44(13)2024 Mar 27.
Article in English | MEDLINE | ID: mdl-38267257

ABSTRACT

Visual and haptic perceptions of 3D shape are plagued by distortions, which are influenced by nonvisual factors, such as gravitational vestibular signals. Whether gravity acts directly on the visual or haptic systems or at a higher, modality-independent level of information processing remains unknown. To test these hypotheses, we examined visual and haptic 3D shape perception by asking male and female human subjects to perform a "squaring" task in upright and supine postures and in microgravity. Subjects adjusted one edge of a 3D object to match the length of another in each of the three canonical reference planes, and we recorded the matching errors to obtain a characterization of the perceived 3D shape. The results show opposing, body-centered patterns of errors for visual and haptic modalities, whose amplitudes are negatively correlated, suggesting that they arise in distinct, modality-specific representations that are nevertheless linked at some level. On the other hand, weightlessness significantly modulated both visual and haptic perceptual distortions in the same way, indicating a common, modality-independent origin for gravity's effects. Overall, our findings show a link between modality-specific visual and haptic perceptual distortions and demonstrate a role of gravity-related signals on a modality-independent internal representation of the body and peripersonal 3D space used to interpret incoming sensory inputs.


Subject(s)
Touch Perception , Vestibule, Labyrinth , Humans , Male , Female , Visual Perception , Haptic Technology , Cognition , Space Perception
16.
IEEE Trans Haptics ; 17(1): 33-38, 2024.
Article in English | MEDLINE | ID: mdl-38227400

ABSTRACT

In this paper, we explore the effects of multimodal haptic feedback combining vibrotactile and electrical muscle stimulation (EMS) on expressing virtual collisions. We first present a wearable multimodal haptic device capable of generating both mechanical vibration and EMS stimuli. The two types of haptic stimulus are combined into a haptic rendering method that conveys improved virtual collision sensations. This multimodal rendering method highlights the strengths of each modality while compensating for mutual weaknesses. The multimodal rendering method is compared in subjective quality with two unimodal methods (vibration only and EMS only) by a user study. Experimental results demonstrate that our multimodal feedback method can elicit more realistic, enjoyable, expressive, and preferable user experiences.


Subject(s)
Touch Perception , Touch , Humans , Touch/physiology , Touch Perception/physiology , Feedback , Haptic Interfaces , Haptic Technology , User-Computer Interface , Muscles , Vibration
17.
IEEE Trans Haptics ; 17(1): 26-32, 2024.
Article in English | MEDLINE | ID: mdl-38227401

ABSTRACT

Some interactions in virtual environments need to be operated on inclined planes. If a real inclined plane can be found in the real environment that corresponds exactly to the angle of the virtual inclined plane to provide haptic feedback, the user's immersion can be enhanced. However, it is not easy to find such a real inclined plane in the real environment. We proposed a horizontal plane haptic redirection scheme, where users interacting with virtual inclined planes in virtual environments can obtain haptic feelings by using real horizontal planes that are easily available in the real world for redirection mapping. We also designed an integrated solution to locate the real horizontal plane and for haptic redirection based on the Vive Pro headset. Then we measured the angle and size thresholds for horizontal plane haptic redirection as 20° and 88%, respectively, through a user study. Through experiments, we also found that when the degree of redirection exceeded the threshold, the user's operation efficiency would be significantly reduced. In addition, we compared the horizontal plane haptic redirection scheme with the scheme without redirection and the scheme without haptic feedback to demonstrate the validity and necessity of the redirection scheme proposed in this paper.


Subject(s)
Touch Perception , Humans , Feedback , Haptic Technology , User-Computer Interface
18.
Indian J Ophthalmol ; 72(Suppl 2): S319-S322, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38271430

ABSTRACT

To evaluate a novel technique for six-point scleral fixation of a three-looped haptics posterior chamber intraocular lens (PCIOL) by a single suture. Nine eyes of nine patients were studied from September 2021 to March 2023. All patients had undergone vitrectomy. Only a single 9-0 polypropylene suture was used for scleral fixation. The three looped haptics were fixed at 12, 4, and 8 o'clock with six-point scleral fixation. The entire procedure took about 30 min. Among the nine patients, eight (88.8%) eyes had a significant improvement in best-corrected visual acuity, whereas one (11.2%) eye showed no change. No intraoperative or postoperative complications were observed. By ultrasonic biomicroscopy examination, intraocular lenses were well positioned and stable with no tilt in the horizontal and vertical axis. The method of six-point scleral fixation of a three-looped haptics PCIOL by a single suture is safe and effective.


Subject(s)
Lens Implantation, Intraocular , Lenses, Intraocular , Humans , Lens Implantation, Intraocular/methods , Haptic Technology , Suture Techniques , Visual Acuity , Sclera/surgery , Sutures , Retrospective Studies
19.
J Robot Surg ; 18(1): 43, 2024 Jan 18.
Article in English | MEDLINE | ID: mdl-38236452

ABSTRACT

Robotic surgery started nearly 30 years ago. It has achieved telepresence and the performance of repetitive, precise, and accurate tasks. The "master-slave" robotic system allows control of manipulators by surgeon at distant site. Robotic surgical fingers were developed to allow surgeons to move them with accuracy through sensors fixed on surgeon's hand. Also, haptic sensors were developed to allow transmission of sensation from robotic finger to surgeon's finger. A complete system of a, 3D printed by a stereolithography (SLA) 3D printer, robotic surgical finger with haptic feedback system is proposed. The developed system includes a master glove that controls the motion of a 3DOF robotic slave finger while getting haptic feedback of force/pressure exerted on it. The precise control of the slave robotic finger was achieved by applying a Proportional Integral and Derivative (PID), fast and robust, control algorithm using an Arduino based hardware and software module. The individual joint angles, metacarpophalangeal joint (MCP) and proximal interphalangeal joint (PIP), and wrist were measured using rotatory and inertial sensors respectively. The degree of movement for MCP, PIP, and Wrist joints were measured to be 0-86°, 0-71°, and 0-89° respectively. Motion to the robotic finger is mimicked by a glove motion requiring minimal learning curve for the device. The collected data for the slave motion is in good agreement with the master-glove motion data. The vibro-tactile haptic feedback system was developed to distinguish between three different materials to mimic human flesh, tumor, and bone. The master-slave system using robotic surgical finger with good simultaneous movement to surgeon's finger and good haptic sensation will provide the surgeon with the opportunity to perform finger dissection in laparoscopic and robotic surgery, as it used to be in open surgery. 3D bio printing will make this process even cheaper with the added advantage of making surgical tools locally according to the need of the surgery. An ongoing work is to develop silicone based 8 mm robotic surgical finger with multiple type haptic feedback.


Subject(s)
Robotic Surgical Procedures , Robotics , Humans , Robotic Surgical Procedures/methods , Feedback , Haptic Technology , Upper Extremity
20.
J Vis ; 24(1): 8, 2024 Jan 02.
Article in English | MEDLINE | ID: mdl-38206278

ABSTRACT

Adapting to particular features of a haptic shape, for example, the slant of a surface, affects how a subsequently touched shape is perceived (aftereffect). Previous studies showed that this adaptation is largely based on our proprioceptive sense of hand posture, yet the influence of vision on haptic shape adaptation has been relatively unexplored. Here, using a slant-adaptation paradigm, we investigated whether visual information affects haptic adaptation and, if so, how. To this end, we varied the available visual cues during the adaptation period. This process ranged from providing visual information only about the slant of the surface, or the reference frame in which it is presented, to only providing visual information about the location of the fingertips. Additionally, we tested several combinations of these visual cues. We show that, as soon as the visual information can be used as a spatial reference to link the own fingertip position to the surface slant, haptic adaptation is very much reduced. This result means that, under these viewing conditions, vision dominates touch and is one reason why we do not easily adapt to haptic shape in our daily life, because we usually have visual information about both hand and object available simultaneously.


Subject(s)
Haptic Technology , Touch Perception , Humans , Hand , Cues , Posture
SELECTION OF CITATIONS
SEARCH DETAIL
...