Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Robot Autom Lett ; 9(2): 1166-1173, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38292408

ABSTRACT

Head and neck cancers are the seventh most common cancers worldwide, with squamous cell carcinoma being the most prevalent histologic subtype. Surgical resection is a primary treatment modality for many patients with head and neck squamous cell carcinoma, and accurately identifying tumor boundaries and ensuring sufficient resection margins are critical for optimizing oncologic outcomes. This study presents an innovative autonomous system for tumor resection (ASTR) and conducts a feasibility study by performing supervised autonomous midline partial glossectomy for pseudotumor with millimeter accuracy. The proposed ASTR system consists of a dual-camera vision system, an electrosurgical instrument, a newly developed vacuum grasping instrument, two 6-DOF manipulators, and a novel autonomous control system. The letter introduces an ontology-based research framework for creating and implementing a complex autonomous surgical workflow, using the glossectomy as a case study. Porcine tongue tissues are used in this study, and marked using color inks and near-infrared fluorescent (NIRF) markers to indicate the pseudotumor. ASTR actively monitors the NIRF markers and gathers spatial and color data from the samples, enabling planning and execution of robot trajectories in accordance with the proposed glossectomy workflow. The system successfully performs six consecutive supervised autonomous pseudotumor resections on porcine specimens. The average surface and depth resection errors measure 0.73±0.60 mm and 1.89±0.54 mm, respectively, with no positive tumor margins detected in any of the six resections. The resection accuracy is demonstrated to be on par with manual pseudotumor glossectomy performed by an experienced otolaryngologist.

2.
J Opt Soc Am A Opt Image Sci Vis ; 39(4): 655-661, 2022 Apr 01.
Article in English | MEDLINE | ID: mdl-35471389

ABSTRACT

Point clouds have been widely used due to their information being richer than images. Fringe projection profilometry (FPP) is one of the camera-based point cloud acquisition techniques that is being developed as a vision system for robotic surgery. For semi-autonomous robotic suturing, fluorescent fiducials were previously used on a target tissue as suture landmarks. This not only increases system complexity but also imposes safety concerns. To address these problems, we propose a numerical landmark localization algorithm based on a convolutional neural network (CNN) and a conditional random field (CRF). A CNN is applied to regress landmark heatmaps from the four-channel image data generated by the FPP. A CRF leveraging both local and global shape constraints is developed to better tune the landmark coordinates, reject extra landmarks, and recover missing landmarks. The robustness of the proposed method is demonstrated through ex vivo porcine intestine landmark localization experiments.


Subject(s)
Algorithms , Neural Networks, Computer , Animals , Swine
3.
Article in English | MEDLINE | ID: mdl-34840856

ABSTRACT

Autonomous robotic suturing has the potential to improve surgery outcomes by leveraging accuracy, repeatability, and consistency compared to manual operations. However, achieving full autonomy in complex surgical environments is not practical and human supervision is required to guarantee safety. In this paper, we develop a confidence-based supervised autonomous suturing method to perform robotic suturing tasks via both Smart Tissue Autonomous Robot (STAR) and surgeon collaboratively with the highest possible degree of autonomy. Via the proposed method, STAR performs autonomous suturing when highly confident and otherwise asks the operator for possible assistance in suture positioning adjustments. We evaluate the accuracy of our proposed control method via robotic suturing tests on synthetic vaginal cuff tissues and compare them to the results of vaginal cuff closures performed by an experienced surgeon. Our test results indicate that by using the proposed confidence-based method, STAR can predict the success of pure autonomous suture placement with an accuracy of 94.74%. Moreover, via an additional 25% human intervention, STAR can achieve a 98.1% suture placement accuracy compared to an 85.4% accuracy of completely autonomous robotic suturing. Finally, our experiment results indicate that STAR using the proposed method achieves 1.6 times better consistency in suture spacing and 1.8 times better consistency in suture bite sizes than the manual results.

4.
Front Robot AI ; 8: 645756, 2021.
Article in English | MEDLINE | ID: mdl-34113656

ABSTRACT

The COVID-19 pandemic has emerged as a serious global health crisis, with the predominant morbidity and mortality linked to pulmonary involvement. Point-of-Care ultrasound (POCUS) scanning, becoming one of the primary determinative methods for its diagnosis and staging, requires, however, close contact of healthcare workers with patients, therefore increasing the risk of infection. This work thus proposes an autonomous robotic solution that enables POCUS scanning of COVID-19 patients' lungs for diagnosis and staging. An algorithm was developed for approximating the optimal position of an ultrasound probe on a patient from prior CT scans to reach predefined lung infiltrates. In the absence of prior CT scans, a deep learning method was developed for predicting 3D landmark positions of a human ribcage given a torso surface model. The landmarks, combined with the surface model, are subsequently used for estimating optimal ultrasound probe position on the patient for imaging infiltrates. These algorithms, combined with a force-displacement profile collection methodology, enabled the system to successfully image all points of interest in a simulated experimental setup with an average accuracy of 20.6 ± 14.7 mm using prior CT scans, and 19.8 ± 16.9 mm using only ribcage landmark estimation. A study on a full torso ultrasound phantom showed that autonomously acquired ultrasound images were 100% interpretable when using force feedback with prior CT and 88% with landmark estimation, compared to 75 and 58% without force feedback, respectively. This demonstrates the preliminary feasibility of the system, and its potential for offering a solution to help mitigate the spread of COVID-19 in vulnerable environments.

5.
Article in English | MEDLINE | ID: mdl-38533465

ABSTRACT

Surgical resection is the current clinical standard of care for treating squamous cell carcinoma. Maintaining an adequate tumor resection margin is the key to a good surgical outcome, but tumor edge delineation errors are inevitable with manual surgery due to difficulty in visualization and hand-eye coordination. Surgical automation is a growing field of robotics to relieve surgeon burdens and to achieve a consistent and potentially better surgical outcome. This paper reports a novel robotic supervised autonomous electrosurgery technique for soft tissue resection achieving millimeter accuracy. The tumor resection procedure is decomposed to the subtask level for a more direct understanding and automation. A 4-DOF suction system is developed, and integrated with a 6-DOF electrocautery robot to perform resection experiments. A novel near-infrared fluorescent marker is manually dispensed on cadaver samples to define a pseudotumor, and intraoperatively tracked using a dual-camera system. The autonomous dual-robot resection cooperation workflow is proposed and evaluated in this study. The integrated system achieves autonomous localization of the pseudotumor by tracking the near-infrared marker, and performs supervised autonomous resection in cadaver porcine tongues (N=3). The three pseudotumors were successfully removed from porcine samples. The evaluated average surface and depth resection errors are 1.19 and 1.83mm, respectively. This work is an essential step towards autonomous tumor resections.

6.
Med Image Comput Comput Assist Interv ; 11764: 320-328, 2019 Oct.
Article in English | MEDLINE | ID: mdl-33511379

ABSTRACT

Oral squamous cell carcinoma (OSCC) is the most common cancer in the head and neck region, and is associated with high morbidity and mortality rates. Surgical resection is usually the primary treatment strategy for OSCC, and maintaining effective tumor resection margins is paramount to surgical outcomes. In practice, wide tumor excisions impair post-surgical organ function, while narrow resection margins are associated with tumor recurrence. Identification and tracking of these resection margins remain a challenge because they migrate and shrink from preoperative chemo or radiation therapies, and deform intra-operatively. This paper reports a novel near-infrared (NIR) fluorescent marking and landmark-based deformable image registration (DIR) method to precisely predict deformed margins. The accuracy of DIR predicted resection margins on porcine cadaver tongues is compared with rigid image registration and surgeon's manual prediction. Furthermore, our tracking and registration technique is integrated into a robotic system, and tested using ex vivo porcine cadaver tongues to demonstrate the feasibility of supervised autonomous tumor bed resections.

7.
IEEE Int Conf Robot Autom ; 2018: 6637-6644, 2018 May.
Article in English | MEDLINE | ID: mdl-31475074

ABSTRACT

This paper reports a robotic laparoscopic surgery system performing electro-surgery on porcine cadaver kidney, and evaluates its accuracy in an open loop control scheme to conduct targeting and cutting tasks guided by a novel 3D endoscope. We describe the design and integration of the novel laparoscopic imaging system that is capable of reconstructing the surgical field using structured light. A targeting task is first performed to determine the average positioning error of the system as guided by the laparoscopic camera. The imaging system is then used to reconstruct the surface of a porcine cadaver kidney, and generate a cutting trajectory with consistent depth. The paper concludes by using the robotic system in open loop control to cut this trajectory using a multi degree of freedom electro-surgical tool. It is demonstrated that for a cutting depth of 3 mm, the robotic surgical system follows the trajectory with an average depth of 2.44 mm and standard deviation of 0.34 mm. The average positional accuracy of the system was 2.74±0.99 mm.

SELECTION OF CITATIONS
SEARCH DETAIL
...