Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Stud Health Technol Inform ; 81: 119-25, 2001.
Article in English | MEDLINE | ID: mdl-11317724

ABSTRACT

Due to increases in network speed and bandwidth, distributed exploration of medical data in immersive Virtual Reality (VR) environments is becoming increasingly feasible. The volumetric display of radiological data in such environments presents a unique set of challenges. The shear size and complexity of the datasets involved not only make them difficult to transmit to remote sites, but these datasets also require extensive user interaction in order to make them understandable to the investigator and manageable to the rendering hardware. A sophisticated VR user interface is required in order for the clinician to focus on the aspects of the data that will provide educational and/or diagnostic insight. We will describe a software system of data acquisition, data display, Tele-Immersion, and data manipulation that supports interactive, collaborative investigation of large radiological datasets. The hardware required in this strategy is still at the high-end of the graphics workstation market. Future software ports to Linux and NT, along with the rapid development of PC graphics cards, open the possibility for later work with Linux or NT PCs and PC clusters.


Subject(s)
Computer Communication Networks , Medical Records Systems, Computerized , User-Computer Interface , Computer Systems , Humans , Imaging, Three-Dimensional , Magnetic Resonance Imaging , Microcomputers , Software Design , Tomography, X-Ray Computed
2.
Article in English | MEDLINE | ID: mdl-11317826

ABSTRACT

It requires skill, effort, and time to visualize desired anatomic structures from radiological data in three-dimensions. There have been many attempts at automating this process and making it less labor intensive. The technique we have developed is based on mutual information for automatic multi-modality image fusion (MIAMI Fuse, University of Michigan). The initial development of our technique has focused on the autocolorization of the liver, portal vein, and hepatic vein. A standard dataset in which these structures had been segmented and assigned colors was created from the full color Visible Human Female (VHF) and then optimally fused to the fresh CT Visible Human Female. This semi-automatic segmentation and coloring of the CT dataset was subjectively evaluated to be reasonably accurate. The transformation could be viewed interactively on the ImmersaDesk, in an immersive Virtual Reality (VR) environment. This 3D segmentation and visualization method marks the first step to a broader, standardized automatic structure visualization method for radiological data. Such a method, would permit segmentation of radiological data by canonical structure information and not just from the data's intrinsic dynamic range.


Subject(s)
Anatomy, Cross-Sectional , Imaging, Three-Dimensional , Radiographic Image Enhancement , Tomography, X-Ray Computed , User-Computer Interface , Angiography , Color , Humans , Image Processing, Computer-Assisted
3.
Article in English | MEDLINE | ID: mdl-10977581

ABSTRACT

Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.


Subject(s)
Image Processing, Computer-Assisted/instrumentation , Internet/instrumentation , Magnetic Resonance Imaging/instrumentation , Teleradiology/instrumentation , Tomography, X-Ray Computed/instrumentation , User-Computer Interface , Humans , Medical Records Systems, Computerized/instrumentation , Radiology Information Systems/instrumentation , Software , Telecommunications/instrumentation
4.
Proc AMIA Symp ; : 345-8, 1999.
Article in English | MEDLINE | ID: mdl-10566378

ABSTRACT

This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet.


Subject(s)
Anatomy, Cross-Sectional , Anatomy/education , Pelvic Floor/anatomy & histology , User-Computer Interface , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...