Menu
Home
Log in / Register
 
Home arrow Computer Science arrow Augmented and Virtual Reality
< Prev   CONTENTS   Next >

2 Materials and Methods

We designed and implemented a surgical navigator based on virtual reality. Our navigator provides the clinician a software application with a 3D virtual scene. In this scene the surgical instrument is coherently visualized in respect to a 3D representation of the 3D model of the anatomy. To know the right position of the rendered object, we used an electromagnetic localizer. Furthermore we studied and developed the optimal way to obtain a phantom for in-vitro system testing.

2.1 Prototypal System Architecture

The prototypal system architecture was realized with: a Philips iU22 ultrasound system, an electromagnetic localizer (Aurora®, Northern Digital Inc., Canada), a sensorized surgical instrument and a laptop running a home-developed software application. Figure 1 shows the prototypal architecture.

Fig. 1. General description of the architecture: an EM sensor is embedded in the surgical tool; an EM sensor is fixed to the US transducer

Several basic problems have been addressed to design and implement the US navigation system: the virtual information representation, instruments tracking and calibration, design and implementation of the Graphical User Interface (GUI).

Virtual Information Representation

To obtain the 3D model of the anatomy it is necessary an imaging source: we chose to use an intraoperative imaging to avoid the critical registration problem. We chose 3D Ultrasound (US) because presents some advantages with respect to other imaging modalities: it is portable, less expensive, non-invasive, it does not require a dedicate room.

To acquire the ultrasound (US) volumetric dataset we used the trans-vaginal V9-4v volume curved array transducer connected to Philips® iU22 ultrasound system. The transducer has an operating frequency range between 9 to 4 MHz, 150 degrees field of view and 2D, 3D, 4D imaging including Doppler, STIC, PW Doppler, M-mode and CPA. We verified that the transducer was right for our purpose, acquiring four volumetric dataset of the prostate during four male patients US examinations (through anus). The patients suffered from benign prostatic hyperplasia (BPH). First of all we converted the US voxel size to obtain cuboid voxels with constant size. Figure 2 shows the prostate of a patient. Furthermore, we elaborated the volumes in order to build up a virtual 3D model of the prostate, and the adenoma. The patient specific 3D models were used to build up a phantom, required to validate the navigation system.

Fig. 2. Ultrasound scan of a human prostate acquired by using a trans-rectal V9-4v volume curved array transducer and a Philips® iU22 ultrasound system

Localization

The localizer allows real-time tracking of position and orientation of the tracked objects and it is therefore necessary to coherently represent the information provided to the user in the virtual reality scene.

We chose EM tracking due to the poor visibility and limited space of the surgical environment that does not allow direct line of sight, required by more precise optical technologies.

We fixed a six-degree of freedom (DOF) electromagnetic sensor to the V9-4v volume curved array transducer. In the prototypal system we used a cylindrical shape instrument simulating the surgical cutting device. We embedded an Aurora® 5 DOF electromagnetic sensor (0.5 mm diameter) in the instrument to track its position and orientation.

Calibration

Reading the position and the orientation of the sensors fixed on the objects, the localizer permits to refer all the sensors in the same reference frame (fixed on the localizer itself). It is necessary to determine the relationship between each sensor and the object on which this sensor is mounted. In particular, in our case, a calibration procedure was necessary to know the position and orientation of the US scan volume starting from the 3D position and orientation of the sensor attached to the US probe. To this aim, we implemented a software routine and we realized a calibration phantom. The phantom was a simple three points' phantom. It was placed in an aqueous environment and we acquired the US volumetric dataset and simultaneously electromagnetic (EM) measures with acquisition rate of 1 second. We acquired the data in ten different

positions. We adopted a closed form to solve the problem (Figure 3). The problem is conducible to solve an equation of the type: AX=XB, well known in robotics. The A matrices are determined by localizer measurements, the B matrix by means of a point based rigid registration routine applied to the points' coordinates in the US volume reference frame system in two poses. A detailed description of these routines can be found in our previous works [8-9]. Using a cylindrical shape instrument its calibration has been performed easily since the embedded sensor was aligned with the object main axis.

Fig. 3. Schematics of the calibration procedure to solve the position and orientation of the US scan volume starting from the 3D position and orientation of the sensor attached to the US probe. The variables n and p varied from 1 to 10 with the condition n ≠ p.

Graphical User Interface (GUI)

Our navigator provides two kinds of visualization: a classical 3D virtual scene where the clinician can change the point of view, and one where the camera and the light of the scene are fixed on the tip of the instrument realizing an endoscopic view. Figure 4 shows the GUI of the navigation system.

 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel