Skull Base Surgery Navigation System Based on Updating Preoperative Images Using Positional Information of Surgical Tools
Please use this identifier to cite or link to this publication: http://hdl.handle.net/10380/3428
New: Prefer using the following doi: https://doi.org/10.54294/ty4p4n
Published in The MIDAS Journal - MICCAI 2013 Workshop: Systems and Architectures for Computer Assisted Interventions.
In this paper, we introduce a new concept of surgical navigation which processes information interactively between the real and virtual spaces, namely, updating preoperative images using the positional information of surgical tools. Although the organs are deformed by operative procedures during surgery, surgical navigation systems usually do not change the reference images that are taken prior to surgery. It is useful to generate deformed reference images during surgery while it progresses. We develop a skull base surgery navigation system that updates the preoperative images during surgery. To estimate the resected regions, our proposed system utilizes the positional information of the surgical tools that can be tracked by a surgical navigation system. Our proposed system reflects the bone removal on preoperative images by changing the voxel values of the preoperative images using the positional information of the tracked tools. The updated reference images are generated by visualizing the updated preoperative images using a volume rendering method. We evaluated the proposed system on a skull phantom created from CT images by a 3D printer. The experimental results showed that the proposed system updated the reference images in real time based on the surgical tasks including bone removal process. The accuracy of our proposed method was about 1 mm. It is very useful for surgeons to drill into such complex bone structure as the skull base.