Visual navigation system for endoscopic surgery
First Claim
1. A method comprising:
- during an endoscopic procedure on a body performed by a surgeon, inputting intra-operative scan data generated during the endoscopic procedure by a medical scanning device, the intra-operative scan data being representative of a region of interest in the body;
inputting pre-operative scan data generated prior to the endoscopic procedure by a medical scanning device, the pre-operative scan data being representative of the region of interest in the body;
coregistering the pre-operative scan data with the intra-operative scan data;
capturing data indicative of positions and orientations of a flexible endoscope during the endoscopic procedure;
generating real-time three-dimensional surface models of the region of interest based on the intra-operative scan data, preoperative scan data, and the data indicative of positions and orientations of the flexible endoscope;
coregistering the generated real-time three-dimensional surface models with live video images generated by a video camera coupled to the flexible endoscope; and
displaying, simultaneously and coregistered with each other, the generated real-time three-dimensional surface models and the live video images on a display device, wherein the real-time three-dimensional surface models are rendered from a point of view based on the position and orientation of the body during the endoscopic procedure and not on the point of view of the camera.
1 Assignment
0 Petitions
Accused Products
Abstract
An endoscopic surgical navigation system comprises a data acquisition subsystem, a tracking subsystem, a registration subsystem, a data processing subsystem and a user interface subsystem. The data acquisition subsystem inputs intra-operative scan data from a medical scanning device during an endoscopic procedure. The tracking subsystem captures data representing positions and orientations of a flexible endoscope during the endoscopic procedure. The registration subsystem determines transformation parameters for coregistering the intra-operative scan data and the data indicative of positions and orientations of the endoscope. The data processing subsystem coregisters the intra-operative scan data and the data indicative of positions and orientations of the endoscope based on the transformation parameters and generates real-time image data representing 3D internal views of a body that are coregistered with live video from an endoscopic video camera. The user interface subsystem receives input from a user for controlling the system and provides output to the user.
194 Citations
18 Claims
-
1. A method comprising:
-
during an endoscopic procedure on a body performed by a surgeon, inputting intra-operative scan data generated during the endoscopic procedure by a medical scanning device, the intra-operative scan data being representative of a region of interest in the body; inputting pre-operative scan data generated prior to the endoscopic procedure by a medical scanning device, the pre-operative scan data being representative of the region of interest in the body; coregistering the pre-operative scan data with the intra-operative scan data; capturing data indicative of positions and orientations of a flexible endoscope during the endoscopic procedure; generating real-time three-dimensional surface models of the region of interest based on the intra-operative scan data, preoperative scan data, and the data indicative of positions and orientations of the flexible endoscope; coregistering the generated real-time three-dimensional surface models with live video images generated by a video camera coupled to the flexible endoscope; and displaying, simultaneously and coregistered with each other, the generated real-time three-dimensional surface models and the live video images on a display device, wherein the real-time three-dimensional surface models are rendered from a point of view based on the position and orientation of the body during the endoscopic procedure and not on the point of view of the camera. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. An endoscopic surgical navigation system comprising:
-
a data acquisition subsystem to input pre-operative scan data and intra-operative scan data representative a region of interest of a body provided by a medical scanning device during an endoscopic procedure; a tracking subsystem to capture data indicative of positions and orientations of a flexible endoscope during the endoscopic procedure performed by a surgeon; a registration subsystem to determine transformation parameters for use in coregistering the pre-operative scan data, intra-operative scan data, and the data indicative of positions and orientations of the flexible endoscope; a data processing device to coregister the pre-operative scan data, the intra-operative scan data, and the data indicative of positions and orientations of the flexible endoscope based on the transformation parameters and to generate, in real-time, surface models representing three-dimensional internal views of the region of interest based on the intra-operative scan data, preoperative scan data, and the data indicative of positions and orientations of the flexible endoscope, wherein the generated surface models are coregistered with live video generated by a video camera coupled to the flexible endoscope; and a user interface subsystem to receive input from a user for controlling said endoscopic surgical navigation system and to provide output to the user including displaying, simultaneously and coregistered with each other, the generated real-time three-dimensional surface models and the live video images on a display device, wherein the real-time three-dimensional surface models are rendered from a point of view based on the position and orientation of the body during the endoscopic procedure and not on the point of view of the camera. - View Dependent Claims (12, 13, 14, 15, 16, 17)
-
-
18. An endoscopic surgical navigation system comprising:
-
a data acquisition subsystem to input intra-operative scan data representative a region of interest of a body provided by a medical scanning device during an endoscopic procedure; a tracking subsystem to capture data indicative of positions and orientations of a flexible endoscope during the endoscopic procedure; a registration subsystem to determine transformation parameters for use in coregistering the scan data and the data indicative of positions and orientations of the flexible endoscope, the registration subsystem including; a multi-modal image coregistration module to coregister the intra-operative patient scan data with pre-operative scan data, and an endoscope registration module operable to transform data acquired by the tracking subsystem into a common co-ordinate space, based on data indicative of positions and orientations of the flexible endoscope placed on the origin of a co-ordinate system of the medical scanning device; a data processing device to coregister the intra-operative patient scan data, the pre-operative scan data, and the data indicative of positions and orientations of the flexible endoscope based on the transformation parameters and to generate in real-time surface models representing three-dimensional internal views of the region of interest based on the intra-operative scan data, preoperative scan data, and the data indicative of positions and orientations of the flexible endoscope, wherein the generated surface models are coregistered with live video generated by a video camera coupled to the flexible endoscope, the data processing device including; an image reslicer to reformat images from the scan data to user-specified positions and orientations, an anatomical model generation module to generate surface models of the region of interest from the scan data, a transform module to apply a geometric transform on three-dimensional points within the region of interest, based on an image, a transform and an interpolator, and a barrel-distortion correction unit to correct barrel distortion in the live video; and a user interface subsystem to receive input from a user for controlling said endoscopic surgical navigation system and to provide output to the user, the user interface subsystem including a merge unit to mix graphical models from the anatomical model generation module, image slices generated by the image reslicer, video generated by the video camera, and text, at user-specified positions and orientations, to produce images for display including the generated surface models, wherein the real-time three-dimensional surface models are rendered from a point of view based on the position and orientation of the body during the endoscopic procedure and not on the point of view of the camera; and a measurement subsystem to generate measurements of physical properties of anatomical features within the region of interest in response to user inputs.
-
Specification