Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
First Claim
1. A tool tracking method comprising:
- tracking a tool by processing non-endoscopically derived tool state information and endoscopically derived tool state information generated while the tool is inserted and being manipulated through a minimally invasive incision in a body.
2 Assignments
0 Petitions
Accused Products
Abstract
Methods and system perform tool tracking during minimally invasive robotic surgery. Tool states are determined using triangulation techniques or a Bayesian filter from either or both non-endoscopically derived and endoscopically derived tool state information, or from either or both non-visually derived and visually derived tool state information. The non-endoscopically derived tool state information is derived from sensor data provided either by sensors associated with a mechanism for manipulating the tool, or sensors capable of detecting identifiable signals emanating or reflecting from the tool and indicative of its position, or external cameras viewing an end of the tool extending out of the body. The endoscopically derived tool state information is derived from image data provided by an endoscope inserted in the body so as to view the tool.
553 Citations
126 Claims
-
1. A tool tracking method comprising:
- tracking a tool by processing non-endoscopically derived tool state information and endoscopically derived tool state information generated while the tool is inserted and being manipulated through a minimally invasive incision in a body.
- View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26)
- 11. The method according to claim 11, wherein quality measures are respectively derived for the non-endoscopically and the endoscopically derived tool states, and weightings of contributions of the non-endoscopically and the endoscopically derived tool states in the Bayesian filter are determined by the quality measures.
-
27. A tool tracking method comprising:
- receiving sensor information indicative of a position and orientation of a tool when the tool is inserted through an incision in a body;
receiving image information for the tool; and
determining the position and orientation of the tool using both the sensor and the image information. - View Dependent Claims (28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53)
- receiving sensor information indicative of a position and orientation of a tool when the tool is inserted through an incision in a body;
-
54. A minimally invasive robotic surgery system with tool tracking, comprising:
-
one or more non-endoscopic devices providing data from which non-endoscopically derived tool state information is generated when a tool is inserted and robotically manipulated through an incision in a body;
an endoscope capturing images from which endoscopically derived tool state information is generated for an area within the body when the tool is inserted therein; and
a processor configured to process the non-endoscopically and endoscopically derived tool state information for tracking the state of the tool. - View Dependent Claims (55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67)
-
-
68. A minimally invasive robotic surgery system with tool tracking, comprising:
-
one or more sensors providing sensor data from which non-visually derived tool state information for a tool is generated when the tool is inserted and robotically manipulated through an incision in a body;
at least one camera capturing image information of the tool when the tool is inserted therein; and
a processor configured to process the non-visually derived tool state information and the image information for tracking the state of the tool. - View Dependent Claims (69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86)
-
-
87. A tool tracking method comprising:
-
determining a computer model of a tool;
receiving a captured image including a view of the tool;
determining an estimated position and orientation of the tool from the captured image, and positioning and orienting the computer model at that estimated position and orientation in reference to the captured image; and
modifying the estimated position and orientation of the computer model with respect to an image of the tool in the captured image until the computer model approximately overlays the image so as to correct the estimated position and orientation of the tool for the captured image. - View Dependent Claims (88, 89, 90, 91, 92, 93, 94)
-
-
95. A tool tracking method comprising:
-
determining whether sensor data indicative of a tool state is available for a point in time;
determining whether image data indicative of the tool state is available for the point in time; and
determining the tool state using both the sensor data and the image data if both are available for the point in time, or using only the sensor data if only the sensor data is available, or using only the image data if only the image data is available. - View Dependent Claims (96, 97, 98, 99, 100)
-
-
101. A tool tracking method comprising:
-
determining a first estimated tool state relative to a landmark for a point in time using first sensor data indicative of the tool state at the point in time;
determining an estimated camera state relative to the landmark for the point in time using second sensor data indicative of the camera state at the point in time;
determining a second estimated tool state relative to the camera for the point in time using image data generated by the camera and indicative of the tool state at the point in time;
translating the first estimated tool state so as to be relative to the camera instead of the landmark; and
computing an error transform between the translated first and the second estimated tool states that at a subsequent point in time if image data indicative of the tool state at the subsequent point in time is not available, then the tool state is determined by applying the error transform to a third estimated tool state determined using sensor data indicative of the tool state at the subsequent point in time translated so as to be relative to the camera instead of the landmark. - View Dependent Claims (102, 103)
-
-
104. A tool tracking method comprising:
-
determining non-endoscopically derived estimated state information for a tool at a given time;
determining endoscopically estimated state information for the tool at the given time; and
providing the non-endoscopically and endoscopically derived estimated states for the tool to a Bayesian filter configured so as to generate an optimal estimate of the state of the tool. - View Dependent Claims (105, 106, 107, 108, 109, 110, 111)
-
-
112. A tool tracking and calibration method comprising:
-
generating visually derived state information from image data received from a camera viewing a tool;
generating state vector information by combining initial values for a set of camera parameters with the visually derived state information; and
providing the state vector information to a Bayesian filter for processing so as to generate an optimal estimate of a state of the tool and corrected values for the set of camera parameters. - View Dependent Claims (113, 114, 115)
-
-
116. A camera tracking method comprising:
-
determining a position of a tool in a fixed reference frame from non-visually derived tool state information generated from sensor data indicative of the position of the tool;
determining a position of the tool in a camera frame moveable with a camera using visually derived tool state information generated from image data provided by the camera while viewing the tool; and
determining a position of the camera in the fixed reference frame using the position of the tool in the fixed reference frame and the position of the tool in the moveable camera frame.
-
-
117. A tool tracking method comprising:
-
determining a position of a camera in a fixed reference frame from non-visually derived camera state information generated from sensor data indicative of the position of the camera;
determining a position of a tool in a camera frame moveable with the camera using visually derived tool state information generated from image data provided by the camera while viewing the tool; and
determining a position of the tool in the fixed reference frame using the position of the camera in the fixed reference frame and the position of the tool in the moveable camera frame.
-
-
118. A tool tracking method comprising:
-
generating a plurality of estimated tool states for each point in a plurality of points in time, while the tool is inserted and being manipulated through an incision in a body; and
determining an optimal estimated tool state for each point in the plurality of points in time by processing the plurality of estimated tool states using Bayesian techniques. - View Dependent Claims (119, 120, 121, 122, 123, 124, 125, 126)
-
Specification