Systems, methods, apparatuses, and computer-readable media for image guided surgery
First Claim
1. A method for image guided surgery, comprising:
- determining, with a computer system, device type information for a first medical device;
determining, with the computer system, real-time emplacement information for the first medical device;
determining, with the computer system, real-time emplacement information for a second medical device;
determining the real-time relative emplacements of the first and second medical devices with the computer system;
determining a 2D image plane based at least on the real-time emplacement information for the second medical device;
determining, with the computer system, at least based on the device type of the first medical device, real-time prediction information for the first medical device, wherein the prediction information comprises a plurality of graphical indicators of the real-time relative emplacement of a plurality of portions of the first medical device with respect to the image plane, wherein each graphical indicator indicates whether a corresponding portion of the first medical device is in a proximal or distal position with respect to the image plane;
receiving, in real-time, a 2D image corresponding to at least a portion of the determined 2D image plane from the second medical device;
determining a perspective view of the 2D image within a virtual 3D space in response to real-time relative emplacement information for the second medical device with respect to a position of a user;
generating image guidance information, with the computer system, based at least on the real-time relative emplacements of the first and second medical devices and the real-time prediction information for the first medical device;
causing one or more displays to display a graphical rendering of the image guidance information; and
causing the one or more displays to display in real time the perspective view of the 2D image within the virtual 3D space based at least on the relative emplacement information for the second medical device with respect to the position of the user.
1 Assignment
0 Petitions
Accused Products
Abstract
Presented herein are methods, systems, devices, and computer-readable media for image guided surgery. The systems herein allow a physician to use multiple instruments for a surgery and simultaneously provide image-guidance data for those instruments. Various embodiments disclosed herein provide information to physicians about procedures they are performing, the devices (such as ablation needles, ultrasound wands or probes, scalpels, cauterizers, etc.) they are using during the procedure, the relative emplacements or poses of these devices, prediction information for those devices, and other information. Some embodiments provide useful information about 3D data sets. Additionally, some embodiments provide for quickly calibratable surgical instruments or attachments for surgical instruments.
739 Citations
22 Claims
-
1. A method for image guided surgery, comprising:
-
determining, with a computer system, device type information for a first medical device; determining, with the computer system, real-time emplacement information for the first medical device; determining, with the computer system, real-time emplacement information for a second medical device; determining the real-time relative emplacements of the first and second medical devices with the computer system; determining a 2D image plane based at least on the real-time emplacement information for the second medical device; determining, with the computer system, at least based on the device type of the first medical device, real-time prediction information for the first medical device, wherein the prediction information comprises a plurality of graphical indicators of the real-time relative emplacement of a plurality of portions of the first medical device with respect to the image plane, wherein each graphical indicator indicates whether a corresponding portion of the first medical device is in a proximal or distal position with respect to the image plane; receiving, in real-time, a 2D image corresponding to at least a portion of the determined 2D image plane from the second medical device; determining a perspective view of the 2D image within a virtual 3D space in response to real-time relative emplacement information for the second medical device with respect to a position of a user; generating image guidance information, with the computer system, based at least on the real-time relative emplacements of the first and second medical devices and the real-time prediction information for the first medical device; causing one or more displays to display a graphical rendering of the image guidance information; and causing the one or more displays to display in real time the perspective view of the 2D image within the virtual 3D space based at least on the relative emplacement information for the second medical device with respect to the position of the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A system for image guided surgery, comprising:
a computer system, wherein the computer system is configured to; determine device type information for a first medical device; determine real-time emplacement information for the first medical device; determine real-time emplacement information for a second medical device; determine the real-time relative emplacements of the first and second medical devices; determine a 2D image plane based at least on the real-time emplacement information for the second medical device; determine at least based on the device type of the first medical device, real-time prediction information for the first medical device, wherein the prediction information comprises a plurality of graphical indicators of the real-time relative emplacement of a plurality of portions of the first medical device and the image plane, wherein each graphical indicator indicates whether a corresponding portion of the first medical device is in front of or behind the image plane; receive, in real-time, a 2D image corresponding to at least a portion of the determined 2D image plane from the second medical device; determine a perspective view of the 2D image within a virtual 3D space in response to real-time relative emplacement information for the second medical device with respect to a position of a user; generate image guidance information based at least on the real-time relative emplacements of the first and second medical devices and the real-time prediction information for the first medical device; cause one or more displays to display a graphical rendering of the image guidance information; and cause the one or more displays to display in real time the perspective view of the 2D image within the virtual 3D space. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20, 21)
-
22. A non-transitory computer-readable medium comprising computer-executable instructions, said instructions, when executed cause one or more processors to:
-
determine device type information for a first medical device; determine real-time emplacement information for the first medical device; determine real-time emplacement information for a second medical device; determine the real-time relative emplacements of the first and second medical devices; determine a 2D image plane based at least on the real-time emplacement information for the second medical device; determine at least based on the device type of the first medical device, real-time prediction information for the first medical device, wherein the prediction information comprises a plurality of graphical indicators of the real-time relative emplacement of a plurality of portions of the first medical device and the image plane, wherein each graphical indicator indicates whether a corresponding portion of the first medical device is in front of or behind the image plane; receive, in real-time a 2D image corresponding to at least a portion of the determined 2D image plane from the second medical device; determine a perspective view of the 2D image within a virtual 3D space in response to real-time relative emplacement information for the second medical device with respect to a position of a user; generate image guidance information based at least on the real-time relative emplacements of the first and second medical devices and the real-time prediction information for the first medical device; cause one or more displays to display a graphical rendering of the image guidance information; and cause the one or more displays to display in real time the perspective view of the 2D image within the virtual 3D space.
-
Specification