Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
First Claim
1. A method for image management in image-guided medical procedures, comprising:
- obtaining pose data for a set of 3D visualizable medical data, wherein the set of 3D visualizable medical data includes a plurality of parallel slices;
determining, with one or more computing devices, a real-time pose of a tracking sensor associated with a medical device;
based at least in part on the real-time pose of the tracking sensor associated with the medical device, determining a pose of an imaging plane associated with the medical device;
selecting a region of the set of 3D visualizable medical data based at least in part on the determined pose of the imaging plane, wherein the selected region of the set of 3D visualizable medical data comprises data from the set of 3D visualizable medical data that intersects with at least a portion of the imaging plane associated with the medical device;
determining a perspective view of the selected region of the set of 3D visualizable medical data based at least in part on an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and
causing the one or more displays to concurrently display, in a virtual 3D space;
a rendering of the perspective view of the selected region of the set of 3D visualizable medical data, andimage guidance data based at least in part on the real-time pose of the tracking sensor associated with the medical device.
1 Assignment
0 Petitions
Accused Products
Abstract
Presented herein are methods, systems, devices, and computer-readable media for image management in image-guided medical procedures. Some embodiments herein allow a physician to use multiple instruments for a surgery and simultaneously provide image-guidance data for those instruments. Various embodiments disclosed herein provide information to physicians about procedures they are performing, the devices (such as ablation needles, ultrasound transducers or probes, scalpels, cauterizers, etc.) they are using during the procedure, the relative emplacements or poses of these devices, prediction information for those devices, and other information. Some embodiments provide useful information about 3D data sets and allow the operator to control the presentation of regions of interest. Additionally, some embodiments provide for quick calibration of surgical instruments or attachments for surgical instruments.
-
Citations
20 Claims
-
1. A method for image management in image-guided medical procedures, comprising:
-
obtaining pose data for a set of 3D visualizable medical data, wherein the set of 3D visualizable medical data includes a plurality of parallel slices; determining, with one or more computing devices, a real-time pose of a tracking sensor associated with a medical device; based at least in part on the real-time pose of the tracking sensor associated with the medical device, determining a pose of an imaging plane associated with the medical device; selecting a region of the set of 3D visualizable medical data based at least in part on the determined pose of the imaging plane, wherein the selected region of the set of 3D visualizable medical data comprises data from the set of 3D visualizable medical data that intersects with at least a portion of the imaging plane associated with the medical device; determining a perspective view of the selected region of the set of 3D visualizable medical data based at least in part on an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and causing the one or more displays to concurrently display, in a virtual 3D space; a rendering of the perspective view of the selected region of the set of 3D visualizable medical data, and image guidance data based at least in part on the real-time pose of the tracking sensor associated with the medical device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A system, comprising:
a computing device comprising one or more processors, the computing device configured to; obtain pose data for a set of 3D visualizable medical data, wherein the set of 3D visualizable medical data includes a plurality of parallel slices; determine a real-time pose of a tracking sensor associated with a medical device; based at least in part on the real-time pose of the tracking sensor associated with the medical device, determine a pose of an imaging plane associated with the medical device; select a region of the set of 3D visualizable medical data based at least in part on the determined pose of the imaging plane, wherein the selected region comprises data from the set of 3D visualizable medical data that intersects with at least a portion of the imaging plane; determine a perspective view of the selected region of the set of 3D visualizable medical data based at least in part on an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and cause the one or more displays to concurrently display, in a virtual 3D space; a rendering of the perspective view of the selected region of the set of 3D visualizable medical data, and image guidance data based at least in part on the real-time pose of the tracking sensor associated with the medical device. - View Dependent Claims (14, 15, 16)
-
17. A non-transitory computer-readable medium comprising computer-executable instructions for image management in image-guided medical procedures, said computer-executable instructions, when running on one or more computing devices, performing a method comprising:
-
obtaining pose data for a set of 3D visualizable medical data, wherein the set of 3D visualizable medical data includes a plurality of parallel slices; determining a real-time pose of a tracking sensor associated with a medical device; based at least in part on the real-time pose of the tracking sensor, determining a pose of an imaging plane associated with the medical device; selecting a region of the set of 3D visualizable medical data based at least in part on the determined pose of the imaging plane, wherein the selected region comprises data from the set of 3D visualizable medical data that intersects with at least a portion of the imaging plane; determining a perspective view of the selected region of the set of 3D visualizable medical data based at least in part on an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and causing the one or more displays to concurrently display, in a virtual 3D space; a rendering of the perspective view of the selected region of the set of 3D visualizable medical data; and image guidance data based at least in part on the real-time pose of the tracking sensor. - View Dependent Claims (18, 19, 20)
-
Specification