Systems, methods, apparatuses, and computer-readable media for image guided surgery
First Claim
1. A method, comprising:
- determining, with one or more processors, real-time position and orientation of a first medical device;
determining, with the one or more processors, real-time position and orientation of a second medical device;
determining position and orientation of a 2D image area based at least in part on the determined real-time position and orientation of the second medical device;
receiving in real-time image data for the 2D image area from the second medical device;
determining a perspective view of the 2D image area in a virtual 3D space based at least in part on a relative orientation of the second medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and
causing the one or more displays to display in real-time a perspective view of an image corresponding to the image data in the virtual 3D space based at least in part on the determined perspective view of the 2D image area.
1 Assignment
0 Petitions
Accused Products
Abstract
Presented herein are methods, systems, devices, and computer-readable media for image guided surgery. The systems herein allow a physician to use multiple instruments for a surgery and simultaneously provide image-guidance data for those instruments. Various embodiments disclosed herein provide information to physicians about procedures they are performing, the devices (such as ablation needles, ultrasound wands or probes, scalpels, cauterizers, etc.) they are using during the procedure, the relative emplacements or poses of these devices, prediction information for those devices, and other information. Some embodiments provide useful information about 3D data sets. Additionally, some embodiments provide for quickly calibratable surgical instruments or attachments for surgical instruments.
447 Citations
20 Claims
-
1. A method, comprising:
-
determining, with one or more processors, real-time position and orientation of a first medical device; determining, with the one or more processors, real-time position and orientation of a second medical device; determining position and orientation of a 2D image area based at least in part on the determined real-time position and orientation of the second medical device; receiving in real-time image data for the 2D image area from the second medical device; determining a perspective view of the 2D image area in a virtual 3D space based at least in part on a relative orientation of the second medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and causing the one or more displays to display in real-time a perspective view of an image corresponding to the image data in the virtual 3D space based at least in part on the determined perspective view of the 2D image area. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A system, comprising:
a computing device comprising one or more processors, the computing device configured to; determine real-time position and orientation of a first medical device based at least in part on data received from a first pose sensor; determine real-time position and orientation of a second medical device based at least in part on data received from a second pose sensor; determine position and orientation of a 2D image area based at least in part on the determined real-time position and orientation of the second medical device; receive image data for the 2D image area from the second medical device; determine a perspective view of the 2D image area in a virtual 3D space based at least in part on a relative orientation of the second medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and cause the one or more displays to display in real-time a perspective view of an image corresponding to the image data in the virtual 3D space based at least in part on the determined perspective view of the 2D image area. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
17. A non-transitory computer-readable medium comprising computer-executable instructions, said instructions, when executed, cause one or more processors to:
-
determine real-time position and orientation of a first medical device based at least in part on data received from a first pose sensor; determine real-time position and orientation of a second medical device based at least in part on data received from a second pose sensor; determine position and orientation of a 2D image area based at least in part on the determined real-time position and orientation of the second medical device; receive image data for the 2D image area from the second medical device; determine a perspective view of the 2D image area in a virtual 3D space based at least in part on a relative orientation of the second medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and cause the one or more displays to display in real-time a perspective view of an image corresponding to the image data in the virtual 3D space based at least in part on the determined perspective view of the 2D image area. - View Dependent Claims (18, 19, 20)
-
Specification