Systems, methods, apparatuses, and computer-readable media for image guided surgery
First Claim
1. A method of displaying a perspective view of an image, the method comprising:
- receiving emplacement data associated with a physical medical device;
determining an emplacement of a medical image slice based at least in part on the emplacement data;
determining a perspective view of the medical image slice in a virtual 3D space based at least in part on a relative orientation of the physical medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and
causing the one or more displays to display in real-time the perspective view of the medical image slice in the virtual 3D space.
1 Assignment
0 Petitions
Accused Products
Abstract
Presented herein are methods, systems, devices, and computer-readable media for image guided surgery. The systems herein allow a physician to use multiple instruments for a surgery and simultaneously provide image-guidance data for those instruments. Various embodiments disclosed herein provide information to physicians about procedures they are performing, the devices (such as ablation needles, ultrasound wands or probes, scalpels, cauterizers, etc.) they are using during the procedure, the relative emplacements or poses of these devices, prediction information for those devices, and other information. Some embodiments provide useful information about 3D data sets. Additionally, some embodiments provide for quickly calibratable surgical instruments or attachments for surgical instruments.
-
Citations
20 Claims
-
1. A method of displaying a perspective view of an image, the method comprising:
-
receiving emplacement data associated with a physical medical device; determining an emplacement of a medical image slice based at least in part on the emplacement data; determining a perspective view of the medical image slice in a virtual 3D space based at least in part on a relative orientation of the physical medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and causing the one or more displays to display in real-time the perspective view of the medical image slice in the virtual 3D space. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A system, comprising:
a computing device comprising one or more processors, the computing device configured to; receive emplacement data associated with a physical medical device; determine an emplacement of a medical image slice based at least in part on the emplacement data; determine a perspective view of the medical image slice in a virtual 3D space based at least in part on a relative orientation of the physical medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and cause the one or more displays to display in real-time the perspective view of the medical image slice in the virtual 3D space. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17)
-
18. A non-transitory computer-readable medium comprising computer-executable instructions, the instructions, when executed, cause one or more processors to:
-
receive emplacement data associated with a physical medical device; determine an emplacement of a medical image slice based at least in part on the emplacement data; determine a perspective view of the medical image slice in a virtual 3D space based at least in part on a relative orientation of the physical medical device with respect to an expected location of a user, wherein the expected location of the user is a fixed location in front of one or more displays; and cause the one or more displays to display in real-time the perspective view of the medical image slice in the virtual 3D space. - View Dependent Claims (19, 20)
-
Specification