System and method for augmented reality navigation in a medical intervention procedure
First Claim
1. A method for augmented reality navigation during a medical intervention, said method comprising the steps of:
- providing stereoscopic head mounted display, said display including a pair of stereo viewing cameras, at least one tracking camera, and a stereoscopic guidance display;
providing a plurality of markers on a frame attached to a table;
providing a medical instrument for performing said medical intervention, and providing a plurality of markers on the instrument;
during a medical intervention on a patient positioned on said table, determining a rigid body transformation between the tracking camera and the frame markers, and determining said patient'"'"'s body pose from said rigid body transformation;
determining the pose of said medical instrument with respect to the table from the patient'"'"'s body pose and said instrument markers;
displaying in the stereoscopic guidance display a visual representation of said patient, said instrument, and a path for guiding said instrument to perform said medical intervention, wherein the visual representation of said patient and said instrument is generated based on a signal received from said viewing cameras, and wherein the visual representation of said patient is overlaid with a medical image of a target of said intervention; and
color-coding the visual representation of the medical instrument to indicate penetration of said medical instrument toward said target in real-time to show the time evolution of the medical instrument'"'"'s trajectory.
2 Assignments
0 Petitions
Accused Products
Abstract
A method for augmented reality navigation of a medical intervention includes providing a stereoscopic head mounted display, the display including a pair of stereo viewing cameras, at least one tracking camera, and a stereoscopic guidance display. During a medical intervention on a patient, the patient'"'"'s body pose is determined from a rigid body transformation between the tracking camera and frame markers on the scanning table, and the pose of an intervention instrument with respect to the table is determined. A visual representation of the patient overlaid with an image of the intervention target, the instrument, and a path for guiding the instrument to perform said medical intervention is displayed in the stereoscopic guidance display.
-
Citations
22 Claims
-
1. A method for augmented reality navigation during a medical intervention, said method comprising the steps of:
-
providing stereoscopic head mounted display, said display including a pair of stereo viewing cameras, at least one tracking camera, and a stereoscopic guidance display; providing a plurality of markers on a frame attached to a table; providing a medical instrument for performing said medical intervention, and providing a plurality of markers on the instrument; during a medical intervention on a patient positioned on said table, determining a rigid body transformation between the tracking camera and the frame markers, and determining said patient'"'"'s body pose from said rigid body transformation; determining the pose of said medical instrument with respect to the table from the patient'"'"'s body pose and said instrument markers; displaying in the stereoscopic guidance display a visual representation of said patient, said instrument, and a path for guiding said instrument to perform said medical intervention, wherein the visual representation of said patient and said instrument is generated based on a signal received from said viewing cameras, and wherein the visual representation of said patient is overlaid with a medical image of a target of said intervention; and color-coding the visual representation of the medical instrument to indicate penetration of said medical instrument toward said target in real-time to show the time evolution of the medical instrument'"'"'s trajectory. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A program storage device readable by a computer, tangibly embodying a program of instructions executable by the computer to perform the method steps for augmented reality navigation during a medical intervention, said method comprising the steps of:
-
determining a rigid body transformation between a tracking camera and a plurality of frame markers attached to a table; determining a patient'"'"'s body pose from said rigid body transformation; during a medical intervention on said patient, wherein said patient is positioned on said table, determining the pose of a medical instrument with respect to the table from the patient'"'"'s body pose and a plurality of instrument markers; providing a stereoscopic video-view of the patient in real-time in a stereoscopic head mounted display, said display including a pair of stereo viewing cameras, at least one tracking camera, and a stereoscopic guidance display; displaying in the stereoscopic video-view a visual representation of said instrument, and a path for guiding said instrument to perform said medical intervention, wherein the visual representation of said patient is overlaid with an image of a target of said intervention; and color-coding the visual representation of the medical instrument to indicate penetration of said medical instrument toward said target in real-time to show the time evolution of the medical instrument'"'"'s trajectory. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17)
-
-
18. A method for augmented reality navigation during a magnetic resonance guided intervention, said method comprising the steps of:
-
providing stereoscopic head mounted display, said display including a pair of stereo viewing cameras, at least one tracking camera, and a stereoscopic guidance display; providing a biopsy needle for performing said intervention; providing a scanning table having a set of attached markers; providing a calibration phantom that includes magnetic and optical markers; determining a transformation between said at least one tracking camera and said scanning table by scanning said calibration phantom; during a medical intervention on a patient positioned on said table, determining said patient'"'"'s body pose from a rigid body transformation between the tracking camera and the table; determining the pose of said biopsy needle with respect to the table from the patient'"'"'s body pose; displaying in the stereoscopic guidance display a visual representation of said patient, said needle, and a path for guiding said needle to perform said intervention, wherein the visual representation of said patient and said needle is generated based on a signal received from said viewing cameras, and wherein the visual representation of said patient is overlaid with a magnetic resonance image of a target of said intervention; and displaying concentric circles in the guidance display to indicate distance of the needle to the target. - View Dependent Claims (19, 20, 21, 22)
-
Specification