Surgeon head-mounted display apparatuses
First Claim
1. An augmented reality surgical system comprising:
- a display comprising a see-through display screen that display images while allowing transmission of ambient light therethrough;
a motion sensor connected to the display and configured to output a motion signal, wherein the motion sensor includes at least one inertial sensor configured to output the motion signal indicating a measurement of movement or static orientation of a user'"'"'s head while wearing the display; and
at least one camera configured to observe a first set of reference markers connected to a patient, and a second set of reference markers connected to a surgical tool located within a surgical room;
a computer configured to;
compute the relative location and orientation of the display and the first and second set of reference markers;
generate a three dimensional anatomical image using patient data created by medical imaging equipment that has imaged a portion of the patient;
generate a video signal based on at least a portion of the three dimensional anatomical image and the location and orientation of the first and second set of reference markers coupled to the patient and the surgical instrument,output the video signal to the display screen on the display,the computer including a gesture interpretation module configured to sense gestures of the user, wherein the computer changes the displayed video signal on the display based on the sensed gestures of the user.
1 Assignment
0 Petitions
Accused Products
Abstract
An augmented reality surgical system includes a head mounted display (HMD) with a see-through display screen, a motion sensor, a camera, and computer equipment. The motion sensor outputs a head motion signal indicating measured movement of the HMD. The computer equipment computes the relative location and orientation of reference markers connected to the HMD and to the patient based on processing a video signal from the camera. The computer equipment generates a three dimensional anatomical model using patient data created by medical imaging equipment, and rotates and scales at least a portion of the three dimensional anatomical model based on the relative location and orientation of the reference markers, and further rotate at least a portion of the three dimensional anatomical model based on the head motion signal to track measured movement of the HMD. The rotated and scaled three dimensional anatomical model is displayed on the display screen.
-
Citations
19 Claims
-
1. An augmented reality surgical system comprising:
-
a display comprising a see-through display screen that display images while allowing transmission of ambient light therethrough; a motion sensor connected to the display and configured to output a motion signal, wherein the motion sensor includes at least one inertial sensor configured to output the motion signal indicating a measurement of movement or static orientation of a user'"'"'s head while wearing the display; and at least one camera configured to observe a first set of reference markers connected to a patient, and a second set of reference markers connected to a surgical tool located within a surgical room; a computer configured to; compute the relative location and orientation of the display and the first and second set of reference markers; generate a three dimensional anatomical image using patient data created by medical imaging equipment that has imaged a portion of the patient; generate a video signal based on at least a portion of the three dimensional anatomical image and the location and orientation of the first and second set of reference markers coupled to the patient and the surgical instrument, output the video signal to the display screen on the display, the computer including a gesture interpretation module configured to sense gestures of the user, wherein the computer changes the displayed video signal on the display based on the sensed gestures of the user. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. An augmented reality surgical system comprising:
-
a positioning tracking system for tracking a location of a surgical tool, a head mounted display, and a surgical site; a computer system using patient data from an imaging device to generate an image model of a targeted site of the patient; wherein the image model includes reference markers that assist in correlating between virtual locations and physical locations of the patient, wherein the computer system processes the present locations of the head mounted display, the surgical site and the surgical tool obtained by the positioning tracking system and uses the reference markers contained in the image model to generate a graphical representation of the patient to the head mounted display for a user to view, wherein the computer system provides a graphical representation of a virtual trajectory of a surgical tool to the patient'"'"'s surgical site, wherein a motion sensor includes at least one inertial sensor configured to output a motion signal indicating a measurement of movement or static orientation of a user'"'"'s head while wearing the head mounted display, the computer system including a gesture interpretation module configured to sense gestures of the user, wherein the computer changes the displayed graphical representation of the patient on the display based on the sensed gestures of the user. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. An augmented reality surgical system for displaying multiple video streams to a user, the augmented reality surgical system comprising:
-
a display comprising a see-through display screen that display images while allowing transmission of ambient light therethrough; a motion sensor connected to the display and configured to output a motion signal, wherein the motion sensor includes at least one inertial sensor configured to output the motion signal indicating a measurement of movement or static orientation of a user'"'"'s head while wearing the display; and a computer configured to receive a plurality of patient data including a three dimensional anatomical image of the patient and relative location and orientation of the patient and to control which or of the plurality of patient data to output as a video signal to the display screen, wherein the computer provides a graphical representation of a virtual trajectory of a surgical tool to the patient'"'"'s surgical site, the computer including a gesture interpretation module configured to sense gestures of the user, wherein the computer changes the displayed video signal on the display screen based on the sensed gestures of the user. - View Dependent Claims (16, 17, 18, 19)
-
Specification