Interactive digital arrow (d'arrow) three-dimensional (3D) pointing
First Claim
1. A real-time three dimensional (3D) pointing device for identifying 3D locations within a subject comprising:
- a) a semi-transparent screen interposed between said subject and an operator for displaying an image provided to it to appear superimposed on external structures of said subject seen through the screen;
b) a moveable mechanical arm for holding the semi-transparent screen in a position and orientation selected by said operator between said operator and said subject such that said operator may view said subject through said semi-transparent screen;
c) touch sensors for indicating a two dimensional (2D) position of the semi-transparent screen selected by said operator and identifying this position as a screen location;
d) a tracking device for repeatedly measuring the location and orientation of the operator, said subject, and the semi-transparent screen;
e) a symbol generation unit coupled to the tracking device for determining a depth based upon the screen distance from the subject, and for displaying a symbol at the `target location` defined as the screen location and depth on the semi-transparent screen in proper relation to the internal and external structures; and
f) a workstation coupled to the tracking device, for receiving the locations and orientations of said subject, operator and the semi-transparent screen, creating an image of internal structures of said subject from a set of imaging data on the semi-transparent screen consistent with the locations and orientations of the operator, said subject and the semi-transparent screen.
1 Assignment
0 Petitions
Accused Products
Abstract
An interactive three-dimensional (3D) pointing device for selecting points within a subject employs a tracking device which determines the position of the operator, a semi-transparent screen positioned by the operator and the subject and provides this information to a model workstation. The model workstation superimposes computer graphic images of internal structures of the subject on a the semi-transparent screen through which the operator is viewing the subject. The superimposed image is derived from image data either previously generated and stored or obtained with an imaging system. The images of the internal structures are registered with the operator'"'"'s view of the external structures of the operator. The operator interactively views internal and external structures and the relation between them simultaneously, while moving the screen to select 3D target points at an image depth within the subject. Optionally other input devices may be used to identify current `target points` as selected points. The 3D points are then provided to an output device which utilizes them. Another embodiment employs stereoscopic viewing methods to provide 3D representations of the internal images superimposed on external structures to allow the operator to employ parallax to select 3D points.
-
Citations
13 Claims
-
1. A real-time three dimensional (3D) pointing device for identifying 3D locations within a subject comprising:
-
a) a semi-transparent screen interposed between said subject and an operator for displaying an image provided to it to appear superimposed on external structures of said subject seen through the screen; b) a moveable mechanical arm for holding the semi-transparent screen in a position and orientation selected by said operator between said operator and said subject such that said operator may view said subject through said semi-transparent screen; c) touch sensors for indicating a two dimensional (2D) position of the semi-transparent screen selected by said operator and identifying this position as a screen location; d) a tracking device for repeatedly measuring the location and orientation of the operator, said subject, and the semi-transparent screen; e) a symbol generation unit coupled to the tracking device for determining a depth based upon the screen distance from the subject, and for displaying a symbol at the `target location` defined as the screen location and depth on the semi-transparent screen in proper relation to the internal and external structures; and f) a workstation coupled to the tracking device, for receiving the locations and orientations of said subject, operator and the semi-transparent screen, creating an image of internal structures of said subject from a set of imaging data on the semi-transparent screen consistent with the locations and orientations of the operator, said subject and the semi-transparent screen. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A method of aiding an operator in identifying three-dimensional (3D) locations within a subject comprising the steps of:
-
a) acquiring multi-dimensional imaging data defining internal structures of said subject; b) positioning a semi-transparent screen at the selected location and orientation between an operator and said subject allowing the operator to view external structures of said subject through the semi-transparent screen; c) measuring locations and orientations of said subject, said operator and the semi-transparent screen; d) superimposing a computer generated image of the internal structures on the semi-transparent screen consistent with the measured locations and orientations from the imaging data; e) determining a position of a semi-transparent screen touched by the operator; f) identifying this position as the selected screen location; g) calculating a depth within the subject being a function of the distance between the screen and subject; and h) creating a symbol on the screen representing a 3D `target location`, being the depth from the selected screen location. - View Dependent Claims (10, 11, 12)
-
-
13. A real-time three dimensional (3D) pointing device for interacting with an operator to select 3D locations of stored computer graphic models of structures within a subject comprising:
-
a) a semi-transparent screen allowing said operator to see external structures of said subject through the screen and also for displaying images of internal structures, superimposed upon the external structures; b) a mechanical arm coupled to the screen and fixed at a second end for adjustably holding the semi-transparent screen between said subject and said operator in an operator-selected position; c) touch sensors for interactively determining 2D screen locations touched by the operator; d) tracking device for measuring locations and orientation of the semi-transparent screen, said operator and subject; e) a workstation coupled to the semi-transparent screen, the touch sensors, tracking device, for receiving the locations and orientations of the screen, said subject, and operator, for displaying computer graphic models of said internal structures of said subject correctly registered with said subject'"'"'s external structures from the operator'"'"'s location and position; f) symbol generation device coupled to the tracking device for determining a distance between the screen and said subject, determining a depth perpendicular to the screen within said subject being a function of this distance, and displaying a symbol on the screen representing a 3D location defined by the 2D screen location selected and the depth, as viewed from operator location and orientation through the screen.
-
Specification