Method and apparatus for marking a position of a real world object in a see-through display
First Claim
Patent Images
1. A method for marking a position of a real world object on a see-through display comprising:
- capturing an image of a real world object with an imaging device;
identifying a pointing device within the field of view of the imaging device;
determining a viewing angle along a line from a see-through display to a tip of the pointing device to the real world object based on an azimuth angle and an elevation angle between locations of the real world object and the pointing device within the field of view of the imaging device;
steering a ranging device toward the real world object based on the determined viewing angle;
determining a real world distance between the ranging device and the real world object with the ranging device, wherein the ranging device determines the real world distance based on an amount of time it takes a signal to travel from the ranging device to the real world object and back to the ranging device;
determining an orientation of the see-through display;
calculating a real world position of the real world object based on the viewing angle to the real world object, the real world distance between the ranging device and the real world object, and the orientation of the see-through display;
determining a first location on the see-through display that corresponds to the calculated real world position of the real world object;
displaying a mark on the see-through display at the first location that corresponds to the calculated real world position of the real world object;
tracking movement of the see-through display relative to the calculated real world position of the real world object using an inertial sensor; and
adjusting the first location of the mark as displayed on the see-through display to account for the movement of the see-through display relative to the calculated real world position of the real world object.
1 Assignment
0 Petitions
Accused Products
Abstract
A method for marking a position of a real world object on a see-through display is provided. The method includes capturing an image of a real world object with an imaging device. A viewing angle and a distance to the object are determined. A real world position of the object is calculated based on the viewing angle to the object and the distance to the object. A location on the see-through display that corresponds to the real world position of the object is determined. A mark is then displayed on the see-through display at the location that corresponds to the real world object.
21 Citations
27 Claims
-
1. A method for marking a position of a real world object on a see-through display comprising:
-
capturing an image of a real world object with an imaging device; identifying a pointing device within the field of view of the imaging device; determining a viewing angle along a line from a see-through display to a tip of the pointing device to the real world object based on an azimuth angle and an elevation angle between locations of the real world object and the pointing device within the field of view of the imaging device; steering a ranging device toward the real world object based on the determined viewing angle; determining a real world distance between the ranging device and the real world object with the ranging device, wherein the ranging device determines the real world distance based on an amount of time it takes a signal to travel from the ranging device to the real world object and back to the ranging device; determining an orientation of the see-through display; calculating a real world position of the real world object based on the viewing angle to the real world object, the real world distance between the ranging device and the real world object, and the orientation of the see-through display; determining a first location on the see-through display that corresponds to the calculated real world position of the real world object; displaying a mark on the see-through display at the first location that corresponds to the calculated real world position of the real world object; tracking movement of the see-through display relative to the calculated real world position of the real world object using an inertial sensor; and adjusting the first location of the mark as displayed on the see-through display to account for the movement of the see-through display relative to the calculated real world position of the real world object. - View Dependent Claims (2, 3, 4, 5, 6, 20, 23)
-
-
7. A system for marking a position of a real world object on a see-through display comprising:
-
a processor; a see-through display communicatively coupled to the processor; an imaging device communicatively coupled to the processor; and a ranging device communicatively coupled to the processor; wherein the imaging device is configured to; capture an image of a real world object; wherein the processor is configured to; identify a pointing device within the field of view of the imaging device; determine a viewing angle along a line from a see-through display to a tip of the pointing device to the real world object based on an azimuth angle and an elevation angle between locations of the real world object and the pointing device within the field of view of the imaging device; steer the ranging device toward the real world object based on the determined viewing angle; determine a real world distance between the ranging device and the real world object with the ranging device, wherein the ranging device determines the real world distance based on an amount of time it takes a signal to travel from the ranging device to the real world object and back to the ranging device; calculate a real world position of the real world object based on the viewing angle to the real world object, the real world distance between the ranging device and the real world object, and an orientation of the see-through display; determine a first location on the see-through display that corresponds to the calculated real world position of the real world object; display a mark on the see-through display at the first location that corresponds to the calculated real world position of the real world object; track movement of the see-through display relative to the calculated real world position of the real world object using an inertial sensor; and adjusting the first location of the mark as displayed on the see-through display to account for the movement of the see-through display relative to the real world position of the real world object. - View Dependent Claims (8, 9, 10, 11, 12, 21, 24)
-
-
13. A program product comprising a non-transitory processor readable medium on which program instructions are embodied, wherein the program instructions are operable to:
-
capture an image of a real world object with the imaging device; identify a pointing device within the field of view of the imaging device; determine a viewing angle along a line from the see-through display to a tip of the pointing device to the real world object based on an azimuth angle and an elevation angle between locations of the real world object and the pointing device within the field of view of the imaging device; steer a ranging device toward the real world object based on the determined viewing angle; determine a real world distance between the ranging device and the real world object with the ranging device, wherein the ranging device determines the real world distance based on an amount of time it takes a signal to travel from the ranging device to the real world object and back to the ranging device; determining an orientation of the see-through display; calculate a real world position of the real world object based on the viewing angle to the real world object, the real world distance between the ranging device and the real world object and the orientation of the see-through display; determine a first location on the see-through display that corresponds to the calculated real world position of the real world object; display a mark on the see-through display at the first location that corresponds to the calculated real world position of the real world object; track movement of the see-through display relative to the calculated real world position of the real world object using an inertial sensor; and adjust the first location of the mark as displayed on the sec-through display to account for the movement of the see-through display relative to the real world position of the real world object. - View Dependent Claims (14, 15, 16, 17, 22, 25)
-
-
18. A method of determining a distance to a point comprising:
-
identifying a point within a field of view of an imaging device, the point corresponding to a real world object within the field of view of the imaging device; determining an orientation between the imaging device and the point within the field of view of the imaging device based on a position of the point within the field of view of the imaging device, wherein determining the orientation includes determining an azimuth angle for the point based on the position of the point within the field of view of the imaging device and determining the elevation angle for the point based on the position of the point within the field of view of the imaging device; steering a laser ranging device such that a laser beam from the laser ranging device propagates at the determined orientation towards the real world object, wherein steering the laser ranging device includes orienting the laser ranging device based on the azimuth angle and the elevation angle; determining a distance from the laser ranging device to the real world object located at the determined orientation based on an amount of time it takes the laser beam to travel from the laser ranging device to the real world object and back to the laser ranging device; calculating a real world position of the real world object based on the determined distance from the laser ranging device to the real world object and the determined orientation; determining a first location on a see-through display that corresponds to the calculated real world position of the real world object; displaying a mark on the see-through display at the first location that corresponds to the calculated real world position of the real world object; tracking movement of the see-through display relative to the calculated real world position of the real world object using an inertial sensor; and adjusting the first location of the mark as displayed on the see-through display to account for the movement of the see-through display relative to the calculated real world position of the real world object. - View Dependent Claims (26)
-
-
19. An apparatus comprising:
-
a processing unit; a see-through display communicatively coupled to the processing unit; a laser ranging device coupled to the processing unit, wherein the laser ranging device is steerable, such that a laser beam from the laser ranging device is capable of being directed towards a particular orientation; and an imaging device coupled to the processing unit; wherein the processing unit is configured to; identify a point within a field of view of an imaging device, the point corresponding to a real world object within the field of view of the imaging device; determine an orientation between the imaging device and the point within the field of view of the imaging device based on a position of the point within the field of view of the imaging device, wherein the processing unit is configured to determine an orientation by being configured to determine an azimuth angle for the point based on the position of the point within the field of view of the imaging device, determine the elevation angle of the point based on the position of the point within the field of view of the imaging device; provide the determined orientation to the laser ranging device by being configured to provide the azimuth angle and the elevation angle to the laser ranging device, wherein the laser ranging device is configured to be steered such that a laser beam from the laser ranging device propagates at the determined orientation towards the real world object; determine a distance from the laser ranging device to an object located at the determined orientation based on an amount of time it takes the laser beam to travel from the laser ranging device to the object and back to the laser ranging device; calculate a real world position of the real world object based on the determined distance from the laser ranging device to the real world object and the determined orientation; determine a first location on the see-through display that corresponds to the calculated real world position of the real world object; display a mark on the see-through display at the first location that corresponds to the calculated real world position of the real world object; track movement of the see-through display relative to the calculated real world position of the real world object using an inertial sensor; and adjusting the first location of the mark as displayed on the see-through display to account for the movement of the see-through display relative to the real world position of the real world object. - View Dependent Claims (27)
-
Specification