Determining a virtual representation of an environment by projecting texture patterns
First Claim
1. A method comprising:
- projecting a plurality of different patterns of light using a plurality of projectors, wherein projecting the plurality of different patterns of light using the plurality of projectors comprises projecting, using a first projector of the plurality of projectors, a first texture pattern having a first wavelength while projecting, using a second projector of the plurality of projectors, a second texture pattern having a second wavelength such that at least a portion of the first texture pattern overlaps at least a portion of the second texture pattern on a surface in an environment of a computing device;
receiving sensor data by the computing device and from a plurality of optical sensors, wherein the plurality of optical sensors are configured to distinguish between the plurality of different patterns of light, and wherein the sensor data comprises at least one first image of an overlapping of the first texture pattern and the second texture pattern as perceived from a first viewpoint of a first optical sensor of the plurality of optical sensors and at least one second image of the overlapping of the first texture pattern and the second texture pattern as perceived from a second viewpoint of a second optical sensor of the plurality of optical sensors;
determining, by the computing device and based on the received sensor data, corresponding features between the at least one first image and the at least one second image; and
based on the determined corresponding features, determining, by the computing device, an output including a virtual representation of the environment of the computing device, wherein the output comprises a depth measurement indicative of a distance to at least one object in the environment.
5 Assignments
0 Petitions
Accused Products
Abstract
Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.
114 Citations
19 Claims
-
1. A method comprising:
-
projecting a plurality of different patterns of light using a plurality of projectors, wherein projecting the plurality of different patterns of light using the plurality of projectors comprises projecting, using a first projector of the plurality of projectors, a first texture pattern having a first wavelength while projecting, using a second projector of the plurality of projectors, a second texture pattern having a second wavelength such that at least a portion of the first texture pattern overlaps at least a portion of the second texture pattern on a surface in an environment of a computing device; receiving sensor data by the computing device and from a plurality of optical sensors, wherein the plurality of optical sensors are configured to distinguish between the plurality of different patterns of light, and wherein the sensor data comprises at least one first image of an overlapping of the first texture pattern and the second texture pattern as perceived from a first viewpoint of a first optical sensor of the plurality of optical sensors and at least one second image of the overlapping of the first texture pattern and the second texture pattern as perceived from a second viewpoint of a second optical sensor of the plurality of optical sensors; determining, by the computing device and based on the received sensor data, corresponding features between the at least one first image and the at least one second image; and based on the determined corresponding features, determining, by the computing device, an output including a virtual representation of the environment of the computing device, wherein the output comprises a depth measurement indicative of a distance to at least one object in the environment. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A non-transitory computer-readable medium having stored therein instructions, that when executed by a computing device, cause the computing device to perform functions comprising:
-
causing a plurality of projectors to project a plurality of different patterns of light, wherein causing the plurality of projectors to project the plurality of different patterns of light comprises causing a first projector of the plurality of projectors to project a first texture pattern having a first wavelength while causing a second projector of the plurality of projectors to project a second texture pattern having a second wavelength such that at least a portion of the first texture pattern overlaps at least a portion of the second texture pattern on a surface in an environment of the computing device; receiving sensor data from a plurality of optical sensors, wherein the plurality of optical sensors are configured to distinguish between the plurality of different patterns of light, and wherein the sensor data comprises at least one first image of an overlapping of the first texture pattern and the second texture pattern as perceived from a first viewpoint of a first optical sensor of the plurality of optical sensors and at least one second image of the overlapping of the first texture pattern and the second texture pattern as perceived from a second viewpoint of a second optical sensor of the plurality of optical sensors; based on the received sensor data, determining corresponding features between the at least one first image and the at least one second image; and based on the determined corresponding features, determining an output including a virtual representation of the environment of the computing device, wherein the output comprises a depth measurement indicative of a distance to at least one object in the environment. - View Dependent Claims (12, 13, 14, 15, 16)
-
-
17. A system comprising:
-
a robotic manipulator; at least one projector coupled to the robotic manipulator and configured to project different patterns of light; at least one stereo camera coupled to the robotic manipulator and configured to obtain sensor data that is indicative of an environment of the system; and a computing device configured to perform functions comprising; determining an expected amount of motion of the robotic manipulator during a future time period, determining that the expected amount of motion is less than a threshold, and in response to determining that the expected amount of motion is less than the threshold, causing the at least one projector to project multiple texture patterns during the future time period. - View Dependent Claims (18, 19)
-
Specification