Visual-inertial positional awareness for autonomous and non-autonomous device
First Claim
Patent Images
1. A system including:
- a mobile platform controllable by a host, the mobile platform having disposed thereon;
a visual sensor comprising at least one RGB sensing capable camera and at least one other grayscale camera disposed at a distance relative to one another to form a region in which the fields of view at least partially overlap, thereby providing stereoscopic imaging capability;
a multi-axis inertial measuring unit (IMU) capable of providing measurement of at least acceleration; and
a visual inertial control unit, including;
a first interface that couples to the visual sensor to receive sets of image data;
a second interface that couples to the multi-axis IMU to receive accelerometer sensor data;
a cache storage that stores the sets of image data;
a single instruction-multiple data processing element having direct memory access to the cache storage;
an inertial measurement engine that performs time stamping of inertial data received via the second interface, corrects the timestamped inertial data for bias, applies a stored scale factor to the corrected inertial data and corrects the scaled inertial data for misalignment in the IMU to form localization data;
an imaging engine that performs imaging undistortion on the sets of image data; and
a communications interface to provide the localization data and the undistorted sets of image data to a host controlling the mobile platform.
3 Assignments
0 Petitions
Accused Products
Abstract
The described positional awareness techniques employing visual-inertial sensory data gathering and analysis hardware with reference to specific example implementations implement improvements in the use of sensors, techniques and hardware design that can enable specific embodiments to provide positional awareness to machines with improved speed and accuracy.
-
Citations
21 Claims
-
1. A system including:
-
a mobile platform controllable by a host, the mobile platform having disposed thereon; a visual sensor comprising at least one RGB sensing capable camera and at least one other grayscale camera disposed at a distance relative to one another to form a region in which the fields of view at least partially overlap, thereby providing stereoscopic imaging capability; a multi-axis inertial measuring unit (IMU) capable of providing measurement of at least acceleration; and a visual inertial control unit, including; a first interface that couples to the visual sensor to receive sets of image data; a second interface that couples to the multi-axis IMU to receive accelerometer sensor data; a cache storage that stores the sets of image data; a single instruction-multiple data processing element having direct memory access to the cache storage; an inertial measurement engine that performs time stamping of inertial data received via the second interface, corrects the timestamped inertial data for bias, applies a stored scale factor to the corrected inertial data and corrects the scaled inertial data for misalignment in the IMU to form localization data; an imaging engine that performs imaging undistortion on the sets of image data; and a communications interface to provide the localization data and the undistorted sets of image data to a host controlling the mobile platform. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. An apparatus for guiding a mobile device using information from one or more cameras with distance calculation and multi-axis inertial measuring unit (IMU), the apparatus including:
-
a first interface that couples to the one or more cameras to receive sets of image data; a second interface that couples to a multi-axis IMU to receive accelerometer sensor data; a cache storage that stores the sets of image data; a single instruction-multiple data processing element having direct memory access to the cache storage; an inertial measurement engine that performs time stamping of inertial data received via the second interface, corrects inertial readouts in the timestamped inertial data for bias, applies a stored scale factor to the corrected inertial data and corrects the scaled inertial data for misalignment in the IMU to form localization data; an imaging engine that performs imaging undistortion on the sets of image data; and a communications interface to provide the localization data and the undistorted sets of image data to a host controlling the mobile device. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A non-transitory computer readable storage medium impressed with computer program instructions to guide a mobile device using information from a camera with distance calculation and multi-axis inertial measuring unit (IMU), the instructions, when executed on a processor, implement a method, the method including:
-
buffering image sets from a visual sensor comprising at least one RGB sensing capable camera and from at least one other grayscale camera disposed at a distance relative to one another to form a region in which the fields of view at least partially overlap, thereby providing stereoscopic imaging capability; buffering inertial measurements from a multi-axis inertial measuring unit (IMU) capable of providing measurement of at least acceleration; receiving at a visual inertial control unit the sets of image data; receiving at the visual inertial control unit sensor data from the multi-axis IMU; time stamping by an inertial measurement engine the inertial data received, correcting inertial readouts in the timestamped inertial data for bias, scaling the inertial readouts using a stored scale factor the corrected inertial data; correcting the scaled inertial data for misalignment in the IMU to form localization data; performing imaging undistortion on the sets of image data; and providing across a communications interface the localization data and the undistorted sets of image data to a host controlling a mobile platform. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
Specification