×

Visual-inertial positional awareness for autonomous and non-autonomous tracking

  • US 10,043,076 B1
  • Filed: 08/29/2016
  • Issued: 08/07/2018
  • Est. Priority Date: 08/29/2016
  • Status: Active Grant
First Claim
Patent Images

1. A system including:

  • a mobile platform having disposed thereon;

    at least one camera;

    a multi-axis inertial measuring unit (IMU); and

    an interface to a host including one or more processors coupled to memory, the memory loaded with computer instructions to updating a position of the mobile platform that includes the camera with distance calculation and multi-axis IMU, the instructions, when executed on the processors, implement actions comprising;

    receiving a location of the mobile platform and perspective, including view direction, of the camera, referred to collectively as an initial pose;

    while waiting for a new frame, between successive camera frames, updating the initial pose using inertial data from the multi-axis IMU, to generate a propagated pose;

    correcting drift between the propagated pose, based on the inertial data, and an actual perspective of a new pose, using the new frame captured by the camera, including;

    using the propagated pose, estimating an overlap between the successive camera frames to reduce computation requirements, correlating the new frame with a previous frame by 2D comparison of the successive camera frames, beginning with the estimated overlap;

    retrieving at least some feature points within a field of view of the propagated pose from a 3D map using the propagated pose;

    extracting new features from the new frame;

    matching the extracted new features to the retrieved feature points based on (1) reuse of matched features from the previous frame and (2) matching of features in the new frame with reprojected feature positions from the 3D map onto a 2D view from a perspective of the propagated pose, producing a list of matching features; and

    calculating a visually corrected pose using positions of the matching features in the list of matching feature to determine a perspective from which the new frame was viewed by the camera; and

    responsive to requests for location of the mobile platform and/or the perspective of the camera, providing data based on one or both of the propagated pose, based on the inertial data, and the visually corrected pose.

View all claims
  • 2 Assignments
Timeline View
Assignment View
    ×
    ×