×

Method and system for full path analysis

  • US 10,163,031 B2
  • Filed: 09/05/2012
  • Issued: 12/25/2018
  • Est. Priority Date: 02/29/2012
  • Status: Expired due to Fees
First Claim
Patent Images

1. A method comprising:

  • identifying a base track of a tracked subject, the base track being identified as a first track in a first camera space based on images of the tracked subject captured by a first camera in a space, the first track being a first superimposed line in the first camera space that represents locations occupied by the tracked subject in the first camera space over a fixed period of time;

    identifying a first candidate track of a first candidate subject from a plurality of candidate tracks captured by a second camera, the first candidate track being identified as a second track in a second camera space based on images of the first candidate subject captured by the second camera in the space, the second track being a second superimposed line in the second camera space that represents locations occupied by the first candidate subject in the second camera space over the fixed period of time;

    identifying a second candidate track of a second candidate subject from the plurality of candidate tracks captured by the second camera, the second candidate track being identified as a third track in a second camera space based on images of the second candidate subject captured by the second camera in the space, the third track being a third superimposed line in the second camera space that represents locations occupied by the second candidate subject in the second camera space over the fixed period of time;

    generating a model comprising camera-space data to real-space data correspondence that is independent of camera lens type, the camera-space data having coordinates measured in pixels and the real-space data having coordinates measured in lengths, the model being generated by mapping a set of calibration points identified in each camera-space of each camera and a set of corresponding calibration points identified in a floor plan representing the real-space;

    applying the model comprising the camera-space to real-space correspondence mapping data to project the base track, the first candidate track, and the second candidate track onto a the floor plan of the space;

    comparing a first set of attributes associated with the tracked subject, a second set of attributes associated with the first candidate subject and a third set of attributes associated with the second candidate subject, the comparing comprising;

    scoring the projected base track and projected first candidate track by calculating a first plurality of differences between the first set of attributes and the second set of attributes and applying weights to the plurality of differences such that a color attribute difference is weighted more than the other attributes to determine a score for the projected first candidate track; and

    scoring the projected base track and projected second candidate track by calculating a second plurality of differences between the first set of attributes and the third set of attributes and applying the same weights as applied to the first plurality of differences to determine a score for the projected second candidate track;

    determining, based upon the comparison, that the first candidate subject is the same as the tracked subject when the score for the projected first candidate track is greater than a user-specified threshold; and

    joining, based upon the determination that the first candidate subject is the same as the tracked subject, the projected first candidate track to the projected base track to create a continuous track of the tracked subject on the floor plan of the space.

View all claims
  • 11 Assignments
Timeline View
Assignment View
    ×
    ×