Method and system for in-store shopper behavior analysis with multi-modal sensor fusion
First Claim
1. A method for automatically and unobtrusively analyzing in-store behavior of people visiting a physical space based on a fusion of a set of mobile signal- and vision-based person trajectories, an association of the set of mobile signal- and vision-based trajectories with a set of transaction data, and automatic recognition of a set of pre-defined shopping actions, using at least a computing machine, a set of mobile signal and vision sensors, and a set of computer vision and mobile signal processing algorithms, comprising:
- a. setting-up a plurality of types of vision and mobile signal sensors in an area of interest such as a retail store,b. tracking a plurality of persons individually using a set of cameras and a set of mobile signal sensors and a set of corresponding computer vision and mobile signal processing algorithms and yielding a set of vision-based trajectories and a set of mobile signal-based trajectories,c. fusing a mobile signal-based trajectory to a set of corresponding vision-based trajectories through a matching method and generating a fused trajectory for a person, further comprising;
i. retrieving a pool of candidate vision-based trajectories from a database wherein the pool of candidate vision-based trajectories are generated in a similar time frame during which the mobile signal-based trajectory is generated,ii. identifying a set of vision-based trajectories among the pool of candidate vision-based trajectories by comparing the distance statistics of the set of vision-based trajectories to the mobile signal-based trajectory of the mobile-carrying person and comparing the motion dynamics of the set of vision-based trajectories and the mobile signal-based trajectory, which includes direction and speed,iii. integrating the set of vision-based trajectories to generate a fused trajectory and to account for a plurality of vision measurements for a same target at a same time instance,iv. interpolating the missing segments of the fused trajectory by excerpting the missing segments from the mobile signal-based trajectory stored in a database based on a set of point-to-point correspondence information between the set of vision-based trajectories and the mobile signal-based trajectory, andv. refining the fused trajectory by incorporating a store floor plan and a set of layout information that describes an occupancy map of a set of fixtures and other facilities or equipments where a set of shopper trajectories can not exist,d. associating a transaction data set among a pool of candidate transaction data to the fused trajectory based on a set of purchased items and the locations of said set of purchased items,e. extracting an intermediate shopper behavior representation, called a TripVector, from the fused trajectory and the transaction data set associated to said fused trajectory through detecting and recognizing a set of pre-defined shopping actions,f. generating a set of pre-defined shopper metric measurements and behavior analyses based on the TripVector, wherein the transaction data set includes a set of items purchased in a trip by a shopper.
15 Assignments
0 Petitions
Accused Products
Abstract
The present invention provides a comprehensive method for automatically and unobtrusively analyzing the in-store behavior of people visiting a physical space using a multi-modal fusion based on multiple types of sensors. The types of sensors employed may include cameras for capturing a plurality of images and mobile signal sensors for capturing a plurality of Wi-Fi signals. The present invention integrates the plurality of input sensor measurements to reliably and persistently track the people'"'"'s physical attributes and detect the people'"'"'s interactions with retail elements. The physical and contextual attributes collected from the processed shopper tracks includes the motion dynamics changes triggered by an implicit and explicit interaction to a retail element, comprising the behavior information for the trip of the people. The present invention integrates point-of-sale transaction data with the shopper behavior by finding and associating the transaction data that corresponds to a shopper trajectory and fusing them to generate a complete an intermediate representation of a shopper trip data, called a TripVector. The shopper behavior analyses are carried out based on the extracted TripVector. The analyzed behavior information for the shopper trips yields exemplary behavior analysis comprising map generation as visualization of the behavior, quantitative shopper metric derivation in multiple scales (e.g., store-wide and category-level) including path-to-purchase shopper metrics (e.g., traffic distribution, shopping action distribution, buying action distribution, conversion funnel), category dynamics (e.g., dominant path, category correlation, category sequence). The present invention includes a set of derived methods for different sensor configurations.
-
Citations
44 Claims
-
1. A method for automatically and unobtrusively analyzing in-store behavior of people visiting a physical space based on a fusion of a set of mobile signal- and vision-based person trajectories, an association of the set of mobile signal- and vision-based trajectories with a set of transaction data, and automatic recognition of a set of pre-defined shopping actions, using at least a computing machine, a set of mobile signal and vision sensors, and a set of computer vision and mobile signal processing algorithms, comprising:
-
a. setting-up a plurality of types of vision and mobile signal sensors in an area of interest such as a retail store, b. tracking a plurality of persons individually using a set of cameras and a set of mobile signal sensors and a set of corresponding computer vision and mobile signal processing algorithms and yielding a set of vision-based trajectories and a set of mobile signal-based trajectories, c. fusing a mobile signal-based trajectory to a set of corresponding vision-based trajectories through a matching method and generating a fused trajectory for a person, further comprising; i. retrieving a pool of candidate vision-based trajectories from a database wherein the pool of candidate vision-based trajectories are generated in a similar time frame during which the mobile signal-based trajectory is generated, ii. identifying a set of vision-based trajectories among the pool of candidate vision-based trajectories by comparing the distance statistics of the set of vision-based trajectories to the mobile signal-based trajectory of the mobile-carrying person and comparing the motion dynamics of the set of vision-based trajectories and the mobile signal-based trajectory, which includes direction and speed, iii. integrating the set of vision-based trajectories to generate a fused trajectory and to account for a plurality of vision measurements for a same target at a same time instance, iv. interpolating the missing segments of the fused trajectory by excerpting the missing segments from the mobile signal-based trajectory stored in a database based on a set of point-to-point correspondence information between the set of vision-based trajectories and the mobile signal-based trajectory, and v. refining the fused trajectory by incorporating a store floor plan and a set of layout information that describes an occupancy map of a set of fixtures and other facilities or equipments where a set of shopper trajectories can not exist, d. associating a transaction data set among a pool of candidate transaction data to the fused trajectory based on a set of purchased items and the locations of said set of purchased items, e. extracting an intermediate shopper behavior representation, called a TripVector, from the fused trajectory and the transaction data set associated to said fused trajectory through detecting and recognizing a set of pre-defined shopping actions, f. generating a set of pre-defined shopper metric measurements and behavior analyses based on the TripVector, wherein the transaction data set includes a set of items purchased in a trip by a shopper. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 36, 37)
-
-
23. An automatic and unobtrusive system that analyzes in-store behavior of a plurality of people visiting a physical space based on a fusion of a set of mobile signal- and vision-based person trajectories, an association of the set of mobile signal- and vision-based trajectories with a set of transaction data, and automatic recognition of a set of pre-defined shopping actions, using at least a computing machine, a set of mobile signal and vision sensors, and a set of computer vision and mobile signal processing algorithms, comprising:
-
a. a computer and a set of human operators that set-up a plurality of types of vision and mobile signal sensors in an area of interest such as a retail store, b. the computer that tracks a plurality of persons individually using a set of cameras and a set of mobile signal sensors and a set of corresponding computer vision and mobile signal processing algorithms and yielding a set of vision-based trajectories and a set of mobile signal-based trajectories, c. the computer that fuses a mobile signal-based trajectory to a set of corresponding vision-based trajectories through a matching method and generating a fused trajectory for a person, further comprising; i. retrieving a pool of candidate vision-based trajectories from a database wherein the pool of candidate vision-based trajectories are generated in a similar time frame during which the mobile signal-based trajectory is generated, ii. identifying a set of vision-based trajectories among the pool of candidate vision-based trajectories by comparing the distance statistics of the set of vision-based trajectories to the mobile signal-based trajectory of the mobile-carrying person and comparing the motion dynamics of the set of vision-based trajectories and the mobile signal-based trajectory, which includes direction and speed, iii. integrating the set of vision-based trajectories to generate a fused trajectory and to account for a plurality of vision measurements for a same target at a same time instance, iv. interpolating the missing segments of the fused trajectory by excerpting the missing segments from the mobile signal-based trajectory stored in a database based on a set of point-to-point correspondence information between the set of vision-based trajectories and the mobile signal-based trajectory, and v. refining the fused trajectory by incorporating a store floor plan and a set of layout information that describes an occupancy map of a set of fixtures and other facilities or equipments where a set of shopper trajectories can not exist, d. the computer that associates a transaction data set among a pool of candidate transaction data to the fused trajectory based on a set of purchased items and the locations of said set of purchased items, e. the computer that extracts an intermediate shopper behavior representation, called a TripVector, from the fused trajectory and the transaction data set associated to said fused trajectory through detecting and recognizing a set of pre-defined shopping actions, f. the computer that generates a set of pre-defined shopper metric measurements and behavior analyses based on the TripVector, wherein a transaction data set includes a set of items purchased in a trip by a shopper. - View Dependent Claims (24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 38, 39, 40, 41, 42, 43, 44)
-
Specification