Dynamic content analysis and engagement system for shared displays
First Claim
1. A system for sensing and analyzing a level of engagement of viewers with content outputted to a shared digital display comprising:
- an attention sensor comprising one or more sensors proximately located to the shared digital display and configured to detect presence of one or more viewers within a determined range of the shared digital display and configured to detect image and/or audio data associated with each viewer; and
a computing system configured to;
analyze the presence and image and/or audio data associated with each viewer detected by the attention sensor to generate engagement metrics regarding each viewer, the generated engagement metrics comprising individual viewer metrics and crowd metrics, the individual viewer metrics including at least arrival time, departure time, viewing time, and viewer dwell time, and the crowd metrics including at least one of crowd size, crowd turnover, crowd engaged, crowd disengaged, or a crowd engagement rating; and
in near real time, dynamically reconfigure content presented on the shared digital display based upon the generated engagement metrics.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, systems, and techniques for analyzing the level of engagement of viewers of shared digital displays and enhancing that engagement by dynamically modifying the content of the display and incorporating personalized content created by viewers are provided. One example embodiment is the Dynamic Content Analysis and Engagement System (DCAES), which enables the operator of the system to analyze engagement, generate metrics, manage owner and viewer generated content, and dynamically modify the content on the shared display to increase the engagement of the viewer. In one example embodiment, the DCAES comprises an attention sensor module, an analysis module, a content management module, and a display control module.
-
Citations
19 Claims
-
1. A system for sensing and analyzing a level of engagement of viewers with content outputted to a shared digital display comprising:
-
an attention sensor comprising one or more sensors proximately located to the shared digital display and configured to detect presence of one or more viewers within a determined range of the shared digital display and configured to detect image and/or audio data associated with each viewer; and a computing system configured to; analyze the presence and image and/or audio data associated with each viewer detected by the attention sensor to generate engagement metrics regarding each viewer, the generated engagement metrics comprising individual viewer metrics and crowd metrics, the individual viewer metrics including at least arrival time, departure time, viewing time, and viewer dwell time, and the crowd metrics including at least one of crowd size, crowd turnover, crowd engaged, crowd disengaged, or a crowd engagement rating; and in near real time, dynamically reconfigure content presented on the shared digital display based upon the generated engagement metrics. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A method for sensing and analyzing a level of engagement of a plurality of viewers with content outputted to a shared digital display comprising:
-
receiving data from an attention sensor detecting presences of one or more viewers within a determined range of the shared digital display and detecting image and/or audio data associated with each viewer; combining the detected presence data and the image and/or audio data for each viewer to generate a multidimensional model to determine posture and/or physical state of each viewer; analyzing the multidimensional model to determine focal point for each viewer; calculating engagement metrics based upon the determined focal point, the engagement metrics comprising individual viewer metrics and crowd metrics, the individual viewer metrics including at least arrival time, departure time, viewing time, and viewer dwell time, and the crowd metrics including at least one of crowd size, crowd turnover, crowd engaged, crowd disengaged, or a crowd engagement rating; and generating and reconfiguring content in near real time to present modified content on the shared digital display based upon the calculated engagement metrics. - View Dependent Claims (18)
-
-
19. A non-transitory computer readable storage medium containing instructions for controlling a computer processor to determine a level of engagement with content presented on a shared digital display by performing a method comprising:
-
receiving data from an attention sensor detecting presences of one or more viewers within a determined range of the shared digital display and detecting image and/or audio data associated with each viewer; combining the detected presence data and the image and/or audio data for each viewer to generate a multidimensional model associated with each viewer; analyzing the multidimensional model to determine focal point for each viewer and an individual engagement metric including at least one of dwell time, viewing time, arrival time, or departure time; calculate engagement metrics based upon the determined focal point and the individual engagement metric, the engagement metrics comprising individual viewer metrics and crowd metrics, the individual viewer metrics including at least arrival time, departure time, viewing time, and viewer dwell time, and the crowd metrics including at least one of crowd size, crowd turnover, crowd engaged, crowd disengaged, or a crowd engagement rating; and generating and reconfiguring content in near real time for the shared digital display based upon the calculated engagement metrics.
-
Specification