Predictive user modeling in user interface design
First Claim
Patent Images
1. A method of facilitating machine interaction with a user, comprising the steps of:
- identifying a current user emotional state of a user;
recognizing in real time a current situation involving said user by;
measuring current physical properties of said user;
measuring current environmental conditions affecting said user; and
recognizing the current situation involving said user based on said measured current physical properties and said measured current environmental conditions;
identifying a plurality of possible user interface components to provide to the user;
calculating a quantitative measure of impact on a state of the user for each of the plurality of possible user interface components;
predicting desired user states for said user based on said current user emotional state and said current situation using a supervised learning algorithm; and
dynamically creating a new user interface comprising one or more user interface components selected from the plurality of possible user interface components, based on said desired user states, said current user emotional state, and said current situation, wherein the one or more user interface components selected from the plurality of possible user interface components are selected such that a sum of the quantitative measures of impact for the selected user interface components does not exceed a specified threshold value, wherein the current physical properties of the user comprise an indication received from a medical device that the user is having or is about to have a heart attack.
1 Assignment
0 Petitions
Accused Products
Abstract
Dynamic modification of user interfaces is disclosed, based upon identification of the current state of the user and the sensing of a particular situation in which the user is involved and/or environment in which the user is situated. In particular, emotional and mental states of a user are identified and these states are taken into consideration when creating and/or adapting an interface to be used by the user. The interface is modified/created automatically based on identified user biometrics, that is, measured physical properties of the user.
-
Citations
18 Claims
-
1. A method of facilitating machine interaction with a user, comprising the steps of:
-
identifying a current user emotional state of a user; recognizing in real time a current situation involving said user by; measuring current physical properties of said user; measuring current environmental conditions affecting said user; and recognizing the current situation involving said user based on said measured current physical properties and said measured current environmental conditions; identifying a plurality of possible user interface components to provide to the user; calculating a quantitative measure of impact on a state of the user for each of the plurality of possible user interface components; predicting desired user states for said user based on said current user emotional state and said current situation using a supervised learning algorithm; and dynamically creating a new user interface comprising one or more user interface components selected from the plurality of possible user interface components, based on said desired user states, said current user emotional state, and said current situation, wherein the one or more user interface components selected from the plurality of possible user interface components are selected such that a sum of the quantitative measures of impact for the selected user interface components does not exceed a specified threshold value, wherein the current physical properties of the user comprise an indication received from a medical device that the user is having or is about to have a heart attack. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A system of facilitating machine interaction with a user, comprising:
-
means for identifying a current user emotional state of a user; means for recognizing in real time a current situation involving said user, said means for recognizing including; means for measuring current physical properties of said user; means for measuring current environmental conditions affecting said user; and means for recognizing the current situation involving said user based on said measured current physical properties and said measured current environmental conditions; means for identifying a plurality of possible user interface components to provide to the user; means for calculating a quantitative measure of impact on a state of the user for each of the plurality of possible user interface components; a supervised learning algorithm predicting desired user states for said user based on said current user emotional state and said current situation; and means for dynamically creating a new user interface comprising one or more user interface components selected from the plurality of possible user interface components, based on said desired user states, said current user emotional state, and said current situation;
wherein the one or more user interface components selected from the plurality of possible user interface components are selected such that a sum of the quantitative measures of impact for the selected user interface components does not exceed a specified threshold value, wherein the current physical properties of the user comprise an indication received from a medical device that the user is having or is about to have a heart attack. - View Dependent Claims (10, 11, 12, 13, 14)
-
-
15. A computer program product for facilitating machine interaction with a user, the computer program product comprising a non-transitory computer-readable storage medium having computer-readable program code embodied in the medium, the computer-readable program code comprising:
-
computer-readable program code that identifies a current user emotional state of a user; computer-readable program code that recognizes in real time a current situation involving said user by; measuring current physical properties of said user; measuring current environmental conditions affecting said user; and recognizing the current situation involving said user based on said measured current physical properties and said measured current environmental conditions; computer-readable program code that identifies a plurality of possible user interface components to provide to the user; computer-readable program code that calculates a quantitative measure of impact on a state of the user for each of the plurality of possible user interface components; computer-readable program code that executes a supervised learning algorithm to predict desired user states for said user based on said current user emotional state and said current situation; and computer-readable program code that dynamically creates a new user interface comprising one or more user interface components selected from the plurality of possible user interface components, based on said desired user states, said current user emotional state, and said current situation, wherein the one or more user interface components selected from the plurality of possible user interface components are selected such that a sum of the quantitative measures of impact for the selected user interface components does not exceed a specified threshold value, wherein the current physical properties of the user comprise an indication received from a medical device that the user is having or is about to have a heart attack. - View Dependent Claims (16, 17, 18)
-
Specification