SEARCH USER INTERFACE USING OUTWARD PHYSICAL EXPRESSIONS
First Claim
1. A system, comprising:
- a user interaction component in association with a search engine framework that employs a gesture recognition component to capture and interpret a gesture of a user as interaction with the search engine framework, the gesture is user feedback related to interactions with the results and related interfaces by the user to collect data for improving a user search experience; and
a microprocessor that executes computer-executable instructions stored in memory.
2 Assignments
0 Petitions
Accused Products
Abstract
The disclosed architecture enables user feedback in the form of gestures, and optionally, voice signals, of one or more users, to interact with a search engine framework. For example, document relevance, document ranking, and output of the search engine can be modified based on the capture and interpretation of physical gestures of a user. The recognition of a specific gesture is detected based on the physical location and movement of the joints of a user. The architecture captures emotive responses while navigating the voice-driven and gesture-driven interface, and indicates that appropriate feedback has been captured. The feedback can be used to alter the search query, personalize the response using the feedback collected through the search/browsing session, modifying result ranking, navigation of the user interface, modification of the entire result page, etc., among many others.
-
Citations
27 Claims
-
1. A system, comprising:
-
a user interaction component in association with a search engine framework that employs a gesture recognition component to capture and interpret a gesture of a user as interaction with the search engine framework, the gesture is user feedback related to interactions with the results and related interfaces by the user to collect data for improving a user search experience; and a microprocessor that executes computer-executable instructions stored in memory. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A method, comprising acts of:
-
capturing a gesture of a user as part of a data search experience, the gesture is interactive feedback related to the search experience; comparing the captured gesture to joint characteristics data of the user analyzed as a function of time; interpreting the gesture as a command defined as compatible with a search engine framework; executing the command via the search engine framework; interacting with a search interface according to the command; presenting a visual representation related to the gesture to the user via the search interface; and utilizing a microprocessor that executes instructions stored in memory. - View Dependent Claims (18, 19, 20, 21, 22)
-
-
23. A method, comprising acts of:
-
receiving a gesture from a user viewing a search result user interface of a search engine framework, the gesture is user interactive feedback related to search results; analyzing the gesture of the user based on captured image features of the user as a function of time; interpreting the gesture as a command compatible with the search engine framework; executing the command to facilitate interacting with a search result of a results page via a user interface of the search engine framework; recognizing voice commands to navigate the user interface; presenting a visual representation of the gesture and an effect of the gesture to the user via the user interface of the search engine framework; and utilizing a microprocessor that executes instructions stored in memory. - View Dependent Claims (24, 25, 26, 27)
-
Specification