Updating printed content with personalized virtual data
First Claim
1. A machine-implemented method of augmenting real printed content with personalized virtual data that is personalized for a user wearing a near-eye, mixed reality display device, the method comprising:
- retrieving identity data for a user wearing a near-eye, mixed reality display device;
identifying a printed content item viewable through and in a field of view of the near-eye, mixed reality display device, the identified item being real and providing real printed content;
identifying user selection of a printed content portion within the real printed content of the printed content item based on physical action user input;
determining whether personalizable virtual data is available for the user selected printed content portion;
responsive to the personalizable virtual data being determined to be available for the user selected printed content portion, selecting personalized virtual data from the determined to be available virtual data based on user profile data corresponding to the retrieved identity data for the user wearing the near-eye, mixed reality display device, wherein the selecting of the personalized virtual data from the available personalizable data includes selecting the personalized virtual data based on a current state of being of the user, the current state of being of the user including an emotional state of the user; and
causing display of the selected, personalized virtual data in the field of view of the near-eye, mixed reality display device and in a position registered to a position of the user selected printed content portion of the automatically identified printed content item.
2 Assignments
0 Petitions
Accused Products
Abstract
The technology provides for updating printed content with personalized virtual data using a see-through, near-eye, mixed reality display device system. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. Virtual data is selected from available virtual data for the printed content selection based on user profile data, and the display device system displays the selected virtual data in a position registered to the position of the printed content selection. In some examples, a task related to the printed content item is determined based on physical action user input, and personalized virtual data is displayed registered to the printed content item in accordance with the task.
106 Citations
18 Claims
-
1. A machine-implemented method of augmenting real printed content with personalized virtual data that is personalized for a user wearing a near-eye, mixed reality display device, the method comprising:
-
retrieving identity data for a user wearing a near-eye, mixed reality display device; identifying a printed content item viewable through and in a field of view of the near-eye, mixed reality display device, the identified item being real and providing real printed content; identifying user selection of a printed content portion within the real printed content of the printed content item based on physical action user input; determining whether personalizable virtual data is available for the user selected printed content portion; responsive to the personalizable virtual data being determined to be available for the user selected printed content portion, selecting personalized virtual data from the determined to be available virtual data based on user profile data corresponding to the retrieved identity data for the user wearing the near-eye, mixed reality display device, wherein the selecting of the personalized virtual data from the available personalizable data includes selecting the personalized virtual data based on a current state of being of the user, the current state of being of the user including an emotional state of the user; and causing display of the selected, personalized virtual data in the field of view of the near-eye, mixed reality display device and in a position registered to a position of the user selected printed content portion of the automatically identified printed content item. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A near-eye, mixed reality display device system configured to automatically augment real printed content with personalized virtual data, the system comprising:
-
a display positioned by a support structure; at least one outward facing camera positioned on the support structure for capturing image data of a field of view of real objects disposed beyond the display; one or more software controlled processors having access to user profile data of a user of the display, the user profile data being stored in a memory and being communicatively coupled to a search engine having access to one or more datastores including content, layout and virtual data for works and printed content items embodying the works; the one or more software controlled processors communicatively coupled to the at least one outward facing camera for receiving image data of the field of view of the camera and being configured for automatically identifying user physical action indicative of the user selecting a printed content portion in a printed content item among the real objects within the field of view of the camera, the identified action being directed to a real printed content providing item that provides real printed content; the one or more software controlled processors being configured for determining a current state of being of the user, the current state of being of the user including an emotional state of the user, and storing the current state of being in the user profile data of the user; the one or more software controlled processors being configured for automatically selecting user-personalized virtual data from the one or more datastores based on the user profile data of the user including based on the current state of being and based on the user selected printed content portion of the real printed content providing item; and the one or more software controlled processors being configured for causing at least one communicatively coupled image generation unit for the display to display the automatically selected virtual data as being automatically registered with a field of view position of the printed content portion. - View Dependent Claims (8, 9, 10, 11, 12, 13)
-
-
14. One or more processor readable non-volatile storage devices having instructions encoded thereon for causing one or more processors to execute a machine-implemented method for augmenting real printed content with user personalized virtual data for a user wearing and using a near-eye, mixed reality display device, the method comprising:
-
identifying a real printed content item in a field of view of the near-eye, mixed reality display device, the identified item being real and providing real printed content; determining a task related to the identified printed content item based on detected physical action of the user directed to a user selected portion of the real printed content; and performing the task which further includes; determining a current state of being of the user wearing the near-eye, mixed reality display device, the current state of being of the user including an emotional state of the user, and storing the current state of being of the user, selecting an output format for personalized virtual data associated with the task based on the current state of being of the user, the output format being from the group consisting of non-literary visual symbols, a language of text, and audio data, and outputting the personalized virtual data in accordance with the task and placed in registration with the user selected portion of the real printed content. - View Dependent Claims (15, 16, 17, 18)
-
Specification