Synchronized, interactive augmented reality displays for multifunction devices
First Claim
1. A computer-implemented method for generating and synchronizing interactive Augmented Reality (AR) displays, comprising:
- capturing live video of a real-world, physical environment and displaying the live video on a touch sensitive surface of a device;
combining an information layer and the live video, the information layer related to one or more objects in the live video;
modeling computer-generated imagery based on the live video;
displaying the computer-generated imagery representing one or more objects in the live video on the touch sensitive surface;
overlaying the information layer on the computer-generated imagery;
receiving sensor data from one or more onboard motion sensors indicating that the device is in motion; and
synchronizing the display of the live video, the computer-generated imagery and the information layer on the touch sensitive surface using the sensor data.
1 Assignment
0 Petitions
Accused Products
Abstract
A device can receive live video of a real-world, physical environment on a touch sensitive surface. One or more objects can be identified in the live video. An information layer can be generated related to the objects. In some implementations, the information layer can include annotations made by a user through the touch sensitive surface. The information layer and live video can be combined in a display of the device. Data can be received from one or more onboard sensors indicating that the device is in motion. The sensor data can be used to synchronize the live video and the information layer as the perspective of video camera view changes due to the motion. The live video and information layer can be shared with other devices over a communication link.
-
Citations
24 Claims
-
1. A computer-implemented method for generating and synchronizing interactive Augmented Reality (AR) displays, comprising:
-
capturing live video of a real-world, physical environment and displaying the live video on a touch sensitive surface of a device; combining an information layer and the live video, the information layer related to one or more objects in the live video; modeling computer-generated imagery based on the live video; displaying the computer-generated imagery representing one or more objects in the live video on the touch sensitive surface; overlaying the information layer on the computer-generated imagery; receiving sensor data from one or more onboard motion sensors indicating that the device is in motion; and synchronizing the display of the live video, the computer-generated imagery and the information layer on the touch sensitive surface using the sensor data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 21, 22)
-
-
17. An apparatus, comprising:
-
a touch sensitive surface configured for receiving touch input; one or more onboard motion sensors configured for sensing motion of the apparatus; a video camera for capturing live video of a real-world, physical environment for display on the touch sensitive surface; and a processor coupled to the touch sensitive surface, the motion sensor and the video camera, the processor configured for combining an information layer and live video in a display of the device, modeling computer-generated imagery based on the live video, displaying computer-generated imagery representing one or more objects in the live video on the touch sensitive surface, overlaying the information layer on the computer-generated imagery, receiving sensor data from the one or more onboard motion sensors indicating that the apparatus is in motion, synchronizing the live video, computer-generated imagery and the information layer using the sensor data. - View Dependent Claims (18, 19, 20, 23, 24)
-
Specification