AUDIO, VIDEO, AND ACTUATION (A/V/A) SYNCHRONIZATION FOR MIXED REALITY
First Claim
Patent Images
1. A computing system comprising:
- network interface circuitry;
a processor coupled to the network interface circuitry; and
one or more memory devices coupled to the processor, the one or more memory devices including instructions, which when executed by the processor, cause the system to;
prepare mixed reality content with timestamp audio, video, and actuation metadata in a content stream to describe modality synchronization requirements, wherein the actuation metadata describes haptic output, projectile launcher output, and other mechanical stimulation output to provide a user with a sense of physical contact;
present the mixed reality content;
determine whether audio, video, and actuation components of the mixed reality content are in synchronization; and
improve perception of the mixed reality content when the audio, video, and actuation components of the mixed reality content are not in synchronization.
0 Assignments
0 Petitions
Accused Products
Abstract
Systems, apparatuses and methods for monitoring and adjusting audio/video/actuation (A/V/A) synchronization to ensure immersive mixed reality experiences. The method includes preparing the mixed reality content with timestamp metadata in the content stream to describe modality synchronization requirements. The mixed reality content is presented. Synchronization between components of the mixed reality content is monitored. The system determines whether the components of the mixed reality content are in synchronization. If they are not in synchronization, the system may take action to improve the perception of the A/V/A synchronization.
-
Citations
31 Claims
-
1. A computing system comprising:
-
network interface circuitry; a processor coupled to the network interface circuitry; and one or more memory devices coupled to the processor, the one or more memory devices including instructions, which when executed by the processor, cause the system to; prepare mixed reality content with timestamp audio, video, and actuation metadata in a content stream to describe modality synchronization requirements, wherein the actuation metadata describes haptic output, projectile launcher output, and other mechanical stimulation output to provide a user with a sense of physical contact; present the mixed reality content; determine whether audio, video, and actuation components of the mixed reality content are in synchronization; and improve perception of the mixed reality content when the audio, video, and actuation components of the mixed reality content are not in synchronization. - View Dependent Claims (2, 3, 4, 5, 7, 26, 30, 31)
-
-
6. (canceled)
-
8. An apparatus comprising:
-
a substrate; and logic coupled to the substrate, wherein the logic includes one or more of configurable logic or fixed-functionality hardware logic, the logic coupled to the substrate to; prepare the mixed reality content with timestamp audio, video, and actuation metadata in a content stream to describe modality synchronization requirements, wherein the actuation metadata describes haptic output, projectile launcher output, and other mechanical stimulation output to provide a user with a sense of physical contact; present the mixed reality content; determine whether audio, video, and actuation components of the mixed reality content are in synchronization; and improve perception of the mixed reality content when the audio, video, and actuation components of the mixed reality content are not in synchronization. - View Dependent Claims (9, 10, 11, 12, 14, 27)
-
-
13. (canceled)
-
15. A method of synchronizing mixed reality content, comprising:
-
preparing the mixed reality content with timestamp audio, video, and actuation metadata in a content stream to describe modality synchronization requirements; presenting the mixed reality content; monitoring synchronization between audio, video, and actuation components of the mixed reality content; determining whether the audio, video, and actuation components of the mixed reality content are in synchronization, wherein the actuation metadata describes haptic output, projectile launcher output, and other mechanical stimulation output to provide a user with a sense of physical contact; and improving perception of the mixed reality content when the audio, video, and actuation components of the mixed reality content are not in synchronization. - View Dependent Claims (16, 17, 19, 28)
-
-
18. (canceled)
-
20. At least one non-transitory computer readable storage medium comprising a set of instructions, which when executed by a computing system, cause the computing system to:
-
prepare the mixed reality content with timestamp audio, video, and actuation metadata in a content stream to describe modality synchronization requirements, wherein the actuation metadata describes haptic output, projectile launcher output, and other mechanical stimulation output to provide a user with a sense of physical contact; present the mixed reality content; monitor synchronization between audio, video, and actuation components of the mixed reality content; determine whether the audio, video, and actuation components of the mixed reality content are in synchronization; and improve perception of the mixed reality content when the audio, video, and actuation components of the mixed reality content are not in synchronization. - View Dependent Claims (21, 22, 25, 29)
-
-
23. (canceled)
-
24. (canceled)
Specification