×

Surround sound in a sensory immersive motion capture simulation environment

  • US 8,825,187 B1
  • Filed: 03/15/2012
  • Issued: 09/02/2014
  • Est. Priority Date: 03/15/2011
  • Status: Active Grant
First Claim
Patent Images

1. A computer program product tangibly embodied in a non-transitory storage medium and comprising instructions that when executed by a processor perform a method, the method comprising:

  • receiving, by a wearable computing device of a first entity, audio data that is generated responsive to a second entity triggering an audio event in a capture volume;

    receiving, by the wearable computing device of the first entity, three dimensional (3D) motion data of a virtual representation of the first entity in a simulated virtual environment, the 3D motion data of the virtual representation of the first entity is calculated based on 3D motion data of the first entity in the capture volume;

    receiving, by the wearable computing device of the first entity, 3D motion data of a virtual representation of the second entity in the simulated virtual environment; and

    processing the audio data, the 3D motion data of the virtual representation of the first entity and the 3D motion data of a virtual representation of the second entity to generate multi-channel audio output data customized to a perspective of the virtual representation of the first entity in the simulated virtual environment,wherein the multi-channel audio output data is associated with the audio event, andwherein generating the multi-channel audio output data comprises;

    updating, at a sound library module of the wearable computing device, the 3D motion data of the virtual representation of the first entity in the simulated virtual environment,updating, at the sound library module, the 3D motion data of the virtual representation of the second entity in the simulated virtual environment,calculating, by the sound library module, a distance between the virtual representation of the first entity and the virtual representation of the second entity in the simulated virtual environment, andcalculating, by the sound library module, a direction of the virtual representation of the second entity in reference to the virtual representation the first entity in the simulated virtual environment,wherein the direction and the distance is calculated based on based on at least one of the updated 3D motion data of the virtual representation of the first entity and the updated 3D motion data of the virtual representation of the second entity.

View all claims
  • 1 Assignment
Timeline View
Assignment View
    ×
    ×