Gesture-controlled augmented reality experience using a mobile communications device
First Claim
1. A method for producing a gesture-controlled augmented reality experience using a first mobile communications device, the method comprising:
- receiving a motion sensor input on a motion sensor input modality of the first mobile communications device from a user;
calculating a trajectory of a camera of the first mobile communications device in response to the received motion sensor input;
receiving one or more images of a person other than the user captured by the camera of the first mobile communications device;
translating a gesture of the person other than the user into a set of quantified values based on the received one or more images and the calculated trajectory of the camera, the set of quantified values being calculated by a removal of the calculated trajectory from the received one or more images of the person other than the user; and
controlling an augmented reality object within a three-dimensional virtual environment in response to a substantial match between the set of quantified values and a set of predefined values.
3 Assignments
0 Petitions
Accused Products
Abstract
A method for producing a gesture-controlled augmented reality experience using a first mobile communications device includes receiving a motion sensor input on a motion sensor input modality of the first mobile communications device, calculating a trajectory of a camera of the first mobile communications device in response to the received motion sensor input, receiving a visual input captured by the camera of the first mobile communications device, translating a gesture of a user into a set of quantified values based on the received visual input and the calculated trajectory of the camera, and controlling an augmented reality object within a three-dimensional virtual environment in response to a substantial match between the set of quantified values and a set of predefined values.
58 Citations
20 Claims
-
1. A method for producing a gesture-controlled augmented reality experience using a first mobile communications device, the method comprising:
-
receiving a motion sensor input on a motion sensor input modality of the first mobile communications device from a user; calculating a trajectory of a camera of the first mobile communications device in response to the received motion sensor input; receiving one or more images of a person other than the user captured by the camera of the first mobile communications device; translating a gesture of the person other than the user into a set of quantified values based on the received one or more images and the calculated trajectory of the camera, the set of quantified values being calculated by a removal of the calculated trajectory from the received one or more images of the person other than the user; and controlling an augmented reality object within a three-dimensional virtual environment in response to a substantial match between the set of quantified values and a set of predefined values.
-
-
2. The method of claim 1, further comprising:
displaying the augmented reality object on the first mobile communications device.
-
3. The method of claim 2, wherein said displaying the augmented reality object includes displaying a movable-window view of the three-dimensional virtual environment on the first mobile communications device.
-
4. The method of claim 1, further comprising:
outputting, on the first mobile communications device, at least one of visual, auditory, and haptic feedback in response to a substantial match between the set of quantified values and a set of predefined values.
-
5. The method of claim 1, further comprising:
-
receiving an auditory input captured by a microphone of the first mobile communications device; wherein said controlling includes controlling the augmented reality object within the three-dimensional virtual environment in response to the received auditory input.
-
-
6. The method of claim 5, further comprising:
-
detecting a signature of music in response to the received auditory input; wherein said controlling includes controlling the augmented reality object within the three-dimensional virtual environment in response to the detected signature.
-
-
7. The method of claim 1, further comprising:
transmitting object control data to a second mobile communications device held by the person other than the user and communicatively coupled to the first mobile communications device, the object control data defining one or more movements of the augmented reality object within the three-dimensional virtual environment in response to said controlling.
-
8. The method of claim 7, further comprising:
-
receiving a set of secondary quantified values from the second mobile communications device; wherein said controlling includes controlling the augmented reality object within the three-dimensional virtual environment in response to a substantial match between the set of secondary quantified values and the set of predefined values.
-
-
9. The method of claim 8, wherein the set of secondary quantified values is derived from a motion sensor input on a motion sensor input modality of the second mobile communications device.
-
10. The method of claim 8, wherein the set of secondary quantified values is derived from a one or more images captured by a camera of the second mobile communications device.
-
11. The method of claim 8, wherein the set of secondary quantified values is derived from an auditory input captured by a microphone of the second mobile communications device.
-
12. The method of claim 1, further comprising:
transmitting feedback data to a second mobile communications device held by the person other than the user and communicatively coupled to the first mobile communications device, the feedback data defining at least one of visual, auditory, and haptic feedback to be output by the second mobile communications device in response to a substantial match between the set of quantified values and a set of predefined values.
-
13. The method of claim 1, wherein the motion sensor input modality of the first mobile communications device includes at least one of an accelerometer, a gyroscope, and a magnetometer integrated into the first mobile communications device.
-
14. A system comprising a non-transitory program storage medium readable by a first mobile communications device, the medium tangibly embodying one or more programs of instructions executable by the device to perform operations for producing a gesture-controlled augmented reality experience, the operations comprising:
-
receiving a motion sensor input on a motion sensor input modality of the first mobile communications device from a user; calculating a trajectory of a camera of the first mobile communications device in response to the received motion sensor input; receiving one or more images of a person other than the user captured by the camera of the first mobile communications device; translating a gesture of the person other than the user into a set of quantified values based on the received one or more images and the calculated trajectory of the camera, the set of quantified values being calculated by a removal of the calculated trajectory from the received one or more images of the person other than the user; and controlling an augmented reality object within a three-dimensional virtual environment in response to a substantial match between the set of quantified values and a set of predefined values.
-
-
15. The system of claim 14, further comprising:
-
the first mobile communications device; wherein the first mobile communications device includes a processor or programmable circuitry for executing the one or more programs of instructions.
-
-
16. The system of claim 15, further comprising:
-
a second mobile communications device held by the person other than the user and communicatively coupled to the first mobile communications device; wherein the operations further comprise transmitting object control data to the second mobile communications device, the object control data defining one or more movements of the augmented reality object within the three-dimensional virtual environment in response to said controlling.
-
-
17. The system of claim 16, wherein:
-
the second mobile communications device transmits a set of secondary quantified values to the first mobile communications device; and said controlling includes controlling the augmented reality object within the three-dimensional virtual environment in response to a substantial match between the set of secondary quantified values and the set of predefined values.
-
-
18. The system of claim 17, wherein the second mobile communications device derives the set of secondary quantified values from a motion sensor input on a motion sensor input modality of the second mobile communications device, one or more images captured by a camera of the second mobile communications device, and/or an auditory input captured by a microphone of the second mobile communications device.
-
19. A mobile communications device operable to produce a gesture-controlled augmented reality experience, the mobile communications device comprising:
-
a motion sensor for receiving a motion sensor input from a user; a camera for capturing a one or more images of a person other than the user; and a processor for calculating a trajectory of the camera in response to the received motion sensor input, translating a gesture of person other than the user into a set of quantified values based on the received one or more images and the calculated trajectory of the camera, the set of quantified values being calculated by a removal of the calculated trajectory from the received one or more images of the person other than the user, and controlling an augmented reality object within a three-dimensional virtual environment in response to a substantial match between the set of quantified values and a set of predefined values.
-
-
20. The mobile communications device of claim 19, further comprising:
a display;
wherein the processor controls the display to display the augmented reality object within a movable-window view of the three-dimensional virtual environment.
Specification