User movement interpretation in computer generated reality
First Claim
1. A method for interpret user movement in computer generated reality, the method comprising:
- receiving, by a processor, movement data relating to movement of a user interface;
defining, by the processor, a coordinate system based on the movement data;
the three dimensional coordinate system including a first, second and third dimension, and the three dimensional coordinate system including a cube having six sides, each side having a center and four respective edges;
mapping, by the processor, the movement data to the three dimensional coordinate system to produce mapped movement data;
determining, by the processor, a start location, a final location, a path between the start and final location, and at least one intermediate location between the start location and the final location along the path, of the mapped movement data in the coordinate system, the final location representing a change in user movement from the start location in the first dimension, in the second dimension and in the third dimension, wherein a first line between the start location and the intermediate location is different from a second line between the start location and the final location;
determining, by the processor, a feature of the mapped movement data based on the path, the mapped movement data, a first intersection between one of the edges of the cube and the mapped movement data, a second intersection between one of the centers of one of the sides of the cube in the coordinate system and the mapped movement data, and a difference between the start location and the final location; and
mapping, by the processor, the feature to a code.
4 Assignments
0 Petitions
Accused Products
Abstract
Technologies are generally described for a system for interpreting user movement in computer generated reality. In some examples, the system includes a user interface effective to generate movement data relating to movement of the user interface. In some examples, the system further includes a processor receive the movement data. In some examples, the processor is further effective to define a coordinate system based on the movement data and map the movement data to the coordinate system to produce mapped movement data. In some examples, the processor is further effective to determine a feature of the mapped movement data and to map the feature to a code. In some examples, the processor is further effective to send the code to the application and receive application data from the application in response to the code. In some examples, the processor is further effective to generate an image based on the application data.
99 Citations
8 Claims
-
1. A method for interpret user movement in computer generated reality, the method comprising:
-
receiving, by a processor, movement data relating to movement of a user interface; defining, by the processor, a coordinate system based on the movement data;
the three dimensional coordinate system including a first, second and third dimension, and the three dimensional coordinate system including a cube having six sides, each side having a center and four respective edges;mapping, by the processor, the movement data to the three dimensional coordinate system to produce mapped movement data; determining, by the processor, a start location, a final location, a path between the start and final location, and at least one intermediate location between the start location and the final location along the path, of the mapped movement data in the coordinate system, the final location representing a change in user movement from the start location in the first dimension, in the second dimension and in the third dimension, wherein a first line between the start location and the intermediate location is different from a second line between the start location and the final location; determining, by the processor, a feature of the mapped movement data based on the path, the mapped movement data, a first intersection between one of the edges of the cube and the mapped movement data, a second intersection between one of the centers of one of the sides of the cube in the coordinate system and the mapped movement data, and a difference between the start location and the final location; and mapping, by the processor, the feature to a code. - View Dependent Claims (2, 3)
-
-
4. A device for interpreting user movement in computer generated reality, the device comprising:
-
a memory effective to store instructions; a processor in communication with the memory, the processor effective to read the instructions and effective to receive movement data relating to movement of a user interface; define a three dimensional coordinate system based on the movement data, the three dimensional coordinate system including a first, second and third dimension, and the three dimensional coordinate system including a cube having six sides, each side having a center and four respective edges; map the movement data to the three dimensional coordinate system to produce mapped movement data; determine a start location, a final location, a path between the start and final location, and at least one intermediate location between the start location and the final location along the path, of the mapped movement data in the coordinate system, the final location representing a change in user movement from the start location in the first dimension, in the second dimension and in the third dimension, wherein a first line between the start location and the intermediate location is different from a second line between the start location and the final location; determine a feature of the mapped movement data based on the path, the mapped movement data, a first intersection between one of the edges of the cube and the mapped movement data, a second intersection between one of the centers of one of the sides of the cube in the coordinate system and the mapped movement data, and a difference between the start location and the final location; and map the feature to a code. - View Dependent Claims (5, 6)
-
-
7. A system for interpreting user movement in computer generated reality, the system comprising:
-
a user interface, wherein the user interface is effective to generate movement data relating to movement of the user interface; a memory effective to store instructions; a display; a processor in communication with the memory, the user interface and the display, wherein the processor is effective to read the instructions and effective to receive the movement data; define a three dimensional coordinate system based on the movement data, the three dimensional coordinate system including a first, second and third dimension, and the three dimensional coordinate system including a cube having six sides, each side having a center and four respective edges; map the movement data to the three dimensional coordinate system to produce mapped movement data; determine a start location, a final location, a path between the start and final location, and at least one intermediate location between the start location and the final location along the path, of the mapped movement data in the coordinate system, the final location representing a change in user movement from the start location in the first dimension, in the second dimension and in the third dimension, wherein a first line between the start location and the intermediate location is different from a second line between the start location and the final location; determine a feature of the mapped movement data based on the path, the mapped movement data, a first intersection between one of the edges of the cube and the mapped movement data, a second intersection between one of the centers of one of the sides of the cube in the coordinate system and the mapped movement data, and a difference between the start location and the final location; map the feature to a code; send the code to the memory; receive application data from the memory in response to the code; generate an image based on the application data; and send the image to the display; and
wherein the display is effective to receive the image and display the image. - View Dependent Claims (8)
-
Specification