Altering properties of rendered objects via control points
First Claim
1. A device comprising:
- one or more processors;
memory;
one or more modules stored in the memory and executable by the one or more processors to perform operations comprising;
presenting, in a three-dimensional coordinate space, a rendered object via a display of the device;
detecting, in the three-dimensional coordinate space, a gesture performed by or in association with a control object;
identifying, based at least in part on the gesture, a target control point from a plurality of control points, each control point of the plurality of control points displayed in a position that is associated with an edge or a vertex of the rendered object;
tracking movement of the control object in the three-dimensional coordinate space;
causing the target control point to move with the movement of the control object in the three-dimensional coordinate space;
determining a displacement of the target control point from an original position to a new position in the three-dimensional coordinate space, the new position based at least in part on the movement of the control object;
altering a property of the rendered object based at least in part on the displacement; and
modifying a rendering of the rendered object to reflect an alteration of the property.
1 Assignment
0 Petitions
Accused Products
Abstract
Altering properties of rendered objects and/or mixed reality environments utilizing control points associated with the rendered objects and/or mixed reality environments is described. Techniques described can include detecting a gesture performed by or in association with a control object. Based at least in part on detecting the gesture, techniques described can identify a target control point that is associated with a rendered object and/or a mixed reality environment. As the control object moves within the mixed reality environment, the target control point can track the movement of the control object. Based at least in part on the movement of the control object, a property of the rendered object and/or the mixed reality environment can be altered. A rendering of the rendered object and/or the mixed reality environment can be modified to reflect any alterations to the property.
-
Citations
20 Claims
-
1. A device comprising:
-
one or more processors; memory; one or more modules stored in the memory and executable by the one or more processors to perform operations comprising; presenting, in a three-dimensional coordinate space, a rendered object via a display of the device; detecting, in the three-dimensional coordinate space, a gesture performed by or in association with a control object; identifying, based at least in part on the gesture, a target control point from a plurality of control points, each control point of the plurality of control points displayed in a position that is associated with an edge or a vertex of the rendered object; tracking movement of the control object in the three-dimensional coordinate space; causing the target control point to move with the movement of the control object in the three-dimensional coordinate space; determining a displacement of the target control point from an original position to a new position in the three-dimensional coordinate space, the new position based at least in part on the movement of the control object; altering a property of the rendered object based at least in part on the displacement; and modifying a rendering of the rendered object to reflect an alteration of the property. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 18)
-
-
9. A computer-implemented method for altering one or more properties of a rendered object in a mixed reality environment, the computer-implemented method comprising:
-
presenting the rendered object in a three-dimensional coordinate space; detecting, in the three-dimensional coordinate space, a gesture performed by or in association with a control object; identifying, by a processor, a target control point of a plurality of control points selected by the gesture, wherein each control point of the plurality control points is displayed in a position that is associated with an edge or a vertex of the rendered object and the target control point is within a threshold distance of the gesture performed by or in association with the control object; tracking movement of the control object in the three-dimensional coordinate space; causing the target control point to move with the movement of the control object in the three-dimensional coordinate space; altering a property of the rendered object based at least in part on the movement of the control object; and modifying a rendering of the rendered object to reflect an alteration of the property. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16, 19)
-
-
17. A computer-implemented method for altering one or more properties of a mixed reality environment, the computer-implemented method comprising:
-
presenting an object in a three-dimensional coordinate space; presenting a plurality of control points in a three-dimensional coordinate space, the plurality of control points positioned on an edge or a vertex of the object; identifying, by a processor, a target control point of the plurality of control points, the target control point being within a threshold distance of a control object; tracking movement of the control object in the three-dimensional coordinate space; causing the target control point to move with the movement of the control object in the three-dimensional coordinate space; altering a property of the object based at least in part on the movement of the control object; and modifying a rendering of the object to reflect an alteration of the property. - View Dependent Claims (20)
-
Specification