Object creation using body gestures
First Claim
1. A method comprising:
- creating, using an imaging device, a three dimensional (3D) object having an initial shape based on an outline of an initial pose of a user;
causing presentation of the 3D object having the initial shape on a display to provide feedback to the user;
editing the 3D object by;
determine a reference mode of a plurality of reference modes for editing the 3D object, the plurality of reference modes including a spatial mapping mode and a body reference mapping mode, the body reference mapping mode allowing the user to edit the 3D object based on at least one of the user being inside of the 3D object and the user being the 3D object, and the spatial mapping mode allowing the user to edit the 3D object based on at least one of the user stepped out of the 3D object and the user being next to the 3D object;
determining an edit operation to be performed on the 3D object;
performing the edit operation on the 3D object based on a body gesture of the user that is sensed by the imaging device and the reference mode for editing the 3D object; and
causing presentation of the 3D object to the user; and
animating the 3D object based on additional body gestures of the user.
3 Assignments
0 Petitions
Accused Products
Abstract
An intuitive interface may allow users of a computing device (e.g., children, etc.) to create imaginary three dimensional (3D) objects of any shape using body gestures performed by the users as a primary or only input. A user may make motions while in front of an imaging device that senses movement of the user. The interface may allow first-person and/or third person interaction during creation of objects, which may map a body of a user to a body of an object presented by a display. In an example process, the user may start by scanning an arbitrary body gesture into an initial shape of an object. Next, the user may perform various gestures using his body, which may result in various edits to the object. After the object is completed, the object may be animated, possibly based on movements of the user.
-
Citations
20 Claims
-
1. A method comprising:
-
creating, using an imaging device, a three dimensional (3D) object having an initial shape based on an outline of an initial pose of a user; causing presentation of the 3D object having the initial shape on a display to provide feedback to the user; editing the 3D object by; determine a reference mode of a plurality of reference modes for editing the 3D object, the plurality of reference modes including a spatial mapping mode and a body reference mapping mode, the body reference mapping mode allowing the user to edit the 3D object based on at least one of the user being inside of the 3D object and the user being the 3D object, and the spatial mapping mode allowing the user to edit the 3D object based on at least one of the user stepped out of the 3D object and the user being next to the 3D object; determining an edit operation to be performed on the 3D object; performing the edit operation on the 3D object based on a body gesture of the user that is sensed by the imaging device and the reference mode for editing the 3D object; and causing presentation of the 3D object to the user; and animating the 3D object based on additional body gestures of the user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
-
13. A system comprising:
-
an imaging device to detect a body gestures of a user; one or more processors; and memory to store computer readable instructions that, when executed on the one or more processors, cause the one or more processors to perform acts comprising; creating, using the imaging device, a three dimensional (3D) object having an initial shape based on an outline of an initial pose of the user; causing presentation of the 3D object having the initial shape on a display to provide feedback to the user; identifying a body gesture of the user based on information received from the imaging device; determining an edit operation to be performed on the 3D object based on the body gesture; modifying the 3D object using the edit operation; identifying a tool based on the body gesture; determining a tool operation to be performed by the tool; determine a reference mode of a plurality of reference modes for editing the 3D object, the plurality of reference modes including a spatial mapping mode and a body reference mapping mode, the body reference mapping mode allowing the user to edit the 3D object based on at least one of the user being inside of the 3D object and the user being the 3D object, and the spatial mapping mode allowing the user to edit the 3D object based on at least one of the user stepped out of the 3D object and the user being next to the 3D object; and modifying the 3D object, wherein modifying the 3D object includes causing the tool to interact with the 3D object based on the body gesture and the reference mode for editing the 3D object. - View Dependent Claims (14, 15, 16, 17)
-
-
18. One or more memory storing computer-executable instructions that, when executed on one or more processors, causes the one or more processors to perform acts comprising:
-
obtaining a three dimensional (3D) object having an initial shape based on an outline of an initial pose of a user; causing presentation of the 3D object having the initial shape on a display to provide feedback to the user; determine a reference mode of a plurality of reference modes for editing the 3D object, the plurality of reference modes including a spatial mapping mode and a body reference mapping mode, the body reference mapping mode allowing the user to edit the 3D object based on at least one of the user being inside of the 3D object and the user being the 3D object, and the spatial mapping mode allowing the user to edit the 3D object based on at least one of the user stepped out of the 3D object and the user being next to the 3D object; determining an edit operation to be performed on the 3D object; performing the edit operation on the 3D object based on a body gesture of the user that is sensed by a imaging device and the reference mode for editing the 3D object; causing presentation of the 3D object; allowing the user to map a first feature of the 3D object to a first body part of the user; and animating the 3D object by moving the first feature in response to a first movement of the first body part of the user. - View Dependent Claims (19, 20)
-
Specification