FREE-SPACE USER INTERFACE AND CONTROL USING VIRTUAL CONSTRUCTS
First Claim
Patent Images
1. A computer-implemented method of controlling a machine user interface, comprising:
- receiving information including motion information for a control object;
determining from the motion information whether a motion of the control object is an engagement gesture according to an occurrence of an engagement gesture applied to at least one virtual control construct defined within a field of view of an image capturing device;
determining a control to which the engagement gesture is applicable; and
manipulating the control according to at least the motion information.
10 Assignments
0 Petitions
Accused Products
Abstract
During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes may be facilitated by tracking the control object'"'"'s movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct may be updated, continuously or from time to time, based on the control object'"'"'s location.
96 Citations
37 Claims
-
1. A computer-implemented method of controlling a machine user interface, comprising:
-
receiving information including motion information for a control object; determining from the motion information whether a motion of the control object is an engagement gesture according to an occurrence of an engagement gesture applied to at least one virtual control construct defined within a field of view of an image capturing device; determining a control to which the engagement gesture is applicable; and manipulating the control according to at least the motion information. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
-
-
12. A computer-implemented method of controlling a machine user interface, comprising:
-
receiving information including motion information for a control object; determining from the motion information whether a motion of the control object is an engagement gesture according to an occurrence of an engagement gesture applied to at least one virtual control construct defined within a field of view of an image capturing device, comprising; determining whether an intersection occurred between control object and at least one virtual control construct, and when an intersection has occurred determining from the motion information whether the engagement includes continued motion after intersection;
otherwisedetermining whether a dis-intersection of the control object from the at least one virtual control construct occurred;
otherwisedetermining whether motion of the control object occurred relative to at least one virtual control construct; determining from the motion information a set of engagement attributes defining an engagement gesture; and identifying an engagement gesture by correlating motion information to at least one engagement gesture based at least upon one or more of motion of the control object, occurrence of any of an intersection, a dis-intersection or a non-intersection of the control object with the virtual control construct, and the set of engagement attributes; determining a control to which the engagement gesture is applicable; and manipulating the control according to at least the engagement gesture.
-
-
13. A computer-implemented method for controlling a user interface via free-space motions of a control object, the method comprising:
-
receiving motion information indicating positions of a control object being tracked in free space; and using a processor, (i) defining a virtual control construct, at least a portion thereof having a spatial position determined based at least in part on the motion information such that the virtual control construct portion is positioned proximate to the control object;
(ii) determining from the motion information whether the tracked motions of the control object indicate that the control object has intersected the virtual control construct; and
(iii) switching from conducting control of a user interface in a first mode to conducting control of the user interface in a second mode based at least in part upon an occurrence of the control object intersecting the virtual control construct. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35)
-
-
36. A system for controlling a machine user interface via free-space motions of a control object tracked with an image capturing device, the system comprising:
-
a processor; and memory storing (i) motion information for the control object; and (ii) processor-executable instructions for causing the processor to determine from the motion information whether a motion of the control object is an engagement gesture according to an occurrence of an engagement gesture applied to at least one virtual control construct defined within a field of view of the image capturing device, to determine a control to which the engagement gesture is applicable, and to manipulate the control according to at least the motion information.
-
-
37. A non-transitory machine-readable medium, storing one or more instructions which, when executed by one or more processors, cause the one or more processors to:
-
determine from motion information received for a control object whether a motion of the control object is an engagement gesture according to an occurrence of an engagement gesture applied to at least one virtual control construct defined within a field of view of an image capturing device; determine a control to which the engagement gesture is applicable; and manipulate the control according to at least the motion information.
-
Specification