Non-tactile interface systems and methods
First Claim
Patent Images
1. A machine-implemented method for processing an input gesture, the method comprising:
- tracking, using a 3D sensor, user movements including sensing positional information of a portion of a hand in a monitored region of space monitored by the 3D sensor;
using the sensed positional information of the portion of the hand, defining a plurality of distinct user-specific virtual planes, including at least a first user-specific virtual plane defined in space relative to a position of, and corresponding to, a first finger of the hand and a second user-specific virtual plane defined in space relative to a position of, and corresponding to, a second finger of the hand, in the monitored region of space;
detecting, by the 3D sensor, a first finger state of the first finger relative to the corresponding first user-specific virtual plane and a second finger state of the second finger relative to the corresponding second user-specific virtual plane, wherein a finger state for a finger relative to the corresponding user-specific virtual plane defined for the finger is one of;
the finger moving closer or further away from the corresponding user-specific virtual plane, and the finger moving on or against the corresponding user-specific virtual plane;
determining an input gesture made by the portion of the hand based on the first finger state and the second finger state;
interpreting the input gesture as a command using the input gesture determined from the first finger state and the second finger state; and
providing the command to a machine for executing an action appropriate to the command.
10 Assignments
0 Petitions
Accused Products
Abstract
Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.
211 Citations
20 Claims
-
1. A machine-implemented method for processing an input gesture, the method comprising:
-
tracking, using a 3D sensor, user movements including sensing positional information of a portion of a hand in a monitored region of space monitored by the 3D sensor; using the sensed positional information of the portion of the hand, defining a plurality of distinct user-specific virtual planes, including at least a first user-specific virtual plane defined in space relative to a position of, and corresponding to, a first finger of the hand and a second user-specific virtual plane defined in space relative to a position of, and corresponding to, a second finger of the hand, in the monitored region of space; detecting, by the 3D sensor, a first finger state of the first finger relative to the corresponding first user-specific virtual plane and a second finger state of the second finger relative to the corresponding second user-specific virtual plane, wherein a finger state for a finger relative to the corresponding user-specific virtual plane defined for the finger is one of;
the finger moving closer or further away from the corresponding user-specific virtual plane, and the finger moving on or against the corresponding user-specific virtual plane;determining an input gesture made by the portion of the hand based on the first finger state and the second finger state; interpreting the input gesture as a command using the input gesture determined from the first finger state and the second finger state; and providing the command to a machine for executing an action appropriate to the command. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A system, including:
-
an image-capture device including at least one camera; an image analyzer coupled to the at least one camera, the image analyzer being configured to; track, using a 3D sensor, user movements including sensing positional information of a portion of a hand in a monitored region of space monitored by the 3D sensor; using the sensed positional information of the portion of the hand, define a plurality of distinct user-specific virtual planes, including at least a first user-specific virtual plane defined in space relative to a position of, and corresponding to, a first finger of the hand and a second user-specific virtual plane defined in space relative to a position of, and corresponding to, a second finger of the hand, in the monitored region of space; detect a first finger state of the first finger relative to the corresponding first user-specific virtual plane and a second finger state of the second finger relative to the corresponding second user-specific virtual plane, wherein a finger state for a finger relative to the corresponding user-specific virtual plane defined for the finger is one of;
the finger moving closer or further away from the corresponding user-specific virtual plane, and the finger moving on or against the corresponding user-specific virtual plane;determine an input gesture made by the portion of the hand based on the first finger state and the second finger state; interpret the input gesture as a command using the input gesture determined from the first finger state and the second finger state; and provide the command to a machine for executing an action appropriate to the command. - View Dependent Claims (15, 16, 17, 18, 19)
-
-
20. A non-transitory computer-readable storage medium storing instructions which when executed by a processor cause the processor to:
-
track, using a 3D sensor, user movements including sensing positional information of a portion of a hand in a monitored region of space monitored by the 3D sensor; using the sensed positional information of the portion of the hand, define a plurality of distinct user-specific virtual planes, including at least a first user-specific virtual plane defined in space relative to a position of, and corresponding to, a first finger of the hand and a second user-specific virtual plane defined in space relative to a position of, and corresponding to, a second finger of the hand, in the monitored region of space; detect, by the 3D sensor, a first finger state of the first finger relative to the corresponding first user-specific virtual plane and a second finger state of the second finger relative to the corresponding second user-specific virtual plane, wherein a finger state for a finger relative to the corresponding user-specific virtual plane defined for the finger is one of;
the finger moving closer or further away from the corresponding user-specific virtual plane, and the finger moving on or against the corresponding user-specific virtual plane;determine an input gesture made by the portion of the hand based on the first finger state and the second finger state; interpret the input gesture as a command using the input gesture determined from the first finger state and the second finger state; and provide the command to a machine for executing an action appropriate to the command.
-
Specification