GESTURE-BASED USER INTERFACE
First Claim
1. A computer-implemented method for a gesture-based user interface for controlling a software program on an electronic device, said method comprising:
- receiving image data from a multi-aperture image sensor in said electronic device, said image sensor being configured to simultaneously expose an image sensor to at least a first part of the electromagnetic (EM) spectrum using a first aperture and at least a second part of the EM spectrum using one or more second apertures;
determining sharpness information in at least one area of said image data associated with at least part of an object imaged by said first aperture and said one or more second apertures onto the image plane of said image sensor;
generating depth information on the basis of at least part of said sharpness information; and
,recognizing on the basis of said depth information, at least part of a gesture associated with a movement of said object.
1 Assignment
0 Petitions
Accused Products
Abstract
A computer-implemented method for a gesture-based user interface and a gesture-based user interface system are described. The method comprises receiving image data from a multi-aperture image sensor in said electronic device, said image sensor being configured to simultaneously expose an image sensor to at least a first part of the electromagnetic (EM) spectrum using a first aperture and at least a second part of the EM spectrum using one or more second apertures; determining sharpness information in at least one area of said image data associated with at least part of an object imaged by said first aperture and said one or more second apertures onto the image plane of said image sensor; generating depth information on the basis of at least part of said sharpness information; and, recognizing on the basis of said depth information, at least part of a gesture associated with a movement of said object.
107 Citations
22 Claims
-
1. A computer-implemented method for a gesture-based user interface for controlling a software program on an electronic device, said method comprising:
-
receiving image data from a multi-aperture image sensor in said electronic device, said image sensor being configured to simultaneously expose an image sensor to at least a first part of the electromagnetic (EM) spectrum using a first aperture and at least a second part of the EM spectrum using one or more second apertures; determining sharpness information in at least one area of said image data associated with at least part of an object imaged by said first aperture and said one or more second apertures onto the image plane of said image sensor; generating depth information on the basis of at least part of said sharpness information; and
,recognizing on the basis of said depth information, at least part of a gesture associated with a movement of said object. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
-
-
16. The method according to said method further comprising:
-
on the basis of said depth map determining in said image data a region of interest associated with a fingertip; extracting one or more directional features from an enhanced image formed by blending first low-frequency image data associated with said first part of the EM spectrum with said second high-frequency image data; authenticating a user by matching said extracted directional features with directional reference features associated with a fingerprint of said user.
-
-
17. A gesture-based user interface system for use in an electronic device, said user interface system being configured to control a software program on said electronic device, said system comprising:
-
a multi-aperture image sensor configured to generate image data, said multi-aperture image sensor being configured to simultaneously expose an image sensor to at least a first part of the electromagnetic (EM) spectrum using a first aperture and at least a second part of the EM spectrum using one or more second apertures; one or more filters configured to generate sharpness information in at least one area of said image data associated with at least part of an object, preferably at least part a human body part, imaged by said first aperture and said one or more second apertures onto the image plane of said image sensor; a generator configured to generate depth information on the basis of at least part of said sharpness information; and
,a gesture recognition module configured to recognize on the basis of said depth information a gesture associated with a movement of said object. - View Dependent Claims (18, 19, 20, 21)
-
-
22. (canceled)
Specification