Virtual keyboards and methods of providing the same
First Claim
1. A system for inputting data into a computing device, the system comprising:
- a camera for capturing a sequence of images containing a finger of a user;
a virtual keyboard comprising a plurality of virtual keys, each virtual key having a plurality of virtual sensors, each virtual sensor comprising a region of adjoining pixels within a corresponding virtual key, wherein a plurality of the virtual keys have different virtual sensor layouts from one another;
a display for displaying each image of the sequence combined with the virtual keyboard, wherein the position of the finger in the displayed image relative to the virtual keyboard changes as the finger of the user moves relative to the camera;
a video feature extraction module configured to;
detect motion of the user'"'"'s finger in the sequence of images relative to the virtual keyboard,identify a sequence of actuated virtual sensors within at least one of the plurality of virtual keys based on the detected motion, wherein a virtual sensor is actuated when the motion of the user'"'"'s finger is detected over a threshold number of pixels within the region of adjoining pixels corresponding to the actuated virtual sensor, andcollect virtual sensor actuation data comprising both geographical and temporal information related to each actuated virtual sensor of the sequence of actuated virtual sensors; and
a gesture pattern matching module for using the sequence of virtual sensor actuations to recognize a user'"'"'s gesture as one of a valid keystroke and a rejected keystroke and input data into the computing device.
1 Assignment
0 Petitions
Accused Products
Abstract
The present disclosure provides systems, methods and apparatus, including computer programs encoded on computer storage media, for providing virtual keyboards. In one aspect, a system includes a camera, a display, a video feature extraction module and a gesture pattern matching module. The camera captures a sequence of images containing a finger of a user, and the display displays each image combined with a virtual keyboard having a plurality of virtual keys. The video feature extraction module detects motion of the finger in the sequence of images relative to virtual sensors of the virtual keys, and determines sensor actuation data based on the detected motion relative to the virtual sensors. The gesture pattern matching module uses the sensor actuation data to recognize a gesture.
119 Citations
40 Claims
-
1. A system for inputting data into a computing device, the system comprising:
-
a camera for capturing a sequence of images containing a finger of a user; a virtual keyboard comprising a plurality of virtual keys, each virtual key having a plurality of virtual sensors, each virtual sensor comprising a region of adjoining pixels within a corresponding virtual key, wherein a plurality of the virtual keys have different virtual sensor layouts from one another; a display for displaying each image of the sequence combined with the virtual keyboard, wherein the position of the finger in the displayed image relative to the virtual keyboard changes as the finger of the user moves relative to the camera; a video feature extraction module configured to; detect motion of the user'"'"'s finger in the sequence of images relative to the virtual keyboard, identify a sequence of actuated virtual sensors within at least one of the plurality of virtual keys based on the detected motion, wherein a virtual sensor is actuated when the motion of the user'"'"'s finger is detected over a threshold number of pixels within the region of adjoining pixels corresponding to the actuated virtual sensor, and collect virtual sensor actuation data comprising both geographical and temporal information related to each actuated virtual sensor of the sequence of actuated virtual sensors; and a gesture pattern matching module for using the sequence of virtual sensor actuations to recognize a user'"'"'s gesture as one of a valid keystroke and a rejected keystroke and input data into the computing device. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 38, 39, 40)
-
-
10. A method for inputting data into an electronic device using a virtual keyboard, the method comprising:
-
receiving a sequence of images containing a finger of a user; generating a combined image for each image of the sequence, the combined image containing the finger of the user and a virtual keyboard, the virtual keyboard comprising a plurality of virtual keys, each virtual key having a plurality of virtual sensors, each virtual sensor comprising a region of adjoining pixels within a corresponding virtual key, wherein a plurality of the virtual keys have different virtual sensor layouts from one another; detecting a motion of the finger in the sequence of images relative to the virtual sensors; identifying a sequence of actuated virtual sensors within at least one of the plurality of virtual keys based on the detected motion, wherein a virtual sensor is actuated when the motion of the user'"'"'s finger is detected over a threshold number of pixels within the plurality of pixels corresponding to the actuated virtual sensor; generating sensor actuation data comprising both geographical and temporal information related to each actuated virtual sensor of the sequence of actuated virtual sensors; and recognizing a gesture as one of a valid keystroke and a rejected keystroke using the sequence of virtual sensor actuations to input data into the electronic device. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A system for inputting data into a computing device, the system comprising:
-
means for capturing a sequence of images containing a finger of a user; means for displaying each image of the sequence combined with a virtual keyboard, the virtual keyboard comprising a plurality of virtual keys, each virtual key having a plurality of virtual sensors, each virtual sensor comprising a region of adjoining pixels within a corresponding virtual key, wherein a plurality of the virtual keys have different virtual sensor layouts from one another, wherein the position of the finger in the displayed image relative to the virtual keyboard changes as the finger of the user moves relative to the camera; means for detecting motion of the user'"'"'s finger in the sequence of images relative to the virtual sensors; means for identifying a sequence of actuated virtual sensors within at least one of the plurality of virtual keys based on the detected motion, wherein a virtual sensor is actuated when the motion of the user'"'"'s finger is detected over a threshold number of pixels within the plurality of pixels corresponding to the actuated virtual sensor; means for generating sensor actuation data comprising both geographical and temporal information related to each actuated virtual sensor of the sequence of actuated virtual sensors; and means for recognizing a user'"'"'s gesture as one of a valid keystroke and a rejected keystroke using the sequence of virtual sensor actuations data to input data into the computing device. - View Dependent Claims (21, 22, 23, 24, 25, 26, 27)
-
-
28. A non-transitory computer-readable storage medium comprising instructions that when executed perform a method of inputting data into an electronic device using a virtual keyboard, the method comprising:
-
receiving a sequence of images containing a finger of a user; generating a combined image for each image of the sequence, the combined image containing the finger of the user and a virtual keyboard, the virtual keyboard comprising a plurality of virtual keys, each virtual key having a plurality of virtual sensors, each virtual sensor comprising a region of adjoining pixels within a corresponding virtual key, wherein a plurality of the virtual keys have different virtual sensor layouts from one another; detecting a motion of the finger in the sequence of images relative to the virtual sensors; identifying a sequence of actuated virtual sensors within at least one of the plurality of virtual keys based on the detected motion, wherein a virtual sensor is actuated when the motion of the user'"'"'s finger is detected over a threshold number of pixels within the plurality of pixels corresponding to the actuated virtual sensor; generating sensor actuation data comprising both geographical and temporal information related to each actuated virtual sensor of the sequence of actuated virtual sensors; and recognizing a gesture as one of a valid keystroke and a rejected keystroke using the sequence of virtual sensor actuations to input data into the electronic device. - View Dependent Claims (29, 30, 31, 32, 33, 34, 35, 36, 37)
-
Specification