Optical flow based tilt sensor
First Claim
Patent Images
1. A method comprising:
- accessing, by at least one processor, images captured from an image sensor positioned on a handheld device;
identifying, by the at least one processor, a first stationary feature that is present in the images captured from the image sensor on the handheld device;
identifying, by the at least one processor, a second stationary feature that is present in the images captured from the image sensor on the handheld device, the second stationary feature being different than and spaced apart from the first stationary feature;
determining, by the at least one processor, a description of motion of the handheld device in a depth direction based on changes in distance between the first stationary feature and the second stationary feature within the captured images, the depth direction being a direction of movement toward and away from the first stationary feature and the second stationary feature; and
determining, by the at least one processor, user input for an application based on the determined description of motion of the handheld device in the depth direction.
2 Assignments
0 Petitions
Accused Products
Abstract
A method is described for determining a description of motion of a moving mobile camera to determine a user input to an application. The method may involve capturing a series of images from a moving mobile camera and comparing stationary features present in the series of images. Optical flow analysis may be performed on the series of images to determine a description of motion of the moving mobile camera. Based on the determined motion, a user input to an application may be determined and the application may respond to the user input, for example, by updating a user interface of the application.
62 Citations
22 Claims
-
1. A method comprising:
-
accessing, by at least one processor, images captured from an image sensor positioned on a handheld device; identifying, by the at least one processor, a first stationary feature that is present in the images captured from the image sensor on the handheld device; identifying, by the at least one processor, a second stationary feature that is present in the images captured from the image sensor on the handheld device, the second stationary feature being different than and spaced apart from the first stationary feature; determining, by the at least one processor, a description of motion of the handheld device in a depth direction based on changes in distance between the first stationary feature and the second stationary feature within the captured images, the depth direction being a direction of movement toward and away from the first stationary feature and the second stationary feature; and determining, by the at least one processor, user input for an application based on the determined description of motion of the handheld device in the depth direction. - View Dependent Claims (2, 3, 4, 5)
-
-
6. A system comprising:
-
a handheld device; an image sensor positioned on the handheld device; and at least one processor configured to; access images captured from the image sensor positioned on the handheld device; identify a first stationary feature that is present in the images captured from the image sensor on the handheld device; identify a second stationary feature that is present in the images captured from the image sensor on the handheld device, the second stationary feature being different than and spaced apart from the first stationary feature; determine a description of motion of the handheld device in a depth direction based on changes in distance between the first stationary feature and the second stationary feature within the captured images, the depth direction being a direction of movement toward and away from the first stationary feature and the second stationary feature; and determine user input for an application based on the determined description of motion of the handheld device in the depth direction. - View Dependent Claims (7, 8, 9, 10)
-
-
11. A system comprising:
-
a handheld device; an image sensor positioned on the handheld device; and at least one processor configured to; access images captured from the image sensor positioned on the handheld device; identify a first stationary feature that is present in the images captured from the image sensor on the handheld device; identify a second stationary feature that is present in the images captured from the image sensor on the handheld device, the second stationary feature being different than and spaced apart from the first stationary feature; determine positions of the first stationary feature and the second stationary feature within the images captured from the image sensor on the handheld device; provide a user interface that includes a visual display for an application, the user interface being separate from the handheld device on which the image sensor is positioned; determine user input for the application based on the determined positions of the first stationary feature and the second stationary feature within the images captured from the image sensor on the handheld device; and control the user interface that is separate from the handheld device on which the image sensor is positioned based on the determined user input for the application. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19, 20)
-
-
21. An apparatus comprising:
-
means for accessing images captured from an image sensor positioned on a handheld device; means for identifying a first stationary feature that is present in the images captured from the image sensor on the handheld device; means for identifying a second stationary feature that is present in the images captured from the image sensor on the handheld device, the second stationary feature being different than and spaced apart from the first stationary feature; means for determining a description of motion of the handheld device in a depth direction based on changes in distance between the first stationary feature and the second stationary feature within the captured images, the depth direction being a direction of movement toward and away from the first stationary feature and the second stationary feature; and means for determining user input for an application based on the determined description of motion of the handheld device in the depth direction.
-
-
22. At least one non-transitory computer-readable medium having instructions stored thereon that, when executed, cause at least one processor to:
-
access images captured from an image sensor positioned on a handheld device; identify a first stationary feature that is present in the images captured from the image sensor on the handheld device; identify a second stationary feature that is present in the images captured from the image sensor on the handheld device, the second stationary feature being different than and spaced apart from the first stationary feature; determine a description of motion of the handheld device in a depth direction based on changes in distance between the first stationary feature and the second stationary feature within the captured images, the depth direction being a direction of movement toward and away from the first stationary feature and the second stationary feature; and determine user input for an application based on the determined description of motion of the handheld device in the depth direction.
-
Specification