Ergonomic physical interaction zone cursor mapping
First Claim
1. A computer-implemented method for moving a cursor on a two-dimensional (“
- 2D”
) display in response to motion of a user'"'"'s hands, the method comprising the steps of;
capturing locations of each of the user'"'"'s hands within respective monitored physical interaction zones (PHIZs), each of the PHIZs being ergonomically matched to the user'"'"'s natural range of motions;
tracking the locations for each of the user'"'"'s hands within each PHIZ to determine motion of each of the user'"'"'s hands; and
mapping the tracked hand locations from either one of the PHIZs to the display so that motion of either of the user'"'"'s hands in a respective PHIZ results in a corresponding motion of the cursor on the display.
2 Assignments
0 Petitions
Accused Products
Abstract
Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user'"'"'s natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user'"'"'s hand so that the user'"'"'s 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
10 Citations
20 Claims
-
1. A computer-implemented method for moving a cursor on a two-dimensional (“
- 2D”
) display in response to motion of a user'"'"'s hands, the method comprising the steps of;capturing locations of each of the user'"'"'s hands within respective monitored physical interaction zones (PHIZs), each of the PHIZs being ergonomically matched to the user'"'"'s natural range of motions; tracking the locations for each of the user'"'"'s hands within each PHIZ to determine motion of each of the user'"'"'s hands; and mapping the tracked hand locations from either one of the PHIZs to the display so that motion of either of the user'"'"'s hands in a respective PHIZ results in a corresponding motion of the cursor on the display. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13)
- 2D”
-
9. The computer-implemented method of claim l further including a step of utilizing a hand PHIZ in which motion of the user'"'"'s hand or fingers is determined relative to a wrist joint of the user.
-
14. One or more computer-readable storage media storing instructions which, when executed on one or more processors disposed in a computing device perform a method for controlling a movement of a cursor on a user interface (“
- UI”
), the method comprising the steps of;using an optical sensor to detect a position of a hand or fingers of a subject within one or more of a plurality of three-dimensional (“
3D”
) physical interaction zones (“
PHIZs”
) located in a real world space, each PHIZ being shaped, sized, and located relative to the subject to enable ergonomic motion of the subject'"'"'s hand or fingers throughout the PHIZ'"'"'s volume;dynamically selecting one of the plurality of PHIZs, each PHIZ in the plurality being shaped, sized, and located relative to the subject for tracking ergonomic motion of the subject'"'"'s hand or fingers about different arm joints, the arm joints comprising one of shoulder, elbow, or wrist; mapping the detected hand or finger position to a cursor location in a user interface (“
UI”
) supported by a two-dimensional (“
2D”
) display; andmoving the cursor in the UI in correspondence with the subject'"'"'s hand or finger motion within the selected PHIZ. - View Dependent Claims (15, 16)
- UI”
-
17. A system for mapping motion of a user'"'"'s hand in a three-dimensional (“
- 3D”
) space to motion of a cursor in a two-dimensional (“
2D”
) user interface (UI), comprising;at least one processor; a display configured to support at least a visible portion of the UI; an optical sensor configured to capture an orientation of the user within a physical space and for capturing locations of the user'"'"'s hand within a 3D physical interaction zone (“
PHIZ”
) that is configured so that the user is able to move the hand to reach all points within the PHIZ in an ergonomic manner; andmemory bearing executable instructions that, when executed by the at least one processor, perform a method comprising the steps of; using the optical sensor, determining the location of the hand within the PHIZ relative to a known point on the user'"'"'s body using a 3D coordinate system, mapping the hand location in 3D coordinates in the PHIZ to 2D coordinates in the UI wherein the 2D coordinates include non-visible portions of the UI, and repeating the determining and mapping steps to move the cursor in the UI at a 2D coordinate to correspond with motion of the user'"'"'s hand. - View Dependent Claims (18, 19, 20)
- 3D”
Specification