Ergonomic physical interaction zone cursor mapping
First Claim
1. A computer-implemented method for moving a cursor on a two-dimensional (“
- 2D”
) display in response to motion of a user'"'"'s hand, the method comprising the steps of;
capturing locations of the user'"'"'s hand within a monitored physical interaction zone (“
PHIZ”
), the PHIZ being ergonomically matched to the user'"'"'s natural range of motions;
tracking the locations within the PHIZ to determine motion of the user'"'"'s hand; and
mapping the tracked hand locations from the PHIZ to the display so that motion of the hand in the PHIZ results in a corresponding motion of the cursor on the display.
4 Assignments
0 Petitions
Accused Products
Abstract
Users move their hands in a three dimensional (“3D”) physical interaction zone (“PHIZ”) to control a cursor in a user interface (“UI”) shown on a computer-coupled 2D display such as a television or monitor. The PHIZ is shaped, sized, and positioned relative to the user to ergonomically match the user'"'"'s natural range of motions so that cursor control is intuitive and comfortable over the entire region on the UI that supports cursor interaction. A motion capture system tracks the user'"'"'s hand so that the user'"'"'s 3D motions within the PHIZ can be mapped to the 2D UI. Accordingly, when the user moves his or her hands in the PHIZ, the cursor correspondingly moves on the display. Movement in the z direction (i.e., back and forth) in the PHIZ allows for additional interactions to be performed such as pressing, zooming, 3D manipulations, or other forms of input to the UI.
-
Citations
20 Claims
-
1. A computer-implemented method for moving a cursor on a two-dimensional (“
- 2D”
) display in response to motion of a user'"'"'s hand, the method comprising the steps of;capturing locations of the user'"'"'s hand within a monitored physical interaction zone (“
PHIZ”
), the PHIZ being ergonomically matched to the user'"'"'s natural range of motions;tracking the locations within the PHIZ to determine motion of the user'"'"'s hand; and mapping the tracked hand locations from the PHIZ to the display so that motion of the hand in the PHIZ results in a corresponding motion of the cursor on the display. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15)
- 2D”
-
16. One or more computer-readable storage media storing instructions which, when executed on one or more processors disposed in a computing device perform a method for controlling a movement of a cursor on a user interface (“
- UI”
), the method comprising the steps of;using an optical sensor to detect a position of a hand or fingers of a subject within one or more of a plurality of three-dimensional (“
3D”
) physical interaction zones (“
PHIZs”
) located in a real world space, each PHIZ being shaped, sized, and located relative to the subject to enable ergonomic motion of the subject'"'"'s hand or fingers throughout the PHIZ'"'"'s volume;dynamically selecting one of the plurality of PHIZs, each PHIZ in the plurality being shaped, sized, and located relative to the subject for tracking ergonomic motion of the subject'"'"'s hand or fingers about different arm joints, the arm joints comprising one of shoulder, elbow, or wrist; dynamically applying tuning parameters to the selected PHIZ to adjust at least one of shape of the PHIZ, size of the PHIZ, or location of the PHIZ relative to the subject, the dynamic application being dependent on one of the subject'"'"'s position in the space, orientation in the space, distance from the optical sensor, past behavior, or UI context; mapping the detected hand or finger position to a cursor location in a user interface (“
UI”
) supported by a two-dimensional (“
2D”
) display; andmoving the cursor in the UI in correspondence with the subject'"'"'s hand or finger motion within the selected PHIZ. - View Dependent Claims (17, 18)
- UI”
-
19. A system for mapping motion of a user'"'"'s hand in a three-dimensional (“
- 3D”
) space to motion of a cursor in a two-dimensional (“
2D”
) display, comprising;at least one processor; an optical sensor for capturing an orientation of the user within a physical space and for capturing locations of the user'"'"'s hand within a 3D physical interaction zone (“
PHIZ”
) that is configured so that the user is able to move the hand to reach all points within the PHIZ in an ergonomic manner; andmemory bearing executable instructions that, when executed by the at least one processor, perform a method comprising the steps of; using the optical sensor, determining the location of the hand within the PHIZ relative to a known point on the user'"'"'s body using a 3D coordinate system, mapping the hand location in 3D coordinates in the PHIZ to 2D coordinates associated with the cursor in the display, and repeating the determining and mapping steps to move the cursor in the display to correspond with motion of the user'"'"'s hand. - View Dependent Claims (20)
- 3D”
Specification