Object-drag continuity between discontinuous touch-screens
First Claim
1. Apparatus for manipulating a first object between discontinuous source and target touch-screens of a computer comprising:
- a single virtual display, the first object being displayed on the source touch-screen and being known in the virtual display by unique parameters, a buffer in the computer for storing the unique parameters of the first object;
means for triggering manipulation of the first object from the source touch-screen to the target touch-screen;
means for releasing the first object'"'"'s parameters from the buffer for display of the first object on the target touch-screen; and
program means on the computer for selecting the first object upon a contact action of a pointer and the source touch-screen, for implementing the triggering manipulation means, and for implementing the releasing means.
0 Assignments
0 Petitions
Accused Products
Abstract
Apparatus and process are provided for dragging or manipulating an object across a non-touch sensitive discontinuity between touch-sensitive screens of a computer. The object is selected and its parameters are stored in a buffer. The user activates means to trigger manipulation of the object from the source screen to the target screen. In one embodiment, a pointer is manipulated continuously on the source screen to effect the transfer. The object can be latched in a buffer for release on when the pointer contacts the target screen, preferably before a timer expires. Alternatively, the object is dragged in a gesture or to impinge a hot switch which, directs the computer to release the object on the target screen. In a hardware embodiment, buttons on a wireless pointer can be invoked to specify cut, copy or menu options and hold the object in the buffer despite a pointer lift. In another software/hardware embodiment, the steps of source screen and object selection can be aided with eye-tracking and voice recognition hardware and software.
-
Citations
37 Claims
-
1. Apparatus for manipulating a first object between discontinuous source and target touch-screens of a computer comprising:
-
a single virtual display, the first object being displayed on the source touch-screen and being known in the virtual display by unique parameters, a buffer in the computer for storing the unique parameters of the first object;
means for triggering manipulation of the first object from the source touch-screen to the target touch-screen;
means for releasing the first object'"'"'s parameters from the buffer for display of the first object on the target touch-screen; and
program means on the computer for selecting the first object upon a contact action of a pointer and the source touch-screen, for implementing the triggering manipulation means, and for implementing the releasing means. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
gesture-recognition means for comparing the gesture of the pointer on the source touch-screen with predefined gestures and wherein if a gesture is recognized as being one of the predefined gestures the triggering manipulation means is activated for manipulating the first object to the target touch-screen.
-
-
11. Apparatus for manipulating a first object between discontinuous source and target screens of a single virtual display of a computer, the first object being displayed on the source screen and being known in the virtual display by unique parameters, comprising:
-
means for selecting the first object on the source screen;
a buffer for storing the first objects parameters when it is selected;
means associated with the source screen which, when activated by the user through a predefined motion of the pointer, upon and restricted to the source touch-screen, for manipulating the first object from the source screen to the target screen; and
means, which when actuated, release the first object'"'"'s parameters from the buffer for display of the first object on the target screen;
microphone means for receiving voice commands and emitting digitized voice signals; and
voice recognition means for receiving and recognizing digitized voice signals and for determining if a voice command is recognized as having identified a unique parameter of the first object and if a voice command is recognized as having identified a source screen and wherein the means for selecting the first object comprises determining if the identified first object is displayed on the identified source screen. - View Dependent Claims (12)
an eye-tracking interface for detecting which of the source or target screens is being watched by the user; and
wherein the means for selecting the first object comprise determining if the identified first object is displayed on the identified source touch-screen.
-
-
13. A process for manipulating a first object between discontinuous source and target touch-screens of a single virtual display of a computer, the first object being displayed on the source touch-screen and being known in the virtual display by unique parameters, the process comprising the steps of:
-
selecting the first object from the source touch-screen when the first object is contacted by a pointer;
storing the first object'"'"'s unique parameters in a buffer in the computer when it is selected;
applying a program on the computer to sense contact of the pointer to the touch-screens and for triggering manipulation of the first object from the source touch-screen to the target touch-screen; and
releasing the first objects parameters from the buffer for display of the transferred first object to the target touch-screen. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37)
initiating a timer upon latching the buffer, the timer having a predetermined timeout; and
releasing the first object'"'"'s parameters to the target touch-screen before the timer reaches timeout.
-
-
16. The process as recited in claim 13 further comprising:
-
setting a cut flag which specifies that the first object is to be deleted after release to the target touch-screen;
checking the state of the cut flag upon releasing the first object'"'"'s parameters to the target touch-screen; and
deleting the first object from the source touch-screen if the cut flag is set.
-
-
17. The process as recited in claim 13 wherein the releasing of the first object'"'"'s parameters from the buffer comprises touching the pointer to the target touch-screen.
-
18. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by:
-
defining a hot switch zone on the source touch-screen;
dragging the pointer and selected first object across the source touch-screen; and
impinging the first object with the hot switch zone for transferring the first objector to the target touch-screen.
-
-
19. The process as recited in claim 18 wherein the hot switch zone is a boundary of the source touch-screen.
-
20. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by:
-
dragging the pointer and first object across the source touch-screen;
comparing the velocity of the dragged first object against a predetermined drag velocity and if velocity is greater than the predetermined drag velocity then the first object is transferred to the target touch-screen.
-
-
21. The process as recited in claim 18 further comprising:
comparing a velocity of the first object when it impinges the hot switch zone against a predetermined drag velocity and if the first object'"'"'s velocity is greater than the predetermined drag velocity then the first object is transferred to the target touch-screen.
-
22. The process as recited in claim 18 wherein the hot switch zone is a third object displayed on the source touch screen.
-
23. The process as recited in claim 22 further comprising
forming a virtual second object on the target touch-screen when the first object impinges the third object; -
mapping the source touch-screen to the display on the target touch-screen;
dragging the pointer over the source touch-screen for dragging the virtual second object over the target touch-screen so that, when the first object is released, the first object is transferred to the target touch-screen to the location of the virtual second object upon release.
-
-
24. The process as recited in claim 22 further comprising:
-
displaying a virtual target screen on the source touch-screen when the first object impinges the third object; and
dragging the pointer over the source touch-screen for scrolling the virtual target screen progressively under the first object on the source touch-screen so that, when the first object is released, the first object is transferred to the target touch-screen to a location corresponding to where the first object was located over the virtual target screen.
-
-
25. The process as recited in claim 18 further comprising:
-
displaying a menu of options when the first object impinges the hot switch zone; and
selecting an option from the menu so that the first object is transferred to the target touch-screen according to the menu option.
-
-
26. The process as recited in claim 25 wherein one menu option is a copy option for transferring and releasing the first object to the target touch-screen while leaving a copy of the first object on the source touch-screen.
-
27. The process as recited in claim 25 wherein one menu option is a cut option for:
-
for transferring and releasing the first object to the target touch-screen; and
deleting the first object from the source touch-screen.
-
-
28. The process as recited in claim 13 wherein the first object is manipulated to the target touch-screen by:
-
dragging the pointer across the source touch-screen as a gesture;
comparing the gesture against pre-determined gestures so that if it matches a known pre-determined gesture then the first object is transferred onto the target touch-screen.
-
-
29. The process as recited in claim 28 wherein the gesture matches a pre-determined copy gesture so that the first object is transferred to the target touch-screen and when released thereto, a copy of the first object remains on the source touch-screen.
-
30. The process as recited in claim 28 wherein the gesture matches a pre-determined cut gesture so that the first object is transferred to the target touch-screen and when released thereto, the first object is deleted from the source touch-screen.
-
31. The process as recited in claim 13 further comprising the steps of:
-
providing a wireless pointer having one or more buttons, the state of buttons being determinable;
touching the pointer to the source touch-screen to select the first object;
actuating a first button on the wireless pointer for latching the first object'"'"'s parameters in the buffer and maintaining them there until released;
touching the wireless pointer to the target touch-screen at a release point where the second object is to be dragged to; and
actuating the first button for releasing the first object to the target touch-screen.
-
-
32. The process as recited in claim 31 further comprising:
-
actuating a second button on the pointer for displaying a context option menu on either of the source and target touch-screens;
touching the context menu for selecting a first option manipulation therefrom;
touching the pointer to the target touch-screen at a location where the first object is to be released;
actuating the second button on the wireless pointer for displaying the context menu; and
touching the context menu for selecting a second option therefrom for transferring and releasing the first object to the target touch-screen at the released location.
-
-
33. The process as recited in claim 32 wherein an option from the context menu is a copy option so that when the first object is transferred and released to the target touch-screen, a copy of the first object remains on the source touch-screen.
-
34. The process as recited in claim 32 wherein an option from the context menu is a cut option so that when the first object is transferred and released to the target touch-screen, the first object is deleted from the source touch-screen.
-
35. The process as recited in claim 13 wherein the first object is selected by:
-
providing a predetermined voice vocabulary;
providing means for recognizing voice commands by comparing them with the predetermined vocabulary;
receiving voice commands from the user;
recognizing the voice commands for comparing the voice commands for a match against a predetermined vocabulary;
determining if a vocabulary match identifies a unique parameter of an object on the touch-screen; and
selecting the object as the first object if the object having the recognized unique parameter is displayed on the source touch-screen.
-
-
36. The process as recited in claim 13 further comprising the steps of:
-
providing an eye-tracking interface;
detecting if a touch-screen is being watched by the user using the eye-tracking interface;
selecting the detected touch-screen as being the source touch-screen.
-
-
37. The process as recited in claim 36 wherein the first object is manipulated for transferring it from the source touch-screen to the target touch-screen by:
-
tracking the eyes of the user as the user looks from the source touch-screen to the target touch-screen for detects a cross-discontinuity drag; and
releasing the first object'"'"'s parameters from the buffer for display of the transferred first object to the target touch-screen.
-
Specification