Work environment for information sharing and collaboration
First Claim
Patent Images
1. A method implemented in a computer infrastructure comprising a combination of hardware and software, the method comprising:
- linking a first user device to a first screen of a work environment and a second user device to a second screen of the work environment, wherein the first user device and the second user device are separate devices from one another and from the first and second screens;
displaying data associated with the first user device on the first screen;
detecting manipulation of the data at the first screen, wherein the detecting the manipulation includes detecting a flick gesture of the data on the first screen;
displaying a copy of the data on the second screen based on the detecting; and
transferring the copy of the data to the second user device based upon acceptance by a user of the copy of the data displayed on the second screen,wherein the work environment comprises a smart table, andwherein the method further comprises;
determining a direction of the flick gesture;
extrapolating a line based on the determined direction;
determining whether the extrapolated line intersects the second screen and a third screen which is located on the smart table based on coordinate data regarding spatial locations of the second screen and the third screen on the smart table;
determining a speed of the flick gesture; and
determining whether an intended recipient of the data on the first screen is the second screen or the third screen based on an initial speed of travel, a rate of deceleration after the initial speed of travel of the flick gesture, and the spatial locations of the second screen and the third screen on the smart table.
2 Assignments
0 Petitions
Accused Products
Abstract
An approach for collaboration is provided. An approach includes linking a first user device to a first collaboration screen of a work environment and a second user device to a second collaboration screen of the work environment. The approach also includes displaying data associated with the first user device on the first collaboration screen. The approach further includes detecting manipulation of the data at the first collaboration screen. The approach additionally includes displaying a copy of the data on the second collaboration screen based on the detecting.
-
Citations
18 Claims
-
1. A method implemented in a computer infrastructure comprising a combination of hardware and software, the method comprising:
-
linking a first user device to a first screen of a work environment and a second user device to a second screen of the work environment, wherein the first user device and the second user device are separate devices from one another and from the first and second screens; displaying data associated with the first user device on the first screen; detecting manipulation of the data at the first screen, wherein the detecting the manipulation includes detecting a flick gesture of the data on the first screen; displaying a copy of the data on the second screen based on the detecting; and transferring the copy of the data to the second user device based upon acceptance by a user of the copy of the data displayed on the second screen, wherein the work environment comprises a smart table, and wherein the method further comprises; determining a direction of the flick gesture; extrapolating a line based on the determined direction; determining whether the extrapolated line intersects the second screen and a third screen which is located on the smart table based on coordinate data regarding spatial locations of the second screen and the third screen on the smart table; determining a speed of the flick gesture; and determining whether an intended recipient of the data on the first screen is the second screen or the third screen based on an initial speed of travel, a rate of deceleration after the initial speed of travel of the flick gesture, and the spatial locations of the second screen and the third screen on the smart table. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A computer program product comprising a hardware computer usable storage device having readable program code embodied in the storage device, the computer program product includes at least one component operable to:
-
wirelessly link a first user device, which includes a processor and a display, to a first screen of a work environment; wirelessly link a second user device, which includes a processor and a display, to a second screen of the work environment; display, on the first screen, an icon of a file stored on the first user device; detect manipulation of the icon the first screen, wherein the detecting the manipulation includes detecting a flick gesture of the data on the first screen; display a copy of the icon the second screen based on the detecting the manipulation; detect one of acceptance and denial at the second screen based on the displaying the copy of the icon, when acceptance is detected, cause a copy of the file to be stored on the second user device, when denial is detected, remove the copy of the icon from the second screen, wherein the work environment comprises a smart table, and wherein the at least one component is further operable to; determine a direction of a the flick gesture; extrapolate a line based on the determined direction; determine whether the extrapolated line intersects the second screen and a third screen which is located on the smart table based on computer data regarding spatial location of the second screen and the third screen on the smart table; determine a speed of the flick gesture; determine whether an intended recipient of the data on the first screen is the second screen or the third screen based on an initial speed of travel, a rate of deceleration after the initial speed of travel of the flick gesture, and the spatial locations of the second screen and the third screen on the smart table. - View Dependent Claims (16)
-
-
17. A system comprising:
-
a work environment comprising screens; a CPU, a computer readable memory and a computer readable storage media; program instructions to link a first user device, which includes a processor and a display, to a first one of the screens and a second user device, which includes a processor and a display, to a second one of the screens, wherein the first user device and the second user device are separate devices from one another and from the first one of the screens and from the second one of the screens; program instructions to display an icon, corresponding to data stored on the first user device, on the first one of the screens; program instructions to display a copy of the data stored on the first user device on the second one of the screens based on detecting manipulation of the icon of the data on the first one of the screens, wherein the detecting the manipulation includes detecting a flick gesture of the data on the first screen; and program instructions to transfer the copy of the data displayed on the second one of the screens to the second user device based upon acceptance by a user of the copy of the data displayed on the second screen; wherein the program instructions are stored on the computer readable storage media for execution by the CPU via the computer readable memory, wherein the work environment comprises a smart table, and wherein the system further comprises; program instructions to determine a direction of the flick gesture; program instructions to extrapolate a line based on the determined direction; program instructions to determine whether the extrapolated line intersects the second screen and a third screen which is located on the smart table based on coordinate data regarding spatial locations of the second screen and the third screen on the smart table; program instructions to determine a speed of the flick gesture; and program instructions to determine whether an intended recipient of the data on the first screen is the second screen or the third screen based on an initial speed of travel and a rate of deceleration after the initial speed of travel of the flick gesture, and the spatial locations of the second screen and the third screen on the smart table. - View Dependent Claims (18)
-
Specification