Control of remote applications using companion device
First Claim
Patent Images
1. A method of using a companion device to control an application running on a primary device, the method comprising:
- outputting for display to a screen on a display device a graphical user interface generated by an application running on the primary device;
receiving from the companion device, which has a touch screen display, a normalized touch data that describes a touch event comprising an object contacting the touch screen at a plurality of touch points on the touch screen display, the normalized touch data comprising a normalized x coordinate and a normalized y coordinate for each of the touch points, wherein the normalized x coordinate is calculated using the formula xn=x/w where xn is the normalized x coordinate, x is an x coordinate for a touch point, and w is the width of the touch screen display in pixels;
wherein the normalized y coordinate is calculated using the formula yn=y/h where yn is the normalized y coordinate, y is an y coordinate for the touch point, and h is the height of the touch screen display in pixels, wherein the touch event is movement across the touch screen;
calculating a distance an object on the graphical user interface is to be moved using the normalized touch data as input and applying a dampening logic that adjusts a delta calculated between touch points using both a dampening coefficient and an acceleration component; and
outputting for display to the display device an updated graphical user interface generated by the application showing the object moved the distance.
2 Assignments
0 Petitions
Accused Products
Abstract
Embodiments of the invention provide underlying communication functionality to enable companion experiences. A companion experience allows the user to interact with content playing on a primary device through a companion device. An application on the companion device interacts with an application running on a base device (e.g., a game console, PC, or TV) to provide additional interface options on the companion that are related to a title or application playing on the base device.
-
Citations
16 Claims
-
1. A method of using a companion device to control an application running on a primary device, the method comprising:
-
outputting for display to a screen on a display device a graphical user interface generated by an application running on the primary device; receiving from the companion device, which has a touch screen display, a normalized touch data that describes a touch event comprising an object contacting the touch screen at a plurality of touch points on the touch screen display, the normalized touch data comprising a normalized x coordinate and a normalized y coordinate for each of the touch points, wherein the normalized x coordinate is calculated using the formula xn=x/w where xn is the normalized x coordinate, x is an x coordinate for a touch point, and w is the width of the touch screen display in pixels;
wherein the normalized y coordinate is calculated using the formula yn=y/h where yn is the normalized y coordinate, y is an y coordinate for the touch point, and h is the height of the touch screen display in pixels, wherein the touch event is movement across the touch screen;calculating a distance an object on the graphical user interface is to be moved using the normalized touch data as input and applying a dampening logic that adjusts a delta calculated between touch points using both a dampening coefficient and an acceleration component; and outputting for display to the display device an updated graphical user interface generated by the application showing the object moved the distance. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. One or more computer-storage media comprising computer-executable instructions embodied thereon that when executed by a computing device performs a method for using a companion device to manipulate a graphical user interface generated by an application operating on a primary device, the method comprising:
-
outputting for display on a screen of a display device a graphical user interface (“
GUI”
) generated by the application running on the primary device, the GUI comprising a cursor having a present location;receiving control information from the companion device, which has a touch screen display, the control information comprising a normalized touch data that describes a touch event comprising an object contacting the touch screen at a plurality of touch points on the touch screen display, the normalized touch data comprising a normalized x coordinate and a normalized y coordinate for each of the touch points, wherein the normalized x coordinate is calculated using the formula xn=x/w where xn is the normalized x coordinate, x is an x coordinate for a touch point, and w is the width of the touch screen display in pixels;
wherein the normalized y coordinate is calculated using the formula yn=y/h where yn is the normalized y coordinate, y is an y coordinate for the touch point, and h is the height of the touch screen display in pixels, wherein the touch event is movement across the touch screen;calculating a distance an object on the graphical user interface is to be moved using the normalized touch data as input and applying a dampening logic that adjusts a delta calculated between touch points using both a dampening coefficient and an acceleration component; and outputting for display an updated GUI showing the object moved the distance. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A computing system to implement a method for using a companion device to manipulate a graphical user interface generated by an application operating on a primary device, the computing system comprising:
-
a processor; and computer storage media having computer-executable instructions stored thereon which, when executed by the processor, configure the computing system to; activate, at the companion device, a navigation function that enables touch input on a touch screen display that is integrated in the companion device to manipulate the application running on the primary device that is separate from the companion device and has a separate; receive, from the touch screen display, touch data describing points contacted during a touch event comprising an object contacting the touch screen at a plurality of touch points; convert the touch data to a normalized touch data;
the normalized touch data comprising a normalized x coordinate and a normalized y coordinate for each of the touch points, wherein the normalized x coordinate is calculated using the formula xn=x/w where xn is the normalized x coordinate, x is an x coordinate for a touch point, and w is the width of the touch screen display in pixels;
wherein the normalized y coordinate is calculated using the formula yn=y/h where yn is the normalized y coordinate, y is an y coordinate for the touch point, and h is the height of the touch screen display in pixels;dynamically adjust a frame rate at which the normalized touch data is communicated to the primary device to match a frame rate of the primary device based on network latency characteristics; and communicate the normalized touch data to the primary device that is associated with a display device having a primary screen. - View Dependent Claims (14, 15, 16)
-
Specification