Methods for interactive communications with real time effects and avatar environment interaction
First Claim
1. A computer implemented method for interactively animating an avatar in response to real world input, data of the avatar transmitted between a first user and a second user using a computer program that is executed on at least one computer in a computer network and each of the first user and the second user interacting through a respective computing system that is at least partially executing the computer program, comprising:
- identifying components of the avatar representing the first user that can be modified using real-time effects;
identifying controller input from the second user, the controller input being detected by the computing system, and the controller input indicating a selection of one of the identified components of the avatar representing the first user to be modified during play of a game between the first user and the second user, wherein the second user that provides the controller input is capable of interacting via one or more computer processors with the first user and the first user is capable of interacting via one or more computer processors with the second user;
applying the real-time effects onto the selected component of the avatar representing the first user in response to the identified controller input, the applying of the real-time effects onto the identified components acting to augment the avatar of the first user;
providing for display the avatar of the first user, having the applied real-time effects, on a screen connected to the computing system of one or both of the first and second users; and
tracking movements of the first user so that the tracked movements are followed by the avatar of the first user when displayed;
wherein the real-time effects applied onto the selected component of the avatar of the first user move with the movement of the avatar of the first user.
4 Assignments
0 Petitions
Accused Products
Abstract
Computer implemented method for interactively animating an avatar in response to real world input are provided. The avatar can be transmitted between a first user and a second user using a computer program that is executed on at least one computer in a computer network. Additionally, the first user and the second user each interact using a respective computing system that is at least partially executing the computer program. The method is initiated by identifying components of the avatar representing the first user that can be modified using real-time effects. The method continues by the computing system identifying controller input from either the first user or the second user. The identification of the controller input is used to determine which of the identified components of the avatar representing the first user will be modified. In response to the identified controller input the real-time effects are applied to the identified components of the avatar that represent the first user. The avatar of the first user augmented to reflect the application of the real-time effects. In another operation, the method displays the augmented avatar of the first user on a screen connected to the computing system of one or both of the first and second users.
-
Citations
19 Claims
-
1. A computer implemented method for interactively animating an avatar in response to real world input, data of the avatar transmitted between a first user and a second user using a computer program that is executed on at least one computer in a computer network and each of the first user and the second user interacting through a respective computing system that is at least partially executing the computer program, comprising:
-
identifying components of the avatar representing the first user that can be modified using real-time effects; identifying controller input from the second user, the controller input being detected by the computing system, and the controller input indicating a selection of one of the identified components of the avatar representing the first user to be modified during play of a game between the first user and the second user, wherein the second user that provides the controller input is capable of interacting via one or more computer processors with the first user and the first user is capable of interacting via one or more computer processors with the second user; applying the real-time effects onto the selected component of the avatar representing the first user in response to the identified controller input, the applying of the real-time effects onto the identified components acting to augment the avatar of the first user; providing for display the avatar of the first user, having the applied real-time effects, on a screen connected to the computing system of one or both of the first and second users; and tracking movements of the first user so that the tracked movements are followed by the avatar of the first user when displayed; wherein the real-time effects applied onto the selected component of the avatar of the first user move with the movement of the avatar of the first user. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A computer implemented method for interactively applying animation onto users in response to real world input, data being transmitted between a first user and a second user using a computer program that is executed on at least one computer in a computer network and each of the first user and the second user interacting through a respective computing system that is at least partially executing the computer program, comprising:
-
identifying components of an image of the second user that can be modified using real-time effects; identifying controller input from the first user, the controller input being detected by the computing system, and the controller input indicating a selection of one of the identified components of the image of the second user to be modified during play of a game between the first user and the second user, wherein the first user that provides the controller input is capable of interacting via one or more computer processors with the second user and the second user is capable of interacting via one or more computer processors with the first user; applying the real-time effects onto the selected component of the image of the second user in response to the identified controller input, the applying of the real-time effects onto the selected component acting to augment the image of the second user, without permission of the second user to cause a taunting effect by the first user; while enabling interactive communication between the first and second users, providing for display the image of the second user, having the applied real-time effects, on a screen connected to the computing system of one or both of the first and second users; and tracking movements of the second user so that the real-time effects applied onto the image of the second user move as does the second user. - View Dependent Claims (12, 13, 14)
-
-
15. A method for modifying an avatar, the method comprising:
-
providing a first region for display of a first image, the first image representing a first user; providing a second region for display of a second image, the second image representing a second user; providing a third region for display of a shared interactive application, the shared interactive application facilitating an interaction between the first user and the second user, wherein the first, second, and third regions are provided for display on a display screen; receiving a controller input from the first user, the controller input indicating a selection of a component of the second image to be modified; applying real-time effects to the second image in response to receiving the controller input, wherein applying the real-time effects to the second image is performed to modify the selected component of the second image, wherein applying the real-time effects to the second image is performed during a session of interaction between the first user and the second user via the shared interactive application; and tracking movements of the second user so that the real-time effects applied onto the second image of the second user moves as does the second user; wherein the first image is a captured image of the first user or an avatar representing the first user; wherein the second image is a captured image of the second user or an avatar representing the second user. - View Dependent Claims (16, 17, 18, 19)
-
Specification