TOUCH CONTROL WITH DYNAMIC ZONES AND DISPLAYED ELEMENTS
First Claim
1. A method, comprising:
- generating a user interface for a virtual world that includes a touch control having a touch zone arrangement in which a user may press to control an object of the virtual world; and
changing, responsive to one of a condition of the virtual world and a user interaction with the user interface, the touch control to have a different touch zone arrangement.
2 Assignments
0 Petitions
Accused Products
Abstract
Methods and systems for providing user control of objects or vehicles simulated in a simulated environment or virtual world are described herein. The user control may include various types of touch controls that may be used by a player to move the object or vehicle in a direction and/or at a velocity. The user, in some embodiments, may be able to select which type of touch control is used in the user interface. In some arrangements, the touch control may include a touch zone arrangement that can be changed based on a condition of the virtual world or a player interaction with a user interface. Additionally, in some embodiments, the user control may include one or more controls for removing one or more statuses that may be received by the vehicle.
2 Citations
14 Claims
-
1. A method, comprising:
-
generating a user interface for a virtual world that includes a touch control having a touch zone arrangement in which a user may press to control an object of the virtual world; and changing, responsive to one of a condition of the virtual world and a user interaction with the user interface, the touch control to have a different touch zone arrangement. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method, comprising:
-
receiving a first gesture input from a user; responsive to receiving the first gesture input, generating, as part of a user interface for a virtual world, a first touch control element of a dissected touch control at a first location; controlling an object of the virtual world based on the first touch control element; receiving a second gesture input from the user; responsive to receiving the second gesture input, generating, as part of the user interface, a second touch control element of the dissected touch control at a second location different from the first location; and controlling the object based on the first touch control element. - View Dependent Claims (12, 13, 14)
-
Specification