Graphical user interface for a gaming system
First Claim
1. An electronic device for playing a game comprising:
- a user interface comprising a first region and a second region, a first set of selectable objects presented in the first region and a second set of non-selectable object presented in the second region, the user interface comprising a touch-sensitive display screen configured to sense simultaneous touching operations performed at multiple points of the second region; and
a computing hardware operable to execute a software product, wherein executing the software product results in generating and rendering multiple instances of a graphical object associated with the first set of selectable objects on the touch sensitive display screen of the user interface and wherein execution of the software product on the computing hardware causes the computing hardware to;
detect a selection of one or more of the selectable objects presented in the first region of the user interface, wherein the selected one or more selectable objects correspond to one or more resources configured to perform one or more operations on the non-selectable object in the second region;
detect multiple touching operations around the non-selectable object in the second region of the user interface, each touching operation corresponding to a touch point on the second region of the user interface;
deploy a first resource in the form of a first set of multiple graphical objects to a first set of touchpoints in the second region of the user interface and a second resource in the form of a second set of multiple graphical objects to a second set of touch points in the second region of the user interface;
cause the first resource to perform an action on the non-selectable object and the second resource to perform an action on the non-selectable object; and
reformat and represent the second region of the user interface to show a current state of the non-selectable object after the action performed by the first resource and the action performed by the second resource.
1 Assignment
0 Petitions
Accused Products
Abstract
A method and a system for improving a user'"'"'s experience with a graphical user interface corresponding to a gaming environment, executes a software product corresponding to the game, on the computing hardware of an electronic device. The interface renders multiple graphical objects and user selectable options corresponding to the graphical object. The user selects one or more selectable option, and eventually, performs a touching or a swiping operation through multiple points on the display screen of the graphical object. The touching or swiping operation leads to deploying of multiple resources corresponding to the selected option, at different locations on the interface. For controlling the different deployed resources, the user can swipe through different regions of the display screen, based on his/her desire. The number of resources deployed at the different locations on the screen depends on certain parameters, including the pressure applied by the user on the screen, during performing the touching or swiping operations.
-
Citations
20 Claims
-
1. An electronic device for playing a game comprising:
-
a user interface comprising a first region and a second region, a first set of selectable objects presented in the first region and a second set of non-selectable object presented in the second region, the user interface comprising a touch-sensitive display screen configured to sense simultaneous touching operations performed at multiple points of the second region; and a computing hardware operable to execute a software product, wherein executing the software product results in generating and rendering multiple instances of a graphical object associated with the first set of selectable objects on the touch sensitive display screen of the user interface and wherein execution of the software product on the computing hardware causes the computing hardware to; detect a selection of one or more of the selectable objects presented in the first region of the user interface, wherein the selected one or more selectable objects correspond to one or more resources configured to perform one or more operations on the non-selectable object in the second region; detect multiple touching operations around the non-selectable object in the second region of the user interface, each touching operation corresponding to a touch point on the second region of the user interface; deploy a first resource in the form of a first set of multiple graphical objects to a first set of touchpoints in the second region of the user interface and a second resource in the form of a second set of multiple graphical objects to a second set of touch points in the second region of the user interface; cause the first resource to perform an action on the non-selectable object and the second resource to perform an action on the non-selectable object; and reformat and represent the second region of the user interface to show a current state of the non-selectable object after the action performed by the first resource and the action performed by the second resource. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 11, 12)
-
-
10. A method of facilitating user interactions with a graphical user interface, the graphical user interface being generated and rendered on the display of an electronic device by execution of a software product on a computing hardware of the electronic device, wherein execution of the software product on the computing hardware of the electronic device causes the graphical user interface to:
-
render one or more graphical objects in first and second regions of the graphical user interface, graphical objects in the first region corresponding to one or more user selectable options, the one or more user selectable options corresponding to one or more resources to be deployed on the second region of the graphical user interface and wherein a graphical object in the second region is not selectable; detect a selection of one or more of the user selectable options in the first region; detect a touching operation at different points on the second region of the graphical user interface; deploy the one or more resources corresponding to the selected user selectable option at multiple locations on the interface simultaneously, the multiple locations corresponding to the different points of the detected touching operation wherein the one or more resources are deployed at multiple locations only if a time duration of the touching operation at the multiple points on the display screen exceeds a predetermined time period; cause the one or more resources to perform an action on the non-selectable object; and reformat and represent the second region of the user interface to show a current state of the non-selectable object after the action performed by the one or more resource, the current state being a result of an interaction of the one or more resources with the non-selectable object. - View Dependent Claims (13, 14, 15, 16, 17, 18, 19, 20)
-
Specification