Panning In A Three Dimensional Environment On A Mobile Device
First Claim
1. A computer-implemented method for navigating a virtual camera in a three dimensional environment on a mobile device having a touch screen, comprising:
- (a) receiving a user input indicating that an object has touched a first point on a touch screen of the mobile device and the object has been dragged to a second point on the touch screen;
(b) determining a first target location in the three dimensional environment based on the first point on the touch screen;
(c) determining a second target location in the three dimensional environment based on the second point on the touch screen; and
(d) moving a three dimensional model in the three dimensional environment relative to the virtual camera according to the first and second target locations.
2 Assignments
0 Petitions
Accused Products
Abstract
This invention relates to panning in a three dimensional environment on a mobile device. In an embodiment, a computer-implemented method for navigating a virtual camera in a three dimensional environment on a mobile device having a touch screen. A user input is received indicating that an object has touched a first point on a touch screen of the mobile device and the object has been dragged to a second point on the touch screen. A first target location in the three dimensional environment is determined based on the first point on the touch screen. A second target location in the three dimensional environment is determined based on the second point on the touch screen. Finally, a three dimensional model is moved in the three dimensional environment relative to the virtual camera according to the first and second target locations.
-
Citations
20 Claims
-
1. A computer-implemented method for navigating a virtual camera in a three dimensional environment on a mobile device having a touch screen, comprising:
-
(a) receiving a user input indicating that an object has touched a first point on a touch screen of the mobile device and the object has been dragged to a second point on the touch screen; (b) determining a first target location in the three dimensional environment based on the first point on the touch screen; (c) determining a second target location in the three dimensional environment based on the second point on the touch screen; and (d) moving a three dimensional model in the three dimensional environment relative to the virtual camera according to the first and second target locations. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8)
-
-
9. A system for navigating a virtual camera in a three dimensional environment on a mobile device, comprising:
-
a touch receiver that receives a user input indicating that an object has touched a first point on a touch screen of the mobile device and the object has been dragged to a second point on the touch screen; a target module that determines a first target location in the three dimensional environment based on the first point on the touch screen and determines a second target location in the three dimensional environment based on the second point on the touch screen; and a pan module that moves a three dimensional model in the three dimensional environment relative to the virtual camera according to the first and second target locations. - View Dependent Claims (10, 11, 12, 13, 14, 15, 16)
-
-
17. A computer-implemented method for navigating a virtual camera in a three dimensional environment on a mobile device having a touch screen, comprising:
-
(a) receiving a user input indicating that an object has touched a first point on a screen of the mobile device and the object has been dragged to a second point on the touch screen; (b) receiving an orientation of the mobile device; and (c) determining a panning mode from a plurality of panning modes based on the orientation of the model device. - View Dependent Claims (18, 19, 20)
-
Specification