Method of controlling virtual object or view point on two dimensional interactive display
First Claim
Patent Images
1. A method of controlling a manipulation target, the method comprising:
- determining whether a distance between two touch points is less than a predetermined value;
pairing the two touch points when the distance between the two touch points is less than the predetermined value; and
controlling the manipulation target based on the paired touch points, wherein the controlling comprises converting a user input by the paired touch points to at least one of a movement along an X axis, a movement along a Y axis, a movement along a Z axis, a roll based on the X axis, a pitch based on the Y axis, and a yaw based on the Z axis, in a three-dimensional (3D) space comprising the X axis, the Y axis, and the Z axis; and
determining, as the manipulation target, any one of a virtual object and a viewpoint of the user, based on the paired touch points, wherein the virtual object is determined as the manipulation target when a location of the paired touch points exists in a region corresponding to the virtual object, and the viewpoint of the user is determined as the manipulation target when the location of the paired touch points exists in a region corresponding to a background.
0 Assignments
0 Petitions
Accused Products
Abstract
A method of controlling a viewpoint of a user or a virtual object on a two-dimensional (2D) interactive display is provided. The method may convert a user input to at least 6 degrees of freedom (DOF) structured data according to a number of touch points, a movement direction thereof, and a rotation direction thereof. Any one of the virtual object and the viewpoint of the user may be determined as a manipulation target based on a location of the touch point.
-
Citations
13 Claims
-
1. A method of controlling a manipulation target, the method comprising:
-
determining whether a distance between two touch points is less than a predetermined value; pairing the two touch points when the distance between the two touch points is less than the predetermined value; and controlling the manipulation target based on the paired touch points, wherein the controlling comprises converting a user input by the paired touch points to at least one of a movement along an X axis, a movement along a Y axis, a movement along a Z axis, a roll based on the X axis, a pitch based on the Y axis, and a yaw based on the Z axis, in a three-dimensional (3D) space comprising the X axis, the Y axis, and the Z axis; and determining, as the manipulation target, any one of a virtual object and a viewpoint of the user, based on the paired touch points, wherein the virtual object is determined as the manipulation target when a location of the paired touch points exists in a region corresponding to the virtual object, and the viewpoint of the user is determined as the manipulation target when the location of the paired touch points exists in a region corresponding to a background. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. An apparatus of controlling a manipulation target, the apparatus comprising:
-
a sensor configured to recognize two touch points; and a controller configured to; determine whether a distance between the two touch points is less than a predetermined value; pair the two touch points when the distance between the two touch points is less than the predetermined value; control the manipulation target based on the paired touch points; convert a user input by the paired touch points to at least one of a movement along an X axis, a movement along a Y axis, a movement along a Z axis, a roll based on the X axis, a pitch based on the Y axis, and a yaw based on the Z axis, in a three-dimensional (3D) space comprising the X axis, the Y axis, and the Z axis; and determine, as the manipulation target, any one of a virtual object and a viewpoint of the user, based on the paired touch points, wherein the virtual object is determined as the manipulation target when a location of the paired touch points exists in a region corresponding to the virtual object, and the viewpoint of the user is determined as the manipulation target when the location of the paired touch points exists in a region corresponding to a background. - View Dependent Claims (9, 10, 11, 12, 13)
-
Specification