METHOD OF CONTROLLING VIRTUAL OBJECT OR VIEW POINT ON TWO DIMENSIONAL INTERACTIVE DISPLAY
First Claim
Patent Images
1. A method of controlling a virtual object or a viewpoint of a user on a two-dimensional (2D) interactive display, the method comprising:
- verifying, when a number of touch points by a user input is at least two, two touch points most adjacent to each other among the at least two touch points;
determining whether a distance between the verified two touch points is less than a predetermined value;
pairing the verified two touch points when the distance between the verified two touch points is less than the predetermined value;
converting the user input to at least 6 degrees of freedom (DOF) structured data according to at least one of a movement direction of the paired touch points and a rotation direction of the paired touch points; and
controlling a manipulation target using the at least 6DOF structured data.
0 Assignments
0 Petitions
Accused Products
Abstract
A method of controlling a viewpoint of a user or a virtual object on a two-dimensional (2D) interactive display is provided. The method may convert a user input to at least 6 degrees of freedom (DOF) structured data according to a number of touch points, a movement direction thereof, and a rotation direction thereof. Any one of the virtual object and the viewpoint of the user may be determined as a manipulation target based on a location of the touch point.
-
Citations
15 Claims
-
1. A method of controlling a virtual object or a viewpoint of a user on a two-dimensional (2D) interactive display, the method comprising:
-
verifying, when a number of touch points by a user input is at least two, two touch points most adjacent to each other among the at least two touch points; determining whether a distance between the verified two touch points is less than a predetermined value; pairing the verified two touch points when the distance between the verified two touch points is less than the predetermined value; converting the user input to at least 6 degrees of freedom (DOF) structured data according to at least one of a movement direction of the paired touch points and a rotation direction of the paired touch points; and controlling a manipulation target using the at least 6DOF structured data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. A method of processing a virtual manipulation medium that is used to control a virtual object or a viewpoint of a user on an interactive display, the method comprising:
-
receiving, from a user input via the virtual manipulation medium, at least 6 degrees of freedom (6DOF) structured data, wherein the at least 6DOF structured data is generated based on a number of touch points, a movement direction thereof, a location thereof, and a rotation direction thereof; determining, as a manipulation target, any one of the virtual object, displayed on a screen, and the viewpoint of the user, based on the location of the touch point, wherein the virtual object is determined as the manipulation target when a location of the touch points exists in a region including the virtual object, and the viewpoint of the user is determined as the manipulation target when the location of the touch points exists in a region excluding the virtual object; displaying, as a user interface, the virtual manipulation medium in any one of a region including the virtual object and a region excluding the virtual object, depending on whether the virtual object is determined as the manipulation target or the viewpoint of the user is determined as the manipulation target; and applying a video effect or an audio effect to the manipulation target that is any one of the virtual object and the viewpoint of the user. - View Dependent Claims (12, 13, 14)
-
-
15. A computing system for controlling a virtual object or a viewpoint of a user on a two-dimensional (2D) interactive display, the system comprising:
-
a multi-touch sensor to recognize one or more touches of the user and to generate one or more touch signals, wherein a virtual manipulation medium is displayed on the interactive display after the multi-touch sensor recognizes the one or more touches; a multi-touch driver, using at least one processor, to generate multi-touch structured data based on one or more touch signals received from the multi-touch sensor; a six degree of freedom interaction engine to receive the multi-touch structured data from the multi-touch driver and to generate six degrees of freedom structured data; a virtual manipulation medium processing engine to generate a command to apply at least one of a video effect and an audio effect to the virtual manipulation medium to correspond to a user gesture using the six degrees of freedom structured data; and a three-dimensional virtual application to generate a command to perform a graphic process according to motion of the virtual object or according to a movement of a camera, using the six degrees of freedom structured data, wherein, when the number of touches is two, the displayed virtual manipulation medium has a spherical form which has a diameter which is determined by a distance between two fingers, and the displayed virtual medium includes display of three-dimensional (3D) coordinate axes.
-
Specification