Gesture-based object measurement method and apparatus
First Claim
1. A gesture-based object measurement method, comprising:
- collecting image information of a to-be-measured object and a gesture;
extracting contour information of the to-be-measured object from the collected image information;
determining, according to the gesture, a target measurement area that is in the image information and in which the user is interested when the gesture meets a set condition;
partitioning contour information of the target measurement area off the extracted contour information;
obtaining three-dimensional coordinate values obtained after each endpoint comprised in the partitioned-off contour information is mapped to three-dimensional space; and
calculating, according to the obtained three-dimensional coordinate values, a measurement parameter value of the to-be-measured object corresponding to the contour information of the target measurement area.
1 Assignment
0 Petitions
Accused Products
Abstract
In the field of man-machine interaction technologies, a gesture-based object measurement method and apparatus, which are used to improve measurement efficiency. According to this method, after image information is collected, contour information of a to-be-measured object is automatically extracted and partitioned off, and a measurement parameter value such as a length, an area, or a volume is calculated on this basis. In this way, not only real-time online measurement is implemented, a measurement process is simpler, and more convenient, visual, and effective, and augmented reality (AR) measurement efficiency can be improved, but also the measurement process is more harmonious and natural, and closer to a human intention, and a measurement result is more accurate.
17 Citations
20 Claims
-
1. A gesture-based object measurement method, comprising:
-
collecting image information of a to-be-measured object and a gesture; extracting contour information of the to-be-measured object from the collected image information; determining, according to the gesture, a target measurement area that is in the image information and in which the user is interested when the gesture meets a set condition; partitioning contour information of the target measurement area off the extracted contour information; obtaining three-dimensional coordinate values obtained after each endpoint comprised in the partitioned-off contour information is mapped to three-dimensional space; and calculating, according to the obtained three-dimensional coordinate values, a measurement parameter value of the to-be-measured object corresponding to the contour information of the target measurement area. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A gesture-based object measurement apparatus, comprising:
-
a display; a sensor configured to collect image information of a to-be-measured object and a gesture; a processor coupled to the sensor and configured to; determine a measurement positioning point of the to-be-measured object according to the gesture when the gesture collected by the sensor meets a set condition; obtain three-dimensional coordinate values obtained after the measurement positioning point is mapped to three-dimensional space; determine a measurement parameter of the to-be-measured object; and calculate a value of the measurement parameter of the to-be-measured object according to the measurement parameter and the three-dimensional coordinate values; and a transceiver coupled to the processor and configured to send the value of the measurement parameter to the display, the display being configured to display the received parameter value. - View Dependent Claims (8, 9, 10, 11, 12, 13)
-
-
14. A gesture-based object measurement apparatus, comprising:
-
a display; a sensor configured to; collect image information of a to-be-measured object and a gesture, a processor coupled to the sensor and configured to; extract contour information of the to-be-measured object from the image information collected by the sensor; determine, according to the gesture, a target measurement area that is in the image information and in which the user is interested when the gesture collected by the sensor meets a set condition; partition contour information of the target measurement area off the extracted contour information; obtain three-dimensional coordinate values obtained after each endpoint comprised in the partitioned-off contour information is mapped to three-dimensional space; and calculate, according to the obtained three-dimensional coordinate values, a measurement parameter value of the to-be-measured object corresponding to the contour information of the target measurement area; and a transceiver coupled to the processor and configured to send the measurement parameter value to the display, the display being configured to display the received measurement parameter value. - View Dependent Claims (15, 16, 17, 18, 19, 20)
-
Specification