Method and system of automatically extracting facial features
First Claim
1. An automatic facial feature extraction system for analyzing a face image, comprising:
- a pre-processing unit for generating a face region according to the face image by a second-chance region growing method;
a front-end feature extraction unit, coupled to the pre-processing unit, for dividing the face region into a plurality of primary sub-regions pertaining to primary facial features and extracting primary facial feature data from the face image by searching feature points in image portions of the face image corresponding to the primary sub-regions; and
a back-end feature extraction unit, coupled to the front-end feature extraction unit, for determining a plurality of secondary sub-regions pertaining to secondary facial features according to the position information of the primary facial feature data and extracting secondary facial feature data from the face image by searching feature points in the secondary sub-regions. wherein each estimated area corresponding to the facial feature data of the face image are processed with a post-processing method by a post-processing unit, and the post-processing method comprises the steps of;
generating a threshold image of the estimated area by converting intensity values of points within the estimated area to binary values;
calculating intensity accumulations of boundaries of the threshold image corresponding to the estimated area;
shrinking the estimated area if one of the intensity accumulations equals to zero; and
generating the facial feature data corresponding to the estimated area if none of the intensity accumulations equal to zero.
1 Assignment
0 Petitions
Accused Products
Abstract
An automatic feature extraction system for analyzing a face image and finding the facial features of the face. In the pre-processing stage, a second-chance region growing method is applied to determine a face region on the face image. In the feature extraction stage, three primary facial features, including both eyes and the mouth, are extracted first. Then other facial features can be extracted according to these extracted primary facial features. Searching feature points can be achieved by calculating the cost function of each point using a simple feature template. In addition, a genetic algorithm can be used to accelerate the process of searching feature points.
-
Citations
17 Claims
-
1. An automatic facial feature extraction system for analyzing a face image, comprising:
-
a pre-processing unit for generating a face region according to the face image by a second-chance region growing method;
a front-end feature extraction unit, coupled to the pre-processing unit, for dividing the face region into a plurality of primary sub-regions pertaining to primary facial features and extracting primary facial feature data from the face image by searching feature points in image portions of the face image corresponding to the primary sub-regions; and
a back-end feature extraction unit, coupled to the front-end feature extraction unit, for determining a plurality of secondary sub-regions pertaining to secondary facial features according to the position information of the primary facial feature data and extracting secondary facial feature data from the face image by searching feature points in the secondary sub-regions. wherein each estimated area corresponding to the facial feature data of the face image are processed with a post-processing method by a post-processing unit, and the post-processing method comprises the steps of;
generating a threshold image of the estimated area by converting intensity values of points within the estimated area to binary values;
calculating intensity accumulations of boundaries of the threshold image corresponding to the estimated area;
shrinking the estimated area if one of the intensity accumulations equals to zero; and
generating the facial feature data corresponding to the estimated area if none of the intensity accumulations equal to zero. - View Dependent Claims (2, 3)
-
-
4. An automatic facial feature extraction system for analyzing a face image, comprising:
-
a pre-processing unit for generating a face region according to the face image;
a front-end feature extraction unit, coupled to the pre-processing unit, for dividing the face region into a plurality of primary sub-regions pertaining to primary facial features and extracting primary facial feature data from the face image by searching feature points in image portions of the face image corresponding to the primary sub-regions; and
a back-end feature extraction unit, coupled to the front-end feature extraction unit, for determining a plurality of secondary sub-regions pertaining to secondary facial features according to the position information of the primary facial feature data and extracting secondary facial feature data from the face image by searching feature points in the secondary sub-regions;
wherein the primary facial features include the eyes and the mouth; and
wherein the front-end feature extraction unit comprises a partitioning unit for partitioning the face image into a right-eye image portion, a left-eye image portion and a mouth image portion according to the face region generated by the pre-processing unit, a first extractor, coupled to the partitioning unit, for determining a first estimated area pertaining to the left eye of the primary facial features by a feature-point-searching rule, a second extractor, coupled to the partitioning unit, for determining a second estimated area pertaining to the right eye of the primary facial features by the feature-point-searching rule, a third extractor, coupled to the partitioning unit, for determining a third estimated area pertaining to the mouth of the primary facial features by the feature-point-searching rule, and a post-processing unit, coupled to the first extractor, the second extractor, and the third extractor, for generating the primary facial feature data pertaining to the eyes and the mouth by applying a post-processing method, and the post-processing method comprises the steps of;
generating a threshold image of each estimated area by converting intensity values of points within the estimated area to binary values;
calculating intensity accumulations of boundaries of the threshold image corresponding to the estimated area;
shrinking the estimated area if one of the intensity accumulations equals to zero; and
generating the primary facial feature data corresponding to the estimated area if none of the intensity accumulations equal to zero. - View Dependent Claims (5, 6, 7)
an initial population generator to generate an initial population in the genetic algorithm, wherein the initial population is acquired by a spiral function, a fitness evaluator to determine a fitness value associated with each chromosome in the initial population, a reproduction unit to determine an interval for each chromosome according to its corresponding fitness value and select the chromosome into a mating pool according to its corresponding interval, a mutation unit to perform mutation operations to mutate the chromosomes in the mating pool into candidate chromosomes of the new generation, and a survival competition unit to determine the estimated areas pertaining to the primary facial features by selecting part of the chromosomes and the candidate chromosomes of the new generation according to its fitness value.
-
-
8. An automatic facial feature extraction system for analyzing a face image, comprising:
-
a pre-processing unit for generating a face region according to the face image;
a front-end feature extraction unit, coupled to the pre-processing unit, for dividing the face region into a plurality of primary sub-regions pertaining to primary facial features and extracting primary facial feature data from the face image by searching feature points in image portions of the face image corresponding to the primary sub-regions; and
a back-end feature extraction unit, coupled to the front-end feature extraction unit, for determining a plurality of secondary sub-regions pertaining to secondary facial features according to the position information of the primary facial feature data and extracting secondary facial feature data from the face image by searching feature points in the secondary sub-regions, the secondary facial features including the nose and the eyebrows;
wherein the back-end feature extraction unit comprises a partitioning unit for partitioning the face image into two eyebrow image portions and a nose image portion according to the position information of the primary facial feature data generated by the front-end feature extraction unit, an extractor, coupled to the partitioning unit, for determining three estimated areas pertaining to the secondary facial features by a feature-point-searching rule, and a post-processing unit, coupled to the extractor, for generating the secondary facial feature data pertaining to the nose and the eyebrows by applying a post-processing method, and the post-processing method comprises the steps of;
generating a threshold image of each estimated area by converting intensity values of points within the estimated area to binary values;
calculating intensity accumulations of boundaries of the threshold image corresponding to the estimated area;
shrinking the estimated area if one of the intensity accumulations equals to zero; and
generating the primary facial feature data corresponding to the estimated area if none of the intensity accumulations equal to zero. - View Dependent Claims (9, 10, 11)
an initial population generator to generate an initial population in the genetic algorithm, wherein the initial population is acquired by a spiral function, a fitness evaluator to determine a fitness value associated with each chromosome in the initial population, a reproduction unit to determine an interval for each chromosome according to its corresponding fitness value and select the chromosome into a mating pool according to its corresponding interval, a mutation unit to perform mutation operations to mutate the chromosomes in the mating pool into candidate chromosomes of the new generation, and a survival competition unit to determine the estimated areas pertaining to the primary facial features by selecting part of the chromosomes and the candidate chromosomes of the new generation according to its fitness value.
-
-
12. A method for automatically extracting facial features from a face image, comprising the steps of:
-
determining a face region according to the face image by a second-chance region growing method;
partitioning a plurality of primary sub-regions pertaining to the primary facial features from the face region;
extracting primary facial feature data from the face image by searching feature points in image portions of the face image corresponding to the primary sub-regions;
determining a plurality of secondary sub-regions pertaining to secondary facial features according to the position information of the primary facial feature data; and
extracting secondary facial feature data from the face image by searching feature points in the secondary sub-regions;
wherein each estimated area corresponding to the facial feature data of the face image area processed with a post-processing method, and the post-processing method comprises the steps of;
generating a threshold image of the estimated area by converting intensity values of points within the estimated area to binary values;
calculating intensity accumulations of boundaries of the threshold image corresponding to the estimated area;
shrinking the estimated area if one of the intensity accumulations equals to zero; and
generating the primary facial feature data corresponding to the estimated area if none of the intensity accumulations equal to zero. - View Dependent Claims (13, 14, 15, 16, 17)
an initial population generator to generate an initial population in the genetic algorithm, wherein the initial population is acquired by a spiral function, a fitness evaluator to determine a fitness value associated with each chromosome in the initial population, a reproduction unit to determine an interval for each chromosome according to its corresponding fitness value and select the chromosome into a mating pool according to its corresponding interval, a mutation unit to perform mutation operations to mutate the chromosomes in the mating pool into candidate chromosomes of the new generation, and a survival competition unit to determine the estimated areas pertaining to the primary facial features by selecting part of the chromosomes and the candidate chromosomes of the new generation according to its fitness value.
-
Specification