Method and apparatus for single channel color image segmentation using local context based adaptive weighting
First Claim
1. A method of generating a context based adaptive weighting vector for use in single color segmentation of a color image formed of first color channel pixels R(i,k), second color channel pixels G(i,k), and third color channel pixels B(i,k), the method comprising the steps of:
- obtaining a first color channel activity estimate Rs(i,k) representation of a measure of local first color channel video signal variation at each first color channel pixel R(i,k) of the color image;
obtaining a second color channel activity estimate Gs(i,k) representation of a measure of local second color channel video signal variation at each second color channel pixel G(i,k) of the color image;
obtaining a third color channel activity estimate Bs(i,k) representation of a measure of local third color channel video signal variation at each third color channel pixel B(i,k) of the color image;
for each pixel (i,k) of the color image, comparing the first color channel activity estimate Rs(i,k) with the second color channel activity estimate Gs(i,k) and with the third color channel activity estimate Bs(i,k) to identify a one of the first, second, and third color channels having greatest activity;
generating a first color channel binary map Rsb(i,k) by storing, for each pixel (i,k) of the color image, a first binary value for pixel locations where the first color channel had said greatest activity and a second binary value for pixel locations where the first color channel did not have said greatest activity;
generating a second color channel binary map Gsb(i,k) by storing, for each pixel (i,k) of the color image, the first binary value for pixel locations where the second color channel had said greatest activity and the second binary value for pixel locations where the second color channel did not have said greatest activity;
generating a third color channel binary map Bsb(i,k) by storing, for each pixel (i,k) of the color image, the first binary value for pixel locations where the third color channel had said greatest activity and the second binary value for pixel locations where the third color channel did not have said greatest activity;
low pass filtering the first color channel binary map Rsb(i,k) to generate a first color low pass filtered binary map Rsb1(i,k);
low pass filtering the second color channel binary map Gsb(i,k) to generate a second color low pass filtered binary map Gsb1(i,k);
low pass filtering the third color channel binary map Bsb(i,k) to generate a third color low pass filtered binary map Bsb1(i,k); and
, generating an adaptive weighting vector w(i,k) by combining said first, second, and third color low pass filtered binary maps as w(i,k)=[Rsb1(i,k) Gsb1(i,k) Bsb1(i,k)]′
.
7 Assignments
0 Petitions
Accused Products
Abstract
A method and apparatus for single channel color image segmentation using local context based adaptive weighting is provided. The varying weightings of the projection vector are determined as a function of local input image activity context. A Sobel operator is used to calculate the input image activity. A binary map is created for each color channel and is adapted to store binary markers indicative of local activity levels on a per pixel basis. The binary maps are low pass filtered and then normalized to generate a context based adaptive weighting vector for use in single color segmentation of a multi-channel color image signal.
-
Citations
20 Claims
-
1. A method of generating a context based adaptive weighting vector for use in single color segmentation of a color image formed of first color channel pixels R(i,k), second color channel pixels G(i,k), and third color channel pixels B(i,k), the method comprising the steps of:
-
obtaining a first color channel activity estimate Rs(i,k) representation of a measure of local first color channel video signal variation at each first color channel pixel R(i,k) of the color image;
obtaining a second color channel activity estimate Gs(i,k) representation of a measure of local second color channel video signal variation at each second color channel pixel G(i,k) of the color image;
obtaining a third color channel activity estimate Bs(i,k) representation of a measure of local third color channel video signal variation at each third color channel pixel B(i,k) of the color image;
for each pixel (i,k) of the color image, comparing the first color channel activity estimate Rs(i,k) with the second color channel activity estimate Gs(i,k) and with the third color channel activity estimate Bs(i,k) to identify a one of the first, second, and third color channels having greatest activity;
generating a first color channel binary map Rsb(i,k) by storing, for each pixel (i,k) of the color image, a first binary value for pixel locations where the first color channel had said greatest activity and a second binary value for pixel locations where the first color channel did not have said greatest activity;
generating a second color channel binary map Gsb(i,k) by storing, for each pixel (i,k) of the color image, the first binary value for pixel locations where the second color channel had said greatest activity and the second binary value for pixel locations where the second color channel did not have said greatest activity;
generating a third color channel binary map Bsb(i,k) by storing, for each pixel (i,k) of the color image, the first binary value for pixel locations where the third color channel had said greatest activity and the second binary value for pixel locations where the third color channel did not have said greatest activity;
low pass filtering the first color channel binary map Rsb(i,k) to generate a first color low pass filtered binary map Rsb1(i,k);
low pass filtering the second color channel binary map Gsb(i,k) to generate a second color low pass filtered binary map Gsb1(i,k);
low pass filtering the third color channel binary map Bsb(i,k) to generate a third color low pass filtered binary map Bsb1(i,k); and
,generating an adaptive weighting vector w(i,k) by combining said first, second, and third color low pass filtered binary maps as w(i,k)=[Rsb1(i,k) Gsb1(i,k) Bsb1(i,k)]′
.- View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
the step of obtaining said first color channel activity estimate Rs(i,k) includes obtaining a red channel activity estimate Rs(i,k) representation of a measure of local red channel video signal variation at each red channel pixel R(i,k) of the color image;
the step of obtaining said second color channel activity estimate Gs(i,k) includes obtaining a green channel activity estimate Gs(i,k) representation of a measure of local green channel video signal variation at each green channel pixel G(i,k) of the color image;
the step of obtaining said third color channel activity estimate Bs(i,k) includes to obtaining a blue channel activity estimate Bs(i,k) representation of a measure of local blue channel video signal variation at each blue channel pixel B(i,k) of the color image;
the step of comparing the first color channel activity estimate with the second color channel activity estimate and with the third color channel activity estimate includes comparing said red channel activity estimate Rs(i,k) with said green channel activity estimate Gs(i,k) and with said blue channel activity estimate Bs(i,k) to identify a one of the red, green, and blue channels having greatest activity;
the step of generating said first color channel binary map Rsb(i,k) includes generating a red channel binary map Rsb(i,k) for storing, for each pixel (i,k) of the color image, said first binary value for pixel locations where said red channel had said greatest activity and said second binary value for pixel locations where said red channel did not have said greatest activity;
the step of generating said second color channel binary map Gsb(i,k) includes generating a green channel binary map Gsb(i,k) for storing, for each pixel (i,k) of the color image, said first binary value for pixel locations where said green channel had said greatest activity and said second binary value for pixel locations where said green channel did not have said greatest activity;
the step of generating said third color channel binary map Bsb(i,k) includes generating a blue channel binary map Bsb(i,k) for storing, for each pixel (i,k) of the color image, said first binary value for pixel locations where said blue channel had said greatest activity and said second binary value for pixel locations where said blue channel did not have said greatest activity;
the step of low pass filtering said first color channel binary map includes low pass filtering said red channel binary map Rsb(i,k) to generate a red low pass filtered binary map Rsb1(i,k);
the step of low pass filtering said second color channel binary map includes low pass filtering said green channel binary map Gsb(i,k) to generate a green low pass filtered binary map Gsb1(i,k);
the step of low pass filtering said third color channel binary map includes low pass filtering said blue channel binary map Bsb(i,k) to generate a blue low pass filtered binary map Bsb1(i,k); and
,the step of generating said adaptive weighting vector w(i,k) includes generating said adaptive weighting vector w(i,k) by combining said red, green, and blue low pass filtered binary maps as
-
-
3. The method of generating a context based adaptive weighting vector according to claim 2 wherein:
-
the step of obtaining said red channel activity estimate Rs(i,k) includes the step of applying a Sobel filter to each of said red channel pixels R(i,k);
the step of obtaining said green channel activity estimate Gs(i,k) includes the step of applying a Sobel filter to each of said green channel pixels G(i,k); and
,the step of obtaining said blue channel activity estimate Bs(i,k) includes the step of applying a Sobel filter to each of said blue channel pixels B(i,k).
-
-
4. The method of generating a context based adaptive weighting vector according to claim 3 wherein:
-
the step of applying said Sobel filter to each of said red channel pixels R(i,k) includes applying said Sobel filter in both vertical and horizontal red channel image directions;
the step of applying said Sobel filter to each of said green channel pixels G(i,k) includes applying said Sobel filter in both vertical and horizontal green channel image directions; and
,the step of applying said Sobel filter to each of said blue channel pixels B(i,k) includes applying said Sobel filter in both vertical and horizontal blue channel image directions.
-
-
5. The method of generating a context based adaptive weighting vector according to claim 4 wherein:
-
the step of applying the Sobel filter in both said vertical and horizontal red image directions includes calculating a norm {square root over ((Sx2+L +Sy2+L ))} of vertical and horizontal image direction components as said red channel activity estimate Rs(i,k);
the step of applying the Sobel filter in both said vertical and horizontal green channel image directions includes calculating a norm {square root over ((Sx2+L +Sy2+L ))} of vertical and horizontal image direction components as said green channel activity estimate Gs(i,k); and
, the step of applying the Sobel filter in both said vertical and horizontal blue channel image directions includes calculating a norm {square root over ((Sx2+L +Sy2+L ))} of vertical and horizontal image direction components as said blue channel activity estimate Bs(i,k).
-
-
6. The method of generating a context based adaptive weighting vector according to claim 2 wherein:
-
the step of generating said red channel binary map Rsb(i,k) includes storing, for each pixel (i,k) of the RGB color image, a logical “
1”
as said first binary value where the red channel had said greatest activity and storing a logical “
0”
as said second binary value for pixel locations where the red channel did not have said greatest activity;
the step of generating said green channel binary map Gsb(i,k) includes storing, for each pixel (i,k) of the RGB color image, a logical “
1”
as said first binary value where the green channel had said greatest activity and storing a logical “
0”
as said second binary value for pixel locations where the green channel did not have said greatest activity; and
,the step of generating said blue channel binary map Bsb(i,k) includes storing, for each pixel (i,k) of the RGB color image, a logical “
1”
as said first binary value where the blue channel had said greatest activity and storing a logical “
0”
as said second binary value for pixel locations where the blue channel did not have said greatest activity.
-
-
7. The method of generating a context base adaptive weighting vector according to claim 2 wherein the step of comparing the red channel activity estimate Rs(i,k) with the green channel activity estimate Gs(i,k) and with the blue channel activity estimate Bs(i,k) includes comparing the red channel activity estimate Rs(i,k) with the green channel activity estimate Gs(i,k) and with the blue channel activity estimate Bs(i,k) according to:
-
if (Gs≧
Rs&
&
Gs≧
Bs)Gsb=1 else if (Rs≧
Bs)Rsb=1 else Bsb=1.
-
-
8. The method of generating a context based adaptive weighting vector according to claim 2 wherein:
-
the step of low pass filtering the red channel binary map Rsb(i,k) includes low pass filtering the red channel binary map Rsb(i,k) using a first M×
N low pass pyramidal filter;
the step of low pass filtering the green channel binary map Gsb(i,k) includes low pass filtering the green channel binary map Gsb(i,k) using said first M×
N low pass pyramidal filter; and
,the step of low pass filtering the blue channel binary map Bsb(i,k) includes low pass filtering the blue channel binary map Bsb(i,k) using said first M×
N low pass pyramidal filter.
-
-
9. The method of generating a context based adaptive weighting vector according to claim 8 wherein the step of low pass filtering the red, green, and blue channel binary maps includes normalizing the red, green, and blue channel binary maps to a unitary sum at each pixel according to:
-
10. The method of generating a context based adaptive weighting vector according to claim 9 wherein the step of generating said adaptive weighting vector w(i,k) includes the step of combining said red, green, and blue low pass filter binary maps as:
-
11. A method of generating a context based adapted weighting vector for use in single channel segmentation of an image formed of first channel pixels A(i,k) and second channel pixels B(i,k) the method comprising the steps of:
-
obtaining activity estimate representations of a measure of local channel signal variation at each first channel pixel as As(i,k) and at each second channel pixel as Bs(i,k) of the image;
for each pixel (i,k) of the image, comparing said activity estimate representations to identify a one of a first and second channels having greatest activity;
generating a first channel binary map Asb(i,k) and a second channel binary map Bsb(i,k) by storing in each first and second binary maps, for each pixel (i,k) of the color image, a first binary value for pixel locations where the first and second channels had said greatest activity and a second binary value for pixel locations where the channels did not have said greatest activity;
filtering the first channel binary map Asb(i,k) and the second channel binary map Bsb(i,k) to generate a first filtered binary map Asb1(i,k) and a second filtered binary map Bb1(i,k); and
,generating an adaptive weighting vector w(i,k) by combining said first and second filtered binary maps as - View Dependent Claims (12, 13, 14, 15)
-
-
15. The method generating a context based adaptive weighting vector according to claim 14 wherein the step of generating said adaptive weighting vector w(i,k) includes the step of combining said first and second low pass filter binary maps as:
-
16. An apparatus of generating a context based adapted weighting vector for use in single channel segmentation of an image formed of first channel pixels A(i,k) and second channel pixels B(i,k), the apparatus comprising:
-
an activity estimator circuit for obtaining activity estimate representations of a measure of local channel signal variation at each first channel pixel as As(i,k) and at each second channel pixel as Bs(i,k) of the image;
a comparator circuit for comparing, for each pixel (i,k) of the image, comparing said activity estimate representations to identify a one of a first and second channels having greatest activity and generating a first channel binary map Asb(i,k) and a second channel binary map Bsb(i,k) by storing in each first and second binary maps, for each pixel (i,k) of the color image, a first binary value for pixel locations where the first and second channels had said greatest activity and a second binary value for pixel locations where the channels did not have said greatest activity;
a filter circuit for filtering the first channel binary map Asb(i,k) and the second channel binary map Bsb(i,k) to generate a first filtered binary map Asb1(i,k) and a second filtered binary map Bb1(i,k); and
,a normalizer circuit for generating an adaptive weighting vector w(i,k) by combining said first and second filtered binary maps as - View Dependent Claims (17, 18, 19, 20)
-
-
20. The apparatus according to claim 19 wherein the normalizer circuit includes a circuit for combining said first and second low pass filter binary maps as:
Specification