Method for determining a border in a complex scene with applications to image masking
First Claim
1. A method for identifying a border in a digital image, the digital image defined by a plurality of pixels, each pixel defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the method comprising:
- receiving as an input an area of interest that includes at least a portion of the border to be identified;
estimating information about an edge zone that models the border portion, the estimating including;
estimating a position of the edge zone including calculating a weighted average value of pixel positions of each pixel in the area of interest;
estimating a width of the edge zone;
calculating a measure of confidence in the edge zone information; and
estimating a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimated edge zone width increases; and
using the estimated edge zone information to identify the border.
1 Assignment
0 Petitions
Accused Products
Abstract
A method is used for identifying a border in a digital image that is defined by a plurality of pixels, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image. The method includes receiving for each pixel a pixel gradient indicating a direction and magnitude of change in color. A user inputs an area of interest that includes at least a portion of the border to be identified. The method then includes estimating information about an edge zone that models the border portion including estimating a position, direction, and width of the edge zone. The position of the edge zone is estimated by calculating a weighted average value of pixel positions of each pixel in the area of interest. The direction of the edge zone is estimated by calculating a weighted average value of pixel gradient direction of each pixel in the area of interest. The method further includes calculating a measure of confidence in the edge zone information and estimating a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimated width increases. The border is identified based on the estimated edge zone information. An identified border is used to improve masking an object bound by the identified border from the digital image.
187 Citations
91 Claims
-
1. A method for identifying a border in a digital image, the digital image defined by a plurality of pixels, each pixel defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the method comprising:
-
receiving as an input an area of interest that includes at least a portion of the border to be identified;
estimating information about an edge zone that models the border portion, the estimating including;
estimating a position of the edge zone including calculating a weighted average value of pixel positions of each pixel in the area of interest;
estimating a width of the edge zone;
calculating a measure of confidence in the edge zone information; and
estimating a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimated edge zone width increases; and
using the estimated edge zone information to identify the border. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38)
estimating a center of the area of interest, and weighting each position of a pixel in the area of interest by a third function of a difference between the pixel position and the estimated area of interest center.
-
-
8. The method of claim 7 in which estimating the area of interest center includes accessing a previously estimated edge zone position.
-
9. The method of claim 8 in which receiving the area of interest includes receiving information relating to the area of interest through a user controlled graphical interface device and in which estimating the center of the area of interest includes finding one or more local maxima of a function of the gradients along a path that intersects the graphical interface device and lies parallel to a predetermined direction relating to the edge zone.
-
10. The method of claim 9 in which the predetermined direction relating to the edge zone is a previously estimated normal direction of the identified border.
-
11. The method of claim 9 in which estimating the center of the area of interest includes selecting as the center of the area of interest one of the local maxima including:
-
finding a distance between a local maximum and a previously estimated edge zone position, and calculating a gradient value of the pixels that are crossed along a path connecting the previously estimated edge zone position and the local maximum.
-
-
12. The method of claim 9 in which estimating the center of the area of interest includes selecting as the center of the area of interest one of the local maxima including calculating an average value over pixels along a path between the selected local maximum and a previously estimated edge zone position, the average value corresponding to an amount of agreement in gradient angle with the predetermined direction.
-
13. The method of claim 9 in which estimating the center of the area of interest includes selecting as the center of the area of interest one of the local maxima that lies in the identified border.
-
14. The method of claim 1 further comprising receiving for each pixel in the area of interest, a pixel gradient indicating a direction and magnitude of change in color, in which estimating information about the edge zone includes estimating a direction of the edge zone including calculating a weighted average value of pixel gradient directions at each pixel in the area of interest.
-
15. The method of claim 14 in which calculating the measure of confidence in the edge zone information includes calculating an average value of a difference between the estimated edge zone direction and the pixel gradient direction at each pixel over an estimated edge zone area.
-
16. The method of claim 15 in which identifying the border includes:
-
comparing the position of a pixel in the area of interest to the estimated edge zone position to determine a relative position of the pixel; and
calculating an opacity for a pixel in the area of interest based on the relative position.
-
-
17. The method of claim 16 in which comparing the position of the pixel in the area of interest to the estimated edge zone position includes comparing the position of the pixel to the estimated edge zone position along the estimated edge zone direction.
-
18. The method of claim 17 in which calculating the opacity includes calculating a value based on the comparison and the estimated edge zone width.
-
19. The method of claim 16 further comprising:
-
receiving as the input a second area of interest, in which the second area of interest includes at least a portion of the border to be identified, and the portion is modeled by a second edge zone, receiving for each pixel in the second area of interest, a pixel gradient indicating a direction and magnitude of change in color, estimating a position of the edge zone for the second area of interest including calculating a weighted average value of pixel positions of each pixel in the second area of interest, estimating a direction of the edge zone for the second area of interest including calculating a weighted average value of gradient directions at each pixel in the second area of interest, estimating a width of the edge zone for the second area of interest;
calculating a measure of confidence in the edge zone direction, the edge zone position, and the edge zone width for the second area of interest, estimating for the second area of interest a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimated width increases, and calculating an opacity for a pixel in the second area of interest including comparing the position of the pixel in the other area of interest to the estimated edge zone position for the second area of interest to determine another relative position of the pixel.
-
-
20. The method of claim 19 further comprising analyzing the calculated measures of confidence for those areas of interest in which a pixel is included and calculating a border-derived opacity for the pixel based on the analyzed measures of confidence.
-
21. The method of claim 20 in which analyzing the calculated measures of confidence includes calculating a measure of confidence that corresponds to an acceptable measure of confidence and calculating the border-derived opacity includes selecting an opacity corresponding to an acceptable measure of confidence.
-
22. The method of claim 20 in which calculating the border-derived opacity includes weighting each opacity by a corresponding calculated measure of confidence to produce a weighted average value for the opacities for each pixel.
-
23. The method of claim 20 in which calculating the border-derived opacity includes selecting an opacity corresponding to a most recently calculated opacity for a given pixel.
-
24. The method of claim 19 further comprising masking a region of the digital image including receiving input from a user indicating the region to mask.
-
25. The method of claim 24 in which masking the region of the digital image includes:
-
calculating a second opacity for a pixel using a color space computation based on a linear blend model, comparing the border-derived opacity to the second opacity, and calculating a final opacity for the given pixel.
-
-
26. The method of claim 25 in which calculating the final opacity includes analyzing one or more of the measures of confidence in the areas of interest in which the pixel is included.
-
27. The method of claim 25 in which determining the final opacity includes:
-
estimating an error in calculating the border-derived opacity, estimating an error in calculating the second opacity, and selecting a final opacity from the group of the border-derived opacity, the second opacity, and a composite opacity that depends on the border-derived opacity and the second opacity based on the estimated errors.
-
-
28. The method of claim 14 further comprising automatically indicating to a user the estimated edge zone information.
-
29. The method of claim 28 further comprising enabling the user to identify the area of interest using the indicated edge zone information.
-
30. The method of claim 29 further comprising automatically indicating to the user the estimated edge zone position.
-
31. The method of claim 29 further comprising automatically indicating to the user the estimated edge zone direction.
-
32. The method of claim 29 further comprising automatically indicating to the user the estimated edge zone width.
-
33. The method of claim 29 further comprising automatically indicating to the user the calculated measure of confidence.
-
34. The method of claim 19 in which using the estimated edge zone information includes combining the edge zone with the second edge zone.
-
35. The method of claim 34 in which combining the edge zone with the second edge zone includes determining a union of the second edge zone with the edge zone.
-
36. The method of claim 34 in which combining the edge zone with the second edge zone includes combining the opacity for each of the pixels in the area of interest with the opacity for each of the pixels in the second area of interest.
-
37. The method of claim 14 in which the edge zone width is estimated along the estimated edge zone direction.
-
38. The method of claim 1 in which the border includes one or more pixels along a direction normal to a general direction of the border.
-
39. A method for interactively indicating to a user an identified edge zone in a digital image, the digital image defined by a plurality of pixels, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the method comprising:
-
receiving for each pixel in the digital image, a pixel gradient indicating a direction and magnitude of change in color;
receiving as an input an area of interest that includes at least a portion of the edge zone to be identified;
estimating a position and direction of the edge zone based on the pixel positions and gradients in the area of interest;
estimating a width of the edge zone;
calculating a measure of confidence in the estimated edge zone position, width and direction;
based on the estimated position and direction, and calculated measure of confidence, estimating a width of the edge zone;
automatically indicating to the user the estimated width of the edge zone; and
receiving a user modification of a next area of interest based on the indication. - View Dependent Claims (40, 41, 42)
-
-
43. A method for identifying an edge zone in a digital image, the digital image defined by a plurality of pixels, each pixel defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the method comprising:
-
receiving for each pixel in the digital image, a pixel gradient indicating a direction and magnitude of change in color;
receiving as an input an area of interest that includes at least a part of the edge zone to be identified;
receiving a direction based on an area of interest adjacent to the inputted area of interest;
estimating a position and direction of the edge zone based on the positions and gradients at pixels in the area of interest and the received direction;
estimating an initial width of the edge zone;
calculating a measure of confidence in the estimated edge zone position, direction and width; and
based on the calculated measure of confidence and estimated initial width, estimating a width of the edge zone to identify the edge zone.
-
-
44. A method for extracting an object from a background in a digital image, the digital image defined by a plurality of pixels, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the method comprising:
-
receiving as an input an area of interest that includes at least a part of a border, the border relating to an object;
estimating a position, direction, width, and measure of confidence of a portion of the border that lies within the area of interest;
identifying the border based on the estimated position, direction, measure of confidence, and width of the border portion;
estimating an opacity of each pixel in the digital image including, for each pixel, comparing a first opacity calculated using the identified border and a second opacity calculated by analyzing colors in a predetermined neighborhood near the pixel; and
extracting the object from the image including estimating an intrinsic color of a pixel in the object. - View Dependent Claims (45)
-
-
46. A computer-implemented system for identifying a border in a digital image, the digital image defined by a plurality of pixels, the border being modeled by a plurality of edge zones, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the system comprising:
-
a storage device configured to store the digital image; and
a processor configured to;
receive as an input an area of interest that includes at least a portion of the border to be identified;
estimate a position of an edge zone by calculating a weighted average value of pixel positions of each pixel in the area of interest;
estimate an initial width of the edge zone;
calculate a measure of confidence in the edge zone position and initial width;
estimate a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimated initial width increases to identify the edge zone; and
use the estimated edge zone position and width, and calculated measure of confidence to identify the border.
-
-
47. Computer software, tangibly embodied in a computer-readable medium or a propagated carrier signal, for identifying a border in a digital image defined by a plurality of pixels, the border being modeled by a plurality of edge zones, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the software comprising instructions to perform the following operations:
-
receive as input an area of interest that includes at least a part of an edge zone to be identified;
estimate a position of the edge zone by calculating a weighted average value of pixel positions of each pixel in the area of interest;
estimate an initial width of the edge zone;
calculate a measure of confidence in the edge zone position and initial width;
estimate a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimate initial width increases to identify the edge zone; and
use the estimated edge zone width and position, and calculated measure of confidence to identify the border. - View Dependent Claims (48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 83, 84)
estimate a center of the area of interest and weight each position of a pixel in the area of interest by a third function of a difference between the pixel position and the estimated area of interest center.
-
-
54. The software of claim 53 in which the instructions to estimate the area of interest center include instructions to access a previously estimated edge zone position.
-
55. The software of claim 54 in which the instructions to receive the area of interest include instructions to receive information relating to the area of interest trough a user controlled graphical interface device and m which the instructions to estimate the center of the area of interest include instructions to find one or more local ma of a function of the gradients along a path that intersects the graphical interface device and lies parallel to a predetermined direction relating to the edge zone.
-
56. The software of claim 55 in which the predetermined direction relating to tho edge zone is a previously estimated normal direction of the identified border.
-
57. The software of claim 55 in which the instructions to estimate the center of the area of interest include instructions to select as the center of the area of interest one of the local maxima and instructions to perform the following operations:
-
find a distance between a local maximum and a previously estimated edge zone position, and calculate a gradient value of the pixels that are crossed along a path connecting the previously estimated edge zone position and the local maximum.
-
-
58. The software of claim 55 in which the instructions to estimate the center of the area of interest include instructions to select as the center of the area of interest one of the local the instructions to select including instructions to calculate an average value over pixels along a path between the selected local maximum and a previously estimated edge zone position, the average value corresponding to an amount of agreement in gradient angle with the predetermined direction.
-
59. The software of claim 55 in which the instructions to estimate the center of the area of interest include instructions to select as the center of the area of interest one of the local maxima that lies in the identified border.
-
60. The software of claim 47 further comprising instructions to receive, for each pixel in the area of interest, a pixel gradient indicating a direction and magnitude of change in color, in which the instructions to estimate information about the edge zone include instructions to estimate a direction of the edge zone that include instructions to calculate a weighted average value of pixel gradient directions at each pixel in the area of interest.
-
61. The software of claim 60 in which the instructions to calculate the measure of confidence in the edge zone information include instructions to calculate an average value of a difference between the estimated edge zone direction and the pixel gradient direction at each pixel over an estimated edge zone area.
-
62. The software of claim 61 in which the instructions to identity the border include instructions to perform the following operations.
compare the position of a pixel in the area of interest to the estimated edge zone position to determine a relative position of the pixel; - and
calculate an opacity for a pixel in the area of interest based on the relative position.
- and
-
63. The software of claim 62 in which the instructions to compare the position of the pixel in the area of interest to the estimated edge zone position include instructions to compare the position of the pixel to the estimated edge zone position along the estimated edge zone direction.
-
64. The software of claim 63 in which the instructions to calculate the opacity include instructions to calculate a value based on the comparison and the estimated edge zone width.
-
65. The software of claim 62 further comprising instructions to:
-
receive as the input a second aria of interest, in which the second area of interest includes at least a portion of the border to be identified, and the portion is modeled by a second edge zone, receive for each pixel in the second area of interest, a pixel gradient indicating a direction and magnitude of change in color, estimate a position of the edge zone for the second area of interest including calculating a weighted average value of pixel positions of each pixel in the second area of interest, estimate a direction of the edge zone for the second area of interest including calculating a weighted average value of gradient directions at each pixel in the second area of interest, estimate a width of the edge zone for the second area of interest;
calculate a measure of confidence in the edge zone direction, the edge zone position, and the edge zone width for the second area of interest, estimate for the second area of interest a width of the edge zone at which the calculated measure of confidence decreases appreciably if the estimated width increases, and calculate an opacity for a pixel in the second area of interest including comparing the position of the pixel in the other area of interest to the estimated edge 7 zone position for the second area of interest to determine another relative position of the pixel.
-
-
66. The software of claim 65 further comprising instructions to analyze the calculated measures of confidence for those areas of interest in which a pixel is included and to calculate a border-derived opacity for the pixel based on the analyzed measures of confidence.
-
67. The software of claim 66 in which the instructions to analyze the calculated measures of confidence include instructions to calculate a measure of confidence that corresponds to an acceptable measure of confidence and to calculate the border-derived opacity by selecting an opacity corresponding to an acceptable measure of confidence.
-
68. The software of claim 66 in which the instructions to calculate the border-derived opacity include instructions to weight each opacity by a corresponding calculated measure of confidence to produce a weighted average value for the opacities for each pixel.
-
69. The software of claim 66 in which the instructions to calculate the border-derived opacity include instructions to select an opacity corresponding to a most recently calculated opacity for a given pixel.
-
70. The software of claim 65 further comprising instructions to mask a region of the digital image and to receive input from a user indicating the region to mask.
-
71. The software of claim 70 in which the instructions to mask the region of the digital image include instructions to perform the following operations:
-
calculate a second opacity for a pixel using a color space computation based on a linear blend model, compare the border-derived opacity to We second opacity, and calculate a final opacity for the given pixel.
-
-
72. The software of claim 71 in which the instructions to calculate the final opacity include instructions to analyze one or more of the measures of confidence in the areas of interest in which the pixel is included.
-
73. The software of claim 71 in which the instructions to determine the final opacity include instructions to perform the following operations:
-
estimate an error in calculating the border-derived opacity, estimate an error in calculating the second opacity, and select a final opacity from the group of the border-derived opacity, the second opacity, and a composite opacity that depends on the border-derived opacity and the second opacity based on the estimated errors.
-
-
74. The software of claim 60 further comprising instructions to automatically indicate to a user the estimated edge zone information.
-
75. The software of claim 74 further comprising instructions to enable the user to identify the area of interest using the indicated edge zone information.
-
76. The software of claim 75 comprising instructions to automatically indicate to the user the estimated edge zone position.
-
77. The software of claim 75 further comprising instructions to automatically indicate to the user the estimated edge zone direction.
-
78. The software of claim 75 further comprising instructions to automatically indicate to the user the estimated edge zone width.
-
79. The software of claim 75 further comprising instructions to automatically indicate to the user the calculated measure of confidence.
-
80. The software of claim 65 in which the instructions to use the estimated edge zone information include instructions to combine the edge zone with the second edge zone.
-
81. The software of claim 80 in which the instructions to combine the edge zone with the second edge zone include instructions to determine a union of the second edge zone with the edge zone.
-
83. The software of claim 60 in which the instructions to estimate the edge zone width include instructions to estimate the edge zone width along the estimated edge zone direction.
-
84. The software of claim 47 in which the border includes one or more pixels along a direction normal to a general direction of the border.
-
82. The software of clam 80 in which the instructions to combine the edge zone with the second edge zone include instructions to combine die opacity for each of the pixels in the area of interest with the capacity for each of the pixels in the second area of interest.
-
85. Computer software tangibly embodied in a computer- able medium or a propagated carrier signal, for interactively indicating to a user an identified edge zone in a digital image, the digital image defined by a plurality of pixels, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the software comprising instructions to perform the following operations:
-
receive for each pixel in the digital image, a pixel gradient indicating a direction and magnitude of change in color, receive as an input an area of interest that includes at least a portion of the edge zone to be identified;
estimate a position and direction of the edge zone based on the pixel positions and gradients in the area of interest;
estimate a width of the edge zone;
calculate a measure of confidence in the estimated edge zone position, width, and direction;
based on the estimated position and direction, and calculated measure of confidence, estimate a width of tie edge zone;
automatically indicate to the user the estimated width of the edge zone; and
receive a user modification of a next area of interest based on the indication. - View Dependent Claims (86, 87, 88)
-
-
89. Computer software, tangibly embodied in a computer-readable medium or a propagated carrier signal, for identifying an edge zone in a digital image, the digital image defined by a plurality of pixels, each pixel defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the software comprising instructions to perform the following operations:
-
receive for each pixel in the digital image, a pixel gradient indicating a direction and magnitude of change in color;
receive as an input an area of interest that include at least a part of the edge zone to be identified;
receive a direction based on an area of interest adjacent to the inputted area of interest;
estimate a position and direction of the edge zone based on the positions and gradients at pixels in the area of interest and the received direction;
estimate an initial width of the edge zone;
calculate a measure of confidence in the estimated edge zone position, direction and width; and
based on the calculated measure of confidence and estimated initial width, estimate a width of the edge zone to identify the edge zone.
-
-
90. Computer software, tangibly embodied in a computer-readable medium or a propagated carrier signal, for extracting an object from a background in a digital image, the digital image defined by a plurality of pixels, each pixel being defined by a pixel color and a pixel position indicating a location of the pixel in the digital image, the software instructions comprising instructions to perform the following operations
receive as an input an area of interest that includes at least a part of a border, the border relating to an object; -
estimate a position, direction, width, and measure of confidence of a portion of the border that lies within the area of interest;
identify the border based on the estimated position, direction, measure of confidence, and width of the border portion;
estimate an opacity of each pixel in the digital image including, for each pixel, comparing a first opacity calculated using the identified border and a second opacity calculated by analyzing colors in a predetermined neighborhood near the pixel; and
extract the object from the image including estimating an intrinsic color of a pixel in the object. - View Dependent Claims (91)
-
Specification