Apparatus and method for color calibration
10291823 ยท 2019-05-14
Assignee
Inventors
Cpc classification
H04N1/409
ELECTRICITY
H04N1/3871
ELECTRICITY
G06T7/143
PHYSICS
H04N1/6027
ELECTRICITY
International classification
H04N1/407
ELECTRICITY
G06T7/143
PHYSICS
H04N1/409
ELECTRICITY
Abstract
An apparatus and a method for mapping colors of a source image to colors of a reference image, where the apparatus is configured to match a histogram of the source image to a histogram of the reference image to generate a histogram matched image of the source image, generate a conditional probability distribution of the reference image, and detect outliers in the histogram matched image based on the conditional probability distribution.
Claims
1. An apparatus for mapping colors of a source image to colors of a reference image, comprising: a memory comprising instructions; and a processor coupled to the memory, wherein the instructions cause the processor to be configured to: match a histogram of the source image to a histogram of the reference image to generate a histogram matched image of the source image; generate a conditional probability distribution of the reference image by estimating a conditional probability (p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN)) of an intensity (I.sub.c) of a given pixel of the reference image given intensities (I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) of neighboring pixels of the given pixel, wherein the neighboring pixels are selected according to a neighborhood configuration; and detect outliers in the histogram matched image based on the conditional probability distribution.
2. The apparatus of claim 1, wherein the neighborhood configuration is adaptively selected according to the content of the source image or content of the reference image.
3. The apparatus of claim 2, wherein a number of neighbors defined by the neighborhood configuration depends on a texture of the source image and a texture of the reference image, and wherein the number of neighbors is increased for a source image and a reference image with larger homogenous areas or with less textures.
4. The apparatus of claim 2, herein a number of neighbors defined by the neighborhood configuration depends on a texture of the source image, and wherein the number of neighbors is increased for a source image with larger homogenous area or with less texture.
5. The apparatus of claim 2, wherein a number of neighbors defined by the neighborhood configuration depends on a texture of the reference image, and wherein the number of neighbors is increased for a reference image with larger homogenous area or with less texture.
6. The apparatus of claim 1, wherein the instructions further cause the processor to be configured to detect a matched pixel of the histogram matched image as outlier when, based on the conditional probability distribution, a probability of a combination of the matched pixel and of its neighboring pixels, which are selected according to the neighborhood configuration, is lower than a threshold (T).
7. The apparatus of claim 6, wherein the T is predefined by a user, and wherein the T is comprised between 0.05 and 0.3.
8. The apparatus of claim 6, wherein the instructions further cause the processor to be configured to correct the outliers by correcting an intensity of the matched pixel, which is detected as the outlier based on the conditional probability distribution.
9. The apparatus of claim 8, wherein the intensity of the matched pixel detected as the outlier is corrected by finding the I.sub.c that maximizes the p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN).
10. The apparatus of claim 8, wherein the instructions further cause the processor to be configured to iterate several times the detection of the outliers and the correction of the outliers, and wherein the number of iterations is increased when amount of textures in the source image or of the reference image decreases.
11. The apparatus of claim 6, wherein the T is computed automatically, and wherein the T is comprised between 0.05 and 0.3.
12. The apparatus of claim 1, wherein the instructions further cause the processor to be configured to detect a matched pixel of the histogram matched image as outlier when, for the matched pixel and its neighboring pixels that are selected according to the neighborhood configuration, the p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) is lower than a threshold (T).
13. The apparatus of claim 1, wherein the instructions further cause the processor to be configured to detect the outliers in the histogram matched image for each color channel.
14. The apparatus of claim 1, wherein the instructions further cause the processor to be configured to generate an outlier map identifying all the outliers detected in the histogram matched image.
15. The apparatus of claim 1, wherein the instructions further cause the processor to be configured to: detect edges of the histogram matched image; generate a corresponding edge map of the reference image and a corresponding edge map of the histogram matched image; generate the conditional probability distribution based on the edge map of the reference image; and detect the outliers in the histogram matched image based on the edge map of the histogram matched image.
16. The apparatus of claim 15, wherein the instructions further cause the processor to be configured to: detect edges of the source image; generate a corresponding edge map of the source image; and process the edge map of the histogram matched image by combining the edge map of the histogram matched image and the edge map of the source image.
17. The apparatus of claim 16, wherein the instructions further cause the processor to be configured to: generate an exposedness map identifying, for each pixel position (x,y), when the pixel at this position (x,y) is better exposed in the histogram matched image or in the source image; and process the edge map of the histogram matched image by combining the edge map of the histogram matched image and the edge map of the source image according to the exposedness map.
18. The apparatus of claim 15, wherein the instructions further cause the processor to be configured to apply an erosion on the edge map of the reference image and on the edge map of the histogram matched image.
19. A method for mapping colors of a source image to the colors of a reference image, comprising: matching a histogram of the source image to a histogram of the reference image to generate a histogram matched image of the source image; generating a conditional probability distribution of the reference image by estimating a conditional probability (p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN)) of an intensity (I.sub.c)of a given pixel of the reference image given intensities (I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN)of neighboring pixels of the given pixel, wherein the neighboring pixels are selected according to a neighborhood configuration; and detecting outliers in the histogram matched image based on the conditional probability distribution.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1) The above aspects and implementation forms of the present application will be explained in the following description of specific embodiments in relation to the enclosed drawings.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION OF EMBODIMENTS
(9)
(10) The apparatus 56 is adapted to map the colors of at least one source image to the colors of a reference image. In this context color mapping, color calibration or color matching, is the operation of mapping the colors of a source image to the colors of a reference image. The apparatus 56 comprises a histogram matching unit 52 adapted to match the histogram of the source image to the histogram of the reference image to generate a histogram matched image of the source image, a probability distribution computation unit 54 adapted to generate a conditional probability distribution of the reference image, and an outlier detection unit 55 adapted to detect outliers in the histogram matched image based on the conditional probability distribution.
(11) The source image is an image whose color distribution or histogram will be mapped to a reference during a color mapping operation, and the reference image is an image whose color distribution or histogram will be used as a reference during a color mapping operation.
(12) The histogram matching unit 52 is adapted to match the histogram of an input image being the source image to the histogram of an input image being the reference image to generate a histogram matched image of the source image. The histogram matching unit 52 shown in
(13) The histogram matching performed by the histogram matching unit 52 comprises matching the histogram of the source image to the histogram of the reference image, i.e. matching the source histogram to the reference histogram. This matching output the histogram matched image of the source image, whose color properties are ideally close to those of the reference image.
(14) The matching can comprise, as known in the art, the following steps to generate the histogram matched image.
(15) At first, a respective histogram is generated from the source image and from the reference image, wherein a histogram consists in a graphical illustration of the distribution of the pixel color intensities of an image. Each of the two generated histograms is then normalized by dividing the histogram values by the total number of pixels in the respective image.
(16) In a further step, the normalized histograms are processed to generate a respective cumulative distribution function (CDF) of both histograms of the source image and of the reference image, wherein a CDF describes the likelihood of a random variable X of having a probability distribution that is equal or greater than a specific value x according to the equation F.sub.X(x)=P(Xx). The two CDF may be respectively named F.sub.source( ) and F.sub.reference( ). The two CDF then operate as respective look-up tables. The indices of each look-up table correspond to the grey level values, and the content of each look-up table at a given index corresponds to the value of the CDF. For example, the grey level values range from 0 to 255.
(17) In a further step, for each pixel of the source image having an associated grey level value of I.sub.source, the look-up table of the source image is used to determine the corresponding value of the CDF of the source image. This CDF value is then searched and identified in the look-up table to determine the corresponding grey level value I.sub.reference , in the reference image. At last, the grey level I.sub.reference is substituted for the grey level I.sub.source in the source image to generate the histogram matched image of the source image. Hence, the two CDF are used to match every pixel gray level I.sub.source in the source image in the range {0, 255} to a new intensity I.sub.reference which satisfies the following equation:
F.sub.source(I.sub.source)=F.sub.reference(I.sub.reference).
(18) The probability distribution computation unit 54 is adapted to generate a conditional probability distribution of the reference image. Preferably, the probability distribution computation unit 54 is adapted to generate the conditional probability distribution by estimating a conditional probability p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) of an intensity I.sub.c of a given pixel of the reference image given intensities I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN of neighboring pixels of said given pixel, wherein the neighboring pixels are selected according to a neighborhood configuration.
(19) This means that the conditional probabilities used for modeling the color distribution of the reference image are computed. For a selected neighborhood configuration with N neighbors, for each neighbor, the conditional probability p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) is estimated. The values I.sub.c and I.sub.n1 to I.sub.nN represent respectively the intensities of the central pixel of the neighborhood configuration and the N intensities of the remaining pixels of the neighborhood configuration.
(20) The estimated conditional probability is based on the computed joint probabilities p(I.sub.c, I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) and p(I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) from the reference image which contains the targeted color distribution, using a Bayesian framework according to the equation p(a|b)=p(a, b)/p(b). Alternatively, an assumption can be made prior to the estimation of the conditional probability, in which the joint probabilities are assumed to be statistically independent.
(21)
(22) According to a first example, the neighborhood configuration (i.e., Neighborhood Configuration 1) comprises the central pixel as well as the four pixels directly adjacent to the central pixel, i.e. the four pixels above 1, under 3, to the left 2 and to the right 4 of the central pixel. The second example (i.e., Neighborhood Configuration 2) of
(23) The neighborhood configuration used by the probability distribution computation unit 54 of the apparatus 56, is preferably adaptively selected according to the content of the source image and/or of the reference image. The choice of the neighborhood configuration can be adaptively set to fit the nature of the input images. The adaptive selection can be carried out depending on the source image or the reference image since both input images are supposed to comprise similar contents, or even depending on both input images.
(24) The number of neighbors defined by the neighborhood configuration preferably depends on the texture of the source image and/or of the reference image. Preferably, the number of neighbors is increased for a source image and/or reference image with larger homogenous areas. Alternatively or in addition thereto, the number of neighbors is increased for a source image and/or reference image presenting less texture. For input images that are rich in textures, configurations with a smaller number of neighbors are more suitable, like in the first configuration of
(25) For the adaptive selection of the neighborhood configuration, the apparatus 56 can comprise a memory (not shown) adapted to store different neighborhood configurations like the configurations shown in
(26) With reference to
(27) The previously computed conditional probability distribution is used to detect histogram matching artifacts and noise, i.e. outliers. The underlying idea is to check, for each pixel inside the histogram matched image and according to the designated configuration, whether this combination of the central pixel intensity and its neighbors is likely or not.
(28) Preferably, the outlier detection unit 55 is adapted to detect a matched pixel of the histogram matched image as outlier if, based on the conditional probability distribution, the probability of a combination of the matched pixel and of its neighboring pixels, which are selected according to the neighborhood configuration, is lower than a threshold T.
(29) Preferably, the outlier detection unit 55 is adapted to detect a matched pixel of the histogram matched image as outlier if, for the matched pixel and its neighboring pixels that are selected according to the neighborhood configuration, the conditional probability p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN) is lower than a threshold T. The neighborhood configuration is preferably the configuration already selected by the apparatus within the context of the probability distribution computation unit 54 for generating the conditional probability distribution.
(30) Therefore, in the outlier detection unit 55, histogram matching outliers are detected given the previously estimated conditional probability distribution and the threshold T. The proposed approach checks whether the distribution of the pixel under consideration and its neighbors in the histogram matched image is likely or not if p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3, . . . I.sub.nN)>T, it is likely that the pixel corresponds to a correct match, and if p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN)<T, it is likely that the pixel is a miss-match, i.e. an outlier.
(31) If the conditional probability is smaller than the threshold T, then it is assumed that the central pixel (pixel under consideration) is erroneous, thus considered to be an outlier.
(32) The threshold T, which is preferably stored in the apparatus, can be predefined by a user or can be computed automatically. A user can for example send the value of the threshold T to the apparatus or store the value directly in the apparatus. The value of the threshold T can be computed based on prior experiments, i.e. empirically. For example, if in prior experiments the number of matched pixels detected as outliers is too high. i.e. if the number of detected outliers is too high with respect of the actual number of outliers, then the threshold T can be reduced to detect less outliers in the next color mapping operation. In the opposite case, the threshold T can be increased to detect more outliers in the next color mapping operation. The threshold T can be preferably set to a value being comprised between 0.05 and 0.3.
(33) The outlier detection unit 55 is preferably adapted to detect outliers in the histogram matched image for each color channel. That is, outliers can be detected in the histogram matched image for the red channel, the green channel, and the blue channel. The detection is performed independently for each channel. The color channels, for which the outlier detection is carried out, depend on the chosen color model. In case the color mapping is based on the red, green, and blue (RGB) color model, outliers are detected within each of the red, green, and blue channels. For the alternative cyan, magenta, yellow, and black (CMYK) color model, outliers are detected within each of the four cyan, magenta, yellow, and black channels. In case the input images, i.e. the source image and the reference image, are grayscale images, only one channel is available and outliers are detected for only this channel.
(34) The outlier detection unit 55 is preferably adapted to generate an outlier map identifying all outliers detected in the histogram matched image. The outlier map is preferably a binary outlier map, i.e. a map that indicates the locations or coordinates of the histogram matching outliers. The white pixels of the binary outlier map may indicate the location of probable histogram matching-related outliers, whereas the black pixels then indicate that have not been detected as outliers. Conversely, also the black pixels of the binary outlier map can indicate detected outliers.
(35) If the outlier detection unit 55 is adapted to detect outliers in the histogram matched image for several color channels, like RGB channels, it is also adapted to generate several outlier maps. Preferably, one outlier map is then generated for each color channel.
(36) In this respect,
(37)
(38) The apparatus 106 comprises a histogram matching unit 102 that is adapted to match the histogram of the source image to the histogram of the reference image to generate a histogram matched image of the source image, a probability distribution computation unit 104 that is adapted to generate a conditional probability distribution of the reference image, and an outlier detection and correction unit 105 that is adapted to detect outliers in the histogram matched image based on the conditional probability distribution. Preferably, the histogram matching unit 102, the probability distribution computation unit 104, and the outlier detection and correction unit 105 correspond to and comprise the features of respectively the histogram matching unit 52, the probability distribution computation unit 54, and the outlier detection unit 55 of the apparatus 56 shown in
(39) Preferably, the apparatus 106 further comprises a capture reference image unit 100 and a capture source image unit 101. These units 100, 101 are preferably cameras for capturing the reference image and the source image. In case a color mapping is performed for more than one source image, the apparatus 106 can comprise one or more additional capture source image units 101. Alternatively, the reference image and the source image can be captured by only one single capture image unit (not shown) that is then used to capture sequentially the reference image and the source image or vice versa. The output of such a single capture image unit 101 is connected to the histogram matching unit 102. Alternatively, and instead of comprising the capture reference image unit 100 and the capture source image unit 101, the apparatus 106 can comprise one or more interfaces (not shown) adapted to receive the reference image and the source image and transmit them to the histogram matching unit 102, wherein in such a case the reference image and the source image are captured outside of the apparatus 106.
(40) Further on, the apparatus 106 preferably comprises an edge mapping unit 103. The edge mapping unit 103 is shown in more detail in
(41) As shown in
(42) Accordingly, the probability distribution computation unit 104 is adapted to generate the conditional probability distribution based on the reference image as well as based on the edge map of the reference image. This means that the edges detected in the reference image are not taken into consideration for generating the conditional probability distribution.
(43) Similarly, the outlier detection and correction unit 105 is adapted to detect outliers in the histogram matched image based on the conditional probability distribution as well as based on the edge map of the histogram matched image. This means that the edges detected in the histogram matched image are not taken into consideration when detecting outliers.
(44) As shown in
(45) Preferably, the erosion operation unit 302 is adapted to apply the erosion according to an erosion kernel, i.e. to detect pixels in the vicinity of the edges according to an erosion kernel. The size of the erosion kernel then can depend on the neighborhood configuration of the apparatus, and for example be the same as the size of the neighborhood configuration. The size of the erosion kernel can depend on the size or the depth of the neighborhood configuration. The depth can be defined as being the distance, within the neighborhood configuration, from the central pixel to the more distant pixel. For example, the first and third configurations shown in
(46) As shown in
(47) As shown in
(48) In this context, a pixel being better exposed than another pixel means a pixel presenting a better exposure than said another pixel, wherein the exposure describes the amount of light gathered by the capture unit. i.e. for example by the camera. A low-exposed image appears to be dark and a high-exposed image appears to be bright. The exposedness map allows for disregarding pixels being low-exposed or high-exposed and privileging pixels being better exposed.
(49) The exposedness assessment unit 303 generates an exposedness map from the source image and the histogram matched image. The exposedness map is a binary map that indicates, for each pixel, the image where this pixel is best exposed. For example if the binary map has a value of 1, respectively 0, at a given pixel location (x,y), this indicates that the pixel defined by this location is better exposed in the source image, whereas a value of 0, respectively 1, indicates that the pixel is better exposed in the histogram matched image. The exposedness map can be generated according to the known exposure fusion method, which is e.g. described in Exposure Fusion. Mertens, Kautz, and Van Reeth, Pacific Graphics, 2007, pp. 369-378.
(50) As shown in
(51) The selection of pixels having a higher exposedness allows for discarding e.g. pixels having an intensity near zero or near one, i.e. pixels being underexposed or overexposed. The exposedness can be calculated using a Gaussian weighting curve according to the following equation:
(52)
(53) As proposed in Exposure Fusion. Mertens, Kautz. and Van Reeth, Pacific Graphics. 2007. pp. 369-378, the exposedness or well-exposedness can be obtained by weighting each pixel intensity based on how close it is to 0.5, wherein the value of in the above equation may be 0.2. For pixels having several intensities in several channels, the exposedness can be obtained by applying the Gaussian weighting curve on each channel and then adding or multiplying the obtained values of each channel.
(54)
(55)
(56) The outlier detection and correction unit 105 comprise an outlier detection unit 501 that corresponds to the outlier detection unit 55 of
(57) The outlier detection unit 501 differs from the outlier detection unit 55 of
(58) An outlier correction unit 502 is provided in the outlier detection and correction unit 105 for correcting the outliers identified by the outlier map. The correction is performed by correcting the intensity of the matched pixels, which are detected as outliers by the outlier detection unit 501, based on the conditional probability distribution.
(59) The intensity of a matched pixel detected as outlier is corrected by finding the intensity I.sub.c that maximizes the conditional probability p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN). Thereby, the newly found intensity I.sub.cNew that maximizes the conditional probability will replace the previous intensity value within the histogram matched image.
(60) Hence, having located all possible outliers separately for each color channel, the outlier correction unit 502 focuses on finding a replacement, thus correcting, the histogram matching-related outliers as indicated in the previously generated outlier maps. The underlying idea is to find a maximum a-posteriori estimation (MAP) estimate for the intensity of the central pixel using its neighbors. The MAP is a Bayesian estimation approach used to estimate the probability of an unobserved data based on the available empirical information. Here, the MAP estimate aims at finding the intensity I.sub.c which maximizes the conditional probability p(I.sub.c|I.sub.n1, I.sub.n2, I.sub.n3 . . . I.sub.nN):
I.sub.cNew=arg max {p(I|I.sub.a1,I.sub.n2,I.sub.n3 . . . I.sub.nN)};I{0,255}.
(61) The output of the outlier correction unit 502 is then an improved version of the histogram matched image, labeled in
(62) The number of iterations can be set automatically to better fit the requirements of the input images. For example, an increased number of iterations is beneficial in the case of texture-less images. Since iterations tend to blur the final results, the number of iteration shall be reduced for images with more textures. For determining the number of iterations, a value reflecting the amount of texture can be calculated by known technique, for example by computing the variance of the patch, by edge detection or by a gradient-based technique.
(63) Optionally, in case the previously described detection and correction steps are iterated several times, the joint edge map can be updated in a joint edge map actualization unit 503. This update is done for example after each iteration of the detection and correction steps. The joint edge map actualization unit 503 comprises inputs being the source image and the statistically enhanced histogram matched image. These two input images are used by the joint edge map actualization unit 503 for generating an updated version of the joint edge map, in a similar way as the edge detection unit 301 generates the joint edge map based on the source image and the histogram matched image. This means that the joint edge map actualization unit 503 can similarly comprise an edge detection unit, an optional erosion operation unit, an exposedness assessment unit and a fusion operation unit (not shown) to generate the updated version of the joint edge map. This updated version is then fed back to the outlier detection unit 501 as shown in
(64) Further, this edge detection unit (not shown) can, similarly to the edge detection unit 301, detect edges of the statistically enhanced histogram matched image and of the source image, and generate a corresponding edge map of the statistically enhanced histogram matched image and a corresponding edge map of the source image. This optional erosion operation unit (not shown) can, similarly to the erosion operation unit 302, apply an erosion on these two edge maps. Further on, the exposedness assessment unit and the fusion operation unit (not shown) can then, similarly to the exposedness assessment unit 303 and the fusion operation unit 304, generate the updated version of the joint edge map.
(65) The resulting histogram matched image that is output by the outlier detection and correction unit 105 comprises less noise and artifacts, therefore allowing for a better and accurate edge detection.
(66) The present application has been described in conjunction with various embodiments as examples as well as implementations. However, other variations can be understood and effected by those persons skilled in the art and practicing the claimed application, from the studies of the drawings, this disclosure and the independent claims. In the claims as well as in the description the word comprising does not exclude other elements or steps and the indefinite article a or an does not exclude a plurality. A single element or other unit may fulfill the functions of several entities or items recited in the claims. The mere fact that certain measures are recited in the mutual different dependent claims does not indicate that a combination of these measures cannot be used in an advantageous implementation.