Method of digital measuring color of fabrics based on digital camera

11614362 · 2023-03-28

Assignee

Inventors

Cpc classification

International classification

Abstract

A method of digital measuring the color of fabrics based on digital camera, includes: making plain fabric samples; obtaining ground-truth color of plain fabrics using a spectrophotometer; capturing a raw format digital image of the plain fabrics using the digital camera and extracting raw camera responses of the plain fabrics; capturing a raw format digital image of a target fabric and extracting the raw camera responses of a ROI in the target fabric; calculating a Euclidean distance and a similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics; normalizing the Euclidean distance and the similarity coefficient; calculating a weighting coefficient of each color data of the plain fabrics based on the normalized Euclidean distance and similarity coefficient; weighting every color data of plain fabrics with a corresponding weighting coefficient; and summing the weighted color data of the plain fabrics.

Claims

1. A method of digital measuring color of fabrics based on a digital camera, the method comprising: step 1: making plain fabric samples; step 2: obtaining ground-truth color of plain fabrics using a spectrophotometer; step 3: capturing a raw format digital image of the plain fabrics using the digital camera and extracting raw camera responses of the plain fabrics; step 4: capturing a raw format digital image of a target fabric and extracting the raw camera responses of a region of interest (ROI) in the target fabric; step 5: calculating a Euclidean distance and a similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics; step 6: normalizing the Euclidean distance and the similarity coefficient; step 7: calculating a weighting coefficient of each color data of the plain fabrics based on the normalized Euclidean distance and the normalized similarity coefficient; step 8: weighting each of the color data of the plain fabrics with a corresponding weighting coefficient; and step 9: summing the weighted color data of the plain fabrics to get color data of the ROI in the target fabric.

2. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the making plain fabric samples in the step 1 is as follows: first, collecting historical color data of the plain fabric samples, and using convex hull algorithm to extract convex hull color data, represents as C.sub.hull; then, investigating and acquiring a color difference, represents as ΔE, when producing the plain fabrics; after that, using the convex hull color data C.sub.hull and the color difference ΔE as inputs to generate a principal dataset of the plain fabric samples based on a color system production method; finally, obtaining the plain fabric samples by fixing plain fabric entities that are produced in a textile factory on a white cardboard.

3. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the obtaining the ground-truth color of the plain fabrics using the spectrophotometer in the step 2 is as follows: obtaining CIEXYZ tristimulus values of the plain fabrics by measuring with the spectrophotometer; and according to colorimetry, the CIEXYZ tristimulus values are calculated as expressed in equation (1) and equation (2):
X=k∫.sub.λx(λ)E(λ)S(λ)
Y=k∫.sub.λy(λ)E(λ)S(λ)
Z=k∫.sub.λz(λ)E(λ)S(λ)dλ,  (1) where, k = 100 / [ .Math. λ y ( λ ) E ( λ ) d λ ] , ( 2 ) where, x(λ), y(λ) and z(λ) are color matching functions of standard observers, E(λ) is a spectral reflectance of an object measured by the spectrophotometer, S(λ) is a relative spectral power distribution of a light source, λ is a wavelength, k is an adjustment factor, X, Y, and Z are respective tristimulus values of the plain fabrics; with the calculated tristimulus values, the corresponding color data of fabrics are calculated as expressed in equation (3) and equation (4): L = 116 f ( Y Y n ) - 1 6 a = 500 [ f ( X X n ) - f ( Y Y n ) ] b = 200 [ f ( Y Y n ) - f ( Z Z n ) ] , ( 3 ) where, { f ( H H n ) = ( H H n ) 1 / 3 if ( H H n ) > ( 24 / 116 ) 1 / 3 f ( H H n ) = ( 841 / 108 ) ( H H n ) + 16 / 116 if ( H H n ) ( 24 / 116 ) 1 / 3 , ( 4 ) where, L, a, and b are the luminance, red-green, and yellow-blue value of fabric in CIELab color space, X Y, and Z are the respective tristimulus values of the plain fabrics, X.sub.n, Y.sub.n, and Z.sub.n are tristimulus values of a referenced light source, H and H.sub.n represent tristimulus values of the plain fabrics and the referenced light source; completing measurements of the color data of the plain fabrics to obtain color data, expressed as c.sub.i, for each plain fabric, c.sub.i is a 1×3 row vector, each color data of the plain fabrics is expressed as equation (5):
c.sub.i=(L.sub.i,a.sub.i,b.sub.i)i∈(1,2, . . . ,N),  (5) where i indicates an ith plain fabric, i is from 1 to N, N is a total number of the plain fabric samples, L.sub.i, a.sub.i, and b.sub.i are luminance, red-green, and yellow-blue values of the ith plain fabric in a CIELab color space.

4. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the capturing the raw format digital image of the plain fabrics using the digital camera and extracting the raw camera responses of the plain fabrics in the step 3 is as follows: first, capturing the raw format digital images of the plain fabrics under uniformly illuminated environment; then, for each digital image of the plain fabrics, extracting m-by-m pixels of the raw camera responses in a central area of each digital image of the plain fabrics, and calculating a mean of the m-by-m pixels of the raw camera responses that is expressed as equation (6): d i = 1 m × m .Math. j = 1 m × m ( r i , j , g i , j , b i , j ) , ( 6 ) where, i indicates an ith plain fabric, and j indicates a jth pixel of the central area extracted in the plain fabric, r.sub.i,j, g.sub.i,j, and b.sub.i,j are red-channel, green-channel, and blue-channel raw camera responses of the jth pixel in the ith plain fabric, d.sub.i is a mean of the raw camera response of the ith plain fabric which is a 1×3 row vector.

5. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the capturing the raw format digital image of the target fabric and extracting the raw camera responses of the ROI in the target fabric in the step 4 is as follows: first, acquiring the raw format digital image for the target fabric under same imaging condition parameters and illumination parameters for capturing the plain fabric using the digital camera; then, selecting the ROI manually, a single pixel is saved as d.sub.t directly when the single pixel is selected as the ROI, where d.sub.t is a raw camera response row vector with a dimension of 1×3, and acquiring a mean of L pixels of the raw camera response according to equation (6) and saving the acquired mean as d.sub.t when the L pixels are included in the ROI, the dt is expressed as equation (7):
d.sub.t=(r.sub.t,g.sub.t,b.sub.t),  (7) where, r.sub.t, g.sub.t, and b.sub.t are red-channel, green-channel, and blue-channel of the raw camera responses of the ROI.

6. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the calculating the Euclidean distance and the similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics in the step 5 is as follows: first, calculating the Euclidean distance between the raw camera responses of the ROI in the target fabric and the plain fabrics is expressed as equation (8) and equation (9):
e.sub.i=√{square root over ((r.sub.i−r.sub.i).sup.2+(g.sub.t−g.sub.i).sup.2+(b.sub.t−b.sub.i).sup.2)},  (8)
e=(e.sub.1,e.sub.2, . . . ,e.sub.N),  (9) where r.sub.t, g.sub.t, and b.sub.t are red-channel, green-channel, and blue-channel raw format camera responses of the ROI, r.sub.i, g.sub.i, and b.sub.i are red-channel, green-channel, and blue-channel raw format camera responses of an ith plain fabric, e.sub.i is the Euclidean distance between the raw format camera responses of the ROI in the target fabric and the plain fabrics, e is a row vector including N number of Euclidean distances; then, calculating the similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics is expressed as equation (10) and equation (11): s i = d t .Math. d i .Math. d t .Math. × .Math. d i .Math. , ( 10 ) s = ( s 1 , s 2 , .Math. , s N ) , ( 11 ) where d.sub.t is a raw camera response vector of the ROI, d.sub.i is a raw format camera response vector of an ith plain fabric, ‘⋅’ indicates a vector inner product, S.sub.i is the similarity coefficient between the raw format camera responses of the ROI in the target fabric and the plain fabrics, s is the row vector including N number of similarity coefficients.

7. The method of digital measuring color of fabrics based on a digital camera of claim 6, wherein the normalizing the Euclidean distance and the similarity coefficient in the step 6 is expressed as equation (12) to equation (13):
e.sub.n,i=e.sub.i⋅/max(e),  (12)
s.sub.n,i=s.sub.i⋅/max(s),  (13) where, e.sub.n,i and s.sub.n,i are the normalized Euclidean distance and the similarity coefficient respectively, and max(⋅) is a function for finding a maximum value, and ‘⋅/’ means element division.

8. The method of digital measuring color of fabrics based on a digital camera of claim 7, wherein the calculating the weighting coefficient of each of the color data of the plain fabrics based on the normalized Euclidean distance and the normalized similarity coefficient in the step 7 is expressed as equation (14): w i = s n , i .Math. exp ( - e n , i 2 ) , ( 14 ) where, w.sub.i represents the weighting coefficient of the ith plain fabric, and exp(⋅) represents an exponential function.

9. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the weighting each of the color data of the plain fabrics with the corresponding weighting coefficient in the step 8 is expressed as equation (15):
c.sub.w,i=w.sub.i.Math.c.sub.i=(w.sub.i.Math.L.sub.i,w.sub.i.Math.a.sub.i,w.sub.i.Math.b.sub.i),  (15) where, c.sub.w,i is a weighted color data of the ith plain fabrics, w.sub.i is the weighting coefficient of the ith plain fabric, c.sub.i is the color data of the ith plain fabric, L.sub.i, a.sub.i, and b.sub.i are luminance, red-green, and yellow-blue values of the ith plain fabric in a CIELab color space.

10. The method of digital measuring color of fabrics based on a digital camera of claim 1, wherein the summing the weighted color data of the plain fabrics to get the color data of the ROI in the target fabric in the step 9 is expressed as equation (16): c t = 1 N .Math. i = 1 N c w , i , ( 16 ) where, c.sub.t is the color data of the ROI in the target fabric, c.sub.w,i is the weighted color data of the ith plain fabric, N is a total number of the plain fabrics; the measuring color of the ROI in the target fabric based on digital camera is completed.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) FIG. 1 is a flowchart of digital measuring the color of fabrics based on digital camera.

(2) FIG. 2 is the color data distribution of 20 measured target fabrics according to one embodiment of the disclosure.

DETAILED DESCRIPTION OF THE EMBODIMENTS

(3) To further illustrate, embodiments detailing a method of digital measuring the color of fabrics based on digital camera are described below. It should be noted that the following embodiments are intended to describe and not to limit the disclosure. When implementing the embodiments of the disclosure, the person of in the field of art may adopt computer software to implement.

(4) By referencing to FIG. 1, the embodiment of disclosure of a method of digital measuring color of fabrics based on a digital camera comprises the following steps:

(5) Step 1: making plain fabric samples;

(6) Step 2: obtaining the ground-truth color of plain fabrics using the spectrophotometer;

(7) Step 3: capturing the raw format digital image of plain fabrics using a digital camera and extracting the raw camera responses of them;

(8) Step 4: capturing the raw format digital image of a target fabric and extracting the raw camera response of a region of interest (ROI) in a target fabric;

(9) Step 5: calculating the Euclidean distance and similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics;

(10) Step 6: normalizing the Euclidean distance and the similarity coefficient;

(11) Step 7: calculating the weighting coefficient of each color data of the plain fabric based on the normalized Euclidean distance and similarity coefficient;

(12) Step 8: weighting the color data of the plain fabrics with the corresponding weighting coefficient;

(13) Step 9: summing the weighted color data of the plain fabrics to get the color data of the ROI in the target fabric.

(14) Each step of the process is explained via the following embodiments: embodiment is carried out based on the plain fabric weaving technique, a X-rite Color i7 spectrophotometer, a digital camera of Canon 600D and a light box with a model of VeriVide DigiEye to carry out the tests of the embodiments.

(15) In step 1, 1872 historical color data of plain fabrics are collected from company A, and 237 convex hull color data are extracted using convex hull algorithm (see reference 1). The maximum level of color difference of ΔE of the color of fabrics at the company A is 3.5 units. 1236 color data of plain fabrics are generated using the previously proposed color system production method (see reference 2). The 1236 plain fabric samples are produced and fixed on white cardboards through proofing process. In addition, 20 jacquard fabrics are also produced as test targets.

(16) In step 2, by adopting X-rite Color i7 spectrophotometer, the ground-truth color data of all plain fabrics are acquired. In the measurement, the CIEXYZ tristimulus values of plain fabrics is firstly measured. The details of calculating CIEXYZ are listed in equation (1) and equation (2):
X=k∫.sub.λx(λ)E(λ)S(λ)
Y=k∫.sub.λy(λ)E(λ)S(λ)
Z=k∫.sub.λz(λ)E(λ)S(λ)dλ,  (1)

(17) where,

(18) k = 100 / [ .Math. λ y ( λ ) E ( λ ) d λ ] , ( 2 )

(19) In equation (1) and equation (2), x(λ), y(λ) and z(λ) are the color matching functions of standard observers, E(λ) is the measured spectral reflectance of the object, S(λ) is the relative spectral power distribution of the light source, λ indicates the wavelength, k is the adjustment factor, X, Y, and Z are the tristimulus values of the plain fabrics. After acquiring the tristimulus values, the corresponding CIELab color data of the fabrics can be further calculated as shown in equation (3) and equation (4) to complete the color measurement of the plain fabrics. In an embodiment, CIED65 standard light source that is frequently used in the field of art is adopted as the reference light source when calculating the CIELab color data of the fabrics.

(20) L = 116 f ( Y Y n ) - 1 6 a = 500 [ f ( X X n ) - f ( Y Y n ) ] b = 200 [ f ( Y Y n ) - f ( Z Z n ) ] , ( 3 )

(21) where,

(22) 0 { f ( H H n ) = ( H H n ) 1 / 3 if ( H H n ) > ( 24 / 116 ) 1 / 3 f ( H H n ) = ( 841 / 108 ) ( H H n ) + 16 / 116 if ( H H n ) ( 24 / 116 ) 1 / 3 , ( 4 )

(23) In equation (3) and equation (4), L, a, and b are the luminance, red-green, and yellow-blue values of fabric in the CIELab color space. X, Y, and Z are the tristimulus values of the fabrics, X.sub.n, Y.sub.n, and Z.sub.n are the tristimulus values of reference light source, H and H.sub.n in equation (4) represents the tristimulus values of the plain fabrics and the reference light source. Upon completing the measurement of ground-truth color data of the plain fabrics, the color data of each plain fabric sample c.sub.i is obtained. c.sub.i is a 1×3 row vector that can be expressed as equation (5):
c.sub.i=(L.sub.i,a.sub.i,b.sub.i)i∈(1,2, . . . ,N),  (5)

(24) where i indicates the ith plain fabric, the value of i is from 1 to N. N is the total number of plain fabric samples, L.sub.i, a.sub.i, and b.sub.i are the luminance, red-green, and yellow-blue value of fabric of the ith plain fabric in the CIELab color space. In an embodiment, the value of N is 1236.

(25) In step 3, after completing the measurement of the ground-truth color data of the plain fabrics, the raw format digital images of plain fabrics are captured in VeriVide DigiEye light box with uniform illumination in order, the raw format images are saved (in *.CR2 format). For each digital image of the plain fabrics, every pixels in a m×m pixels area near the central area of the digital image are extracted as raw camera response and a mean of the raw camera responses of the m×m pixels are calculated that is expressed in equation (6):

(26) d i = 1 m × m .Math. j = 1 m × m ( r i , j , g i , j , b i , j ) , ( 6 )

(27) where, i indicates the ith plain fabric, and j indicates the jth pixel of the extracted area in the plain fabric, r.sub.i,j, g.sub.i,j, and b.sub.i,j are the red-channel, green-channel, and blue-channel raw camera responses of the jth pixel in the ith plain fabric, d.sub.i is the raw camera response of the ith plain fabric, d.sub.i is a 1×3 row vector. The value of m is 100 in the embodiments.

(28) In step 4, after the capturing of raw format digital image of the plain fabrics and the extraction of the raw camera responses are completes, the same imaging conditions as plain fabrics, using the digital camera to acquire the digital images of target jacquard fabrics under the same illumination and imaging conditions for capturing the plain fabrics. The region of interest (ROI) is selected manually for each target jacquard fabric, and the corresponding raw camera response is acquired. It should be noted that if the ROI just includes one pixel, the raw camera response of this pixel is directly saved as d.sub.i. d.sub.i is a raw camera response row vector with a dimension of 1×3. If the ROI includes L pixels, a mean of the raw camera responses of these L pixels is calculated as indicated in equation (6), and the mean of the raw camera responses is save as d.sub.t, as shown in equation (7):
d.sub.t=(r.sub.t,g.sub.t,b.sub.t),  (7)

(29) where, r.sub.t, g.sub.t, and b.sub.t are the red-channel, green-channel, and blue-channel raw camera responses of ROI. In the embodiments, the ROI is randomly selected with make sure the selected area is a pure color.

(30) In step 5, after completing the extraction of the raw camera responses of the ROI, the Euclidean distance and similarity coefficient between the raw camera responses of the ROI in the target fabric and plain fabrics are calculated. The details are as follows.

(31) Firstly, the Euclidean distance between the raw camera responses of ROI in the target fabric and the plain fabrics are calculated as shown in equation (8) to equation (9):
e.sub.i=√{square root over ((r.sub.t−r.sub.i).sup.2+(g.sub.t−g.sub.i).sup.2+(b.sub.t−b.sub.i).sup.2)},  (8)
e=(e.sub.1,e.sub.2, . . . ,e.sub.N),  (9)

(32) where r.sub.t, g.sub.t, and b.sub.t are the red-channel, green-channel, and blue-channel raw format camera responses of the ROI, r.sub.i, g.sub.i, and b.sub.i are the red-channel, green-channel, and blue-channel raw format camera responses of the ith plain fabric, e.sub.i is the Euclidean distance between raw format camera responses of the ROI in the target fabric and plain fabrics, e is the row vector including N number of Euclidean distance.

(33) Then, the similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics are calculated as shown in equation (10) to equation (11):

(34) s i = d t .Math. d i .Math. d t .Math. × .Math. d i .Math. , ( 10 ) s = ( s 1 , s 2 , .Math. , s N ) , ( 11 )

(35) where d.sub.t is the raw camera response vector of the ROI, d.sub.i is the raw camera response vector of the ith plain fabric, ‘⋅’ indicates the vector inner product, s.sub.i is the similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics, s is the row vector including N number of similarity coefficient.

(36) In step 6, after the calculation of the Euclidean distance and similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics is completed, the Euclidean distance and similarity coefficient are normalized as expressed in equation (12) to equation (13):
e.sub.n,i=e.sub.i⋅/max(e),  (12)
s.sub.n,i=s.sub.i⋅/max(s),  (13)

(37) where, e.sub.n,i and s.sub.n,i are the normalized Euclidean distance and the normalized similarity coefficient respectively, and max(⋅) is the function for finding the maximum value, and ‘⋅/’ means element division.

(38) In step 7, after the normalization of the Euclidean distance and similarity coefficient between the raw camera responses of the ROI in the target fabric and the plain fabrics is completed, the weighting coefficient of each color data of the plain fabrics based on the normalized Euclidean distance and similarity coefficient are calculated as expressed in equation (14):

(39) w i = s n , i .Math. exp ( - e n , i 2 ) , ( 14 )

(40) where, w.sub.i represents the weighting coefficient of the ith plain fabric, and exp(⋅) represents the exponential function.

(41) In step 8, after the normalization of weighting coefficient of each color data of the plain fabrics is completed, weighting the color data with the corresponding weighting coefficient of the plain fabrics is carried out as expressed in equation (15):
c.sub.w,i=w.sub.i.Math.c.sub.i=(w.sub.i.Math.L.sub.i,w.sub.i.Math.a.sub.i,w.sub.i.Math.b.sub.i),  (15)

(42) where, c.sub.w,i is the weighted color data of the ith plain fabrics, w is the weighting coefficient of the color data of the ith plain fabric, c.sub.i is the color data of the ith plain fabric, L.sub.i, a.sub.i, and b.sub.i are the luminance, red-green, and yellow-blue values of the ith plain fabric in the CIELab color space.

(43) In step 9, after the weighting the color data with the corresponding weighting coefficient of the plain fabrics is completed, summing the weighted color data of the plain fabrics to get the color data of the ROI in the target fabric as expressed in equation (16). To this point, the color measurement of the measuring area of the target fabric is completed.

(44) c t = 1 N .Math. i = 1 N c w , i , ( 16 )

(45) where, c.sub.t is the color data of the ROI in the target fabric, c.sub.w,i is the weighted color data of the ith plain fabric, N is the total number of plain fabrics. In the embodiments, the color measurement of 20 ROIs in 20 target fabrics according to the above-mentioned steps is completed. The corresponding color data distribution are plotted as shown in FIG. 2.

(46) It will be obvious to those skilled in the art that changes and modifications may be made, and therefore, the aim in the appended claims is to cover all such changes and modifications.