SYSTEMS FOR CHARACTERIZING AMBIENT ILLUMINATION
20220360756 · 2022-11-10
Inventors
Cpc classification
G01J3/505
PHYSICS
H04N23/74
ELECTRICITY
H04N23/88
ELECTRICITY
H04N9/77
ELECTRICITY
International classification
H04N9/73
ELECTRICITY
Abstract
A camera system with a multispectral sensor that can be used in combination with a flash to determine a spectrum of the ambient illumination without needing a separate measurement. This may then be used to colour-correct an image captured with or without flash.
Claims
1. A method of using a camera system to characterize ambient illumination, the camera system having an image sensor to capture a view of a scene, a flash to provide flash illumination of the scene, and a multispectral sensor to capture a spectrum of light from the scene in a plurality of wavelength channels, the method comprising: capturing a first spectrum of light (D.sub.TFA) from the scene using the multispectral sensor whilst the flash is operating to illuminate the scene in addition to ambient illumination; capturing a second spectrum of light (D.sub.TA) from the scene using the multispectral sensor whilst the flash is not operating and the scene is illuminated by the ambient illumination; determining a difference between the first and second spectra representing a scene flash spectrum (D.sub.TF), wherein the scene flash spectrum represents a spectrum of the scene when illuminated by the flash without the ambient illumination; compensating the scene flash spectrum using a spectrum of the flash illumination (E(λ)) to determine a colour-compensated scene flash spectrum (R.sub.T(λ)), wherein the colour-compensated scene flash spectrum represents an average reflectance spectrum of the scene when compensated for the spectrum of the flash illumination; and processing the second spectrum of light from the scene using the colour-compensated scene flash spectrum to estimate a spectrum of the ambient illumination (R.sub.A(λ)).
2. A method as claimed in claim 1 wherein processing the second spectrum of light from the scene using the colour-compensated scene flash spectrum comprises dividing each of a set of values (R.sub.TA(λ)) representing the second spectrum of light at each of a respective set of wavelength points by a corresponding value for the colour-compensated scene flash spectrum (R.sub.T(λ)) at the respective wavelength point.
3. A method as claimed in claim 1 wherein processing the second spectrum of light from the scene using the colour-compensated scene flash spectrum to estimate a spectrum of the ambient illumination comprises dividing a representation of the second spectrum of light (R.sub.TA(λ)) by the colour-compensated scene flash spectrum (R.sub.T(λ)).
4. A method as claimed in claim 3 further comprising compensating each of the first and second spectra for a response of the multispectral sensor (M.sub.TF; M.sub.TA).
5. A method as claimed in claim 1 wherein the multispectral senor has n wavelength channels, wherein the first and second spectra are represented by respective first spectrum and second spectrum vectors of length n (D.sub.TFA; D.sub.TA), and wherein determining the difference between the first and second spectra comprises subtracting one of the first spectrum and second spectrum vectors from the other to determine a scene flash spectrum vector (D.sub.TF).
6. A method as claimed in claim 5 wherein the spectrum of the flash illumination is represented by a flash illumination vector (E(λ)) of length m, where m represents a number of wavelength points defined by the spectrum; wherein a sensitivity of the multispectral sensor at the wavelength points for each wavelength channel is defined by a n×m sensitivity matrix (S(λ)); and wherein compensating the scene flash spectrum using the spectrum of the flash illumination comprises multiplying the scene flash spectrum vector (D.sub.TF) by a matrix (M.sub.TF) defined by a combination of the sensitivity matrix and the flash illumination vector to obtain a colour-compensated scene flash spectrum vector (R.sub.T(λ)) representing the colour-compensated scene flash spectrum.
7. A method as claimed in claim 6 further comprising multiplying the second spectrum vector (D.sub.TA) by an inverse of the sensitivity matrix (M.sub.TA) to obtain a sensor-compensated second spectrum vector (R.sub.TA(λ)), and dividing sensor-compensated second spectrum vector by the colour-compensated scene flash spectrum vector (R.sub.T(λ)).
8. A method as claimed in claim 1 wherein the multispectral sensor has at least four wavelength channels.
9. A method as claimed in claim 1 further comprising adapting an RGB to CIE XYZ transformation matrix of the camera using the estimated spectrum of the ambient illumination.
10. A method as claimed in claim 1 further comprising using the image sensor to capture an image, and i) using the estimated spectrum of the ambient illumination to colour-correct the image and/or ii) storing data representing the estimated spectrum of the ambient illumination with image data representing the captured image.
11. A method as claimed in claim 1 further comprising processing the estimated spectrum of the ambient illumination to classify the ambient illumination into one of a set of discrete categories, and controlling one or both of image capture and image processing by the camera dependent upon the category of the ambient illumination.
12. A method as claimed in claim 1 further comprising processing the estimated spectrum of the ambient illumination to determine illumination data characterizing a colour or colour temperature of the ambient illumination, and i) using the illumination data to colour-correct the image and/or ii) storing the illumination data with image data from the image sensor.
13. A method as claimed in claim 1 further comprising determining a colour transformation matrix, wherein the colour transformation matrix comprises a matrix to transform from an RGB to a CIE XYZ colour space adapted to compensate for the estimated spectrum of the ambient illumination.
14. Processor control code, or one or more computer readable media storing processor control code, to implement the method of claim 1.
15. A camera system (100) comprising: an image sensor (106) to capture a view of a scene; a flash (110) to provide flash illumination of the scene; a multispectral sensor (108) to capture a spectrum of light from the scene in a plurality of wavelength channels; an image processing subsystem (120) configured to: capture a first spectrum of light (D.sub.TFA) from the scene using the multispectral sensor whilst the flash is operating to illuminate the scene in addition to ambient illumination; capture a second spectrum of light (D.sub.TA) from the scene using the multispectral sensor whilst the flash is not operating and the scene is illuminated by the ambient illumination; determine a difference between the first and second spectra representing a scene flash spectrum (D.sub.TF), wherein the scene flash spectrum represents a spectrum of the scene when illuminated by the flash without the ambient illumination; compensate the scene flash spectrum using a spectrum of the flash illumination (E(λ)) to determine a colour-compensated scene flash spectrum (R.sub.T(λ)), wherein the colour-compensated scene flash spectrum represents an average reflectance spectrum of the scene when compensated for the spectrum of the flash illumination; and process the second spectrum of light from the scene using the colour-compensated scene flash spectrum to estimate a spectrum of the ambient illumination (R.sub.A(λ)).
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0034]
[0035]
[0036]
[0037]
[0038] In the drawings like reference numerals indicate like elements.
DETAILED DESCRIPTION
[0039] To properly capture or represent a colour under coloured ambient illumination requires knowledge of the ambient light spectrum. For example under reddish ambient illumination a white surface will appear reddish, but so will a reddish surface under white light illumination.
[0040] This specification describes a camera system with a multispectral sensor that can be in combination with a flash to determine a spectrum of the ambient illumination without needing a separate measurement. This may then be used to colour-correct an image captured with or without flash, e.g. to perform automatic white balancing. The camera system may form part of a consumer electronic device such as a camera, mobile phone, tablet, laptop, or other device. However variants of the system need not capture an image, e.g. a variant of the system may be used in a projector to project a colour-corrected image.
[0041]
[0042] In implementations the multispectral sensor 108 comprises an n-channel sensor where n≥4, e.g. n=6-8, defines a number of wavelength channels of the multispectral sensor. In this specification the data captured from the multispectral sensor 108 is referred to as a spectrum, even though in some implementations it may define only four points of the spectrum.
[0043]
[0044] The camera system 100 also includes a flash 110 to illuminate, directly or indirectly, a scene viewed by the multispectral sensor 108. The flash 110 may comprise e.g. one or more LEDs (light emitting diodes).
[0045] The field of view (FOV) of the multispectral sensor 108 and the FOV (strictly, the field of illumination) of the flash 110 should overlap, and in implementations are similar. The multispectral sensor 108, flash 110, and image sensor 106 may be physically nearby one another on mobile device 102.
[0046] The camera system 100 includes an image processing subsystem 120. In implementations this is configured to control the flash 110, e.g. via line 126, to provide flash illumination of the scene, and to capture data from the multispectral sensor 108, e.g. via line 124, representing a spectrum of light from the scene. The image processing subsystem 120 may also cooperate with the image sensor 106 to capture an image of the scene, e.g. via line 128, and optionally to process the captured image. In implementations the image processing subsystem 120 is integrated with the camera system 100 (though shown separately in
[0047] In implementations the image processing subsystem 120 is configured to control the flash and capture and process data from the multispectral sensor 108 in order to estimate a spectrum of ambient light illuminating the scene. The image processing subsystem 120 has an output 130, which internal to the mobile device 102. The output 130, and generally an output as described in this specification, may comprise e.g. an electrical connection or a register of image processing subsystem 120.
[0048] The spectrum of the ambient illumination may be provided as an explicit output of the image processing subsystem 120 i.e. output 130 may provide data defining a spectrum. Also or instead the image processing subsystem 120 may further process the spectrum of the ambient illumination e.g. to classify the ambient illumination into one of a set of discrete categories, e.g. representing different types of illumination such as one or more of: fluorescent light illumination, LED illumination, tungsten/incandescent illumination, and daylight illumination. The output 130 may then comprise data identifying the category into which ambient illumination is classified.
[0049] The image processing subsystem 120 may be implemented in hardware, software (which as used here includes firmware), or a combination of the two. As illustrated in
[0050]
[0051] Thus at step 202 the process controls e.g. triggers the flash to illuminate the scene and captures a first spectrum of light from the scene using the multispectral sensor whilst the flash is operating to illuminate the scene in addition to ambient illumination.
[0052] In general the colour of a scene can be described by a product of the reflectance of the scene, i.e. the colour of reflecting surfaces in the scene (the “target colour”) and the illumination i.e. the colour of the light illuminating the scene.
[0053] The first spectrum may be represented by a first spectrum vector D.sub.TFA of dimension n each element of the vector corresponding to a wavelength channel of the n-channel multispectral sensor. Here D.sub.TFA denotes the detected multispectral sensor signal. This comprises the scene illumination, a combination of the ambient light illumination of the scene and the flash illumination of the scene, modified (multiplied) by the target colour of the scene, that is multiplied by a reflectance of the scene at each wavelength component of the flash illumination (see later).
[0054] At step 204 the process captures a second spectrum of light from the scene using the multispectral sensor, whilst the flash is not operating and the scene is illuminated by just the ambient illumination.
[0055] The second spectrum may be represented by a second spectrum vector D.sub.TA of dimension n, each element of the vector corresponding to a wavelength channel of the n-channel multispectral sensor. Here D.sub.TA denotes the detected multispectral sensor signal, which comprises the ambient light illumination of the scene modified (multiplied) by the target colour of the scene.
[0056] Steps 202 and 204 may be performed in any order.
[0057] At step 206 the process determines a difference between the first and second spectra. This represents a spectrum of the scene as would be detected by the multispectral sensor when illuminated by the flash without the ambient illumination, here referred to as a scene flash spectrum. The scene flash spectrum may be represented by an n-dimensional scene flash spectrum vector, D.sub.TF, where:
D.sub.TF=D.sub.TFA−D.sub.TA
[0058] To avoid inaccuracy this difference should not be too small. For example a radiance of the flash may be larger, e.g. at least around 5% larger, than a radiance of the ambient light. Implementations of the technique described herein are particularly suitable for indoor applications.
[0059] Knowing the spectrum of the scene when illuminated by the flash i.e. the scene flash spectrum vector, D.sub.TF, and knowing the spectrum of the flash illumination, the (mean) target colour of the scene may be determined. The target colour is represented by an average reflectance spectrum of the scene when compensated for the spectrum of the flash illumination.
[0060] The spectrum of the flash illumination may be represented by an m-dimensional flash illumination vector, E(λ), where m represents a number of wavelength points defined by the spectrum, e.g. defining an optical output of the flash at each of m different wavelengths. In the case of an LED flash comprising m LEDs each at a different wavelength, e.g. a white LED flash, the flash illumination vector may define the optical output at each LED wavelength. The optical output may be defined as the radiance at each wavelength, or may be defined in arbitrary units.
[0061] Thus at step 208 the process may compensate the scene flash spectrum using the spectrum of the flash illumination to determine a colour-compensated scene flash spectrum.
[0062] In general terms, since the scene flash spectrum is a product of the spectrum of the flash illumination and the target colour, the true target colour may be recovered by dividing the scene flash spectrum by the spectrum of the flash illumination. In practice, however, this may involve multiplying the scene flash spectrum vector by a matrix, M.sub.TF, derived from a combination of a sensitivity matrix, S(λ), representing a wavelength-dependent sensitivity of the multispectral sensor 108, and the flash illumination vector.
[0063] The true target colour, i.e. the colour-compensated scene flash spectrum, may be represented by an m-dimensional colour-compensated scene flash spectrum vector, R.sub.T(λ), which defines a true colour (average reflectance) of the scene at each of the m different wavelengths of the flash illumination vector. Then, for example,
R.sub.T(λ)=M.sub.TF.sup.T.Math.D.sub.TF
[0064] where the superscript T denotes a matrix transpose.
[0065] In general (though not necessarily) the n wavelength channels of the sensor may not coincide with the m wavelengths of the flash illumination. An n×m sensitivity matrix, S(λ), may thus be defined for the multispectral sensor 108. The sensitivity matrix may define the sensitivity of each of the n sensor wavelength channels at each of the m wavelengths representing the spectrum of the flash illumination. The sensitivity matrix may be known for the multispectral sensor 108 or may be determined by calibration.
[0066] In some implementations the matrix, M.sub.TF, represents an inverse of the sensitivity of the multispectral sensor measured at the spectrum of the flash illumination. This may be a pseudo inverse or Wiener inverse of a matrix generated from the n×m sensitivity matrix, S(λ), and m-dimensional flash illumination vector, E(λ). The matrix generated from S(λ) and E(λ), which may be termed S.sub.TF, may be generated by elementwise multiplication of each m-dimensional column of S(λ) by E(λ). For example the ith column of S.sub.TF may have elements S.sub.i,1⋅E.sub.1, . . . , S.sub.i,m ⋅E.sub.m, where i=1 . . . n.
[0067] Alternatively the matrix M.sub.TF may be determined by calibration, e.g. by illuminating a scene e.g. a white wall, at each separate wavelength m under zero ambient light.
[0068] The matrix, M.sub.TF, may be determined as a pseudo inverse by determining:
M.sub.TF=S.sub.TF.sup.T*(S.sub.TF*S.sub.TF.sup.T).sup.−1
[0069] The matrix, M.sub.TF, may be determined as a Wiener inverse by determining:
M.sub.TF=S.sub.TF.sup.T*Smooth*(S.sub.TF*Smooth*S.sub.TF.sup.T).sup.−1
[0070] where Smooth is a smoothing matrix, for example:
[0071] Determining M.sub.TF as a Wiener inverse is more representative than determining M.sub.TF as a pseudo-inverse. Other methods may also be used to determine M.sub.TF.
[0072] The process may then use the true target colour, i.e. the colour-compensated scene flash spectrum vector, R.sub.T(λ), to estimate a spectrum of the ambient illumination (step 210). This may use the second spectrum, obtained when the scene is illuminated by just ambient illumination, as represented by the second spectrum vector D.sub.TA.
[0073] In an example implementation a second matrix, M.sub.TA, represents an inverse of the sensor sensitivity. In this case M.sub.TA, may be determined as a pseudo inverse or Wiener inverse of the n×m sensitivity matrix S(λ) characterizing the spectral sensitivity of the multispectral sensor 108.
[0074] Thus an m-dimensional sensor-compensated second spectrum vector, R.sub.TA(λ), may be determined as:
R.sub.TA(λ)=M.sub.TA.sup.T.Math.D.sub.TA
[0075] where again the superscript T denotes a matrix transpose. Here R.sub.TA(λ), represents a measurement of the second (ambient light only) spectrum compensated for the response of the multispectral sensor 108.
[0076] The spectrum of the ambient illumination may be represented by an m-dimensional vector R.sub.A(λ) with a value for each of m wavelengths. This may be determined by elementwise dividing the sensor-compensated second spectrum vector R.sub.TA(λ) by the colour-compensated scene flash spectrum vector R.sub.T(λ):
[0077] The process may provide the spectrum of the ambient illumination as an output and/or the process may then classify the type (category) of ambient illumination (step 214). There are many ways in which this might be done, for example determining which of a set of template or reference spectra best matches the measured spectrum. The classification may simply aim to distinguish between natural and artificial illumination or may also attempt to determine a particular type (category) of artificial illumination.
[0078] An advantage of classifying the ambient illumination is that this may allow a more accurate determination of the ambient light spectrum: a relatively few points may serve to classify the ambient light spectrum but once the type of ambient illumination is known better colour compensation may be applied.
[0079] As an example, in one approach to classifying the type of ambient illumination the measured spectrum of the ambient illumination is first normalized, e.g.
R.sub.An(λ)=R.sub.A(λ)/R.sub.A(mean)
[0080] where R.sub.A (mean) is the mean of the elements of R.sub.A (λ). A set of i reference spectra may be represented as an i×m matrix, R.sub.Aref(λ, i), where i indexes a reference spectrum and m indexes wavelength. These may be normalized in the same way to determine a set of normalized reference spectra, R.sub.An_ref(λ, i):
R.sub.An_ref(λ,i)=R.sub.Aref(λ,i)/R.sub.Aref(mean,i)
[0081] where R.sub.Aref(mean, i) is the mean of the elements of R.sub.Aref(λ, i) for reference spectrum i. The set of reference spectra may be stored in non-volatile memory 122.
[0082] An integrated deviation, d(i), between the measured spectrum of the ambient illumination and each of the reference spectra may then be determined, e.g.
d(i)=sum(abs(R.sub.An(λ)−R.sub.An_ref(λ,i)))
[0083] where abs(⋅) denotes taking an absolute value and sum(⋅) denotes summing over λ. The type of ambient illumination may then be determined by determining the best match to the reference spectra,
the value of i which minimises d(i).
[0084] Also or instead the measured spectrum of the ambient illumination may be processed to determine a location of the ambient illumination in a colour space (on a chromaticity diagram), such as a location in CIE u′, v′ colour space.
[0085] Also or instead the measured spectrum of the ambient illumination may be processed to determine colour temperature, e.g. correlated colour temperature, CCT, of the ambient illumination.
[0086] This information may be provided as an illumination data output from the system.
[0087] The illumination data may be stored with image data from the image sensor, and/or used to colour-correct a captured image, or used in some other way. For example an image captured under low-CCT ambient light may be processed so as to appear captured under other, e.g. standard, illumination such as D65 daylight.
[0088] Also or instead data characterizing the true target colour, i.e. independent of the ambient light, may be provided as a data output from the system. For example the system may output the colour-compensated scene flash spectrum vector, R.sub.T(λ). This may be used, for example, to determine the true colour of a target such as paint on a wall, to overcome metamerism when colour matching.
[0089]
[0090] The CIE 1931 and CIE 1964 colour spaces are designed to better match human vision and have associated colour matching functions. These can be thought of as defining the sensitivity curves of three light detectors for CIE tristimulus values XYZ, approximately blue, green and red. A 3×3 matrix defines a conversion from RBG to XYZ for a particular image sensor. As described above, for implementations of the system described herein this may also take account of the spectrum of the ambient illumination (which may be treated as a modification to the RGB sensor characteristic).
[0091]
is shown in
[0092] In broad terms there has been described a method of calculating a spectrum of ambient light by measuring an image scene with and without flash light. The method comprises reconstructing a mean reflectance colour of the scene under ambient light (A) by using a first matrix operation (optimized for direct measurement); reconstructing the mean reflectance colour of the scene (B) by using a second matrix operation (optimized for colour measurement and with respect to the known spectrum of a flash light); and calculating the ambient light spectrum as difference between A and B.
[0093] The type of ambient light source may be detected by comparison with a set of expected reconstructed data for typical light sources, for example by calculating the spectral deviation and finding the reference light source spectrum with lowest deviation, e.g. using a least squares method, or a sum of absolute differences between the spectra over the spectra.
[0094] The method may be used for ambient white balancing in a camera, and/or for generating specific conversion matrices from e.g. RGB values of a (sensor) pixel to a standard colour space (e.g. XYZ), and/or to provide additional image information for post image capture processing.
LIST OF REFERENCE NUMERALS
[0095] 100 camera system [0096] 102 mobile device [0097] 104 display [0098] 106 image sensor [0099] 108 multispectral sensor [0100] 110 flash [0101] 120 image processing subsystem [0102] 122 non-volatile memory [0103] 124 multispectral data capture line [0104] 126 flash control line [0105] 128 image capture line [0106] 130 output [0107] 202 control flash to illuminate scene and capture first spectrum [0108] 204 capture second spectrum from the scene under ambient illumination [0109] 206 determine a difference between the first and second spectra, D.sub.TF [0110] 208 colour-compensate to determine true mean colour of the scene, R.sub.T(λ) [0111] 210 use true colour to estimate spectrum of ambient illumination, R.sub.A(λ) [0112] 212 classify the type of ambient illumination [0113] 300 D65 daylight spectrum [0114] 302 low CCT halogen lamp spectrum [0115] 304 white LED spectrum [0116] 306 fluorescent light spectrum
[0117] Features of the method and system which have been described or depicted herein in combination e.g. in an embodiment, may be implemented separately or in sub-combinations. Features from different embodiments may be combined. Thus each feature disclosed or illustrated in the present specification may be incorporated in the invention, whether alone or in any appropriate combination with any other feature disclosed or illustrated herein. Method steps should not be taken as requiring a particular order e.g. that in which they are described or depicted, unless this is specifically stated. A system may be configured to perform a task by providing processor control code and/or dedicated or programmed hardware e.g. electronic circuitry to implement the task.
[0118] Aspects of the method and system have been described in terms of embodiments but these embodiments are illustrative only and that the claims are not limited to those embodiments.
[0119] For example, the system may be used in an electronic device such as a projector, laptop, or smart home device such as a smart speaker, in which case the imaging sensor may be omitted. In such applications the system may still be termed a camera system as it includes a light sensor, i.e. the multispectral sensor, though not necessarily an imaging sensor. In such applications the system may be used e.g. to colour-correct a displayed image.
[0120] Those skilled in the art will be able to make modifications and alternatives in view of the disclosure which are contemplated as falling within the scope of the claims.